• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

DX11/DX12 CPU overhead and how it relates to Ryzen performance

Joined
Dec 6, 2016
Messages
748 (0.28/day)
First a video from NerdTechGasm: AMD vs NV Drivers: A Brief History and Understanding Scheduling & CPU Overhead


I think the video nicely explains why GCN architecture has performance problems with some DX11 games and why it shines in DX12/Vulkan.

Basically, the GCN GPUs have hardware schedulers which work best if they are loaded via multiple CPU cores - the DX12 way. In DX11, if the developer loads all game logic into the primary thread, the GCN GPU will be idling and underperforming.
nVidia on the other hand doesn't have these problems as they use a software scheduler which intercepts draw calls, distributes them to more CPU cores and then sends them reassembled as command lists back to the GPU.

The second video from AdoredTV explains why current CPU benchmarks might be biased:

nVidias CPU dependent scheduling can cause problems on Ryzen CPUs because of the CCX latency. This gives an edge to Intel CPUs when paired with nVidia GPUs on both DX11 and DX12. GCN GPUs don't have this problem and are thus working on Ryzen as good as on Kaby Lake.
This could be easily tested with a Fury X in april when the Ryzen 5 CPUs launch.
 
Last edited:

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
This video is a direct reaction to that of Adored:
tl:dw: nVidia is fine with DX12 on Ryzen 7.

Also a video on Ram performance for Ryzen (2133 to 3600 tested and also vs. i7 7700K @ 5 GHz/3200 Ram):
 
Last edited:

r9

Joined
Jul 28, 2008
Messages
3,300 (0.57/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
First a video from NerdTechGasm: AMD vs NV Drivers: A Brief History and Understanding Scheduling & CPU Overhead


I think the video nicely explains why GCN architecture has performance problems with some DX11 games and why it shines in DX12/Vulkan.

Basically, the GCN GPUs have hardware schedulers which work best if they are loaded via multiple CPU cores - the DX12 way. In DX11, if the developer loads all game logic into the primary thread, the GCN GPU will be idling and underperforming.
nVidia on the other hand doesn't have these problems as they use a software scheduler which intercepts draw calls, distributes them to more CPU cores and then sends them reassembled as command lists back to the GPU.

The second video from AdoredTV explains why current CPU benchmarks might be biased:

nVidias CPU dependent scheduling can cause problems on Ryzen CPUs because of the CCX latency. This gives an edge to Intel CPUs when paired with nVidia GPUs on both DX11 and DX12. GCN GPUs don't have this problem and are thus working on Ryzen as good as on Kaby Lake.
This could be easily tested with a Fury X in april when the Ryzen 5 CPUs launch.

This is fing amazing.
Explains so much.
So far NVIDIA seems weren't to interested to fix the driver thread scheduler.
Because everybody were blaming on Ryzen performance.
But like it says in the second video once Vega is out if Nvidia doesn't fix their driver people will wonder how come Vega is good for Ryzen and not Nvidia.
Especially how popular Ryzen cpus are.
So there will be another thing to look forward and that is NVIDIA driver patch.
 
Joined
Jun 24, 2015
Messages
35 (0.01/day)
First a video from NerdTechGasm: AMD vs NV Drivers: A Brief History and Understanding Scheduling & CPU Overhead


I think the video nicely explains why GCN architecture has performance problems with some DX11 games and why it shines in DX12/Vulkan.

Basically, the GCN GPUs have hardware schedulers which work best if they are loaded via multiple CPU cores - the DX12 way. In DX11, if the developer loads all game logic into the primary thread, the GCN GPU will be idling and underperforming.
nVidia on the other hand doesn't have these problems as they use a software scheduler which intercepts draw calls, distributes them to more CPU cores and then sends them reassembled as command lists back to the GPU.

Woah, that video is ... why the heck reviewers never talk about these things? Everything makes so much sense now for the past few years with AMD and NV DX11, and now with DX12 and Vulkan.

This should be compulsory viewing for anyone interested in GPU tech, to understand these important differences between AMD & NV, helps us make sense of benchmarks so much better.
 
Joined
Jan 8, 2017
Messages
8,946 (3.35/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
This was known for a long time , I wonder why it took so much time to reach the spotlight of review sites and people in general. And even more baffling to me is why AMD never had a response to what Nvidia has done with their drivers by making adjustments to their architecture and they basically dragged this issue with them to this day. This caused insanely powerful hardware for it's time such as Fury X to fail really hard.
 
Last edited:
Joined
Sep 17, 2014
Messages
20,953 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
This was known for a long time , I wonder why it took so much time to reach the spotlight of review sites and people in general. And even more baffling to me is why AMD never had a response to what Nvidia has done with their drivers by making adjustments to their architecture and they basically dragged this issue with them to this day. This caused insanely powerful hardware for it's time such as Fury X to fail really hard.

Mostly this. As usual we have AMD explaining why they are underperforming long after the fact occurs, in some failing attempt to clear stock on their old hardware and push reasons for us to buy their new product. One wonders why AMD hasn't been doing something within GCN or drivers to recover when the hardware was still new. Instead AMD was content in pushing an ungodly amount of TFlops and shaders and even HBM onto their GPUs to keep them afloat.

Now, with Ryzen they finally seem to catch on to these things ON TIME and actually make things happen. AMD always seems to believe the balance will magically swing in their favour, but the reality is they need to keep pulling those strings throughout every part of the industry to really get the top performance spot.

In the end buyers don't give a rat's ass about the 'why', just about the performance they get. It's the reason my CPU budget for Ryzen has been moved towards a new GPU, I'll be waiting for Zen2.
 
Last edited:
Joined
Jan 10, 2011
Messages
1,329 (0.27/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) LG 24MK430 primary && Samsung S24D590 secondary
Case Corsair Graphite 780T.
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Coolermaster MM530.
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 22.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
I'm starting to see a recurring theme of Youtubers starting their pro-Zen arguments attacking established media outlets with -implicit or explicit- accusations of bias.

Also, I find it fascinating that -some- people base their arguments on prevalent APIs, and for each one the most common aspects of which that are used for development <spoiler>(taking an example from the first video: Not using deferred context in d3d11),</spoiler> yet complain that reviewers are following the same logic for picking graphics cards for their tests! So, hey! Since we are complaining irrationally, where are my OpenGL 4.5 benchmarks?

The first video is an interesting one though, one of the few rational presentations out of the swamp of mediocrity and BS that is Youtube. I have to nitpick though, the using "Int" for "instruction" confused me for a moment. -_-

nVidias CPU dependent scheduling can cause problems on Ryzen CPUs because of the CCX latency. This gives an edge to Intel CPUs when paired with nVidia GPUs on both DX11 and DX12. GCN GPUs don't have this problem and are thus working on Ryzen as good as on Kaby Lake.
This could be easily tested with a Fury X in april when the Ryzen 5 CPUs launch.

Why wait for the R5? If GCN is -relatively- agnostic to CPU architecture as implied, the R7 results should be quite sufficient to show less delta between it and comparable intel CPUs than there are using an Nvidia GPU. If that's the case (and I honestly can't find any benchmarks using a variety of cards), then it can be inferred that Nvidia's software scheduler is at significant fault here.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I'm starting to see a recurring theme of Youtubers starting their pro-Zen arguments attacking established media outlets with -implicit or explicit- accusations of bias.

Also, I find it fascinating that -some- people base their arguments on prevalent APIs, and for each one the most common aspects of which that are used for development <spoiler>(taking an example from the first video: Not using deferred context in d3d11),</spoiler> yet complain that reviewers are following the same logic for picking graphics cards for their tests! So, hey! Since we are complaining irrationally, where are my OpenGL 4.5 benchmarks?

The first video is an interesting one though, one of the few rational presentations out of the swamp of mediocrity and BS that is Youtube. I have to nitpick though, the using "Int" for "instruction" confused me for a moment. -_-



Why wait for the R5? If GCN is -relatively- agnostic to CPU architecture as implied, the R7 results should be quite sufficient to show less delta between it and comparable intel CPUs than there are using an Nvidia GPU. If that's the case (and I honestly can't find any benchmarks using a variety of cards), then it can be inferred that Nvidia's software scheduler is at significant fault here.
It certainly puts a new light on the fact most benched RyZen with an Nvidia card only.
Interesting video , and by and large it confirmed exactly what i believed to be the case.
This is exactly what the secret sauce in Nvidias drivers have always been ,software based scheduler, optimised by Nvidia to reach its potential , don't get me wrong i see this as a reasonable angle of approach by them.
But I would add though that for me it's secrets like this , Nvidias tiling use and their lossy colour compression that they didn't tell anyone about that makes me not buy off them, they are very tight lipped and imho snide about how they get to the performance they do(good performance i agree).
That and selling not much in Bom of hardware terms for high ticket prices.
 
Joined
Jan 8, 2017
Messages
8,946 (3.35/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
It certainly puts a new light on the fact most benched RyZen with an Nvidia card only.
Interesting video , and by and large it confirmed exactly what i believed to be the case.
This is exactly what the secret sauce in Nvidias drivers have always been ,software based scheduler, optimised by Nvidia to reach its potential , don't get me wrong i see this as a reasonable angle of approach by them.
But I would add though that for me it's secrets like this , Nvidias tiling use and their lossy colour compression that they didn't tell anyone about that makes me not buy off them, they are very tight lipped and imho snide about how they get to the performance they do(good performance i agree).
That and selling not much in Bom of hardware terms for high ticket prices.

I doubt that made as big of difference as they claim it did , it would mean that the mere 192 gb/s of my 1060 would be way more effective yet every time any intensive alpha based effect shows up in a game performance drops significantly as expected. All there is to Nvidia right now is high clocks , low power consumption and effective drivers , any fancy talk from them is meaningless.
 
Joined
Jan 10, 2011
Messages
1,329 (0.27/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) LG 24MK430 primary && Samsung S24D590 secondary
Case Corsair Graphite 780T.
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Coolermaster MM530.
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 22.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
But I would add though that for me it's secrets like this , Nvidias tiling use and their lossy colour compression that they didn't tell anyone about that makes me not buy off them, they are very tight lipped and imho snide about how they get to the performance they do(good performance i agree).
That and selling not much in Bom of hardware terms for high ticket prices.

Dunno about the "not telling" part. Nv does pour a lot of detail into their dev documentations. And surely game devs, those whom such information really concerns, are aware of these details (and I'm pretty sure I passed a lot of press conference slides talking about those two "secrets").
Not to say that they don't do propriety stuff at all however. Their Gameworks strategy comes to mind when thinking how much NV loves to abuse its quasi-monopoly over video games graphics.

I completely agree that Nvidia is quite snide. I might be a willing consumer, but god knows we don't need yet a third full-scale monopoly in the field! It's enough that we have to suffer Microsoft and Intel on the OS and CPU markets.
 
Joined
Sep 17, 2014
Messages
20,953 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
It certainly puts a new light on the fact most benched RyZen with an Nvidia card only.
Interesting video , and by and large it confirmed exactly what i believed to be the case.
This is exactly what the secret sauce in Nvidias drivers have always been ,software based scheduler, optimised by Nvidia to reach its potential , don't get me wrong i see this as a reasonable angle of approach by them.
But I would add though that for me it's secrets like this , Nvidias tiling use and their lossy colour compression that they didn't tell anyone about that makes me not buy off them, they are very tight lipped and imho snide about how they get to the performance they do(good performance i agree).
That and selling not much in Bom of hardware terms for high ticket prices.

What do you mean 'not telling', they really do beyond some powerpoint slides if you care to read the in depth source material they put out. Also CUDA isn't really a black box...

It's not like when you can't imagine 'how they do it' that it is suddenly some underhand super arcane workaround to achieve performance. Nvidia have just been refining their tech with every new gen and this is where they are now. Lossy color compression for example: I have yet to find a SINGLE realistic scenario where a user would ever notice this. I can't, I couldn't, and I have Pascal now, still can't. That's a bit like defending 'tesselation where you can't see it' (*wink* if you know what I'm getting at) instead of removing it to improve performance.

Keep in mind that all big performance jumps that don't happen in hardware and architecture, happen by the way of optimization. Optimization means cutting away all the parts that are not essential to the experience when using the product. It is only logical to then take a look at any kind of compression you can get your hands on. And its happening big time since DX12 - we're cutting up everything that's fed to the GPU; textures, color, draw calls, even geometry. And on the hardware side, Nvidia's been shaving away everything that's not needed for gaming exclusively, which is why have 'handicapped Titans' that are nothing like their pro-market counterparts.
 
Last edited:
Joined
Jan 10, 2011
Messages
1,329 (0.27/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) LG 24MK430 primary && Samsung S24D590 secondary
Case Corsair Graphite 780T.
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Coolermaster MM530.
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 22.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
And its happening big time since DX12 - we're cutting up everything that's fed to the GPU; textures, color, draw calls, even geometry.

Other than depth tests, how can you cut geometry?
 
Joined
Sep 17, 2014
Messages
20,953 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Joined
Jan 10, 2011
Messages
1,329 (0.27/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) LG 24MK430 primary && Samsung S24D590 secondary
Case Corsair Graphite 780T.
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Coolermaster MM530.
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 22.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
Joined
Sep 17, 2014
Messages
20,953 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Ah, forgot about that one! But wasn't it available long before d3d12 (or any d3d at all)?

Yeah I was actually looking for an example within DX12, because I remember reading something about this not too long ago, but had a couple too many beers over the weekend I guess
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
What do you mean 'not telling', they really do beyond some powerpoint slides if you care to read the in depth source material they put out. Also CUDA isn't really a black box...

It's not like when you can't imagine 'how they do it' that it is suddenly some underhand super arcane workaround to achieve performance. Nvidia have just been refining their tech with every new gen and this is where they are now. Lossy color compression for example: I have yet to find a SINGLE realistic scenario where a user would ever notice this. I can't, I couldn't, and I have Pascal now, still can't. That's a bit like defending 'tesselation where you can't see it' (*wink* if you know what I'm getting at) instead of removing it to improve performance.

Keep in mind that all big performance jumps that don't happen in hardware and architecture, happen by the way of optimization. Optimization means cutting away all the parts that are not essential to the experience when using the product. It is only logical to then take a look at any kind of compression you can get your hands on. And its happening big time since DX12 - we're cutting up everything that's fed to the GPU; textures, color, draw calls, even geometry. And on the hardware side, Nvidia's been shaving away everything that's not needed for gaming exclusively, which is why have 'handicapped Titans' that are nothing like their pro-market counterparts.
I meant they tell only those they need to , devs know sure but none of them mentioned the tile based rendering either so it like other things must be covered by NDA to a degree imho.
And i get it ,I'm not per say saying Nvidias bad or anything , just that i prefer open ,ethicaly open companies, and while few money making organisations are perfect i have values i won't bend.
This video points out clearly that Nvidia is using CPU cycles without clearly advising the user of this , while few games use all CPU resources ,you could encounter a game that suffers issues with all CPU resources being used ,also as is shown the system (using Nvidia)will use more power (total system power)in some situations through its CPU overhead than the Same game on a 480 ,all while Nvidia shouts about its cards efficiency, again a bit frowny.
I don't just game though so i want to know how it's working not just that it does work.
Nvidia changed their game after Fermi going with different designs for compute and gaming chips i wanted and still do want the best of both worlds at reasonable prices, ethically if possible.

I use 10bit(12avialable) full Rgb on a pro 4k monitor and I noticed the colour pallette mate as would most, no washed out look.


@Vya Domus , your wrong, the colour compression alone made the memory bandwidth available for more FPS , tiling is said to free up more On chip memory for your app to use IE in cache etc , and removing all the double compute power reduced it's power use , all in big things ,all with pros and cons.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
nVidias CPU dependent scheduling can cause problems on Ryzen CPUs because of the CCX latency.
any real test results behind this?
same-ccx 4core tests have not faired much better than full cpu tests so far.

also, last time nvidia's draw call management was talked about, i think they said this stuff really didn't scale beyond 2 threads in this form.

It certainly puts a new light on the fact most benched RyZen with an Nvidia card only.
beyond the fact that there is currently certain lack of amd cards beyond the $200 midrange one? :)
for gaming tests, pairing a $500 high-end cpu with a $200 midrange gpu would not really make much sense. crossfire has its own problems.
 
Joined
Sep 17, 2014
Messages
20,953 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I meant they tell only those they need to , devs know sure but none of them mentioned the tile based rendering either so it like other things must be covered by NDA to a degree imho.
And i get it ,I'm not per say saying Nvidias bad or anything , just that i prefer open ,ethicaly open companies, and while few money making organisations are perfect i have values i won't bend.
This video points out clearly that Nvidia is using CPU cycles without clearly advising the user of this , while few games use all CPU resources ,you could encounter a game that suffers issues with all CPU resources being used ,also as is shown the system (using Nvidia)will use more power (total system power)in some situations through its CPU overhead than the Same game on a 480 ,all while Nvidia shouts about its cards efficiency, again a bit frowny.
I don't just game though so i want to know how it's working not just that it does work.
Nvidia changed their game after Fermi going with different designs for compute and gaming chips i wanted and still do want the best of both worlds at reasonable prices, ethically if possible.

I use 10bit(12avialable) full Rgb on a pro 4k monitor and I noticed the colour pallette mate as would most, no washed out look.


@Vya Domus , your wrong, the colour compression alone made the memory bandwidth available for more FPS , tiling is said to free up more On chip memory for your app to use IE in cache etc , and removing all the double compute power reduced it's power use , all in big things ,all with pros and cons.

But they did inform the user, not specifically of the CPU cycles of course, but they did make noise about their DX11 optimization driver back then:

http://www.geforce.com/whats-new/articles/nvidia-geforce-337-50-beta-performance-driver

The performance/watt of the Nvidia cards also doesn't change (generally) because of this, because they also pump out more FPS, and consequently use a bit more power for that.

From my personal experience, I just don't see what you're getting at. Nvidia hasn't been hiding all that much and if they do, it gets uncovered so quickly I doubt they will try it again (970's 3.5 GB comes to mind). Literally all of their documentation however when it comes to technical aspects is very complete and exhaustive.

Now of course AMD is a different company, I agree, but at the same time, them doing everything 'in the open' has not really helped them in any way, and one could also conclude that obscurity actually has helped Nvidia gain traction for their technology to also get supported widely, while AMD's open nature has not. In the end you do need market share to have your technology adopted.
 
Joined
Jan 8, 2017
Messages
8,946 (3.35/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
@Vya Domus , your wrong, the colour compression alone made the memory bandwidth available for more FPS , tiling is said to free up more On chip memory for your app to use IE in cache etc , and removing all the double compute power reduced it's power use , all in big things ,all with pros and cons.

I don't really care what it's said to do , as long as there is no solid detailed info with contextual examples it's just vaporware to me. Memory bandwidth bottlenecks are hard to track down. Spatial anti-aliasing and alpha based effects are the only things that can directly benefit from this and the reality is that it's difficult to determine how much of a difference it makes since Pascal cards that benefit from higher bandwidth (1080/1080ti/XP) are way faster anyway and you can't really say with certainty how much of the FPS boost comes from fancy algorithms and how much from raw bandwidth. There is no doubt that it has an effect but unless someone writes some intelligent piece of code to test this out in particular and prove that this is game changing I remain skeptical. Nvidia wouldn't have pushed GDDR5X if their memory optimization techniques played such a big role.
 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
In reply to both above, fine so are opinions differ.
The world spins on. :)
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Its funny how many people are ignoring my first post just to miss out that Adoreds claims were effectively checked as being wrong due to a faulty driver. Well at least the OP checked it. Ignorance. :p

9:45 We know that titan xp isn't performing anything near it's full potential on the Ryzen 7 platform. ¯\_(ツ)_/¯
Its one game where the Titan XP is performing bad and also on a low resolution, so I wouldn't bother.
 
Joined
Sep 17, 2014
Messages
20,953 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Its funny how many people are ignoring my first post just to miss out that Adoreds claims were effectively checked as being wrong due to a faulty driver. Well at least the OP checked it. Ignorance. :p


Its one game where the Titan XP is performing bad and also on a low resolution, so I wouldn't bother.

I don't waste time on random Youtubers spouting nonsense, that's probably the main reason ;) I'm getting really good at recognizing those these days
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
I don't waste time on random Youtubers spouting nonsense, that's probably the main reason ;)
You're arrogant again, also the reason why I dont care to talk to you anymore. Youre also in the wrong thread then, I'd say.

Btw I wasn't talking about you. :p
 
Joined
Jan 8, 2017
Messages
8,946 (3.35/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Its funny how many people are ignoring my first post just to miss out that Adoreds claims were effectively checked as being wrong due to a faulty driver. Well at least the OP checked it. Ignorance. :p


Its one game where the Titan XP is performing bad and also on a low resolution, so I wouldn't bother.

It's not that we ignored it but , that video wasn't really worth much attention to begin with since it has no real conclusion.
 
Top