• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD "Navi 31" Memory Cache Die Has Preparation for 3D Vertical Cache?

Joined
Apr 14, 2022
Messages
671 (0.86/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Asus XG35VQ
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Software Windows 11 64bit
In case you forgot, AMD is still on top in terms of performance/price and Nvidia at the bottom, so this is complete nonsense.


Such as ?


They have HIP and OpenCL, HIP is not supported on windows but I doubt this is relevant to the vast majority of users.

You missed my writing. The problem is not the performance and you put a performance metric chart.
The problem is the package compared to performance.
Everything else apart from the performance is worse.
Image reconstruction tech, RT performance, drivers issues, no cuda alternative, no NVENC alternative, at least AV1 on 7900s, etc.

So they are not cheap enough for the package they offer.
But we are off topic.

On topic, the 3D V cache even if it works with the gpus, it will give something that AMD does not need necessarily. More performance.
 
Joined
Dec 12, 2016
Messages
1,341 (0.49/day)
More latency because less lanes?
I’m not an expert in this but usually serial implies one after the other instead of all at once. In other words, I don’t think all lanes are accessed simultaneously like in parallel connections so latency should increase as you require more lanes for more bandwidth if latency is even a factor at all.

Someone with more knowledge can correct me or provide more detail.
 
Joined
Sep 1, 2020
Messages
2,068 (1.51/day)
Location
Bulgaria
I think that the technology is not so strictly serial if you look at it correctly. There are also elements of parallel data transfer.
 
Joined
Dec 26, 2020
Messages
368 (0.29/day)
System Name Incomplete thing 1.0
Processor Ryzen 2600
Motherboard B450 Aorus Elite
Cooling Gelid Phantom Black
Memory HyperX Fury RGB 3200 CL16 16GB
Video Card(s) Gigabyte 2060 Gaming OC PRO
Storage Dual 1TB 970evo
Display(s) AOC G2U 1440p 144hz, HP e232
Case CM mb511 RGB
Audio Device(s) Reloop ADM-4
Power Supply Sharkoon WPM-600
Mouse G502 Hero
Keyboard Sharkoon SGK3 Blue
Software W10 Pro
Benchmark Scores 2-5% over stock scores
I like the design, but it barely checks out. Perhaps saving quite a bit of cost, but it's not good enough. The memory/cache side of things seems good enough, makes me wonder if those cores really are any better. 35-40% faster with 20% more cores doesn't seem that good for "50% watt/perf", which it should've been less power or higher performance than currently... It just seems to me those cores are again (like always?) too weak, or have too little of them. 3090 ti > 4090 had like 50% more cores, all of which faster as well.

Perhaps AMD should try making a bigger Navi, 128 instead of 96, 2 more mcd's, that sort of stuff. Might be too power hungry but unless there are core bugs (which is likely) it will get on 4090 level...
 
Joined
Jan 8, 2017
Messages
9,108 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Image reconstruction tech, RT performance, drivers issues, no cuda alternative, no NVENC alternative, at least AV1 on 7900s, etc.

Define "competitive image reconstruction tech", FSR works fine. RT performance comes with a colossal performance hit on every card, including Nvidia's, you can't seriously use this as some kind of big selling point when even the 4090 can't hit 60fps at 4K natively in some games, not to mention RT is still not exactly wide spread and it has been almost 5 years since RTRT became a thing.

I already said there is an alternative to CUDA and VCN is comparable to NVENC. And you still failed to mention what those drivers issues are.
 
Joined
Oct 12, 2005
Messages
682 (0.10/day)
X3D was only good because Intel was being incompetent for 4 straight years
Nvidia however never sit back and relax
You are right that Nvidia is not sitting back currently. (I dont think we can say never as they had era where they were crappy like the Geforce FX series that were huge crap. They just didn't aimed far enough and they suffered from it)

But the fact that X3D is good on CPU have nothing to do with Intel but how CPU work. CPU are very latency dependent were GPU are made to hide the memory latency. Having the 3D V-Cache on CPU help to reduce the trip to memory and the CPU can continue is computation faster. The code that run on CPU also tend to be loop that reuse the same instruction and data frequently.

On GPU, the cache is there to increase the effective bandwidth, not to reduce latency. The benefits is you can use a smaller bus. A 4080 have a smaller bus than a 3080 but still manage to perform way better due to the large L2 cache.

With RDNA3, they increased the bus side for the top end SKU from 256 bit to 384 bit but reduced the cache from 128 to 96 MB. The thing is getting to 192 MB (like rumored) of cache with 3D V-cache might not increase that much the hit rate. Not much increase on the cache hit rate means not much increase in effective bandwidth.

But anyway, is RDNA 3 really bandwidth starved ? I am not sure.
 
Joined
Sep 6, 2013
Messages
3,054 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
RDNA3's problem is RayTracing performance and I doubt extra cache will help there.

And just to be clear, my point of view is that, it doesn't matter if rasterization is still the king. Raytracing is what helps(if performance is great), or not(if performance is not great), in marketing those chips TODAY.
 
Joined
Apr 19, 2018
Messages
1,067 (0.48/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
At least this explains the missing performance.
 
Joined
Jan 8, 2017
Messages
9,108 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Raytracing is what helps(if performance is high), or not(if performance is not great), in marketing those chips TODAY.

Does it ? Nvidia seems to be plastering everything with "get x100 times more FPS with DLSS4637" and they rarely, if ever, advertise anything related to RT by itself because they know it still sucks.
 
Joined
Aug 21, 2013
Messages
1,723 (0.44/day)
RDNA3's problem is RayTracing performance and I doubt extra cache will help there.

And just to be clear, my point of view is that, it doesn't matter if rasterization is still the king. Raytracing is what helps(if performance is great), or not(if performance is not great), in marketing those chips TODAY.
How is this a problem on RDNA3? They have roughly 3090 RT performance. I did not see many people calling 3090 RT performance as a "problem".
Yes 4090 increased this performance but that does not mean that older cards somehow got worse.

And yes AMD is still one generation behind in RT perf but that's not too bad considering they started with it years after Nvidia. Nvidias' RT is not better because of some "magic". It is better because they were the first to bring it to dGPU's.
 
Joined
Sep 6, 2013
Messages
3,054 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Does it ? Nvidia seems to be plastering everything with "get x100 times more FPS with DLSS4637" and they rarely, if ever, advertise anything related to RT by itself because they know it still sucks.
DLSS was created for both cheating performance and making Raytracing something more than a laughable slideshow. But considering that Nvidia has reach 90% of market share when AMD is offering superior options in rasterization, at least at under $500, I believe that people are really looking at RT performance when buying.

How is this a problem on RDNA3? They have roughly 3090 RT performance. I did not see many people calling 3090 RT performance as a "problem".
Yes 4090 increased this performance but that does not mean that older cards somehow got worse.

And yes AMD is still one generation behind in RT perf but that's not too bad considering they started with it years after Nvidia. Nvidias' RT is not better because of some "magic". It is better because they were the first to bring it to dGPU's.
I can see your point and have used it in arguments in the past. What was "amazing RT performance" 9 months ago, it is considered "crap performance" today? It doesn't make sense. I agree.

But the thing is that AMD needs a strong win to start selling GPUs and the only way to do it is to close the gap with Nvidia in RT. One generation behind doesn't help when for a decade or longer people and press where downgrading AMD in GPUs to the value option with the crappy drivers. People will prefer to go and buy an old gen Nvidia card, even inferior card in rasterization, over a brand new AMD card, if AMD is not completely beating Nvidia in something. Also one gen behind means that AMD not only has to compete with 4000 series, but also 3000 series for the reason I just mentioned. People will buy the inferior(in rasterization performance) Nvidia card, because of AMD's online reputation.
AMD needs to be better in EVERYTHING compared to Nvidia's last gen and start closing the gap with Nvidia's latest gen. RX 6000 was a step even two steps forward. RX 7000 is more like half step backwards. Maybe going chiplets didn't gave them enough time to improve in other areas. Don't know.
 
Joined
Jan 8, 2017
Messages
9,108 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I believe that people are really looking at RT performance when buying.

Dude, I know personally several people who have Nvidia GPUs and have no clue what either RT or DLSS are. The vast majority of users are completely clueless about these things, they look at the charts but they rarely have any idea what they even mean.

Some of them might spot the option in the menus, they enable it see that performance falls off a cliff with hardly any visual improvement and they never touch it again.
 
Joined
Dec 12, 2016
Messages
1,341 (0.49/day)
Dude, I know personally several people who have Nvidia GPUs and have no clue what either RT or DLSS are. The vast majority of users are completely clueless about these things, they look at the charts but they rarely have any idea what they even mean.

Some of them might spot the option in the menus, they enable it see that performance falls off a cliff with hardly any visual improvement and they never touch it again.
Vya Domus is right. The vast majority of gamers buy consoles, use mobile, buy pre-built PCs, use laptop integrated graphics, etc. The sliver of users that buy a discrete video card for a DIY build is quite small comparatively. The number of educated DIY buyers are even smaller than that. And from that share, AMD has between 10-20% share.

Nvidias only selling point is brand. But to become the top brand, they had to deliver and keep delivering even if buyers are unaware why its better. AMD does have a lot of work cut out for them to topple Nvidia as the top brand and better RTRT performance is only one of many top priorities to reach this goal. Its why AMD iterates on RTRT but hasn’t doubled down yet because its still a distance future must have tech. Rasterization is still king and I personally give AMD the benefit of the doubt as they are close to toppling Intel in the CPU space.
 
Joined
Sep 6, 2013
Messages
3,054 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Dude, I know personally several people who have Nvidia GPUs and have no clue what either RT or DLSS are. The vast majority of users are completely clueless about these things, they look at the charts but they rarely have any idea what they even mean.

Some of them might spot the option in the menus, they enable it see that performance falls off a cliff with hardly any visual improvement and they never touch it again.
Ask them why they chose Nvidia. See how many of them will reply with "A friend told me to avoid AMD", or "Someone told me they are faster in modern games" and stuff.
i doubt there will be many replying "It was cheaper".

The vast majority of gamers buy consoles, use mobile, buy pre-built PCs, use laptop integrated graphics, etc.
Nvidia. 90% market share.
RTX 3050 outselling RX 6600.

Look, both of you and anyone else reading this thread. I am an AMD fan, but then I see people buying RTX 3050 over RX 6600 and Nvidia grabbing 90% of the market.

Now ask yourselves, not me, yourselves. If people don't care, why Nvidia is selling everything at any price?

I don't think it's just brand recognition. I can understand it when a company is at 60% and the other at 40%, or even 70% vs 30%. But 90% vs 8%?
 
Joined
Jan 8, 2017
Messages
9,108 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Ask them why they chose Nvidia. See how many of them will reply with "A friend told me to avoid AMD", or "Someone told me they are faster in modern games" and stuff.
i doubt there will be many replying "It was cheaper".
Exactly, so it has nothing to do with RT performance.
 
Joined
Feb 11, 2009
Messages
5,422 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
RDNA3's problem is RayTracing performance and I doubt extra cache will help there.

And just to be clear, my point of view is that, it doesn't matter if rasterization is still the king. Raytracing is what helps(if performance is great), or not(if performance is not great), in marketing those chips TODAY.

Well atm, it does not seem to suffer RT performance, it seems to suffer in Nvidia's take on RT implementation (which, I know......shocking).
RDNA3 seems pretty darn close in Unreal 5's RT with the fornite patch (according to the HardwareUnboxed video)
 
Joined
Sep 6, 2013
Messages
3,054 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Exactly, so it has nothing to do with RT performance.
That "Someone told me it's faster in modern games", was meant to mean RT. Because AMD cards are usually faster in many price points over Nvidia in rasterization. But people buy Nvidia. Why?
I have posted something in the past.

What would someone choose?
A card that produces 200fps in raster and 50 in Raytracing?
or a card that produces 150 fps in raster and 75 in raytracing?

I could have gone with the second option.
 
Joined
Jul 15, 2020
Messages
982 (0.69/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.025mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F26 with "Instant 6 GHz" on
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 85%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
So I guess those 3dv, if exist, will be the better choice for gaming if that what you do with your GPU. Just as their 3dv CPU brother's.
 
Joined
Jan 8, 2017
Messages
9,108 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
That "Someone told me it's faster in modern games", was meant to mean RT.
But it's not, because in the end they still don't actually check if that's true, it's just random crap people say.

What would someone choose?
A card that produces 200fps in raster and 50 in Raytracing?
or a card that produces 150 fps in raster and 75 in raytracing?
You're assuming an average consumer would even know or care enough to look into it to that level, which they don't. You may base your choices according to those metrics but most don't, as I said a lot of them don't even know what those things are, how would they ever even check what the performance in RT is ?

The main metric by which people choose what to buy is actually price, I recently saw some news article that RX 7900XT and 4070ti are the best sellers, why do you think that is ? It's because they're the cheapest new GPUs, that's it.

Nvidia has a huge advantage in the mobile space, practically every laptop that ships with a dedicated GPU is Nvidia, there people don't even have a choice, they have to buy Nvidia. And people buy a ton of laptops, I don't know if there are any market share statistics about that but it wouldn't surprise me if a massive chunk of Nvidia's market share comes from mobile parts.
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
In case you forgot, AMD is still on top in terms of performance/price and Nvidia at the bottom, so this is complete nonsense.

View attachment 281462


They have HIP and OpenCL, HIP is not supported on windows but I doubt this is relevant to the vast majority of users.

The problem is that the RTX 4070 Ti performs notoriously poorly at 4K due to its narrow memory bus and relatively low memory capacity. If you use a more reasonable 1440p target for that segment, the story changes as performance per dollar breaks even:



It will pull ahead at 1080p as well:



Yes... the GPU market is so bad I just had to bring up that this filth they call the 4070 Ti actually seems to offer "good value for money", and it's ridiculous - this is a low end card they are selling at the premium segment, it's just... man, what a time...

The other issue, which is the compute problem, is a long-standing one, though. No one sane uses OpenCL (in fact it's been deprecated by Apple since 2018, they've never been one to shy away from pulling the plug on technologies they don't believe in), both AMD and Intel will need to develop and offer a comparable GPU C compiler if they want to have any chance of challenging CUDA's almost two decade long reign. Intel might be in the best position for this, their software R&D capabilities are vast and they can pour the resources on this... but they need a viable product first and I have a feeling that is where Intel's graphics division is currently busy with.

I must say I'm fascinated by the idea that integrating 3D V-Cache onto every product will radically alter their characteristics to the point it could be considered a holy band-aid for every woe that would affect one of AMD's compute architectures... I'm not sure I believe in outmuscling existing problems, not against a competitor such as NVIDIA.
 
Joined
Sep 6, 2013
Messages
3,054 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
You're assuming an average consumer
I am assuming, you are assuming. You are ALSO ASSUMING. Your whole post is like "I KNOW WHAT PEOPLE WANT AND HOW THEY CHOOSE". Well, no, you don't.

In the end, I repeat.

RTX 3050 outsells RX 6600.
Nvidia at 90% market share.

You all avoid that huge elephand in the room. Are these just "brand recognition"? I don't think so. If it was brand recognision alone, Ryzen would had failed in CPUs. Because AMD's brand recognition back then was the kind of "They made bulldozer, avoid at any cost". But they offered something unbeatable back then. 8 core CPUs that could also perform. They where still far behind in single thread, but in multi thread they where killing.

GPUs? NOT beating current gen in raster, offering last gen in RT and also having to suffer that "crap drivers" reputation.
 
Joined
Jan 8, 2017
Messages
9,108 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
The problem is that the RTX 4070 Ti performs notoriously poorly at 4K due to its narrow memory bus and relatively low memory capacity. If you use a more reasonable 1440p target for that segment
Buying close to 1000 dollar GPUs for 1080 and even 1440p is comical though.

The other issue, which is the compute problem, is a long-standing one, though. No one sane uses OpenCL (in fact it's been deprecated by Apple since 2018, they've never been one to shy away from pulling the plug on technologies they don't believe in), both AMD and Intel will need to develop and offer a comparable GPU C compiler if they want to have any chance of challenging CUDA's almost two decade long reign. Intel might be in the best position for this, their software R&D capabilities are vast and they can pour the resources on this... but they need a viable product first and I have a feeling that is where Intel's graphics division is currently busy with.

As I already said this is completely irrelevant for vast overwhelming majority of consumers. AMD moved from OpenCL as well to HIP, which can be compiled for both AMD and Nvidia actually.

Intel is absolutely not in the best position for competing with CUDA, they've been completely out of the loop on GPGPU front, to this day they don't have any GPU compute products in the hands of their customers to my knowledge. And this is with their vast resources, they've simply failed to deliver anything.

RTX 3050 outsells RX 6600.
And you think it's because of RT performance ? Both of those cards are worthless for RT, actually the RX 6600 is sometimes faster. Your theory falls apart.

NOT beating current gen in raster
What does that even mean ? 7900 XTX is faster than 4080 in raster and cheaper. Let's assume AMD had some 600mm^2 RX 7950 XTX XXX that would be faster than the 4090 in raster, which was entirely feasible, they could have done that if they wanted to. Why would have that made any difference ? Those GPUs account for a minuscule portion of the market share.

having to suffer that "crap drivers" reputation.
And what do you want them to do about that ? Their drivers work fine, that "reputation" mostly comes from trolls that have never owned AMD cards who talk utter nonsense.

Ryzen would had failed in CPUs.
They did kind of failed for a while, it took a long time for them to gain a lot of market share and for people to realize a lot of Intel's products were asinine and that it was not a good idea to pay hundreds of dollars for overclocked quad cores every new generation and to have to change the platform on top of that.
 
Last edited:
Joined
Sep 6, 2013
Messages
3,054 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
And you think it's because of RT performance ? Both of those cards are worthless for RT, actually the RX 6600 is sometimes faster. Your theory falls apart.
My theory doesn't fall apart because you say so. Especially when you base that conclusion on nothing.

People buying RTX 3050 over RX 6600 do it because Nvidia is winning the charts and it's cards are considered as offering the best performance in everything and especially the feature that is most talked about the last 2-3 years at least. Raytracing. It's common knowledge for about forever and not just my opinion, that the best high end card also sells low end cards. So, people go and buy an RTX 3050 because Nvidia offers the fastest cards and also because it's cards are considered as the fastest in RayTracing. That's what they heard/been told. So, people who don't know about hardware will go and buy the Nvidia card because "Nvidia is faster". RX 6600 is the best card and at a lower price point, but people either choose Nvidia or avoid AMD.

Buying close to 1000 dollar GPUs for 1080 and even 1440p is comical though.
Buying close to $1000 and ignoring RT performance could also be described as comical.

But yeah, RTX 4070 Ti does have, as usually for this category of models from Nvidia, just enough memory capacity to perform and sell today, but fell apart tomorrow.
 
Joined
Jan 8, 2017
Messages
9,108 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
People buying RTX 3050 over RX 6600 do it because Nvidia is winning the charts
Which charts ? Like I said 6600 is faster.

that the best high end card also sells low end cards.
So basically your theory is that if AMD has some hypothetical top performer in literally anything then they could sell every heaping pile of crap they can come up with because people will just buy it ?

Why would that be advantageous to you, the consumer ?

Buying close to $1000 and ignoring RT performance could also be described as comical.
Why wouldn't you ignore it ? Cards at that price point work fine in RT for 1080p and even 1440p but at 4K RT performance is crap on anything and you'd be an idiot in my opinion to buy any card for that purpose in particular because if games are barely playable right now at that resolution they're gonna run like absolute crap in a few years.
 
Top