• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA

Joined
Jun 14, 2020
Messages
2,678 (1.87/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Can you point a game that uses ecores with pcores?
Yeah, plenty. Cyberpunk, spiderman remastered, spiderman miles morales

What are YOU talking about, both CCDs have a CCX with 8C/16T with performance cores only. 2nd CCD runs slighly lower clockspeeds BUT USING SAME MICROARCHITECTURE and they are still FAST, with no risk of E cores complicating software compatibility. 7950X = 16 performance cores, 13900K = 8 performance cores.

Intels E cores are using dated and MUCH LOWER CLOCKED microarchitecture. The cores are NOT FAST at all. Their primary goal is to fool consumers into thinking the chip has more cores than it has.
i5-13600K is a "14 core chip" but only has 6 performance cores :roll: Intel STILL only has 6-8 performance cores across the board on mainstream chips in the upper segment. Rest is useless e cores.

Ryzen 7000 chips with 3D cache will beat Intel in gaming anyway. Hell even the 7800X3D 400 dollar chip will beat 13900KS with 6 GHz boost and twice if not triple the peak watt usage + 800 dollar price tag and Intel will abandon the platform after 2 years as usual. Meaning 14th gen will require new socket, new board. Milky milky time. Intels architechture is inferior which is why they need to run with high clockspeeds to be able to compete, SADLY for Intel this means high watt usage.

However i9-12900K/KS and i9-13900K/KS are pointless chips gamers, since i7 delivers the same gaming performance anyway, without the HUGE watt usage. Hell even i5's are within a few percent.
You said ecores are useless. I said they are as useless as the 2nd ccd. Can you actually give me a couple of applications that the 2nd ccd boosts performance but ecores don't? If not, then you HAVE to admit that ecores are as useless as the 2nd ccd.
 
Joined
May 31, 2016
Messages
4,331 (1.49/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Yeah, plenty. Cyberpunk, spiderman remastered, spiderman miles morales
Can you show me any proof that ecores are being used simultaneously with pcores by the games you have pointed out?
 
Joined
Jun 14, 2020
Messages
2,678 (1.87/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Can you show me any proof that ecores are being used simultaneously by the games you have pointed out?
Sure, is a video with each individual core usage enough as proof?
 
Joined
May 31, 2016
Messages
4,331 (1.49/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Sure, is a video with each individual core usage enough as proof?
Can you tell that the game is using the ecore or is it the usage of a windows 11 background processes that is being shown? After you switch the ecores off (considering they are being used by the game) does the performance drop? If it does drop, by how much?
 
Joined
Jun 14, 2020
Messages
2,678 (1.87/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Can you tell that the game is using the ecore or is it the usage of a windows 11 background processes that is being shown? After you switch the ecores off (considering they are being used by the game) does the performance drop? If it does drop, by how much?
Yes, the game is using the ecores, it's pretty obvious to tell. 16 ecores at 30 to 50%+ usage. And yes in those specific games I mentioned performance drops by around 15% with ecores off in cpu demanding areas of the game (apartments, Tom's dinner etc) . That's with the 13900k, I don't know the exact numbers with a 12900k but since that's the cpu I'm using right now I can test it
 
Joined
Feb 22, 2022
Messages
99 (0.12/day)
System Name Lexx
Processor Threadripper 2950X
Motherboard Asus ROG Zenith Extreme
Cooling Custom Water
Memory 32/64GB Corsair 3200MHz
Video Card(s) Liquid Devil 6900XT
Storage 4TB Solid State PCI/NVME/M.2
Display(s) LG 34" Curved Ultrawide 160Hz
Case Thermaltake View T71
Audio Device(s) Onboard
Power Supply Corsair 1000W
Mouse Logitech G502
Keyboard Asus
VR HMD NA
Software Windows 10 Pro
eah, plenty. Cyberpunk, spiderman remastered
Is that the same Spiderman remaster than runs better on a 7900XT than an overclocked 4070ti?
 
Joined
May 31, 2016
Messages
4,331 (1.49/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Yes, the game is using the ecores, it's pretty obvious to tell. 16 ecores at 30 to 50%+ usage. And yes in those specific games I mentioned performance drops by around 15% with ecores off in cpu demanding areas of the game (apartments, Tom's dinner etc) . That's with the 13900k, I don't know the exact numbers with a 12900k but since that's the cpu I'm using right now I can test it
Funny, you still haven't shown anything just write your theories. 13900k and 12900k will act exactly the same with ecores and pcores utilization since windows is scheduling those not the CPU itself.
According to 13900K benchmark on TPU with the Cyberpunk game at any given resolution that is not correct. Performance does not drop when disabling ecores by 15% but rather 1.8% at 1080p.
 
Joined
Jun 14, 2020
Messages
2,678 (1.87/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Is that the same Spiderman remaster than runs better on a 7900XT than an overclocked 4070ti?
And the same cyberpunk that runs better on a 3080 than a 7900xt
 
Joined
Jan 8, 2017
Messages
9,063 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
And the same cyberpunk that runs better on a 3080 than a 7900xt
1676981960755.png


1 fps dude, it runs 1 fps faster with RT. 1 fps

Meanwhile it's like 40% faster with RT off, not even worth comparing the two.

"runs better on a 3080", the nonsense rubbish you say never ceases to amaze me, you might just take the cake for the worst fanboy I've seen on this site yet.
 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.29/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Never said the 4090 is a low power card. I said it's incredibly efficient. Which it is.

Never said the 13900k is exceptional. Actually I swapped back to my 12900k cause I prefer it. Had I said is that ecores are not useless in gaming, since there are games that benefit a lot from them. Try to actually argue with what people are saying instead of constant strawmaning
You never said anything on topic so far.

Can this shitpostfest be locked yet it's clear that some would rather argue about AMD v Nvidia or CPUs.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.36/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Yeah, plenty. Cyberpunk, spiderman remastered, spiderman miles morales


You said ecores are useless. I said they are as useless as the 2nd ccd. Can you actually give me a couple of applications that the 2nd ccd boosts performance but ecores don't? If not, then you HAVE to admit that ecores are as useless as the 2nd ccd.

Oh really? Techpowerup test showed 2% less performance in Cyberpunk with e cores off

And if you actually run Windows 10 instead of 11 most games will perform like crap because of no thread director, which is essential so e-cores are not used for stuff that actually matters
 
Joined
Jun 14, 2020
Messages
2,678 (1.87/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Funny, you still haven't shown anything just write your theories. 13900k and 12900k will act exactly the same with ecores and pcores utilization since windows is scheduling those not the CPU itself.
According to 13900K benchmark on TPU with the Cyberpunk game at any given resolution that is not correct. Performance does not drop when disabling ecores by 15% but rather 1.8% at 1080p.
I have videos on my channel showing 13900k with ecore usage on a 13900k

I'll show you when I'm home

View attachment 284827

1 fps dude, it runs 1 fps faster with RT. 1 fps

Meanwhile it's like 40% faster with RT off, not even worth comparing the two.

"runs better on a 3080", the nonsense rubbish you say never ceases to amaze me, you might just take the cake for the worst fanboy I've seen on this site yet.
So it runs slower than a 2.5 year old card. Splendid

Oh really? Techpowerup test showed 2% less performance in Cyberpunk with e cores off

And if you actually run Windows 10 instead of 11 most games will perform like crap because of no thread director, which is essential so e-cores are not used for stuff that actually matters
I don't really care what tpup showed, I have the cpu and the game. If tpup doesn't test in cpu demanding areas that need more than 8cores then obviously you won't see a difference.
 
Joined
Jan 8, 2017
Messages
9,063 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
So it runs slower than a 2.5 year old card. Splendid

Well, looks like you forgot this, a game where a 4070ti is slower than Nvidia's own 2.5 years old previous generation. What should I make of this, our resident fanboy ? I suppose that's splendid as well.

1676983837421.png
 
Joined
Sep 17, 2014
Messages
21,080 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Okay, do you think for example w1z considers them gimmicks? Dodge the question, go ahead
Everyone is entitled to their opinion. Do you really think 'you have me' now or something? Get a life.

W1z has already confirmed that despite his general statements about the need for more VRAM (generally: core runs out as VRAM runs out), which I do agree with, the exceptions do make the rule and we've already seen several examples appear in his own reviews, where this had to be acknowledged. Similarly, W1z has also been seen saying the technologies in play here are progress, and that he likes to see it. But has also been saying how abysmal the performance can get in certain games. Its a thing called nuance. You should try it someday.

And that's my general stance with regard to these new features too: the general movement is good. But paying through the nose for them today is just early adopting into stuff with an expiry date, and very little to show for it. Upscaling technologies, are good. And they're much better if they are hardware agnostic.

Similarly, RT tech, is good. And its much better if its hardware agnostic.
AMD is proving the latter in fact 'just works' too.

And that is why Nvidia's approach is indeed a gimmick, where fools and money get parted. History repeats.
 
Joined
Jun 14, 2020
Messages
2,678 (1.87/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Everyone is entitled to their opinion. Do you really think 'you have me' now or something? Get a life.

W1z has already confirmed that despite his general statements about the need for more VRAM (generally: core runs out as VRAM runs out), which I do agree with, the exceptions do make the rule and we've already seen several examples appear in his own reviews, where this had to be acknowledged. Similarly, W1z has also been seen saying the technologies in play here are progress, and that he likes to see it. But has also been saying how abysmal the performance can get in certain games. Its a thing called nuance. You should try it someday.

And that's my general stance with regard to these new features too: the general movement is good. But paying through the nose for them today is just early adopting into stuff with an expiry date, and very little to show for it. Upscaling technologies, are good. And they're much better if they are hardware agnostic.

Similarly, RT tech, is good. And its much better if its hardware agnostic.
AMD is proving the latter in fact 'just works' too.

And that is why Nvidia's approach is indeed a gimmick, where fools and money get parted. History repeats.
We were talking about dlss / fsr, that's at least what the post I quoted was talking about. Those are definitely not gimmicks. Rt, sure, you can call it that, especially in certain games.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.36/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
I have videos on my channel showing 13900k with ecore usage on a 13900k

I'll show you when I'm home


So it runs slower than a 2.5 year old card. Splendid


I don't really care what tpup showed, I have the cpu and the game. If tpup doesn't test in cpu demanding areas that need more than 8cores then obviously you won't see a difference.
I do, because I look at facts and proof. If you actually need more than 8 cores in a game, you are screwed anyway, because E cores are slow and runs at ~4 GHz using dated architecture.

i7-13700K has the same game performance as i9-13900K.

Even i5-13600K only performs 1% behind i9-13900K. For half the price.

Efficiency cores gives you exactly nothing. Performance cores is what matters for gaming performance.

Ryzen 7800X3D will smack i9-13900K for half the price in a few weeks. Oh, and half the watt usage.

You can enable DLSS 3 to make fake frames tho, that will remove cpu bottleneck :roll:
 
Joined
Jun 14, 2020
Messages
2,678 (1.87/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Well, looks like you forgot this, a game where a 4070ti is slower than Nvidia's own 2.5 years old previous generation. What should I make of this, our resident fanboy ? I suppose that's splendid as well.

View attachment 284833
The 4070ti is slower than a 2 year old Much more expensive card. The XT is slower than a 2 year old cheaper card.

But it doesn't matter, it wasn't me who made the claim. Someone said "are we talking about the spiderman that the 7900xt is faster than a 4070ti" as that means something. It doesn't. It's a pointless statement that doesn't seem to bother you. You seem particularly bothered when a specific company is losing. Someone would even call you biased

I do, because I look at facts and proof. If you actually need more than 8 cores in a game, you are screwed anyway, because E cores are slow and runs at ~4 GHz using dated architecture.

i7-13700K has the same game performance as i9-13900K.

Even i5-13600K only performs 1% behind i9-13900K. For half the price.

Efficiency cores gives you exactly nothing. Performance cores is what matters for gaming performance.

Ryzen 7800X3D will smack i9-13900K for half the price in a few weeks. Oh, and half the watt usage.

You can enable DLSS 3 to make fake frames tho, that will remove cpu bottleneck :roll:
Well obviously you don't look at facts and proof cause you don't have the actual cpu. I do, and I'm telling you in areas that are cpu demanding ecores boost performance by a lot

I'll make some videos with ecores off as well since in my channel I only have with ecores on and you'll see that there is a difference.
 
Joined
Jan 8, 2017
Messages
9,063 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Someone would even call you biased
Yeah sure because I am the one who says a card that runs 1 fps faster in RT and 40% slower in raster in a game "runs better". That's totally not a laughable statement and it doesn't sound like something someone who is biased would ever say.
 
Joined
Sep 17, 2014
Messages
21,080 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
We were talking about dlss / fsr, that's at least what the post I quoted was talking about. Those are definitely not gimmicks. Rt, sure, you can call it that, especially in certain games.
Correct, and I mentioned them. But this topic is about 'the use of AI' for GPU acceleration in general, too - see title.

None of the technologies in play 'require AI' because Nvidia said so, and the point isn't proven either because Nvidia has a larger share of the market now. That just proves the marketing works - until a competitor shows a competitive product/a design win (like Zen!) and the world turns upside down. See, the truth isn't what the majority thinks it is. The truth is what reality dictates - a principle people seem to have forgotten in their online bubbles. And then they meet the real world, where real shit has real consequences. Such as the use of die space vs cost vs margins vs R&D budgets.
 
Joined
Jan 8, 2017
Messages
9,063 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
None of the technologies in play 'require AI' because Nvidia said so
I still remember the first implementation of DLSS in Control where remedy said it didn't actually used tensor cores proving it's completely redundant.
 
Joined
Sep 17, 2014
Messages
21,080 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I still remember the first implementation of DLSS in Control where remedy said it didn't actually used tensor cores proving it's completely redundant.
Yeah I never quite did understand what's 'AI' about upscaling anyway. You just create a set of rules to implement upscaling, there is no way you're passing through every scene in a game to determine what's what. People play the game with a flexible viewport, they don't rerun a benchmark.

Nvidia is clearly charging ahead with their implementation and marketing because having to dial it back would:
A. destroy their dual-use strategy for datacenter and consumer GPUs
B. force them to revert to old technology sans special cores
C. Redesign the CUDA core to actually do more per clock, or somehow improve throughput further while carrying their old featureset

They realistically can't go back, so strategically, AMD's bet is a perfect one - note that I said this exact thing when they talked about their proprietary RTX cores shortly after it was initially announced. Also, the fact Wang is saying now what he's said years ago at around the same time... Time might be on either company's side, really. Its going to be exciting to see how this works out. Still though, the fact AMD is still on the same trajectory is telling, it shows they have faith in the approach of doing more with less. Historically, doing more with less has always been the success formula for hardware - and it used to be the 'Nvidia thing'.
 
Joined
Jun 14, 2020
Messages
2,678 (1.87/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Correct, and I mentioned them. But this topic is about 'the use of AI' for GPU acceleration in general, too - see title.

None of the technologies in play 'require AI' because Nvidia said so, and the point isn't proven either because Nvidia has a larger share of the market now. That just proves the marketing works - until a competitor shows a competitive product/a design win (like Zen!) and the world turns upside down. See, the truth isn't what the majority thinks it is. The truth is what reality dictates - a principle people seem to have forgotten in their online bubbles. And then they meet the real world, where real shit has real consequences. Such as the use of die space vs cost vs margins vs R&D budgets.
But you are not paying through the nose for them. Thats just false. Launch prices, 7900xt was 15% more expensive than the 4070 ti, while being only 12% faster in raster, much slower in rt, with worse up scaling and worse power draw. So how exactly are you paying through the nose for them? The 70ti in fact had better performance per dollar even on just raster. I'm sorry but it seems to me you are paying through the nose for amd
 
Joined
Sep 17, 2014
Messages
21,080 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
But you are not paying through the nose for them. Thats just false. Launch prices, 7900xt was 15% more expensive than the 4070 ti, while being only 12% faster in raster, much slower in rt, with worse up scaling and worse power draw. So how exactly are you paying through the nose for them? The 70ti in fact had better performance per dollar even on just raster. I'm sorry but it seems to me you are paying through the nose for amd
That's just false, you say, and yet, perf/dollar has barely moved forward since the first gen of RTX cards. You're living a nice alternate reality :)

Note my specs, and note how I'm not paying through the nose at any time ever - I still run a 1080 because every offer past it has been regression, not progress. You might not want to see it, but the fact is, the price to get an x80 GPU has more than doubled since then and you actually get less hardware for it. Such as lower VRAM relative to core.

I'm not even jumping on a 550-600 dollar RX 6800 (XT) because we're in 2023 now and this is the original MSRP of years back. That's paying too much for what its going to do, even if it nearly doubles game performance relative to the old card.

There are a LOT of people on this dilemma right now. Every offer the market has currently is crappy in one way or another. If a deal is hard to swallow, its a no deal in my world. Good deals feel like a win-win. There is no way any card in the new gen is a win-win right now.

Chasing the cutting edge has never been great, even when I did try doing so. I've learned I like my products & purchases solid and steady, so that I get what I pay for.

Hey, and don't take it from me, you don't have to:
 
Last edited:
Joined
Jun 14, 2020
Messages
2,678 (1.87/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
And that is why Nvidia's approach is indeed a gimmick, where fools and money get parted. History repeats.
This is what you said that I disagreed with. Of course prices are up, but they are up for both amd and nvidia cards. In fact as evident by the 4070ti launch prices compared to the 7900xt, you are not paying for those gimmicks, they come for free, since the 4070ti had the better value even for pure raster performance, completely excluding rt dlss and fg.

So who's the fool here? Amd buyers are paying more money for less features, higher power draw, worse rt and similar raster per dollar.
 
Joined
Nov 14, 2021
Messages
105 (0.11/day)
I'd agree generally. AI is taking off, but very little is turnkey, through a simple exe. It's usually this whole environment you have to set up. Nvidia has had Tensor since 2018 in consumer chips, yet no AI in games. Then some stuff that may use AI cores, or could use AI cores, run just fine without them. Nvidia has had RTX Voice, which is awesome. But apps do a fine job without AI cores. I have voice.ai installed and it uses no more than 3% of my CPU. We have so much CPU overhead, and keep getting more and more cores that already go underutilized. For games, Nvidia has DLSS, but the competitors are still pretty dang good.

With RDNA3 we are seeing AI accelerators that will largely go unused, especially for gaming until FSR3 comes out. Zen5 will introduce AI accelerators and we already have that laptop Zen that has XDNA. On top of all the CPU cycles that go unused.

It's coming, but I think it's overrated in the consumer space atm. It's very niche to need those Tensor cores and a gaming GPU. On the business side, AMD has had CDNA with AI. What is really limiting is consumer software and strong AI environments on the AMD side. For gaming I'm more excited for raytracing and would rather that be the focus. RT is newer and needs that dedicated hardware. But generally, we are still lacking in how much hardware we are getting to accelerate that RT performance even from Nvidia. If for example Nvidia removed all that Tensor and replaced it with RT and just use FSR or similar, that would be mouth watering performance.

For AMDs argument, if they made up for it in rasterization and ray-tracing performance, that would make since. But they can't even do that. Seems more like AMD just generally lacks resources.
 
Top