• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA

Joined
Jun 5, 2018
Messages
216 (0.10/day)
...AMD announced a DLSS 3.0 competitor some time ago during the launch of RNDA3 GPUs. Technically you are correct, AMD won't have a DLSS3.0 competitor with RDNA4 but that's because they already will have it under RDNA3.

Ok, they did not claim anywhere that it includes frame generation, instead they used the term "fluid motion frames", but sure, let's assume so. But then when? all they claimed is 2023 and it is still no where to be found, and we are approaching March. Maybe by the time this comes out and actually works, RDNA 4 will already be launched.
 
Joined
Nov 11, 2016
Messages
3,133 (1.14/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
I'd agree generally. AI is taking off, but very little is turnkey, through a simple exe. It's usually this whole environment you have to set up. Nvidia has had Tensor since 2018 in consumer chips, yet no AI in games. Then some stuff that may use AI cores, or could use AI cores, run just fine without them. Nvidia has had RTX Voice, which is awesome. But apps do a fine job without AI cores. I have voice.ai installed and it uses no more than 3% of my CPU. We have so much CPU overhead, and keep getting more and more cores that already go underutilized. For games, Nvidia has DLSS, but the competitors are still pretty dang good.

With RDNA3 we are seeing AI accelerators that will largely go unused, especially for gaming until FSR3 comes out. Zen5 will introduce AI accelerators and we already have that laptop Zen that has XDNA. On top of all the CPU cycles that go unused.

It's coming, but I think it's overrated in the consumer space atm. It's very niche to need those Tensor cores and a gaming GPU. On the business side, AMD has had CDNA with AI. What is really limiting is consumer software and strong AI environments on the AMD side. For gaming I'm more excited for raytracing and would rather that be the focus. RT is newer and needs that dedicated hardware. But generally, we are still lacking in how much hardware we are getting to accelerate that RT performance even from Nvidia. If for example Nvidia removed all that Tensor and replaced it with RT and just use FSR or similar, that would be mouth watering performance.

For AMDs argument, if they made up for it in rasterization and ray-tracing performance, that would make since. But they can't even do that. Seems more like AMD just generally lacks resources.

Yup, AMD still lose in rasterization even when they throw everything plus the kitchen sink at it, it's quite pathetic.

And it's not like AMD is doing more with less, they are doing less with more, 7900XTX with 384bit bus + 24GB VRAM barely beat 256bit 4080 by a hair in raster and lose in everything else ;). The BOM on the 7900XTX is definitely higher than that of 4080 and the only way for AIBs to earn any profit is selling 7900XTX at ~1100usd, which make it the worse choice than 1200usd 4080.

Everyone and their mother should realize by now Nvidia is just letting RTG survive enough to keep the pseudo duopoly going.
 
Joined
Oct 4, 2017
Messages
696 (0.29/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
For the love of god people read the short article. AMD is not giving up on AI

Nobody claimed AMD was giving up on AI , the claim was that they are already behind the competition on that front and things aren't going to get any better for them since they seem to have dropped the ball on the idea of competing head to head with Nvidia .

''Wang said that with the company introducing AI acceleration hardware with its RDNA3 architecture, he hopes that AI is leveraged in improving gameplay—such as procedural world generation, NPCs, bot AI, etc; to add the next level of complexity; rather than spending the hardware resources on image-processing."

they just focusing on what will have the biggest impact for gaming on their gaming GPUs.

Do you know why this speech sounds hollow ? Because it's based on thin air !

AMD pulls the old switcheroo and claims they have implemented AI acceleration hardware in RDNA3 for what ? Features that may or may not be a thing when RDNA3 goes EOL ? When Nvidia implemented AI acceleration hardware in Turing they did also immediately put games that would leverage said hardware to the table , they didn't wait for it to happen .

Yet somehow you are falling for it ... well majority of the market isn't .
 
Joined
Jan 14, 2019
Messages
10,057 (5.15/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
I think that depends on the screen quality and size of it. I would not suggest getting 32 inch if you are planning to play 1080p at some point in the future and you are not going to upgrade GPU.
Agreed, although I'm not getting anything. I'm OK with 24" 1080p. My desk isn't that big anyway. :)

Of course it doesn't increase iq in all games, but it doesn't really decrease it either. I cannot for the life of me tell the difference, not even with screenshot next to each other. Even balanced looks great on static screens, but then it loses in motion where you can see some minor artifacts, but dlssQ is amazing
Considering that you're talking about higher resolutions, I believe you. It's a different story with 1080p, that's all I'm saying.
 
Joined
May 31, 2016
Messages
4,331 (1.49/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Agreed, although I'm not getting anything. I'm OK with 24" 1080p. My desk isn't that big anyway. :)
Damn I wish I was in your shoes now. I'd really want to go through the amazement when I get to switch to 4k and get my mind blown. Now it is the other way around when i switch to 1080p and still get my mind blown but rather in a different way. It is your choice I'm sure the 1440p and 4k will knock on your door at some point. What I can tell you is, when you switch you will not regret it. Maybe it is not your time yet.
 
Last edited:
Joined
Jul 13, 2016
Messages
2,890 (1.01/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Ok, they did not claim anywhere that it includes frame generation, instead they used the term "fluid motion frames", but sure, let's assume so. But then when? all they claimed is 2023 and it is still no where to be found, and we are approaching March. Maybe by the time this comes out and actually works, RDNA 4 will already be launched.

Remember that when you add frames to something it makes it look more fluid. That's ultimately the goal of DLSS 3.0, making the game look fluid. Not sure how AMD is going to approach it but frame interpolation (which is what Nvidia is doing) doesn't require AI to work.

IMO there's no huge rush to release it because as tech outlets like HWUB have reported, DLSS 3.0 is only useful in very specific scenarios due to it's drawbacks. The introduced latency will always be a problem for DLSS 3.0 unless Nvidia fundamentally change the technology. Unless Nvidia starts inserting the next frame instead of the frame between the current and last, DLSS 3.0 frame insertion will always be niche. By end of 2023 is still a year before RDNA4's likely launch mind you.
 
Joined
Dec 28, 2012
Messages
3,515 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
Nobody claimed AMD was giving up on AI , the claim was that they are already behind the competition on that front and things aren't going to get any better for them since they seem to have dropped the ball on the idea of competing head to head with Nvidia .



Do you know why this speech sounds hollow ? Because it's based on thin air !

AMD pulls the old switcheroo and claims they have implemented AI acceleration hardware in RDNA3 for what ? Features that may or may not be a thing when RDNA3 goes EOL ? When Nvidia implemented AI acceleration hardware in Turing they did also immediately put games that would leverage said hardware to the table , they didn't wait for it to happen .

Yet somehow you are falling for it ... well majority of the market isn't .
It sure seems to rhyme with AMD dropping out of the high end CPU market way back yonder.

"CEO Rory Read has made the comment that AMD will no longer compete head to head with Intel in the CPU market"

Ahh, dark days.
 
Joined
Jan 14, 2019
Messages
10,057 (5.15/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Damn I wish I was in your shoes now. I'd really want to go through the amazement when I get to switch to 4k and get my mind blown. Now it is the other way around when i switch to 1080p and still get my mind blown but rather in a different way. It is your choice I'm sure the 1440p and 4k will knock on your door at some point. What I can tell you is, when you switch you will not regret it. Maybe it is not your time yet.
My time will come when I can get such a monitor for under £200 and a graphics card to drive it without upscaling for under £400. Maybe. :) My current system really stretches the limit of how much I'm willing to spend on my parts.
 
Joined
Nov 14, 2021
Messages
105 (0.11/day)
I just thought of this. The Phoenix is Zen4 with integrated RDNA3, which comes with 2 AI accelerators per CU. So has up to 24. Then it also comes with XDNA AI engine.

I wonder how that is going to be handled? Will they have similar or different functionality? Can they help each other out? Is the OS going to have to handle which one to use? Can end-users pick which ones we use? Just wait. We are going to be blow up the rear end with AI cores. Zen 5 is going to have an XDNA AI Engine, which will also be an APU that will have AI accelerators contained in RDNA and then those with discrete cards will have AI cores there too.
 
Joined
Feb 8, 2017
Messages
199 (0.07/day)
I'd rather AMD invest all that die space into more shaders and clock speed, make GPU's faster, Ngreedia is only doing these things to check boxes on their advertising, these are all worthless "features" that wouldn't be needed if we actually had faster GPU's that can provide more FPS without the need to cheat!
 
Joined
Sep 17, 2014
Messages
21,080 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I'd rather AMD invest all that die space into more shaders and clock speed, make GPU's faster, Ngreedia is only doing these things to check boxes on their advertising, these are all worthless "features" that wouldn't be needed if we actually had faster GPU's that can provide more FPS without the need to cheat!
Well honestly, raster graphics are simply done for the most part. We're now mulling over whether subpixels are the right color at the right time, go figure.

That's what they're using 'AI' for, after all. Its hilarious in all of its sadness, the low hanging fruit is long gone and graphical improvements have hit diminishing returns bigtime. Expensive post processing is expensive, always has been, and RT then sells us the idea that brute forcing the whole scene's lighting is a great step forward. Again, its hilariously stupid and sad if you think of it. It is desperate commerce looking for desperate measures to keep any semblance of progress in the GPU space to keep selling products to us.

You can still put RT and non RT scenes side by side and be challenged to spot a difference. The vast majority of its lighting is still rasterized/pre cooked, and the moment its not, the performance nosedives. If we RT a full scene, you're down to unplayable FPS on the fastest GPU on the planet right now (Portal), and again, struggling to see the point / what's gained in actual graphics fidelity.

Fact is, some shit's just done at some point.
 
Joined
Jul 9, 2015
Messages
3,413 (1.05/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
"AI" talk in terms of GPU is a very straightforward number cruncher. You have X mm2 of silicon, you make use of it.

Literally a comment by the Leather Man himself, when Frau Su rolled out some "AI" crap GPU that rivaled his.

This is why AMD sits at an all time low of 10% dGPU market share

Yeah. I mean, latest quarter earnings:

1.6 billion made by AMD GPU + Console business in one quarter.
1.57 billion made by NV (a sharp drop from earlier years mind you).

"But that marketing company told me so".

You can still put RT and non RT scenes side by side and be challenged to spot a difference.
I can spot the difference 100% of times, if FPS counter is on, though. :D

includes frame generation
Something my 5+ years old TV is doing.

Yes, it also adds lag, naturally.
 
Joined
Oct 4, 2017
Messages
696 (0.29/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
Yeah. I mean, latest quarter earnings:

1.6 billion made by AMD GPU + Console business in one quarter.
1.57 billion made by NV (a sharp drop from earlier years mind you).

"But that marketing company told me so".

Guy throws consoles into the mix when the topic is dGPU , might aswell throw cellphones while you are at it :roll::roll::roll:. According to your logic Intel dominates dGPU market :roll: .

Capture d’écran 2023-02-22 184827.jpg


Even in a declining market Nvidia managed to grind more market share over AMD which tells you how well AMD plans are going ... unless of course you are trying to tell me that AMD plans to abandon dGPU market and focus only on making iGPUs and SOCs for consoles :roll:.
 
Last edited:
Joined
Sep 17, 2014
Messages
21,080 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Guy throws consoles into the mix when the topic is dGPU , might aswell throw cellphones while you are at it :roll::roll::roll:. According to your logic Intel dominates dGPU market :roll: .

View attachment 285033

Even in a declining market Nvidia managed to grind more market share over AMD which tells you how well AMD plans are going ... unless of course you are trying to tell me that AMD plans to abandon dGPU market and focus only on making iGPUs and SOCs for consoles :roll:.
News flash; the majority of console games are also key drivers for the PC platform. Its only been like that for 20 odd years, I know its hard to stay up to date.
 
Joined
Mar 7, 2010
Messages
957 (0.18/day)
Location
Michigan
System Name Daves
Processor AMD Ryzen 3900x
Motherboard AsRock X570 Taichi
Cooling Enermax LIQMAX III 360
Memory 32 GiG Team Group B Die 3600
Video Card(s) Powercolor 5700 xt Red Devil
Storage Crucial MX 500 SSD and Intel P660 NVME 2TB for games
Display(s) Acer 144htz 27in. 2560x1440
Case Phanteks P600S
Audio Device(s) N/A
Power Supply Corsair RM 750
Mouse EVGA
Keyboard Corsair Strafe
Software Windows 10 Pro
I'm sure glad the first comments are always by people who comprehend business and economics; I'm sure they're running multi-billion businesses that are responsible for employing thousands of people. :rolleyes:

I think it's a good strategy. Nvidia keeps making the wrong hard bets:
  • Lower all cards below the 4090 by two tiers and pretend that crypto-hashing is still a thing to continue to jack up their prices.
  • Max-out L2 cache from 6MB to 72MB to make their cards memory dependent instead of more efficient.
  • Not bother to go after modularization as aggressively like AMD so they force the higher prices for relatively close to the same performance.
  • Implement dedication to very-specific kinds of AI stuff that requires dedicated silicon that doesn't inherently go towards gaming performance.
  • Make all of their implementations only work on their cards.
  • Buy hey, even if the 4030 is mislabeled as a "4060" it's nice to see a 30-series card come out with 8GB of video RAM!
  • Make their customers think that their $400 card which is going to be 30% slower than AMD's equivalent is better because the $2,000+ card they can't afford is faster than AMD's fastest.
Yeah, a lot of people fall for the last one but there are lots of people who monkey-see monkey-do and make zero effort in to doing research.

AMD on the other hand makes reasonable products that perform well though the just absolutely suck at marketing. I've never once seen their marketing department make a point other than cost/value ratio about their products that cover any of the reasons I buy anything technology related. That being said:
  • Their technologies like works both on AMD and Nvidia cards (probably Intel's too) so game developers have much better incentive to implement something that will work on more cards than fewer.
  • AI isn't supposed to be complex, it's supposed to be generic and numerous just like the calculations themselves.
THIS!
 
Joined
Oct 15, 2010
Messages
951 (0.19/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
Well it doesn't really matter whether you think it's worth it or not. Why is it okay for the 7900xt not only to be 15% more expensive at launch, not only consuming 30% more power, but also getting it's ass handed to it in rt?
Why is it okay for the 4080 to be 71% more expensive than the 3080 while only being 50% faster?
 
Joined
Apr 8, 2008
Messages
329 (0.06/day)
I think AI, ML & some other HW acceleration is crucial for the consumer in modern days and more in the future, not just for gaming, consumers don't just game on their Radeon and GeForce; that's why most content creators use NVIDIA nowadays because the other non-gaming features of the GPU is outstanding and influential in NVIDIA, and they keep adding more features like the RTX acceleration for the prosumers (OptiX engine for 3D rendering).

I don't mean overpowered implementation, like 2x over NV, but something competitive and useful and doesn't affect the other business they have.

And if this is their idea about AI, why they added AI to their Pheonix APUs?
 
Joined
Jul 9, 2015
Messages
3,413 (1.05/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Guy throws consoles into the mix when the topic is dGPU
Yeah. Why could consoles matter, huh? :D

Oh ,wait, AMD reported console APUs + GPU sales combined, THAT IS WHY?

Oh. But hard to follow, isn't it?


7 or so console APUs sold by AMD in the same period. Reported 1.6 billion revenue. (NV reported 1.57 from GPU business)

So the question, that needs a rocket scientist, I guess, judging by comments in this thread, how much of that 1.6 billion chunk is console APUs?

How much could AMD realistically charge for a bare APU chip, if consoles start at $399 with a controller, SSD and what not?
Say $100-150.

That gives us

between
1600 - 7'*100 = 900 million
1600 - 7*150 = 550 million

for GPU revenue on AMD side.


For "12% of the market share" to be true, AMD would need to make on GPUs less than 1.57 (NV revenue)*12/84 = 220 million on GPUs.
Then console APUs should on average cost 1600-220=1380/7 = $197

No way in hell is AMD getting half of the PS5 console price for its chip alone.

getting it's ass handed to it in rt?
This is how we refer to a 20% more expensive card being 16% faster at RT these days. :roll:

If you wonder what RT is: the most impactful aspect of this feature is "bring down my FPS by 40-50%".
It was promised that "new GPUs" won't be affected, because they have "more RT thingies".
But for some reason it didn't happened even in the 3rd generation of RT cards.
As if "RT thingies" were still largely utilizing good old raster thingies... :D
 
Last edited:

las

Joined
Nov 14, 2012
Messages
1,533 (0.36/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Yup, AMD still lose in rasterization even when they throw everything plus the kitchen sink at it, it's quite pathetic.

And it's not like AMD is doing more with less, they are doing less with more, 7900XTX with 384bit bus + 24GB VRAM barely beat 256bit 4080 by a hair in raster and lose in everything else ;). The BOM on the 7900XTX is definitely higher than that of 4080 and the only way for AIBs to earn any profit is selling 7900XTX at ~1100usd, which make it the worse choice than 1200usd 4080.

Everyone and their mother should realize by now Nvidia is just letting RTG survive enough to keep the pseudo duopoly going.

7900 XTX beats or performs on par with 4090 in plenty of games for 60% less money. 4090 is terrible value.
7900 XTX even has higher minimum fps than 4090 at Ultra settings 4K in Hogwartz Legacy; https://www.techspot.com/review/2627-hogwarts-legacy-benchmark/

AMD does less with more? Hahah. Nvidia needed a node advantage and GDDR6X. You compare bus width and memory size but don't talk about GDDR6 vs GDDR6X and manufacturing proces ... Once again, Clueless. No wonder Nvidia already prepping 4090 Ti and 4080 Ti :laugh: RDNA3 is getting faster and faster for every driver and OC headroom on 7900XTX is like 10-15%, meanwhile you can get 2-5% tops on Nvidia, because they already maxed them out and don't allow proper OC.

Most gamers don't give a F about Ray Tracing. It's a gimmick. No Nvidia card will do heavy ray tracing at high res without using upscaling anyway. Maybe 5000 series will be able to, 2000, 3000 and 4000 series are too slow for proper RT unless you accept much lower fps, with huge dips. Even 4090 is slow as hell with RT on high.
 
Joined
Nov 11, 2016
Messages
3,133 (1.14/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
7900 XTX beats or performs on par with 4090 in plenty of games for 60% less money. 4090 is terrible value.
7900 XTX even has higher minimum fps than 4090 at Ultra settings 4K in Hogwartz Legacy; https://www.techspot.com/review/2627-hogwarts-legacy-benchmark/

AMD does less with more? Hahah. Nvidia needed a node advantage and GDDR6X. You compare bus width and memory size but don't talk about GDDR6 vs GDDR6X and manufacturing proces ... Once again, Clueless. No wonder Nvidia already prepping 4090 Ti and 4080 Ti :laugh: RDNA3 is getting faster and faster for every driver and OC headroom on 7900XTX is like 10-15%, meanwhile you can get 2-5% tops on Nvidia, because they already maxed them out and don't allow proper OC.

Most gamers don't give a F about Ray Tracing. It's a gimmick. No Nvidia card will do heavy ray tracing at high res without using upscaling anyway. Maybe 5000 series will be able to, 2000, 3000 and 4000 series are too slow for proper RT unless you accept much lower fps, with huge dips. Even 4090 is slow as hell with RT on high.

Yes Nvidia is prepping 4090ti to beat RDNA4, what a dumbster fire RDNA3 is, 4090 will be the next 1080Ti that took AMD 3.5 years to beat ;)
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.36/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Yes Nvidia is prepping 4090ti to beat RDNA4, what a dumbster fire RDNA3 is, 4090 will be the next 1080Ti that took AMD 3.5 years to beat ;)
Considering 7900XTX is already close overall, I doubt it

Btw 7900XTX uses 50 less watts than 4090 and performs on par or beats it in plenty of games in raster which is the only thing that matters to most gamers :roll: 4090 is 60-75% more expensive as well, what a steal :laugh:

Nvidia is prepping 4090 Ti because AMD is prepping 7950XTX :toast:
 
Joined
Nov 11, 2016
Messages
3,133 (1.14/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Considering 7900XTX is already close, I doubt it

Btw 7900XTX uses 50 less watts than 4090 and performs on par or beats it in plenty of games in raster which is the only thing that matters to most gamers :roll: 4090 is 60-75% more expensive as well, what a steal :laugh:

Plenty of gimmicky game that the majority of gamers don't care about anyways ;), but AMD is paying YTers to include those games into their benchmark suit all right

Oh well if you can't afford 4090, don't feel too bad ;)

Oh and Nvidia is prepping for Blackwell too, which could be twice as fast as 4090 if Nvidia is serious, poor RDNA3/4 are so 1-2 gen behind :/
 
Last edited:

las

Joined
Nov 14, 2012
Messages
1,533 (0.36/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Plenty of gimmicky game that the majority of gamers don't care about anyways ;), but AMD is paying YTers to include those games into their benchmark suit all right

Oh well if you can't afford 4090, don't feel too bad ;)

Oh and Nvidia is prepping for Blackwell too, which could be twice as fast as 4090 if Nvidia is serious, poor RDNA3/4 are so 1-2 gen behind :/

I can easily afford it, but I am not stupid :laugh: When do you replace that dated OLED? Can't afford better or do you enjoy low nits and crappy HDR?

It's funny to see how butthurt you are about 7900XTX getting closer and closer. You probably cleared the bank to buy that entry level 4090 :laugh: Probably why you re-used an old HX850 :roll:

Glad I am not forced to use Windows 11 to make my CPU work right :laugh:

Yeah, a gimmick game that sells 12+ million copies in 2 weeks :laugh:

Yes Nvidia is prepping 4090ti to beat RDNA4, what a dumbster fire RDNA3 is, 4090 will be the next 1080Ti that took AMD 3.5 years to beat ;)
Keep dreaming, 1080 Ti was a $699 card, not $1599 and up like 4090. 1080 Ti had proper connectivity, 4090 don't even have DP 2.1 in 2023 :roll:
7900 XTX is already close to 4090 in tons of games and even beats it on some, for 600 dollars less and with 50 watts less :laugh:

Nvidia also skimped on the VRAM on 4090; 4080 have higher clocked memory.

4080 Ti and 4090 Ti = DP 2.1 + High speed GDDR6X modules. Both will kill off 4090 and make it EoL. Then 1 year after, 5070 will come out and beat 4090 :laugh: And by then, resell value will be sub 400 dollars :laugh:
Great buy :toast: But I guess you can afford it, thats why you bought the absolute cheapest 4090 and skimps on other parts :laugh: Remember to replace that dated PSU before it pops, you are using high watt parts afterall :p
 
Last edited:
Joined
Nov 11, 2016
Messages
3,133 (1.14/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Persistent troll baiting against AMD
poor AMD, trying their hardest every single day to fix RDNA3 (with ton of copium), meanwhile Nvidia is just chilling with their full AD102 chip selling for over 7k :rolleyes:.
Maybe AMD should just rename RTG to CTG (Console Technology Group)
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.36/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Persistent troll baiting against Nvidia
poor AMD, trying their hardest every single day to fix RDNA3 (with ton of copium), meanwhile Nvidia is just chilling with their full AD102 chip selling for over 7k :rolleyes:.
Maybe AMD should just rename RTG to CTG (Console Technology Group)
Truth hurt I see :laugh: They milked you hard this time. Did your 4090 caught fire yet?
 
Top