Friday, May 3rd 2024

AMD to Redesign Ray Tracing Hardware on RDNA 4

AMD's next generation RDNA 4 graphics architecture is expected to feature a completely new ray tracing engine, Kepler L2, a reliable source with GPU leaks, claims. Currently, AMD uses a component called Ray Accelerator, which performs the most compute-intensive portion of the ray intersection and testing pipeline, while AMD's approach to ray tracing on a hardware level still relies greatly on the shader engines. The company had debuted the ray accelerator with RDNA 2, its first architecture to meet DirectX 12 Ultimate specs, and improved the component with RDNA 3, by optimizing certain aspects of its ray testing, to bring about a 50% improvement in ray intersection performance over RDNA 2.

The way Kepler L2 puts it, RDNA 4 will feature a fundamentally transformed ray tracing hardware solution from the ones on RDNA 2 and RDNA 3. This could probably delegate more of the ray tracing workflow onto fixed-function hardware, unburdening the shader engines further. AMD is expected to debut RDNA 4 with its next line of discrete Radeon RX GPUs in the second half of 2024. Given the chatter about a power-packed event by AMD at Computex, with the company expected to unveil "Zen 5" CPU microarchitecture on both server and client processors; we might expect some talk on RDNA 4, too.
Sources: HotHardware, Kepler_L2 (Twitter)
Add your own comment

227 Comments on AMD to Redesign Ray Tracing Hardware on RDNA 4

#51
AnarchoPrimitiv
dgianstefaniIntel's upscaler and RT tech is already ahead.
That's not impressive and nobody should consider it to be so....Intel literally spends over 3x more on R&D than AMD, so should we really be impressed by the fact that Intel's ahead after spending an obscene amount of money?
Posted on Reply
#52
dgianstefani
TPU Proofreader
SithaerNot as crazy as some ppl being so much against new tech/progress on a TECH forum only because they dislike it for whatever reason.
Progress doesn't necessarily have to happen on a need to ask basis, if that was the case then we would be still using some basic tech playing old ass games cause most of the stuff we have nowadays is something that nobody directly asked for I think. 'we could play this who asked for it game forever, its pointless and kinda stupid imo'

Personally I have no issues with RT/Upscaling or any of that stuff and actually I like using DLSS or DLAA if I have the headroom for it.
RT depends on the game but the tech on its own is pretty cool imo. 'Cyber/Control/Ghostwire Tokyo I did finish with RT+DLSS on'

At this point and going forward I consider those things a selling point whenever I will upgrade my GPU cause if the price diff is not too big then I will pick up the card with the better feature set and currently that is Nvidia so I'm glad to see that AMD is working on those. 'I have no probs with AMD itself, I've had my share of AMD hardware in the past with no issues'
Well said.

Consumers like these features. Therefore a consumer serving company will cater to their customers.

End of story.
AnarchoPrimitivThat's not impressive and nobody should consider it to be so....Intel literally spends over 3x more on R&D than AMD, so should we really be impressed by the fact that Intel's ahead after spending an obscene amount of money?
Maybe R&D spending is... important?

How many generations does it take to get it right? For these two things, RT hardware and upscaling, for Intel, one generation.

Take notes AMD.
Posted on Reply
#53
Vya Domus
dgianstefaniAh so now we're back to raster. Sorry, you move the goalposts so frequently it's hard to keep up.
Well we're back to whatever you like, you've failed spectacularly to back up every single one of your claims so far.
dgianstefaniWe were talking about "how Intel fares in RT" to use your words.
You ignored my post where I showed Intel can't even match AMD's first gen RT, I don't know what more can be said on that subject. Then you brought up prices, no clue as to what you were trying to prove. The A770 is so much cheaper because it sucks, no one is buying it for it's stelar RT performance, which actually still sucks.
Posted on Reply
#54
AnarchoPrimitiv
dgianstefaniLmao.

Ok. 85% of the GPU market is wrong, AMD is right, and the rest of us are just fanboys.

Interesting take.

Even Intel got priorities right with Arc, perhaps AMD, who should have known better even with RDNA 2, is finally getting their heads around the concept of RT and upscaling being core features, not lazy "we have this feature too" checkboxes to tick.
Also, ever consider that AMD reasons for not doing something isn't choice so much as financial constraint? It seems as though whenever somebody compares Nvidia, Intel, and AMD they just assume that they're all competing on an even playing field when that couldn't be further from the truth. Nvidia and Intel are so much bigger than AMD and have so many more resources that we shouldn't view it as AMD underachieving compared to Nvidia in Intel, but that what AMD is able to achieve with so little resources and their competition having every advantage is truly impressive.

Up until the last three or four years, Nvidia had an R&D budget over twice as big as AMD's and Intel had one over 7x as large, and despite that, AMD was able to compete and hang with Nvidia and actually beat Intel. Can anybody name an example from any other industry where a company is able to successfully.cimpete while being completely outmatched by its competition resource wise?
Posted on Reply
#55
TheinsanegamerN
AnarchoPrimitivThat's not impressive and nobody should consider it to be so....Intel literally spends over 3x more on R&D than AMD, so should we really be impressed by the fact that Intel's ahead after spending an obscene amount of money?
Intel also has huge software development not related to GPUs. They have side businesses larger then AMD.

How much of that 3x budget is going to GPUs specifically?
Posted on Reply
#56
Colddecked
dgianstefaniThis is one of the things needed. Actual full hardware acceleration rather than a half hearted approach.
We AMD fans prefer the term jimmy rigged, but will also accept janky.

Its not a bad thing for a tech company to innovate beyond what is the current demand or trends. Say what you will about people not wanting RT or AI upscaling back then, right now its a necessity, and Nvidia has positioned themselves as the clear leader in those areas. Nvidia is simply reaping what they sowed.
Posted on Reply
#57
dgianstefani
TPU Proofreader
Vya DomusWell we're back to whatever you like, you've failed spectacularly to back up every single one of your claims so far.


You ignored my post where I showed Intel can't even match AMD's first gen RT, I don't know what more can be said on that subject. Then you brought up prices, no clue as to what you were trying to prove. The A770 is so much cheaper because it sucks, no one is buying it for it's stelar RT performance, which actually still sucks.
A £500 card that is more performant than a £300 one and the price is irrelevant? OK. I'll remind you of that next time you're mocking the price of NVIDIA cards.
Posted on Reply
#58
Vya Domus
dgianstefaniA £500 card that is more performant than a £300 one and the price is irrelevant? OK. I'll remind you of that next time you're mocking the price of NVIDIA cards.
Dude what are you even saying.

The 6800XT is a discontinued 4 year old card, that's still much faster in raster by the way, but that doesn't matter, the point is Intel sucks in raster and can't even match first gen AMD RT performance. The 7700XT is more expensive because it's simply a much better GPU.
Posted on Reply
#59
RGAFL
dgianstefaniAh so now we're back to raster. Sorry, you move the goalposts so frequently it's hard to keep up. We were talking about "how Intel fares in RT" to use your words.

Considering RT is employed as the default lighting in new game engines, it's not quite as shocking as you might think to expect expensive cards to be performant.
In the biggest game engines, IE Unreal Engine 5 or Unity, no, they are not the default lighting system. Lumen needs to be toggled on and Unity needs to use the HDRP shaders which is bugged to hell and hardly used. Let me know what other engines defaults to RT lighting and i'll ask some of my dev colleagues.
Posted on Reply
#60
Onasi
All this bickering is pointless. Either AMD rolls out a good, viable product line and gain market share, or they won’t. That’s it. I don’t think the average customer cares about the technical minutiae. Yes, NV has a massive mindshare advantage, but not an insurmountable one. After all, before Zen AMD was “done” in CPU market according to many. But, turns out, all you need is a good product. Same with Intel GPUs - if Battlemage is an improvement, the drivers no longer take forever to improve (being fair, that wasn’t just Intels fault, there were extenuating circumstances out of their control) and is priced well I do think it could be a success.
In the end, the product is all that matters. And said product is holistic - hardware, software, features, the lot.
Posted on Reply
#61
Vya Domus
RGAFLLet me know what other engines defaults to RT lighting
None do, his logic is that if a game has RT that must mean the engine itself is RT by default lol.
Posted on Reply
#62
dgianstefani
TPU Proofreader
RGAFLIn the biggest game engines, IE Unreal Engine 5 or Unity, no, they are not the default lighting system. Lumen needs to be toggled on and Unity needs to use the HDRP shaders which is bugged to hell and hardly used. Let me know what other engines defaults to RT lighting and i'll ask some of my dev colleagues.
Lumen introduced in 5.0, enabled by default from 5.1 onwards.

ColddeckedWe AMD fans prefer the term jimmy rigged, but will also accept janky.

Its not a bad thing for a tech company to innovate beyond what is the current demand or trends. Say what you will about people not wanting RT or AI upscaling back then, right now its a necessity, and Nvidia has positioned themselves as the clear leader in those areas. Nvidia is simply reaping what they sowed.
Precisely.
OnasiAll this bickering is pointless. Either AMD rolls out a good, viable product line and gain market share, or they won’t. That’s it. I don’t think the average customer cares about the technical minutiae. Yes, NV has a massive mindshare advantage, but not an insurmountable one. After all, before Zen AMD was “done” in CPU market according to many. But, turns out, all you need is a good product. Same with Intel GPUs - if Battlemage is an improvement, the drivers no longer take forever to improve (being fair, that wasn’t just Intels fault, there were extenuating circumstances out of their control) and is priced well I do think it could be a success.
In the end, the product is all that matters. And said product is holistic - hardware, software, features, the lot.
Well said.

Additionally it took until Zen 3 for AMD to be competitive in all areas, ST+MT and in games.

So looking at Arc Alchemist I see many similarities.

Ah mod had to edit my phone screenshot size again :toast:, my bad, easy thing to forget those pixel densities :D
Posted on Reply
#63
Onasi
Wait, wasn’t the whole selling point of Lumen that it doesn’t need dedicated RT hardware and is a software implementation instead?
Posted on Reply
#64
RGAFL
dgianstefaniLumen introduced in 5.0, enabled by default from 5.1 onwards.




Precisely.

Well said.
It's the default Global Illumination system. Big difference. I work with Unreal Engine 5. Lumen needs to be toggled on. I can post pics if proof needed.
Posted on Reply
#65
dgianstefani
TPU Proofreader
OnasiWait, wasn’t the whole selling point of Lumen that it doesn’t need dedicated RT hardware and is a software implementation instead?
It's both. Software, hybrid, hardware accelerated. Has to be because until the PS5 pro, consoles have anaemic RT hardware.
Posted on Reply
#66
Onasi
dgianstefaniSo looking at Arc Alchemist I see many similarities.
As I mentioned, Intel got an unexpected stumbling block there. I know for a fact from a reliable first hand source that quite a bit of software development, especially for GPU drivers, was done for a while at Intel Russia. I am sure I don’t need to explain further what the issue is.
Posted on Reply
#67
RGAFL
Seems i'm half right/half wrong. Lumen is the default in the latest Unreal Engine V5.4.1. Still need to enable the 'Generate Mesh Distance Fields' and switching from point based lighting to Global to enable properly but it seems Lumen is becoming the default. I'm using V5.3 at the moment which does need toggling to switch on so something to bear in mind in the future.
Posted on Reply
#68
Noyand
john_They already start moving those. Stable Diffusion, AI, "smart" NPCs, will be the all new rage in future marketing material from Nvidia. And don't ask me why gamers need AI. Nvidia, tech press and the hordes of (payed and not payed) trolls will try to convince that AI in gaming is the most important thing in the world.
Many game studios have labs where they are trying out new tech to see how it could benefit game development. AI lip sync was already used in commercial games before the marketing went apeshit with A.I. Back in 2020, I had a teacher who was working at Ubisoft Annecy, and told me about the "innovation lab" and how there's people there already experimenting with A.I. Ubisoft is also helping out start-up working on new tech (and that include A.I start-up) for the entertainment industry. Behind closed doors, the gaming industry is not as conservative as many people think they are... they are just being very careful about what get released in their AAA games.

From our consumer pov we are under the impression that Nvidia created the waves, but I think that Nvidia is actually surfing on a wave that was already there but needed a push. When I was in uni i had a seminar where a guy talked about how digital twins are having a big influence on his job...and that was a few weeks after Nvidia kept on talking about digital twins at the GTC
Posted on Reply
#69
AnotherReader
OnasiAs I mentioned, Intel got an unexpected stumbling block there. I know for a fact from a reliable first hand source that quite a bit of software development, especially for GPU drivers, was done for a while at Intel Russia. I am sure I don’t need to explain further what the issue is.
It isn't just the drivers. Even after two years, the A770 is significantly slower than the 6700 XT despite having 60% more compute and nearly 50% more bandwidth. This is why, I'm not expecting more than a 4070 competitor from Battlemage. I may be pleasantly surprised, but that remains to be seen.
Posted on Reply
#70
ToTTenTranz
dgianstefaniNice scene of a console port with no RT, or use of tech other than simple raster.

Really proves your point well.
Horizon Forbidden West is an excellent way to prove that point because 95% of the time it looks incredibly better than CP2077, Control or whatever other game with mediocre geometry+textures Nvidia convinced you it was so good to look at because of all that performance-breaking RT with a gazillion bounces per ray.

Like what happened with the last two console generations, the whole AIB market could have been moving towards $100-$500 GPUs doing temporal 2K-4K resolution with excellent performance and visuals using screen-space effects, perhaps with limited RT reflections here and there (like e.g. Spider Man 2 or Ratchet & Clank).

Instead Nvidia convinced most GPU reviewers to tell people RT performance is such an important thing so they could upsell their $1500-2000 GPUs, so we're back to AMD having to adapt their hardware to Nvidia's latest naked-emperor. Just like they had to when Nvidia was telling reviewers they had to measure performance rendering sub-pixel triangles in Geralt's hair.
SithaerNot as crazy as some ppl being so much against new tech/progress on a TECH forum only because they dislike it for whatever reason.
I wonder where all this progressivism was when ATi/AMD tried to introduce TruForm, then the first tessellation hardware in DX10, then TruAudio.
All of which were good enough to be widely adopted in consoles (i.e. a much higher volume of gaming hardware than PC dGPUs), but for some reason the PC crowd ignored.
Posted on Reply
#71
Onasi
AnotherReaderIt isn't just the drivers. Even after two years, the A770 is significantly slower than the 6700 XT despite having 60% more compute and nearly 50% more bandwidth. This is why, I'm not expecting more than a 4070 competitor from Battlemage. I may be pleasantly surprised, but that remains to be seen.
And the 7900XTX (full chip) is massively more powerful on paper than the 4080S (another full chip) and yet is only slightly ahead and in only several pure raster scenarios. Same story with, for example, 4070S vs 7900 GRE There’s a reason why comparing sheer stats doesn’t work between different architectures.
Posted on Reply
#72
Noyand
Vya DomusYeah it's nice isn't it, here's how Intel fares in RT :



A 6800XT is faster, strange how Intel has all of this ultra performant hardware RT implementation that many Intel and Nvidia fanboys assure me it's a must have yet their GPUs can't even beat AMD's top RDNA2 first gen non hardware RT GPU and is twice as slow as AMD's second gen, still non hardware, RT implementation.

But yes AMD is in serious danger, yikes.
Your comparison is weird, without RT the A770 is slightly slower than a RX 6700XT, but then you came out and compare the A770 to the RX 6800XT to prove that Intel RT implementation was worse than AMD ? And you are calling people out on being dishonest ?
Posted on Reply
#73
RGAFL
The major problem with the AMD cards is the way they use the shaders. Where NVidia have dedicated tensor and RT cores to do upscaling/Ray Tracing etc, AMD use the shaders they have and dual issue them. Very good for light RT games as their shaders are good, probably the best pure shaders out there, but when RT gets heavy then they lose about 20-40% having to do the RT calculations and the raster work. Dedicated shaders or cores will help out a lot as like I said the shaders are excellent, power wise not far off 4090 performance when used correctly. But AMD do like their programmable shaders.
Posted on Reply
#74
Denver
NoyandYour comparison is weird, without RT the A770 is slightly slower than a RX 6700XT, but then you came out and compare the A770 to the RX 6800XT to prove that Intel RT implementation was worse than AMD ? And you are calling people out on being dishonest ?


The intel implementation is inferior, ARC dedicates more space to RT and AI than the others. What's smart or better about dedicating a huge part of the die to RT to shout out that it has more performance in RT, but in reality it's irrelevant in 99% of games, and ends up struggling to compete with the RX 6600/RX7600?

A770 > 406mm² @ 21,700 million transistors.
RX 7600/RX 7600XT @ 204 mm² @ 13,300 million transistors.

Performance>


And no, it's not worth having a design 2x larger to have such a "glorious" advantage:
Posted on Reply
#75
dgianstefani
TPU Proofreader
Denver

The intel implementation is inferior, ARC dedicates more space to RT and AI than the others. What's smart or better about dedicating a huge part of the die to RT to shout out that it has more performance in RT, but in reality it's irrelevant in 99% of games, and ends up struggling to compete with the RX 6600/RX7600?

A770 > 406mm² @ 21,700 million transistors.
RX 7600/RX 7600XT @ 204 mm² @ 13,300 million transistors.

Performance>

The Arc A770 obliterates the RX 7600.

The 7600XT you are comparing to is a three month old review with old drivers, and you've specifically chosen 1080p RT off, the Intel card scales better at higher resolutions, and/or with RT on.
What's smart or better about dedicating a huge part of the die to RT to shout out that it has more performance in RT, but in reality it's irrelevant in 99% of games, and ends up struggling to compete with the RX 6600/RX7600?
RGAFLSeems i'm half right/half wrong. Lumen is the default in the latest Unreal Engine V5.4.1. Still need to enable the 'Generate Mesh Distance Fields' and switching from point based lighting to Global to enable properly but it seems Lumen is becoming the default. I'm using V5.3 at the moment which does need toggling to switch on so something to bear in mind in the future.
Irrelevant huh?


Posted on Reply
Add your own comment
Jun 1st, 2024 18:47 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts