• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD to Redesign Ray Tracing Hardware on RDNA 4

It's crazy how anti-AMD some of you are, it's like you live in a parallel universe where AMD is dogshit and everyone is just light years ahead.

Not as crazy as some ppl being so much against new tech/progress on a TECH forum only because they dislike it for whatever reason.
Progress doesn't necessarily have to happen on a need to ask basis, if that was the case then we would be still using some basic tech playing old ass games cause most of the stuff we have nowadays is something that nobody directly asked for I think. 'we could play this who asked for it game forever, its pointless and kinda stupid imo'

Personally I have no issues with RT/Upscaling or any of that stuff and actually I like using DLSS or DLAA if I have the headroom for it.
RT depends on the game but the tech on its own is pretty cool imo. 'Cyber/Control/Ghostwire Tokyo I did finish with RT+DLSS on'

At this point and going forward I consider those things a selling point whenever I will upgrade my GPU cause if the price diff is not too big then I will pick up the card with the better feature set and currently that is Nvidia so I'm glad to see that AMD is working on those. 'I have no probs with AMD itself, I've had my share of AMD hardware in the past with no issues'
 
Intel's upscaler and RT tech is already ahead.
That's not impressive and nobody should consider it to be so....Intel literally spends over 3x more on R&D than AMD, so should we really be impressed by the fact that Intel's ahead after spending an obscene amount of money?
 
Not as crazy as some ppl being so much against new tech/progress on a TECH forum only because they dislike it for whatever reason.
Progress doesn't necessarily have to happen on a need to ask basis, if that was the case then we would be still using some basic tech playing old ass games cause most of the stuff we have nowadays is something that nobody directly asked for I think. 'we could play this who asked for it game forever, its pointless and kinda stupid imo'

Personally I have no issues with RT/Upscaling or any of that stuff and actually I like using DLSS or DLAA if I have the headroom for it.
RT depends on the game but the tech on its own is pretty cool imo. 'Cyber/Control/Ghostwire Tokyo I did finish with RT+DLSS on'

At this point and going forward I consider those things a selling point whenever I will upgrade my GPU cause if the price diff is not too big then I will pick up the card with the better feature set and currently that is Nvidia so I'm glad to see that AMD is working on those. 'I have no probs with AMD itself, I've had my share of AMD hardware in the past with no issues'
Well said.

Consumers like these features. Therefore a consumer serving company will cater to their customers.

End of story.

That's not impressive and nobody should consider it to be so....Intel literally spends over 3x more on R&D than AMD, so should we really be impressed by the fact that Intel's ahead after spending an obscene amount of money?
Maybe R&D spending is... important?

How many generations does it take to get it right? For these two things, RT hardware and upscaling, for Intel, one generation.

Take notes AMD.
 
Ah so now we're back to raster. Sorry, you move the goalposts so frequently it's hard to keep up.
Well we're back to whatever you like, you've failed spectacularly to back up every single one of your claims so far.

We were talking about "how Intel fares in RT" to use your words.
You ignored my post where I showed Intel can't even match AMD's first gen RT, I don't know what more can be said on that subject. Then you brought up prices, no clue as to what you were trying to prove. The A770 is so much cheaper because it sucks, no one is buying it for it's stelar RT performance, which actually still sucks.
 
Lmao.

Ok. 85% of the GPU market is wrong, AMD is right, and the rest of us are just fanboys.

Interesting take.

Even Intel got priorities right with Arc, perhaps AMD, who should have known better even with RDNA 2, is finally getting their heads around the concept of RT and upscaling being core features, not lazy "we have this feature too" checkboxes to tick.
Also, ever consider that AMD reasons for not doing something isn't choice so much as financial constraint? It seems as though whenever somebody compares Nvidia, Intel, and AMD they just assume that they're all competing on an even playing field when that couldn't be further from the truth. Nvidia and Intel are so much bigger than AMD and have so many more resources that we shouldn't view it as AMD underachieving compared to Nvidia in Intel, but that what AMD is able to achieve with so little resources and their competition having every advantage is truly impressive.

Up until the last three or four years, Nvidia had an R&D budget over twice as big as AMD's and Intel had one over 7x as large, and despite that, AMD was able to compete and hang with Nvidia and actually beat Intel. Can anybody name an example from any other industry where a company is able to successfully.cimpete while being completely outmatched by its competition resource wise?
 
That's not impressive and nobody should consider it to be so....Intel literally spends over 3x more on R&D than AMD, so should we really be impressed by the fact that Intel's ahead after spending an obscene amount of money?
Intel also has huge software development not related to GPUs. They have side businesses larger then AMD.

How much of that 3x budget is going to GPUs specifically?
 
This is one of the things needed. Actual full hardware acceleration rather than a half hearted approach.
We AMD fans prefer the term jimmy rigged, but will also accept janky.

Its not a bad thing for a tech company to innovate beyond what is the current demand or trends. Say what you will about people not wanting RT or AI upscaling back then, right now its a necessity, and Nvidia has positioned themselves as the clear leader in those areas. Nvidia is simply reaping what they sowed.
 
Well we're back to whatever you like, you've failed spectacularly to back up every single one of your claims so far.


You ignored my post where I showed Intel can't even match AMD's first gen RT, I don't know what more can be said on that subject. Then you brought up prices, no clue as to what you were trying to prove. The A770 is so much cheaper because it sucks, no one is buying it for it's stelar RT performance, which actually still sucks.
A £500 card that is more performant than a £300 one and the price is irrelevant? OK. I'll remind you of that next time you're mocking the price of NVIDIA cards.
 
A £500 card that is more performant than a £300 one and the price is irrelevant? OK. I'll remind you of that next time you're mocking the price of NVIDIA cards.
Dude what are you even saying.

The 6800XT is a discontinued 4 year old card, that's still much faster in raster by the way, but that doesn't matter, the point is Intel sucks in raster and can't even match first gen AMD RT performance. The 7700XT is more expensive because it's simply a much better GPU.
 
Ah so now we're back to raster. Sorry, you move the goalposts so frequently it's hard to keep up. We were talking about "how Intel fares in RT" to use your words.

Considering RT is employed as the default lighting in new game engines, it's not quite as shocking as you might think to expect expensive cards to be performant.
In the biggest game engines, IE Unreal Engine 5 or Unity, no, they are not the default lighting system. Lumen needs to be toggled on and Unity needs to use the HDRP shaders which is bugged to hell and hardly used. Let me know what other engines defaults to RT lighting and i'll ask some of my dev colleagues.
 
All this bickering is pointless. Either AMD rolls out a good, viable product line and gain market share, or they won’t. That’s it. I don’t think the average customer cares about the technical minutiae. Yes, NV has a massive mindshare advantage, but not an insurmountable one. After all, before Zen AMD was “done” in CPU market according to many. But, turns out, all you need is a good product. Same with Intel GPUs - if Battlemage is an improvement, the drivers no longer take forever to improve (being fair, that wasn’t just Intels fault, there were extenuating circumstances out of their control) and is priced well I do think it could be a success.
In the end, the product is all that matters. And said product is holistic - hardware, software, features, the lot.
 
Let me know what other engines defaults to RT lighting
None do, his logic is that if a game has RT that must mean the engine itself is RT by default lol.
 
In the biggest game engines, IE Unreal Engine 5 or Unity, no, they are not the default lighting system. Lumen needs to be toggled on and Unity needs to use the HDRP shaders which is bugged to hell and hardly used. Let me know what other engines defaults to RT lighting and i'll ask some of my dev colleagues.
Lumen introduced in 5.0, enabled by default from 5.1 onwards.

Screenshot_20240503_130947.png


We AMD fans prefer the term jimmy rigged, but will also accept janky.

Its not a bad thing for a tech company to innovate beyond what is the current demand or trends. Say what you will about people not wanting RT or AI upscaling back then, right now its a necessity, and Nvidia has positioned themselves as the clear leader in those areas. Nvidia is simply reaping what they sowed.
Precisely.
All this bickering is pointless. Either AMD rolls out a good, viable product line and gain market share, or they won’t. That’s it. I don’t think the average customer cares about the technical minutiae. Yes, NV has a massive mindshare advantage, but not an insurmountable one. After all, before Zen AMD was “done” in CPU market according to many. But, turns out, all you need is a good product. Same with Intel GPUs - if Battlemage is an improvement, the drivers no longer take forever to improve (being fair, that wasn’t just Intels fault, there were extenuating circumstances out of their control) and is priced well I do think it could be a success.
In the end, the product is all that matters. And said product is holistic - hardware, software, features, the lot.
Well said.

Additionally it took until Zen 3 for AMD to be competitive in all areas, ST+MT and in games.

So looking at Arc Alchemist I see many similarities.

Ah mod had to edit my phone screenshot size again :toast:, my bad, easy thing to forget those pixel densities :D
 
Last edited:
Wait, wasn’t the whole selling point of Lumen that it doesn’t need dedicated RT hardware and is a software implementation instead?
 
Wait, wasn’t the whole selling point of Lumen that it doesn’t need dedicated RT hardware and is a software implementation instead?
It's both. Software, hybrid, hardware accelerated. Has to be because until the PS5 pro, consoles have anaemic RT hardware.
 
So looking at Arc Alchemist I see many similarities.
As I mentioned, Intel got an unexpected stumbling block there. I know for a fact from a reliable first hand source that quite a bit of software development, especially for GPU drivers, was done for a while at Intel Russia. I am sure I don’t need to explain further what the issue is.
 
Seems i'm half right/half wrong. Lumen is the default in the latest Unreal Engine V5.4.1. Still need to enable the 'Generate Mesh Distance Fields' and switching from point based lighting to Global to enable properly but it seems Lumen is becoming the default. I'm using V5.3 at the moment which does need toggling to switch on so something to bear in mind in the future.
 
Last edited:
They already start moving those. Stable Diffusion, AI, "smart" NPCs, will be the all new rage in future marketing material from Nvidia. And don't ask me why gamers need AI. Nvidia, tech press and the hordes of (payed and not payed) trolls will try to convince that AI in gaming is the most important thing in the world.
Many game studios have labs where they are trying out new tech to see how it could benefit game development. AI lip sync was already used in commercial games before the marketing went apeshit with A.I. Back in 2020, I had a teacher who was working at Ubisoft Annecy, and told me about the "innovation lab" and how there's people there already experimenting with A.I. Ubisoft is also helping out start-up working on new tech (and that include A.I start-up) for the entertainment industry. Behind closed doors, the gaming industry is not as conservative as many people think they are... they are just being very careful about what get released in their AAA games.

From our consumer pov we are under the impression that Nvidia created the waves, but I think that Nvidia is actually surfing on a wave that was already there but needed a push. When I was in uni i had a seminar where a guy talked about how digital twins are having a big influence on his job...and that was a few weeks after Nvidia kept on talking about digital twins at the GTC
 
As I mentioned, Intel got an unexpected stumbling block there. I know for a fact from a reliable first hand source that quite a bit of software development, especially for GPU drivers, was done for a while at Intel Russia. I am sure I don’t need to explain further what the issue is.
It isn't just the drivers. Even after two years, the A770 is significantly slower than the 6700 XT despite having 60% more compute and nearly 50% more bandwidth. This is why, I'm not expecting more than a 4070 competitor from Battlemage. I may be pleasantly surprised, but that remains to be seen.
 
Nice scene of a console port with no RT, or use of tech other than simple raster.

Really proves your point well.

Horizon Forbidden West is an excellent way to prove that point because 95% of the time it looks incredibly better than CP2077, Control or whatever other game with mediocre geometry+textures Nvidia convinced you it was so good to look at because of all that performance-breaking RT with a gazillion bounces per ray.

Like what happened with the last two console generations, the whole AIB market could have been moving towards $100-$500 GPUs doing temporal 2K-4K resolution with excellent performance and visuals using screen-space effects, perhaps with limited RT reflections here and there (like e.g. Spider Man 2 or Ratchet & Clank).

Instead Nvidia convinced most GPU reviewers to tell people RT performance is such an important thing so they could upsell their $1500-2000 GPUs, so we're back to AMD having to adapt their hardware to Nvidia's latest naked-emperor. Just like they had to when Nvidia was telling reviewers they had to measure performance rendering sub-pixel triangles in Geralt's hair.


Not as crazy as some ppl being so much against new tech/progress on a TECH forum only because they dislike it for whatever reason.
I wonder where all this progressivism was when ATi/AMD tried to introduce TruForm, then the first tessellation hardware in DX10, then TruAudio.
All of which were good enough to be widely adopted in consoles (i.e. a much higher volume of gaming hardware than PC dGPUs), but for some reason the PC crowd ignored.
 
It isn't just the drivers. Even after two years, the A770 is significantly slower than the 6700 XT despite having 60% more compute and nearly 50% more bandwidth. This is why, I'm not expecting more than a 4070 competitor from Battlemage. I may be pleasantly surprised, but that remains to be seen.
And the 7900XTX (full chip) is massively more powerful on paper than the 4080S (another full chip) and yet is only slightly ahead and in only several pure raster scenarios. Same story with, for example, 4070S vs 7900 GRE There’s a reason why comparing sheer stats doesn’t work between different architectures.
 
Yeah it's nice isn't it, here's how Intel fares in RT :

View attachment 346070

A 6800XT is faster, strange how Intel has all of this ultra performant hardware RT implementation that many Intel and Nvidia fanboys assure me it's a must have yet their GPUs can't even beat AMD's top RDNA2 first gen non hardware RT GPU and is twice as slow as AMD's second gen, still non hardware, RT implementation.

But yes AMD is in serious danger, yikes.
Your comparison is weird, without RT the A770 is slightly slower than a RX 6700XT, but then you came out and compare the A770 to the RX 6800XT to prove that Intel RT implementation was worse than AMD ? And you are calling people out on being dishonest ?
1714741757245.png
1714741914633.png
1714741927199.png
 
The major problem with the AMD cards is the way they use the shaders. Where NVidia have dedicated tensor and RT cores to do upscaling/Ray Tracing etc, AMD use the shaders they have and dual issue them. Very good for light RT games as their shaders are good, probably the best pure shaders out there, but when RT gets heavy then they lose about 20-40% having to do the RT calculations and the raster work. Dedicated shaders or cores will help out a lot as like I said the shaders are excellent, power wise not far off 4090 performance when used correctly. But AMD do like their programmable shaders.
 
Your comparison is weird, without RT the A770 is slightly slower than a RX 6700XT, but then you came out and compare the A770 to the RX 6800XT to prove that Intel RT implementation was worse than AMD ? And you are calling people out on being dishonest ?
View attachment 346081View attachment 346082View attachment 346084
IntelGP.png


The intel implementation is inferior, ARC dedicates more space to RT and AI than the others. What's smart or better about dedicating a huge part of the die to RT to shout out that it has more performance in RT, but in reality it's irrelevant in 99% of games, and ends up struggling to compete with the RX 6600/RX7600?

A770 > 406mm² @ 21,700 million transistors.
RX 7600/RX 7600XT @ 204 mm² @ 13,300 million transistors.

Performance>

1714743858847.png

And no, it's not worth having a design 2x larger to have such a "glorious" advantage:
1714744334022.png
 
Last edited:
Back
Top