• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Path Tracing Toyshop Demo

Joined
Jun 20, 2005
Messages
114 (0.02/day)
Location
Leeds, UK
System Name My PC
Processor AMD 9800X3D
Motherboard MSI MPG B850 Edge TI Wifi
Cooling Deepcool AK620 White, 4 x 140 PWM case fans
Memory 2 x 16GB Corsair Vengeance 6000MHz C28 EXPO DDR5
Video Card(s) MSI RX 6900 XT Gaming X Trio
Storage WD SN7100 2TB, MX500 2TB x 2, 3TB WD Blue
Display(s) 27" curved 165Hz VA 1080p (Gigabyte)
Case Montech Air 903 Max (white)
Audio Device(s) Creative X4, Onkyo AVR + Monitor Audio MASS 5.1, GigaByte Aorus G5 headphones, AKG K550 headphones
Power Supply NZXT C850 ATX3.1
Mouse Deathadder 2
Keyboard Xtrfy K4
Software W11 Pro
Benchmark Scores TBD
It looks nice, but I'm far from a graphics expert. A site I visit wasn't as impressed https://www.notebookcheck.net/Lackl...rge-gap-with-RTX-50-series-GPUs.971854.0.html but I don't spent gaming time staring at puddles etc.

What do you guy think?
Demo itself is less impressive than what Nvidia has put out recently. Looks quite a bit behind which is probably accurate for what it is. But really, whether that matters is a bit secondary. AMD does have the tech in usable and apparently performant form.

Real games or applications and apples-to-apples comparisons are what will determine if they got on par or not. Visuals of real stuff are not determined by AMD or Nvidia and will put cards from both on equal-ish ground. Remains to be seen how implementation of all the custom tech like neural rendering or ray reconstruction will end up happening. Now that there is more than one real player, some standardization would be nice?
 
The thing that hit me from the video Ozzer posted was that, yeah, the NVidia marble video was awesome, but they've got £2000+++ hardware to play that (plus more software tricks). AMD just has a 9070XT, which is in my price range. I still think a lot of RT is just NV trying to hammer AMD because they can, rather than RT being something we absolutely need.

When the 4090 came out it seemed like the card to last forever as it was so damn powerful and so ahead of everything else, but then NV NVidia-ed themselves by bringing out path tracing which immediately hammered the 4090. Lol. That's changed how I view the xx90 cards (and NV), not for the better.
 
The thing that hit me from the video Ozzer posted was that, yeah, the NVidia marble video was awesome, but they've got £2000+++ hardware to play that (plus more software tricks). AMD just has a 9070XT, which is in my price range. I still think a lot of RT is just NV trying to hammer AMD because they can, rather than RT being something we absolutely need.
Marbles? Wasn't that Turing demo that ran on a high-end Quadro? That would be a bit faster than 2080Ti which in more recent generations is the performance range of 3070/Ti or 4060Ti/4070 or 7800XT.
 
The thing that hit me from the video Ozzer posted was that, yeah, the NVidia marble video was awesome, but they've got £2000+++ hardware to play that (plus more software tricks). AMD just has a 9070XT, which is in my price range. I still think a lot of RT is just NV trying to hammer AMD because they can, rather than RT being something we absolutely need.

When the 4090 came out it seemed like the card to last forever as it was so damn powerful and so ahead of everything else, but then NV NVidia-ed themselves by bringing out path tracing which immediately hammered the 4090. Lol. That's changed how I view the xx90 cards (and NV), not for the better.
It was the same story with sli, physx, hairworks, tesselation, gsync, dlss... Planned obsolescence.
 
I'll be honest, I can't make out the imperfections that other people are seeing. The soft image is basically the only thing that stood out to me. I suspect that once reviewers boot up Cyberpunk and compare path-tracing properly it will be a marginal difference. NVIDIA will likely come out ahead of course but if we're zooming in to find differences then I'd consider that a job well done for AMD. They can only get better.
 
Marbles? Wasn't that Turing demo that ran on a high-end Quadro? That would be a bit faster than 2080Ti which in more recent generations is the performance range of 3070/Ti or 4060Ti/4070 or 7800XT.
You're right 2020 - I didn't catch that!

OK so NV have quite a large lead but they're pushing RT with the big guns - 5090 is silly expensive and it made the 4090 (which was also silly expensive) look decidely not good enough in a year or so. AMD have nothing in that range, so NV is maybe pushing boundaries, but who's winning? RT is still a long way from being universally available.

It was the same story with sli, physx, hairworks, tesselation, gsync, dlss... Planned obsolescence.
Yeah. Haven't missed them (970 was last NV card I had). I now think spending too much is a fools errand as something, possibly unexpected or from a tangent, comes along that suddenly makes what you have seem meh. Flagship wow is great to dream about, and helps all the content providers, but...

Spending £2k on a series of gfx cards over a period of 10 years (?) seems a better bet than buying a 5090 and clinging on for dear life hoping for the best.

I think I'm happy I look at these demos and say "wow" and dont really see all the bad things. I liked this toyshop demo. I hope 9070XT does well.
 
It was the same story with sli, physx, hairworks, tesselation, gsync, dlss... Planned obsolescence.
What? Come on, you know better than that. And none of these are an example of planned obsolescence. Either the feature fell out of favor or became a standard, neither of which even resembles planned absolescence.
- SLI/CF died because of increasing maintenance costs for IHVs and theoretical promise of more inherent replacement in form of DX12 multi-gpu.
- For PhysX Nvidia tried to hold on to GPU-accelerated version of it by hampering the CPU version. They gave up and PhysX is still pretty widely used, just usually not running on GPUs. A major factor in this was also the rise of multi-core CPUs that were no longer that bad in physics calculations.
- Hairworks or a successor of that is still in Gameworks. Plus it was joined by other implementations of hair rendering - TressFX was maybe the most prominent one going through the news. But hair rendering as once a "special" technology went into mainstream and is part of various engines or (physics) middlewares these days.
- Tessellation is just mainstream. It is a standard thing and used all over the place. Btw, tessellation came out first as an ATi (now AMD) innovation.
- Other than a marketing thing, GSync came out a good 1.5 years ahead of FreeSync and more than that before things got on par - arguably. Now Adaptive sync is a standard feature and basically implemented on scaler level in almost any monitor that needs it. But that took long years to happen.
- Not sure what you mean by DLSS. Newer versions or features like MFG limited to newer generations? There does seem to be a technical reason for most of that. And it is not usually the "cannot run" that Nvidia hints at and understanding veers towards, it is older cards not being able to run the thing without an oversized performance hit.

The ontopic feature of raytracing is quite different from these things from get-go. DXR is part of Microsoft DX12 as standard from the beginning. Vulkan took a bit longer but also has ray-tracing in API as standard thing. Implementations can and should vary.
 
The only thing I can conclude from comparing this tech demo to Nvidia's is that I like NV's artists' work better. Which is irrelevant to the actual technical prowess being demoed here...

Isn't most of the blurring people complaining about from intentional motion blur and DoF?
 
Isn't most of the blurring people complaining about from intentional motion blur and DoF?
The biggest problem I can see quite apparently is ghosting, which is different than motion blur. If you watch the robot's arm, it's pretty apparent where a whole trail exists where it used to be:
1741104375790.png

There's some blurriness on the green of the "shoulder" for example that seems like motion blur, but that trailing effect behind anything that moves looks exactly like what you see with bad DLSS/FSR implementations.
 
My old eyes don't see these things, in real life or games! I maintain if you're enjoying the game and immersed you're not puddle peeping and missing all this. Which is great if you didn't pay for it. Not so sure if you're paying the NV tax.
If you have to take a still or slow stuff down you are doing it wrong!
 
It was the same story with sli, physx, hairworks, tesselation, gsync, dlss... Planned obsolescence.

Can't have "planned obsolescence" if your hardware doesn't, or begrudgingly supports any of it anyway. Hell, why bother: let's all just go back to playing on an Atari 2600, after all, that fancy NES and its 56 color graphics are just unnecessary.

My old eyes don't see these things, in real life or games! I maintain if you're enjoying the game and immersed you're not puddle peeping and missing all this. Which is great if you didn't pay for it. Not so sure if you're paying the NV tax.
If you have to take a still or slow stuff down you are doing it wrong!

Thread started well and then veered into some kind of damage control? What was the entire point of this? Like, to reinforce the point that something that AMD now seems capable of doing isn't needed? I don't get it?
 
What do you guy think?
their neural denoiser looks poor.

Can't have "planned obsolescence" if your hardware doesn't, or begrudgingly supports any of it anyway. Hell, why bother: let's all just go back to playing on an Atari 2600, after all, that fancy NES and its 56 color graphics are just unnecessary.



Thread started well and then veered into some kind of damage control? What was the entire point of this? Like, to reinforce the point that something that AMD now seems capable of doing isn't needed? I don't get it?
amd fans will scoff at progress in scene rendering, then wonder why their brand is under 10% marketshare.
 
their neural denoiser looks poor.

For their first generation tech? I'm giving an approval stamp... Nv's got a ~7 year head start with Turing, after all. More so than the denoiser, I just noticed the video looks oversharpened somewhat, but it's probably running with FSR 4 to achieve good frame rates.

Personally, I welcome AMD with open arms to the modern GPU world
 
For their first generation tech? I'm giving an approval stamp... Nv's got a ~7 year head start with Turing, after all. More so than the denoiser, I just noticed the video looks oversharpened somewhat, but it's probably running with FSR 4 to achieve good frame rates.

Personally, I welcome AMD with open arms to the modern GPU world
yes and no. a step in the right direction, but frankly, this is 2025 and it looks far worse than nvidia's 2020 PT demo.
 
yes and no. a step in the right direction, but frankly, this is 2025 and it looks far worse than nvidia's 2020 PT demo.

IMO it's more that AMD's technology is still relatively immature, but you need to crawl before you can walk, after all. They are on the right track, with some time and elbow grease I think AMD will pull it off. I'm cautiously optimistic for UDNA right now, if they pull this generation well, I think I might be grabbing one of their upcoming high end cards. Though, I'd like to actually receive my 5090 before I start thinking of upgrades for it :laugh:
 
Thread started well and then veered into some kind of damage control? What was the entire point of this? Like, to reinforce the point that something that AMD now seems capable of doing isn't needed? I don't get it?
Apologies, I was fishing for more info - I liked the demo, but couldn't really see some of the bad things the link I posted highlighted. Ozzer posted a really useful video in response. Made me think about so many things. I'm trying to figure out what I want for myself rather than being led by influencers with their own motives. I have to read/watch their content to get the info but I dont have to agree.

If I'm busy on my nth attempt to find a tactic to kill the bad boss that's been slaying me, it's all about the game and the hopeful buzz of success with the quality of the shadow under his arm being very far from my mind! I do like >120Hz, but I'm not sure i can tell between higher frequencies. I do rememer older games being less blurry (which I like), but the jaggies - telegraph wires and chain-link fences everywhere. Just finished another playthrough of all the Half Life's/System Shock and some other oldies. It was great.

It made me remember that there are visual tricks done at a pixel level to improve the bigger picture - is this partly to play when pixel peepers say xyz artifact, but in reality it's to fool your eyes to see what's not really there (to see "better")? I think it's like an AA, or a kind of predistortion to correct for a wierd transfer function, to use an engineering analogy. Not always though - some stuff is just a mess without any intent.

NV and AMD are in the same arena but aren't equals. Their output seems to fall into 3 categories - 1) to make things better for the consumer 2) to make the other look worse 3) to create ways to make their products seem better than there actually are (frame gen, etc). Each has a different emphasis/priority and as a result nothing from one is directly comparable with the other. I want good but there's a limit to how much I want to pay. Of course I can get better if I pay more (probably) but... This is what this thread is about to me.

Sorry for a long one :D
 
Well, as little as I want to use RT and PT it looks like AMD is advancing in the right direction for people who do want it. The demo is a bit rough but a promising start.

With any luck this will help AMD claw back some desperately needed marketshare.
 
Back
Top