• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

XFX Radeon RX 5700 XT THICC III Ultra

Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
It's totally not possible that users who buy a card TODAY should worry about ray-tracing. Console will have it, but PC games - they are what? 2 games that use different approach and none does full ray-tracing? What about development circle? Blizzard "considers" ray-tracing for D4. When it will be out? I'd say at least 3 years...

Besides, the gains of hardware-accelerated are somewhere between probably and maybe, each current CPU and GPU can do the math needed, whether there will be gain or not of 'a' change to CUs (or, indeed, is it possible at all) is one question, other is does it sacrifices something else for it - eg. slowing down something that is actually used.

NVIDIA uses it as an questionable marketing trick, 'early adopters' just get the same what they typically get - option that they don't need. Yet. In two years users, no matter of knowledge, will probably be looking for a new GPU anyway, and that is the time when raytracing (and it's implementation, and it's natural slowness as a property of algorhytm) can be an issue.

Other than that, good review.
There are currently several titles out already with RT and more (AAA titles too) planned.

I don't think any title does "full" RT of a scene. That was never its intent. ;)
 
Joined
Oct 12, 2019
Messages
128 (0.08/day)
"are planned" is the key here. I've mentioned Blizzard just as an example, pipeline for releasing games takes years, look around and there is majority of DirectX11 titles...

I don't think any title does "full" RT of a scene. That was never its intent. ;)

No, that *has* to be the goal. Or it isn't ray-tracing. We have supplements already... Reason why they are implemented partially is most likely inability of any GPU today to do real-time ray-tracing. People (outside of circle who actually did some rendering), usually make mistakes - there is no partial ray-tracing. Each light wave is either calculated completely (with all recursions) or is not. No middle ground. I *did* said 'none full', meaning it's enough. True, *some* stuff may look better - and... it also can look better with any other CPU/GPU, depending of calculation capacity.

Don't get me wrong, it's just... that I was in this (rendering methods) for a long time. There is just one full or 'pure' ray-tracing method. There are few 'shortcuts' to do less math, but difference is typically quite noticeable. Also, ray-tracing is not necessarily photo-realism. I'm too excited to see this huge step in computer graphics, but also am bit skeptical about it happening that soon - more objects, vertices, light sources, shadows = exponentially more calculation. First true ray-traced game will need to reduce something (most likely - light sources; next - reflective surfaces and so on). Yet, they still can look good... We'll see, in time...
 
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
"are planned" is the key here. I've mentioned Blizzard just as an example, pipeline for releasing games takes years, look around and there is majority of DirectX11 titles...



No, that *has* to be the goal. Or it isn't ray-tracing. We have supplements already... Reason why they are implemented partially is most likely inability of any GPU today to do real-time ray-tracing. People (outside of circle who actually did some rendering), usually make mistakes - there is no partial ray-tracing. Each light wave is either calculated completely (with all recursions) or is not. No middle ground. I *did* said 'none full', meaning it's enough. True, *some* stuff may look better - and... it also can look better with any other CPU/GPU, depending of calculation capacity.

Don't get me wrong, it's just... that I was in this (rendering methods) for a long time. There is just one full or 'pure' ray-tracing method. There are few 'shortcuts' to do less math, but difference is typically quite noticeable. Also, ray-tracing is not necessarily photo-realism. I'm too excited to see this huge step in computer graphics, but also am bit skeptical about it happening that soon - more objects, vertices, light sources, shadows = exponentially more calculation. First true ray-traced game will need to reduce something (most likely - light sources; next - reflective surfaces and so on). Yet, they still can look good... We'll see, in time...
Just look up the list of expected titles, bud. ;)

As far as splitting hairs on RT and what Nvidia is doing... I just don't care. Nobody claimed it was full scene RT man... so, coolio but, not really a talking point. ;)
 
Joined
Oct 12, 2019
Messages
128 (0.08/day)
Did you, by chance, read anything except the first sentence? I'll shorten it:
- Ray-tracing does a LOT of math
- To be called ray-tracing, it must do every light ray, with recursions
- Implementing it on a whole scene with lots of light sources and reflections requires either enormous computing power, or... well, the scene must be simple by design - does not look good on "non-ray-tracing" (TM NVIDIA) GPUs
 
Joined
May 11, 2009
Messages
3 (0.00/day)
There's people out there who aren't aware. I know that you guys commenting are experts. This is the same thing as "no analog vga on dvi" back in the day, or "no integrated graphics" on ryzen. I'm trying to educate people here, it would be easy to just hide it and not face your criticism.

Also it's not impossible that raytracing will take off in a year or two, driven by next-gen consoles and NVIDIA. All I want is that people think about points I make and come to their own conclusions.

I see where both is coming from. May I suggest then an alternative third section for these exact type of issues. Like more of a yellow/warning symbol section that it has no RTcores. A bad for AMD because it doesn't have Nvidia RTX? Seems wrong / inaccurate.
 

namesurename

New Member
Joined
Dec 17, 2019
Messages
1 (0.00/day)
Hello, little update from somebody who bought the actual card.
In short - it is a disaster.

Tested as is in Timespy benchmark - gpu temp up to 90, edge temp 100+, load as jet engine(3000rpm max).
Tried to undervolt it and set fan profile - did help, but not much, still absolutely unhealthy temps, and fans even at 1600rmp can't cool it.

So I remembered about what Gamer's Nexus Steve did, and repasted my card.
Memory modules are properly covered with thermal pads and everything, gpu die and heatspreader are in proper contact based on thermal paste pattern...But the termal paste itself... Well, I can't call it a paste, it was dry chunky plastic substance.

After that a healthy undervolt of 1905 mhz at 1052Mv(may be a bit lower I think, but at 950 it crashes), less agressive fan profile and -5% power limit it's at last quiet and still retain 95% of it's power. Still 75GPU and 86 hotspot but it's fine. At least it's a decent card now.

But still, for a really big heatsink(and it is BIG), to have issues, I think they really cheaped out on heat pipes.
Or maybe I just got unlucky?
 

Firestorm1439

New Member
Joined
Dec 29, 2019
Messages
3 (0.00/day)
I bought this card about 2 weeks ago. The first thing I noticed is that the card boosts way above the boost clock target very often reaching 2080mhz for extended periods. Temperatures in an environment with a 26-degree ambient temp are (GPU: 75-78, MEM 75, VRM 68). So far I am super impressed, perhaps I just won the silicon lottery with this example, I haven't had any issues with driver stability either.
 
Joined
Jul 10, 2015
Messages
749 (0.23/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
Lol, is ambient so hot before or after gaming?
 
Top