• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

If it has flagship or very near flagship performance, it is a lite version of the flagship.

A quality which can be bestowed upon the RTX 3080 all the same. But no.
 
A quality which can be bestowed upon the RTX 3080 all the same. But no.
Can you even use a 3080 for 4K Gaming with new Games?
 
Can you even use a 3080 for 4K Gaming with new Games?

You cannot use a 6800 XT either, since it performs so poorly. Despite being 10GB, the 3080 still outperforms it. From W1zz's latest game performance review

performance-3840-2160.png
 
  • Haha
Reactions: ARF
@SailorMan-PT - please post in English, or your next posts will be removed. This is an English speaking forum, thank you.

Bitte auf Englisch posten for writing in German language
Does that mean that my jaw has to drop in front of the 4090's performance with no consideration towards its price or power consumption? Sorry, not gonna happen.

Since you mentioned it, even the 7900 XTX is way above the limit of what I consider a sensible price for a GPU. That's why I'm not affected by AMD's decision of not making a halo RDNA 4 GPU. If they can pull off a decent midrange card, I'm game.
Well, after watched and studied several Benchmarks, there are cards who serves the mid-range market. In despite of technical value, in relationship to the prices, I would consider the AMD Radeon GPUs better. Cuz of her advantage in a larger VMemory and one decent storage connection. 16-GB+256 bit vs.12-GB+192bit.
Those are the Radeon 7800xt and RX-7900-GRE. I think I don't have to mention the respective NVIDIA GPUs. Cuz they're well known.

Those are the Radeon 7800xt and RX-7900-GRE. I think I don't have to mention the respective NVIDIA GPUs. Cause they are well known

@SailorMan-PT - please post in English, or your next posts will be removed. This is an English speaking forum, thank you.

Bitte auf Englisch posten

@SailorMan-PT - please post in English, or your next posts will be removed. This is an English speaking forum, thank you.

Bitte auf Englisch posten.
Yeah I understand but my Smartphone translated this English language into German language when I tipt the reply it's appeared in German language so I just think it would be write it on English language in my reply it was one big Miss understanding
 
AMD may lose a golden opportunity to beat Nvidia this year

A year and a half after the launch of RDNA 3, AMD’s graphics card lineup has grown a little stagnant — as has Nvidia’s. We’re all waiting for a new generation, and according to previous leaks, AMD was getting ready to release RDNA 4 later this year. Except that now, we’re hearing that it might not happen until CES 2025, which is still six months away.
Launching the new GPUs in the first quarter of 2025 is a decision that could easily backfire, and it’s never been more important for AMD to get the timing right. In fact, if AMD really decides to wait until January 2025 to unveil RDNA 4, it’ll miss out on a huge opportunity to beat Nvidia.
...
But AMD’s gaming segment appears to be bleeding money. Its gaming revenue dropped by 48% year-over-year, and even AMD itself doesn’t expect it to get better.
Jean Hu, AMD’s CFO, recently talked about how the subpar sales of its Radeon GPUs affected the gaming segment in a big way. The company predicts that the revenue in that segment will continue to decline. Where’s AMD going to make money then? It’s simple: From its data center and client segment.

 
nVidia Ray Tracing is a con I mean even Cyberpunk on a 4090 looks no where near as good as the CGI on old movie like Lord Of The Rings. AMD shouldn’t bother with Ray Tracing even on NVIDIA the performance degradation is too big. AMD should keep on being a Rasta man and use HDR like The Last of Us Part 1 PC. AMD should focus on getting higher polygon count models and more detailed textures…
 
nVidia Ray Tracing is a con I mean even Cyberpunk on a 4090 looks no where near as good as the CGI on old movie like Lord Of The Rings. AMD shouldn’t bother with Ray Tracing even on NVIDIA the performance degradation is too big. AMD should keep on being a Rasta man and use HDR like The Last of Us Part 1 PC. AMD should focus on getting higher polygon count models and more detailed textures…

Following this absurd logic, every single API feature developed in the past 20 years is a con, since you can get 70% of the job done with DirectX 9.0c anyway
 
You cannot use a 6800 XT either, since it performs so poorly. Despite being 10GB, the 3080 still outperforms it. From W1zz's latest game performance review

performance-3840-2160.png
That is at Max settings.

If you drop them you can use a 3080 or a 6800XT at 4k.
 
That is at Max settings.

If you drop them you can use a 3080 or a 6800XT at 4k.

That much seems obvious, the argument there (in this necro'd thread) is that the 6800 XT was superior for 4K gaming, it's not.
 
Following this absurd logic, every single API feature developed in the past 20 years is a con, since you can get 70% of the job done with DirectX 9.0c anyway
The effect/difference ray tracing adds is subtle, where as the performance degradation is anything but; that statement is beyond reproach…
 
The effect/difference ray tracing adds is subtle, where as the performance degradation is anything but; that statement is beyond reproach…

I didn't specifically mention RT to begin with but I disagree, but it's not something that can be truly appreciated unless you're already playing at 4K or higher IMO. The atmospherics in Metro Exodus for example, or more recently the full path tracing in Wukong, these look absolutely phenomenal on the right setup.

End of the day people make a big deal of RT because their hardware can't handle it and they cannot afford a suitable upgrade.
 
I disagree, but it's not something that can be truly appreciated unless you're already playing at 4K or higher IMO. The atmospherics in Metro Exodus for example, or more recently the full path tracing in Wukong, these look absolutely phenomenal on the right setup.

End of the day people make a big deal of RT because their hardware can't handle it and they cannot afford a suitable upgrade.
It's a great advice to actually play something instead of just peeping at screenshot.

Because the best looking thing we could do for screenshot is a very high quality baked lighting that wouldn't be dynamic at all. Doesn't matter, screenshot don't move. This is also why so many old game still look good and seem to have good lighting on screenshot. Because it's just static.

When you get to Dynamic lighting, then the non RT stuff have many flaws. It can look somehow good but it's easy to fell apart. RT do a much better job for those kind of scenario. It just make the lighting to be much more realistic, witch at the same time, make it less obvious. That is a strange thing but that is the case. Like said previously that is a style you go for.

Right now, the main problem i think with Path Tracing and RT in general is not really the performance impact, it's the quality of the denoiser. Movie just brute force that issue with many more rays and heavy offline denoising algorithm.

DLSS 3.5 is a bit better on that front but it still have many flaws. Having a quality denoiser will be key to make RT look like what we see in movies.


As for the performance impact, this is just a current impact. Most of the shaders we use would destroys the few first generations of GPU that supported it. In 2-3 generation, even mid range should have plenty of power to run RT at 1080P
 
If MLID or RGT say something good then at least 80% of the time it's guaranteed to be late and slow again. Only if the moment arrives when these two declare that AMD has lost it we should all start waiting for AMD to pull out a card of 9700XT/7970 levels of greatness.
 
6800 XT was superior for 4K gaming, it's not.
neither of them is good for 4k without dropping settings so you are correct.
 
If MLID or RGT say something good then at least 80% of the time it's guaranteed to be late and slow again. Only if the moment arrives when these two declare that AMD has lost it we should all start waiting for AMD to pull out a card of 9700XT/7970 levels of greatness.

The current situation is so bad, anyways. Because the current lineup is already 2-year-old and needs optimisations and updates for 2024/2025 usage.
The only thing that matters now is to release something (anything) new, no matter if it's good or bad, then adjust the pricings accordingly, so that at least some stock moves off the warehouses shelves..
 
The current situation is so bad, anyways. Because the current lineup is already 2-year-old and needs optimisations and updates for 2024/2025 usage.
The only thing that matters now is to release something (anything) new, no matter if it's good or bad, then adjust the pricings accordingly, so that at least some stock moves off the warehouses shelves..
Why are you so bad at super-basic, increadibly easy-to-check facts, Arf?

1724572685141.png


Nothing in the current AMD 7000-series lineup is 2 years old yet, most of it isn't even 1 year old yet - and the AMD 7000-series is younger than the RTX 40-series and the Intel Arc series.
 
Why are you so bad at super-basic, increadibly easy-to-check facts, Arf?

View attachment 360536

Nothing in the current AMD 7000-series lineup is 2 years old yet, most of it isn't even 1 year old yet - and the AMD 7000-series is younger than the RTX 40-series and the Intel Arc series.
Also, what if it's old? What matters is that it still works, right?
 
That much seems obvious, the argument there (in this necro'd thread) is that the 6800 XT was superior for 4K gaming, it's not.
I own a 6800xt with the snot over clocked out of it. It benchmarks faster than. 7900GRE and what you are saying is correct.
 
I own a 6800xt with the snot over clocked out of it. It benchmarks faster than. 7900GRE and what you are saying is correct.
Throw a UE5 game like Black Myth at a 4090 and its not a real 4K killer either... it manages, at best ;) OTOH throw Minesweeper at a 6800XT and itll do 4K120...

4K cards dont exist and never have nor will. There is only a game and its performance. GPUs move along doing what they can, and especially now with dynamic 'anything' in graphics... if you put up with high latency, lower IQ you can have 4K on many cards...
 
Throw a UE5 game like Black Myth at a 4090 and its not a real 4K killer either... it manages, at best ;) OTOH throw Minesweeper at a 6800XT and itll do 4K120...

4K cards dont exist and never have nor will. There is only a game and its performance. GPUs move along doing what they can, and especially now with dynamic 'anything' in graphics... if you put up with high latency, lower IQ you can have 4K on many cards...
True it's a weird situation currently hardware seems off I'm hoping to upgrade to Blackwell but I'd want 30% more performance than a 4090 to feel comfortable parting with the cash a 5090 will cost. The 6800xt is a hell of a card so much so that I kept it after going through 3 7900xtx cards that all had issues with drivers and latency. Software is to important and my 6800xt and 4 month hell with 7900xtx has shown me AMD still sucks in that department, software adrenaline is terrible in so many ways I'm going back to Nvidia the premium is worth it IMO. But I'll be putting my 6800xt on the wall with my other favourite cards.
 
20 months old. Give or take 3-4 months. While the design process can be traced back at least 3 or 4 years earlier.
So, a 2018 thing. :D
Yes, I literally posted a table with all of the launch dates for you, ranging from 6 to 20 months. 20 months is not 'already 2 years" that you wrote which quite unambiguously means >24M

As for the design process being 2018, what does that have to do with anything? Every CPU and GPU in the last 30 years has been in development for multiple years before launch. Once again, you're spouting nonsense - please stop, or at least do a basic sanity check on whether what you type is sane, relevant, or worthwhile.
 
Yes, I literally posted a table with all of the launch dates for you, ranging from 6 to 20 months. 20 months is not 'already 2 years" that you wrote which quite unambiguously means >24M

As for the design process being 2018, what does that have to do with anything? Every CPU and GPU in the last 30 years has been in development for multiple years before launch. Once again, you're spouting nonsense - please stop, or at least do a basic sanity check on whether what you type is sane, relevant, or worthwhile.

The one spouting nonsense is you.
You in a wrong way count the period between a product going in the wild and the present moment, when the right way to count how old a product is to count the time between its set-in-stone, tape-out, or set-in-stone decision / feature set. Because between that moment in time, and the physical release to the wild, there can be multiple other milestones happening, feature set updates, etc.

Also, what if it's old? What matters is that it still works, right?

Wrong. AMD's market share is close to non-existent in the OEMs market, which means that while the products do indeed physically work, they are market failures.

88% of all GPU shipments are nvidia, the rest is AMD and intel.

1724585992024.png

 
Back
Top