• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Leaked AMD Radeon RX 7700 & RX 7800 GPU Benchmarks Emerge

Wizzard's power consumption measurements for gaming aren't very conclusive since he's using only 1(!) game (Cyberpunk 2077) as a reference. ;) Optimus Tech used at least 6 games for his tests where the RTX 4080 is drawing around 100W to 200W less power than the 7900 XTX. In CS:GO the 4080 is even drawing ~60% less power, that's just crazy! And he didn't even use a lower power target which would bring the numbers down even more, lol.

View attachment 305427



Also if you're looking at the 100 most played games on STEAM you will find maybe a hand full of heavy demanding AAA games. :) Folks mostly play older or indie games, and that's what matters in the end.
Yes an Nvidia sponsored Title, seams fair?!?

@W1zzard can 3 non sponsored titles or one, not make up power figures that game was made with Nvidia hardware in mind every step of the way.
 
AMD is actually losing market share to Intel, not gaining share from Nvidia. Latest GPU figures are sad for all but terrible for AMD. I'm skipping this gen entirely. I thought Turing was bad.
That is their strong OEM relations at work. Still, it hurts AMD. We'll see how the long term trend plays out.
 
can 3 non sponsored titles or one, not make up power figures that game was made with Nvidia hardware in mind every step of the way.
There's non-sponsored titles out there? Those publishers are leaving money on the table!

I need a game that's AAA, highly popular, well-optimized for, somewhat recent, supports RT, not CPU limited

What would you pick other than Cyberpunk?
 
There's non-sponsored titles out there? Those publishers are leaving money on the table!

I need a game that's AAA, highly popular, well-optimized for, somewhat recent, supports RT, not CPU limited

What would you pick other than Cyberpunk?
I don't have all the answers either and didn't suggest I did.

It's worth a conversation if you too have no idea then I would say because using one game heavily supported by one OEM is not unbiased
 
one game heavily supported by one OEM is not unbiased
Both AMD and Intel have invested a lot of resources into optimizing Cyberpunk, and I'm sure they are looking at it right now, because of that upcoming expansion
 
Both AMD and Intel have invested a lot of resources into optimizing Cyberpunk, and I'm sure they are looking at it right now, because of that upcoming expansion
So it's fair because, Now, Intel and AMD are optimising a game made with Nvidia's help?!? , Ok, I doubt my company could get passed iso with that argument.
 
Both AMD and Intel have invested a lot of resources into optimizing Cyberpunk, and I'm sure they are looking at it right now, because of that upcoming expansion
Hogwarts Legacy is the top selling game of 2023, and by now, it runs fine though it may pose problems for the V-Sync test with any cards slower than the 3070. It shouldn't replace CyberPunk, but you can use CyberPunk and a couple of other games to measure the average and peak power consumption.
 
Last edited:
Hogwarts Legacy
Yeah that could be a replacement for CP, but it's NVIDIA sponsored, too, afaik, and takes forever to compile shaders every time you start it
 
Yeah that could be a replacement for CP, but it's NVIDIA sponsored, too, afaik, and takes forever to compile shaders every time you start it
Resident Evil 4 has also sold well and isn't Nvidia sponsored, but its RT implementation is minimal.
 
Resident Evil 4 has also sold well and isn't Nvidia sponsored, but its RT implementation is minimal.
# of Steam reviews:
Cyberpunk: 550k
Hogwarts: 150k
RE4: 55k

I think CP is the most-rated game with RT support, which should be a decent proxy for popularity
 
# of Steam reviews:
Cyberpunk: 550k
Hogwarts: 150k
RE4: 55k

I think CP is the most-rated game with RT support, which should be a decent proxy for popularity
Thanks for taking the time to discuss this. CyberPunk has been on the market for over 31 months compared to 5 months for Hogwarts, but that is another point in its favour: maturity. I'm not advocating for replacing CP; I just think you should base the results off more than one game. However, if it's too time consuming to add another game, then CP 2077 is perfect for all the reasons you gave.
 
# of Steam reviews:
Cyberpunk: 550k
Hogwarts: 150k
RE4: 55k

I think CP is the most-rated game with RT support, which should be a decent proxy for popularity
I would suggest 3 tests

Cp2077. Something to balance it like Jedi survivor and a compute load from a reputable source, I would have said F@H, it's worthy but not replicable.
 
Obviously the only Game that should be test is Ashes of the Benchmark. Even though it is old it is probably the best optimized DX12 Game still.
 
oth AMD and Intel have invested a lot of resources into optimizing Cyberpunk
I dunno what AMD have been investing but almost half the time the game gets updated AMD cards run it slower and slower and slower:

https://www.techpowerup.com/review/amd-radeon-rx-6700-xt/11.html. This is how it's been 2 years ago. 56 FPS on a 6700 XT.
https://www.techpowerup.com/review/msi-geforce-rtx-3080-suprim-x-12-gb/11.html. One year ago, 50 FPS on a 6700 XT.

My own benchmarks: 69 FPS, ver. 1.6. 57 FPS, ver. 1.63. Settings: 1440p, Ultra, no RT, no FSR.

FSR, though, is completely broken as of now. EXTREME ghost-glitching whilst driving anything at speeds faster than walking ones. Unplayable. Only fixed by a 3rd party modder, CDPR hasn't even bothered to admit the issue.

On the other hand, nVidia GPUs run it faster and faster and faster. And DLSS doesn't break after patches! And works awesome!

Honestly, it feels like both AMD and CDPR are not interested in how AMD GPUs are behaving in this game. That's why I really consider to throw CP2077 outta VSync 60 Hz benchmarking as soon as it is humanly possible.
 
Last edited:
It's going to be screwed, like everything else this generation.

Well, we have this thing called "past precedence". We know that AMD is not interested in providing good perf/$, and even with disappointing cards they still under-deliver on price. AMD is not a budget brand, the CEO herself has stated this more then once.

We have watched AMD release a flat line in perf/$ ratios compared to rDNA2. So why would that change now?

What is a "budget" brand? A budget brand is one that operates under significantly lower operating expenses/costs (maybe in countries in Africa, or India, China, Eastern Europe) and has relatively low reputation and popularity.
AMD cannot operate under lower expenses than its competitors but is obviously the brand with tiny market share.
AMD's CEO is wrong and must change.
 
There's non-sponsored titles out there? Those publishers are leaving money on the table!

I need a game that's AAA, highly popular, well-optimized for, somewhat recent, supports RT, not CPU limited

What would you pick other than Cyberpunk?

I think you should use for the power consumption measurements, in addition to your regular max performance/consumption tests, a different set of games. I would focus on lower demanding & most played games, a lot to pick from the SteamCharts. Guess MP games are kinda harder to test, but there are also enough SP games on the list. Just some for example:

MP: Counter-Strike: Global Offensive, Dota 2, PUBG: BATTLEGROUNDS, Apex Legends & Call of Duty®: Modern Warfare® II | Warzone™
SP: Grand Theft Auto V, Cities: Skylines, The Witcher 3: Wild Hunt, Fallout 4 & Stellaris

Also test the efficency with lower power targets. Performance per Watt sweet spot. Nobody cares about overclocking, esp. since there is nothing to gain for most games. Power consumtion, heat & noise is way more important to most of the folks out there. ;)
 
I think you should use for the power consumption measurements, in addition to your regular max performance/consumption tests, a different set of games. I would focus on lower demanding & most played games, a lot to pick from the SteamCharts. Guess MP games are kinda harder to test, but there are also enough SP games on the list. Just some for example:

MP: Counter-Strike: Global Offensive, Dota 2, PUBG: BATTLEGROUNDS, Apex Legends & Call of Duty®: Modern Warfare® II | Warzone™
SP: Grand Theft Auto V, Cities: Skylines, The Witcher 3: Wild Hunt, Fallout 4 & Stellaris

Also test the efficency with lower power targets. Performance per Watt sweet spot. Nobody cares about overclocking, esp. since there is nothing to gain for most games. Power consumtion, heat & noise is way more important to most of the folks out there. ;)

Getting consistent numbers from multiplayer online games that involve unpredictable situations is a royal PITA. It's likely not feasible to do as often as needed for every GPU and CPU that W1zz tests. Maybe once as a special during "slow" (ha, not bloody likely!) times. Same for efficency targetting. Maybe one special with representative high not not top end GPUs like 4080, 7900 XT, 3080 Ti, 6800 XT. Nobody other than a crackpot like me would undervolt their 6600 XT and 1660 Super for efficiency reasons, there's simply not enough watts in it to matter.
 
Back
Top