• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PowerColor Radeon RX 7900 GRE Hellhound

Boorock70

New Member
Joined
Jul 13, 2023
Messages
2 (0.01/day)
That insane video playback consumption will be a real problem for AMD when people realize that x4 consumption on YouTube especially in countries with higher electricity prices.
UK & most of Europe will be counting those Watts tbh...
 

ARF

Joined
Jan 28, 2020
Messages
4,020 (2.56/day)
Location
Ex-usa
That insane video playback consumption will be a real problem for AMD

Let's not put the whole of AMD or all of its products under a common denominator. Some AMD products are great, running with exceptionally low video playback power consumption, while others are bad, but particularly the 7900 GRE is going in the right direction.
Previous Sapphire review showed disastrous power consumption, the worst of them all:

26 February t.y.
1711214564971.png


Today:
1711214603711.png


Radeon RX 6600 and Radeon RX 6600 XT are the best:

1711214643533.png

1711214656586.png


I do hope AMD will fix the awful Counter - Strike 2 performance, because I want my game to be running at 200+ FPS, not at 130ish FPS. :banghead:

I read https://www.reddit.com/r/GlobalOffensive/comments/16v83ig and by the looks of it, there is a thing called DXNAVI which causes the problems.
 
Joined
Jun 5, 2020
Messages
6 (0.00/day)
I'm probably unlucky but my memory won't go over 2500 mhz on my sapphire pure.

My maximum score is 74 fps in time spy gt, i guess i lost at lottery :(
 
Joined
Aug 7, 2019
Messages
343 (0.20/day)
That insane video playback consumption will be a real problem for AMD when people realize that x4 consumption on YouTube especially in countries with higher electricity prices.
UK & most of Europe will be counting those Watts tbh...
Lets be real for a second here. What would be a "tolerable" power consumption? Minus 20W like the middle of the table? That's 20 W / 1000 W = 0.02 kWh | 0.02 kWh * 0.2 € = 0.004 € per hour is the difference between running this one and one in the middle of the table. Double the price of electricity to 0.4 €/kWh? Still less than one cent difference between running both.

Let's run how much you'd need to watch Youtube/whatever to make a 10€ difference: 10 € / 0.004 € = 2500 hours.

The difference is huge, but not something to slit your veins over.
 

Boorock70

New Member
Joined
Jul 13, 2023
Messages
2 (0.01/day)
Let's not put the whole of AMD or all of its products under a common denominator. Some AMD products are great, running with exceptionally low video playback power consumption, while others are bad, but particularly the 7900 GRE is going in the right direction.
Previous Sapphire review showed disastrous power consumption, the worst of them all:

26 February t.y.
View attachment 340301

Today:
View attachment 340302

Radeon RX 6600 and Radeon RX 6600 XT are the best:

View attachment 340303
View attachment 340304

I do hope AMD will fix the awful Counter - Strike 2 performance, because I want my game to be running at 200+ FPS, not at 130ish FPS. :banghead:

I read https://www.reddit.com/r/GlobalOffensive/comments/16v83ig and by the looks of it, there is a thing called DXNAVI which causes the problems.
True !
I should say RX 6800(and UP) cause I do own a RX 6700 XT with 12W power consumption @ video playback... and I'm OK with that :peace:
 
Joined
Apr 18, 2019
Messages
2,053 (1.11/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
Impressive. Too impressive...
1711223889935.png

Nearly 10c better than the Nitro+; which, clearly has a slightly 'beefier' cooler.

Physics is telling me there must be something wrong with the methodology (ambient temps? GPU orientation?),
or
Powercolor is the only AIB using superior TIM and/or mounting pressure.

1711223520186.png
1711223531484.png


1711223561869.png1711223575185.png

1711223644321.png1711223675047.png

1711223715764.png1711223746620.png

It almost looks like Powercolor opted for much thinner thermal pads and the TIM-remnants look a different color (diff TIM).
So, maybe Powercolor actually did make a superior assembly?


NGL, I'm a little bit miffed at the fact my 'premo' Nitro+ is being beaten (thermally) by some Wet Nose*.

1711224412444.png1711224675303.png
*Someone over at Powercolor is a BattleTech fan.
 
Joined
Apr 17, 2021
Messages
525 (0.47/day)
System Name Jedi Survivor Gaming PC
Processor AMD Ryzen 7800X3D
Motherboard Asus TUF B650M Plus Wifi
Cooling ThermalRight CPU Cooler
Memory G.Skill 32GB DDR5-5600 CL28
Video Card(s) MSI RTX 3080 10GB
Storage 2TB Samsung 990 Pro SSD
Display(s) MSI 32" 4K OLED 240hz Monitor
Case Asus Prime AP201
Power Supply FSP 1000W Platinum PSU
Mouse Logitech G403
Keyboard Asus Mechanical Keyboard
That insane video playback consumption will be a real problem for AMD when people realize that x4 consumption on YouTube especially in countries with higher electricity prices.
UK & most of Europe will be counting those Watts tbh...

Is this a troll post? This is not a laptop. Many RTX cards idle at 20W, what is 37W while watching youtube lmao. "insane" I don't even know how to respond to that nonsense. People used to use a 100W lightbulb and think nothing of it, now if your video card uses 10W more than the other guy's while watching youtube it is "insane".

10 W, max 15 W is required. This is a very low load task, you can't overload the GPU to insane numbers like 40s or 50s watts...

"insane" another troll... just say "it uses 10W more than the competition" to watch video...

---

I see where this narrative is going. I hope these are the same people that don't buy Intel.
 
Last edited:
Joined
Apr 18, 2019
Messages
2,053 (1.11/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
Is this a troll post? This is not a laptop. Many RTX cards idle at 20W, what is 37W while watching youtube lmao. "insane" I don't even know how to respond to that nonsense. People used to use a 100W lightbulb and think nothing of it, now if your video card uses 10W more than the other guy's while watching youtube it is "insane".



"insane" another troll...

---

I see where this narrative is going. I hope these are the same people that don't buy Intel.
'Green Mindshare' :laugh:

Won't go so far as to say 'paid agitator' but, they do 'learn' from them:
-Pick one point, take it to an extreme, and get everyone focused on it.
(Bonus points for purposefully convoluting and conflating data counter to one's argument)
 
Joined
Feb 14, 2008
Messages
988 (0.17/day)
Location
Cincinnati, Ohio
Processor AMD Ryzen 1700 @ 3.825ghz 1.36v
Motherboard ASRock x370 Taichi
Cooling Noctua NH-C14s
Memory Corsair Vengeance RGB 16gb 3200
Video Card(s) Gigabyte GTX 1080 ti
Display(s) LG 4k IPS
Case Be Quiet Pure Base 600 + more fans
Audio Device(s) Sound Blaster Z
Power Supply EVGA P2 750w
Mouse G900
Keyboard G810
Software Windows 10 Pro
DLSS is a valid reason for people purchasing nowadays unfortunately. FSR straight up looks worse than it's competitors. I have a RX 6800 XT myself but will GO OUT OF MY WAY to mod Xess into games using the Starfield FSR bridge mod because it looks that much better.

Balanced Xess modded on Alan Wake 2 looked better than FSR Quality.

I wonder if there is a easy list of the 7900 GRE cards by what memory they use. A nice easy chart.
 
Joined
Apr 18, 2019
Messages
2,053 (1.11/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
DLSS is a valid reason for people purchasing nowadays unfortunately. FSR straight up looks worse than it's competitors. I have a RX 6800 XT myself but will GO OUT OF MY WAY to mod Xess into games using the Starfield FSR bridge mod because it looks that much better.

Balanced Xess modded on Alan Wake 2 looked better than FSR Quality.
XeSS works in CP'77. I too thought it looked 'a little better' v. FSR. However, FSR is actively being improved.
Me, though: CP'77 MTFO is the *only* title I need anything like FSR/XeSS. @ 1080P my GRE is justabout the right level of overkill.
I wonder if there is a easy list of the 7900 GRE cards by what memory they use. A nice easy chart.
Please. Yes.
The GPU(vBIOS) db facilitates this but, it seems like those of us w/ GREs are 'shy' :laugh:

Historically, Samsung is the superior option. However, here... Unless there's some major latency/timing differences, it's looking like 20gbps GDDR6 from Samsung and Hynix are 'equal' in quality
My Nitro+ has Hynix RAM, does the same 2600-2614Mhz as seen in the Hellhound review (which, was equipped w/ Samsung VRAM).

entirely unlike what occurred w/ HBM2 on the Vega 56-64, etc. and the RX580's GDDR5.
Early Production V56 and V64 are all Hynix IIRC, and can barely hit 1Ghz HBM clocks. My Vega 10 16GB MI25s (all Samsung HBM2) all can pull off 1107Mhz or higher
On Polaris, the couple Samsung-equipped 580s I had, mined more efficiently vs. the Hynix-equipped examples.


I almost grabbed a Hellhound off Amazon when my Gigabyte GRE from NE came out of the box in pieces.
Kinda regretting that after seeing the markedly better thermal performance and Samsung VRAM
-but, there were no trusted reviews on the Hellhound

*advertising* purple fans, was a bit of an aesthetic turn-off; plus, I'm more a Periphery Freebirth, not some canister born Clanner :D
 
Last edited:
Joined
Aug 7, 2019
Messages
343 (0.20/day)
DLSS is a valid reason for people purchasing nowadays unfortunately. FSR straight up looks worse than it's competitors. I have a RX 6800 XT myself but will GO OUT OF MY WAY to mod Xess into games using the Starfield FSR bridge mod because it looks that much better.

Balanced Xess modded on Alan Wake 2 looked better than FSR Quality.

I wonder if there is a easy list of the 7900 GRE cards by what memory they use. A nice easy chart.
Last time I checked Intel GPUs don't have DLSS and it's not listed under their "cons".

FSR or XeSS being better or worse than each other is irrelevant since both and DLSS are garbage. It's like having a contest where everyone dies and the rotten corpse of the winner is exhibited as if it was any better than the other two.
 
Joined
Apr 18, 2019
Messages
2,053 (1.11/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
Last time I checked Intel GPUs don't have DLSS and it's not listed under their "cons".

FSR or XeSS being better or worse than each other is irrelevant since both and DLSS are garbage. It's like having a contest where everyone dies and the rotten corpse of the winner is exhibited as if it was any better than the other two.
Not the exact PoV I'd take but...
Agreed

XeSS and FSR are non-proprietary and 'algorithmic'.
DLSS is 100% nVidia-propietary, MI-based, and 'trained' by nVidia on a title-by-title basis.

I'd 'argue' that DLSS is 'better' but...
A. It's something entirely different, functionally.
B. All these 'intelligent' scaler/pixel generators are crap.
You lose so much detailed definition to faces and objects/textures, and can feel 'the fuzziness' in motion. What ever happened to 'crisp-ness'?


People seem to think the GRE is 'overkill' or 'excessive' for 1080P gaming; they call it 'a 1440P card'.
I disagree. It *is* the 1080P card.
Once you subtract-out DLSS/FSR/XeSS, modern titles need the grunt to perform smoothly.
 
Joined
Aug 7, 2019
Messages
343 (0.20/day)
Not the exact PoV I'd take but...
Agreed

XeSS and FSR are non-proprietary and 'algorithmic'.
DLSS is 100% nVidia-propietary, MI-based, and 'trained' by nVidia on a title-by-title basis.

I'd 'argue' that DLSS is 'better' but...
A. It's something entirely different, functionally.
B. All these 'intelligent' scaler/pixel generators are crap.
You lose so much detailed definition to faces and objects/textures, and can feel 'the fuzziness' in motion. What ever happened to 'crisp-ness'?


People seem to think the GRE is 'overkill' or 'excessive' for 1080P gaming; they call it 'a 1440P card'.
I disagree. It *is* the 1080P card.
Once you subtract-out DLSS/FSR/XeSS, modern titles need the grunt to perform smoothly.
The problem is that everyone is so busy declaring a winner that the fact they are all garbage is lost in the discussion.

Example: https://www.techpowerup.com/review/starfield-xess-1-2-vs-dlss-3-vs-fsr-3-comparison/


How can you say any of them are ANY good when you have a massive screen tall artifact staring at you in the middle of the screen? Do you know which one wins? NO UPSCALING because it doesn't introduce artifacts, blurriness and so on!

At least now the media outlets stopped hammering with the "better than native" BS. My fucking god it was like being in They Live and only a handful of us had the glasses.
 

Btzen

New Member
Joined
Mar 23, 2024
Messages
5 (0.10/day)
The V-sync 60hz test stood out to me in this one and I was wondering if anyone had any insight on it

power-vsync1.png


It's a huge difference across two of the same model card on the same driver. The Nitro+ only sits at 103w for this test so it's not even an increased power draw difference.
The graph for the Hellhound shows much more varaition in power draw from data point to data point than any of the other GRE cards. I almost thought it may have been a testing error because the difference is so huge, but I assume that would have been retested on being noticed?

Is this likely to be a BIOS issue, maybe unfixable because as far as I know powercolor doesn't update BIOS on their cards, or something drivers could sort out?

I'd also be curious to know if this sort of behavior happens in other games when framerate is capped, because this would have a huge effect on the overall power efficency of the card when not running at 100%, which I would argue most games don't unless you run everything with uncapped FPS regardless of screen sync. I'm still learning about power draw on cards so any insight is appreciated

Edit: Adding in an overlayed chart of the power data point graph for what I mean above about the strange behavior and how much more variation there is at V-sync than any other review I've seen on this site

power-consumption1.png
 
Last edited:
Joined
Apr 18, 2019
Messages
2,053 (1.11/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
The V-sync 60hz test stood out to me in this one and I was wondering if anyone had any insight on it

View attachment 340384

It's a huge difference across two of the same model card on the same driver. The Nitro+ only sits at 103w for this test so it's not even an increased power draw difference.
The graph for the Hellhound shows much more varaition in power draw from data point to data point than any of the other GRE cards. I almost thought it may have been a testing error because the difference is so huge

Is this likely to be a BIOS issue, maybe unfixable because as far as I know powercolor doesn't update BIOS on their cards, or something drivers could sort out?

I'd also be curious to know if this sort of behavior happens in other games when framerate is capped. I'm still learning about power draw on cards so any insight is appreciated
My immediate guess:
Powercolor purposefully altered the frequency/voltage curve to minimize on 'power state stutter'*. No gamer or enthusiast will ever want to use VSYNC, it's a huge 'latency add'. Only time I've EVER used VSYNC is when a game's lighting or physics get 'messed up' by excessive framerate.

(IMEO) Basically, the vBIOS is set to 'hold' higher clocks and voltages, rather than so-aggressively downclocking in momentary 'power savings'.

In my experience across Polaris-Vega-Navi, the bigger the delta in moment-to-moment reported clocks, the more you 'feel' some kind of hitching/stuttering.
It's a well-documented cause for high Avg and good Min FPS but, poor 'experience' (due to frametime inconsistency)


Notably, this is tangentially-referenced in the review; 'minimum clocks' in the OCing portion.
MOST people, are not so 'latency sensitive', and cannot quantify the 'difference' without extensive and damned-near pedantic testing.
With cursory and standardized testing, there is no quantifiable difference (other than Power Consumption)

Personally,
leaving 500Mhz minimum clocks vs. 2000Mhz minimum clocks on my GRE, is very noticeable.
I feel less 'frametime jitter' @ 2Ghz min. core clocks.



*I bought a Nitro+ 7900 GRE for the multiple vBIOS.
I don't want to be first to try it, but I am interested in "cross-flashing"
 
Last edited:

Btzen

New Member
Joined
Mar 23, 2024
Messages
5 (0.10/day)
My immediate guess:
Powercolor purposefully altered the frequency/voltage curve to minimize on 'power state stutter'. No gamer or enthusiast will ever want to use VSYNC, it's a huge 'latency add'. Only time I've EVER used VSYNC is when a game's lighting or physics get 'messed up' by excessive framerate.

(IMEO) Basically, the vBIOS is set to 'hold' higher clocks and voltages, rather than so-aggressively downclocking in momentary 'power savings'.

In my experience across Polaris-Vega-Navi, the bigger the delta in moment-to-moment reported clocks, the more you 'feel' some kind of hitching/stuttering.
It's a well-documented cause for high Avg and good Min FPS but, poor 'experience' (due to frametime inconsistency)


Notably, this is tangentially-referenced in the review; 'minimum clocks' in the OCing portion.
MOST people, are not so 'latency sensitive', and cannot quantify the 'difference' without extensive and damned-near pedantic testing.
With cursory and standardized testing, there is no quantifiable difference (other than Power Consumption)

Personally,
leaving 500Mhz minimum clocks vs. 2000Mhz minimum clocks on my GRE, is very noticeable.
I feel less 'frametime jitter' @ 2Ghz min. core clocks.

I had just edited my post above as I forgot to include the image I made overlaying the graph of the Hellhound vs Pure's power consumption graph, which is relevant.

In that you see there is actually much larger variance from data point to data point than the others, which makes the average much higher as a result. If what you're saying about the min clock holds true, and the minimum clock speed slider not mattering much is also noted in the other GRE reviews, I thought it would be the opposite with less spikes moment to moment

Side note: I use v-sync on most games because I am annoyingly sensitive to screen tearing and for some reason despite my tv reporting that it is adaptive sync compatible, it never seems to work.
 
Joined
Apr 18, 2019
Messages
2,053 (1.11/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
I had just edited my post above as I forgot to include the image I made overlaying the graph of the Hellhound vs Pure's power consumption graph, which is relevant.

In that you see there is actually much larger variance from data point to data point than the others, which makes the average much higher as a result. If what you're saying about the min clock holds true, and the minimum clock speed slider not mattering much is also noted in the other GRE reviews, I thought it would be the opposite with less spikes moment to moment

Side note: I use v-sync on most games because I am annoyingly sensitive to screen tearing and for some reason despite my tv reporting that it is adaptive sync compatible, it never seems to work.
Point! :D

Still, is likely 'related'. Perhaps, just for a different reason.
It *does* look like it's 'aggressively and rapidly clock shifting', for some reason.

(W/o further info/testing)
I'd go so far as to theorize this is still from a 'gaming tune' curve.

I know I'm not alone in my utter disdain for VSYNC
VRR display technologies, were unironically a dream come true, for me and many gamer-enthusiasts.
The card may be 'freaking out' at the improper use-case.

Edit:
Speaking towards your issue w/ your Adaptive Sync...
I'm *very* tearing-sensitive too but, moreso latency sensitive.

There's a 'trick' to getting it to work smoothly on older and poorly-implemented Adaptive Sync displays:
You *cannot* allow your framerate to exceed "Refresh - 1hz", nor allow your minimum framerate to go below "VRRrange + 1hz".

Setting up a 'by-profile' Radeon Chill or Framerate Cap of 143FPS on my ancient 32" 144hz screen, made a huge difference when I was still running a Vega. On my GRE, I can notice FreeSync Range Excursions in CP'77 and BattleTech.
I'll be 'addressing' those minor issues after I get the 24.3.1 drivers back onto my system
(Doing F@H for Team EHW @TM, req. a driver w/ full OpenCL support)
 
Last edited:

Btzen

New Member
Joined
Mar 23, 2024
Messages
5 (0.10/day)
Point! :D

Still, is likely 'related'. Perhaps, just for a different reason.
It *does* look like it's 'aggressively and rapidly clock shifting', for some reason.

(W/o further info/testing)
I'd go so far as to theorize this is still from a 'gaming tune' curve.

I know I'm not alone in my utter disdain for VSYNC
VRR display technologies, were unironically a dream come true, for me and many gamer-enthusiasts.
The card may be 'freaking out' at the improper use-case.

Edit:
Speaking towards your issue w/ your Adaptive Sync...
I'm *very* tearing-sensitive too but, moreso latency sensitive.

There's a 'trick' to getting it to work smoothly on older and poorly-implemented Adaptive Sync displays:
You *cannot* allow your framerate to exceed "Refresh - 1hz", nor allow your minimum framerate to go below "VRRrange + 1hz".

Setting up a 'by-profile' Radeon Chill or Framerate Cap of 143FPS on my ancient 32" 144hz screen, made a huge difference when I was still running a Vega. On my GRE, I can notice FreeSync Range Excursions in CP'77 and BattleTech.
I'll be 'addressing' those minor issues after I get the 24.3.1 drivers back onto my system
(Doing F@H for Team EHW @TM, req. a driver w/ full OpenCL support)
Re-checked the Efficency page of the review vs the other GRE cards, and it does have higher minimum clocks, but that still doesn't seem to fit the weird spikes for the V-sync test, and the high minimum clocks on other Hellhound models don't seem to cause this behavior. Unless it's just an issue with minimum clocks that high on the GRE in particular.

I also presume that this test was only run on the performance BIOS, and would be curious if the same behavior is seen on the quiet BIOS, or if it could be overridden on the user end

I'm also not a fan of v-sync, I've just learnt to live with it through decades of gaming on both console and cheap screens, and it helps I play mostly third person action games. And my TV isn't even a 2yo model (brought less than a year ago), it's just being dodgy.
 
Joined
Dec 30, 2010
Messages
2,107 (0.43/day)
The higher media power playback power thing is i think related to forcing the modules to be turned on. I'm not sure how to pronounce it (lol) but the whole threadripper has a insane idle power requirement as well due to the nature of the chip, the cache and all that.

It just disappoints me that the whole 7x00 series has bin locked out by the use of MPT and such. You can easily uncork to cards to excessive 600W power consumption if you wanted to and break some records with that.
 
Joined
Apr 18, 2019
Messages
2,053 (1.11/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
Joined
Nov 22, 2020
Messages
60 (0.05/day)
Processor Ryzen 5 3600
Motherboard ASRock X470 Taichi
Cooling Scythe Kotetsu Mark II
Memory G.SKILL 32GB DDR4 3200 CL16
Video Card(s) EVGA GeForce RTX 3070 FTW3 Ultra (1980 MHz / 0.968 V)
Display(s) Dell P2715Q; BenQ EX3501R; Panasonic TC-P55S60
Case Fractal Design Define R5
Audio Device(s) Sennheiser HD580; 64 Audio 1964-Q
Power Supply Seasonic SSR-650TR
Mouse Logitech G700s; Logitech G903
Keyboard Cooler Master QuickFire TK; Kinesis Advantage
VR HMD Quest 2
Impressive. Too impressive...
View attachment 340337
Nearly 10c better than the Nitro+; which, clearly has a slightly 'beefier' cooler.

Physics is telling me there must be something wrong with the methodology (ambient temps? GPU orientation?),
or
Powercolor is the only AIB using superior TIM and/or mounting pressure.
It stuck out to me too, but actually PowerColor's 7900 GRE Hellhound is performing similarly to the 7800 XT Hellhound, which was only modestly better than the competition. The Sapphire 7900 GRE Nitro cooler should be close too, but is substantially worse now than when it was used on the 7800 XT (the fan speed graph shows why).
 
Joined
Sep 9, 2017
Messages
182 (0.07/day)
System Name B20221017 Pro SP1 R2 Gaming Edition
Processor AMD Ryzen 7900X3D
Motherboard Asus ProArt X670E-Creator
Cooling NZXT Kraken Z73
Memory G.Skill Trident Z DDR5-6000 CL30 64GB
Video Card(s) NVIDIA RTX 3090 Founders Edition
Storage Samsung 980 Pro 2TB + Samsung 870 Evo 4TB
Display(s) Samsung CF791 Curved Ultrawide
Case NZXT H7 Flow
Power Supply Corsair HX1000i
VR HMD Meta Quest 3
Software Windows 11
Out of curiosity, why are the 4080 Super results not included in the charts?
 
Joined
Jun 21, 2013
Messages
542 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
Out of curiosity, why are the 4080 Super results not included in the charts?
Because it's the same as 4080, so there is no point.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,083 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Out of curiosity, why are the 4080 Super results not included in the charts?
What @Pumper said, it's kinda pointless, but now I'm realizing that there's the point that people might not be aware that it's pointless, so it's been added to the summary graphs and I'll include it in the future, too
 
Top