• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD RX 7900 XTX OC Does Cross 3 GHz Barrier, But in Non-Gaming Workloads

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,808 (7.40/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD's Radeon RX 7900 XTX RDNA3 graphics card does cross the 3 GHz engine clocks barrier, but not in gaming use-cases, finds a ComputerBase.de article, in which the German publication compares the overclocking experience between the RX 7000-series RDNA3 and NVIDIA RTX 40-series "Ada" architectures. The RX 7900 XTX was found to hit engine clock speeds as high as 3455 MHz, but when handling the Blender rendering benchmark, and not typical gaming workloads.

The GPU could even be pushed to 3548 MHz with a power-draw of around 400 W, but it wasn't stable, the article notes. The top frequencies the GPUs could hit with gaming workloads were around 2.90 GHz. We could be happening with games is that more of the GPU's hardware resources are tapping into its power-limit (such as the memory controllers, caches, and other special SIMD functions, which could be impacting the engine clock boosting headroom. ComputerBase.de used a Sapphire RX 7900 XTX NITRO+ custom-design graphics card in its testing, which comes with three 8-pin PCIe power connectors, and a higher overclocking headroom than what the reference-design cards are capable of.



View at TechPowerUp Main Site | Source
 
Are they trying to say it was a poor design? :laugh:
 
The power draw is terrible in comparison. Efficiency is out the window.
 
Are they trying to say it was a poor design? :laugh:

Not really , what i can see here is coming new gpu 7950, 7950xt, 7950xtx.

Simple if you have headroom for higher clock, why not milk public?
 
I'm so looking forward to not gaming at 3500mhz.

:twitch:
 
These cards are in prebeta stage, the fact that amd is asking over 1k++ for these abominations is insane
 
They rushed it out seemingly. My guess is the next revision will clock much better.

The real question though is if either team can put out a sub $500 card that is worth buying.
 
The power draw is terrible in comparison. Efficiency is out the window.

Thats a no brainer considering the clock is going beyond 3Ghz here.

Chips simply have a best efficiency / performance curve. Once you start raising clocks the power consumption raises proportionally. Or the gains you get from OC'ing are minimal.
 
These cards are in prebeta stage, the fact that amd is asking over 1k++ for these abominations is insane
No matter what AMD releases you would always say the same thing. Both AMD and Nvidia have good products. Many here are unhappy about pricing and business practices on the nVidia side. AMD still has a ways to go on the RT and SS side.

Drivers are fine from both camps. I have a Radeon 7900XT (couldn’t find an XTX) and I’m enjoying gaming with it. I’m sure everyone here is enjoying whatever GPU they have.

Stop using words like abomination. It makes us all look even nerdier and out of touch than we are. There are abominations in this world. GPUs are not among them.
 
These cards are in prebeta stage, the fact that amd is asking over 1k++ for these abominations is insane

Why is this "prebeta" and the hell does it have anything to do with what AMD is asking for their GPUs ? You are aware percentage wise these cards overclock higher than ADA compared to stock clocks, right ? Why would you even care about clock speeds for different GPUs is beyond me anyway.

Can you try for one nanosecond to not be a giga fanboy and say the most off the wall stuff possible ?
 
No matter what AMD releases you would always say the same thing. Both AMD and Nvidia have good products. Many here are unhappy about pricing and business practices on the nVidia side. AMD still has a ways to go on the RT and SS side.

Drivers are fine from both camps. I have a Radeon 7900XT (couldn’t find an XTX) and I’m enjoying gaming with it. I’m sure everyone here is enjoying whatever GPU they have.

Stop using words like abomination. It makes us all look even nerdier and out of touch than we are. There are abominations in this world. GPUs are not among them.
Then why do i have plenty of amd cpus and gpus?

Why is this "prebeta" and the hell does it have anything to do with what AMD is asking for their GPUs ? You are aware percentage wise these cards overclock higher than ADA compared to stock clocks, right ? Why would you even care about clock speeds for different GPUs is beyond me anyway.

Can you try for one nanosecond to not be a giga fanboy and say the most off the wall stuff possible ?
Its prebeta cause it has myriads of issues. Hrd not working, consuming vast amounts of power in video playback or multimonitor, drivers constantly crashing etc. I didnt even touch the cooler situation yet.

Its obvious that the cards werent ready for release yet.
 
Its obvious that the cards werent ready for release yet.
Could say the same about certain cards who's power connectors melt, I am sure you'd agree with me, right ? Rather have a card with a defective cooler compered to one that melts, though neither are great options.

Hrd not working
No clue what that is.

Then why do i have plenty of amd cpus and gpus?
Don't know, do you ? Had an Nvidia card before and I still had no problem hating all the shitty things Nvidia did, these things are not mutually exclusive. It's funny because you saying that proves you think exactly like a fanboy lmao.

consuming vast amounts of power in video playback or multimonitor
Hardly a catastrophic issue.

drivers constantly crashing etc.
And you know that for a fact how exactly ? I think I blink every time my drivers crashes so I must have missed them up until now.
 
Last edited:
Could say the same about certain cards who's power connectors melt, I am sure you'd agree with me, right ? Rather have a card with a defective cooler compered to one that melts, though neither are great options.
Any cards connectors will melt if you dont plug them properly. Just a fact.

Here you go, 8pins melting



Does that mean the connectors that amd has are melting? Youll find hundreds of threads with 8pins melting. Try to hide your bias man

You saying your drivers dont crash is irrelevant. My 16pins didnt melt either, does that mean no ones did?
 
Then why do i have plenty of amd cpus and gpus?
Because you don’t understand that you can sell them after you don’t need them?

Or do you happen to have a large family, a huge house and a need for plenty of gaming PCs?
 
You saying your drivers dont crash is irrelevant.

And you saying they do is ? Astounding logic mate, especially since you don't own one.

The thing is some 16pins melt, you said drivers "constantly" crash. It would be pretty stupid if I said connectors constantly melt wouldn't it ? Because they don't. Well, saying the drivers constantly crash is just as stupid and untrue. See the problem ?
 
And you saying they do is ? Astounding logic mate, especially since you don't own one.

The thing is some 16pins melt, you said drivers "constantly" crash. It would be pretty stupid if I said connectors constantly melt wouldn't it ? Because they don't. Well, saying the drivers constantly crash is just as stupid and untrue. See the problem ?
16 pins melt at the same rate as any other cable that hasn't been plugged properly. That's just a fact, i don't know why you insist on that nonsense.

How do I know drivers crash on amd? A read on amds owners thread in every single forum on planet earth suggests that. HDR not working? Again, the internet is full of complaints about this.
 
Then why do i have plenty of amd cpus and gpus?
Not sure what this has to do with anything, but that isnt the same as actually using them lol.
Its prebeta cause it has myriads of issues. Hrd not working, consuming vast amounts of power in video playback or multimonitor, drivers constantly crashing etc. I didnt even touch the cooler situation yet.

Its obvious that the cards werent ready for release yet.
The power consumption was fixed with drivers, drivers crashing isnt an issue, and the vapor chamber is hardly an issue compared to melting power connectors, as you can avoid that by buying an AIB card, but you can't avoid a power connector melting as Nvidia forced all of their AIBs to use that poorly designed 12VHPWR connector.

Rather have a card with a defective cooler compered to one that melts, though neither are great options.
I would honestly rather deal with the possibility of having a defective cooler, IMO its easier to solve that as you can either put a water block on the card, or just buy an AIB 7900XTX if you can't wait for the vapor chamber issue to be fully resolved. There isn't any avoiding the melting power connector unless you also buy a new PSU and make sure you have a massive case to make sure the power connector doesn't get pushed loose, which was never a problem with the 8 pin connectors.
Don't know, do you ? Had an Nvidia card before and I still had no problem hating all the shitty things Nvidia did, these things are not mutually exclusive. It's funny because you saying that proves you think exactly like a fanboy lmao.
Most of my gpu's have been from Nvidia, yet I don't like their business practices. Their greediness that started with the RTX 2000 series pricing, and has gotten out of hand with the RTX 4000 cards, and the melting power connector is why i'll probably support AMD, though the XT and XTX needs to be $100 cheaper.
 
Last edited:
These cards are in prebeta stage, the fact that amd is asking over 1k++ for these abominations is insane
Imagine how RTX is still in pre-beta stage altogether in its 3rd iteration. Performance still abysmal, low hanging fruit gone, shitty frame generation trickery with quality loss to fill the gap... Nvidia's in a glass house, not a good place to be throwing stones from IMHO

And they even have the gall to ask over 1,5k ;)
And on top of that, AMD 'just wingin' RT keeps them within spitting distance now on their latest series., go figure, without all the nonsense around it.
The whole state of GPUs is abysmal right now; even Nvidia's stack relative to Ampere is a sorry state of affairs.
 
That's just a fact
Not it's not you don't know that, do you have any proof it happens at any rate which is lesser or higher ? I am sure you don't, keep in mind how many products use 8pin and how many use 12VHPWR.

You don't know what the word "fact" means just as you don't know what "constantly" means.

Again, the internet is full of complaints about this.
No, it's not. Define "the internet" and "full", once again you use expressions that mean nothing and can't be proven.

HDR not working?
Considering you misspelled it so badly the first time around I didn't even knew what you were talking about I doubt you're sufficiently apt at determining what works and what doesn't.

Especially since you don't have one. I do and to your dismay I just checked and HDR works fine.

By the way I am surprised you are so thoroughly educated on this subject. Do you do anything else besides scouring the internet for obscure problems on AMD cards ? If you'd ask me what problems there are on Nvidia I'd have no clue besides something huge that was covered in the media like the power connectors thing, though I am sure there are plenty. So are you still sure you're not a fanboy ? Like 100% sure ? I am still wondering because you're not very convincing.
 
Last edited:
The lower clocks in games suggest that the fixed function units aren't able to clock as high as the shaders. I've no idea if AMD will ever release a version of RDNA3 that fixes this, but for future products, hopefully, they'll both have the same headroom. It's clear that quite a bit of performance was left on the table due to the shaders outpacing the rest of the GPU.
 
When fevgatos said he read on the internet, proves he or she hasnt owned an AMD product in awhile if ever. Hey fevgatos, why do you need to read about AMD crashes in the internet when you own the last few generations? Shouldn’t you be able to speak from experience?
 
The power consumption was fixed with drivers, drivers crashing isnt an issue, and the vapor chamber is hardly an issue compared to melting power connectors, as you can avoid that by buying an AIB card, but you can't avoid a power connector melting as Nvidia forced all of their AIBs to use that poorly designed 12VHPWR connector.
Yeap, plugging the cable properly is definitely something you can't avoid :roll: :roll:

Imagine how RTX is still in pre-beta stage altogether in its 3rd iteration. Performance still abysmal, low hanging fruit gone, shitty frame generation trickery with quality loss to fill the gap... Nvidia's in a glass house, not a good place to be throwing stones from IMHO

And they even have the gall to ask over 1,5k ;)
And on top of that, AMD 'just wingin' RT keeps them within spitting distance now on their latest series., go figure, without all the nonsense around it.
The whole state of GPUs is abysmal right now; even Nvidia's stack relative to Ampere is a sorry state of affairs.
How is that even an argument, whatever you think of RT - its still way better on nvidia than it is on amd, so I don't see exactly what point you are trying to make.

Not it's not you don't know that, do you have any proof it happens at any rate which is lesser or higher ? I am sure you don't, keep in mind how many products use 8pin and how many use 12VHPWR.

You don't know what the word "fact" means just as you don't know what "constantly" means.


No, it's not. Define "the internet" and "full", once again you use expressions that mean nothing and can't be proven.


Considering you misspelled it so badly the first time around I didn't even knew what you were talking about I doubt you're sufficiently apt at determining what works and what doesn't.

Especially since you don't have one. I do and to your dismay I just checked and HDR works fine.

By the way I am surprised you are so thoroughly educated on this subject. Do you do anything else besides scouring the internet for obscure problems on AMD cards ? If you'd ask me what problems there are on Nvidia I'd have no clue besides something huge that was covered in the media like the power connectors thing, though I am sure there are plenty. So are you still sure you're not a fanboy ? Like 100% sure ? I am still wondering because you're not very convincing.
Man you keep going around threads repeating the same stuff about melting 16 pins, even after I showed you that 8pins have the same problem. If you don't plug the cable properly, it's going to melt, whether its a 16pin or an 8pin. Yet you keep repeating it...you are irredeemable
 
Back
Top