Tuesday, October 1st 2013

Radeon R9 290X Clock Speeds Surface, Benchmarked

Radeon R9 290X is looking increasingly good on paper. Most of its rumored specifications, and SEP pricing were reported late last week, but the ones that eluded us were clock speeds. A source that goes by the name Grant Kim, with access to a Radeon R9 290X sample, disclosed its clock speeds, and ran a few tests for us. To begin with, the GPU core is clocked at 1050 MHz. There is no dynamic-overclocking feature, but the chip can lower its clocks, taking load and temperatures into account. The memory is clocked at 1125 MHz (4.50 GHz GDDR5-effective). At that speed, the chip churns out 288 GB/s of memory bandwidth, over its 512-bit wide memory interface. Those clock speeds were reported by the GPU-Z client to us, so we give it the benefit of our doubt, even if it goes against AMD's ">300 GB/s memory bandwidth" bullet-point in its presentation.

Among the tests run on the card include frame-rates and frame-latency for Aliens vs. Predators, Battlefield 3, Crysis 3, GRID 2, Tomb Raider (2013), RAGE, and TESV: Skyrim, in no-antialiasing, FXAA, and MSAA modes; at 5760 x 1080 pixels resolution. An NVIDIA GeForce GTX TITAN was pitted against it, running the latest WHQL driver. We must remind you that at that resolution, AMD and NVIDIA GPUs tend to behave a little differently due to the way they handle multi-display, and so it may be an apples-to-coconuts comparison. In Tomb Raider (2013), the R9 290X romps ahead of the GTX TITAN, with higher average, maximum, and minimum frame rates in most tests.
RAGE
The OpenGL-based RAGE is a different beast. With AA turned off, the R9 290X puts out an overall lower frame-rates, and higher frame latency (lower the better). It gets even more inconsistent with AA cranked up to 4x MSAA. Without AA, frame-latencies of both chips remain under 30 ms, with the GTX TITAN looking more consistent, and lower. At 4x MSAA, the R9 290X is all over the place with frame latency.
TESV: Skyrim
The tester somehow got the game to work at 5760 x 1080. With no AA, both chips put out similar frame-rates, with the GTX TITAN having a higher mean, and the R9 290X spiking more often. In the frame-latency graph, the R9 290X has a bigger skyline than the GTX TITAN, which is not something to be proud of. As an added bonus, the VRAM usage of the game was plotted throughout the test run.

GRID 2
GRID 2 is a surprise package for the R9 290X. The chip puts out significantly, and consistently higher frame-rates than the GTX TITAN at no-AA, and offers lower frame-latencies. Even with MSAA cranked all the way up to 8x, the R9 290X holds out pretty well on the frame-rate front, but not frame-latency.
Crysis 3
This Cryengine 3-based game offers MSAA and FXAA anti-aliasing methods, and so it wasn't tested without either enabled. With 4x MSAA, both chips offer similar levels of frame-rates and frame-latencies. With FXAA enabled, the R9 290X offers higher frame-rates on average, and lower latencies.
Battlefield 3
That leaves us with Battlefield 3, which like Crysis 3, supports MSAA and FXAA. At 4x MSAA, the R9 290X offers higher frame-rates on average, and lower frame-latencies. It gets better for AMD's chip with FXAA on both fronts.
Overall, at 1050 MHz (core) and 4.50 GHz (memory), it's advantage-AMD, looking at these graphs. Then again, we must remind you that this is 5760 x 1080 we're talking about. Many Thanks to Grant Kim.
Add your own comment

100 Comments on Radeon R9 290X Clock Speeds Surface, Benchmarked

#51
Steevo
EarthDogINteresting speculation, but you can't take a clock speed and 1:1 it to performance. Same with SP/Shader count...
If you did that math and looked at the scaling in SP count and core speed the correlation in performance is dependent on the memory and GPU core speed. It is usually within a few percent on all cards except those artificially crippled.

And we are not talking about exponential increases, but percentage of raw output standardized.

So on he 7970 tests the increase was directly tied to core and memory speed, a 10% increase on both resulted in a 10% increase in performance. Same with Titan.


I am aware of the boost speeds Titan uses, and so it does make it a apples to coconut comparison, if this test card is stuck at 800Mhz and is still hand in hand with Titan that is boosting to 1Ghz speeds, what happens when the memory is ramped up and the core is capable of reaching the same clocks as the 7970?
Posted on Reply
#52
RCoon
SteevoI am aware of the boost speeds Titan uses, and so it does make it a apples to coconut comparison, if this test card is stuck at 800Mhz and is still hand in hand with Titan that is boosting to 1Ghz speeds, what happens when the memory is ramped up and the core is capable of reaching the same clocks as the 7970?
The card was benched at 1050, edited in main article, so unless somebody clarifies further:

290X @ 1050mhz vs Titan boosted to 1000mhz
Posted on Reply
#53
EarthDog
WHo knows Steevo.

Let's wait for real benchmarks instead of trying to continue to speculate with values that really cannot be used to achieve any greater accuracy than guessing in the first place. :)
Posted on Reply
#54
m1dg3t
How i LOOOOOOOOOOVE speculation! At least this thread hasn't urned into an nVidia thread :laugh:

W1zz: I got some spare coke and an extra hooker, you interested? :p

:peace:
Posted on Reply
#55
RCoon
m1dg3tHow i LOOOOOOOOOOVE speculation! At least this thread hasn't urned into an nVidia thread :laugh:

W1zz: I got some spare coke and an extra hooker, you interested? :p

:peace:
Extra? You have two?
Posted on Reply
#56
Steevo
You can't handle two? Coke got to your willy son?

I heard they are going to fix that problem with some magic blue powder, that's right, add crushed Viagra to the coke for a better pecker picker upper.
Posted on Reply
#57
Casecutter
Honestly, it more a leak of what the "Hawaii part" can do rather than saying it’s actual R9 290X performance. Those clocks could be more representative of the Hawaii LE part.
This crap going to bubble up... and there's less inference to draw form it because we don't know how old it is, or what the drivers were.
So keep the seat belt on it's going to get bumpy, while in this holding pattern.
Posted on Reply
#58
m1dg3t
RCoonExtra? You have two?
An "extra" would imply more than 1, nothing more. Nothing less. Use your imagination :)
SteevoYou can't handle two? Coke got to your willy son?

I heard they are going to fix that problem with some magic blue powder, that's right, add crushed Viagra to the coke for a better pecker picker upper.
:laugh: Heinekken hard - on strikes again! :nutkick:
CasecutterHonestly, it more a leak of what the "Hawaii part" can do rather than saying it’s actual R9 290X performance. Those clocks could be more representative of the Hawaii LE part.
This crap going to bubble up... and there's less inference to draw form it because we don't know how old it is, or what the drivers were.
So keep the seat belt on it's going to get bumpy, while in this holding pattern.
AMD/ATi are trolling nVidia and the mainstream, that is all. Same thing nVidia has been doing from time... Looks like the red team finally put the shoe on the other foot ;)
Posted on Reply
#59
btarunr
Editor & Senior Moderator
Graphs corrected for 1050/4500 MHz.
Posted on Reply
#60
FrustratedGarrett
Those are decent numbers. When not overclocked, the 290X is around the same performance as Titan except for Crysis 3 and Rage. It seems that AMD cards don't do well in those two games. Even when overclocked, the 290x still losses to the Titan in those games.

Of course this is a multi-display setup and things might look different on a 1440P monitor.

All of this shouldn't matter at this point. We need to see how efficient Mantle is. If Mantle does deliver on the 9x promise, which is quite likely given that console developers have managed to squeeze folds more in performance out of those console chips using low level coding that interfaces semi-directly with the hardware.

Truly what matters at this point to me as a PC gamer is getting next gen PC games; a revolution in PC games. We need games with better mechanics, graphics and physics.
All these old games just don't cut it for PC gamers like myself. The crappy game mechanics and physics especially are no longer bearable.
Posted on Reply
#61
Xero717
I can't help but think if AMD truly had a Titan killer they would of bragged about having the world's fastest single GPU at GPU'14. Their marketing is beyond over the top, and they would of had a field day with that.

TrueAudio is neat, don't get me wrong, but this was a GPU conference, and they spent more time talking about that than touting their new flagship. Maybe they learned from Bulldozer, but I doubt it.

As a side note, I have no doubt they will beat nVidia's offerings at price/performance.
Posted on Reply
#62
tacosRcool
I must be losing my mind but I thought there was another article on the R9 290X with dual bios, with the other one being overclocked?
Posted on Reply
#63
Recus
RCoonThe card was benched at 1050, edited in main article, so unless somebody clarifies further:

290X @ 1050mhz vs Titan boosted to 1000mhz
You must oc Titan if you want boost to 1000MHz, so HD 290X also was oc or BS again.
Posted on Reply
#65
the54thvoid
Intoxicated Moderator
tacosRcoolI must be losing my mind but I thought there was another article on the R9 290X with dual bios, with the other one being overclocked?
No, I saw that too. I figure it got pulled before it evoked the wrath of AMD.
Posted on Reply
#66
sweet
RecusYou must oc Titan if you want boost to 1000MHz, so HD 290X also was oc or BS again.
You are a typical victim of the fall assumption created by nVidia's dynamic boost. The cards with this technology always boost themselves to the highest stable clock assuming the power target (6xx) or temperature target (Titan, 7xx) is satisfied. In case of Titan, with stock setting it runs games at more than 1GHz. The base clock is mostly just for show.
Posted on Reply
#68
EarthDog
Not sure I missed anything...?

Here is a paragraph from the article talking about what you are though...
As mentioned in the initial article, the 876 MHz stock boost clock was quite conservative. When run for short periods and a moderate distance from the temp target (at stock that’s 80 °C), it was boosting to 1006 MHz consistently. If you adjusted the fan profile to keep it away from 80 °C, it would probably stay there indefinitely.
Is adjusting the fan profile overclocking? Overclocking is adjusting the clock speeds in my head. He adjusted the fan profile in the second graph which was then overclocked and hit 1100+ Mhz. The first is stock. ;)

So, with temps in order and not using the overly 'quiet' stock fan settings, this card will boost 1006 Mhz all day long. I guess we are splitting hairs? lOl!
Posted on Reply
#69
sweet
CasecutterI think you missed where they said, "and the temp target was set at 90°C"... that isn't stock correct?

W1zzard only found slightly below 1000Mhz on his referance review, but it ran there alot as it's a dark diamond.
www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/34.html
I can see the 993MHz -1006 MHz gap is small enough, right? So, this bench is between a 1050 MHz R9 290X and a 993MHz Titan. Personally I think it's a fair comparison, not to mention the mem clock of R9 290X is far more lower than Titan's.
Posted on Reply
#70
Patriot
hilpiVery fishy Tomb Raider Benchmark, Min FPS higher or the same as the AVG FPS...
Either those 2 collumns got switched on accident or they are fake... who knows...

min being 29 makes more sense 35 avg... would fall about same % behind amd...
Posted on Reply
#71
EarthDog
sweetnot to mention the mem clock of R9 290X is far more lower than Titan's.
You are forgetting about the 512bit bus on the 290X...;)
Posted on Reply
#72
Ahhzz
SlackerI can't wait for this card to be officially lifted of its NDA and gets benched. It looks really promising and maybe a great upgrade from my 6970:)
Absolutely. I've got about a year to look for solid replacements for my dual 6950's, and I'd love to see ASUS put out a rock-steady OC pair for a good price...
Posted on Reply
#73
Casecutter
EarthDogI guess we are splitting hairs? lOl!
True enough not a worry.
Posted on Reply
#75
TRWOV
Too bad there isn't any GPU WUs available at WCG. I bet this would net about 200K ppd :roll:
Posted on Reply
Add your own comment
Apr 23rd, 2024 03:52 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts