Tuesday, October 1st 2013

Radeon R9 290X Clock Speeds Surface, Benchmarked

Radeon R9 290X is looking increasingly good on paper. Most of its rumored specifications, and SEP pricing were reported late last week, but the ones that eluded us were clock speeds. A source that goes by the name Grant Kim, with access to a Radeon R9 290X sample, disclosed its clock speeds, and ran a few tests for us. To begin with, the GPU core is clocked at 1050 MHz. There is no dynamic-overclocking feature, but the chip can lower its clocks, taking load and temperatures into account. The memory is clocked at 1125 MHz (4.50 GHz GDDR5-effective). At that speed, the chip churns out 288 GB/s of memory bandwidth, over its 512-bit wide memory interface. Those clock speeds were reported by the GPU-Z client to us, so we give it the benefit of our doubt, even if it goes against AMD's ">300 GB/s memory bandwidth" bullet-point in its presentation.

Among the tests run on the card include frame-rates and frame-latency for Aliens vs. Predators, Battlefield 3, Crysis 3, GRID 2, Tomb Raider (2013), RAGE, and TESV: Skyrim, in no-antialiasing, FXAA, and MSAA modes; at 5760 x 1080 pixels resolution. An NVIDIA GeForce GTX TITAN was pitted against it, running the latest WHQL driver. We must remind you that at that resolution, AMD and NVIDIA GPUs tend to behave a little differently due to the way they handle multi-display, and so it may be an apples-to-coconuts comparison. In Tomb Raider (2013), the R9 290X romps ahead of the GTX TITAN, with higher average, maximum, and minimum frame rates in most tests.
RAGE
The OpenGL-based RAGE is a different beast. With AA turned off, the R9 290X puts out an overall lower frame-rates, and higher frame latency (lower the better). It gets even more inconsistent with AA cranked up to 4x MSAA. Without AA, frame-latencies of both chips remain under 30 ms, with the GTX TITAN looking more consistent, and lower. At 4x MSAA, the R9 290X is all over the place with frame latency.
TESV: Skyrim
The tester somehow got the game to work at 5760 x 1080. With no AA, both chips put out similar frame-rates, with the GTX TITAN having a higher mean, and the R9 290X spiking more often. In the frame-latency graph, the R9 290X has a bigger skyline than the GTX TITAN, which is not something to be proud of. As an added bonus, the VRAM usage of the game was plotted throughout the test run.

GRID 2
GRID 2 is a surprise package for the R9 290X. The chip puts out significantly, and consistently higher frame-rates than the GTX TITAN at no-AA, and offers lower frame-latencies. Even with MSAA cranked all the way up to 8x, the R9 290X holds out pretty well on the frame-rate front, but not frame-latency.
Crysis 3
This Cryengine 3-based game offers MSAA and FXAA anti-aliasing methods, and so it wasn't tested without either enabled. With 4x MSAA, both chips offer similar levels of frame-rates and frame-latencies. With FXAA enabled, the R9 290X offers higher frame-rates on average, and lower latencies.
Battlefield 3
That leaves us with Battlefield 3, which like Crysis 3, supports MSAA and FXAA. At 4x MSAA, the R9 290X offers higher frame-rates on average, and lower frame-latencies. It gets better for AMD's chip with FXAA on both fronts.
Overall, at 1050 MHz (core) and 4.50 GHz (memory), it's advantage-AMD, looking at these graphs. Then again, we must remind you that this is 5760 x 1080 we're talking about. Many Thanks to Grant Kim.
Add your own comment

100 Comments on Radeon R9 290X Clock Speeds Surface, Benchmarked

#26
HTC
RecusFrom slightly faster than GTX 780 to faster than Titan. Seems legit :wtf:
Remember: this is 5760 x 1080 pixels resolution.

A wider interface + higher memory should mean better performance @ higher resolutions but not necessarily so @ lower ones, no?
Posted on Reply
#27
librin.so.1
inb4 "lolnub Y U run RAGE with vsync for benchmarks? lolwutanub!". :ohwell:
Posted on Reply
#28
dj-electric
Those specs are rather weird yet understandable.
Heck, if they are true in conjuntion with those tests than one massive of an overclocking potential is waiting, in both core and memory segments.

BUT, it is not unreasonable to see 1250Mhz memory chips on the 290X to get those >300GBps they were talking about, otherwise why would they present such a bold lie?

I dont know, just waiting a couple of days to get my hands on one and there will be no more doubts, for me.
Posted on Reply
#29
Pedro Lisboa
Fake benchmarks

These are fake benchmarks :shadedshu.
Wait for some oficial tests and after that make a serious comparison between Titan/GTX 780 and R9 290X.
Posted on Reply
#30
Recus
HTCRemember: this is 5760 x 1080 pixels resolution.

A wider interface + higher memory should mean better performance @ higher resolutions but not necessarily so @ lower ones, no?
B.. but Titan has 6 GB memory for high res. :eek:
Posted on Reply
#31
EarthDog
SteevoI am guessing the review sample is a ES spin, so final could have higher clocks, and as well increased memory clocks as AMD have mentioned.


Considering this is almost hand in hand with titan at 800Mhz, 1Ghz would would provide somewhere around 20% more performance depending on achievable memory speeds.

27% increase in SP count but W1zz's benchmarks of Titan VS 7970 shows the AMD only 12% slower at that resolution, so either the card is crippled and someone got it for other testing, or they have failed to reach their target.

A efficiency improvement in the 7970 for higher core speeds and better memory would have been a better investment for them if this were true. I doubt this to be true, instead its most likely a crippled ES card.

www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/22.html

tpucdn.com/reviews/NVIDIA/GeForce_GTX_Titan/images/skyrim_5760_1080.gif

Or its meant to show the validity of the core architecture at the same speed as Titan, while consuming less power, and having more features possibly.
INteresting speculation, but you can't take a clock speed and 1:1 it to performance. Same with SP/Shader count...
Posted on Reply
#32
springs113
Wizz got the card!!! He's laying low for that matter.
Posted on Reply
#33
dj-electric
springs113Wizz got the card!!! He's laying low for that matter.
W1zz get cards so early he is probably sitting hours in-front of the screen just working on his evil laugh and munching on popcorn while reading all speculations and arguments.

i.imgur.com/BWUVqyi.png
Posted on Reply
#34
HTC
RecusB.. but Titan has 6 GB memory for high res. :eek:
You're right ... DUH ...

Maybe the difference is the interface? Dunno :confused:
Posted on Reply
#35
jigar2speed
I need your address WIZZARD, i got some beer i would love to share - :toast:




Sheesh - me plans to rob Wizzard while he is drunk and score that shinny R9 290X
:D
Posted on Reply
#36
springs113
Lol Wizz come role with me lol i can get you free flights to anywhere in the world. lmfao...I think after this review it's bye bye hydro copper (at least from my main rig) Eventhough I have the Creative ZXR I always preferred AMDs version of audio to NVIDIAs. Plus AMDs color(visuals) have always appeared to be more precise/crisp.
Posted on Reply
#37
Filiprino
So, AMD still sucks on OpenGL. Nice.
Posted on Reply
#38
Crap Daddy
Dj-ElectriCI dont know, just waiting a couple of days to get my hands on one and there will be no more doubts, for me.
Couple of days?
Posted on Reply
#39
EarthDog
A "couple" seems to vary significantly from person to person, LOL! Couple WEEKS, sure. :p
Posted on Reply
#40
Slomo4shO
btarunrAt $599 (compared to TITAN's $999), I'm sure even you can't complain.
The Titan has been available since February. In addition, there are already 780 models that outperform the Titan within $100 of the launch price of the R9 290X.... Releasing a product that is equivalent performance 8 months later isn't impressive by any means as I am sure price cuts can easily make this product lineup irrelevant if Nvidia decides to release the rumored Titan Ultra this year. Also, maxwell is less than a year away. This launch is become more and more disappointing as the days progress.
Posted on Reply
#41
W1zzard
Corrected the GPU clock to 1050 MHz. The ES card the benches were run on has two BIOSes, one with 800/1125 and the other with 1050/1125. The benches were run at 1050/1125
Posted on Reply
#42
springs113
W1zzardCorrected the GPU clock to 1050 MHz. The ES card the benches were run on has two BIOSes, one with 800/1125 and the other with 1050/1125. The benches were run at 1050/1125
I know you have the card W1zz
Posted on Reply
#43
Vario
unholythreeAgreed, I'm pretty close to an AMD fanboy; I want them to succeed badly but hype is a dangerous thing. I know why I'm still running a 1090T.
Because the 1090t is the last seriously badass thing AMD has produced on the cpu side of things.



6 cores baby ... not modules, cores.
Posted on Reply
#44
RCoon
W1zzardCorrected the GPU clock to 1050 MHz. The ES card the benches were run on has two BIOSes, one with 800/1125 and the other with 1050/1125. The benches were run at 1050/1125
Interesting. 1050mhz core clock 290X vs 876mhz core on the Titan. Leads me to wonder what overclocking headroom there is left on the 290X, if much at all, when compared to how far a 780 or Titan overclocks(1200mhz core without breaking a sweat). And all this nets a measly 0-8 FPS in some cases?
VarioBecause the 1090t is the last seriously badass thing AMD has produced on the cpu side of things.

i.imgur.com/82i6YGm.jpg

6 cores baby ... not modules, cores.
1100T would like a word with you.
That being said, my 1055T @ 4Ghz chirps gleefully at games.
Posted on Reply
#45
Vario
RCoonInteresting. 1050mhz core clock 290X vs 873mhz core on the Titan. Leads me to wonder what overclocking headroom there is left on the 290X, if much at all, when compared to how far a 780 or Titan overclocks(1200mhz core without breaking a sweat). And all this nets a measly 0-8 FPS in some cases?



1100T would like a word with you.
That being said, my 1055T @ 4Ghz chirps gleefully at games.
yeah your right.
Posted on Reply
#46
EarthDog
LOL, so does my Intel CPU, but who really gives a crap? LOL! This is r9 290X stuff peeps! :)
Posted on Reply
#47
Vario
EarthDogLOL, so does my Intel CPU, but who really gives a crap? LOL! This is r9 290X stuff peeps! :)
Yeah man, I think I'll upgrade my 7970 once these are out for 6mo- a year. Get that price drop.
Posted on Reply
#48
Slomo4shO
W1zzardCorrected the GPU clock to 1050 MHz. The ES card the benches were run on has two BIOSes, one with 800/1125 and the other with 1050/1125. The benches were run at 1050/1125
You missed this in your edit:
Over all, at 800 MHz (core) and 4.50 GHz (memory), it's advantage-AMD, looking at these graphs. Then again, we must remind you that this is 5760 x 1080 we're talking about. Many Thanks to Grant Kim.
Posted on Reply
#49
sweet
RCoonInteresting. 1050mhz core clock 290X vs 876mhz core on the Titan. Leads me to wonder what overclocking headroom there is left on the 290X, if much at all, when compared to how far a 780 or Titan overclocks(1200mhz core without breaking a sweat). And all this nets a measly 0-8 FPS in some cases?
You forgot the fact that Titan constantly boosts to 1GHz when running at stock setting ;) nVidia's dynamic boost was created for this benchmark cheat.
Posted on Reply
#50
EarthDog
sweetnVidia's dynamic boost was created for this benchmark cheat.
:roll: :slap:
Posted on Reply
Add your own comment
Apr 26th, 2024 02:50 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts