Tuesday, October 1st 2013

Radeon R9 290X Clock Speeds Surface, Benchmarked

Radeon R9 290X is looking increasingly good on paper. Most of its rumored specifications, and SEP pricing were reported late last week, but the ones that eluded us were clock speeds. A source that goes by the name Grant Kim, with access to a Radeon R9 290X sample, disclosed its clock speeds, and ran a few tests for us. To begin with, the GPU core is clocked at 1050 MHz. There is no dynamic-overclocking feature, but the chip can lower its clocks, taking load and temperatures into account. The memory is clocked at 1125 MHz (4.50 GHz GDDR5-effective). At that speed, the chip churns out 288 GB/s of memory bandwidth, over its 512-bit wide memory interface. Those clock speeds were reported by the GPU-Z client to us, so we give it the benefit of our doubt, even if it goes against AMD's ">300 GB/s memory bandwidth" bullet-point in its presentation.

Among the tests run on the card include frame-rates and frame-latency for Aliens vs. Predators, Battlefield 3, Crysis 3, GRID 2, Tomb Raider (2013), RAGE, and TESV: Skyrim, in no-antialiasing, FXAA, and MSAA modes; at 5760 x 1080 pixels resolution. An NVIDIA GeForce GTX TITAN was pitted against it, running the latest WHQL driver. We must remind you that at that resolution, AMD and NVIDIA GPUs tend to behave a little differently due to the way they handle multi-display, and so it may be an apples-to-coconuts comparison. In Tomb Raider (2013), the R9 290X romps ahead of the GTX TITAN, with higher average, maximum, and minimum frame rates in most tests.
RAGE
The OpenGL-based RAGE is a different beast. With AA turned off, the R9 290X puts out an overall lower frame-rates, and higher frame latency (lower the better). It gets even more inconsistent with AA cranked up to 4x MSAA. Without AA, frame-latencies of both chips remain under 30 ms, with the GTX TITAN looking more consistent, and lower. At 4x MSAA, the R9 290X is all over the place with frame latency.
TESV: Skyrim
The tester somehow got the game to work at 5760 x 1080. With no AA, both chips put out similar frame-rates, with the GTX TITAN having a higher mean, and the R9 290X spiking more often. In the frame-latency graph, the R9 290X has a bigger skyline than the GTX TITAN, which is not something to be proud of. As an added bonus, the VRAM usage of the game was plotted throughout the test run.

GRID 2
GRID 2 is a surprise package for the R9 290X. The chip puts out significantly, and consistently higher frame-rates than the GTX TITAN at no-AA, and offers lower frame-latencies. Even with MSAA cranked all the way up to 8x, the R9 290X holds out pretty well on the frame-rate front, but not frame-latency.
Crysis 3
This Cryengine 3-based game offers MSAA and FXAA anti-aliasing methods, and so it wasn't tested without either enabled. With 4x MSAA, both chips offer similar levels of frame-rates and frame-latencies. With FXAA enabled, the R9 290X offers higher frame-rates on average, and lower latencies.
Battlefield 3
That leaves us with Battlefield 3, which like Crysis 3, supports MSAA and FXAA. At 4x MSAA, the R9 290X offers higher frame-rates on average, and lower frame-latencies. It gets better for AMD's chip with FXAA on both fronts.
Overall, at 1050 MHz (core) and 4.50 GHz (memory), it's advantage-AMD, looking at these graphs. Then again, we must remind you that this is 5760 x 1080 we're talking about. Many Thanks to Grant Kim.
Add your own comment

100 Comments on Radeon R9 290X Clock Speeds Surface, Benchmarked

#76
Patriot
TRWOVToo bad there isn't any GPU WUs available at WCG. I bet this would net about 200K ppd :roll:
:toast: Fellow DCer.
There are other DC projects that make good use of GPUs both BOINC and f@h.
Posted on Reply
#77
EpicShweetness
I'm just gonna stand back, wait for the release, and let the flame war burn itself out. Ifs its a $600 card it's gonna be amazing regardless. If it ends up being less then $600 we have a absolute game changer on our hands. :rockout:
Posted on Reply
#78
Fluffmeister
Fairly interesting results, if unsurprising in places (GRID 2 is just a continuation of the Dirt Showdown results). Looking at things from a purely technological progress perspective, I can't help but think it's still being compared to what amounts to be cut-down nV technology that has already been on the market in one form or another for a good year now already.

I appreciate the excitement of course, and no doubt it will thankfully bring prices into more reasonable realms for all us mere mortals.
Posted on Reply
#79
FR@NK
Im hoping for $499-$549 range. I might do $599 for an asus custom pcb.
Posted on Reply
#80
sweet
FR@NKIm hoping for $499-$549 range. I might do $599 for an asus custom pcb.
FYI, there will be no custom for R9 290X, just like Titan. Maybe the only custom cards are those 8000 of R9 290X Battlefield 4 edition.
We will have to wait for R9 290, a 780 counter part, for custom pcb.
Posted on Reply
#81
Serpent of Darkness
Epic Fails @ Math...

@ the Tomb Raider Benchmark Graph:

-FXAA-
GTX Titan, Min = 35.0 fps; Max = 40.9 fps; avg stated = 29.1 fps; actual avg = 37.95 fps.
error off = 24.62%.

This implies that the GTX Titan's avg is smaller than it's minimum...


RX9-990, Min = 32.0 fps; Max = 48.6 fps; avg stated = 38.6 fps; actual avg = 40.30 fps.
error off = 4.22%.

Looking at the huge error of the GTX Titan's performance, this graph comes into question, and also, the author's ability comes into question.

-MSAA 4x-
GTX Titan, Min = 15.1 fps; Max = 19.4 fps; avg stated = 15.1 fps; actual avg = 17.25 fps.
error off = 12.46%

In this, author states that the Minimum fps is equal to the average... Again, this is another error...

RX9-990, Min = 14.7 fps; Max = 22.0 fps; avg stated = 18.2 fps; actual avg = 18.35 fps.
error off = 0.82%
Posted on Reply
#82
RCoon
sweetFYI, there will be no custom for R9 290X, just like Titan. Maybe the only custom cards are those 8000 of R9 290X Battlefield 4 edition.
We will have to wait for R9 290, a 780 counter part, for custom pcb.
Like Titan (and arguably the 780) it will be made for water! :rockout:
Posted on Reply
#83
haswrong
sweetFYI, there will be no custom for R9 290X, just like Titan. Maybe the only custom cards are those 8000 of R9 290X Battlefield 4 edition.
We will have to wait for R9 290, a 780 counter part, for custom pcb.
AMD gone green -> AMviDia? they copy just about everything, except the performance..
i expected much much more consistent and higher performance results.. this card changes nothing and will be easily dominated by evga gtx 780 classified for just a slightly higher price.

basically theres no wonder AMD was hiding these cards as long as humanly possible.. if they had a game changer, itd be available in the summer already. yup, its the bitter reality..
Posted on Reply
#84
uuuaaaaaa
Serpent of Darkness@ the Tomb Raider Benchmark Graph:

-FXAA-
GTX Titan, Min = 35.0 fps; Max = 40.9 fps; avg stated = 29.1 fps; actual avg = 37.95 fps.
error off = 24.62%.

This implies that the GTX Titan's avg is smaller than it's minimum...


RX9-990, Min = 32.0 fps; Max = 48.6 fps; avg stated = 38.6 fps; actual avg = 40.30 fps.
error off = 4.22%.

Looking at the huge error of the GTX Titan's performance, this graph comes into question, and also, the author's ability comes into question.

-MSAA 4x-
GTX Titan, Min = 15.1 fps; Max = 19.4 fps; avg stated = 15.1 fps; actual avg = 17.25 fps.
error off = 12.46%

In this, author states that the Minimum fps is equal to the average... Again, this is another error...

RX9-990, Min = 14.7 fps; Max = 22.0 fps; avg stated = 18.2 fps; actual avg = 18.35 fps.
error off = 0.82%
The first one that you pointed out is correct, the avg cannot be below the min fps.

The other ones are perfectly possible. The average is not the (Max_fps+Min_fps)/2... These are weighted averages. Imagine a situation were your game runs at 55 fps 99% of the time, and 1% were it runs at 300fps. Is the average fps 205, or will it be closer to 55?
Posted on Reply
#85
Xzibit
I don't think its a mystery to anyone that's owned the Tomb Raider reboot that the benchmark is screwy since launch.
Posted on Reply
#86
H82LUZ73
SlackerI can't wait for this card to be officially lifted of its NDA and gets benched. It looks really promising and maybe a great upgrade from my 6970:)
That make 2 of us ,I will just go 1 card this time crossfire is a pain to maintain some times,I am also interested in the Blu Ray /Audio playback of this 290x card.Say Wizzz That is a hint ;)
Posted on Reply
#87
Recus
sweetYou are a typical victim of the fall assumption created by nVidia's dynamic boost. The cards with this technology always boost themselves to the highest stable clock assuming the power target (6xx) or temperature target (Titan, 7xx) is satisfied. In case of Titan, with stock setting it runs games at more than 1GHz. The base clock is mostly just for show.
I know how it works. I'm talking about this
Posted on Reply
#88
Patriot
uuuaaaaaaThe first one that you pointed out is correct, the avg cannot be below the min fps.

The other ones are perfectly possible. The average is not the (Max_fps+Min_fps)/2... These are weighted averages. Imagine a situation were your game runs at 55 fps 99% of the time, and 1% were it runs at 300fps. Is the average fps 205, or will it be closer to 55?
His post title was aptly named fails @math... :laugh:

Yes I think many of us have noted your min can't be above your average.
looks like the 2 rows got swapped tbh....
Posted on Reply
#89
sweet
RecusI know how it works. I'm talking about this s15.postimg.org/lmwsuf5rf/Capture.jpg
For a Kepler cards like Titan, even though it says the boost clock is 876 MHz, it still boosts to ~1000 MHz when running, stock setting.
Try to google the term "Kepler boost" and you will know what I meant with "cheating"?
Posted on Reply
#90
HumanSmoke
sweetFor a Kepler cards like Titan, even though it says the boost clock is 876 MHz, it still boosts to ~1000 MHz when running, stock setting.
Try to google the term "Kepler boost" and you will know what I meant with "cheating"?
If reasonably intelligent dynamic core tuning is your idea of cheating then pretty much everyone is doing it. I'll take a wide guess and say you don't have an issue with Intel's or AMD's implementations(both CPU and GPU) of the same technology considering both these companies predate Nvidia's use.

Nice bait though. :toast:
Posted on Reply
#91
Ahhzz
I was pretty sure both camps were guilty of catering their cards to their "preferred brand" of benchmark... kinda sounds like Intel, way back :)
Posted on Reply
#92
HumanSmoke
AhhzzI was pretty sure both camps were guilty of catering their cards to their "preferred brand" of benchmark... kinda sounds like Intel, way back :)
Pretty much. Nvidia had the whole 3DMark2003 issue, and ATI pretty much invented benchmark tuning for GPUs when it released a top end card (Rage Pro Turbo) whose only difference from the card it replaced was a driver fine-tuned for synthetic benchmarks.
The graphics landscape is littered with "optimizations" from most anyone who ever released hardware.
Posted on Reply
#93
buildzoid
HumanSmokeIf reasonably intelligent dynamic core tuning is your idea of cheating then pretty much everyone is doing it. I'll take a wide guess and say you don't have an issue with Intel's or AMD's implementations(both CPU and GPU) of the same technology considering both these companies predate Nvidia's use.

Nice bait though. :toast:
AMD doesn't have dynamic OC on their gpus. Intel and AMD CPUs only turbo to the specified turbo frequency. If it says max turbo frequency 4.2 on the box the cpu will run upto 4.2 never above 4.2. If Nvidia specified the boost speed as 1ghz it wouldn't be cheating but when you say 867mhz and run 1ghz it's cheating.
Posted on Reply
#94
HumanSmoke
buildzoidAMD doesn't have dynamic OC on their gpus. Intel and AMD CPUs only turbo to the specified turbo frequency. If it says max turbo frequency 4.2 on the box the cpu will run upto 4.2 never above 4.2. If Nvidia specified the boost speed as 1ghz it wouldn't be cheating but when you say 867mhz and run 1ghz it's cheating.
It's actually 876 Mhz....and that figure is the "minimum guaranteed" boost (maximum guaranteed sustainable) not the maximum achievable- which is fairly common knowledge to be 992MHz at default voltage ( 9 speed bins of 13 MHz/0.013v steppings). It's not so much cheating as being ignorant of the actual facts on how the boost algorithm actually works.

As an amusing aside, why do a number of people howl about the max sustainable boost over the minimum guaranteed boost, but yet always quote the lower specification numbers associated with base frequency. For example, every man and his AMD loving dog seem to attribute Titan's FP32 number at 4500 Teraflops of compute...yet if you were taking the boost into consideration -i.e. just as in gaming benchmarks - the number is actually 5333. Weird huh?

Anyhow we seem to be getting off the subject
Posted on Reply
#96
sweet
HumanSmokeIt's actually 876 Mhz....and that figure is the "minimum guaranteed" boost (maximum guaranteed sustainable) not the maximum achievable- which is fairly common knowledge to be 992MHz at default voltage ( 9 speed bins of 13 MHz/0.013v steppings). It's not so much cheating as being ignorant of the actual facts on how the boost algorithm actually works.

As an amusing aside, why do a number of people howl about the max sustainable boost over the minimum guaranteed boost, but yet always quote the lower specification numbers associated with base frequency. For example, every man and his AMD loving dog seem to attribute Titan's FP32 number at 4500 Teraflops of compute...yet if you were taking the boost into consideration -i.e. just as in gaming benchmarks - the number is actually 5333. Weird huh?

Anyhow we seem to be getting off the subject
I understand your point of view, but you seem to be new with the term "Kepler boost". The difference between 876MHz and 993MHz (13*9 = 9 step Kepler boost) is significant. And in any case, running the card at higher clock than it should be in benchmarks is an act of cheating. Dynamic boost is a nice utility for users though.

On the topic, the final specs ;)
www.webhallen.com/se-sv/hardvara/184362-asus_radeon_r9-290x_4gb_battlefield_4_bundle_limited_edition__battlefield_4_premium

GPU Codename – Hawaii
GPU Process – 28nmlarge
Stream Processors – 2816
TMUs – 176
ROPs – 44
Base Clock – 800 MHz
Turbo Clock – 1000 MHz
VRAM – 4GB GDDR5
Memory Bus – 512-Bit
Memory Clock – 1250 MHz (5GHz Effective)
Memory Bandwidth – 320 GBps
Power Configuration – 8+6 Pin
PCB VRM – 5+1+1
Die Size – 424mm2
Transistors – 6 billion
3D Technology – DirectX 11.2, OpenGL 4.3 and Mantle
Audio Technology – AMD TrueAudio
Personally, I'm disappoint with the VRM settings :mad: The core clock will be hard to deal with, however the mem can be much higher if samsung/hynix chips were used.
Posted on Reply
#97
Blín D'ñero
sweet[...]

On the topic, the final specs ;)
www.webhallen.com/se-sv/hardvara/184362-asus_radeon_r9-290x_4gb_battlefield_4_bundle_limited_edition__battlefield_4_premium

GPU Codename – Hawaii
GPU Process – 28nmlarge
Stream Processors – 2816
TMUs – 176
ROPs – 44
Base Clock – 800 MHz
Turbo Clock – 1000 MHz
VRAM – 4GB GDDR5
Memory Bus – 512-Bit
Memory Clock – 1250 MHz (5GHz Effective)
Memory Bandwidth – 320 GBps
Power Configuration – 8+6 Pin
PCB VRM – 5+1+1
Die Size – 424mm2
Transistors – 6 billion
3D Technology – DirectX 11.2, OpenGL 4.3 and Mantle
Audio Technology – AMD TrueAudio
Personally, I'm disappoint with the VRM settings :mad: The core clock will be hard to deal with, however the mem can be much higher if samsung/hynix chips were used.
What "final". That page states:
Vi kommer endast att få in ett fåtal av dessa så passa på att boka nu! Observera att priset på produkten och nedan specifikationer är preliminära och kan komma att ändras (framförallt klockfrekvenserna är inte 100 % bekräftade ännu).
Bokningar är inte bindande och om vi justerar ner priset får ni givetvis tillbaka mellanskillnaden, precis som vanligt på Webhallen med andra ord.
in google english:
We will only get a few of these so make sure you book now! Please note that the price of the product and the below specifications are provisional and subject to change (especially the frequencies are not 100% confirmed yet).
Reservations are not binding and if we adjust the price down, you will of course refund the difference, just as usual on Webhallen in other words.
7299 Kr (Swedish Kroner) is € 845. If that's true, then...:twitch:

Hmm well they sell the Titan for 8490 Kr ( = € 983,-) ... which is normal.

[EDIT:
Naaa... price is already known to be $599, and their description translates:
Price, images, product description and release date are preliminary and subject to change.

[...]
/EDIT]
Posted on Reply
#98
sweet
Blín D'ñero7299 Kr (Swedish Kroner) is € 845. If that's true, then...:twitch:

Hmm well they sell the Titan for 8490 Kr ( = € 983,-) ... which is normal.
There are only 8000 of this version. It is bundled with BF4 premium (~140$ on Origin), and comes the premium fee for the first owners.
Posted on Reply
#99
Blín D'ñero
The description says
Pris, bild, produktbeskrivning och releasedatum är preliminära och kan komma att ändras.
ASUS Radeon R9-290X 4GB med Battlefield 4 på köpet i extremt begränsad utgåva!

Exklusivt för Webhallen får du dessutom battlefield 4 Premium på köpet. Premium innehåller bland annat:
• Fem expansioner, som du dessutom får tillgång till två veckor före andra.
• Prioritet till kö på fulla servrar.
• 12 st Battlepacks (nya vapen, skins, kamoflage, dog tags etc)
in google english:
Price, images, product description and release date are preliminary and subject to change.
ASUS Radeon R9-290X 4GB with Battlefield 4 on the purchase of extremely limited edition!

Exclusive to Webhallen, you get battlefield 4 Premium for free. Premium includes:
• Five expansions, you also get access to two weeks before the others.
• Priority to queue on full servers.
• 12 pieces Battlepacks (new weapons, skins, camouflage, dog tags, etc.)
For free.
That makes the card € 845.
Posted on Reply
#100
erocker
*
Thread cleaned up. Any trolling, flaming or posting in an uncivilized manner may result in the loss of posting privileges.

Please behave appropriately.

Thank you.
Posted on Reply
Add your own comment
Apr 26th, 2024 14:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts