Monday, October 7th 2013

Final Radeon R9 290 Series Specifications Leaked

Disappointed at the $729.99 Newegg.com pricing of the Radeon R9 290X? No worries. AMD's second SKU based on the "Hawaii" silicon could be lighter on the wallet. Japanese retailers leaked the specifications sheets of both the upcoming R9 290X, and its lighter sibling, the R9 290 (non-X). Specifications of the R9 290X match rumors. The chip features 2,816 stream processors, up to 1000 MHz of GPU clock, single-precision floating point performance of 5.16 TFLOP/s, and 4 GB of GDDR5 memory across a 512-bit wide memory interface, clocked at 5.00 GHz, yielding 320 GB/s of memory bandwidth. The R9 290, on the other hand, features 2,560 stream processors, up to 948 MHz GPU clocks, 4.9 TFLOP/s single-precision floating point performance, and the same memory subsystem as the R9 290X. Both cards feature an identical combination of power connectors, 8-pin PCIe and 6-pin PCIe. Both feature hardware support for DirectX 11.2, OpenGL 4.3, and Mantle.

Source: Hermitage Akihabara
Add your own comment

52 Comments on Final Radeon R9 290 Series Specifications Leaked

#1
TheinsanegamerN
by: a_ump
eh, 2560 to 2816 that's only a ~9.1% difference, not 25%. just thought i'd clarify. I suppose if you take into account lower clocks i could see that tho. maybe, but they have the same memory and 50mhz down clock isn't really that much imo, plus you know vendors are going to release custom PCB and OC'd out of the box. We'll just have to wait, but i think performance between those two is going to be very close.

EDIT: also how does the dual-gpu card fit into AMD's new naming-scheme? R9 295X? 290XT(X)?
R299XTreme Space Heater Edition?
Posted on Reply
#2
Yeoman
by: a_ump
eh, 2560 to 2816 that's only a ~9.1% difference, not 25%. just thought i'd clarify. I suppose if you take into account lower clocks i could see that tho. maybe, but they have the same memory and 50mhz down clock isn't really that much imo, plus you know vendors are going to release custom PCB and OC'd out of the box. We'll just have to wait, but i think performance between those two is going to be very close.

EDIT: also how does the dual-gpu card fit into AMD's new naming-scheme? R9 295X? 290XT(X)?
R9-290x2 might be a basic solution, afterall the x is already present. But it will be interesting to see what they go with. I'm a little surprised they didn't leave the "x90" spot for a dual gpu option.
Posted on Reply
#3
jihadjoe
So 290X vs Titan and 290 vs GTX780?
Posted on Reply
#4
hardcore_gamer
by: Yeoman
R9-290x2 might be a basic solution, afterall the x is already present. But it will be interesting to see what they go with. I'm a little surprised they didn't leave the "x90" spot for a dual gpu option.
May be they haven't planned a dual GPU card because the power consumption of a single card is already too high.:confused:
Posted on Reply
#5
Yeoman
by: hardcore_gamer
May be they haven't planned a dual GPU card because the power consumption of a single card is already too high.:confused:
Could be. Though I suppose only time will tell.

A 7990 is already a pretty power hungry beast, judging from the relatively low clock speeds of 290/290x (about 800mhz sans any turbo function I believe), I don't know if some kind of twin 290/290x is out of the question entirely.

I suppose '90' was never a clear indicator that a card was dual gpu or not, after all there is a 7790 (obviously not dual 7750's or whatever), so I think there are a number of ways open to them to mark something as a dual gpu flagship card...I guess its not something they will sweat right now. It will be a while since we see one I suppose, although hopefully not as long as it took for a reference 7990 to materialise.
Posted on Reply
#6
HisDivineOrder
Looks like we may FINALLY get a GK110 part that's less than $500 and it will be because AMD FINALLY decided to actually compete. This is why them sitting the GPU race out for two years was such a horrible thing for all of us.

This is why relying on bundles for two years is a horrible move. This is why them starting the current generation (7970) out with a $50 increase for a marginal 10-20% gain in performance was such a horrible thing for us all. That allowed nVidia to rely on a mid-range part at high end pricing that was only very barely less expensive.

Now AMD shows up two years later after six months of nVidia owning the high end with their true high end part. And they're raising the price... again on the high end.

At least they're also releasing a lower part at a reasonable price, which will force nVidia to do the same. But man it sure took them forever to do it.
Posted on Reply
#7
Brusfantomet
by: bogami
512 bit and 4 gb ram ! I m litle concerned if it is enough ! The lest time ATI made R-2900x on 512 bit -512 pipe it was not on expected preformance and GTX8800 win the day !
I seriously hope R9 290X will bee better and cheaper than TITAN and GTX 780.:D
Given that NVIDIA seriously overestimates the price only the best products and competitive manufacturer can splinter prices !
We do not want just € 500 price on the market also much more capable GPU' s !
First visible test results show a better product than the competition and price is litle too high Can not wait for the first real test, 15's is coming and if all go well ,it will be by the time too buy AMD GPU !:toast:
it was 512 bit internal bus, the memory interface to the ram was 256 bit, GTX 8800 had 384 bit.

Might be time to retire my to 6950 card.
Posted on Reply
#8
NeoXF
by: Brusfantomet
it was 512 bit internal bus, the memory interface to the ram was 256 bit, GTX 8800 had 384 bit.

Might be time to retire my to 6950 card.
GTX 280/285 was 512bit...
Posted on Reply
#9
Xzibit
by: jigar2speed
But but but Crossfire is broken- AMD driver Sucks :slap: /flame suits on.
Unless AMD is going the Nvidia route and only supporting DirectX .? on software calls (Only 780/Titan support DX 11.1 natively). There is a small possibility they improved the new line-up. Don't hold your breath tho.

Sapphire specs have all chips supporting DirectX 11.2, PowerTune & ZeroCore from R7 240 on up. So were looking at new/improved chips across the line.

If not Raj mentioned phase 2 driver to be released in Autumn.
Posted on Reply
#10
TheinsanegamerN
These cards look wonderful, now, whether to get a r290/290x, or spend only $300 and grap a 7970 while i can....the 550ti is getting a little slow for 1080p gaming.
Posted on Reply
#11
corsaro
by: TheinsanegamerN
These cards look wonderful, now, whether to get a r290/290x, or spend only $300 and grap a 7970 while i can....the 550ti is getting a little slow for 1080p gaming.
that's like '' i think my 7790 will have a problem with 4k games!''
Posted on Reply
#12
Hilux SSRG
Still trying to wrap my head around AMD's new gfx naming scheme. Does anyone know what the die size will be on this monster? And if it's 28nm?
Posted on Reply
#13
W1zzard
by: Hilux SSRG
Still trying to wrap my head around AMD's new gfx naming scheme. Does anyone know what the die size will be on this monster? And if it's 28nm?
it's 28 nm because there is no better process available at this time at TSMC, where all NV and AMD GPUs are produced.
Posted on Reply
#14
Intel God
Has everyone seen this terrible 290X firestrike score?



Compared to a stock clocked 780's

Posted on Reply
#15
v12dock
Hmm the FP32 performance ~1.6TF higher than the 2880 kepler chip. The R9 290X should be a computing beast

Posted on Reply
#16
jihadjoe
by: jigar2speed
^ +1, Also AMD seems to be holding this cards on specs - GTX 770 has 7GHZ of RAM, same applied on R9 will boost the memory bandwidth heavily.
I imagine 7GHz RAM is now in short supply. If you check out the recent batch of reviews some of the new cards barely make it past 6GHz.
Posted on Reply
#17
Ahhzz
by: HopelesslyFaithful
I find the 280x more interesting. You get way more performance with 2 cards at 600 bucks compared to the 290x single :/
I love your logic, and almost agree. But I can't.
Too many "high-end" games don't do well with the Crossfire setups, and, iirc, the (double) RAM is not used in a XFire setup. Plus, the 280x is just a rebranded 7970 with a little boost, which doesn't put me in a comforted mood... I think I'll just hope for a good price point in a year or so on the 290.


by: Brusfantomet
it was 512 bit internal bus, the memory interface to the ram was 256 bit, GTX 8800 had 384 bit.

Might be time to retire my to 6950 card.
I'll be making that statement this time next year :toast: Little early to adopt it, plus I'm not budgeted for it yet, but next year, I'll be scrounging some pennies!
Posted on Reply
#18
HumanSmoke
by: v12dock
Hmm the FP32 performance ~1.6TF higher than the 2880 kepler chip. The R9 290X should be a computing beast
Will depend on driver support and application. Case in point, the FirePro W8000 should annihilate the Quadro K5000 ( 3225 TFlops vs 2169, a 48% advantage), but due to the vagaries of workload and priorities in driver writing...


Hardware is only one part of the equation...as is usual
Posted on Reply
#19
Blín D'ñero
by: Ahhzz
I love your logic, and almost agree. But I can't.
Too many "high-end" games don't do well with the Crossfire setups, and, iirc, the (double) RAM is not used in a XFire setup. Plus, the 280x is just a [...]
List those "too many"? You can't, because reality is that crossfire runs great on most games and scales very well. Which "(double) RAM" is not used?? You don't know what you're talking about. Educate yourself about AFR.
Posted on Reply
#20
HopelesslyFaithful
by: Ahhzz
I love your logic, and almost agree. But I can't.
Too many "high-end" games don't do well with the Crossfire setups, and, iirc, the (double) RAM is not used in a XFire setup. Plus, the 280x is just a rebranded 7970 with a little boost, which doesn't put me in a comforted mood... I think I'll just hope for a good price point in a year or so on the 290.




I'll be making that statement this time next year :toast: Little early to adopt it, plus I'm not budgeted for it yet, but next year, I'll be scrounging some pennies!
i thought i included saying the 7970/280x would be more interesting. Granted i am a big F@H fan so thats why i would go with the xfire personally. But i get the interest in the 290/x
Posted on Reply
#21
TheinsanegamerN
[quote="Blín D'ñero, post: 2992814"]List those "too many"? You can't, because reality is that crossfire runs great on most games and scales very well. Which "(double) RAM" is not used?? You don't know what you're talking about. Educate yourself about AFR.[/quote]he is referring to the fact that some people believe you get double the framebuffer with 2 cards. IE, two 3gb 7970s would give you 6gb of frame buffer, where in fact you only have 3gb
Posted on Reply
#22
TheinsanegamerN
by: corsaro
that's like '' i think my 7790 will have a problem with 4k games!''
sorry. meant 550tis (sli). forgot the s on the end.
Posted on Reply
#23
theoneandonlymrk
well well, the day has come,, now come on wizzard,release a review its 0ct 8 :D some of us are still up n all:respect::respect::D
Posted on Reply
#24
eidairaman1
by: MikeGR7
I sure hope we'll see the glory days of 6950 again and 290 unlocks to 290X!
you must of came after the R300 Series
Posted on Reply
#25
Ahhzz
[quote="Blín D'ñero, post: 2992814"]List those "too many"? You can't, because reality is that crossfire runs great on most games and scales very well. Which "(double) RAM" is not used?? You don't know what you're talking about. Educate yourself about AFR.[/quote]Let's see.... Skyrim had issues with Crossfire, Fallout 3 and Vegas, I heard that Serious Sam had some problems, never played it myself, COD Black Ops 2, BC2, I heard about BF3, didn't play... Yeah, they come out with their CAP for them sometime down the road, but there are still issues that you shouldn't expect to deal with. I'm ignoring the statements of "Growing pains" and the like.



Here. You may educate yourself.
http://www.tomshardware.com/forum/245454-33-crossfire-faqs

"The data is mirrored by both cards so two 1GB cards will still result in 1GB of VRAM being available."

So, in games like Skyrim, where VRAM is at a premium, loading large resolution packs can result in reaching your VRAM limit very quickly. A dual 2Gb card setup doesn't provide you with 4Gb of VRAM for use, you're limited to the 2Gb, since the data is mirrored on both cards.

Nice tone by the way. :nutkick:
Posted on Reply
Add your own comment