• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Final Radeon R9 290 Series Specifications Leaked

eh, 2560 to 2816 that's only a ~9.1% difference, not 25%. just thought i'd clarify. I suppose if you take into account lower clocks i could see that tho. maybe, but they have the same memory and 50mhz down clock isn't really that much imo, plus you know vendors are going to release custom PCB and OC'd out of the box. We'll just have to wait, but i think performance between those two is going to be very close.

EDIT: also how does the dual-gpu card fit into AMD's new naming-scheme? R9 295X? 290XT(X)?

I think he means the difference between the 290 (non X) - 2560SP and the R280X - 2048SP.
 
eh, 2560 to 2816 that's only a ~9.1% difference, not 25%. just thought i'd clarify. I suppose if you take into account lower clocks i could see that tho. maybe, but they have the same memory and 50mhz down clock isn't really that much imo, plus you know vendors are going to release custom PCB and OC'd out of the box. We'll just have to wait, but i think performance between those two is going to be very close.

EDIT: also how does the dual-gpu card fit into AMD's new naming-scheme? R9 295X? 290XT(X)?

R299XTreme Space Heater Edition?
 
eh, 2560 to 2816 that's only a ~9.1% difference, not 25%. just thought i'd clarify. I suppose if you take into account lower clocks i could see that tho. maybe, but they have the same memory and 50mhz down clock isn't really that much imo, plus you know vendors are going to release custom PCB and OC'd out of the box. We'll just have to wait, but i think performance between those two is going to be very close.

EDIT: also how does the dual-gpu card fit into AMD's new naming-scheme? R9 295X? 290XT(X)?

R9-290x2 might be a basic solution, afterall the x is already present. But it will be interesting to see what they go with. I'm a little surprised they didn't leave the "x90" spot for a dual gpu option.
 
So 290X vs Titan and 290 vs GTX780?
 
R9-290x2 might be a basic solution, afterall the x is already present. But it will be interesting to see what they go with. I'm a little surprised they didn't leave the "x90" spot for a dual gpu option.

May be they haven't planned a dual GPU card because the power consumption of a single card is already too high.:confused:
 
May be they haven't planned a dual GPU card because the power consumption of a single card is already too high.:confused:

Could be. Though I suppose only time will tell.

A 7990 is already a pretty power hungry beast, judging from the relatively low clock speeds of 290/290x (about 800mhz sans any turbo function I believe), I don't know if some kind of twin 290/290x is out of the question entirely.

I suppose '90' was never a clear indicator that a card was dual gpu or not, after all there is a 7790 (obviously not dual 7750's or whatever), so I think there are a number of ways open to them to mark something as a dual gpu flagship card...I guess its not something they will sweat right now. It will be a while since we see one I suppose, although hopefully not as long as it took for a reference 7990 to materialise.
 
Looks like we may FINALLY get a GK110 part that's less than $500 and it will be because AMD FINALLY decided to actually compete. This is why them sitting the GPU race out for two years was such a horrible thing for all of us.

This is why relying on bundles for two years is a horrible move. This is why them starting the current generation (7970) out with a $50 increase for a marginal 10-20% gain in performance was such a horrible thing for us all. That allowed nVidia to rely on a mid-range part at high end pricing that was only very barely less expensive.

Now AMD shows up two years later after six months of nVidia owning the high end with their true high end part. And they're raising the price... again on the high end.

At least they're also releasing a lower part at a reasonable price, which will force nVidia to do the same. But man it sure took them forever to do it.
 
512 bit and 4 gb ram ! I m litle concerned if it is enough ! The lest time ATI made R-2900x on 512 bit -512 pipe it was not on expected preformance and GTX8800 win the day !
I seriously hope R9 290X will bee better and cheaper than TITAN and GTX 780.:D
Given that NVIDIA seriously overestimates the price only the best products and competitive manufacturer can splinter prices !
We do not want just € 500 price on the market also much more capable GPU' s !
First visible test results show a better product than the competition and price is litle too high Can not wait for the first real test, 15's is coming and if all go well ,it will be by the time too buy AMD GPU !:toast:

it was 512 bit internal bus, the memory interface to the ram was 256 bit, GTX 8800 had 384 bit.

Might be time to retire my to 6950 card.
 
it was 512 bit internal bus, the memory interface to the ram was 256 bit, GTX 8800 had 384 bit.

Might be time to retire my to 6950 card.

GTX 280/285 was 512bit...
 
But but but Crossfire is broken- AMD driver Sucks :slap: /flame suits on.

Unless AMD is going the Nvidia route and only supporting DirectX .? on software calls (Only 780/Titan support DX 11.1 natively). There is a small possibility they improved the new line-up. Don't hold your breath tho.

Sapphire specs have all chips supporting DirectX 11.2, PowerTune & ZeroCore from R7 240 on up. So were looking at new/improved chips across the line.

If not Raj mentioned phase 2 driver to be released in Autumn.
 
Last edited:
These cards look wonderful, now, whether to get a r290/290x, or spend only $300 and grap a 7970 while i can....the 550ti is getting a little slow for 1080p gaming.
 
These cards look wonderful, now, whether to get a r290/290x, or spend only $300 and grap a 7970 while i can....the 550ti is getting a little slow for 1080p gaming.

that's like '' i think my 7790 will have a problem with 4k games!''
 
Still trying to wrap my head around AMD's new gfx naming scheme. Does anyone know what the die size will be on this monster? And if it's 28nm?
 
Still trying to wrap my head around AMD's new gfx naming scheme. Does anyone know what the die size will be on this monster? And if it's 28nm?

it's 28 nm because there is no better process available at this time at TSMC, where all NV and AMD GPUs are produced.
 
Has everyone seen this terrible 290X firestrike score?

R9-290X-FireStrike-808x620.jpg


Compared to a stock clocked 780's

1026945.jpg
 
Hmm the FP32 performance ~1.6TF higher than the 2880 kepler chip. The R9 290X should be a computing beast
47a.jpg

44a.jpg
 
^ +1, Also AMD seems to be holding this cards on specs - GTX 770 has 7GHZ of RAM, same applied on R9 will boost the memory bandwidth heavily.

I imagine 7GHz RAM is now in short supply. If you check out the recent batch of reviews some of the new cards barely make it past 6GHz.
 
I find the 280x more interesting. You get way more performance with 2 cards at 600 bucks compared to the 290x single :/

I love your logic, and almost agree. But I can't.
Too many "high-end" games don't do well with the Crossfire setups, and, iirc, the (double) RAM is not used in a XFire setup. Plus, the 280x is just a rebranded 7970 with a little boost, which doesn't put me in a comforted mood... I think I'll just hope for a good price point in a year or so on the 290.


it was 512 bit internal bus, the memory interface to the ram was 256 bit, GTX 8800 had 384 bit.

Might be time to retire my to 6950 card.

I'll be making that statement this time next year :toast: Little early to adopt it, plus I'm not budgeted for it yet, but next year, I'll be scrounging some pennies!
 
Hmm the FP32 performance ~1.6TF higher than the 2880 kepler chip. The R9 290X should be a computing beast
Will depend on driver support and application. Case in point, the FirePro W8000 should annihilate the Quadro K5000 ( 3225 TFlops vs 2169, a 48% advantage), but due to the vagaries of workload and priorities in driver writing...
specviewperf-13.png


Hardware is only one part of the equation...as is usual
 
I love your logic, and almost agree. But I can't.
Too many "high-end" games don't do well with the Crossfire setups, and, iirc, the (double) RAM is not used in a XFire setup. Plus, the 280x is just a [...]

List those "too many"? You can't, because reality is that crossfire runs great on most games and scales very well. Which "(double) RAM" is not used?? You don't know what you're talking about. Educate yourself about AFR.
 
I love your logic, and almost agree. But I can't.
Too many "high-end" games don't do well with the Crossfire setups, and, iirc, the (double) RAM is not used in a XFire setup. Plus, the 280x is just a rebranded 7970 with a little boost, which doesn't put me in a comforted mood... I think I'll just hope for a good price point in a year or so on the 290.




I'll be making that statement this time next year :toast: Little early to adopt it, plus I'm not budgeted for it yet, but next year, I'll be scrounging some pennies!

i thought i included saying the 7970/280x would be more interesting. Granted i am a big F@H fan so thats why i would go with the xfire personally. But i get the interest in the 290/x
 
List those "too many"? You can't, because reality is that crossfire runs great on most games and scales very well. Which "(double) RAM" is not used?? You don't know what you're talking about. Educate yourself about AFR.

he is referring to the fact that some people believe you get double the framebuffer with 2 cards. IE, two 3gb 7970s would give you 6gb of frame buffer, where in fact you only have 3gb
 
that's like '' i think my 7790 will have a problem with 4k games!''

sorry. meant 550tis (sli). forgot the s on the end.
 
well well, the day has come,, now come on wizzard,release a review its 0ct 8 :D some of us are still up n all:respect::respect::D
 
Back
Top