Wednesday, October 9th 2013

Radeon R9 290X Features 64 ROPs

A leaked company slide by AMD confirmed that its high-end "Hawaii" silicon indeed features 64 raster operations units (ROPs). In reference to its predecessor, "Tahiti," the slide speaks of 2 times the ROPs (32 on "Tahiti") and 1.4 times the stream processors (2048 on "Tahiti," so 2816 on "Hawaii"). Other known specifications include up to 1 GHz GPU clock, up to 5.00 GHz memory clock, and a 512-bit wide GDDR5 memory interface, holding 4 GB of memory. Reviews of Radeon R9 290X could surface around mid-October.

Source: WCCFTech
Add your own comment

130 Comments on Radeon R9 290X Features 64 ROPs

#1
hardcore_gamer
by: Big_Vulture
adding to that Mantle API, GK110 is a little crying kid:laugh:.
That depends on how many devs adopt the API. I'm not a fan of proprietary APIs, including physx. This will further fragment the PC gaming.
Posted on Reply
#2
TheHunter
by: hardcore_gamer
That depends on how many devs adopt the API. I'm not a fan of proprietary APIs, including physx. This will further fragment the PC gaming.
I said this somewhere else
Or better yet, remove directX/openGL from the picture and focus only on Mantle and Nvapi.

Both architectures would be used to their fullest and no one would complain about it.
At least in next-gen engines, since older gpus won't be strong enough anyway.
Posted on Reply
#3
TheGuruStud
by: hardcore_gamer
That depends on how many devs adopt the API. I'm not a fan of proprietary APIs, including physx. This will further fragment the PC gaming.
It's not proprietary (just like everything else AMD is doing) and it's designed to unify gaming. Directx is terrible and I guess AMD felt opengl is far too limited and consoles need mantle to squeeze out performance.

The ball is in nvidia's court, but they're the proprietary douche bags.
Posted on Reply
#4
Hilux SSRG
by: TheGuruStud
It's not proprietary (just like everything else AMD is doing) and it's designed to unify gaming. Directx is terrible and I guess AMD felt opengl is far too limited and consoles need mantle to squeeze out performance.

The ball is in nvidia's court, but they're the proprietary douche bags.
No one yet knows if it's proprietary with a licensing cost to NVidia or completely open. I actually wouldn't mind if they charged NVidia an annual fee.

Don't forget that AMD has the proprietary TressFX.
Posted on Reply
#5
Crap Daddy
by: TheGuruStud
It's not proprietary (just like everything else AMD is doing) and it's designed to unify gaming. Directx is terrible and I guess AMD felt opengl is far too limited and consoles need mantle to squeeze out performance.

The ball is in nvidia's court, but they're the proprietary douche bags.
It is. It works only on GCN arch. Wait, it is still IN the works. We'll see again in two months time what's with this in ONE game. As for the rest we'll just have to wait more. Until then we can enjoy our non-mantle games on any hardware we might have.
Posted on Reply
#6
TRWOV
What I find maddening about this card is the 8+6 pin connector: Only 300w TDP for such a beast? :twitch:
Posted on Reply
#7
Hilux SSRG
by: TRWOV
What I find maddening about this card is the 8 6 pin connector: Only 300w TDP for such a beast?
That's concerning especially because the die size is huge, 424mm2 I think. AIB vendors will up it to 8 8 pins, no doubt.
Posted on Reply
#8
esrever
by: Hilux SSRG
No one yet knows if it's proprietary with a licensing cost to NVidia or completely open. I actually wouldn't mind if they charged NVidia an annual fee.

Don't forget that AMD has the proprietary TressFX.
tressFX runs on directcompute which is dx11. nvidia run it as well.
Posted on Reply
#9
Hilux SSRG
by: esrever
tressFX runs on directcompute which is dx11. nvidia run it as well.
I stand corrected sir/madam. :slap:
Posted on Reply
#10
Kaleid
by: FX-GMC
Lol, this guy.
It's a tiny fan, it's crap.
Posted on Reply
#11
TRWOV
What's so bad about reference coolers? The 7870 I have in my HTPC is reference and it isn't noisy. Last time I checked, temps were in the low 70s but considering the case only has a single 120mm exhaust and it's OCed to 1200/1500 I'd say temps are fine.
Posted on Reply
#12
the54thvoid
by: Big_Vulture
adding to that Mantle API, GK110 is a little crying kid:laugh:.
Hmm, I like how people anthropomorphize inanimate objects. It won't make a graphics card cry. It will only make idiots cry. And it will only make idiots shout triumphantly as if owning a certain brand is like following a football team.

Anybody using terms like "This is gonna kick Titan's butt" or "Titan fall" or "this will make GK110 cry" have lost the plot. It's a piece of technology, not your brother in a school yard fight.

Folk need to just grow up.

Here's how it is. If the R9 290X truly trounces the performance of a GTX Titan, then that is a very good thing for everyone. As a person who puts money aside for tech products, I'll go and buy the AMD card (once water blocks are out). To me it's a win win situation.

In BF4 at 2560x1440 with Ultra settings I get roughly 60fps, with drops to 50 and peaks at 70. That's how fast Titan is but if 290X gets 15% more (minimum 60fps) I'll get that card as well.

And in 'x' months Nvidia will release another card that will beat the performance of the 290X and the idiots will come clammering back saying '290 Xpired' and other such nonsense.

People - it's a graphics card. If you're getting in any way attached to it, please, go outside and feel the breeze, cuddle your partner or phone your dear old mum. There's more to life than triumphalism/defeatism about tech for goodness sake.
Posted on Reply
#13
Fluffmeister
The very same GK110 GPU's that were installed and running in the Titan SC over a year ago. :P

I for one certainly hope AMD's latest and greatest GPU can mix it with year old nV tech.

Can't wait till they actually frickin release it.
Posted on Reply
#14
acerace
Looks like some owners are butthurting. Huh, typical when new products is announced.
(Not related to any living or dead person).
Posted on Reply
#15
SIGSEGV
by: Fluffmeister
The very same GK110 GPU's that were installed and running in the Titan SC over a year ago. :P

I for one certainly hope AMD's latest and greatest GPU can mix it with year old nV tech.
huh ? what?
Oh please dude.. :slap:
Posted on Reply
#16
The Von Matrices
I just don't understand the hype over some of the features AMD is touting when we have been down a similar path before.

For almost a decade NVidia (since the 8800 series) has been trying to force everyone into the belief that CUDA and PhysX as the greatest things on Earth. Since then it has only been a disappointment in the gaming world with very few games supporting these features and their benefits being minimal at best. Now we have AMD touting Mantle and TrueAudio with no games to demonstrate them, limited performance data, and only the promise that support will come in the future. This is NVidia circa the 8800 series, and I see no reason why this situation will turn out any differently.

Anyone who buys an graphics card (or any product) based upon what the manufacturer promises in the future is crazy. Buy a card based on what it supports now, not what the manufacturer claims it will support in the future.
Posted on Reply
#17
TheHunter
Hmm, I like how people anthropomorphize inanimate objects. It won't make a graphics card cry. It will only make idiots cry. And it will only make idiots shout triumphantly as if owning a certain brand is like following a football team.

Anybody using terms like "This is gonna kick Titan's butt" or "Titan fall" or "this will make GK110 cry" have lost the plot. It's a piece of technology, not your brother in a school yard fight.

Folk need to just grow up.
I said it mostly as joke, dont get so hung up about it, besides its true anyway ;)
Posted on Reply
#18
EpicShweetness
by: Hilux SSRG
That's concerning especially because the die size is huge, 424mm2 I think. AIB vendors will up it to 8 8 pins, no doubt.
Isn't Nivida's own GK110 551mm (squared) in size? That's a bigger chip then, and Nvidia has done some impressive power optimizations on that, so...... just saying.
Posted on Reply
#19
crazyeyesreaper
Chief Broken Rig
this is what I expected 44 ROPs as reported earlier was wrong. I expected 56 or 64 so this is a welcome addition ROPs wont be a limiting factor so now it will come down to efficiency and clock speeds. If these cards clock well, and memory clocks well to drop latency a bit these could be seriously good performers.
Posted on Reply
#20
shovenose
by: Pinktulips7
Hey hmm this guy ,what?:mad:
Oh, right, narrow-minded idiots don't typically see themselves as such.
I'm the first one to say NVIDIA ROCKS AMD SUCKS but you know what? That doesn't mean that you can make unfounded bullshit accusations about a product that isn't sold yet? I for one am interested to see if AMD's R9 290(x) can get them back in the game.
Posted on Reply
#21
crazyeyesreaper
Chief Broken Rig
290X = 2816 / 64 = 44 shaders per ROP
7970 = 2048 / 32 = 64 shaders per ROP
7870 = 1280 / 32 = 40 shaders per ROP

so in terms of the 7870 which when overclocked could take on a 7950 relatively easily

the 290x has a much better shader to ROP ratio and with the increased bus width should prove to be power hungry but well balanced in terms of Shaders / ROPs / TMUs / Bandwidth. the 290X looks to be shaping up nicely but final performance numbers will bring us the truth of the matter. Diminishing returns will still be a problem.
Posted on Reply
#22
FX-GMC
by: Kaleid
It's a tiny fan, it's crap.
You would know, seems how you must own one to be so sure of that statement. Oh wait, you don't. Nice SPECULATION.

I can't wait to see how it benchmarks. I could care less about the fan as long as it keeps it within operating temps with maybe a small OC.
Posted on Reply
#23
jigar2speed
by: Crap Daddy
It is. It works only on GCN arch. Wait, it is still IN the works. We'll see again in two months time what's with this in 15 games (Frostbite engine). As for the rest we'll just have to wait more. Until then we can enjoy our non-mantle games on any hardware we might have.
Here, your corrected and you're welcome btw :)
Posted on Reply
#24
jigar2speed
by: the54thvoid
Hmm, I like how people anthropomorphize inanimate objects. It won't make a graphics card cry. It will only make idiots cry. And it will only make idiots shout triumphantly as if owning a certain brand is like following a football team.

Anybody using terms like "This is gonna kick Titan's butt" or "Titan fall" or "this will make GK110 cry" have lost the plot. It's a piece of technology, not your brother in a school yard fight.

Folk need to just grow up.

Here's how it is. If the R9 290X truly trounces the performance of a GTX Titan, then that is a very good thing for everyone. As a person who puts money aside for tech products, I'll go and buy the AMD card (once water blocks are out). To me it's a win win situation.

In BF4 at 2560x1440 with Ultra settings I get roughly 60fps, with drops to 50 and peaks at 70. That's how fast Titan is but if 290X gets 15% more (minimum 60fps) I'll get that card as well.

And in 'x' months Nvidia will release another card that will beat the performance of the 290X and the idiots will come clammering back saying '290 Xpired' and other such nonsense.

People - it's a graphics card. If you're getting in any way attached to it, please, go outside and feel the breeze, cuddle your partner or phone your dear old mum. There's more to life than triumphalism/defeatism about tech for goodness sake.
While i agree with what you say, but that is the practical side of me.

Do you know why people come to techpowerup and signup for the forum - 10% people are here with their problems, 10 % want to learn something new, but 80% people are enthusiast (I am talking about regulars).

What makes an Enthusiast ? - He/she loves new technology, wants to own the very best of tech stuff and will invest in it. But there is major side effect human beings have in them. Sense of ownership of a particular thing can also make them loyal to that particular product maker and hence they will always prefer that vendor but isn't that Enthusiasm all about ?

Nvidia and AMD wouldn't be fighting like cats and dogs for high end market, if they didn't wanted Enthusiast to vouch for their high end products, remember money is at mainstream market, but word of mouth by an Enthusiast is still the greatest marketing tactic till date.

Anyway back to Titan and AMD's R9 290X - while my practical side will never care which side wins as it will force the other side to bring the pricing down and make a better product for next round, my enthusiast heart will always and i mean always be loyal to a particular product maker and please note that doesn't guarantee i will buy their product but i will take their side for sure.
Posted on Reply
#25
RejZoR
by: hardcore_gamer
That depends on how many devs adopt the API. I'm not a fan of proprietary APIs, including physx. This will further fragment the PC gaming.
It won't fragment anything. If your card doesn't support Mantle, it will still run just the same using Direct3D (just a tiny bit slower, clearly). Feature wise i don't think there will be much difference since HW is of the same class. PhysX on the other hand, without the support either doesn't work at all or works piss poorly even on 4+ GHz quad core...

So in the end it's just a matter of developers taking time to use Mantle API in their engine. And frankly i'd prefer Mantle over anything. Extra performance is always nice where i have yet to see anything useful from PhysX other than lame eye candy that still badly degrades performance even on fully supported HW...
Posted on Reply
Add your own comment