• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD GPU'14 Event Detailed, Announces Radeon R9 290X

i don't see any CF connector on the card???:wtf:

It is there at the very same spot it normally is...look good there's just no cutout where the xfire connector/adapter normally slides on but the markings with the metal/copper conductors are there.
 
It is there at the very same spot it normally is...look good there's just no cutout where the xfire connector/adapter normally slides on but the markings with the metal/copper conductors are there.
I was kind of under the impression that the Crossfire bridge was incompatible with 4K resolution. Something about the maximum playfield/screen resolution being ~4 megapixels that can be transferred. I know the Crossfire bridge isn't overly friendly for bandwidth ( 2.5GT/sec) which I assumed was why all the talk of the transfers being made of the PCI-E bus.

BTW: Here's a pretty clear picture of the Crossfire finger contacts you are talking about.
big_radeon-r9-290x-4.jpg


EDIT: Just found this at Tech Report.
We noted this fact way back in our six-way Eyefinity write-up: the card-to-card link over a CrossFire bridge can only transfer images up to to four megapixels in size. Thus, a CrossFire team connected to multiple displays must pass data from the secondary card to the primary card over PCI Express. The method of compositing frames for Eyefinity is simply different. That's presumably why AMD's current frame-pacing driver can't work its magic on anything beyond a single, four-megapixel monitor.
 
Wow 6 pages of comments :), I just wanted to say: I am digging the looks. Definitely I like how they look. Really fancy right there. They remind me of one of my all time favourite cards: Asus ROG gtx285 matrix edition I wouldn't mind even more bling bling :) You know what they say: Performance may win the fight, but style wins the crowd ;)
 
Last edited:
I kind of hoped for something at the 249$ mark, but the R9 270X looks like a pretty solid card. 5500 in Firestrike "Standard" is around the same as a GTX 760, while costing less than the GTX 760. If we're lucky, it will also overclock like a madman. Since BF4 is AMD optimized, it's definitely a no-brainer for me.

Now we just have wait for availability, hoping to grab one before BF4 releases.
 
I need ek, or preferably swiftech to come out with a block and I am selling my 780 as long as it is on par with it. I may end up keeping my 780 and giving my wife or brother it, or just get selfish and keep it for myself and put it in my htpc case.
 
I don't think a proprietary API (Mantle) has much of a place in today's world. You ever wondered what mostly caused Glide's demise? On top of that, will game developers want to implement multiple versions of code, or code that can only run on a specific vendor's hardware that has ~1/3* of the market, let alone what percentage of those cards are capable of running it?
 
I don't think a proprietary API (Mantle) has much of a place in today's world. You ever wondered what mostly caused Glide's demise? On top of that, will game developers want to implement multiple versions of code, or code that can only run on a specific vendor's hardware that has ~1/3* of the market, let alone what percentage of those cards are capable of running it?

You are right, it doesn't. But does it worth spending some money to make Nvidia look bad in one AAA game? AMD sez yes. Thing is when the 290X will be in the hands of the reviewers mantle spiced BF4 will not be available yet.
 
So all we got today really was a prices on all but the newest cards, no benchmarks, and a boatload of marketing BS.

I mean a big boatload. A shipload.

Really turned off by this. Audio? They most of the time talking about audio!?!?

Not what anybody wanted to hear.

I was thrilled to hear about it - it could really add to the immersion, not to say that more info on the gfx side would not have been nice too. :)
 
I don't think a proprietary API (Mantle) has much of a place in today's world. You ever wondered what mostly caused Glide's demise? On top of that, will game developers want to implement multiple versions of code, or code that can only run on a specific vendor's hardware that has ~1/3* of the market, let alone what percentage of those cards are capable of running it?

Aren't all consoles running amd hardware? Will they be using mantle on the consoles? If so then they would have their foot in. All console ports will use this.
 
Last edited:
Aren't all consoles running amd hardware? Will they be using mantle on the consoles? If so then they would have their foot in.

PS4 and XBO natively support Mantle through AMD's GCN based gfx cards. Both PC and Consoles will benefit from this new API. Thank you AMD.
 
Excellent. They finally dropped the requirement of an active DisplayPort adapter for 'legacy' monitors in EyeFinity. (Presumably eliminating the screen tearing you get on one monitor from using an adapter; not to mention them dying) I suspect a 290 or 280X will be my next card. I'll wait for the reviews, but the healthy helpings of VRAM on both cards should be ideal for triple monitors. Plus, excellent compute performance never hurts.

As I just installed my adapter last night on my 24" Asus, to finally setup Eyefinity... *sigh*.... Oh well, I won't be in the market for one until next year, (closer to Star Citizen release) anyway....
 
Pricing on all cards except the one everyone is interested in.

/Damp squib

I can assure you not everyone is interested in "just" that...

GPUs....audio technology....say wut ? 0_o

GPUs are used to farm virtual gold (coins)... or to hack your damn passwords and what not...
Surprised of 1st hand audio integration on them... why?

Little birds tell: 8000 points in 3DMark Fire Strike. gtx 780 does 8500. :o

On a overclocked i7, yes. However, I'm willing to bet AMD stupidly do their in-house runs with stuff like A10-6800 or FX-8350 tops.

Edit: Examples
-> Stock R7970 + stock FX-9590 -> ~5700 marks in FS
-> Stock R7970 + stock i7-3820 -> ~7100 marks in FS
 
Last edited:
On a overclocked i7, yes.

Nah, a stock 3960X or 4770K does 8700 no problem..in fact, much closer to 9000, to be honest. I have 7970 and GTX 780, plus the modern CPUs, plus the 3DMark Firestrike benches already done and can link them easily enough.

But that's besides the point. We have no clockspeed info, no shader count, and no real testing done. Those scores could change.

Anyway, to me "immersive" = V.R. and I suppose that AMD has reason for going with this, although the real use might not be immediately apparent. What does strike me as odd about the whole thing is the fact that Microsoft seems to be one step ahead of AMD here, and that the butt-load of Win7 users that are not likely to change OS any time soon, plus some hardware restrictions, will make this TrueAudio a difficult sell for some time yet.
 
I think the consoles are the ace in the whole here. If both the xbox and the ps4 use this and mantle, all the AAA console ports will have it too. To me it seems amd is really leveraging their position in consoles to push their new tech.

TressFx was open to anyone. I think all of this new stuff will too. Of course, it will run best on AMD hardware.
 
Last edited:
I don't think a proprietary API (Mantle) has much of a place in today's world.

it could get some attention if amd manages to show at least 20% performance increase in crysis 3 using mantle as opposed to running it via directx. never hurts to become independent on microsoft.

i hope amd doesnt drop the opengl support too..

i havea different question. how does one utilize the audio features? does the graphics card communicate with a sound device or wtf?
 
crossfire bridge isn't complete yet. card isn't ready yet.

hence the extended discussion about audio and nothing concrete about graphics ability.

very disappointing pile of crap presented yesterday.
 
Nah, a stock 3960X or 4770K does 8700 no problem..in fact, much closer to 9000, to be honest. I have 7970 and GTX 780, plus the modern CPUs, plus the 3DMark Firestrike benches already done and can link them easily enough.

But that's besides the point. We have no clockspeed info, no shader count, and no real testing done. Those scores could change.

Anyway, to me "immersive" = V.R. and I suppose that AMD has reason for going with this, although the real use might not be immediately apparent. What does strike me as odd about the whole thing is the fact that Microsoft seems to be one step ahead of AMD here, and that the butt-load of Win7 users that are not likely to change OS any time soon, plus some hardware restrictions, will make this TrueAudio a difficult sell for some time yet.

My info is based off of HWBot submissions, so no monkey bussiness, also, I'm not talking about GTX 780 scores...
Point being, I wouldn't swear by it, but I'm pretty sure AMD's scores are based on a FX-8350, FX-9590 tops, system (PCI-Express 2.0 reminder as well), which doesn't mean much when benching a lower end card, but will hold back considerably faster cards, in synthetics especially.

it could get some attention if amd manages to show at least 20% performance increase in crysis 3 using mantle as opposed to running it via directx. never hurts to become independent on microsoft.

i hope amd doesnt drop the opengl support too..

i havea different question. how does one utilize the audio features? does the graphics card communicate with a sound device or wtf?

Drop OGL support as in drop Linux support? LOL, never.
Especially since there's a huge chance that VALVE's SteamMachines/SteamOS will be based on AMD APUs.


Edit: I'm surprised no one is commenting on AMD's response to nVidia's "GeForce Experience" or whatever it's called.
 
I'm surprised no one is commenting on AMD's response to nVidia's "GeForce Experience" or whatever it's called.

Bloatware for those that don't like to (or worse, that don't know how to) configure their games themselves. There's my comment.
 
Edit: I'm surprised no one is commenting on AMD's response to nVidia's "GeForce Experience" or whatever it's called.

whats to comment anyway, amd are the best of best! at least thats what they told me during the livestream.

well, its a caddish datamining operation for monitoring what programs users use (games at this stage, but theres certainly much more info to mine). i wonder what or who decides the optimal game setting (users may broadcast their setting back to amd, but the most used setting doesnt have to be the smoothest).. and amd = copy cat as with the boost/turbo feature.
 
whats to comment anyway, amd are the best of best! at least thats what they told me during the livestream.

well, its a caddish datamining operation for monitoring what programs users use (games at this stage, but theres certainly much more info to mine). i wonder what or who decides the optimal game setting (users may broadcast their setting back to amd, but the most used setting doesnt have to be the smoothest).. and amd = copy cat as with the boost/turbo feature.

only a fool of the spinal tap variety would believe Nvidia invented boost/turbo

for years all chips have had performance steps etched into their low level microcode and there is no difference at all between how that and boost works, the only change is thermal/power monitoring ,err no actually they were doing that anyway to just not as well

my guitar amp goes to 11 too btw i tipexed it on its way better now

oh and i understand it all no worries , i get that they use it better now to auto oc(:laugh::rockout::laugh:) when a game is not loading it up as much in heat or load terms and thats why I Dont own a GK104 based card.
 
at least these will ram the prices of everything down,

thank you amd
 
only a fool of the spinal tap variety would believe Nvidia invented boost/turbo

for years all chips have had performance steps etched into their low level microcode and there is no difference at all between how that and boost works, the only change is thermal/power monitoring ,err no actually they were doing that anyway to just not as well

ok. i didnt notice until it became an advertised feature. i usually change graphics card in a 4-5 year period to my excuse..


at least these will ram the prices of everything down,
thank you amd

would be nice if it rammed the price of ram too somehow..
 
Back
Top