Monday, March 16th 2015

More Radeon R9 390X Specs Leak: Close to 70% Faster than R9 290X

Earlier today, AMD reportedly showed its industry partners (likely add-in board partners) a presentation, which was leaked to the web as photographs, and look reasonably legitimate, at first glance. If these numbers of AMD's upcoming flagship product, the Radeon R9 390X WCE (water-cooled edition) hold up, then it could spell trouble for NVIDIA and its GeForce GTX TITAN X. To begin with, the slides confirm that the R9 390X will feature 4,096 stream processors, based on a more refined version of Graphics CoreNext architecture. The core ticks at speeds of up to 1050 MHz. The R9 390X could sell in two variants, an air-cooled one with tamed speeds, and a WCE (water-cooled edition) variant, which comes with an AIO liquid-cooling solution, which lets it throw everything else out of the window in psychotic and murderous pursuit of performance.

It's the memory, where AMD appears to be an early adopter (as its HD 4870 was the first to run the faster GDDR5). The R9 390X features a 4096-bit wide HBM memory bus, holding up to 8 GB of memory. The memory is clocked at 1.25 GHz. The actual memory bandwidth will yet end up much higher than the 5.00 GHz 512-bit GDDR5 on the R9 290X. Power connectors will be the same combination as the previous generation (6-pin + 8-pin). What does this all boil down to? A claimed single-precision floating point performance figure of 8.6 TFLOP/s. Wonder how NVIDIA's GM200 compares to that. AMD claims that the R9 390X will be 50-60% faster than the R9 290X, and we're talking about benchmarks such as Battlefield 4 and FarCry 4. The expectations on NVIDIA's upcoming product are only bound to get higher.
Source: VideoCardz
Add your own comment

99 Comments on More Radeon R9 390X Specs Leak: Close to 70% Faster than R9 290X

#26
the54thvoid
Intoxicated Moderator
VinskaFor some reason, I don't see people b**ching about the price of TITAN-X nearly as much as people b**ch about the price of R9 390X, despite TITAN-X looking noticeably slower [on paper] and costing a LOT more.
Double standards FTW.
Wtf? Are you serious?
1) I'm not seeing much bitching about price.
2) Head over to most Titan threads for a lesson in price bitchery.

Really, there are no double standards here, if anything, less vocal ranting than Titan threads. Go look. Dont make me go find quotes.
Posted on Reply
#27
HumanSmoke
the54thvoidReally, there are no double standards here, if anything, less vocal ranting than Titan threads. Go look. Dont make me go find quotes.
Just listing the threads that are top heavy with Titan-branded pricing would take the best part of a day....and require some serious font size reduction just to fit them all on a page!
Posted on Reply
#28
arbiter
HumanSmokeJust listing the threads that are top heavy with Titan-branded pricing would take the best part of a Week....and require some serious font size reduction just to fit them all on a page!
Corrected that for ya.
Posted on Reply
#29
Naito
There be interesting times ahead, indeed.
Posted on Reply
#30
xvi
newtekie1"Up To 1,050MHz" That translates to "the card will overheat and throttle just like the 290x, but it will last long enough for benchmarks to finish so the reviews look better".
If it's available, shouldn't they use it? Yeah, it conveniently makes benchmarks look better and yes, AMD's marketing department will squeeze every last drop of PR they can out of it, but it also gives us as consumers the ability to get more out of our cards via aftermarket cooling instead of having them locked down from the get-go.
Posted on Reply
#31
Rahmat Sofyan
When this kind of news get into TPU.com newsroom,That's something really interesting...

and maybe true :).
Posted on Reply
#32
arbiter
xviIf it's available, shouldn't they use it? Yeah, it conveniently makes benchmarks look better and yes, AMD's marketing department will squeeze every last drop of PR they can out of it, but it also gives us as consumers the ability to get more out of our cards via aftermarket cooling instead of having them locked down from the get-go.
IMO that is a cop out, its a way to claim card is faster then what it will really run. Throwing a crap cooler on the card that can't reasonable cool the card. People gave nvidia hell for the 970 fiasco and AMD using that is just as bad yet they get a pass to do it.
Posted on Reply
#33
lastcalaveras
If the 390x is this quick and the 380x is just a rebrand of the 290x I see a huge gap in the performance line-up.
Posted on Reply
#34
Rowsol
ZoneDymoThey really need to stop making graphs like that last one.
I know right, so deceptive.
Posted on Reply
#35
GhostRyder
arbiterIMO that is a cop out, its a way to claim card is faster then what it will really run. Throwing a crap cooler on the card that can't reasonable cool the card. People gave nvidia hell for the 970 fiasco and AMD using that is just as bad yet they get a pass to do it.
There is a difference, you can replace a cooler but you cannot replace the built in memory structure.
NaitoThere be interesting times ahead, indeed.
It sure is going to be something this round that's for sure, biggest jump in performance in awhile.
arbiterYea they have been also playing standards game lately claiming stuff should be standard of open source. They want stuff for free cause they can't afford to make it themselves.
Not really, open source means it is more likely to be used than if its closed source. If you keep something locked up where no one can use it without extreme royalties people are not going to use it...
xviIf it's available, shouldn't they use it? Yeah, it conveniently makes benchmarks look better and yes, AMD's marketing department will squeeze every last drop of PR they can out of it, but it also gives us as consumers the ability to get more out of our cards via aftermarket cooling instead of having them locked down from the get-go.
Well there was a problem with the way it was listed, however I feel after they patched the cooler performance things evened out to being just fine. Mine for example are reference OC models and were all able to maintain the clocks under tests. But whatever, that's in the past so I hope this round we will be fine from the get go.
64KI'm sorry Vinska to tell you this but it's because AMD buyers want something for nothing and it's killing AMD. They want a top tier GPU but they don't want to pay for the research and engineering costs that made it possible in the first place.
http://www.msn.com/en-us/money/stockdetails/fi-126.1.AMD.NAS?symbol=AMD&form=PRFISB

AMD Net Profit Margin % -7.32
http://www.msn.com/en-us/money/stockdetails/fi-126.1.NVDA.NAS

Nvidia Net Profit Margin % +12.77
AMD has charged too little for their chips for too long and it may very well bankrupt them if they don't stop it.
AMD has to earn back the OEMs more than anything. They already have plenty of performance and show they can compete fine, its more of the problem with less big name OEM's using their parts plus when you include that consumers automatically view the company with less on the shelves as the "off brand company".

Well, I just am curious now about these 8gb cards and the WC editions...
Posted on Reply
#36
Mysteoa
ZoneDymoThey really need to stop making graphs like that last one.
For who is deceptive for the one that can't read the Y scale? Nobody is going to buy the card based on that graph. It will only make you a little exited before you see the scale on the graphs.
Posted on Reply
#37
librin.so.1
arbiterYea they have been also playing standards game lately claiming stuff should be standard of open source. They want stuff for free cause they can't afford to make it themselves.
Keep in mind that
open source != free to write
a vast majority of "serious" open-source code is, in fact, written by paid[1] developers, working on it full-time.
So, making it open source doesn't mean they don't have to pay for it being written. (and AMD does hire quite a few developers to write FOSS code for them)

[1]and paid quite well at it, too.
Posted on Reply
#38
AsRock
TPU addict
XzibitPrice wise its going to be north of $550 GTX 980 base prices. A $200 premium will put it into $750. No word on a 390 so that could be a starting point for a 390 and the 390X & WCE filling up to 1k price tags.

I think it all depends on how aggressive Nvidia is with Titan X pricing.
nVidia aggressive on their top tier card ?, erm extremely surprised if that happened.
Posted on Reply
#39
xvi
arbiterIMO that is a cop out, its a way to claim card is faster then what it will really run. Throwing a crap cooler on the card that can't reasonable cool the card. People gave nvidia hell for the 970 fiasco and AMD using that is just as bad yet they get a pass to do it.
I agree that it skews benchmarks higher than what real-world would be and that's definitely not ideal, but there are cases where people will see those boost clocks most of the time even in real-world on inexpensive stock coolers (read: not the Bahamas).
The alternative here is to make the card run slow all the time even if the conditions are perfect for 24/7 boost clock speeds.

The solution is a practice that I believe most review websites do follow, which is to simply play through a game and record clock speeds and temps over time. A good review would test real-world clock speeds even after the card has heated up.
lastcalaverasIf the 390x is this quick and the 380x is just a rebrand of the 290x I see a huge gap in the performance line-up.
Don't forget the 390 non-X. We don't have specs for that yet. Even then, I don't think a performance gap this big isn't unheard of, is it?
Posted on Reply
#40
FordGT90Concept
"I go fast!1!11!1!"
NC37Been noticing a lot of GPUs relisting themselves as DX12 from 11.2 or 11.1. Now this is listing something else. Is it just assumed that DX11.1 or 2 are automatically DX12 chips?
No. DX12 is DX12. DX11.2 cards are not fully compatible with DX12. If you try to play a DX12 game on DX11.2 hardware, the features added in DX12 will not work.
VinskaFor some reason, I don't see people b**ching about the price of TITAN-X nearly as much as people b**ch about the price of R9 390X, despite TITAN-X looking noticeably slower [on paper] and costing a LOT more.
Double standards FTW.
I think it's because people looking at 390X are actually going to buy it. TITAN-X, not so much. Real pain versus imagined pain (on the wallet).


I do hope the WCE is reasonably priced. It's about a 4x step up from my 5870. $400/450 maybe but I'm not about to blow $500+ on it.
Posted on Reply
#41
AsRock
TPU addict
390X w\WC under $500 is totally dreamland. Expecting at least $700 but i am ok with i got now so no rush here and to be honest if i upgrade i am after a die shrink at least how ever good or bad this card is.
Posted on Reply
#42
FordGT90Concept
"I go fast!1!11!1!"
That's what I'm afraid of. I hope the non-WCE one isn't butchered (hopefully only slightly slower clocks) and is in my price range. I'm thinking I would be happy with 900+ MHz.
Posted on Reply
#43
manofthem
WCG-TPU Team All-Star!
The 390x is shaping up nicely, looking forward to it if true. After this launch, I'll be making my next gpu purchase, which depends on things to be seen.
Posted on Reply
#44
HumanSmoke
FordGT90ConceptNo. DX12 is DX12. DX11.2 cards are not fully compatible with DX12. If you try to play a DX12 game on DX11.2 hardware, the features added in DX12 will not work.
That probably depends upon the features used by developers. What was clarified a week agowas the feature levels ( AFAIK DX12 does not have a hardware level component, and feature wise supports 11_0, 11_1, 12_0, and 12_1 )


Nvidia's Maxwell doesn't support Tier 3 resource binding, and AMD doesn't support conservative rasterization Tier 1, so it will depend upon whether these are implemented. Conservative rasterization most assuredly will, but not DX features are automatic for gaming (DX11.2 springs to mind.)
FordGT90ConceptI think it's because people looking at 390X are actually going to buy it. TITAN-X, not so much. Real pain versus imagined pain (on the wallet).
Probably a safe assumption for the latter. Titan X will give an insight into what is to come. I suspect the real battle will be between the 390X and 980 Ti. If the former lives up to todays hype than that augers well for the latter as also worthy of the wait.
Posted on Reply
#45
HM_Actua1
Is this %70 on the review models....?
Posted on Reply
#46
Space Lynx
Astronaut
newtekie1"Up To 1,050MHz"

That translates to "the card will overheat and throttle just like the 290x, but it will last long enough for benchmarks to finish so the reviews look better".
Most people don't use a fan curve, I use one, just the custom default from MSI AB and my reference 290x has never throttled or gone past 70 celsius... all about dat idle temp staying low and the curve.

Hatas gon hate. que taylor swift song bros
Posted on Reply
#47
arbiter
xviI agree that it skews benchmarks higher than what real-world would be and that's definitely not ideal, but there are cases where people will see those boost clocks most of the time even in real-world on inexpensive stock coolers (read: not the Bahamas).
The alternative here is to make the card run slow all the time even if the conditions are perfect for 24/7 boost clock speeds.

The solution is a practice that I believe most review websites do follow, which is to simply play through a game and record clock speeds and temps over time. A good review would test real-world clock speeds even after the card has heated up.
I under stand things are never perfect in most people's systems, but in case of 290x even with Open air benches which does help keep video card cooler. A lot of reviews under stock fan profiles would seen a 15-20% drop in clocks to 800-850mhz after 5-10min. AMD could manned up and spent some $ on the cooler that could least keep the card near the 1000mhz they said. If on open air bench couldn't keep barely over 800mhz in testing that is terrible. Yea i know can after market cooler for them but shouldn't need to rely on that for card to run at speeds they claim.
Posted on Reply
#48
15th Warlock
That's gonna be one monster of a card!

Wonder if the AIO cooling solution will get in the way of multiple card configs, I'm sure I wont be able to fit two separate 120mm or 140mm coolers on my current HAF-XB Radeon build...

What's weird is that one of those slides was leaked a couple of days ago, and ppl called it out as a fake because of a couple of typos:



Notice how "functionallity" was misspelled, and Direct X 12_Tier has a weird underscore.

Now in this new leak, the typos have been corrected; oh well, it might that the slide was fixed between the time it was leaked and this new presentation... Kinda suspicious though :wtf:

Source: www.guru3d.com/news-story/amd-radeon-390x-wce-water-cooled-edition.html
Posted on Reply
#49
Para_Franck
Why don't they release it and take my money already, before I spend it on something else much more useless, like a new receiver/controller for my custom RC truck.
Posted on Reply
#50
newtekie1
Semi-Retired Folder
GhostRyderThere is a difference, you can replace a cooler but you cannot replace the built in memory structure.
And the advertised specs on the AMD actually skewed benchmarks which lead to worse than expected performance by the consumer. I'm amazed that people are totally ok with AMD listing and advertising clock speeds that they know their cards can't maintain in stock form. However, nVidia lists some incorrect specs on private spec sheets, specs that they don't advertise openly, and everyone freaks out.
xviIf it's available, shouldn't they use it? Yeah, it conveniently makes benchmarks look better and yes, AMD's marketing department will squeeze every last drop of PR they can out of it, but it also gives us as consumers the ability to get more out of our cards via aftermarket cooling instead of having them locked down from the get-go.
xviThe alternative here is to make the card run slow all the time even if the conditions are perfect for 24/7 boost clock speeds.
Then do what nVidia does and pick an at least obtainable base clock that can be maintained constantly. Then let the card boost if temperature allows. Give at least a guaranteed base clock that the card won't go lower than. But then again, advertising the core clock at 600MHz would be an embarrassment...
Posted on Reply
Add your own comment
Apr 25th, 2024 22:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts