Wednesday, April 2nd 2014

Radeon R9 295X2 Press Deck Leaked

Here are some of the key slides from AMD's press-deck (presentation) for reviewers, for the Radeon R9 295X2 dual-GPU graphics card, ahead of its April 8 launch. The slides confirm specifications that surfaced earlier this week, which describe the card as bearing the codename "Vesuvius," having two 28 nm "Hawaii" GPUs, and all 2,816 stream processors on the chips being enabled, next to 176 TMUs, 64 ROPs, and 512-bit wide GDDR5 memory interfaces. Two such chips are wired to a PLX PEX8747 PCI-Express 3.0 x48 bridge chip. There's a total of 8 GB of memory on board, 4 GB per GPU. Lastly, clock speeds are revealed. The GPUs are clocked as high as 1018 MHz, and memory at 5.00 GHz (GDDR5-effective). The total memory bandwidth of the card is hence 640 GB/s.

The Radeon R9 295X2 indeed looks like the card which was pictured earlier this week, by members of the ChipHell tech community. It features an air+liquid hybrid cooling solution, much like the ROG ARES II by ASUS. The cooling solution is co-developed by AMD and Asetek. It features a couple of pump-blocks cooling the GPUs, which are plumbed with a common coolant channel running through a single 120 mm radiator+reservoir unit. A 120 mm fan is included. A centrally located fan on the card ventilates heatsinks that cool the VRM, memory, and the PCIe bridge chip.
The card draws power from two 8-pin PCIe power connectors, and appears to use a 12-phase VRM to condition power. The VRM appears to consist of CPL-made chokes, and DirectFETs by International Rectifier. Display outputs include four mini-DisplayPort 1.2, and a dual-link DVI (digital only). The total board power of the card is rated at 500W, and so AMD is obviously over-drawing power from each of the two 8-pin power connectors. You may require PSUs with strong +12V rails driving them. Looking at these numbers, we'd recommend at least an 800W PSU for a single-card system, ideally with a single +12V rail design. The card is 30.7 cm long, and its coolant tubes shoot out from its top. AMD expects the R9 295X2 to be at least 60 percent faster than the R9 290X at 3DMark FireStrike (performance).
Source: VideoCardz
Add your own comment

78 Comments on Radeon R9 295X2 Press Deck Leaked

#51
Xzibit
Why not pull up the one where you didn't know where the PSU was in a server.

Strange how you only express negative views in AMD threads
HumanSmokeI don't like any dual GPU card on principle - not just this one.
Duallies are usually more problematic, suffer in ability (OC) to two single cards, have more issues with drivers, lower resale, and generally aren't a saving over two separate cards
Hypocrite much?

I'm flattered though really. Never had a Hobbit pay so much attention to me.
Posted on Reply
#52
arbiter
So most reviews say 290(x) uses around 300watts but AMD is claiming 250watts? If pcie 3 can give 300 watts which probably not best idea for AMD to use that to power this, doubt that up to clock will be constant . Which that is other thing that sucks about AMD saying you will get Up to this performance most likely less. Least nvidia say you get least X but get what ever card can do after that.

With that said TitanZ i think was just Nvidia putting a shot across AMD nose to get them to respond and they did. Probably won't be long til see Nvidia fire back with something soon.
Posted on Reply
#53
HumanSmoke
XzibitWhy not pull up the one where you didn't know where the PSU was in a server.
Nah, that's bullshit :shadedshu:
XzibitStrange how you only express negative views in AMD threads
Obviously, since I'd prefer two 290X's to a single 295X2 :shadedshu:
XzibitHypocrite much?
The quote is actually consistent with what I've been saying. Are you sure you understand what the word hypocrite means?
XzibitI'm flattered though really. Never had a Hobbit pay so much attention to me.
Well done. You are here....where you've always been
arbiterSo most reviews say 290(x) uses around 300watts but AMD is claiming 250watts? If pcie 3 can give 300 watts which probably not best idea for AMD to use that to power this, doubt that up to clock will be constant .
Well, the board is specced for a nominal 375 watts delivery (2 x 8pin), but it will surely pull more than 150 watts per PCI-E 8-pin - as is quite common at the high end.
Even if the board will clock to its max 1018MHz, I'd think that any overclocking will pull significantly more current than the PCI specification, so any prospective owner might want to check their PSU's power delivery per rail (real or virtual). I really couldn't see this card getting to stable 290X overclocks.
Posted on Reply
#54
alwayssts
buildzoidThe VRM on this seems to be a doubled up and more compact version of the R9 290X VRM so the GPUs can easily get fed 375A each so 750A total current allowance. Just don't expect the PCIe 8Pins(30A-40A) to be able to carry that much power(750A @ 1.5V = 1125W = 93A @ 12V) so if you do plan to use all of the VRM's capability you should solder on 1 or 2 more 6 pin connectors or risk burning something. So VRM wise your good but the dual 8 pins are insufficient. Most Quad R9 290X OC attempts I've seen included 2 1600W PSUs so this card OCed + OCed intel hexa core/ AMD octa core will need a 1200W+ PSU.
Each wire is rated for 13A, just butchered for the pci-e sig spec. Three active in both 6 or 8-pin (8-pin has extra grounds). 13Ax6 @ 12v = 936w. I have no idea if the slot can go out of spec, probably not, but still...you could (theoretically) feed a double 8-pin card 1KW.
Posted on Reply
#55
sweet
alwaysstsEach wire is rated for 13A, just butchered for the pci-e sig spec. Three active in both 6 or 8-pin (8-pin has extra grounds). 13Ax6 @ 12v = 936w. I have no idea if the slot can go out of spec, probably not, but still...you could (theoretically) feed a double 8-pin card 1KW.
Finally a proper explain.
8+8 card can pull more than just 375W, as proved with 7990. Just make sure your PSU is single rail, or each 8 pin is on a separated healthy 30A rail.
By the way, 780ti is a 6+8 card, and it can pull a lot more than just 300W.
Posted on Reply
#56
HumanSmoke
alwaysstsI have no idea if the slot can go out of spec.
It can, but varies on motherboards design. A prime example would be the recent GTX 750/750Ti reviews that used the reference (no aux power) design for testing. More than a few results (esp those which overclocked) yielded a power draw in excess of 75 watts.

[Source]
Posted on Reply
#57
RCoon
SIGSEGVhow much salary do you get from nvidia?
Clearly not enough if you saw the car I drive. I'd also demand a free better GPU as a perk of the job. They can keep their shield for all I care.

Is it too much to ask to have a civilised thread without everyone sharpening their pitchforks and getting all potty mouthed?
I think I prefered it when the worse thing said in one of these threads is "that card looks ugly". I'd almost welcome Jorge at this point (this is a filthy lie).
Posted on Reply
#58
alwayssts
sweetFinally a proper explain.
8+8 card can pull more than just 375W, as proved with 7990. Just make sure your PSU is single rail, or each 8 pin is on a separated healthy 30A rail.
By the way, 780ti is a 6+8 card, and it can pull a lot more than just 300W.
Right. There are actually many cards that do it. Before the days of software to limit tdp so amd/nvidia could upsell you powertune and the like many volt-modded 4850's way out of spec. The same game was played in reverse when nvidia's 500 line essentially was clocked to the balls end of a pci-e spec for their plugs and overclocking brought them substantially over. The fact of the matter is while you could say pci-e plugs are an evolution of the guidelines of the old 'molex' connector, which is all well and good, they are over-engineered by a factor of around 2-3. This card just seems to bring it down to around 2.
HumanSmokeIt can, but varies on motherboards design. A prime example would be the recent GTX 750/750Ti reviews that used the reference (no aux power) design for testing. More than a few results (esp those which overclocked) yielded a power draw in excess of 75 watts.
Very interesting! Thanks for this. So it's a factor of around 2 then, if not the card simply limited to that power draw (to stay in the 75+75w spec) and it's even higher.

Well then, I guess you could take 936 + at least 141w then. When you factor in the vrm rated at (taking his word for it) 1125w, it gives an idea of what the card was built to withstand (which is insane). It seems engineered for at least 2x spec, which sounds well within reason for anything anybody is realistically going to be able to do with it. I doubt many people have a power supply that could even pull that with a (probably overclocked) system under load, even with an ideal cooling solution.

Waiting for the good ol' XS/coolaler gentleman to hook one up to a bare-bones cherry-picked system and prove me wrong (while busting some records). That seems pretty much what it was built to do.

On a weird sidenote, their clocks kind of reveal something interesting inherent to the design. First, 1018 is probably where 5ghz runs out of bandwidth to feed the thing. Second, 1018 seems like where 28nm would run at 1.05v, as it's in tune with binning of past products (like 1.218 for the 1200mhz 7870 or 1.175 for the 1150mhz 7970). It's surely a common voltage/power scaling threshold on all processes, but in this case half-way between the .9v spec and 1.2v where 28nm scaling seems to typically end. Surely they aren't running them at 1.05v and using 1.35v 5ghz ram, but it's interesting that they *could* and probably conserve a bunch of power, if not even dropping down to .9v and slower memory speed. If I were them I would bring back the UBER AWSUM RAD TOOBULAR switch that toggled between 1.05/1.35v for said clocks and 1.2v/1.5v-1.55v (because 1.35v 5ghz ram is 7ghz ram binned from hynix/samsung at that spec). That would actually be quite useful for both the typical person that spends $1000 on a videocard (like we all do from time to time) and those that want the true most out of it.
Posted on Reply
#59
xorbe
I don't believe that chart up there that claims a 750Ti is pulling 141W peak -- try again with 100ms divisions on the measuring hardware.
Posted on Reply
#60
Bytales
radrokWould've loved to see it as a 3x8 pin just in case someone wants to abuse it for overclocking, 2x8 doesn't leave much room for power :(
They dont design something just because someone would love to see it because someone else would supposedly like to abuse it.
They were more practical in their design. If you would take a look at the PCB, you would understand why they went with the dual 8 pin power connectors.
Because there isn't place for a third, and the card is long as it it.

What i would like to see is how ASUS would design their ARES III. They would probably use a wider PCB and triple 8 pin power plugs. Even so, i don't think there would be place for 8 GB per GPU.
Posted on Reply
#61
Bytales
HumanSmoke...
The reality is that people go where the performance is at this level of expenditure- and performance (and price) resides with combinations of single GPU boards, not some PR grab for the halo which might (or might not) give midrange card buyers a chubby.

You really sound like someone that has never bought enthusiast grade hardware.
The reality is i am interested in 295x2 because it allows me to use 4 GPUs using only 4 PCI slots, further using only 2 Physical slots on the motherboard. The way its designed(the slots on the back are on a single row), a custom made waterblock, would allow a single 295x2 to be made single slot, thus widening my possibilities, and winning me another slot on the motherboard.

Thats why i would be personally interested in this card.
Posted on Reply
#62
radrok
BytalesThey dont design something just because someone would love to see it because someone else would supposedly like to abuse it.
They were more practical in their design. If you would take a look at the PCB, you would understand why they went with the dual 8 pin power connectors.
Because there isn't place for a third, and the card is long as it it.

What i would like to see is how ASUS would design their ARES III. They would probably use a wider PCB and triple 8 pin power plugs. Even so, i don't think there would be place for 8 GB per GPU.
I expect overkill on these kind of GPUs, nothing less.
Posted on Reply
#63
Blín D'ñero
RCoonI think, and I really want to believe, the 295X2 will beat the Titan Z. The Titan Z will be voltage locked, and it will be bottlenecked by the air cooler. The AMD however will have a much higher thermal tolerance before throttling (i hope), and will possibly have a higher voltage perimeter.

All that being said. Titan Z is not designed for gaming, for the 50th time, it's a cheap dpcompute card for home servers. This needs to be stapled to every forum header, title, post, thread, the whole works, until everyone on the internet understands that.
Isn't it? Why don't you post your comment then here (@ geforce.com) where they say it "is a gaming monster, built to power the most extreme gaming rigs on the planet.", "a serious card built for serious gamers"... etcetera.
Posted on Reply
#64
RCoon
Blín D'ñeroIsn't it? Why don't you post your comment then here (@ geforce.com) where they say it "is a gaming monster, built to power the most extreme gaming rigs on the planet.", "a serious card built for serious gamers"... etcetera.
Gaming is not and has never been a Titan's primary purpose (please read Titan without DPCompute AKA 780ti). Nobody reads keynotes, nobody watches release events, nobody pays attention to anything and just blurt out what they think before doing research. I'm sick and tired of saying the same thing over and over again.
Yes, the Titan Z is viable for gaming, and is probably very good at it. This is not it's primary purpose. Please research.
Before anybody says anything, I would never buy a Titan, this isn't "lol you must work for NVidia", this is common sense, because I actually watched the release event where it's primary purpose was specifically outlined.
Cheap DPCompute servers. For the millionth time.

"The GTX Titan Z packs two Kepler-class GK110 GPUs with 12GB of memory and a whopping 5,760 CUDA cores, making it a "supercomputer you can fit under your desk," according to Nvidia CEO Jen-Hsun Huang"

Anybody who buys a Titan Z for gaming probably needs to rethink their life, and apply for the Darwin Award.

In other news, I heard this is an AMD thread regarding the 295X2?
Posted on Reply
#65
pr0n Inspector
alwaysstsEach wire is rated for 13A, just butchered for the pci-e sig spec. Three active in both 6 or 8-pin (8-pin has extra grounds). 13Ax6 @ 12v = 936w. I have no idea if the slot can go out of spec, probably not, but still...you could (theoretically) feed a double 8-pin card 1KW.
16AWG is 13A, 18AWG is 10A.
But the bottleneck is the connector, not the conductor.
Posted on Reply
#67
Slizzo
Yeah that design makes sense. While it does suck that it's not using a full cover waterblock for the card, I can see where a rush to market would mean they didn't want to wait for a full coverage block to be designed and chose to go this route.
Posted on Reply
#68
radrok
Oh my god I didn't notice, you can make this thing single slot with a waterblock.

Oh my god.
Posted on Reply
#71
Blín D'ñero
nemCHIPHELL did again here is the reality

[...]
Why does your post contain a hyperlink to itself? Instead of a link to source Chiphell?
Posted on Reply
#72
radrok
Can't wait to see power consumption figures, those should be high like its performance.
Posted on Reply
#73
MxPhenom 216
ASIC Engineer
Xzibit295x2 just has to a better Price/Performance then Titan Z/790 = Win

We should know when the 295x2 will launch on the 8th

Anyone know when the Titan Z will be launched ?
As far as im concerned, TitanZ is not the 790.
Posted on Reply
#74
radrok
MxPhenom 216As far as im concerned, TitanZ is not the 790.
Agreed, there is literally no point in marketing the TitanZ for gaming.

Nvidia should grow some and get a proper dual 780ti on the market, without all the fuss about DP compute and so, 6GB per GPU for less than 1399 USD and it would have a winner.
Posted on Reply
#75
HumanSmoke
radrokNvidia should grow some and get a proper dual 780ti on the market, without all the fuss about DP compute and so, 6GB per GPU for less than 1399 USD and it would have a winner.
The sites which leaked the advance info regarding the Titan Z seem convinced that a 780/780 Ti based GTX 790 dual card was incoming and that it would be distinct from Titan Z. My guess is that Nvidia would play out the same scenario as happened with the original GeForce GK110 launch - snag the price-no-object crowd with Titan, once the initial sales buzz dies down, launch a more price friendly number (700) series card that AIB's would have more input into.
Likely, Nvidia could tune the board frequencies for performance once the 295X2 arrives...although it wouldn't surprise me for AMD to counter with an lower cost air cooled dual 290 (no-X) either.

Haven't owned a dual board since a GeForce 6600GT duallie (early adopter-itis) , and I don't see any of these offerings convincing me to return to a dual card.
If all these SKUs come to pass, it seems like both camps are treading water until 20/16FF comes onstream, and they more boards they launch, the less likely it seems that we'll see an early appearance of the new process.
Posted on Reply
Add your own comment
May 2nd, 2024 12:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts