1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon R9 295X2 Press Deck Leaked

Discussion in 'News' started by btarunr, Apr 3, 2014.

  1. HumanSmoke

    HumanSmoke

    Joined:
    Sep 7, 2011
    Messages:
    1,804 (1.42/day)
    Thanks Received:
    731
    Yes. I think the 295X2 is a waste of time for the enthusiast
    Why? Even AMD's own propaganda pegs the 295X2 at 60% better than a single card at a 154% higher price.
    Not really. People aren't going to buy the 295X2 for CUDA development, or for Octane, or for any number of CUDA accelerated productivity apps ( of no small importance due to the hit-and-miss state of OpenCL support)....and as a gaming head-to-head...who cares when two single cards are faster and cheaper
    :D I've got a long way to go before I reach the lack of understanding and trolling you Xzibit in the Nvidia threads :clap:
     
  2. Xzibit

    Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,298 (1.26/day)
    Thanks Received:
    381
    Why not pull up the one where you didn't know where the PSU was in a server.

    Strange how you only express negative views in AMD threads

    Hypocrite much?

    I'm flattered though really. Never had a Hobbit pay so much attention to me.
     
  3. arbiter

    Joined:
    Jun 13, 2012
    Messages:
    308 (0.31/day)
    Thanks Received:
    50
    So most reviews say 290(x) uses around 300watts but AMD is claiming 250watts? If pcie 3 can give 300 watts which probably not best idea for AMD to use that to power this, doubt that up to clock will be constant . Which that is other thing that sucks about AMD saying you will get Up to this performance most likely less. Least nvidia say you get least X but get what ever card can do after that.

    With that said TitanZ i think was just Nvidia putting a shot across AMD nose to get them to respond and they did. Probably won't be long til see Nvidia fire back with something soon.
     
  4. HumanSmoke

    HumanSmoke

    Joined:
    Sep 7, 2011
    Messages:
    1,804 (1.42/day)
    Thanks Received:
    731
    Nah, that's bullshit :shadedshu:
    Obviously, since I'd prefer two 290X's to a single 295X2 :shadedshu:
    The quote is actually consistent with what I've been saying. Are you sure you understand what the word hypocrite means?
    Well done. You are here....where you've always been
    [​IMG]
    Well, the board is specced for a nominal 375 watts delivery (2 x 8pin), but it will surely pull more than 150 watts per PCI-E 8-pin - as is quite common at the high end.
    Even if the board will clock to its max 1018MHz, I'd think that any overclocking will pull significantly more current than the PCI specification, so any prospective owner might want to check their PSU's power delivery per rail (real or virtual). I really couldn't see this card getting to stable 290X overclocks.
     
    Last edited: Apr 4, 2014
  5. alwayssts

    alwayssts

    Joined:
    May 13, 2008
    Messages:
    429 (0.17/day)
    Thanks Received:
    103
    Each wire is rated for 13A, just butchered for the pci-e sig spec. Three active in both 6 or 8-pin (8-pin has extra grounds). 13Ax6 @ 12v = 936w. I have no idea if the slot can go out of spec, probably not, but still...you could (theoretically) feed a double 8-pin card 1KW.
     
  6. sweet

    Joined:
    Oct 1, 2013
    Messages:
    137 (0.27/day)
    Thanks Received:
    33
    Finally a proper explain.
    8+8 card can pull more than just 375W, as proved with 7990. Just make sure your PSU is single rail, or each 8 pin is on a separated healthy 30A rail.
    By the way, 780ti is a 6+8 card, and it can pull a lot more than just 300W.
     
  7. HumanSmoke

    HumanSmoke

    Joined:
    Sep 7, 2011
    Messages:
    1,804 (1.42/day)
    Thanks Received:
    731
    It can, but varies on motherboards design. A prime example would be the recent GTX 750/750Ti reviews that used the reference (no aux power) design for testing. More than a few results (esp those which overclocked) yielded a power draw in excess of 75 watts.
    [​IMG]
    [Source]
     
    alwayssts says thanks.
  8. RCoon

    RCoon Gaming Moderator Staff Member

    Joined:
    Apr 19, 2012
    Messages:
    8,518 (8.14/day)
    Thanks Received:
    4,661
    Location:
    Gypsyland, UK
    Clearly not enough if you saw the car I drive. I'd also demand a free better GPU as a perk of the job. They can keep their shield for all I care.

    Is it too much to ask to have a civilised thread without everyone sharpening their pitchforks and getting all potty mouthed?
    I think I prefered it when the worse thing said in one of these threads is "that card looks ugly". I'd almost welcome Jorge at this point (this is a filthy lie).
     
    james888 and radrok say thanks.
  9. alwayssts

    alwayssts

    Joined:
    May 13, 2008
    Messages:
    429 (0.17/day)
    Thanks Received:
    103
    Right. There are actually many cards that do it. Before the days of software to limit tdp so amd/nvidia could upsell you powertune and the like many volt-modded 4850's way out of spec. The same game was played in reverse when nvidia's 500 line essentially was clocked to the balls end of a pci-e spec for their plugs and overclocking brought them substantially over. The fact of the matter is while you could say pci-e plugs are an evolution of the guidelines of the old 'molex' connector, which is all well and good, they are over-engineered by a factor of around 2-3. This card just seems to bring it down to around 2.


    Very interesting! Thanks for this. So it's a factor of around 2 then, if not the card simply limited to that power draw (to stay in the 75+75w spec) and it's even higher.

    Well then, I guess you could take 936 + at least 141w then. When you factor in the vrm rated at (taking his word for it) 1125w, it gives an idea of what the card was built to withstand (which is insane). It seems engineered for at least 2x spec, which sounds well within reason for anything anybody is realistically going to be able to do with it. I doubt many people have a power supply that could even pull that with a (probably overclocked) system under load, even with an ideal cooling solution.

    Waiting for the good ol' XS/coolaler gentleman to hook one up to a bare-bones cherry-picked system and prove me wrong (while busting some records). That seems pretty much what it was built to do.

    On a weird sidenote, their clocks kind of reveal something interesting inherent to the design. First, 1018 is probably where 5ghz runs out of bandwidth to feed the thing. Second, 1018 seems like where 28nm would run at 1.05v, as it's in tune with binning of past products (like 1.218 for the 1200mhz 7870 or 1.175 for the 1150mhz 7970). It's surely a common voltage/power scaling threshold on all processes, but in this case half-way between the .9v spec and 1.2v where 28nm scaling seems to typically end. Surely they aren't running them at 1.05v and using 1.35v 5ghz ram, but it's interesting that they *could* and probably conserve a bunch of power, if not even dropping down to .9v and slower memory speed. If I were them I would bring back the UBER AWSUM RAD TOOBULAR switch that toggled between 1.05/1.35v for said clocks and 1.2v/1.5v-1.55v (because 1.35v 5ghz ram is 7ghz ram binned from hynix/samsung at that spec). That would actually be quite useful for both the typical person that spends $1000 on a videocard (like we all do from time to time) and those that want the true most out of it.
     
    Last edited: Apr 4, 2014
  10. xorbe

    Joined:
    Feb 14, 2012
    Messages:
    528 (0.48/day)
    Thanks Received:
    90
    Location:
    Bay Area, CA
    I don't believe that chart up there that claims a 750Ti is pulling 141W peak -- try again with 100ms divisions on the measuring hardware.
     
  11. Bytales

    Joined:
    Oct 15, 2010
    Messages:
    45 (0.03/day)
    Thanks Received:
    3
    They dont design something just because someone would love to see it because someone else would supposedly like to abuse it.
    They were more practical in their design. If you would take a look at the PCB, you would understand why they went with the dual 8 pin power connectors.
    Because there isn't place for a third, and the card is long as it it.

    What i would like to see is how ASUS would design their ARES III. They would probably use a wider PCB and triple 8 pin power plugs. Even so, i don't think there would be place for 8 GB per GPU.
     
  12. Bytales

    Joined:
    Oct 15, 2010
    Messages:
    45 (0.03/day)
    Thanks Received:
    3
    The reality is i am interested in 295x2 because it allows me to use 4 GPUs using only 4 PCI slots, further using only 2 Physical slots on the motherboard. The way its designed(the slots on the back are on a single row), a custom made waterblock, would allow a single 295x2 to be made single slot, thus widening my possibilities, and winning me another slot on the motherboard.

    Thats why i would be personally interested in this card.
     
  13. radrok

    radrok

    Joined:
    Oct 26, 2011
    Messages:
    2,992 (2.45/day)
    Thanks Received:
    803
    Location:
    Italy
    I expect overkill on these kind of GPUs, nothing less.
     
  14. Blín D'ñero

    Blín D'ñero

    Joined:
    Feb 21, 2009
    Messages:
    248 (0.11/day)
    Thanks Received:
    50
    Location:
    Netherlands
    Isn't it? Why don't you post your comment then here (@ geforce.com) where they say it "is a gaming monster, built to power the most extreme gaming rigs on the planet.", "a serious card built for serious gamers"... etcetera.
     
  15. RCoon

    RCoon Gaming Moderator Staff Member

    Joined:
    Apr 19, 2012
    Messages:
    8,518 (8.14/day)
    Thanks Received:
    4,661
    Location:
    Gypsyland, UK
    Gaming is not and has never been a Titan's primary purpose (please read Titan without DPCompute AKA 780ti). Nobody reads keynotes, nobody watches release events, nobody pays attention to anything and just blurt out what they think before doing research. I'm sick and tired of saying the same thing over and over again.
    Yes, the Titan Z is viable for gaming, and is probably very good at it. This is not it's primary purpose. Please research.
    Before anybody says anything, I would never buy a Titan, this isn't "lol you must work for NVidia", this is common sense, because I actually watched the release event where it's primary purpose was specifically outlined.
    Cheap DPCompute servers. For the millionth time.

    "The GTX Titan Z packs two Kepler-class GK110 GPUs with 12GB of memory and a whopping 5,760 CUDA cores, making it a "supercomputer you can fit under your desk," according to Nvidia CEO Jen-Hsun Huang"

    Anybody who buys a Titan Z for gaming probably needs to rethink their life, and apply for the Darwin Award.

    In other news, I heard this is an AMD thread regarding the 295X2?
     
    Last edited: Apr 4, 2014
  16. pr0n Inspector

    pr0n Inspector

    Joined:
    Dec 8, 2008
    Messages:
    1,334 (0.59/day)
    Thanks Received:
    164
    16AWG is 13A, 18AWG is 10A.
    But the bottleneck is the connector, not the conductor.
     
  17. Xzibit

    Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,298 (1.26/day)
    Thanks Received:
    381
    New images

    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]

    [​IMG]
    [​IMG]
     
    Last edited: Apr 4, 2014
    Suka, HammerON, erocker and 2 others say thanks.
  18. Slizzo

    Slizzo

    Joined:
    Aug 2, 2011
    Messages:
    388 (0.30/day)
    Thanks Received:
    76
    Yeah that design makes sense. While it does suck that it's not using a full cover waterblock for the card, I can see where a rush to market would mean they didn't want to wait for a full coverage block to be designed and chose to go this route.
     
  19. radrok

    radrok

    Joined:
    Oct 26, 2011
    Messages:
    2,992 (2.45/day)
    Thanks Received:
    803
    Location:
    Italy
    Oh my god I didn't notice, you can make this thing single slot with a waterblock.

    Oh my god.
     
    RCoon says thanks.
  20. nem

    nem

    Joined:
    Oct 22, 2013
    Messages:
    49 (0.10/day)
    Thanks Received:
    4
    Location:
    Cyberdyne CPU Sky Net
    Last edited: Apr 10, 2014
    Xzibit and Suka say thanks.
  21. Xzibit

    Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,298 (1.26/day)
    Thanks Received:
    381
  22. Blín D'ñero

    Blín D'ñero

    Joined:
    Feb 21, 2009
    Messages:
    248 (0.11/day)
    Thanks Received:
    50
    Location:
    Netherlands
    Why does your post contain a hyperlink to itself? Instead of a link to source Chiphell?
     
  23. radrok

    radrok

    Joined:
    Oct 26, 2011
    Messages:
    2,992 (2.45/day)
    Thanks Received:
    803
    Location:
    Italy
    Can't wait to see power consumption figures, those should be high like its performance.
     
  24. MxPhenom 216

    MxPhenom 216 Corsair Fanboy

    Joined:
    Aug 31, 2010
    Messages:
    10,410 (6.34/day)
    Thanks Received:
    2,531
    Location:
    Seattle, WA
    As far as im concerned, TitanZ is not the 790.
     
  25. radrok

    radrok

    Joined:
    Oct 26, 2011
    Messages:
    2,992 (2.45/day)
    Thanks Received:
    803
    Location:
    Italy
    Agreed, there is literally no point in marketing the TitanZ for gaming.

    Nvidia should grow some and get a proper dual 780ti on the market, without all the fuss about DP compute and so, 6GB per GPU for less than 1399 USD and it would have a winner.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page