1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce GTX 780 Ti Pictured in the Flesh

Discussion in 'News' started by btarunr, Nov 2, 2013.

  1. Raptorpowa

    Raptorpowa

    Joined:
    Sep 4, 2013
    Messages:
    48 (0.11/day)
    Thanks Received:
    11
    [​IMG]

    Whoever wins the crown I will go with the fastest one and treat it really well with my pedestal addition to my SM8, not only that I pre-order the RIVE black edition to replace the old fart i7 920. So this gpu will be happy with the new home. The 3 hd 7950 will stay with i7 920 to a new case prolly corsair 750....
     
  2. ensabrenoir

    ensabrenoir

    Joined:
    Apr 16, 2010
    Messages:
    1,248 (0.74/day)
    Thanks Received:
    207
    Its the law of the Techno-Jungle....baby

    .......wow all this science, math, numbers crunching.....boulder dash.!!! You got the fastest gpu ? You claim the rights to what ever price you want! Its the Law!
     
  3. 1d10t

    1d10t

    Joined:
    Sep 28, 2012
    Messages:
    193 (0.25/day)
    Thanks Received:
    42
    So this early leaks shows nVidia "naturally" 4-5% faster than R9 290X while requiring 12-15% more power.Now,we need that guy that always bash R9 290X to claim this card had audible sound at max load :p
     
    INSTG8R says thanks.
  4. SIGSEGV

    SIGSEGV

    Joined:
    Mar 31, 2012
    Messages:
    508 (0.53/day)
    Thanks Received:
    107
    let's don't forget that this is 699$ card and their reference design cooler is kind of shiny and elegant. I'm so sure that cooler is able to cool this card better than 290x reference (549$) :laugh: /sarcasm.
    I love how the way nvidia milking the cash cow :laugh:
     
  5. radrok

    radrok

    Joined:
    Oct 26, 2011
    Messages:
    2,990 (2.66/day)
    Thanks Received:
    803
    Location:
    Italy
    This card needs two eight pin power connectors and at least eight power phases just for the core.

    And lol to people saying this hasn't overclocking headroom, 2688 cuda Titans can reach 1300/1400 Mhz core with 1.3v.

    Wouldn't be surprised to see 1500 mhz core on 1.5v classifieds with this chip.
     
    OC-Rage says thanks.
  6. ShurikN New Member

    Joined:
    Nov 3, 2013
    Messages:
    10 (0.03/day)
    Thanks Received:
    2
    Yea... good luck with the power bill...
     
    OC-Rage says thanks.
  7. SIGSEGV

    SIGSEGV

    Joined:
    Mar 31, 2012
    Messages:
    508 (0.53/day)
    Thanks Received:
    107
    they don't care
     
    radrok says thanks.
  8. Suka

    Suka

    Joined:
    May 6, 2013
    Messages:
    54 (0.10/day)
    Thanks Received:
    7
    Location:
    Steam Servers
    The power figures of the 780Ti make the 290x look good now the guys who complained about the power consumption will be like (fill in your thoughts here) :laugh: Assuming all this is true
     
  9. 1d10t

    1d10t

    Joined:
    Sep 28, 2012
    Messages:
    193 (0.25/day)
    Thanks Received:
    42
    Oh don't forget their feature and proprietary stuff...$699 is nothing for such a fancy cooler that trade 2 db bla...bla 2 times louder bla...bla,3D -sooo 2010-active shutter glass,boost lightning on TN panel and future Gay-Sync to mark a duet between Justin Timberlake and Justin Bieber :laugh:
    Subjective is a bliss,ignorance is new logic :laugh:
     
  10. xorbe

    Joined:
    Feb 14, 2012
    Messages:
    414 (0.41/day)
    Thanks Received:
    60
    Location:
    Bay Area, CA
    780Ti length: 281mm (11.0")
    Titan length: 267mm (10.5")
     
  11. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,367 (0.95/day)
    Thanks Received:
    462
    The spec sheet clearly shows 7GHz memory, which on a 384-bit bus would give it 336GB/s of memory bandwidth. This is more than the R9 290X's 320GB/s bandwidth, so according to your reasoning it shouldn't lose to or tie the 780Ti at 4K (even though it does). I suspect ROP performance is more of the issue here.

    I think more likely than not these numbers are correct. But what I've learned from R9 290X speculation and hype is just how cherry picked these initial leaked benchmarks are (for better or for worse depending on the bias of the source.) You can't get a full idea of the card's advantages and disadvantages just based on 5 benchmarks. The R9 290X's performance looked great from initial benchmarks and specifications, but then the final reviews showed the heatsink and power consumption, which significantly dulled the appeal.
     
    Last edited: Nov 3, 2013
  12. mastrdrver

    mastrdrver

    Joined:
    Feb 24, 2009
    Messages:
    3,162 (1.51/day)
    Thanks Received:
    586
    I disagree as the 4k benches help anyone with multiple monitors to get a good feel of how the card will perform.

    You also do not need multiple cards, but you do need bandwidth. That's the biggest killer of 4k and multiple monitor setups. You can see this in the benchmarks of the 290x as the resolution scales to 4k.
     
  13. Eagleye

    Joined:
    Dec 4, 2012
    Messages:
    72 (0.10/day)
    Thanks Received:
    12
    I just hope W1zzard tests this card in the same manner he did the 290X e.g. sticking his hand in front of the air-vent to see how it does. I also hope all reviews including W1zzard warm the card up for benches as was done for the 290X, Otherwise the tests are null.

    Now back to the card.. Wow this thing is going to take the record for hottest, highest power usage and probably loudest card ever made. Just the electric bill will double the price on this card within a year.:nutkick:
     
  14. chinmi

    Joined:
    Nov 9, 2011
    Messages:
    95 (0.09/day)
    Thanks Received:
    10
    Location:
    Indonesia
    when r9 290x came out, compared to a gtx 780, the r9 290x is :
    1. cheaper
    2. faster
    3. hotter
    4. more power consumption
    then gtx780... most nvidia fanboys reaction about the r9 290x on youtube and review comment : it's too hot, and that power bill is outrageous !! who cares if the r9 290x is cheaper, no way i'm gonna buy the r9 290x ! gtx 780 ftw !

    then 780ti came out, compared to a r9 290x, the r9 290x is :
    1. cheaper
    2. slower
    3. hotter
    4. more power consumption
    then gtx780ti... i bet most nvidia fanboys reaction on youtube and review comment gonna be : it's faster !! who cares about heat and power bill, even though the r9 290x is cheaper, no way i'm gonna buy the r9 290x ! gtx 780ti ftw !

    :roll:
     
    Roph says thanks.
  15. Sihastru

    Sihastru

    Joined:
    Apr 26, 2009
    Messages:
    357 (0.18/day)
    Thanks Received:
    67
    Location:
    You are here.
    An argument that can be used by both camps is not an argument at all. And you forgot about the noise levels.
     
  16. jagd

    Joined:
    Jul 10, 2009
    Messages:
    460 (0.23/day)
    Thanks Received:
    89
    Location:
    TR
    I am agreed with you but problem is more complicated , if a company is giving chery pciked benchmark list to reviewers/review sites and asks to them shown and give some spec list must mentioned it is time the question how independent are reviewers and how many step(s )away PR/marketers for that company ? Similiar thing happened whit xbox360 and most of gaming media trying downplay 720p games on xboxone vs 1080p BF4 and CoD at PS4
    http://www.neogaf.com/forum/showthread.php?t=704836

    Are you sure they are only fanboys ? Schills ? Social Media marketers ? Focus group members ?Remember nvidia got cought while its hand in cookie jar :slap:
     
  17. rainzor

    Joined:
    Sep 27, 2012
    Messages:
    7 (0.01/day)
    Thanks Received:
    4
    Location:
    Croatia
    So noone noticed how in half of those "tests" 290x is on par with Titan, and in the other half on par with gtx780 when it comes to power consumption? Every bench ive seen so far shows it consumes at least 40w more then the titan and double that compared to 780.

    oh yea, 599 for 3gb version and 649 for 6gb or gtfo
     
    OC-Rage says thanks.
  18. OC-Rage

    OC-Rage

    Joined:
    Oct 20, 2013
    Messages:
    84 (0.21/day)
    Thanks Received:
    3
    HA HA BEATS all GPUS

    hi

    see this card with 3GB Vram beats all single and duall GPUS

    power is there and cheaper than all GPUS

    performance that's on par with dual-GPU cards such as the GTX 690, and HD 7990. For a

    :banghead::Dsingle-GPU card, that's a great feat
     
  19. repman244

    repman244

    Joined:
    Apr 7, 2011
    Messages:
    1,104 (0.83/day)
    Thanks Received:
    456
    So why is nobody complaining about the power consumption now? :rolleyes:
     
    Roph says thanks.
  20. Raptorpowa

    Raptorpowa

    Joined:
    Sep 4, 2013
    Messages:
    48 (0.11/day)
    Thanks Received:
    11
    is it 4gb vram and 512 bit bus like 290X? If not...290X is the one for me cuz I will be rocking 3 27 " crossover monitor soon....
     
  21. the54thvoid

    the54thvoid

    Joined:
    Dec 14, 2009
    Messages:
    3,433 (1.90/day)
    Thanks Received:
    1,653
    Location:
    Glasgow - home of formal profanity
    See that 'eek'? I've mentioned twice in this thread about it's power consumption. It's bloody high. The only mitigating factor is that IF the figures are true, it matches dual gpu performance.

    The beautiful thing we are about to see is the power of hypocrisy. If this thing is hot and noisy then all those blasting the 290X will need to keep their mouths shut or also criticise this card. The power usage isn't an issue. It's on the same node as a GK104 chip, therefore has the same (in)efficiencies. So if this single card matches a GTX690, we should expect it to draw similar power. If it draws lots more than the relative increase over a GTX690 then it is less efficient.

    Power usage is only an argument from a performance/watts ratio. Apologies for using the 290X graph but it is relevant and has all the big players.

    [​IMG]
     
    xvi and qubit say thanks.
  22. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,870 (3.88/day)
    Thanks Received:
    3,498
    Location:
    Quantum well (UK)
    Quite agree. If it's hot and noisy then you can bet I'll be criticising it. I might generally prefer NVIDIA's products, but if they put out a lemon, I'm gonna call them out on it.
     
  23. Crap Daddy

    Crap Daddy

    Joined:
    Oct 29, 2010
    Messages:
    2,758 (1.86/day)
    Thanks Received:
    1,050
    First, these leaks are very far from what a professional review means. Videocardz says the 780Ti was clocked 50Mhz above stock. I find it hard to believe that Nvidia will launch a card that's as noisy, hot and power hungry like the 290X. At stock clocks expect the reference 780Ti to draw less power than the 290X while performing better. While it seems it's impossible to surpass convincingly the 290X on several Gaming Evolved titles, I do think it's fair to assume that in every other game this card can achieve around 10% improvement.
    I also think that finally Nvidia will allow better overclocking of the card, a situation where power consumption and heat will shoot through the roof. But that is to be expected.
     
  24. 1c3d0g

    1c3d0g

    Joined:
    Dec 9, 2007
    Messages:
    699 (0.28/day)
    Thanks Received:
    59
    Because we get faster performance, greater driver stability, less noise (etc. etc. etc.) compared to whatever shitty card AMD puts out? :rolleyes:
     
  25. 20mmrain

    20mmrain

    Joined:
    Oct 6, 2009
    Messages:
    2,774 (1.48/day)
    Thanks Received:
    826
    Location:
    Midwest USA
    My question is who cares about 4k resolution when 99% of the people can't afford it and are not using it?
    Christ most people still don't use a 2560x1440 monitor either. Why don't graphics card manufactures concentrate on something more important .... like I know building a card that doesn't use 350 Watts by itself and doesn't require nuclear facility to cool it. It won't belong before all cards come with water blocks or need a 700 watt PSU to power the card by itself

    All of this seems like laziness to me! It's been a long time since any real progress has been made in the video card front! This re-badge card just proves it some more.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page