Saturday, November 2nd 2013

GeForce GTX 780 Ti Pictured in the Flesh

Shortly after its specifications sheet leak, pictures of a reference GeForce GTX 780 Ti (which aren't renders or press-shots) surfaced on ChipHell Forums. The pictures reveal a board design that's practically identical to the GTX TITAN and GTX 780, with the "GTX 780 Ti" marking on the cooler. The folks over at ChipHell Forums also posted five sets of benchmark results, covering various 3DMark tests, Unigine Valley, Aliens vs. Predator 3, Battlefield 3, and Bioshock: Infinite, on a test-bed running Core i7-4960X at 4.50 GHz, and 16 GB of quad-channel DDR3-2933 MHz memory. Given its specifications, it comes as no surprise that the GTX 780 Ti beats both the GTX TITAN, and R9 290X, and goes on to offer performance that's on par with dual-GPU cards such as the GTX 690, and HD 7990. For a single-GPU card, that's a great feat.
The benchmark results from ChipHell's run follow.


Source: ChipHell Forums
Add your own comment

92 Comments on GeForce GTX 780 Ti Pictured in the Flesh

#1
ensabrenoir
Its the law of the Techno-Jungle....baby

.......wow all this science, math, numbers crunching.....boulder dash.!!! You got the fastest gpu ? You claim the rights to what ever price you want! Its the Law!
Posted on Reply
#2
1d10t
So this early leaks shows nVidia "naturally" 4-5% faster than R9 290X while requiring 12-15% more power.Now,we need that guy that always bash R9 290X to claim this card had audible sound at max load :p
Posted on Reply
#3
SIGSEGV
by: 1d10t
So this early leaks shows nVidia "naturally" 4-5% faster than R9 290X while requiring 12-15% more power.Now,we need that guy that always bash R9 290X to claim this card had audible sound at max load :p
let's don't forget that this is 699$ card and their reference design cooler is kind of shiny and elegant. I'm so sure that cooler is able to cool this card better than 290x reference (549$) :laugh: /sarcasm.
I love how the way nvidia milking the cash cow :laugh:
Posted on Reply
#4
radrok
This card needs two eight pin power connectors and at least eight power phases just for the core.

And lol to people saying this hasn't overclocking headroom, 2688 cuda Titans can reach 1300/1400 Mhz core with 1.3v.

Wouldn't be surprised to see 1500 mhz core on 1.5v classifieds with this chip.
Posted on Reply
#5
ShurikN
by: radrok
Wouldn't be surprised to see 1500 mhz core on 1.5v classifieds with this chip.
Yea... good luck with the power bill...
Posted on Reply
#6
SIGSEGV
by: ShurikN
Yea... good luck with the power bill...
they don't care
Posted on Reply
#7
Suka
The power figures of the 780Ti make the 290x look good now the guys who complained about the power consumption will be like (fill in your thoughts here) :laugh: Assuming all this is true
Posted on Reply
#8
1d10t
by: SIGSEGV
let's don't forget that this is 699$ card and their reference design cooler is kind of shiny and elegant. I'm so sure that cooler is able to cool this card better than 290x reference (549$) :laugh: /sarcasm.
I love how the way nvidia milking the cash cow :laugh:
Oh don't forget their feature and proprietary stuff...$699 is nothing for such a fancy cooler that trade 2 db bla...bla 2 times louder bla...bla,3D -sooo 2010-active shutter glass,boost lightning on TN panel and future Gay-Sync to mark a duet between Justin Timberlake and Justin Bieber :laugh:
Subjective is a bliss,ignorance is new logic :laugh:
Posted on Reply
#9
xorbe
780Ti length: 281mm (11.0")
Titan length: 267mm (10.5")
Posted on Reply
#10
The Von Matrices
by: Slomo4shO
Take a look at the 4K benchmarks. Seems the card is limited by memory bandwidth...
The spec sheet clearly shows 7GHz memory, which on a 384-bit bus would give it 336GB/s of memory bandwidth. This is more than the R9 290X's 320GB/s bandwidth, so according to your reasoning it shouldn't lose to or tie the 780Ti at 4K (even though it does). I suspect ROP performance is more of the issue here.

by: Suka
Assuming all this is true
I think more likely than not these numbers are correct. But what I've learned from R9 290X speculation and hype is just how cherry picked these initial leaked benchmarks are (for better or for worse depending on the bias of the source.) You can't get a full idea of the card's advantages and disadvantages just based on 5 benchmarks. The R9 290X's performance looked great from initial benchmarks and specifications, but then the final reviews showed the heatsink and power consumption, which significantly dulled the appeal.
Posted on Reply
#11
mastrdrver
by: the54thvoid
Single card 4k is pointless tbh. Need dualies for that.
I disagree as the 4k benches help anyone with multiple monitors to get a good feel of how the card will perform.

You also do not need multiple cards, but you do need bandwidth. That's the biggest killer of 4k and multiple monitor setups. You can see this in the benchmarks of the 290x as the resolution scales to 4k.
Posted on Reply
#12
Eagleye
I just hope W1zzard tests this card in the same manner he did the 290X e.g. sticking his hand in front of the air-vent to see how it does. I also hope all reviews including W1zzard warm the card up for benches as was done for the 290X, Otherwise the tests are null.

Now back to the card.. Wow this thing is going to take the record for hottest, highest power usage and probably loudest card ever made. Just the electric bill will double the price on this card within a year.:nutkick:
Posted on Reply
#13
chinmi
when r9 290x came out, compared to a gtx 780, the r9 290x is :
  1. cheaper
  2. faster
  3. hotter
  4. more power consumption
then gtx780... most nvidia fanboys reaction about the r9 290x on youtube and review comment : it's too hot, and that power bill is outrageous !! who cares if the r9 290x is cheaper, no way i'm gonna buy the r9 290x ! gtx 780 ftw !

then 780ti came out, compared to a r9 290x, the r9 290x is :
  1. cheaper
  2. slower
  3. hotter
  4. more power consumption
then gtx780ti... i bet most nvidia fanboys reaction on youtube and review comment gonna be : it's faster !! who cares about heat and power bill, even though the r9 290x is cheaper, no way i'm gonna buy the r9 290x ! gtx 780ti ftw !

:roll:
Posted on Reply
#14
Sihastru
An argument that can be used by both camps is not an argument at all. And you forgot about the noise levels.
Posted on Reply
#15
jagd
I am agreed with you but problem is more complicated , if a company is giving chery pciked benchmark list to reviewers/review sites and asks to them shown and give some spec list must mentioned it is time the question how independent are reviewers and how many step(s )away PR/marketers for that company ? Similiar thing happened whit xbox360 and most of gaming media trying downplay 720p games on xboxone vs 1080p BF4 and CoD at PS4
http://www.neogaf.com/forum/showthread.php?t=704836

by: The Von Matrices

I think more likely than not these numbers are correct. But what I've learned from R9 290X speculation and hype is just how cherry picked these initial leaked benchmarks are (for better or for worse depending on the bias of the source.)
Are you sure they are only fanboys ? Schills ? Social Media marketers ? Focus group members ?Remember nvidia got cought while its hand in cookie jar :slap:
by: chinmi

then gtx780ti... i bet most nvidia fanboys reaction on youtube and review comment gonna be : it's faster !!
Posted on Reply
#16
rainzor
So noone noticed how in half of those "tests" 290x is on par with Titan, and in the other half on par with gtx780 when it comes to power consumption? Every bench ive seen so far shows it consumes at least 40w more then the titan and double that compared to 780.

oh yea, 599 for 3gb version and 649 for 6gb or gtfo
Posted on Reply
#17
OC-Rage
HA HA BEATS all GPUS

hi

see this card with 3GB Vram beats all single and duall GPUS

power is there and cheaper than all GPUS

performance that's on par with dual-GPU cards such as the GTX 690, and HD 7990. For a

:banghead::Dsingle-GPU card, that's a great feat
Posted on Reply
#18
repman244
So why is nobody complaining about the power consumption now? :rolleyes:
Posted on Reply
#19
Raptorpowa
is it 4gb vram and 512 bit bus like 290X? If not...290X is the one for me cuz I will be rocking 3 27 " crossover monitor soon....
Posted on Reply
#20
the54thvoid
by: repman244
So why is nobody complaining about the power consumption now? :rolleyes:
but also consumes 10-15% more power which is probably about 20% more than a Titan, which could be about 80-100 watts? :eek:
See that 'eek'? I've mentioned twice in this thread about it's power consumption. It's bloody high. The only mitigating factor is that IF the figures are true, it matches dual gpu performance.

The beautiful thing we are about to see is the power of hypocrisy. If this thing is hot and noisy then all those blasting the 290X will need to keep their mouths shut or also criticise this card. The power usage isn't an issue. It's on the same node as a GK104 chip, therefore has the same (in)efficiencies. So if this single card matches a GTX690, we should expect it to draw similar power. If it draws lots more than the relative increase over a GTX690 then it is less efficient.

Power usage is only an argument from a performance/watts ratio. Apologies for using the 290X graph but it is relevant and has all the big players.

Posted on Reply
#21
qubit
Overclocked quantum bit
by: the54thvoid
The beautiful thing we are about to see is the power of hypocrisy. If this thing is hot and noisy then all those blasting the 290X will need to keep their mouths shut or also criticise this card.
Quite agree. If it's hot and noisy then you can bet I'll be criticising it. I might generally prefer NVIDIA's products, but if they put out a lemon, I'm gonna call them out on it.
Posted on Reply
#22
Crap Daddy
by: the54thvoid
The beautiful thing we are about to see is the power of hypocrisy. If this thing is hot and noisy then all those blasting the 290X will need to keep their mouths shut or also criticise this car
First, these leaks are very far from what a professional review means. Videocardz says the 780Ti was clocked 50Mhz above stock. I find it hard to believe that Nvidia will launch a card that's as noisy, hot and power hungry like the 290X. At stock clocks expect the reference 780Ti to draw less power than the 290X while performing better. While it seems it's impossible to surpass convincingly the 290X on several Gaming Evolved titles, I do think it's fair to assume that in every other game this card can achieve around 10% improvement.
I also think that finally Nvidia will allow better overclocking of the card, a situation where power consumption and heat will shoot through the roof. But that is to be expected.
Posted on Reply
#23
1c3d0g
by: repman244
So why is nobody complaining about the power consumption now? :rolleyes:
Because we get faster performance, greater driver stability, less noise (etc. etc. etc.) compared to whatever shitty card AMD puts out? :rolleyes:
Posted on Reply
#24
20mmrain
My question is who cares about 4k resolution when 99% of the people can't afford it and are not using it?
Christ most people still don't use a 2560x1440 monitor either. Why don't graphics card manufactures concentrate on something more important .... like I know building a card that doesn't use 350 Watts by itself and doesn't require nuclear facility to cool it. It won't belong before all cards come with water blocks or need a 700 watt PSU to power the card by itself

All of this seems like laziness to me! It's been a long time since any real progress has been made in the video card front! This re-badge card just proves it some more.
Posted on Reply
#25
Pandora's Box
by: 20mmrain
My question is who cares about 4k resolution when 99% of the people can't afford it and are not using it?
Christ most people still don't use a 2560x1440 monitor either. Why don't graphics card manufactures concentrate on something more important .... like I know building a card that doesn't use 350 Watts by itself and doesn't require nuclear facility to cool it. It won't belong before all cards come with water blocks or need a 700 watt PSU to power the card by itself

All of this seems like laziness to me! It's been a long time since any real progress has been made in the video card front! This re-badge card just proves it some more.
We really need 20nm to move forward at this point. 28nm is holding AMD and Nvidia back.
Posted on Reply
Add your own comment