• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GeForce GTX 780 Ti Pictured in the Flesh

7cf2.jpg


Whoever wins the crown I will go with the fastest one and treat it really well with my pedestal addition to my SM8, not only that I pre-order the RIVE black edition to replace the old fart i7 920. So this gpu will be happy with the new home. The 3 hd 7950 will stay with i7 920 to a new case prolly corsair 750....
 
Its the law of the Techno-Jungle....baby

.......wow all this science, math, numbers crunching.....boulder dash.!!! You got the fastest gpu ? You claim the rights to what ever price you want! Its the Law!
 
So this early leaks shows nVidia "naturally" 4-5% faster than R9 290X while requiring 12-15% more power.Now,we need that guy that always bash R9 290X to claim this card had audible sound at max load :p
 
So this early leaks shows nVidia "naturally" 4-5% faster than R9 290X while requiring 12-15% more power.Now,we need that guy that always bash R9 290X to claim this card had audible sound at max load :p

let's don't forget that this is 699$ card and their reference design cooler is kind of shiny and elegant. I'm so sure that cooler is able to cool this card better than 290x reference (549$) :laugh: /sarcasm.
I love how the way nvidia milking the cash cow :laugh:
 
This card needs two eight pin power connectors and at least eight power phases just for the core.

And lol to people saying this hasn't overclocking headroom, 2688 cuda Titans can reach 1300/1400 Mhz core with 1.3v.

Wouldn't be surprised to see 1500 mhz core on 1.5v classifieds with this chip.
 
Wouldn't be surprised to see 1500 mhz core on 1.5v classifieds with this chip.
Yea... good luck with the power bill...
 
The power figures of the 780Ti make the 290x look good now the guys who complained about the power consumption will be like (fill in your thoughts here) :laugh: Assuming all this is true
 
let's don't forget that this is 699$ card and their reference design cooler is kind of shiny and elegant. I'm so sure that cooler is able to cool this card better than 290x reference (549$) :laugh: /sarcasm.
I love how the way nvidia milking the cash cow :laugh:

Oh don't forget their feature and proprietary stuff...$699 is nothing for such a fancy cooler that trade 2 db bla...bla 2 times louder bla...bla,3D -sooo 2010-active shutter glass,boost lightning on TN panel and future Gay-Sync to mark a duet between Justin Timberlake and Justin Bieber :laugh:
Subjective is a bliss,ignorance is new logic :laugh:
 
780Ti length: 281mm (11.0")
Titan length: 267mm (10.5")
 
Take a look at the 4K benchmarks. Seems the card is limited by memory bandwidth...

The spec sheet clearly shows 7GHz memory, which on a 384-bit bus would give it 336GB/s of memory bandwidth. This is more than the R9 290X's 320GB/s bandwidth, so according to your reasoning it shouldn't lose to or tie the 780Ti at 4K (even though it does). I suspect ROP performance is more of the issue here.

Assuming all this is true

I think more likely than not these numbers are correct. But what I've learned from R9 290X speculation and hype is just how cherry picked these initial leaked benchmarks are (for better or for worse depending on the bias of the source.) You can't get a full idea of the card's advantages and disadvantages just based on 5 benchmarks. The R9 290X's performance looked great from initial benchmarks and specifications, but then the final reviews showed the heatsink and power consumption, which significantly dulled the appeal.
 
Last edited:
Single card 4k is pointless tbh. Need dualies for that.

I disagree as the 4k benches help anyone with multiple monitors to get a good feel of how the card will perform.

You also do not need multiple cards, but you do need bandwidth. That's the biggest killer of 4k and multiple monitor setups. You can see this in the benchmarks of the 290x as the resolution scales to 4k.
 
I just hope W1zzard tests this card in the same manner he did the 290X e.g. sticking his hand in front of the air-vent to see how it does. I also hope all reviews including W1zzard warm the card up for benches as was done for the 290X, Otherwise the tests are null.

Now back to the card.. Wow this thing is going to take the record for hottest, highest power usage and probably loudest card ever made. Just the electric bill will double the price on this card within a year.:nutkick:
 
when r9 290x came out, compared to a gtx 780, the r9 290x is :
  1. cheaper
  2. faster
  3. hotter
  4. more power consumption
then gtx780... most nvidia fanboys reaction about the r9 290x on youtube and review comment : it's too hot, and that power bill is outrageous !! who cares if the r9 290x is cheaper, no way i'm gonna buy the r9 290x ! gtx 780 ftw !

then 780ti came out, compared to a r9 290x, the r9 290x is :
  1. cheaper
  2. slower
  3. hotter
  4. more power consumption
then gtx780ti... i bet most nvidia fanboys reaction on youtube and review comment gonna be : it's faster !! who cares about heat and power bill, even though the r9 290x is cheaper, no way i'm gonna buy the r9 290x ! gtx 780ti ftw !

:roll:
 
An argument that can be used by both camps is not an argument at all. And you forgot about the noise levels.
 
I am agreed with you but problem is more complicated , if a company is giving chery pciked benchmark list to reviewers/review sites and asks to them shown and give some spec list must mentioned it is time the question how independent are reviewers and how many step(s )away PR/marketers for that company ? Similiar thing happened whit xbox360 and most of gaming media trying downplay 720p games on xboxone vs 1080p BF4 and CoD at PS4
http://www.neogaf.com/forum/showthread.php?t=704836

I think more likely than not these numbers are correct. But what I've learned from R9 290X speculation and hype is just how cherry picked these initial leaked benchmarks are (for better or for worse depending on the bias of the source.)

Are you sure they are only fanboys ? Schills ? Social Media marketers ? Focus group members ?Remember nvidia got cought while its hand in cookie jar :slap:
then gtx780ti... i bet most nvidia fanboys reaction on youtube and review comment gonna be : it's faster !!
 
So noone noticed how in half of those "tests" 290x is on par with Titan, and in the other half on par with gtx780 when it comes to power consumption? Every bench ive seen so far shows it consumes at least 40w more then the titan and double that compared to 780.

oh yea, 599 for 3gb version and 649 for 6gb or gtfo
 
HA HA BEATS all GPUS

hi

see this card with 3GB Vram beats all single and duall GPUS

power is there and cheaper than all GPUS

performance that's on par with dual-GPU cards such as the GTX 690, and HD 7990. For a

:banghead::Dsingle-GPU card, that's a great feat
 
So why is nobody complaining about the power consumption now? :rolleyes:
 
is it 4gb vram and 512 bit bus like 290X? If not...290X is the one for me cuz I will be rocking 3 27 " crossover monitor soon....
 
So why is nobody complaining about the power consumption now? :rolleyes:

but also consumes 10-15% more power which is probably about 20% more than a Titan, which could be about 80-100 watts? :eek:

See that 'eek'? I've mentioned twice in this thread about it's power consumption. It's bloody high. The only mitigating factor is that IF the figures are true, it matches dual gpu performance.

The beautiful thing we are about to see is the power of hypocrisy. If this thing is hot and noisy then all those blasting the 290X will need to keep their mouths shut or also criticise this card. The power usage isn't an issue. It's on the same node as a GK104 chip, therefore has the same (in)efficiencies. So if this single card matches a GTX690, we should expect it to draw similar power. If it draws lots more than the relative increase over a GTX690 then it is less efficient.

Power usage is only an argument from a performance/watts ratio. Apologies for using the 290X graph but it is relevant and has all the big players.

perfwatt_2560.gif
 
The beautiful thing we are about to see is the power of hypocrisy. If this thing is hot and noisy then all those blasting the 290X will need to keep their mouths shut or also criticise this card.

Quite agree. If it's hot and noisy then you can bet I'll be criticising it. I might generally prefer NVIDIA's products, but if they put out a lemon, I'm gonna call them out on it.
 
The beautiful thing we are about to see is the power of hypocrisy. If this thing is hot and noisy then all those blasting the 290X will need to keep their mouths shut or also criticise this car

First, these leaks are very far from what a professional review means. Videocardz says the 780Ti was clocked 50Mhz above stock. I find it hard to believe that Nvidia will launch a card that's as noisy, hot and power hungry like the 290X. At stock clocks expect the reference 780Ti to draw less power than the 290X while performing better. While it seems it's impossible to surpass convincingly the 290X on several Gaming Evolved titles, I do think it's fair to assume that in every other game this card can achieve around 10% improvement.
I also think that finally Nvidia will allow better overclocking of the card, a situation where power consumption and heat will shoot through the roof. But that is to be expected.
 
So why is nobody complaining about the power consumption now? :rolleyes:

Because we get faster performance, greater driver stability, less noise (etc. etc. etc.) compared to whatever shitty card AMD puts out? :rolleyes:
 
My question is who cares about 4k resolution when 99% of the people can't afford it and are not using it?
Christ most people still don't use a 2560x1440 monitor either. Why don't graphics card manufactures concentrate on something more important .... like I know building a card that doesn't use 350 Watts by itself and doesn't require nuclear facility to cool it. It won't belong before all cards come with water blocks or need a 700 watt PSU to power the card by itself

All of this seems like laziness to me! It's been a long time since any real progress has been made in the video card front! This re-badge card just proves it some more.
 
Back
Top