Saturday, November 2nd 2013

GeForce GTX 780 Ti Pictured in the Flesh

Shortly after its specifications sheet leak, pictures of a reference GeForce GTX 780 Ti (which aren't renders or press-shots) surfaced on ChipHell Forums. The pictures reveal a board design that's practically identical to the GTX TITAN and GTX 780, with the "GTX 780 Ti" marking on the cooler. The folks over at ChipHell Forums also posted five sets of benchmark results, covering various 3DMark tests, Unigine Valley, Aliens vs. Predator 3, Battlefield 3, and Bioshock: Infinite, on a test-bed running Core i7-4960X at 4.50 GHz, and 16 GB of quad-channel DDR3-2933 MHz memory. Given its specifications, it comes as no surprise that the GTX 780 Ti beats both the GTX TITAN, and R9 290X, and goes on to offer performance that's on par with dual-GPU cards such as the GTX 690, and HD 7990. For a single-GPU card, that's a great feat.
The benchmark results from ChipHell's run follow.


Source: ChipHell Forums
Add your own comment

92 Comments on GeForce GTX 780 Ti Pictured in the Flesh

#1
RCoon
Forum Gypsy
[quote="Pandora's Box, post: 3009254"]We really need 20nm to move forward at this point. 28nm is holding AMD and Nvidia back.[/quote]When 20nm comes its going to be a lot more costly for both AMD and NVidia. 28nm is refined and as far as they are concerned it's dirt cheap to produce something that has much better failure rates - one of the many reasons 20nm is taking so long. If and when 20nm does come, the GPU's are probably going to be pretty expensive.
Posted on Reply
#2
NeoXF
It's Radeon HD 7970 vs GeForce GTX 680 all over again...

I see this being mostly a driver war... again, which is great, we didn't have drivers that squeezed the living daylight (in terms of performance) before that generation (at least, in the limited way high-level APIs can allow it).

It's fair to think that AMD might release a newer stepping of Hawaii Pro/XT (Pro-H2/XT2? Ha...) with better yields and maybe higher stock clocks (for one thing tho, the memory could be a lot higher clocked).

R9 290X w/ increased efficiency and 1075MHz reference boost clock & 8GB of 6500MHz GDDR5s please...

BTW, anyone know if Hawaii XT is full Hawaii chip? 3072 ALUs / 192 TMUs sounds like way better numbers... Maybe only for the Pro market. :(

All in all, performance is on the way up and prices on the way down, and the (GP)GPU landscape isn't boring anymore, for the moment.
Posted on Reply
#3
symmetrical
Yayyy it looks like..... the other gpus..... :wtf:
Posted on Reply
#4
SIGSEGV
by: 1c3d0g
Because we get faster performance, greater driver stability, less noise (etc. etc. etc.) compared to whatever shitty card AMD puts out? :rolleyes:
:roll:
Shitty card which tackle 650$ even 1000$ card
Yes sir. It's a shitty card indeed.. driver blah blah classy...

Are you mad bro? :roll:
Posted on Reply
#5
repman244
by: 1c3d0g
Because we get faster performance, greater driver stability, less noise (etc. etc. etc.) compared to whatever shitty card AMD puts out? :rolleyes:
U mad?
Faster performance at higher power (noise usually goes hand in hand with power so don't get your hopes up) and at higher price while AMD offered higher performance at lower price :rolleyes:
Next thing you know you'll also say that the higher power consumption is a plus also :laugh:
Driver point is moot since both sides have it's fair share of problems.

It's just funny to see how very few mention the power consumption now like they did when the 290x was tested. I personally don't care but it's funny to read.
Posted on Reply
#6
EpicShweetness
I might hold out for the R9 290, but with the performance I get now it's questionable as to why not wait for 20nm. All these "new" cards are 28nm being pushed to it's limits, and while impressive there are key areas (in my book) these cards sacrificed for there sheer performance. So whatever, I'll wait, there are certain things I want with my cards.
Posted on Reply
#7
OC-Rage
who say?

by: repman244
U mad?
Faster performance at higher power (noise usually goes hand in hand with power so don't get your hopes up) and at higher price while AMD offered higher performance at lower price :rolleyes:
Next thing you know you'll also say that the higher power consumption is a plus also :laugh:
Driver point is moot since both sides have it's fair share of problems.

It's just funny to see how very few mention the power consumption now like they did when the 290x was tested. I personally don't care but it's funny to read.
i think this card is better faster and cheaper than R9 290x

when any company unleashe any GPU faster and powerfull think about

power consumption and noise and heats

i dont think come soon any 20 nm gpu

waiting for any more benchs you see GTX 780 Ti beats all AMD GPUS single and duals
Posted on Reply
#8
MxPhenom 216
Corsair Fanboy
by: OC-Rage
i think this card is better faster and cheaper than R9 290x

when any company unleashe any GPU faster and powerfull think about

power consumption and noise and heats

i dont think come soon any 20 nm gpu

waiting for any more benchs you see GTX 780 Ti beats all AMD GPUS single and duals
Um,

Cheaper no. Faster yes.

NVidia already confirmed 699 price for the 780ti ($150 more then the r290x). It also probably wont beat Dual GPU cards, but if it even gets near one it'll be right around 690 performance.
Posted on Reply
#9
Jaffakeik
Are those really leaks?As they say in description,I thionk its been done on purpose just to add some hype to potential customers,so they just nameing it as leak just to hide advert :D BUt it looks monster
Posted on Reply
#10
HumanSmoke
by: SIGSEGV
:roll:
Shitty card which tackle 650$ even 1000$ card
Yes sir. It's a shitty card indeed.. driver blah blah classy...

Are you mad bro? :roll:
Conversely, the $550 290X offers 4% (quiet) / 11.1% (scream) more performance than the reference 780 for a 10% higher price and three fewer games ( or 16% higher price w/ 2 fewer games)...and that's assuming that you could find a 290X in stock.

Nice cherry picking. Even nicer incoherent rant.
Kirk Lazarus doesn't approve.
Posted on Reply
#11
Recus
Looks like smaller die is useless die.
NVIDIA clearly targets Radeon R9 290X in their comparisons. The green team is well aware of AMD problems with the noise and the temperature of their new flagship. NVIDIA’s theory is that since R9 290X is using 455mm2 die and GTX 780 Ti is based on 533 mm2 die, it equals to lower thermal density, thus less power condensed into smaller area. Long story short, NVIDIA’s GPU will generate less thermal density per square millimeter, so it is easier to dissipate.

NVIDIA made a test 20 minute Crysis 3 run with R9 290X and 780 TI. According to their data after 2 minutes R9 290X drops to 720 MHz, while GTX 780 Ti sustains 940 MHz clock. The average clock speeds are 799 MHz and 968 MHz for 290X and 780 Ti respectively.
http://videocardz.com/47576/nvidia-geforce-gtx-780-ti-official-specifications
Posted on Reply
#13
the54thvoid
by: radrok
I don't get what you people gain by throwing gasoline on threads, especially you.
You should know by now Rad that some people can only post in certain colours. I like my gfx cards like i like my hoes - hot, expensive and composed of quality silicon.
Posted on Reply
#14
Crap Daddy
Supposedly Nvidia gaming performance chart comparing 780Ti and 290X (Quiet) courtesy videocardz:

Posted on Reply
#15
The Von Matrices
I'm calling BS on their power consumption numbers; something is seriously wrong.

W1zzard tested the GTX 690 at 274W Peak and the R9 290X at 282W peak. Yet somehow the 780Ti in these charts consumes 75W more than either of those. That would make it a 375W card, which it couldn't be since it only has a 300W power design (6+8pin PCIe connectors).

I'm not expecting the 780Ti to use less power than either of the aforementioned cards, but it cannot be more than 15-20W above those while still remaining in spec. These power consumption numbers can't be correct for the benchmarks.

The only way I could possibly see those numbers make sense is if they were taken from Furmark or OCCT. AMD clamps down on power much harder than NVidia in those power viruses so the results are really unrepresentative of the normal power consumption of the card.
Posted on Reply
#17
Crap Daddy
by: The Von Matrices
I'm not expecting the 780Ti to use less power than either of the aforementioned cards, but it cannot be more than 15-20W above those while still remaining in spec. These power consumption numbers can't be correct for the benchmarks.
Moving on to power characteristics. GTX 780 Ti has much lower TDP than 290X, which is 250 watts. This is actually the same number as for TITAN and 780.
source: http://videocardz.com/47576/nvidia-geforce-gtx-780-ti-official-specifications

Apparently they have official specs of the card. Another interesting thing is:
NVIDIA equipped its GTX 780 Ti with a new feature called Power Balancing. Without this feature power drawn from: 6-pin power connector, 8-pin power connector and PCI Express interface would be balanced across these three sources respectively depending on the current load. However if user overclocks the card power delivery becomes unbalanced, thus card draws more power from one source than the others. To fix this problem NVIDIA came up with an idea of Power Balancing. You probably see where this is going. With this feature enabled GPU can steer the power from one input to another. This will improve overclocking capabilities in comparison to GTX 780 or the TITAN.
Posted on Reply
#18
NeoXF
The amount of ignorance and fanboysm in this thread is too damn high!
Posted on Reply
#19
MxPhenom 216
Corsair Fanboy
by: NeoXF
The amount of ignorance and fanboysm in this thread is too damn high!
The amount of worthiness in this post is too damn low.
Posted on Reply
#20
Eagleye
by: The Von Matrices
The only way I could possibly see those numbers make sense is if they were taken from Furmark or OCCT. AMD clamps down on power much harder than NVidia in those power viruses so the results are really unrepresentative of the normal power consumption of the card.
WRONG; Nvidia clamps down on power whereas AMD does nothing to throttle power on most if not all benchmarks, especially furmark. See TPU Reviews

This so called leak is fake. If it was real, am sure its being shown in its best case scenario (Power included)
Posted on Reply
#21
nemesis.ie
by: the54thvoid
You should know by now Rad that some people can only post in certain colours. I like my gfx cards like i like my hoes - hot, expensive and composed of quality silicon.
Are you sure you don't prefer your ladies with "quality silicone" instead? :roll:
Posted on Reply
#22
The Von Matrices
by: Eagleye
WRONG; Nvidia clamps down on power whereas AMD does nothing to throttle power on most if not all benchmarks, especially furmark. See TPU Reviews

This so called leak is fake. If it was real, am sure its being shown in its best case scenario (Power included)
Are you looking at the same TPU data I am? It's obvious that both AMD and NVidia clamp board power consumption to the specification. Look at the maximum power consumption chart from the R9 290X review. Titan and the 780 are both 250W cards, and they are held to within 7% of that specification. R9 290X is a 300W card, and it is held to 5% of that specification. The leaked specifications call the 780Ti a 250W card. There is no way that the 780Ti is drawing 75W more than the R9 290X even though other 250W NVidia cards draw 50W less.

I share your skepticism, but unlike you I don't think in any way this is the best case power consumption scenario for the 780Ti; In fact, I would argue that this is much worse than the worst case. It looks like the tester tried to overclock a reference board to get better performance results but also is drawing vastly more power because of it. I would expect actual 780Ti performance to be slightly less than these results but board power to be significantly less.
Posted on Reply
#23
erocker
Release a card for $500 bucks over all other GPU's. Some buy it, some are on the fence, some think it's a ripoff.

...Then release a card for $200 over all other GPU's and it's a bargain!!

Cunning moves!
Posted on Reply
#25
qubit
Overclocked quantum bit
I think this thread has run its course. Let's wait until some hard evidence of its performance turns up. It won't be long now.
Posted on Reply
Add your own comment