• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GeForce GTX 780 Ti Pictured in the Flesh

My question is who cares about 4k resolution when 99% of the people can't afford it and are not using it?
Christ most people still don't use a 2560x1440 monitor either. Why don't graphics card manufactures concentrate on something more important .... like I know building a card that doesn't use 350 Watts by itself and doesn't require nuclear facility to cool it. It won't belong before all cards come with water blocks or need a 700 watt PSU to power the card by itself

All of this seems like laziness to me! It's been a long time since any real progress has been made in the video card front! This re-badge card just proves it some more.

We really need 20nm to move forward at this point. 28nm is holding AMD and Nvidia back.
 
We really need 20nm to move forward at this point. 28nm is holding AMD and Nvidia back.

When 20nm comes its going to be a lot more costly for both AMD and NVidia. 28nm is refined and as far as they are concerned it's dirt cheap to produce something that has much better failure rates - one of the many reasons 20nm is taking so long. If and when 20nm does come, the GPU's are probably going to be pretty expensive.
 
It's Radeon HD 7970 vs GeForce GTX 680 all over again...

I see this being mostly a driver war... again, which is great, we didn't have drivers that squeezed the living daylight (in terms of performance) before that generation (at least, in the limited way high-level APIs can allow it).

It's fair to think that AMD might release a newer stepping of Hawaii Pro/XT (Pro-H2/XT2? Ha...) with better yields and maybe higher stock clocks (for one thing tho, the memory could be a lot higher clocked).

R9 290X w/ increased efficiency and 1075MHz reference boost clock & 8GB of 6500MHz GDDR5s please...

BTW, anyone know if Hawaii XT is full Hawaii chip? 3072 ALUs / 192 TMUs sounds like way better numbers... Maybe only for the Pro market. :(

All in all, performance is on the way up and prices on the way down, and the (GP)GPU landscape isn't boring anymore, for the moment.
 
Yayyy it looks like..... the other gpus..... :wtf:
 
Because we get faster performance, greater driver stability, less noise (etc. etc. etc.) compared to whatever shitty card AMD puts out? :rolleyes:

:roll:
Shitty card which tackle 650$ even 1000$ card
Yes sir. It's a shitty card indeed.. driver blah blah classy...

Are you mad bro? :roll:
 
Because we get faster performance, greater driver stability, less noise (etc. etc. etc.) compared to whatever shitty card AMD puts out? :rolleyes:

U mad?
Faster performance at higher power (noise usually goes hand in hand with power so don't get your hopes up) and at higher price while AMD offered higher performance at lower price :rolleyes:
Next thing you know you'll also say that the higher power consumption is a plus also :laugh:
Driver point is moot since both sides have it's fair share of problems.

It's just funny to see how very few mention the power consumption now like they did when the 290x was tested. I personally don't care but it's funny to read.
 
I might hold out for the R9 290, but with the performance I get now it's questionable as to why not wait for 20nm. All these "new" cards are 28nm being pushed to it's limits, and while impressive there are key areas (in my book) these cards sacrificed for there sheer performance. So whatever, I'll wait, there are certain things I want with my cards.
 
who say?

U mad?
Faster performance at higher power (noise usually goes hand in hand with power so don't get your hopes up) and at higher price while AMD offered higher performance at lower price :rolleyes:
Next thing you know you'll also say that the higher power consumption is a plus also :laugh:
Driver point is moot since both sides have it's fair share of problems.

It's just funny to see how very few mention the power consumption now like they did when the 290x was tested. I personally don't care but it's funny to read.

i think this card is better faster and cheaper than R9 290x

when any company unleashe any GPU faster and powerfull think about

power consumption and noise and heats

i dont think come soon any 20 nm gpu

waiting for any more benchs you see GTX 780 Ti beats all AMD GPUS single and duals
 
i think this card is better faster and cheaper than R9 290x

when any company unleashe any GPU faster and powerfull think about

power consumption and noise and heats

i dont think come soon any 20 nm gpu

waiting for any more benchs you see GTX 780 Ti beats all AMD GPUS single and duals

Um,

Cheaper no. Faster yes.

NVidia already confirmed 699 price for the 780ti ($150 more then the r290x). It also probably wont beat Dual GPU cards, but if it even gets near one it'll be right around 690 performance.
 
  • Like
Reactions: xvi
Are those really leaks?As they say in description,I thionk its been done on purpose just to add some hype to potential customers,so they just nameing it as leak just to hide advert :D BUt it looks monster
 
Looks like smaller die is useless die.

NVIDIA clearly targets Radeon R9 290X in their comparisons. The green team is well aware of AMD problems with the noise and the temperature of their new flagship. NVIDIA’s theory is that since R9 290X is using 455mm2 die and GTX 780 Ti is based on 533 mm2 die, it equals to lower thermal density, thus less power condensed into smaller area. Long story short, NVIDIA’s GPU will generate less thermal density per square millimeter, so it is easier to dissipate.

NVIDIA made a test 20 minute Crysis 3 run with R9 290X and 780 TI. According to their data after 2 minutes R9 290X drops to 720 MHz, while GTX 780 Ti sustains 940 MHz clock. The average clock speeds are 799 MHz and 968 MHz for 290X and 780 Ti respectively.

http://videocardz.com/47576/nvidia-geforce-gtx-780-ti-official-specifications
 
I don't get what you people gain by throwing gasoline on threads, especially you.

You should know by now Rad that some people can only post in certain colours. I like my gfx cards like i like my hoes - hot, expensive and composed of quality silicon.
 
Supposedly Nvidia gaming performance chart comparing 780Ti and 290X (Quiet) courtesy videocardz:

NVIDIA-GeForce-GTX-780-Ti-gaming-performance.png
 
I'm calling BS on their power consumption numbers; something is seriously wrong.

W1zzard tested the GTX 690 at 274W Peak and the R9 290X at 282W peak. Yet somehow the 780Ti in these charts consumes 75W more than either of those. That would make it a 375W card, which it couldn't be since it only has a 300W power design (6+8pin PCIe connectors).

I'm not expecting the 780Ti to use less power than either of the aforementioned cards, but it cannot be more than 15-20W above those while still remaining in spec. These power consumption numbers can't be correct for the benchmarks.

The only way I could possibly see those numbers make sense is if they were taken from Furmark or OCCT. AMD clamps down on power much harder than NVidia in those power viruses so the results are really unrepresentative of the normal power consumption of the card.
 
I'm not expecting the 780Ti to use less power than either of the aforementioned cards, but it cannot be more than 15-20W above those while still remaining in spec. These power consumption numbers can't be correct for the benchmarks.

Moving on to power characteristics. GTX 780 Ti has much lower TDP than 290X, which is 250 watts. This is actually the same number as for TITAN and 780.

source: http://videocardz.com/47576/nvidia-geforce-gtx-780-ti-official-specifications

Apparently they have official specs of the card. Another interesting thing is:

NVIDIA equipped its GTX 780 Ti with a new feature called Power Balancing. Without this feature power drawn from: 6-pin power connector, 8-pin power connector and PCI Express interface would be balanced across these three sources respectively depending on the current load. However if user overclocks the card power delivery becomes unbalanced, thus card draws more power from one source than the others. To fix this problem NVIDIA came up with an idea of Power Balancing. You probably see where this is going. With this feature enabled GPU can steer the power from one input to another. This will improve overclocking capabilities in comparison to GTX 780 or the TITAN.
 
The amount of ignorance and fanboysm in this thread is too damn high!
 
The amount of ignorance and fanboysm in this thread is too damn high!

The amount of worthiness in this post is too damn low.
 
The only way I could possibly see those numbers make sense is if they were taken from Furmark or OCCT. AMD clamps down on power much harder than NVidia in those power viruses so the results are really unrepresentative of the normal power consumption of the card.

WRONG; Nvidia clamps down on power whereas AMD does nothing to throttle power on most if not all benchmarks, especially furmark. See TPU Reviews

This so called leak is fake. If it was real, am sure its being shown in its best case scenario (Power included)
 
You should know by now Rad that some people can only post in certain colours. I like my gfx cards like i like my hoes - hot, expensive and composed of quality silicon.

Are you sure you don't prefer your ladies with "quality silicone" instead? :roll:
 
WRONG; Nvidia clamps down on power whereas AMD does nothing to throttle power on most if not all benchmarks, especially furmark. See TPU Reviews

This so called leak is fake. If it was real, am sure its being shown in its best case scenario (Power included)

Are you looking at the same TPU data I am? It's obvious that both AMD and NVidia clamp board power consumption to the specification. Look at the maximum power consumption chart from the R9 290X review. Titan and the 780 are both 250W cards, and they are held to within 7% of that specification. R9 290X is a 300W card, and it is held to 5% of that specification. The leaked specifications call the 780Ti a 250W card. There is no way that the 780Ti is drawing 75W more than the R9 290X even though other 250W NVidia cards draw 50W less.

I share your skepticism, but unlike you I don't think in any way this is the best case power consumption scenario for the 780Ti; In fact, I would argue that this is much worse than the worst case. It looks like the tester tried to overclock a reference board to get better performance results but also is drawing vastly more power because of it. I would expect actual 780Ti performance to be slightly less than these results but board power to be significantly less.
 
Release a card for $500 bucks over all other GPU's. Some buy it, some are on the fence, some think it's a ripoff.

...Then release a card for $200 over all other GPU's and it's a bargain!!

Cunning moves!
 
Back
Top