Thursday, April 10th 2014

NVIDIA GeForce GTX 880 Detailed

NVIDIA's next-generation GeForce GTX 880 graphics card is shaping up to be a true successor to the GTX 680. According to a Tyden.cz report, GTX 880 will be based on NVIDIA's GM204 silicon, which ranks within its product stack in the same way GK104 does to the GeForce "Kepler" family. It won't be the biggest chip based on the "Maxwell" architecture, but will have what it takes to outperform even the GK110, again, in the same way GK104 outperforms GF110. The DirectX 12-ready chip will feature an SMM (streaming multiprocessor Maxwell) SIMD design that's identical to that of the GeForce GTX 750 Ti, only there are more SMMs, spread across multiple graphics processing clusters (GPCs), probably cushioned by a large slab of cache.
This is what the GTX 880 is shaping up to be.

  • 20 nm GM204 silicon
  • 7.9 billion transistors
  • 3,200 CUDA cores
  • 200 TMUs
  • 32 ROPs
  • 5.7 TFLOP/s single-precision floating-point throughput
  • 256-bit wide GDDR5 memory interface
  • 4 GB standard memory amount
  • 238 GB/s memory bandwidth
  • Clock speeds of 900 MHz core, 950 MHz GPU Boost, 7.40 GHz memory
  • 230W board power
Sources: PCTuning Tyden.cz, Expreview
Add your own comment

102 Comments on NVIDIA GeForce GTX 880 Detailed

#26
Harry Lloyd
Absolutely ridiculous specs.

Compared to 780 Ti this has:
15% more Gflops, 1/3 less memory bandwidth, fewer TMUs and ROPs, and an enormous TDP. Bear in mind, Maxwell has 50% better performance/watt, and 20 nm adds another 30%.

Whoever came up with those specs is a complete idiot.
Posted on Reply
#27
SimplexPL
Harry Lloyd, post: 3092371, member: 121315"
Absolutely ridiculous specs.

Compared to 780 Ti this has:
15% more Gflops, 1/3 less memory bandwidth, less TMUs and ROPs, and an enormous TDP. Bear in mind, Maxwell has 50% better performance/watt, and 20 nm adds another 30%.

Whoever came up with those specs is a complete idiot.
Yeah, I was just about to write the same thing. All those people who got excited about those specs - read them again and compare to specs of current GPUs.

Luckily this source has zero credibility so let's hope that this specs will turn out to be completely false, or they turn out to be specs of GTX 860 :D
Posted on Reply
#28
20mmrain
I got some solid info that Nvidia will be releasing this card with a black cooler and under the Titan Designation.
It will be called the "GTX 880 Titan-ZZ top edition" and cost $10,000 US dollars (But it will not be the fully unlocked version) But don't worry... those waiting for the fully unlocked version (after spending their money on the GTX 880 won't have to wait long.) Because 3 months later Nvidia will release the more powerful GTX 880ti Super duper Titan ZZ top edition for the price of $20,000 US dollars.

On a serious note, yes I know I was trolling above....and I am glad to see Nvidia make improvements finally. However, I just hope that Nvidia does't try and charge an arm and a leg for this card when it is unnecessary to do so.
Posted on Reply
#29
Harry Lloyd
SimplexPL, post: 3092372, member: 84024"
...or they turn out to be specs of GTX 860 :D
Very realistic, indeed (except for the TDP).

660 had 20% more Gflops than 580, and cost 230 $ (580 cost 500 $).

680 had 100% more Gflops than 580, and cost exactly the same, 500 $.

580 TDP - 244 W
680 TDP - 195 W
660 TDP - 140 W


So how could a card with just 15% more Gflops and 80% better performance/watt have a TDP of 230 W? Complete bullshit.
Posted on Reply
#30
the54thvoid
Well, I just bought a new card so it follows all the next gen stuff will appear in about a month...

But seriously, yawn. This is getting old. New gfx card releases are becoming as dull as Intel's new cpu's. How these things are used and the things they are designed for, i.e. games, need to move forward.

I'd much rather see innovation in gaming and software dictating what the gfx vendors need to do. It's almost like designing a race car that goes 8000 mph in a straight line only to find out we're racing round a donut. Energy efficiencies are pushing NV forward in the mobile space but as far as desktop, something 'more' is required to make it all good.

People are applauding the 295x2 but that's also a boring turd. There's nothing inventive about it - it's a dual chip card with no refinement and a cooling solution slapped on by pragmatism, not innovation. Does it power 4K? Of course it does. But so do 2 x 290x.

TitanZ? who knows. If it's 300watt and 90% the perf of 295x2, it's an achievement. But still meh.

Next gen needs a big rabbit out of a small hat.
Posted on Reply
#31
CounterZeus
Finally, my new graphics card detailed! That or a 870 if my psu doesn't cut it.
Posted on Reply
#32
KainXS
so are we going to have the titan mashup again

GTX880 -> GTX880 Titan(48Rop) -> GTX880Ti(48Rops) -> GTX880 Titan Z/Black/:confused:/:pimp:(All SP's/Rops unlocked for $$$$)

don't think its true myself.
Posted on Reply
#33
ISI300
This is why no one should buy Titan-Zs or 295 X2s. Sooner or later, this thing will release, And AMD has had plenty of time to get their architecture right. Let's hope they can compete.
Posted on Reply
#34
64K
Harry Lloyd, post: 3092375, member: 121315"
Very realistic, indeed (except for the TDP).

660 had 20% more Gflops than 580, and cost 230 $ (580 cost 500 $).

680 had 100% more Gflops than 580, and cost exactly the same, 500 $.

580 TDP - 244 W
680 TDP - 195 W
660 TDP - 140 W


So how could a card with just 15% more Gflops and 80% better performance/watt have a TDP of 230 W? Complete bullshit.
Agreed. The TDP doesn't make sense for Maxwell GM104 with those specs. If the TDP is wrong then what other specs are also wrong and why did whoever leak this info know enough to make the other specs somewhat believable but screw up the TDP? I have to throw the whole leak out the window for now. Maybe some of it will prove to be true at the end of this year. We'll see.
Posted on Reply
#35
xorbe
I am going to camp until ALL of the 800 series cards are on the table (probably when they start talking 900 series). They burned their goodwill by going 780 -> 780Ti -> 6GB 780Ti last round. Luckily we were able to flash a lot of original Titans towards 1100 Mhz and side-step that whole disaster, though still down one SMX, but not much difference to the 6GB 780Ti that was finally sold months and months later.
Posted on Reply
#36
TheHunter
So another "mid-range" dressed up as high-end aka GK104 chip.. Meh talk about milking again.
Posted on Reply
#37
TheGuruStud
This won't ship for another 7-8 months, but we have specs?

Bwahahhahahahahhahahahahahahhahahahahhhhhahahahahahahhaha

Clickbait crap.
Posted on Reply
#38
TheDeeGee
OC-Rage, post: 3092311, member: 139764"
wow GODDAMN what is this {monster}

ready for 4K Games ready for O.C i think 256bit not enough for high resolutions

instead 7.40 Ghz memory speed
Maxwell has an L2 cache of 2MB, so 256-Bit is far from a bottleneck for high resolutions.

Apart from that it runs at 7,4 GHz which might even be OCed to 8 GHz.
Posted on Reply
#39
MxPhenom 216
Corsair Fanboy
buggalugs, post: 3092353, member: 56431"
What are you going on about. This is Nvidia next gen hardware, wont be out for a while, we're still talking rumours. AMD will have something to compare soon enough. AMD will be built on 20nm too so performance will be +/- 10%.....just like every generation. Both are built at the same factory.

The plan is same as always, to milk the consumers. Nvidia will sell you cut down GTX 880 first with disabled ROPS,TMUs CUDA cores etc etc, then AMD will release something, as good or better, then Nvidia will unlock some features of same silicon and call it 880 Ti so consumers need to spend hundreds on a new card.

There is no "better" graphics card company, they are both in on it, everything is pre-planned, Nvidia knows exactly what AMD is doing and AMD knows exactly what Nvidia is doing.
No, GM210 or whatever they are calling big die Maxwell will likely be out at the gtx980, which is what id expect nvidia to use to counter AMDs release. Just like what the 780 is to 680.
Posted on Reply
#40
Slizzo
Yeah, if the 880 is going to be GM204, then I'm waiting for whatever GM200 or GM210 will be.
Posted on Reply
#41
H2323
buggalugs, post: 3092353, member: 56431"
What are you going on about. This is Nvidia next gen hardware, wont be out for a while, we're still talking rumours. AMD will have something to compare soon enough. AMD will be built on 20nm too so performance will be +/- 10%.....just like every generation. Both are built at the same factory.

The plan is same as always, to milk the consumers. Nvidia will sell you cut down GTX 880 first with disabled ROPS,TMUs CUDA cores etc etc, then AMD will release something, as good or better, then Nvidia will unlock some features of same silicon and call it 880 Ti so consumers need to spend hundreds on a new card.

There is no "better" graphics card company, they are both in on it, everything is pre-planned, Nvidia knows exactly what AMD is doing and AMD knows exactly what Nvidia is doing.
Nice job, first practical thing I have read in comments for some time.....That said AMD might do some GloFlo fab for GPU. But yes, it's a game.
Posted on Reply
#42
crazyeyesreaper
Chief Broken Rig
yup fake hardcore super fake fake fake fake fake. :roll:
Posted on Reply
#44
BiggieShady
crazyeyesreaper, post: 3092504, member: 68032"
yup fake hardcore super fake fake fake fake fake. :roll:
When was the last time when a chinese tech website covered the story from czech tech website and it all ends in TPU news section ... and it turns out to be fake?
Posted on Reply
#45
thebluebumblebee
Harry Lloyd, post: 3092375, member: 121315"
Very realistic, indeed (except for the TDP).

660 had 20% more Gflops than 580, and cost 230 $ (580 cost 500 $).

680 had 100% more Gflops than 580, and cost exactly the same, 500 $.

580 TDP - 244 W
680 TDP - 195 W
660 TDP - 140 W



So how could a card with just 15% more Gflops and 80% better performance/watt have a TDP of 230 W? Complete bullshit.
Unfortunately, you are comparing GPU's from totally different "families". This is the trick that Nvidia has played on us. Remember that what we're paying for is the silicone.
580 GF110
680 GK104 (GTX 460/560 family) remember the "shorty" 670's?
660 GK106 (GTS 450/550 family)
and I've got to throw this in
750 Ti GM107 (GT 640/GTS 650)

Lets do some real power comparisons, using w1zzard's PEAK numbers
580=229 watts, 780 Ti=269 watts
560 Ti=159 watts 770=180 watts
550 TI=112 watts 660=124 watts
650=54 watts 750 Ti=57 watts
Has Nvidia reduced power usage in a family?
TheHunter, post: 3092451, member: 104610"
So another "mid-range" dressed up as high-end aka GK104 chip.. Meh talk about milking again.
Bingo!
Posted on Reply
#46
mroofie
Harry Lloyd, post: 3092375, member: 121315"
Very realistic, indeed (except for the TDP).

660 had 20% more Gflops than 580, and cost 230 $ (580 cost 500 $).

680 had 100% more Gflops than 580, and cost exactly the same, 500 $.

580 TDP - 244 W
680 TDP - 195 W
660 TDP - 140 W


So how could a card with just 15% more Gflops and 80% better performance/watt have a TDP of 230 W? Complete bullshit.
I agree the TDP should be below 200w
This info is lacking credibility !!
TPU shame on you !!!!!!!!!!!!!!!!!!
Posted on Reply
#47
64K
thebluebumblebee, post: 3092527, member: 55599"
Unfortunately, you are comparing GPU's from totally different "families". This is the trick that Nvidia has played on us. Remember that what we're paying for is the silicone.
580 GF110
680 GK104 (GTX 460/560 family) remember the "shorty" 670's?
660 GK106 (GTS 450/550 family)
and I've got to throw this in
750 Ti GM107 (GT 640/GTS 650)

Lets do some real power comparisons, using w1zzard's PEAK numbers
580=229 watts, 780 Ti=269 watts
560 Ti=159 watts 770=180 watts
550 TI=112 watts 660=124 watts
650=54 watts 750 Ti=57 watts
Has Nvidia reduced power usage in a family?

Bingo!
However.....

GF104 160 watts
GK104 190 watts
GM204 230 watts - doesn't make sense.

If we're looking at GM210 then yes, but not GM204. I fully expect a 250 watt Big Maxwell but that won't come from GM204. If Nvidia follows suit with the Kepler releases then look for that about a year after GM204. So I'm thinking about 1.5 years at earliest.
Posted on Reply
#48
rooivalk
looking forward to 860 :) hopefully this year.
Posted on Reply
#49
mroofie
rooivalk, post: 3092552, member: 98395"
looking forward to 860 :) hopefully this year.
60 range only comes after a few months after 80 and 70 range :/
Posted on Reply
#50
bogami
Again outdated exits in the majority . Short and thick as if it could have something to hide thick ineffective cooler .Da not to mention the price and because it is the middle class GPU that should replace GTX 770 , is 300 € ! A realistic price ! Of course INVIDIA will not be so blue oriented and we will be imposing abnormal prices for the development of outdated GPU processor as the next generation is already developed !. Horror . follows me when I see what they shows us as already developed and then sold generation of obsolete ., just to draw profit from the patents . The future Tegra based on the second generation of Maxwell and so on (car suport demo) ! developers processors lag behind the capabilities and long known solutions require a few years to finally begin to put into practice . We really lacking strong competition that cood ......
As far as the progress of this processor but I can say that initially predicted ratio of 1 ( FG ) vs. 16 ( MG ) and then it was suspended on 1 against the 8. DirectX12 comes and we'll see what brings us .I hope that AMD will not be left behind with their new generation.
Posted on Reply
Add your own comment