Friday, September 29th 2017

NVIDIA GeForce GTX 1070 Ti by Late October

It looks like NVIDIA's next performance-segment graphics card, the GeForce GTX 1070 Ti, could be launched sooner than expected. A report by NordicHardware, pins its launch date at October 26, 2017; ahead of the "early-November" date which was doing rounds earlier. It's also entirely possible that the card will be launched on October 26, and reviews of the card being posted, but market-availability beginning in November.

Based on the 16 nm "GP106" silicon, the GTX 1070 Ti is being designed to be almost as fast as the GTX 1080. It features 2,432 CUDA cores, 152 TMUs, 64 ROPs, and a 256-bit wide GDDR5 memory interface, holding 8 GB of memory. The card is expected to perform (and be priced) within 12 percent of the GTX 1080. Its main competitor from the AMD stable is the Radeon RX Vega 56.

Source: NordicHardware
Add your own comment

31 Comments on NVIDIA GeForce GTX 1070 Ti by Late October

#1
Anymal
It is gp104 and are you sure that it will be on 16nm?
Posted on Reply
#2
iO
Ah, the GTX 1080 Miners Edition...
Posted on Reply
#3
NC37
Anymal said:
It is gp104 and are you sure that it will be on 16nm?
Was about to say, that should be the 104 there.
Posted on Reply
#4
Parn
Should be GP104 there.

Any NV miners would rejoice as they finally can have a near full GP104 without the dreadful GDDR5X.
Posted on Reply
#5
ppn
In case of 9 Gbps GDDR it will be much closer within 2 percent. 1080 costs 549$ * 0.88 ~ 480 is too high.. And I see no point in releasing such product for gamers, they can just buy full 1080 and OC to 12 Gbps. 349$ for old 1070 and 399$ for new Ti makes more sense. And miners have already been satisfied, as we see steady supply now, unless Ti sell for like 299$ it could work for a month before ice age. The difficulty is bombing. There is much wizdom and prophecy in the 106, since GV106 may very well be as fast as full 1080, just like GP106 is as fast as GM204.
Posted on Reply
#6
Anymal
iO said:
Ah, the GTX 1080 Miners Edition...
Ti as Titanium. Have to be mined.
Posted on Reply
#7
lyndonguitar
I play games
planning to build a new rig with GTX 1080 this coming nov, do you think it will replace the 1080? would it better to just buy a 1070 Ti?
Posted on Reply
#8
Anymal
If you want to mine and game, than 1070ti would be better
Posted on Reply
#9
NeDix!
lyndonguitar said:
planning to build a new rig with GTX 1080 this coming nov, do you think it will replace the 1080? would it better to just buy a 1070 Ti?
its gonna hard to get one i guess, since this is the 1080 miners edition, still suggest to check price for both, but the 1080 should be still be better
Posted on Reply
#10
newtekie1
Semi-Retired Folder
lyndonguitar said:
planning to build a new rig with GTX 1080 this coming nov, do you think it will replace the 1080? would it better to just buy a 1070 Ti?
The 1080 will still perform better. However, this will be cheaper. I think the "12%" thing needs some perspective, because people seem to be thinking this put the 1070Ti closer in price to the 1080, and in reality it puts it very near the current prices of the 1070.

If you look at models of cards that exist in both the 1070 and 1080 cards you'll see this.

For example, the Zotac 1070 and 1080 mini cards. The 1080 mini is $510, 12% would be about $60, putting the price of the 1070Ti at about $449. The Zotac 1070 Mini is $429. To the 1070Ti would be $60 cheaper than the 1080, but only about $20 more expensive than the current 1070 prices. If you are already looking at a 1070, then it is a no brainer to just spend the $20 more for a 1070Ti(of course prices on the 1070s are likely to go down after the 1070Ti's release). However, it is a harder justification to spend the extra $60 to jump to a 1080, especially if the rumors are true that the 1070Ti performs almost identically and overclocks to outperform a stock 1080.

The pricing roughly works out the same for other models too, like the ASUS Strix 1070 and 1080, MSI Armor cards, and Gigabyte G1 Gaming cards.
Posted on Reply
#11
lyndonguitar
I play games
Thanks for the answers, I don't mine, so my priority will be Gaming related.

but I guess we'll see the benchmarks to further confirm if the supposed $60 difference is worth it for the supposed to be more powerful 1080. I am worried that it will become like the GTX 1080 Ti which essentially killed the first Pascal Titan.
Posted on Reply
#12
Vya Domus
lyndonguitar said:
Thanks for the answers, I don't mine, so my priority will be Gaming related.

but I guess we'll see the benchmarks to further confirm if the supposed $60 difference is worth it for the supposed to be more powerful 1080. I am worried that it will become like the GTX 1080 Ti which essentially killed the first Pascal Titan.
Either way decide quickly , once released they aren't going to stay on the shelves for too long.
Posted on Reply
#13
cdawall
where the hell are my stars
These are going to sell out. Can't wait for them to hit the market I have set aside a BTC to buy a rig of them.
Posted on Reply
#14
Vayra86
lyndonguitar said:
planning to build a new rig with GTX 1080 this coming nov, do you think it will replace the 1080? would it better to just buy a 1070 Ti?
No, this card looks bandwidth starved.
Posted on Reply
#15
Th3pwn3r
I feel like the only real reason this card is good is because it may cause a downward shift in the pricing for the other card beneath it.
Posted on Reply
#16
Prima.Vera
iO said:
Ah, the GTX 1080 Miners Edition...
I thought nVidia cards suck big time at "mining"
Posted on Reply
#17
cdawall
where the hell are my stars
Prima.Vera said:
I thought nVidia cards suck big time at "mining"
They were a little more expensive for the same hashrate on amd cards but typically more power efficient. 1070 does 32mh/s at 120-140w a 480/580 does 31 at 140w. However in ZEC they are quite a bit better 440sol/s at 150w vs 350-380sol/s at 150w
Posted on Reply
#18
Melvis
Why? this card makes no sense to me....
Posted on Reply
#19
BiggieShady
Vayra86 said:
No, this card looks bandwidth starved.
At which resolution though ... less bandwidth (260 vs 320 GBps) goes well with less TMUs (less texture fill rate and less texture filtering) and similar compute ... any noticeable difference may be at 4K only
Posted on Reply
#20
efikkan
Bandwidth requirements for textures are not dependent on display resolution, but texture resolution. So unless the game explicitly uses a higher resolution texture for higher screen resolution, the bandwidth requirements for textures will remain the same. Memory bandwidth is likely to be the least of this card's problems.
Posted on Reply
#21
Vayra86
BiggieShady said:
At which resolution though ... less bandwidth (260 vs 320 GBps) goes well with less TMUs (less texture fill rate and less texture filtering) and similar compute ... any noticeable difference may be at 4K only
The 1080 already benefits from more memory bandwidth, and this 1070ti loses more in bandwidth than it loses in core capability.

Theoretically it may not matter much, but in practice, even at 1080p you can notice the benefits of running at 11Gbps compared to stock, especially in the minimum fps. Its something similar to the GTX 660ti, and the 970, all performance segment cards that have fallen off more quickly than the well rounded, full die Gx104's. Similarly, the 680 > 770 had the exact same core, but the 770 came out faster due to faster VRAM.

Therefore its strange seeing these specs come out of the Nvidia stable. Usually they do it the other way around (such as the 1080 11Gbps).

Another reason to avoid the 1070ti is the TDP compared to its core count, its clear as day these are failed 1080's, there is no way you're winning the lottery here. So you get sloppy GP104 cuts compared with the 1070's memory for a near-1080 price. This really isn't a good deal.
Posted on Reply
#22
efikkan
Vayra86 said:
The 1080 already benefits from more memory bandwidth, and this 1070ti loses more in bandwidth than it loses in core capability.
Computational performance still matters more, so it probably wouldn't be a big deal.

Vayra86 said:

Another reason to avoid the 1070ti is the TDP compared to its core count, its clear as day these are failed 1080's, there is no way you're winning the lottery here.
You need to learn how binning works. By your logic every GTX 1070 is a failed GTX 1080.
Posted on Reply
#23
BiggieShady
efikkan said:
Bandwidth requirements for textures are not dependent on display resolution, but texture resolution. So unless the game explicitly uses a higher resolution texture for higher screen resolution, the bandwidth requirements for textures will remain the same. Memory bandwidth is likely to be the least of this card's problems.
Oh but you're wrong, in last 7 years all game engines are using deferred rendering as a norm, they are all rendering scene compositely in layers using g-buffers and build frame buffer from that ... with higher resolution frame buffer come same high resolution g-buffers times number of layers, both bandwidth and fill rate requirements rise at 4K ... case in point, texture resolution can be only 2x2 pixels and that 4 texels can cover your entire 4k screen - you still require certain texture fill rate in a deferred renderer even your shaders happily compute only texture samplers interpolate between 4 samples for each pixel on your 4k screen ;)
Posted on Reply
#24
Vayra86
efikkan said:

You need to learn how binning works. By your logic every GTX 1070 is a failed GTX 1080.
Which at its core, is what it is...

The 1070ti enables Nvidia to sell more failed 1080's at a higher price.
Posted on Reply
#25
efikkan
BiggieShady said:
Oh but you're wrong, in last 7 years all game engines are using deferred rendering as a norm, they are all rendering scene compositely in layers using g-buffers and build frame buffer from that ... with higher resolution frame buffer come same high resolution g-buffers times number of layers, both bandwidth and fill rate requirements rise at 4K ... case in point, texture resolution can be only 2x2 pixels and that 4 texels can cover your entire 4k screen - you still require certain texture fill rate in a deferred renderer even your shaders happily compute only texture samplers interpolate between 4 samples for each pixel on your 4k screen ;)
Try reading my post again.
Frame buffers scale with screen resolution, going from 1440p to 4K, even with AA only increases the consumption with megabytes. With tiled rendering, the frame buffers mostly stay cache local, resulting in very marginal bandwidth requirements with resolution changes.
Texture resources which uses most of bandwidth are not proportional with screen resolution, they are proportional with detail levels.
Posted on Reply
Add your own comment