• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 1070 Ti by Late October

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,890 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
It looks like NVIDIA's next performance-segment graphics card, the GeForce GTX 1070 Ti, could be launched sooner than expected. A report by NordicHardware, pins its launch date at October 26, 2017; ahead of the "early-November" date which was doing rounds earlier. It's also entirely possible that the card will be launched on October 26, and reviews of the card being posted, but market-availability beginning in November.

Based on the 16 nm "GP106" silicon, the GTX 1070 Ti is being designed to be almost as fast as the GTX 1080. It features 2,432 CUDA cores, 152 TMUs, 64 ROPs, and a 256-bit wide GDDR5 memory interface, holding 8 GB of memory. The card is expected to perform (and be priced) within 12 percent of the GTX 1080. Its main competitor from the AMD stable is the Radeon RX Vega 56.



View at TechPowerUp Main Site
 
It is gp104 and are you sure that it will be on 16nm?
 
Ah, the GTX 1080 Miners Edition...
 
Should be GP104 there.

Any NV miners would rejoice as they finally can have a near full GP104 without the dreadful GDDR5X.
 
In case of 9 Gbps GDDR it will be much closer within 2 percent. 1080 costs 549$ * 0.88 ~ 480 is too high.. And I see no point in releasing such product for gamers, they can just buy full 1080 and OC to 12 Gbps. 349$ for old 1070 and 399$ for new Ti makes more sense. And miners have already been satisfied, as we see steady supply now, unless Ti sell for like 299$ it could work for a month before ice age. The difficulty is bombing. There is much wizdom and prophecy in the 106, since GV106 may very well be as fast as full 1080, just like GP106 is as fast as GM204.
 
Last edited:
planning to build a new rig with GTX 1080 this coming nov, do you think it will replace the 1080? would it better to just buy a 1070 Ti?
 
If you want to mine and game, than 1070ti would be better
 
planning to build a new rig with GTX 1080 this coming nov, do you think it will replace the 1080? would it better to just buy a 1070 Ti?

its gonna hard to get one i guess, since this is the 1080 miners edition, still suggest to check price for both, but the 1080 should be still be better
 
planning to build a new rig with GTX 1080 this coming nov, do you think it will replace the 1080? would it better to just buy a 1070 Ti?

The 1080 will still perform better. However, this will be cheaper. I think the "12%" thing needs some perspective, because people seem to be thinking this put the 1070Ti closer in price to the 1080, and in reality it puts it very near the current prices of the 1070.

If you look at models of cards that exist in both the 1070 and 1080 cards you'll see this.

For example, the Zotac 1070 and 1080 mini cards. The 1080 mini is $510, 12% would be about $60, putting the price of the 1070Ti at about $449. The Zotac 1070 Mini is $429. To the 1070Ti would be $60 cheaper than the 1080, but only about $20 more expensive than the current 1070 prices. If you are already looking at a 1070, then it is a no brainer to just spend the $20 more for a 1070Ti(of course prices on the 1070s are likely to go down after the 1070Ti's release). However, it is a harder justification to spend the extra $60 to jump to a 1080, especially if the rumors are true that the 1070Ti performs almost identically and overclocks to outperform a stock 1080.

The pricing roughly works out the same for other models too, like the ASUS Strix 1070 and 1080, MSI Armor cards, and Gigabyte G1 Gaming cards.
 
Thanks for the answers, I don't mine, so my priority will be Gaming related.

but I guess we'll see the benchmarks to further confirm if the supposed $60 difference is worth it for the supposed to be more powerful 1080. I am worried that it will become like the GTX 1080 Ti which essentially killed the first Pascal Titan.
 
Thanks for the answers, I don't mine, so my priority will be Gaming related.

but I guess we'll see the benchmarks to further confirm if the supposed $60 difference is worth it for the supposed to be more powerful 1080. I am worried that it will become like the GTX 1080 Ti which essentially killed the first Pascal Titan.

Either way decide quickly , once released they aren't going to stay on the shelves for too long.
 
These are going to sell out. Can't wait for them to hit the market I have set aside a BTC to buy a rig of them.
 
planning to build a new rig with GTX 1080 this coming nov, do you think it will replace the 1080? would it better to just buy a 1070 Ti?

No, this card looks bandwidth starved.
 
I feel like the only real reason this card is good is because it may cause a downward shift in the pricing for the other card beneath it.
 
I thought nVidia cards suck big time at "mining"

They were a little more expensive for the same hashrate on amd cards but typically more power efficient. 1070 does 32mh/s at 120-140w a 480/580 does 31 at 140w. However in ZEC they are quite a bit better 440sol/s at 150w vs 350-380sol/s at 150w
 
Why? this card makes no sense to me....
 
No, this card looks bandwidth starved.
At which resolution though ... less bandwidth (260 vs 320 GBps) goes well with less TMUs (less texture fill rate and less texture filtering) and similar compute ... any noticeable difference may be at 4K only
 
Bandwidth requirements for textures are not dependent on display resolution, but texture resolution. So unless the game explicitly uses a higher resolution texture for higher screen resolution, the bandwidth requirements for textures will remain the same. Memory bandwidth is likely to be the least of this card's problems.
 
At which resolution though ... less bandwidth (260 vs 320 GBps) goes well with less TMUs (less texture fill rate and less texture filtering) and similar compute ... any noticeable difference may be at 4K only

The 1080 already benefits from more memory bandwidth, and this 1070ti loses more in bandwidth than it loses in core capability.

Theoretically it may not matter much, but in practice, even at 1080p you can notice the benefits of running at 11Gbps compared to stock, especially in the minimum fps. Its something similar to the GTX 660ti, and the 970, all performance segment cards that have fallen off more quickly than the well rounded, full die Gx104's. Similarly, the 680 > 770 had the exact same core, but the 770 came out faster due to faster VRAM.

Therefore its strange seeing these specs come out of the Nvidia stable. Usually they do it the other way around (such as the 1080 11Gbps).

Another reason to avoid the 1070ti is the TDP compared to its core count, its clear as day these are failed 1080's, there is no way you're winning the lottery here. So you get sloppy GP104 cuts compared with the 1070's memory for a near-1080 price. This really isn't a good deal.
 
The 1080 already benefits from more memory bandwidth, and this 1070ti loses more in bandwidth than it loses in core capability.
Computational performance still matters more, so it probably wouldn't be a big deal.

Another reason to avoid the 1070ti is the TDP compared to its core count, its clear as day these are failed 1080's, there is no way you're winning the lottery here.
You need to learn how binning works. By your logic every GTX 1070 is a failed GTX 1080.
 
Bandwidth requirements for textures are not dependent on display resolution, but texture resolution. So unless the game explicitly uses a higher resolution texture for higher screen resolution, the bandwidth requirements for textures will remain the same. Memory bandwidth is likely to be the least of this card's problems.
Oh but you're wrong, in last 7 years all game engines are using deferred rendering as a norm, they are all rendering scene compositely in layers using g-buffers and build frame buffer from that ... with higher resolution frame buffer come same high resolution g-buffers times number of layers, both bandwidth and fill rate requirements rise at 4K ... case in point, texture resolution can be only 2x2 pixels and that 4 texels can cover your entire 4k screen - you still require certain texture fill rate in a deferred renderer even your shaders happily compute only texture samplers interpolate between 4 samples for each pixel on your 4k screen ;)
 
You need to learn how binning works. By your logic every GTX 1070 is a failed GTX 1080.

Which at its core, is what it is...

The 1070ti enables Nvidia to sell more failed 1080's at a higher price.
 
Back
Top