• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GTX 1080 Ti coming January: lower price and same performance as Titan X Pascal

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.80/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Sounds great. Depending on the benchies, I might sell my GTX 1080 and get this.

Nvidia’s upcoming GTX 1080 Ti graphics beast is reportedly launching at next year’s CES in January featuring the company’s high-end GP102 GPU. Packing 3328 CUDA cores, a 1.6Ghz boost clock, 12GB of GDDR5X memory & an impressive 10.8 TFLOPs of graphics horsepower. The new GeForce GTX 1080 Ti is set to deliver Titan X Pascal flagship performance at a significantly lower price. This latest leak comes from China where effectively all of Nvidia’s channel partners manufacture their graphics cards.

http://wccftech.com/nvidia-gtx-1080-ti-launch-january
 
Didn't we get news before that the GTX1080Ti would be using GDDR5, not GDDR5X?
 
Didn't we get news before that the GTX1080Ti would be using GDDR5, not GDDR5X?
Yeah but I'd regarded that as bullshit because gddr5 is simply too slow for that gpu. Maybe with a 512bit bus (nvidia never used that since gtx200 series).

Nice rumours though.
 
Maybe with a 512bit bus (nvidia never used that since gtx200 series).
Yeah, it's a shame that. I used to get simple nerdy pleasure from counting the 512 circuit board traces on my GTX 285. :ohwell:
 
Yeah but I'd regarded that as bullshit because gddr5 is simply too slow for that gpu. Maybe with a 512bit bus (nvidia never used that since gtx200 series).

Nice rumours though.

It's not like there is a massive bandwidth difference between the two. GDDR5 is only about ~20% slower than GDDR5X.
 
Its Trending on Twatter
Sales of KY jump to all time Sky high price and manuals on "" How to Touch your Toe's for Nvidia""
:)o_O:laugh:
 
It's not like there is a massive bandwidth difference between the two. GDDR5 is only about ~20% slower than GDDR5X.
20% is a lot if it is what's lacking in the end. And I bet it would.

Also I wouldn't pay 800-1000 bucks for a gpu with old tech. Gddr5x is a must there.
 
TBH, I'm more interested to see if AMD can produce good competition. If so, nvidia is going to regret their more expensive gsync tech.
 
TBH, I'm more interested to see if AMD can produce good competition. If so, nvidia is going to regret their more expensive gsync tech.
Wrong thread?
Also gsync/Freesync is pretty worthless when you drive a high hz monitor like 144hz. Not really relevant on highend cards like Vega.
 
Its Trending on Twatter
Sales of KY jump to all time Sky high price and manuals on "" How to Touch your Toe's for Nvidia""
:)o_O:laugh:

In fairness, nobody is forcing anyone to buy it. Nvidia aren't pushing anyone into a corner, people are willingly picking the soap up. And that's a symptom of no proper high performance competition.

TBH, I'm more interested to see if AMD can produce good competition. If so, nvidia is going to regret their more expensive gsync tech.

I don't think they do at all. Same way Apple don't regret pricing their phones as they do with limited tech other companies freely use like NFC for file transfers and browser transfers. Nvidia have made money from what they do and will continue to do so. Gsync works and people enjoy it. It's made NV some cash.
 
In fairness, nobody is forcing anyone to buy it. Nvidia aren't pushing anyone into a corner, people are willingly picking the soap up

its a choice of
images

or

images

Both Scrape the **** from your Ass
one is more Expensive than the other
 
its a choice of
images

or

images

Both Scrape the **** from your Ass
one is more Expensive than the other
Lol.

Well I wouldn't say gsync is comparable to Apple tech because apple tech usually doesn't fail to the competition or is almost forgotten. Freesync easily won that war. But as I said earlier - not that it really matters anyway. Technically spoken. Also how many users actually care and buy a sync monitor? Not too many I guess.
 
Lol.

Well I wouldn't say gsync is comparable to Apple tech because apple tech usually doesn't fail to the competition or is almost forgotten. Freesync easily won that war. But as I said earlier - not that it really matters anyway. Technically spoken. Also how many users actually care and buy a sync monitor? Not too many I guess.

I strongly disagree (though I don't have numbers to back this up) I think adaptive sync has been pretty popular.
My point, though, was that if top end gsync monitor are still priced 100 to 300 over their freesync counterparts it could give AMD sales a pretty big boost (assuming the price and performance of the competing cards are close)
 
Looks like we can expect to see 1080 Ti, Zen, and Vega, all in first part of 2017.

Found a site that has some interesting comparison specs on Vega, 1080 Ti, Polaris. The Vega is going to shred in Vulkan and Dx12. Hopefully more devs will offer a Vulkan option like Id did with Doom, as it has proven more effective than Dx12 by far, at least for AMD users.

Vega will have 4096 stream processors, vs 3328-3840 for 1080 Ti, 16GB VRAM, vs 12 for 1080 Ti, 4096bit mem bus, vs 382bit for 1080 Ti, and a whopping 1TB/s bandwidth, vs 480GB/s for 1080 Ti. This lists 1080 Ti as having GDDR5x too, but that may have been changed by Nvidia.

http://wccftech.com/amd-vega-flagship-gpu-launch-teaser/

I was planning on going 1080 Ti, but I'm not so sure now.
 
Last edited by a moderator:
I strongly disagree (though I don't have numbers to back this up) I think adaptive sync has been pretty popular.
My point, though, was that if top end gsync monitor are still priced 100 to 300 over their freesync counterparts it could give AMD sales a pretty big boost (assuming the price and performance of the competing cards are close)
Didn't do up to this point because not so many users actually care about it. That was the same point I tried to make with my last post.

Nvidia has strong sales on their GPU and maybe every 5. or less user of Nvidia GPU does have a Gsync monitor - AMD it's the same probably. It's not really important sales wise.
 
Yeah, it's a shame that. I used to get simple nerdy pleasure from counting the 512 circuit board traces on my GTX 285. :ohwell:
Darn, I miss the BFG GTX275 I had. Should of kept it. It actually still ran games pretty decent as long you did not shove all the sliders to max.

I wonder how well the 1080Tis will fold. If they are close to a Titan XP, I may nab used ones down the road.
Though, that depends also what AMD's new big dies can do. The RX480s fold nicely for their price range.
 
Looks like we can expect to see 1080 Ti, Zen, and Vega, all in first part of 2017.

Found a site that has some interesting comparison specs on Vega, 1080 Ti, Polaris. The Vega is going to shred in Vulkan and Dx12. Hopefully more devs will offer a Vulkan option like Id did with Doom, as it has proven more effective than Dx12 by far, at least for AMD users.

Vega will have 4096 stream processors, vs 3328-3840 for 1080 Ti, 16GB VRAM, vs 12 for 1080 Ti, 4096bit mem bus, vs 382bit for 1080 Ti, and a whopping 1TB/s bandwidth, vs 480GB/s for 1080 Ti. This lists 1080 Ti as having GDDR5x too, but that may have been changed by Nvidia.

http://wccftech.com/amd-vega-flagship-gpu-launch-teaser/

I was planning on going 1080 Ti, but I'm not so sure now.

Look at core counts of Fury X compared to 1080. Vega will be upgraded Fury in terms of performance. Hard to know how far ahead for sure.
If 1080 ti matches Titan X, Vega might not pull it off. Problem is if Nvidia launch it early, we may have to wait quite a while for Vega to appear.
I really would love to see the two released together.
 
Look at core counts of Fury X compared to 1080. Vega will be upgraded Fury in terms of performance..

Fury was made to compete with the 900 series, the Polaris to compete in the sub $300 mainstream market. As WCCFTECH states, it's Vega that is aimed to compete with the 1000 series, and some of the key specs we don't even know yet.

Many have been anxiously awaiting a high end AMD card though, because clearly the RX480 does very well for it's price.
 
Darn, I miss the BFG GTX275 I had. Should of kept it. It actually still ran games pretty decent as long you did not shove all the sliders to max.

I wonder how well the 1080Tis will fold. If they are close to a Titan XP, I may nab used ones down the road.
Though, that depends also what AMD's new big dies can do. The RX480s fold nicely for their price range.
I still have my 285 and I'll bet it will run quite well as long as memory demands are kept down and a DX9 or DX10 game is being run. Might just go and have a fiddle with it now you bring it up...
 
So 980 Ti Vs Fury all over again, nvidia are just waiting for AMD to show up.

Volta baby, bring it!
 
Back
Top