• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

1080 TI and blessed!!

I simply want to save the card for later - Nothing bizzare about that.

As for saving cards, been doing that for over a decade now - Nevermind three years ago.
I have my reasons (Benchmarking related) and that's it.
 
It's a a bizarre concept to be stashing videocards for later use, would you have ever considered doing this, say, three years ago?
Yes, it's always handy to have a spare on hand.
 
It's a a bizarre concept to be stashing videocards for later use, would you have ever considered doing this, say, three years ago?

Why? If there is life in them... they have purpose. And if there's no life in them, maybe it just looks nice.

I think the concept didn't get more bizarre given the current shortages, you either have the spare card(s) or you didn't and you won't be getting them now anyway :D

Personally more partial to selling used hardware off while it still has economical value, preferably as close to MSRP as possible :) I could probably sell my 1080 at a net gain right now... 500 eur doesn't even seem to be a stretch anymore, and I paid 420...
 
Why? If there is life in them... they have purpose. And if there's no life in them, maybe it just looks nice.

I think the concept didn't get more bizarre given the current shortages, you either have the spare card(s) or you didn't and you won't be getting them now anyway :D

Personally more partial to selling used hardware off while it still has economical value, preferably as close to MSRP as possible :) I could probably sell my 1080 at a net gain right now... 500 eur doesn't even seem to be a stretch anymore, and I paid 420...
See this is what I thought was the standard rationale WRT holding onto videocards (i.e. selling used hardware off while it still has economical value), but with the GPU crisis going on now I can see holding on to older videocards might be the wisest move.
 
I've just read a post about nvidia MIGHT re-release the 1080ti and 1650 cards. Read a bunch of comments saying who it would be a bargain at 250$. How would a new 1080ti cost 250 when they were sold for 300-350$ used. But I have to agree that this card for it's age is still a beast.
 
Just a few weeks prior the latest Mining Craze I was thinking to buy GTX 1080 TI and I remember I could get one of those for around 250€ instead of that I decided to buy R9 Fury for just 75€....Well I love the Fury it does what is supposed to do and even their price is pumped lately but I wish I could get back in time just to get that 1080TI!!!
 
Why? If there is life in them... they have purpose. And if there's no life in them, maybe it just looks nice.

I think the concept didn't get more bizarre given the current shortages, you either have the spare card(s) or you didn't and you won't be getting them now anyway :D

Personally more partial to selling used hardware off while it still has economical value, preferably as close to MSRP as possible :) I could probably sell my 1080 at a net gain right now... 500 eur doesn't even seem to be a stretch anymore, and I paid 420...
I only keep hold of dead, useless but cool shit apparently, odd.
 
I've just read a post about nvidia MIGHT re-release the 1080ti and 1650 cards. Read a bunch of comments saying who it would be a bargain at 250$. How would a new 1080ti cost 250 when they were sold for 300-350$ used. But I have to agree that this card for it's age is still a beast.
How could Nvidia do this though without producing more Pascal GPU's? If the semiconductor foundries are already overbooked how are you going to get a line producing three year old GPU's that are two generations obsolete? Why not produce Turing parts? Or more Ampere parts? Why waste a semiconductor foundry line making obsolete GPU's?
 
How could Nvidia do this though without producing more Pascal GPU's? If the semiconductor foundries are already overbooked how are you going to get a line producing three year old GPU's that are two generations obsolete? Why not produce Turing parts? Or more Ampere parts? Why waste a semiconductor foundry line making obsolete GPU's?

Nvidia is still making Tegra X1+ on TSMC 16nm, now they reuse 16nm fab for 1080Ti while transitioning new Tegra to TSMC 5nm probably (for next gen Switch)
 
Going back in time or forth in time I think there are few cards better than 1080ti.it was amazing when launched and price was good. Only missing feature that would have been nice today is dlss, but even without it still does okay in the new Crysis aka Cyberpunk and 1440p at mostly high is doable at around 60fps.
 
Nvidia is still making Tegra X1+ on TSMC 16nm, now they reuse 16nm fab for 1080Ti while transitioning new Tegra to TSMC 5nm probably (for next gen Switch)
OK, if you're going to commit yourself to making obsolete 16nm Pascal parts, why not at least manufacture the BEST of the Pascal parts, like Titan XP's? Then let the AIB's design their own custom boards with custom cooling solutions for the Titan XP's?
 
I was very lucky to get a day 1 RTX3080, as the months go past it is not lost on me just how lucky that really was.

But I kept my GTX1080 (non Ti), got that beast of a card at launch and truth be told it's still no slouch either, and I'd 100% have stuck with it rather than pay inflated/scalper prices in the aftermath of the launch(es).

It sits mining away in a rig that is on a timer to only be on when my solar power is generating +1kw, if it's still going when I replace my 3080 down the line, it will likely go into that box and the GTX1080 can take its place mounted on my wall with a tribute poster that's in the making.

That was always my original plan for it, in the 4 years of gaming I really grew fond of the GTX1080 and it enabled some amazing memories for me while just powering through anything I could throw at it. It can take its place in a shrine for all eternity, plus be a backup/retro bench/fun card if the itch strikes.
 
this is a great feel good subject. i 46.4% dont need a 3070 now :) as im stroking my 1080 :love::)
 
If you check how 1080 Ti fares in most NEWER GAMES, we are talking RTX 2060 levels more often than not. Far from high-end. Barely considered mid-end if you ask me. If the game supports DLSS, the 2060 will beat the 1080 Ti by 50% or so, using less than half the watts. An absolute monster? Not at all. Simply a mid-end card at best.

I don't get why people still praise the 1080 Ti, if we were able to buy GPU's at MSRP, we would not even be talking about 1080 Ti in 2021 tbh... It's 2016 architecture

Yeah still a decent GPU for many people, far from high-end tho. Especially not if you're a 1440p/144+ Hz gamer.

I had both 980 Ti and 1080 Ti and I think 980 Ti overall was the better card. Mine overclocked like crazy, running 1650 MHz 3D clocks using custom firmware and performing like a GTX 1080.
Turing was a pretty bad release and my 1080 Ti did not come close to RTX 2080 in comparison, regardless of OC. Maxwell overclocked way better than Pascal. Especially true for 980 Ti.

Problem with both cards today, is lack of optimization from Nvidia and game dev's. Maxwell and Pascal is pretty much forgotten at this point, all focus is on Turing/Ampere. And this is why Maxwell and Pascal drops lower and lower on new game benches.

In a year when 4000 series launch, it will be even worse. When you get 3 gens behind = close to no support/optimization (even tho products is still in support "on paper"). It works if it works. I know plenty of people who are already seeing this happen on Maxwell (visual artifacts in newer games, like Warzone for example) and wonky performance in general, drivers not fixing issues - Nvidia/AMD does not bother -, neither does the game devs (not testing much with old obsolete hardware, they typically test with current and last gen stuff)

Glad I sold my 1080 Ti a few weeks before 3080 came out, and I recieved mine for MSRP. Upgrade cost me 225 dollars lol. Night and day difference in performance, especially when I output to my OLED TV at 4K/120Hz (made possible with HDMI 2.1, which my 1080 Ti did not have). DLSS is the true magic on RTX cards. Could not care less about ray tracing.
 
Last edited:
im on top of world and someone comes along kicks ya in the dangley bits :) 43.3%
 
I was curious and checked the price on Fleabay of "For parts or repair" cards like the ones I got earlier (970 GTX).
The prices of those has almost doubled since I got these around late 2019 to early 2020. I looked at a few different models, all seem to show the same pricing trend, esp the newer ones.

They all seem to be rising in "Worth" even if broken or just dead.
 
If you check how 1080 Ti fares in most NEWER GAMES, we are talking RTX 2060 levels more often than not. Far from high-end. Barely considered mid-end if you ask me. If the game supports DLSS, the 2060 will beat the 1080 Ti by 50% or so, using less than half the watts. An absolute monster? Not at all. Simply a mid-end card at best.

Not sure about other resolutions but at 1440p my 1080 TI (stock) levels up in the RTX 2070/RTX 2070 SUPER/5700 XT domain. With my personal performance targets, it definitely maintains that "higher-end" performance value experience. I guess we're not all built the same... in the few AAA titles I play, i can't see much difference between 100fps~144fps hence the "absolute monstrosity" reference :) Achieving higher FPS doesn't come with much compromise either as a few tweaks with quality settings without noticeable difference pops north of 100 easily.

Definitely not refuting your claim with "newer games" though.... haven't tried any recently launched titles. I'm scarred for life with Battlefield for now and a couple of others.
 
Last edited:
If you check how 1080 Ti fares in most NEWER GAMES, we are talking RTX 2060 levels more often than not. Far from high-end. Barely considered mid-end if you ask me. If the game supports DLSS, the 2060 will beat the 1080 Ti by 50% or so, using less than half the watts. An absolute monster? Not at all. Simply a mid-end card at best.

I don't get why people still praise the 1080 Ti, if we were able to buy GPU's at MSRP, we would not even be talking about 1080 Ti in 2021 tbh... It's 2016 architecture

Yeah still a decent GPU for many people, far from high-end tho. Especially not if you're a 1440p/144+ Hz gamer.

I had both 980 Ti and 1080 Ti and I think 980 Ti overall was the better card. Mine overclocked like crazy, running 1650 MHz 3D clocks using custom firmware and performing like a GTX 1080.
Turing was a pretty bad release and my 1080 Ti did not come close to RTX 2080 in comparison, regardless of OC. Maxwell overclocked way better than Pascal. Especially true for 980 Ti.

Problem with both cards today, is lack of optimization from Nvidia and game dev's. Maxwell and Pascal is pretty much forgotten at this point, all focus is on Turing/Ampere. And this is why Maxwell and Pascal drops lower and lower on new game benches.

In a year when 4000 series launch, it will be even worse. When you get 3 gens behind = close to no support/optimization (even tho products is still in support "on paper"). It works if it works. I know plenty of people who are already seeing this happen on Maxwell (visual artifacts in newer games, like Warzone for example) and wonky performance in general, drivers not fixing issues - Nvidia/AMD does not bother -, neither does the game devs (not testing much with old obsolete hardware, they typically test with current and last gen stuff)

Glad I sold my 1080 Ti a few weeks before 3080 came out, and I recieved mine for MSRP. Upgrade cost me 225 dollars lol. Night and day difference in performance, especially when I output to my OLED TV at 4K/120Hz (made possible with HDMI 2.1, which my 1080 Ti did not have). DLSS is the true magic on RTX cards. Could not care less about ray tracing.

You do have a point. Here is a performance comparison for CP2077 at 1440p:

performance-2560-1440.png


...and that is with DLSS off.
 
I don't get why people still praise the 1080 Ti, if we were able to buy GPU's at MSRP, we would not even be talking about 1080 Ti in 2021 tbh... It's 2016 architecture

You answered your own question. These are not normal times. The 1080 Ti is still very much relative in gaming today. Yes it will fade away on the 4xxx series and the 2xxx will fade away too. This is how it has always been with GPUs.

For now it is almost impossible to buy a 3xxx card unless you are willing to get scalped.
 
You answered your own question. These are not normal times. The 1080 Ti is still very much relative in gaming today. Yes it will fade away on the 4xxx series and the 2xxx will fade away too. This is how it has always been with GPUs.

For now it is almost impossible to buy a 3xxx card unless you are willing to get scalped.

That goes for all modern graphics cards, to be honest. Not just the 1080 Ti. I'm sure that those who bought an RX 580 for 200 USD a year ago feel pretty lucky right now, too.
 
How could Nvidia do this though without producing more Pascal GPU's? If the semiconductor foundries are already overbooked how are you going to get a line producing three year old GPU's that are two generations obsolete? Why not produce Turing parts? Or more Ampere parts? Why waste a semiconductor foundry line making obsolete GPU's?
Perhaps because old (14nm) technology can be made on old equipment and so supplement output.
 
Perhaps because old (14nm) technology can be made on old equipment and so supplement output.
GP102 (Titan Xp, Titan X, 1080 Ti), GP104 (1080, 1070 Ti, 1070, some 1060s) and GP106 (1060) are made with TSMC 16nm FinFET. GP107 (1050 and 1050 Ti) and GP108 (1010, 1030) are made with Samsung 14nm LPP.
 
You do have a point. Here is a performance comparison for CP2077 at 1440p:

View attachment 195813

...and that is with DLSS off.

Crikes!! that's one hell of a heavy title.... poorly optimised or game mechanics gone wildly larger than life?
 
Crikes!! that's one hell of a heavy title.... poorly optimised or game mechanics gone wildly larger than life?

Well I'll admit, I cherry-picked this game. But other modern titles follow a similar trend, such as RDR2, AC Valhalla, MSFS, etc. putting the 1080 Ti between a 2070 and a 2070 Super in performance. Hardly remarkable, and that's coming from someone who owns a lesser card, an RTX 2060 Super.
 
Last edited:
CDPR is still working on patching Cyberpunk 2077. On release it was pretty much a mess imo. I think maybe 1 year and it will run much better. CDPR is busier than a one legged man in a butt kicking contest. They are working on patching this game, expansions for this game, an update to The Witcher 3 to bring it up to nextgen level and add ray tracing and a new game.
 
Back
Top