• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GTX 1080 Ti coming January: lower price and same performance as Titan X Pascal

I still have my 285 and I'll bet it will run quite well as long as memory demands are kept down and a DX9 or DX10 game is being run. Might just go and have a fiddle with it now you bring it up...
Yep, when I had to send in the used OG Titan I manage to kill by folding. I had to use the GTX 275 for a while. It actually ran Skyrim decently enough.
 
Yep, when I had to send in the used OG Titan I manage to kill by folding. I had to use the GTX 275 for a while. It actually ran Skyrim decently enough.
I'm not surprised since these sorts of apps really hammer graphics cards. Was there much coil whine?

On my GTX 580, I've got this one, running Folding@Home was fairly quiet for about a minute (fan spun up of course) but then the most godawful noise (about 4-500Hz tone) could be heard coming directly from the power coils on the card. On top of that, it somehow managed to find its way into the audio card as well, putting the same loud noise through the speakers. Of course, I didn't bother and forgot all about running it. Besides driving me nuts and potentially encouraging a neighbour complaint, I doubt the card would have lasted very long. Basically the thing is cheaply made to a price so can't stand a little stress. :shadedshu:
 
  • Like
Reactions: xvi
Look at core counts of Fury X compared to 1080. Vega will be upgraded Fury in terms of performance. Hard to know how far ahead for sure.
If 1080 ti matches Titan X, Vega might not pull it off. Problem is if Nvidia launch it early, we may have to wait quite a while for Vega to appear.
I really would love to see the two released together.
Polaris was a 15% IPC increase over GCN 3 (Fury/R9 285/380) and Vega is a increase from Polaris, a new architecture, "more new" compared to Polaris was compared to Fury. I guess Vega is supposed to be the "really new" architecture, at least I hope that. So I have some hope that it might be good. At least this time it will not have a Vram handicap compared to Nvidias offerings.
 
I'm not surprised since these sorts of apps really hammer graphics cards. Was there much coil whine?

On my GTX 580, I've got this one, running Folding@Home was fairly quiet for about a minute (fan spun up of course) but then the most godawful noise (about 4-500Hz tone) could be heard coming directly from the power coils on the card. On top of that, it somehow managed to find its way into the audio card as well, putting the same loud noise through the speakers. Of course, I didn't bother and forgot all about running it. Besides driving me nuts and potentially encouraging a neighbour complaint, I doubt the card would have lasted very long. Basically the thing is cheaply made to a price so can't stand a little stress. :shadedshu:
Surprisingly it did not whine much. Just went all glorious black screen. Even tested it in another system. Nope, no output. Even tried different video connections. At least EVGA has some awesome warranties. Reason I love used EVGA cards for my folding cards.

So far my 960 and two 980Tis have survived me. The 980 STRIX.....I want to beat it to death. It has the most god awful fans or coil whine or something. Sounds like an angry banshee or evil hornet if I even dare fold on the card. Last time I am touching an ASUS card especially a STRIX. (Also dislike it locked to 1.212V too)
 
  • Like
Reactions: xvi
Wrong thread?
Also gsync/Freesync is pretty worthless when you drive a high hz monitor like 144hz. Not really relevant on highend cards like Vega.

ehhh, look up the actual function of gsync and freesync. you might find your statement to be wrong. Having high refresh rate is one thing, but having the monitor perfectly synced to every frame rendered by the GPU is another.
 
  • Like
Reactions: xvi
ehhh, look up the actual function of gsync and freesync. you might find your statement to be wrong. Having high refresh rate is one thing, but having the monitor perfectly synced to every frame rendered by the GPU is another.
Ehhh no I probably understand it and you do not get my point. If you have a highend card and a high Hz monitor your fps is so high that gsync or any other sync isn't needed anymore simply because of the very fast reaction times of the monitor. Also windows 10 does future what some call "fast sync" (it's on in windowed mode) which entirely makes the rest of free/gsync useless if you have at least 80 to 100 fps and a high Hz monitor (that means it's useful up to 60hz). If you don't believe or understand what I say I couldn't care less though, it's offtopic anyway.

Also you're double posting like a total noob.
 
Last edited:
Ehhh no I probably understand it and you do not get my point. If you have a highend card and a high Hz monitor your fps is so high that gsync or any other sync isn't needed anymore simply because of the very fast reaction times of the monitor. Also windows 10 does future what some call "fast sync" (it's on in windowed mode) which entirely makes the rest of free/gsync useless if you have at least 80 to 100 fps and a high Hz monitor (that means it's useful up to 60hz). If you don't believe or understand what I say I couldn't care less though, it's offtopic anyway.

Also you're double posting like a total noob.

Well considering I am able too experience it first hand on my monitor, mind you its only at 100hz, then trying it on my roommates freesync monitor with his rig. Freesync still wins from what I could tell. Still way smoother, zero tearing, etc.
 
Well considering I am able too experience it first hand on my monitor, mind you its only at 100hz, then trying it on my roommates freesync monitor with his rig. Freesync still wins from what I could tell. Still way smoother, zero tearing, etc.
I don't have any tearing since I got this 120hz monitor over 4 years ago. I had stutter from crossfire but that's gone since I sold my hd5970. Though i didn't say Freesync/gsync is useless I meant it's more like a unneeded luxury if you have a high hz monitor with decent fps (80+) and no drops.
You should not have any tearing when playing windowed under Windows 10 - fast sync as I said. Nvidia now offers this feature for 10 and 9 series Gpus for fullscreen additionally on its own (they call it fastsync the actual windows implemented function has no special name). I'm playing windowed anyway most of the time. Freesync / gsync is generally useful for 60hz rigs or machines that have low fps on a high hz monitor.

That said I like ulmb it's more useful to me than gsync. New gsync monitors have that feature too. It makes picture quality much better.
 
Last edited:
Why would they make GTX 1080Ti same performance as Titan X Pascal and with lower cost? That wouldn't make any sense at all.
 
Why would they make GTX 1080Ti same performance as Titan X Pascal and with lower cost? That wouldn't make any sense at all.

The thing with Nvidia is they have a successful strategy going.

They released the Kepler Titan for $1,000. Later they released the 780 Ti which was faster than the Titan for $650 but had half the VRAM.

They released the Maxwell Titan X for $1,100. Later they released the 980 Ti which was almost as fast as the Titan X with non reference coolers for $700 but had half the VRAM.

Now they release the Pascal Titan X for $1,200 and later release a 1080 Ti which is almost as fast as a Titan XP for maybe $850.

What Nvidia is successful in doing is releasing the Titans first and scooping up the gamers that are willing to pay the high price to be early high end adopters and then releasing another version of their high end GPU for a good bit less afterwards. No doubt they will continue this in the future until it doesn't work anymore.
 
Why would they make GTX 1080Ti same performance as Titan X Pascal and with lower cost? That wouldn't make any sense at all.
Because they were fleecing early adopters again.
 
Well considering I am able too experience it first hand on my monitor, mind you its only at 100hz, then trying it on my roommates freesync monitor with his rig. Freesync still wins from what I could tell. Still way smoother, zero tearing, etc.

I totally agree...I have an X34 100hz gsync and if you turn the gsync off you instantly notice the difference it is not a huge one but it's there.

I don't have any tearing since I got this 120hz monitor over 4 years ago. I had stutter from crossfire but that's gone since I sold my hd5970. Though i didn't say Freesync/gsync is useless I meant it's more like a unneeded luxury if you have a high hz monitor with decent fps (80+) and no drops.
You should not have any tearing when playing windowed under Windows 10 - fast sync as I said. Nvidia now offers this feature for 10 and 9 series Gpus for fullscreen additionally on its own (they call it fastsync the actual windows implemented function has no special name). I'm playing windowed anyway most of the time. Freesync / gsync is generally useful for 60hz rigs or machines that have low fps on a high hz monitor.

That said I like ulmb it's more useful to me than gsync. New gsync monitors have that feature too. It makes picture quality much better.

80+ fps and no drops...that is almost impossible with the current AAA games...see DeusEx
 
I totally agree...I have an X34 100hz gsync and if you turn the gsync off you instantly notice the difference it is not a huge one but it's there.



80+ fps and no drops...that is almost impossible with the current AAA games...see DeusEx
It's not. My settings just aren't stupid ultra all the way. If you carefully choose intelligent settings that aren't "ultra ultra ultra" it's easily possible. For example in GTA online with my machine - I only have those settings on ultra which aren't extreme like grass.

@64K
the problem with what you say is that in gtx700series times they HAD to release full kepler to fight the 290X. Now that nvidia has no competition they will just release a further crippled gpu and want even more Money for it than before. compared to the 780ti it's just laughable. The only thing better will be the amount of Vram but it will be nowhere near full shaders.
 
What I'm hoping is that Nvidia is planning ahead a little for the Vega 10 release which should come Q1 next year. They should know they need something to counter that because a $1,200 Titan XP will surely be a lot more expensive than a Vega 10 but that's purely guesswork on my part. I don't expect the 1080 Ti to have the same amount of shaders as The Titan XP but I do expect it to be released to board partners with permission to use non reference cooler designs so I'm expecting something like a repeat of the Maxwell Titan X and 980 Ti.
 
Yes I think too that via better cooling and pcb it will be almost as good as titan XP or even better, same as 980ti. Always granted titan xp isn't moded with watercooling which every user that spent so much money on a gpu simply should do because it's absolutely smart to do so.

My hope for Vega 10 is only that it can trade blows with 1080ti or be a tad faster than 1080 customs. To hope for even more is unrealistic if you ask me.
 
Would be funny if AMD kept everything quiet and then bam, out of the blue, top end cards for decent price. NVIDIA wouldn't be able to counter that for several months. But if you give out all the projections and info, of course they'll prepare in advance lol.
 
Would be funny if AMD kept everything quiet and then bam, out of the blue, top end cards for decent price. NVIDIA wouldn't be able to counter that for several months. But if you give out all the projections and info, of course they'll prepare in advance lol.
Yeah dreaming is a lot of fun. Nonetheless I think Vega will be a alternative to nvidia highend cards - that will be good enough for the market situation I guess.
 
I still have my 285 and I'll bet it will run quite well as long as memory demands are kept down and a DX9 or DX10 game is being run. Might just go and have a fiddle with it now you bring it up...

More like fondle, amirite?

Anyway, I'm loving the sound of the first half of 2017.
 
Would be funny if AMD kept everything quiet and then bam, out of the blue, top end cards for decent price. NVIDIA wouldn't be able to counter that for several months. But if you give out all the projections and info, of course they'll prepare in advance lol.
There was a time when that actually happened...see the HD3000 and HD5000 and HD7000 series, let's hope it will be one of those moments.I have nothing to gain from this as I already bought a GTX1080 and I have a gsync monitor as well...but for the sake of the customers and the GPU,CPU market AMD should wake the f**k up :)
 
AMD will hit us with another Radeon 4870, nVIDIA will shit bricks.
 
It's not. My settings just aren't stupid ultra all the way. If you carefully choose intelligent settings that aren't "ultra ultra ultra" it's easily possible. For example in GTA online with my machine - I only have those settings on ultra which aren't extreme like grass.

It is a different ballgame once you go above 1080p gaming. 1440p and 3440x1440 will on many new generation games give you frame drops that are noticeable without gsync/freesync, even on medium settings with a 980ti.
 
More like fondle, amirite?

Anyway, I'm loving the sound of the first half of 2017.
lol, you am right. The details are quite unspeakable. :laugh:
 
Back
Top