• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 2080 Ti Benchmarks Allegedly Leaked- Twice

Ah, the mythical gimping.

Not optimizing new software for older arches is not gimping. Read a dictionary sometime. gimping is retroactively LOWERING performance of a previous product, which has been proven time and time again to be completely TRUE with nvidia....
Fixed that for you.
Gimping is lowering the performance of older hardware with new drivers, in comparison to older drivers. Perhaps you should read a dictionary sometimes.
Optimising new software for old hardware has nothing to do with it, they should be at least capable of running at prior levels of performance, not lower, which has been shown many times through forums like this one.
 
guess the whole "take a grain of salt" is getting lit as everyone is getting salty over the fact that Green Camp is "milking" people's wallet when they haven't even pre-order one or even benched one for themselves. Leaks are becoming the norm here coz impatient people who are too trigger happy are benching new cards using current drivers, which do not even unlock the card's full potential. Another thing about the whole "boycott Nvidia" because of how they set the prices, I doubt it'll hurt their stock when the animation & CGI industry have already ride the real time ray-tracing bandwagon, effectively replacing all their current Maxwell or Pascal based Quadro cards while saving up at the same time. For me, I'll be playing the waiting game until those who owned the cards signed the NDA, get the correct drivers & update the benchmark software so we can see how will it fare.
 
Why, are you planning to buy this card for 1080p gaming? :laugh::laugh::laugh::laugh::roll:

You obviously don't game a whole lot if you say this. Resolution is irrelevant, games get heavier and your and my 1080 will be struggling with 1080p before long. When Pascal launched, people said the exact same nonsense you typed up here about that 1080 but then applied it to 1440p. Go figure. A higher res only locks you into buying the most expensive GPUs all the time every time, while resolution doesn't even pay off that much in image quality. Who's laughing here? I'll take 1080p any day and get stable, high FPS at max settings.

As for the 2080ti and the sad people who need to post everywhere that they pre-ordered one... the joke's on you :D Enjoy those 10 Giga Rays at 30 fps.

25-30% best case seems accurate and as predicted. It also puts the 2080 and 2070 in perspective, those are literally worthless now. It also means the Pascal line up will remain stagnant in value making our second hand sales that much more valuable. Thanks Nvidia ;)
 
Last edited:
You obviously don't game a whole lot if you say this. Resolution is irrelevant, games get heavier and your and my 1080 will be struggling with 1080p before long. When Pascal launched, people said the exact same nonsense you typed up here about that 1080 but then applied it to 1440p. Go figure. A higher res only locks you into buying the most expensive GPUs all the time every time, while resolution doesn't even pay off that much in image quality. Who's laughing here? I'll take 1080p any day and get stable, high FPS at max settings.

As for the 2080ti and the sad people who need to post everywhere that they pre-ordered one... the joke's on you :D Enjoy those 10 Giga Rays at 30 fps.

25-30% best case seems accurate and as predicted. It also puts the 2080 and 2070 in perspective, those are literally worthless now. It also means the Pascal line up will remain stagnant in value making our second hand sales that much more valuable. Thanks Nvidia ;)

for some games, higher res (and/or larger screen) is a competitive advantage
I was the only 4k gamer (now 2k 144) in a group of 1080p screen users, and i could see and snipe people with iron sights they couldnt see with 4x scopes
 
for some games, higher res (and/or larger screen) is a competitive advantage
I was the only 4k gamer (now 2k 144) in a group of 1080p screen users, and i could see and snipe people with iron sights they couldnt see with 4x scopes

Fair point. And in close combat you'd lose because you need more accuracy due to higher pixel counts = higher miss chance ;)

There is a reason people lower their res in competitive shooters, much more than there is upping it so you can camp in the bushes. It feels to me you're grasping at straws here, sry...
 
Fair point. And in close combat you'd lose because you need more accuracy due to higher pixel counts = higher miss chance ;)

There is a reason people lower their res in competitive shooters, much more than there is upping it so you can camp in the bushes. It feels to me you're grasping at straws here, sry...

The only advantage i found moving from 40" 4k to 32" 1440p, is that i needed a lower mouse DPI setting so i was more accurate with snap movements
That may have been an issue with my mouse or even a latency issue on the TV (it wasnt exactly made for PC gaming), but its still relevant

The main reason people use the lower res monitors is because thats what those older games were designed for... even on a 40" TV you should see how tiny 4k was, and thats the real reason 1440p is more popular - because UI scaling isnt prevalent yet, even at an operating system/common apps level its still immature tech.
 
The only advantage i found moving from 40" 4k to 32" 1440p, is that i needed a lower mouse DPI setting so i was more accurate with snap movements

The main reason people use the lower res monitors is because

You've just given the answer yourself. A lower res is a lower amount of pixels to aim for making the chance of hitting the right one(s) much higher. You can also lower DPI which makes your hand's movement more pronounced and thus more active and accurate. Its simply easier to point something at a bigger target than it is at a smaller one.

This has nothing to do with UI scaling, in competitive shooters nobody looks at a UI, you just know what you have at all times (clip size, reload etc is all muscle memory). For immersive games and for strategy and isometric stuff, yes, high res and crappy UI visibility go hand in hand... I can imagine a higher res is nice to have for competitive games such as DOTA though.
 
RTS games like DOTA is exactly where its useless, because you've got a locked perspective/field of view - you cant see further or any different

I've got 1080p, 1440p and 4k in the room with me and direct experience with all of them, even at LAN parties dealing with retro games. Some games work amazing no matter the resolution (call of duty 2 runs with no issues in windows 10 in 4k for example) and others are very broken (supreme commander works great in 1440p 144hz despite its 100fps cap, but the UI is utterly broken at 4k)

No competitive gamer is going to choose a resolution, practise and learn it and then go to an event with the hardware provided and have all the muscle memory be out of whack
 
If these results are 4 real, it doesn't look bad...still way too expensive for now. If people don't pre-order like crazy the price will definitely go down
In terms of prices, it really looks bad in my opinion.

This ^^^ is an example of tech snobbery.
There's nothing wrong with gaming at 1080p.
There is nothing wrong. But it is wrong from its foundations when you defend it in terms of a 1050-1100$ GPU.
 
Last edited:
Still not a genuine 4K 60fps card then.
 
This is fake.
Why? If you want every graphic option enabled except for AA, most of today's games will run 60 fps on the 2080Ti for sure. However, in the most demanding ones (and I'm not speaking of garbage ports like Mafia 3 but Crysis 3, Witcher 3, etc.) and in the upcoming games with the same options may be hard to achieve 4k 60.
 
Fact: 4K 60fps or higher is still not possible when game devs are putting all sorts of "features" into their game, considering they know how low level APIs like DX12 or Vulkan work. I will still take 1080p/1440p/ultrawide @ 60+fps over 4K. Since when 4K 60fps is the "new standard" of PC gaming when 90% of userbase are still running 1080p @ >60fps?
 
Why? If you want every graphic option enabled except for AA, most of today's games will run 60 fps on the 2080Ti for sure. However, in the most demanding ones (and I'm not speaking of garbage ports like Mafia 3 but Crysis 3, Witcher 3, etc.) and in the upcoming games with the same options may be hard to achieve 4k 60.
I mean the leaks are fake.
 
I'd be interested to see what the difference is at equal clock speeds.

The 2080 Ti FE comes overclocked out of the box and should run cooler, which means it can sustain much higher clocks than the stock 1080 Ti FE.
Around 33.2%; ((4352/3584*100)-100)/12*18.6
That was due to the difference in cores and transistors, not including clock speed. Of course, since I know I will be told this, yes, it’s really just theoretical, but, obviously it’s pretty damn close. Also, the fact that my numbers actually aren’t off compared to the released benchmark numbers, I’d say yes, if you want max quality, go RTX 2080 Ti. There won’t be any better GPUs coming in the next two years anyways, other than perhaps Titan versions, and nothing on the AMD side.
 
Since when 4K 60fps is the "new standard" of PC gaming when 90% of userbase are still running 1080p @ >60fps?
It's not a "standard" == "typical gaming PC offers it", but it's a standard we should go for... and possibly stay there.
Most people don't notice much difference when going beyond either 4K or 60fps (nor are they willing to pay for it).

Of course there will be people who want to play on multiple screens or 144Hz.
And since there's demand, there should be a "halo" product that can do that for a premium. I'd imagine that by 2025 such GPUs will have no problem with 4K@144fps or triple 4K@60fps.

But wouldn't it make sense for the rest of GPU lineup to concentrate on 4K@60fps and differentiate by image quality (features, effects)?
I mean... Nvidia currently makes 10 different models per generation (some in two RAM variants). Do we really need that? :-)
 
Yet Maxwell was faster than Kepler, on the same 28nm. Let's not kid ourselves Nvidia could have still easily doubled performance (or at least get very close to it) if they would have dropped the Tensor and RT cores. People aren't spoiled, there have always been huge jumps in performance in the GPU space, that's actually the norm. It's Nvidia that has spoiled itself , funny even when they go all out with the biggest dies that they can make they still chose to make it so that it's not the best they can do.


Bingo. It's a bloody 750mm^2 die!!! They could have made this a 6080-SP card with a 16GB 512-bit bus and 1 TB/s of bandwidth. It would have been easily capable of 4K 144Hz.

But instead we get this inefficient modest upgrade so they can sell defective compute cards to oblivious gamers.
 
Obviously will wait to see an actual comparison from a solid / reliable reviewer like TPU, but if the #s are accurate, that's incredibly disappointing given the price points - just adding more support to those of us who are planning on skipping this iteration from NVidia. Awful pricing, not nearly enough of a performance jump to justify it...

Looking forward to the TPU review though.
 
I would not call 30%+ "tiny bit"

People got spoiled with Maxwell--->Pascal, completely ignoring the fact the previous gen was stuck in 28nm for a long time.

Meanwhile I will keep my pre-order. Any 20XX is better than my current GPU that is for sure.

Leave % out of it really. I get severe allergic reaction when people plaster percentages to make a point and salivate over 10% or 15% differences. Great, 30%. And then you look at actual framerate difference and it's 13fps (example 45fps+30%). 30% sounds like HUGE difference, but in actual framerate, 13fps is a difference, but hardly a game changing. And when that's best case scenario, many games may experience even lower gains as evident from graphs (fake or not). And then you ask yourself, a 1000€ graphic card for 13fps boost? Really? For someone who shits money, sure. Every FPS counts. But for rational people, sorry, just no.
 
What are you talking about, the lower the fps, the more noticeable the difference. 45+13 is 58, making an uplayable or hard to play experience pretty playable again. 100 + 30 fps is not even close to being that impactful. I can't game at 45, 58 is not great but it's fine. 100 and 130 are both super fast, while I do notice the difference, it's not make or break anymore.I went 980-980ti-1080,I'm used to 20-25%,and as long as it's reflected in min fps I can see it very clearly in games.
Leaving out percentage comparison for direct fps comparisons would be an absolutely awful idea.
 
Last edited:
Around 33.2%; ((4352/3584*100)-100)/12*18.6
That was due to the difference in cores and transistors, not including clock speed. Of course, since I know I will be told this, yes, it’s really just theoretical, but, obviously it’s pretty damn close. Also, the fact that my numbers actually aren’t off compared to the released benchmark numbers, I’d say yes, if you want max quality, go RTX 2080 Ti. There won’t be any better GPUs coming in the next two years anyways, other than perhaps Titan versions, and nothing on the AMD side.

AMD announced they are launching 7nm Vega this year for gamers. Don't count out the possibility that it could blow away Turing...
for some games, higher res (and/or larger screen) is a competitive advantage
I was the only 4k gamer (now 2k 144) in a group of 1080p screen users, and i could see and snipe people with iron sights they couldnt see with 4x scopes

That's definitely true. I noticed a major disadvantage with iron sights when I gamed on my laptop recently (900p).
 
AMD announced they are launching 7nm Vega this year for gamers. Don't count out the possibility that it could blow away Turing...
Source.
 

Google "7nm Radeon 2018" and filter only news from the past month. TBH it's somewhat a mixed bag of rumors and facts. What we DO know is AMD is launching 7nm Vega this year, and most recent rumors point to a Vega coming to gaming as well. However some rumors point to Navi possibly launching this year as well.

The point is we do know AMD is bringing 7nm cards THIS year. But it's up in the air if it is Vega or Navi (or both), and if it will be a full or a paper launch. I just wouldn't bother with overpriced and soon outdated 12nm products.
 
Back
Top