Monday, September 3rd 2018

NVIDIA GeForce RTX 2080 Ti Benchmarks Allegedly Leaked- Twice

Caveat emptor, take this with a grain of salt, and the usual warnings when dealing with rumors about hardware performance come to mind here foremost. That said, a Turkish YouTuber, PC Hocasi TV, put up and then quickly took back down a video going through his benchmark results for the new NVIDIA GPU flagship, the GeForce RTX 2080 Ti across a plethora of game titles. The results, which you can see by clicking to read the whole story, are not out of line but some of the game titles involve a beta stage (Battlefield 5) and an online shooter (PUBG) so there is a second grain of salt needed to season this gravy.

As it stands, 3DCenter.org put together a nice summary of the relative performance of the RTX 2080 Ti compared to the GeForce GTX 1080 Ti from last generation. Based on these results, the RTX 20T0 Ti is approximately 37.5% better than the GTX 1080 Ti as far as average FPS goes and ~30% better on minimum FPS. These are in line with expectations from hardware analysts and the timing of these results tying in to when the GPU launches does lead some credence to the numbers. Adding to this leak is yet another, this time based off a 3DMark Time Spy benchmark, which we will see past the break.
The second leak in question is from an anonymous source to VideoCardz.com that sent a photograph of a monitor displaying a 3DMark Time Spy result for a generic NVIDIA graphics device with code name 1E07 and 11 GB of VRAM on board. With a graphics score of 12, 825, this is approximately 35% higher than the average score of ~9500 for the GeForce GTX 1080 Ti Founders Edition. This increase in performance matches up closely to the average increase in game benchmarks seen before and, if these stand with release drivers as well, then the RTX 2080 Ti brings with it a decent but not overwhelming performance increase compared to the previous generation in titles that do not make use of in-game real-time ray tracing. As always, look out for a detailed review on TechPowerUp before making up your minds on whether this is the GPU for you.
For those interested, screenshots of the first set of benchmarks are attached below (taken from Joker Productions on YouTube):
Add your own comment

86 Comments on NVIDIA GeForce RTX 2080 Ti Benchmarks Allegedly Leaked- Twice

#51
Tsukiyomi91
guess the whole "take a grain of salt" is getting lit as everyone is getting salty over the fact that Green Camp is "milking" people's wallet when they haven't even pre-order one or even benched one for themselves. Leaks are becoming the norm here coz impatient people who are too trigger happy are benching new cards using current drivers, which do not even unlock the card's full potential. Another thing about the whole "boycott Nvidia" because of how they set the prices, I doubt it'll hurt their stock when the animation & CGI industry have already ride the real time ray-tracing bandwagon, effectively replacing all their current Maxwell or Pascal based Quadro cards while saving up at the same time. For me, I'll be playing the waiting game until those who owned the cards signed the NDA, get the correct drivers & update the benchmark software so we can see how will it fare.
Posted on Reply
#53
Vayra86
Prima.VeraWhy, are you planning to buy this card for 1080p gaming? :laugh::laugh::laugh::laugh::roll:
You obviously don't game a whole lot if you say this. Resolution is irrelevant, games get heavier and your and my 1080 will be struggling with 1080p before long. When Pascal launched, people said the exact same nonsense you typed up here about that 1080 but then applied it to 1440p. Go figure. A higher res only locks you into buying the most expensive GPUs all the time every time, while resolution doesn't even pay off that much in image quality. Who's laughing here? I'll take 1080p any day and get stable, high FPS at max settings.

As for the 2080ti and the sad people who need to post everywhere that they pre-ordered one... the joke's on you :D Enjoy those 10 Giga Rays at 30 fps.

25-30% best case seems accurate and as predicted. It also puts the 2080 and 2070 in perspective, those are literally worthless now. It also means the Pascal line up will remain stagnant in value making our second hand sales that much more valuable. Thanks Nvidia ;)
Posted on Reply
#54
Mussels
Freshwater Moderator
Vayra86You obviously don't game a whole lot if you say this. Resolution is irrelevant, games get heavier and your and my 1080 will be struggling with 1080p before long. When Pascal launched, people said the exact same nonsense you typed up here about that 1080 but then applied it to 1440p. Go figure. A higher res only locks you into buying the most expensive GPUs all the time every time, while resolution doesn't even pay off that much in image quality. Who's laughing here? I'll take 1080p any day and get stable, high FPS at max settings.

As for the 2080ti and the sad people who need to post everywhere that they pre-ordered one... the joke's on you :D Enjoy those 10 Giga Rays at 30 fps.

25-30% best case seems accurate and as predicted. It also puts the 2080 and 2070 in perspective, those are literally worthless now. It also means the Pascal line up will remain stagnant in value making our second hand sales that much more valuable. Thanks Nvidia ;)
for some games, higher res (and/or larger screen) is a competitive advantage
I was the only 4k gamer (now 2k 144) in a group of 1080p screen users, and i could see and snipe people with iron sights they couldnt see with 4x scopes
Posted on Reply
#55
Vayra86
Musselsfor some games, higher res (and/or larger screen) is a competitive advantage
I was the only 4k gamer (now 2k 144) in a group of 1080p screen users, and i could see and snipe people with iron sights they couldnt see with 4x scopes
Fair point. And in close combat you'd lose because you need more accuracy due to higher pixel counts = higher miss chance ;)

There is a reason people lower their res in competitive shooters, much more than there is upping it so you can camp in the bushes. It feels to me you're grasping at straws here, sry...
Posted on Reply
#56
Mussels
Freshwater Moderator
Vayra86Fair point. And in close combat you'd lose because you need more accuracy due to higher pixel counts = higher miss chance ;)

There is a reason people lower their res in competitive shooters, much more than there is upping it so you can camp in the bushes. It feels to me you're grasping at straws here, sry...
The only advantage i found moving from 40" 4k to 32" 1440p, is that i needed a lower mouse DPI setting so i was more accurate with snap movements
That may have been an issue with my mouse or even a latency issue on the TV (it wasnt exactly made for PC gaming), but its still relevant

The main reason people use the lower res monitors is because thats what those older games were designed for... even on a 40" TV you should see how tiny 4k was, and thats the real reason 1440p is more popular - because UI scaling isnt prevalent yet, even at an operating system/common apps level its still immature tech.
Posted on Reply
#57
Vayra86
MusselsThe only advantage i found moving from 40" 4k to 32" 1440p, is that i needed a lower mouse DPI setting so i was more accurate with snap movements

The main reason people use the lower res monitors is because
You've just given the answer yourself. A lower res is a lower amount of pixels to aim for making the chance of hitting the right one(s) much higher. You can also lower DPI which makes your hand's movement more pronounced and thus more active and accurate. Its simply easier to point something at a bigger target than it is at a smaller one.

This has nothing to do with UI scaling, in competitive shooters nobody looks at a UI, you just know what you have at all times (clip size, reload etc is all muscle memory). For immersive games and for strategy and isometric stuff, yes, high res and crappy UI visibility go hand in hand... I can imagine a higher res is nice to have for competitive games such as DOTA though.
Posted on Reply
#58
Mussels
Freshwater Moderator
RTS games like DOTA is exactly where its useless, because you've got a locked perspective/field of view - you cant see further or any different

I've got 1080p, 1440p and 4k in the room with me and direct experience with all of them, even at LAN parties dealing with retro games. Some games work amazing no matter the resolution (call of duty 2 runs with no issues in windows 10 in 4k for example) and others are very broken (supreme commander works great in 1440p 144hz despite its 100fps cap, but the UI is utterly broken at 4k)

No competitive gamer is going to choose a resolution, practise and learn it and then go to an event with the hardware provided and have all the muscle memory be out of whack
Posted on Reply
#59
B-Real
Liviu CojocaruIf these results are 4 real, it doesn't look bad...still way too expensive for now. If people don't pre-order like crazy the price will definitely go down
In terms of prices, it really looks bad in my opinion.
Caring1This ^^^ is an example of tech snobbery.
There's nothing wrong with gaming at 1080p.
There is nothing wrong. But it is wrong from its foundations when you defend it in terms of a 1050-1100$ GPU.
Posted on Reply
#60
BluesFanUK
Still not a genuine 4K 60fps card then.
Posted on Reply
#61
cucker tarlson
BluesFanUKStill not a genuine 4K 60fps card then.
This is fake.
Posted on Reply
#62
medi01
londisteGTX1080Ti is a $700 card. So should be the GTX2080
Remind me the 980Ti price and when did 1080Ti start to cost as much.
Posted on Reply
#63
B-Real
cucker tarlsonThis is fake.
Why? If you want every graphic option enabled except for AA, most of today's games will run 60 fps on the 2080Ti for sure. However, in the most demanding ones (and I'm not speaking of garbage ports like Mafia 3 but Crysis 3, Witcher 3, etc.) and in the upcoming games with the same options may be hard to achieve 4k 60.
Posted on Reply
#64
Tsukiyomi91
Fact: 4K 60fps or higher is still not possible when game devs are putting all sorts of "features" into their game, considering they know how low level APIs like DX12 or Vulkan work. I will still take 1080p/1440p/ultrawide @ 60+fps over 4K. Since when 4K 60fps is the "new standard" of PC gaming when 90% of userbase are still running 1080p @ >60fps?
Posted on Reply
#65
cucker tarlson
B-RealWhy? If you want every graphic option enabled except for AA, most of today's games will run 60 fps on the 2080Ti for sure. However, in the most demanding ones (and I'm not speaking of garbage ports like Mafia 3 but Crysis 3, Witcher 3, etc.) and in the upcoming games with the same options may be hard to achieve 4k 60.
I mean the leaks are fake.
Posted on Reply
#66
Berfs1
DarkswordI'd be interested to see what the difference is at equal clock speeds.

The 2080 Ti FE comes overclocked out of the box and should run cooler, which means it can sustain much higher clocks than the stock 1080 Ti FE.
Around 33.2%; ((4352/3584*100)-100)/12*18.6
That was due to the difference in cores and transistors, not including clock speed. Of course, since I know I will be told this, yes, it’s really just theoretical, but, obviously it’s pretty damn close. Also, the fact that my numbers actually aren’t off compared to the released benchmark numbers, I’d say yes, if you want max quality, go RTX 2080 Ti. There won’t be any better GPUs coming in the next two years anyways, other than perhaps Titan versions, and nothing on the AMD side.
Posted on Reply
#67
notb
Tsukiyomi91Since when 4K 60fps is the "new standard" of PC gaming when 90% of userbase are still running 1080p @ >60fps?
It's not a "standard" == "typical gaming PC offers it", but it's a standard we should go for... and possibly stay there.
Most people don't notice much difference when going beyond either 4K or 60fps (nor are they willing to pay for it).

Of course there will be people who want to play on multiple screens or 144Hz.
And since there's demand, there should be a "halo" product that can do that for a premium. I'd imagine that by 2025 such GPUs will have no problem with 4K@144fps or triple 4K@60fps.

But wouldn't it make sense for the rest of GPU lineup to concentrate on 4K@60fps and differentiate by image quality (features, effects)?
I mean... Nvidia currently makes 10 different models per generation (some in two RAM variants). Do we really need that? :-)
Posted on Reply
#68
Captain_Tom
Vya DomusYet Maxwell was faster than Kepler, on the same 28nm. Let's not kid ourselves Nvidia could have still easily doubled performance (or at least get very close to it) if they would have dropped the Tensor and RT cores. People aren't spoiled, there have always been huge jumps in performance in the GPU space, that's actually the norm. It's Nvidia that has spoiled itself , funny even when they go all out with the biggest dies that they can make they still chose to make it so that it's not the best they can do.
Bingo. It's a bloody 750mm^2 die!!! They could have made this a 6080-SP card with a 16GB 512-bit bus and 1 TB/s of bandwidth. It would have been easily capable of 4K 144Hz.

But instead we get this inefficient modest upgrade so they can sell defective compute cards to oblivious gamers.
Posted on Reply
#69
Unregistered
Obviously will wait to see an actual comparison from a solid / reliable reviewer like TPU, but if the #s are accurate, that's incredibly disappointing given the price points - just adding more support to those of us who are planning on skipping this iteration from NVidia. Awful pricing, not nearly enough of a performance jump to justify it...

Looking forward to the TPU review though.
#70
RejZoR
xkm1948I would not call 30%+ "tiny bit"

People got spoiled with Maxwell--->Pascal, completely ignoring the fact the previous gen was stuck in 28nm for a long time.

Meanwhile I will keep my pre-order. Any 20XX is better than my current GPU that is for sure.
Leave % out of it really. I get severe allergic reaction when people plaster percentages to make a point and salivate over 10% or 15% differences. Great, 30%. And then you look at actual framerate difference and it's 13fps (example 45fps+30%). 30% sounds like HUGE difference, but in actual framerate, 13fps is a difference, but hardly a game changing. And when that's best case scenario, many games may experience even lower gains as evident from graphs (fake or not). And then you ask yourself, a 1000€ graphic card for 13fps boost? Really? For someone who shits money, sure. Every FPS counts. But for rational people, sorry, just no.
Posted on Reply
#71
cucker tarlson
What are you talking about, the lower the fps, the more noticeable the difference. 45+13 is 58, making an uplayable or hard to play experience pretty playable again. 100 + 30 fps is not even close to being that impactful. I can't game at 45, 58 is not great but it's fine. 100 and 130 are both super fast, while I do notice the difference, it's not make or break anymore.I went 980-980ti-1080,I'm used to 20-25%,and as long as it's reflected in min fps I can see it very clearly in games.
Leaving out percentage comparison for direct fps comparisons would be an absolutely awful idea.
Posted on Reply
#72
Captain_Tom
Berfs1Around 33.2%; ((4352/3584*100)-100)/12*18.6
That was due to the difference in cores and transistors, not including clock speed. Of course, since I know I will be told this, yes, it’s really just theoretical, but, obviously it’s pretty damn close. Also, the fact that my numbers actually aren’t off compared to the released benchmark numbers, I’d say yes, if you want max quality, go RTX 2080 Ti. There won’t be any better GPUs coming in the next two years anyways, other than perhaps Titan versions, and nothing on the AMD side.
AMD announced they are launching 7nm Vega this year for gamers. Don't count out the possibility that it could blow away Turing...
Musselsfor some games, higher res (and/or larger screen) is a competitive advantage
I was the only 4k gamer (now 2k 144) in a group of 1080p screen users, and i could see and snipe people with iron sights they couldnt see with 4x scopes
That's definitely true. I noticed a major disadvantage with iron sights when I gamed on my laptop recently (900p).
Posted on Reply
#73
cucker tarlson
Captain_TomAMD announced they are launching 7nm Vega this year for gamers. Don't count out the possibility that it could blow away Turing...
Source.
Posted on Reply
#74
Captain_Tom
cucker tarlsonSource.
Google "7nm Radeon 2018" and filter only news from the past month. TBH it's somewhat a mixed bag of rumors and facts. What we DO know is AMD is launching 7nm Vega this year, and most recent rumors point to a Vega coming to gaming as well. However some rumors point to Navi possibly launching this year as well.

The point is we do know AMD is bringing 7nm cards THIS year. But it's up in the air if it is Vega or Navi (or both), and if it will be a full or a paper launch. I just wouldn't bother with overpriced and soon outdated 12nm products.
Posted on Reply
#75
cucker tarlson
Point is, your point is miselading. No 7nm Vega for gamers this year or in early 19.

www.techpowerup.com/forums/threads/amd-7nm-vega-by-december-not-a-die-shrink-of-vega-10.247006/

AMD at its Computex event confirmed that "Vega 20" will build Radeon Instinct and Radeon Pro graphics cards, and that it has no plans to bring it to the client-segment. That distinction will be reserved for "Navi," which could only debut in 2019, if not later.
How much will 7nm and 4 stack hbm2 bring to vega anyway, it's 1080 performance. If 2080ti is 1.35-1.4x of 1080ti, that's 1.35-1.4 times 1.3x, so 1.8x of vega 64. 80% faster - don't think so.... 7nm Vega can probably beat 16nm 1080Ti,though not by much. 2080 performance in other words and 10-15% faster than 2070. worth waiting for months ? If you got money for a gpu now then I don't think so,might as well pick up 1080ti when they drop price, won't be much difference from 7nm Vega.
Posted on Reply
Add your own comment
May 10th, 2024 22:53 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts