Monday, September 3rd 2018

NVIDIA GeForce RTX 2080 Ti Benchmarks Allegedly Leaked- Twice

Caveat emptor, take this with a grain of salt, and the usual warnings when dealing with rumors about hardware performance come to mind here foremost. That said, a Turkish YouTuber, PC Hocasi TV, put up and then quickly took back down a video going through his benchmark results for the new NVIDIA GPU flagship, the GeForce RTX 2080 Ti across a plethora of game titles. The results, which you can see by clicking to read the whole story, are not out of line but some of the game titles involve a beta stage (Battlefield 5) and an online shooter (PUBG) so there is a second grain of salt needed to season this gravy.

As it stands, 3DCenter.org put together a nice summary of the relative performance of the RTX 2080 Ti compared to the GeForce GTX 1080 Ti from last generation. Based on these results, the RTX 20T0 Ti is approximately 37.5% better than the GTX 1080 Ti as far as average FPS goes and ~30% better on minimum FPS. These are in line with expectations from hardware analysts and the timing of these results tying in to when the GPU launches does lead some credence to the numbers. Adding to this leak is yet another, this time based off a 3DMark Time Spy benchmark, which we will see past the break.
The second leak in question is from an anonymous source to VideoCardz.com that sent a photograph of a monitor displaying a 3DMark Time Spy result for a generic NVIDIA graphics device with code name 1E07 and 11 GB of VRAM on board. With a graphics score of 12, 825, this is approximately 35% higher than the average score of ~9500 for the GeForce GTX 1080 Ti Founders Edition. This increase in performance matches up closely to the average increase in game benchmarks seen before and, if these stand with release drivers as well, then the RTX 2080 Ti brings with it a decent but not overwhelming performance increase compared to the previous generation in titles that do not make use of in-game real-time ray tracing. As always, look out for a detailed review on TechPowerUp before making up your minds on whether this is the GPU for you.
For those interested, screenshots of the first set of benchmarks are attached below (taken from Joker Productions on YouTube):
Add your own comment

86 Comments on NVIDIA GeForce RTX 2080 Ti Benchmarks Allegedly Leaked- Twice

#26
TheinsanegamerN
xkm1948, post: 3897246, member: 50521"
I would not call 30%+ "tiny bit"

People got spoiled with Maxwell--->Pascal, completely ignoring the fact the previous gen was stuck in 28nm for a long time.

Meanwhile I will keep my pre-order. Any 20XX is better than my current GPU that is for sure.
Kepler-->maxwell was also a 30%+ jump, and both of those were on 28nm. Arguably, the 30% for tweaked arch+ transistor shrink of pascal was a disappointment.

People were spoiled by nvidia doing a good job of creating faster architectures, not from a single arch change. If RTX 2000 is indeed just a small jump over pascal, that will be a disappointment on par with the 400 series.
xkm1948, post: 3897252, member: 50521"
Meh, why buy old stuff?
If you need an upgrade? Once the 2000s come out, it will be 6 months before you can buy one, and signs point to it not being all that much faster.

Getting 85% of that future performance today for 50% of the price is a good deal. Its not like that old tech will just stop working.
Posted on Reply
#27
mastrdrver
The benchmarks are 4k results. Let me know when there are leaks of actual usefulness.
Posted on Reply
#28
RichF
TheinsanegamerN, post: 3897326, member: 127292"
Its not like that old tech will just stop working.
Depends on how much Nvidia gimps the 1080/1080Ti in driver updates.
Posted on Reply
#29
TheinsanegamerN
RichF, post: 3897351, member: 154826"
Depends on how much Nvidia gimps the 1080/1080Ti in driver updates.
Ah, the mythical gimping.

Not optimizing new software for older arches is not gimping. Read a dictionary sometime. gimping is retroactively LOWERING performance of a previous product, which has been proven time and time again to be completely false with nvidia.

They just no longer optimize for previous generations. AMD did this too, and I'll bet you were not complaining when they stopped optimizing for the 4000 series, or evergreen, or the 9000 series, ece. The only reason they continue to optimize for GCN is because they have no choice, they are still selling that arch.

The moment GCN is either replaced or updates to a point where it no longer functions similarly to GCN 1, optimization for GCN will stop completely. It's SOP for GPU makers, has been since the beginning of time.
Posted on Reply
#30
rtwjunkie
PC Gaming Enthusiast
RichF, post: 3897351, member: 154826"
Depends on how much Nvidia gimps the 1080/1080Ti in driver updates.
There comes a time when you cannot improve performance through software anymore...when the driver has eked out the maximum performance and efficiency.
Posted on Reply
#31
Chaboxx
Why in the presentación , wasnt any benchmark ? I think NVIDIA know that the rtx 2080ti isnt fast as un his time was the 1080ti....
Posted on Reply
#32
carex
if u see 30% jump that can only be possible in 4k because of the gdd5x vs gddr6 bandwidth difference
otw the avg difference would be something like 15-20% max
Posted on Reply
#33
Mussels
Moderprator
for a 1080 like mine to a 2080ti, the gain looks worth it


just gotta wait til the price is less than three kidneys
Posted on Reply
#34
enya64
Witcher 3 runs between 57- 64 fps on my asus 1080ti/6700k in 4k on ultra with hairworks only on hair. Am I missing something? I can afford the 2080ti, but I think I'd rather spend on the money on the i9-9900k or maybe a fall vacation back to New Orleans for me and my wife. Surely the next generation of gpus in 2020 with the release of the new consoles will make much more sense to purchase due to there being a new base of graphics fidelity.
Posted on Reply
#35
sweet
xkm1948, post: 3897246, member: 50521"
I would not call 30%+ "tiny bit"

People got spoiled with Maxwell--->Pascal, completely ignoring the fact the previous gen was stuck in 28nm for a long time.

Meanwhile I will keep my pre-order. Any 20XX is better than my current GPU that is for sure.
The price is doubled, so 30%+ is still a disappointing improvement.
Posted on Reply
#36
cucker tarlson
Sandbo, post: 3897250, member: 156430"
Especially 1080Ti are now significantly cheaper.
Really ? Cause they're disgustingly expensive here, rtx 2080 preorder is cheaper. Where are you from ?
Posted on Reply
#37
eidairaman1
The Exiled Airman
A potato is faster

Mussels, post: 3897394, member: 1746"
for a 1080 like mine to a 2080ti, the gain looks worth it


just gotta wait til the price is less than three kidneys
By that time youd have a mid range board that whips both a 2080ti and 1080ti...
Posted on Reply
#38
Notea
I've made a comparision between my card, Strix 1080 OC and the alleged 2070 or 2080 (not sure which) and its aprroximately 25% faster in the graphics benchmark in 3D Mark Time Spy, So id say 25% improvement is plausible if we compare clock for clocks...and as everyone mentioned, the 2080Ti FE is Factory Overclocked and a 1080ti Overclocked can gain as much as 10% over reference clocks.. maybe more
Strix 1080 OC Vs 2070/2080
Posted on Reply
#39
HD64G
25-30% is the normal performance increase on the (almost) same node. Here, clocks are almost the same and chips have got much bigger. So, 35% of performance increase is ok, but the price increase isn't at all. nVidia did that to allow their huge Pascal stock being sold out and the yields that are low since the die sizes to give them the same profits as the small Pascal dies did. Point is that customers get much worse vfm from the next gpu gen maybe for the 1st time ever in the pc history. And that is what matters for us costomers. Fanboys and crazy and/or rich people not included ofc.
Posted on Reply
#40
techy1
xkm1948, post: 3897252, member: 50521"
Meh, why buy old stuff?
why buy top-midrange gpu (2080) for 900$+?

if nvidia will get away with this (sell well). trust me - next rtx 3070 will cost FE edition 899$ and potential BP 799$ (that in market == Null), and guess what - yes, AMD will come around, but they wont bie like "lets sell our rtx 3070 eqv for -400$ less", they be like "lets adjust to market and give -50$, maybe -100$". so thx to all proud preorderers and "+30% for +80% price jump is not that bad, dont hate, if you have money - go for it" - we might have 1000$ midrange gpus as soon as next gen even though, no mining craze
Posted on Reply
#41
londiste
Moofachuka, post: 3897306, member: 46319"
I'd upgrade my 1070 to 2070 if it matches up to 1080ti performance and around the same price as 1080ti... But that won't happen lol...
GTX1080Ti is a $700 card. So should be the GTX2080 once the release panic is over.

techy1, post: 3897436, member: 146421"
why buy top-midrange gpu (2080) for 900$+?
Why would you? I see GTX2080 preorders for €810-€815 here in Euroland. This follows the FE pricing pretty well. Once the initial grab is over, the prices will drop down to MSRP. This happened to both Pascal and Maxwell before. It will happen again. Just a little patience.
Posted on Reply
#42
Mussels
Moderprator
prices will drop over time, and we'll all trickle in our upgrades like usual
Posted on Reply
#43
Prima.Vera
mastrdrver, post: 3897337, member: 66727"
The benchmarks are 4k results. Let me know when there are leaks of actual usefulness.
Why, are you planning to buy this card for 1080p gaming? :laugh::laugh::laugh::laugh::roll:
Posted on Reply
#45
notb
Nkd, post: 3897254, member: 42675"
30% for almost double the price! When you look at it that way its Nvidia raping wallets lol! Then again if you got the money go for it. All power to you.
So performance gain isn't far off 1080Ti SLI, but it delivers in all games (unlike SLI), uses 1 slot and surely not 2x 1080Ti power draw. Not the worst deal.

Sure, the price is Titan-level, but... the card is very Titan-like in other aspects as well. Tensor cores, RT cores...

Maybe it's a shift in product lineup? RTX could replace Titan and we may still get a GTX2080. Dropping GTX brand seemed very fishy from the beginning.
Honestly, I hoped for Tensor and RT cores in budget cards in this generation. But having read a bit more about this tech and alleged R&D put into it, an RTX 2060 suddenly seems much less possible... RTX 3060 as well... Damn. :-/
Posted on Reply
#46
Liviu Cojocaru
If these results are 4 real, it doesn't look bad...still way too expensive for now. If people don't pre-order like crazy the price will definitely go down
Posted on Reply
#47
Caring1
Prima.Vera, post: 3897493, member: 98685"
Why, are you planning to buy this card for 1080p gaming? :laugh::laugh::laugh::laugh::roll:
This ^^^ is an example of tech snobbery.
There's nothing wrong with gaming at 1080p.

Darksword, post: 3897199, member: 128114"
I'd be interested to see what the difference is at equal clock speeds.

The 2080 Ti FE comes overclocked out of the box and should run cooler, which means it can sustain much higher clocks than the stock 1080 Ti FE.
Wut!
All rumours say it runs hotter, hence the dual fan set up up.
You an Nvidia shill sent here to dispel that rumour?
Posted on Reply
#48
londiste
Caring1, post: 3897508, member: 153156"
This ^^^ is an example of tech snobbery.
There's nothing wrong with gaming at 1080p.
No it is not. 1080Ti is already overkill for anything 1080p. I mean sure, you get a half-idling GPU with awesome efficiency, but it would not be worth the cost.

Caring1, post: 3897508, member: 153156"
All rumours say it runs hotter, hence the dual fan set up up.
Everyone is hoping the dual-fan openair cooler does a better job at keeping the GPU temp lower. Coupled with Max Boost 4.0 that should translate into higher clock bins.
As far as radiated heat goes, it's going to be close enough to 1080Ti with the same 250W (or 10W more at 260W for clock bump) of power consumption.
Posted on Reply
#49
Gungar
TheinsanegamerN, post: 3897355, member: 127292"
Ah, the mythical gimping.

Not optimizing new software for older arches is not gimping. Read a dictionary sometime. gimping is retroactively LOWERING performance of a previous product, which has been proven time and time again to be completely false with nvidia.

They just no longer optimize for previous generations. AMD did this too, and I'll bet you were not complaining when they stopped optimizing for the 4000 series, or evergreen, or the 9000 series, ece. The only reason they continue to optimize for GCN is because they have no choice, they are still selling that arch.

The moment GCN is either replaced or updates to a point where it no longer functions similarly to GCN 1, optimization for GCN will stop completely. It's SOP for GPU makers, has been since the beginning of time.
Lowering performance of a previous product or stop optimizing 1 years old (or even 2) gpu is the same thing. If you pay 800 dollars for a gpu, you should be entitle of at least 4 years of gpu optimization.
Posted on Reply
#50
Caring1
TheinsanegamerN, post: 3897355, member: 127292"
Ah, the mythical gimping.

Not optimizing new software for older arches is not gimping. Read a dictionary sometime. gimping is retroactively LOWERING performance of a previous product, which has been proven time and time again to be completely TRUE with nvidia....
Fixed that for you.
Gimping is lowering the performance of older hardware with new drivers, in comparison to older drivers. Perhaps you should read a dictionary sometimes.
Optimising new software for old hardware has nothing to do with it, they should be at least capable of running at prior levels of performance, not lower, which has been shown many times through forums like this one.
Posted on Reply
Add your own comment