• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 2080 Ti Ray-tracing "SOTR" Barely Manages 30-60 FPS at Full HD

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,914 (7.37/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Perhaps a lot of driver optimization and game patches are due, but early performance numbers for real-time ray-tracing on NVIDIA's thousand-dollar GeForce RTX 2080 Ti don't look encouraging. German tech publication PCGH tested the enthusiast-segment graphics card on "Shadow of the Tomb Raider," one of the poster-boys of NVIDIA's upcoming ray-tracing acceleration, and found that with all its eye-candy cranked up, the card barely manages 30 to 60 frames per second at Full HD (1920 x 1080 pixels).

NVIDIA and Eidos (developers of "Shadow of the Tomb Raider") were quick to respond to the PCGH story. They stated that the build of the game demoed at Gamescom is pre-release, and the studio is still optimizing it for NVIDIA GeForce RTX series; and that the GeForce RTX hardware is running on pre-launch beta drivers that are yet to pack "Game Ready" optimization for SOTR. Catch PCGH's video presentation in the source link below.



View at TechPowerUp Main Site
 
...I really need to see benchmarks on these cards. Between this and the huge price hikes, Turing cards will have to obliterate Pascal cards in performance to be worth buying.
 
Can't say I am shocked.
 
Next question, what is the performance with RT disabled and how does it compare to a 1080 Ti.
 
1080p claim comes from indirect sources. Shadowplay output, if I remember correctly.
Eidos has said in twitter that this is very early and they are still optimizing.

...I really need to see benchmarks on these cards. Between this and the huge price hikes, Turing cards will have to obliterate Pascal cards in performance to be worth buying.
They won't obliterate Pascal. 20-25% faster in non-RT use cases. Titan V is roughly the performance 2080Ti will do.
 
They won't obliterate Pascal. 20-25% faster in non-RT use cases. Titan V is roughly the performance 2080Ti will do.
With every day that goes by, my thought goes exactly in this direction.
I am very afraid of something colossally disappointing for the first time in a long time
 
Jensen: "JiggaRayz - its just works".... - NO it does not and will have to be turned off even on RTX 2080ti and even on 1080@60 monitors....and now we can get that fluff out of the question - this RTX looks more and more like a complete dud. if RTX 20XX does not give +50-60% over equivalent GTX in RTX OFF scenarios - we have got our selves in worst price/performance gpus since mining craze top spike and this is awful
 
Is this a situation where a dual card configuration would help? I know that’s really expensive, but right now it looks like we have to take a big step back in resolution to get the benefits of RT. Is it going to be worth the trade off? Is it just a situation where 5 years from now we’ll have a GPU as big as two 2080Ti’s that will be able to run RT at better than 1080p?
 
Also the guy who did the video posted that the presentation ran on a quadro 6000 at jensens presentation, source artists linkedIn.
 
Is this a situation where a dual card configuration would help? I know that’s really expensive, but right now it looks like we have to take a big step back in resolution to get the benefits of RT. Is it going to be worth the trade off? Is it just a situation where 5 years from now we’ll have a GPU as big as two 2080Ti’s that will be able to run RT at better than 1080p?
SLI in games are pretty much dead and gone so thats a even more waste of time and money
 
a 1200 dollars card barely do 60 fps on 1080p with glorified ray tracing
let that sink in for a moment.....
 
I'm curious about the differences in RT performance within the RTX series. Looking at Nvidia's chart the 2080 is a 20% drop from the 2080ti and then 40% for the 2070 with GR/s and RTX Ops (whatever those are?). Will things look uglier, perform worse, or both on the lesser cards?
 
Next question, what is the performance with RT disabled and how does it compare to a 1080 Ti.
I think far, far more people are interested in this.

While RT is of more interest to me compared to other nvidia tech, just like with any of the other gamewreck tech the performance hit tends not to be worth it.

Considering the time between the gens and the cost, I am hoping for at least 40% faster. Obviously more is nice, but any lower than that and I will feel robbed.
 
I think far, far more people are interested in this.

While RT is of more interest to me compared to other nvidia tech, just like with any of the other gamewreck tech the performance hit tends not to be worth it.

Considering the time between the gens and the cost, I am hoping for at least 40% faster. Obviously more is nice, but any lower than that and I will feel robbed.

Tiny Huang didn't say a single word about actual performance, which should make it obvious. Clocks haven't changed and the CC count is modestly increased at best. Good luck with over 20 with Turding.
 
My estimation for a few weeks now for 10-20% performance difference with RT off from Pascal to Turing with lower efficiency and much higher price might be true after all.
 
I have been called an nvidia fanboy more than once in a derogatory tone. Speaking as someone who only buys nvidia cards since a Radeon HD 4890 probably 10 years ago: This release is a mess. Price is bad, the new touted feature is limping along, anticipated performance increase is modest at best.
 
What will be interesting is if AMD's 7nm based card has much greater in game performance than the 2080ti, but no focus on ray tracing and a much cheaper price.
 
SLI in games are pretty much dead and gone so thats a even more waste of time and money
I think the question is, can RT actually finally become a possibility with this tech, or will games always outpace it? TR is one of those games that doesn’t always need high frame rate, but there are countless games that do. I guess my point is it seems like RT might be a possibility, but the market might have to be willing to go backward in resolution for a while for the sake of more accurate lighting. If two 2080Ti’s can cure the FPS issues, then Maybe RTX 2580 Ti would be able to handle the task?

What makes RT potentially trivial is that players will need to be able to see and appreciate the differences. As much as I like pretty games, it still comes down to the actual content. I don’t just want to look at the scenery, and eventually we just adjust to whatever the graphics are anyway.
 
word:

Hype-turd-shankityskankskank :D
 
Sad thing is there are plenty of truly idiotic individuals out there that will still pay the full whack for it, just encouraging Nvidia to turn further into Intel.
 
I think that performance is about right because nvidia want people to buy two cards. so 30fps for one card with all the bells and whistles will be enough to make people buy two. If I hadn't bought a Titan V I would of been buying two 2080ti.
We really need amd to release something really fast, to encourage nvidia to up the performance.
otherwise we are going to be milked for every miner fps increase. :(
 
If RT becomes a thing, NVIDIA will just add more tensor cores. Currently 8 per SM of 64 cores, they can make it 1:1 64 shader cores and 64 tensor, and we have a 4K144 raytrace monster.
 
What about VR futureproof? Are these cards prepared for dual 2k-4k screens? To me, that is more important than a few new reflections in the same old, same-as-always gameplay that current flat-screen gaming is all about.

New gameplay requires a new medium, and I don't see any intention of bringing that medium to the forefront.
 
the Volta gpu is miles faster in games, even if its only 60fps... you just cant help but notice it is buttery smooth. more so than a titan XP so turing should be fine with over 4000 cores.All we've seen are rtx scores. we need to see them without!:toast:
 
...I really need to see benchmarks on these cards. Between this and the huge price hikes, Turing cards will have to obliterate Pascal cards in performance to be worth buying.

No shit, as in I agree and then some. Since when do we get all this hype and you know all the stats and you can pre-order the damned thing and have ZERO reviews or even a whiff of performance minus a few cherry picked benchmarks etc? I'm very skeptical at this rollout just for the fact Nvidia can literally get people to preorder these cards....without even knowing what they do in real life performance. I don't usually note the rollout process of video cards but doesn't seem like you can usually buy the damn card (especially for over a GRAND) and have nothing but hype and hope.

I mean only reason we have this disparaging benchmark is some German company did it and published it, if the card was worth the big bucks they are set at....why didn't Nvidia allow any notable benchmark numbers for games people play leak?
 
Back
Top