• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Performance Slide of RTX 3090 Ampere Leaks, 100% RTX Performance Gain Over Turing

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA's performance expectations from the upcoming GeForce RTX 3090 "Ampere" flagship graphics card underline a massive RTX performance gain generation-over-generation. Measured at 4K UHD with DLSS enabled on both cards, the RTX 3090 is shown offering a 100% performance gain over the RTX 2080 Ti in "Minecraft RTX," greater than 100% gain in "Control," and close to 80% gain in "Wolfenstein: Young Blood." NVIDIA's GeForce "Ampere" architecture introduces second generation RTX, according to leaked Gainward specs sheets. This could entail not just higher numbers of ray-tracing machinery, but also higher IPC for the RT cores. The specs sheets also refer to third generation tensor cores, which could enhance DLSS performance.



View at TechPowerUp Main Site
 
Those benchies are looking pretty sexy but i bet the price won't be
Note that all 3 games in the chart are running RTX, so this is just raytracing performance, not general rasterization.
Also this is running with DLSS on, and no idea if Ampere has any performance gains in DLSS over Turing either.
 
So the 3-4x RT performance gain rumor was fake after all.
 
So the 3-4x RT performance gain rumor was fake after all.

The only thing real, will be the absurd price.




Also, that 2x will surely now mean that some titles will be playable (fps-wise) with all the RTX features enabled.... others, still not so much.
 
Based on TPU review of Control. RTX on will cut 2080Ti performance to half in 1080p


Without DLSS, 2080Ti gets about 100FPS avg at 1080p, 50fps avg at RTX 1080P.

So now RTX3090 gets about 100 fps on RTX 1080p (>2X from this chart)

That means RTX3090 needs at least 200FPS on regulation 1080p, that is without RTX. Of course this is assuming no RT efficiency improvement. Let's assume there is some major RT effieceny improvement so instead of 0.5X performance penalty we have 0.7X performance penalty. Then RTX3090 would be running 133FPS to 150FPS without RTX. So a 30% to 50% performance uplift in non-RTX games. Also we have 5248 versus 4352 CUDA core. So CUDA core increase by itself should give at least 20% performance in nonRTX

That is just some quick napkin math. I am more leaning towards 30%~35% performance increase. But there is a good chance that I am wrong
 
Ah yes, Control doesn't even give you playable frame rates on a 2080 at 4k, so all this means for Control is that you might get 60fps at 4K now... But that's a big maybe. I mean, I barely get 60fps on my system at 1080p... Ok, that's without DLSS, but hey, the game is a graphics hog.
 
source: yuten0x (Twitter)

ok.... and...that is a reliable source? and not just some random person on the internet?
 
Someone at wfcctech (yeah, that site) posted this image.

243565ef1bffb1267a440051f7314e77aac7283417056e4f7733928e50260996.jpg


And there are other examples there too.

So it could be fake.
 
Based on TPU review of Control. RTX on will cut 2080Ti performance to half in 1080p


Without DLSS, 2080Ti gets about 100FPS avg at 1080p, 50fps avg at RTX 1080P.

So now RTX3090 gets about 100 fps on RTX 1080p (>2X from this chart)

That means RTX3090 needs at least 200FPS on regulation 1080p, that is without RTX. Of course this is assuming no RT efficiency improvement. Let's assume there is some major RT effieceny improvement so instead of 0.5X performance penalty we have 0.7X performance penalty. Then RTX3090 would be running 133FPS to 150FPS without RTX. So a 30% to 50% performance uplift in non-RTX games. Also we have 5248 versus 4352 CUDA core. So CUDA core increase by itself should give at least 20% performance in nonRTX

That is just some quick napkin math. I am more leaning towards 30%~35% performance increase. But there is a good chance that I am wrong

I think you missed the 4k resolution with DLSS enabled at the bottom right of the screen.
 
Well at 1440p it means RT might actually be usable
 
Some random Twitter user is now a reliable source?
 
Someone at wfcctech (yeah, that site) posted this image.

243565ef1bffb1267a440051f7314e77aac7283417056e4f7733928e50260996.jpg
Maybe. To me it looks like a regular jpg artifact. It looks the same at the 100 % line as well, which makes it consistent to me. This is a leak after all, so we can't expect the best quality.
 
Someone at wfcctech (yeah, that site) posted this image.

243565ef1bffb1267a440051f7314e77aac7283417056e4f7733928e50260996.jpg


And there are other examples there too.

So it could be fake.
While I want to believe what you are saying, there is a possibility it had something to do with the aliasing of the image and multiple pixel rescales.

EDIT: look at every intersecting bar. It isn't fake, it's due to low resolution.
 
only a day now, will just wait for the official ones
 
So the 3-4x RT performance gain rumor was fake after all.
Not really, that rumor said that the RT performance HIT would be 4x smaller, which might be very well the case, given that here things are presented differently, but in order to make sure you would need to do some tedious napkin math. Given that this might very well be fake, I don't see the point, let's just wait a little for the real tests to come in, and then we'll get our pitchforks :)
 
Not really, that rumor said that the RT performance HIT would be 4x smaller, which might be very well the case, given that here things are presented differently, but in order to make sure you would need to do some tedious napkin math. Given that this might very well be fake, I don't see the point, let's just wait a little for the real tests to come in, and then we'll get our pitchforks :)
Do a Google search for 4x the Rtx performance, see what others have seen and said, is not what your saying.
It's rumours so there's that but it wasn't 4x smaller hit in performance , I don't recall ever seeing that said.
Pitchfork at the ready :p
 
And... keep in mind this is with RTX ON... where the 2080ti takes a 40% performance hit because it was first generation RTX and the implementation wasn't the best.

So I could totally believe that WITH RTX ON, that the boost of performance will be double between the flagship cards as they improve tensor cores and tweak the implementation so that it doesnt take a nearly as much of performance hit + the 60% rumored performance uplift, and it's very possible to hit those numbers.
 
The hype train has reach it`s destination and crashed the station at full speed.
 
You guys are completely forgetting about DLSS 2.0, it basically looks like 1440p or 4k even though its running the game at 720p or 1080p, Control is the best example, but I have a feeling Nvidia is all in with RTX and DLSS 2.0 being a combined feature. Just my guess, but yeah.
 
It wouldn't be surprising that the first revision/update of RTX brought hefty performance gains in that area. I don't think I have any RTX games yet though.
 
Note that all 3 games in the chart are running RTX, so this is just raytracing performance, not general rasterization.
Also this is running with DLSS on, and no idea if Ampere has any performance gains in DLSS over Turing either.
I don't think DLSS is hardware accelerated. It's hardware accelerated on Nvidia's training servers, but not on the GPU itself.
 
Do a Google search for 4x the Rtx performance, see what others have seen and said, is not what your saying.
It's rumours so there's that but it wasn't 4x smaller hit in performance , I don't recall ever seeing that said.
Pitchfork at the ready :p
Well that's always the way I understood it, but maybe it's just me (and Tom from Moore's law is dead...).

Let's wait&see, but I think that 50% of that performance gain comes from purely rasterized performance, so only ~50% RT performance gain sounds a bit disappointing, indeed, at least if you care a lot about RT and you plan on buying a 3090. I'm neither of those 2, so I'm keeping calm atm.
 
Wow, so good! No. Who the heck cares about RT?
You guys are completely forgetting about DLSS 2.0, it basically looks like 1440p or 4k even though its running the game at 720p or 1080p, Control is the best example, but I have a feeling Nvidia is all in with RTX and DLSS 2.0 being a combined feature. Just my guess, but yeah.
And what other game supports DLSS 2? Anyway, honestly I do not want to use lowered options when I buy a $800-1200 card: I want to use it on native resolution and don't want it to be used on lower resolution that won't match native one.
 
Back
Top