Monday, August 31st 2020

Performance Slide of RTX 3090 Ampere Leaks, 100% RTX Performance Gain Over Turing

NVIDIA's performance expectations from the upcoming GeForce RTX 3090 "Ampere" flagship graphics card underline a massive RTX performance gain generation-over-generation. Measured at 4K UHD with DLSS enabled on both cards, the RTX 3090 is shown offering a 100% performance gain over the RTX 2080 Ti in "Minecraft RTX," greater than 100% gain in "Control," and close to 80% gain in "Wolfenstein: Young Blood." NVIDIA's GeForce "Ampere" architecture introduces second generation RTX, according to leaked Gainward specs sheets. This could entail not just higher numbers of ray-tracing machinery, but also higher IPC for the RT cores. The specs sheets also refer to third generation tensor cores, which could enhance DLSS performance.
Source: yuten0x (Twitter)
Add your own comment

131 Comments on Performance Slide of RTX 3090 Ampere Leaks, 100% RTX Performance Gain Over Turing

#1
Ravenmaster
Those benchies are looking pretty sexy but i bet the price won't be
Posted on Reply
#2
Zubasa
RavenmasterThose benchies are looking pretty sexy but i bet the price won't be
Note that all 3 games in the chart are running RTX, so this is just raytracing performance, not general rasterization.
Also this is running with DLSS on, and no idea if Ampere has any performance gains in DLSS over Turing either.
Posted on Reply
#3
Calmmo
So the 3-4x RT performance gain rumor was fake after all.
Posted on Reply
#4
XL-R8R
CalmmoSo the 3-4x RT performance gain rumor was fake after all.
The only thing real, will be the absurd price.




Also, that 2x will surely now mean that some titles will be playable (fps-wise) with all the RTX features enabled.... others, still not so much.
Posted on Reply
#5
xkm1948
Based on TPU review of Control. RTX on will cut 2080Ti performance to half in 1080p

www.techpowerup.com/review/control-benchmark-test-performance-nvidia-rtx/4.html

Without DLSS, 2080Ti gets about 100FPS avg at 1080p, 50fps avg at RTX 1080P.

So now RTX3090 gets about 100 fps on RTX 1080p (>2X from this chart)

That means RTX3090 needs at least 200FPS on regulation 1080p, that is without RTX. Of course this is assuming no RT efficiency improvement. Let's assume there is some major RT effieceny improvement so instead of 0.5X performance penalty we have 0.7X performance penalty. Then RTX3090 would be running 133FPS to 150FPS without RTX. So a 30% to 50% performance uplift in non-RTX games. Also we have 5248 versus 4352 CUDA core. So CUDA core increase by itself should give at least 20% performance in nonRTX

That is just some quick napkin math. I am more leaning towards 30%~35% performance increase. But there is a good chance that I am wrong
Posted on Reply
#6
TheLostSwede
News Editor
Ah yes, Control doesn't even give you playable frame rates on a 2080 at 4k, so all this means for Control is that you might get 60fps at 4K now... But that's a big maybe. I mean, I barely get 60fps on my system at 1080p... Ok, that's without DLSS, but hey, the game is a graphics hog.
Posted on Reply
#8
ZoneDymo
source: yuten0x (Twitter)

ok.... and...that is a reliable source? and not just some random person on the internet?
Posted on Reply
#9
john_
Someone at wfcctech (yeah, that site) posted this image.



And there are other examples there too.

So it could be fake.
Posted on Reply
#10
Chris34
xkm1948Based on TPU review of Control. RTX on will cut 2080Ti performance to half in 1080p

www.techpowerup.com/review/control-benchmark-test-performance-nvidia-rtx/4.html

Without DLSS, 2080Ti gets about 100FPS avg at 1080p, 50fps avg at RTX 1080P.

So now RTX3090 gets about 100 fps on RTX 1080p (>2X from this chart)

That means RTX3090 needs at least 200FPS on regulation 1080p, that is without RTX. Of course this is assuming no RT efficiency improvement. Let's assume there is some major RT effieceny improvement so instead of 0.5X performance penalty we have 0.7X performance penalty. Then RTX3090 would be running 133FPS to 150FPS without RTX. So a 30% to 50% performance uplift in non-RTX games. Also we have 5248 versus 4352 CUDA core. So CUDA core increase by itself should give at least 20% performance in nonRTX

That is just some quick napkin math. I am more leaning towards 30%~35% performance increase. But there is a good chance that I am wrong
I think you missed the 4k resolution with DLSS enabled at the bottom right of the screen.
Posted on Reply
#11
Calmmo
Well at 1440p it means RT might actually be usable
Posted on Reply
#12
Chomiq
Some random Twitter user is now a reliable source?
Posted on Reply
#13
SL2
john_Someone at wfcctech (yeah, that site) posted this image.

Maybe. To me it looks like a regular jpg artifact. It looks the same at the 100 % line as well, which makes it consistent to me. This is a leak after all, so we can't expect the best quality.
Posted on Reply
#14
Berfs1
john_Someone at wfcctech (yeah, that site) posted this image.



And there are other examples there too.

So it could be fake.
While I want to believe what you are saying, there is a possibility it had something to do with the aliasing of the image and multiple pixel rescales.

EDIT: look at every intersecting bar. It isn't fake, it's due to low resolution.
Posted on Reply
#15
ViperXTR
only a day now, will just wait for the official ones
Posted on Reply
#16
BoboOOZ
CalmmoSo the 3-4x RT performance gain rumor was fake after all.
Not really, that rumor said that the RT performance HIT would be 4x smaller, which might be very well the case, given that here things are presented differently, but in order to make sure you would need to do some tedious napkin math. Given that this might very well be fake, I don't see the point, let's just wait a little for the real tests to come in, and then we'll get our pitchforks :)
Posted on Reply
#17
TheoneandonlyMrK
BoboOOZNot really, that rumor said that the RT performance HIT would be 4x smaller, which might be very well the case, given that here things are presented differently, but in order to make sure you would need to do some tedious napkin math. Given that this might very well be fake, I don't see the point, let's just wait a little for the real tests to come in, and then we'll get our pitchforks :)
Do a Google search for 4x the Rtx performance, see what others have seen and said, is not what your saying.
It's rumours so there's that but it wasn't 4x smaller hit in performance , I don't recall ever seeing that said.
Pitchfork at the ready :p
Posted on Reply
#18
phanbuey
And... keep in mind this is with RTX ON... where the 2080ti takes a 40% performance hit because it was first generation RTX and the implementation wasn't the best.

So I could totally believe that WITH RTX ON, that the boost of performance will be double between the flagship cards as they improve tensor cores and tweak the implementation so that it doesnt take a nearly as much of performance hit + the 60% rumored performance uplift, and it's very possible to hit those numbers.
Posted on Reply
#19
Dirt Chip
The hype train has reach it`s destination and crashed the station at full speed.
Posted on Reply
#20
Space Lynx
Astronaut
You guys are completely forgetting about DLSS 2.0, it basically looks like 1440p or 4k even though its running the game at 720p or 1080p, Control is the best example, but I have a feeling Nvidia is all in with RTX and DLSS 2.0 being a combined feature. Just my guess, but yeah.
Posted on Reply
#21
xorbe
It wouldn't be surprising that the first revision/update of RTX brought hefty performance gains in that area. I don't think I have any RTX games yet though.
Posted on Reply
#22
bug
ZubasaNote that all 3 games in the chart are running RTX, so this is just raytracing performance, not general rasterization.
Also this is running with DLSS on, and no idea if Ampere has any performance gains in DLSS over Turing either.
I don't think DLSS is hardware accelerated. It's hardware accelerated on Nvidia's training servers, but not on the GPU itself.
Posted on Reply
#23
BoboOOZ
theoneandonlymrkDo a Google search for 4x the Rtx performance, see what others have seen and said, is not what your saying.
It's rumours so there's that but it wasn't 4x smaller hit in performance , I don't recall ever seeing that said.
Pitchfork at the ready :p
Well that's always the way I understood it, but maybe it's just me (and Tom from Moore's law is dead...).

Let's wait&see, but I think that 50% of that performance gain comes from purely rasterized performance, so only ~50% RT performance gain sounds a bit disappointing, indeed, at least if you care a lot about RT and you plan on buying a 3090. I'm neither of those 2, so I'm keeping calm atm.
Posted on Reply
#24
B-Real
Wow, so good! No. Who the heck cares about RT?
lynx29You guys are completely forgetting about DLSS 2.0, it basically looks like 1440p or 4k even though its running the game at 720p or 1080p, Control is the best example, but I have a feeling Nvidia is all in with RTX and DLSS 2.0 being a combined feature. Just my guess, but yeah.
And what other game supports DLSS 2? Anyway, honestly I do not want to use lowered options when I buy a $800-1200 card: I want to use it on native resolution and don't want it to be used on lower resolution that won't match native one.
Posted on Reply
#25
BoboOOZ
bugI don't think DLSS is hardware accelerated. It's hardware accelerated on Nvidia's training servers, but not on the GPU itself.
It's hardware accelerated on both sides, the neural network is trained at Nvidia and is put at work in real-time on each GPU to "guess" how to extrapolate those images.
Posted on Reply
Add your own comment
Apr 27th, 2024 18:04 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts