Monday, August 31st 2020

Performance Slide of RTX 3090 Ampere Leaks, 100% RTX Performance Gain Over Turing

NVIDIA's performance expectations from the upcoming GeForce RTX 3090 "Ampere" flagship graphics card underline a massive RTX performance gain generation-over-generation. Measured at 4K UHD with DLSS enabled on both cards, the RTX 3090 is shown offering a 100% performance gain over the RTX 2080 Ti in "Minecraft RTX," greater than 100% gain in "Control," and close to 80% gain in "Wolfenstein: Young Blood." NVIDIA's GeForce "Ampere" architecture introduces second generation RTX, according to leaked Gainward specs sheets. This could entail not just higher numbers of ray-tracing machinery, but also higher IPC for the RT cores. The specs sheets also refer to third generation tensor cores, which could enhance DLSS performance.
Source: yuten0x (Twitter)
Add your own comment

131 Comments on Performance Slide of RTX 3090 Ampere Leaks, 100% RTX Performance Gain Over Turing

#51
Metroid
BoboOOZ
There, I fixed it for you.
Exactly, that is the reason they went with 3090 instead of 3080ti, if they have chosen 3080ti and put a price tag of $2000, people would complain because the 2080ti is around $999, since there was no rtx 2090 to compare with then they can charge anything they want.
Posted on Reply
#52
Jayp
lynx29
You guys are completely forgetting about DLSS 2.0, it basically looks like 1440p or 4k even though its running the game at 720p or 1080p, Control is the best example, but I have a feeling Nvidia is all in with RTX and DLSS 2.0 being a combined feature. Just my guess, but yeah.
I hate that Control is the example everyone uses. That game looks like ass regardless of how you run it. Game seems purposefully shitty on the visual side to enable RT to look like it did something special. Ray tracing is still a waste of worrying about on the consumer end. There is so much that can still be done in the traditional graphics world. RT is currently pretty lame, pick from a hand full of RT tech what you want to add to the game to kill performance. If you look at the hardware required thus far to produce ray traced effects and the actual visual outcome we have a long way to go until this is actually making waves in the gaming world. Every RT implementation I have see so far could easily be forgotten about once you start playing the game and stop staring at the 1-3 new effects in the game. Considering many developers haven't pushed current tech to the max I find ray tracing in games to be an advertising ploy.

Think about how much those RT and Tensor cores add to the price of these Nvidia GPUs. Think about what they actually do for the game play experience. As a 2080 Ti owner I wish I could have just got a version of the card minus the Ray Tracing supported tech.

I really hope AMD implements RT functionality in a way that doesn't drive the cost of the card up and have useless hardware when RT is off.

Lastly, DLSS would mean so much more if it wasn't in such a limited amount of titles. DLSS needs to work in all games if consumers are going to pay for it.
Posted on Reply
#53
Anymal
Step by step, nv is known for that.
Posted on Reply
#54
Shatun_Bear
thepath
You might have a point if the extra video memory was useful. But it won't



Many of these game you mention are not DLSS2.0
I'm pretty sure that Battlefield, Tomb Raider, FFXV, Metro ..... are all using DLSS1.0

However, more and more game will use DLSS2.0 in future and some of them are big titles like Cyperpunk 2077
You dont need to damage control for them.

I'm saying 10GB on a $800 graphics card is a joke. With that kind of money you should expect to use it for 3-4 years, 10gb wont be enough when both new consoles have 16gb, RDNA2 will have 16gb cards for the same price or likely cheaper, and Nvidia will nickle and dime everyone with a 20gb model 3080 next month. The only people buying a 10gb $800 graphics card are pretty clueless ones.
Posted on Reply
#55
Searing
Yeah we all know RT and DLSS is nonsense meant to find a use for datacenter hardware and mask bad performance gains. There are many better ways to get nice visuals. Look at all the upcoming PS5 games like Rachet and Clank (where RT is basically off and no DLSS).

Control is a marketing stunt. TAA nasty looking game with no proper reflections without RT, then they turn it on and go "look how great this looks". Compare vs Spiderman on PS4 to see what an actual game should look like not using RT "features".

It's been sad gaming on PC since the 1080 ti came out and showed us what an actual video card should be, but now nVidia is majority not gaming focused.
Posted on Reply
#57
BoboOOZ
Shatun_Bear
You dont need to damage control for them.

I'm saying 10GB on a $800 graphics card is a joke. With that kind of money you should expect to use it for 3-4 years, 10gb wont be enough when both new consoles have 16gb, RDNA2 will have 16gb cards for the same price or likely cheaper, and Nvidia will nickle and dime everyone with a 20gb model 3080 next month. The only people buying a 10gb $800 graphics card are pretty clueless ones.
It's pretty simple actually if TFlops and bandwidth go up, memory size should go up accordingly. The 10GB will only be good in the future for people that want ultra-high framerate/low res gaming.
Posted on Reply
#58
P4-630
BoboOOZ
The 10GB
Should be good enough for ~3 years @ 1440p for me.
Posted on Reply
#59
FreedomEclipse
~Technological Technocrat~
yeah. Given how the early iterations of RTX were "it just works" Im gonna wait for the reviews.
Posted on Reply
#61
BoboOOZ
P4-630
Should be good enough for ~3 years @ 1440p for me.
That I agree, but wouldn't a 3070 suffice for that?
Posted on Reply
#62
P4-630
BoboOOZ
That I agree, but wouldn't a 3070 suffice for that?
I want some more horsepower/fps this time by going up a tier this year as well :D (using a 2070 super at the moment)
However I'll wait for reviews first.
Posted on Reply
#63
medi01
Searing
Compare vs Spiderman on PS4 to see what an actual game should look like not using RT "features"
Or compare it to PS5 not using "RT features":

Aretak
www.guru3d.com/news-story/faked-nvidia-geforce-3090-slides-are-making-rounds-on-the-www.html

Fake.
I call fake cause it's not NV style to have charts with 0 as the baseline.



And, frankly, I don't get what "oddities" of jpeg compression are supposed to indicate.
This kind of "leak" does not need to modify an existing image, it's plain text and bars.
Posted on Reply
#64
M2B
Searing
Yeah we all know RT and DLSS is nonsense meant to find a use for datacenter hardware and mask bad performance gains. There are many better ways to get nice visuals. Look at all the upcoming PS5 games like Rachet and Clank (where RT is basically off and no DLSS).

Control is a marketing stunt. TAA nasty looking game with no proper reflections without RT, then they turn it on and go "look how great this looks". Compare vs Spiderman on PS4 to see what an actual game should look like not using RT "features".

It's been sad gaming on PC since the 1080 ti came out and showed us what an actual video card should be, but now nVidia is majority not gaming focused.
Too bad for you Ratchet and Clank uses Ray Traced reflections and the next spider-man on PS5 will also use ray tracing :)
Try harder next time.
Posted on Reply
#65
medi01
M2B
Too bad for you Ratchet and Clank uses Ray Traced reflections and the next spider-man on PS5 will also use ray tracing :)
I don't get why people are excited about R&C, to me it looks like last gen.
Posted on Reply
#66
wtfbbqlol
About the Unreal Engine 5 Global Illumination (ie Lumen tech) vs hardware ray tracing:

According to a Eurogamer interview
Lumen uses ray tracing to solve indirect lighting, but not triangle ray tracing," explains Daniel Wright, technical director of graphics at Epic. "Lumen traces rays against a scene representation consisting of signed distance fields, voxels and height fields. As a result, it requires no special ray tracing hardware. Lumen uses a combination of different techniques to efficiently trace rays," continues Wright. "Screen-space traces handle tiny details, mesh signed distance field traces handle medium-scale light transfer and voxel traces handle large scale light transfer."
So the Unreal Engine 5 has figured out a good way to do fast and and beautiful indirect lighting via a new "software" ray tracing technique. But this technique appears to be limited to indirect lighting; it may not do reflections, hard shadows with it (or else Epic would have wanted to show that off too).
Posted on Reply
#67
thesmokingman
john_
Someone at wfcctech (yeah, that site) posted this image.



And there are other examples there too.

So it could be fake.
Reminds me of the Madden 20 team using Madden 19 assets and not even bothering to hide it. Like zoom into the stadium and you see it says Madden 19 lol.
Posted on Reply
#68
Chris34
RedelZaVedno
Games that support RTX ray tracing as of now...
1. Amid Evil / 2. Battlefield V / 3. Control / 4. Call of Duty: Modern Warfare / 5. Deliver Us the Moon / 6. Justice Online / 7. JX3 / 8. Mechwarrior 5: Mercenaries / 9. Metro Exodus (and The Two Colonels DLC) / 10. Minecraft / 12. Quake II RTX / 13. Shadow of the Tomb Raider / 14. Wolfenstein: Youngblood

and DLSS 2.0...
1. Cyberpunk 2077 / 2. Death Stranding / 3. F1 2020 / 4. Minecraft / 4. Bright Memory / 5. Mechwarrior V: Mercenaries / 6. Deliver Us The Moon / 7. Control / 8. Wolfenstein : Youngblood / 9. Anthem /10. Metro Exodus / 11.Battlefield / 12. Shadow of the Tomb Raider / 13. Final Fantasy XV / 14. Monster Hunter: World

Is it worth the price increase over Maxwell/Pascal? NO

Metro Exodus is still DLSS 1.0.
Dunno where this site got it's DLSS 2.0
Even them, the site, clearly state they don't know in the comment of the comparaison video they've uploaded. ->
Ozarc Gaming 1 month ago

Most likely 1.0 as the devs have taken the time to upgrade the game in a long time. DLSS 1.0 is actually slightly faster in my Mechwarrior tests www.ozarc.games/dlss2-benchmarks-mechwarrior5/ (this could be different for different games though) but 2.0 basically just made it easier for devs to implement it, compared to 1.0, so in game speeds are very similar. If you look at the comment section int hat article a NVIDIA dev replied, but when asked why 2.0 was slightly slower he didn't get back to me. Thanks
Edit: I didn't see, there's Final Fantasy XV too that's DLSS 1.0.

This site is just a clickbait ffs.
Posted on Reply
#69
Vya Domus
BoboOOZ
It's pretty simple actually if TFlops and bandwidth go up, memory size should go up accordingly. .
There is absolutely no reason why that would be the case.
Posted on Reply
#70
dragontamer5788
Shatun_Bear
You dont need to damage control for them.

I'm saying 10GB on a $800 graphics card is a joke. With that kind of money you should expect to use it for 3-4 years, 10gb wont be enough when both new consoles have 16gb, RDNA2 will have 16gb cards for the same price or likely cheaper, and Nvidia will nickle and dime everyone with a 20gb model 3080 next month. The only people buying a 10gb $800 graphics card are pretty clueless ones.
Consoles split their RAM between system-RAM AND VRAM. The PS5 will have 16GB of GDDR6 RAM, but a standard computer with a 3080 will have 10GB VRAM + 16GB System RAM (easily 20+ RAM).

With Microsoft Flight Sim remaining below 8GBs of VRAM, I find it highly unlikely that any other video game in the next few years will come close to those capabilities. The 10+ GBs of VRAM is useful in many scientific and/or niche (ex: V-Ray) use cases, but I don't think its a serious issue for any video gamer.
Posted on Reply
#71
Hotobu
Metroid
So there we have, as soon as I saw the image nvidia comparing 2080ti x 3090 then it became clear, 3090 was meant to be 3080ti but for some reason they wanted a different name but in truth 3090 = 3080ti.
I doubt that. If you look at the rumored prices there's a huge gap between the 3080 and the 3090. I think it's a pretty good bet that a 3080TI will come out some months down the line to fill that space.
Posted on Reply
#72
Krzych
Hardware Geek
Wccf is the absolute worst. Just awful people on there in general. The comments section is almost always a bunch of political BS that has nothing to do with the article posted. I don't understand the fanboying either.
I actually found the comment section there to be increasingly funny over time. It used to really annoy me when I was new to all of this and taking it seriously, but now it just feels so stupid that it is no longer triggering and simply comical. It just doesn't seem serious. I'd much rather be concerned about sites like DSOG being completely overrun by chronic malcontents putting 'soy' and 'sjw' in between every word and being 100% serious about it, the name of that website really makes perfect sense now :P WCCF is more like comedy.
Posted on Reply
#73
bug
Krzych
I actually found the comment section there to be increasingly funny over time. It used to really annoy me when I was new to all of this and taking it seriously, but now it just feels so stupid that it is no longer triggering and simply comical. It just doesn't seem serious. I'd much rather be concerned about sites like DSOG being completely overrun by chronic malcontents putting 'soy' and 'sjw' in between every word and being 100% serious about it, the name of that website really makes perfect sense now :p WCCF is more like comedy.
Is it still funny when you realize those guys repeat that in real life and are taken for granted by their less-savvy friends, because they read it on the internet?
Posted on Reply
#74
Searing
M2B
Too bad for you Ratchet and Clank uses Ray Traced reflections and the next spider-man on PS5 will also use ray tracing :)
Try harder next time.
I already covered that. The last spiderman is what I was talking about, and Rachet and Clank is 99 percent not RT rendering. Use your brain. The graphics you see in Rachet and Clank are due to large increases in rasterization performance.
Posted on Reply
#75
Shatun_Bear
dragontamer5788
Consoles split their RAM between system-RAM AND VRAM. The PS5 will have 16GB of GDDR6 RAM, but a standard computer with a 3080 will have 10GB VRAM + 16GB System RAM (easily 20+ RAM).

With Microsoft Flight Sim remaining below 8GBs of VRAM, I find it highly unlikely that any other video game in the next few years will come close to those capabilities. The 10+ GBs of VRAM is useful in many scientific and/or niche (ex: V-Ray) use cases, but I don't think its a serious issue for any video gamer.
No, the next gen console OS takes 2.5GB only, leaving 13.5GB purely for the GPU (on Series X the memory is split, but it's still 13.5GB for games), which again, is more than the paltry 10GB on this 3080.

And you can already push past 8GB today, it's going to get much worse once the new consoles are out and they drop PS4/XB1 for multiplatform titles.
Posted on Reply
Add your own comment