• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 3DMark Performance Reveals Impressive Improvements

New gen xx80 not catching previous xx90 class = Ngreedia showing middle finger to all of us while...:nutkick:
 
Still pretty poor jump considering the bus, memory and more stuff in the die.
Even if we assume your 35% is close, history clearly shows that would still be in-line with trends gen-on-gen.

New gen xx80 not catching previous xx90 class = Ngreedia showing middle finger to all of us while...:nutkick:
There's an embarrassing statement..
 
There's an embarrassing statement..
Embarrassing statement? How so? 2080 was 39% faster than 1080 and matched top dog 1080TI, 3080 63% faster than 2080 and beated 2080TI +36%, 4080 49% faster than 3080 while fefeating 3090 with 30% raster lead. 5080 = 4080 + 15%, at least 11% slower than 4090. This has never happened in dGPU market on Ngreedia side. A total generational shitshow:banghead:
 
Embarrassing statement? How so? 2080 was 39% faster than 1080 and matched top dog 1080TI, 3080 63% faster than 2080 and beated 2080TI +36%, 4080 49% faster than 3080 while fefeating 3090 with 30% raster lead. 5080 = 4080 + 15%, at least 11% slower than 4090. This has never happened in dGPU market on Ngreedia side. A total generational shitshow:banghead:
Context is always important. What have you missed?

Also, only people lacking in reasonable maturity use terms like that... Just throwing it out there.
 
Last edited:
Context is always important. What have you missed?


Also, only a people lacking in reasonable maturity use terms like that... Just throwing it out there.

Dude thank you! Finally someone said it. That term should be banned from forums and force people to grow up a bit.
 
Ah yes, 3DM benchmarks that hold no water in real world scenarios.
 
Context is always important. What have you missed?


Also, only people lacking in reasonable maturity use terms like that... Just throwing it out there.
Fistly I've just written you the"context". Secondly just looking at the price trajectories, profit margins and market share over the years Ngreedia branding stands firmly. This company is eye watering example of de facto monopoly and should be broken under antitrust laws. Just like Microsoft and many others. But I know, I know, the hell will freeze sooner, after all we live in times of oligarchic kleptocracy where rich sociopaths call all the shots.
 
GPU advancement for the last 30 years hasn't been marketed exclusively on fake frames and upscaling.

All the frames are fake, regardless of the technology and techniques used to create them. You didn’t think what your looking at is real, did you?

Looks like 4090 got destroyed to me :laugh:

Hence AMD’s panic.

Oh well guess I will take the 30% higher raster, 40% higher RT and the improved DLSS Super Res, Ray Reconstruction and frame pacing (with MFG)

Sucks to have to settle for the best.
 
Embarrassing statement? How so? 2080 was 39% faster than 1080 and matched top dog 1080TI, 3080 63% faster than 2080 and beated 2080TI +36%, 4080 49% faster than 3080 while fefeating 3090 with 30% raster lead. 5080 = 4080 + 15%, at least 11% slower than 4090. This has never happened in dGPU market on Ngreedia side. A total generational shitshow:banghead:
That's a silly way of comparing this. Basically you are punishing nvidia for having a very fast 4090. Let say hypothetically that the 4090 was as fast as the 4070. Suddenly the 5080 would be wayyy faster than that, does that make the 5080 a better card? Nope.
 
That's a silly way of comparing this. Basically you are punishing nvidia for having a very fast 4090. Let say hypothetically that the 4090 was as fast as the 4070. Suddenly the 5080 would be wayyy faster than that, does that make the 5080 a better card? Nope.
Strange logic, but fine. You're obviously OK waiting 2,5 years to get 15% performance increase gen to gen on average and zero price to performance improvement at the high end.
 
This is what I want to believe…but in reality 20-30% Rasterization and 50% RT and that’s not appealing at all(don’t care about RT at all).

I’ll wait for either at RTX 5090ti(that actually pushes rasterization and just RT) or wait for the RTX 6090

Strange logic, but fine. You're obviously OK waiting 2,5 years to get 15% performance increase gen to gen on average and zero price to performance improvement at the high end.
The “Apple” tactic but since it’s Nvidia, they get a pass…
 
Strange logic, but fine. You're obviously OK waiting 2,5 years to get 15% performance increase gen to gen on average and zero price to performance improvement at the high end.
That's not what I said.
 
Nah, the point was we used to always get +X % performance per dollar every new generation, with X being beyond 100 a couple decades ago. 4090 destroyed 3090. 3080 Ti destroyed 2080 Ti. 2080 not only was faster than 1080 Ti but it also had a couple new features. 1080 Ti left 980 Ti with no business.

Today, we get a more expensive GPU with even more ridiculous power draw that is about as much faster. FPS/$ doesn't change to the better so what is there to be impressed with? nVidia are taking our quid for granted, this is what happens. Sure, 5090 will make 4090 look pathetic at AI workloads but that's the only field where you don't need a FPS counter to tell the difference between them.

They CAN do better, there's no physical evidence they can't. We haven't run out of nanometres. We haven't run out of silicon. Just why try harder when you can be king even without releasing anything?
And this right here is why a lot of people are upset at Nvidia. This is happening in real time but people are “sweeping it under the rug” and accept the “very impressive” improvement
 
It's only synthetic benchmarks where it's 33% to 35% faster than RTX 4090, in games it was on average 20% better then a RTX 4090.
 
To be fair, the 4090 gains over the 3090 are "out of the norm" when compared to previous releases. The performance increase of Lovelace over Ampere is indeed historic.

I personally am looking forward to the 5090 release not only because of gaming enhancements. We get new tools for video editing and a boost in speed from 32gb of GDDR7 memory. Handling huge videos will be a breeze. The 4090 was extremely impressive and the 5090 will be more efficient for many tasks. These cards are also a great investment. I sold both of my 4090s for $1,400 each card. An approximate 70% or more return to invest in new hardware is excellent in my opinion.

I will make my purchases early in the morning. I'm excited to put these cards to use. I wish I could use my existing Lian Li Strimer Plus V2 cables. I purchased these cables for both of my systems but of course they are 12VHPWR. These cables were over 40 bucks a piece LOL.
 
GPU advancement for the last 30 years hasn't been marketed exclusively on fake frames and upscaling.
not talking about those fake frames. i'm talking about those pure 20 to 30 percent improvement on the same node. marketing as always are BS. gamer always take things for the granted. now many gamer regard nvidia pascal as one of their best series. heck some even dub it as another "mistake" that nvidia did not want to make like those 1080Ti. but back then many gamer give their flak towards pascal. saying nvidia being lazy by just shrinking maxwell to 16nm and clock it to the moon. they complain how expensive those 1080Ti as well at $700. now some people said "i miss when top GPU only cost $700".

Nah, the point was we used to always get +X % performance per dollar every new generation, with X being beyond 100 a couple decades ago. 4090 destroyed 3090. 3080 Ti destroyed 2080 Ti. 2080 not only was faster than 1080 Ti but it also had a couple new features. 1080 Ti left 980 Ti with no business.

Today, we get a more expensive GPU with even more ridiculous power draw that is about as much faster. FPS/$ doesn't change to the better so what is there to be impressed with? nVidia are taking our quid for granted, this is what happens. Sure, 5090 will make 4090 look pathetic at AI workloads but that's the only field where you don't need a FPS counter to tell the difference between them.

They CAN do better, there's no physical evidence they can't. We haven't run out of nanometres. We haven't run out of silicon. Just why try harder when you can be king even without releasing anything?
if such thing is very easy to do as you imply then AMD already did it an compete with nvidia head to head for the top performance. plus you really love monopoly don't you?
 
Impressive?? Where??

Don't forget:

"Across all 18 titles the GeForce RTX 4090 performed on average 77% faster than the GeForce RTX 3090, a 1.8X increase in 4K performance gen on gen."
The 5090 and 4090 are almost on the same TSMC node (4nm) whereas TSMC 4N was like a 2 node jump compared to Samsung 8nm...

The 4090 has ~56% more CUDA Cores, ~48% Higher Core clocks, ~7,7% more Memory Bandwidth (GDDR6X) and 12x more L2 Cache than the 3090 (72MB vs 6MB) !
The 5090 has ~33% more CUDA Cores, ~4,4% Lower Core clocks, ~77,7% more Memory Bandwidth (GDDR7) and ~22% more L2 Cache than the 4090 (88MB vs 72MB)

Unless NVIDIA was able to make the CUDA Cores IPC on Blackwell 40-50% higher than on Lovelace, the same performance jump was never going to happen!
 
Hype for cards the vast majority of us can't buy (due to nv deliberately not making anywhere near enough just to drive hype, demand and prices up) or simply cannot afford... After all, tech sites and YouTubers are being gifted over $4000 worth of cards over the next few weeks just from NV alone, let alone all the AIBs that will also be gifting these cards over the next month - It's a damn good year for them, and they are expected to hype them, at least for now. Expect public statements denying this that will be bound to follow the backlash when folks realise the vast majority of the midrange A.K.A affordable versions of these cards are barely any faster than the previous gen "super" refreshes, think single digit percentage points over the 4070ti super, and 4070 super cards!

But onto the performance claims... It cannot be much more than +-33% or so faster due to not having the hardware resources, unless its memory bandwidth bottlenecked of course, which synthetic benchmarks often are, and which most games aren't, reducing these kinds of clickbait articles as an e-peen measuring contest only. The only other thing these cards can do much better at is in anything that uses FP4/INT4, where that has been unlocked on this refresh, (because it is not a new chip, it's just bigger and more of the same with a small handful of bolted on extras and some software locked features), but I'm not aware of any games or benchmarks that would be able to take advantage of this yet, unless reviewers have a new build of their benchmarks of course.

I was hyped up for this series, until I found out just how bad it is. It will go down as nv's worst release ever. I sincerely hope they don't make us wait 3 years for the next new µarch.
 
Last edited:
If the 5090 was priced at $1600 like the 4090 nobody would complain, the problem is that it is being sold 25% more for 30-40% more RAW performance. DLSS 4 is another story.
 
Seriously with that? Do you have issues contextually understand graphs, yes?...

...nevermind..
What did I say that was wrong...? Have you not seen the reviews of that 5090 that came out a few hours after my comment? A geomean performance improvement of +-25% - I actually overstated the awful perf improvement.

Or did you not understand something? You also do realise that the 5090 is the card with the largest perf uptick? The rest of the cards are going to mostly be single digit improvements over the 40x0 series.

I was proven right...
 
Back
Top