Wednesday, October 5th 2022

NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

Rather than another leak of performance numbers for NVIDIA's upcoming RTX 4090 and 4080 cards, the company has shared some performance numbers of their upcoming cards in Overwatch 2. According to NVIDIA, Blizzard had to increase the framerate cap in Overwatch 2 to 600 FPS, as the new cards from NVIDIA were simply too fast for the previous 400 FPS framerate cap. NVIDIA also claims that with Reflex enabled, system latency will be reduced by up to 60 percent.

As for the performance numbers, the RTX 4090 managed to 507 FPS at 1440p in Overwatch 2, using an Intel Core i9-12900K CPU, at the Ultra graphics quality setting. The RTX 4080 16 GB manages 368 FPS, with the RTX 4080 12 GB coming in at 296 FPS. The comparison to the previous generation cards is seemingly a bit unfair, as the highest-end comparison we get is an unspecified RTX 3080 that managed to push 249 FPS. This is followed by an RTX 3070 at 195 FPS and an RTX 3060 at 122 FPS. NVIDIA recommends an RTX 4080 16 GB if you want to play Overwatch 2 at 1440p 360 FPS+ and an RTX 3080 Ti at 1080p 360 FPS+.
Source: NVIDIA
Add your own comment

98 Comments on NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

#26
awesomesauce
RavenmasterIf the 4090 is good then why didn't they show the results at 4K resolution?
This.

if you buying an 4090 and playing at 1440P you doing it wrong..

Weird it from nvidia, but their markeking PR suck
Posted on Reply
#27
the54thvoid
Intoxicated Moderator
Saying how fast your card can go in a game like Overwatch is like boasting how fast your caravan can go on a downhill slope.

Can't wait for reviews with real world performance.
Posted on Reply
#28
TheoneandonlyMrK
So the 4090 got a pr fluff piece and that's news.

The interesting thing to me are the two 4080's, one of which is pure arse relative to it's namesake.

20% is not a small amount, so one better be 20% cheaper than the other or I'm done with any interest in 4###.
Posted on Reply
#29
TheLostSwede
News Editor
Outback BronzeYeah, well I'd like to see Cyberpunk hit that.

How much do you think one of those bad boys would cost too?
Only for E-sports titles...

More than most of us are going to be willing to spend on a 1080p monitor?
Posted on Reply
#30
Outback Bronze
TheLostSwedeOnly for E-sports titles...
Professional gaming too?
Posted on Reply
#31
TheLostSwede
News Editor
Outback BronzeProfessional gaming too?
Uhm, what's that?
Posted on Reply
#32
Outback Bronze
TheLostSwedeUhm, what's that?
Competitive should I have said?
Posted on Reply
#33
TheLostSwede
News Editor
Outback BronzeCompetitive should I have said?
Yeah, so E-sports, as they seem to like to call it, so it sounds truly competitive.
Posted on Reply
#34
birdie
This is the first time in 25 years that I'm happy to say,

Fuck NVIDIA.

The pricing is atrocious. The naming is misleading if not outright deceitful.

Also fuck DLSS 3.0. The tech is great but the way they use it to inflate performance figures? Absolutely scammy.

I want the GeForce 40 series sales to tank hard. This is a bloody rip-off.

What's the worst about this situation? AMD will release RDNA 3.0 cards with similar raster/ray tracing performance and will make them just 5-10% cheaper and thus both companies will enjoy insane margins. No competition whatsoever, only a fucking duopoly.
Posted on Reply
#35
Chrispy_
Has anyone done a serious, controlled study on whether framerates above, say, 200FPS really make a difference? I've seen a few videos where esports titles like CS:GO were group tested at 60, 144, 240Hz and observed in slo-mo - and yes, 60Hz was measurably worse but the difference between 144 and 240fps is negligible enough to call margin of error and human variance.

LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.

I'm curious if there's any real benefit to running north of 200fps whatsoever. At 500fps we're talking about a peak improvement of 3ms, a mean improvement of 1.5ms and this is taken into consideration alongside the following other variables:
  1. 8.3ms server tick interval
  2. 10-25ms median ping/jitter to the game server via your ISP
  3. 2ms input sampling latency
  4. 2-5ms pixel response time
  5. 80-120ms human reflex time
    or
    20-50ms limitations of human premeditated timing accuracy (ie, what's your ability to stop a stopwatch at exactly 10.00 seconds, rather than 9.98 or 10.03 seconds, even though you know EXACTLY when to click?)
Whilst it's true that some of that is hidden behind client prediction, it's still 42ms of unavoidable latency. Throwing several thousand dollars at hardware to support 500+ FPS does seem to me like a fool's errand just to gain a mean improvement of 1.5ms in the best possible case scenario, and worst case you've spent all that for 1.5ms out of 160ms.
Posted on Reply
#36
ratirt
TheoneandonlyMrKSo the 4090 got a pr fluff piece and that's news.

The interesting thing to me are the two 4080's, one of which is pure arse relative to it's namesake.

20% is not a small amount, so one better be 20% cheaper than the other or I'm done with any interest in 4###.
I think the 12gb price should be in the ballpark of a 3070 which it will not be obviously.
It would seem like NV is using last chance to cash in resort and it is all over. Weird.
Posted on Reply
#37
awesomesauce
Chrispy_Has anyone done a serious, controlled study on whether framerates above, say, 200FPS really make a difference? I've seen a few videos where esports titles like CS:GO were group tested at 60, 144, 240Hz and observed in slo-mo - and yes, 60Hz was measurably worse but the difference between 144 and 240fps is negligible enough to call margin of error and human variance.

LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.

I'm curious if there's any real benefit to running north of 200fps whatsoever. At 500fps we're talking about a peak improvement of 3ms, a mean improvement of 1.5ms and this is taken into consideration alongside the following other variables:
  1. 8.3ms server tick interval
  2. 10-25ms median ping/jitter to the game server via your ISP
  3. 2ms input sampling latency
  4. 2-5ms pixel response time
  5. 80-120ms human reflex time
    or
    20-50ms limitations of human premeditated timing accuracy (ie, what's your ability to stop a stopwatch at exactly 10.00 seconds, rather than 9.98 or 10.03 seconds, even though you know EXACTLY when to click?)
Whilst it's true that some of that is hidden behind client prediction, it's still 42ms of unavoidable latency. Throwing several thousand dollars at hardware to support 500+ FPS does seem to me like a fool's errand just to gain a mean improvement of 1.5ms in the best possible case scenario, and worst case you've spent all that for 1.5ms out of 160ms.
I think the same. Over 240fps your eyes will not see the difference and between 120 to 240 it marginal

they are showing off in a game where the graphics suck
Posted on Reply
#38
ratirt
awesomesauceI think the same. Over 240fps your eyes will not see the difference and between 120 to 240 it marginal
I can bet there will be people that will tell you they see a difference between the 240hz and 500hz for sure. It would seem nowadays (it's been happening for some time now) People just want to be considered different or special or better. Freakin' sickness.
Posted on Reply
#39
TheLostSwede
News Editor
ratirtI can bet there will be people that will tell you they see a difference between the 240hz and 500hz for sure. It would seem nowadays (it's been happening for some time now) People just want to be considered different or special or better. Freakin' sickness.
13 ms per picture is apparently what's required for us to understand what we've seen.
mollylab-1.mit.edu/sites/default/files/documents/FastDetect2014withFigures.pdf

PC Gamer wrote a piece on FPS a few years ago as well.
www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/
Posted on Reply
#40
Pumper
Hoopi24% performance increase between 4080 16gb and 12gb, 28% performance increase between 3080 and 3070. This makes me even more convinced that the 4080 12gb was meant to be the 4070.
False, the gap between 4090 and 16GB 4080 shows that the 16GB should be a 4070 and the 12GB should be a 4060.
Posted on Reply
#41
TheDeeGee
spnidel4080 12gb only 47 more fps compared to 3080 lolololol
lolololol, people still expecting 400% increase every generation.
Posted on Reply
#42
HaKN !
PumperFalse, the gap between 4090 and 16GB 4080 shows that the 16GB should be a 4070 and the 12GB should be a 4060.
Finally , someone gets it :)
Posted on Reply
#43
mama
Price to performance?
Posted on Reply
#44
nguyen
Man OW2 is such a shit e-sport, I wonder if anyone would invest into being a pro OW2 player LOL
Posted on Reply
#45
Chrispy_
PumperFalse, the gap between 4090 and 16GB 4080 shows that the 16GB should be a 4070 and the 12GB should be a 4060.
Agreed.

Nvidia has historically always aimed for a 25%-30% performance gap between tiers. This has been approximately true going back as far as the Maxwell architecture for about 8 years worth of consistent product segmentation from Nvidia
Posted on Reply
#46
Chomiq
Can't wait to hear GN take on this with their "Rainbow Six Siege becomes completely unplayable below 615.3 fps average" meme :D
Posted on Reply
#47
Chrispy_
mamaPrice to performance?
Nvidia says no, and changes the topic to something else.
Posted on Reply
#48
ratirt
TheLostSwede13 ms per picture is apparently what's required for us to understand what we've seen.
mollylab-1.mit.edu/sites/default/files/documents/FastDetect2014withFigures.pdf

PC Gamer wrote a piece on FPS a few years ago as well.
www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/
Nothing has been mentioned about the 500hz refresh profound difference that a person can see. If there are people able to perceive the 200hz refresh rate these are rather scarce.
But I can bet people will see difference with 500hz with they first day of purchase. I think that is a but of a stretch from their side.
Posted on Reply
#49
TheoneandonlyMrK
TheDeeGeelolololol, people still expecting 400% increase every generation.
Yeah totally madness.

Except Huang declared 2/4x the performance so they're getting it from the horses asss, I mean mouth. :D
Posted on Reply
#50
the54thvoid
Intoxicated Moderator
It's an easy enough study to conduct. Gather a whole bunch of participants subjected to 100, 200, 300, 400, 500Hz images without prior knowledge of what FPS they are viewing. Gather results of subjective experience and you have a population sample of sensitivity to high refresh imagery.
Posted on Reply
Add your own comment
Jun 12th, 2024 19:57 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts