Wednesday, October 5th 2022
NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers
Rather than another leak of performance numbers for NVIDIA's upcoming RTX 4090 and 4080 cards, the company has shared some performance numbers of their upcoming cards in Overwatch 2. According to NVIDIA, Blizzard had to increase the framerate cap in Overwatch 2 to 600 FPS, as the new cards from NVIDIA were simply too fast for the previous 400 FPS framerate cap. NVIDIA also claims that with Reflex enabled, system latency will be reduced by up to 60 percent.
As for the performance numbers, the RTX 4090 managed to 507 FPS at 1440p in Overwatch 2, using an Intel Core i9-12900K CPU, at the Ultra graphics quality setting. The RTX 4080 16 GB manages 368 FPS, with the RTX 4080 12 GB coming in at 296 FPS. The comparison to the previous generation cards is seemingly a bit unfair, as the highest-end comparison we get is an unspecified RTX 3080 that managed to push 249 FPS. This is followed by an RTX 3070 at 195 FPS and an RTX 3060 at 122 FPS. NVIDIA recommends an RTX 4080 16 GB if you want to play Overwatch 2 at 1440p 360 FPS+ and an RTX 3080 Ti at 1080p 360 FPS+.
Source:
NVIDIA
As for the performance numbers, the RTX 4090 managed to 507 FPS at 1440p in Overwatch 2, using an Intel Core i9-12900K CPU, at the Ultra graphics quality setting. The RTX 4080 16 GB manages 368 FPS, with the RTX 4080 12 GB coming in at 296 FPS. The comparison to the previous generation cards is seemingly a bit unfair, as the highest-end comparison we get is an unspecified RTX 3080 that managed to push 249 FPS. This is followed by an RTX 3070 at 195 FPS and an RTX 3060 at 122 FPS. NVIDIA recommends an RTX 4080 16 GB if you want to play Overwatch 2 at 1440p 360 FPS+ and an RTX 3080 Ti at 1080p 360 FPS+.
98 Comments on NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers
if you buying an 4090 and playing at 1440P you doing it wrong..
Weird it from nvidia, but their markeking PR suck
Can't wait for reviews with real world performance.
The interesting thing to me are the two 4080's, one of which is pure arse relative to it's namesake.
20% is not a small amount, so one better be 20% cheaper than the other or I'm done with any interest in 4###.
More than most of us are going to be willing to spend on a 1080p monitor?
Fuck NVIDIA.
The pricing is atrocious. The naming is misleading if not outright deceitful.
Also fuck DLSS 3.0. The tech is great but the way they use it to inflate performance figures? Absolutely scammy.
I want the GeForce 40 series sales to tank hard. This is a bloody rip-off.
What's the worst about this situation? AMD will release RDNA 3.0 cards with similar raster/ray tracing performance and will make them just 5-10% cheaper and thus both companies will enjoy insane margins. No competition whatsoever, only a fucking duopoly.
LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.
I'm curious if there's any real benefit to running north of 200fps whatsoever. At 500fps we're talking about a peak improvement of 3ms, a mean improvement of 1.5ms and this is taken into consideration alongside the following other variables:
- 8.3ms server tick interval
- 10-25ms median ping/jitter to the game server via your ISP
- 2ms input sampling latency
- 2-5ms pixel response time
- 80-120ms human reflex time
Whilst it's true that some of that is hidden behind client prediction, it's still 42ms of unavoidable latency. Throwing several thousand dollars at hardware to support 500+ FPS does seem to me like a fool's errand just to gain a mean improvement of 1.5ms in the best possible case scenario, and worst case you've spent all that for 1.5ms out of 160ms.or
20-50ms limitations of human premeditated timing accuracy (ie, what's your ability to stop a stopwatch at exactly 10.00 seconds, rather than 9.98 or 10.03 seconds, even though you know EXACTLY when to click?)
It would seem like NV is using last chance to cash in resort and it is all over. Weird.
they are showing off in a game where the graphics suck
mollylab-1.mit.edu/sites/default/files/documents/FastDetect2014withFigures.pdf
PC Gamer wrote a piece on FPS a few years ago as well.
www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/
Nvidia has historically always aimed for a 25%-30% performance gap between tiers. This has been approximately true going back as far as the Maxwell architecture for about 8 years worth of consistent product segmentation from Nvidia
But I can bet people will see difference with 500hz with they first day of purchase. I think that is a but of a stretch from their side.
Except Huang declared 2/4x the performance so they're getting it from the horses asss, I mean mouth. :D