Wednesday, October 5th 2022

NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

Rather than another leak of performance numbers for NVIDIA's upcoming RTX 4090 and 4080 cards, the company has shared some performance numbers of their upcoming cards in Overwatch 2. According to NVIDIA, Blizzard had to increase the framerate cap in Overwatch 2 to 600 FPS, as the new cards from NVIDIA were simply too fast for the previous 400 FPS framerate cap. NVIDIA also claims that with Reflex enabled, system latency will be reduced by up to 60 percent.

As for the performance numbers, the RTX 4090 managed to 507 FPS at 1440p in Overwatch 2, using an Intel Core i9-12900K CPU, at the Ultra graphics quality setting. The RTX 4080 16 GB manages 368 FPS, with the RTX 4080 12 GB coming in at 296 FPS. The comparison to the previous generation cards is seemingly a bit unfair, as the highest-end comparison we get is an unspecified RTX 3080 that managed to push 249 FPS. This is followed by an RTX 3070 at 195 FPS and an RTX 3060 at 122 FPS. NVIDIA recommends an RTX 4080 16 GB if you want to play Overwatch 2 at 1440p 360 FPS+ and an RTX 3080 Ti at 1080p 360 FPS+.
Source: NVIDIA
Add your own comment

98 Comments on NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

#1
Xaled
And where are 3090 and 3090ti ?

Edit: oh! even the 3080ti is not included.
Posted on Reply
#2
HaKN !
Nice try Nvidia , im getiing 600fps with my 2080Ti & i9-10850k at 1440p , low settings. ultra settings is not worth the costs in OW
Posted on Reply
#3
Gungar
HaKN !Nice try Nvidia , im getiing 600fps with my 2080Ti & i9-10850k at 1440p , low settings. ultra settings is not worth the costs in OW
Yeah but with the 4090, you would have the privilege of using 150 Watts more and pay only 1600 dollars.
Posted on Reply
#4
Hoopi
24% performance increase between 4080 16gb and 12gb, 28% performance increase between 3080 and 3070. This makes me even more convinced that the 4080 12gb was meant to be the 4070.
Posted on Reply
#5
Dirt Chip
Wake me up when we reach 1000fps.
Hoopi24% performance increase between 4080 16gb and 12gb, 28% performance increase between 3080 and 3070. This makes me even more convinced that the 4080 12gb was meant to be the 4070.
It's just a name, drop it.
Posted on Reply
#6
lesovers
Yes doing the calculation for 3080 to 3090Ti gain @1440P is about +18% FPS. We therefore have a 3090Ti to 4090 gain @1440P of +72% using this game.
Posted on Reply
#7
Bwaze
Dirt ChipIt's just a name, drop it.
And half a grand. ;-)
Posted on Reply
#8
P4-630
The 3090(Ti) missing in the slides....Just because they are still so good enough...
Posted on Reply
#10
Chaitanya
Hoopi24% performance increase between 4080 16gb and 12gb, 28% performance increase between 3080 and 3070. This makes me even more convinced that the 4080 12gb was meant to be the 4070.
Much worse looking at core count drop from 4090 to lower 4080 16GB was supposed to be 4070 while 4080 12GB was 4060Ti/Super.
Posted on Reply
#11
bobsled
What’s the bet NVIDIA has DLSS enabled and this isn’t native 2560x1440 rendering?
Posted on Reply
#12
Cippo95
Nvidia then: "4080 is 2-4X Faster than 3080Ti".

Nvidia now: "4080 is 1.2X Faster than 3080".


Maybe they should have said "4080 is 1-4X Faster than 3080Ti"
Posted on Reply
#13
Ravenmaster
If the 4090 is good then why didn't they show the results at 4K resolution?
Posted on Reply
#14
Bwaze
Yeah, those presentation figures were all mixed - DLSS 2.X or even no DLSS vs. DLSS 3.0...

But I'm sure that's OK in eyes of fanboys, since they will lock DLSS 3.0 to Ada exclusively, although there is no hardware reason to do so...
Posted on Reply
#15
spnidel
4080 12gb only 47 more fps compared to 3080 lolololol
Posted on Reply
#16
GunShot
It's not all roses for Overwatch 2 and shameful AB. Can we say... DDoS attack... uhm... TWICE?! :roll:
Posted on Reply
#17
Chomiq
3080 Ti / 3090 / 3090 Ti were the sh*t until few months ago and now they're shunned by Nvidia? Or is it just that they would make 4080's look bad?
Posted on Reply
#18
ratirt
BwazeYeah, those presentation figures were all mixed - DLSS 2.X or even no DLSS vs. DLSS 3.0...

But I'm sure that's OK in eyes of fanboys, since they will lock DLSS 3.0 to Ada exclusively, although there is no hardware reason to do so...
NV new release smells awfully fishy to me. With the DLSS 3.0 and the 4080 12gb and 16gb is just bad manipulation and confusion form those uninformed which will pay a premium price for something that is a 4070. Don't like it at all.
Posted on Reply
#19
Outback Bronze
They could have given us something more demanding no?

Why do I need 500fps?
Posted on Reply
#20
ratirt
Chomiq3080 Ti / 3090 / 3090 Ti were the sh*t until few months ago and now they're shunned by Nvidia? Or is it just that they would make 4080's look bad?
NV needs to compare to 3080 since there is a meaningful performance upgrade and justification of the enormous price tag on the 4000 series cards.
Posted on Reply
#21
Unregistered
Interesting like most said, no 3090 or 3090ti, and then 6900/6950XT with SAM which usually are very fast at lower resolutions than 4k.
The price of the RTX 4000 series look even more ridiculous.
#22
Bwaze
Well, independent reviewers will of course show the performance results as they are. But they are getting rare, and I'm sure Nvidia will try to flood the Youtube with influencers showing results in DLSS 3.0...
Posted on Reply
#23
TheLostSwede
News Editor
RavenmasterIf the 4090 is good then why didn't they show the results at 4K resolution?
Because no 500+ FPS.
Outback BronzeThey could have given us something more demanding no?

Why do I need 500fps?
Because Nvidia wants you to buy a 500 Hz display?
Posted on Reply
#24
Outback Bronze
TheLostSwedeBecause Nvidia wants you to buy a 500 Hz display?
Yeah, well I'd like to see Cyberpunk hit that.

How much do you think one of those bad boys would cost too?
Posted on Reply
#25
Crackong
lol they pretended 3090 (Ti) didn't existed ?
Posted on Reply
Add your own comment
May 10th, 2024 22:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts