• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
18,486 (2.47/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Rather than another leak of performance numbers for NVIDIA's upcoming RTX 4090 and 4080 cards, the company has shared some performance numbers of their upcoming cards in Overwatch 2. According to NVIDIA, Blizzard had to increase the framerate cap in Overwatch 2 to 600 FPS, as the new cards from NVIDIA were simply too fast for the previous 400 FPS framerate cap. NVIDIA also claims that with Reflex enabled, system latency will be reduced by up to 60 percent.

As for the performance numbers, the RTX 4090 managed to 507 FPS at 1440p in Overwatch 2, using an Intel Core i9-12900K CPU, at the Ultra graphics quality setting. The RTX 4080 16 GB manages 368 FPS, with the RTX 4080 12 GB coming in at 296 FPS. The comparison to the previous generation cards is seemingly a bit unfair, as the highest-end comparison we get is an unspecified RTX 3080 that managed to push 249 FPS. This is followed by an RTX 3070 at 195 FPS and an RTX 3060 at 122 FPS. NVIDIA recommends an RTX 4080 16 GB if you want to play Overwatch 2 at 1440p 360 FPS+ and an RTX 3080 Ti at 1080p 360 FPS+.






View at TechPowerUp Main Site | Source
 
Nice try Nvidia , im getiing 600fps with my 2080Ti & i9-10850k at 1440p , low settings. ultra settings is not worth the costs in OW
 
Nice try Nvidia , im getiing 600fps with my 2080Ti & i9-10850k at 1440p , low settings. ultra settings is not worth the costs in OW

Yeah but with the 4090, you would have the privilege of using 150 Watts more and pay only 1600 dollars.
 
24% performance increase between 4080 16gb and 12gb, 28% performance increase between 3080 and 3070. This makes me even more convinced that the 4080 12gb was meant to be the 4070.
 
Wake me up when we reach 1000fps.

24% performance increase between 4080 16gb and 12gb, 28% performance increase between 3080 and 3070. This makes me even more convinced that the 4080 12gb was meant to be the 4070.
It's just a name, drop it.
 
Yes doing the calculation for 3080 to 3090Ti gain @1440P is about +18% FPS. We therefore have a 3090Ti to 4090 gain @1440P of +72% using this game.
 
The 3090(Ti) missing in the slides....Just because they are still so good enough...
 
24% performance increase between 4080 16gb and 12gb, 28% performance increase between 3080 and 3070. This makes me even more convinced that the 4080 12gb was meant to be the 4070.
Much worse looking at core count drop from 4090 to lower 4080 16GB was supposed to be 4070 while 4080 12GB was 4060Ti/Super.
 
What’s the bet NVIDIA has DLSS enabled and this isn’t native 2560x1440 rendering?
 
Nvidia then: "4080 is 2-4X Faster than 3080Ti".
geforce-rtx-4080-graphics-cards-available-november.jpg
Nvidia now: "4080 is 1.2X Faster than 3080".
overwatch.jpg

Maybe they should have said "4080 is 1-4X Faster than 3080Ti"
 
Yeah, those presentation figures were all mixed - DLSS 2.X or even no DLSS vs. DLSS 3.0...

But I'm sure that's OK in eyes of fanboys, since they will lock DLSS 3.0 to Ada exclusively, although there is no hardware reason to do so...
 
4080 12gb only 47 more fps compared to 3080 lolololol
 
It's not all roses for Overwatch 2 and shameful AB. Can we say... DDoS attack... uhm... TWICE?! :roll:
 
3080 Ti / 3090 / 3090 Ti were the sh*t until few months ago and now they're shunned by Nvidia? Or is it just that they would make 4080's look bad?
 
Yeah, those presentation figures were all mixed - DLSS 2.X or even no DLSS vs. DLSS 3.0...

But I'm sure that's OK in eyes of fanboys, since they will lock DLSS 3.0 to Ada exclusively, although there is no hardware reason to do so...
NV new release smells awfully fishy to me. With the DLSS 3.0 and the 4080 12gb and 16gb is just bad manipulation and confusion form those uninformed which will pay a premium price for something that is a 4070. Don't like it at all.
 
They could have given us something more demanding no?

Why do I need 500fps?
 
3080 Ti / 3090 / 3090 Ti were the sh*t until few months ago and now they're shunned by Nvidia? Or is it just that they would make 4080's look bad?
NV needs to compare to 3080 since there is a meaningful performance upgrade and justification of the enormous price tag on the 4000 series cards.
 
Interesting like most said, no 3090 or 3090ti, and then 6900/6950XT with SAM which usually are very fast at lower resolutions than 4k.
The price of the RTX 4000 series look even more ridiculous.
 
Well, independent reviewers will of course show the performance results as they are. But they are getting rare, and I'm sure Nvidia will try to flood the Youtube with influencers showing results in DLSS 3.0...
 
Because Nvidia wants you to buy a 500 Hz display?

Yeah, well I'd like to see Cyberpunk hit that.

How much do you think one of those bad boys would cost too?
 
Back
Top