Wednesday, October 5th 2022

NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

Rather than another leak of performance numbers for NVIDIA's upcoming RTX 4090 and 4080 cards, the company has shared some performance numbers of their upcoming cards in Overwatch 2. According to NVIDIA, Blizzard had to increase the framerate cap in Overwatch 2 to 600 FPS, as the new cards from NVIDIA were simply too fast for the previous 400 FPS framerate cap. NVIDIA also claims that with Reflex enabled, system latency will be reduced by up to 60 percent.

As for the performance numbers, the RTX 4090 managed to 507 FPS at 1440p in Overwatch 2, using an Intel Core i9-12900K CPU, at the Ultra graphics quality setting. The RTX 4080 16 GB manages 368 FPS, with the RTX 4080 12 GB coming in at 296 FPS. The comparison to the previous generation cards is seemingly a bit unfair, as the highest-end comparison we get is an unspecified RTX 3080 that managed to push 249 FPS. This is followed by an RTX 3070 at 195 FPS and an RTX 3060 at 122 FPS. NVIDIA recommends an RTX 4080 16 GB if you want to play Overwatch 2 at 1440p 360 FPS+ and an RTX 3080 Ti at 1080p 360 FPS+.
Source: NVIDIA
Add your own comment

98 Comments on NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

#76
evernessince
The performance gap between the 4090 and "4080" 12GB and 4080 16GB really shows little Nvidia is giving people below the flagship this time.

Nvidia have managed to create a lineup that makes the crazy pricing of the xx90 tier look good.
TheinsanegamerNIf the tick rate of the servers is only 120 then anything over 120 FPS should be useless, no?
Higher FPS is still beneficial because the server tickrate is not synchronized to when your GPU outputs a frame. In otherwords there is a gap between when a frame is generated on your end and when the server ticks. The higher you framerate the smaller that gap will be.

Higher FPS is beneficial but that benefit is smaller the higher you go.
Posted on Reply
#77
Vayra86
Dirt ChipWake me up when we reach 1000fps.


It's just a name, drop it.
Its a name and a tier/price point buddy. Wake up. The fact we get a new x80 that is doing so little gen-to-gen is reason they're respinning their old Ampere stack below the x80 for a while, for example. Its the reason we're overpaying for 2 year old performance. Meanwhile, the die underneath is just way below expectations for an x80 tier product, while the price is beyond reasonable for that tier. And everytime, its memory that's just subpar. In Ampere it was capacity, now it's bus width/bandwidth, and its the most effective way to reinforce planned obscolescence. We can talk about magical new cache all day alleviating memory requirements, but in the end, if you start pushing that memory, it just won't suffice and you'll experience subpar performance. History repeats.

In 2017 the street price of an x80 (with a notably more powerful x80ti above it, 1080ti) was about 450~520 eur. Now we're getting a handicapped product with the same 'just a name' at twice the amount. This is proof that Nvidia is stalling in progress gen-to-gen, and bit off more than they can chew with RT. Are you paying for their fucked up strategy? I sure as hell am not. I called this very thing in every RT topic since Huang announced it: this is going to be bad for us. And here we are now, 3 generations of RT, still very little to show for it, but everything sensible has escalated into absolute horror: TDP, price, and size. And that's ON TOP of the bandaid to fix abysmal FPS numbers called DLSS. And here's the kicker: its not even turning out to be great news for Nvidia in terms of share price.

They have a third left of their peak share price now, and no signs of it turning.



So sure, I'll drop it now :) No ADA anytime soon, its no biggie, still have not the slightest feeling I'm missing out on anything useful.
Posted on Reply
#78
the54thvoid
Intoxicated Moderator
Vayra86Its a name and a tier/price point buddy. Wake up. The fact we get a new x80 that is doing so little gen-to-gen is reason they're respinning their old Ampere stack below the x80 for a while, for example. Its the reason we're overpaying for 2 year old performance. Meanwhile, the die underneath is just way below expectations for an x80 tier product, while the price is beyond reasonable for that tier. And everytime, its memory that's just subpar. In Ampere it was capacity, now it's bus width/bandwidth.

In 2017 the street price of an x80 (with a notably more powerful x80ti above it, 1080ti) was about 450~520 eur. Now we're getting a handicapped product with the same 'just a name' at twice the amount. This is proof that Nvidia is stalling in progress gen-to-gen, and bit off more than they can chew with RT. Are you paying for their fucked up strategy? I sure as hell am not.

So sure, I'll drop it now :) No ADA anytime soon, its no biggie, still have not the slightest feeling I'm missing out on anything useful.
I don't always agree with you but in this we have Nvidia trying to justify a fucking 4-slot GPU at $1600 (or whatever) with crazy power consumption and they're selling performance on a bunch of software implementations like DLSS 3 and other voodoo. No. When my 2080ti dies, I'll go AMD, or PS5.

Screw this tech party. The PC master race is dead and Nvidia is killing it. At least, that's my opinion.
Posted on Reply
#79
Anymal
XaledAnd where are 3090 and 3090ti ?

Edit: oh! even the 3080ti is not included.
Do you want to put W1zzard out of bussines? There is a reason why we are waiting for legit reviews.
Posted on Reply
#80
Vayra86
the54thvoidI don't always agree with you but in this we have Nvidia trying to justify a fucking 4-slot GPU at $1600 (or whatever) with crazy power consumption and they're selling performance on a bunch of software implementations like DLSS 3 and other voodoo. No. When my 2080ti dies, I'll go AMD, or PS5.

Screw this tech party. The PC master race is dead and Nvidia is killing it. At least, that's my opinion.
PCMR is just reaching logical pinnacles IMHO and we should stop trying to reinvent things that are just fine. The only reason that happens is disgusting shareholders and commercial considerations. None of it is making gaming more fun or better (its more likely to do the opposite, looking at monetization in games). At some point, things are just good. I'm still missing that kind of focus on just making cool game concepts, franchises, etc, and that also reflects on the RT push; its just a graphics effect, and not integral to gameplay. That's the stuff indie developers do, not focusing on graphics. PCMR isn't dead, but needs to readjust its focus. And let's place that perspective in the current time too: climate is an issue, going bigger and more power hungry is completely counter intuitive and soon an economical fallacy because the bill to fix that is so much higher (and growing faster). Nvidia is seriously heading for an iceberg here, and that one ain't melting.

In the end by going console you are literally saying 'I go for content' more so than buying a 1600 dollar GPU, that's for sure.
Posted on Reply
#81
Dirt Chip
Vayra86Its a name and a tier/price point buddy. Wake up. The fact we get a new x80 that is doing so little gen-to-gen is reason they're respinning their old Ampere stack below the x80 for a while, for example. Its the reason we're overpaying for 2 year old performance. Meanwhile, the die underneath is just way below expectations for an x80 tier product, while the price is beyond reasonable for that tier. And everytime, its memory that's just subpar. In Ampere it was capacity, now it's bus width/bandwidth, and its the most effective way to reinforce planned obscolescence. We can talk about magical new cache all day alleviating memory requirements, but in the end, if you start pushing that memory, it just won't suffice and you'll experience subpar performance. History repeats.

In 2017 the street price of an x80 (with a notably more powerful x80ti above it, 1080ti) was about 450~520 eur. Now we're getting a handicapped product with the same 'just a name' at twice the amount. This is proof that Nvidia is stalling in progress gen-to-gen, and bit off more than they can chew with RT. Are you paying for their fucked up strategy? I sure as hell am not. I called this very thing in every RT topic since Huang announced it: this is going to be bad for us. And here we are now, 3 generations of RT, still very little to show for it, but everything sensible has escalated into absolute horror: TDP, price, and size. And that's ON TOP of the bandaid to fix abysmal FPS numbers called DLSS. And here's the kicker: its not even turning out to be great news for Nvidia in terms of share price.

They have a third left of their peak share price now, and no signs of it turning.



So sure, I'll drop it now :) No ADA anytime soon, its no biggie, still have not the slightest feeling I'm
So you say you get less for more? The product is not for your taste for some reason?
The name isn't the preformance.
The preformance are the preformance and as always: no good - no money. So simple.
I'm with a 970gtx and will use it for the time to come.
Posted on Reply
#82
Vayra86
Dirt ChipSo you say you get less for more? The product is not for your taste for some reason?
The name isn't the preformance.
The preformance are the preformance and as always: no good - no money. So simple.
I'm with a 970gtx and will use it for the time to come.
I'm looking at the market and how it develops, that's the angle of my post, and it also influences my buying decision.
Posted on Reply
#83
Dyatlov A
Nvidia is totally disgusting, they were not giving us videocards on retail price for two years, just for miners. Now they want to sell a 4060ti under 4080 name for super big price tag. I already know in 2001 they are big bastards when they killed of 3dfx.
Posted on Reply
#84
gffermari
It's funny that the 4090 is the only card of the line up that has some value.
....stuck with the 2080Ti, I was hoping to skip the 3000.
Maybe I'll wait for the 4080Ti.

The RDNA3 gpus is said that they'll use chiplets.
If they keep the same 6800XT/6900XT raster performance and double or triple the RT numbers, I'm sold to AMD.
Posted on Reply
#85
AdmiralThrawn
TheLostSwedeBecause no 500+ FPS.


Because Nvidia wants you to buy a 500 Hz display?
I absolutley love monitor marketing when they show a specific fps vs. another and they blur the shit out of one of them to make the other look better.
Posted on Reply
#86
AusWolf
The most pointless numbers of the year. Why don't they share Half-Life 2 or Quake 3: Arena performance numbers while they're at it?
Posted on Reply
#87
Why_Me
Looking forward to the real reviews. The 4080 16GB could be a winner along with the eventual 4080 Ti (20GB?). Pair one of those cards up with a Raptor Lake build for FPS gaming at 1440P and you're good to go.
Posted on Reply
#88
wolf
Performance Enthusiast
PumperFalse, the gap between 4090 and 16GB 4080 shows that the 16GB should be a 4070 and the 12GB should be a 4060.
To an extent I barely care what they're called, I care what they're priced relative to the performance on offer, and that is also absolute bollocks for the 4080 series and the 4090 seems better but it's still ludicrous.

Nvidia is 100% taking advantage of early adopters and this small window they have before AMD launch and 30 series sits on shelves. More models will follow, price adjustments should follow too, all depends how greedy AMD also want to be, and on that front sadly I don't expect much either.
Posted on Reply
#89
phanbuey
RavenmasterIf the 4090 is good then why didn't they show the results at 4K resolution?
I don't think there are reflex monitors at 360hz at that resolution yet. The current 4k panels are slow af with the exception of the samsung g8 which is a generally sub par VA monitor with scanline issues and QC problems for $1200.
Posted on Reply
#90
onemanhitsquad
Hoopi24% performance increase between 4080 16gb and 12gb, 28% performance increase between 3080 and 3070. This makes me even more convinced that the 4080 12gb was meant to be the 4070.
JaysTwoCents video talks about that exact subject
Posted on Reply
#91
nguyen
Yawn, no one here realize that the huge FPS, coupled with Nvidia Reflex, offer insanely low input delay (which is important for e-sport, more than FPS).

For example I can't see when FPS is above 120, but I can feel the input delay difference when FPS is in the 100 vs in the 200 in PUBG.

Here is how Nvidia Reflex vs AMD Anti Lag stack up in some e-sport, including Overwatch
Posted on Reply
#92
fevgatos
Chrispy_Has anyone done a serious, controlled study on whether framerates above, say, 200FPS really make a difference? I've seen a few videos where esports titles like CS:GO were group tested at 60, 144, 240Hz and observed in slo-mo - and yes, 60Hz was measurably worse but the difference between 144 and 240fps is negligible enough to call margin of error and human variance.

LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.
Let me start off by saying I game on 120 hz monitor.

Yes, the difference between 144 and 240 is smaller than between 60 and 144. That much is obvious. I've seen LTT's testing, but it's fundamentally flawed. What you really need to do to figure out if going from 144 to 240 makes a difference, is to play for a week on a 240hz, and then go back to 144. Then your 144 will look like a snail. Playing on a 144 and then going to a 240, yeah - you are not going to notice much difference, but the moment you go back to your 144 after a week, oh my gawd.

And im saying this from personal experience, i've played exactly for a week exclusively on a 240hz apex legends. After going back to my 120, yeah, it seemed terrible.
Posted on Reply
#93
Hofnaerrchen
So this answers the question why DLSS 3.0 exists and being RTX 4000 only, as well as why RTX 3000 prices are not coming down further: The RTX 4080 12GB would be a shelf warmer at it's current MSRP. Looks like TSMC 4nm is quite expensive even if yields are as good as they are telling us they are.

Apart from that: If you look up Overwatch 2 performance videos: Not even the 3090ti manages the numbers nVIDIA is using in it's comparison for the 3080. As always: 1st party numbers need to be seen with a hefty grain of salt.
Posted on Reply
#94
Lycanwolfen
OOO wow 1440P. You know most 1080P cards can do 1440p no problems. Try 2160p or 4320p then I would be impressed.

Nvidia marketing is a joke.

Just for some reference my two Diamond Monster Voodoo II's in SLI could do over 600 FPS at 1280x1024 and that was in 1999.

After this mega monster hope nvidia learns bigger is not always better.
Posted on Reply
#95
kapone32
Isn't this Game 5 vs 5 with PS4 Graphics?
Posted on Reply
#97
cvaldes
kapone32Isn't this Game 5 vs 5 with PS4 Graphics?
Yes.

Activision Blizzard shrewdly chose a minimum system requirement that is fairly inclusive. The game does run on older and relatively modest powered consoles such as PS4, Xbox One and Nintendo Switch.

These older devices have a very significant user base. Switch alone has sold over 111 million units. Welcoming previous generation consoles provides a larger pool of players unlike Valorant (another 5v5) which is Windows only.

Naturally this low bar to entry means that PC players can use relatively modest systems like many notebooks.

Unlike its predecessor, Overwatch 2 is free to play and makes its revenue via in-game purchases such as cosmetics and battle passes. Having a large number of active and engaged players is crucial for sustained revenue generation.

From a PC standpoint, it's more important to Activision Blizzard that the game runs well on a GTX 1060, 1050 Ti and 1050 (respectively #1, #4 and #9 graphics cards in the latest Steam Hardware Survey) not the RTX 4090.
Posted on Reply
Add your own comment
Jun 12th, 2024 11:36 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts