• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Shares RTX 4090 Overwatch 2 Performance Numbers

It's an easy enough study to conduct. Gather a whole bunch of participants subjected to 100, 200, 300, 400, 500Hz images without prior knowledge of what FPS they are viewing. Gather results of subjective experience and you have a population sample of sensitivity to high refresh imagery.
 
It's an easy enough study to conduct. Gather a whole bunch of participants subjected to 100, 200, 300, 400, 500Hz images without prior knowledge of what FPS they are viewing. Gather results of subjective experience and you have a population sample of sensitivity to high refresh imagery.
I've seen similar study conducted by one of the YT channels. Most of the people were considered gamers and only managed to guess? what refresh rate he has been playing.
the range was from 30 to 144. Also, it has been said in the article, even though you see the difference it is most likely it will do nothing for your playing abilities
 
The inclusion of 3080 Ti would indicate how 4080 12 is the 407o. But the 192 bit bus really makes it a 406o. So the 405o is the 4060 instead. Scares me to think it would cost what I had prepared for the 4070, Jensen is one hell of a player.
 
The inclusion of 3080 Ti would indicate how 4080 12 is the 407o. But the 192 bit bus really makes it a 406o. So the 405o is the 4060 instead. Scares me to think it would cost what I had prepared for the 4070, Jensen is one hell of a player.
Nvidia didn't specify which 3080 they used, so it could be a 10 GB card for all we know.
 
I won't count on 1 game result to conclude how "good" the RTX 4000 will be. There is no doubt that we should see a good bump in performance, but we also know that performance don't scale proportionately with increased CUDA cores, and/or, clock speed. Seeing a 60 to 70% bump in performance is likely the best case scenario for a generational improvement. Those 2 to 4x performance improvements are just fluffy, and only a result of DLSS 3. If the numbers are so good, why not do an apple to apple comparison of rasterization performance to start off before going into the fluff?
 
So many uninformed people just jumping on bandwagon and spreading fake news. Please fact check yourself before posting.

Overwatch and Overwatch 2 does not support DLSS. All of that fps gains are rasterized without AI image upscaling.

 
So many uninformed people just jumping on bandwagon and spreading fake news. Please fact check yourself before posting.

Overwatch and Overwatch 2 does not support DLSS. All of that fps gains are rasterized without AI image upscaling.

By Andrew Burnes on May 31, 2021
 
Wake me up when we reach 1000fps.


It's just a name, drop it.

Actually its a further devaluing of the xx70 series. So more people get less performance. But lets hope their stock price recovers!
 
Has anyone done a serious, controlled study on whether framerates above, say, 200FPS really make a difference? I've seen a few videos where esports titles like CS:GO were group tested at 60, 144, 240Hz and observed in slo-mo - and yes, 60Hz was measurably worse but the difference between 144 and 240fps is negligible enough to call margin of error and human variance.

LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.

I'm curious if there's any real benefit to running north of 200fps whatsoever. At 500fps we're talking about a peak improvement of 3ms, a mean improvement of 1.5ms and this is taken into consideration alongside the following other variables:

  1. 8.3ms server tick interval
  2. 10-25ms median ping/jitter to the game server via your ISP
  3. 2ms input sampling latency
  4. 2-5ms pixel response time
  5. 80-120ms human reflex time
    or
    20-50ms limitations of human premeditated timing accuracy (ie, what's your ability to stop a stopwatch at exactly 10.00 seconds, rather than 9.98 or 10.03 seconds, even though you know EXACTLY when to click?)
Whilst it's true that some of that is hidden behind client prediction, it's still 42ms of unavoidable latency. Throwing several thousand dollars at hardware to support 500+ FPS does seem to me like a fool's errand just to gain a mean improvement of 1.5ms in the best possible case scenario, and worst case you've spent all that for 1.5ms out of 160ms.
nice comment.

You also have to add.
What kind of monitor use AND
What size of monitor used.

I did a comparison on Playing Overwatch on a 27 inch screen window and a 32 inch screen window on my monitor @1440p @165Mhtz. with a AMD 5700 The difference in size was noticeable for posting data, but actual playing performance? In my case no, not really.

Since you mentioned Tick Rate OW tried to pull a fast one when it came out in 2016 when the game was being ran on it's legacy servers at the Tick rate of 24. A loud riot ensued and I believe it went up to 120 because of all of the complaints.

Finally NGreedia can talk all about these fantastic numbers but if the game and the netcode for that game can not use it.. It will not be used.
 
The 4090 is only for those who think they can see the difference between 500 and 240 fps.
If your not one of those people- 4090 is not for you.
Please don't buy it and leave it for those who can.
 
I don't have too many games in my library that utilize DLSS. DLSS and FSR don't interest me, they are not parts of the reviews I look at. I can't wait for rasterization performance comparisons. You can't really expect people to be excited about DLSS or FSR if all games don't utilize them, right now it's just a niche software - think Nvida Physx - until they're more widely utilized.

Nvidia must be shitting bricks if they're heavily focusing on sharing gaming performance results for competitive gaming to make the numbers look good. Right now all these two graphs are showing me is that the 3070 and 3080 (10GB...? 12GB...? Both?) give remarkable performance in Overwatch 2, even the 3060 handles the game well on maxed out settings at 1440p.

I don't play these competitive games and I don't need freakishly high fps to play games. Clearly I'm not their target market here for this display of performance and I understand that, but I don't see how they're doing anything positive with their PR stuff here with these kind of "leaks" or shared information.
 
Wait a second... they can actually log in?
 
I'll test soon how 1080 Ti handles that. I hope it's not that much demanding than the original OW; that ran like a charm with 980 @ 1500MHz back in the day.
 
So many uninformed people just jumping on bandwagon and spreading fake news. Please fact check yourself before posting.

Overwatch and Overwatch 2 does not support DLSS.
Can you please quote the exact posts that you consider "so many uninformed people"?

In the 55 posts before yours, DLSS is only mentioned by three people and two of them are talking about DLSS as the marketing focus of the 40-series as a whole, and clearly not referring to these 4090 Overwatch 2 results. That one person:
What’s the bet NVIDIA has DLSS enabled and this isn’t native 2560x1440 rendering?
is making a cynical, accusatory remark in the form of a question. One single person questioning the legitimacy of an unverified benchmark isn't what I'd call "so many people" or "spreading fake news". Whilst it's unlikely that Nvidia are using a not-released-to-the-public internal build for DLSS development, his cynicism is at least warranted because it would not be the first (or even tenth time) that Nvidia has been caught cheating in benchmarks and fudging their numbers.

As your first post on TPU, it really helps to not lead with accusations and then insult the community, that's never going to be a good entrance however you try and defend it.

Welcome to TPU and try to be better.
 
Last edited:
If the tick rate of the servers is only 120 then anything over 120 FPS should be useless, no?
 
Why tf do you need 600fps for a shitty game?????????????????????
 
Has anyone done a serious, controlled study on whether framerates above, say, 200FPS really make a difference? I've seen a few videos where esports titles like CS:GO were group tested at 60, 144, 240Hz and observed in slo-mo - and yes, 60Hz was measurably worse but the difference between 144 and 240fps is negligible enough to call margin of error and human variance.

LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.

I'm curious if there's any real benefit to running north of 200fps whatsoever. At 500fps we're talking about a peak improvement of 3ms, a mean improvement of 1.5ms and this is taken into consideration alongside the following other variables:

  1. 8.3ms server tick interval
  2. 10-25ms median ping/jitter to the game server via your ISP
  3. 2ms input sampling latency
  4. 2-5ms pixel response time
  5. 80-120ms human reflex time
    or
    20-50ms limitations of human premeditated timing accuracy (ie, what's your ability to stop a stopwatch at exactly 10.00 seconds, rather than 9.98 or 10.03 seconds, even though you know EXACTLY when to click?)
Whilst it's true that some of that is hidden behind client prediction, it's still 42ms of unavoidable latency. Throwing several thousand dollars at hardware to support 500+ FPS does seem to me like a fool's errand just to gain a mean improvement of 1.5ms in the best possible case scenario, and worst case you've spent all that for 1.5ms out of 160ms.
Can’t only speak from my limited experience but I do notice a difference in gameplay like COD between 144 fps and 220-240 fps (using rtx 3080 and Samsung g7 1440p display). Gameplay seems smoother and slightly faster . Again my subjective experience but I’ve been lowering settings on purpose to hit that 200+ mark
 
Can’t only speak from my limited experience but I do notice a difference in gameplay like COD between 144 fps and 220-240 fps (using rtx 3080 and Samsung g7 1440p display). Gameplay seems smoother and slightly faster . Again my subjective experience but I’ve been lowering settings on purpose to hit that 200+ mark
I can definitely tell the difference between 120 and 165Hz, I have never used a display faster than 240Hz but even side by side (well okay, desk in front of me vs desk behind me) I cannot tell the difference between 165 and 240Hz.

Interestingly, I don't notice any increase in immediacy of input between 120 and 240Hz. Maybe it depends on the game engine itself, and I don't play COD.
 
Get ready guys, some how NVidia stats will blow your mind yet again. A full 4X more fps at 100% more price with a 10% gain. It makes perfect sense, just trust them.

Jensen Gpu GIF by NVIDIA GeForce
Dummy Feeling Dumb GIF
discovered GIF
 
lolololol, people still expecting 400% increase every generation.
nvidia's own marketing said the 4080 is "2-4x faster than the 3080 ti", so yes, lololol
good job being a corporate cocksucker :)
 
Back
Top