• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Counter-Strike 2 Performance Benchmark

CS 2 is also can be launched with command line argument: –vulkan for Vulkan API.
 
Dear heavens that's terrible performance on AMD, in a comparison sense towards Nvidia's cards, drivers asap!

Even considering MSAA it's ridiculous.

And by the way what's MSAA doing in a game in 2023 ?
Still being the great form of AA out there, what else?
 
Dear heavens that's terrible performance on AMD, in a comparison sense towards Nvidia's cards, drivers asap!


Still being the great form of AA out there, what else?
Nah, I was looking at the preliminary tests as soon as the game was released and with a 7800x, YouTubers were getting 400-700fps, the game is not demanding. Maybe it's just the MSAA 8X taking its toll on the TPU test.
 
I think this is the least difference in graphics quality in a game between low and high settings while you get tipple the FPS.
 
this is only game i have ever seen that looks better to me on Low settings vs very high, I was never a csgo fan so meh, I think I have 3 hours total in the first csgo.

I tried csgo 2 for about give minutes, but uninstalled it quickly, I love Valve, but this series just isn't for me
 
The 4060Ti being slower than the 3060Ti is such an NVIDIA moment.
 
It's my eyes or there are no real difference between High and Very High ?
 
It's my eyes or there are no real difference between High and Very High ?
If anything, I think the shadows on Very High look too hard. I don't know by the screenshots, but looking at the VRAM consumption I guess that High uses 4X MSAA instead of 8X, which is absolutely overkill for Counter-Strike :roll:
 
If anything, I think the shadows on Very High look too hard. I don't know by the screenshots, but looking at the VRAM consumption I guess that High uses 4X MSAA instead of 8X, which is absolutely overkill for Counter-Strike :roll:
You're right !
I prefer the High shadows though, and the +100 fps comes with it :)
 
Also, is it just me that prefers the High preset graphics looking at the screenshots rather than Very High? I dunno, on Very High the shadows look TOO HARD for their own good.
@KrazyT thanks for proving me not alone in this opinion :D
 
high dynamic range should be tested, does it work? does it impact performance? and so on.
the hdr setting tested was at "quality" so i guess that is hdri on, but does that only work on hdr monitors?
i got oled monitor and is i set hdr down to "performance" the game looks like washed out shit, so i have to use hdr "quality" setting.
by the way the game look better ingame on my oled than in the screenshot,i guess that the picture format does not support hdr anyways.
looks like hdr has support on YouTube but not in the picture format we commonly use, a bit of if you ask me, pictures should had support first.
 
Holy crap that 4090 is flying to the fking moon

2x an RX7900XT in minimum FPS

That's both a compliment to Nvidia's arch as well as how optimized this game is. Damn

The 4060Ti being slower than the 3060Ti is such an NVIDIA moment.
Yeah and then there's that. Ada is such a sad moment in GPU history. All that potential, only really given to a 1500 dollar SKU
 
The 4060Ti being slower than the 3060Ti is such an NVIDIA moment.
Yeah and then there's that. Ada is such a sad moment in GPU history. All that potential, only really given to a 1500 dollar SKU
I don't know what I do laugh more over: either the fact the 4060Ti is bitchslapped by the 3060Ti at any resolution, or the 16GB version getting gobsmacked by the 8GB everytime (and them both not being able to really overpower a 2080).
 
On low does the game use 2Gb more vram than on high? What ?


no, the difference at 1080p very high is only 400mb. I think all 4gb cards can handle that easily

Also, you'll notice certain cards can handle way higher vram than rated, like the 6500 xt @ 1440p very high, - as long as the game does not exceed the cards bandwidth, you can often exceed fixed vram load by up-to 50% (swap to main memory)
 
RDNA 3 really takes shots to the hips here.
The 6700 xt beating the much more powerful 7700 xt more and more as resolution increases is very telling, as is the XTX below a 3090 Ti.

AMD, new drivers plz!
I constantly have crashes with my 7900XT
GG AMD
 
Curious to understand why the 7800XT trails the 6800XT and 6800 so significantly in this game?
 
  • Haha
Reactions: ARF
The 4060Ti being slower than the 3060Ti is such an NVIDIA moment.

Yeah...That's pretty weird. Shakes out in the minimums, but still.

It's nice to see the 2080 Ti still putting in work (probably possible to obtain ~1080p360/1440p240/4k120 avgs or 1080p240/1440p120/4k60 mins with an overclock in many configs), continuing to prove it's long-term relevancy/utility as a good option (in pretty much any game it will fit into a comfy frame-rate at a common resolution, or at least ~1080p30 if you want to dabble with ray-tracing) for those that paid the money at launch for longevity or are willing to buy used for just the bare-minimum to stay relevant in pretty much all games this generation. It's weird though to see it beating a 7800xt at stock and probably competing with the best of N21 using an overclock. That's down-right embarrassing wrt current drivers for some products (or whatever the case may be; perhaps how it was programmed).

It's another one of those games in which the fallacy of a hierarchy shows itself, as consistency isn't always there for resolutions you may expect for certain products over time (due to VRAM, bandwidth, whatever being mis-matched). 2080 Ti has held up pretty well though, gracefully declining down the ladder and now pretty much showing it can sustain itself (in raster). TLDR; it's a 11GB card that should be an 11GB card, where-as many (nearly all nvidia) cards don't have enough VRAM for their performance level. Sometimes holistically that's enough for 4k, but usually it isn't. Often it's enough for 1440p, sometimes it isn't. It pretty much will work for 1080p (or 4k DLSS balanced; 1253p) for a long-ass time, and by virtue of being the oldest card with that performance level is the best bargain (at often ~1/2 the price of something comparable, ~1/4 the price of something twice as fast) imho. Who cares that it is 5 years old; it's one of the few products that actually makes sense.
 
Last edited:
Curious to understand why the 7800XT trails the 6800XT and 6800 so significantly in this game?
Compared to the 6800XT, the 7800XT has less CUs. Compared to the 6800, I can only imagine it's due to a latency hit of the chiplet design.
 
I'm DISGUSTED with CS2 and want to go back to CS1.

#1 They took out the Auto -snipers
#2 The M4 silenced is still 10 bullets inferior to the AK-47...as if OBAMA and a bunch of liberals got to the CS developers or something. It used to be the best gun in the game.
#3 While I love the "blackout effect" if you get shot in the head, I hate the new hit detection. It feels "off" somehow.
 
I'm DISGUSTED with CS2 and want to go back to CS1.

#1 They took out the Auto -snipers
#2 The M4 silenced is still 10 bullets inferior to the AK-47...as if OBAMA and a bunch of liberals got to the CS developers or something. It used to be the best gun in the game.
#3 While I love the "blackout effect" if you get shot in the head, I hate the new hit detection. It feels "off" somehow.
CS 1.6?
 
Crazy performance difference between low and very high, while the visual difference is basically just shadows and aliasing. If only it worked like this in AAA games.

I guess MSAAx8 is the biggest hit to performance. Screenshots indicate there's a lot of pixel crawl anyway. FSR is version 1.0, right?
 
I'm DISGUSTED with CS2 and want to go back to CS1.

#1 They took out the Auto -snipers
#2 The M4 silenced is still 10 bullets inferior to the AK-47...as if OBAMA and a bunch of liberals got to the CS developers or something. It used to be the best gun in the game.
#3 While I love the "blackout effect" if you get shot in the head, I hate the new hit detection. It feels "off" somehow.
CS Go remains active until all tournaments are finished. Then it's history.
 
no, the difference at 1080p very high is only 400mb. I think all 4gb cards can handle that easily

Also, you'll notice certain cards can handle way higher vram than rated, like the 6500 xt @ 1440p very high, - as long as the game does not exceed the cards bandwidth, you can often exceed fixed vram load by up-to 50% (swap to main memory)
You can but there is likely an FPS hit, and it might also be big enough at times to give you a stutter.
 
Back
Top