• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Alan Wake 2 Performance Benchmark

Anything lower is for indie/old games, htpc, display output, etc. Sometimes you can squeeze out decent settings with a xx50ti card.
Isn't that what entry level means?

Let's bury this topic, though.
 
Well what a difference in this game, switched from RTX 3080 to RTX 4090 constant 115FPS.
 
Anyone tried the new driver Hotfix from NVIDIA?

Click here to download the GeForce Hotfix display driver version 546.08 for Windows 10 x64 / Windows 11 x64. (@W1zzard maybe u want to add it in NVCleanstall )
[Alan Wake 2] Addressing gradual stability and performance degradation over extended periods of gameplay [4334633]
Windows 10 transparency effects are not displaying correctly after driver update [4335862]
Random Bugcheck may be observed on certain systems [4343844]
 
Last edited:
xx60 series is usually bare bones for gaming based on performance. Anything lower is for indie/old games, htpc, display output, etc. Sometimes you can squeeze out decent settings with a xx50ti card.
Let's put this one to bed finally. Take a look at the 3060 review from Tech Powerup. In 1080p games, the 3060 average 115fps. I would not call 115fps barebones in the least. In fact, many would say 115fps with reasonable settings is perfect.

Generally what is considered entry is 30 fps, but I'll even say it should be up to 60fps. At 1440p, the mid range 3060 is able average 85fps. Again, that is at 1440p, entry level cards are not supposed to work with good framerate at higher resolution with high settings. Even the 48 fps at 4k is pretty incredible for the 3060.
 
1440p not "4K" is the OPTIMAL resolution for the RTX 4090.

1699915553188.png
 
1440p not "4K" is the OPTIMAL resolution for the RTX 4090.

View attachment 321479
Hahaha yeah thats one of the reasons I struggle with upgrading my display. Right now I can downsample. But if I get a 4k display I feel like my card will age faster than with my 1440p one. And I still run dlss when its there, I mean why wouldn't you want to save power? But its not always there especially with the niche games which I happen to play. 1440p screen you go back to native in a few years which still looks good, 4k display if you go below native it looks bad.

Anyway Alan Wake II has been running great for me, I play with rt off, but it bugs me that mirrors don't work. Come on, we had mirrors in games way before rt.

Let's put this one to bed finally. Take a look at the 3060 review from Tech Powerup. In 1080p games, the 3060 average 115fps. I would not call 115fps barebones in the least. In fact, many would say 115fps with reasonable settings is perfect.

Generally what is considered entry is 30 fps, but I'll even say it should be up to 60fps. At 1440p, the mid range 3060 is able average 85fps. Again, that is at 1440p, entry level cards are not supposed to work with good framerate at higher resolution with high settings. Even the 48 fps at 4k is pretty incredible for the 3060.
I like the 3060 too, but I'm afraid I don't have such nice words for the 4060 or 4060 ti.
 
I'm so far ahead of that guy I've been playing at 1440p since my GTX 1060. Friggin' slow people only realizing that after getting a 4090, smh.
 
Hahaha yeah thats one of the reasons I struggle with upgrading my display. Right now I can downsample. But if I get a 4k display I feel like my card will age faster than with my 1440p one. And I still run dlss when its there, I mean why wouldn't you want to save power? But its not always there especially with the niche games which I happen to play. 1440p screen you go back to native in a few years which still looks good, 4k display if you go below native it looks bad.

Anyway Alan Wake II has been running great for me, I play with rt off, but it bugs me that mirrors don't work. Come on, we had mirrors in games way before rt.


I like the 3060 too, but I'm afraid I don't have such nice words for the 4060 or 4060 ti.

Duke nukem had mirrors
 
mirrors don't work
And won't work any time soon. Game devs gave up on making mirrors work without RT on a long while ago and we'll have to wait till RT becomes available for mainstream gamers. I mean, $200 GPUs not outright dying from activating RT, which is not currently the case.

4k display
The main advantage of having a 4K display is that you can activate 1080p when your GPU can't handle the native resolution and it still looks more or less (rather more than less) fine, unlike 1080p/720p on 1440p displays (1080p is just cursed because 1440/1080 = 1.(3); 720p is just 720p, looks awful regardless).

I'm currently a 4K display owner and my GPU is by no mean a 4K performer (6700 XT) and I play games at 4K when the FPS is right and at 1080p when it's too hard to play 4K. 27" make for a reasonable PPI if we talk 1080p so I don't suffer from it whatsoever. Not to mention 1080p looks better on a 4K display than it does on a 1080p display due to much more pixels available. XeSS and (not so often) FSR artifacts are usually inobservable at 4K either.

That said, pairing 4090 with a 4K display makes all sense in the world, especially if you don't hunt the very maximum settings possible. Many games look awesome on med-high presets.
 
That said, pairing 4090 with a 4K display makes all sense in the world, especially if you don't hunt the very maximum settings possible. Many games look awesome on med-high presets.
I think that who spends that much money for a 4090 expects to run everything at 4k maximum :)
 
I think that who spends that much money for a 4090 expects to run everything at 4k maximum :)
2017: an 8 hundred dollar 1080 Ti gets beaten by at least 6 games at 4K.
2019: a 12 hundred dollar 2080 Ti gets beaten by at least 5 games at 4K.
2021: a 15 (de facto 40) hundred dollar 3090 gets beaten by at least 3 games at 4K.

I didn't count RT enabled. This would lead 2080 Ti and 3090 to have even bigger numbers. 4K was and still is a YOLO resolution for maxed out gaming. We're currently in the 1440p era (1080p era: mid-10s to Q1'22). 4090 is a 4K capable device, sure, yet it's more of a high preset rather than ultra preset one.
 
Is it just me that finds these kind of reviews useless for most people?

I think most people will use DLSS/FSR at whatever resolution they're using, and they only tested those with 4090/7900XTX.

I play on a 4k screen with a 3090, Id love to know if Ill be able to reach 60 fps with DLSS Quality or Balanced and RT off, and have zero interest to know how it performs at native 4k because it just isnt a native 4k card anymore (most cards arent). So this huge review gave me some data about general performance and how a 4090/7900XTX will actually run the game but thats it.

I know its a lot of work but I really think that these reviews need DLSS/FSR data for more cards.
 
if Ill be able to reach 60 fps with DLSS Quality or Balanced
Performance at 4K with DLSS: Quality is somewhere in the middle atween native 4K and native 1440p, closer to that of 4K (average between 43 and 73 is 58 so you should expect low to mid 50s).
Performance at 4K with DLSS: Balanced is closer to that of 1440p but it's not exactly there. You should expect low to mid 60s, sometimes high 50s.
This is how DLSS boosts performance in heavily GPU-bound games. These calculations might be a wee bit inaccurate but you got the idea. About +35% with DLSS:Q and approximately +55% with DLSS:B.
 
Performance at 4K with DLSS: Quality is somewhere in the middle atween native 4K and native 1440p, closer to that of 4K (average between 43 and 73 is 58 so you should expect low to mid 50s).
Performance at 4K with DLSS: Balanced is closer to that of 1440p but it's not exactly there. You should expect low to mid 60s, sometimes high 50s.
This is how DLSS boosts performance in heavily GPU-bound games. These calculations might be a wee bit inaccurate but you got the idea. About +35% with DLSS:Q and approximately +55% with DLSS:B.

I know that it "should" be somewhere around there, I just think that these native resolution reviews and only trying DLSS on the most powerful and expensive cards in the world (oh the irony) is bad design.
 
2017: an 8 hundred dollar 1080 Ti gets beaten by at least 6 games at 4K.
2019: a 12 hundred dollar 2080 Ti gets beaten by at least 5 games at 4K.
2021: a 15 (de facto 40) hundred dollar 3090 gets beaten by at least 3 games at 4K.

I didn't count RT enabled. This would lead 2080 Ti and 3090 to have even bigger numbers. 4K was and still is a YOLO resolution for maxed out gaming. We're currently in the 1440p era (1080p era: mid-10s to Q1'22). 4090 is a 4K capable device, sure, yet it's more of a high preset rather than ultra preset one.
The thing that people don't realize is that the mere existence of an uber high end card results in games being more demanding, therefore card still can't hit 4k 60. If the 4090 didn't exist, PT on cyberpunk or alan wake wouldn't exist, therefore a 4090 (even though it didn't exist in this parallel universe) would max out almost everything at 4k. Even if a hypothetical 4090 was 3 times faster than the current 4090, there would still be games that it couldn't run, cause instead of 1-2-3 rays games would come out with 5-10 rays etc. There will never be a card no matter how fast that will be able to "max out" every game.
 
The thing that people don't realize is that the mere existence of an uber high end card results in games being more demanding, therefore card still can't hit 4k 60. If the 4090 didn't exist, PT on cyberpunk or alan wake wouldn't exist, therefore a 4090 (even though it didn't exist in this parallel universe) would max out almost everything at 4k. Even if a hypothetical 4090 was 3 times faster than the current 4090, there would still be games that it couldn't run, cause instead of 1-2-3 rays games would come out with 5-10 rays etc. There will never be a card no matter how fast that will be able to "max out" every game.
I was thinking this way too until I checked various GPU reviews starting with early Kepler days.

Back in 2012, it was unusual to play maxed out 1440p even on the fastest GPUs on the market. They barely could go beyond 60 FPS at 1920x1200. And not in each and every game.

You could argue GPUs weren't as expensive back then. And you're gonna be true in that case. But let's compare what GTX 680 ($500'12, or $670'23) and RTX 4070 ($600'22) are capable of.

1700036718103.gif
1700036731565.gif
1700036742482.gif
1700036762485.gif
1700036769224.gif
1700036781736.gif
1700036788745.gif
1700036794614.gif
1700036800633.gif
1700036805963.gif
1700036817039.gif

1700036921980.png

I only cherry picked the sub-60 FPS games for GTX 680. Decided to do the same for RTX 4070, yet it delivers 60+ FPS in every single game at 1440p (as of Q2'23 when it was just released).

Furthermore, RTX 4070 delivers more FPS at 1440p than GTX 680 delivered at 1080p.

Having a PC became much less expensive. You don't need to upgrade every 1.5 gens. You don't have games absolutely ravaging on your top tier GPU a year after the purchase. Ultimate 1080p was impossible pre-2010. Now it's a question of eight hundred dollars, tax excluded. 4K will be there, too. My bet we'll have maxed out 4K gaming as a norm in 10 years from now.
 
this game is unplayable, low or high i can not get enough base fps to reduce in mouse input lag to something i want to play with.
i have 15fps in there beginning scene with the body in the forest with max settings 4k and my preferred 220watt on my 4090, if i push my 4090 to 430watts i get 25 fps.
and i repeat: lowest settings still do not give playable fps. it is seriously 100fps with large mouse input lag :(
game is shit so no loss. i just like to give games a chance or use them as tech demos.
 
this game is unplayable, low or high i can not get enough base fps to reduce in mouse input lag to something i want to play with.
i have 15fps in there beginning scene with the body in the forest with max settings 4k and my preferred 220watt on my 4090, if i push my 4090 to 430watts i get 25 fps.
and i repeat: lowest settings still do not give playable fps. it is seriously 100fps with large mouse input lag :(
game is shit so no loss. i just like to give games a chance or use them as tech demos.

Highly highly highly disagree.

Game isn't shit at all, it's an absolute masterpiece and genre bending. If you enjoy complex, weird, completely cuckoo out there narratives (Lynch style et all) then it's an amazing game.

There's obviously something wrong with your PC because my 5800x3d + 4090 at 4K all maxed out (DLSS Q + FG of course) played beautifully the 3 times I played it. Never ever seen 15 fps, so there's something deeply wrong with your Windows installation or PC.
 
i did try last patch ... even with last DLSS .. works fine with FG mod for ppl with 20x series 100+ fps
 
this game is unplayable, low or high i can not get enough base fps to reduce in mouse input lag to something i want to play with.
i have 15fps in there beginning scene with the body in the forest with max settings 4k and my preferred 220watt on my 4090, if i push my 4090 to 430watts i get 25 fps.
and i repeat: lowest settings still do not give playable fps. it is seriously 100fps with large mouse input lag :(
game is shit so no loss. i just like to give games a chance or use them as tech demos.
Something must be wrong with your system. I got 50-70 FPS on a 7800 XT with ultra settings, no RT at 3440x1440 native.

Highly highly highly disagree.

Game isn't shit at all, it's an absolute masterpiece and genre bending. If you enjoy complex, weird, completely cuckoo out there narratives (Lynch style et all) then it's an amazing game.

There's obviously something wrong with your PC because my 5800x3d + 4090 at 4K all maxed out (DLSS Q + FG of course) played beautifully the 3 times I played it. Never ever seen 15 fps, so there's something deeply wrong with your Windows installation or PC.
I agree, both AW1 and 2 are masterpieces in their own way.

Although, I guess a twisty-bendy narrative style isn't meant for everyone. There's a reason why ratatata-waaaagh-boom-boom angry teenager keyboard smasher games are popular (even though I haven't found it).
 
Let's put this one to bed finally. Take a look at the 3060 review from Tech Powerup. In 1080p games, the 3060 average 115fps. I would not call 115fps barebones in the least. In fact, many would say 115fps with reasonable settings is perfect.
It's a decent card definitely not more than mid range. The 3060 even has sufficient VRAM. A lot of titles at 1440p or below yes but some titles I can't even get over 80fps on a 4090 without frame generation!
 
Back
Top