• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Alan Wake 2 Performance Benchmark

If needing a 4090 to run 1440p at not even 120 fps is a quality release to you, you might wanna get a reality check.
These results echo what we've seen in other recent games. So all devs are bad now and you are right, is that it

Its okay to have an opinion, but the numbers don't lie... and neither do the VRAM requirements.

60 FPS is a fine bar to have, and it actually used to be the bar we judge things by for a long time. Its not just because people bought high refresh monitors that games can suddenly max them out because of it. Never been the case. There was a sweet spot in times before RT where GPUs would just slaughter anything DX11 rasterized. The Pascal>Turing days. Oh yeah, btw, those new monitors also run 1440p or 4K now.

Those days are over. And regardless, you can still get 60 FPS at maxed Raster settings with most cards that aren't sub midrange or ancient.
 
Last edited:
Maybe AMD hitting the VRAM limit while NVIDIA does not? Generally NVIDIA has better memory management, and is handling resource allocation a bit more efficiently

This is something I specifically experience with modded Minecraft, especially with heavy texture and shader packs. Nvidia GPUs manage full VRAM situations better than AMD in that specific use case, but I haven't used my Nv card in Minecraft in a long time and know how to work around this with AMD.
 
That is definitely false. I've verified that ray traced reflections are only active when path tracing is active (screen space reflections are used otherwise)

Edit: This is from NVIDIA's reviewer guide, suggesting AO is part of of Path Tracing, too.

Edit2: Settings description in-game
So it does use PT For GI also? In Cyberpunk, with PT all other RT effects are disabled and PT does everything. Since you have the game now, when using PT, does it disable the other RT effects?
 
Last edited:
Applies only to cards with sufficient VRAM apparently, lol a 4070ti is sometimes slower than a 16GB 4060ti because it's running out of VRAM.
By making Path Tracing mandatory for anything over Low RT (The PhysX tactics all over again) and Frame Generation probably a necessity for smooth frame rates in many cases this needs a 4000 series card. For the settings TPU used in it's Path Tracing tests, a 4070 with FG seems to be the minimum for 1080p and 70fps, a 4070 Ti with FG for 1440p and 60fps, a 4090 for 4K and 60 fps.

I wonder how much time it will take for FSR 3 to be implemented in this game, if ever. That will make some high end RTX 3000 and Radeon cards also somewhat valid options at least at 1080p.
 
Vram does not matter when nvidia is beating AMD

8GB 3060 Ti is beating 12GB 6700XT at all resolution (rasterization) in both average fps and minimum

Also, 8GB 3060 Ti beats 6700XT at 1080p with RT but both cards had poor fps.... Any higher resolution is slide show in both cards, so RT results does not matter. If you are going to use RT on 3060 Ti, then you will mostly likely use DLSS + and lower some setting to get decent fps ..... So Vram never going be issue)
 
Thanks for the review and guide! But the fact that the game basically forces you to use an upscaler is appalling imo. I thought these technologies were suppose to help a card last longer, not become a crutch.
It doesn’t force you to use an upscaler
 
Lots of cards have been added and charts were updated
 
VRAM capacity saves the day with RT on for some of the Radeon GPUs.


Sorry, but any benchmarks that show very poor fps is irrelevant. If Card X get 27fps and Card Y get 5fps at extreme setting.... Then it won't matter in real life because NO ONE WILL PLAY AT THOSE SETTING

AMD extra Vram does not provide advantage over Nvidia at any playable fps

RTX 4070 12GB beats 7900XT 20GB in 1080p/1440p with RT (both cards are already below 60fps).... Any higher than that is useless

Path tracing benchmarks is useless.... Even 7900XT get very poor fps .... NOBODY will use Path tracing on AMD cards, and most nvidia cards

RTX 4070 (not even TI) is better than 7900XT when using RT at any realistic setting that people will be using in real life... END OF STORY
 
Last edited:
It doesn’t force you to use an upscaler
when you have to use an INI edit to get native rez to show, I consider that being forced.
 
10GB card, as predicted; don't go complaining now ;)

Compromises are always gonna happen sooner or later

10GB 3080 beats 16GB 7800XT & 6900XT at 1440p/4K (rasterization) in BOTH Average fps and minimum fps

10GB 3080 beats 20GB 7900XT in 1080p/1440p (Ray Tracing) in BOTH Average fps and minimum fps

Anything above that is useless....... Even 7900XT runs below 30fps when using path tracing.


IN OTHER WORDS, 10GB is not an issue....

I love how you guys twist things up..... nvidia is clearly doing better in this game, even if has lower VRAM......at any realistic settings.

Yep, when the RTX3000 series launched, I said, I called it, 16GB VRAM is the bare minimum all the cards should have, now even the RTX4000 is showing how mediocre the product was, with no DP2.0 for high refresh/resolution OLED panels, nor enough VRAM. Everything sub 4080 was a turd right out of the gate. Overpriced turds.

Ha ?? Did you even look at benchmarks and fps ????!


10GB 3080 beats 16GB 7800XT & 6900XT at 1440p/4K (rasterization) in BOTH Average fps and minimum fps

10GB 3080 beats 20GB 7900XT in 1080p/1440p (Ray Tracing) in BOTH Average fps and minimum fps

Path tracing is pointless... The fps is so poor even on top AMD GPU

8GB 3060 Ti is beating 12GB 6700XT at all resolution (rasterization) in both average fps and minimum

Also, 8GB 3060 Ti beats 6700XT at 1080p with RT but both cards had poor fps.... Any higher resolution is slide show in both cards

The performance is, frankly, appaling. 39 FPS on a 6700xt when said 6700xt is stronger then either home console? How poorly optimized is this?

And forced upscaling. Utter trash. Will skip.

Which is absolutely hilarious. Everyone who was so smug about how 16GB was useless and people were overreacting to 8GB cards getting it smothered all over their face that, no, 8GB is in fact NOT enough for modern GPUs.

First of all, people will not be running this game on extreme setting with path tracing on midranged GPU anyway....

Second of all, 8GB nvidia GPU is performing well in this game compared to 12GB AMD GPU with same generation.... For example 3060 Ti beats 6700XT 12GB in rasterization (all resolutions)........ and in 1080p with normal RT (not path tracing)...........Anything above that runs below 20fps on 6700XT.... It does not matter who wins when both cards are below 20fps or 30fps because nobody will play at those setting anyway.

Are you guys even look at the benchmarks ???!
 
-This is probably the first game that my old 2080Ti fails because of not enough vram. And that in 4K(RT/PT).
I should have kept this card for souvenir.

-It's obvious that the nvidias gpus do a better management with the vram.

-nVidia said the 4070Ti is a 1440p card (...the 4090 a 4K one, the 4080 an ultrawide 1440p accordingly)
No matter if the 4070Ti cannot deliver the wanted fps at a specific resolution, it's not nice to see it failing because of lack of vram.
And that's the difference between the 3080 and 3090, the 3070 and the 2080TI, the 4070Ti and the 4090.
The big models never(or extremely rarely) fail because of lack of vram.
 
3060 is entry level and those who bought it should understand that. Reviews aren't conducted on the lowest common denominator and I hope that doesn't become the norm. However, I have stumbled across a few reviewers who focus on entry level hardware on YouTube.
The 3060 is not entry level. It is the #1 GPU used on steam and is mainstream as it gets. More people will use that GPU to play Alan Wake 2 than any other. So seeing how it runs on the game is actually really informative for a lot of gamers.
 
Added a page with performance numbers for various DLSS settings
 
playable by less than 5% of the global gamer community ahahah :laugh:
 
Lots of cards have been added and charts were updated
I imagine the updated charts are only for the added gpus, since amd performance numbers look to be the same, but I ask just to make sure.

Did you get a chance to test the alan wake amd drivers? (I ask since maybe there weren't any noteworthy improvements that would warrant retesting everything)
 
If needing a 4090 to run 1440p at not even 120 fps is a quality release to you, you might wanna get a reality check.
You don't need a 4090 and you don't need to run ultra settings. You have to understand that "ultra" settings are not the same across the game. Just because I can get 120 fps at ultra on X game doesn't mean that Y game is unoptimized because I can't. It might just mean that game Y has much better graphics. Can you name me a game you can get 120 fps on a 4070 at 1440p that also looks better than AW2?
 
Page 5 has a typo showing ridiculous numbers with Ray Tracing, much lower than Path Tracing in two of three of the comparison quality screenshots as you can see here:

YycoXmk.jpg
 
Despicable Me Reaction GIF

Are you telling me i cant use my brand new old Stock Voodoo 5 card for this game?

Looks like this is a game that can get my rtx 4090 to sweet a little bit.

Could we maybe call this the new Crysis (until we get the real Crysis 4)?
 
The 3060 is not entry level. It is the #1 GPU used on steam and is mainstream as it gets. More people will use that GPU to play Alan Wake 2 than any other. So seeing how it runs on the game is actually really informative for a lot of gamers.
For Gaming GPUs, xx60s are entry level and if half of all gaming machines run it won’t change that fact.
 
More like a RTX 4090 exclusive
And even that doesn't run it with solid 4K60 with everything turned on. What a pile of crap what it comes to optimization.
 
Well, it's a good thing they excluded most gamers by making it Epic Games Store exclusive because otherwise people might be mad that their hardware isn't capable of running it.

Someone said this is a fine game to put aside and wait a couple of generations to play and I think that's me, too. By then, Epic will have pivoted and probably released this on Steam because EGS will be a distant memory by then.
 
  • Like
Reactions: Jun
Is it me or does it look like it's lacking anti aliasing even on DLSS Quality? Usually DLSS is quite good with anti aliasing... I wonder if it has anything to do with the ini settings referenced in the article? Film grain, vignette etc
 
This game proves that unless you go for the 4090, this generation of VGAs is worthless. And that is to play @ real 1440p! No, 1080p upscaled is NOT real 1440p.

I currently own a 3070, therefore I'm thinking it'll be way better to save money and wait for the 50XX series or AMD's 8XXX series and hope they'll cope better with the next generation games. That happened when Cyberpunk first came out, when only the 2080ti could max it out in graphics...

Ray tracing/Path tracing will be the standard for high quality graphics, the same way that happened with Transformation & Lightning back in the 2000s. And the only VGA that is capable of running it natively is the 4090, as proven by both Cyberpunk 2077 2.0 and Alan Wake 2.

In a way, it's great that programmers are pushing the PCs to the limit again (like when we HAD to move on from the 486s to the Pentiums and then the Pentium II with voodoo cards, and so on).

On the other side, PC gaming is getting so very expensive again (like it was in the 486-era), which is BAD.
 
Back
Top