• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Alan Wake 2 Performance Benchmark

I currently own a 3070, therefore I'm thinking it'll be way better to save money and wait for the 50XX series or AMD's 8XXX series and hope they'll cope better with the next generation games. That happened when Cyberpunk first came out, when only the 2080ti could max it out in graphics...
GeForce RTX 50 series in 2025 but this won't be cheaper. But Radeon RX 8000 won't have any high-end. ;)
 
GeForce RTX 50 series in 2025 but this won't be cheaper. But Radeon RX 8000 won't have any high-end. ;)

Yeah, but most likely a RTX 5070 (non TI) will have about 4080 to 4090-level performance, and it'll be about the price of the 4070Ti or a little more, so, waiting for a year could be interesting.

Also a RX 8900 XTX could be on the "4080 or more" performance level, so it could be interesting as well, being cheaper than ngreedia stuff. That if AMD figures their RT stuff out.

I'm also hopeful that ngreedia finally wakes up for the lack of VRAM on their cards... Their midrange 40 series cards with 8gb running the game @ 3fps was a bad joke.
 
Yeah, but most likely a RTX 5070 (non TI) will have about 4080 to 4090-level performance, and it'll be about the price of the 4070Ti or a little more, so, waiting for a year could be interesting.

Also a RX 8900 XTX could be on the "4080 or more" performance level, so it could be interesting as well, being cheaper than ngreedia stuff. That if AMD figures their RT stuff out.

I'm also hopeful that ngreedia finally wakes up for the lack of VRAM on their cards... Their midrange 40 series cards with 8gb running the game @ 3fps was a bad joke.
Ok. Will RTX 5090 be released in end of 2024? Very likely no. No any RX 8900 series.
 
GeForce RTX 50 series in 2025 but this won't be cheaper. But Radeon RX 8000 won't have any high-end. ;)

You are making that assumption based on a rumor, which is silly given that 99% of rumors in this market turn out to be false. The 7900 XTX already has a small GPU die yet competes with the 4080. AMD can keep the die size the same and still only have one GCD and with the new architecture and node shrink it'd be competitive with the 5080. That assumes AMD doesn't figure out how to finallly have multiple GCDs per die, which would make it very hard for Nvidia to compete. This isn't like Nvidia where they need to dedicate all their high end chips to AI. Given the modular nature of AMD's GPUs it would make far more sense that if AMD were to get swamped with AI chip contracts, that they only sell high end GPUss because their margins are far larger and AMD can build them with the lower binned modular chiplets that don't make it into AI products for very cheap. Nvidia doesn't have that same luxury and it's one of the many advantages of chiplets.
 
First of all, thanks for running DLSS benches across the spectrum. I'm sure it's a time-suck, but a lot of people trust your results and it's actually extremely interesting information.

I have an extremely weird request. Any way you (or anyone else) could run 4k balanced and quality on a 4070ti? I'm just curious if it hits the frame buffer cap.

What I get from this:

Native:

1080p: 2080 Ti (stay frosty, old buddy)
1440p: 7800xt
4k: 4090

RT:
1080p: 4070ti
1440p: 4090
4k: 4090

PT:
1080p: 4080
1440: Nothing
4k: BWAHAHAHAHA

DLSS/FSR:
1440p Quality (960p): 2080 Ti
4k Balanced (1253/1270p): (4070ti?)/7900xt (hic sunt dracones)
4k Quality (1440p): (4070ti?)/7900xt


Hmm. Yep. Sounds right. I'll stick to thinking 2080 Ti (~$300) is the value option. A 7800xt (~$500) if somebody is a buying-new normie. 7900xt is the performance play. Everything else is ridiculous imho.

If you're that abnormally rational fellow that bought a 4070ti for 1080p, congratulations on your RT. You beat the system, and it only cost $800.

I don't blame the game; games get more demanding and this literal exact stack was bound to happen (I've only written about it 37 times).

I blame the GPU market/products, which just ain't conducive for most people at >1440p.

Not sayin' just sayin', you can see why we need new 16GB parts above 7800xt/4070ti in performance but below 7900xt/4070ti in price (for 4k quality/balanced DLSS/FSR).

This is one of those instances where I could see a 4070ti 16GB doing 4k 'quality' and Navi 4 doing 4k 'balanced', but ofc there are other situations where it will probably be reversed, or it might be a wash. This is why I'm curious how 4070ti performs at those settings. It *might* be interesting to prospective 4070ti buyers.
 
Ah the old 10GB debate rears it's head, and it's not like the card has good performance [relative to all tested cards] at resolutions and settings it would get playable framerates at anyway... oh wait, it does...

Personally I think it acquits itself very well for a 3 year old card that was half the price of the big brother GPU, in perhaps 2023's best looking game, with literally the most advanced rendering features available, what's on show here is massively respectable, even downright impressive, again, a 3 year old card playing with the most advanced visual features on offer.

Tell you what, I'd rather play on a 10GB card that has DLSS and RR to lean on, than a 12-16GB card and be forced to use the FSR code path and lack the muscle to even try PT, sub 30fps at 1080p...

We all made our choices, and I hate to disappoint the internet, but I'm still happy with mine, I'm getting out of it exactly what I want and expect.
 
It seems I'll be easily getting 60+ FPS without RT with my setup, no need for upscaling, either. All the cries in the system requirements thread were pointless, just as I thought. :)
 
Tell you what, I'd rather play on a 10GB card that has DLSS and RR to lean on, than a 12-16GB card and be forced to use the FSR code path and lack the muscle to even try PT, sub 30fps at 1080p...

The only 10GB Nvidia card that gets over 30 FPS at 1080p + PT is the 3080, the rest of the lower VRAM Nvidia cards perform horridly:

1698367529382.png


Your conclusion is silly, you shouldn't be forced to rely on upscaling in any game, particularly at 1080p where there will be a hit to image quality regardless of what up-scaling tech you are using. 42 FPS with upscaling is not what I'd call something to write home about.

It seems I'll be easily getting 60+ FPS without RT with my setup, no need for upscaling, either. All the cries in the system requirements thread were pointless, just as I thought. :)

Please take a look at the test setup page. Upscaling is forced in this game and it's already factored into the given performance numbers. I think TPU needs to do a better job representing that by putting that in every chart. Particularly so given that it's an apples to oranges comparison given different settings are enabled for different vendors and these charts will undoubtedly be linked to out of context in the future.
 
Last edited:
The only 10GB Nvidia card that gets over 30 FPS at 1080p + PT is the 3080, the rest of the lower VRAM Nvidia cards perform horridly:

That's because they're lower powered GPUs - Not a VRAM limitation.

Please take a look at the test setup page. Upscaling is forced in this game and it's already factored into the given performanc numbers.

Yep. Default script from W1zz shows DLSS on:

Code:
"m_eSSAAMethod": 2,

And a render resolution below native:

Code:
"m_iOutputResolutionX": 2560,
"m_iOutputResolutionY": 1600,
"m_iRenderResolutionX": 1707,
"m_iRenderResolutionY": 1067,
 
Will add performance results for more cards throughout the day.

Charts need to be updated indicating the type of upscaling used and in general that upscaling is on. Already seen four people in the comments miss that detail because it's only in the test setup page. It's also important for when those charts eventually get used in other comparisons.

That's because they're lower powered GPUs - Not a VRAM limitation.

The fact that the 16GB 4060 Ti gets nearly double the performance of the 8GB 4060 Ti disproves this assertion. That might hold true for the 3060 but it does not for anything higher.
 
Last edited:
The only 10GB Nvidia card that gets over 30 FPS at 1080p + PT is the 3080, the rest of the lower VRAM Nvidia cards perform horridly:
It's the only 10GB card I was referring to...
Your conclusion is silly, you shouldn't be forced to rely on upscaling in any game, particularly at 1080p where there will be a hit to image quality regardless of what up-scaling tech you are using. 42 FPS with upscaling is not what I'd call something to write home about.
My conclusion is right for me, and it's also realistic. Again, we're talking about the latest game release of 2023 pushing visuals to the bleeding edge of what's possible, I won't die on this hill of not using upscaling when it's not only excellent quality, and capable of actually enhancing the quality (RR), but such a good tool to have in the toolbox of tweaking my personal gaming experience.

I'm not mad that this game came out thoroughly pushing the envelope and has steep requirements, I applaud it, and have reasonable expectations relative to the age and spec of my hardware. I didn't even think I'd get to play AAA path traced titles at all on an Ampere generation GPU, yet here we are.

I can't deny you how you want to feel about upscaling, this game, it's requirements and so on, but I find them all quite reasonable based on hardware used and resulting visuals.
 
Damn, another game where the 6700XT 12GB can't outclass the 3070/Ti 8GB. Gentlemen, three years have passed since the launch of the 3000 and 6000 series and the start of the big game on vRAM memory. How long do we have to wait? In 2-3 years, these GPUs can no longer render even 20 fps in 1080p even with 128GB of vRAM.
I'm glad that TPU also introduced Low limits.
 
Alan Wake 2 is a great game with state-of-the-art graphics, 2023 turns out to be the best year ever for PC gamers :D
 
My goodness the visual is super impressive…. Welp it would be a purchase 2 years later
 
alan wake 2 vram.png


nice...... nvidia rtx 4080 16gb = dead on 4K with RT PT + FG...... LOL

right now, maybe we can waiting, the new rtx 4080 20gb super/rtx 4080 ti 24gb on next year........

btw, the average price for rtx 4080 16gb in my country today, still US$ 1400 (US$1200-1600).....
 
  • Like
Reactions: Jun
Is it me or does ray reconstruction remove shadows, if you look at the image with the chain fence there is absolutely no shadow from it like in the raster image. Not sure if that's because the raster is wrong at adding the shadows or the PT isn't working correctly.

Even in the dlss and fsr comparison, if you look at the shadows from the power lines and telephone pole on the left brick building, the PT version shadows seem to disappear vs the raster version.


Path tracing is actually broken in this game and removes a ton of shadows, look at dsogsmings article, he mentions it completely removes a ton of shadow.
Its working accurately, thats what you call shadow penumbra effect due to the hanging rectangular light and not a pt source light, hence diffusing the light,
 
Upscaling is forced in this game
Really? :eek: That's scummy AF if true! :shadedshu:

Edit: The review also says that you can set the render resolution to your in-game resolution which will basically run without upscaling.
 
Last edited:
Does this game support 1st person camera?
 
Another game you can barely play. Despite the looks that is horrible. The RT PT used to be shown us advancement, great feature but it would seem it is also a downfall of the performance so sharp that top of the top cards can barely handle it at 1080p where upscale tech has image quality flaws.
 
Really? :eek: That's scummy AF if true! :shadedshu:

Edit: The review also says that you can set the render resolution to your in-game resolution which will basically run without upscaling.
Crutches are needed, you thought this was free perf? Tsk tsk
 
nice...... nvidia rtx 4080 16gb = dead on 4K
It's quite evident that, although these VRAM numbers were measured, they aren't strictly necessary for great performance, it seems it can be overshot by 2GB+
 
Back
Top