• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Alan Wake II System Requirements Released, Steep RT Requirements Due to Path Tracing

Are no other brothered by the memory requirements. 16 gb from lowest to highest.
Memory usage for game logic does not really change with graphical settings. You can test this in any game, change the settings and RAM usage wont budge.
 
Memory usage for game logic does not really change with graphical settings. You can test this in any game, change the settings and RAM usage wont budge.
RAM consumption increases significantly on GPUs with little Vram available, if this is pushed to the limit. Also, Games with large, expansive environments, such as open worlds, need to load and maintain information about large areas of the game in memory. This may increase RAM consumption.

So yes, sometimes there is an impact on RAM consumption, depending on the configurations and hardware.
 
This game looks really really good.
 
Remedy is highly subsidized by Nvidia so this is expected.
However, Nvidia can only get their hands on smaller engines because the others really need to work well on consoles.
They did this on Remedy's and CDPR's engines, but CDPR moved on to UE5 and Alan Wake is an Epic exclusive so it'll be irrelevant in terms of sales.


Also, expect Alex Battaglia to speak of this game as the second coming of Christ, and for review sites to get this sales and mindshare-wise irrelevant Alan Wake in their benchmark repertoir for an absurdly long period of time.
 
Maybe this is just intended to boost some high-end GPU sales in the interim, then closer to launch they'll say "Oops, that was a typo! The person responsible for sacking people has been sacked."
Temporarily, for about 1 year after game release (that's how long the Gameworks affiliate is valid, as far as i know).
 
Who said that? Did you test it? ;)

My point is, let's wait until the game is out and see for ourselves. Developer recommended systems hardly ever match the truth.
Look at the picture. It says that.

I find the requirements reasonable.
You can play at 1440p/60+ with a 2080Ti nearly at max settings. I mean what did you expect to get from a gpu with this level of performance at 1440p on a fall 2023 AAA title?
The RT requirements are reasonable too. You can play at 1440p with a 3080/4070 level of a gpu.
And still the 2080Ti, I believe can run it at 1440p with RT on with a few tweaks here and there.

4K/60/Pathtracing on? this is a joke. Not only 4K res with pathtracing on but 60fps as well?
They could put 4090 there and still I would be ok with that.
The path tracing came to existence because of the frame generation. There is no other way to play a AAA path traced game.
Where are you getting this? a 2080Ti is about 3070 perf. The chart shows that you need a 3070 to run it at 60fps in medium, in 1080p with dlss on performance. That's 540p internally. So you need a 3070 to run the game in 540p 60fps medium settings... And that's reasonable? lol.

Also RT medium and game medium 60fps in 1080p dlss quality. so 720p internally, needs a 4070. And they haven't put 3080 in the chart, and said VRAM 12GB. So you need a 12GB card of 4070 perf, one of those rare 12GB 3080s will do, to run the game in medium RT, medium GFX 60fps in 720p lol...
 
Now you're kinda starting to sound like me.

When you look at it as needing a 4070 to play a game at decent settings, that indeed does suck. When you realize a 2080 Ti really isn't THAT far off a 4070 (~10-15%, maybe ~20% on the outside absolute performance depending upon where the bottleneck is, and it can be 2x perf a stock 6600xt when overclocked), it's not so bad. I do agree that it's important to look at *some* nuance in settings, as we're kinda spoiled by the fact W1zard cranks everything up in reviews and 2080ti is turning into a <1440p/60fps gpu in some scenarios. I agree 1440p60 will probably be doable with a little finessing that won't impact IQ that much (if you're running a fairly decent CPU). We shall see, but make no mistake that reasonable settings (say 1440p60 or 4k DLSS balanced) are left off for a reason...They likely make too much sense on a cheaper product to sell newer more-expensive gpus (which is the point of a game like this).

I really, honestly, do not blame the games advancing in required spec. I blame nVIDIA's pricing racket and performance segmentation. AD104 is handicapped, AD103 is expensive. It would be great if AMD would catch up in RT, but I'll take the cheaper prices in the mean time.

At some point people will realize the next step up from that level of perf is not RT, but 16GB of VRAM (hence the 1080p60 med RT for the 4070/ti; 1440p is conspicuously missing bc it too probably won't run performant-enough on AD104). It will be interesting to watch 4070ti age versus 7800xt. I imagine 7800xt aging even better vs 4070ti than 2080 ti versus 4070, and in the end the achievable playable performance will be similar (if 7800xt not better when 12GB becomes a limiting factor at ~1440p regardless of RT). I think there's also a dang good chance the same will be true with AMD/Intel's next GPUs and 4080 wrt longevity. 4080 will likely always be a very slightly better GPU, and that's by (probably anticipated) design, but (hopefully) there will reach a point where people realize it's just not worth the premium unless you're absolutely enraptured by nVIDIA's paid-for tailored features in a handful of games specifically there to up-sell you to a needlessly expensive GPU that still can't run at a decent resolution/fr with those features turned on. JMO.

As a 'what will x get me' sorta guy, I really only see two GPUs currently: 2080ti, which is old but a bangin' value, and (6800xt)/7800xt, which is if you're buying new. There needs to be a GPU that is ~30% faster but also ~30% more money; a slightly cheaper 7900xt if you will. The only way anything changes in the current landscape is if 4080 gets a ton cheaper, and I just can't see them going below the price of a 7900xtx bc greed. 7900xt might get cheaper though.

I can also imagine 7800xt becoming cheaper, and navi4x/BM being greater than proportionaly better perf/$ than 4070ti/4070 ti super, or whatever they call a further cut AD103, let-alone 4080. TBD value prop with 4070 (just like) tea tea, as I think 4070ti/4080 are going to lose a metric ton of value fairly quickly and it's unknown how nVIDIA will attempt to slot that into the equation. If it's under $700 and can compete with 7900xt/n4x/BM in value I will be pleasantly surprised, but I'm not holding my breath.

With that being the case, I just don't see how anyone can make an argument for nVIDIA's current GPUs unless you are that person for which money is no object; the next evolution of an Apple fanboy complete with paying the tax for a bespoke feature. I can appreciate RT as much as the next guy, but I just don't see the value until the next generation (at the very least). Until that point, I just don't see why someone would buy something more expensive than a 7900xt, and preferably that performance would be (and soon likely will be) even less expensive.
A 4070 is nearly 30% faster overall than a 2080 ti at 1440p. https://tpucdn.com/review/nvidia-ge...ion/images/relative-performance-2560-1440.png

100/78=28
 
About the VRAM requirement.
The numbers quoted for "2160p" necessitate DLSS/FSR in performance mode, where the game renders at 1080p internally.
So, it would follow that 12 GB VRAM may be necessary for 1080p with high details, and 16 GB with RT/PT on.
Yep, in 1080p :oops:
 
Look at the picture. It says that.
It's a bloody picture, nothing more. A lot of games run well on systems way below the official minimum spec. That's why I suggest waiting until release before drawing conclusions.
 
Where are you getting this?

1697969847125.png

Because the above does not make much sense.
The 4070(aka 3080 level) can play at 4K/60 High while the 2080Ti can not exceed the 1080p/60 medium at the same time? Really?
Is the 4070 twice as fast as the 2080Ti and we don't know it?

The 2080Ti won't have any problem at 1440p with 11GB of vram. I cannot say the same for the 3070 (although the remedy games do not consume that much vram).
If you change DLSS Quality to Balanced and reduce one two settings, you can get 50-60s with RT on.
 
View attachment 318488
Because the above does not make much sense.
The 4070(aka 3080 level) can play at 4K/60 High while the 2080Ti can not exceed the 1080p/60 medium at the same time? Really?
Is the 4070 twice as fast as the 2080Ti and we don't know it?

The 2080Ti won't have any problem at 1440p with 11GB of vram. I cannot say the same for the 3070 (although the remedy games do not consume that much vram).
If you change DLSS Quality to Balanced and reduce one two settings, you can get 50-60s with RT on.
that requiremets list states "DLSS performance" to achieve those results, which means heavy upscaling and a much lower internal resolution.
also if we're talking ray tracing the newer cards are better at it.
 
It's a bloody picture, nothing more. A lot of games run well on systems way below the official minimum spec. That's why I suggest waiting until release before drawing conclusions.
It's a picture put out by the publisher of the game days before release, and you think it's just meaningless?

View attachment 318488
Because the above does not make much sense.
The 4070(aka 3080 level) can play at 4K/60 High while the 2080Ti can not exceed the 1080p/60 medium at the same time? Really?
Is the 4070 twice as fast as the 2080Ti and we don't know it?

The 2080Ti won't have any problem at 1440p with 11GB of vram. I cannot say the same for the 3070 (although the remedy games do not consume that much vram).
If you change DLSS Quality to Balanced and reduce one two settings, you can get 50-60s with RT on.
If they've put that gap between them for what they call 4k and 1080p, (really 1080p and 540p) then I'd assume it's strictly a VRAM requirement, but they didn't wanna add a middle chart at 1440p(720p) 60 medium for the 6700XT alone, and then the 1080p(540p) 60 medium column for the 3070 alone.

Assuming it is purely based on the game eating VRAM, a 2080Ti may be able to do 1440(720p) med and hit 60. Unless it also favours the newer architectures more. But I doubt it.

In an ideal world they'd do a much bigger chart, with more res/quality configs and way more cards tested, especially older ones. But this always seems to be too difficult for big publishers to push, so random little youtubers end up doing it. Or TPU hopefully. But if this is what they're going with, then benchmarks using no image reconstruction are really gonna show what a hog this game will be.

Also RIP Series S. That's gonna be like 30fps 360p upscaled to 1080 with FSR2 probably.
At this rate I'd be surprised if the PS5/Series X versions have a 60fps mode at all. And their 30fps modes will be like 540p internal.
 
It's a picture put out by the publisher of the game days before release, and you think it's just meaningless?
Considering how meaningless lots of official game system requirements are, yes.
 
Considering how meaningless lots of official game system requirements are, yes.
The bad ones, which aren't meaningless, just often a fair bit off, usually just say minimum and recommended, and don't even bother mentioning what res or framerate.
The amount of info on this suggests they've at least tested a bunch of these setups.
 
The bad ones, which aren't meaningless, just often a fair bit off, usually just say minimum and recommended, and don't even bother mentioning what res or framerate.
The amount of info on this suggests they've at least tested a bunch of these setups.
The last time I saw a game that didn't run below the official minimum spec and needed the recommended spec for stable gameplay was about 20 years ago.
 
In an ideal world they'd do a much bigger chart, with more res/quality configs and way more cards tested, especially older ones.
Question: will it run on pascal and rdna1 which don’t have dx12 mesh shaders?
 
So you need a 3070 to run the game in 540p 60fps medium settings... And that's reasonable? lol.
The answer to that is, we don't know. If the game at medium settings looks twice as good as the best looking game at ultra then YES, that's reasonable. If the game at medium looks like crap then it's unreasonable. So "medium" settings on their own mean ABSOLUTELY nothing, and it's very cringeworthy when people mention presets to argue about whether a game is optimized or not, as if presets are set in stone and remain the same between games.
 
All this negativity about the "system requirements" of a game that isn't even out yet is astonishing! Not that playing at 4K with RT ultra was a requirement to enjoy it anyway.

How many times have we seen one system being recommended by the developer, and the game running just fine on another system with half the computing power?

I would tend to agree if there wasn't a trend of recent games needing DLSS / FSR in order to get decent FPS.

Why am I getting flashbacks to PhysX, Arkham Knight, hair works, etc?

Probably because this is an Nvidia sponsored title where they are nuking performance to the point where you need to buy new cards. The only difference is that GPUs used to be more affordable. A 970 was $330 and that got you 73% of flagship performance (and that's before you consider OC headroom). Now you get only 50% of flagship peformance for $600 and it's a card designed to last precicely 2 years until the next gen comes out and Nvidia sponsored titles add an additional sample per ray that halves performance on last gen cards.
 
Care about Raytracing?
Have fun getting a loan to play games.
Just wait until UE5 gets in full swing, it was supposed to be the next best thing in engines, until now, games that use it run like crap.
 
Nvidia sold us this RT, Upscaled future. Outside a few exceptions its mostly harmed PC gaming, not improved it IMO
Most people (Nvidia) are still buying up everything Jensen dishes out and his vision for future profits. In one of the recent DF interviews Nvidia is now pushing that native frames are fake frames and real frames are the upscaled and frame generated DLSS 3.5 frames. You've gotta be some kinda sucker to buy that shit.
 
Another visual masterpiece in the making, let hope the game is equally engaging as its visual :)
 
Care about Raytracing?
Have fun getting a loan to play games.
Just wait until UE5 gets in full swing, it was supposed to be the next best thing in engines, until now, games that use it run like crap.

UE5 stutters on a single card & you think it's good?
 
never tought that in 2023 gpus will run games at 2003 resolution because of "business"
 
never tought that in 2023 gpus will run games at 2003 resolution because of "business"
That's because upscaling is every gamer's dream, apparently.

Where are all the DLSS fans now?
 
Back
Top