Friday, October 20th 2023

Alan Wake II System Requirements Released, Steep RT Requirements Due to Path Tracing

Alan Wake II by Remedy Entertainment promises to be the year's most visually intense AAA title. The publisher put out the various tiered system requirements lists that highlight just what it takes to max the game out. As with most publishers these days, the company put out separate lists for RT and non-RT experiences. The common minimum requirements across all tiers include 90 GB of SSD-based storage, Windows 10 or Windows 11, and 16 GB of main memory. At the bare minimum, you'll need a quad-core Intel Core i5-7600K or comparable processor. For all other tiers, Remedy recommends at least an AMD Ryzen 7 3700X or Intel equivalent processor (which would mean at least a Core i7-10700K), or an 8-core/16-thread processor that's as fast as the 3700X.

The bare minimum GPU requirement calls for an NVIDIA GeForce RTX 2060 or Radeon RX 6600. With this, you can expect 1080p @ 30 FPS, and can use the "quality" setting with DLSS 2 or FSR 2. The non-RT "Medium" list, is either 1440p @ 30 FPS or 1080p @ 60 FPS. For 1440p @ 30 FPS, you'll need a GPU at least as fast as a GeForce RTX 3060 or Radeon RX 6600 XT. 1080p @ 60 FPS requires at least a GeForce RTX 3070 or Radeon RX 6700 XT. The "Ultra" non-RT preset with 4K @ 60 Hz, which is the best experience you can possibly have without ray tracing, demands at least a GeForce RTX 4070 or Radeon RX 7800 XT. Ray tracing is a whole different beast.
The "Low" ray tracing tier, which is medium raster graphics settings with low ray tracing, for 1080p @ 30 FPS, demands at least a GeForce RTX 3070 or Radeon RX 6800 XT. The "Medium" ray tracing tier, which is medium raster graphics settings with medium ray tracing and path tracing enabled, for 1080p @ 60 FPS gameplay, demands at least a GeForce RTX 4070. There's no AMD Radeon GPU with the ray tracing performance of an RTX 4070 in its price-range, so Rockstar didn't recommend an AMD option. The "High" ray tracing preset, which combines high raster graphics with high ray tracing, and path tracing; for gameplay at 4K with 60 FPS; requires a GeForce RTX 4080.
Add your own comment

157 Comments on Alan Wake II System Requirements Released, Steep RT Requirements Due to Path Tracing

#76
Denver
Vya DomusMemory usage for game logic does not really change with graphical settings. You can test this in any game, change the settings and RAM usage wont budge.
RAM consumption increases significantly on GPUs with little Vram available, if this is pushed to the limit. Also, Games with large, expansive environments, such as open worlds, need to load and maintain information about large areas of the game in memory. This may increase RAM consumption.

So yes, sometimes there is an impact on RAM consumption, depending on the configurations and hardware.
Posted on Reply
#79
ToTTenTranz
Remedy is highly subsidized by Nvidia so this is expected.
However, Nvidia can only get their hands on smaller engines because the others really need to work well on consoles.
They did this on Remedy's and CDPR's engines, but CDPR moved on to UE5 and Alan Wake is an Epic exclusive so it'll be irrelevant in terms of sales.


Also, expect Alex Battaglia to speak of this game as the second coming of Christ, and for review sites to get this sales and mindshare-wise irrelevant Alan Wake in their benchmark repertoir for an absurdly long period of time.
Posted on Reply
#80
Shtb
Darmok N JaladMaybe this is just intended to boost some high-end GPU sales in the interim, then closer to launch they'll say "Oops, that was a typo! The person responsible for sacking people has been sacked."
Temporarily, for about 1 year after game release (that's how long the Gameworks affiliate is valid, as far as i know).
Posted on Reply
#81
Arkz
AusWolfWho said that? Did you test it? ;)

My point is, let's wait until the game is out and see for ourselves. Developer recommended systems hardly ever match the truth.
Look at the picture. It says that.
gffermariI find the requirements reasonable.
You can play at 1440p/60+ with a 2080Ti nearly at max settings. I mean what did you expect to get from a gpu with this level of performance at 1440p on a fall 2023 AAA title?
The RT requirements are reasonable too. You can play at 1440p with a 3080/4070 level of a gpu.
And still the 2080Ti, I believe can run it at 1440p with RT on with a few tweaks here and there.

4K/60/Pathtracing on? this is a joke. Not only 4K res with pathtracing on but 60fps as well?
They could put 4090 there and still I would be ok with that.
The path tracing came to existence because of the frame generation. There is no other way to play a AAA path traced game.
Where are you getting this? a 2080Ti is about 3070 perf. The chart shows that you need a 3070 to run it at 60fps in medium, in 1080p with dlss on performance. That's 540p internally. So you need a 3070 to run the game in 540p 60fps medium settings... And that's reasonable? lol.

Also RT medium and game medium 60fps in 1080p dlss quality. so 720p internally, needs a 4070. And they haven't put 3080 in the chart, and said VRAM 12GB. So you need a 12GB card of 4070 perf, one of those rare 12GB 3080s will do, to run the game in medium RT, medium GFX 60fps in 720p lol...
Posted on Reply
#82
robb
alwaysstsNow you're kinda starting to sound like me.

When you look at it as needing a 4070 to play a game at decent settings, that indeed does suck. When you realize a 2080 Ti really isn't THAT far off a 4070 (~10-15%, maybe ~20% on the outside absolute performance depending upon where the bottleneck is, and it can be 2x perf a stock 6600xt when overclocked), it's not so bad. I do agree that it's important to look at *some* nuance in settings, as we're kinda spoiled by the fact W1zard cranks everything up in reviews and 2080ti is turning into a <1440p/60fps gpu in some scenarios. I agree 1440p60 will probably be doable with a little finessing that won't impact IQ that much (if you're running a fairly decent CPU). We shall see, but make no mistake that reasonable settings (say 1440p60 or 4k DLSS balanced) are left off for a reason...They likely make too much sense on a cheaper product to sell newer more-expensive gpus (which is the point of a game like this).

I really, honestly, do not blame the games advancing in required spec. I blame nVIDIA's pricing racket and performance segmentation. AD104 is handicapped, AD103 is expensive. It would be great if AMD would catch up in RT, but I'll take the cheaper prices in the mean time.

At some point people will realize the next step up from that level of perf is not RT, but 16GB of VRAM (hence the 1080p60 med RT for the 4070/ti; 1440p is conspicuously missing bc it too probably won't run performant-enough on AD104). It will be interesting to watch 4070ti age versus 7800xt. I imagine 7800xt aging even better vs 4070ti than 2080 ti versus 4070, and in the end the achievable playable performance will be similar (if 7800xt not better when 12GB becomes a limiting factor at ~1440p regardless of RT). I think there's also a dang good chance the same will be true with AMD/Intel's next GPUs and 4080 wrt longevity. 4080 will likely always be a very slightly better GPU, and that's by (probably anticipated) design, but (hopefully) there will reach a point where people realize it's just not worth the premium unless you're absolutely enraptured by nVIDIA's paid-for tailored features in a handful of games specifically there to up-sell you to a needlessly expensive GPU that still can't run at a decent resolution/fr with those features turned on. JMO.

As a 'what will x get me' sorta guy, I really only see two GPUs currently: 2080ti, which is old but a bangin' value, and (6800xt)/7800xt, which is if you're buying new. There needs to be a GPU that is ~30% faster but also ~30% more money; a slightly cheaper 7900xt if you will. The only way anything changes in the current landscape is if 4080 gets a ton cheaper, and I just can't see them going below the price of a 7900xtx bc greed. 7900xt might get cheaper though.

I can also imagine 7800xt becoming cheaper, and navi4x/BM being greater than proportionaly better perf/$ than 4070ti/4070 ti super, or whatever they call a further cut AD103, let-alone 4080. TBD value prop with 4070 (just like) tea tea, as I think 4070ti/4080 are going to lose a metric ton of value fairly quickly and it's unknown how nVIDIA will attempt to slot that into the equation. If it's under $700 and can compete with 7900xt/n4x/BM in value I will be pleasantly surprised, but I'm not holding my breath.

With that being the case, I just don't see how anyone can make an argument for nVIDIA's current GPUs unless you are that person for which money is no object; the next evolution of an Apple fanboy complete with paying the tax for a bespoke feature. I can appreciate RT as much as the next guy, but I just don't see the value until the next generation (at the very least). Until that point, I just don't see why someone would buy something more expensive than a 7900xt, and preferably that performance would be (and soon likely will be) even less expensive.
A 4070 is nearly 30% faster overall than a 2080 ti at 1440p. tpucdn.com/review/nvidia-geforce-rtx-4070-founders-edition/images/relative-performance-2560-1440.png

100/78=28
Posted on Reply
#83
QuietBob
About the VRAM requirement.
The numbers quoted for "2160p" necessitate DLSS/FSR in performance mode, where the game renders at 1080p internally.
So, it would follow that 12 GB VRAM may be necessary for 1080p with high details, and 16 GB with RT/PT on.
Yep, in 1080p :oops:
Posted on Reply
#84
AusWolf
ArkzLook at the picture. It says that.
It's a bloody picture, nothing more. A lot of games run well on systems way below the official minimum spec. That's why I suggest waiting until release before drawing conclusions.
Posted on Reply
#85
gffermari
ArkzWhere are you getting this?

Because the above does not make much sense.
The 4070(aka 3080 level) can play at 4K/60 High while the 2080Ti can not exceed the 1080p/60 medium at the same time? Really?
Is the 4070 twice as fast as the 2080Ti and we don't know it?

The 2080Ti won't have any problem at 1440p with 11GB of vram. I cannot say the same for the 3070 (although the remedy games do not consume that much vram).
If you change DLSS Quality to Balanced and reduce one two settings, you can get 50-60s with RT on.
Posted on Reply
#86
progste
gffermari
Because the above does not make much sense.
The 4070(aka 3080 level) can play at 4K/60 High while the 2080Ti can not exceed the 1080p/60 medium at the same time? Really?
Is the 4070 twice as fast as the 2080Ti and we don't know it?

The 2080Ti won't have any problem at 1440p with 11GB of vram. I cannot say the same for the 3070 (although the remedy games do not consume that much vram).
If you change DLSS Quality to Balanced and reduce one two settings, you can get 50-60s with RT on.
that requiremets list states "DLSS performance" to achieve those results, which means heavy upscaling and a much lower internal resolution.
also if we're talking ray tracing the newer cards are better at it.
Posted on Reply
#87
Arkz
AusWolfIt's a bloody picture, nothing more. A lot of games run well on systems way below the official minimum spec. That's why I suggest waiting until release before drawing conclusions.
It's a picture put out by the publisher of the game days before release, and you think it's just meaningless?
gffermari
Because the above does not make much sense.
The 4070(aka 3080 level) can play at 4K/60 High while the 2080Ti can not exceed the 1080p/60 medium at the same time? Really?
Is the 4070 twice as fast as the 2080Ti and we don't know it?

The 2080Ti won't have any problem at 1440p with 11GB of vram. I cannot say the same for the 3070 (although the remedy games do not consume that much vram).
If you change DLSS Quality to Balanced and reduce one two settings, you can get 50-60s with RT on.
If they've put that gap between them for what they call 4k and 1080p, (really 1080p and 540p) then I'd assume it's strictly a VRAM requirement, but they didn't wanna add a middle chart at 1440p(720p) 60 medium for the 6700XT alone, and then the 1080p(540p) 60 medium column for the 3070 alone.

Assuming it is purely based on the game eating VRAM, a 2080Ti may be able to do 1440(720p) med and hit 60. Unless it also favours the newer architectures more. But I doubt it.

In an ideal world they'd do a much bigger chart, with more res/quality configs and way more cards tested, especially older ones. But this always seems to be too difficult for big publishers to push, so random little youtubers end up doing it. Or TPU hopefully. But if this is what they're going with, then benchmarks using no image reconstruction are really gonna show what a hog this game will be.

Also RIP Series S. That's gonna be like 30fps 360p upscaled to 1080 with FSR2 probably.
At this rate I'd be surprised if the PS5/Series X versions have a 60fps mode at all. And their 30fps modes will be like 540p internal.
Posted on Reply
#88
AusWolf
ArkzIt's a picture put out by the publisher of the game days before release, and you think it's just meaningless?
Considering how meaningless lots of official game system requirements are, yes.
Posted on Reply
#89
Arkz
AusWolfConsidering how meaningless lots of official game system requirements are, yes.
The bad ones, which aren't meaningless, just often a fair bit off, usually just say minimum and recommended, and don't even bother mentioning what res or framerate.
The amount of info on this suggests they've at least tested a bunch of these setups.
Posted on Reply
#90
AusWolf
ArkzThe bad ones, which aren't meaningless, just often a fair bit off, usually just say minimum and recommended, and don't even bother mentioning what res or framerate.
The amount of info on this suggests they've at least tested a bunch of these setups.
The last time I saw a game that didn't run below the official minimum spec and needed the recommended spec for stable gameplay was about 20 years ago.
Posted on Reply
#91
gffermari
ArkzIn an ideal world they'd do a much bigger chart, with more res/quality configs and way more cards tested, especially older ones.
Question: will it run on pascal and rdna1 which don’t have dx12 mesh shaders?
Posted on Reply
#92
fevgatos
ArkzSo you need a 3070 to run the game in 540p 60fps medium settings... And that's reasonable? lol.
The answer to that is, we don't know. If the game at medium settings looks twice as good as the best looking game at ultra then YES, that's reasonable. If the game at medium looks like crap then it's unreasonable. So "medium" settings on their own mean ABSOLUTELY nothing, and it's very cringeworthy when people mention presets to argue about whether a game is optimized or not, as if presets are set in stone and remain the same between games.
Posted on Reply
#93
evernessince
AusWolfAll this negativity about the "system requirements" of a game that isn't even out yet is astonishing! Not that playing at 4K with RT ultra was a requirement to enjoy it anyway.

How many times have we seen one system being recommended by the developer, and the game running just fine on another system with half the computing power?
I would tend to agree if there wasn't a trend of recent games needing DLSS / FSR in order to get decent FPS.
AnarchoPrimitivWhy am I getting flashbacks to PhysX, Arkham Knight, hair works, etc?
Probably because this is an Nvidia sponsored title where they are nuking performance to the point where you need to buy new cards. The only difference is that GPUs used to be more affordable. A 970 was $330 and that got you 73% of flagship performance (and that's before you consider OC headroom). Now you get only 50% of flagship peformance for $600 and it's a card designed to last precicely 2 years until the next gen comes out and Nvidia sponsored titles add an additional sample per ray that halves performance on last gen cards.
Posted on Reply
#94
blacksea76
Care about Raytracing?
Have fun getting a loan to play games.
Just wait until UE5 gets in full swing, it was supposed to be the next best thing in engines, until now, games that use it run like crap.
Posted on Reply
#95
AMD718
CammNvidia sold us this RT, Upscaled future. Outside a few exceptions its mostly harmed PC gaming, not improved it IMO
Most people (Nvidia) are still buying up everything Jensen dishes out and his vision for future profits. In one of the recent DF interviews Nvidia is now pushing that native frames are fake frames and real frames are the upscaled and frame generated DLSS 3.5 frames. You've gotta be some kinda sucker to buy that shit.
Posted on Reply
#96
nguyen
Another visual masterpiece in the making, let hope the game is equally engaging as its visual :)
Posted on Reply
#97
DemonicRyzen666
blacksea76Care about Raytracing?
Have fun getting a loan to play games.
Just wait until UE5 gets in full swing, it was supposed to be the next best thing in engines, until now, games that use it run like crap.
UE5 stutters on a single card & you think it's good?
Posted on Reply
#98
Merluz
never tought that in 2023 gpus will run games at 2003 resolution because of "business"
Posted on Reply
#99
AusWolf
Merluznever tought that in 2023 gpus will run games at 2003 resolution because of "business"
That's because upscaling is every gamer's dream, apparently.

Where are all the DLSS fans now?
Posted on Reply
#100
TheNightLynx
watzupkenThis is a reflection of the terrible state of PC games. You need DLSS/ FSR performance mode to get playable framerates is ridiculous. I don't know how well a game that runs badly on most systems will sell. In fact, a PS5 game like Spider Man 2 seems to offer very good visuals on very low end hardware in 2023 without resorting to upscaling at miserable resolution. DLSS/ FSR at Performance mode = to very low resolution.
Consider that PC hardware and console hardware was never be so similar in the past. Same CPUs, same GPUs and in case of Microsoft same development environment and APIs. Imho the real problem is not the multiplatform dev style.
I think the problem is in the videogame business that is became so biiig.
Why optimize if the title sell however.
Bugs ? If any annoying we will release some patches otherwise keep the bugs.
So we buy eternally unfinished games that needs patches for years to became what they should have be on dey one. And after 2-3 years of patches also medium hardware can run well.
Posted on Reply
Add your own comment
Jun 16th, 2024 04:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts