• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Alan Wake II System Requirements Released, Steep RT Requirements Due to Path Tracing

This is a reflection of the terrible state of PC games. You need DLSS/ FSR performance mode to get playable framerates is ridiculous. I don't know how well a game that runs badly on most systems will sell. In fact, a PS5 game like Spider Man 2 seems to offer very good visuals on very low end hardware in 2023 without resorting to upscaling at miserable resolution. DLSS/ FSR at Performance mode = to very low resolution.
Consider that PC hardware and console hardware was never be so similar in the past. Same CPUs, same GPUs and in case of Microsoft same development environment and APIs. Imho the real problem is not the multiplatform dev style.
I think the problem is in the videogame business that is became so biiig.
Why optimize if the title sell however.
Bugs ? If any annoying we will release some patches otherwise keep the bugs.
So we buy eternally unfinished games that needs patches for years to became what they should have be on dey one. And after 2-3 years of patches also medium hardware can run well.
 
That's because upscaling is every gamer's dream, apparently.

Where are all the DLSS fans now?
We've woken up, or at least I have anyway.

I also know that a good number of dlss fans don't want to use dlss to attain playable framerates, instead they wanted to use dlss as a way to supercharge performance, which we know is not the developers intention anymore.

Or you have the people like me that realized how nice playing the game at a native resolution is, free of most motion artifacts and sharp as can be.
 
"The way it's meant to be played!"

In the era of zero or even negative price / performance increases with new generation there is verry little demand for upgrade - unless your old card suddenly doesn't cut it even for measly 1080p at 60 Hz without upscaling.

I wonder how many" badly optimized" new games have actually been intentionally made "to be the new Crysis", without the actual visual generational leap in image quality Crysis brought...
 
... a good number of dlss fans don't want to use dlss to attain playable framerates, instead they wanted to use dlss as a way to supercharge performance ...
I think that's stupid, but each to their own, I guess.

Or you have the people like me that realized how nice playing the game at a native resolution is, free of most motion artifacts and sharp as can be.
I realised this the very first moment I tried DLSS in Cyberpunk 2077. Whatever anybody says, upscaling will always be a last resort to get acceptable performance in my opinion.
 
RAM consumption increases significantly on GPUs with little Vram available, if this is pushed to the limit. Also, Games with large, expansive environments, such as open worlds, need to load and maintain information about large areas of the game in memory. This may increase RAM consumption.

So yes, sometimes there is an impact on RAM consumption, depending on the configurations and hardware.
Yes, VRAM usage goes up especially in an open world game, but I don't believe Remedy makes open world games. Alan Wake and Control are not open world games as far as I can recall.

While it is knowned that Remedy tends to push graphical requirements with each game release, I cannot help but feel that they have gone way over the limits here. They are not selling a game anymore. They are selling a graphic showcase. What is the point of releasing a "beautiful game" when only a small handful of people can truely enjoy the eye candy. It is good to test the boundary somewhat, but this is way overboard when you see 4K path tracing requiring a RTX 4080 @ DLSS performance mode (aka 1080p upscale) in order to play @ 60 FPS. Seriously... I am voting with my wallet to not bother buying this game because it just means I need to shell out even more money for a better graphic card.
 
That's because upscaling is every gamer's dream, apparently.

Where are all the DLSS fans now?

maybe those fans doesn't exsist and they actually are simple evangelists working for the businessman that cannot sell anymore the persistent increase of resolution or of fps/hz forever, so the new plan is to sell the old resolutions upscaled to became the new future effectively killing low to medium budget pc gaming.
yesterday you play at 1080p@1080p, today you play 1080p@4k, tomorrow you will play 720p@4k, the day after tomorrow 800*600@8k and so on.

it's so nostalgic remembering the old times where I wasn't able to play some games (Civ3 is the first that I clearly remember) because my monitor (or video card drivers idk) was stuck @800*600 and the minimum required resolution was 1024*768... Now the table is turning...
 
And everyone here bashing me a week ago because I said the RTX40x0 series is not powerful enough for the latest games, and a "super" variant is a waste of money for just 10% more perf. Not so great if you're only getting 4K20fps, now you can have 4K23FPS for just $1200!

The RTX40x0 series is soooo over!
 
And everyone here bashing me a week ago because I said the RTX40x0 series is not powerful enough for the latest games, and a "super" variant is a waste of money for just 10% more perf. Not so great if you're only getting 4K20fps, now you can have 4K23FPS for just $1200!

The RTX40x0 series is soooo over!
It's equally ironic to get all the bashes for saying that "DLSS is not a QoL feature" and then looking at all the booing Remedy gets for requiring DLSS for AW2 to run. I mean, what happened with DLSS being a QoL feature? If it really is one, then having to enable it for decent framerates isn't a problem, surely? :rolleyes:
 
It's equally ironic to get all the bashes for saying that "DLSS is not a QoL feature" and then looking at all the booing Remedy gets for requiring DLSS for AW2 to run. I mean, what happened with DLSS being a QoL feature? If it really is one, then having to enable it for decent framerates isn't a problem, surely? :rolleyes:
DLSS looks awful (compared to native) on a large 4K display. I also consider it cheat-marketing, at best it should only be suitable for a 4070 and lower card to require upscaling in order to get 1440 60fps, a high-end gaming card should not require DLSS in order to meet performance expectations. If I wanted to play at 1080p60fps, then that's the kind of monitor I would have.

I can't wait for the 50x0 series to launch, let's hope Nvidia does not hike the prices again, or hobble the performance of anything lower than the 90 series, like they did to the 4080 this time.
 
Last edited:
It seems to be wrong information, the gap between 3060 and 3070 is big.... and where is the 3080 or the 1440p at 60fps? yet they prefer 4k?...??
I have to agree, that graph seems a bit off.
Nothing here makes sense, no wonder people in r/nvidia are roasting them to hell
This is not NVidia's problem, it's the folks at Remedy. Besides, Redditers are over-whiny for a start anyway.

View attachment 318395

I hate this, can we stop doing this
Yes, they really should stop doing that.
 
Obviously most of you do not realize how the games are developed and that in order to improve one thing, in the scale of the game environment, multiple times the performance of a gpu is required.
Remedy has always pushing the boundary in graphics and now the only way to do this is by using the nVidia's tech.

The performance of the gpus is increasing in every gen by 20-40% while the requirements of just a reflection with slightly higher res may require 12 times the performance of the fastest gpu available. Just for a reflection. No real time shadows, no global illumination, no ambient occlusion or complete path traced rendering.

So, yes. It's completely justified that a 4080 can run this game at 1080p upscaled to 4K, if you understand what is required to be calculated in the background.
Am I happy with it? No! But we cannot beat physics and go from 6nm to 0.000003nm or improve the IPC by 300K% in one-two gens.

To end. I'm happy when some companies push the boundary and take advantage of the best available hardware and i'm ok if I cannot play it right away. But the gamers have to read, think and realize why this happens before post or complain about requirements and resolutions etc.

Alan Wake 2 is one of the first games to require DX12 Ultimate's Mesh Shaders (dsogaming.com)

In short, Alan Wake 2 will be one of the first games that will take full advantage of DX12 Ultimate.

Ironically, a lot of PC gamers have been wondering when they’d see a game that supports Mesh Shaders. And now that a game requires them, the exact same people are crucifying Remedy.

For what it’s worth, Remedy has been constantly pushing the graphical boundaries of PC games. Remember Quantum Break? That game came out in 2016, and it took it four years until it was playable at 4K. CONTROL was also one of the best-looking games of its time. And now Alan Wake 2 will have Path Tracing effects.

Again, it’s really funny witnessing PC gamers constantly asking for a new “Crysis” game. And when a “Crysis” game does appear on the horizon, those same folks start calling it an “unoptimized mess”. Here is a fun fact. Crysis WAS unoptimized when it came out (due to its awful CPU utilization as it was single-threaded).
 
Obviously most of you do not realize how the games are developed and that in order to improve one thing, in the scale of the game environment, multiple times the performance of a gpu is required.
Remedy has always pushing the boundary in graphics and now the only way to do this is by using the nVidia's tech.

The performance of the gpus is increasing in every gen by 20-40% while the requirements of just a reflection with slightly higher res may require 12 times the performance of the fastest gpu available. Just for a reflection. No real time shadows, no global illumination, no ambient occlusion or complete path traced rendering.

So, yes. It's completely justified that a 4080 can run this game at 1080p upscaled to 4K, if you understand what is required to be calculated in the background.
Am I happy with it? No! But we cannot beat physics and go from 6nm to 0.000003nm or improve the IPC by 300K% in one-two gens.

To end. I'm happy when some companies push the boundary and take advantage of the best available hardware and i'm ok if I cannot play it right away. But the gamers have to read, think and realize why this happens before post or complain about requirements and resolutions etc.

Alan Wake 2 is one of the first games to require DX12 Ultimate's Mesh Shaders (dsogaming.com)
Crysis also looked like nothing else back in the day with proper physics and amazing graphics.

These days games get released with absurd requirements but when you look at the graphics they offer you're left scratching your head as to why the requirements are so high when the graphics themselves are nowhere near wow level like Crysis was back in the day.
 
Crysis also looked like nothing else back in the day with proper physics and amazing graphics.

These days games get released with absurd requirements but when you look at the graphics they offer you're left scratching your head as to why the requirements are so high when the graphics themselves are nowhere near wow level like Crysis was back in the day.
I think only the UE games look like shxt (not all obviously) because the developers and even Epic do not have a clue about how to make it work consistently well.
Most of the in house game engines deliver exceptional graphics. A Plague Tale series, Cyberpunk, all Remedy games etc.
Even the plasticky graphics from Sony engine (PS) are ok and better than any Crysis.

The thing is that games were way more simple back then and just a bump in texture resolution or add cube maps/screen space reflections or a bit more detailed models and they were transformed. We are past that. That's why raster graphics are dead. We have accomplished the mission to have extremely good models and textures. The lighting has always been the problem and although we knew the solution, we didn't have the hardware to run this. After decades, we run RT or PT games. Yes in 1080p and that's a success.

If gamers don't like that, we can go back, revive the MXs and stop even using pixel shaders in games.
 
I think only the UE games look like shxt (not all obviously) because the developers and even Epic do not have a clue about how to make it work consistently well.
UE5 graphics look great in demos and showcases but we have very few UE5 games to really compare.
Most of the in house game engines deliver exceptional graphics. A Plague Tale series, Cyberpunk, all Remedy games etc.
The strength of the in-house engines is that they're best suited for the task. Not their graphics.
Even the plasticky graphics from Sony engine (PS) are ok and better than any Crysis.
Comparing to game that came out in 2007 is not much of a benchmark to pass.
That's why raster graphics are dead. We have accomplished the mission to have extremely good models and textures.
Raster is not dead and wont be for a long time. Even the games you bring up as examples here are still hybrids of raster and RT with some PT experiments.

Also this works both ways. Since raster has been perfected so much over the years people cant always tell the difference or they may actually prefer raster to more accurate methods. For example RT/PT often makes the scene darker. Yes it may be more realistic but people are not always playing games for their realism.
The lighting has always been the problem and although we knew the solution, we didn't have the hardware to run this.
Lighting is the least of the problems that impact realism of the games. Animations and physics matter much more than accurate reflections or light bounces.
What good is a PT game when a dumb robotic NPC stumbles onto the scene and opens it's mouth with bad lip syncing? And just like that the "magic" is gone.
Or you throw a grenade into a coffee cup and all that happens is a small black strain on an intact coffee cup.
If gamers don't like that, we can go back, revive the MXs and stop even using pixel shaders in games.
That's BS argument. So if i don't like anything running at 720p 30fps on a reasonably priced midrange card then i should go back to pre-pixel shader era?
How about i go "back" to native 4K 240fps with barely noticeable difference in some graphical elements (like shadows) with much higher resolution and framerate?

Yes i get that eventually games will incorporate more RT and PT but that's a long way off even with various upscaling and frame generation hacks to get it running at reasonable framerates. Also the hardware capable of doing this needs to come down in price not go up in price like Nvidia (and AMD too) have done. That is the ONLY way things go mainstream. Closed ecosystems and higher prices are exactly what killed VR. Yes it's still there but it's not mainstream.
The same will happen to RT and PT if things continue as they are.
 
Obviously most of you do not realize how the games are developed and that in order to improve one thing, in the scale of the game environment, multiple times the performance of a gpu is required.
Remedy has always pushing the boundary in graphics and now the only way to do this is by using the nVidia's tech.

The performance of the gpus is increasing in every gen by 20-40% while the requirements of just a reflection with slightly higher res may require 12 times the performance of the fastest gpu available. Just for a reflection. No real time shadows, no global illumination, no ambient occlusion or complete path traced rendering.

So, yes. It's completely justified that a 4080 can run this game at 1080p upscaled to 4K, if you understand what is required to be calculated in the background.
Am I happy with it? No! But we cannot beat physics and go from 6nm to 0.000003nm or improve the IPC by 300K% in one-two gens.

To end. I'm happy when some companies push the boundary and take advantage of the best available hardware and i'm ok if I cannot play it right away. But the gamers have to read, think and realize why this happens before post or complain about requirements and resolutions etc.

Alan Wake 2 is one of the first games to require DX12 Ultimate's Mesh Shaders (dsogaming.com)
Bro... this is how FM ran on recommended hardware for ULTRA RT + 4K at 1080P ...



An update later and it is still jank and drops to 40 FPS at 1080P, not even at 4K.



Then look at the visuals and tell us... it's truly world apart in visuals for this kind of bad performance on top performing hardware?
 
I have to agree, that graph seems a bit off.
Time will tell tbh, once it's out the benchmarks will drop and if it's a badly optimized game.. They will get bombed one way or another.

This is not NVidia's problem, it's the folks at Remedy. Besides, Redditers are over-whiny for a start anyway.
Oh no, totally, it's not an nvidia issue, I didn't meat make it seem that way, People often discuss new game releases and many other things in r/nvidia. It seems to be a good place to debate most of the time, compared to other tech subs.
 
This is not NVidia's problem, it's the folks at Remedy. Besides, Redditers are over-whiny for a start anyway.

Can anyone say conclusively whether Nvidia sponsoring this game or other titles isn't pushing requirements up? Game Sponsorship in general is bad for the industry. Both Nvidia and AMD use it as a tool to control the performance and features of games. AMD was very likely limiting DLSS in starfield prior to the blowback and Nvidia is without a doubt using it to manipulate games to it's favor and to increase sales. Nvidia has gotten Gigabyte, ASUS, and MSI to limit it's top SKUs to Nvidia only cards. I'm willing to be they could easily get game devs to target 30 FPS for the base game unless people enable DLSS. Now that sells cards.
 
RAM consumption increases significantly on GPUs with little Vram available, if this is pushed to the limit. Also, Games with large, expansive environments, such as open worlds, need to load and maintain information about large areas of the game in memory. This may increase RAM consumption.

So yes, sometimes there is an impact on RAM consumption, depending on the configurations and hardware.
Sure but how do you capture that in a spec sheet that also specifies what amount of VRAM you need. You go under, you get shit perf. Done

Fact is, you can make a case for more RAM anytime of the day but you simply don't need it if you have a well balanced system. For gaming, it seems the norm is 16GB and 24 or 32 won't help you. And from personal experience... the only games that chew RAM are city sims that you take far into late game. That's an outlier.

Can anyone say conclusively whether Nvidia sponsoring this game or other titles isn't pushing requirements up? Game Sponsorship in general is bad for the industry. Both Nvidia and AMD use it as a tool to control the performance and features of games. AMD was very likely limiting DLSS in starfield prior to the blowback and Nvidia is without a doubt using it to manipulate games to it's favor and to increase sales. Nvidia has gotten Gigabyte, ASUS, and MSI to limit it's top SKUs to Nvidia only cards. I'm willing to be they could easily get game devs to target 30 FPS for the base game unless people enable DLSS. Now that sells cards.
Neh... not sure I totally agree there. It does feed dev budgets. It has also pushed ahead some features we later got everywhere. And at the same time, its a bit like the purchased review.

Also, game sponsorship is inherently also developers and GPU manufacturers having 'the dialogue' which I think is great for an industry. The world revolves around money, money opens doors.

Yes i get that eventually games will incorporate more RT and PT but that's a long way off even with various upscaling and frame generation hacks to get it running at reasonable framerates. Also the hardware capable of doing this needs to come down in price not go up in price like Nvidia (and AMD too) have done. That is the ONLY way things go mainstream. Closed ecosystems and higher prices are exactly what killed VR. Yes it's still there but it's not mainstream.
The same will happen to RT and PT if things continue as they are.
Yep... as above... the world revolves around money. RT is clearly a corporate push to extract more of it. And look where Nvidia is right now, money wise. Look where gaming is, in tandem. Not looking good. Can't last.

But, this has been my line of thinking ever since RT was announced and how AMD has responded to it. I think AMD's strategy is sound, and I think Nvidia's is a gamble. Since that moment I've only been reinforced in my stance. AMD is still selling console GPUs, they created a chiplet GPU and they've got a stable environment going on driver wise, while keeping largely in check with Nvidia's performance movement. In the meantime they're expanding on that technology lead by fusing CPU and GPU together on chips with much better yields. This is as steady as it gets. RT? Whatever. RDNA3 runs it too, but doesn't rely on it to sell.

"The way it's meant to be played!"

In the era of zero or even negative price / performance increases with new generation there is verry little demand for upgrade - unless your old card suddenly doesn't cut it even for measly 1080p at 60 Hz without upscaling.

I wonder how many" badly optimized" new games have actually been intentionally made "to be the new Crysis", without the actual visual generational leap in image quality Crysis brought...
Yeah... put this nonsense next to what Crysis was in terms of advancements. Both 1 and 3. Its hilarious. They still haven't really, convincingly surpassed it, have they. Sure, resolution go up. But what else? FPS go down :p
 
Last edited:
UE5 graphics look great in demos and showcases but we have very few UE5 games to really compare.

The strength of the in-house engines is that they're best suited for the task. Not their graphics.

Comparing to game that came out in 2007 is not much of a benchmark to pass.

Raster is not dead and wont be for a long time. Even the games you bring up as examples here are still hybrids of raster and RT with some PT experiments.

Also this works both ways. Since raster has been perfected so much over the years people cant always tell the difference or they may actually prefer raster to more accurate methods. For example RT/PT often makes the scene darker. Yes it may be more realistic but people are not always playing games for their realism.

Lighting is the least of the problems that impact realism of the games. Animations and physics matter much more than accurate reflections or light bounces.
What good is a PT game when a dumb robotic NPC stumbles onto the scene and opens it's mouth with bad lip syncing? And just like that the "magic" is gone.
Or you throw a grenade into a coffee cup and all that happens is a small black strain on an intact coffee cup.

That's BS argument. So if i don't like anything running at 720p 30fps on a reasonably priced midrange card then i should go back to pre-pixel shader era?
How about i go "back" to native 4K 240fps with barely noticeable difference in some graphical elements (like shadows) with much higher resolution and framerate?

Yes i get that eventually games will incorporate more RT and PT but that's a long way off even with various upscaling and frame generation hacks to get it running at reasonable framerates. Also the hardware capable of doing this needs to come down in price not go up in price like Nvidia (and AMD too) have done. That is the ONLY way things go mainstream. Closed ecosystems and higher prices are exactly what killed VR. Yes it's still there but it's not mainstream.
The same will happen to RT and PT if things continue as they are.

I was talking about UE4, not 5. Although the engine is capable of creating some great results, it's extremely inconsistent. I don't blame it though that much. UE4 is an old engine and asking it to include RT, PT or whatever 2023 tech is already too much.

I agree, the in house engine are best suited for the task. But the result is the graphics.
Like in A Plague Tale: You couldn't get millions of rats running around in a general purpose game engine. You have to reinvent a part of the engine or develop a new on from scratch, so it can handle it.

The fact that raster has been perfected, does not mean that we stay with the prebaked lighting. We advance and we want more dynamic elements. At first is the lighting, then the physics etc. etc.

The fact that we have hybrid RT/PT games, shows how difficult it is to calculate these things in real time. And that's how the requirements are justified. As long as there is a normal setting in game, so the gamers can actually play the game, I'm fine with the extreme requirements for the RT/PT level of settings.
 
Neh... not sure I totally agree there. It does feed dev budgets.

No one knows the exact amount a sponsorship contributes to a game's budget. One thing is for sure though, the larger the contribution a sponsorship makes to a game the more those devs are beholden to that money. It does feed dev budget but at the cost of features being tailored towards what AMD / Nvidia want and not what the devs / gamers want. It doesn't take a leap of faith to see that AMD / Nvidia will tailor features to sell cards.

It has also pushed ahead some features we later got everywhere. And at the same time, its a bit like the purchased review.

You can't make this statement definitively, it's impossible to tell whether a feature would or would not have been added sans sponsorship. The industry has innovated a vast majority of game features without the need of sponsorships. Therer are far more instances of sponsorships leading to low performance or low quality features as compared to what the industry is already capable of. FXAA, Hairworks, and PhysX among others. PhsyX of which worked perfectly fine via a CPU code path until Nvidia nuked that and sponsor forced the nutuered version into a bunch of games.

Also, game sponsorship is inherently also developers and GPU manufacturers having 'the dialogue' which I think is great for an industry. The world revolves around money, money opens doors.

I don't see how Nvidia and AMD telling devs what to implement is better dialog than them plainly speaking to either company. It's not like that communication can only happen over contracts, both companies have engineers just for game optimization for their GPUs.
 
No one knows the exact amount a sponsorship contributes to a game's budget. One thing is for sure though, the larger the contribution a sponsorship makes to a game the more those devs are beholden to that money. It does feed dev budget but at the cost of features being tailored towards what AMD / Nvidia want and not what the devs / gamers want. It doesn't take a leap of faith to see that AMD / Nvidia will tailor features to sell cards.



You can't make this statement definitively, it's impossible to tell whether a feature would or would not have been added sans sponsorship. The industry has innovated a vast majority of game features without the need of sponsorships. Therer are far more instances of sponsorships leading to low performance or low quality features as compared to what the industry is already capable of. FXAA, Hairworks, and PhysX among others. PhsyX of which worked perfectly fine via a CPU code path until Nvidia nuked that and sponsor forced the nutuered version into a bunch of games.



I don't see how Nvidia and AMD telling devs what to implement is better dialog than them plainly speaking to either company. It's not like that communication can only happen over contracts, both companies have engineers just for game optimization for their GPUs.
Can you give me some examples of nvidia sponsorship harming a game? Please, don't go 10 years back, something recent.

AFAIK all nvidia sponsored games run pretty great on competitors cards, they are well optimized and they look good, they support competitors features etc . None of those can be said about amd sponsored games like godfall forspoken jedi immortals etc.
 
Back
Top