• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required

Gamers, I do not understand you. After the last couple years of abysmal failures regarding optimisation (amongst other things) for released games where even 4090 can take a hike what the hell is with this "Just upgrade your crap PC, bro"?
Are you gluttons for punishment?
True gamers are gluttons for progress in evolving the state of the art. This game does that.
 
Honestly, after a while, we have to let progress be progress and move on with new tech, it's not like most UE5 titles where it looks alright while having no technology that actually warrants that level of meh performance, this game has actual hardware reasons to cast certain gpus into the flames, while also looking insanely good (geometric complexity is absurd if you stop to look at it, taking great advantage of mesh shaders)

If we don't start embracing things like mesh shaders now, we could be sacrificing performance that doesn't need sacrificing, as things like, again, mesh shaders, can enable games to either look better geometrically, or run better, as it's an insanely good optimization tool (if anyone compares this to DLSS or FSR, please don't, it's silly to do so)
 
There is always a way to optimize. Same goes for Ray tracing, there are many techniques where RT can be optimized to use as low possible rays while still keeping good image quality.
AMD has on their dev website for example a demo for hybrid reflections, where it uses both SSR and RT to help where SSR cannot work properly. It looks great and performs well. They also have a whitepaper on global illumination system that uses RT, but rather than brute forcing it with PT it uses Radiance Caching to improve performance.
Even more recent example of games - Metro Exodus EE and Witcher 3 Remastered, both used RTGI, where it's a mix of probes and RT, both especially ME had amazing lighting and while it was costly to run those, it was not as costly as PT.

For Alan Wake 2 best example is reflections. In order to get RT reflections in this game you need to turn on PT, which is stupid. They could have added RT reflections option pretty easily, in Control it was not a problem as well.
But they chose to brute force it with PT, probably because of sponsorship with nvidia.

Nvidia has a habit of presenting technologies and then using them to the fullest even beyond the diminishing returns point and even when it tanks their own GPUs (as long as it tanks competiton more its good). Happened before and happens now. It will not become mainstream anyway because of consoles and same as gameworks, physx etc didn't see mainstream adoption other than nvidia sponsored titles same will happen here. Did that smoke simulations and hair simulations looked really nice? They did, was it viable to use in every game? Nope.

It is not to say that RT or PT will dissapear, it's just that acceleration for those calculations will be used differently, to assist raster in places where it cannot improve visuals anymore.
Maybe for the next gen consoles we will see fully PT games become mainstream.
 
Maybe in the mid 90's that was a thing, but definitely not in the 2000's, iirc.
Yes it was. Look at DirectX's version history. Basically every single major version, and sometimes even minor versions needed a new graphics card to run. That is: DirectX 7: 1999, DirectX 8: 2000, DirectX 8.1: 2001, DirectX 9: 2002, DirectX 9.0c: 2004. That's 5 graphics card upgrades within 5 years. Only if you wanted to run the latest games, of course - because they didn't even start without compatible hardware. And here we are, crying that you only get X FPS on a 7 year-old graphics card in the newest AAA game. :rolleyes:

20 years ago, we paid 200 bucks every year. Now, we pay 5-800 every 5 years. That's it.

Edit: Half-Life 2 was regarded as a bloody miracle for supporting DirectX 7 as well as 9, and ran on a GeForce 2 at ultra low graphics, which was a 4 year-old graphics card at that time.
 
Last edited:
It's interesting people mentioning older games like Crysis and comparing how cards of the day performed, looking back W1zz's review of the GTX 680 using 1200P as he did at the time there wasn't a card that managed 60FPS:
1698407501405.png


Metro 2033 battered the cards even harder:
1698407604879.png


Just interesting to compare...
 
I now understand what leather jacket meant when he said it just works, meaning it will run but he doesn’t promise it would run well
 
It's interesting people mentioning older games like Crysis and comparing how cards of the day performed, looking back W1zz's review of the GTX 680 using 1200P as he did at the time there wasn't a card that managed 60FPS:
View attachment 319113

Metro 2033 battered the cards even harder:
View attachment 319114

Just interesting to compare...
I do believe that back then there just wasn't that much room to lower visual settings, it would look like horse shit.
Nowadays in modern games that are supposedly "setting the bar" have a lot of room to reduce visual quality but the performance you gain is so minimal it's not worth it.
Obviously it's because the developer does not optimize for this experience, but it does sting a bit.
I'd bet most would much prefer a nice 60 fps over a "next-gen experience".
 
Just interesting to compare...
Not to mention, for Crysis specifically there, the fastest card at the time of launch, if memory serves, was an 8800GTX, if we round node shrinks and arch changes into 'major' generational leaps, the GTX680 is a full 3 generations beyond an 8800GTX (8/9 series, 200 series, 400/500 series, then 600 series) - and still not cracking 60fps @ 1200p with 4xMSAA - which admittedly was a tall bar for Crysis. I wonder if a full 3 generations from now, AW2 will be playable at native 4k60+...

I never played the first Alan Wake, but am very much enjoying all the content and discussion on 2, it seems like a masterpiece for it's time. Might be time to pick up the 1st game remastered while it's cheap and wait for a special on 2.
 
Pi$$ off game engine. 16fps @1080p on a 1080 Ti ??? Are you kidding me?
Seriously, the video card companies are hand-in-hand with those callous game devs, so they can sucker in the plebs on buying always the latest and most powerful GPU.
Just like on mobile phones...
Disgusting.

I feel like this is an emotional overreaction to be honest. The 1080 Ti is based on a 7 year old GP102 processor in its worst shipping configuration. It had more than its valiant run, this was inevitable, just let it go. :)
 
No, it absolutely was. Back then it was easy to get massive hardware performance gain via a simple and cheap node shrink, then anything on top of that in terms of actual design improvements was just gravy (and there was plenty of low-hanging fruit there too).

Nowadays node shrinks are barely an improvement and hideously expensive to boot, rasterisation has been optimised to the Nth degree so there's almost no room for improvement via that avenue, and we've only barely started down the far more complex ray- and path-tracing road, where optimisations are hindered by the slow down in node shrinking.

I meant the part about having to buy a new GPJ each year to play new games because the games wouldn't even launch on last years GPU.
Yes it was. Look at DirectX's version history. Basically every single major version, and sometimes even minor versions needed a new graphics card to run. That is: DirectX 7: 1999, DirectX 8: 2000, DirectX 8.1: 2001, DirectX 9: 2002, DirectX 9.0c: 2004. That's 5 graphics card upgrades within 5 years. Only if you wanted to run the latest games, of course - because they didn't even start without compatible hardware. And here we are, crying that you only get X FPS on a 7 year-old graphics card in the newest AAA game. :rolleyes:

20 years ago, we paid 200 bucks every year. Now, we pay 5-800 every 5 years. That's it.

Edit: Half-Life 2 was regarded as a bloody miracle for supporting DirectX 7 as well as 9, and ran on a GeForce 2 at ultra low graphics, which was a 4 year-old graphics card at that time.

Do you have examples? I was pretty deep into games at that point (on a nothing budget!) and have absolutely no memory of this being a thing at all. Not every year anyway.
 
Last edited:
Me in twenty-****ing-thirty: Why does my 4090 not run this new graphically intense game well?!?!



*(Assuming Im still alive)
 
I meant the part about having to buy a new GPJ each year to play new games because the games wouldn't even launch on last years GPU.


Do you have examples? I was pretty deep into games at that point and have absolutely no memory of this being a thing at all.

Well, try running Oblivion on a Fury Maxx (which would be 7 years old by then). Right, that one needs Windows 98...

Oblivion even booted on a GeForce FX (3 years old by its release) if you disabled HDR and fell back to bloom lighting but... Can't speak for fps...

Yep, until unified shaders and I'd argue DirectX 11 cards, upgrades were essentially mandatory. Things are cozier now, support for obsolete/downlevel hardware is quite good
 
I never played the first Alan Wake, but am very much enjoying all the content and discussion on 2, it seems like a masterpiece for it's time. Might be time to pick up the 1st game remastered while it's cheap and wait for a special on 2.
Do that! It's one of my favourite games, ever.
 
Do you have examples? I was pretty deep into games at that point (on a nothing budget!) and have absolutely no memory of this being a thing at all. Not every year anyway.
I just gave you examples. :confused:

Every new DirectX version needed a new graphics card to run. You can look up any game using the DirectX versions above which came out every year. OpenGL games weren't any better, like Quake 3 and other id Tech 3 engine based ones. Or there's id Tech 4 which ran under Doom 3, which (quoting Wikipedia) "would not even run on high end graphics cards in 2004 as the engine required at least 512 MB of video memory to display properly and at playable speeds." That is, at Ultra graphics, it required hardware that wasn't even available until about 2 years later!

Like I said above, Half-Life 2 was revolutionary in a sense that it supported 3 major DirectX versions, making it run on a 4 year-old GeForce 2, which was unheard of.

I got my first PC in 1998 which had partial support for DirectX 7 (more like 6.1). Two years later, no new game would run on it. At all. So I basically missed the 2000-2004 era entirely.

Edit: Or what about the proprietary APIs that ran only on specific hardware, like Glide on 3DFX?

Well, try running Oblivion on a Fury Maxx (which would be 7 years old by then). Right, that one needs Windows 98...

Oblivion even booted on a GeForce FX (3 years old by its release) if you disabled HDR and fell back to bloom lighting but... Can't speak for fps...

Yep, until unified shaders and I'd argue DirectX 11 cards, upgrades were essentially mandatory. Things are cozier now, support for obsolete/downlevel hardware is quite good
How could I forget about Oblivion? You basically had to mod it to make it run properly on anything other than a high-end GeForce 7800 or such.

Or even then, you turned on HDR and the grass density/distance, and your PC died. :laugh:
 
True gamers are gluttons for progress in evolving the state of the art. This game does that.
I'm not going to touch what "true gamers" think, but if AAA games from one year are performing the same as AAA games from the year before, graphics are stagnating and game devs should be pushing harder. As long as companies are releasing faster GPUs, devs should be releasing more demanding games.
 
I cannot play 3rd person. Too many hours playing FPS since Doom came out. Non starter.
 
I'm not going to touch what "true gamers" think, but if AAA games from one year are performing the same as AAA games from the year before, graphics are stagnating and game devs should be pushing harder. As long as companies are releasing faster GPUs, devs should be releasing more demanding games.
Do you own stock in nVidia?
 
I'm not going to touch what "true gamers" think, but if AAA games from one year are performing the same as AAA games from the year before, graphics are stagnating and game devs should be pushing harder. As long as companies are releasing faster GPUs, devs should be releasing more demanding games.
I actually agree, but to push beyond what your baseline is using,

At the minimum spec required.


That's the , ,, ,A issue.
 
Many times devs make ultra settings not as settings for the present, but settings for the future, a good example of this is kingdom come deliverance, and metro exodus enhanced to an extent (the maximum preset is ridiculous to run and doesn't offer that much of a visual improvement)

I suspect this is one such case, as the low-mid settings look absolutely stunning, and because this is remedy entertainment, and nothing is ever normal with these guys, they pushed, and they pushed big time.

Minimum spec is minimum spec for a reason, it's minimum, not recommended
 
Do you own stock in nVidia?
Don't even have an nvidia gpu. And I'm not a frequent upgrader either. But if the state of the art isn't always moving forward then why are we even gaming on PCs? We could sit around with our PS2s and agree games could never use more capabilities than it had.
 
Don't even have an nvidia gpu. And I'm not a frequent upgrader either. But if the state of the art isn't always moving forward then why are we even gaming on PCs? We could sit around with our PS2s and agree games could never use more capabilities than it had.
To an extent, you're right. However, not all of us are made of money and can afford the latest and greatest GPUs. Or are willing to put it on a credit card to buy it. Talk about alienating a large portion of the market.
 
Epic, this is really the content that you guys stand out for. Thank you for the update.
 
Did you make the same complaints when Crysis was released? Remedy, like Crytek, are to be praised for pushing the boundaries of what can be achieved with game engines, not condemned. When developers push these boundaries, hardware is pushed to keep up, and consumers benefit as a result.
Crysis had REVOLUTIONARY graphics, even today many games don't have the same level of graphical fidelity and technology.

Alan Wake looks great, but nothing special, nothing we haven't seen before, nothing that would warrant this level of poor performance!

Plus Crysis sold like 10 copies due to its absurd requirements, which I think will be the case for Alan Wake 2 on PC as well.

And the thing is I think Hogwarts looks just as good if not better in certain areas, yet the game can be easily run by 1000 series cards with ease at the lowest settings.
 
Last edited:
Back
Top