Friday, October 27th 2023

PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required

"Alan Wake II," released earlier this week, is the latest third person action adventure loaded with psychological thriller elements that call back to some of the best works of Remedy Entertainment, including "Control," "Max Payne 2," and "Alan Wake." It's also a visual feast as our performance review of the game should show you, leveraging the full spectrum of the DirectX 12 Ultimate feature-set. In the run up to the release, when Remedy put out the system requirements lists for "Alan Wake II" with clear segregation for experiences with ray tracing and without; what wasn't clear was just how much the game depended on hardware support for mesh shaders, which is why its bare minimum list called for at least an NVIDIA RTX 2060 "Turing," or at least an AMD RX 6600 XT RDNA2, both of which are DirectX 12 Ultimate GPUs with hardware mesh shaders support.

There was some confusion among gaming online forums over the requirement for hardware mesh shaders. Many people assumed that the game will not work on GPUs without mesh shader support, locking out lots of gamers. Through the course of our testing for our performance review, we learned that while it is true that "Alan Wake II" relies on hardware support for mesh shaders, the lack of this does not break gameplay. You will, however, pay a heavy performance penalty on GPUs that lack hardware mesh shader support. On such GPUs, the game is designed to show users a warning dialog box that their GPU lacks mesh shader support (screenshot below), but you can choose to ignore this warning, and go ahead to play the game. The game considers mesh shaders a "recommended GPU feature," and not a requirement. Without mesh shaders, you can expect a severe performance loss that is best illustrated with the AMD Radeon RX 5700 XT based on the RDNA architecture, which lacks hardware mesh shaders.
In our testing, at 1080p, without upscaling, the RX 5700 XT performs worse than the GeForce GTX 1660 Ti. In most other raster-only titles, the RX 5700 XT with the latest AMD drivers, is known to perform about as fast as an RTX 2080. Here it's seen lagging behind the GTX 1660 Ti. It's important to note here, that the GTX 16-series "Turing," while lacking in RT cores and tensor cores from its RTX 20-series cousin, does feature hardware support for mesh shaders, and is hence able to perform along expected lines. We have included a projection for how the RX 5700 XT fares typically in our testing—it ends up roughly around the performance region of the RTX 3060 and RX 6600 XT. AMD's Radeon RX 6000 series RDNA2 and current RX 7000 series RDNA3 fully support hardware mesh shaders across all GPU models.

That doesn't mean that RX 5700 XT delivers unplayable results. 1080p at 60 FPS is in reach with lowest settings, or at close to maximum settings with FSR Quality, which is not such a terrible tradeoff, just you still need to make compromises. We didn't spot any rendering errors or crashes.

Once we knew that RX 5700 XT works, we also wanted to test the NVIDIA side of things. Using the GeForce GTX 1080 Ti "Pascal" , the flagship GPU from that generation, we were greeted with the same warning dialog as the RX 5700 XT—that the GPU is missing support for mesh shaders. Not only does the GTX 1080 Ti vastly underperform, but it yields far worse performance than the RX 5700 XT, nearly 2-3rds. At launch, the RX 5700 XT was a little bit slower than the GTX 1080 Ti in our reviews of the time, but has climbed since, and is now a tiny bit faster. Since the card lacks DLSS support, using FSR is the only option, but even that can't save the card. Running at 1080p lowest with FSR 2 Ultra Performance yielded only 27 FPS.
Add your own comment

100 Comments on PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required

#1
Prima.Vera
Pi$$ off game engine. 16fps @1080p on a 1080 Ti ??? Are you kidding me?
Seriously, the video card companies are hand-in-hand with those callous game devs, so they can sucker in the plebs on buying always the latest and most powerful GPU.
Just like on mobile phones...
Disgusting.
Posted on Reply
#2
sLowEnd
Prima.VeraPi$$ off game engine. 16fps @1080p on a 1080 Ti ??? Are you kidding me?
Seriously, the video card companies are hand-in-hand with those callous game devs, so they can sucker in the plebs on buying always the latest and most powerful GPU.
Just like on mobile phones...
Disgusting.
Pascal is old enough to be missing features (e.g. proper async compute) that people shouldn't be surprised are beginning to have consequences. The card is nearly 7 years old. It had a good run.
Posted on Reply
#3
Assimilator
Prima.VeraPi$$ off game engine. 16fps @1080p on a 1080 Ti ??? Are you kidding me?
Seriously, the video card companies are hand-in-hand with those callous game devs, so they can sucker in the plebs on buying always the latest and most powerful GPU.
Just like on mobile phones...
Disgusting.
Did you make the same complaints when Crysis was released? Remedy, like Crytek, are to be praised for pushing the boundaries of what can be achieved with game engines, not condemned. When developers push these boundaries, hardware is pushed to keep up, and consumers benefit as a result.
Posted on Reply
#4
PapaTaipei
16 fps on a 1080ti at 1080p with no upscaling or rt. This is hilarious.
Posted on Reply
#5
zo0lykas
Why so many post about is unbalanced game? If creators cant optimize game dont need us force buy £2k gpu..

Just let it go.
Posted on Reply
#6
Hyderz
yes it runs but with wheelchair or crutches support...
Posted on Reply
#7
PapaTaipei
T
AssimilatorDid you make the same pathetic complaints when Crysis was released? Remedy, like Crytek, are to be praised for pushing the boundaries of what can be achieved with game engines, not condemned.
The difference is Crisis was a massive upgrade in terms of visual fidelity compared to what was on the market. The engine to this day is still used extensively.
Posted on Reply
#8
pavle
Missing functionalities sure do take their toll, like testing nvKepler cards in modern titles a few years back.

Edit: The game does look mighty impressive, even on low rendering settings.
Posted on Reply
#9
FierceRed
Alan Wake 2 causing this level of posterior injuries was definitely not on my bingo card.

The graphical bar has been raised. I hope the gameplay is actually fun.
Posted on Reply
#10
the54thvoid
Super Intoxicated Moderator
FierceRedAlan Wake 2 causing this level of posterior injuries was definitely not on my bingo card.
You're playing bingo all wrong. :D
Posted on Reply
#11
azrael
I don't think I've ever seen a chart where the GTX 1080, let alone the Ti version, is at the very bottom. I guess we're nearing the time where I have to sell a kidney to afford a new graphics card.
Posted on Reply
#12
AusWolf
Does anybody remember the late '90s - early 2000s when you had to buy a new graphics card every year because of new feature sets that made new games not even start on the old card? Anyone?

Sure, they were cheaper, but instead of paying $200 every year, now we pay $5-800 every 5 years. Pascal was awesome, and the 1080 (Ti) had a long and prosperous life, but sometimes, we have to move on.
Posted on Reply
#13
zo0lykas
AusWolfDoes anybody remember the late '90s - early 2000s when you had to buy a new graphics card every year because of new feature sets that made new games not even start on the old card? Anyone?

Sure, they were cheaper, but instead of paying $200 every year, now we pay $5-800 every 5 years. Pascal was awesome, and the 1080 (Ti) had a long and prosperous life, but sometimes, we have to move on.
Yes, but back in days we dont pay for pre-order game £60 quid for basic version £99 for full, and gpu back in days dont cost all months salary.
Posted on Reply
#14
AusWolf
zo0lykasYes, but back in days we dont pay for pre-order game £60 quid for basic version £99 for full
That's easy - don't do it. Wait for a discount and play something else in the meantime. :)
zo0lykasgpu back in days dont cost all months salary.
But you swapped it every year if you wanted to play the latest games. The 1080 Ti has been with us for 7 years and is only starting to show signs of weakness.
Posted on Reply
#15
lexluthermiester
So the results of this testing show that users of these cards should;

1. Turn down their settings from "High", and customize/optimize better.
2. Drop to a lower resolution(yes 720p is perfectly playable AND enjoyable).


A side note, Remedy needs to stop with the silly warnings.
Posted on Reply
#16
AusWolf
Prima.VeraJust look at some gameplay, ore reviews on Youtube, TPU or whatever. There is absolutely nothing impressive about the graphics, what are you talking about? Crysis was a MAJOR graphical upgrade over the existing games.
This game looks good, but nothing impressive or jaw dropping.
My expectation is that the closer we get to realism, the less impressive games will look compared to previous ones and the higher the hardware cost will be. Just like the difference between 360p and 720p is massive, the difference between 720p and 1080p is significant, between 1080p and 4K it's detectable, and between 4K and 8K, you need a microscope for pixel-peeping, yet, the GPU requirements rise exponentially.
Posted on Reply
#17
wolar
AssimilatorDid you make the same complaints when Crysis was released? Remedy, like Crytek, are to be praised for pushing the boundaries of what can be achieved with game engines, not condemned. When developers push these boundaries, hardware is pushed to keep up, and consumers benefit as a result.
But you actually can see where the graphics gone on crysis (mostly), i can't really understand in this game though, i saw games at 99% the graphical appearance that run 3-5 times better, how is this possible besides shit optimization?
Posted on Reply
#18
lexluthermiester
AusWolfMy expectation is that the closer we get to realism, the less impressive games will look compared to previous ones and the higher the hardware cost will be.
I would not disagree with this statement. I find myself increasingly leaning toward games that have good visuals but not exactly realistic. I have remembered that I play games to jump into another world, one of fantasy and adventure. Super realistic GFX can be great if they add to the immersion of the experience. They do nothing for how well crafted the experience is. Great GFX alone do not make a quality game.
Posted on Reply
#19
sLowEnd
AusWolfMy expectation is that the closer we get to realism, the less impressive games will look compared to previous ones and the higher the hardware cost will be. Just like the difference between 360p and 720p is massive, the difference between 720p and 1080p is significant, between 1080p and 4K it's detectable, and between 4K and 8K, you need a microscope for pixel-peeping, yet, the GPU requirements rise exponentially.
Anyone remember this old image?



Adding more polygons gets diminishing returns pretty quickly. Good art direction is definitely more impactful than brute forcing detail.
Posted on Reply
#20
Assimilator
PapaTaipeiThe engine to this day is still used extensively.
The original engine is absolutely not used at all, so this comment is both stupid and pointless.
sLowEndAnyone remember this old image?



Adding more polygons gets diminishing returns pretty quickly. Good art direction is definitely more impactful than brute forcing detail.
This called the Pareto principle, and for graphics I'd go so far as to say that every 10% increase in visual fidelity is now taking 90% more rendering power. Realism is hard, rasterisation has used a lot of tricks to get to 95% of it for a very long time, but that last 5% simply cannot be faked and has to be done the hard way. And this is what we are finally starting to see in titles such as this.

The same cohort of people who refuse to understand this will continue to complain, of course. But we cannot do anything about those who choose - despite having access to the entirety of human knowledge via the internet - to be uneducated.
Posted on Reply
#21
Frick
Fishfaced Nincompoop
AssimilatorDid you make the same complaints when Crysis was released? Remedy, like Crytek, are to be praised for pushing the boundaries of what can be achieved with game engines, not condemned. When developers push these boundaries, hardware is pushed to keep up, and consumers benefit as a result.
Crysis was pretty well optimised though, if you fiddled with settings.
AusWolfDoes anybody remember the late '90s - early 2000s when you had to buy a new graphics card every year because of new feature sets that made new games not even start on the old card? Anyone?

Sure, they were cheaper, but instead of paying $200 every year, now we pay $5-800 every 5 years. Pascal was awesome, and the 1080 (Ti) had a long and prosperous life, but sometimes, we have to move on.
Maybe in the mid 90's that was a thing, but definitely not in the 2000's, iirc.
Posted on Reply
#22
TheoneandonlyMrK
The confusion amongst gamer's was caused by news based rumours shitposting.

By the news team on sites like this reporting the 5700 Xt couldn't play it at all.

Journalism usually involves a degree of fact checking.

Yet were here.


"Gamer's are confused". So are journalists.
Posted on Reply
#23
Assimilator
FrickMaybe in the mid 90's that was a thing, but definitely not in the 2000's, iirc.
No, it absolutely was. Back then it was easy to get massive hardware performance gain via a simple and cheap node shrink, then anything on top of that in terms of actual design improvements was just gravy (and there was plenty of low-hanging fruit there too).

Nowadays node shrinks are barely an improvement and hideously expensive to boot, rasterisation has been optimised to the Nth degree so there's almost no room for improvement via that avenue, and we've only barely started down the far more complex ray- and path-tracing road, where optimisations are hindered by the slow down in node shrinking.
Posted on Reply
#24
Easo
Gamers, I do not understand you. After the last couple years of abysmal failures regarding optimisation (amongst other things) for released games where even 4090 can take a hike what the hell is with this "Just upgrade your crap PC, bro"?
Are you gluttons for punishment?
Posted on Reply
#25
Assimilator
EasoGamers, I do not understand you. After the last couple years of abysmal failures regarding optimisation (amongst other things) for released games where even 4090 can take a hike what the hell is with this "Just upgrade your crap PC, bro"?
Are you gluttons for punishment?
True gamers are gluttons for progress in evolving the state of the art. This game does that.
Posted on Reply
Add your own comment
Dec 5th, 2024 11:02 CST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts