• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required

Me in twenty-****ing-thirty: Why does my 4090 not run this new graphically intense game well?!?!



*(Assuming Im still alive)

The problem is that is is perfectly possible for 4090 to not do that right now.
Let's not even mention other cards.

Sorry, I really do not understand you all.
 
I think people are giving way too much importance on a game whose PC version is pretty much dead on arrival because it's an Epic Games Store exclusive.
 
I think people are giving way too much importance on a game whose PC version is pretty much dead on arrival because it's an Epic Games Store exclusive.
Check numbers at your favorite Torrent site
 
I think people are giving way too much importance on a game whose PC version is pretty much dead on arrival because it's an Epic Games Store exclusive.
Is the Epic Games Store like some black sheep of the gaming world?
 
Is the Epic Games Store like some black sheep of the gaming world?
Many people (myself included) don't like it because of the exclusive deals and no other service provided. With that said, I just bought and installed the game on EGS (this will be my first and only ever purchase there), and noticed that the Epic app prevents my monitor from entering sleep mode. What an absolute turd that app is anyway. Zero customizability, and now this.
 
Many people (myself included) don't like it because of the exclusive deals and no other service provided. With that said, I just bought and installed the game on EGS (this will be my first and only ever purchase there), and noticed that the Epic app prevents my monitor from entering sleep mode. What an absolute turd that app is anyway. Zero customizability, and now this.
Oh my, I knew it was trash, even used it, but that was long ago, I did not realize it was still trash, welp, thanks for letting me know that I should avoid it still! Bummer.
 
16 fps on a 1080ti at 1080p with no upscaling or rt. This is hilarious.
Even more hilarious is the fact that a 7900XTX that costs over $1000 can only do around 40fps in 4K with no upscaling and RT. And that's with 7800X3D. As per Daniel Owen's last video.
Imagine paying four figures for a GPU (and another four figures for the platform) only to have to use upscaling to even get to 60fps and the game is not even running at max settings (by that i mean RT included).

What a sad state of affairs.
 
Trying to chip in with a more measured response here: I think the primary issue people have with such steep system requirements (that they aren't articulating properly) is that a lot of these modern games don't look "good" enough to justify such monstrous requirements. Fidelity improvements have plateued from what people were used to back in the day so it's hard to understand WHY we suddenly need to rely on tech such as upscaling, especially in non-RT modes, just to get the game to run decently. Remnant II for example really doesn't look all that different to a game from 2 or even 5 years ago, yet we have top end cards not able to maintain a consistent framerate.

It's not necessarily a matter of people being afraid of progressing technology, it's that the progress isn't seen as impressive enough to justify the cost.
 
Many people (myself included) don't like it because of the exclusive deals and no other service provided. With that said, I just bought and installed the game on EGS (this will be my first and only ever purchase there), and noticed that the Epic app prevents my monitor from entering sleep mode. What an absolute turd that app is anyway. Zero customizability, and now this.
It's a game delivery device. It works to that extent so I don't see the need for partisan comparisons about which game delivery device works best. Just play the game you want whether on Steam or otherwise.

Trying to chip in with a more measured response here: I think the primary issue people have with such steep system requirements (that they aren't articulating properly) is that a lot of these modern games don't look "good" enough to justify such monstrous requirements. Fidelity improvements have plateued from what people were used to back in the day so it's hard to understand WHY we suddenly need to rely on tech such as upscaling, especially in non-RT modes, just to get the game to run decently. Remnant II for example really doesn't look all that different to a game from 2 or even 5 years ago, yet we have top end cards not able to maintain a consistent framerate.

It's not necessarily a matter of people being afraid of progressing technology, it's that the progress isn't seen as impressive enough to justify the cost.
True but I expect performance costs will decrease as the new standard sets in.

Even more hilarious is the fact that a 7900XTX that costs over $1000 can only do around 40fps in 4K with no upscaling and RT. And that's with 7800X3D. As per Daniel Owen's last video.
Imagine paying four figures for a GPU (and another four figures for the platform) only to have to use upscaling to even get to 60fps and the game is not even running at max settings (by that i mean RT included).

What a sad state of affairs.
I don't have to imagine. And yes, it is a sad state of affairs. Tell me who to blame other than myself.
 
It's not necessarily a matter of people being afraid of progressing technology, it's that the progress isn't seen as impressive enough to justify the cost.
Couldn’t have said it better myself.

Lifelike is lifelike. What more do we need?
 
How could I forget about Oblivion? You basically had to mod it to make it run properly on anything other than a high-end GeForce 7800 or such.

Or even then, you turned on HDR and the grass density/distance, and your PC died. :laugh:

It really shows how much G80 was revolutionary IMO. You can play complex games as far as Borderlands PreSequel on a GeForce 8800 GTX or Ultra without a problem (and if you happen to own a Quadro FX 5600 like me, the 1.5 GB helps immensely handling 1080p smoothly as well), and even some minor recent releases like the Final Fantasy pixel remasters run flawless on these cards - in DirectX 11, too! Sure it's feature level 10_0 but still, quite well supported for what it is.

That means that despite being a GPU from 2006, it's still capable of tackling basic games released well into the 2020s. Unified shaders with full programmability really were the bedrock of modern graphics.
 
Trying to chip in with a more measured response here:

I'll add another, less measured one: I think a number of people are offended that games they want to play are released with sysreqs which would require upgrades they can't afford, acting as though developers are deliberately thumbing their collective noses at their lack of funds. As if playing a particular game at max settings has somehow become a universal human right.

With no particular reference to anyone in this thread; just a general thought.

In other news i would quite like a winter home in the Bahamas or maybe Antigua and Barbuda, but my bank statements keep telling me it ain't happening anytime soon.

We all have our crosses to bear.
 
I'll add another, less measured one: I think a number of people are offended that games they want to play are released with sysreqs which would require upgrades they can't afford, acting as though developers are deliberately thumbing their collective noses at their lack of funds. As if playing a particular game at max settings has somehow become a universal human right.

With no particular reference to anyone in this thread; just a general thought.

In other news i would quite like a winter home in the Bahamas or maybe Antigua and Barbuda, but my bank statements keep telling me it ain't happening anytime soon.

We all have our crosses to bear.

There might be some of that, sure. But at the same time you have, for example, the director of a major game studio telling people with the best hardware available to "just upgrade your PC" when the studio's game just doesn't look good enough to be problematic on current generation hardware. Yet it is problematic and somehow considered the fault of the consumer for not having better-than-the-best available hardware?

You can't help but feel slighted considering the cost of PC parts these days.

Older gamers (including myself) were expecting basically photo-realistic graphics by this point with the hardware to match. The graphics just aren't there and yet the mythical hardware is still required.
 
I agree with all of that, and more besides. There are valid annoyances and concerns on all sides.
 
It really shows how much G80 was revolutionary IMO. You can play complex games as far as Borderlands PreSequel on a GeForce 8800 GTX or Ultra without a problem (and if you happen to own a Quadro FX 5600 like me, the 1.5 GB helps immensely handling 1080p smoothly as well), and even some minor recent releases like the Final Fantasy pixel remasters run flawless on these cards - in DirectX 11, too! Sure it's feature level 10_0 but still, quite well supported for what it is.

That means that despite being a GPU from 2006, it's still capable of tackling basic games released well into the 2020s. Unified shaders with full programmability really were the bedrock of modern graphics.

I'd agree if not for the fact that D3D11 feature 11_0 soon become mandatory for games to even boot in short order after DX11 came out in 2009(!) on Windows 7 (14 years old already, how time flies!), which made DX10 only cards like the G80 and G92 obsolete in many titles a lot quicker than they should've been given the raster power they had.

It's still a sore point for me since in 2009 i bought a laptop which had a DX10 only Mobilty HD Radeon 4670 1 GB which, while low-mid range at the time, i could tweak and tune settings in many contemporary games to run them fluidly, until DX11 feature 11_0 became ingrained in more and more titles, which it couldn't boot at all.

DX10 in many ways, was a short lived, transitional period that lasted only 3 years, had token games made using it's feature set (devs mostly made games for 9.0c or 11 and just skipped 10 entirely, some like CAPCOM making token efforts on PC like DMC4 and RE5 for example).

DX11 turned out to be the longest ever DX era, still relevant to this very day.
 
I'd agree if not for the fact that D3D11 feature 11_0 soon become mandatory for games to even boot in short order after DX11 came out in 2009(!) on Windows 7 (14 years old already, how time flies!), which made DX10 only cards like the G80 and G92 obsolete in many titles a lot quicker than they should've been given the raster power they had.

It's still a sore point for me since in 2009 i bought a laptop which had a DX10 only Mobilty HD Radeon 4670 1 GB which, while low-mid range at the time, i could tweak and tune settings in many contemporary games to run them fluidly, until DX11 feature 11_0 became ingrained in more and more titles, which it couldn't boot at all.

DX10 in many ways, was a short lived, transitional period that lasted only 3 years, had token games made using it's feature set (devs just made games for 9.0c or 11 and just skipped 10 entirely, some like CAPCOM making token efforts on PC like DMC4 and RE5 for example).

DX11 turned out to be the longest ever DX era, still relevant to this very day.

Agreed, although we likely have the overwhelmingly negative reception of Vista amongst gamers to thank for the 9.0c/11 split :/
 
Oh my, I knew it was trash, even used it, but that was long ago, I did not realize it was still trash, welp, thanks for letting me know that I should avoid it still! Bummer.
There is ZERO innovation on the Epic store, as far as I see it. The only things keeping it alive are the exclusives and free games. I wholeheartedly wish Epic to go bankrupt so I can have their games on Steam.

It's a game delivery device. It works to that extent so I don't see the need for partisan comparisons about which game delivery device works best. Just play the game you want whether on Steam or otherwise.
If the game delivery device prevents my monitor from entering sleep mode while installing, then it's not doing its job properly, is it? When I googled, I saw posts on the EGS support site all the way back from 2018 with the same issue, and it's still not being dealt with. Besides, why can't I set it up so it only does that: deliver my games? Every other platform has a default screen which I can set to being my game library. Even Ubisoft's app! Why do I have to look at the EGS storefront with a big "Alan Wake 2 out now, buy now" banner every time I open the app when I already have the game? Sorry, this level of turdiness doesn't fly with me.

I'll add another, less measured one: I think a number of people are offended that games they want to play are released with sysreqs which would require upgrades they can't afford, acting as though developers are deliberately thumbing their collective noses at their lack of funds. As if playing a particular game at max settings has somehow become a universal human right.
I would generally agree, although, back when PC parts were cheap, we had to upgrade every year. Now, it's enough to upgrade every 5 or so years. Whether I spend 200 quid every year, or 800 every 5 years makes no difference in the long run (unless I spend all my money that I should save for my next PC upgrade in the meantime, which I assume, the biggest complainers do).
 
There is ZERO innovation on the Epic store, as far as I see it
100% agree. A small dev team could add so many improvements within weeks and months, which leads me to believe that Epic simply has no interest in improving it.
 
100% agree. A small dev team could add so many improvements within weeks and months, which leads me to believe that Epic simply has no interest in improving it.
They don't have to if you're forced to buy their games on EGS by having no other choice. This is why I find their practice utterly disgusting. AW2 will be the first and last game I ever bought on EGS.

Not to mention, they can't even be bothered to fix age-old bugs by the looks of it, which is even sadder.
 
Does anybody remember the late '90s - early 2000s when you had to buy a new graphics card every year because of new feature sets that made new games not even start on the old card? Anyone?

Sure, they were cheaper, but instead of paying $200 every year, now we pay $5-800 every 5 years. Pascal was awesome, and the 1080 (Ti) had a long and prosperous life, but sometimes, we have to move on.
The writing was on the wall for a long time already.

People are just in denial... I jumped on the 7900XT because I already saw what was gonna happen and it would affect my gaming too - it started doing so with TW Warhammer already and that's not even a state of the art engine. For much the same reasons, I strongly recommend people to get 16GB+ VRAM and solid hardware rather than buying/paying for better software, for anything midrange or up. And here we are...
 
The writing was on the wall for a long time already.

People are just in denial... I jumped on the 7900XT because I already saw what was gonna happen and it would affect my gaming too - it started doing so with TW Warhammer already and that's not even a state of the art engine. For much the same reasons, I strongly recommend people to get 16GB+ VRAM and solid hardware rather than buying/paying for better software, for anything midrange or up. And here we are...
And Nvidia sells a 12GB card for 800+ as if it's normal. The only 16GB card they have is horribly priced at 1200+ and is the worst selling 80 class in the history. The next jump is 24GB for 1700+. Nothing in between. If they had a 16GB option around 800 that would be at least reasonable and would be competition for 7900XT.
 
And Nvidia sells a 12GB card for 800+ as if it's normal. The only 16GB card they have is horribly priced at 1200+ and is the worst selling 80 class in the history. The next jump is 24GB for 1700+. Nothing in between. If they had a 16GB option around 800 that would be at least reasonable and would be competition for 7900XT.
I don't get for the life of me why people bought Ada below the 4080 to begin with. Its a complete waste of time. At least you're leading in the amount of abbreviations you need to switch on to game proper, well yay.

The person who invented the wagon said the same thing.
If anything is built on stagnation, its the automotive business itself though lol. Heck, it thrives on stagnation.

100% agree. A small dev team could add so many improvements within weeks and months, which leads me to believe that Epic simply has no interest in improving it.
Yep and that's quite a worrying conclusion, because it means Epic's business doesn't float on recurring customer base that just 'wants to be on EGS'. Its puzzling isn't it... they throw millions in game funding, and nothing in customer experience/journey.

I used to give EGS benefit of the doubt, but I just can't anymore. Their launcher is horrible to use, it literally detracts from playing games through EGS. I just don't trust my game library being there either because of it.
 
Yep and that's quite a worrying conclusion, because it means Epic's business doesn't float on recurring customer base that just 'wants to be on EGS'. Its puzzling isn't it... they throw millions in game funding, and nothing in customer experience/journey.

I used to give EGS benefit of the doubt, but I just can't anymore. Their launcher is horrible to use, it literally detracts from playing games through EGS. I just don't trust my game library being there either because of it.
I honestly don't understand what Epic's business model is at this point. I think the problem is that they don't, either.
 
I honestly don't understand what Epic's business model is at this point. I think the problem is that they don't, either.
I think the closest guess is "exclusivity contracts -> gamers don't have a choice -> money comes without spending on development and innovation".
 
Back
Top