• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Alan Wake II System Requirements Released, Steep RT Requirements Due to Path Tracing

At least if it will run like ass with all the RT features on there will be a valid reason for it.

Nowadays even shitty Unity games opt for DLSS to cover up lack of any performance optimization and look the same as the stuff released 10 years ago.
Growing pains for the technology, in a couple generations no one will be able to tell the difference between any upscaling and native

Oh my. You are so happy supporting Nvidia, you don't even want to hide it. Of course you avoid giving a real answer to what I wrote.

So, you just bypass that Nvidia is pushing developers to optimize their games in a way that not only makes other cards, even older Nvidia models, look bad, but also make games look much worst visualy in anything not supporting a specific Nvidia feature. In the past it was PhysX, then Hairworks and Gameworks, then RayTracing, now Path Tracing and off course we got pretty fast in a situation where developers are unable to optimize a game and Frame generation is absolutely necessary for smooth...... 1080p gaming with an RTX 4090. This is something happening the last 15 years. When Nvidia promoted PhysX, developers "$$$forgot$$$" how to code physics effects in games and they where totally depended on PhysX. Then they stop knowing how to build their own libraries that work great everywhere, and they where totally depended on Gameworks. Then they decided that moving hair means Hairworks only. They could implement their own solution, they didn't know how. Then we had the tessellation fiasco with a full tessellated ocean under a city!!! Then we got RayTracing that was promoted as a breakthrough, but here we are in today where every tech site out there calls even DLSS 3.0 as mediocry without ray recostraction. Developers are starting forgeting how to code lighting without raytracing. Then Frame Generation is here to make RTX 3000 and below and anything non Nvidia look bad, so optimization just died.
Off course when AMD had DirectX 10.1 and Nvidia didn't, Nvidia pushed for the removal of a patch supporting DX10.1 from a specific game, because Radeon cards where getting a 20% boost in a specific area and that was bad for Nvidia of course.

Anyway, having conversation with someone who finds security and feels superior by supporting the strongest brand and playbacking their marketing messages, probably with a smile in their face, certain that the other person "must be sweating hard" is pointless. And It is long ago I was 15 years old to enjoy conversations with a 15 years old mentallity.

Have a nice day.

PS Modders don't need to offer support, so they can throw anything they want from day 1. Still you avoided the fact that if AMD really wanted to block DLSS they could lock the game on FSR libraries. Nvidia was locking the CUDA and PhysX use by disabling those in case the primary card wasn't an Nvidia one. If someone really wants to block something, they can.
At the highest end of the stack, I definitely am, because who would ever buy a 7900xtx over a 4090?

Holding path tracing back because lil timmy has to turn if off is timmy's problem. Where's FSR 3 support AMD? Only on 2 terrible games. Where's anti-lag+ AMD? Delayed. I have a 6800 in my living room that's been RMA'd twice until I got a working one. My 5700xt was garbage for like 2 years before it had a decent driver. My 7900x3d and 5800x are great!... When they boot, otherwise they cycle restarts for 30 minutes straight.

Whose fault do you think these issues are? It's mine for buying them lmao
 
because who would ever buy a 7900xtx over a 4090?

You are being a troll/fanboy. 7900XTX and 4090 are no where near the same price, there is like a 600-700$ price difference between them, people always compare these cards seemingly ignoring the ludicrous price delta between them, the 7900XTX is cheaper even than the 4080 by a good amount. That alone would be a good reason for someone to pick the 7900XTX.
 
Growing pains for the technology, in a couple generations no one will be able to tell the difference between any upscaling and native


At the highest end of the stack, I definitely am, because who would ever buy a 7900xtx over a 4090?

Holding path tracing back because lil timmy has to turn if off is timmy's problem. Where's FSR 3 support AMD? Only on 2 terrible games. Where's anti-lag+ AMD? Delayed. I have a 6800 in my living room that's been RMA'd twice until I got a working one. My 5700xt was garbage for like 2 years before it had a decent driver. My 7900x3d and 5800x are great!... When they boot, otherwise they cycle restarts for 30 minutes straight.

Whose fault do you think these issues are? It's mine for buying them lmao
Oh my. All planets align to offer you great experiences with Nvidia cards and things go bananas when you try to use AMD hardware.
Thanks for the comedy.
 
Lets try and keep it on topic people thanks!
 
That misses the POINT.

Dlss3.5 increase performance 500%


This shitcake is batman, Crysis all over again.

You needed physx or it looked less good.
Or the tesselation power for a hidden sea.

Now we Need an RTX 4090 to hit 1080p native.


My two finger salute for this game is infinite.


Not even in a sale is this turdburger being bought.

I hope only 4090 owners buy in at best and the dev sinks out of existence for this Shit.

Bought fools.
I'm currently on AMD hardware, but I'll buy it anyway, as I love the first game. My list of games to play is so long that by the time I actually get to play it, I'll probably be able to run it at 4K with RT on at 120 FPS. :laugh:

I don't know what's wrong with people wanting to play everything on Ultra graphics straight after release. Like there isn't any other game, or graphics setting to play at. I remember when the Witcher 2 came out, I had to play it at 900p with medium-ish graphics and I wasn't whining.
 
You are being a troll/fanboy. 7900XTX and 4090 are no where near the same price, there is like a 600-700$ price difference between them, people always compare these cards seemingly ignoring the ludicrous price delta between them, the 7900XTX is cheaper even than the 4080 by a good amount. That alone would be a good reason for someone to pick the 7900XTX.
Obviously he's one of those people who thinks products should be compared based on their position in the company's product stack (e.g. Company A's highest end vs Company B's highest end), rather than on a price basis. By such logic, we should be pitting the A770 against the 7900 XTX and RTX 4090 because the A770 is Intel's best card, right? Completely reasonable. /s
 
Low quality post by JustBenching
Obviously he's one of those people who thinks products should be compared based on their position in the company's product stack (e.g. Company A's highest end vs Company B's highest end), rather than on a price basis. By such logic, we should be pitting the A770 against the 7900 XTX and RTX 4090 because the A770 is Intel's best card, right? Completely reasonable. /s
You can compare whatever you want. His point is, if you want the best, you buy an nvidia.
 
Back
Top