• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hogwarts Legacy Benchmark Test & Performance Analysis

It might be related to the DLSS settings bug, where greyed out options are still in effect


a 5500XT is not a ray tracing card and never will be, but i do agree ray tracing is not worth it
In the end, they paid very small amount of money for that sweet cute 5500xt, enjoy the game, since their expectations are in check. For what its worth, the game actually runs fine at 1080p/med/high settings on 8 GB buffer, most of the time. No ray tracing, no fuss, no ultra setting hunt: stable performance.

People paid huge money for GPUs that have gimped VRAM, and they can't honestly keep their expectations in check, since the investment is now hallow just after 2 years of the gpu's release.
 
I'm not insulting your choice of GPU, it's just that raytracing needs far more performance than makes it viable for even top end GPU's without features like DLSS, so anything midrange or below it's just not worth even trying
 
Sure, then show me such an instance where RTAO makes a huge difference, I'll wait.
Any game with RTAO has quite a noticeable difference even compared to HBAO/HDAO, less so from VXAO but that one rare and halfway towards RT anyway. For me, the specific game as an example was Deathloop. Not so much because the effect is any different but I ended up playing it extensively with both RT on and off. With RT off, a whole lot of lighting was just... off. Enough light leaks to notice, especially around some smaller items or more weird placements. Looked up TPU article and comparison screenshots somewhat capture it but maybe not in the best way.

Hogwarts seems to be a bad example right now:
 
okay.

looking at those screenshots. some fucking imbecile must've done the lighting system for this game ... it just looks wrong.
and its thus completely ruining them graphics.
sorry, but ive seen games 15 years old that have better lighting, which in turn makes their graphics far more believable, and personally speaking at least, better. no amount of polygons can conceal the fact that the outdoor lighting looks like as if every other vertex is a subtle light source or something, making everything look washed and overexposed at the same fucking time, as if you've turned a 2010s games graphics settings to the minimum.

sigh.
Same. It seems nearly like an HDR movie played on an SDR screen without any tonemapping, too washed out and, as you said, overexposed. WHYYYYYYY????
I hope there’ll be some ReShade presets to the rescue.
 
Same. It seems nearly like an HDR movie played on an SDR screen without any tonemapping, too washed out and, as you said, overexposed. WHYYYYYYY????
I hope there’ll be some ReShade presets to the rescue.
I've just looked through the screenshots again, and I guess you guys are right... Outdoor scenes look like there's some intense blueish-white colour filter applied, or there's just way too much fog. It sort of reminds me of the first Assassin's Creed game where every city was some shade of yellow or orange, except for Acre which was blue all around. I didn't understand why back then, and I still don't understand now.

I think the developers overdid it with the AO this time around. I'll still have to see how annoying it is during actual gameplay. We might just be crying wolf based on screenshots.
 
Any game with RTAO has quite a noticeable difference even compared to HBAO/HDAO
You still didn't give me any concrete examples however. I can give you plenty of examples where RTAO makes no difference, like the recent Dead Space remake.

For me, the specific game as an example was Deathloop. Not so much because the effect is any different but I ended up playing it extensively with both RT on and off. With RT off, a whole lot of lighting was just... off. Enough light leaks to notice, especially around some smaller items or more weird placements.
That's because of global illumination, not AO, light leak is the result of improper GI. That game is using ray traced sun shadows, that's why it looks so different.
 
Looks like zen 3 performance is having a hard time with it.


edit: turns out he was running with fg dlss3 enabled on intel without knowing it. Still something looks off about zen perf.
 

Attachments

  • 85A7351D-C571-49FE-86D4-5BC8DF9B24B6.png
    85A7351D-C571-49FE-86D4-5BC8DF9B24B6.png
    556.5 KB · Views: 414
Last edited:
Looks like zen 3 performance is having a hard time with it.

Wow! I'm wondering how Zen 4 is doing... does someone have spare 50 quid so I can buy the game and test it? :roll:
 
CapframeX posted some results, 13900k is 45% faster than the 7950x with RT.
Is that with or without that frame generation "bug"?
 
It's not a bug, when you disable DLSS2 you can't turn FG on or off cause it's greyed out as an option. So if you have FG on and you disable DLSS2, FG remains on.
It's a "bug" because it's on by default on 40X0 cards when it should not: you have to go "through hoops" to disable it.
 
It's a "bug" because it's on by default on 40X0 cards when it should not: you have to go "through hoops" to disable it.
What do you mean man? It's a simple button on the menu. What hoops lol. W/e
 
What do you mean man? It's a simple button on the menu. What hoops lol. W/e
I linked in in my very 1st reply to you (1st link), which you obviously didn't read ...

Let me help you:


It's explained in the video, starting @ 22:34.
 
I linked in in my very 1st reply to you (1st link), which you obviously didn't read ...

Let me help you:


It's explained in the video, starting @ 22:34.
I have the game, do I need someone to explain to me how to turn on or of FG? lol

And yeah, that video is complete BS. It never turns itself on randomly, he is making stuff up lol
 
I have the game, do I need someone to explain to me how to turn on or of FG? lol

And yeah, that video is complete BS. It never turns itself on randomly, he is making stuff up lol

So you're calling @W1zzard a liar ... i see ...

It doesn't turn itself on randomly: it's on BY DEFAULT if you have a 40X0 card, and you have to turn it off (if you don't want it on, ofc) but you can't DIRECTLY because the option is greyed out, which is why you have to "go through hoops".
 
So you're calling @W1zzard a liar ... i see ...

It doesn't turn itself on randomly: it's on BY DEFAULT if you have a 40X0 card, and you have to turn it off (if you don't want it on, ofc) but you can't DIRECTLY because the option is greyed out, which is why you have to "go through hoops".
But the video you posted says that it turns on randomly "make sure FG didn't decide to turn itself on". Wtf does that even mean? It doesn't turn itself on

And yes you can directly turn it OFF cause DLSS2 is also on by default
 
But the video you posted says that it turns on randomly "make sure FG didn't decide to turn itself on". Wtf does that even mean? It doesn't turn itself on

And yes you can directly turn it OFF cause DLSS2 is also on by default

Be so kind and ask nVidia about it: perhaps THEY can enlighten you.

Watch the HU video i linked: it goes on about it for JUST UNDER 1 minute: be sure to watch THAT portion to understand the issue.
 
And yes you can directly turn it OFF cause DLSS2 is also on by default
If an option is greyed out, 99% of people would assume that it's off, as it's sort of "parent" option is also off. Maybe not you, but the majority of people, surely.

If the two options work independently from each other, then why do both get greyed out when you turn off DLSS? This is why people call it a bug.
 
If an option is greyed out, 99% of people would assume that it's off, as it's sort of "parent" option is also off. Maybe not you, but the majority of people, surely.

If the two options work independently from each other, then why do both get greyed out when you turn off DLSS? This is why people call it a bug.
Βy default it's not greyed out, cause DLSS in also on by default, at least it was on my computer. It only greys out when you turn DLSS OFF, which I agree is silly, but hwunboxeds video made it seem like FG decides to randomly turn itself on and off which is definitely not the case
 
Friends, that's how unreal engine is, if you played a plague tale requiem, graphic drops to 50% as soon as you enter a settlement with more NPCs, if you tried Unreal Engine 5.1 City Sample Demo, the same thing happens, CPU limit!
Plague Tale: Requiem doesn't use Unreal Engine.
 
Back
Top