• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hogwarts Legacy Benchmark Test & Performance Analysis

pc hdr broken
 

Attachments

  • HDR.jpg
    HDR.jpg
    678.5 KB · Views: 165
If you don't run "lab conditions" then you end up thinking that a pentium 4 is as fast as a 13900k cause they get the same framerate in 8k.

The usefulness of lab conditions testing is, the moment a game shows up (like cyberpunk or hogwarts or control), that shows a 50% difference in RT performance, we don't blame drivers or optimizations cause the difference is supposed to be "17%". We know the difference isn''t supposed to be 17% and the 50% we are seeing is perfectly normal
I don't disagree. The only problem is that you can't define "lab conditions" when you look at games. Sure, you can test at 720p low to create a CPU bottleneck, but how do you create a bottleneck in the RT engine? Games, and even 3D benchmarks work with complex scenes that include many different elements and require all of your GPU to work as a system. I'm sure there is a possibility to create a combination of elements that tanks Nvidia GPUs just as much as it does AMD ones, even if we haven't really found it, yet for some reason.

I'm also not questioning whether AMD is worse at RT than Nvidia (they obviously are). I'm just saying that there is some ambiguity in how much it is the GPU's fault, and how much drivers and game support are a contributing factor. Do you remember Crysis 2 (or was it 3?) that worked with a huge tessellated sea under the map which tanked performance especially on AMD GPUs? I'm thinking along this line. No one questioned whether AMD was worse in tessellation than Nvidia. We only questioned whether a huge tessellated sea under the map was necessary.

Unless you create a scene that uses one specific effect only, with no textures, polygons, or anything to use only one specific part of your GPU (I don't know if it's possible), you can't be sure of the objective performance difference. As long as this ambiguity exists (it probably always will), I'd much rather just look at results from individual games and see which GPU is faster in what game without drawing conclusions. As for Hogwarts Legacy, I think we can agree that there is something in the RT Ultra setting that kills AMD GPUs, but running benchmarks, or arguing about other games won't tell us what it is.

Edit: TL,DR: All I'm saying is, the situation with Hogwarts Legacy at Ultra RT is probably more complex than "AMD is shit at RT".
 
Last edited:
Just one moment, guys :)

Have you seen this?

In short, you should be able to verify, how much VRAM is being used in Afterburner (4.6.3 Beta 2 Build 15840 or later):
Near the top and next to "Active Hardware Monitoring Graphs" click the "..."

Click the Checkmark next to "GPU.dll", and hit OK

Scroll down the list until you see "GPU Dedicated Memory Usage", "GPU Shared Memory Usage", "GPU Dedicated Memory Usage \ Process", "GPU Shared Memory Usage \ Process"

Pick and choose what you want to be tracked using the checkmarks next to them. "GPU Dedicated Memory Usage \ Process" is the # that most closely reflects the # we find in FS2020 Developer Overlay and Special K (DXGI_Budget, except Unwinder uses D3DKMT api)

Click show in On-Screen Display, and customize as desired.

***

I'm using this "GPU Dedicated Memory Usage \ Process" monitoring for a week, it shows lower usage than previous VRAM allocation
This shows allocated per process, still not the usage we're looking for. It's useful in the sense that it doesn't include VRAM used by DWM (Desktop Windows Manager), and browser tabs etc.
 
I wholeheartedly disagree. RTAO is very distinguishable and so far always results in a clearly more realistic look for the game than other AO methods.
I'm with @Vya Domus on this one. When I looked at the comparison screenshots, the only obvious differences I spotted between RT and non-RT were the shadows and reflections. I had to look more carefully to see the difference in AO, and even then, I wouldn't say it's clearly better, just different. And these are just screenshots, I bet I would see even less difference while playing the game.
 
I wholeheartedly disagree. RTAO is very distinguishable and so far always results in a clearly more realistic look for the game than other AO methods.

Sure, then show me such an instance where RTAO makes a huge difference, I'll wait.
 
Just played a bunch of this game at 4k with DLSS 3, quality with 0.5 sharpness enabled and RT ultra - runs amazing so far -- have not yet seen under 100fps yet and no lag at all. Amazing game.
 
Just played a bunch of this game at 4k with DLSS 3, quality with 0.5 sharpness enabled and RT ultra - runs amazing so far -- have not yet seen under 100fps yet and no lag at all. Amazing game.
Yeah DLSS 3 is amazing .. it doubles your FPS with no visible difference (at least I can't see any difference during actual gameplay)

Just did a test of the patch that just came out

Update Feb 10: The game is now released for everyone and there's a new patch. I've tested the patch on RTX 4090 and RX 7900 XTX, with RT on and off, there's no change in performance. The DLSS 3 menu bug is also not fixed.
 
So RT low makes it perform better on 7900xtx. How does overall image quality compare?
 
So RT low makes it perform better on 7900xtx. How does overall image quality compare?
 
Just played a bunch of this game at 4k with DLSS 3, quality with 0.5 sharpness enabled and RT ultra - runs amazing so far -- have not yet seen under 100fps yet and no lag at all. Amazing game.
Ιm also at 4k, DLSS 3 is off. Actually only RT shadows are heavy, if you turn these off you can get 60-65+ with everything else on ultra at native 4k + RT , lol
 
Reflections are night and day, but shadows look better at low, in my opinion.
 
Yeah I think there is a bug in the game or AMD's drivers (probably the latter or both). Ultra settings, 3440x1440, RT Ultra and no FSR on a stock RX 6800 I'm getting 20FPS in a more demanding scene. Low RT I get 27 fps, no RT 57fps.

BUT, after playing with the settings with RT on I suddenly got stuck at 4fps. GPU 99% utilization. Saved, restarted the game, and performance came back. Guessing something like this is happening for you @W1zzard though more frequent/permanent.
 
Reflections are night and day, but shadows look better at low, in my opinion.
You can completely turn RT shadows OFF with everything else at ultra and the game performs insanely - unexpectedly fast. Im talking about 4k native on a 4090 - maybe even a 4080.
 
You can completely turn RT shadows OFF with everything else at ultra and the game performs insanely - unexpectedly fast. Im talking about 4k native on a 4090 - maybe even a 4080.
Then I guess we found the culprit: there's something in the game's implementation of RT shadows that slows down Nvidia cards and completely kills AMD ones.
 
Then I guess we found the culprit: there's something in the game's implementation of RT shadows that slows down Nvidia cards and completely kills AMD ones.
I don't know if im at the heaviest areas of the game yet, im running around in hogwarts now, maybe it gets heavier later, but at this point with shadows off and everything else maxed out + RT at native 4k you can get an almost locked 60. With DLSS Q you get 80-90+.

The game also looks to be cpu bound at some point, around 110-130 fps no matter how much I lower the resolution it stays there. Nvidia driver overhead I assume, although HUB with his 7700x was dropping as low as 80 :L
 
Yeah DLSS 3 is amazing .. it doubles your FPS with no visible difference (at least I can't see any difference during actual gameplay)

Just did a test of the patch that just came out

Update Feb 10: The game is now released for everyone and there's a new patch. I've tested the patch on RTX 4090 and RX 7900 XTX, with RT on and off, there's no change in performance. The DLSS 3 menu bug is also not fixed.

Can you reduce RT shadow to low with everything at Ultra then perform a benchmark?
 
I don't know if im at the heaviest areas of the game yet, im running around in hogwarts now, maybe it gets heavier later, but at this point with shadows off and everything else maxed out + RT at native 4k you can get an almost locked 60. With DLSS Q you get 80-90+.

The game also looks to be cpu bound at some point, around 110-130 fps no matter how much I lower the resolution it stays there. Nvidia driver overhead I assume, although HUB with his 7700x was dropping as low as 80 :L
I would love to test it myself, but I can't bring myself to give 50 quid for a game. It'll have to wait, I'm afraid. :(
 
Friends, that's how unreal engine is, if you played a plague tale requiem, graphic drops to 50% as soon as you enter a settlement with more NPCs, if you tried Unreal Engine 5.1 City Sample Demo, the same thing happens, CPU limit!
 
Reflections are night and day, but shadows look better at low, in my opinion.
This was mainly the sort of thing I was meaning by asking. Comparisons in screenshots when it comes to lighting effects can get a bit harder to distinguish. That temporal component I mentioned earlier is probably more noticeable. Of course, once you start moving around, minor differences start disappearing, especially if motion blur and flaring gets used. Something seems wrong with the non-RT reflections though. I’ve played games where reflections are handled pretty well, without RT. On this game, reflections are either mush or nonexistent when RT is off.
 
This game is fucking broken. Imagine having multiple monitors while playing this game - 1 added monitor already uses close to 4gb of vram just having youtube playing on the second one while browsing on the first one :kookoo:
 
Friends, that's how unreal engine is, if you played a plague tale requiem, graphic drops to 50% as soon as you enter a settlement with more NPCs, if you tried Unreal Engine 5.1 City Sample Demo, the same thing happens, CPU limit!
a plague tale does not use unreal engine tho. it uses asobo's custom in house engine.
i dont know if its a variant of unreal engine however. it really looks like an unreal engine game, some of its settings too. but can't be quite sure
 
Definitely. I clean installed the newest drivers and ran the game. Not sure what else I could do
It might be related to the DLSS settings bug, where greyed out options are still in effect

im playing it in a 5500xt 8gb 1080p ryzen 5600x high fsr 2.0 at 40-50 fps

why i need to buy a 4080 to put RT on

really dont worth
a 5500XT is not a ray tracing card and never will be, but i do agree ray tracing is not worth it
 
Back
Top