• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Hogwarts Legacy Benchmark Test & Performance Analysis

I don't get the hate. First 'Next gen' games are always like this. Devs eventually learn to optimize better, and cards get more powerful. This has always happened in gaming.
Visuals are getting harder to improve meaningfully without impacting performance too much, that's how diminishing returns work. I also wouldn't expect AAA, story-based games to err on the side of performance rather than visuals.
I see your point, but I still don't think one should need a $400 GPU just to run the game at 1080p with RT off.

On the bright side, as long as Low gives you 95% of the visual quality of Ultra with 3x the FPS, I'm fine. :)

On the other bright side, I remember when I bought my 6750 XT and some people tried to suggest upgrading my monitor because 1080p is so oldschool, and the 6750 XT can do so much more, and I was like "nah, it'll be fine". Who was right then? :laugh:
 
Last edited:
Thanks for another in-depth performance review, love these! Really appreciate the inclusion of 1% lows :clap:

Judging by the screenshots the graphics look good, even great at times. But certainly not jaw-dropping or "next-gen". Neither do they justify the high requirements, especially @ 4K where only the top performing cards are able to reach 60 fps.

What I like is that the 7900XTX is 50% faster than the 6900XT in raster, finally in line with AMD's initial claims.
What I don't like is the abysmal performance of the whole 6000/7000 series in RT. We should be seeing at least double the number of frames here. Perhaps it's a driver issue, but more likely to do with the game engine itself. Hopefully it will get addressed in a future patch.

I'm not surprised with the VRAM allocation @ 4K either. I've tested several older games at this resolution to witness 10-12 GB and 13-14 GB with RT on. And at 1080p a few existing games can already allocate 8-9 GB with details maxed out.
 
Thanks for the review; it's always nice to have another RPG in my list of games I haven't played yet :D The 99th percentile frame time is a welcome addition.
 
I'm playing the game and it does look great but i'm disappointed with the performance. It really sucks that developers are bringing console games over to PC and not optimizing the games at all. My 3080Ti should be able to run this at 1440 and RT on @60 FPS steady but it doesn't. Also I noticed that my video card and cpu are not reaching 100% usage, even when though my frames were in the 40s. My GPU usage was around 50% and cpu in the 30s. This game needs some optimization badly.
 
I'm playing the game and it does look great but i'm disappointed with the performance. It really sucks that developers are bringing console games over to PC and not optimizing the games at all. My 3080Ti should be able to run this at 1440 and RT on @60 FPS steady but it doesn't. Also I noticed that my video card and cpu are not reaching 100% usage, even when though my frames were in the 40s. My GPU usage was around 50% and cpu in the 30s. This game needs some optimization badly.
I suppose the game only uses 2-3 cores, and if they're not strong enough (that is, they're not the latest and greatest), it'll choke the GPU.
 
It‘s 1% low. Will add that to the chart notes. I’m working on adding min fps to my other testing. Spent the whole jan on retesting cous, gpus and ssds… and still not finished.. fml

honestly you are at a point now if you wanted you could probably step back and take the role of a traditional CEO and just hire a gpu game tester to follow your outline of testing and nvme tester. I think deep down you enjoy it though, its just getting to be a bit too much because there is so much variety now.

regardless, this game looks interesting to me, thanks for the review. :toast:
 
"VRAM usage on AMD and NVIDIA is very similar and not too unreasonable."

@W1zzard I would say 9GB as a minimum for 900p is kinda unreasonable :D
bah i keep forgetting about this line .. i always copy paste an older review as template .. fixed
 
I don't get the hate.
Reeding iz HRd.
First 'Next gen' games are always like this. Devs eventually learn to optimize better, and cards get more powerful. This has always happened in gaming.
"next gen" started 3 years ago, and current mid range PC hardware is already far more powerful then the consoles. It also doesnt justify the weaker console running at 30 FPS and the PC port running at 11 on more powerful hardware. With "next gen" previously it was always the other way around.
Visuals are getting harder to improve meaningfully without impacting performance too much, that's how diminishing returns work. I also wouldn't expect AAA, story-based games to err on the side of performance rather than visuals.
By that logic, this game should run fine on 1000 series cards, since its getting harder to improve visuals, and a AAA story focused game shouldnt be trying to push insane visuals (and it doesnt, it looks OK, but not ground breaking). And yet, despite the game looking worse on low then some games a few years ago, it chugs HARD.

This is evidence of a very poorly optimized title. A game that runs at 30 FPS on an xbox series S should not be hitting 11 FPS on a 6700xt.
 
:love: that's the nicest thing i read today
To be fair, I just read the review myself, and I want to buy the game now. :D I didn't have much faith in the project, but these lines really got me:
"Hogwarts Legacy is one of the best movie/book adaptations in a game that I've encountered in a long time. Not only did the developers get the Hogwarts Castle right, they were also able to fill the world with lots of unique and entertaining places that are worth visiting. What's even more important is that these are believable in the J.K. Rowling's Harry Potter Universe and don't feel like the devs went crazy with their own ideas." ... "Even if you're not particularly into Harry Potter, but enjoy the magic/fantasy/wizardry concept, then you should definitely take a look, check out the numerous game reviews online."

I'm just sad about how poorly it runs, but as long as I'll only have to sacrifice RT on my machine, oh well... :)
 
okay.

looking at those screenshots. some fucking imbecile must've done the lighting system for this game ... it just looks wrong.
and its thus completely ruining them graphics.
sorry, but ive seen games 15 years old that have better lighting, which in turn makes their graphics far more believable, and personally speaking at least, better. no amount of polygons can conceal the fact that the outdoor lighting looks like as if every other vertex is a subtle light source or something, making everything look washed and overexposed at the same fucking time, as if you've turned a 2010s games graphics settings to the minimum.

sigh.
 
Last edited:
I didn't have much faith in the project
Me neither, but I really have to admit it's a good game.. wish I had more time :/ not a big Harry Potter fan btw and I still like it
 
Me neither, but I really have to admit it's a good game.. wish I had more time :/ not a big Harry Potter fan btw and I still like it
I like Harry Potter... I just didn't think any adaptation of anything still had a chance of turning out to be anything like the original material anno 2023. Reading the review, this one is a refreshing exception. :)

Edit: Also, because the first three Harry Potter PC games were awesome, but the series took a massively deep dive into poorly executed console port territory afterwards.
 
Last edited:
When I was a kid I remember my parents driving me to line up for the midnight releases of the books. It was a cool time to be a kid. I haven't re-read them since and didn't like the movies though. This will be cool game for my nostalgia for sure. :toast:
 
When I was a kid I remember my parents driving me to line up for the midnight releases of the books. It was a cool time to be a kid. I haven't re-read them since and didn't like the movies though. This will be cool game for my nostalgia for sure. :toast:
The movies were the exact opposite to the PC games, imo (see my post above). They started out kind of simple and bland, but got better in later episodes. 4-5-6 are my favourites. 2 and 3 aren't too bad, either.
 
What do you mean full path ? As in the entire scene is ray traced? Obviously not, you can't do that in real time on anything.
I suppose it was meant all indirect lighting managed by hardware RTGI + RTAO + RT Reflections and distance variant shadows via RT Shadows ... such as the likes of Metro exodus enhanced edition and dying light 2.

Actual full path would be quake rt ... maybe minecraft rt ... not sure about portal rt
 
This game is awesome my eyes have bags. Thanks for the review!

EDIT:: I think the best part of the review so far, is that finally out of all the years, @W1zzard can finally play as himself in a game. It must be freeing.
 
Last edited:
This game is awesome my eyes have bags. Thanks for the review!

EDIT:: I think the best part of the review so far, is that finally out of all the years, @W1zzard can finally play as himself in a game. It must be freeing.
68c9ff9d7c93a0bde4c09a4e5d5591b65ef31339_00.jpg
 
No Intel Arc graphics card tested, the only company that provided optimized drivers.. I know that even their A770 will be in the bottom of the charts but still interested to know how Intel hardware and their drivers evolve...
Came to say the same thing.

Let's see ARC up there. Looking forward to seeing the blue bars! LOL

FIRST to optimise drivers. FIRST ROW on the performance charts. (aufsteigend). LOL
 
Adding Intel's A770 and A750 to the charts would be a very welcome addition.
 
On 1070 Ti with 1080p and Medium settings game ate about 7.5 GB of VRAM. Still looks good and runs well (CPU helps...).
On High - game crashed due to "Out of video memory" almost straight away when arriving in Hogwarts, but was fine until then, even with half of settings on Ultra (cause I am masochist).
I got 2-3 of those 30 sec compiling shaders thingies so far.

Regardless there clearly is a need for optimisation, but it's otherwise acceptable, imho.
 
On 1070 Ti with 1080p and Medium settings game ate about 7.5 GB of VRAM. Still looks good and runs well (CPU helps...).
On High - game crashed due to "Out of video memory" almost straight away when arriving in Hogwarts, but was fine until then, even with half of settings on Ultra (cause I am masochist).
I got 2-3 of those 30 sec compiling shaders thingies so far.

Regardless there clearly is a need for optimisation, but it's otherwise acceptable, imho.

most games are like this today, they look 95% as good at medium to high settings, and you gain lots of fps.

My config:
5900x + 64gb ram 3600mhz + NV3090
1. why game alocate VRAM i got 24GB on 3090, bug!
2. maybe someone should try this :

I might try this on Witcher 3, I always get stuttering in that game.
 
Back
Top