• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Hogwarts Legacy Benchmark Test & Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,747 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Hogwarts Legacy lets you relive the steps of Harry Potter and is a fantastic adaptation of the books. In our performance review, we're taking a closer look at image quality, differences between the graphical setting presets, VRAM usage, and performance on a selection of modern graphics cards.

Show full review
 
Is Hogwarts legacy using Full path RT? 3060 faster than 7900xtx in 4K with RT on..
If Hybrid RT supposed to be around 3090s
 
Its hybrid rt of course
 
Is Hogwarts legacy using Full path RT? 3060 faster than 7900xtx in 4K with RT on..
If Hybrid RT supposed to be around 3090s

What do you mean full path ? As in the entire scene is ray traced? Obviously not, you can't do that in real time on anything.
 
That RT performance makes no sense. No way it's working as intended.

Either way, the game looks a bit meh overall, even with RT on (it's certainly way bellow what you should expect comparing visuals to the FPS numbers), not to mention that I even prefer the low RT setting visuals.
 
Last edited:
That RT performance makes no sense. No way it's working as intended.
Not that I'd be the first time. Remember Portal RTX ? It was atrocious on anything that wasn't 4000 series, that didn't make any sense either.
 
Not that I'd be the first time. Remember Portal RTX ? It was atrocious on anything that wasn't 4000 series, that didn't make any sense either.
It made a bit sense as it's built in nvidia's RTX Remix tools, so Jensen made sure to cripple AMD performance.
 
It made a bit sense as it's built in nvidia's RTX Remix tools, so Jensen made sure to cripple AMD performance.

It sucked on 3000 and 2000 series as well, it was clearly meant to cripple performance on anything that wasn't ADA.
 
Nvidia spent good money .....
 
This game is neither sponsored by Nvidia nor AMD lol, yet some people here are blaming Nvidia for some reasons ;)

Though the RT results do crack me up
 
Another game where Radeons bite the dust with RT enabled. Another game where the difference between Low and Ultra graphics is RT, AO, shadow quality and nothing else. What a surprise.

Anyway, being true to the original HP world got my interest piqued. :)
 
No Intel Arc graphics card tested, the only company that provided optimized drivers.. I know that even their A770 will be in the bottom of the charts but still interested to know how Intel hardware and their drivers evolve...
 
Every AAA release this year so far (Forspoken, Dead Space and now this) performs dreadfully despite providing no meaningful uplift in graphical quality. Hogwarts and Forspoken especially look more or less like PS4 titles with slightly better textures.
I find it a big shame that DLSS/FSR are now just used as a crutch instead of enabling next-gen graphics.
 
It seems nobody tested the game on Radeon with ray tracing, because they would have noticed the white trees that turn to the correct green color once you walk up to them.
This is absolutely mind boggling.

Hogwarts Legacy geprüft: Wie realistisch sind die hohen Systemanforderungen? [Update: RTX 4090 vs. RX 7900 XTX, Treiber] - Bild in Originalgröße (29) (pcgameshardware.de)
Hogwarts-Legacy_Radeon-RX-7900-XT-X_Visual-Corruption-pcgh.png
 
No Intel Arc graphics card tested, the only company that provided optimized drivers.. I know that even their A770 will be in the bottom of the charts but still interested to know how Intel hardware and their drivers evolve...
Meh i forgot.. will add later today

Good to see that MIN FPS chart.

Is that a "real" MIN FPS or something like 0,1%?
Will it be implemented in all subsequent game reviews?
It‘s 1% low. Will add that to the chart notes. I’m working on adding min fps to my other testing. Spent the whole jan on retesting cous, gpus and ssds… and still not finished.. fml
 
I wonder, how come it manages 30 fps with RT on PS5 and on PC with probably more details is at 11 FPS on 6700 xt, a gpu a bit more powerful than what PS5 has.
PC games these days should have have release/launch day and one year later playable date after many patches, drivers and bug fixes.
 
Appreciate the review W1z - makes me glad I've not payed MSRP for a new NVidia card in years since I've bought my last couple used. Even the latest and greatest 4090 costing anywhere from $1600-$2000 in many places can't run the game at 4k with RT...ouch.
 
Meh i forgot.. will add later today


It‘s 1% low. Will add that to the chart notes. I’m working on adding min fps to my other testing. Spent the whole jan on retesting cous, gpus and ssds… and still not finished.. fml
Appreciate the thoroughness nonetheless!
 
Thanks W1zzard, my son read your review and now wants to buy the game. So I guess his homework will have to wait now..... LOL.. :)
 
I wonder, how come it manages 30 fps with RT on PS5 and on PC with probably more details is at 11 FPS on 6700 xt, a gpu a bit more powerful than what PS5 has.

It might not actually be using hardware RT, it looks like developers target software implementations on consoles, Fortnite and Crysis remastered use software implementations on both PS5 and Series X for example. So it may be the case they just don't bother to properly implement the hardware RT path for PC.
 
I wonder, how come it manages 30 fps with RT on PS5 and on PC with probably more details is at 11 FPS on 6700 xt, a gpu a bit more powerful than what PS5 has.
PC games these days should have have release/launch day and one year later playable date after many patches, drivers and bug fixes.
We're back to devs not caring about PC ports yet again, just like the old days.

Devs have been spoiled by abundant memory and hardware of new consoles, and apparently 20 FPS stutters are just acceptable to gamers now. Hell, go look at the intel optimized driver thread and you'll see one after another meatshielding nvidia/AMD for lack of driver optimization, since that's the "old school" way. The good old days of the PS3/4 era when devs had to actually optimize to get their games to run are behind us.
 
We're back to devs not caring about PC ports yet again, just like the old days.

Devs have been spoiled by abundant memory and hardware of new consoles, and apparently 20 FPS stutters are just acceptable to gamers now. The good old days of the PS3/4 era when devs had to actually optimize to get their games to run are behind us.
Yep. Because the answer to every performance problem is "just throw a 4090 at it" nowadays.
 
I don't get the hate. First 'Next gen' games are always like this. Devs eventually learn to optimize better, and cards get more powerful. This has always happened in gaming.
Visuals are getting harder to improve meaningfully without impacting performance too much, that's how diminishing returns work. I also wouldn't expect AAA, story-based games to err on the side of performance rather than visuals.
 
Back
Top