• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hogwarts Legacy Benchmark Test & Performance Analysis

Oh dear god, that RT effort form RDNA3 is abysmal whether you have any interest or not. 16GB cards now needed for 1080p gaming. First Forespoken and now Hogwarts. Got to say the 7800XT aka 7900XT is disappointing in this game.
 
What is going on here? A770 is on par with 3090 in RT and much faster than 7900 xtx?! Bruh, this is some wild crap.
 
People seem to keep waiting for the day when the 10GB 3080 chokes because of VRAM, but that day is yet to come
Well i don't know how a very simple question lead to you drawing that conclusion. And no answer :(
 
Well the chart says usage, not allocation. It clearly means memory being used. Unless my definitions are somehow wrong.

So that begs the question... why isn't the card tanking to AMD levels of RT performance???
 
So that begs the question... why isn't the card tanking to AMD levels of RT performance???
That one is obvious, the 3080 is equivalent to the 6800xt in raster, and we all know by now that nvidia has better rt performance. But then that's the origin of my question(14GB vram used), why? I was hoping someone with knowledge on the subject would explain.
 
That one is obvious, the 3080 is equivalent to the 6800xt in raster, and we all know by now that nvidia has better rt performance. But then that's the origin of my question(14GB vram used), why? I was hoping someone with knowledge on the subject would explain.


Ouch
 

Ouch
LOL, so he changed the words, for the "less" knowledgeable people :D
Thanks.
 
LOL, so he changed the words, for the "less" knowledgeable people :D
Thanks.

It's clear you needed help and was struggling with the miracle of the 10 GB wonder.
 
It's clear you needed help and was struggling with the miracle of the 10 GB wonder.
If the game was not using more than 10GB then it is no wonder or miracle at all, it's just business as usual. And for the record, because you seem to be laser focused on the vram usage debate, i don't care. I know that by the time more than 10GB of vram is used, 6800xt and 3080 won't be relevant, won't have enough power to drive games that use that amount.
 
If the game was not using more than 10GB then it is no wonder or miracle at all, it's just business as usual. And for the record, because you seem to be laser focused on the vram usage debate, i don't care. I know that by the time more than 10GB of vram is used, 6800xt and 3080 won't be relevant, won't have enough power to drive games that use that amount.

Well indeed, and if the 10 GB card isn't suffering as badly as expected then there really isn't anything else to say, hell I'd argue it's perfect for the GPUs level of grunt it has.

In short, I think you may have answered your own question.
 
Well indeed, and if the 10 GB card isn't suffering as badly as expected then there really isn't anything else to say, hell I'd argue it's perfect for the GPUs level of grunt it has.

In short, I think you may have answered your own question.
Something i would like to know also, is there any tool to know exactly how much vram is being used?
 
HwInfo64 its a bit extensive monitoring (can lower fps a few) used with RTSS Riva Tuner Statistics Server.
Or if you have an AMD Radeon in the performance metrics section of the Adrenalin control panel.
Also for Nvidia MSI afterburner includes RTSS.
 
The ray tracing performance hit can be reduced by lowering the ray tracing quality setting from the "Ultra" default, which helps a lot with performance, especially on AMD. RX 7900 XTX can now reach 60 FPS with ray tracing "low," at 1080p. Not impressive, but still a huge improvement over 28 FPS. In terms of VRAM usage, Hogwarts Legacy set a new record, we measured 15 GB VRAM allocated at 4K with RT, 12 GB at 4K with RT disabled. While these numbers seem shockingly high, you have to put them in perspective. Due to the high performance requirements you'll definitely not be gaming with a sub-16 GB card at 4K. 8 GB+ at 1080p is certainly a lot, but here, too, owners of weaker GPUs will have to dial down their settings anyway. What's always an option is to use the various upscaling methods like DLSS, FSR and XeSS, which lower VRAM usage, too, by running at a lower internal rendering resolution.


***** so, u have never tested with DLSS v2 quality, balanced, performance, auto, for low vga card ?? such as using DLSS quality with rtx 3080 10gb/rtx 4070 ti 12gb on ultra setting (without RT) in 2160p ??
 
Last edited:
Looks like y'all are losing your minds over the poor performance blaming the cards, drivers, and so on.

It's shituvo guys, always has been.
 
This games all over social media today for all the wrong reasons, but one thing stood out - one user found .ini files you can customise the ray tracing in, and found the defaults to be absolute garbage for performance and quality

FB screenshots couldnt really show much of a difference, but the FPS values were a lot higher with his changes (and since thats some random FB user, i'm sure better guides will exist elsewhere soon enough)

People seem to keep waiting for the day when the 10GB 3080 chokes because of VRAM, but that day is yet to come :p
High performance storage and RAM will alleviate that issue
Someone on DDR4 2133 with a sata SSD would have a stutter fest, but review level hardware with high speed RAM and storage behind it wont suffer anywhere near as much.


And yes, it really is that drastic an issue - i fixed a friends system with a weakass 2GB GT960 and nearly doubled her FPS in DX12 games by OCing her ram from 2133 to 2667 - higher VRAM is a buffer, but if a system can stream that data fast enough it's not needed (but can cause those 1% and 0.1% lows to dip)

One of my intel machines (i7 6700, locked to DDR4 2133) has great CPU performance but was *Garbage* with a GTX980 4GB GPU with lots of stuttering - all gone with an 8GB 1070 - the exact opposite fix to the same problem.
 
Last edited:
Is Hogwarts legacy using Full path RT? 3060 faster than 7900xtx in 4K with RT on..
If Hybrid RT supposed to be around 3090s
Nah, the RT is pretty much just borked in places right now.
Also the RT effects are pretty low res and noisy, so definitely an optimization issue. CP2077 looks better and runs less poorly in RT.
 
Last edited:
Nvidia spent good money .....
You can either say that Nvidia spent good money, or AMD is not. Nvidia had a good head in terms of hardware accelerated RT, so not surprising most RT heavy titles favor Nvidia hardware. I actually don't think RDNA3's RT performance is bad since we can tell its around the capability of Ampere's RT performance. But the inconsistent RT performance seems to suggest a lack of optimization for AMD hardware.
 
@W1zzard Did you use a Seasonic Vertex PX 850 W ATX 3.0 or a GX?

Thanks!
It's GX, thanks, fixing

But but, you write "usage" in your chart, so?

If you come up with a better wording that's accessible to everyone, I can change the charts

Something i would like to know also, is there any tool to know exactly how much vram is being used?
Nothing that's accessible to us as far as I know

The ray tracing performance hit can be reduced by lowering the ray tracing quality setting from the "Ultra" default, which helps a lot with performance, especially on AMD. RX 7900 XTX can now reach 60 FPS with ray tracing "low," at 1080p.
Yup, I mention this in the conclusion
 
You can either say that Nvidia spent good money, or AMD is not. Nvidia had a good head in terms of hardware accelerated RT, so not surprising most RT heavy titles favor Nvidia hardware. I actually don't think RDNA3's RT performance is bad since we can tell its around the capability of Ampere's RT performance. But the inconsistent RT performance seems to suggest a lack of optimization for AMD hardware.
People are quick to jump to conspiracy on the internet, the fact that the 3090 is barely faster than the A770 in 4k RT means something is not working right.
Do people seriously thinks that the A770 is suppose to match a 3080Ti in 4K RT?
On top of that, the RT in this game is nothing special for it to run this poorly.
1676014302993.png


Edit: Apperently HUB also found a menu bug in this game.
Hardware Unboxed:"OMG! I solved the issue, it's not a Ryzen bug but rather a menu bug. Although DLSS and all forms of upscaling were disabled & greyed out in the menu, frame generation was on for just the 40 series. I had to enable DLSS, then enabled FG, then disable FG and disable DLSS to fix it!" / Twitter
 
Last edited:
Is it me or the graphics engine is nothing special, even looks outdated??
And boy, the AMD cards do really suck on Ray Tracing, not only on this game, but any so far...
Looks like the only proper card to play this game with everything max is the RTX 4090...

Any news when the 50x0 series is going to be released, haha?
 
Is it me or the graphics engine is nothing special, even looks outdated??
And boy, the AMD cards do really suck on Ray Tracing, not only on this game, but any so far...
Looks like the only proper card to play this game with everything max is the RTX 4090...

Any news when the 50x0 series is going to be released, haha?
It is still on Unreal Engine 4.
 
The AMD numbers with RT on do not match at all to an other benchmark of the game with 13900K, RT Ultra, where the nVidia numbers were pretty much the same:
1080r/1440p Ultra + Ultra RT:

7900 XTX 59/42 fps
6900 XT 45/29 fps
6800 XT - 40/26 fps
RTX 4070 Ti 62/47 fps (1440p is much more than the 41 fps on this review, true)

I've never seen that big difference in the same game same CPU test. Because of that I think that there is something wrong with the RT ON tests here for AMD cards - sure AMD sucks at RT, but not that much. This is too strange for me.
 
Back
Top