• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Hogwarts Legacy Benchmark Test & Performance Analysis

Yeah, dude. Things are rough. I just ordered my i9-13900KS this week and came to a realization I had splurged on what's basically the most eccentric gaming CPU of all time - and by the time I get my Raptor Lake upgrade complete, the RTX 3090 I have will both be the oldest part, the graphics card that I've used for the longest time and will also remain the most expensive component on the build - and this year it will turn 3 full years old :(

I just can't upgrade it to the 4090, it's too expensive, and the 4070 Ti is a sidegrade that'd cost me 12 GB of VRAM in exchange for... DLSS3 FG? Maybe. Not to mention if I sold it, I couldn't make enough on it to afford a 4070 Ti to begin with. Yikes.
You purchased an unlocked i9 for gaming?
 
I just played for a few hours, haven't noticed one stutter, perhaps me locking CPU + GPU to a fixed frequency is keeping the frametimes consistent :rolleyes:


Looks like I get 10FPS higher in CPU limited area than the YTer with 7700X
 
MSI are still doing the whole overheating VRM's thing?!

I thought they learned with B550, but it seems not
No, it is just MSI running too conservative settings, they drop IMC clock at DDR5 6000 before everyone else. That led to the cpu running at half IMC clock and terrible memory latency.
 
I just played for a few hours, haven't noticed one stutter, perhaps me locking CPU + GPU to a fixed frequency is keeping the frametimes consistent :rolleyes:


Looks like I get 10FPS higher in CPU limited area than the YTer with 7700X
I get similar results on a 12900k, the problem is the game is incredibly cpu bound at very low framerates. I mean we are both dropping to the 60s due to a cpu limitation. Imagine someone without top end CPUs, like a 3600x / 10600k or something similar
 
MSI are still doing the whole overheating VRM's thing?!

I thought they learned with B550, but it seems not
I'd be happy to test it... if my MSi had VRM temperature sensors. :laugh:

All I can say is, I don't see a CPU clock change during a 30-minute Cinebench run, so the problem must be something else, I guess.

No, it is just MSI running too conservative settings, they drop IMC clock at DDR5 6000 before everyone else. That led to the cpu running at half IMC clock and terrible memory latency.
That can be corrected with manual settings.
 
I get similar results on a 12900k, the problem is the game is incredibly cpu bound at very low framerates. I mean we are both dropping to the 60s due to a cpu limitation. Imagine someone without top end CPUs, like a 3600x / 10600k or something similar

Yeah but I have not experienced any stutter with Ultra RT, which is great, and it's not like any ADA owner would turn off Frame Generation in this game :D.

Hogwarts Legacy would have been a great marketing push for Frame Generation, but somehow Nvidia is not sponsoring this game :rolleyes:
 
I got 24gb VRAM and 64gb RAM, VRAM used in half 12GB and 12gb somwere in a NVME. So 12GB VRAM + 12GB allocated.
Game is CPU limited thats all.
Also game use 23GB RAM.
 
There appears to be more to this game than these charts are telling us. Steve Walton did an exhaustive Hogwarts Legacy performance comparison and, in his usual insane Aussie fashion, went all-out with it. He tested it with fifty-three different video cards. I don't even want to think of how long that must have taken but he's discovered that the 8GB and 10GB VRAM buffers of the RTX 3070 and 3080 respectively seriously cripple those cards when playing in Hogsmeade.
 
There appears to be more to this game than these charts are telling us. Steve Walton did an exhaustive Hogwarts Legacy performance comparison and, in his usual insane Aussie fashion, went all-out with it. He tested it with fifty-three different video cards. I don't even want to think of how long that must have taken but he's discovered that the 8GB and 10GB VRAM buffers of the RTX 3070 and 3080 respectively seriously cripple those cards when playing in Hogsmeade.

that is why..... all nvidia rtx 3000/4000 under 16gb VRAM, dont use RT with ultra graphic on 2160p......also, DLSS must be ON......

IMO....
 
The new AMD GPU driver seems to bring a 6-8% uplift in RT performance, as some of us have noted in 3DMark in another thread. Can somebody confirm if the same applies in Hogwarts Legacy?
 
that is why..... all nvidia rtx 3000/4000 under 16gb VRAM, dont use RT with ultra graphic on 2160p......also, DLSS must be ON......

IMO....
Yup. What I thought was funny was the fact that the RTX 3060 out-performed the RTX 3070 and RTX 3080. I can't say that I ever thought I'd see that. :laugh:

Hogwarts Legacy would have been a great marketing push for Frame Generation, but somehow Nvidia is not sponsoring this game :rolleyes:
Probably because, in Hogsmeade, the nVidia cards get exposed for their lack of VRAM. The RTX 3060 actually out-performs the RTX 3080.

I don't think that's what Jensen would consider to be a valuable marketing tool. :laugh:
 
Yup. What I thought was funny was the fact that the RTX 3060 out-performed the RTX 3070 and RTX 3080. I can't say that I ever thought I'd see that. :laugh:


Probably because, in Hogsmeade, the nVidia cards get exposed for their lack of VRAM. The RTX 3060 actually out-performs the RTX 3080.

I don't think that's what Jensen would consider to be a valuable marketing tool. :laugh:
Jensen probably thinks that all valuable members of "the Nvidia community" are using 40-series already. :laugh:
 
Played this at my mates last night for a little bit, the gameplay/story doesn't really interest me but performance is a hot topic. He has 9700K+3080 system, no VRAM related issues from the 3080 to note, with or without RT, Textures set to Ultra, 4K using DLSS quality, game looked and played great. Tried a bit to replicate what HUB saw, but his results are basically in line with TPU's testing (with DLSS off), I wonder what's up on their system.
 
Played this at my mates last night for a little bit, the gameplay/story doesn't really interest me but performance is a hot topic. He has 9700K+3080 system, no VRAM related issues from the 3080 to note, with or without RT, Textures set to Ultra, 4K using DLSS quality, game looked and played great. Tried a bit to replicate what HUB saw, but his results are basically in line with TPU's testing (with DLSS off), I wonder what's up on their system.

The new patch has reduced the texture pool size limit, so less VRAM usage more texture streaming (similar to Doom Eternal setting), therefore HUB testing is rather outdated by now but they like to make a big deal out of VRAM :rolleyes:.

I'm also playing the game because it's a fun game too, I'm completely uninterested in HP-verse
 
therefore HUB testing is rather outdated by now but they like to make a big deal out of VRAM :rolleyes:.
Yeah AMD's volunteer marketing department are working overtime congratulating themselves about this one game 'obsoleting' the 3080, that turned out to be the games issue lol. HUB love a good poke (to either brand to be fair), and can't resist stirring the pot, but I agree it's essentially their review/testing that's what's actually obsolete.
 
Yeah AMD's volunteer marketing department are working overtime congratulating themselves about this one game 'obsoleting' the 3080, that turned out to be the games issue lol. HUB love a good poke (to either brand to be fair), and can't resist stirring the pot, but I agree it's essentially their review/testing that's what's actually obsolete.
They don't call it 'AMD Unboxed' for nothing.
 
You purchased an unlocked i9 for gaming?

Why not? I'm upgrading from a Ryzen 5950X. Anything less and it wouldn't be worth it ;)
 
The new patch has reduced the texture pool size limit, so less VRAM usage more texture streaming (similar to Doom Eternal setting), therefore HUB testing is rather outdated by now but they like to make a big deal out of VRAM :rolleyes:.
That's just the nature of Youtube, I guess. I remember when they made a big deal out of cheap B560 motherboards not being able to supply enough power and sustain stable clocks on certain higher-end processors. Everybody was bashing Intel for this at the time, while I was like: "Yeah, no sh*t, Sherlock! What about not putting high-end CPUs into cheap-ass motherboards?" :laugh:
 
The new patch has reduced the texture pool size limit, so less VRAM usage more texture streaming
Hmm, no mention of it in the official patch notes. Did you check the specific line in Engine.ini before and after?

The February 14 patch does address a number of performance and stability issues, among them crashing caused by VRAM spillover.
 
Jensen probably thinks that all valuable members of "the Nvidia community" are using 40-series already. :laugh:
Yep, everyone without an RTX 40-series card has no value to him, not even people with RTX-30 and below. :D

The new patch has reduced the texture pool size limit, so less VRAM usage more texture streaming (similar to Doom Eternal setting), therefore HUB testing is rather outdated by now but they like to make a big deal out of VRAM :rolleyes:.
Well, even if that's the case (and I'm glad that it is), the day is not too far off when having less than 12GB really WILL cripple performance in hi-res gaming. I remember when the standard amount of VRAM for high-end cards was 512MB and 1GB was considered an upgrade. The HD 7970 started being crippled by its 3GB VRAM buffer long before the GPU itself was considered weak. The whole reason that high-end cards get larger VRAM buffers is so that the power of their GPUs can be leveraged further into the future without issue.

If you don't think that VRAM is a big deal, then I guess that GPU manufacturers should just put 8GB on everything. After all, it's not a big deal, eh?
I'm also playing the game because it's a fun game too, I'm completely uninterested in HP-verse
I liked the books and movies but the HP universe isn't even remotely as interesting as those of JRR Tolkien or HP Lovecraft so I'm not really into it either. I find that Sci-Fi universes are the most interesting of any genre anyway. Fantasy doesn't really have universes, it just has worlds. :laugh:
 
If you don't think that VRAM is a big deal, then I guess that GPU manufacturers should just put 8GB on everything. After all, it's not a big deal, eh?
Strawman bad. This isn't in the spirit of what they said at all.
 
If you don't think that VRAM is a big deal, then I guess that GPU manufacturers should just put 8GB on everything. After all, it's not a big deal, eh?
I prefer 8gb of vram and better RT performance than 16gb of ram and terrible RT performance.
 
I prefer 8gb of vram and better RT performance than 16gb of ram and terrible RT performance.
3090 or 7900XT if you can find both in the same price?
Bonus: or ~+100£ for 4070Ti?
 
Back
Top