• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hogwarts Legacy Benchmark Test & Performance Analysis

Yeah, dude. Things are rough. I just ordered my i9-13900KS this week and came to a realization I had splurged on what's basically the most eccentric gaming CPU of all time - and by the time I get my Raptor Lake upgrade complete, the RTX 3090 I have will both be the oldest part, the graphics card that I've used for the longest time and will also remain the most expensive component on the build - and this year it will turn 3 full years old :(

I just can't upgrade it to the 4090, it's too expensive, and the 4070 Ti is a sidegrade that'd cost me 12 GB of VRAM in exchange for... DLSS3 FG? Maybe. Not to mention if I sold it, I couldn't make enough on it to afford a 4070 Ti to begin with. Yikes.
I wouldn't worry about it. Just turn the quality down a notch, and enjoy double the frame rate with no loss in image quality.

Ultra quality is a scam at this point.
 
Did you have trouble understanding the chart that was shown to you?
The chart has games that dont have serious rt that skews the results. Pretty self explanatory stuff, when your average includes fc6 and sotr you are not really testing rt. But whatever man, you wanna think amd matches 4080 in rt, good for you, i wont stop you.
 
The chart has games that dont have serious rt that skews the results. Pretty self explanatory stuff, when your average includes fc6 and sotr you are not really testing rt. But whatever man, you wanna think amd matches 4080 in rt, good for you, i wont stop you.
Sort of yes, sort of no. Personally, I'd say 17% difference isn't the end of the world, especially from a much cheaper GPU.
1675979347544.png
 
I wouldn't worry about it. Just turn the quality down a notch, and enjoy double the frame rate with no loss in image quality.

Ultra quality is a scam at this point.

Yeah, that's what I'm gonna have to do. Ouchie on my pride though... :oops:
 
I can take some screen shots later, this is not the case AT ALL. Its ray tracing like entire forests.
I was referring more to the indoor environments. I didn’t go through all the screenshots though. I do think when it comes to lighting, it’s not as easy to represent the feel from a screenshot, as there can be a temporal component to it.
 
Yeah, that's what I'm gonna have to do. Ouchie on my pride though... :oops:
Tell your pride to stop whining. :p Just look at the screenshot comparisons in the review. The situation is pretty self-explanatory. ;)
 
I was referring more to the indoor environments. I didn’t go through all the screenshots though. I do think when it comes to lighting, it’s not as easy to represent the feel from a screenshot, as there can be a temporal component to it.

Yeah this precisely. The inside of the castle is all marble walls and flooring. Its very dynamic, and the day/night cycles are happening in real time (light streams in the windows). Things like mirrors actually reflect you etc. A still doesn't do it justice, between haze (fog) lights, shadows, there is a lot going on in any given scene.
 
Sort of yes, sort of no. Personally, I'd say 17% difference isn't the end of the world, especially from a much cheaper GPU.
View attachment 283107
The difference in rt capabilities between the 2 cards is nowhere close to 17% but again, sure whatever, i dont wanna argue. If you remove raster or games with non existent rt (like fc6 sotr etc) from the picture the difference skyrockets. Go check some actual full RT benches and see whats going on and then come back and tell me their difference is 17%. It just isn't.
 
Now i just ask. Is this game badly optimized or is even RTX 4090 aging fast. These low FPS even a 4090 in 4K. Is kinda crazy. Lowest FPS is under 60 FPS and RT totally kills even this card.
 
Now i just ask. Is this game badly optimized or is even RTX 4090 aging fast. These low FPS even a 4090 in 4K. Is kinda crazy. Lowest FPS is under 60 FPS and RT totally kills even this card.
Or its time people realize RT is heavy. REALLY heavy? Like it takes thousands of supercomputers to render a 3d movie at 24fps in a year or something. We are asking home hardware to do it 60 times per second.
 
Running an RTX4080 and Ryzen 3900x, 4k with ultra settings, 165 framerate, ray tracing, dlss 3, frame generation, Reflex +, etc... I am getting frame rates over 100 FPS and smooth game play. Looks pretty great as well.
 
Or its time people realize RT is heavy. REALLY heavy? Like it takes thousands of supercomputers to render a 3d movie at 24fps in a year or something. We are asking home hardware to do it 60 times per second.
No dout RT is heavy. But even with out RT, minimum FPS is stil below my prefer minimum target of at least 60 FPS.
 
It allocates that much memory, caches additional textures, etc.
But it does not need that much to function normally
After all, if the vRAM is there, why not use it for *something*

Yes, so long as the game's most frequently used bits of data are stored in VRAM having to put the rest in main system memory isn't going to incur a massive performance penalty. It's only when that frequent data set exceeds the size of VRAM do you start having issues.

You can see that a tad with the 3070 in this benchmark going from 1080 to 1440p as it doesn't scale as well relative to the 2080 Ti.
 
LoL , look at game menu 340w , 123FPS, 1440p on 3090.
 

Attachments

  • meny.jpg
    meny.jpg
    1.1 MB · Views: 160
Now i just ask. Is this game badly optimized or is even RTX 4090 aging fast. These low FPS even a 4090 in 4K. Is kinda crazy. Lowest FPS is under 60 FPS and RT totally kills even this card.
It's just garbage optimization on a AAA game. Nobody should be surprised by this anymore.
 
It's just garbage optimization on a AAA game. Nobody should be surprised by this anymore.
It is sadly not that rare that games release badly optimized and full of bugs. Cyberpunk 2077 is a good exsample of that.
 
Game looks interesting even though I'm not an HP fan. Unfortunately, even if I was and wanted to pick this game up, I literally couldn't play it even on Low settings @ 1080p on my current system. My GPU only has 6GBs of VRAM. Six. I wouldn't even be able to get my foot through the door before the game crashed or something.

(I'm supposed to get a new card but...yeah....)
 
The difference in rt capabilities between the 2 cards is nowhere close to 17% but again, sure whatever, i dont wanna argue. If you remove raster or games with non existent rt (like fc6 sotr etc) from the picture the difference skyrockets. Go check some actual full RT benches and see whats going on and then come back and tell me their difference is 17%. It just isn't.
Well, if you remove all games except for Hogwarts Legacy, you'll jump to the conclusion that AMD cards are bad without, and really bad with RT. The question is, how relevant is the average, and how relevant are the handful of games that you pick for yourself. In the end, everyone needs to answer this for themselves.

By the way, I checked, in nearly half of the tested games, the 7900 XTX is at, or close to 4080 level in RT. You can say that those games have poor RT effects that aren't demanding, but... so what?
 
Well, if you remove all games except for Hogwarts Legacy, you'll jump to the conclusion that AMD cards are bad without, and really bad with RT. The question is, how relevant is the average, and how relevant are the handful of games that you pick for yourself. In the end, everyone needs to answer this for themselves.

By the way, I checked, in nearly half of the tested games, the 7900 XTX is at, or close to 4080 level in RT. You can say that those games have poor RT effects that aren't demanding, but... so what?
So when you are trying to determine the difference between 2 cards specifically in RT, you have to exclude everything else but RT. If you don't, youll end up thinking that the difference between the 2 cards is lets say 10%, and then a game like cyberpunk with the full rt package drops and you see a 50% difference and wonder wtf is going on.

It's for the same reason you test cpus at low resolution. To exclude everything else but the cpu performance. Taking averages on RT games like far freaking cry 6 and sotr among others ijust a joke. Correct me if im wrong, but in every game that utilizes heavy RT, the 7900xtx is basically sandwiched between a 3080 and a 3090, right? 4080 is 50% faster than that.
 
GPURaster FPSRaytrace FPSHIT%
RTX2080 Ti 11GB8136-56%
ARC 770 16GB6246-26%
RX6600 XT 8GB598-86%

I took one set of data, a typically "better" card suitable for playing at 1920x1080 at around 60FPS or thereabouts. (raster)

I am actually pretty impressed with the ARC770. It is slow for raster, but pretty good for RT. Perhaps drivers will improve raster. But there is actually hope in the next version of Intel ARC whatever they call it. Even, at the right price, for the curious, the current ARC 770!

Performance per Watt needed!
 
Last edited:
Forspoken drops to 720p on the ps5. So wtf are you talking about pc gaming? You can match the console experience with 5 year old cards, you dont need 1600 or 1200 dollars.

Oh yeah, Forspoken, a game where textures don't even load on cards with 8 GB of VRAM. :D

Hogwarts definitely has issues. There is stutter and some weird CPU bottleneck.


Pretty much every new game has some major problems on PC. You spend thousands on hardware, but you get stutter, crashes and various other problems.

Consoles are not perfect either, but most games are way more polished than they are on PC. Console games don't get much better over time, though, while PC games definitely do. PC is a great platform to play games after a few weeks or months, or older games which you can max out with newer hardware. But the day 1 experience is often horrendous.
 
So when you are trying to determine the difference between 2 cards specifically in RT, you have to exclude everything else but RT. If you don't, youll end up thinking that the difference between the 2 cards is lets say 10%, and then a game like cyberpunk with the full rt package drops and you see a 50% difference and wonder wtf is going on.

It's for the same reason you test cpus at low resolution. To exclude everything else but the cpu performance. Taking averages on RT games like far freaking cry 6 and sotr among others ijust a joke. Correct me if im wrong, but in every game that utilizes heavy RT, the 7900xtx is basically sandwiched between a 3080 and a 3090, right? 4080 is 50% faster than that.
You're not wrong... all I'm saying is, I prefer testing relevant games at relevant resolutions and not drawing any overarching conclusion from the results. The reasons why the 7900 XTX performs horribly in Hogwarts Legacy with RT on can be many, but it doesn't change the situation that it performs horribly and you shouldn't turn RT on in this particular game if you have an AMD card.

It might be a driver issue. It might be a bug. It might be some specific code that runs better on Nvidia cards. Or maybe AMD's RT is really bad. My point is, "in the end, it doesn't even matter". :rockout:

We can't even know for sure unless every single game runs so badly with RT on, which is clearly not the case.
 
Oh yeah, Forspoken, a game where textures don't even load on cards with 8 GB of VRAM. :D
The same textures running on the console version? Again, it drops to 720p on a ps5. Im sure any card from the last 5 years can play the game at 720p. So yeah, typically when it comes to console vs pc gaming, they are comparing apples and oranges. Full maxed out RT + 4k + Extreme uber ultra settings vs 720p low no RT on consoles. So yeah, sure, whatever.
 
Howcome the game uses 14GB vram at 1440p but the trx 3080 is not crippled by it?

People seem to keep waiting for the day when the 10GB 3080 chokes because of VRAM, but that day is yet to come :p
71 FPS in 4K without RT on a $1600 card. 58 FPS on a $1200 card.

Impressive! PC gaming is amazing!

Hey could be worse, try turning RT on with the latest cutting edge AMD cards costing north of $900. Even Intel do better which is kinda... embarassing.
 
Back
Top