• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Hogwarts Legacy Benchmark Test & Performance Analysis

Why don’t you show FPS with DLSS 2.0 and 3.0 and RT vs FSR 2.0?

We're back to devs not caring about PC ports yet again, just like the old days.

Devs have been spoiled by abundant memory and hardware of new consoles, and apparently 20 FPS stutters are just acceptable to gamers now. Hell, go look at the intel optimized driver thread and you'll see one after another meatshielding nvidia/AMD for lack of driver optimization, since that's the "old school" way. The good old days of the PS3/4 era when devs had to actually optimize to get their games to run are behind us.
Dynamic resolution, medium setting and RT on minimum
 
Honestly, this runs like Cyberpunk - if you crank everything and turn RT on you get 38-45 FPS on a 4090. But the game looks identical with DLSS quality/3.0 and that runs at 120+fps.

I think optimization + good scaling implementation and the result will be very good.
Optimization in modern games seems to mean turning your quality settings down from Ultra to High and doubling your FPS with no loss in image quality.

What I don't understand is why doesn't this happen on the developer's side before people start accusing them of not optimizing the game properly.
 
Optimization in modern games seems to mean turning your quality settings down from Ultra to High and doubling your FPS with not loss in image quality.

What I don't understand is why doesn't this happen on the developer's side before people start accusing them of not optimizing the game properly.

I mean I have to wonder what is considered ahead of its time? I think for the moment it’s ray tracing.

in which case is it really a case of optimization? I don’t think improved visuals can NOT come at a cost.

Crysis got a pass for punishing systems for years. What makes games like cyberpunk different?
 
Hmm, not a bad showing from the Arc A770 card with RT enabled, good effort from Intel there.
 
I mean I have to wonder what is considered ahead of its time? I think for the moment it’s ray tracing.

in which case is it really a case of optimization? I don’t think improved visuals can NOT come at a cost.

Crysis got a pass for punishing systems for years. What makes games like cyberpunk different?
It's not just that. If you compare the screenshots of Ultra+RT and Low quality, I doubt you see a lot of difference. What I can spot is the quality of AO, RT and shadows, but they're all part of the RT package. It's a very minor difference. And we're talking about a couple of screenshots here, not a moving, live game where counting polygons is the least of your concerns. Technically, you can triple the game's performance by sacrificing some near indistinguishable image quality enhancements.
 
Hopefully this game improves after a few patches. It looks awesome!
 
Thanks for that! :) RT results look promising. Crashes do not, unfortunately.


I think this only proves once again that VRAM allocation and VRAM usage aren't the same thing.
It allocates that much memory, caches additional textures, etc.
But it does not need that much to function normally
After all, if the vRAM is there, why not use it for *something*
@

Avro Arrow

What could be the cause of this?

Coz game alocate VRAM on my 3090, no benefit of vram 3090 over 3080!
"Allocated" is not "used". You can see cards with 8 GB dropping down in their relative positioning, that's when they are actually running out of VRAM (enough to make a difference in FPS)
Yeah, I know the difference between "allocated" and "used" which is why I was puzzled. The chart says "usage", not "allocation":
vram.png

Thanks to all of you. I really was going crazy trying to figure it out. I remember encountering situations in which the card I had at the time didn't have enough VRAM even though it was certainly powerful enough (HD 7970) so I thought that there might be some magic secret that I just didn't know about! :confused: :laugh:
 
Based on the article's data, I would say that all GPUs are running terribly, especially if you compare the cost in fps with the graphical result obtained.
I disagree. Now granted, i havent played the game to see if the visuals are there but the gpus perform where id expect them to for a game with almost the full rt pack. Actually it performs better than cyberpunk for example
 
The chart says "usage", not "allocation":
Ah, using "usage" is just to make it easier to understand for less technical people, because "allocation" is such a big word
 
I mean I have to wonder what is considered ahead of its time? I think for the moment it’s ray tracing.
Yep. I've been saying that very thing for years now. It's not even that big of a deal despite people calling it a "game-changer". A real game-changer was hardware tessellation back in the day because it made games look completely different, like night and day. RT on the other hand might be better at telling you what time of day it is in the game, but not much else.
in which case is it really a case of optimization? I don’t think improved visuals can NOT come at a cost.
I agree with you there. If that weren't the case, the RX 6600 would be considered high-end! :laugh:
Crysis got a pass for punishing systems for years. What makes games like cyberpunk different?
Crysis was more of a challenge that CryTek made to ATi and nVidia. It was a gauntlet thrown down that gave people something to aspire to, as in "just TRY to run this game with only one video card! I think that Hogwarts Legacy is the kind of game.

However, games like Control, Metro: Exodus, Cyberpunk 2077 and Spider-Man were just nVidia trying to validate their claims about RT and DLSS. From what I've seen, at least with me, they failed.

@Avro Arrow
Look at this guy, 11gb dedicated + 13gb alocated VRAm
Yeah, some other people said the same thing and I agree, it's the only thing that makes sense. What puzzled me was the fact that the chart said "VRAM Usage" instead of "VRAM Allocation" so it made me think the wrong thing. Thanks for that, I was racking my brain over this one! :laugh:

Ah, using "usage" is just to make it easier to understand for less technical people, because "allocation" is such a big word
Yep, and it sends techies like me into a tizzy! :laugh:
 
The A770 is both performing below expectations and above expectations. It's performing below expectations without raytracing where it's barely faster than a 6600 XT without game ready drivers. On the other hand, with raytracing, it's a 3080 competitor now.

I disagree. Now granted, i havent played the game to see if the visuals are there but the gpus perform where id expect them to for a game with almost the full rt pack. Actually it performs better than cyberpunk for example
Your bias is showing. Tell me of another game where the A770 is as fast as a 3090 at 4K when raytracing.

1675977287747.png
 
If only Intel could get non RT performance to consistently match the 6700xt instead of the 6600xt.
Why don’t you show FPS with DLSS 2.0 and 3.0 and RT vs FSR 2.0?


Dynamic resolution, medium setting and RT on minimum
Dynamic resolution is not optimization.
 
What is it with people asking "how does Arc perform"? The answer is irrelevant until or unless Intel fixes their driver to be usable on the most basic level, and as Wizz demonstrated their so-called "Game Ready" drivers are not. Why don't you rather ask more useful questions, like "does shit smell?" or "is space a vacuum?" or "is Raja Koduri capable of making a product that is not a steaming turd?"
 
What is it with people asking "how does Arc perform"?
Because people are interested in how the Arc cards work?
The answer is irrelevant until or unless Intel fixes their driver to be usable on the most basic level, and as Wizz demonstrated their so-called "Game Ready" drivers are not.
Well we wouldnt know that if W1z didnt test, so I'm not sure how you expect people to learn it otherwise?
Why don't you rather ask more useful questions, like "does shit smell?" or "is space a vacuum?" or "is Raja Koduri capable of making a product that is not a steaming turd?"
Who shat in your cereal this morning?
 
I have to agree with some others regarding the visuals. It’s not terrible, but doesn’t at all seem representative of the end result regarding the performance. A lot of surfaces look entirely flat, with just texturing added, and there just seems to be a sparse feeling to the environment. Maybe it looks better in-person, but I can think of other games I’ve played that look better and feel more immersive while being playable on considerably lesser hardware.
 
just seems to be a sparse feeling to the environment.

I can take some screen shots later, this is not the case AT ALL. Its ray tracing like entire forests.
 
Howcome the game uses 14GB vram at 1440p but the trx 3080 is not crippled by it?
 
71 FPS in 4K without RT on a $1600 card. 58 FPS on a $1200 card.

Impressive! PC gaming is amazing!
 
71 FPS in 4K without RT on a $1600 card. 58 FPS on a $1200 card.

Impressive! PC gaming is amazing!
Do you get more fps at same quality from a different platform? I dont get these kinds of posts.... Forspoken drops to 720p on the ps5. So wtf are you talking about pc gaming? You can match the console experience with 5 year old cards, you dont need 1600 or 1200 dollars.
 
"Allocated" is not "used". You can see cards with 8 GB dropping down in their relative positioning, that's when they are actually running out of VRAM (enough to make a difference in FPS)
But but, you write "usage" in your chart, so?
 
Your bias is showing. Tell me of another game where the A770 is as fast as a 3090 at 4K when raytracing.

View attachment 283104
My bias towards what? Intel is the only one with drivers ready for the game. And as far as i know they are doing pretty well on rt, dont know if they much the 3090,probably not, but they are doing good in general.
 
71 FPS in 4K without RT on a $1600 card. 58 FPS on a $1200 card.

Impressive! PC gaming is amazing!

Yeah, dude. Things are rough. I just ordered my i9-13900KS this week and came to a realization I had splurged on what's basically the most eccentric gaming CPU of all time - and by the time I get my Raptor Lake upgrade complete, the RTX 3090 I have will both be the oldest part, the graphics card that I've used for the longest time and will also remain the most expensive component on the build - and this year it will turn 3 full years old :(

I just can't upgrade it to the 4090, it's too expensive, and the 4070 Ti is a sidegrade that'd cost me 12 GB of VRAM in exchange for... DLSS3 FG? Maybe. Not to mention if I sold it, I couldn't make enough on it to afford a 4070 Ti to begin with. Yikes.
 
You think amd cards have the same rt performance as a 4080? Lol
Did you have trouble understanding the chart that was shown to you?
 
Back
Top