• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hogwarts Legacy Benchmark Test & Performance Analysis

3090 or 7900XT if you can find both in the same price?
Bonus: or ~+100£ for 4070Ti?
4070ti > 7900xt > 3090. The 3090s problem isn't that much performance but power draw was kinda nuts compared to my 4090
 
I have a question. I play this game with everything cranked at 1080p with RT on (I have 6700xt, r7 1700, 16gb DDR4) and for some reason I can't look at bushes. I'm serious, whenever I go outside and look at a tree or something, my FPS goes down from 40-60 FPS to 4 FPS. I think it is doing this because it is trying to ray trace theough the leaves and is not having a good time. Does anyone have any idea what might be happening here? Thanks in advance :)
 
I have a question. I play this game with everything cranked at 1080p with RT on (I have 6700xt, r7 1700, 16gb DDR4) and for some reason I can't look at bushes. I'm serious, whenever I go outside and look at a tree or something, my FPS goes down from 40-60 FPS to 4 FPS. I think it is doing this because it is trying to ray trace theough the leaves and is not having a good time. Does anyone have any idea what might be happening here? Thanks in advance :)

You might have Ray Tracing set on High/Ultra. Bring it down in quality, or turn it off. RT on AMD isn't working like it's supposed to right now. See the Benchmark section of this review you're replying to:

performance-rt-1920-1080.png
 
People seem to keep waiting for the day when the 10GB 3080 chokes because of VRAM, but that day is yet to come :p
It definitely happens. I had a 3080 when I upgraded to a 48in 4k OLED. There were 2-3 titles it was running between 5-10fps vs 60+ before. Reducing vram intensive settings fixed it. Never had any issues at 3440x1440.

Nah, the RT is pretty much just borked in places right now.
Also the RT effects are pretty low res and noisy, so definitely an optimization issue. CP2077 looks better and runs less poorly in RT.
It is doing 33% resolution you can fix it in the INI file. 100% resolution with 1 sample looks better than 33% resolution with 4 (default).

I just played for a few hours, haven't noticed one stutter, perhaps me locking CPU + GPU to a fixed frequency is keeping the frametimes consistent :rolleyes:

Looks like I get 10FPS higher in CPU limited area than the YTer with 7700X
Just two days ago, I hit a spot that dropped me into seconds per frame territory. I'm running a 4090/7950x combo with 64GB of ram...
 
Last edited:
Strawman bad. This isn't in the spirit of what they said at all.
My whole point was that the amount of VRAM matters. Any argument against that would have to be that it doesn't otherwise there's no argument.
 
It definitely happens. I had a 3080 when I upgraded to a 48in 4k OLED. There were 2-3 titles it was running between 5-10fps vs 60+ before. Reducing vram intensive settings fixed it. Never had any issues at 3440x1440.

What games out of interest were those, I'm at 4K and haven't had any issues as it stands.

But as you say, when you can drop one or two settings, which can massively improve performance whilst playing spot the difference to IQ it's hard to be too upset when it's literally one or two games.
 
Every AAA release this year so far (Forspoken, Dead Space and now this) performs dreadfully despite providing no meaningful uplift in graphical quality. Hogwarts and Forspoken especially look more or less like PS4 titles with slightly better textures.
I find it a big shame that DLSS/FSR are now just used as a crutch instead of enabling next-gen graphics.
Is it a surprise though?

The few dev's that speak openly about game development on big AAA titles, always talk about how by the time they start working on performance, its right at the end, so the engine and features are set in stone, and its usually just before a game is released so very little time to do it, its very low priority.

So if a tech is released which improves performance, that would just translate to not having to achieve as much optimisation, as the tech now does it for them.

The video footage of this game is an embarassement, should never have been approved for release, but stuttery gameplay sadly seems to be the norm now.
 
My whole point was that the amount of VRAM matters. Any argument against that would have to be that it doesn't otherwise there's no argument.
Reality is a lot more nuanced than "any argument against that would have to be that it doesn't [matter]", you don't really get to decide that. and using a strawman to try and make your point doesn't help you at all either.
 
so, are this rtx 3080 10gb, still can playing with ultra detail on 2160p 60fps, without RT, but using only DLSS quality, in this rpg pc.... ??
 
Reality is a lot more nuanced than "any argument against that would have to be that it doesn't [matter]", you don't really get to decide that. and using a strawman to try and make your point doesn't help you at all either.
Actually, you're wrong on a couple of things. Firstly, I DO get to decide if I think that a lack of VRAM on a card is detrimental and I DO get to decide if I want to tell people about it on a forum that exists for that very purpose.

If something is obviously true, there is no such thing as a strawman argument supporting it. An argument can only be strawman if it's supporting a falsehood because that's why it's called strawman in the first place, there's nothing holding it up except for false rhetoric. Saying that a card without enough VRAM is a bad purchase is not false rhetoric, it is an obvious truth. Therefore, it's not an argument made of straw but an argument made of granite.

The reason that it's especially relevant today is that we're in a pretty specific situation here where so many people who bought last-gen cards paid double or more what they would've paid for the previous-gen. The fact that they did have to pay that much makes every shortcoming on these cards all the more egregious, especially shortcomings that limit the cards' longevity.

If you suddenly had to pay twice as much as what was historically normal for something, wouldn't you be all the more angry that the company that made the product had knowingly done something that would severely limit its useful life? I sure would and I don't think that this is a strawman argument. Sure, in the days of the RTX 20 and RX 5000 cards, it wasn't nearly as bad a thing because cards were priced in a much more sane manner than they were in the storm that followed so I'm guessing that this is the nuance that you're talking about.

The nuance that I'm talking about is people paying twice as much (or more) for cards that have had their useful lives artificially shortened by small VRAM buffers. I had the same thing happen to me twice (the insufficient VRAM, not the paying double). Once was with the HD 7970 because the GPU was strong enough to use more than 3GB of VRAM and the other was with the R9 Fury. With the Fury I knowingly bought it because the first mining craze of 2017 was on and I was stuck with that 3GB HD 7970 since Crossfire had been essentially dropped. Since the Fury had a TDP of 275W, it wasn't suitable for mining and so was half the price of the RX 580 despite being faster. I decided that the low price and high level of performance for the money was worth it despite the low VRAM because everything else was literally double the price or more. Greg Salazar did a video about it:
However, my long-term experience with the R9 Fury, as long-lived as it was, taught me to never again choose a card, especially a high-end card, if it had significantly less VRAM than its rival. The R9 Fury has had decent longevity but not at anything higher than 1080p and if I had paid full price for it, I probably would've felt cheated.

My whole point is based on altruism and empathy because I'm not suffering from a lack of VRAM, I have an RX 6800 XT and 16GB is not lacking. I'm saying what I'm saying so that people who have been screwed like this know that it happened and why so that they'll see it coming next time and avoid it. I remember pulling my hair out because my HD 7970 had enough GPU horsepower to play a specific game (I don't remember which game because it's been so long) but the 3GB of VRAM just ruined the whole experience. I just don't want others to have to live with that same frustration. I don't see how that makes me a bad person, oh yeah, because it doesn't. I don't know why anyone would push back against this unless they're people who have made this mistake and are in denial or know that it's true and are just being belligerent because they don't want to admit that they messed up.

There's no shortage of people like that on the internet and you know it, especially among the younger crowd.
 
Last edited:
Actually, you're wrong on a couple of things. Firstly, I DO get to decide if I think that a lack of VRAM on a card is detrimental and I DO get to decide if I want to tell people about it on a forum that exists for that very purpose.

If something is obviously true, there is no such thing as a strawman argument supporting it. An argument can only be strawman if it's supporting a falsehood because that's why it's called strawman in the first place, there's nothing holding it up except for false rhetoric. Saying that a card without enough VRAM is a bad purchase is not false rhetoric, it is an obvious truth. Therefore, it's not an argument made of straw but an argument made of granite.

The reason that it's especially relevant today is that we're in a pretty specific situation here where so many people who bought last-gen cards paid double or more what they would've paid for the previous-gen. The fact that they did have to pay that much makes every shortcoming on these cards all the more egregious, especially shortcomings that limit the cards' longevity.

If you suddenly had to pay twice as much as what was historically normal for something, wouldn't you be all the more angry that the company that made the product had knowingly done something that would severely limit its useful life? I sure would and I don't think that this is a strawman argument. Sure, in the days of the RTX 20 and RX 5000 cards, it wasn't nearly as bad a thing because cards were priced in a much more sane manner than they were in the storm that followed so I'm guessing that this is the nuance that you're talking about.

The nuance that I'm talking about is people paying twice as much (or more) for cards that have had their useful lives artificially shortened by small VRAM buffers. I had the same thing happen to me twice (the insufficient VRAM, not the paying double). Once was with the HD 7970 because the GPU was strong enough to use more than 3GB of VRAM and the other was with the R9 Fury. With the Fury I knowingly bought it because the first mining craze of 2017 was on and I was stuck with that 3GB HD 7970 since Crossfire had been essentially dropped. Since the Fury had a TDP of 275W, it wasn't suitable for mining and so was half the price of the RX 580 despite being faster. I decided that the low price and high level of performance for the money was worth it despite the low VRAM because everything else was literally double the price or more. Greg Salazar did a video about it:
However, my long-term experience with the R9 Fury, as long-lived as it was, was to never again choose a card, especially a high-end card, if it had significantly less VRAM than its rival. The R9 Fury has had decent longevity but not at anything higher than 1080p and if I had paid full price for it, I probably would've felt cheated.

My whole point is based on altruism and empathy because I'm not suffering from a lack of VRAM, I have an RX 6800 XT and 16GB is not lacking. I'm saying what I'm saying so that people who have been screwed like this know that it happened and why so that they'll see it coming next time and avoid it. I remember pulling my hair out because my HD 7970 had enough GPU horsepower to play a specific game (I don't remember which game because it's been so long) but the 3GB of VRAM just ruined the whole experience. I just don't want others to have to live with that same frustration. I don't see how that makes me a bad person, oh yeah, because it doesn't. I don't know why anyone would push back against this unless they're people who have made this mistake and are in denial or know that it's true and are just being belligerent because they don't want to admit that they messed up.

There's no shortage of people like that on the internet and you know it, especially among the younger crowd.
You make it sound like the only tradeoff between nvidia / amd is the vram. It's not, going for the extra vram on the amd means you sacrifice a ton of RT performance, probably - at least in the case on the new gen of cards - worse power draw and efficiency, insane amounts of power draw while playing a youtube video or if you use dual monitor etc. On the other hand, what do you have to sacrifice IF there is a game that needs more vram than your card has, you drop textures from ultra to very high instead. So, not a whole lot.
 
You make it sound like the only tradeoff between nvidia / amd is the vram. It's not, going for the extra vram on the amd means you sacrifice a ton of RT performance, probably - at least in the case on the new gen of cards - worse power draw and efficiency, insane amounts of power draw while playing a youtube video or if you use dual monitor etc.
You're correct about the tradeoff in ray tracing performance. However, the power consumption in video playback and multiple monitors has improved significantly since launch. It's also important to note that the higher consumption is observed in the case of monitors with different resolutions and refresh rates.

1677181730268.png
 
If you don't think that VRAM is a big deal, then I guess that GPU manufacturers should just put 8GB on everything. After all, it's not a big deal, eh?
Dude, that's a strawman, he never said put 8gb on everything, you did, Strawman 101.
Firstly, I DO get to decide if I think that a lack of VRAM on a card is detrimental and I DO get to decide if I want to tell people about it on a forum that exists for that very purpose.
What you don't get to decide is conditions for a successful argument against it, or conclude that none exists. I mean sure, nobody can prevent you saying that and making that argument, I guess, well done. Nobody is saying VRAM doesnt matter at all, of course it does, cards can't function without it and generally speaking more is better, but it's not the be all end all of a video card.

I don't really care about people who bought at twice the price, they were silly enough to do it at all, they get what they deserve. I'm glad you enjoy your 6800xt, good for you. I like me 3080, good for me.
 
Last edited:
Dude, that's a strawman, he never said put 8gb on everything, you did, Strawman 101.
No he said it's not a big deal, I was exaggerating on purpose to show that it actually is. That's not a strawman argument, that's a demonstrative metaphor but you can call it whatever you want, it doesn't make it true.
I don't really care about people who bought at twice the price, they were silly enough to do it at all, they get what they deserve. I'm glad you enjoy your 6800xt, good for you. I like me 3080, good for me.
Some people just have to learn the hard way I guess. The people who liked my post tell me that some understand the lesson to be learnt so it was worth it. I really hope, for your sake, that the 10GB holds up longer than I expect it to. I don't revel or get pleasure from seeing other people in less than optimal situations and I know that a large reason that things played out the way they did was because of ignorance.

If you point out a problem that people didn't notice (like the RTX 3080 having less VRAM than the GTX 1080 Ti), some people will ignore you but some will say "Yeah, that IS pretty low..." and maybe they'll take an RTX 3060 or RX 6800 XT instead, realising that the 10GB is a limiting factor. To call 10GB a limiting factor on a card as powerful as the RTX 3080 isn't an opinion, it's a fact that has been proven more than once over time.

If you didn't spend a pantload on your RTX 3080 then you're all good and my post wasn't intended for you anyway. It's the people who way overspent (and yes, it was stupid of them to do it) on it that my post was aimed at. There's nothing that they can do now, but maybe they'll remember this in the future.
 
To call 10GB a limiting factor on a card as powerful as the RTX 3080 isn't an opinion, it's a fact that has been proven more than once over time.
It absolutely is an opinion. The same way it's an opinion that the RT performance of the 6800xt is a limiting factor. I find that disgustingly bad RT performance way more restrictive than the 10gb vram.
 
No he said it's not a big deal, I was exaggerating on purpose to show that it actually is. That's not a strawman argument, that's a demonstrative metaphor but you can call it whatever you want, it doesn't make it true.

Some people just have to learn the hard way I guess. The people who liked my post tell me that some understand the lesson to be learnt so it was worth it. I really hope, for your sake, that the 10GB holds up longer than I expect it to. I don't revel or get pleasure from seeing other people in less than optimal situations and I know that a large reason that things played out the way they did was because of ignorance.

If you point out a problem that people didn't notice (like the RTX 3080 having less VRAM than the GTX 1080 Ti), some people will ignore you but some will say "Yeah, that IS pretty low..." and maybe they'll take an RTX 3060 or RX 6800 XT instead, realising that the 10GB is a limiting factor. To call 10GB a limiting factor on a card as powerful as the RTX 3080 isn't an opinion, it's a fact that has been proven more than once over time.

If you didn't spend a pantload on your RTX 3080 then you're all good and my post wasn't intended for you anyway. It's the people who way overspent (and yes, it was stupid of them to do it) on it that my post was aimed at. There's nothing that they can do now, but maybe they'll remember this in the future.
Most people who have a 3080 will swap it for something newer long before the 10 GB VRAM becomes a limiting factor.

And to suggest that a 3060 is better because it has more VRAM is weird, to say the least. I would suggest rethinking this point.

People who overspent on a GPU during the mining crisis is an entirely different matter altogether. Based on the past, it was logical to assume that GPU prices would come back down eventually as the availability issue was slowly eliminated. People who couldn't wait and bought a GPU, any GPU, not just a 3080, for way over MSRP, have only themselves to blame (unless they don't care about money, which is also a possibility).
 
you can call it whatever you want, it doesn't make it true.
Same to you my dude, about your Strawman and your other rants. You're an interesting one, and it's nice to get alternative opinions on things, even when I disagree.
 
Most people who have a 3080 will swap it for something newer long before the 10 GB VRAM becomes a limiting factor.
That doesn't mean that they wouldn't have preferred it if their expensive-as-hell card lived longer.
And to suggest that a 3060 is better because it has more VRAM is weird, to say the least. I would suggest rethinking this point.
I said that it's better with regard to how much VRAM it has. Of course it's weird, it's weird that the RTX 3060 has more VRAM than the RTX 3080. We're talking Twilight Zone-level weird here!
People who overspent on a GPU during the mining crisis is an entirely different matter altogether. Based on the past, it was logical to assume that GPU prices would come back down eventually as the availability issue was slowly eliminated.
I completely agree. I was one of those fools myself. Fortunately, I managed to mine enough Ethereum that I only paid about $200CAD more than MSRP so I didn't do too badly. I still curse at the mirror from time to time over it though... :laugh:
People who couldn't wait and bought a GPU, any GPU, not just a 3080, for way over MSRP, have only themselves to blame (unless they don't care about money, which is also a possibility).
It's true, I couldn't agree more. I just hope that this time, they learn their lesson instead of going back time and time again to get fleeced like so many dumb sheep.
 
That doesn't mean that they wouldn't have preferred it if their expensive-as-hell card lived longer.
That's kind of a contradiction. If you sell your card 2 years after you bought it to buy the next shiny new thing, then you won't care if it performs poorly in a game 3 years later, do you? ;)

I said that it's better with regard to how much VRAM it has. Of course it's weird, it's weird that the RTX 3060 has more VRAM than the RTX 3080. We're talking Twilight Zone-level weird here!
Oh but there's an explanation. 6 GB wouldn't have been enough for a card of that calibre, but 8 and 12 aren't doable due to the GPU's core config and memory controller. Not to mention the marketing value.

It's true, I couldn't agree more. I just hope that this time, they learn their lesson instead of going back time and time again to get fleeced like so many dumb sheep.
On a 3080-4080 level, maybe, but people who want the best of the best just for the sake of having it will never learn. You could easily sell them a steaming pile of turd with a green 5090 logo on it for $10k. It only has to be marginally better at 4K, or have a new feature that you'll never use.
 
Fortunately, I managed to mine enough Ethereum that I only paid about
I had completely forgotten about this too, my full hash rate 3080, bought on day 1 before the price crazyness, mined in excess of its own value in eth, the card paid for itself and then put even more money in my pocket. Would be hard to be mad at it even if the 10gb ever let's me down in a meaningful way.
 
Just sad that we are on 3rd Gen Raytracing and still need upscaling cheats to play with RT on a $1,200 GPU in 2023 at decent fps. I feel like just when sharp detailed artifact free high res high refresh gaming was in our grasp in the mainstream... along comes RT and says NOPE!
 
Just sad that we are on 3rd Gen Raytracing and still need upscaling cheats to play with RT on a $1,200 GPU in 2023 at decent fps. I feel like just when sharp detailed artifact free high res high refresh gaming was in our grasp in the mainstream... along comes RT and says NOPE!
Because RT was made to sell graphics cards, not to advance graphics
Look at PhysX after nvidia bought them

The RT cores were researched for their enterprise level GPUs for render farms, and then they found a way to make it "useful" in games and pushed to have it implemented, instead of desigining a feature for gamers exclusively (like DLSS was, but admittedly as a backlash to how poorly RTX ran)
 
Because RT was made to sell graphics cards, not to advance graphics
Look at PhysX after nvidia bought them

The RT cores were researched for their enterprise level GPUs for render farms, and then they found a way to make it "useful" in games and pushed to have it implemented, instead of desigining a feature for gamers exclusively (like DLSS was, but admittedly as a backlash to how poorly RTX ran)
So why are 3d movies made with rt and not traditional raster technology? Are movie studios trying to sell gpus as well?

...
 
CGI in movies is not rendered in real time, and it's not hardware accelerated. It's rendered on high core count CPUs at less than 1 frame per second.

RT in games will never ever look as good as movies, it's just not possible. I don't think most people realize just how demanding it is. And RT is not about life-like looking graphics, it's about the accurate behavior of light.
Developers have gotten so good at faking stuff with rasterization, that RT often doesn't look mind-blowing. And for it to look much better, you need a lot more computing power.
 
CGI in movies is not rendered in real time, and it's not hardware accelerated. It's rendered on high core count CPUs at less than 1 frame per second.

RT in games will never ever look as good as movies, it's just not possible. I don't think most people realize just how demanding it is. And RT is not about life-like looking graphics, it's about the accurate behavior of light.
Developers have gotten so good at faking stuff with rasterization, that RT often doesn't look mind-blowing. And for it to look much better, you need a lot more computing power.
The point is, it's preferred cause it looks better.
 
Back
Top