• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are game requirements and VRAM usage a joke today?

Why do you think the 7800XT has the 4070 Ti as an opponent? Hm? :rolleyes:
It was just an example. I could have said any other Nvidia card (and yes, I could have bought a 4070 Ti).

Maybe you tested with DLSS 1.0. Anyway, AMD graphics cards with RT ON lose half of the performance. The ones from nVidia are doing much better. AMD introduced RT on the principle: "weak is better than nothing", an undoubted proof of how important this gadget is, blamed by some until the moment AMD equals nVidia. DLSS will be blamed forever, despite the positive reviews.
Maybe you can stop making assumptions about what I tested. It was version 2.something in Cyberpunk 2077.

As for RT, both Nvidia and AMD cards lose a huge chunk of performance when it's enabled. The fact that AMD loses more is of no significance.

I'm going to disregard the rest of your post as you're going on a tangent with random stuff you just came up with, and not contributing to the conversation.
 
It was just an example. I could have said any other Nvidia card (and yes, I could have bought a 4070 Ti).


Maybe you can stop making assumptions about what I tested. It was version 2.something in Cyberpunk 2077.

As for RT, both Nvidia and AMD cards lose a huge chunk of performance when it's enabled. The fact that AMD loses more is of no significance.

I'm going to disregard the rest of your post as you're going on a tangent with random stuff you just came up with, and not contributing to the conversation.
So your radical verdict on DLSS is based only on the experience of a single very poorly optimized game. Interesting.
If I have time, I will do a DLSS off/on comparison in Cyberpunk.

The RX 7800XT fights the non-Ti 4070. They have similar prices.
TPU includes the 7800XT in the last five reviews of newly released titles. At 1440p, without RT or upscaling, the 4070 non Ti wins in four of them. Maybe this will be the trend in the future.
4070 Ti is far away. It's like comparing 7900XT with 7900XTX.
 
So your radical verdict on DLSS is based only on the experience of a single very poorly optimized game. Interesting.
I made quite an extended post as to why this would be a less valuable opinion in the eyes of others too, with suggestions of how to improve that, but it's been ignored or skipped by the intended audience... not that it's the first time I've said it mind you.

To me it doesn't matter if the chicken or egg came first (purchase decision made because of a stance, or a stance made because of a purchase decision), it's still less valuable when underpinned by incomplete and/or antiquated data, and is often voiced in a far too 'matter of fact' way, for what is just a personal opinion. Part of that is my personal gripe with the way people talk about personally held beliefs online.

Naturally, everyone gets to choose their hardware for what matters to them, whether you clutch DLSS pearls or VRAM pearls (for example) - we are similar in that we post on forums about how our pearls matter the most the the other pearls are gimmicks, to a greater or lesser extent, and with better or worse arguments.
 
So your radical verdict on DLSS is based only on the experience of a single very poorly optimized game. Interesting.
If I have time, I will do a DLSS off/on comparison in Cyberpunk.

The RX 7800XT fights the non-Ti 4070. They have similar prices.
TPU includes the 7800XT in the last five reviews of newly released titles. At 1440p, without RT or upscaling, the 4070 non Ti wins in four of them. Maybe this will be the trend in the future.
4070 Ti is far away. It's like comparing 7900XT with 7900XTX.
Ah, so not liking a feature is radical now? Bloody hell! Since when is it mandated to be a yes-man? :roll:
You can compare all you like, and have whatever opinion you want, but please let me have mine, will you? ;)

I know the 4070 Ti is more expensive than the 7800 XT, but I still could have bought one if I wanted to. I just didn't want to. If I spend this much on a graphics card, I want it to have more than 12 GB VRAM. I'm not saying that 16 GB is a necessity (it's really not), but I kind of expect it at this price point. You're free to disagree with me once again, or put on the same old "your GPU will suffer long before your VRAM" song, but I like to have the extra insurance to avoid the possibility of another 2 GB 960 situation.

I made quite an extended post as to why this would be a less valuable opinion in the eyes of others too, with suggestions of how to improve that, but it's been ignored or skipped by the intended audience... not that it's the first time I've said it mind you.
I said it before, and I'll say it here. I'm not interested in upscaling tech, be it DLSS, XeSS, FSR, or anything else. I consider it a helping hand for older or less powerful graphics cards, but that's it. If I can, I play at my monitor's native resolution for the sharpest image possible. I accept that some people are on a different opinion, but I don't see how that makes mine any less valid or wrong, or why I should invest more time and money in testing a technology that I don't care about. Nvidia/AMD/Intel can polish it all day long, but native will always look better.
 
Nvidia/AMD/Intel can polish it all day long, but native will always look better.
And this is where we disagree the most I think (and major tech press who have tested it say this too [better than native = possible]), and where I think first hand experience of seeing it do better, with your own eyes would be beneficial, because it absolutely can and does. It's not a blanket rule per say, but it's so common in games I play it's an insta-enable, making top tier upscaling a desirable feature to me and, as you can tell, many others.
 
I said it before, and I'll say it here. I'm not interested in upscaling tech, be it DLSS, XeSS, FSR, or anything else. I consider it a helping hand for older or less powerful graphics cards, but that's it. If I can, I play at my monitor's native resolution for the sharpest image possible. I accept that some people are on a different opinion, but I don't see how that makes mine any less valid or wrong, or why I should invest more time and money in testing a technology that I don't care about. Nvidia/AMD/Intel can polish it all day long, but native will always look better.
The problem is you don't care whether or not your opinion is correct. And yes, opinions can be wrong. I've already in the past given you lots of examples where DLSS gives better image quality than native - with or without an increase in framerate on top of it - but you choose to ignore it just to keep saying that it's a helping hand for older GPUs. That to me sounds very religious, you already made up your mind before you consider the arguments / evidence, so you discard all of those that don't already agree with your original opinion.

It is absolutely, without a shadow of a doubt wrong that native will always look better. 4k DLSS Q looks WAY better than 1440p native for example, while performance is almost identical.
 
without a shadow of a doubt wrong that native will always look better
It's somewhat baffling that in 2023 people still say native is best, as it if is some holy grail of image quality that can never be improved upon.

Supersamplying has existed for many years before DLSS, and it's obvious that a 4x render resolution downsmapeld punches higher than any given panels native resolution, so it's strange to keep hearing that people think native is the pinnacle.

Native serves as a good reference point for IQ, but it's not at the top of the list, so it's no wonder a top teir upscaling solution with the single best temporal AA underpinning it can match or exceed native + TAA
 
It's somewhat baffling that in 2023 people still say native is best, as it if is some holy grail of image quality that can never be improved upon.

Supersamplying has existed for many years before DLSS, and it's obvious that a 4x render resolution downsmapeld punches higher than any given panels native resolution, so it's strange to keep hearing that people think native is the pinnacle.

Native serves as a good reference point for IQ, but it's not at the top of the list, so it's no wonder a top teir upscaling solution with the single best temporal AA underpinning it can match or exceed native + TAA
Proponents of native reminds me of the "audiophiles". 10 years ago it was very popular - the supposed audiophiles were calling peasants people that listened to music via youtube, while they themselves were using spotify. And although it's true that youtube was bad for sound quality, spotify was not anywhwere near the pinnacle either.

So, to build on top of the argument, native is a helping hand for weaker gpus and it's always going to look worse than supersampling. See what I did there?

EG1. Btw, it is actually true that I only use native in games that my 4090 is too weak to push DLDSR + DLSS for higher resolutions. I see no other reason to use native, since from all the available options I have, it looks and performs the worst.
 
And this is where we disagree the most I think (and major tech press who have tested it say this too [better than native = possible]), and where I think first hand experience of seeing it do better, with your own eyes would be beneficial, because it absolutely can and does. It's not a blanket rule per say, but it's so common in games I play it's an insta-enable, making top tier upscaling a desirable feature to me and, as you can tell, many others.
Disagreeing is fine. :) I also recently tested FSR in God of War, and while the quality was a tad better than DLSS in Cyberpunk (due to the game using a much later version, I guess), it still wasn't comparable to native.

The problem is you don't care whether or not your opinion is correct. And yes, opinions can be wrong.
No, there's no such thing as a correct/incorrect opinion. If you don't like onions, I won't force you to eat them.

I've already in the past given you lots of examples where DLSS gives better image quality than native - with or without an increase in framerate on top of it - but you choose to ignore it just to keep saying that it's a helping hand for older GPUs. That to me sounds very religious, you already made up your mind before you consider the arguments / evidence, so you discard all of those that don't already agree with your original opinion.

It is absolutely, without a shadow of a doubt wrong that native will always look better. 4k DLSS Q looks WAY better than 1440p native for example, while performance is almost identical.
You've given me 4K + DLSS vs 1440p native images, which is a bullshit comparison. The real comparison is at the same output resolution with DLSS enabled vs disabled, which I said to you on the spot.

We're arguing about opinions, which is utterly pointless, so I'm out.
 
No, there's no such thing as a correct/incorrect opinion.If you don't like onions, I won't force you to eat them.
If my opinion is that onions are flying orange disks then yes, my opinion is definitely just wrong.
 
If my opinion is that onions are flying orange disks then yes, my opinion is definitely just wrong.
That's not an opinion. That's misinformation. And I know what DLSS is.
 
  • Like
Reactions: Jun
You've given me 4K + DLSS vs 1440p native images, which is a bullshit comparison. The real comparison is at the same output resolution with DLSS enabled vs disabled, which I said to you on the spot.
Why is that the real comparison? What does even "real comparison" mean. What matters is image quality at similar framerates. And in that regard dlss is way superior to native. The only reason to use native is if your card doesn't support dlss or it's too slow to supersample.

I have a 1440p monitor, my choices are to use 1440p native (which also means TAA most of time), 1440p dlss q and 4k dlss q.

Both options either look or perform better than native, why the heck would I ever use native? Just give me a simple good reason to use native
 
Why is that the real comparison? What does even "real comparison" mean. What matters is image quality at similar framerates. And in that regard dlss is way superior to native.
Right, let me explain...
  • If X game gives me 90 FPS at native with upscaling off, then happy days, that's how I'm gonna play.
  • If X game gives me 35 FPS at native with upscaling off, but 80 FPS with it on, then I'm gonna think about the cost of image quality vs the extra performance I'm getting.
  • If X game gives me 15 FPS at native with upscaling off, but 60 with it on, then it's a no brainer.
  • Under no circumstances am I gonna lower the resolution if upscaling is available, because everybody knows that upscaling gives a better picture. That's not even a question, so there's no point to compare.
Get it?
 
Right, let me explain...
  • If X game gives me 90 FPS with upscaling off, then happy days, that's how I'm gonna play.
  • If X game gives me 35 FPS with upscaling off, but 80 FPS with it on, then I'm gonna think about the cost of image quality vs the extra performance I'm getting.
  • If X game gives me 15 FPS with upscaling off, but 60 with it on, then it's a no brainer.
  • Under no circumstances am I gonna lower the resolution if upscaling is available, because everybody knows that upscaling gives a better picture. That's not even a question, so there's no point to compare.
Get it?
I'm not asking you what you are going to use. You can use whatever you want, that's fine. I'm saying using dlss is always always better than native.

If game gives me 80 fps natively at 1440p,im going to supersample to 4k and then use dlss q. Similar performance much better image quality. There is no reason to run natively
 
You can use whatever you want, that's fine.
Exactly. Thank God I'm not being forced to like stuff.

If game gives me 80 fps natively at 1440p,im going to supersample to 4k and then use dlss q. Similar performance much better image quality. There is no reason to run natively
Nobody talked about supersampling. You're deflecting the topic.
 
Exactly. Thank God I'm not being forced to like stuff.


Nobody talked about supersampling. You're deflecting the topic.
Clearly I'm talking about it. Can you tell me why would I actively choose to play natively instead of supersample + dlss? I'd only consider native if my card was too slow or didn't have access to dlss.
 
Clearly I'm talking about it.
You can talk about airplane construction if all I care - it has nothing to do with what I said, not to mention the original topic in this thread.

I'm out.
 
  • Like
Reactions: Jun
If the FPS isn't enough, I'll just turn off some settings that will give me more FPS than DLSS Performance, and be done.

Yes, DLSS can make some objects of better quality.
Yes, FSR can make some objects of lower quality because it magnifies them by scaling.
But again - why should I use them when I can just manage the game settings and get a lot more FPS?
 
Wow. So much emotion so early in the morning.

Relax guys.
 
so early in the morning.
Time zones are a thing. It's almost 5 pm where I live.

im going to supersample to 4k and then use dlss q. Similar performance much better image quality.
This is what I did (but using FSR since my GPU is from AMD) when I had a 1080p monitor. 3200x1800 + FSR: Quality. Performance is similar to that of 1440p, image quality is on par give or take. Would definitely look better on a 1440p display but I never had one. 1080p and 2160p only.
 
If the FPS isn't enough, I'll just turn off some settings that will give me more FPS than DLSS Performance, and be done.

Yes, DLSS can make some objects of better quality.
Yes, FSR can make some objects of lower quality because it magnifies them by scaling.
But again - why should I use them when I can just manage the game settings and get a lot more FPS?
Simple. Because a lot of the time dlss looks better than native. Roughly in half or more of the games implemented, dlss > native. So not only od you get better iq, but also higher framerate.

Do you want even better image quality than that? Supersample + dlss q offers similar performance to native but way higher image quality. So the question is why would you ever use native? Native is dead, the only reason to use it is if the game doesn't have dlss.
 
reason to use it is if the game doesn't have dlss.
...or game devs screwed it up so hard that DLSS doesn't work properly in a given game.
 
Simple. Because a lot of the time dlss looks better than native. Roughly in half or more of the games implemented, dlss > native. So not only od you get better iq, but also higher framerate.

Do you want even better image quality than that? Supersample + dlss q offers similar performance to native but way higher image quality. So the question is why would you ever use native? Native is dead, the only reason to use it is if the game doesn't have dlss.
Yes, but for DLSS we need an Nvidia card.
So should we ban AMD from selling cards because of DLSS? I don't think so.
Because if we follow your logic it comes out that way :)
 
Yes, but for DLSS we need an Nvidia card.
So should we ban AMD from selling cards because of DLSS? I don't think so.
Because if we follow your logic it comes out that way :)
Ok so if you have an nvidia card, there is no point in not using dlss. Glad we agree
 
Back
Top