• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 Super Founders Edition

7800Xt is 16.67% cheaper while being around 7% slower overall not worth spending extra $100 for this GPU.

Take, you lost that:


1080p: 100% / 92% = 108,7%, so almost a perfect extra performance of 8,7%

1440p and 4K: 100% / 93% = +7,5% extra.

You need to learn some basic maths. Nope, you can't substract percentages and take the difference as if that represents the extra performance obtained. And nope, you can't take the worst case to say "you only gains X", because you are doing a logical fraud.

Put the other cards with a little extra juice, maybe reaching +10%.

Take consideration about RT, because, you know, you aren't wasting 600$ to play games without enabling the best graphics posible. And nope, wasting "only 500" in a 7800XT for playing without RT isn't a valid argument. With RT the difference can be, much, much wider (more and more if a game implements complex RT effects, or worse, pathtracing).

Take in account the superior DLSS, the best compute technologies, and much better power consumption. Yes, DLSS is very usable with all the resolutions, and yes!, with 1080p with a superb reconstruction image.

So many pluses, not only raw performance. And, I think that I can say about you, and the other users of TPU, that you all know that, normally, a better performance product doesn`t scale its price in the same proportion. Only rare exceptions.

You can find refugee in the extra memory of the 7800XT, but please, don't lie with the data and stop trolling because you dislike nvidia, and you need to shit the launch of a more than enough good product.
 
And no, the latter three don't save it. 4070 Super is a good (not awesome but good) value product and at MSRP VS MSRP, it will outsell 7800 XT hard. It also makes 7900 XT being more expensive than 650 USD a bad value SKU.
only 12 Gigs of ram for 600$ makes 4070 Super dead on arrival for me. Spending 600$ on 12gb is "Super" weird in 2024, just pick a 4080/4070 Ti Super.
 
Last edited:
Wait for the prices to come down then. We have all the time in the world ~
Sassy The Hobbit GIF
 
only 12 Gigs of ram for 600$ makes 4070 Super dead on arrival for me. Spending 600$ on 12gb is "Super" weird in 2024, just pick a 4080 Super.
12 GB will be enough at 1080p till 2030.
At 1440p, till probably 2028.
At 4K, it's fine now and will probably do until 2026.

It's 8 GB that's not enough as of now. 12 GB is fine. What hurts this GPU more is its low VRAM bandwidth.
 
12GB won't be enough at 1080p till next year, let alone 2030.
nonsense like this will encourage game devs to optimize PC games even less.
4K? 4k what? ultra low and computer shut down in 2026
 
12 GB will be enough at 1080p till 2030.
At 1440p, till probably 2028.
At 4K, it's fine now and will probably do until 2026.

It's 8 GB that's not enough as of now. 12 GB is fine. What hurts this GPU more is its low VRAM bandwidth.
I bet our fancy Game Devs will not be able to fit their games into 12GB in next year already.
Plus for extra 200$ u will be able to get 4080/4070 Ti Super with completed specs, and 200$ is not even a half of the price.
 
Why can’t we get 4k+DLSS 2.5Q + DLSS 3.0 FG vs FSR + FG for AMD results on heavy RT/PT games?
 
Curious as to why Forza Horizon 5 was removed from the tested games list? It was nice having a couple racing games there to compare.
Too old and the new Forza is a huge flop with terrible image quality. F1 2023 is included, not sure if any other recent racing game is a success?

Why can’t we get 4k+DLSS 2.5Q + DLSS 3.0 FG vs FSR + FG for AMD results on heavy RT/PT games?
Because only two games (Forspoken and Immortals of Aveum) support both FSR 3 FG and DLSS 3 FG

Maybe once CP gets AMD FG support
 
12 GB will be enough at 1080p till 2030.
At 1440p, till probably 2028.
At 4K, it's fine now and will probably do until 2026.

It's 8 GB that's not enough as of now. 12 GB is fine. What hurts this GPU more is its low VRAM bandwidth.

I don't even have issues with 8 GB personally or at least yet to play a game where it was a real problem, not everyone plays at high resolutions and everything maxed out. 'my resolution/montior has less pixels than 1440p but more than 1080p'
Since I don't plan on upping my resolution/monitor anytime soon 12 would be enough for me for years most likely. '+I almost always use DLSS too if its implemented in a game cause I like it'

For me this 4070 Super would be an ideal upgrade if GPUs/Hardware in general wouldn't be so damn expensive where I live. 'even then second hand prices are high for me'
 
4K? 4k what? ultra low and computer shut down in 2026
I bet our fancy Game Devs will not be able to fit their games into 12GB in next year already.
Yeah I'm not so sure about that
I won't showcase every single game out here but if we don't count VRAM hogs (just a couple of them exist as of now), 4K gaming is pretty much accessible at <10 GB VRAM point. What ruins 4K experience for 4070 series is that these GPUs are not fast enough for maxing 4K out and you will need either upscaling or some settings turned down in order for games to reach 60+ FPS.

1705426289816.png


IoA is a UE5 game with UE5 being one of the most popular game engines out here. A lot of upcoming AAA games will be based upon this engine and it doesn't slurrrrrrp on your VRAM as much as you are whining about it.

Starfield is also a clear example as to why 12 GB is NOT an issue:

1705426464884.png


Even the Resident Evil 4 Remake, one of the worst case scenarios for nVidia GPUs in general, is running hotly well at 4K on these GPUs:

1705426551684.png


Games like Cities: Skyilnes, The Last of Us, Hogwarts Legacy etc are outliers and they will be outliers for about a couple years from now at least.

At 1440p, DLSS Q allows to shrink VRAM requirements a bit, and at 4K, you are even good to go for DLSS B or even Performance mode. DLSS is by no mean ideal but this feature massively helps these GPUs and will improve over time.

Overall, 12 GB on both these GPUs is fine. I'd very much love to see ~23 GHz VRAM on Super though because this video cards suffers from limited VRAM bandwidth, especially in higher resolutions.
 
Overall, 12 GB on both these GPUs is fine. I'd very much love to see ~23 GHz VRAM on Super though because this video cards suffers from limited VRAM bandwidth, especially in higher resolutions.
Today yes, tomorrow maybe. I don't want to repeat 3070Ti experience, while RX 6800 still relevant, there is a lot of vids about how bad having 8GB today, in 2025 we can get same vids about 12Gb. If it was 400$ card, then yes, its fine, but spending 600$ on card that probably wont last 2 years, idk, when u can add extra and take 16Gb one with 256bus

I don't even have issues with 8 GB personally or at least yet to play a game where it was a real problem, not everyone plays at high resolutions and everything maxed out. 'my resolution/montior has less pixels than 1440p but more than 1080p'
I traded 3070Ti for Rx 6800, when games like Hogwards Legacy start to stutter hard, and refuse to load textures in Hogsmeade
 
.... spending 600$ on card that probably wont last 2 years ...
What do you mean by LASTING? What is the price range of Nvidia PC graphic cards, is it something like $ 300 - 1600 ? Where in this range is $ 600? It it reasonable to expect that the card from the lowest third of the range will be able to play the most demanding games at the most extreme settings at some high framerates or at all? I do not think so.
 
I didn't go to CES, spent the time on a cruise with the family, was back on Thursday, like 20+ more GPU reviews coming this month
On a cruise! Oh you poor thing! ;):D
Well I agree with that, should 4070 Ti Super with 16 GB of memory be available when I was buying the card, I would have bought it instead of the ordinary 12 GB 4070. Now I have a 4070, and I will probably loose at least $150 selling it. Realistically $200. That is a loss for me because Nvidia had stupid and incomplete product lineup. The guy in leather jacket needs to pay me $200. I am serious.
I can't agree with you because nVidia's practices haven't changed much over the past fifteen years and are no secret. Jensen didn't hold a gun to your head and it was YOU who chose the RTX 4070 over the RX 6800 XT and RX 6900 XT for your PC, not Jensen. After what nVidia did with the RTX 2080 Ti's price increase just before the RTX 3070 came out, it is a wonder that anyone is willing to trust them.

Your purchase choices are your responsibility, not Jensen Huang's so he owes you nothing. I only hope that, for your sake, you'll choose better next time.
Look up mindfactory GPU sales data. 4070 is usually in the first three positions.
MindFactory is only in Germany and the only data I could find was on this forum from back in April 2023:
mfgraphicscardsales-png.291898

Six out of the top-ten cards were Radeons with the 4070 Ti having five of them ahead of it. From what I see here, only 1,800 GeForce cards were sold compared to 2,660 Radeons. The RTX 4070 did have the top spot (by a rather thin margin) but it certainly didn't have the top three (how would that even work?).

On the other hand, Tweak Town did an article in November titled:
AMD's top-end RDNA 3 sales blow away NVIDIA rivals - is this why new Super GPUs are coming?
The first sentence completely contradicts what you just said: "AMD's RX 7800 XT is embarrassing NVIDIA's RTX 4070, frankly, and overall, higher-end graphics card sales are skewed heavily towards Team Red."

So where did you get this data that the RTX 4070 wasn't being destroyed in sales by the RX 7800 XT? I'd be interested to see it because I like to think that most people use good sources (instead of just making things up out of thin air) and so I'm interested to see what your source says and what their perspective is. More data is always a good thing.
What do you mean by LASTING? What is the price range of Nvidia PC graphic cards, is it something like $ 300 - 1600 ? Where in this range is $ 600? It it reasonable to expect that the card from the lowest third of the range will be able to play the most demanding games at the most extreme settings at some high framerates or at all? I do not think so.
I don't understand something. You said that you were pissed off at nVidia for what they did to you but here you are fiercely defending them. What's up with that?
only 12 Gigs of ram for 600$ makes 4070 Super dead on arrival for me. Spending 600$ on 12gb is "Super" weird in 2024, just pick a 4080/4070 Ti Super.
I'm FAR happier with the card I chose than I would have been with a 4070 Ti or 4080 and I didn't have to wait for it because I bought it last August.
 
Last edited:
I see post about this card and 4K. This card is meant for 1080P. If you want high end gaming at 4K then look at the 4090.
 
I can't agree with you because nVidia's practices haven't changed much over the past fifteen years and are no secret. ...

Your purchase choices are your responsibility, not Jensen Huang's so he owes you nothing. I only hope that, for your sake, you'll choose better next time.
Sorry, but I can make informed and rational decision only if have the information. If Nvidia clearly stated what cards are they going to make now and in near future, I could have chosen what I am going to buy and would have no reason in the future to be unhappy about the decision I made. These are expensive products and quite frankly this treatment of customers is disgusting.

So where did you get this data that the RTX 4070 wasn't being destroyed in sales by the RX 7800 XT?
I do not care about this, I just said that these cards were selling well, nothing more.

I don't understand something. You said that you were pissed off at nVidia for what they did to you but here you are fiercely defending them. What's up with that?
I am not defending them. I just want to point out that when I buy a product that costs A FRACTION of the best product, I need to expect that the performance of this product will have some limitations in comparison with the best product.
 
1 down, two left to go
So far ain't nothing super about it except the super stupid name and stupid high price
 
It's so ridiculous that people claim that Ray Tracing is better on a 4070 super card than AMD, but the performance penalty is such that the game is almost unplayable.
Anyone who wants to play with Ray Tracing today needs a 4090 card no less.
 
Well I agree with that, should 4070 Ti Super with 16 GB of memory be available when I was buying the card, I would have bought it instead of the ordinary 12 GB 4070. Now I have a 4070, and I will probably loose at least $150 selling it. Realistically $200. That is a loss for me because Nvidia had stupid and incomplete product lineup. The guy in leather jacket needs to pay me $200. I am serious.
Nah you should have either waited or not bought it at all.

It was clear from the get-go 12GB was obsolete territory in 2023. Any card with that VRAM amount was going to lose value faster than you could blink.

And yet, in 2024, we have a Highly Recommended review on a card with the same VRAM amount but already notes 'edge cases where 12GB is an issue'. Imagine paying $600,- for a spanking new card that can't run what it should run given its core power proper, right out of the box on release day.

That's definitely a novelty for an x70, that's for sure.

The x70 S is great, if you forget you're probably using this card in 2025-2026 as well. Great product for those with a short attention span. That's all I can say about it. The only reasonable card in the Ada lineup now is the 4070ti Super except then you're down nearly a grand and you're stuck with the most ridiculous GPU product name of the century.

FWIW, just ride that 4070 as long as it'll go. Resale on that card is not pretty.
 
not that hard, but it will result in a huge drop in social sharing of our charts -> less traffic -> less $$ -> less time to justify all those reviews with all this testing


Check the test description at the start of the power page, it will make a lot of sense
I understand, but perhaps it could be created as a Premium tool available only to site supporters.

As for data on consumption, I had already noticed the information, but I continue with the same position, following the strict meaning of the word "maximum".
 
Sorry, but I can make informed and rational decision only if have the information. If Nvidia clearly stated what cards are they going to make now and in near future, I could have chosen what I am going to buy and would have no reason in the future to be unhappy about the decision I made. These are expensive products and quite frankly this treatment of customers is disgusting.
Yes, but that has been true for years. It's the reason that I stopped buying GeForce cards back in 2007. The information about whether or not their information could be trusted has been around for a dog's age (GTX 1070 VRAM class-action lawsuit). People KNOW that nVidia is a terrible company and still choose to support them, like you did. It's the old "I couldn't help it, I'm a scorpion!" tale all over again.

Now, if you're new to all of this, then it was an honest mistake that you made, a mistake that thousands have made so I don't fault you for it. All you have to do is learn from this mistake and one day you'll laugh about it, like every other non-life-or-death mistake that everyone has made. You've found TPU and that's a great thing because you will learn a lot here. I didn't want to just assume that you were new to this because of your join date because the join date only says when you joined the forum, not how much actual experience you have. I've had people think I was new to all of this two years ago when I joined but I actually did my first build in June of 1988. I don't assume that someone's new to this because it's very easy to sound insulting that way and I try not to do that.
I do not care about this, I just said that these cards were selling well, nothing more.
You were quite specific about them having the top-3 places which is far more than just saying that they're selling well. I just wanted to see where that was because it would be something I hadn't seen before and I would find it interesting. A piece of advice though, if you know of data that backs your argument, just post it. Telling other people to look it up weakens your argument and makes you appear lazy or dishonest. Posting hard data that cannot be argued against is always the most effective method.
I am not defending them. I just want to point out that when I buy a product that costs A FRACTION of the best product, I need to expect that the performance of this product will have some limitations in comparison with the best product.
I meant overall. If you read your posts to yourself, you definitely come across as being in nVidia's corner. What you said here is of course true.
 
How come TUF is listed in the cooling comparison, but the card itself does not have a separate review?
 
Thank you for all the reviews and the time taken to do so.

I believe I suggested this before, but I'm not knowledgeable in the field, therefore I need to ask: would it be too difficult to implement a unified multi-thread for all the product reviews of the same part? Something that would include the source of the comment with a link to the original review, as seen on the attached image. That way we can have some sort of direct comparison between products under the same section.

Just a thought...

(took @droopyRO comment as an example)
Comments Merger.jpg
 
We're gonna need to hope more cards can overclock their memory like that dual card so we can avoid the absolutely pathetic results like the 4070 super tying the 3080 10gb in CS2 at 4k because of that bandwidth. Yes it's still an issue at 1440p, the 4070 super should not be only beating the 3080 10gb by 5% either. None of AD104 is suited to 500GB/s of bandwidth besides a theoretical 4070 with the full 48mb of l2 cache.
 
I have mixed feelings about this review... I'm happy to see Nvidia's mid-range card kicking some serious butt with this mid-cycle refresh. The part that saddens me is the fact that my 6900XT is officially mid-range card in 2024. Can't wait to see what the 5090 is about as that will most likely be the next card.
 
Back
Top