• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 Super Founders Edition

So SC2 is still mostly clocks. Hah. Nice
Well, to quote TotalBiscuit from ages past, “Starcraft 2 runs on everything, but doesn’t run well on anything”. Par for the course with Blizzard and their weird engines. Diablo 3 stutters like mad even at high FPS without an SSD. Overwatch was (is?) weirdly memory bound and going from 2666 to 3200/3600 gave you disproportionate boost to FPS stability that I haven’t seen in almost anything else. They are just quirky like that.
 
Every GPU prior, it was VRAM killing me.
Opposite experience here, with almost all of my cards over the years I've ran out of raw GPU power before running out of Vram the only exception being my 4 GB RX 570 not liking Horizon Zero Dawn at all. 'that was most likely a driver/game related issue since it did not have missing or low res textures on my bro's GTX 970'
Thats when I've had to upgrade to a GTX 1070 and then again that card was too slow for the newer games that I wanted to play for example Cyberpunk.

With my 3060 Ti yet again I run out of GPU power in UE 5 games before running out of Vram, finished Immortals of Aveum not long ago on High settings with maxed textures and it had no Vram issues but it was completely choking my GPU and the only way to make it at least playable/enjoyable was to use DLSS. 'yes this engine is meant to be played with upscalers'
UE 5 perfomance is important to me cause I happen to end up playing UE based games a lot ever since the UE 3 days.


I do not play sim or heavy strategy games so I can't speak for those but some of the Vram hog games from the past year or so are mostly fixed by now. 'Last of us for example and Hogwarts is also okay with a 8 GB card as long as you don't use RT'

I guess you can't have both as a budget-mid range user, I will most likely always run out of GPU power when new gen games show up.
I do like high quality textures but when my card is simply not capable of running the game no matter what then its a moot point anyway. 'I'm okay with medium-ish settings usually but when its full low then its time for an upgrade, normally that means ~3 years of GPU runtime for me tho upscalers do help now at least'
 
Last edited:
600$ card for 1080p ? :roll:

My brother in Christ the year isn't 2010 anymore. Saying this is "meant for 1080p" is outright insane, if you said this as some kind of excuse for something, it's not, you should be able to play at 4K with a card this expensive at this point in time. That's not unreasonable at all.
And to think that if game optimization did not take a sudden dip out of nowhere then maybe 1440p (native) as a standard with 4K (native) as the next step would've been a reality. That and, of course, if the gpu market hadn't experienced the metaphorical catastrophe that were the mining and AI booms, inflating prices and skewing generational improvements so that only the seller may benefit from the advanced architecture, but not the consumer. that applies for nvidia, meanwhile at AMD's camp they simply got too tropical with chiplets, and it's clear they are not ready for primetime just yet, much like ryzen back in the day. Maybe by RDNA5 we might start to see the rewards, if amd stops being complacent.
I also want to blame TAA, because fuck TAA.
 
You really need to escape copium mode
That's not copium. That's called realism.

Realistically, $600 for a GPU that's ravaging on 99+ % games at 1440p and absolutely destroys it at 1080p is not insane. It's a completely fair offer, especially considering what we previously had for such money:

• 2021, RTX 3070 Ti: poor VRAM capacity, basically the "now it's fine and devil may care what's later" kinda GPU.
• 2018, RTX 2070 ($500 but with inflation and +100 USD premium for well cooled AIB options taken into the consideration...): not really a 1440p performer, it was falling short in some games right from the start. Still is a reasonable 1080p option and can offer playable experience at 1440p with DLSS and lower settings.
• 2016, GTX 1080 ($600 = more expensive all things considered): even worse at 1440p, not even 100% ideal at 1080p with a couple games wiggling around 60 FPS mark. Became obsolete a year ago when DX12_2 titles kicked in and rendered this GPU helpless.
• 2014, GTX 980 ($550 = more expensive all things considered): 1440p ain't a thing. 1080p is a struggle at some titles. Went obsolete at about 2018 mark.

I agree with 12 GB being a little bit too little for this money. But this is one of the easiest things to mitigate by lowering settings and enabling DLSS. AMD GPUs are a greater rip-off at this point since they are now behind in everything that's not VRAM.

Of course everyone wants to pay a couple dollars and get a GPU that's capable of a million FPS in every single game but it's not a giveaway, it's the real market. And $600 for this device is sane. And it will be sane up until AMD come up with something better than RX 7900 XT at the exact same price point. 7800 XT, however, falls down to $420ish territory. It can't be competitive at higher price anymore.
 
It also depends what you bench and play, the hardest nut to crack is really the campaign map. Battles are easy on VRAM.

When I ran WH3 campaign map on my 1080, it was a stuttery mess even with lower settings that would still exceed or hit 8GB. 30-40 FPS, but stuttery.
I have over 400 hours in WH3 and do not have stutter issues with a 1080, 3070, or 3080. On the 3080 I run 2560x1440 max settings. That being said I think it's a horrid benchmark because the game is built on tech-debt from 9 years ago.

Sega ain't what it used to be
 
Opposite experience here, with almost all of my cards over the years I've ran out of raw GPU power before running out of Vram the only exception being my 4 GB RX 570 not liking Horizon Zero Dawn at all. 'that was most likely a driver/game related issue since it did not have missing or low res textures on my bro's GTX 970'

You are very likely right as 4GB is fine for Horizon Zero Dawn on a 1050 Ti and an RX 6400. I played a lot of the game on both (more on the 6400 because of much higher performance) and I even got it to work on a 4GB GTX 745! "work" Lol it was very slow and dumbing down to 640p with Performance FSR was... less good but a nice proof of concept.

IMO VRAM issues are overblown but they do crop up on rare occasions in some games.
 
on a 2080 ti i cant see more than 6GB without the game playing below comfortable and 4070 super is barely 50% faster. But still. 16GB should be the norm. Start saving for a 5070. don't complain about price and power and have fun.
 
You are very likely right as 4GB is fine for Horizon Zero Dawn on a 1050 Ti and an RX 6400. I played a lot of the game on both (more on the 6400 because of much higher performance) and I even got it to work on a 4GB GTX 745! "work" Lol it was very slow and dumbing down to 640p with Performance FSR was... less good but a nice proof of concept.

IMO VRAM issues are overblown but they do crop up on rare occasions in some games.

Maybe its just about enough for 1080p but it did not work at my native 2560x1080 res, even low settings had missing or very low res textures.
Borrowed my bro's 970 and it was all good apparently, this was an early version of the game so no upscalers at the time.

But yea, thats pretty much all of the Vram issues I've had in a game that I actually wanted to play but couldn't at the time cause of my GPU and its Vram limit.
With my 3060 Ti since 2022 September at most I've had to go from Ultra to High Textures and it was all good and in more recent titles I can hardly tell the difference anyway but the GPU itself is starting to show its age in more demanding games.
 
on a 2080 ti i cant see more than 6GB without the game playing below comfortable

Slot any recent gaming CPU instead of your 2690 V4 and see where it goes :^)

I'm not kidding, your CPU is a serious bottleneck for such a powerful GPU.
 
Aaand right on cue here are the "12GB isn't enough" crowd. It is, for the simple reason that consoles don't even have 12GB.

Technically the consoles (XBox Series X and PS5) have 16GB of unified memory exposed in a way that's addressable by both the CPU and GPU cores.

Fortunately a lot of game developers are sensitive to the plight of folks with cards with 12GB or less and are putting more effort into shoehorning their games into limited framebuffers.
 
not that hard, but it will result in a huge drop in social sharing of our charts -> less traffic -> less $$ -> less time to justify all those reviews with all this testing

[ ... ]
bit late to the party, but make that an optional feature?
like, release comparison pictures along w/ an optional interactive feature; or better yet have that be a separate page entirely where people can interactively compare stuff based on the lastest benchmark data
 
Aaand right on cue here are the "12GB isn't enough" crowd. It is, for the simple reason that consoles don't even have 12GB.
12 GB isn't enough on a $600.00 gpu. Consoles are using 12 gb but are also dynamically raising or lowering rendered resolution depending on how heavy a scene is in geometry and effects
Aaand right on cue here are the "12GB isn't enough" crowd. It is, for the simple reason that consoles don't even have 12GB.
 
Aaand right on cue here are the "12GB isn't enough" crowd. It is, for the simple reason that consoles don't even have 12GB.
Not to mention throwing allocation numbers around to 'support' it. 10GB here still fine at 4k relative to the cards actual GPU muscle. For the money and realistic expectations on games, likely at 1440p or 4k using upscaling, 12GB is A'OK. If we take MSRP's as the true price, the feature set, efficiency and overall performance easily makes this worth it over a 7800XT - imo. No VRAM pearl clutching necessary for a majority of buyers, but I can see how it's relevant to the 0.1% that are hardcore enthusiasts that know that they want (or perceive that they need) it.
 
Now this is the "REAL" 4070 as it should be in last year.
 
I still think the 7800XT is the better purchase unless you are super into RT gaming, for $100 less you are getting 8% slower performance, but more vram, dp2.1, frame generation on ALL games, more stable and standard pci-e connectors, better software suite and better and more stable drivers.

On the other hand the 4070 super for $100 more gets you some decent 8-9% more performance, 40W less energy consumption and of course faster RT if you are really into that. For my taste the 7800XT has the nod, and I'm sure the 4070 super's are going to be at least 650 euros in Europe so a no go from start.
 
I still think the 7800XT is the better purchase unless you are super into RT gaming, for $100 less you are getting 8% slower performance, but more vram, dp2.1, frame generation on ALL games, more stable and standard pci-e connectors, better software suite and better and more stable drivers.

On the other hand the 4070 super for $100 more gets you some decent 8-9% more performance, 40W less energy consumption and of course faster RT if you are really into that. For my taste the 7800XT has the nod, and I'm sure the 4070 super's are going to be at least 650 euros in Europe so a no go from start.

i totally agree! The 7800XT IMO is set to age a little better with wider bandwidth and more VRAM (primarily hi-res gaming without having to worry about smart dynamic predefined high quality pruning). Outside of the "8-9% average" performance gain (esp. eliminating doom/CS2 and co), some of the heavier graphics lifter titles see a much smaller performance difference. I usually check a couple of individual benchmarks for the titles i play at 1440p and to be frank a $100 for the 4070 soup aint gonna cut it. They would have knocked it out of the park with a 16GB wider bandwidth 4070 TI SUPER at $600. Can't blame the greenies though, esp if people are willing to splurge any-and-everything for whatever Nvidia places before them - thats just good business sense! As a consumer, its a tough one to stomach seeing the 70-class upper middle ground being pushed to $600-$800 especially considering my last 3 upgrades are all 80-class SKUs which will see no mercy.

Anyway lost faith in the greens but some time ago got a great deal on a used 10GB 3080 for just short of £350, so ain't in the race to go 40-bananas. Yep 10GB @ 1440p - a short term investment to maybe one day grab a 50-series/8000-series (or fall back on a used 4080 SUPER / 7900XTX but refuse to pay anything above sticks and bone, stuck in stone, ~£800).
 
Last edited:
if only it was 16GB even if it has slightly less CUDA cores, last time i was ready to sell my 4070 and get the Super version if it had 16GB. Guess ill have to wait till 5070
4070 Ti Super 16Gb might be out of question for me, those things are huge, 4070 had the perfect smaller dual sized ones.
 
Now this is the "REAL" 4070 as it should be in last year.
Sure...at $499.99 but not $599.99. The GTX 1070 cost $379.99 in 2016, which would be $485 in today's money....And that price gives Nvidia 80% margins and board partners 20% margin. The GTX 1070 had a 256-bit memory interface...Nvidia raises prices and cuts memory bandwidth....Why anybody submits to the spit in the face nvidia gives you is beyond me. And before anyone says "Amd fanboy"...I own both amd and nvidia gpus...I just don't buy them new....

Performs about as expected. Very close to the 4070 Ti. It's nice the power consumption barely went up compared to the 4070.

Solid card.
Yet the price....Too high.

To play devils advocate in this particular case, I assume most people want to use settings that are quite beyond what the console versions run at and thus assume more memory will be needed. Of course, then we get into the whole idea of diminishing returns and whether minor visual improvements from Uber Epic Ultra settings actually are meaningful, let alone if anyone at the dev studio bothered to optimize them at all.
But don’t get me wrong, I am with you on this.
The GPU in the PS5/XBOX Series X is equivalent to an RX 6700 which can be bought for $289.99 at bestbuy....The RTX 4070 Super is a $600(minimum) Gpu so you expect it to curb stomp a console in performance in every single way...and according to this website's own numbers it is 86% faster than the console GPU so yeah...you should be able to use settings far beyond what a console could do and if a console is using 8-12 GB Vram for dynamically adjusting render resolution to maintain frame rate then this class of gpu should have 16 GB at minimum....I feel that the 7800XT will have superior longevity due to it's 16gb framebuffer. Nvidia is asking for you to pay $100-150 more than a 7800 XT for 8% more raster and 25-30% more for some nice shadows? Hard sell, imo but dummies are going to buy it anyway...because they are stupid like that.
 
Yet the price....Too high.

That's your opinion.

Graphics cards in general have gotten really expensive in the past few years, but relative to what's on the market right now, the 4070 Super improves Nvidia's position. Previously they had a 4070, which was slower and more expensive than the 7800 XT. Now, they have a card that took the price point the 4070 occupied, but is faster than the 7800 XT. It makes the 7800 XT a harder sell, unless AMD responds with price cuts. Paying extra for Nvidia's features is no different than paying for a piece of software. Some people do find value in that, and are willing to fork out the money for it.
 
We shouldn't be asked for extra $$$ for features.
Those should be a bonus
Most of them are AI stuff.
Pretty soon they'll be shipping us a dummy pciex card that connects to their servers to provide us with an 1080p Experience for just 1000E/month.
 
Last edited:
What's the point of unreasonable low price options in price / peeformance charts? RTX 4070 Super for $450?

I think it's more realistic to add 700, 800, 900 USD options, if AIB partners start to complain that volumes of cards are really low, since all manufacturing is now focused on AI cards... But then again, all prices will go up, even for the used cards.
 
Sure...at $499.99 but not $599.99. The GTX 1070 cost $379.99 in 2016, which would be $485 in today's money....And that price gives Nvidia 80% margins and board partners 20% margin. The GTX 1070 had a 256-bit memory interface...Nvidia raises prices and cuts memory bandwidth....Why anybody submits to the spit in the face nvidia gives you is beyond me. And before anyone says "Amd fanboy"...I own both amd and nvidia gpus...I just don't buy them new....

Yes...if it was the 4070, then today's 4070 super will be the 7680 cuda one, then the 4070 drops to 499.
 
Look at it this way:

In September 2020 Nvidia launched RTX 3080 for $699. Sure, due to cryptomadness not a lot of people could buy it at that price, but nonetheless.

And here we are, 3,5 years later, looking at the card that's barely faster, barely cheaper, and if rumours are true, it in fact won't be any cheaper!
 
We shouldn't be asked for extra $$$ for features.
Those should be a bonus
Most of them are AI stuff.
Pretty soon they'll be shipping us a dummy pciex card that connects to their servers to provide us with an 1080p Experience for just 1000E/month.
It's pretty normal for a company with a competitive edge to use that edge as leverage for more profit. It sucks for the consumer, but it's not unexpected behaviour for a company. They aren't benevolent entities.
 
Back
Top