• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 Founders Edition

It is pretty much the same isn't it, while the core power, as you correctly point out, is virtually doubled. I've made that very comparison, its same ballpark, just like your comment on 'its faster' and then pointing out percentile gaps on raster, I agree, that's the same perf on raster.

See and this kind of bullshit response from your end, is why you lose all credibility every time. Everyone with non hazy vision can see the problem in relative specs core to VRAM, except you.


And yet cache also turns into an achilles heel for even AMD at 4K where it drops off against Nvidia's 4090. At that point, they're saved (most of the time) by hard throughput being at 800GBps still on a 7900XT, to an extent.

Cache does NOT alleviate constraints in the very use cases where you need it most, which is with heavy swapping required due to large amounts of data needed at will. The two are at odds with one another. At that point you are saved somewhat by royal VRAM capacity.

Its a bit like 'I have super boost clocks' under loads where you already exceed useful FPS numbers by miles. Who cares?? Its nice for bench realities, in actual gaming, it doesn't amount to anything. This is where experience comes in. We've seen this all before and crippled bandwidth, real, hard, bandwidth, is and will always be a defining factor.

You put that 6600XT on a higher res, it will die horribly, whereas in relative sense the 1080ti would still be standing upright. I experience this now with a 1080 on 8GB, I can fill the framebuffer, FPS can go down, but the affair is still buttery smooth in frame times.
This makes absolutely non sense…
You are shifting the point, after another non-sense post about a 1080Ti comparison, on 4K resolution which is NOT a target for RTX 4070 (and not even for 4070Ti to be honest).
Insulting potential buyers doesn’t make you any smarter.
 
Lowering texture quality means the GPU is obsolete? :rolleyes:.

I tested Hogswart Legacy with Low Texture Quality vs Ultra Texture Quality and it makes very little difference, saving ~4GB VRAM @ 4K Ultra RT DLSS
link
Wow so after years on ultra we need low now, shit the copium on show is epic.

Except we are not speaking about hamburgers here, and if people keep choosing Nvidia over AMD, is because of quality issues with AMD.
Backup your nonsense.

The 6800Xt owns this card just as it owned the 3070 but your kind makes up stuff.

Like driver issues, yet you own or say you do, two AMD Igpu laptops.

Your choices make your words odd ,or vice versa.
 
Virtually no one should be buying generation after generation, like phones there is little to be gained by yearly updates


Except for the fact that NVIDIA owners and users are/have been trained to pay up for a new GPU every two years because Vram ran out.

Or you have the wrong tensor cores doing nothing etc etc.

Yet that's a bonus to some, the Rx580 showed how to avoid EWaste , the 3070 and it's 4070 replacement are showing how to Make EWaste.
Pragmatic is key! I buy discounted phones and don't "upgrade" every year.. Never pay full price. There is not enough smart buyers.

As far as GPUS, 16GB vs 8GB comparison from Hardware Unboxed says it all. Games are already using 12GB. People don't realize, actually confused, how dirty Jensen Huang did to the ones who spent thousands of dollars for their 3070/3070ti/3080s to have be obsolete in 2 years.
 
The 4070 is MSRP 669 isn't it, not 600? I do know that's the case in EUR, and in reality I'll probably see it start at 700 for the FE.

It'll easily land even above the 7900XTX in cost per frame, as a midrange contender. Its hilariously bad.

I can buy a 7900XT for 836 EUR today and an XTX at just over 1K in the Netherlands. That's a net perf gap of a whopping 50% at roughly same relative cost per frame.

700 was about right it seems, too: (and that's for a bottom end AIB contraption, which for 200W might just be ok)
View attachment 291364
Instant No. To me this feels like paying for a VW UP with all options that drives the exact same as a Renault Twingo or Citroen C1 at stock, I just can't...
Holy crap that's $760 USD. Buy what's lowest that's within your range? But I get your analogy. It's like buying a car. One has the perfect spec sheet but the other one gives you better MPG for a lower price. No need to pay for a premium when it doesn't add value.

Edit: have not seen listings except for one FE at Best Buy says coming soon. High demand product
 
Last edited:
Wow so after years on ultra we need low now, shit the copium on show is epic.


Backup your nonsense.

The 6800Xt owns this card just as it owned the 3070 but your kind makes up stuff.

Like driver issues, yet you own or say you do, two AMD Igpu laptops.

Your choices make your words odd ,or vice versa.
No point in speaking with AMD supporters: they don’t even listen.
I’m usually building 3/4 PCs a month.
On 35/40% of the installed Radeon the customers are encountering issues, in the first year of usage.
Not hardware problems. Just very annoying software/drivers related issues.
I stopped installing Radeons a few months ago because it is just too much time consuming for me.

AMD software division is just very poor.

And 6800XT doesn’t “own” anything, and wasn’t a 3070 competitor since the beginning (price point was on par with 3080, which is a better card).
 
Smart buyers who bought 7900XT at 900usd, only for them to drop to 800usd in a month :D, could be 700usd next month, who knows.

Man let hope 7900XT/X last a long time, because their resale value are going down the toilet
Yep, I'm really on the fence here but about to drop 850 eur on a 7900XT... It started at 1050 EUR a few months back over here.

Resale value... well, I think Nvidia Ampere resale value is taking a nosedive just the same. Ada below the 4080... not gonna be pretty in 4 years from now. I'm not sure the Nvidia premium is gonna hold for its midrange as it used to. The 3080 is an interesting one to watch now. It guzzles power, has no DLSS3, and low VRAM albeit at decent b/w. The 4070 kinda killed its value proposition on 2nd hand markets now, and its worse in most ways. Ampere cards below the 3080 are just plain useless in resale.

In the meantime, I can sell my 1080 for 175 EUR today.. Already got several offers. 200 will likely even work as well. If anything that proves how little we've progressed since. I bought that card for 420 (could still skip taxes at the time) EUR... Ridiculously good value.
 
Last edited:
people are really divided here

i think they are all overpriced, the new 4070 and the old 6*** cards (makes it a bit worst for being a older gen card), wouldn't buy either amd or nvidia at these prices
I don’t see people divided on prices. Everyone know cards are overpriced.
All the noise in the thread is made by AMD supporters trying to stand their point about “AMD being better” even if it is pointless (no one want a Radeon) and off topic.
 
people are really divided here

i think they are all overpriced, the new 4070 and the old 6*** cards (makes it a bit worst for being a older gen card), wouldn't buy either amd or nvidia at these prices
What do you mean? People are happy to pay more for less. It's not that deep. It's actually simple. People are confused by marketing and yt influencers.
 
Smart buyers who bought 7900XT at 900usd, only for them to drop to 800usd in a month :D, could be 700usd next month, who knows.

Man let hope 7900XT/X last a long time, because their resale value are going down the toilet
Radeon’s resale value is very low.
I had a 2070 Super and a 5700XT. One year and a half ago I sold both and was super easy for the 2070 but a real PITA to sell the Radeon. I had to lower very much my initial request (already quite low since the beginning).
 
I don’t see people divided on prices. Everyone know cards are overpriced.
All the noise in the thread is made by AMD supporters trying to stand their point about “AMD being better” even if it is pointless (no one want a Radeon) and off topic.

divided on amd vs nvidia fanboys was what i meant, sorry if it was not clear
 
It's not on the chart since it's such a wierd card. However I'm glad I pulled the trigger on a very affordable 6750 XT, which has the exact same VRAM. $265+tax saved vs waiting to get this card. Solid review Wizz.
 
What do you mean? People are happy to pay more for less. It's not that deep. It's actually simple. People are confused by marketing and yt influencers.

what?! everybody is paying to much. Too much for the new card, and too much for a end of life card that comes as an alternative
 
You
Smart buyers who bought 7900XT at 900usd, only for them to drop to 800usd in a month :D, could be 700usd next month, who knows.

Man let hope 7900XT/X last a long time, because their resale value are going down the toilet
You are happy and proud that you got extorted by Jensen? Hahahahaha
 
divided on amd vs nvidia fanboys was what i meant, sorry if it was not clear
How can someone be an Nvidia fanboy ?
Nvidia is a very “anti-consumer” company.
No one could be “in love” with them. Their price policy is awful, they are greedy, they don’t care about customers at all.

The problem is there is no alternative. AMD is not. Their products are just not good enough, and not reliable for generic users. Yes, their price policy is much better BUT it is only because their market share is irrelevant. Lisa Su already demonstrated with CPU that once they are no more the “little AMD”, their price policy become identical of the competitor’s.

I was hoping on Intel as a competitor, but their start wasn’t exactly stellar. So I’m buying Nvidia products because there is no alternative on the market, not because I like them. They probably are the worst Company on the market.
 
It doesn't. How exactly was it supposed to perform ?

My expectation was a repeat of Navi 21 vs. GA102, so the very least on par with AD102 with the usual pitfalls that renders it the runner-up. It doesn't come even close to that. As it stands, the 7900 XTX consumes more power than the RTX 4090 (provided you don't buy an artificially limited, 2x 8-pin model, by the way - I can make the argument that my 3090 "only" uses 375W too, but you are leaving performance on the table), and performs substantially worse than it does in its already hilariously cut down configuration.

Even their presentations, which are hilariously optimistic, turned out botched beyond repair, they missed these extremely conservative, marketable estimates:

radeon-rx-7900-xtx-vs-6900-4k.jpg


Reality is closer to a pyrrhic victory with a 30% lead over the regular 6900 XT and getting sent to pound sand by the 4090:

relative-performance_3840-2160.png


So yes, it's on this that I base my argument of AMD being a no-show with the 7800 XT/Navi 32. Regardless, this thread is about the 4070; and AMD has no answer to the 4070 at present. The poor condition of Navi 31 spells bad news from where i'm standing. The 7900 XTX did not sell well, and it does not perform as well as its predecessor did against the competition, AMD themselves are showing little interest in promoting it, like a bad dream hoping that it will go away soon.
 
I was crying when I saw Argentine prices in the Steam store. :twitch: Same for Turkey. If I had known about that before I would have bought half of the Steam store empty, like apparently others did. Which is btw. the reason Steam cracked down on cross region trading, region changing & expanded region locks on games. That's how some people got a 10k+ games Steam library, not because they're millionairs.

Hahaha yeah, then again, we do earn $300 per month and its better for them to earn something than to have 90% of my country downloading cracked games.

Now Resident Evil 4 (and many many others, most new AA/AAA games for the past year) costs close to 30.000 pesos, or 72 dollars. My monthly salary is 115.000 pesos. My days of buying AAA games are over. We still get the cheap-ish game here and there, those will have to do. Thank you Americans!, I guess.
 
what?! everybody is paying to much. Too much for the new card, and too much for a end of life card that comes as an alternative
No. There are good value cards out there. People care to look. End of life cards? You're talking about $550-$700 8GB 3070/3070ti?
 
Yep, I'm really on the fence here but about to drop 850 eur on a 7900XT... It started at 1050 EUR a few months back over here.

Resale value... well, I think Nvidia Ampere resale value is taking a nosedive just the same. Ada below the 4080... not gonna be pretty in 4 years from now. I'm not sure the Nvidia premium is gonna hold for its midrange as it used to. The 3080 is an interesting one to watch now. It guzzles power, has no DLSS3, and low VRAM albeit at decent b/w. The 4070 kinda killed its value proposition on 2nd hand markets now, and its worse in most ways. Ampere cards below the 3080 are just plain useless.

In the meantime, I can sell my 1080 for 175 EUR today.. Already got several offers. 200 will likely even work as well. If anything that proves how little we've progressed since. I bought that card for 420 (could still skip taxes at the time) EUR... Ridiculously good value.

Just sold my old 3090 for 800usd 3 days ago, I miss that beautiful beast :/, though my house is getting cramped with children toys so I want to get rid of old PC stuff (sold within an hour after I posted)

2nd hand market should never be tangled with brand new GPU prices in the first place, especially when those 2nd hand could be ex-mining GPUs.

In the end it doesn't really matter what GPU you buy, any option is as good as any. No idea why people are thinking over-spending some money on hobby are grave mistakes LOL.
 
W
Hahaha yeah, then again, we do earn $300 per month and its better for them to earn something than to have 90% of my country downloading cracked games.

Now Resident Evil 4 (and many many others, most new AA/AAA games for the past year) costs close to 30.000 pesos, or 72 dollars. My monthly salary is 115.000 pesos. My days of buying AAA games are over. We still get the cheap-ish game here and there, those will have to do. Thank you Americans!, I guess.
Would it be cheaper to get an Xbox and use the ultimate game pass?
 
No. There are good value cards out there. People care to look. End of life cards? You're talking about $550-$700 8GB 3070/3070ti?

This is not that complicates. I think everything (literally) is overpriced, no exceptions, it's the age we live in.
More specifically to this point the new cards, green or red overpriced, and the end of life ones are less expensive (not cheap by any means) because they are END OF LIFE.

i see no good vale cards anywhere, not even used, the used market is also insane.
 
I live in Germany, but use eBay USA for these prices, and do not pick the cheapest seller, but what looks like a reasonable middle ground

If you don't use eBay, where do people sell their used graphics cards?
In Canada most people sell their GPU's on Kijiji or Craiglist and we avoid Ebay if possible.
 
I pay an obscene $0.51/kWh in California. (In your example, that's a $55/yr difference.) My video cards get handed down so they have a long lifecycle. Where I live, the lifetime electricity cost to run a 4070 exceeds its retail price!

Also, I don't drink Starbucks. But that's beside the point because no one here is interested in how to save money with lifestyle changes. We're talking about what's the best price for a particular gaming activity ("1440p 60fps for this generation's AAA" for example). And electricity should obviously be factored into the total cost of that activity. Many people in this thread feel that a $100 discount on this card would make it attractive. The power bill differential between this card and its closest competition can more than make up for that discount.

Its closest competition are 2+ year old gpus on an older, larger, less efficient process nodes, of course it’s not as power friendly. The point, and like I previously mentioned, is that we’ve had this kind of performance available for a year now at similar prices. The power efficiency has been relatively similar with the exclusion of the 3090Ti/6950xt class parts, between nvidia and AMD as we look at the entirety of the lineup.

If you really wanna be that picky about my example, the FE and AMD version of the 4080/7900 XTX draw on average 304w and 361w respectively. In that same example it takes 6 years to break even at your electricity cost, at the average US electricity rate it will take multiple years longer. If you want RT buy a 4080, if you don’t the 7900 XTX is a better card from a purely objective standpoint, and for most will cost less than the 4080 in its useable lifetime. If and when the 7800/7700 cards get released, the scenario will be the same and power difference will be non existent (10-20w at most), yet people will still use this completely hollow argument to justify buying Nvidia, when if they don’t play games with significant RT (which are still relatively small number of games), the AMD product will offer more performance.

Im not saying any piece of hardware shouldn’t be efficient, it’s just such a pointless metric in the grand scheme of things, especially when buying higher end hardware. It’s a stretch to believe any person building a $1500+ rig or spending $600-$800 is going to be actively thinking about a $3-10 difference in their power bill, and spending more than 3-6 hours in a week actually playing games in which case you’re probably talk 1-2$ over the course of a year.
 
Last edited:
Yeah, from 2.5 years ago, $100 increase maybe is not bad, but it's no just that, the PERFORMANCE also is lacking.
Let's not go too far, 3070 was 50% faster than 2070 for THE SAME price
2023 - 4070 30% faster than 3070 for 20% MORE money.


This is incredibly simple.
Now go back and read some of those old 2070 reviews. Here's a few choice samples:

"I see no reason to buy any of these RTX cards right now."
"This is what we get when there is no competition :S"
"This generation of Nvidia is just ridiculous. RTX 2070 is basically slightly better 1080 but far below 1080 ti."

The EXACT same complaints we are hearing right now. Suddenly, now the 2070 wasnt a total wet fart making the 3070 look artificially better by comparison.

In 4 years, people will be citing the 4070 as the reason why the 6070 isnt as good of a value as ada was.

In 2.5 years, we've had MASSIVE INFLATION. You are not getting a 4070 for $400, it just isnt happening. Same reason I cant get a brand new truck for $37k or a house for $90k anymore.

I pay an obscene $0.51/kWh in California. (In your example, that's a $55/yr difference.) My video cards get handed down so they have a long lifecycle. Where I live, the lifetime electricity cost to run a 4070 exceeds its retail price!

Also, I don't drink Starbucks. But that's beside the point because no one here is interested in how to save money with lifestyle changes. We're talking about what's the best price for a particular gaming activity ("1440p 60fps for this generation's AAA" for example). And electricity should obviously be factored into the total cost of that activity. Many people in this thread feel that a $100 discount on this card would make it attractive. The power bill differential between this card and its closest competition can more than make up for that discount.
If $50 in electricity is a problem you shouldn't be spending $600 on a video card. Period.
 
W

Would it be cheaper to get an Xbox and use the ultimate game pass?
Definitely. But Ive been a PC gamer for over 25 years, and downloading cracked games is even cheaper than Xbox GP while I can still play at Ultra settings.

I still do have a backlog of like 300 games out of 550 on my Steam from the past 10 years that I wanna play tho. I just wont be able to buy any more AAA games, no way Im spending 25% of my monthly income in a single game. Its sad, because the RE4 Demo looks great and I just got a used 6800XT after selling my 2060 Super and PS4.
 
Back
Top