• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 Founders Edition

Really my 6800XT does not go over 255 Watts so try again.
I also have a reference 6800XT and this is correct TBP is 255 watts unless you modify the bios for this model.
 
Not picking on you personally, I hate when people use the power cop out argument.

Say the 4080 vs the 7900 XTX, gaming 3 hours per day in the US with ~100w power difference in favor of the 4080. In a year the price difference for power exclusively is ~ $10.95. With the average price of a 4080 being significantly higher $50-$100+, at minimum it takes 5 years to even break even which is longer than the actual life cycle of the card.

People need to stop with the power cost nonsense. If you’re worried about a $10 difference in your power bill due to your PC you have no business buying any GPU if it’s a question of affordability. Don’t buy a coffee from Starbucks every week if you’re worried about saving money.

I have no issue with how much power amd cards use I have a 4090 after all. It's the performance they give in 2023 for that power and the extra heat it produces. I can get my 4090 down close to 300w while gaming while losing a negligible 5% in perfomance... At the same time if I was in the market and 2-3 gpus performed similarly I'd go with the one that consums the least amount of power that has nothing to do with electricity cost just waste heat it produces.

Same reason i would rather have a 4080 not hard to cut 50-60W on it and have about the same performance over a 7900XTX.

Also I much prefer DLSS to FSR and if I was targeting a 4070 I'd likely use it for 1440p gaming where FSR sucks even worse.

Either way even my 2 year old gpu jn my secondary pc offers more perfomance than this card and I already ditched that for my primary pc although that card is also slightly power limited becuase again saving
70-80w on it accounts for almost no perfomance loss.

I live in sunny California so summers are pretty warm I'd rather keep waste heat to a minimum.
 
Last edited:
Hmmm so to get the same power my twin 1070ti's in SLI I need a 4090 to run 4k over 100 fps for all games

WTF, a 1070ti is slower than a RTX 2060 so this is already faster than what you have assuming perfect scaling the fact that 0 new games even support sli means in most games this is over twice as fast as what you have. A 4090 would be comically faster.....

I also have a reference 6800XT and this is correct TBP is 255 watts unless you modify the bios for this model.

I'll go with W1zzards numbers over randoms on the internet 280 gaming 324 peak. Although transients would be in the neighborhood of 600w which is over double the 4070
 
Last edited:
I have no issue with how much power amd cards use I have a 4090 after all. It's the performance they give in 2023 for that power and the extra heat it produces. I can get my 4090 down close to 300w while gaming while losing a negligible 5% in perfomance... At the same time if I was in the market and 2-3 gpus performed similarly I'd go with the one that consums the least amount of power that has nothing to do with electricity cost just waste heat it produces.

Same reason i would rather have a 4080 not hard to cut 50-60W on it and have about the same performance over a 7900XTX.

Also I much prefer DLSS to FSR and if I was targeting a 4070 I'd likely use it for 1440p gaming where FSR sucks even worse.

Either way even my 2 year old gpu jn my secondary pc offers more perfomance than this card and I already ditched that for my primary pc although that card is also slightly power limited becuase again saving
70-80w on it accounts for almost no perfomance loss.

I live in sunny California so summers are pretty warm I'd rather keep waste heat to a minimum.

You can undervolt any gpu to achieve similar results, obviously the baseline is different but undervolting a gpu to save 30-50w with almost no performance penalty isn’t unique to a brand.

At worst a 50-100w heat addition is going to change the room temp by 1-3 degrees f, largely insignificant. It’s just another obtuse justification to play mental gymnastics with yourself make some hollow argument on the internet. Unless you’re sucking down triple the wattage you will not notice the effect on your wallet or room temp.

As if someone isn’t going to buy the fastest possible card for the best price regardless of power 9/10 anyways. It’s such a dumb argument to make.
 
You can undervolt any gpu to achieve similar results, obviously the baseline is different but undervolting a gpu to save 30-50w with almost no performance penalty isn’t unique to a brand.

At worst a 50-100w heat addition is going to change the room temp by 1-3 degrees f, largely insignificant. It’s just another obtuse justification to play mental gymnastics with yourself make some hollow argument on the internet. Unless you’re sucking down triple the wattage you will not notice the effect on your wallet or room temp.

As if someone isn’t going to buy the fastest possible card for the best price regardless of power 9/10 anyways. It’s such a dumb argument to make.

Yes but Ada is way more efficient than RDNA3 and comically more so vs RDNA2 so the performance loss vs power/heat reduction is on another level.


Again I'm just saying when perfomance is already similar but one card is substantially better in the power department... if the 6800XT was like 20% faster sure...

That's why I said the 6950XT is more interesting but it uses 3090ti like power making it unappealing for the performance it gives at least any model worth owning.
 
not impressed. For $600 the least you would expect is 4070 TI level performance or a 30%+ increase over last Gen 3070.

At 1440p/4K the 4070 performance gap is insane.... the 4090 is ~97% faster or the 4080 ~56% faster. Compared to these insatiably "overpriced" cards the 4070 feels like a generation behind gimped wobbler.

... I have to admit, i was expecting the 4070 to go for around $700-$750 but seeing this card is 20-25% rediculously inferior to the TI variant, the $600 punt doesn't surprise me one bit.

@W1zzard any chance if you can add the 6950XT to the performance charts
 
I'll go with W1zzards numbers over randoms on the internet 280 gaming 324 peak. Although transients would be in the neighborhood of 600w which is over double the 4070
You can choose to believe what you will.

I know what I see on my side.

1681348849599.png


1681348919192.png
 
Cool, I'll still trust the dude who's been reviewing hardware for 2 decades over someone who wants to post a random acreenshot trying to defend their hardware.
The screenshots speak for themselves nothing to defend.

And i've been building computers for 25+ years.

You do you :)
 
The screenshots speak for themselves nothing to defend.

And i've been building computers for 25+ years.

You do you :)

I also trust Amd more than I trust randoms on the Internet.....



Typical Board Power (Desktop)
300 W

As well as GamersNexus

Screenshot (131).png

I guess randoms get the low power peasant editions and only Reviewers get the 300w ish cards....

But as you say you do you..

On a side note it isn't personal I don't trust when people talk about temps or average performance etc as well especially when there are professional sources.
 
Last edited:
That's what the $800 4070 Ti is for.

average-fps_2560_1440.png
Every 70 series card in the last 8 years except this one has been faster than the 80 series card of the generation prior. Serious stagnation if the 70Ti is what's required to surpass the 3080 in performance.
 
Every 70 series card in the last 8 years except this one has been faster than the 80 series card of the generation prior. Serious stagnation if the 70Ti is what's required to surpass the 3080 in performance.

I think this pretty much how every non fanboy feels to be honest decent product terrible price I don't really care about naming they could have called it the 4060 gave it the performance of the 4070ti for 600 and I would have felt a lot better about it....

Although if you need a new gpu and only have 600 usd to spend it's this the 6950XT or wait for the 7800XT/7800 for some there is the 6800XT still but given the weak RT performance and the meh FSR it's not appealing to me.
 
Last edited:
Yes but Ada is way more efficient than RDNA3 and comically more so vs RDNA2 so the performance loss vs power/heat reduction is on another level.


Again I'm just saying when perfomance is already similar but one card is substantially better in the power department... if the 6800XT was like 20% faster sure...

That's why I said the 6950XT is more interesting but it uses 3090ti like power making it unappealing for the performance it gives at least any model worth owning.

Why are you comparing a previous gen card on a different node and expecting there to not be a power usage difference?

The big take away from this is that MSRP wise, and retail price wise we’ve had 4000 series performance for similar prices with the obvious of exception 4090 for a year now and longer for the lucky few who bought 3000 and 6000 series cards at retail prices long ago.

Yet everyone is marveling over typical generational improvements while moving down SKUs and moving up prices further. When higher end cards like the the 7900XT/XTX are matching the price performance value at the targeted res of 1440p, not sure why people aren’t more painfully aware how bad a generation of GPUs were getting.
 
I also trust Amd more than I trust randoms on the Internet.....



Typical Board Power (Desktop)
300 W

As well as GamersNexus

View attachment 291298

I guess randoms get the low power peasant editions and only Reviewers get the 300w ish cards....

But as you say you do you..

On a side note it isn't personal I don't trust when people talk about temps or average performance etc as well especially when their are professional sources.
I took this from today's review of the 4070 your picture is from their 6800XT review.

These are not numbers you will usually see in game, and I stand by my original post to kapone32 based on what I see in game with this card.

You are free to disagree but my original post was for him.
1681350871219.png
 
I took this from today's review of the 4070 your picture is from their 6800XT review.

These are not numbers you will usually see in game, and I stand by my original post to kapone32 based on what I see in game with this card.

You are free to disagree but my original post was for him.
View attachment 291299

And you won't see 450 watts on my 4090 90% of the time I still won't sit here and say it isn't a 450w gpu.....

Why are you comparing a previous gen card on a different node and expecting there to not be a power usage difference?

The big take away from this is that MSRP wise, and retail price wise we’ve had 4000 series performance for similar prices with the obvious of exception 4090 for a year now and longer for the lucky few who bought 3000 and 6000 series cards at retail prices long ago.

Yet everyone is marveling over typical generational improvements while moving down SKUs and moving up prices further. When higher end cards like the the 7900XT/XTX are matching the price performance value at the targeted res of 1440p, not sure why people aren’t more painfully aware how bad a generation of GPUs were getting.

I'm just pointing out the still new alternatives in the general price range are not very appealing You can still make a case for all of them all I said was I would take the 4070 over the 6800XT even if the 6800XT is 100 cheaper new.... That's me if you rather have the 6800XT good for you your needs and use case is different than mine life goes on

The 4070 isn't very appealing to begin with.
 
I think this pretty much how every non fanboy feels to be honest decent product terrible price I don't really care about naming they could have called it the 4060 gave it the performance the 4070ti for 600 and I would have felt a lot better about it....

Although if you need a new gpu and only have 600 usd to spend it's this the 6950XT or wait for the 7800XT/7800 for some there is the 6800XT still but given the weak RT performance and the meh FSR it's not appealing to me.
I love Nvidia and wouldn't even consider an AMD card at this price range but the 4070 is just a supreme example of when it's time to pause and think about how stupid Nvidia thinks we are. But my 3080 Ti just abruptly died and I needed a new video card right now I would just buy an A750 and wait it out until something actually reasonably priced came out. Both AMD and Nvidia are unfortunately getting away with robbery because the 4070 will sell out and the 7800 will probably be 650 and really be a secret 7700 XT.
 
I love Nvidia and wouldn't even consider an AMD card at this price range but the 4070 is just a supreme example of when it's time to pause and think about how stupid Nvidia thinks we are. But my 3080 Ti just abruptly died and I needed a new video card right now I would just buy an A750 and wait it out until something actually reasonably priced came out. Both AMD and Nvidia are unfortunately getting away with robbery because the 4070 will sell out and the 7800 will probably be 650 and really be a secret 7700 XT.

I mentioned this in a different post if my gpu just abruptly died this would not be something I would feel good about purchasing but neither are any of the alternatives currently available in the general price range and to me the 4070ti is even worse leaving just the really expensive 4080/7900XTX as cards I would consider assuming a 4090 wasn't an option from a monetary perspective.

Nvidia really has made it hard even for the people who don't mind spending a decent amount of money before a 70/80 tier card was a no brainer now you gotta really think hard about it.
 
The sad thing is, is this their best value Ada card. In isolation it's decent, but basically only 3080 speed at $100 less on RRP when that was new. But it's 33% dearer than 3070 at around 25% more performance. That's not progress at all. Only 4090 has delivered progress in the real sense and still the only card I would buy if I had more money than sense.

My 6800XT holds up really well, but my 2080 S would get its @ss kicked.
 
You can choose to believe what you will.

I know what I see on my side.
So you are unaware that RDNA2 don't report the full power usage of the entire card, only power consumed by ASICs (GPU+VRAM), missing VRM power loss + fans. Plenty of reports going around, such as one from IgorsLab (link). Your 255W TBP 6800XT is actually using 300W, 6900XTXH reports TBP of 290W but it would actually use 350W, increasing TBP to 335W would have the card use ~430W (higher VRM power loss when using more power)

RDNA3 however do report the full power usage.
 
So you are unaware that RDNA2 don't report the full power usage of the entire card, only power consumed by ASICs (GPU+VRAM), missing VRM power loss + fans. Plenty of reports going around, such as one from IgorsLab (link). Your 255W TBP 6800XT is actually using 300W, 6900XTXH reports TBP of 290W but it would actually use 350W, increasing TBP to 335W would have the card use ~430W (higher VRM power loss when using more power)

RDNA3 however do report the full power usage.
thanks for the link I remember something about this posted on that site but it was much older this article is from Nov 2022 so more recent will review it.

And glad that RDNA3 does report as I may be moving up to a XFX Speedster MERC310 RX 7900XTX this summer.

 
Say the 4080 vs the 7900 XTX, gaming 3 hours per day in the US with ~100w power difference in favor of the 4080. In a year the price difference for power exclusively is ~ $10.95. With the average price of a 4080 being significantly higher $50-$100+, at minimum it takes 5 years to even break even which is longer than the actual life cycle of the card. [...] Don’t buy a coffee from Starbucks every week if you’re worried about saving money.
I pay an obscene $0.51/kWh in California. (In your example, that's a $55/yr difference.) My video cards get handed down so they have a long lifecycle. Where I live, the lifetime electricity cost to run a 4070 exceeds its retail price!

Also, I don't drink Starbucks. But that's beside the point because no one here is interested in how to save money with lifestyle changes. We're talking about what's the best price for a particular gaming activity ("1440p 60fps for this generation's AAA" for example). And electricity should obviously be factored into the total cost of that activity. Many people in this thread feel that a $100 discount on this card would make it attractive. The power bill differential between this card and its closest competition can more than make up for that discount.
 
Last edited:
"Loses"

:D

New cards will always cost more than used cards bro.

View attachment 291210View attachment 291211

Uses the RT chart as an example, LMAO! :laugh: Because that's the only scenario where it's taking the cake (besides low power consumption). Expected really some more objectivity from a "Staff Member".

Also depends on who is testing what. Just look a Linus's review. In his review the 2.5 years old (!) AMD 6800XT is totally smoking the 4070 in pure rasterization. 1080p = 14% faster, 1440p = 9% faster, 4K = 10% faster! And not to forget the 7900XT for just 170€ more, running circles around the 4700, even in "ray traced glory".


Ray tracing performance doesn't even matter because of the low performance of the card it becomes unplayable anyways. The low power consumption on the other hand is great, but it becomes meaningless because of the high price of the card. If you get the 6800XT for let's say 200€ (used ones go for way less) cheaper than the 4070 you have to play for 5.5 years (!) to make up for it (3hrs gaming a day). And that's a steep calculation with 33.5 €cents per KW, any other place power is way cheaper.

TL;DR: the card is really good. It's not a shoe box like the 4090, 4080 & 4070Ti, it fits in every case. Power consumption is great. Price is just horrible. Avoid for now. Wait for AMD 7800/XT. Might be the way better deal or at least put some pressure on Nvidia's consumer shafting price structure.
 
Last edited:
You can buy two 6800xt with the same amount of money
Nobody want ONE 6800XT, are you suggesting people buy two ?
 
Nobody want ONE 6800XT, are you suggesting people buy two ?

He's suggesting that the 4070 is a terrible value in his country due to it being twice as expensive as the 6800XT, not sure why that's overly hard to understand.
 
Back
Top