• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Is the RTX 4070 the new performance to watts card?

No - I'm arguing that if you want 144 FPS at 2160p then that's what you should target with your GPU choice. My 60 FPS was only an example of a larger, hungrier GPU eating less power when FPS capped. ;)

But another question: how much time do you need to return the investement and price difference from, let's say, RX 6750 XT's 400-500$ price tag to the 250$ price tag of RX 6650 XT? This is a lot of money and you will need decades to burn that in electricity cost difference.
 
But another question: how much time do you need to return the investement and price difference from, let's say, RX 6750 XT's 400-500$ price tag to the 250$ price tag of RX 6650 XT? This is a lot of money and you will need decades to burn that in electricity cost difference.
That's absolutely right. My only choice for the 6750 XT was longevity (12 GB VRAM), and lower temperatures and noise in a scenario like above.
 
RX 6750 XT's 400-500$ price tag to the 250$ price tag of RX 6650 XT?
Computers use a lot of juice.. I think you would be surprised.
 
how much time do you need to return the investement and price difference from, let's say, RX 6750 XT's 400-500$ price tag to the 250$ price tag of RX 6650 XT?
It basically survives one generation more due to massive VRAM capacity and greater general performance in comparison so it pays for itself by staying on point for a couple years longer. Compare 1080 Ti to 1070. The former is still completely fine for 1080p gaming on high settings and the latter struggles in all recent games. Additional $320 got you a GPU which lasted way longer and provided you with better experience overall.

Not to mention some people get paid for their calculating power so some of this money could've returned to the user by them mining, rendering or doing any other maths on their PC. And 1080 Ti yields way more profits than 1070. The same situation, albeit with a little less difference, is with 6600 XT versus 6700 XT.

I'm on @AusWolf side, I got a 6700 XT (basically the same GPU) and a 1080p display, yet with higher refresh rate (83 vs 60). And my GPU is completely on point as of now and will be on point for a long time. I will need to upgrade in a while, especially if FSR3 proves actually good.
 
That's absolutely right. My only choice for the 6750 XT was longevity (12 GB VRAM), and lower temperatures and noise in a scenario like above.

I see pricing thrown out there all the time as a reason for buying a specific gpu over the other I guess I just look at it differently. When I buy a gpu I plan on keeping it 4 years, 2 years in my primary pc and 2 in my secondary pc give or take 6 months. Lets say one gpu is $1200 and another is $1600 I don't look at it as $400 more I look at it as 8 ish dollars more a month to have the better performing gpu over the time I plan on keeping it. I always ask myself is my hobby worth that 8-9 usd more a month it's almost always yes.


The 4070 is an awesome card with a silly price same with the 4070ti but again the buyer has to decide if their hobby is worth it assuming the card offers them the performance they are looking for.

I do think the 7800XT assuming the buyer is ok with FSR is going to be the new PP champ though and the gpu I'll probably recommend to most people the remainder of the generation.
 
Yesterday I saw this video Bryan from Tech Yes City made on the Zotac RTX 4070 Spider-Verse undervolting it to 130w without much performance drop


Since electricity prices are on the raise here I was wondering if this would be the RTX 4070 would be the best 1440p option at ~150watt max?

The RX 6950 XT easily beats it on price to performance, not even close. Unfortunately it's an embarrassing time to be an Nvidia customer.
 
Thread's first post isn't news.
This is why im always advocating for power optimization if possible.

Pretty much all Ada cards have this effect where they can retain their performance and shave huge amounts of power off.
Im currently using an RTX 4080 and im enjoying this kind of effect while running it on 250W. 70W shaven is not a bad deal.
 
Thread's first post isn't news.
This is why im always advocating for power optimization if possible.

Pretty much all Ada cards have this effect where they can retain their performance and shave huge amounts of power off.
Im currently using an RTX 4080 and im enjoying this kind of effect while running it on 250W. 70W shaven is not a bad deal.

Yeah I agree, I run my 4090 at 330w and get 95% of stock performance in a lot of games it doesn't break 300w.
 
I dont feel bad about my purchase at all.

Me either, honestly I'd buy a second one if I could.

If gpu prices weren't so stupid I'd own both a 7900XTX and a 4090.... I'd also love to go back to having one Intel and Ryzen based system as well.
 
I see pricing thrown out there all the time as a reason for buying a specific gpu over the other I guess I just look at it differently. When I buy a gpu I plan on keeping it 4 years, 2 years in my primary pc and 2 in my secondary pc give or take 6 months. Lets say one gpu is $1200 and another is $1600 I don't look at it as $400 more I look at it as 8 ish dollars more a month to have the better performing gpu over the time I plan on keeping it. I always ask myself is my hobby worth that 8-9 usd more a month it's almost always yes.


The 4070 is an awesome card with a silly price same with the 4070ti but again the buyer has to decide if their hobby is worth it assuming the card offers them the performance they are looking for.

I do think the 7800XT assuming the buyer is ok with FSR is going to be the new PP champ though and the gpu I'll probably recommend to most people the remainder of the generation.
Price isn't even my first consideration, to be honest. I buy the majority of my hardware out of curiosity, and not because I need it. :ohwell: Costs and returns play a weird game when you look at PC hardware as a hobby, and not as an investment.
 
Me either, honestly I'd buy a second one if I could.

If gpu prices weren't so stupid I'd own both a 7900XTX and a 4090.... I'd also love to go back to having one Intel and Ryzen based system as well.
I was going to go with the 7900XT.. but that wicked Prime day sale came along and literally saved me hundreds.

What I really wanted was to stretch to the 4080 but I cannot justify that for the amount that I play. And I play at 3840x2160 60FPS.. my 3070 Ti was doing that too, just not with my games maxed out or nearly maxed out.. CP2077 I do have to use dlss.. if not performance torpedoes.
 
Me either, honestly I'd buy a second one if I could.

If gpu prices weren't so stupid I'd own both a 7900XTX and a 4090.... I'd also love to go back to having one Intel and Ryzen based system as well.
I have two HTPCs that are both Intel + Nvidia. Only my main gaming rig is full AMD. :)
 
New since april and 4080 has won the performance to energy efficiency crown for almost a year now if you don't mind the +100% price for 50% performance. Undervolts exactly the same.

1693755925197.png
 
The RX 6950 XT easily beats it on price to performance, not even close. Unfortunately it's an embarrassing time to be an Nvidia customer.
Maybe in price but not power under full load.

Yeah I agree, I run my 4090 at 330w and get 95% of stock performance in a lot of games it doesn't break 300w.
Thread's first post isn't news.
This is why im always advocating for power optimization if possible.

Pretty much all Ada cards have this effect where they can retain their performance and shave huge amounts of power off.
Im currently using an RTX 4080 and im enjoying this kind of effect while running it on 250W. 70W shaven is not a bad deal.
Most RTX 4080 and 4090 can properly do this means that it should be a factory because this shows the efficiency that the 40series can do but Nvidia doesn't give unless you look at the Quadro series with the same gpus.

New since april and 4080 has won the performance to energy efficiency crown for almost a year now if you don't mind the +100% price for 50% performance. Undervolts exactly the same.

View attachment 311868
I am not talking in watt per frame but total power consumption here because 4W per frame and you run 200 frames per second is 800 (Yeah I know it's not how you calculate this but still).

When the day is done I just say for my own wallet that I would love to have a totally system power of max 200W at 1440p with 100fps average medium/high which would be innovation to see but this wouldn't be with a dedicated gpu anytime soon nur a APU which AMD is planning to push more of in the future if anyone believe the rumour mill that AMD is kinda leaving the entry level to their APU products.
 
I have seen the 400w limit @ 360w while running F@H, crazy stuff lol..
 
I was going to go with the 7900XT.. but that wicked Prime day sale came along and literally saved me hundreds.

What I really wanted was to stretch to the 4080 but I cannot justify that for the amount that I play. And I play at 3840x2160 60FPS.. my 3070 Ti was doing that too, just not with my games maxed out or nearly maxed out.. CP2077 I do have to use dlss.. if not performance torpedoes.

I just know from my past year with mixing AMD RX 6800 XT and Nvidia RTX 3070 and 3090 that even now I got a offer from someone with a RTX 4070 Dual plus £115 / 145USD cash for my RX 7900 XT.

I am just thinking I might end up disappointed switching to Nvidia again like last year it wasn't fun hunting RX 6800 XT's (1. Gaming X Trio, 2. Reference and 3. Red Devil) and I wish to avoid this and the RX 7700 XT and 7800 XT doesn't look like the cards for me really :(
 
It basically survives one generation more due to massive VRAM capacity and greater general performance in comparison so it pays for itself by staying on point for a couple years longer. Compare 1080 Ti to 1070. The former is still completely fine for 1080p gaming on high settings and the latter struggles in all recent games. Additional $320 got you a GPU which lasted way longer and provided you with better experience overall.

The same can be said about the 8GB RTX 3070, and the 12GB RTX 4070 in 12 months.

 
The same can be said about the 8GB RTX 3070, and the 12GB RTX 4070 in 12 months.
Much more difference in actual price and much less difference in performance plus I was speaking same gen cards but yes.
 
I just know from my past year with mixing AMD RX 6800 XT and Nvidia RTX 3070 and 3090 that even now I got a offer from someone with a RTX 4070 Dual plus £115 / 145USD cash for my RX 7900 XT.

I am just thinking I might end up disappointed switching to Nvidia again like last year it wasn't fun hunting RX 6800 XT's (1. Gaming X Trio, 2. Reference and 3. Red Devil) and I wish to avoid this and the RX 7700 XT and 7800 XT doesn't look like the cards for me really :(

If just looking at it academically, we had a short thread here with some results


4070 seems to have a good amount of UV headroom. My 4070 Ti is more along normal lines for Turing/Ampere/Ada, respectable but nothing out of the ordinary.
 
We bought some of the most hated video cards ever made
many people that did not buy one, hate that I did but buy a stupid card the 3080 10GB

Damn you dumb card that I got for MSRP at launch
which gave me so damn many great experiences through entire games over 3 years and counting
and paid for itself and then some in ill gotten mining gains because why not I bought one anyway
which continues to game like a beast

what a dumb ripoff pos.

Wish I never bought it.

/s
 
More VRAM is always better :D
Look at the 1% low FPS: RTX 3080 10GB - 39 FPS | RX 6800 16GB - 51 FPS @1440p

1693768342130.png


 
It basically survives one generation more due to massive VRAM capacity and greater general performance in comparison so it pays for itself by staying on point for a couple years longer. Compare 1080 Ti to 1070. The former is still completely fine for 1080p gaming on high settings and the latter struggles in all recent games. Additional $320 got you a GPU which lasted way longer and provided you with better experience overall.

Not to mention some people get paid for their calculating power so some of this money could've returned to the user by them mining, rendering or doing any other maths on their PC. And 1080 Ti yields way more profits than 1070. The same situation, albeit with a little less difference, is with 6600 XT versus 6700 XT.

I'm on @AusWolf side, I got a 6700 XT (basically the same GPU) and a 1080p display, yet with higher refresh rate (83 vs 60). And my GPU is completely on point as of now and will be on point for a long time. I will need to upgrade in a while, especially if FSR3 proves actually good.
Higher end GPUs, except the truly overpriced ones, are great return on investment.

Back when things were sane I could spend around a hundred bucks every 2-3 years to upgrade to a same tier new GPU and sell off the old one. That was with stuff in the x70/x80 tier. I managed to extend that to the 1080 and into the 7900XT I have now; I think over the last 15 years I spent 1500~1800 bucks on GPU in total, with the 7900XT in my hands at the end of it.

If you can pony up the initial cost of say 600-700, you can ride that money for a looong time. But you have to play it smart; don't buy into low VRAM GPUs like the x70 tier is now, for example, and upgrade when your purchase price is nice while you can still resell the old card because it can run things proper. And don't be afraid of second hand either; I haven't had a single second hand GPU fail on me in fact.

So its quite important to not just look at what you're getting out of a GPU today, but more in terms of what it might do 5 years ahead of today. That's the perspective of that next buyer for you, and its your perspective if the current gen at that time isn't offering what you want. And yeah, higher end also just means a better experience, less compromise, often better build quality on coolers, etc.
 
but more in terms of what it might do 5 years ahead of today. That's the perspective of that next buyer for you, and its your perspective if the current gen at that time isn't offering what you want.
I got a double personality regarding this matter.

One side of me tells it's irreasonable to invest money in a more powerful GPU right now, I already have a device which is fully sufficient and will be sufficient for half a decade from now.
Another side of me has very itchy hands and buys a new GPU less than in a year from previous upgrade...

I'm currently forcing myself to stop it and start investing in something useful. For example, funny powder smuggling purchasing carpenter tools and actually making furniture for sale.
 
More VRAM is always better :D
Look at the 1% low FPS: RTX 3080 10GB - 39 FPS | RX 6800 16GB - 51 FPS @1440p

View attachment 311889

1693770197971.png


Hmmmmmm. I don't disagree but cherry picking won't help us ;)

The RX 6950 XT easily beats it on price to performance, not even close. Unfortunately it's an embarrassing time to be an Nvidia customer.
Well only if you believe your GPU is an expression of yourself I guess...
 
Back
Top