Monday, April 3rd 2023
NVIDIA GeForce RTX 4070 has an Average Gaming Power Draw of 186 W
The latest leaked slide for GeForce RTX 4070 confirms most of the specifications, as well as reveals some previously unknown details, including the 186 W average power draw. While the specification list does not mention the number of CUDA cores, it does confirm it will be based on AD104 GPU with 36 MB of L2 cache, and come with 12 GB of GDDR6X memory with 504 GB/s of maximum memory bandwidth, which points to 192-bit memory interface and 21 Gbps clocked memory.
The slide posted by Videocardz also compares the upcoming GeForce RTX 4070 with the previous generation RTX 3070 Ti and the RTX 3070, showing a significant increase in shader number, RT cores, and Tensor cores, not to mention we are talking about 3rd gen RT cores and 4th Gen Tensor cores on the RTX 4070. It will also support DLSS 3, and have AV1 and H.264 NV encoders.The most interesting part of the slide is the power draw comparison, showing a TGP of 200 W, which is is lower than on the RTX 3070 Ti and the RTX 3070. It also draws less power under average gaming, video playback, and in idle. According to NVIDIA's own slide, the GeForce RTX 4070 has an average gaming draw of 186 W, with video playback draw of 16 W, and idle power draw of 10 W. All of these are lower than on the RTX 3070 Ti and the RTX 3070.
The slide also pretty much confirms the previously reported $599 price tag, at least for some of the RTX 4070 graphics cards, as some custom models will definitely be priced significantly higher. So far, it appears that NVIDIA might not change the launch price, and, as reported earlier, the $599 will leave plenty of room for custom RTX 4070 graphics cards without going close to the less expensive RTX 4070 Ti graphics cards, which sell close to $800.
Source:
Videocardz
The slide posted by Videocardz also compares the upcoming GeForce RTX 4070 with the previous generation RTX 3070 Ti and the RTX 3070, showing a significant increase in shader number, RT cores, and Tensor cores, not to mention we are talking about 3rd gen RT cores and 4th Gen Tensor cores on the RTX 4070. It will also support DLSS 3, and have AV1 and H.264 NV encoders.The most interesting part of the slide is the power draw comparison, showing a TGP of 200 W, which is is lower than on the RTX 3070 Ti and the RTX 3070. It also draws less power under average gaming, video playback, and in idle. According to NVIDIA's own slide, the GeForce RTX 4070 has an average gaming draw of 186 W, with video playback draw of 16 W, and idle power draw of 10 W. All of these are lower than on the RTX 3070 Ti and the RTX 3070.
The slide also pretty much confirms the previously reported $599 price tag, at least for some of the RTX 4070 graphics cards, as some custom models will definitely be priced significantly higher. So far, it appears that NVIDIA might not change the launch price, and, as reported earlier, the $599 will leave plenty of room for custom RTX 4070 graphics cards without going close to the less expensive RTX 4070 Ti graphics cards, which sell close to $800.
56 Comments on NVIDIA GeForce RTX 4070 has an Average Gaming Power Draw of 186 W
Point is Nvidia did the "this is the MSRP but AIBs are free to come out with more expensive SKUs" thing with the 4070ti and of course most of the cards available weren't at MSRP, they're gonna do the same with this one as well. Depend where I guess, they're still over MSRP where I live.
$1189 for a crappy ventus model on an already crappy priced gpu so tempting.
Hasn't it been like 3 years you've been wanting to buy a gpu? You ever going to buy anything? Keep drinking that green coolaid maybe they'll eventual release somthing you can buy.
pcpartpicker.com/product/KWnypg/msi-gaming-x-trio-geforce-rtx-4070-ti-12-gb-video-card-rtx-4070-ti-gaming-x-trio-12g <--- MSRP
pcpartpicker.com/product/wRpzK8/zotac-gaming-trinity-geforce-rtx-4070-ti-12-gb-video-card-zt-d40710d-10p <--- MSRP
Not gonna check all of them but you get the point.
Sure if 800 is the most you can spend and you don't like the 7900XT for whatever reason you don't really have a choice otherwise it's the most meh nvidia gpu in a long a$$ time.
There's tons of variability on it though. Some games my 4090 draws 350W at 100% while others it's 420W. It's actually not easy to hit the power limit on a stock 4090 in games. 185W average for a 200W GPU seems pretty normal.
A review that actually tests multiple games:
Seems very much in line with what Nvidia claimed.
Oddly the only time I can even get it to 420w is in cyberpunk/Witcher NG using frame generation with all the settings maxed out.
I haven't been paying close attention to it lately though the card runs cool and quiet even when set to 600w as it is compared to my ampere card it replaced anyways.
Also and every 4090 owner likely knows this if you're willing to sacrifice 5% ish of the performance you can drop it to around 320-340w pretty easily.
12GB VRAM isn't too little. It's the minimum acceptable amount, but it is at least acceptable.
Sub 200W is also fine. Good, even.
The $599 price point is too much given that just five short years ago the 1070 was $379. I'm not sure inflation and manufacturing costs have risen by 60% in that time, but at least in the current market, $599 isn't awful. That says more about how bad the market is than the good price of the 4070 but I think the last five years have all been pretty rubbish with minimal performance/$ improvements from Nvidia since Pascal.
At the very least, I'm looking forward to some decent products in the midrange. If the price rumours of the 4070 being $749 were wrong, perhaps the horror-show that is an 8GB 4060Ti will also be wrong. $300 was is too much for an 8GB card in 2022. In 2023, 8GB is absolutely too little VRAM for whatever Nvidia intend to increase the 3060 Ti's $399 price by. Presumably another 20% like the rest of the Ada lineup so far - which means a $479 4060Ti, perhaps. All it would take for AMD to do then would be to provide 12GB on their sub-$500 for some very easy wins in benchmarks. I don't like it, but that's how it's been this year so far.
This is of course before we see what AMD's lower end next gen cards have in store.
Mind you I don't see gamers being impressed with any options on the market right now, let alone Nvidia's super cutdown $600 4070 with barely enough VRAM. Nvidia isn't trying to impress you, they are trying to pick your pocket and gamers shouldn't buy unless they are getting a 4090 or have a gun held to their head. 12GB is a repeat of the 3070. The card could just barely fit new AAA games in it's VRAM buffer at the time but not even 2 years later we are already seeing the card having to drop settings and stuttering issues. The same is likely to happen to the 4070 / 4070 Ti. This kind of price for a cards that will last less than 2 years is not what I'd call acceptable and it'll kill PC gaming as the vast majority of people cannot afford to drop that kind of money on just their GPU less than every 2 years.
The Xbox and PS5 both have 16GB of RAM, usually allocating 10-12GB as VRAM and both consoles targeting 4K. Both consoles are "current" for the next 3-4 years and when their successors appear in 2027 (rumoured), game devs won't instantly swap to optimising for the newest consoles, they tend to away from the outgoing generation over a year or so, while the vast majority of their paying customers are still on the older hardware.
IMO 12GB is enough for at least 3 years, maybe even 5. Meanwhile, the 10GB of the 3080 and 8GB of the 3070 were widely questioned at launch - I forget whether it was a Sony or Microsoft presentation that claimed up to 13.5GB of the shared memory could be allocated to graphics, but the point is that we had entire consoles with 13.5GB of VRAM costing less than the GPUs in question that were hobbled out of the gate by miserly amounts of VRAM.
The 3070 in particular has been scaling poorly with resolution for a good year now, but it's only in the last couple of months that the 3070 has really struggled. In 2023 we've had four big-budget AAA games which run like ass at maximum texture quality on 8GB cards, with 3070 and 3070Ti owners given the no-win choice between stuttering or significantly lower graphics settings.