• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 Allegedly Launches on April 13

GFreeman

News Editor
Staff member
Joined
Mar 6, 2023
Messages
1,902 (2.40/day)
It has been pretty much confirmed that the NVIDIA GeForce RTX 4070 (non-Ti) is launching in April, but now, the rumored date has been specified as April 13th. The latest report comes from a well known leaker, hongxing2020, over at Twitter, who has a pretty good track record and had correct dates for RTX 30 and RTX 40 series launch dates. In case you missed it, the NVIDIA GeForce RTX 4070 is based on the same AD104 GPU as the RTX 4070 Ti, with slightly fewer cores, but still comes with the same memory specification as the Ti version.

This means the GeForce RTX 4070 should feature 46 streaming multiprocessors (SMs) which should leave it with 5,888 CUDA cores enabled. It will come with 12 GB of GDDR6X memory on a 192-bit memory interface. The TDP is rumored at 200 W. There were some rumors that NVIDIA could have three different SKUs for the RTX 4070, with 16 GB, 12 GB, and 10 GB of VRAM, but so far, this has just remained as a vague rumor coming from Eurasian Economic Commission (EEC) regulatory filings. NVIDIA is slowly completing the RTX 40 series lineup, so hopefully we will not have to wait too long for updates on the RTX 4060 Ti and the RTX 4060. NVIDIA, and its founder and CEO, Jensen Huang, will be holding the opening keynote at GTC on March 21st, so we could get at least some updates for the future GeForce lineup.



View at TechPowerUp Main Site | Source
 
y/A/w/N...

Soooo.... they're about to launch yet ANUTHA over-priced addition to their line-up.... yee haw :) /s
 
I'm waiting for a RTX 4070 Super.....
 
Don't worry guys, with Nvidia's pricing pattern this sku and the next ones are going to be pretty affordable! /s

1ljvpyqw69ba1.jpg
 
3080 performance for $699.... :laugh:
Seeing how 4070Ti is losing to 3080 in some cases, the 4070 will be worse than 3080 in most cases because of the gimped memory. That's called progress in jensen land.
 
Seeing how 4070Ti is losing to 3080 in some cases, the 4070 will be worse than 3080 in most cases because of the gimped memory. That's called progress in jensen land.

I meant on average but yeah wouldn't be surprised if the 3080 beats it in some scenarios.
 
with slightly fewer cores
From 7680 to 5888 is not exactly "slightly". Considering the rest of the specs are not different compared to Ti, probably we are talking about a $650-$700 card.
 
It could launch at 500 USD, 12 days earlier.
 
From 7680 to 5888 is not exactly "slightly". Considering the rest of the specs are not different compared to Ti, probably we are talking about a $650-$700 card.
??? 192bit vs 128, 500gb/s vs 300gb/s, 12gb vs 8gb
 
Seeing how 4070Ti is losing to 3080 in some cases, the 4070 will be worse than 3080 in most cases because of the gimped memory. That's called progress in jensen land.

Yep, the 3080 even has more memory bandwidth then the 4080. This 4070 likely won't even beat the 3070. The 4070 Ti is probably one of the most gimmed cards out of the gate I've ever seen, $800+ 1440p card is some kind of joke in 2023. I doubt Nvidia will price the 4070 at $500 but really this should 4060 or 4050 and should cost no more than $330.
 
I gave him the benefit of the doubt and assumed he was talking about the 3080ti model...
3080 Ti is only in front for Battlefield V at 4K.

So still bad stuff whatever he smokes.
 
3080 Ti is only in front for Battlefield V at 4K.

So still bad stuff whatever he smokes.

It's technically ahead in 6 games reference vs reference spec at 4k.... But in most cases I would probably call it a tie.... Both were/are terrible cards given the price so it's irrelevant I guess and the 4070 will be just as bad if not worse.

16GB for sure.

We already have the super varient... its the Ti and it's meh AF.
 
Seeing how 4070Ti is losing to 3080 in some cases, the 4070 will be worse than 3080 in most cases because of the gimped memory. That's called progress in jensen land.
And what cases would those be.

average-fps_2560_1440.png


power-gaming.png
 
??? 192bit vs 128, 500gb/s vs 300gb/s, 12gb vs 8gb
The article says that 4070 will have a 192bit data bus, GDDR6 and 12GB RAM. So other than making a mistake in the type of VRAM, GDDR6X vs GDDR6, where am I wrong?
So, probably 384GB/sec bandwidth for the non Ti vs 504GB/sec for the Ti?
OK, $600-$650 then to compensate for the cheaper memory? No. Nope. Still $650-$700.
Nvidia is moving everything up now that it can.
 
The most interesting is the FPS/W-ratio in different GPU-generations. Had a 3080 12Gb which enjoyed 400W at stock settings. With undervolting, 250W @ 100% performance, under 250W not able to keep necessary frequency to maintain the performance as intented.

If 4070 delivers 3080-performance @ 150-180W after some fine tuning with voltage, would be really nice. Just thinking of some rumors saying 200W in specs. Yet again, it's not only about saving money in energy costs, but the cooler's noise levels. I am allergic to noise. :)
 
The article says that 4070 will have a 192bit data bus, GDDR6 and 12GB RAM. So other than making a mistake in the type of VRAM, GDDR6X vs GDDR6, where am I wrong?
So, probably 384GB/sec bandwidth for the non Ti vs 504GB/sec for the Ti?
OK, $600-$650 then to compensate for the cheaper memory? No. Nope. Still $650-$700.
Nvidia is moving everything up now that it can.
You are right, the 4060ti is the one with 300 gb/s and 8gb ram. My mistake
 
And what cases would those be.

average-fps_2560_1440.png


power-gaming.png

He said "cases", so obviously the chart of Averages you're showing would not show specific cases....isn't that obvious?
 
He said "cases", so obviously the chart of Averages you're showing would not show specific cases....isn't that obvious? Or was this a bad faith attempt at an "owning"?
I asked what cases ... it's as simple as that. With that said the 3080 is a power hungry overpriced dog compared to the 4070 Ti.
 
The most interesting is the FPS/W-ratio in different GPU-generations. Had a 3080 12Gb which enjoyed 400W at stock settings. With undervolting, 250W @ 100% performance, under 250W not able to keep necessary frequency to maintain the performance as intented.

If 4070 delivers 3080-performance @ 150-180W after some fine tuning with voltage, would be really nice. Just thinking of some rumors saying 200W in specs. Yet again, it's not only about saving money in energy costs, but the cooler's noise levels. I am allergic to noise. :)
At 55% Power Limit my 4070 Ti uses about 160 Watts with 5% performance reduction.
 
At 55% Power Limit my 4070 Ti uses about 160 Watts with 5% performance reduction.

Could you run Superposition 1080p Extreme, and post a screenshot of the result with points and FPS' visible...? I have never been too interested of using the PL slider, but to adjust the voltage directly, which tames down the wattage.

On the other hand, this kind of 160W scenario you mentioned, makes it interesting for laptops. With a TGP of 140W and above, and such amount of cuda cores as 3072 in "4060M", I see a possibility of pumping the GPU to 3000MHz without trouble.

Having 140W TGP equals with logic of more cudas=less frequency. Less cudas=more frequency, to cap with the 140W limitation. :rolleyes:
 
Last edited:
At 55% Power Limit my 4070 Ti uses about 160 Watts with 5% performance reduction.

That must be a very specific use case, because no way that's on average with uncapped framerates.

I play with Vsync so I prefer a fixed clock with a manual undervolt. On my 3080, most games do 200-250 W depending on GPU utilization. I saw as much as 280 W in Metro Exodus EE with max usage.
But in Aliens: Fireteam Elite I saw over 300 W in some scenes. And that was on 1800 MHz @ 0.8 V, which is a crazy undervolt compared to the stock 1.05 V. With a power limit at stock voltage, performance would drop like crazy.

A fixed clock is so much better for a capped framerate. 1800 MHz @ 0.8 V and 1905 MHz @ 0.9 V was a difference of ~200 vs. ~250 W, even though the framerate was identical.

Of course all this is with the "prefer max performance" mode. Adaptive can save you more power, but it's horrible for frametimes.


If the 4070 is $600, I might consider side-grading my 3080. This was my first card with such ridiculous power consumption, and without the undervolt I would've gotten rid of it a long time ago.
 
Back
Top