• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 and RTX 4070 Ti Detailed Specs Sheet Leaks

Yeah, that's correct, bug. Flower power generations of graphics cards. :)
You can get full trilinear either by turning trilinear optimizations to off in nvidia Control panel or choosing "High Quality" rendering option. With nv Quadro cards you get true trilinear under OpenGL even with rendering set to "Quality".
 

Attachments

  • nvidia_tril-opt-off_w_16af.png
    nvidia_tril-opt-off_w_16af.png
    108.6 KB · Views: 58
Yeah, that's correct, bug. Flower power generations of graphics cards. :)
You can get full trilinear either by turning trilinear optimizations to off in nvidia Control panel or choosing "High Quality" rendering option. With nv Quadro cards you get true trilinear under OpenGL even with rendering set to "Quality".
I did that for a while. But since I wasn't seeing any difference (I'm not a pixel peeper), I kinda forgot about it.

For those not in the know, this isn't about artificial patterns, but about a trilinear optimization that translated to texture shimmering and, iirc, visible transitions between various LOD levels.
 
I haven't yet found any problems with using brilinear, but I anticipate lack of pixels by RTX 4070 only having 64 ROPs instead of full 80. :)
 
I haven't yet found any problems with using brilinear, but I anticipate lack of pixels by RTX 4070 only having 64 ROPs instead of full 80. :)
Sometimes ROPs get scaled back simply because they're underutilized. We'll need benchmarks to see actual impact.

Fwiw, I would even be ok with the same or up to 10% less performance than current gen if that translates to a 25% or more reduction in price.
 
Sometimes ROPs get scaled back simply because they're underutilized. We'll need benchmarks to see actual impact.

Fwiw, I would even be ok with the same or up to 10% less performance than current gen if that translates to a 25% or more reduction in price.
Nvidia and value for mon... I can't say that sentence as it doesn't exist as of now. AMD need to do something special.
 
In case 4070 is 3080;12, $100 cheaper and 3 months later than Ti, it's good enough for me. Can't wait. And knowing Nvidia, 2070 was as expensive as it was faster than a 1070, after a mining crash they are on the offensive, just because. For no good reason.
 
Remember

"The more you buy(pay), the more you save(get)".
 
  • Haha
Reactions: ARF
In case 4070 is 3080;12, $100 cheaper and 3 months later than Ti, it's good enough for me. Can't wait. And knowing Nvidia, 2070 was as expensive as it was faster than a 1070, after a mining crash they are on the offensive, just because. For no good reason.
I think the rumor is both 4070 and 4070Ti will launch together. However, launch is one thing, availability is another.
 
RTX 4070=Garbage.
 
  • Like
Reactions: ARF
This is potentially good news. 1/4 smaller than previous gen, about the same specs and a narrower memory bus. All the premises for something that a little faster than Ampere at lower prices. I am now curious about how this plays out.
What lower prices?
 
Wow... I just saw that Nvidia planned to launch the 4080 12gb with a mid-end chip, what a joke. lol

If you take a look at the specs the 4080 16GB is also a midrange Ada GPU. Nvidia has been playing this trickery since the GTX 680 release and even on tech sites such as this one it goes mostly unnoticed by the members. The 4090 is the only high end Ada so far.
 
If you take a look at the specs the 4080 16GB is also a midrange Ada GPU. Nvidia has been playing this trickery since the GTX 680 release and even on tech sites such as this one it goes mostly unnoticed by the members. The 4090 is the only high end Ada so far.
Until the 4090 Ti/Super/Turbo.
 
Until the 4090 Ti/Super/Turbo.

The 4090 will remain a high end Ada GPU forever. Just not the Flagship high end when/if the 4090 Ti comes out.
 
What lower prices?
I said "premises", didn't I?

Nvidia has also reported declining sales, so we know there is some pressure on them. We'll see how this plays out.
 
The only hope is that AMD could wish to save us :(

RTX 3070 - 5888 shaders, 256-bit, 392 sq. mm, $499
RTX 4070 - 5888 shaders, 192-bit, 295 sq. mm, $699-799

View attachment 273690

View attachment 273691
NVIDIA GeForce RTX 4070 rumored to feature 5888 CUDA cores, 12GB memory and 250W TDP - VideoCardz.com

nvidia is digging new deeper holes.

The shitshow by nvidia continues... now it's up to us to vote with our wallets...

AMD is selling 6700 XT successor for $999. How that's a save?

6700 XT 335 mm², 96 MB Infinity Cache $479
7900 XTX 306 mm², 96 MB Infinity Cache $999

But people still gonna buy it and claim they are voting with their wallets for better future.

A lot of people in Nvidia threads are furious about Nvidia pricing but they won't buy it anyway, actually they are praising Nvidia to increase prices so AMD respond with lower prices (Titan Z $3000 vs RX 295X2 $1500) so they could buy only AMD as always.

mvCxPbK.png
 
In case 4070 is 3080;12, $100 cheaper

Let's see:

RTX 3080: 8704 shaders, 272 TMUs, 96 ROPs, 320-bit, 628 sq. mm.
RTX 4070: 5888 shaders, 184 TMUs, 64 ROPs, 192-bit, 295 sq. mm.

I don't see how RTX 4070 will get close to RTX 3080 even in the most cherry-picked cases.

AMD is selling 6700 XT successor for $999. How that's a save?

6700 XT 335 mm², 96 MB Infinity Cache $479
7900 XTX 306 mm², 96 MB Infinity Cache $999

You need to calculate the total die area, which for Navi 31 is not 306 sq. mm but over 500 sq. mm.

1670598824228.png
 
Seems like nVidia has successfully sold that next gen perf should just build upon last gen pricing. Maybe 5000 series will bring something palpable.
 
Im not overly impressed. Apart from the extra L2 cache, and some increased numbers on the fake 4080 the new GPU's seems to have lower specs. I guess the 3rd party reviews will confirm or refute my viewpoint.
The only spec that is lower on the 4070s when compared to the 3070s is the 192bit memory bus width. But this is way, way, way more than compensated for by the TWELVE TIMES increase in L2 cache. “Extra” is a bit of an understatement.

And look at their theoretical compute performance. The 4070ti has almost 2x the compute power and the 4070 has about 1.5x the 3070s. Those aren’t numbers they can fudge. They don’t necessarily translate one-to-one in game performance, but the 4070s are going to be *significantly* faster than the 3070s. Roughly 1.5x-2x faster.
 
The only spec that is lower on the 4070s when compared to the 3070s is the 192bit memory bus width. But this is way, way, way more than compensated for by the TWELVE TIMES increase in L2 cache. “Extra” is a bit of an understatement.
Actually, since 4070 is on GDDR6X, it still has more bandwidth than the 3070 even with the narrower bus. It's the 4070Ti that loses some bandwidth (~16%) compared to the 3070Ti.

Still, at the end of the day internal details can be irrelevant. What matters is $$$/fps.
 
Let's see:

RTX 3080: 8704 shaders, 272 TMUs, 96 ROPs, 320-bit, 628 sq. mm.
4070 boosts the clock speed to roughly 8960 96 Rop equivalent or 50%, but the 192-bit will affect things badly. I'm just grateful for the extra 2GB. At first they hinted 10GB and this would be really bad.
 
AMD is selling 6700 XT successor for $999. How that's a save?

6700 XT 335 mm², 96 MB Infinity Cache $479
7900 XTX 306 mm², 96 MB Infinity Cache $999

But people still gonna buy it and claim they are voting with their wallets for better future.

A lot of people in Nvidia threads are furious about Nvidia pricing but they won't buy it anyway, actually they are praising Nvidia to increase prices so AMD respond with lower prices (Titan Z $3000 vs RX 295X2 $1500) so they could buy only AMD as always.

mvCxPbK.png
uhh… the 7900XTX is not the successor to the 6700XT. The specs you’re comparing mean nothing because those are two different architectures, each manufactured using a different process node.

6700XT transistor count: 17.2 million
7900XTX transistor count: 58 million

6700XT memory: 12gb, 192bit bus, 16Gbps rate
7900XTX memory: 24gb and 384bit bus, 20Gbps rate

Sooooo the 7900XTX has 3.37x more transistors, 2x the amount of RAM, 2.5x more RAM bandwidth, about 4.5x more compute power, and costs about 2x more. Seems like a pretty decent deal to me, even if it *was* the successor to the 6700xt.

Actually, since 4070 is on GDDR6X, it still has more bandwidth than the 3070 even with the narrower bus. It's the 4070Ti that loses some bandwidth (~16%) compared to the 3070Ti.

Still, at the end of the day internal details can be irrelevant. What matters is $$$/fps.
My point was that the GDDR6/X bus width and data rate doesn’t really matter on the 4070s because the insane increase in L2 cache more than compensates for any loss in memory bus throughput.

They’re using the same strategy AMD started using with the 6000 series. The 6800xt/6900xt could get away with 256bit wide GDDR6 at 16Gbps because they had 128mb of L3 cache to compensate for it. Now we’re seeing Nvidia following suit with an increase to the size of their L2 cache by 1200%. Nvidia’s massive L2 cache solution is analogous to AMDs infinity cache solution. GDDR6/X is too slow for it keep up with these new GPUs and the solution is to implement an insane amount of cache.
 
4070 boosts the clock speed to roughly 8960 96 Rop equivalent or 50%

How do you know?
I am inclined to believe that clock speeds don't scale performance linearly. It's more like 10% higher clock means 5% higher performance.

but the 192-bit will affect things badly. I'm just grateful for the extra 2GB. At first they hinted 10GB and this would be really bad.

I agree. I think 4070 is a badly designed chip with severe internal imbalances which would cause performance issues, especially at 2160p, 4320p and beyond.
 
  • Like
Reactions: N/A
Back
Top