• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3070 Founders Edition

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,985 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
NVIDIA's GeForce RTX 3070 is faster than the RTX 2080 Ti, for $500. The new FE cooler is greatly improved, runs very quietly, and even has fan stop. In our RTX 3070 Founders Edition review, we'll also take a look at RTX performance, energy efficiency, frametimes, and overclocking.

Show full review
 
Nice Review! But how was the experience with coil whine, if any?
 
Another excellent and comprehensive review.

Now to see the hype train stop in one way or another to see what it's up against
 
A wonderful review, a great card, only I cannot quite accept the fact that it's still priced quite high:

GTX 770 - $399
GTX 970 - $329
GTX 1070 - $449

Lastly it's the most power efficient Ampere GPU but I expected a much higher overall efficiency for this generation. It's only a few percent more efficient than the GTX 1660 Ti based on a much less advanced node. I expect availablily will totally suck for the next at least three months - Ampere cards are so good several generations of AMD/NVIDIA owners are in line to get them. The RTX3080 is still nowhere to be seen: https://www.nowinstock.net/computers/videocards/nvidia/rtx3080/
 
Another fraudy release
Not available in market
If available, would be with a much higher price than the announced one.
If Nvidia does what it did with 3080 it should be sued (unfortunately only people in US can do that)

A wonderful review, a great card, only I cannot quite accept the fact that it's still priced quite high:

GTX 770 - $399
GTX 970 - $329
GTX 1070 - $449

Lastly it's the most power efficient Ampere GPU but I expected a much higher overall efficiency for this generation. It's only a few percent more efficient than the GTX 1660 Ti based on a much less advanced node. I expect availablily will totally suck for the next at least three months - Ampere cards are so good several generations of AMD/NVIDIA owners are in line to get them. The RTX3080 is still nowhere to be seen: https://www.nowinstock.net/computers/videocards/nvidia/rtx3080/
Except the actual price would much much higher than the announced 500
 
Very nice GPU. Basically 2080TI minus 3Gb GDDR6 and 20% TPD reduction for 500 bucks. Let's see what AMD has to offer tomorrow. I have a gut feeling that it will offer on pair performance, 4GB more VRAM, worse ray tracing, a bit better TPD for 50 bucks less. In that case I'll go with AMD as 8GB will soon not suffice for 4K gaming. Plus supporting the underdog with superior offer is always a good deed.
 
Last edited:
In that case I'll go with AMD as 8GB will soon not be enough for 4K gaming anymore.

W1zzard has mentioned this and this has been disspelled a thousand times already:

GeForce RTX 3070 comes with 8 GB memory and that will be the basis for a lot of discussion, just like on RTX 3080 10 GB. I feel like 8 GB is plenty of memory at the moment, especially considering that this card is targeted at 1440p gaming, which isn't as memory-intensive as 4K. Even with 4K, I feel like you'll run out of shading power long before memory becomes an issue. Next-gen consoles do have more memory, but their 16 GB is for OS + game + graphics, which means effective graphics memory is close enough to the 8 GB RTX 3070 offers. Nobody can predict the future, should you ever feel VRAM runs out, just sell the RTX 3070 and buy whatever card is right at that time. Personally I rather pay less today, than spend extra to solve an unknown problem—8 GB of GDDR6 memory cost around $40, so it will be interesting to see at which price point AMD's reportedly-16 GB RDNA2 Radeon RX 6000 comes and whether the extra memory will be able to make a difference.
 
That is ridiculously quiet, especially for a card that is (or should eventually be) sold at the baseline MSRP. I'm pretty impressed.

For everyone complaining about price, maybe it'll help to think of this more as what would've been a 3080, and the real 3080 as what a 3080ti would've been? This still matches a 2080ti for $500.
 
only I cannot quite accept the fact that it's still priced quite high

If you don't like it, you're more than welcome as a free consume to go buy the AMD alternative that gives the same amount of performance for cheaper...
 
If you don't like it, you're more than welcome as a free consume to go buy the AMD alternative that gives the same amount of performance for cheaper...

I've never owned a GPU which costs more than $300 :cool: - not my price range. I'm just concerned that this generation XX60 cards will cost a lot more than 10XX/16XX/20XX cards. Also, I'm not a fan of AMD drivers, so, thanks a lot. Looks like I'm waiting for RTX 4060.
 
Noice, one of my target GPU upgrading from GTX 1070, but yeah, won't be able to get one anyway since i will not be able to find it anywhere and even if i do its twice the price already locally
 
GeForce RTX 3070 comes with 8 GB memory and that will be the basis for a lot of discussion, just like on RTX 3080 10 GB. I feel like 8 GB is plenty of memory at the moment, especially considering that this card is targeted at 1440p gaming, which isn't as memory-intensive as 4K. Even with 4K, I feel like you'll run out of shading power long before memory becomes an issue. Next-gen consoles do have more memory, but their 16 GB is for OS + game + graphics, which means effective graphics memory is close enough to the 8 GB RTX 3070 offers. Nobody can predict the future, should you ever feel VRAM runs out, just sell the RTX 3070 and buy whatever card is right at that time. Personally I rather pay less today, than spend extra to solve an unknown problem—8 GB of GDDR6 memory cost around $40, so it will be interesting to see at which price point AMD's reportedly-16 GB RDNA2 Radeon RX 6000 comes and whether the extra memory will be able to make a difference.

Well, I'm only into flight sims where there are tons of textures to load so more vram really makes a difference in some titles. DCS world runs smoother on 11Gb 1080TI than on 8Gb 2070S/2080 for example. For me 8 Gb is as low as I'd go on a tight budget, 10 gigs pretty much minimum to consider, 12 Gb price wise optimal, 16 Gb nice for the future proofing .
 
A good improvement, but you can clearly see performance per wattt isn't greatly improved with Ampere compared to Turing (220W vs. 250W). A good 15%, but that's it.
A noteworthy mention is that GA104 in this configuration uses 1536 CUDA cores more than TU102 (4352) and at a higher core clock as well.

Is there perhaps a bottleneck in the pipeline that causes stall?
Is the new Ampere architecture starving memory bandwidth?
Is the CUDA core somewhat 'RISC'-ed / 'bulldozered' compared to Turing CUDA code?

It's good for consumers that mid range card with last gen flagship performance can now be had for half the price of the previous gen flagship SKU. But it does achieve that using brute force.
 
As expected a very attractive GPU. Very much looking forward to AMD's response tomorrow, but even with this alone the GPU market is looking better than in a while. That tiny PCB would be excellent for a water cooled SFF build too. Of course it's a shame that the xx70 series is now at $500, but hopefully that means the (permanent) addition of a xx60 Ti tier filling in the ~$400 bracket, allowing for ~$300 xx60 cards. If not ... that's a shame.

Now, bring on RDNA 2 and let us see them both duke it out. Can't wait!
 
So, dead is the myth nVidia spreaded and some believed of 3070 beating 2080Ti. And most of those people also believed that Big Navi will battle with this GPU. :laugh:

Nice review as usual @W1zzard :toast:
 
So, dead is the myth nVidia spreaded and some believed of 3070 beating 2080Ti. And most of those people also believed that Big Navi will battle with this GPU. :laugh:

Nice review as usual @W1zzard :toast:
Well... 1% slower in 1080p, 1% faster in 1440p and 2160p. Technically that is a win, I guess? Though for all intents and purposes it is a tie, obviously. AMD's answer to this (whether that's the 6800 or the 6700 XT or something else) will definitely be interesting. As will the next tier down, given that ~$3-400 is often peak GPU value territory.
 
I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?
 
Nice Review! But how was the experience with coil whine, if any?
From GUru3d:

Much like the 3080, the GeForce RTX 3070 does exhibit coil squeal. Is it annoying? It's at a level you can hear it. In a closed chassis, that noise would fade away in the background. However, with an open chassis, you can hear coil whine/squeal. Graphics cards all make this in some form, especially at high framerates; this can be perceived.

Pricing
 
I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?
Quite obviously to steal some thunder. AMD can't 'pop more vram' on the card a day before. AMD's card is what it is on that front. Surely clocks are being tweaked, but yeah.
 
Is there perhaps a bottleneck in the pipeline that causes stall?
Is the new Ampere architecture starving memory bandwidth?
Is the CUDA core somewhat 'RISC'-ed / 'bulldozered' compared to Turing CUDA code?

The Ampere SM has 128 CUDA cores however only 64 of them can execute FP32 operations concurrent with 64 integer operations in a clock cycle, meaning that in the best case scenario the Ampere SM has twice the shading power (128 FP32 per cycle) of a Turing one and in the worst case scenarios it's about as fast (64 FP32).

In other words the SM has some pretty bad constraints making it less efficient per cycle in terms of FP32.
 
I've never owned a GPU which costs more than $300 :cool: - not my price range. I'm just concerned that this generation XX60 cards will cost a lot more than 10XX/16XX/20XX cards. Also, I'm not a fan of AMD drivers, so, thanks a lot. Looks like I'm waiting for RTX 4060.

xx70 cards are now pushed to (bottom of) high end, the same way how BMW upsized 3series from a 2 door coupe to a land yacht.
 
I have doubts about card availability at announced price.
 
Back
Top