• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3070 Founders Edition

My current GTX 1070 card is worried it will be getting replaced with a 3070 card by Jan/Feb, if stock holds up then. :p
 
At least the power consumption is reasonable on this card, the 3070 isn't a space heater

seeing how low down the power consumption for my 1080 is compared to these new cards is a shock, they've really regressed on that.
 
At least the power consumption is reasonable on this card, the 3070 isn't a space heater

seeing how low down the power consumption for my 1080 is compared to these new cards is a shock, they've really regressed on that.

Thank you Samsung...
 
Another fraudy release
Not available in market
If available, would be with a much higher price than the announced one.
If Nvidia does what it did with 3080 it should be sued (unfortunately only people in US can do that)


Except the actual price would much much higher than the announced 500
At my country cheapest 620, then 650 and then it goes over 720euro, nice right... not to mention that it is not still released...
 
Last time, they dropped the price by $50 on the day before release day .... which can only be seen as raising the performance white flag. You don't lower prices when you have a better product you raise them

AMD announced pricing on 5700 series.
NV countered with s (bigger and more expensive to produce) versions of the cards.
AMD adjusted pricing on it's GPUs with chips of "whopping" 250mm2 size.
"white flag" is fanboi imagination.

Thank you Samsung...
Thank you Jensen to OC-ing the card to max.
 
Hold your breath and crying? None of which is happening here. Grow up and talk like an adult.

Of course games will work with 8GB, that isn't the point. I like how you just missed the entire 770 to 1070 we got 4x more VRAM and yet 1070 to 3070 we didn't get any more at all. And I'm unreasonable? We've never gone 4 years without more VRAM. And this during a launch year of a new console driving up VRAM requirements very soon. More importantly, build it and they will come, give all your cards more VRAM and game makers will release content for them. The month the 1070 launched I suddenly had games with new video options and texture packs that used more than 4GB and the same people were "crying" like yourself that we didn't need 8GB then, and yet voila we got better settings than console games and used the VRAM up quickly. Doom Eternal already needs more than 8GB for top settings and more games will soon.

NO man, you have to understand, you buy 4K capable cards to game at 1440p with High settings. And that's fine for 700-800 dollars worth of GPU nowadays. Can't be nitpicking too much about 10GB right? Look at the god damn EF PEE ES for the money! This card is fine.

And texture packs?! Console don't get that either, stop whining.

Practical experience with more VRAM usage? No, you're lying ;) Its not commonplace at all. I don't see it myself every day either. Allocated is not 'in use'! Its fine! Stutter doesn't exist!

/s
 
This review, in fact 95%+ of reviews, when manufacturers send in cards to be tested, has it ever been tested how close to "typical" the tested cards are?

In CPU world we have the "silicon lottery", don't we have the same in GPU world?

Imagine you are manufacturing graphic cards. Would you take a bunch of them, benchmark, and pick up the best to be sent in for reviews?


The only case that I could imagine when this wouldn't be a problem, if fluctuations are really small, like 1-3%.
 
Thank you Samsung
We don't really know if Samsung is at fault or Nvidia pushed the cards way past reasonable limits, not unlike AMD, or indeed the Ampere uarch. The only way we'd know for sure is if Nvidia launches the exact same cards on TSMC 7nm or any other "better" node.
 
We don't really know if Samsung is at fault or Nvidia pushed the cards way past reasonable limits, not unlike AMD, or indeed the Ampere uarch. The only way we'd know for sure is if Nvidia launches the exact same cards on TSMC 7nm or any other "better" node.
Samsung's 8nm node isn't made for big dies and thus, when pushed to higher clocks than their optimum they draw too much power. And this time nVidia had to push for high clocks to keep the GPU crown (today we should learn for sure me thinks). And it is obvious that this Ampere arch is compute focused and doesn't like high clocks either. So, Ampere on Samsung was a bad combo. Pascal could work better if made on this but Ampere and Turing not so well with big dies and tensor cores.
 
seeing how low down the power consumption for my 1080 is compared to these new cards is a shock, they've really regressed on that.

So it is a space heater then.
 
You can undervolt 3080 to 200w levels and lose about 5% performance. So that means as stated before these cards are great to a certain point then lose efficiency.
 
You can undervolt 3080 to 200w levels and lose about 5% performance. So that means as stated before these cards are great to a certain point then lose efficiency.
If you get a good sample you might get there. But ~35% less power with that little performance loss sounds rare. Still, GPU undervolting is extremely valuable with these power hungry cards.

This review, in fact 95%+ of reviews, when manufacturers send in cards to be tested, has it ever been tested how close to "typical" the tested cards are?

In CPU world we have the "silicon lottery", don't we have the same in GPU world?

Imagine you are manufacturing graphic cards. Would you take a bunch of them, benchmark, and pick up the best to be sent in for reviews?


The only case that I could imagine when this wouldn't be a problem, if fluctuations are really small, like 1-3%.
Some reviewers have done things like that, and some reviewers purposely buy products off the shelf for review to control for this - for example, GamersNexus often buys GPUs for testing rather than getting them from the OEM. That obviously isn't possible for pre-release reviews, but in general the variance is essentially zero. Given the relatively loose voltages and aggressive boosting algorithms on GPUs, you won't see much of a difference unless you're undervolting - and no OEM would be dumb enough to send out a pre-undervolted review sample, even if it would technically be possible.
 
Well, I'm thoroughly surprised about the performance. I expected it to only deliver "2080 Ti performance" with RTX on and DLSS 2.0 in a few select games. But damn, this is not bad, very intriguing GPU.
It's a shame it is not going to be available for a while. I haven't seen a single 3080 in any Hungarian store yet since release and I have a feeling the 3070 is going to be even worse as I expect the demand to be higher.
Maybe in 3-4 months the 30-series are going to be actually "released" and available.
 
Well, I'm thoroughly surprised about the performance. I expected it to only deliver "2080 Ti performance" with RTX on and DLSS 2.0 in a few select games. But damn, this is not bad, very intriguing GPU.
It's a shame it is not going to be available for a while. I haven't seen a single 3080 in any Hungarian store yet since release and I have a feeling the 3070 is going to be even worse as I expect the demand to be higher.
Maybe in 3-4 months the 30-series are going to be actually "released" and available.
RTX 3070 overall is good but GTX 970 and GTX 1070 was better when they launched. Hype every year increases and people for some reason cannot think clearly.

That's probably because of excellent nvidia marketing.
 
Thank you Samsung...
Why Samsung? In this day and age of MCU design you design a chip around the intended node and it's limitations and not the other way around.
MCUs are pushed to the brink of stability with these ultra high 2GHz boost clock frequencies.
 
Wondering why Fortnite is not used in testing cards?
One of the most played games, and I for one would base my purchase on results for it.
 
Wondering why Fortnite is not used in testing cards?
One of the most played games, and I for one would base my purchase on results for it.
Same reason most MP games aren't used... becuase testing consistency is nearly impossible to be consistent on MP games. Besides, Fortnite runs on a potato in the first place.
 
Same reason most MP games aren't used... becuase testing consistency is nearly impossible to be consistent on MP games. Besides, Fortnite runs on a potato in the first place.
That, also patches can occur at random times, so I can't guarantee a consistent game version between test runs
 
Samsung's 8nm node isn't made for big dies and thus, when pushed to higher clocks than their optimum they draw too much power. And this time nVidia had to push for high clocks to keep the GPU crown (today we should learn for sure me thinks). And it is obvious that this Ampere arch is compute focused and doesn't like high clocks either. So, Ampere on Samsung was a bad combo. Pascal could work better if made on this but Ampere and Turing not so well with big dies and tensor cores.

i don't think nvidia push for high clock with their ampere design. if they were then we probably already seeing something beyond 2.2ghz. starting with turing nvidia try looking for something else than increasing the clock speed to increase their performance. also while samsung node are not made for big die size one way or another nvidia have to "force" samsung to make their big GPU. else how samsung going to get the experience if they never make one?
 
Not bad but look at the difference in performance between the 3080 and the 3070. I can bet there will be 3070 Ti. That would be a very good card if the price is OK.
If I'd go for NV card I'd wait for the 3070 Ti.
 
Not bad but look at the difference in performance between the 3080 and the 3070. I can bet there will be 3070 Ti. That would be a very good card if the price is OK.
If I'd go for NV card I'd wait for the 3070 Ti.
I can see it already consuming almost similar power as the RTX 3080 even with some shader cores disabled as it's using the same die
 
Not bad but look at the difference in performance between the 3080 and the 3070. I can bet there will be 3070 Ti. That would be a very good card if the price is OK.
If I'd go for NV card I'd wait for the 3070 Ti.
$200 price gap, 100W power consumption gap, 23% performance gap at 1440p? Yeah, that does open the door for a $600 3070 Ti, as long as there are enough sufficiently defective GA102 dice to make that viable as a product. Though it would also make for a rather crowded product stack, with AIB 3070s easily exceeding the price of a base 3070 Ti, making developing such cards a lot less attractive for partners. Going by recent history we might instead see a 3070 Super in a while fulfilling the same role while replacing the 3070 outright or pushing its price down a bit.
 
I can see it already consuming almost similar power as the RTX 3080 even with some shader cores disabled as it's using the same die
Well I was focusing mostly on the performance difference between the two than how the power consumption look like.
If you compare the 3080 and 3090 the difference is not that significant and yet there rumors about the 3080 Ti being stuck between the two. It would have made more sense to have a 3070 Ti to leverage the performance range.

we might instead see a 3070 Super in a while fulfilling the same role while replacing the 3070 outright or pushing its price down a bit.
Actually I don't care about the name of the card Ti or Super or whatever. The point is that the gap in performance is huge between the two. It would have made more sense to make a 3070 ti/super than a 3080 Ti with a 12% performance gap(or 15% whatever it is) between the 3080 and 3090
I don't think the replacement is a great option here. The gap is huge. I'm not convince also that the replacement will attract more people actually contrary. People want the cards now, not when they already have the 3000 series whatever card they buy.
 
Last edited:
In my country you can't find one and even if you are lucky at some point it costs around 840$
If AMD doesn't put out something good we are looking at a future where having a decent GPU will cost an important sum of money.
 
Back
Top