• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces the GeForce RTX 3060, $330, 12 GB of GDDR6

Ok, let me see if i get it: less gpu power - more vram? ROFL, wtf
 
3080 twice the 2080, 3060 twice the 1060, lol
 
Quite strange they compared it with the four and a half years old 1060 rather than the 2060/2060 Super it is going to replace.
Millions of players? Yes, Nvidia. You should try to reach millions of actual players instead of scalpers and miners like you did so far...

so this card got more Vram than the 3060 ti, 3070 and 3080. why nvidia??
I think it will be asked many times... With that kind of bus, the choice was between 6 and 12. 6 Gb would have been not enough so... You have your answer ;)

Nice little card, with a plentiful pool of VRAM, I love it.

Hate to say this but I wonder if that MSRP includes the new 25% tariff on all video cards manufactured in China ☹️
As usual MSRP will be something regarding a few US customers only... The other customers outside US will pay much probably like 400/450€ for this card

How about you post some of those examples? People often clamor that there is plenty of evidence that VRAM limits are an issue, yet never can post to tests showing the results. Maybe a youtube video at BEST, but never any well tested, verified results. One of the biggest symptoms of VRAM limitations is stuttering, which is represented by attrocious 1% low FPS. Most review sites test this now. I've yet to see ONE that shows such behaviour from a 3070 or 3080.

Are you REALLY insinuating that the the common demoniator for PC development is 8GB, when >90% of PC gamers use cards with 6GB of VRAM or less? Are you even aware of how rare high ends card are? Polaris was the first generation where >6GB cards were affordable for the masses with the 8GB 470 and 480.

So you admit the only examples you can find are using mods (LMFAO) and is otherwise assumed based on speculation and nothing else? One or two benchmarks intentionally written to break hardware HARDLY count as general gaming.
This. I'm not happy about Nvidia memory size choices, but so far I never saw a test showing we are approaching VRAM issues on a 10 Gb card even at 4K.

Only reason why i think Nvidia decided to put 12GB of VRAM would be due to what happens to the 3070 Maxed out in DLSS with 8gb of RAM. Plus they couldn't release a 6GB of RAM version customers would throw a fit.
What happened? Nothing. So far 3070 are fine with 8 Gb even maxed out
 
Last edited:
RIP RTX 3060. AMD will release RX 6700 XT and RX 6700. RX 6700 XT will compete RTX 2080/S, RX 6700 will compete RTX 2070/S. Nvidia could release RTX 3060 Super after RX 6700 or maybe price cut. However, 900 series, 1000 series, 2000 series' xx60 GPU has half of xx80 GPU's core count. RTX 3060 should have 4352 cores. If RTX 3060 was $279, it would dominate the price region but price of 3060 is bad. At least, I hoped to see GTX 760's promises but only i saw 3060 is baloon of VRAM. Even so, I believe that Nvidia will release RTX 3050 Ti 6GB for $179-199. I would choose RTX 3050 Ti over RTX 3060 and wait RTX 4000 series.
 
so this card got more Vram than the 3060 ti, 3070 and 3080. why nvidia??
I feel a card at this level is unlikely to use anything more than 8GB since it is good for up to 1440p. So the rest of the VRAM is just there just to compete with the mid range cards from AMD which is also expected to have 12GB of VRAM over a 192bit BUS. If AMD included the massive infinity cache on their mid range cards, then I think the 6700 XT may be faster. Just my guess.
 
1,300 less cuda cores than 3060TI? That translates into 35% less rendering performance. It's 2060S/2070 with 12 gigs of vram basically for 70 bucks less. So only 12-15% gain over 2060?
That's very shitty generational performance uplift. The only GPU worth buying in Ampere lineup are 3060TI and 3080 at MSRP. They both lack 2 gigs of additional vram, but still, everything else is kind of meh.

Yep Nvidia's Ampere stack is fast looking like a big pile of 'just below par' crap with tons of software to make it work proper. In BETA, mind, and subject to change.

Starting to look less attractive as it gets seeded further. Strange how that works. But the lackluster specs along with shitty price situation are quickly pushing me to a no-buy again. They made a big mistake trying to pre-empt as they did.
 
Any chance of RTX3050, or is this the weakest card able to effectively push RT in 1080p60hz?
 
And no power connectors. :rolleyes:
Maybe it's in that dark corner on the right.
Clearly just a simplistic render, but it gives an idea of the design language Nvidia are going in for the FE - so the silver S of the 3070 and 3060Ti is gone and it looks to me like the only fan on the front (the bottom as we see it) is at the back. There can't be another fan on the other side because that's where the PCB is so this is certainly looking like either a blower, or a blow-through design based off this artists' impression.

Not really final enough to say, but the texture we can see on the bottom does look like a cover, rather than the chevron-angled heatsink fins we've seen on higher-tier cards.
As for power connectors, Nvidia will be hooking into the PCB which will be short again, so it's very likely they'll be in the middle of the card like the rest of the Ampere line-up.

The other option is that this render is so half-assed because Nvidia isn't planning on an FE and will hope the partners carry the design at launch. That's usually what happens from the 50-series and down, with renders of Nvidia's own desingn only existing as marketing images on Nvidia's website. It could take effect from the 60-series this time....
 
If it only has the one fan, I'd say skip the 3060 FE.

That one fan can't be enough to cool a 175W card(more, since it'll still use the 12pin connector for up to 225W of power draw).
 
Last edited:
RIP RTX 3060. AMD will release RX 6700 XT and RX 6700. RX 6700 XT will compete RTX 2080/S, RX 6700 will compete RTX 2070/S. Nvidia could release RTX 3060 Super after RX 6700 or maybe price cut. However, 900 series, 1000 series, 2000 series' xx60 GPU has half of xx80 GPU's core count. RTX 3060 should have 4352 cores. If RTX 3060 was $279, it would dominate the price region but price of 3060 is bad. At least, I hoped to see GTX 760's promises but only i saw 3060 is baloon of VRAM. Even so, I believe that Nvidia will release RTX 3050 Ti 6GB for $179-199. I would choose RTX 3050 Ti over RTX 3060 and wait RTX 4000 series.
yes... AMD will release other products with very poor RT performance and customers will still buy Geforce cards...
 
yes... AMD will release other products with very poor RT performance and customers will still buy Geforce cards...
"The customers" will buy whatever fits their interests. They only game where rt is noticeable is cp2077 and not even the 3090 can max it out unless you compromise. I just took a look a the list of 30 games that i play or will play and not even one has support for rt. So i guess for the vast majority of "customers" right now rt is meaningless. Maybe next gen.
 
Well, nvidias market share have risen during Turing period, so...
 
Well, nvidias market share have risen during Turing period, so...
RT or not, nvidia has good gpus, and i bet that nvidia share is 90% GTX 1060, GTX 1660.
 
If it only has the one fan, I'd say skip the 3060 FE.

That one fan can't be enough to cool a 175W card(more, since it'll still use the 12pin connector for up to 225W of power draw).
Utter nonsense! There are hundreds (maybe even thousands) of cards both past and present that adequately cool GPUs over 175W with a single fan. I'm currently using a single-fan card with a total board TDP of around 165-170W and it's quiet because the single fan is only running at 1400rpm even under full furmark torture test.

Cooling 175W should be a piece of cake for a single fan. The 3080 at 350W don't seem to have any problem (175W per fan) and with two fans on a card, that means there's much less heatsink per fan too.
 
Last edited:
Hi,
If you want to oc a card to max a single fan is nowhere near enough cooling unless you live near the arctic circle and leave the windows open lol

Hybrids only have one fan might be the exception.
 
Nvidia's releases are getting more comical and less Real with each release.
They're not even going to bother with a real reference release at MSRP, so naturally the OEMs that just announced price hikes on any other GPU are not going to be selling many of these at MSRP , wave one may have some at that price to dodge legal pursuit but after that these will quickly scale the price range IMHO.

As for the card, looks good I wouldn't mind a go but I don't think I will bite.

They're trying to hard via alternative means( Hub etal) to get my cash ATM.
 
"The customers" will buy whatever fits their interests. They only game where rt is noticeable is cp2077 and not even the 3090 can max it out unless you compromise. I just took a look a the list of 30 games that i play or will play and not even one has support for rt. So i guess for the vast majority of "customers" right now rt is meaningless. Maybe next gen.
That's your opinion based on... Let me see... Ah ok, an Rx 580... Very relevant.
 
If it only has the one fan, I'd say skip the 3060 FE.

That one fan can't be enough to cool a 175W card(more, since it'll still use the 12pin connector for up to 225W of power draw).
What if that one fan has RGB?

source.gif
 
The GTX 1060 is by far the most popular card with gamers on Steam and the RTX 2060 is well up there too, no doubt Nvidia hope to continue that trend with the RTX 3060. Hell it even has enough VRAM attract the "Lol 10GB, I must have loads of VRAM because of dodgy consoles in the coming years!!!1" crowd.

Smart.
 
That's your opinion based on... Let me see... Ah ok, an Rx 580... Very relevant.
I mean come on , you have a 3080 so all of a sudden RTX means more than the sun, wtaf dude his 580 is WAYYYYYYYYYYYYYYYYYYYYYYYYY more representative of the gaming pc public ATM then your 3080 is at the moment, yet his opinions worth less than yours, I DONT THINK SO.

remove those blinkers, your perspective is not everyones or more right.
 
I mean come on , you have a 3080 so all of a sudden RTX means more than the sun, wtaf dude his 580 is WAYYYYYYYYYYYYYYYYYYYYYYYYY more representative of the gaming pc public ATM then your 3080 is at the moment, yet his opinions worth less than yours, I DONT THINK SO.

remove those blinkers, your perspective is not everyones or more right.
saying who ? Someone "incidentally" owning another AMD card with no RT support...

I didn't say my opinion worth more. I just said i ACTUALLY played several games using RT, and every one of them counts. And in the next future it will be even more relevant.

PS: I saw a "/RTX 2060" in your signature. I don't know what it means, but even if you own one, a RTX 2060 is not exactly the hardware to test RT.
 
Back
Top