• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

Full chips shoud be 5376/384 bit, 3584/256 bit, in reality are those shaders are some sort of hyperthreading.dual shader operations per clock.

We should evaluate performance per transistor. 28000 vs 18600. should result 50% better performance, provided the memory bandwidth also grows by 50%.
 
Secondly, unless having the full chip delivers noticeably more performance, what value does it have to you if your chip is fully enabled? The Vega 64 Water Cooled edition barely outperformed the V64 or the V56, yet cost 2x more. That is just silly. I mean, you're welcome to your delusions, but don't go pushing them on others here. Cut-down chips are a fantastic way of getting better yields out of intrinsically imperfect manufacturing methods, allowing for far better products to reach end users.
If I may add, cut down chips may actually even be "better" in some ways. Like GTX 970 achieved higher performance per TFlop than GTX 980, because it effectively had more scheduling resources and cache per core. I believe we saw something similar with Vega56 vs. Vega64 too, where Vega56 did better than it "should" overclocked vs. its bigger brother.

LOL. And you actually believe what Jensen said in that reveal?
You are the one challenging the performance figures, you are the one to prove it. Or should I just take your word for it? :rolleyes:

Yes, i would call Ampere catastrophically ineffective garbage.
You are just going to keep digging yourself deeper, aren't you? ;)
Here is another shovel for you;
So if Ampere is catastrophically ineffective, what does that make Polars, Vega, Vega20 and Navi then?
 
If I may add, cut down chips may actually even be "better" in some ways. Like GTX 970 achieved higher performance per TFlop than GTX 980, because it effectively had more scheduling resources and cache per core. I believe we saw something similar with Vega56 vs. Vega64 too, where Vega56 did better than it "should" overclocked vs. its bigger brother.


You are the one challenging the performance figures, you are the one to prove it. Or should I just take your word for it? :rolleyes:


You are just going to keep digging yourself deeper, aren't you? ;)
Here is another shovel for you;
So if Ampere is catastrophically ineffective, what does that make Polars, Vega, Vega20 and Navi then?
Don't mistake me for AMD fanboy. I owned GTX 1080Ti and RTX 2060 Super. I'm just saying that historically Nvidia is full of shit. intel and AMD as well. I used simple math from available data. Independent reviews will tell the truth as always. But fanboys are gonna believe what they want to believe. And buy anything at any price. Trying to use logic and real life available data with fanboys is like trying to have logical conversation with a religious fanatic.
 
Don't mistake me for AMD fanboy. I owned GTX 1080Ti and RTX 2060 Super. I'm just saying that historically Nvidia is full of shit. intel and AMD as well. I used simple math from available data. Independent reviews will tell the truth as always. But fanboys are gonna believe what they want to believe. And buy anything at any price. Trying to use logic and real life available data with fanboys is like trying to have logical conversation with a religious fanatic.
You're part of the problem here, not part of the solution. Saying "Nvidia is full of shit" doesn't help promote the view that one should always wait for independent reviews whatsoever. All you're doing is riling people up and inciting unnecessary conflict. I don't trust Nvidia's cherry-picked performance numbers either, but I stil find myself arguing against you simply due to how you are presenting your "points".
 
You're part of the problem here, not part of the solution. Saying "Nvidia is full of shit" doesn't help promote the view that one should always wait for independent reviews whatsoever. All you're doing is riling people up and inciting unnecessary conflict. I don't trust Nvidia's cherry-picked performance numbers either, but I stil find myself arguing against you simply due to how you are presenting your "points".
I'm sorry. I'll try to never again use simple math if that offends you. :) I'm done with this topic. Peace out. :rockout:
OH. Quick edit. I will be picking up RTX 3080. Because i have connections over at gainward and i can get one at factory price. :)
 
Don't mistake me for AMD fanboy. I owned GTX 1080Ti and RTX 2060 Super. I'm just saying that historically Nvidia is full of shit. intel and AMD as well. I used simple math from available data. Independent reviews will tell the truth as always. But fanboys are gonna believe what they want to believe. And buy anything at any price. Trying to use logic and real life available data with fanboys is like trying to have logical conversation with a religious fanatic.

Kinda funny I think you are full of BS too, where do you find numbers that say 2080 Ti is 10-15% faster than 1080 Ti ? your ass ?
 
I'm sorry. I'll try to never again use simple math if that offends you. :) I'm done with this topic. Peace out. :rockout:
OH. Quick edit. I will be picking up RTX 3080. Because i have connections over at gainward and i can get one at factory price. :)
I will admit to losing interest in anything math related after middle school, but I certainly never came across any type of math that involved calling anyone "full of shit". That might just be me though. Besides that, trying to calculate gaming performance from on-paper specs of a new architecture with major changes (such as the doubled ALU count with who know what other changes, likely creating bottlenecks that were previously not there, dropping perf/Tflop) is arguably more dubious than Nvidia's claims. Their numbers are cherry-picked, but at least they have a basis in some tiny corner of reality. Purely theoretical calculations of performance add very little to the discussion, underscoring the fact that we should all be waiting for a wide selection of third party reviews before making any kind of decision. And, to be honest, we should also be waiting to see what the competition comes up with. RDNA 2 looks very promising, after all.
 
Coming from a RTX 2080, the 3080 seems enticing.
 
Ah, yes, the true sign of a superior intellect: ad hominem attacks, insults and pejoratives. Reported. If you can't present your arguments in at least a somewhat polite way, maybe you shouldn't be posting on forums?
Lets not insult each other here. People disagree with my opinons and i disagree with theirs sometimes. But we can have a polite discussion. :lovetpu:
 
Hi there, everyone!
Keep it on topic.
Quit the arguing and try to keep the discussion on the technical side.
No name calling allowed and enough BS has been thrown and discussed.
As the Guidelines state:
Be polite and Constructive, if you have nothing nice to say then don't say anything at all.

Thank You and Have a Great Day
 
Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?
 
Last edited by a moderator:
Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?
That's pretty interesting indeed. Benchmarks will be well worth the read when they arrive.
 
Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?

Most likely. Cherry picked RT game with DLSS or prosumer software. If you look at the perf/W graf they provided say 1.9X but in reality isn't that 1.5X?
Realistically Ampere is 1.35-1.40X faster regarding normal rasterization performance vs. Turing.

Problem is that most RT titles available currently are bound by DXR1.0 implementation so might not be able to show the full RT potential of Ampere over Turing, even though it will be still considerably faster in these titles.
 
Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?

Yeah the 1.9x Perf/Watt is derived from here

uXkg35kjyWzlYUiZ.jpg


Basically the 2080 Super get 60fps at 250W TGP while 3080 can get 60fps with only 130W, meaning 2080 Super use 1.9X more watt per FPS.
Which is not correct, it should have been FPS divided by power use (Perf/Watt) which Ampere is somewhere between 35-45% more efficient (the higher power usage the more Ampere is becoming more efficient, Ampere scales much better with voltages than Turing it seems).
 
LOL. And you actually believe what Jensen said in that reveal? He also said 2 years ago that RTX 2080Ti gives 50% better performance that 1080Ti. In reality, it turned out more to be 10-15%. So not hard to beat your previous (turing) architecture when that brought minimalistic improvements to performance. RTX excluded ofc. RTX 2080 achieves around 45fps in Shadow of Tomb Raider in 4K and same settings as Digital Foundry used in their video. They say RTX 3080 is 80% faster. So simple math says that is around 80fps. RTX 2080Ti achieves around 60fps (source Guru3D) So performance difference RTX 2080Ti-3080 is 30-35% based on Tomb Raider. RTX 2080Ti 12nm manufacturing process, 250W TDP, 13.5TFLOPS. RTX 3080 8nm manufacturing process, 320W TDP, 30TFLOPS. And only 30-35% performance difference from this power hungry GPU with double the number of CUDA cores. Yes, i would call Ampere catastrophically ineffective garbage.
Also, where the F is Jensen pulling that data about Ampere having 1.9 times performance per Watt compared to Turing? It would mean that RTX 3080 at 250W TDP should have 90% higher performance tham RTX 2080TI. But no. It has 320W TDP and 30-35% higher performance. So power effectiveness of Ampere is minimally better Watt for Watt. Claim about 1.9X effectibness i straight out LIE.

I wonder if Jensen believes that bullshit coming out of his mouth?

Exactly. It appears here like people haven't witnessed an Nvidia/AMD launch before, or have collective ignorance.

How on earth are some of you spewing these marketing numbers from Nvidia and thus hyperbolic assertions about Ampere without a single reputable benchmark of any of the cards? You're like the perfect consumer drones in a dystopian future where consumerism has gone into overdrive.

I will say it again: let's come back to these comments in a couple weeks when W1zzard has put the cards through his benchmark suite. Lets see how these '200% faster' and '2-3X the performance' exclamations hold up when you blow away the smoke and shatter the mirrors to look at actual rasterization performance averages across more than 10 games..
 
Last edited:
Hey, we still remember Polaris was gonna offer a whopping 2.5x perf/watt over cards like the 290X:
AMDGPU.jpg


You're right, don't believe everything you read. ;)
 
Full chips shoud be 5376/384 bit, 3584/256 bit, in reality are those shaders are some sort of hyperthreading.dual shader operations per clock.

In the past CUs could do 1 fp and one int op.
Ampere CUs could do 1 fp and one int op or 2 fps.
So technically it's twice tflops, but the same CU.

Saying it is 2 CUs is like AMD claiming double number of Buldozer cores.
 
So I have a PSU 750W bronze, I know I need to upgrade but to which wattage
Probably not. You should invest some money into a watt meter to get the actual wattage being used coming from your outlet.

In simple terms, this is how I setup all of my PSU's for my computers. Seems to work well for me. It might work for you.

Right now I have a 850Watt 80+Gold standard PSU. Rated 90% Efficiency @ 50% load.
So that number that I use is 850 x 90% / 2 (50% load number) = 382 Watts at the 50% load level. I personally want to keep my wattage below this number as this will prolong the life of your PSU. I have a few that are in the 10 year range ATM and those comps I use to play games on mostly.

With my watt meter I measure my usage normally as well as while I'm playing my games on the computer I am using. I Do this for 2 or more hours recording the data for that rig.

My normal load level for my current computer is at 105 Watts. My Max watt level playing games is at 250W max. Well below the 50% threshold of my PSU. I should have no problems with any future upgrade however I'm not the type who buys on a whim. I do my research first, then buy it needed.

My comp is AMD 3600, 32GB G,Skill Flare X CL14 PC3200 Ram, MSI X570 A-Pro MB, Visiontek RX 5700 w/10% undervolt, 500gb Samsung 860EVO SSD, WD 2TB HDD, LG Blu Ray.

Again. This is how I set up my PSU's concerning watt usages and it has worked for me for at least 15 years.


As far as this whole new generation of Video cards?

When Navi Comes out then we can see all of the performance gains over last generation of cards.

HOWEVER... And this is something that people should be aware of.

1. Most of the world is running on 1080p OR Less.
2. Gaming monitors are mostly set at the 1080p level. recently 1440p is becoming more and more of the Sweet Spot as 4K is still too expensive for what you are getting.
3. Video gaming advancements are still 3 to 5 years behind the current technology.
4. Most people right now are tight on money these days.

It makes no difference if your monitor can not handle the additional power given with these new cards.

I am averaging @ over 165 fps @1440p, playing Overwatch on a 32 inch, 1440p, 165hz Pixio PX329 monitor that is using a RX 5700 video card that I got new for $280.00 9 months ago. Heh Grandpa here playing pew pew. Everything right now is working great.

Unless Nvidia (and AMD) can pull 50% real performance numbers over last years video cards, only those gerbils will be buying those video cards because most people's video cards they currently have on hand are still doing the job.

Unless there is a fantastic price vs performance ratio on these new upcomming video cards, I don't think I'll be throwing 5 to 7 hundred dollars for one.
 
Judging by the DF results we have 55% faster 70-tier card, and 75% faster 80-tier card. So Nvidia pulled a bulldozer on us. In their desire to impress with 2xFP performance nvidia created a GPGPU card that is going to mine well, not so good for games and for prices, although my 1060 mined 1Eth in one month, and that can buy me 3070 now. lol.. But still 1,66x more transistors fitted in the same 450mm2 die size 70-tier card. and that is being hampered by the ridiculous vram gimping. What is very dissapointing that 16GBps memory is used. there is not much difference here. 66% more transistors, and the same memory, 16% faster, that is garbage. should be 20GBps 16GB.
 
Probably not. You should invest some money into a watt meter to get the actual wattage being used coming from your outlet.
But is that really needed though?
I'm pretty sure 750W is plenty unless he/she is overclocking heavily or using a HEDT CPU (and probably enough even then).
I would be much more concerned with the quality and age of the PSU.

I had one computer with a heavily used ~10 year old Corsair TX750 80+ bronze (I believe) which was very unstable, and I put in a Seasonic Focus+ Gold 650W, and it improved a lot.

In general I would prefer quality over quantity, I'll take a Seasonic 650W over a crappy 1000W any day. Still, when the budget allows it, I do choose some breathing room to try to keep the typical load around ~50% for long-term stability as you said.
 
HOWEVER... And this is something that people should be aware of.

1. Most of the world is running on 1080p OR Less.
2. Gaming monitors are mostly set at the 1080p level. recently 1440p is becoming more and more of the Sweet Spot as 4K is still too expensive for what you are getting.
3. Video gaming advancements are still 3 to 5 years behind the current technology.
4. Most people right now are tight on money these days.

My old RX480 runs terraria at 4k at 60 fps :)
 
My old RX480 runs terraria at 4k at 60 fps :)
I was just playing ace combat 7 using my monitor plugged into my laptop Rtx 2060@4k80fps max settings, the Vegas as capable but we do all game different on different game's.
 
My old RX480 runs terraria at 4k at 60 fps :)
So does my 1070. And you can even get 4K monitors @$300+ range.

But there is a big difference between something that is running at 60hz and something running at the 144hz+ with a IP panel @4K. I've seen the difference and its nice... but not that nice when the monitor I'm looking at starts at the $600, then add a card that can take full advantage of the monitor in question and that is a lot of money.

That is why I purchased the Pixio 32 inch 1440p, 165hz monitor for under $300 though it is a bit of overkill for me as I'm so used to the 27 inch monitor.

Hardware Monitor did a review in late 2018

But it was Level 1 Tech that sold me on the Monitor.

And finally the 27 inch Brand Name monitor that I wanted cost more than this monitor.

Now back to the Nvidia 3000 series of video cards. You know that there are going to be limited supply of the 3000 series being sold at launch date... Right? Yea I am hearing the same rumors about limited supply issues and of course Price increases. So if those rumors do come true it will be just like the 2000 series limited supply launch.

I'll just wait until late October/Holiday season to pick up any additional components as needed as well as what AMD has to offer.

But I am interested to see the actual gaming performance over previous generations as well as how hot these cards will be generating the heat it creates.
 
He said the 3080 is a 4k 60fps card and way better than 2080Ti. That made me question a little because i thought 2080Ti was a 4k 60fps card am i wrong?
It was. And still is for older games from when it was released. But game tech moves on and newer games can be unable to be played at 4k on anything but the newest.

That 4k is a moving finish line always is what I have always said. Your 4k card today won’t be one soon.
 
Ordered factory OC RX 5700XT last week...
@ a price point with PP that suited my budget.
Last time I had an Nvidia card in my then, Intel gaming rig was back in 2008!
There's something about mixing AMD platform with NVIDIA product that just doesn't sit well with me today. :fear:
 
Back
Top