• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA's Next-Generation Ampere GPUs to be 50% Faster than Turing at Half the Power

50% better RTX they mean
 
Talk of high-end Navi are rumours not from AMD so don't blame them!

Nvidia.... It's all BS Fu Nvidia!

Man this forum.

Hey hey logic and reasoning is offensive to my gut feelings and love for certain brands, take it back! /s

On a serious note, tech progression don’t give a flying fk about fans. Nvidia dominates GPU accelerated computing and AI in industry and academia. Nvidia has good reason to keep pushing large gen to gen performance boost. This is the best way to entrench themselves in the lucrative market.
 
This is the best way to entrench themselves in the lucrative market.
They're already entrenched, they want to maintain their market lead. And they have no choice, AMD is hot on their heels and Intel will soon enter the discrete GPU market as well. They're the current GPU king of the hill and they have to fight to maintain that status. This level of competition is great for the market!
 
I bet that the "50% uplift" is in RTX ON performance while non RTX performance will remain the same in each segment.
I bet you will be eating crow with this statement later in the year... ;)

If you or anyone that agrees with this/thanked his post didnt read. This is simply talking about the process and die shrink and what can theoretically come along with it. To make a statement like this (isolating where the improvement comes from and thinking this is solely marketing/rtx) is premature at best, ignorant at worst.
 
Last edited:
I bet you will be eating crow with this statement later in the year... ;)

If you or anyone that agrees with this didnt read, this is simply talking about the process and die shrink and what can theoretically come along with it. To make a statement like this (isolating where the improvement comes from and thinking this is solely marketing/rtx) is premature at best, ignorant at worst.

that ORRRR we have been part of this business for long enough to expect some bs from these sort of articles.
 
Say what you want but Nvidia is at least 2 generations ahead of AMD Radeon and the gap will increase most likely, not that I like it but this is the situation.
 
that ORRRR we have been part of this business for long enough to expect some bs from these sort of articles.
Nobody in their right mind is expecting a 50% uptick in actual performance. However, to pin it down to RTX performance only is incredibly premature (at best). With the information given, there is no way anyone can tell. And I assure you there will be an increase in both rtx and non rtx performance.

Just scrolled through the thread.. same lulz..:)
 
that ORRRR we have been part of this business for long enough to expect some bs from these sort of articles.
Perhaps. However NVidia's track record says otherwise. They delivered RTX as claimed, more or less on time. This information may not have come from an NVidia statement, but it's from a source that is seemingly legit and NVidia hasn't debunked it, which they have a habit of doing.
 
Uhh great but at 2080 ti prices ? Would be irrelevant to me.

Last time i checked Nvidia had an entire portfolio of segments going from 200 bucks up to 1200 bucks , same applies to Ampere ( or whatever Nvidia calls the new gen ) nobody will force you to pay 2080Ti money ! If Ampere end up being anywhere close those numbers you will have 2080Ti performance for mid tier money , this is why innovation is great !
 
Last edited:
They're already entrenched, they want to maintain their market lead. And they have no choice, AMD is hot on their heels and Intel will soon enter the discrete GPU market as well. They're the current GPU king of the hill and they have to fight to maintain that status. This level of competition is great for the market!

On the CPU front AMD is gaining some mindshare and marketshare. On the GPU computing front RTG is just horrible.
 
@Valantar regardless, I would still look forward on how it performs.
 
Nobody in their right mind is expecting a 50% uptick in actual performance. However, to pin it down to RTX performance only is incredibly premature (at best). With the information given, there is no way anyone can tell. And I assure you there will be an increase in both rtx and non rtx performance.

Just scrolled through the thread.. same lulz..:)
It's just a rumour bro, it pins nothing down just like navi's x2 rumour, hyperbolic nonesense that's likely to be true only in 2 out of 10 applications not all, as ever.
 
It's just a rumour bro, it pins nothing down just like navi's x2 rumour, hyperbolic nonesense that's likely to be true only in 2 out of 10 applications not all, as ever.

True it is only a rumor , this being said if there is anyone in the GPU industry that can pull those numbers that's precisely Nvidia so the rumor becomes much more credible in this case !

No matter how you look at it Nvidia on 12nm which is closer to 16nm is dominating the market on both perf and efficiency so you can only expect mind-blowing efficiency now that they move to 7nm EUV which is a huge step from 12nm . From there everything becomes possible it's just a matter of how you adjust your efficiency/perf ratio .
 
Last edited:
I see that all the usual suspects that give AMD the benefit of the doubt is unwilling to extend Nvidia the same courtesy.

I assume this performance claim is nothing but a bold prediction; based on nothing but thin air.
Speculation is fine, but only when clearly labeled as such. I also expect Nvidia's next gen to be a "big step up"; a good node shrink, a refined node and a new architecture, this certainly has potential, but remember they still can fail.

i think we heard this every time new GPU is about to come out. the competitor starts spreading rumor to stop them from buying competitor product. sometimes i think company like AMD and nvidia does not really need to do this. because their fanboys will do the job for them for free. still remember when some people said you should hold back from getting GTX1080 because Vega will coming out in october 2016. they said there is shortage anyway with 1080 because of demand so you might just as well wait until october 2016.
Companies certainly seed rumors from time to time, but most of the time it's just certain webpages and youtubers trying to drive traffic.

But unfortunately sometimes the fanboys take it too far and end up hurting their "team". Back when GTX 1080 Ti launched, some claimed it was to spoil Vega's triumph, and when Vega finally shipped, many refused to accept that it was the biggest Vega chip. The same thing happened with Navi; fueling the rumor of Navi12 being "big Navi" to crush RTX 2080 Ti…
 
Will it be half the price?

Yeah, I know... good luck with that.
 
i believe it will be "up to" 50 % which is reasonable...
 
g
I know that. I was replying to, and quoted, someone (and the 'supporters') who thinks this is rtx only...
I don't think anyone is saying "it might be 50% faster in RTX, but everything else will stay the same", which is what you're saying here. All that's being said is that a 50% generation-on-generation absolute performance increase (for all workloads or for conventional rasterization workloads) is quite optimistic, and combining that with a 50% absolute power draw drop is borderline crazy. After all, that would make for a 300% increase in performance per watt (150% perf at 50% power = 300% perf at 100% power). When has that ever happened between two generations? If they could pull that off Nvidia would own the GPU market entirely for the foreseeable future, but it sounds very unrealistic. I at least do not doubt that there will be significant improvements in absolute performance when Nvidia moves to 7nm - anything else would be a scandal - but +300% perf/W? No. I'll gladly admit how wrong I was if that comes to pass, but for now, I'm putting this in the "make-believe" column.
 
I don't think anyone is saying "it might be 50% faster in RTX, but everything else will stay the same", which is what you're saying here.
lol, I quoted the person who said exactly that. So did you earlier, in fact. Perhaps they meant something else? No idea... but that is what was posted. I'll quote it again...

I bet that the "50% uplift" is in RTX ON performance while non RTX performance will remain the same in each segment.


I dont knkw what that means outside of rtx getting the improvement while everything else stay the same...

Nobody should hold on to the 50% value mentioned. Its not based on anything outside of the die shrink, really. I'd gather we'll see notable (25%+) increases in both rtx and non rtx gaming. P /W will also dramatically increase if that is a metric one cares about.
 
Is it going to 6,000 .... 7000 ... 8000 cores and all the other bells and whistles to match? smaller die isn't everything/
 
lol, I quoted the person who said exactly that. So did you earlier, in fact. Perhaps they meant something else? No idea... but that is what was posted. I'll quote it again...




I dont knkw what that means outside of rtx getting the improvement while everything else stay the same...

Nobody should hold on to the 50% value mentioned. Its not based on anything outside of the die shrink, really. I'd gather we'll see notable (25%+) increases in both rtx and non rtx gaming. P /W will also dramatically increase if that is a metric one cares about.
I obviously missed that one, definitely don't agree there (tough I wouldn't put it past Nvidia to accompany the new gen with another per-tier price bump either). Completely agree that even increases in both (or ideally a bit more in RT to balance things out) ought to be their target.
 
4K will always be a tough one. Especially when RT gets bigger. I'd accept 1080p on that panel for the next half decade if I were you ;)

3070 should be a decent entry level 4K card, think GTA V @ 60fps @ 4k at fairly high settings minus high end AA, which is less necessary at 4K anyway.
 
What chart?



Maxwell 28nm to Pascal 16nm. Pascal got performance and efficiency.

turing-shading.png


The game performance one was bullshit, too, but not this egregious.

And more BS
turing-gddr6.png
 
Back
Top