Friday, January 3rd 2020

NVIDIA's Next-Generation Ampere GPUs to be 50% Faster than Turing at Half the Power

As we approach the release of NVIDIA's Ampere GPUs, which are rumored to launch in the second half of this year, more rumors and information about the upcoming graphics cards are appearing. Today, according to the latest report made by Taipei Times, NVIDIA's next-generation of graphics cards based on "Ampere" architecture is rumored to have as much as 50% performance uplift compared to the previous generations of Turing GPUs, while using having half the power consumption.

Built using Samsung's 7 nm manufacturing node, Ampere is poised to be the new king among all future GPUs. The rumored 50% performance increase is not impossible, due to features and improvements that the new 7 nm manufacturing node brings. If utilizing the density alone, NVIDIA can extract at least 50% extra performance that is due to the use of a smaller node. However, performance should increase even further because Ampere will bring new architecture as well. Combining a new manufacturing node and new microarchitecture, Ampere will reduce power consumption in half, making for a very efficient GPU solution. We still don't know if the performance will increase mostly for ray tracing applications, or will NVIDIA put the focus on general graphics performance.
Source: Taipei Times
Add your own comment

180 Comments on NVIDIA's Next-Generation Ampere GPUs to be 50% Faster than Turing at Half the Power

#103
trog100
i think i about enter what i would call my consolidation phase.. this means i will stop buying new stuff just because i can and simply use what i have..

my last consolidation phase lasted well over six years.. he he

trog
Posted on Reply
#104
Xaled
Valantar
I'd say the headline ought to be along the lines of "Nvidia's Next-Generation Ampere GPUs reportedly 50% faster than Turing at Half the Power". I'll agree that a headline with zero reservations or considerations taken as to this being second-hand information with dubious sourcing making bombastic claims at a very early point is poor journalism, but I don't see it as a double standard or bias - inaccuracies like this are quite par for the course across the board for TPU, sadly.
I would've called poor journalism if it was consistent with all news, unbiased. But this one has been made intentionally. I highly believe that they got paid from Nvidia just to use such title. Or they would've used the same logic that they used in AMDs and Intel's news.
Posted on Reply
#105
cucker tarlson
Xaled
I would've called poor journalism if it was consistent with all news, unbiased. But this one has been made intentionally. I highly believe that they got paid from Nvidia just to use such title. Or they would've used the same logic that they used in AMDs and Intel's news.
they do

Casecutter
As by this we are being told Ampere greatness is like perhaps 6 months from being, although there's no clue as to who's going to be spinning these wafer's for Nvidia... but they'll be great.
old architecture will be replaced by a new one called something terrific.

but seriously,it's samsung.
Posted on Reply
#106
lemonadesoda
I'm more interested in the half-power thing. I don't want a nuclear oven in my house. Neither the heat, nor the noise in keeping it cool. So, I'm in!
Posted on Reply
#107
TheGuruStud
LOL, a couple of you guys just got roasted by a youtuber.
Posted on Reply
#108
HTC
TheGuruStud
LOL, a couple of you guys just got roasted by a youtuber.
Elaborate, please.
Posted on Reply
#109
TheGuruStud
HTC
Elaborate, please.
Crying about AMD's prices, b/c they need to be cheaper than Nvidia. Aka the double standard, since no one says that about Nvidia.

Shit, we all know they do it even when cheaper.
Posted on Reply
#110
HTC
TheGuruStud
Crying about AMD's prices, b/c they need to be cheaper than Nvidia. Aka the double standard, since no one says that about Nvidia.

Shit, we all know they do it even when cheaper.
I was hoping for the video link of said youtuber roasting "a couple of you guys" but that works too ...
Posted on Reply
#111
TheGuruStud
HTC
I was hoping for the video link of said youtuber roasting "a couple of you guys" but that works too ...
Posted on Reply
#112
HTC
TheGuruStud

Funny: i'm around 6:20 into that video right now, lol

EDIT

Didn't see much roasting ... some, but not much.
Posted on Reply
#114
HTC
Xzibit
Who does he use as a double standard in pricing @ 9:30 from TPU
It's quite easy to find: just do an advanced search with some of the words and add date parameters.

I'd rather not show who it is directly because it may lead to other types of issues: i'll leave it @ that.
Posted on Reply
#115
Nero1024
I will be a minority, but I am more curious about the new design of cards, because I am pretty sure performance will deliver
Posted on Reply
#116
jabbadap
TheGuruStud


The game performance one was bullshit, too, but not this egregious.

And more BS
Well best case scenarios, but those are actually correct slides to address benefits from shader and memory subsystem evolution between the two generations...

But yeah probably someone might have seen similar slide(s) from Ampere to Turing too and does not have a whole picture.
Posted on Reply
#117
candle_86
R0H1T
Yes & pigs will also fly when that happens :rolleyes:

Reminds me of Intel's PR these days with multiple ***
Nvidia did it before, they only got lazy lately

Geforce 6800 Ultra - four as fast as Geforce FX 5950 Ultra
Geforce 7800GTX - Twice as fast as 6800 Ultra
Geforce 8800GTX - Twice as fast as 7900GTX
GTX 280 - Twice as fast as 8800 Ultra
GTX 480 - Twice as fast as GTX 280
GTX 680 - Twice as fast as GTX 580
And then nvidia got lazy
Posted on Reply
#119
nguyen
It's soo easy to understand when looking at 1080 Ti vs 980 Ti


(Since Nvidia use dual fans for their new GPU so I pick a dual fans 1080 Ti model with the same power consumption as 1080 Ti FE)

basically if you lower the power limit of the 1080 Ti to 50%, it would still retain 70% of it performance (I have tested this) which is a good 35% faster than 980 Ti. Obviously they could be testing a prototype 3080 Ti with lowered power limit, but when crank to full power I suspect it would easily outperform 2080 Ti by 100%, same as Maxwell --> Pascal.
Posted on Reply
#120
Xzibit
nguyen
It's soo easy to understand when looking at 1080 Ti vs 980 Ti
The speculation came from an investment firm to its clients.
Posted on Reply
#121
R0H1T
candle_86
Nvidia did it before, they only got lazy lately
300% more efficient outside edge cases like RT ~ nope I can understand one or the other but they'll literally have to defy physics to get there.
Posted on Reply
#122
Casecutter
cucker tarlson
they do old architecture will be replaced by a new one called something terrific.

but seriously,it's samsung.
[/]
Well what's in a name? While seriously are you presenting fact?
Posted on Reply
#123
dicktracy
Finally some GPU news instead of boring CPU ones.
Posted on Reply
#124
cucker tarlson
Xzibit
Who does he use as a double standard in pricing @ 9:30 from TPU
probably himself.
5500xt 8gb costs exactly the same as 1660 super in reality while it's a tier down in performance.



btw I bet that amd shirt is a loan :laugh:
Posted on Reply
#125
dicktracy
candle_86
Nvidia did it before, they only got lazy lately

Geforce 6800 Ultra - four as fast as Geforce FX 5950 Ultra
Geforce 7800GTX - Twice as fast as 6800 Ultra
Geforce 8800GTX - Twice as fast as 7900GTX
GTX 280 - Twice as fast as 8800 Ultra
GTX 480 - Twice as fast as GTX 280
GTX 680 - Twice as fast as GTX 580
And then nvidia got lazy
They were obviously milking but it's not hard to see that Nvidia will push their GPUs even harder than before since Intel is jumping in and the AI revolution is right around the corner. Everyone knows the GPU is simply the future for next-generation computing. Tomorrow's landscape will revolve heavily on the GPU, and the CPU will only become less and less relevant especially when we're talking about AI-enhanced software. Intel will most likely put tons of resources into their GPU development and Nvidia will be foolish to hold back even a little. Jensen Huang is too good of a leader to let that happen though.
Posted on Reply
Add your own comment