Monday, November 13th 2017

NVIDIA "Volta" Architecture Successor Codenamed "Ampere," Expected GTC 2018

NVIDIA has reportedly codenamed the GPU architecture that succeeds its upcoming "Volta" architecture after the 18th century French physicist who is one of the pioneers of electromagnetism, André-Marie Ampère, after whom the popular unit of measuring current is named. The new NVIDIA "Ampere" GPU architecture, which succeeds "Volta," will make its debut at the 2018 Graphics Technology Conference (GTC), hosted by NVIDIA. As with GPU architecture launches by the company in recent times, one can expect an unveiling of the architecture, followed by preliminary technical presentations by NVIDIA engineers, with actual products launching a little later, and consumer-grade GeForce product launching much later.

NVIDIA is yet to launch GeForce products based on its upcoming "Volta" architecture as its current "Pascal" architecture turns 18 months old in the consumer graphics space. Should NVIDIA continue on the four-digit model number scheme of its GeForce 10-series "Pascal" family, one can expect those based on "Volta" to follow the GeForce 20-series, and "Ampere" GeForce 30-series. NVIDIA is yet to disclose the defining features of the "Ampere" architecture. We'll probably have to wait until March 2018 to find out.

Source: Heise.de
Add your own comment

97 Comments on NVIDIA "Volta" Architecture Successor Codenamed "Ampere," Expected GTC 2018

#1
efikkan
And we're discussing pretty much everything besides the name…:cool:
Posted on Reply
#2
EarthDog
the54thvoid said:
So much bickering over a rumour that starts a verbal conflict over what is effectively a toy. Because some people are really upset that desktops GPU's might not be the focus of Nvidia. Wow, really.

Again, a rumour. A desktop gfx card to facilitate adults playing games, a toy.

Jeez.
LOL, I don't care if they focus on.............I don't even know.. I just know in the end its going to be a solid gaming card and I could care less what NVIDIA is focusing on. Perhaps with NVIDIA 'distracted' AMD can bring some competition in the high-end space instead of midstream on down with way more power use out of the box...

zOMG l00k at t3h clocks and that d13 spac3 zomgzzzzz... whatevs... just lets see the results of the effort. Those who eat crow will eat crow when we know.
Posted on Reply
#3
Vya Domus
EarthDog said:

zOMG l00k at t3h clocks and that d13 spac3 zomgzzzzz... whatevs...
It's not like you are forced to partake into more technical oriented discussions. I don't even know why you are annoyed.
Posted on Reply
#4
bug
Vya Domus said:
It's not like you are forced to partake into more technical oriented discussions. I don't even know why you are annoyed.
I'm going to take a wild guess here and say he's annoyed someone came into this thread and casually made some blanket statements about Nvidia shifting focus away from gamers.

Edit: And we all swallowed that hook, line and sinker, thus making sure it will happen again...
Posted on Reply
#5
Vya Domus
bug said:
I'm going to take a wild guess here and say he's annoyed someone came into this thread and casually made some blanket statements about Nvidia shifting focus away from gamers.
I know it hurts , I really do. :)
Posted on Reply
#6
EarthDog
It's more humorous to me to see these things brought up now than annoying, honestly. If we speculate on the speculation, we can form long lasting and never changing opinions!!!!!

But bug hit it on the head... its hilarious, that first post about NVIDIA's focus. Its like, to me, OK, so they aren't focused. Look what is sitting on the table compared to a 'focused' AMD? Its like shitting on it, just to shit, not that there is a reason to do so. I mean, thanks? I appreciate the warning? I... I was not entirely sure how to respond to it. Second grade silicon... ok, again, look what that 'second grade silicon' (whatever the F that actually means...) is doing?!!!! YAY AMD, GREAT JOB ON COMPUTE!!!! Gaming........ not so much. YAY NVIDIA, GREAT JOB ON GAMING!!! Compute, not so much. Buy the right friggin tool for the job. They both have decent tools for each, some better than others and vice versa. Point is, only time will tell how it shakes out... in the meantime, I'll be dreaming of a more efficient /performance and powerful card with little fear of being let down. And if I am, my mouth is PLENTY big enough to insert my foot.. Lots of practice and all. ;)
Posted on Reply
#7
Vya Domus
EarthDog said:
I was not entirely sure how to respond to it.
Like I said, you don't have to , I think that's really the problem not my claims and arguments. You don't necessarily have to go out of your way every time to disagree with me just for the sake of doing so.
Posted on Reply
#8
EarthDog
And you don't have to go out of your way to post that kind of polarizing comment which warrants such a response...

"keep dreaming people........." - Really, its borderline trolling/baiting into an argument to me...you, nor I, have little idea of what will be coming soon. So to come out with such a polarizing (and I am being nice there) comment, really warrants a response. Don't open doors you don't want others to walk through... then turn and blame the person responding to the initial comment...lol..seriously. Glass houses... stones... Yeah.
Posted on Reply
#9
Vya Domus
Maybe the words that I used weren't such a great choice but my point was pretty valid and it had no intention of trolling , otherwise I would have left a long time ago and just leave it as it is. Get past the semantics , it's just not constructive.

EarthDog said:
Don't open doors you don't want others to walk through... then turn and blame the person responding to the initial comment
You got that backwards , it's you who's blaming me. I have no problems with different opinions , you do and you're trying to call me out for that.
Posted on Reply
#10
EarthDog
I don't have a problem with a different opinion, it's the polarzing wording which warranted a response. If you would have worded it differently, perhaps, "With NVIDIA's focus not being on the consumer graphics for Volta, I wonder what kind of gaming GPU Volta will be", or, "I;m worried what kind of gaming GPU it will be" I am 100% certain we wouldn't be in this situation. Instead, it was a dramatized... "KEEP DREAMING PEOPLE". You set the tone, others responded to it. Glass houses...stones. Broken Glass. #ubroughtitonyourself. :p

I don't have it backward. You posted a polarizing comment with you now even admitting 'poor word choice' and people took exception to the tone you set. You also kept going with the 'second -grade silicon comment'. This clearly isn't a chicken or the egg conundrum, what happened here.

What isn't constructive is starting off with such a polarizing speculative post like yours and then retreating into a corner playing dumb. You are a very smart dude, and those words were chosen intentionally... and poorly. And here we are.

I digress. Back to Volta.
Posted on Reply
#11
medi01
Sooo, ignoring which company is "focused" and which isn't, are there any theories on why would nVidia skip consumer Volta?

efikkan said:
In reality prices are quite okay at the moment.
That's one logical comment, great job!

Naturally, prices are "quite okay" when there is no competition.
It is literally written in any serious book on economics: it is when there is no competition, when prices are OK.
And examples are easy to find: 8 core CPUs, which were 1k USD before Ryzen arrived.
What, is 1k too much for 8 cores? Didn't some CPUs cost 1k ages ago, eh?
Hypocrites!
Posted on Reply
#12
EarthDog
medi01 said:
Sooo, ignoring which company is "focused" and which isn't, are there any theories on why would nVidia skip consumer Volta?
You'll have to ask Vya, he seems to be aware and in the know. ;)
Posted on Reply
#13
bug
Vya Domus said:
Maybe the words that I used weren't such a great choice but my point was pretty valid and it had no intention of trolling...
But you didn't post something relevant to this thread's subject nor did you make an attempt to argue your statement (other than consumer card weren't the main event at some conference).
What do you do when you do have the intention of trolling?
Posted on Reply
#14
Vya Domus
EarthDog said:
I don't have a problem with a different opinion, it's the polarzing wording which warranted a response. If you would have worded it differently, perhaps, "With NVIDIA's focus not being on the consumer graphics for Volta, I wonder what kind of gaming GPU Volta will be", or, "I;m worried what kind of gaming GPU it will be" I am 100% certain we wouldn't be in this situation. Instead, it was a dramatized... "KEEP DREAMING PEOPLE". You set the tone, others responded to it. Glass houses...stones. Broken Glass. #ubroughtitonyourself. :p
Except I really didn't set any tone , just because you got annoyed for some reason doesn't mean everyone did and as you can see people brought arguments instead of starting a shit show. I think you are the only one who's dramatizing this more that it really is.

EarthDog said:

I digress. Back to Volta.
That was the topic before your heroic intervention against my evil comments. :p

bug said:
nor did you make an attempt to argue your statement
It's not my problem that you don't read them , really.
Posted on Reply
#15
EarthDog
The record will speak for itself. I am not worried in the least. :)



Since you seem to be the person in this thread who believes there is a loss of focus... medi came up with a great on topic quesiton...
medi01 said:
are there any theories on why would nVidia skip consumer Volta?
Or, are you suggesting it will come out..........wait, what are you suggesting? All we got was 'keep dreaming they aren't focused on consumer volta' and 'second rate silicon'....and GTC (which showed a game there, clearly there is SOME focus on gaming showing a game at a non gaming conference)...

Is it simply industry trends to lead your opinion, or something more than a hunch (think Brady bunch!! :p).
Posted on Reply
#16
bug
Vya Domus said:
It's not my problem that you don't read them , really.
Read what? Maybe we have different definitions for "argument", but you didn't post a single link or source in any of your posts in this thread, other than the $3bn that went into Volta's development.
Posted on Reply
#17
Vya Domus
bug said:
other than the $3bn that went into Volta's development.
I see you can read after all :laugh:.

bug said:
Read what? Maybe we have different definitions for "argument"
Just a couple more tries and you'll get there.
Posted on Reply
#18
bug
Vya Domus said:
I see you can read after all :laugh:.



Just a couple more tries and you'll get there.
I see. So your definition of an argument coincides with my definition of an insult. Hence our disconnection.
Posted on Reply
#19
Vya Domus
bug said:
I see. So your definition of an argument coincides with my definition of an insult. Hence our disconnection.
I think you are seriously going off track by trying to suggest that I am insulting you in some way.

The disconnect comes from the fact that you keep going on a tangent with nothing relevant to say while I tried my best to stay on topic. You keep telling me that I have "no arguments" which is a ridiculous claim as you've clearly been able to find them as shown by your own comments but instead you chose to stay "blind".

It's pretty obvious to me that you are trying to derail the hell out of this thread. And this isn't the first time I've seen you trying to do this. So the ignore list it is then , since you clearly have no intention to end this.
Posted on Reply
#20
wiyosaya
nVidia does not have any competition right now. What is to stop them from milking, ala sIntel, their customers until they do have competition?

nVidia certainly did not follow the standard path with Volta; why on Earth would they follow it with this new architecture? There is too much milk left in the consumer space, and we can see that nVidia clearly thinks this way with StarWars themed Titans.
Posted on Reply
#21
cyneater
Watt Next?

Is there Resistance to the new code names?

Do the new code names make the marketing departments brain Hertz?
Posted on Reply
#22
wiyosaya
cyneater said:
Watt Next?

Is there Resistance to the new code names?

Do the new code names make the marketing departments brain Hertz?
No, but there is Ohm for the Resistance! :D
Posted on Reply
#23
bug
wiyosaya said:
nVidia does not have any competition right now. What is to stop them from milking, ala sIntel, their customers until they do have competition?
You're thinking about this the wrong way. They're responsible to their shareholders to rake in as much cash as possible, they'd get in trouble if they didn't take advantage of the situation ;)

wiyosaya said:
nVidia certainly did not follow the standard path with Volta; why on Earth would they follow it with this new architecture? There is too much milk left in the consumer space, and we can see that nVidia clearly thinks this way with StarWars themed Titans.
Of course we're all speculating here, but AMD isn't standing still and Nvidia knows it. That is way the possibility of Ampere (or whatever) trickling down to consumer space is real (imho). Another possibility is they release Ampere for HPC and move Volta into consumer space. But there are many variables none of us here on TPU know about, so in the end we'll just have to see.
Posted on Reply
#24
Assimilator
bug said:
You're thinking about this the wrong way. They're responsible to their shareholders to rake in as much cash as possible, they'd get in trouble if they didn't take advantage of the situation ;)

Of course we're all speculating here, but AMD isn't standing still and Nvidia knows it. That is way the possibility of Ampere (or whatever) trickling down to consumer space is real (imho). Another possibility is they release Ampere for HPC and move Volta into consumer space. But there are many variables none of us here on TPU know about, so in the end we'll just have to see.
Well I think we can count on at least one thing: Navi will be late.
Posted on Reply
#25
ensabrenoir
....only thing i'm reading/seeing is that if nvdia (bunch of evil geniuses) HAD more aggressive competition.....we'd be getting Ampere next instead of still waiting on Volta...... Gotta sudden urge to root for intel and their recent acquirement.....
Posted on Reply
Add your own comment