Monday, November 13th 2017
NVIDIA "Volta" Architecture Successor Codenamed "Ampere," Expected GTC 2018
NVIDIA has reportedly codenamed the GPU architecture that succeeds its upcoming "Volta" architecture after the 18th century French physicist who is one of the pioneers of electromagnetism, André-Marie Ampère, after whom the popular unit of measuring current is named. The new NVIDIA "Ampere" GPU architecture, which succeeds "Volta," will make its debut at the 2018 Graphics Technology Conference (GTC), hosted by NVIDIA. As with GPU architecture launches by the company in recent times, one can expect an unveiling of the architecture, followed by preliminary technical presentations by NVIDIA engineers, with actual products launching a little later, and consumer-grade GeForce product launching much later.
NVIDIA is yet to launch GeForce products based on its upcoming "Volta" architecture as its current "Pascal" architecture turns 18 months old in the consumer graphics space. Should NVIDIA continue on the four-digit model number scheme of its GeForce 10-series "Pascal" family, one can expect those based on "Volta" to follow the GeForce 20-series, and "Ampere" GeForce 30-series. NVIDIA is yet to disclose the defining features of the "Ampere" architecture. We'll probably have to wait until March 2018 to find out.
Source:
Heise.de
NVIDIA is yet to launch GeForce products based on its upcoming "Volta" architecture as its current "Pascal" architecture turns 18 months old in the consumer graphics space. Should NVIDIA continue on the four-digit model number scheme of its GeForce 10-series "Pascal" family, one can expect those based on "Volta" to follow the GeForce 20-series, and "Ampere" GeForce 30-series. NVIDIA is yet to disclose the defining features of the "Ampere" architecture. We'll probably have to wait until March 2018 to find out.
97 Comments on NVIDIA "Volta" Architecture Successor Codenamed "Ampere," Expected GTC 2018
zOMG l00k at t3h clocks and that d13 spac3 zomgzzzzz... whatevs... just lets see the results of the effort. Those who eat crow will eat crow when we know.
Edit: And we all swallowed that hook, line and sinker, thus making sure it will happen again...
But bug hit it on the head... its hilarious, that first post about NVIDIA's focus. Its like, to me, OK, so they aren't focused. Look what is sitting on the table compared to a 'focused' AMD? Its like shitting on it, just to shit, not that there is a reason to do so. I mean, thanks? I appreciate the warning? I... I was not entirely sure how to respond to it. Second grade silicon... ok, again, look what that 'second grade silicon' (whatever the F that actually means...) is doing?!!!! YAY AMD, GREAT JOB ON COMPUTE!!!! Gaming........ not so much. YAY NVIDIA, GREAT JOB ON GAMING!!! Compute, not so much. Buy the right friggin tool for the job. They both have decent tools for each, some better than others and vice versa. Point is, only time will tell how it shakes out... in the meantime, I'll be dreaming of a more efficient /performance and powerful card with little fear of being let down. And if I am, my mouth is PLENTY big enough to insert my foot.. Lots of practice and all. ;)
"keep dreaming people........." - Really, its borderline trolling/baiting into an argument to me...you, nor I, have little idea of what will be coming soon. So to come out with such a polarizing (and I am being nice there) comment, really warrants a response. Don't open doors you don't want others to walk through... then turn and blame the person responding to the initial comment...lol..seriously. Glass houses... stones... Yeah.
I don't have it backward. You posted a polarizing comment with you now even admitting 'poor word choice' and people took exception to the tone you set. You also kept going with the 'second -grade silicon comment'. This clearly isn't a chicken or the egg conundrum, what happened here.
What isn't constructive is starting off with such a polarizing speculative post like yours and then retreating into a corner playing dumb. You are a very smart dude, and those words were chosen intentionally... and poorly. And here we are.
I digress. Back to Volta.
Naturally, prices are "quite okay" when there is no competition.
It is literally written in any serious book on economics: it is when there is no competition, when prices are OK.
And examples are easy to find: 8 core CPUs, which were 1k USD before Ryzen arrived.
What, is 1k too much for 8 cores? Didn't some CPUs cost 1k ages ago, eh?
Hypocrites!
What do you do when you do have the intention of trolling?
Since you seem to be the person in this thread who believes there is a loss of focus... medi came up with a great on topic quesiton... Or, are you suggesting it will come out..........wait, what are you suggesting? All we got was 'keep dreaming they aren't focused on consumer volta' and 'second rate silicon'....and GTC (which showed a game there, clearly there is SOME focus on gaming showing a game at a non gaming conference)...
Is it simply industry trends to lead your opinion, or something more than a hunch (think Brady bunch!! :p).
The disconnect comes from the fact that you keep going on a tangent with nothing relevant to say while I tried my best to stay on topic. You keep telling me that I have "no arguments" which is a ridiculous claim as you've clearly been able to find them as shown by your own comments but instead you chose to stay "blind".
It's pretty obvious to me that you are trying to derail the hell out of this thread. And this isn't the first time I've seen you trying to do this. So the ignore list it is then , since you clearly have no intention to end this.
nVidia certainly did not follow the standard path with Volta; why on Earth would they follow it with this new architecture? There is too much milk left in the consumer space, and we can see that nVidia clearly thinks this way with StarWars themed Titans.
Is there Resistance to the new code names?
Do the new code names make the marketing departments brain Hertz?