• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA "Ampere" Successor Reportedly Codenamed "Hopper"

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,857 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA has reportedly codenamed a future GPU architecture "Hopper," in honor of Grace Hopper, an eminent computer scientist who invented one of the first linkers, and programmed the Harvard Mark I computer that aided the American war efforts in World War II. This came to light as Twitter user "@kopite7kimi," who's had a fairly high hit-rate with NVIDIA info tweeted not just the codename, but also a key NVIDIA product design change. The tweets were later deleted, but not before 3DCenter.org reported on them. To begin with, "Hopper" is reportedly succeeding the upcoming "Ampere" architecture slated for the first half of 2020.

"Hopper" is also rumored to introduce MCM (multi-chip module) GPU packages, which are packages with multiple GPU dies. Currently, GPU MCMs are packages that have one GPU die surrounded by memory dies or stacks. This combination of GPU dies could make up "giant cores," at least in the higher end of the performance spectrum. NVIDIA reserves MCMs for only its most expensive Tesla family of compute accelerators, or Quadro professional graphics cards, and seldom offers client-segment GeForce products.



View at TechPowerUp Main Site
 
  • Like
Reactions: 64K
Write your best puns below...
 
Multi-chip? This is definitely not desktop chip for gamers.
 
Write your best puns below...
There's at least two in here...probably more. :roll:
dennis-hopper-super-mario-bros.jpg


MCM was rumored about Navi too. I'll believe it when I see it.
 
With MCM rumors, can prolly expect nothing less than $1k price tags on them.
 
Should have been 'Lovelace' after Ada Lovelace.

...But I can see why that might have had all sorts of unintended connotations, which Nvidia wouldn't have wanted (for those who know their Porn history)
 
Wake up! The hero! The sun!
 
With MCM rumors, can prolly expect nothing less than $1k price tags on them.

isn't that the very point of doing MCM is to make it cheaper because large monolithic die can have much lower yield which in turn can raise the price?
 
No the move to a new node will be more expensive at first, more expensive than the likely savings via MCM. Having said that the final price/margins depend on Nvidia & how much (more) they'd like to milk their loyal user-base!
 
We all have, the market has been stagnant for too long.

We are almost in 2020 and we are still playing with 2016/2017 performances ... (I don't count the 2080Ti, as it has an obscene price tag).
 
Should have been 'Lovelace' after Ada Lovelace.

...But I can see why that might have had all sorts of unintended connotations, which Nvidia wouldn't have wanted (for those who know their Porn history)
Had to look that up. That's some retro stuff way before my time lol
 
Multi-chip? This is definitely not desktop chip for gamers.

Probably not in it's first implementation but it will certainly trickle down to gaming segment sooner or later .
 
Probably not in it's first implementation but it will certainly trickle down to gaming segment sooner or later .
You mean definitely not its first implantation. I doubt the second will be for gaming since this is a multi-chip. This one might only be for workstations.
 
isn't that the very point of doing MCM is to make it cheaper because large monolithic die can have much lower yield which in turn can raise the price?
No. The point of MCM is to make devices utilizing multiple chips with common interface.
With MCM Nvidia will be able to split CUDA, Tensor and RTX parts.
Of course it may lower their costs (both manufacturing - because of yields - and R&D - because chips can be reused).

It will not lower GPU prices. Product prices are only affected by supply and demand.

Intel's GPU will be MCM from the start.
Not sure about AMD (Navi was expected to be MCM).
 
Regardless if this rumor is true or not, "Hopper" seems like a worthy codename to honor someone who have made siginificant advances in computer science. I hope they will eventually honor Dennis Ritchie, which may have had the largest lasting impact on computer software.

Still, we don't have anything solid confirming what "Ampere" is at this point, if it's a gaming architecture, an overarching architecture, a data-center architecture or not real at all.
I would like to remind people that some details reported "everywhere" can still turn out be completely wrong, like AMD Arcturus as a GCN successor, Turing cards being named 11xx or Zen 2 running at 5 GHz…

It will not lower GPU prices. Product prices are only affected by supply and demand.
Not quite true. Production and development cost does to some extent impact how low the prices can be pushed, given there is competition in the first place of course.
 
Should have been 'Lovelace' after Ada Lovelace.

...But I can see why that might have had all sorts of unintended connotations, which Nvidia wouldn't have wanted (for those who know their Porn history)

Yeah, the Lovelace name hasn't been as... shiny since Linda.
 
Not quite true. Production and development cost does to some extent impact how low the prices can be pushed, given there is competition in the first place of course.
Nope. :)
Prices are driven by supply and demand. That's it. This is not finance and not logistics. It's pure math applied to how people use their resources (which we started to call "economy" few centuries ago).
This is based on just 4 assumptions:
- there's a distribution of how much money buyers can spend,
- there's a limited amount of goods,
- each buyer wants to buy for as little as possible,
- each seller wants to sell for as much as possible.

As you can see, there are no costs in this theorem. :)

Costs can be seen as a sensible lower limit, but they don't impact the price. They only impact profitability. Because a company can (and often does) sell under costs. And it is allowed to do so. But if it does this too often, it's not profitable and collapses.
A company always sells for as much as it can. It will not voluntarily sacrifice earnings. :)
 
Nope. :)

Prices are driven by supply and demand. That's it.
<snip>
As you can see, there are no costs in this theorem. :)
That's nonsense.
There will always be a threshold where the manufacturer will cross the line of profitability, and will only sell at a loss for specific reasons, such as grabbing market share(for long term profits), marketing reasons (e.g. a big sale), or they offset the costs somehow with additional products or services (bundled or optional upgrades).

A good example is AMD's Vega, even after the HBM supplies were plentiful and the mining craze were long gone, market supplies remained artifically low, simply because it wouldn't be profitable to price them low enough to truly compete with Pascal.

Costs can be seen as a sensible lower limit, but they don't impact the price. They only impact profitability. Because a company can (and often does) sell under costs. And it is allowed to do so. But if it does this too often, it's not profitable and collapses.

A company always sells for as much as it can. It will not voluntarily sacrifice earnings. :)
And just there you admitted that cost is a factor.

If you know how the stock market works, there is usually both buy and sell options many times larger than the actual traded volume, so the theoretical supply is there, but the supply is always dependent on a minimum price. The same is true for goods and services; the amount of products that Nvidia, AMD or Intel is willing to produce is dependent on the profit they can make from it.

Now let's leave the philosophy at the schools where it belongs, anyone with common sense understands that cost is always a factor.
 
Last edited:
You mean definitely not its first implantation. I doubt the second will be for gaming since this is a multi-chip. This one might only be for workstations.

Well .... you never know hence the ''probably'' . No matter what , MCM is the future , Nvidia has certainly already worked/completed several MCM designs for different applications ( they don't pull those things over-night ) and has definitely the resources to bruteforce any challenge an MCM design can represent for gaming GPUs !

You can argue that they don't really have the need as they dominate the market but then again Nvidia pioneered raytracing in gaming and dedicated hardware to run it and they had no need to do so either but today that's paying off ( considering the strategic advantage they have on this field over the competition ) , so same can happen with MCM .

Don't get me wrong i don't see them pulling an MCM based gaming GPU in 2021 but 2022 there are big chances of that happening . Ultimately it will depend on the degree of competition they receive from AMD and Intel in 2020 ( or the lack of it ) .
 
Last edited:
We don't know what Ampere is about, a news piece about the name of the architecture that comes after it is as useless as it gets.
 
Why wouldn't "chiplet" approach work with GPUs?
Especially having separate RT chiplet?
 
Back
Top