Friday, March 11th 2016

NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface

It looks like NVIDIA's next GPU architecture launch will play out much like its previous two generations - launching the second biggest chip first, as a well-priced "enthusiast" SKU that outperforms the previous-generation enthusiast product, and launching the biggest chip later, as the high-end enthusiast product. The second-biggest chip based on NVIDIA's upcoming "Pascal" architecture, the "GP104," which could let NVIDIA win crucial $550 and $350 price-points, will be a lean machine. NVIDIA will design the chip to keep manufacturing costs low enough to score big in price-performance, and a potential price-war with AMD.

As part of its efforts to keep GP104 as cost-effective as possible, NVIDIA could give exotic new tech such as HBM2 memory a skip, and go with GDDR5X. Implementing GDDR5X could be straightforward and cost-effective for NVIDIA, given that it's implemented the nearly-identical GDDR5 standard on three previous generations. The new standard will double densities, and one could expect NVIDIA to build its GP104-based products with 8 GB of standard memory amounts. GDDR5X breathed a new lease of life to GDDR5, which had seen its clock speeds plateau around 7 Gbps/pin. The new standard could come in speeds of up to 10 Gbps at first, and eventually 12 Gbps and 14 Gbps. NVIDIA could reserve HBM2 for its biggest "Pascal" chip, on which it could launch its next TITAN product.

The GP104 will be built on the 16 nm FinFET process, by TSMC. NVIDIA is hoping to unveil the first GP104-based products by April, at the Graphics Technology Conference (GTC) event, which it hosts annually; with possible market availability by late-May or early-June, 2016.
Source: Benchlife.info
Add your own comment

135 Comments on NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface

#1
esrever
I would be surprised if it did use DDR5x given how little enthusiasm Nvidia and pretty much any tech company have given to DDR5x. I expect normal DDR5 over DDR5X.
Posted on Reply
#2
P4-630
Nice! I will wait for that!
Using intel HD530 now on my new skylake build, still gaming on my ROG laptop just a little longer.
Posted on Reply
#3
FordGT90Concept
"I go fast!1!11!1!"
GDDR5X was announced not that long ago. It takes time to design a new memory controller and push it to market. AMD will likely do the same with their mid-range cards.
Posted on Reply
#4
Solaris17
Super Dainty Moderator
Damn I just retired my 290x with a 980ti like 2 weeks ago.
Posted on Reply
#5
Frick
Fishfaced Nincompoop
Gooooooooosddddddddd I just want the stuff on the market so we can skip the hype.
Posted on Reply
#6
Casecutter
When did/does GDDR5X have production reediness? The info I had was volume production was said to be this summer.
www.anandtech.com/show/10017/micron-reports-on-gddr5x-progress

My money is the GTX1080 with 8Gb GDDR5 256-Bit 150W TDP; it can best FuryX and more the reference 980Ti and they'll sell for $550. They'll update it with GDDR5X and higher clocks (more or less a rebrand) when FinFET process is permitting.
Posted on Reply
#7
rtwjunkie
PC Gaming Enthusiast
My money is on a paper-launch in late May-early June, with hard launch as soon as the new GDDR5X is ready for production.
Posted on Reply
#8
the54thvoid
Intoxicated Moderator
CasecutterWhen did/does GDDR5X have production reediness? The info I had was volume production was said to be this summer.
www.anandtech.com/show/10017/micron-reports-on-gddr5x-progress

My money is the GTX1080 with 8Gb GDDR5 256-Bit 150W TDP; it can best FuryX and more the reference 980Ti and they'll sell for $550. They'll update it with GDDR5X and higher clocks (more or less a rebrand) when FinFET process is permitting.
Quite possible on the performance metric. The GTX980 didn't exactly outclass the 780ti, so the '1080' might be on par with the 980ti but as you say with far less power draw. It'd be strange for Nvidia to release a card though that they weren't sure would take the crown so perhaps they know Polaris is going to be far later in the year when they can release 1080ti? Almost reminiscent of 780 > bettered by 290X, then the 780ti is dropped like a week or so later. It'd be nice if Polaris comes out with guns blazing though and forces a price war with Nvidia.
Posted on Reply
#9
truth teller
FordGT90ConceptIt takes time to design a new memory controller and push it to market.
not really since the protocol is mostly untouched, its mostly just a speed bump a wider word/bus access
FordGT90ConceptAMD will likely do the same with their mid-range cards.
most definitely, which is not a bad thing in itself since on mid/low end skus there wont be much benefit going the hbm way
Posted on Reply
#10
Naito
Darn. I'm ready to throw down some money on a Pascal, but I'm gonna have to wait longer...
Posted on Reply
#11
manofthem
WCG-TPU Team All-Star!
I'll be looking forward to this. I haven't upgraded in a long while and I'm already scratching the itch :)
Posted on Reply
#12
NC37
Will be disappointing. Reports are the 1080 will have a 104 instead of a 110. Means the 1060 will be a 128bit piece of junk. Unless they do like they did with the 600 series and run a 192bit. Either way, it'll be more of the same. Haven't had a good 256bit chip in the 60 series since Fermi.
Posted on Reply
#13
rtwjunkie
PC Gaming Enthusiast
NaitoDarn. I'm ready to throw down some money on a Pascal, but I'm gonna have to wait longer...
Wait longer for what? For a long time it had been looking like mid to late summer. This is earlier than expected.
Posted on Reply
#14
HumanSmoke
FordGT90ConceptGDDR5X was announced not that long ago. It takes time to design a new memory controller and push it to market. AMD will likely do the same with their mid-range cards.
JEDEC might have only just ratified the specification, but it doesn't necessarily follow that memory controller and its associated physical layer IP development starts from that point. QA, verification and test tooling has been available for some time

More likely is that the base GPU (for both vendors) is prepped for both IMC/PHY logic blocks. GDDR5 initially, and GDDR5X when production hits its stride. If 14/16nm is anything like previous nodes, both companies will want a low (R&D)-cost refresh option.
CasecutterWhen did/does GDDR5X have production reediness? The info I had was volume production was said to be this summer.
www.anandtech.com/show/10017/micron-reports-on-gddr5x-progress
The salient point is mass production. Just as with initial HBM supplies, mass production does not preclude products shipping with the chips. SK Hynix announced volume production of HBM only a week before the Fury X launched - that definitively proves the chips used on the Fiji series of cards were fabricated some months before the Hynix announcement given the lead in time needed for GPU package assembly, board assembly, and shipping. Micron are already sampling GDDR5X and they certainly didn't wait around for JEDEC to ratify the specification before beginning work.
CasecutterMy money is the GTX1080 with 8Gb GDDR5 256-Bit 150W TDP; it can best FuryX and more the reference 980Ti and they'll sell for $550. They'll update it with GDDR5X and higher clocks (more or less a rebrand) when FinFET process is permitting.
Excepting the naming ( I sincerely doubt "1080" will be a model number) that seems reasonable. Nvidia will almost certainly target high end mobile and performance segment discrete as they did with GM 204. Smaller die size will offset higher wafer costs, so $500-550 for the GTX _80 and $350 for the GTX _70 seems in line with previous pricing.
the54thvoidQuite possible on the performance metric. The GTX980 didn't exactly outclass the 780ti, so the '1080' might be on par with the 980ti but as you say with far less power draw. It'd be strange for Nvidia to release a card though that they weren't sure would take the crown so perhaps they know Polaris is going to be far later in the year when they can release 1080ti? Almost reminiscent of 780 > bettered by 290X, then the 780ti is dropped like a week or so later.
I'm picking that GP104 has enough performance to pick off the custom 980 Ti's and Fury cards but maybe not much more. Price the cards at $500-550 and $330-350 (GTX _70) and they will still stack up well for consumers in comparison. One thing is certain, the company usually knows how to sell product, so they will have the angles covered.
the54thvoidIt'd be nice if Polaris comes out with guns blazing though and forces a price war with Nvidia.
Sounds like a Grimm faery tale unfortunately. Except for some hollow sabre rattling and PR driven token price cuts, both AMD and Nvidia seem intent on giving the impression of a price war without actually engaging in one.
Posted on Reply
#15
rruff
btarunrIt looks like NVIDIA's next GPU architecture launch will play out much like its previous two generations - launching the second biggest chip first, as a well-priced "enthusiast" SKU that outperforms the previous-generation enthusiast product, and launching the biggest chip later, as the high-end enthusiast product.
GTX 750 doesn't count? It was way before the 970.
Posted on Reply
#17
Naito
rtwjunkieWait longer for what? For a long time it had been looking like mid to late summer. This is earlier than expected.
For the FULL Pascal. 4K needs a lot to be driven.
Posted on Reply
#18
midnightoil
rtwjunkieWait longer for what? For a long time it had been looking like mid to late summer. This is earlier than expected.
Um, NVIDIA's first cards are expected in very low volume at either the beginning of Q4 or end of Q3, no earlier. There's no way there'll be a Titan until late Q1 '17 at the very earliest.

This whole report is completely made up.

GDD5RX won't be in volume production until end of Q3. So how exactly are NVIDIA going to launch a GDDR5X card at the end of Q2?

Answer: they can't and won't.

They still haven't demo'd real cards to anyone as far as we know. If they had it would have leaked. AMD had their first demos to the press in early December with real cards. Select partners had access earlier than that.
Posted on Reply
#20
PP Mguire
midnightoilUm, NVIDIA's first cards are expected in very low volume at either the beginning of Q4 or end of Q3, no earlier. There's no way there'll be a Titan until late Q1 '17 at the very earliest.

This whole report is completely made up.

GDD5RX won't be in volume production until end of Q3. So how exactly are NVIDIA going to launch a GDDR5X card at the end of Q2?

Answer: they can't and won't.

They still haven't demo'd real cards to anyone as far as we know. If they had it would have leaked. AMD had their first demos to the press in early December with real cards. Select partners had access earlier than that.
GDDR5 is expected to roll out high volume production in the summer which means GP104 could very well roll out easily in the fall as partners are supposed to be receiving samples now. How they got available by June is beyond me, but a paper launch seems like it could happen.
Posted on Reply
#21
ZoneDymo
pretty sad for such a "premium" company to go for GDDR5x for all but the most overpriced of their products.
You would think that that massive market share would count for something.
Oh well...guess MOAR profits rule again and the blind will all gladly bend over.
Posted on Reply
#22
Naito
ZoneDymopretty sad for such a "premium" company to go for GDDR5x for all but the most overpriced of their products.
You would think that that massive market share would count for something.
Oh well...guess MOAR profits rule again and the blind will all gladly bend over.
The GP104 is not their flagship chip, so it is understandable. Having said that, GDDR5X is nothing to scoff at if what I've read is to be believed. 8GB/256bit SKUs with bandwidth one might expect from a 512bit memory system serving GDDR5, may become more commonplace, especially in the mid to high end. Surely this is a good thing for consumers?
Posted on Reply
#23
TheDeeGee
Solaris17Damn I just retired my 290x with a 980ti like 2 weeks ago.
It was know for ages the new cards would come out around May/June.
Posted on Reply
#24
Jism
Since AMD was into development of HBM and HBM2, it said as well to have a first priority related to getting HBM 1 & HBM2 chips. The reason why Nvidia is choosing for GDDRX5 is simply because it cant get HBM2 chips yet.

HBM is still superior to GDDR, less chips, less power, much tighter latency's and much bigger bandwidth.
Posted on Reply
#25
Naito
JismHBM is still superior to GDDR, less chips, less power, much tighter latency's and much bigger bandwidth.
GDDR5X closes the gap considerably when compared to HMB1 so much so that unless you're gaming at 4K, you'd probably never notice the difference, if then.
Posted on Reply
Add your own comment
May 3rd, 2024 18:13 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts