• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Rambus Talks HBM3, DDR5 in Investor Meeting

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.15/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
Rambus, a company that has veered around the line of being an innovative company and a patent troll, has shed some more light on what can be expected from HBM3 memory (when it's finally available). In an investor meeting, representatives from the company shared details regarding HBM3's improvements over HBM2. Details are still scarce, but at least we know Rambus' expectations for the technology: double the memory bandwidth per stack when compared to HBM2 (4000 MB/s), and a more complex design, which leaves behind the 2.5D design due to increased height of the HBM3 memory stacks. An interesting thing to note is that Rambus is counting on HBM3 to be produced on 7 nm technologies. Considering the overall semiconductor manufacturing calendar for the 7 nm process, this should place HBM3 production in 2019, at the earliest.

HBM3 is also expected to bring much lower power consumption compared to HBM2, besides increasing memory density and bandwidth. However, the "complex design architectures" in the Rambus slides should give readers pause. HBM2 production has had some apparent troubles in reaching demand levels, with suspected lower yields than expected being the most likely culprit. Knowing the trouble AMD has had in successful packaging of HBM2 memory with the silicon interposer and its own GPUs, an even more complex implementation of HBM memory in HBM3 could likely signal some more troubles in that area - maybe not just for AMD, but for any other takers of the technology. Here's hoping AMD's woes were due only to one-off snags on their packaging partners' side, and doesn't spell trouble for HBM's implementation itself.





Other details that surfaced in the Rambus investor meeting pertain to DDR5 memory. Rambus says these will also be built under the 7 nm fabrication process, which is reinforced by Micron's assertions that the new memory specifications would be ready for production in 2020. With higher volume being needed for DDR5 production than HBM3, it makes sense that the latter would see production and sale to customers slightly before DDR5, to test the new 7 nm fabrication processes in a lower volume, higher margin product, ensuring yields of DDR5 to be within adequate rates.



View at TechPowerUp Main Site
 
Rambus...? NOOOOO! I will never forget the BS they did in the early 2000's
 
Rambus...? NOOOOO! I will never forget the BS they did in the early 2000's

At least we have dual channel memory thanks to them...The patent stuff is a bit messed up.
 
Rambus...? NOOOOO! I will never forget the BS they did in the early 2000's
What BS is that? Rambus RAM rocked. Whether you ran a Pentium 4 or even had the Pentium 3[I had both] the RAM bandwidth was unmatched by anything else at the time.
 
What BS is that? Rambus RAM rocked. Whether you ran a Pentium 4 or even had the Pentium 3[I had both] the RAM bandwidth was unmatched by anything else at the time.
So was the heat, energy usage, latency and use of terminators.
 
What BS is that? Rambus RAM rocked. Whether you ran a Pentium 4 or even had the Pentium 3[I had both] the RAM bandwidth was unmatched by anything else at the time.
So was their patent trolling.
 
So was the heat, energy usage, latency and use of terminators.
The SCSI bus standard used termination schemes, but that didn't stop it from being the highest performing bus architecure of the time. Heat was easily solved by adding better cooling(fans) to the case. And back then people didn't care so much about power usage. Latency was not an issue and was on par with the other ram standards of the time.
So was their patent trolling.
I'm not saying Rambus, the company, didn't make mistakes. I'm saying that Rambus, the product, was a great performer.
 
The SCSI bus standard used termination schemes, but that didn't stop it from being the highest performing bus architecure of the time. Heat was easily solved by adding better cooling(fans) to the case. And back then people didn't care so much about power usage. Latency was not an issue and was on par with the other ram standards of the time.
Right, the highest performing of its class, something RDRAM wasn't overall while its implementation was more expensive and complex than everything in the RAM-side of things at the time. DDR effectively killed it a year-or-so after RDRAM was release by being cheaper, allowing more performance and with the same implementation DRAM had for the last 20 years. Your common computer chassis in the late 90's didn't have much of a headroom for cooling, because Pentium II/III, Athlon, et. al., didn't require it since power usage wasn't that high anyway, hence why people didn't care.
I understand you want to advocate RDRAM, but there wasn't anything particularly great about RDRAM, besides bandwidth. And even that was beaten, by the time DDR PC-3200 came around.
 
Don't post completely incorrect information claiming to know what you are talking about.
I understand you want to advocate RDRAM
That's an assumption on your part. I'm not advocating so much as simply objectively recalling experiences. I was there to work with and see that equipment perform and it was the best money could buy for a time. I didn't say it wasn't pricey, only that it performed very well. And you're right, it wasn't surpassed in performance until DDR 3200. By that time Intel had abandoned the RDRam standard anyway.
Your common computer chassis in the late 90's didn't have much of a headroom for cooling
That is total rubbish. Most cases of even decent quality had great cooling headroom.

Grow up a little, the both of you. I was building and working on PC's in the 1980's, which is likely before either of you were born. When sharing information like this, it is at all times based on actual experience, not drivel discovered through Google.
 
Last edited:
That's an assumption on your part. I'm not advocating so much as simply objectively recalling experiences. I was there to work with and see that equipment perform and it was the best money could buy for a time. I didn't say it wasn't pricey, only that it performed very well. And you're right, it wasn't surpassed in performance until DDR 3200. By that time Intel had abandoned the RDRam standard anyway.

That is total rubbish. Most cases of even decent quality had great cooling headroom.

Grow up a little, the both of you. I was building and working on PC's in the 1980's, which is likely before either of you were born. When sharing information like this, it is at all times based on actual experience, not drivel discovered through Google.

Most OEM cases in the early 2000's when RDRAM was a thing certainly did not have good cooling aspects. Typically we are talking about a single 80mm output fan from the PSU, a single 80mm output fan on the back and that was all.

I actually owned one of the systems in question. Socket 423 P4 with RDRAM.

RDRAM was good for next to no time and was attached to a garbage CPU platform aka Intel's socket 423. Socket 423 by the way didn't even make it a year, much like the RDRAM it was equipped with. (to be fair there where a handful of RDRAM equipped s478 boards, but with DDR performing better and being cheaper that lasted next to no time).

You being older doesn't make you know anything. I have dealt with enough older individuals that have trouble turning on a PC let alone knowing what they are talking about with one.
 
Most OEM cases in the early 2000's when RDRAM was a thing certainly did not have good cooling aspects. Typically we are talking about a single 80mm output fan from the PSU, a single 80mm output fan on the back and that was all.
Where were you buying your PC cases? The thrift store? Like today, back then you could buy good ones and you could buy crap. And PC components back then didn't create as much heat so it didn't take much to keep them cool. A fan on the front, one on the back and one in the Power Supply. The good ones at that time also had a venting for a fan on the side panel for blowing air directly at the mobo. If you were having cooling problems, you were buying crap cases.
I actually owned one of the systems in question. Socket 423 P4 with RDRAM.
Good for you. I built and tested hundreds of them. Benchmark testing showed very clearly that RDRam, particularly the 800mhz offerings, provided measurable and useful gains in every aspect of computing, including gaming which was the majority of the types of systems I built.
RDRAM was good for next to no time and was attached to a garbage CPU platform aka Intel's socket 423. Socket 423 by the way didn't even make it a year, much like the RDRAM it was equipped with. (to be fair there where a handful of RDRAM equipped s478 boards, but with DDR performing better and being cheaper that lasted next to no time).
The 423 P4 1300mhz and 1400mhz models were outperformed by the P3 top end models, but from 1500mhz and above, the P4 was the clear winner. AMD's models weren't even contenders at that time. Socket 423 was a solid offering that wasn't well recieved by the industry because of RDRam costs. Costs aside, P4+RDRAM=Premium performance. However, Socket 423 wasn't alone in getting RDRAM. There were also single and dual Pentium 3 boards as well which provided excellent performance advantages over SDRAM.
You being older doesn't make you know anything.
True! Experience and first hand professional knowledge are what define that I know a lot. And seemingly more than you.
I have dealt with enough older individuals that have trouble turning on a PC let alone knowing what they are talking about with one.
Really? That's it?

EDIT; There is a reason why Nintendo and Sony chose RDRam over other types for their games system of that time. I wonder what that reason was? Certainly wasn't the cost..
 
Where were you buying your PC cases? The thrift store? Like today, back then you could buy good ones and you could buy crap. And PC components back then didn't create as much heat so it didn't take much to keep them cool. A fan on the front, one on the back and one in the Power Supply. The good ones at that time also had a venting for a fan on the side panel for blowing air directly at the mobo. If you were having cooling problems, you were buying crap cases.

Do you know what OEM means? The users we have been talking about this whole time were not DIY users.

Good for you. I built and tested hundreds of them. Benchmark testing showed very clearly that RDRam, particularly the 800mhz offerings, provided measurable and useful gains in every aspect of computing, including gaming which was the majority of the types of systems I built.

The 423 P4 1300mhz and 1400mhz models were outperformed by the P3 top end models, but from 1500mhz and above, the P4 was the clear winner. AMD's models weren't even contenders at that time. Socket 423 was a solid offering that wasn't well recieved by the industry because of RDRam costs. Costs aside, P4+RDRAM=Premium performance. However, Socket 423 wasn't alone in getting RDRAM. There were also single and dual Pentium 3 boards as well which provided excellent performance advantages over SDRAM.

Are you proud of these figures? You tested and worked on a platform so good that it lasted 9 months. It's performance gains were so good it lasted 9 months. Cellphone processors last on the market longer than that. The entire RDRAM platform you are speaking about was off the market faster than AMD rebrands it's GPU's right now. RDRAM had one positive attached to it and that was dual channel. Outside of that it was garbage. Hot, expensive, poorly implemented etc.

True! Experience and first hand professional knowledge are what define that I know a lot. And seemingly more than you.

You know what you can tell us all more about a garbage dead end technology. Please elaborate more on the numbers the P4 williamete gave out vs something modern like a mobile phone. You know the item with more memory bandwidth and CPU processing power and not using RDRAM, which died. Like D-E-D dead.

Really? That's it?

EDIT; There is a reason why Nintendo and Sony chose RDRam over other types for their games system of that time. I wonder what that reason was? Certainly wasn't the cost..

Consoles use all kinds of terrible technology. They currently use a pair of Athlon 5350's glued together and a 7950 for "4k gaming". If that is your argument for performance that is going to fall on deaf ears.
 
Do you know what OEM means?
No, I don't. Help me out there.
The users we have been talking about this whole time were not DIY users.
I was speaking from the perspective of a professional system builder who speciallized in high performance PC's. [Say THAT five times fast]
Are you proud of these figures?
You think this is about ego? :kookoo: I am concerned only with factual information. At that time, the benchmarks clearly showed RDRAM's advantage.
You tested and worked on a platform so good that it lasted 9 months. It's performance gains were so good it lasted 9 months. Cellphone processors last on the market longer than that. The entire RDRAM platform you are speaking about was off the market faster than AMD rebrands it's GPU's right now. RDRAM had one positive attached to it and that was dual channel. Outside of that it was garbage. Hot, expensive, poorly implemented etc.
Market acceptance and adoption has nothing to do with a product's technological performance. Regardless of RDRAM's adoption, it was a superior performer. And those products were around for almost 3 years, not 9 months. DDR RAM was available in dual channel shortly after RDRAM released and DDR did not perform as well until it hit DDR333 and only surpassed RDRAM when reaching DDR400. RDRAM was selected for the N64 and Playstation2 because of superior performance. Niether Nintendo nor Sony were in the habit of buying "garbage" for their game systems.
You know what you can tell us all more about a garbage dead end technology. Please elaborate more on the numbers the P4 williamete gave out vs something modern like a mobile phone. You know the item with more memory bandwidth and CPU processing power and not using RDRAM, which died. Like D-E-D dead.
Interesting associational logic. :wtf: You truly have a dizzying intellect.
Consoles use all kinds of terrible technology. They currently use a pair of Athlon 5350's glued together and a 7950 for "4k gaming". If that is your argument for performance that is going to fall on deaf ears.
If the result is a good gaming experience, why do you care? That would be like saying the Nintendo Switch is garbage because it isn't the highest spec'd system on the market. Perhaps this deafness you speak of has less to do with ears and more something else..
 
Last edited:
Since this thread turned into a e-peen contest, I would caution the culprits (you know who you are) to stay on topic.
 
Back
Top