• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Samsung Mass Producing Industry's Most Advanced 4 Gb DDR3, Using 20 nm Process

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,683 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Samsung Electronics Co., Ltd., the world leader in memory technology, today announced that it is mass producing the most advanced DDR3 memory, based on a new 20 nanometer process technology, for use in a wide range of computing applications. Samsung has pushed the envelope of DRAM scaling, while utilizing currently available immersion ArF lithography, in its roll-out of the industry's most advanced 20-nanometer (nm) 4-gigabit (Gb) DDR3 DRAM.

With DRAM memory, where each cell consists of a capacitor and a transistor linked to one another, scaling is more difficult than with NAND Flash memory in which a cell only needs a transistor. To continue scaling for more advanced DRAM, Samsung refined its design and manufacturing technologies and came up with a modified double patterning and atomic layer deposition.



Samsung's modified double patterning technology marks a new milestone, by enabling 20nm DDR3 production using current photolithography equipment and establishing the core technology for the next generation of 10nm-class DRAM production. Samsung also successfully created ultrathin dielectric layers of cell capacitors with an unprecedented uniformity, which has resulted in higher cell performance.

With the new 20nm DDR3 DRAM applying these technologies, Samsung also has improved manufacturing productivity, which is over 30 percent higher than that of the preceding 25 nanometer DDR3, and more than twice that of 30nm-class* DDR3.

In addition, the new 20nm 4Gb DDR3- based modules can save up to 25 percent of the energy consumed by equivalent modules fabricated using the previous 25 nanometer process technology. This improvement provides the basis for delivering the industry's most advanced green IT solutions to global companies.

"Samsung's new energy-efficient 20-nanometer DDR3 DRAM will rapidly expand its market base throughout the IT industry including the PC and mobile markets, quickly moving to mainstream status," said Young-Hyun Jun, executive vice president, memory sales and marketing, Samsung Electronics. "Samsung will continue to deliver next-generation DRAM and green memory solutions ahead of the competition, while contributing to the growth of the global IT market in close cooperation with our major customers."

According to market research data from Gartner, the global DRAM market will grow from $35.6 billion US dollars in 2013 to $37.9 billion US dollars in 2014.

View at TechPowerUp Main Site
 
Any chance we can get more Samsung miracle memory?
 
hope they will produce also for desktop. I wonder the price and timing and the difference of the current ddr3's
 
Why don't we skip DDR4 and jump right to DDR5 with some of the tech advances we have learned like chip independent timings, on the fly error correction for overclocking? With this and a few simple ECC bits in hardware on the CPU and a core control and data security provided by something like a ARM co-processor...........
 
Why don't we skip DDR4 and jump right to DDR5 with some of the tech advances we have learned like chip independent timings, on the fly error correction for overclocking? With this and a few simple ECC bits in hardware on the CPU and a core control and data security provided by something like a ARM co-processor...........
this sounds nice. i never know there is such thing as like that now with arm proc
 
What kind of voltage are these running? <1v?
 
What kind of voltage are these running? <1v?

1.25v +/- .06v

So 1.2-1.3ish.

One of those great PR spin jobs where they are essentially saying it will CAN run at 1.2v (for low-voltage products) where-as 30nm was running at 1.5v for 'average' products, but in reality the nominal spec of 1.25v (for low-voltage) is similar to the 30nm spec of 1.35v (for low-voltage).

So...yeah. TBH, quite surprised they couldn't pull a 2133mhz bin from that spec...but someone probably will make such a product.
 
Last edited:
Why don't we skip DDR4 and jump right to DDR5 with some of the tech advances we have learned like chip independent timings, on the fly error correction for overclocking? With this and a few simple ECC bits in hardware on the CPU and a core control and data security provided by something like a ARM co-processor...........

Right....... Because DDR5 has been developed already right?
 
There isn't much to be gained in actual system performance from this DRAM. It's just an refinement of existing DDR3 RAM. The tiny bit lower power consumption and cool tech is nice and all but you won't actually see any tangible system gains. They are just applying some DDR4 requirements to DDR3 as DDR4 is primarily for servers and not a cost effective DRAM solution for desktop or laptop.
 
So much mass production and still the price of ram is high.
 
Right....... Because DDR5 has been developed already right?


Considering we are using it in the new consoles and on graphics cards....... I know its "G" DDR5 but the specifications can't be that hard to implement considering its being done now.

But thanks for knowing this already, and making a useful post instead of just being an asshole, we all appreciate it.
 
Considering we are using it in the new consoles and on graphics cards....... I know its "G" DDR5 but the specifications can't be that hard to implement considering its being done now.

But thanks for knowing this already, and making a useful post instead of just being an asshole, we all appreciate it.

You have suggested that we skip a technology which God knows how many people have worked on for a long time. I do not think that it was a clever suggestion.
 
You have suggested that we skip a technology which God knows how many people have worked on for a long time. I do not think that it was a clever suggestion.
http://en.wikipedia.org/wiki/GDDR4

Yeah, GDDR4 was a huge success, and of course the amount of time spent on something is highly indicative of its ultimate success and not market adoption or true world performance.

Perhaps we could get together and watch a HD-DVD or a laserdisc and post about it to our myspace pages on our blackberry's? Oh could we!!!!?
 
^ Haha, He said "MySpace!!" :laugh:
 
http://en.wikipedia.org/wiki/GDDR4

Yeah, GDDR4 was a huge success, and of course the amount of time spent on something is highly indicative of its ultimate success and not market adoption or true world performance.

Perhaps we could get together and watch a HD-DVD or a laserdisc and post about it to our myspace pages on our blackberry's? Oh could we!!!!?
Dammit you guys are so far ahead, I'm still using betamax.
 
Dammit you guys are so far ahead, I'm still using betamax.
make sure to embrace the future, but only one step at a time so we make sure the big good companies who only care about our personal lives can make the pittance they do serving us gods.
 
http://en.wikipedia.org/wiki/GDDR4

Yeah, GDDR4 was a huge success, and of course the amount of time spent on something is highly indicative of its ultimate success and not market adoption or true world performance.

Perhaps we could get together and watch a HD-DVD or a laserdisc and post about it to our myspace pages on our blackberry's? Oh could we!!!!?

If I understand correctly, you're assuming that DDR4 will be a letdown because GDDR4 was not so great? I think I'll have fun quoting you again if it is indeed successful.

Secondly, you're talking about technologies that failed to remain relevant. That being said, I'm not so sure you predicted that (i.e. failure to stay relevant).

Finally, you're comparing tech that is sinking or that has sunk with a tech that hasn't hit the market yet.

P.S.: I was going to talk about the usefulness of posts and the use of sarcasm in said posts, but I changed my mind when I realized you don't like being on the receiving end of criticism.
 
wonder if the samsung memory in my samsung laptop i recently got is this sort.
 
If I understand correctly, you're assuming that DDR4 will be a letdown because GDDR4 was not so great? I think I'll have fun quoting you again if it is indeed successful.

Secondly, you're talking about technologies that failed to remain relevant. That being said, I'm not so sure you predicted that (i.e. failure to stay relevant).

Finally, you're comparing tech that is sinking or that has sunk with a tech that hasn't hit the market yet.

P.S.: I was going to talk about the usefulness of posts and the use of sarcasm in said posts, but I changed my mind when I realized you don't like being on the receiving end of criticism.
But yet you did good sir!!!!

I am not going to spend the time to break all of my reasoning down for you further than, the APU powering the PS4 uses GDDR5 and has no issues and squeezes great performance out of it. Manufacturers are already making GDDR5 and its years ahead of current RAM. Who is making DDR4? Who has adopted it despite having it laid out and planned since 2009? If you had the choice between using a higher speed new version, and a lower speed old version would you honestly choose the older version?

We the consumer have the final say in it, with our money, and much like GDDR4 was a good small stepping stone but ultimately cast aside, current DDR4 (it is in production and has been waiting in the wings for awhile now) production offers no performance or cost improvement over 3 minus a little low power server segment, and we have and know of working high performance use of 5.

Set your sights higher than trying to disagree with another person.
 
Considering we are using it in the new consoles and on graphics cards....... I know its "G" DDR5 but the specifications can't be that hard to implement considering its being done now.

But thanks for knowing this already, and making a useful post instead of just being an asshole, we all appreciate it.
Stevo, GDDR5 is based on DDR3. Is just a different representation.
 
wow good job samsung development team .30% is not so little relation to the norms required for the operation. Regardless of the kind of disruptive high latency (CL-11 at 12600).
I hope that we will soon see a new batch of RAM with manufacturers such as G SKILL a.s.o.:clap:
 
GDDR5 tech is not feasible due to its very high power consumption.

I wonder if this will bring us 16GB dimms?

My 32GB of memory is feeling a bit claustrophobic.
 
GDDR5 tech is not feasible due to its very high power consumption.


My 32GB of memory is feeling a bit claustrophobic.


So the fact the PS4 uses it, and all current graphics cards use it and yet can draw a few watts during light use......


I don't know whats so hard to comprehend, its useable, its faster.

More than 32GB of memory in anything other than a production environment is useless currently, the bandwidth constraints at 32GB to page through for execution make it useless with current generation processors, we need faster RAM and a wider bus.
 
I am not going to spend the time to break all of my reasoning down for you further than, the APU powering the PS4 uses GDDR5 and has no issues and squeezes great performance out of it.

While that's nice, I fail to see how that is a demonstration of DDR5's advantages over DDR4. Now, if you're talking about using GDDR5 as system memory, I wonder if that is feasible. Your sole example is a console, the architecture of which is not exactly the same as our PCs, I believe.

we have and know of working high performance use of 5.

Yet, none of us knows how much time and work are needed for us to have usable DDR5 as system memory. So, should we wait for DDR5 RAM sticks' availability and use DDR3 in the meantime, or should we adopt DDR4 until that occurs?

You've made a bold claim (that it's better for the next generation of memory to be DDR5 in lieu of DDR4), and you've called another user names, just because he was being sarcastic. Then, you yourself have been sarcastic towards me. Of course I'm going to disagree with you. ;)
 
Back
Top