Thursday, February 18th 2021

Intel Rocket Lake-S Lands on March 15th, Alder Lake-S Uses Enhanced 10 nm SuperFin Process

In the latest round of rumors, we have today received some really interesting news regarding Intel's upcoming lineup of desktop processors. Thanks to HKEPC media, we have information about the launch date of Intel's Rocket Lake-S processor lineup and Alder Lake-S details. Starting with Rocket Lake, Intel did not unveil the exact availability date on these processors. However, thanks to HKEPC, we have information that Rocket Lake is landing in our hands on March 15th. With 500 series chipsets already launched, consumers are now waiting for the processors to arrive as well, so they can pair their new PCIe 4.0 NVMe SSDs with the latest processor generation.

When it comes to the next generation Alder Lake-S design, Intel is reported to use its enhanced 10 nm SuperFin process for the manufacturing of these processors. This would mean that the node is more efficient than the regular 10 nm SuperFin present on Tiger Lake processors, and some improvements like better frequencies are expected. Alder Lake is expected to make use of big.LITTLE core configuration, with small cores being Gracemont designs, and the big cores being Golden Cove designs. The magic of Golden Cove is expected to result in 20% IPC improvement over Willow Cove, which exists today in Tiger Lake designs. Paired with PCIe 5.0 and DDR5 technology, Alder Lake is looking like a compelling upgrade that is arriving in December of this year. Pictured below is the LGA1700 engineering sample of Alder Lake-S processor.
Sources: HKEPC, via VideoCardz
Add your own comment

76 Comments on Intel Rocket Lake-S Lands on March 15th, Alder Lake-S Uses Enhanced 10 nm SuperFin Process

#1
Legacy-ZA
Now, this is what I call a waste of resources. Shortages everywhere, but hey, let's make a "stop-gap" until the end of the year. /facepalm
Posted on Reply
#2
watzupken
Here we go, 10nm ++= SuperFin. 10nm+++ = Enhanced SuperFin. I think Intel should really consider making the branding less complicated and drop the very cheesy SuperFin naming. I wonder if they continue on this naming convention, would 10nm++++ = Further enhanced SuperFin?
Posted on Reply
#3
Legacy-ZA
watzupken
Here we go, 10nm ++= SuperFin. 10nm+++ = Enhanced SuperFin. I think Intel should really consider making the branding less complicated and drop the very cheesy SuperFin naming. I wonder if they continue on this naming convention, would 10nm++++ = Further enhanced SuperFin?
Yes sir! Do not forget, with every new CPU, get a new socket and Mobo, maybe even some RAM too. :roll:
Posted on Reply
#4
watzupken
Legacy-ZA
Now, this is what I call a waste of resources. Shortages everywhere, but hey, let's make a "stop-gap" until the end of the year. /facepalm
I totally agree that Rocket Lake is a stop gap solution, but I think Intel is clearly aware that they are falling behind competition, and forced to take this route. If not for the shortage of AMD chips, and the spike in PC demand due to the pandemic, demand and interest for Intel processor is probably at the lowest in Intel's history. The plan to create Rocket Lake using 14nm must have been decided at least a or 2 years before COVID 19 hit.
Posted on Reply
#5
dgianstefani
watzupken
Here we go, 10nm ++= SuperFin. 10nm+++ = Enhanced SuperFin. I think Intel should really consider making the branding less complicated and drop the very cheesy SuperFin naming. I wonder if they continue on this naming convention, would 10nm++++ = Further enhanced SuperFin?
Better than doing a Samsung and calling updated "12nm" "8nm".

You lads like to shit on Intel for everything they do, but they managed to be competitive for five years on revisions of the same process which is a testament to their engineers.

It's been proven time again that Intels 14nm++ is as good as other companies 10nm, and their 10nm+ is as good as TSMC 7nm.

Calm down and understand all the "nm" numbers are just marketing numbers and don't reflect actual transistor measurements. What's important is the performance.
Posted on Reply
#6
Cobain
Legacy-ZA
Now, this is what I call a waste of resources. Shortages everywhere, but hey, let's make a "stop-gap" until the end of the year. /facepalm
I have enough years building DIY computers to Highly disagree with you. NEVER ever I Will jump straight to a new platform that has new RAM Norm.

I remember going into DDR3 hype, just to get stuck at awful 1033mhz CL13 speeds and paying 150€ for 8gb kit. 2 years down the road and 1666mhz CL8 cost 50€ for 2x4gb kits.

Didnt learn my lesson and jumped to DDR4 day 1, paying 200€ for 2x4gb 2400mhz CL18. 2 years later and 3200mhz CL16 was the norm, Costing 100€ for 2x4gb.

No way Im Jumping on DDR5, buying the first kits for a lot of money. Next time I jump to new platform with new RAM, I Will make sure the RAM kits that I buy Will last me that entire DDR5 GEN.

Awful idea to buy new RAM as soon as it gets released. I Will wait for it to mature first.
Posted on Reply
#7
Legacy-ZA
Cobain
I have enough years building DIY computers to Highly disagree with you. NEVER ever I Will jump straight to a new platform that has new RAM Norm.

I remember going into DDR3 hype, just to get stuck at awful 1033mhz CL13 speeds and paying 150€ for 8gb kit. 2 years down the road and 1666mhz CL8 cost 50€ for 2x4gb kits.

Didnt learn my lesson and jumped to DDR4 day 1, paying 200€ for 2x4gb 2400mhz CL18. 2 years later and 3200mhz CL16 was the norm, Costing 100€ for 2x4gb.

No way Im Jumping on DDR5, buying the first kits for a lot of money. Next time I jump to new platform with new RAM, I Will make sure the RAM kits that I buy Will last me that entire DDR5 GEN.

Awful idea to buy new RAM as soon as it gets released. I Will wait for it to mature first.
Strange, I found the exact opposite to be true. Then again, I always did do my research well before I bought anything.
Posted on Reply
#8
Cobain
Legacy-ZA
Strange, I found the exact opposite to be true. Then again, I always did do my research well before I bought anything.
This has nothing to do with research. The first RAM modules on a new DDR Norm, are always expensive, and considerably slower than the kits that release 1 year afterwards. You Will be paying a lot of money for something that I doubt Will beat a Samsung B die DDR4 RAM kit.
Posted on Reply
#9
dgianstefani
The early dd5 kits will have higher latency numbers that won't make up for them with bandwidth. Later ones will have better latencies combined with higher bandwidth and there will be time for others to beta test the hardware and figure out what are the best dies etc.

My 4000/14 kit will probably have better timings/latencies than some DDR5 kits, without the energy saving advantages and at a higher voltage of course.
Posted on Reply
#10
DeathtoGnomes
dgianstefani
Better than doing a Samsung and calling updated "12nm" "8nm".

You lads like to shit on Intel for everything they do, but they managed to be competitive for five years on revisions of the same process which is a testament to their engineers.

It's been proven time again that Intels 14nm++ is as good as other companies 10nm, and their 10nm+ is as good as TSMC 7nm.

Calm down and understand all the "nm" numbers are just marketing numbers and don't reflect actual transistor measurements. What's important is the performance.
People dont shit on Intel unless Intel screws the pooch, their 10nm is a fine example. The rest of them are Ryzen fanbois. Which is perfect normal and somewhat balanced against the Intel fanboi collective that shit on AMD at every opportunity. So this PR is to calm the savage natives (Team Intel).

Intels 14nm is only good for speeds, but not for heat. 10nm has proven to run cooler and more efficient for AMD, why cant Intel provide the same?
Posted on Reply
#11
Cobain
DeathtoGnomes
People dont shit on Intel unless Intel screws the pooch, their 10nm is a fine example. The rest of them are Ryzen fanbois. Which is perfect normal and somewhat balanced against the Intel fanboi collective that shit on AMD at every opportunity. So this PR is to calm the savage natives (Team Intel).

Intels 14nm is only good for speeds, but not for heat. 10nm has proven to run cooler and more efficient for AMD, why cant Intel provide the same?
That must explain why a 10700k reaches lower temperatures compared to a 5800x...

Posted on Reply
#12
dgianstefani
DeathtoGnomes
People dont shit on Intel unless Intel screws the pooch, their 10nm is a fine example. The rest of them are Ryzen fanbois. Which is perfect normal and somewhat balanced against the Intel fanboi collective that shit on AMD at every opportunity. So this PR is to calm the savage natives (Team Intel).

Intels 14nm is only good for speeds, but not for heat. 10nm has proven to run cooler and more efficient for AMD, why cant Intel provide the same?
You're wrong. Intel 14nm can actually be better for heat despite using more power than ryzen 7nm, due to the size of the die being much larger and allowing easier thermal transfer to the cooling.
Posted on Reply
#13
ratirt
Cobain
That must explain why a 10700k reaches lower temperatures compared to a 5800x...


Yes runs cooler. Considering the 10700k is bigger in size than a 5800x so it is obvious, the heat dissipation is better for the 10700k (more surface to exert heat) and yet 10700K still uses more energy for multi-thread and single thread than 5800x and by a noticeable margin. Heat is not everything you know and you shouldn't stick just to one variable here. Even in gaming the difference in power usage is noticeable between the two.
Posted on Reply
#14
dgianstefani
Power usage is irrelevant on a desktop as long as temperatures are fine and the psu can comfortably handle it. Only things that matter are component temperatures, performance and noise.
Posted on Reply
#15
ratirt
dgianstefani
Power usage is irrelevant on a desktop as long as temperatures are fine and the psu can comfortably handle it. Only things that matter are component temperatures, performance and noise.
So temperature is irrelevant as well. You can always slap a water cooling on the CPU and problem solved.
Noise, performance and temp fixed.
Posted on Reply
#16
dgianstefani
Reading comprehension. I specifically said temperature is what matters and it's not linearly linked to power usage as you can clearly see with the 5800x/10700k comparison.
ratirt
So temperature is irrelevant as well. You can always slap a water cooling on the CPU and problem solved.
Noise, performance and temp fixed.
Posted on Reply
#17
londiste
ratirt
Even in gaming the difference in power usage is noticeable between the two.
9W according to the same TPU review :)

Edit:
ratirt
really? Cause here (according to TPU) the difference for gaming between the two is about 35W. stock vs stock
My bad. Looked at the OC number for 5800X.
25W is not too bad of a difference either, though.
Posted on Reply
#18
ratirt
dgianstefani
Reading comprehension. I specifically said temperature is what matters and it's not linearly linked to power usage as you can clearly see with the 5800x/10700k comparison.
I read it no worries and I disagree with your statement with a sarcastic pun. If energy usage or power usage is irrelevant in a desktop, so as temp noise and performance is, because you can make those better.
londiste
9W according to the same TPU review
really? Cause here (according to TPU) the difference for gaming between the two is about 25W. stock vs stock
Posted on Reply
#19
dgianstefani
ratirt
I read it no worries and I disagree with your statement with a sarcastic pun. If energy usage or power usage is irrelevant in a desktop, so as temp noise and performance is, because you can make those better.

really? Cause here (according to TPU) the difference for gaming between the two is about 35W. stock vs stock
Energy usage is irrelevant as long as it doesn't affect things that actually improve the performance of the system (temperature) or the experience of the user (noise) or the longevity of the hardware (good enough psu). Desktops are plugged into mains power so even a couple hundred watts is irrelevant.
Posted on Reply
#20
qubit
Overclocked quantum bit
Releasing a product that's already in the shadow of its successor in a few months time sounds more like it's to help Intel's bottom line, rather than a compelling proposition for the customer. My advice therefore, is not to upgrade to RL unless you really need a gaming PC right now.

RL has something like a 19% IPC performance improvement over its predecessor and AL will be another similar amount over RL, so by waiting it out, you're looking at a 40-50% improvement over the current generation. That's well worth the wait in my book.

Look, I've got an ancient 2700K paired with a 2080 SUPER which can still play all my games very well. Even CoD:MW plays smoothly, although it doesn't reach the highest 144Hz framerate that I would like, so there's no urgency there. For newer CPUs, the need to upgrade is even less compelling. You may need the newer mobo and CPU features, but that's another story.
Posted on Reply
#21
ratirt
dgianstefani
Energy usage is irrelevant as long as it doesn't affect things that actually improve the performance of the system (temperature) or the experience of the user (noise) or the longevity of the hardware (good enough psu). Desktops are plugged into mains power so even a couple hundred watts is irrelevant.
None of those you mentioned affect anything with the ryzen 5800x. 75degrees is the temp, this chip operate at and it does not diminish its performance nor it makes the system louder or reduces its longevity or lifespan if you will. It can normally operate at this temp no problem. So again temp is irrelevant as long as it is in the rage of specification.
Posted on Reply
#22
dgianstefani
Wrong again. Lower temperatures correlate with lower voltage leakage and better component longevity, as well as higher overclock potential. Additionally, lower component temperatures in a less sensitive component can positively affect nearby components, for example ram, which is very temperature sensitive.

Additionally Zen cpus have been shown to boost adaptively based on temperatures even without manual oc.
Posted on Reply
#23
ratirt
dgianstefani
Wrong again. Lower temperatures correlate with lower voltage leakage and better component longevity, as well as higher overclock potential. Additionally, lower component temperatures in a less sensitive component can positively affect nearby components, for example ram, which is very temperature sensitive.
Now you are going for other variables to justify your premise? Like higher clocks and lower voltage? Put a water block on it and problem solved.
Posted on Reply
#24
ratirt
dgianstefani
You're still arguing? Lmao.
Arguing is irrelevant here since you pick random variables to prove the obvious but still going with a flawed argument.
So just like CPU power consumption is irrelevant as long as you have a desktop PC. "Your words bro", temp is irrelevant as long as you have good cooling solution in your desktop PC.
dgianstefani
Wrong again. Lower temperatures correlate with lower voltage leakage and better component longevity, as well as higher overclock potential. Additionally, lower component temperatures in a less sensitive component can positively affect nearby components, for example ram, which is very temperature sensitive.

Additionally Zen cpus have been shown to boost adapively based on temperatures even without manual oc.
And to add to this, NO OC IS Guaranteed. The boost for 5800x up to 4.7Ghz, on the other hand, is within the range of temperature spec given by AMD. If you hit 70 or 80c you will still boost to 4.7Ghz. The temp does not make it run below specification. It can run higher though but that depends on the silicon quality.
Posted on Reply
#25
Cobain
ratirt
Yes runs cooler. Considering the 10700k is bigger in size than a 5800x so it is obvious, the heat dissipation is better for the 10700k (more surface to exert heat) and yet 10700K still uses more energy for multi-thread and single thread than 5800x and by a noticeable margin. Heat is not everything you know and you shouldn't stick just to one variable here. Even in gaming the difference in power usage is noticeable between the two.
Depends if you consider 25watts more to be "noticeable higher".

Posted on Reply
Add your own comment