• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel "Rocket Lake" an Adaptation of "Willow Cove" CPU Cores on 14nm?

I would definitely dare someone WITH a PhD in CS as pure software engineers know fuck all about hardware

You need to be really out of touch to actually believe that.

Whether you like it or not if you've been a CS student you will have been exposed to how does the hardware function. There is no such thing as a pure software engineer, you have to know hardware too (to varying levels). Unless we're talking about people making websites and such but I wouldn't call those engineers and they are also not likely to have a PhD in CS.

The only people that are technically not required to know anything hardware wise would be those that dwell in into the theoretical side of things, however, those wouldn't really be engineers and more like mathematicians and they probably do have a PhD in CS, ironically. Still, the presumption that they'd all be clueless is nothing short of being arrogant or ignorant or both.
 
Last edited:
Sooner or later, Intel is going to have to transition to a new fab size. They've really dropped the ball here and while it's amusing to watch the powerhouse get beaten down by the underdog, it's going to eventually lead to AMD stagnating, too.

We don't need a single best player. We need multiple options. It's getting ridiculous that Intel is THIS badly out of position...
 
It's because 10nm are ramping up and we will soon have a plethora of 10nm Intel CPUs for desktop PCs. Right?

Anyway, Intel will try to keep the crown for gaming and programs that don't see more than 8 cores/ 16 threads. At least they have a plan. They leave the market that needs more cores to AMD and try to keep what they can try to keep.

Pretty good plan because there isn't much that needs more than 8 cores.

Let's see, what are their options : Use the newer node but face potential performance regressions due to lower clocks and worse yields, or, use the older node, probably could squeeze more performance and better yields out of it but at the same time power consumption becomes rampant.

It's not great. I never looked forward with much excitement for anything Intel related in the last few years and this only deepens my discontent with them.

I think it is quite interesting to see what they pull out of their hat. They may be able to get power consumption somewhat tamed with a new arch. Node is not everything.

Woopsie.

I'll say it again. 10nm performance chips are not happening

lol...I feel like saying the same thing about Big Navi. My Vega 56 is old dammit...
 
2020 another iteration of the 14nm node. No 10nm. I'm not surprised.
 
14nm++++ Lakes are way too refined and it's becoming too good (for desktop TDP), kinda like Windows XP was.
10nm desktop SKU simply won't live up to the expectation, if it compared to the well polished 14nm++++++++ (like Vista was to XP)

That is a pretty good point actually, Intel is practically making its own rumored 10nm desktop line obsolete, not just with time but their own releases.

But then again, that also says a lot about the potential gains of 10nm and makes you wonder why Intel kept pursuing it.
 
Too early for 7nm dead memes?
 
Back
Top