• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel 12th Gen Core Alder Lake to Launch Alongside Next-Gen Windows This Halloween

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,684 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel is likely targeting a Halloween (October 2021) launch for its 12th Generation Core "Alder Lake-S" desktop processors, along the sidelines of the next-generation Windows PC operating system, which is being referred to in the press as "Windows 11," according to "Moore's Law is Dead," a reliable source of tech leaks. This launch timing is key, as the next-gen operating system is said to feature significant changes to its scheduler, to make the most of hybrid processors (processors with two kinds of CPU cores).

The two CPU core types on "Alder Lake-S," the performance "Golden Cove," and the low-power "Gracemont" ones, operate in two entirely different performance/Watt bands, and come with different ISA feature-sets. The OS needs to be aware of these, so it knows exactly when to wake up performance cores, or what kind of processing traffic to send to which kind of core. Microsoft is expected to unveil this new-gen Windows OS on June 24, with RTX (retail) availability expected in Q4-2021.



View at TechPowerUp Main Site
 
sounds like some older gaming might end up stuttering with this idea... not much demand so fails to activate the performance core, happens a lot on my gtx 1070 laptop unless i turn on discrete graphics only in mobo bios. win 10 still can't decide which gpu i want to run older games in... so not much faith in this...

yes, you may say it doesn't matter, but that's not true if im trying to get super high frames from older games, which in case I would need the higher performing cores. im sure it will work fine for modern games. but eh. im skeptical.
 
Make no mistake. This design decision is not for us. It's for mobile users, tablets, and other users on battery power. Here's hoping AMD has the sense to make actual desktop chips for desktop users. Or rather keep making them.
 
I wonder if Windows 11 will be free like Windows 10.
 

Halloween :wtf:


Yeah probably reminds Intel of their 10nm horror run :slap:
 
Useful for mobile, not that much for desktops, but it seems like both Intel and AMD will do LITTLE.big approch soon

I think we will see a fair share of issues with programs using the wrong cores? :laugh:

This is not something to get on 1st generation
 
I keep hearing that but where's the evidence? Is this rumor floated by the same YT clickbaiters?

How do you expect evidence when it's not official, if "Windows 11" is going to be re-made groud up to be optimized for little.BIG approch, AMD is pretty much forced to do it, atleast for mobile chips or Intel will destroy them in efficiency
 
By evidence I mean something like roadmaps or at least semi credible leaks, btw is it the usual (YT) suspects who "broke" this story?
 

Halloween :wtf:


Yeah probably reminds Intel of their 10nm horror run :slap:

The lineup uses the same node as TGL, there's zero reasons for any further delays. Your sarcasm is misplaced.
 
TGL(U) is a lot smaller than ADL, not to mention clocks. The latter will need all the help it can get.
 
sounds like some older gaming might end up stuttering with this idea... not much demand so fails to activate the performance core, happens a lot on my gtx 1070 laptop unless i turn on discrete graphics only in mobo bios. win 10 still can't decide which gpu i want to run older games in... so not much faith in this...

yes, you may say it doesn't matter, but that's not true if im trying to get super high frames from older games, which in case I would need the higher performing cores. im sure it will work fine for modern games. but eh. im skeptical.

But presumably this windows update is specifically to adress this problem and really I dont see it as an issue.
If we look at this really simply then I would say that MS would make it so that any program defaults to the performance cores and that the program specifically has to mention to the OS that it can run on the ermm Eco? cores to actually make use of them.

The OS itself has plenty of background tasks that can be relagated to the Eco cores and I would imagine browsers and programes like Word would do fine on them as well.

I can even imagine maybe a little menu in Windows where users can drag programs on the system from the performance core tab,bar,section to the eco cores, then start up the program and see how it works, if it works well that info can be send back to microsoft and then they can contact the devs of the program to make an update if they want to run it on the eco cores as standard or perhaps give the users the option to run it on those cores with a little in program switch.
 
Last edited:
really I dont see it as an issue.
The issue is big cores will support something like AVX512 while the little cores will not. This will be an issue especially if the scheduler is moving the workload across different cores. This is not an issue on ARM.
 
SaaSy little Wintel
 
The OS was mainly the problem of the previous concepts like Dual-Core CPU or Bulldozer or multi-NUMA Threadripper or Lakefield ... where Linux was able to adapt relatively fast, microsoft failed on all sheets, hefty on consumer OS and not so hefty on enterprise OS.
I hope MS did their homework and that Intel did not put too much trust on MS.
 
This means that AMD will launch something around Halloween too, to steal the show, or at least some of the show. What could that be? AM5 doesn't seem to be that early.

I keep hearing that but where's the evidence? Is this rumor floated by the same YT clickbaiters?
Even if it's far from confirmed, it's quite believable. For notebook APUs it's very believable. For the desktop ... it depends on how much, and how fast, AMD will converge the development of notebook and desktop Ryzens.
 
sounds like some older gaming might end up stuttering with this idea... not much demand so fails to activate the performance core, happens a lot on my gtx 1070 laptop unless i turn on discrete graphics only in mobo bios. win 10 still can't decide which gpu i want to run older games in... so not much faith in this...

yes, you may say it doesn't matter, but that's not true if im trying to get super high frames from older games, which in case I would need the higher performing cores. im sure it will work fine for modern games. but eh. im skeptical.
Old gaming will be served by the new scheduler in Windows which will detect the type of app an assign it to the right core.

If you are playing need for speed underground 1, FOR SURE, you don't need to wake up the big core, since even the little core is similar to a haswell core.

This means that AMD will launch something around Halloween too, to steal the show, or at least some of the show. What could that be? AM5 doesn't seem to be that early.


Even if it's far from confirmed, it's quite believable. For notebook APUs it's very believable. For the desktop ... it depends on how much, and how fast, AMD will converge the development of notebook and desktop Ryzens.
AMD will launch zen 3+ on 6nm with V cache.

The issue is big cores will support something like AVX512 while the little cores will not. This will be an issue especially if the scheduler is moving the workload across different cores. This is not an issue on ARM.
this will be an issue with apps that use avx 512. most of them don't. And the ones that do, well will either use only the HP cores or I don't know what scheme Intel has come up with.
I find it funny how some random people on forums think they know better already then the engineers that have worked a few years on this.
 
I keep hearing that but where's the evidence? Is this rumor floated by the same YT clickbaiters?
Yes.

However, AMD is 3D stacking chiplets which makes the big.LITTLE approach irrelevant for at least 1 generation. 15% gains, if true, offer AMD an ace up their sleeve. Smaller nodes and power optimizations will offset the advantage of Intel's big.LITTLE design in no time.
 
Old gaming will be served by the new scheduler in Windows which will detect the type of app an assign it to the right core.

If you are playing need for speed underground 1, FOR SURE, you don't need to wake up the big core, since even the little core is similar to a haswell core.

if you are trying to get 240hz 240 fps 1080p on older games - you need more than haswell.
 
if you are trying to get 240hz 240 fps 1080p on older games - you need more than haswell.
Old games notoriously poorly support anything more than 60 fps and some were still tied to CPU clock speed. They also have poor widescreen support and often just stretch everything instead of having proper scaling. Unless you mean something else by older games, Haswell is plenty.
 
Old games notoriously poorly support anything more than 60 fps and some were still tied to CPU clock speed. They also have poor widescreen support and often just stretch everything instead of having proper scaling. Unless you mean something else by older games, Haswell is plenty.

oh? I just beat Aliens vs Predator and it has unlimited FPS if you do the alt + enter 3x trick. many games actually that are quite old support unlimited frames. its mainly console ports over the years that don't... but keep tellin yourself that.
 
oh? I just beat Aliens vs Predator and it has unlimited FPS if you do the alt + enter 3x trick. many games actually that are quite old support unlimited frames. its mainly console ports over the years that don't... but keep tellin yourself that.
I don't need to keep telling myself anything, by poor support I mean that you can't expect it to work as expected in any old game and by old game I mean early 2000s titles that might not even launch on Windows 10 without extensive tweaking. OG Halo may run at super high fps, but it had animations locked at 30 fps for example. Colin McRae Rally 2005 is partially CPU speed tied game and doesn't have real widescreen support. Crazy Taxi is 60 fps locked without widescreen support. Aliens vs Predators isn't particularly old, it was released in 2010. So it's not really a surprise that it works fine.
 
sounds like some older gaming might end up stuttering with this idea... not much demand so fails to activate the performance core, happens a lot on my gtx 1070 laptop unless i turn on discrete graphics only in mobo bios. win 10 still can't decide which gpu i want to run older games in... so not much faith in this...

yes, you may say it doesn't matter, but that's not true if im trying to get super high frames from older games, which in case I would need the higher performing cores. im sure it will work fine for modern games. but eh. im skeptical.
I haven't seen the specs of how Windows intend to do scheduling of these new hybrid designs, but pretty much any user application (especially games) would have to run on high-performance cores all the time. Games are heavily synchronized, and even if a low load thread is causing delays, it may cause serious stutter, or in some cases even cause the game to glitch.

With dual graphics in laptops you're pretty much in double jeopardy anyway, as you both get the stutter problems of integrated graphics and the extra latency of the dedicated GPU streaming the picture back to the integrated one, in addition to the potential problem of running the game on the "wrong" GPU. If you want a completely smooth gaming experience, you have to stick to desktops.

Make no mistake. This design decision is not for us. It's for mobile users, tablets, and other users on battery power. Here's hoping AMD has the sense to make actual desktop chips for desktop users. Or rather keep making them.
While it should be fairly obvious why hybrid CPUs makes little practical sense for desktop users, most still overlook the primary motivation for doing this; marketing. Most desktops are sold through the big PC makers like Dell, HP, Lenovo etc. Buyers of these machines (both personal and business) often look at "specs" when deciding to upgrade. Clock speeds have hit the wall, and boosting core counts with powerful cores in low TDP models is not likely, so these "16 core" "5 GHz" "65W TDP" specs of upcoming Alder Lake is mostly to make it sell easier, ignoring the fact that most users will not understand that these specs are very misleading.

How do you expect evidence when it's not official, if "Windows 11" is going to be re-made groud up to be optimized for little.BIG approch, AMD is pretty much forced to do it, atleast for mobile chips or Intel will destroy them in efficiency
Why?
HEDT/high-end workstation and server CPUs will not have a hybrid design, so do you think Microsoft want to sabotage the performance of power users and servers then?
If AMD feels "forced" to do it, it's because of marketing.

Old gaming will be served by the new scheduler in Windows which will detect the type of app an assign it to the right core.
Where is the spec for this?
How would the OS detect the "type" of an application? (And how would a programmer set a flag(?) to make it run differently?)

If you are playing need for speed underground 1, FOR SURE, you don't need to wake up the big core, since even the little core is similar to a haswell core.
The small cores will be very slow, and if the details we've seen is right will share L2 cache, which means high latency. They will be totally unsuitable for even an "older" game.

AMD will launch zen 3+ on 6nm with V cache.
Have we got any confirmation on this beyond some APUs?

However, AMD is 3D stacking chiplets which makes the big.LITTLE approach irrelevant for at least 1 generation. 15% gains, if true, offer AMD an ace up their sleeve. Smaller nodes and power optimizations will offset the advantage of Intel's big.LITTLE design in no time.
I sure hope AMD would be smarter and skip it all together. I would rather have more cache.
 
Back
Top