• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Confirms Core Ultra "Lunar Lake" Packs 45 TOPS NPU, Coming This Year

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,468 (7.66/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel at its VISION conference, confirmed that its next-generation processor for the ultraportable and thin-and-light segments, the Core Ultra "Lunar Lake," will feature an over four-fold increase in NPU performance, which will be as fast as 45 TOPS. This is a significant figure, as Microsoft recently announcedthat Copilot will perform several tasks locally (on the device), provided it has an NPU capable of at least 40 TOPS. The current AI Boost NPU found in Core Ultra "Meteor Lake" processor is no faster than 10 TOPS, and the current AMD Ryzen 8040 series features a Ryzen AI NPU with 16 TOPS on tap. AMD's upcoming Ryzen "Strix Point" processor is rumored to feature a similar 40 TOPS-class NPU performance as "Lunar Lake."

Intel also confirmed that notebooks powered by Core Ultra "Lunar Lake" processors will hit the shelves by Christmas 2024 (December). These notebooks will feature not just the 45 TOPS NPU, but also debut Intel's Arc Xe2 "Battlemage" graphics architecture as the processor's integrated graphics solution. With Microsoft's serious push for standardizing AI assistants, the new crop of notebooks could also feature Copilot as a fixed-function button on their keyboards, similar to the Win key that brings up the Start menu.



View at TechPowerUp Main Site | Source
 
Joined
Jan 8, 2017
Messages
9,050 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Joined
Nov 6, 2016
Messages
1,593 (0.58/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
So, how long until data brokers use that AI "assistant" to violate privacy and line their pockets?
 
Joined
Sep 18, 2017
Messages
196 (0.08/day)
Can someone educate me on why it's important to run Copilot locally instead of the cloud? In my opinion, we have only been getting closer and closer to cloud computing but this seems to be going the other direction. Are Microsoft severs not better than an ultra laptop CPU?
 
Joined
Feb 20, 2020
Messages
9,340 (6.04/day)
Location
Louisiana
System Name Ghetto Rigs z490|x99|Acer 17 Nitro 7840hs/ 5600c40-2x16/ 4060/ 1tb acer stock m.2/ 4tb sn850x
Processor 10900k w/Optimus Foundation | 5930k w/Black Noctua D15
Motherboard z490 Maximus XII Apex | x99 Sabertooth
Cooling oCool D5 res-combo/280 GTX/ Optimus Foundation/ gpu water block | Blk D15
Memory Trident-Z Royal 4000c16 2x16gb | Trident-Z 3200c14 4x8gb
Video Card(s) Titan Xp-water | evga 980ti gaming-w/ air
Storage 970evo+500gb & sn850x 4tb | 860 pro 256gb | Acer m.2 1tb/ sn850x 4tb| Many2.5" sata's ssd 3.5hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 | Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1000P2 with APC AX1500 | 850P2 with CyberPower-GX1325U
Mouse Redragon 901 Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
Can someone educate me on why it's important to run Copilot locally instead of the cloud? In my opinion, we have only been getting closer and closer to cloud computing but this seems to be going the other direction. Are Microsoft severs not better than an ultra laptop CPU?
Hi,
Local AI would be best in company environment searching their own closed data for employ's..

Copilot in general is just another word for web search I have no idea what local means in that respect seeing I doubt anyone would want AI storing web search in bulk on their machines although MS doesn't allow deleting on close edge so they do love storing it locally same for oS user usage hehe

Bottom line "cloud" is just a dog whistle like "AI" is unless it's doing something to improve photos/ videos/ gaming at the users request.
 
Joined
Jan 2, 2019
Messages
61 (0.03/day)
Location
Calgary, Canada
>>...NPU performance, which will be as fast as 45 TOPs...

It doesn't say if this is at INT8 data type. I think this is INT8.
 

duraz0rz

Trust this random person on the Internet
Supporter
Joined
Nov 19, 2020
Messages
8 (0.01/day)
Location
Ohio, USA
Can someone educate me on why it's important to run Copilot locally instead of the cloud? In my opinion, we have only been getting closer and closer to cloud computing but this seems to be going the other direction. Are Microsoft severs not better than an ultra laptop CPU?
Pretend you asked Copilot to "open all PDFs related to hardware in my Documents folder", and that it can handle this query. If this query was executed in the cloud, you would need to send MS the list of files in that folder plus any programs that can handle PDFs. It may have to respond "What program do you want these files to open in?" if you have multiple programs that can handle PDFs. All of this costs bandwidth, CPU cycles (the cloud has to wait for your response), complexity of running over a network, resource contention, security/privacy issues with sending your personal data over the Internet.

Running Copilot locally on a dedicated AI processor circumvents all of those issues. It also helps train it to things you specifically do. The model they ship to your PC is initially trained against an initial data set, probably with a lot of the most common tasks or requests they've gathered through their metrics or feedback channels. There are scenarios that the model hasn't encountered and needs help to be trained on how to do it. This training data could be sent back to Microsoft anonymously to incorporate into their more general model that's then shipped out to everyone else.
 
Joined
Sep 17, 2014
Messages
21,042 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Pretend you asked Copilot to "open all PDFs related to hardware in my Documents folder", and that it can handle this query. If this query was executed in the cloud, you would need to send MS the list of files in that folder plus any programs that can handle PDFs. It may have to respond "What program do you want these files to open in?" if you have multiple programs that can handle PDFs. All of this costs bandwidth, CPU cycles (the cloud has to wait for your response), complexity of running over a network, resource contention, security/privacy issues with sending your personal data over the Internet.

Running Copilot locally on a dedicated AI processor circumvents all of those issues. It also helps train it to things you specifically do. The model they ship to your PC is initially trained against an initial data set, probably with a lot of the most common tasks or requests they've gathered through their metrics or feedback channels. There are scenarios that the model hasn't encountered and needs help to be trained on how to do it. This training data could be sent back to Microsoft anonymously to incorporate into their more general model that's then shipped out to everyone else.
In a TL DR translation: We're gonna be folding@home for MS :)
 
Joined
Mar 16, 2017
Messages
1,713 (0.65/day)
Location
Tanagra
System Name Budget Box
Processor Xeon E5-2667v2
Motherboard ASUS P9X79 Pro
Cooling Some cheap tower cooler, I dunno
Memory 32GB 1866-DDR3 ECC
Video Card(s) XFX RX 5600XT
Storage WD NVME 1GB
Display(s) ASUS Pro Art 27"
Case Antec P7 Neo
Can someone educate me on why it's important to run Copilot locally instead of the cloud? In my opinion, we have only been getting closer and closer to cloud computing but this seems to be going the other direction. Are Microsoft severs not better than an ultra laptop CPU?

Pretend you asked Copilot to "open all PDFs related to hardware in my Documents folder", and that it can handle this query. If this query was executed in the cloud, you would need to send MS the list of files in that folder plus any programs that can handle PDFs. It may have to respond "What program do you want these files to open in?" if you have multiple programs that can handle PDFs. All of this costs bandwidth, CPU cycles (the cloud has to wait for your response), complexity of running over a network, resource contention, security/privacy issues with sending your personal data over the Internet.

Running Copilot locally on a dedicated AI processor circumvents all of those issues. It also helps train it to things you specifically do. The model they ship to your PC is initially trained against an initial data set, probably with a lot of the most common tasks or requests they've gathered through their metrics or feedback channels. There are scenarios that the model hasn't encountered and needs help to be trained on how to do it. This training data could be sent back to Microsoft anonymously to incorporate into their more general model that's then shipped out to everyone else.
Also, there are programs that already exist today that will use the NPU to run AI tasks on local files. DxO's software uses it for RAW files, so to do that "in the cloud" would mean uploading all those RAW files and then pulling them back down. RAW files are about a MB per megapixel, and today's cameras are anywhere from 20 to 100MP, maybe more. Even with some really fast internet, that's going to turn a 4 second task into something considerably longer. Some stuff is better in the cloud, no doubt, but I suspect anything manipulating local files is going to be ideally performed locally.
 
Joined
Dec 25, 2020
Messages
4,804 (3.88/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Razer DeathAdder Essential Mercury White
Keyboard Redragon Shiva Lunar White
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
I'm done with Intel mobile processors. It's just not worth buying them until they tell Dell, HP and Lenovo to piss off with the external embedded controllers overriding the CPU for throttling. Except that's not gonna happen.

My Ryzen laptop is such bliss. Completely throttle free. Works perfectly to AMD's specification. That's how a good design is done.
 
Joined
Feb 20, 2020
Messages
9,340 (6.04/day)
Location
Louisiana
System Name Ghetto Rigs z490|x99|Acer 17 Nitro 7840hs/ 5600c40-2x16/ 4060/ 1tb acer stock m.2/ 4tb sn850x
Processor 10900k w/Optimus Foundation | 5930k w/Black Noctua D15
Motherboard z490 Maximus XII Apex | x99 Sabertooth
Cooling oCool D5 res-combo/280 GTX/ Optimus Foundation/ gpu water block | Blk D15
Memory Trident-Z Royal 4000c16 2x16gb | Trident-Z 3200c14 4x8gb
Video Card(s) Titan Xp-water | evga 980ti gaming-w/ air
Storage 970evo+500gb & sn850x 4tb | 860 pro 256gb | Acer m.2 1tb/ sn850x 4tb| Many2.5" sata's ssd 3.5hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 | Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1000P2 with APC AX1500 | 850P2 with CyberPower-GX1325U
Mouse Redragon 901 Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
In a TL DR translation: We're gonna be folding@home for MS :)
Hi,
Not so far fetched actually seeing they already did the win-10 updates bonnet when it first came out
Sharing or receiving updates from others on the internet instead of just from MS servers hehe

Sure household local network can be nice but fact is they activated sending or receiving by default at one time without notice.
 
Top