Monday, December 5th 2022

Intel Plans Slew of "Raptor Lake" Product Announcements on January 3

January 3, 2023, on the sidelines of the International CES, Intel could launch dozens upon dozens of 13th Gen Core processor SKUs spanning every conceivable client PC platform. The event is expected to bring "Raptor Lake" to the mobile platform, with announcements spanning the ultra-portable "Raptor Lake-U," the thin-and-light performance "Raptor Lake-P," the mainstream mobile "Raptor Lake-H," and the enthusiast mobile "Raptor Lake-HX." This would cover almost every mobile device segment, including 7-9 W U-segment, to the 15-28 W P-segment, the 35-45 W H-segment, and the >55 W HX-segment, with several SKUs per segment across the four brand extensions (Core i3 thru Core i9).

For the desktop platform, Intel is expected to announce its new Core i9-13900KS "Raptor Lake-S" flagship, the world's first 6 GHz processor, besides a large number of 13th Gen Core "Raptor Lake-S" non-K SKUs in the 65 W TDP class, forming the bulk of the company's desktop processor lineup. Besides these, the company will also announce the more affordable Intel H770 and B760 desktop motherboard chipsets, bringing down platform costs (the 13th Gen Core processors are already supported on inexpensive motherboards based on the B660 and H670 chipsets via a BIOS update). Lastly, the company could announce "Raptor Lake" processors for the workstation segment, possibly compatible with the W780 chipset, and supporting ECC DDR5 memory, among other features relevant to the commercial desktop and workstation segments.
Sources: Leaf_Hobby (Twitter), VideoCardz
Add your own comment

20 Comments on Intel Plans Slew of "Raptor Lake" Product Announcements on January 3

#1
Lovec1990
Cannot wait to see 13700 non-K be tested. It looks better than 13700K becouse it might not require serious cooling and considering how 12700 non-K handles itself its gonna be great option
Posted on Reply
#2
TheoneandonlyMrK
Hahaaaaaa is this the pre slew anoucment then.

I can't believe the PR chops on these guy's.

I'm still waiting on Arc availability so months of raptor lake PR is likely to be a chuckle fest, bring it on Intel.
The parts not PR.
Posted on Reply
#3
Minus Infinity
Lovec1990Cannot wait to see 13700 non-K be tested. It looks better than 13700K because it might not require serious cooling and considering how 12700 non-K handles itself its gonna be great option
I don't get that, you can power-limit the 13700K, no need to run it at Intel's clocks which is just to win benchmarks bugger the power. Sure no sense paying for OCing when you won't use it, but for now there is no 13700 and I hate how they delay these variants.
Posted on Reply
#4
Why_Me
Minus InfinityI don't get that, you can power-limit the 13700K, no need to run it at Intel's clocks which is just to win benchmarks bugger the power. Sure no sense paying for OCing when you won't use it, but for now there is no 13700 and I hate how they delay these variants.
It's a money grab by Intel.
Posted on Reply
#5
Lovec1990
Minus InfinityI don't get that, you can power-limit the 13700K, no need to run it at Intel's clocks which is just to win benchmarks bugger the power. Sure no sense paying for OCing when you won't use it, but for now there is no 13700 and I hate how they delay these variants.
If i power limit 13700K wouldnt it degrade in performance thus 13700 non-K gonna get closer to it.

If so why buy 13700K over 13700 at all?
Posted on Reply
#7
Minus Infinity
Lovec1990If i power limit 13700K wouldnt it degrade in performance thus 13700 non-K gonna get closer to it.

If so why buy 13700K over 13700 at all?
Like I said availability. But now we know the non-K models are coming soon, yeah I'll wait for comparisons.
Posted on Reply
#8
Totally
TheoneandonlyMrKHahaaaaaa is this the pre slew anoucment then.

I can't believe the PR chops on these guy's.

I'm still waiting on Arc availability so months of raptor lake PR is likely to be a chuckle fest, bring it on Intel.
The parts not PR.
Arc got cancelled. Tbf, the leaks, speculation, and hype is what killed it. They could have kept quiet as quiet as possible about it then made an announcement when it was ready but instead decided to make a bunch of promises and set a date.
Posted on Reply
#9
trsttte
TotallyArc got cancelled. Tbf, the leaks, speculation, and hype is what killed it. They could have kept quiet as quiet as possible about it then made an announcement when it was ready but instead decided to make a bunch of promises and set a date.
I might have missed something, but the A750 right besides me says otherwise
Posted on Reply
#10
Totally
trsttteI might have missed something, but the A750 right besides me says otherwise
Good for you. They're not making anymore just because you have one. Whatever is out there is out there.
Posted on Reply
#11
Zendou
TotallyGood for you. They're not making anymore just because you have one. Whatever is out there is out there.
This is one of those extraordinary claims require extraordinary evidence. AsRock and Acer are currently making cards as well beside Intel, they just posted updated drivers for the cards as seen on the article on this site (www.techpowerup.com/301819/intel-arc-gpu-graphics-drivers-31-0-101-3959-released), it does not make sense to do that for an abandoned device. These cards (A750/A770) seems to have more support than the Radeon VII got.
Posted on Reply
#12
Ayhamb99
TotallyArc got cancelled. Tbf, the leaks, speculation, and hype is what killed it. They could have kept quiet as quiet as possible about it then made an announcement when it was ready but instead decided to make a bunch of promises and set a date.
Was there an official announcement stating that Arc was canceled? No there wasn't and the leaks about Arc being canceled were denied by Intel engineers themselves.

I really hope they do not end up canceling the entire lineup though, For a completely new Architecture and as a 1st generation lineup, ARC performed really well imho considering the giant wall they had to climb over and also having to compete with Nvidia or AMD who have been releasing GPUs for the past decade. They just need more time to flesh out the drivers and Architecture.
Posted on Reply
#13
trsttte
I'm sure when/if they decide to cancel Arc we'll hear about it at the last possible moment, until then will be hype and marketing as usual. But that moment hasn't come contrary to all the pseudo leaks and rants by morons like MooreLawIsDead.
Posted on Reply
#14
mechtech
So..... a dinosaur, a basketball team, a ford truck, a meme now a cpu.

waiting for the yo dawg, I heard you like raptors meme................
Posted on Reply
#15
Minus Infinity
Curious how big is a slew. Is it more than a couple or a few, is it less than 10? We needs to have metrics godammit.
Posted on Reply
#16
Lycanwolfen
Intel should just port over some of there xeons to desktop market to compete with AMD. It would ramp up their desktop market faster. Also I think they need to drop the Core name entirely. Maybe go back to pentium or something new like titan or odin, Or maybe zeus. Raptor huh I wonder when the T-rex is coming out?
Posted on Reply
#17
tpu7887
"7-9 W U-segment, to the 15-28 W P-segment"

So the 15 watt-r's are "P" now eh?
I like the way things are going. At least as far as power envelope targets are concerned lol.

For example though... I have a Surface Laptop 2 with an 8250u I believe. Whatever it is, the key takeaway of what follows is that I'm talking about a reasonably modern 15 watt "U" series CPU. Most of the time it peaks at 10 watts - 12 if you're running it really hard. More than 15 watts is only ever hit with a GPU + CPU load, and to draw more than 18 watts you pretty much have to be running things for the purpose of drawing power. This how the chip behaves after (it appears at least) some firmware updates were pushed my MS which changed the 25W target the laptop shipped with (if you didn't know, it's is a thing that OEMs can, and do, do, with U chips) to 15 watts.

Are these statements of U chips now taking 7-9W mild "wishful thinking", essentially overstating the 10 watts CPU + 5 watts GPU that 15W "U" chips currently kind of try to fit into, or is the performance (relative to how things have progressed) the same, but now taking only 47% to 60% power draw?
Put another way - are the base clocks gimped?


Separately...
Does it really matter if the TDP of the chip is 7-9W instead of 15W? An example using the worst case scenario of equal IPC between Gen 1 16W and Gen 2 8W: Most of the time a mobile device's CPU utilization averages 10% . Say that takes you to 12.5% TDP...

16 watt chip takes 2 watts
8 watt chip takes 1 watt

The matte 13" 16:10 display (thank god 16:9 died - an entire decade of laptops 2009-2019 were all complete garbage - couldn't even use excel except for macs and the Surface Laptop from 2017...). Got distracted...

The matte 14" 16:10 proper beautiful golden ratio display at 225 nits for comfortable viewing indoors
Requires 2.5 watts.
The SSD requires 0.5 watts.
The WiFi adapter requires 0.25 watts
The RAM requires 0.25 watts
The input devices require 0.125 watts
The Bluetooth adapter requires 0.075 watts
The motherboard requires 0.3 watts
Things I haven't accounted for things like the fan running periodically and SSD writes increase average power consumption by another 0.25 watts

Bringing the "everything that isn't the CPU" 's power consumption toooo.....
4.25 watts!

So the 8 watt TDP chip laptop takes 5.25 watts
And the 16 watt TDP chip laptop takes 6.25 watts

So even though it might look like a huge stride is being made, runtime is just 19% longer with the same sized battery.
We went from 6.25 to 5.25.
To go from 5.25 to 4.25, TDP needs to be ZERO
Looks like we're at the end of the road!

I guess it's not too bad. The surface laptop for example - aside from not being serviceable, it is basically the ideal (non specialized) laptop - good keyboard layout with long key travel, large and accurate trackpad, display is colour calibrated with a resolution high enough for proper differentiation of fonts, touch screen, speakers are half decent though lack bass. Latest renditions are claimed to have 12-16 hours of battery life. I like to chop off 10-15% from the battery life reported in reviews done by sites and people paid by manufacturers for their reviews, and an additional 25% off of that because displays should not cause eyestrain from being borderline-uncomfortably dim all the time, and I, like most people, am usually multitasking (or at least have some programs open which don't necessarily need to be while I'm doing something lol).

Yeah, so 12 + 16 = 28 /2 = 14 -15% = 11.9 -25% = 8.925

So we're at 9 hours. LEDs will only get so efficient, and the other stuff I listed will only get so efficient. Taking all that into account without making my post even longer than it already is and just giving the result without all the reasoning + math (just that "everything else[that isn't the CPU]" goes from 4.5 to 3 watts, and CPU power consumption halves again to 0.5W average power draw): in about 5 years, laptops will get to within what I believe will be 5-10% of their maximum efficiency, which I believe will result in laptops lasting:::

13.3875 hours, or, more comfortably: 13.5 hours
Improved battery technology may increase that number, but even if this Li-Ion is what we're stuck with... 13.5 hours doing normal things would give 9-10 hours doing CPU intensive things. You couldn't transcode videos for 8 hours straight on battery, but any real work, will soon be able to be accomplished on a single charge.

This post is getting really long, but what's great about the 13.5 hours runtime is this:

If you only charge your pinnacle laptop's Li-ion battery from between 5% and 10% to between 65 and 70%, (60% battery usage of 13.5 hours is 8.1 hours), then your battery will be viable the entire time you use the laptop. And then you could give it to someone else who could use it as much as you did 8 hours a day 365 days a year over the next 3 years, and only after that would battery capacity start to noticeably diminish. If the laptop was semi-retired over the next 4 years to an older person who didn't use it as much and they kept it charged the same way, gold.
(if you're going to try this, keep in mind you need to either every 20 charges or 45 days complete a full charge to 100%. This means leaving it plugged in for a good two hours after the device reports 100%. For numerous (and often varying) reasons, 100% is reported by devices before it's actually reached. Safest and most convent bet is to just leave the dang thing plugged in overnight the (approximate) 20th charge or (approximate) 45th day that the full charge (maintenance cycle) is done.
Posted on Reply
#18
Why_Me
mechtechSo..... a dinosaur, a basketball team, a ford truck, a meme now a cpu.

waiting for the yo dawg, I heard you like raptors meme................
And there it is ...

Posted on Reply
#19
Harry Wild
Cannot wait for the 13100, 13400, 13600 Raptor Lakes to be announced! All 65W CPUs too!
Posted on Reply
Add your own comment
Apr 26th, 2024 09:01 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts