• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Four 9th Gen Core "KF" Processor Models Get Listed

Hilarious. Intel shoved useless GPUs to everyone costing more and wasting die space while telling you to eat a big D.

2018 production issues intel: oh, look what we’re releasing, totally a new sku.

Too bad it’s not a new die. Or I’d be surprised if so.
 
Last edited:
This reminds me of the i5 2550K, anyone remember that CPU? It was the same as the 2500k but no GPU and was clocked 100MHz higher (my brother uses one) But some of these ones are clocked lower? :confused:
 
Isn't the IGP on Intel's CPU's used to help render in Adobe products.
 
Isn't the IGP on Intel's CPU's used to help render in Adobe products.

Most any gpu can accelerate adobe products.
 
Most any gpu can accelerate adobe products.

I meant you can use the IGP in Intel's CPU's in conjunction with a dedicated GPU to help lower rendering time, instead of it doing nothing.
 
I meant you can use the IGP in Intel's CPU's in conjunction with a dedicated GPU to help lower rendering time, instead of it doing nothing.
I'm not 100% sure, but I think you can't use both at the same time
 
probably short for Kluster Fu*k
 
Hilarious. Intel shoved useless GPUs to everyone costing more and wasting die space while telling you to eat a big D.

2018 production issues intel: oh, look what we’re releasing, totally a new sku.

Too bad it’s not a new die. Or I’d be surprised if so.

What's worse is that many review sites consider the embedded GPU to be a positive on the overall review of the processor... I've always disagreed with that. Embedded mobo GPUs are better.
 
What's worse is that many review sites consider the embedded GPU to be a positive on the overall review of the processor... I've always disagreed with that. Embedded mobo GPUs are better.
Well, the IGP is a positive. When building an office PC, an IGP is all you need. It's also useful as a failsafe, should your main GPU crap out (though, for that purpose, it definitely doesn't need to take up 30-50% of the die space).
But since nobody bothers embedding a minimal GPU, what I have always criticized about IGP is the lack of an IGP-less option. AMD has fixed that, it's nice to see Intel following suit.
 
Last edited:
AMD has fixed that, it's nice to see Intel following suit.

Difference being AMD uses a seperate igpuless die for their ones. I'm sure these intels still have it onboard, just disabled. No way they redesigned just to remove the igpu...
 
Difference being AMD uses a seperate igpuless die for their ones. I'm sure these intels still have it onboard, just disabled. No way they redesigned just to remove the igpu...
I believe I pointed out above that's an important point. We don't know at this point, but I also tend to believe it's the same die.
 
I believe I pointed out above that's an important point. We don't know at this point, but I also tend to believe it's the same die.

So you did. I get lost sometimes, ignore me... :laugh:
 
....i think I got it..... The reason intel's cpu line is trash as of late is the shift in focus and resources to their upcoming gpu endeavors....which left the cpu line twisting in the wind. You've gotta admit though...this half of a generation of refreshing the same old thing....was so good half a generation ago that it still holds its own today and owns the segment. If it was priced better.....baaaaahh who am i kidding. Intel not gonna price it better. I just hope their interest in gpus wasn't mining related.
 
There's really nothing spectacular here, Intel's simply looking to boost yields by selling CPUs with defective GPUs instead of throwing them in the trash.

Embedded mobo GPUs are better.

I too think that things that don't exist are better.
 
There's really nothing spectacular here, Intel's simply looking to boost yields by selling CPUs with defective GPUs instead of throwing them in the trash.

I too think that things that don't exist are better.

Doesn't exist anymore* ;)
 
Definitely want to see a lidless die shot and/or measurement to compare. If it’s smaller, we know the iGPU is gone. I’m a bit curious to see if these are a consumer socket version of the HDET-X design. Intel is having to work all sorts of magic to match core count versus AMD, so these KF models might expand to save fab space.

The part that makes me wonder if this has a disabled iGPU is the fact that there are components of the GPU that can still be used even with a dedicated GPU installed. I know on my iMac 5K, streaming HD video will show the iGPU component as running, even though it has a dedicated AMD card for display purposes. I’m guessing that’s to handle decoding, so would a missing iGPU possibly be a negative from an efficiency standpoint?
 
Intel seems to move along the same lines as car manufacturers. Let me explain. (disclaimer: not really good with cars or model names, just the general idea)

OK, everybody knows what Porsche 911 is. Good example. You go to the showroom and say: I want to buy 911. Sure what model? And then you have many models to choose from. Certainly most will take classic 911, with all bells and whistles your wallet can afford (or not if you live permanently on debit, but that's your problem :p). Anyway.... but there will be 0.1% of customers which will say - No! I want track version GT98whatever-no-frills-no-bells version. And VW/Porsche will say sure, why not. Then they take normal 911, remove and strips down everything you may need in a car, fit in roll-cage and then charge say 100k$ more for that version, just so user can drive 10km/h faster on a track.

And now we have Intel. 9900K is stupidly overpriced already, but then they come-up with KF which doesn't have iGPU (poo we all agree on that) and charge even more? o_O
I agree but I feel like it's necessary to point out that the GPU takes up a significant area of the die, they can probably significantly boost overall yield by branding the chips like this. Alternately these chips might have ended up being some 6 core instead of 8 core version for example, in which case they can charge extra for the same silicon just by chopping different bits off.

It's all about optimising profits on the silicon you get back from the fab by cutting the chips differently, not exactly the same as bolting on different parts to a car.

The latter would probably be what we are moving towards in the future though with multi-chip designs...
 
I agree but I feel like it's necessary to point out that the GPU takes up a significant area of the die, they can probably significantly boost overall yield by branding the chips like this. Alternately these chips might have ended up being some 6 core instead of 8 core version for example, in which case they can charge extra for the same silicon just by chopping different bits off.

It's all about optimising profits on the silicon you get back from the fab by cutting the chips differently, not exactly the same as bolting on different parts to a car.

The latter would probably be what we are moving towards in the future though with multi-chip designs...
The thing is, shrinking the die makes the die cheaper, but so does disabling faulty IGPs, because you get to sell dies that you'd have thrown away otherwise. That's why trying to guess at this point is an exercise in futility.
 
The die costs the same to produce whether they sell it as the highest end part or whether they toss the whole wafer in the bin... The production costs of the silicon do not change (significantly) based on how they cut features off the dies.

Assuming there is a market to buy them, intel wants to cut as many of the dies into the highest tier, most expensive selling product so that their returns are maximised.
 
The die costs the same to produce whether they sell it as the highest end part or whether they toss the whole wafer in the bin...
The wafer costs the same. Whether you can use one die or one hundred dies from that wafer, that's what dictates the cost of the die. Hence the term yields ;)
 
Back
Top