Wednesday, October 6th 2021

Intel's Pat Gelsinger Exclaims "Intel is Back" AMD is "Over"

Intel's recently appointed CEO wasn't mincing words in a recent interview with CRN, where he claimed that Intel not only "have the best product" but also that "this period of time when people could say, "Hey, [AMD] is leading," that's over." We'd say them are fighting words, regardless of what various leaks have suggested, since Intel still has a lot to prove with its upcoming Alder Lake CPUs.

Gelsinger continues with "We have 80 percent market share. We have the best software assets that are available in the industry. We do the best job supporting our partners and our OEMs with it. We have an incredible brand that our channel partners, customers want and trust. Wow, that's a lot of assets in that. If the channel partner doesn't see value in that, I want to talk to him." It's pretty clear from this that Intel believes that they're doing a bang up job and if their customers don't see it, then they need a talking to.
For those that were hoping for an engineer to be at the reins of Intel again, the interview with CRN reads like a marketing spinner is at the head of the company. "We are back with a very defined view of what it requires to be leadership in every dimension: leadership product, leadership [chip] packaging, leadership process, leadership software, unquestioned leadership on critical new workloads like AI, graphics, media, power-performance, enabling again the ecosystem. This is what we will be doing with aggressive actions and programs over the next couple of years." How Intel is planning to take the lead in the graphics market is going to be interesting to see if nothing else.

Most of the interview is about how Intel is planning on growing its channel and partner ecosystem, but the article also touches on things like Apple, although once again, Gelsinger dismisses Apple's move away from Intel hardware by saying " We ultimately see the real competition to enable the ecosystem to compete with Apple". This suggests that he doesn't seem to understand why Apple decided to make its own processors in the first place. He also doesn't seem to be a fan of what he calls "Apples closed garden" while calling Windows an "open ecosystem".

When asked how Intel is going to be able to compete with AMD and the various Arm based server parts from companies like Amazon and Ampere, he simply answers "do better products". It's hard to take that kind of an answer seriously and although Intel is hardly in a situation where they're likely to end up on the brink of ruin any time soon, the company has been losing ground in both the server, desktop and notebook markets over the past couple of years.

Gelsinger isn't expecting any further slips in terms of market share, mostly due to the fact that neither Intel or AMD can increase their production at the moment and the situation is likely the same for the Arm based server chip makers. Furthermore, he's expecting pricing to remain stable, although this seems to be referring to server parts, as consumer CPUs aren't discussed in the article. He doesn't see a thread from Arm based server CPUs either, claiming that they have a "very minimal" market share today and will continue to do so.

One interesting quote about the consumer PC side is that he believes that with Alder Lake, Intel will have the "energy efficiency leadership", something no-one else is expecting. That said, it seems like he does have some respect for AMD, saying "AMD has done a solid job over the last couple of years. We won't dismiss them of the good work that they've done". It'll be interesting to see how this unfolds over the next couple of generations of CPUs from both companies, as Intel still has a lot to prove with its new CPU designs.
Source: CRN
Add your own comment

161 Comments on Intel's Pat Gelsinger Exclaims "Intel is Back" AMD is "Over"

#51
Chomiq
Pat needs a wizard robe and he can cosplay as D&D warlock.

As for AMD being over I will wait until we see benchmarks and we see growth in Intel's market share.
Posted on Reply
#52
TechLurker
I don't know; every time someone smack-talks the competition in the tech world before releasing a product with finalized (pre-independent review) benchmarks against current rival equivalents, they and their product end up being an embarrassment. Raja and his "Poor Volta" being a more memorable one. I like the current AMD where they don't brag until they have the final (pre-independent) benchmarks ready to throw up on the screen, with bars that push out to two screens or more, compared to their nearest competitors. That's including the fact that some of those graphs admitting they lose in a scenario or two.

At the same time, Intel does have the money to trash talk, even if it's just to hype investors. They also have their own fabs, as blessed/cursed as that can be, and can maintain stock better unless/until AMD can buy out the majority of TSMC's node production similar to how Apple can and does. And naturally, if it doesn't quite work, they can just ramp up the power reqs again until it's competitive in everything except power efficiency.
Posted on Reply
#53
las
ChomiqPat needs a wizard robe and he can cosplay as D&D warlock.

As for AMD being over I will wait until we see benchmarks and we see growth in Intel's market share.
AMD won't be over, this have happended many times before. AMD will just lower prices and focus on better performance for the money. 90% performance at 75% the price will still sell. It's not like AMD have been far ahead of Intel anyway in terms of performance, considering they had a huge node advantage with 7nm vs 14nm. Mostly it's the watt usage, performance wise in gaming, emulation and regular workloads, not really much difference between Ryzen 5000 and even Intel 8700K (especially when overclocked to 5 GHz or more like most will do), some tests Ryzen 5000 wins, others 8700K wins. Performance per watt is the biggest difference, if you care about that for desktop pc's tho. 150 watts or 200 watts, big difference? Personally, I could not care less. GPUs today can peak to 500-600 watts and some people point out a 50 watt increase on a CPU? Come on. Even a cheap 240mm or dual tower air cooler will keep an intel chip at 5 ghz without any problem. Alot of people are even pushing 5.2-5.4 GHz for 24/7 usage.

My 9900K consumes like 100-150 watts in gaming locked at 5.2. Could not care less if it can hit 225+ in synthetic burn-ins with avx2.

It's not exactly like Ryzen 5800X/5900X/5950X running at 4.7 GHz or even 4.8 are "cool" in comparison. Still needs some decent cooling.
Posted on Reply
#54
Darmok N Jalad
This is also the same company that provided chips to Apple for years, but now that that is over they are now actively campaigning against them by promoting touchscreens and such.

The leaks of Adler Lake have me skeptical. The little cores are only found on expensive SKUs, suggesting that is a core-war gimmick, Windows 11 is likely going to be needed to get the best results, and more than anything, I would like to see some reviews that show that runaway power consumption is a thing of the past. Intel has a lot to prove, IMO. When they were dominant, they didn’t talk about “the competition,” so the fact that they are having some hard feelings suggests this will be an RX580 moment—designs pushed to the limit to get some performance credibility back. I guess we’ll find out.
Posted on Reply
#55
las
Darmok N JaladThis is also the same company that provided chips to Apple for years, but now that that is over they are now actively campaigning against them by promoting touchscreens and such.

The leaks of Adler Lake have me skeptical. The little cores are only found on expensive SKUs, suggesting that is a core-war gimmick, Windows 11 is likely going to be needed to get the best results, and more than anything, I would like to see some reviews that show that runaway power consumption is a thing of the past. Intel has a lot to prove, IMO. When they were dominant, they didn’t talk about “the competition,” so the fact that they are having some hard feelings suggests this will be an RX580 moment—designs pushed to the limit to get some performance credibility back. I guess we’ll find out.
Apple's M1 has very little to do with Intel chips tho. ARM vs x86.
It's not like Apple uses ARM chips in their higher end stuff. However ARM in MacBook Air makes perfect sense (some some of the lower end stuff too).
ARM is cheap and "fast enough" for most stuff while consuming very little power. This way Apple can spend the money on SCREEN which matters more for most users and still be able to sell MacBook Air for a pretty low price. Perfect for students for example, with 15-18 hours of battery life. You can use it for 2 days without charging...
ARM chips are heavily dependant on optimization and perfect programming in order to yield good/best performance and that is the reason why it works this well for Apple, since their ecosystem is more closed and dev's generelly knows which models they are coding for.
Posted on Reply
#56
londiste
Clickbait.

These two are not equal statements:
AMD is "Over"
"this period of time when people could say, "Hey, [AMD] is leading," that's over."
olymind1Let's wait and see what kind of CPUs will they delivier under 200€ and VGA between 150-300€.
When looking at the current generation, in the sub-200€ CPU market Intel is the only player in town :)
And 11400F is nothing to scoff at.
ValantarThat is a way oversimplified take. The IPC improvements from Zen/Zen+ to Zen2 are very significant. Sure, some of those are enabled by the increased density of the 7nm node allowing for more transistors and denser circuits, but putting that down to "mostly [the fab node]" is oversimplified to the absurd. Zen2 was far more than Zen/Zen+ shrunk down to 7nm. Zen2 beat Intel outright in IPC, and improved upon Zen+ by ~15%. Zen2 was a one-two punch of a highly efficient node that clocked better than its predecessor and a very noticeable architectural improvement.
While you are correct, "oversimplified to the absurd" is also quite an exaggeration. Rocket Lake is probably a pretty good example for the impact of "mostly [the fab node]" problem. Rocket Lake is smack in the middle of Zen2 and Zen3 when it comes to IPC, let down primarily by power consumption and smaller cache. First of which is directly related to fab node used and second is usually related to the available transistor budget. Latter is usually also related to the fab node.
Posted on Reply
#57
TheLostSwede
londisteClickbait.

These two are not equal statements:
Not really, he says that the good job AMD has been doing is over once Alder Lake is out in a different part of the interview. Maybe I misinterpreted that, but it sounds like he's saying AMD no longer has a chance against Intel, i.e. they're out.
Posted on Reply
#58
londiste
Alien88The real question is, would Intel have made the advances they have recently if AMD had not been eating into their market share? No. If Intel had no real competition, we would still be paying exorbitant prices for sub-par CPUs and APUs with rubbish embedded graphics. That alone makes me want to stick with AMD for all future purchases, they pushed Intel to get off their butts and do better instead of just ripping off PC buyers, like they had been for the last decade. AMD deserves the sales, Intel don't.
You mean Intel's game plan was to stay on 14nm and Skylake for 6 years? I doubt it.
Posted on Reply
#59
Liquid Cool
Back when Conroe was released no silly comments were necessary. I was a member of the largest AMD site on the web when the e6600 hit the streets.

The website was a ghost town literally overnight...

From my perspective...Intel has come up with nothing more than incremental improvements since first gen. Milking the (cash cow)public with every new generation and they keep coming back for more. I simply can't understand it myself...other than destroying the worlds resources for nothing...what does it accomplish? What is 8-10% IPC improvements netting you at the end of the day?

From my perspective...this extreme acceleration of production is destroying our children's future and it's these companies who are doing this who are crying about climate change the loudest. It makes zero sense.

Unless there is an alternate explanation...and I personally believe there is. My thesis would start with taxation and control. What about yours?

Oh...veered off topic. Apologize.

Where is Conroe 2 Intel? Where? When?

Best,

Liquid Cool

P.S. Phanbuey. If I was running China...NOW is the best time to attack Taiwan. They won't get a better opportunity after the mid-term elections. They have a window here and I believe someone is going to mis-step. Israel-Iran seems plausible as well. Not to mention Russia and....ok, sorry, going off topic. I'll stop here.
Posted on Reply
#60
medi01
Darmok N Jaladnow that that is over they are now actively campaigning against them by promoting touchscreens and such.
What is the issue with that???
Posted on Reply
#61
DeathtoGnomes
TheLostSwedeNot really, he says that the good job AMD has been doing is over once Alder Lake is out in a different part of the interview. Maybe I misinterpreted that, but it sounds like he's saying AMD no longer has a chance against Intel, i.e. they're out.
Thats how I read that. Trying show confidence in his products, while typically doing an Intel on AMD ( the PR inflated benchmarks ).
Posted on Reply
#62
DeeJay1001
lasI never said 6000 series were bad, only the lower end is bad and priced too high.
The 6600XT is the only current gen card that is regularly available for less than $500. Not to mention it unquestionably outperforms the 3060 which is not only less available but also street prices for more. Dont buy into the media BS. Thousands of gamers are buying and enjoying their 6600XT.
Posted on Reply
#63
HenrySomeone
Gelsinger doesn't bullshit, so this likely means that what I've been saying for a while now (that AMD's good months are coming to an end) is imminent.
Posted on Reply
#64
Darmok N Jalad
medi01What is the issue with that???
It just reeks of hard feelings. A CPU company is bashing a “lifestyle” company using justifications not related to CPU company’s own product merits, but rather platform merits. Intel is making cases that MS should be making, or rather has already made. Up until Apple dropped Intel, Macs not having touchscreens was not an issue to intel, but now that tired comparison emerges yet again. I still think there’s a bigger collaboration between intel and MS with Windows 11. To the point that intel is even making the same arguments for Windows over MacOS that MS has. Intel needs MS, and needs Windows to be interesting again. Windows 10 isn’t the last version of Windows after all, and what a surprise it launches just before Intels first big.LITTLE design, one that needs a new scheduler.
Posted on Reply
#65
Mr.Mopar392
lasIt was just a matter of time - AMD did well condering all.

HOWEVER without Intel being stuck at 14nm _and_ AMD using TSMC 7nm - AMD would never have been able to do what they did. Ryzen 1000 and 2000 on GloFo was nothing special at all however it delivered an alternative to Intel which even the most hardcore AMD fanboys had joined, because FX CPUs were pretty much garbage.

With 3000 series Ryzen became decent, with 5000 series they become good. This is mostly because of going from GloFo 12nm (which is far worse than Intel 14nm in every aspect) to TSMC 7nm.

AMD have beaten Intel and even Nvidia before but Intel and Nvidia always came back and took the crown. Which resulted in AMD lowering prices and went back to the drawing board. Just look how AMD priced 5000 series compared to 1000/2000 and somewhat 3000 series. AMD started milking just like Intel did for ~10 years. AMD left NON-X models and generally priced all the chips way higher than before.

I own 2 Ryzen machines (htpc + nas/server) and 1 Intel (gaming rig) so you can stop the fanboy BS. I even own two consoles, so I actually have 4 AMD chips in my home... However I also have 3 intel laptops :p So I guess it's 50/50
fanboy in your post all day long but sleep well believing your bias isn't showing.
Posted on Reply
#66
Valantar
lasIf you buy 6900XT or 3080Ti/3090 for 1080p gaming, even tho you are running 360 Hz, then you are clueless about how gaming PCs work. You will be CPU (and somewhat RAM) bound anyway and a card like 6700XT/3060 Ti will deliver pretty much the same performance in esport titles which people buying these monitors are playing anyway. Going 1080p/360Hz for maxing out AAA games is downright stupid because the 80% pixel increase going to 1440p will deliver way more impressive visuals while still delivering very high fps.
In case you missed it, I wasn't arguing for 1080p360 with a 6900 XT or 3080 Ti being sensible, I just said it happens.
lasMost "pro gamers" are NOT using high-end GPUs for 1080p and below. Games like CSGO etc requires pretty much NOTHING from the GPU. Streamers generally use higher-end cards, but they tend to run 1440p and above too.
No "pro" gamers play below 1080p these days. And I'm fully aware of how low the CPU requirements for CS:GO and the like are. Again, I just said that it happens. Stating facts, not voicing opinions about those facts.
lasI'm running 1440p at 240 Hz and I'm CPU bound in most cases too, unless I'm maxing out demanding AAA games. Even in Warzone I'm pretty much at 200+ fps at all times with GPU usage going dropping. BF2042 later today should probably require more, I still expect 120 fps minimum tho, but I will aim for 200+ like usual, using custom settings, while retaining 95% of Ultra IQ. Game has DLSS tho, so that will be tested.
Yes. And? Once again, with feeling: I never said this was smart, I just said people do it.

You seem to be under the impression that I was somehow arguing that doing these things was smart, sensible, something like that. Please rid yourself of that impression, as I haven't said anything like that. Whether or not these decisions are sensible (I would argue they typically aren't; most high-end GPU purchases are driven by technofetishism and an inability to make rational buying decisions, or at best under some very specific conditions for why it makes sense for that person), this happens. I haven't commented beyond that. I'm just saying that a) these GPUs sell well, and b) there are very few 2160p120+ displays out there, and 1440p144+ displays have been taking off in the past year or two, meaning most people buying high end GPUs are likely using one of those. And as you say, that leaves you CPU bound. Whatever the rationale may be, an absolute consideration of performance/$ is not the basis for these decisions, and whether the GPUs only really make sense for 2160p is rather immaterial when faced with the reality that 2160p gaming is rare.
lasThere have been plenty of Alder Lake leaks showing big leaps in several benchmarks. I don't deny them, like some people do tho. Even tho I'm not interested in buying Alder Lake at all (maybe for laptop, not for desktop).
Leaks are leaks. They might be accurate, or they might be complete fabrications. We have also had leaks promising major gains, and other showing lacklustre results (which I assume you've missed, since you haven't mentioned them, and they go against what you're saying?). As such, I take the sensible apporach of reserving judgement until we have actually reliable benchmarks to go off of. For now, we have no way of knowing anything concrete.
Posted on Reply
#67
mechtech
TheLostSwedeIndeed. At the same time, AMD has to prove that they can keep up with Intel.
When it comes to chip production they can’t. Unfortunately.

Gelsinger continues with "We have 80 percent market share.

Maybe give out a few more x86 licenses and see how long that lasts.

Not hard when have only 1’real’ competitor with no fabs.

if there was a dozen capable companies making x86 to mass market and you had 80% market share that would be something to brag about.
Posted on Reply
#68
Franzen4Real
btarunrI've heard the damnest bullshit from a bullshit orifice that smells good.
a solid candidate for 'Quote of the Year' :respect:
Posted on Reply
#69
TheLostSwede
mechtechWhen it comes to chip production they can’t. Unfortunately.
Well, they don't have fabs any more so...
mechtechGelsinger continues with "We have 80 percent market share.

Maybe give out a few more x86 licenses and see how long that lasts.

Not hard when have only 1’real’ competitor with no fabs.

if there was a dozen capable companies making x86 to mass market and you had 80% market share that would be something to brag about.
Intel obviously doesn't want any more competition than they already have, since to them, it seems competition is a bad thing.
If it wasn't, they wouldn't be doing so many scummy things in the "channel". I mean, we know for a fact that they have and most likely still are, or they wouldn't have been fined for it.
Posted on Reply
#70
ZoneDymo
ctrl f "leadership", there we go, that is our one note pat back at it again
Posted on Reply
#71
Octavean
Big talk indeed,….

If true, fine, but if the market dominance and performance comes at “drag them across concrete” prices then I’m not too sure I care.
Posted on Reply
#72
HenrySomeone
mechtechWhen it comes to chip production they can’t. Unfortunately.

Gelsinger continues with "We have 80 percent market share.

Maybe give out a few more x86 licenses and see how long that lasts.

Not hard when have only 1’real’ competitor with no fabs.

if there was a dozen capable companies making x86 to mass market and you had 80% market share that would be something to brag about.
Oh, these poor, poor team red unfortunates with no fabs! One might almost think they never had them and then got rid of them because that's what they deemed better at the time.
And while not quite a dozen, there were several more companies making x86 chips back in the 90s and Intel's share was still just as high...
Posted on Reply
#73
seth1911
Sure for the Consumers Amd play to high, Consumer cant buy anything under 150€ from this Company ok yeah a 3000G for 100€ and a 1200 for 128€.

After theyr high rise they would kiss the Floor again.


AMD life a long time only from Cheap and Median Consumers, and since 3000 or Renoir they piss on them harder then any Company did it in the past.
Posted on Reply
#74
Why_Me
watzupkenIf Intel can't hit back at AMD after a year, I don't know what to say about them to be honest. At least based on rumours and leaks so far, they've all been pointing to a good improvement from a single core perspective. But the limited number of performance cores will still cost them some performance deficit in my opinion, at least until we can confirm if this is really the case with official reviews. I believe some of the advantage of Intel Alder Lake is contributed by the DDR5 that comes along with it if the benchmark/ app is bandwidth sensitive.

I feel Pat is missing out the pricing factor that will help to win back market share/ stop their bleeding. There are a few leaks on pricing, but some of them look quite sketchy. So best to wait for Intel's MSRP to determine if they will be able to win back more users. There is no bad product, but bad pricing, and the high cost of entry for Alder Lake CPUs may actually deter people other than hard core enthusiasts to adopt it early.
B660 boards and locked cpu's will be where it's at.
Posted on Reply
#75
TheoneandonlyMrK
HenrySomeoneOh, these poor, poor team red unfortunates with no fabs! One might almost think they never had them and then got rid of them because that's what they deemed better at the time.
And while not quite a dozen, there were several more companies making x86 chips back in the 90s and Intel's share was still just as high...
I mean Gordon Moore Intel's CEO and chief G at the time invented the f£@#ng thing and had a Monopoly, singular, until he couldn't produce enough, but I get your slant on the story.
There's a fair few stories as to why Intel retained the lead but they're irrelevant to this thread too.
Posted on Reply
Add your own comment