Wednesday, March 4th 2009

NVIDIA to Try and Develop x86 CPU in Two to Three Years

Graphics market seems to be already small enough for NVIDIA, so the green corporation looks further ahead into building a x86 processor in the near future. At the Morgan Stanley Technology Conference in San Francisco yesterday, the company revealed that it had plans to enter the x86 processor market by building a x86 compatible system-on-chip in the next two to three years. Michael Hara, NVIDIA’s senior vice president of investor relations and communications, commented:
I think some time down the road it makes sense to take the same level of integration that we’ve done with Tegra ...... Tegra is by any definition a complete computer on a chip, and the requirements of that market are such that you have to be very low power, very small, but highly efficient. So in that particular state it made a lot of sense to take that approach, and someday it’s going to make sense to take the same approach in the x86 market as well.
He also said that NVIDIA’s future x86 CPU wouldn’t be appropriate for every segment of the market, especially the high-end of the PC market which includes gaming systems, graphics engineering stations and many other. The x86 chip will be mainly targeted at smaller system-on-chip platforms. No other details were unveiled at the time of this publication. It's also very early to talk about something that's still on paper.Source: Blog on EDN
Add your own comment

51 Comments on NVIDIA to Try and Develop x86 CPU in Two to Three Years

#1
Silverel
I suppose they'll need to get some licensing for that x86 then, eh?

Better bust out some vaseline, Intel needs a nice rubdown.
Posted on Reply
#2
Necrofire
Well put.

And if they ever decide to go 64-bit they'll probably head to AMD's doorstep.

What can nvidia provide that others haven't already?

What would be nice is a SoaC made by nvidia put into some mobile gaming platform, and with some open source encouragement, nvidia could take over the mobile gaming market.
Posted on Reply
#3
crazy pyro
Might have something to do with the whole intel/ NVIDIA thng kicking off a couple of weeks ago.
Posted on Reply
#4
Silverel
by: crazy pyro
Might have something to do with the whole intel/ NVIDIA thng kicking off a couple of weeks ago.
The part where nVidia had to give up it's chipset or say goodbye to Intel mobos?:rolleyes:
Posted on Reply
#5
crazy pyro
Yeah, NVIDIA starts competing with intel, intel takes them away from the high end CPU market.
Posted on Reply
#6
BazookaJoe
Well good luck to nVidia I suppose...

The CPU market NEEDS some actual competition, lets just hope they actually can compete :

I'd be terrified of trying to play catch-up with the red and blue teams this late in the game.

*Imagines some sort of ultra-cuda processor core with 400 shaders that emulates x86 at a hardware level*
Posted on Reply
#7
DrPepper
The Doctor is in the house
Its not very fair that intel can decide who gets into the x86 cpu club is it.
Posted on Reply
#8
DarkMatter
Intel will never let them get into x86, considering they already are playing dirty and cheaply to fight Ion platform. Intel is scared, really scared and I can just imagine how are they going to react to this. Well dirty and cheapily, that's for sure, but what the meassures??
Posted on Reply
#9
crazy pyro
I'd seriously consider buying a PC with NVIDIA only components, it's now becoming possible (netbook really but meh, the intel graphics chipset's pants
Posted on Reply
#10
spearman914
by: crazy pyro
I'd seriously consider buying a PC with NVIDIA only components, it's now becoming possible (netbook really but meh, the intel graphics chipset's pants
Nvidia ram,psu, and nvidia mouse. lol
Posted on Reply
#11
KainXS
I can only see nvidia failing if they try to make desktop cpu's, thats all i see, and hope that nvidia as greedy as they are, never make cpu

as much as you say intel plays dirty, nvidia has been just as dirty, and we all know


to tell the truth I would much rather see a merger between nvidia and intel even though that will never happen
Posted on Reply
#12
crazy pyro
Weeeell, it'd be a job to get hold of the PSU, possibly make the case out of old NVIDIA PCBs? Could get some SLI certified RAM, that'd be kinda close and screw the peripherals, too attached to my Razer Deathadder!
Posted on Reply
#13
KBD
by: DarkMatter
Intel will never let them get into x86, considering they already are playing dirty and cheaply to fight Ion platform. Intel is scared, really scared and I can just imagine how are they going to react to this. Well dirty and cheapily, that's for sure, but what the meassures??
yea, aint that the truth. Considering how Intel and Nvidia are budding heads at this very moment Nvidia execs must be dreaming when they think they'll get the x86 license from Intel.

Yea, its not fair that Intel is hoarding the rights to x86 but its their tech and they can do with it as they please. They only thing Nvidia can do is sue them for the right to license it, i think AMD, VIA and others had to do it.
Posted on Reply
#14
KieranD
its not high end stuff or even mainstream its low end stuff competing with via and the like
Posted on Reply
#15
lemonadesoda
I think that Intel will be pretty much forced to license x86 to whoever wants it (and will pay up). If not, there would be anti-competitive/damages suits due to x86 "monopoly" in the PC market.

So why is nVidia interested?

Probably because they have analysed carefully what Larrabee is doing. It's something somewhat unpredictable. Is it a GPU? Is it a mathPU? Is it a transputer? Whatever application(s) it will be used for... it is strategically dangerous. If DirectX12/13/NT or whatever has a very different feature set, perhaps suited to flexibility of x86, then hardware "general" shaders are limited. nV needs options. Also...

So many applications today are going multithread for performance. Yes, for gaming, you need at least one or two screaming cores. But if you are encoding, or rendering, or providing server services, you DONT need a single fast core, you are better served by many multicores.

Example. Webserver or mailserver. If you had a "core" per simultaneous transaction, then that would be unbelievably responsive to the users. Just like an old Pentium 3 is MORE THAN ENOUGH power to FTP or webserve just one or two "clients", then a pentium 3 quality CPU, but 512 of them, would more than happily serve 512 simulaneous requests. Moreover, with good powermanagement, when it sits idle, or just partly used, it will save oodles of watt-hours.
Posted on Reply
#16
KieranD
by: KBD
yea, aint that the truth. Considering how Intel and Nvidia are budding heads at this very moment Nvidia execs must be dreaming when they think they'll get the x86 license from Intel.

Yea, its not fair that Intel is hoarding the rights to x86 but its their tech and they can do with it as they please. They only thing Nvidia can do is sue them for the right to license it, i think AMD, VIA and others had to do it.
amd and intel partnership goes back to the 80s when intel let amd manufacture its cpus and license them, they had that agreement and then intel suddenly cancelled its agreement and said no you cant have details of our next cpu, amd eventually won the right to make its own cpus based on the intels, the practise of reverse engineering intels couldnt go on forever and it made its own cpu the K5 and then the much more popular K6

it bought out several companies in order to keep up with intel who was a much bigger company and still is

amd goes back to 96 but intel was back in the 70s and i think amd has done wonders to even come close to intel in market share and product technology
Posted on Reply
#17
npp
I don't see the point in "why should Intel decide who will make x86 chips" or so... Well, somebody invented Coca Cola some time ago, and now you can't simply go out and sell something named Coce with the exact same ingedients... You can make something slighly different, but in the computer world, it won't make much sense, you know. The guys invented and spread the stuff all around the world, they can do whatever they want with the architecture and the name. That's how things work.

Besides, I don't see how Intel should be scared from a company like nVidia, even AMD being for years in the x86 business trails them slighly at that point (be it in terms of technology or pure finances), speaking of nVidia as a direct oponent in the CPU field simply isn't serious. The Ion platform is an intersting proposition, but it still relies on an Intel CPU. So if it gains popularity, Intel will benefit as well - in fact, much more than now, when many users may be held off by the outdated platform the Atom CPU is sold with.

The point is that every company nowadays is trying to expand its portfolio in a vain to maximize its profits, and that's the whole thing. There are hard times ahead, and being able to profit from as much products as possible seems like a sensible strategy. I don't say competition is bad, but numerous examples in other areas have proven that there simply isn't enough place on the market for, say 3, big players. If nVidia builds a x86 CPU, I don't expect it to gain much more popularity than VIA's products right now, as stated above. It's just too much fuss about unclear propositions with much more emotional value than practical one.
Posted on Reply
#18
KBD
by: KieranD
amd and intel partnership goes back to the 80s when intel let amd manufacture its cpus and license them, they had that agreement and then intel suddenly cancelled its agreement and said no you cant have details of our next cpu, amd eventually won the right to make its own cpus based on the intels, the practise of reverse engineering intels couldnt go on forever and it made its own cpu the K5 and then the much more popular K6

it bought out several companies in order to keep up with intel who was a much bigger company and still is

amd goes back to 96 but intel was back in the 70s and i think amd has done wonders to even come close to intel in market share and product technology
yep, thats all true. But for NV it will be very difficult to aquire an x86 license since they never done any work for Intel or reversed enginered anything unlike other CPU manufacturers. May be they can work something out if NV sticks to the very low end mobile devices and wont threaten Intel's dominance elsewhere but if they dont there will be a lawsuit against Intel.
Posted on Reply
#19
FordGT90Concept
"I go fast!1!11!1!"
I think this development is more of a derivative of NVIDIA's push for GPGPU than anything. In a close second place is Intel's Larrabee push (x86 graphics card). As I said before, I don't like where this is headed (more integrated components and much smaller selection in terms of performance diversification).
Posted on Reply
#20
DarkMatter
by: npp
Besides, I don't see how Intel should be scared from a company like nVidia, even AMD being for years in the x86 business trails them slighly at that point (be it in terms of technology or pure finances), speaking of nVidia as a direct oponent in the CPU field simply isn't serious. The Ion platform is an intersting proposition, but it still relies on an Intel CPU. So if it gains popularity, Intel will benefit as well - in fact, much more than now, when many users may be held off by the outdated platform the Atom CPU is sold with.
Why is Intel scared??

Well first of all, considering the power of computers today and the usage that most people do of the PCs, every nettop sold is one less "high-end" desktop CPU that Intel sells, and don't fool yourself, Intel's market is and always has been the high-end market.

Let's see, we are Intel, what do we prefer? Off the 500 million PCs sold every year, how much of them do we want to be full desktop with a $200-500 CPU and $20-100 chipset+igp and how much do we want to be $200 for the full PC??

Look Intel is BIG, because it sells big, if they can only sell smaller cheaper parts, their income would go down-down-down and that's something that not only they don't want, but something they can't afford because of the infrastructure they have to mantain. They released Atom, because every other manufacturer was releasing their competing products, and being that the market is going to change, no matter what you do, how much you fight, you're better off dominating that segment, because that way you can at least control it.

At the same time they have released Atom, they've been saying that nettops and the like had no future like main PCs, at least for now and it's true with their Atom. But oh! Here it comes Ion, a platform that takes that weak CPU and transforms it into something that catters the needs of 80% of the PC market, and Intel's strategy goes out of the window. That's why they've been downplaying Ion with absurd arguments.

No, don't let them fool you, they are very scared, not neccesarily of Nvidia, they can crush them as pleased, but another one would replace them (a known bad is better...). They are scared of the market trends, and of the fact that smaller companies like Nvidia are far better suited to feed that market: As I said each nettop sold (Intel or not) is one less Intel high-end CPU+chipset sold (much less margins for Intel), but with each nettop with a Nvidia-Via-whatever chipset is one more PC sold with a Nvidia-Via-whatever chip, because the PCs they are replacing had none. Intel already owns the market and this is saturated already, 90%++ of first world households already has at least one PC, so almost every new PC sold is a replacement PC in developed countries and countries in development: 1) are not a solution in the short term, because they won't be able to match the adoption rate of developed countries in past years and 2) they are going to choose the cheap PCs to begin with.

Put this on your heads guys, Intel is scared, very scared of losing the big$$ market they dominate with 80% of marketshare and seing it replaced with another one where even if they have the 100% of the market they would still be earning 4 times less.

/end of rant
Posted on Reply
#21
Haytch
Nvidia could end up creating the greatest gaming console the world has ever known. They could indeed begin to incorporate CPU technology onto GPU's as AMD/ATi will surely do in future. They could possibly rival Intel in the commercially mainstream high-end CPU market since Intel's high-end CPU market is pretty much for gamers. Anything could happen, anything could not happen.

I look forward to seeing how this turns out and i wish them luck as i did to the AMD/ATi merge. ATi pwn Intel in GPU technology, and AMD is playing catchup in the CPU department, but if AMD is able to match Intel or better it Intel will reign blood on all, or atleast try too.

The main thing is . . . Nvidia have stepped up the game. I wouldnt stress so much inregards to diversity because theres always those of us that demand better. Im glad the recession hasnt effected their logic.
Posted on Reply
#22
PP Mguire
by: lemonadesoda
I think that Intel will be pretty much forced to license x86 to whoever wants it (and will pay up). If not, there would be anti-competitive/damages suits due to x86 "monopoly" in the PC market.

So why is nVidia interested?

Probably because they have analysed carefully what Larrabee is doing. It's something somewhat unpredictable. Is it a GPU? Is it a mathPU? Is it a transputer? Whatever application(s) it will be used for... it is strategically dangerous. If DirectX12/13/NT or whatever has a very different feature set, perhaps suited to flexibility of x86, then hardware "general" shaders are limited. nV needs options. Also...

So many applications today are going multithread for performance. Yes, for gaming, you need at least one or two screaming cores. But if you are encoding, or rendering, or providing server services, you DONT need a single fast core, you are better served by many multicores.

Example. Webserver or mailserver. If you had a "core" per simultaneous transaction, then that would be unbelievably responsive to the users. Just like an old Pentium 3 is MORE THAN ENOUGH power to FTP or webserve just one or two "clients", then a pentium 3 quality CPU, but 512 of them, would more than happily serve 512 simulaneous requests. Moreover, with good powermanagement, when it sits idle, or just partly used, it will save oodles of watt-hours.
:toast:
Posted on Reply
#23
TreadR
Good luck!... I bet Intel's gonna be pissed!
Posted on Reply
#24
legends84
nvidia making a cpu?? hmm, what socket it will be... impessive...
Posted on Reply
Add your own comment