Wednesday, March 4th 2009

NVIDIA to Try and Develop x86 CPU in Two to Three Years

Graphics market seems to be already small enough for NVIDIA, so the green corporation looks further ahead into building a x86 processor in the near future. At the Morgan Stanley Technology Conference in San Francisco yesterday, the company revealed that it had plans to enter the x86 processor market by building a x86 compatible system-on-chip in the next two to three years. Michael Hara, NVIDIA’s senior vice president of investor relations and communications, commented:
I think some time down the road it makes sense to take the same level of integration that we’ve done with Tegra ...... Tegra is by any definition a complete computer on a chip, and the requirements of that market are such that you have to be very low power, very small, but highly efficient. So in that particular state it made a lot of sense to take that approach, and someday it’s going to make sense to take the same approach in the x86 market as well.
He also said that NVIDIA’s future x86 CPU wouldn’t be appropriate for every segment of the market, especially the high-end of the PC market which includes gaming systems, graphics engineering stations and many other. The x86 chip will be mainly targeted at smaller system-on-chip platforms. No other details were unveiled at the time of this publication. It's also very early to talk about something that's still on paper.Source: Blog on EDN
Add your own comment

51 Comments on NVIDIA to Try and Develop x86 CPU in Two to Three Years

#1
eidairaman1
Intel slapped a lawsuit against AMD, I wonder what they gonna do about it against NV?
Posted on Reply
#2
AddSub
They will probably be CPU's targeted at low to mid range notebooks or netbooks, at first at least. Intel can't be too happy about this or AMD for that matter. While more competition is usually good for the consumer, these corporations generally do not like competition in any form. Even incompetent competition is usually considered dangerous.

Also, this is not that big of a surprise. There have been rumors about this going back at least 2-3 years, back to autumn of 2006 when it was even rumored that nVidia might be buying AMD. Instead nVidia around that time hired a bunch of former Stexar (ex-Intel) x86 engineers. I'm guessing around this same time period is when nVidia started tinkering with x86 architecture, behind closed doors at least.
Posted on Reply
#3
TooFast
lol...........they will fail.:D
Posted on Reply
#4
bryan_d
Interesting article,

But I too agree that all will be in vain due to the x86 licensing. If game developers all jump to the x86 ship, nVidia is pretty much DEAD. AMD/ATI have their hands on the license, and hence can make C/GPU cards, but nVidia will be left drowning in an abandoned architecture. :(

I bet nVidia is sweating with their efforts in 3D gaming, only to be swept away from them with the coming of Larrabee.


DarkMatter said:
Why is Intel scared??

Well first of all, considering the power of computers today and the usage that most people do of the PCs, every nettop sold is one less "high-end" desktop CPU that Intel sells, and don't fool yourself, Intel's market is and always has been the high-end market.
Seriously? I hope you are being sarcastic.

The High-End market might be the main market here in TPU, but Intel is not stupid enough to believe that is what makes them rich! :) Do you really believe that the 40,590 members of TPU even have X-editions of Intel's line-up? Heck no. Intel ripens itself with CPUs that sell; AKA Atoms, e1/2/4/5/700 series CPU, and some lower Q's. Period.

It is like saying that Dodge was only successful because of the Viper... it was the Neons that fattened the pockets. :D

PowerVR HSR FTW!

Bryan d
Posted on Reply
#5
3nd0
WoW??

I really don't know why they(nVIDIA) want to destroying their reputation by competing in developing x86 CPU..(I'm not trying to offend any one but nVIDIA is doing a good job by just developing GPU's!!) b'cause ATI still my fav.. :rockout:
Posted on Reply
#6
DarkMatter
bryan_d said:
Interesting article,

But I too agree that all will be in vain due to the x86 licensing. If game developers all jump to the x86 ship, nVidia is pretty much DEAD. AMD/ATI have their hands on the license, and hence can make C/GPU cards, but nVidia will be left drowning in an abandoned architecture. :(

I bet nVidia is sweating with their efforts in 3D gaming, only to be swept away from them with the coming of Larrabee.




Seriously? I hope you are being sarcastic.

The High-End market might be the main market here in TPU, but Intel is not stupid enough to believe that is what makes them rich! :) Do you really believe that the 40,590 members of TPU even have X-editions of Intel's line-up? Heck no. Intel ripens itself with CPUs that sell; AKA Atoms, e1/2/4/5/700 series CPU, and some lower Q's. Period.

It is like saying that Dodge was only successful because of the Viper... it was the Neons that fattened the pockets. :D

PowerVR HSR FTW!

Bryan d
Those lower Q's as you named them are indeed high-end market forced to lower prices by competition. And all those e1/2/4/... are no different, Core2 flavours that Intel had NO CHOICE but to create to compete with AMD's offerings and potentially Via. If you don't believe me, just look back and tell me how many sub $100-150 CPUs they did in the past?? They used to do the least Celerons they could. Period.

Anyway if you read better my post, and don't try to understand it from your ass, this same thing is explained in the post itself, when I say "high-end" (with quotes), $100-500 CPUs, $30-100 chipsets... I think it's clear that I'm talking about "high-end" CPUs as opposed to cheap embedded solutions like Atom, Nano, etc. I think that the fact that I didn't talk about $300-1500 CPUs (half the people in TPU are in that range...) already contadicts your stupid reply. I can stand critics, but from someone that didn't bother to read and understand my post NOPE.

Really, I don't know how to explain to you, that from a modern market point of view even a cheap $50 e1/2/4 CPU is already high-end compared to $10 Atom. << Atom is way overpriced now, competition and simple market trends wil bring it down there soon.

On a $50 CPU intel can have $10 for themselves, on a $500 they can have $100-200 (does it matter if they sell 50 flod less of them? NO), on a $10 CPU they can't have a shit in comparison, and as I said the market is not growing at a pace enoubh to fight that fact.

Nettops are going to cannibalize the market that feeds Intel. I repeat: Intel owns the market already and a change doesn't benefit them, just as the previous trend change towards cheaper CPUs didn't benefit them neither. On the other hand a new market trend favours almost every other manufacturer, because it's a mean of scape from a market where they cannot compete to one where the field is even for everybody. AMD, Via, Nvidia (if they finally do a CPU) can't compete in the desktop market (AMD does it's best, but hardly) and nettop, ulpcs, and all can save them. Intel is scared by the fact that with the current trend "only" enthusiast will buy their desktop CPUs in a not so far future and that includes $50 CPUs. How did you say? Ah, PERIOD.
Posted on Reply
#7
Error 404
If they incorporate Stream Processing Units into this CPU that can accelerate x86 instructions, it would be a truly epic CPU. I hope they can pull this off, the Ion platform looks like a great chipset for any low to mid ranged PC/notebook/netbook, and an nVidia CPU in it would mean nVidia could sell these competitively against Intel.
I want nVidia CPUs, especially if they can be overclocked!
Posted on Reply
#8
eidairaman1
they way you are Praising NV for such practice makes me think why are you using an Intel product if you dont like it very much?
Posted on Reply
#9
Swansen
DarkMatter said:
Why is Intel scared??
Well first of all, considering the power of computers today and the usage that most people do of the PCs, every nettop sold is one less "high-end" desktop CPU that Intel sells, and don't fool yourself, Intel's market is and always has been the high-end market.
Um, Intel has a HUGE hand in the nettop market, are you kidding me??? THE FRIGGIN ATOM IS ALL THAT IS USED CURRENTLY. That said, no one ever realises that even though the consumer market is large, the corporate market is MUCH larger, as well as the server market. All desktop CPUs are spin-offs of server CPUs.

On Nvidia, everyone is freaking out over nothing, if and when Nvidia gets into the CPU market, it will be cell phones, smart phones, things like that, 20 or so years as we'll probably see a Nvidia CPU.
Posted on Reply
#10
DarkMatter
Swansen said:
Um, Intel has a HUGE hand in the nettop market, are you kidding me??? THE FRIGGIN ATOM IS ALL THAT IS USED CURRENTLY. That said, no one ever realises that even though the consumer market is large, the corporate market is MUCH larger, as well as the server market. All desktop CPUs are spin-offs of server CPUs.

On Nvidia, everyone is freaking out over nothing, if and when Nvidia gets into the CPU market, it will be cell phones, smart phones, things like that, 20 or so years as we'll probably see a Nvidia CPU.
Please read the post before posting. For Gods sake!!

O K. For the nth time. And traduced according to "THE FRIGGIN ATOM IS ALL THAT IS USED CURRENTLY.":

- Every nettop sold with an Atom, is one less e7200 + G31 mobo sold by Intel.

Which at the same time can be traduced as:

- Every 30$ given to Intel is one less $150 given to Intel.

Do any of you guys understand this simple thing yet? Because I have no other damn way of saying it...
Posted on Reply
#11
DrPepper
The Doctor is in the house
I understand what your talking about.
Posted on Reply
#12
3870x2
NV might be making their first move in filing an anti competitive complaint:
1. Try to gain a license for x86.
---a. Success! start making money competing in both markets! GO TO 2.
---b. FAIL! file anti competition complaint. GO TO 3.
2. Make money and compete directly on both fronts with their rival AMD. END.
3. Complaint filed in courts.
---a. Success! start competing with AMD on both fronts. END.
---b. FAIL! AMD 5xxx series GPU > NV GTX 300 series. CEO commits suicide, NV files bankruptcy.
Posted on Reply
#13
DarkMatter
DrPepper said:
I understand what your talking about.
Thanks!! At least one. Yeeeeeeeeehaaaaa!
Posted on Reply
#14
crazy pyro
I understand what you're talking about too Dark Matter, I'm damn impressed with the performance of my atom so intel won't be selling me a desk-top CPU if the machine's only gonna be used for web browsing.
Posted on Reply
#15
3870x2
I think darkmatter hit it on the spot with his novel on p1. However, Intel is safe, they know it, they have no reason to be scared. why?
I work in the US army 5th special forces group in FT Campbell kentucky. We have contracts with both dell and intel worth more than the consumer market demands by far, and we overpay, drastically. Intel likes dabbling in the consumer market, but it is contracts like these that make them money. Our contracts with AMD are only for server processors, and only dual cores at that, which hold only ~10% of the share.
Posted on Reply
#16
Roph
crazy pyro said:
I understand what you're talking about too Dark Matter, I'm damn impressed with the performance of my atom so intel won't be selling me a desk-top CPU if the machine's only gonna be used for web browsing.
Don't forget VIA's nano. While it does use slightly more power than atom, it's still aimed at atom's market and does perform alot better, including handling 1080p fine, whith atom can't do. And also, it doesn't have to be lumped together with an old, power hungry chipset, unlike atom.

I wonder if nvidia are still considering ION but with the VIA nano instead.
Posted on Reply
#17
KBD
Roph said:
Don't forget VIA's nano. While it does use slightly more power than atom, it's still aimed at atom's market and does perform alot better, including handling 1080p fine, whith atom can't do. And also, it doesn't have to be lumped together with an old, power hungry chipset, unlike atom.

I wonder if nvidia are still considering ION but with the VIA nano instead.
While VIA Nano is a nice little chip and a direct competitor to the Atom it is nowhere to be found. Try buying one or a nettop based on it. It will be very, very hard. But Atom is so commonplace its ridiclous. So the Nano is really out of the equation until VIA makes this chip and Nano-based nettops widely available.
Posted on Reply
#18
crazy pyro
Also there are no nano netbooks with the battery life of the NC10 out yet, well except for possibly the NC20 when it comes out.
Posted on Reply
#19
DarkMatter
3870x2 said:
I think darkmatter hit it on the spot with his novel on p1. However, Intel is safe, they know it, they have no reason to be scared. why?
I work in the US army 5th special forces group in FT Campbell kentucky. We have contracts with both dell and intel worth more than the consumer market demands by far, and we overpay, drastically. Intel likes dabbling in the consumer market, but it is contracts like these that make them money. Our contracts with AMD are only for server processors, and only dual cores at that, which hold only ~10% of the share.
Even then they are scared. Even if consumer market is smaller, it's still a very big portion of their revenue, and in reality even the smaller of the portions are very important. The PC bussiness is very harsh and the profits are not so big in comparison to the revenue, that is, expenses are big too. It's not unnusual to see tens of billions of dollars in revenue and still millions of losses. So if for any reason you loose 10% of the revenue (which doesn't seem too much at a first glance), that could very well mean you went from making profits to some worrying losses, and IMHO that's the reality that Intel could face. Because being the big company they are, they are much less flexible to restructuring it to lower the expenses. They depend on their revenues more than what most people think. And even if they are far from a difficult situation they just don't want to be there. And they are scared, their actions just show that they are.

EDIT: And don't subestimate Nvidia in that market you mentioned. Maybe not the army, but in medical environment or in topography CUDA has created a very good impression, doing things that Intel can never dream of, at least for now. CUDA is already being implemented there and as to the army, a GPGPU solution could make a lot of things better than a CPU like GPS image tracking, or pattern finding, for example. What you show is the CURRENT reality, but Intel is scared of the future. Things are bright for them now, and theywant them that way forever. Intel right now is that child that had not seen bad days (or are far forgotten) and is scared of what might be ahead.
Posted on Reply
#20
Swansen
DarkMatter said:
Please read the post before posting. For Gods sake!!

O K. For the nth time. And traduced according to "THE FRIGGIN ATOM IS ALL THAT IS USED CURRENTLY.":

- Every nettop sold with an Atom, is one less e7200 + G31 mobo sold by Intel.

Which at the same time can be traduced as:

- Every 30$ given to Intel is one less $150 given to Intel.

Do any of you guys understand this simple thing yet? Because I have no other damn way of saying it...
Thats not how economics work, you make money on volume. Even then, your not understanding this, a side from everything else i said, just because the price is X amount higher doesn't mean there is more profit. High end parts are expensive because they are difficult to make and because there is a smaller market, which is vise versa for low end parts. Its that simple, i understand your point, but your logic, but it has wholes. Even then, Intel make a BOAT LOAD off licensing, they develop something, and then everyone else pays them to make what they developed, almost pure profit. Intel has ZERO reasons to be any kind of scared, Intel has an extreme amount of money, they will win any kind of suit against them, CHINA, as in the country, couldn't get an x86 license from them. Honestly 3870X2 pretty much nailed it, and on the nettops, they are a VERY small portion of the market, most people don't like them because they are to small. Intel is going anywhere, they aren't "scared" Nvidia is going to topple any mountains, and they are a LONG way off from creating a processor that would compete with AMD or Intel. Also there are a couple open source derivatives of CUDA, which do just as good of a job as CUDA does. On that, Intel is HUGE, they will have answer to CUDA soon enough.
Posted on Reply
#21
DarkMatter
Swansen said:
High end parts are expensive because they are difficult to make and because there is a smaller market, which is vise versa for low end parts. Its that simple, i understand your point, but your logic, but it has wholes.
:roll: So let me undertand that. They told you that BS, marketing BS and you believed it????
My god. While that is true to a point, and was more true some time ago than it is today, it definately is not to the extent that prices show. The cheaper and the more expensive Quad (about $150 and $1500) costs exactly the same to develop (it's the same chip in the same waffer), almost exactly the same to test and the excuse to sell them high is because they are selected chips that can clock higher, have some better properties, etc. / end of theory

Now we take Core2 and it just happens to OC like a charm, to almost 3.8 GHz irregardless of what was it's original clock. Could have been clocked at 3 Ghz in the first place?? Of course my friend. Do they make a profit selling them at $130?? Of course my friend. Then could Intel sell $150 Quads clocked at 3Gz+ and still make a profit? Of course my friend. It's common bussiness, but in no way it is because they are harder to make. :laugh:

* higher end CPUs DO tend to OC better, but is trully because they are selected pieces or because they have a higher multiplier?? I just ask. :rolleyes:
Even then, Intel make a BOAT LOAD off licensing, they develop something, and then everyone else pays them to make what they developed, almost pure profit.
Yeah, agreed. And because of that, what happens when that IP starts losing it's value as is the case with x86 CPUs? :cry:

EDIT: You forgot to list "they force the adoption of that thing" and "they ensure the exclusivity, by not licensing to companies that can do it better" in that las sentence though.
Posted on Reply
#22
Swansen
DarkMatter said:
:roll: So let me undertand that. They told you that BS, marketing BS and you believed it????

Now we take Core2 and it ju
Yeah, agreed. And because of that, what happens when that IP starts losing it's value as is the case with x86 CPUs? :cry:
....... wow... first off, YOU MISSED HALF OF WHAT I'VE WRITTEN, so whatever we are even. What about the fall? everything currently is still based on a x86 architecture, so Intel has say. Right now the only reason Intel is allowing AMD to make processors is because of X86_64. In which case we are in a whole different ball game, in which if Nvidia wants in, they have to go through AMD and Intel, good luck. Yes, i understand that licensing doesn't help development, but again, thats a completely different conversation. I agree, i don't agree with current rules, they only hinder development, but we live in a world where they exists, in turn making Nvidia's climb to anything a very difficult one.
Posted on Reply
#23
DarkMatter
Swansen said:
....... wow... first off, YOU MISSED HALF OF WHAT I'VE WRITTEN, so whatever we are even. What about the fall? everything currently is still based on a x86 architecture, so Intel has say. Right now the only reason Intel is allowing AMD to make processors is because of X86_64. In which case we are in a whole different ball game, in which if Nvidia wants in, they have to go through AMD and Intel, good luck. Yes, i understand that licensing doesn't help development, but again, thats a completely different conversation. I agree, i don't agree with current rules, they only hinder development, but we live in a world where they exists, in turn making Nvidia's climb to anything a very difficult one.
I just answered to the points I wanted to make clear, but if you want:

Overall everything you said was only half true. Like that you make money on volume. You make money on volume and in margins. No margin = no profits. High-end parts have a much higher margin (like 10/20 to one) and even if by volume low-end is 10 times bigger, in profits it's only twice as big. And the most profitable segment is always the mainstream anyway, and it's that market segment which is in risk really. Ion while could be considered super-low-end does threaten some Intels mainstream solutions, because it has better graphics than the better Intel. If it was a success that could lead to Nvidia moving that scheme to higher levels and would, in fact, erase the need for ANY mainstream CPU. Intel then would be left with only the enthusiast segment (which will never dissapear, and no one neither Nvidia said so) and the new low segment, which would be the Atom (and sucessors) and not the $50-100 CPUs of today. Not the bright future Intel wants, that's for sure. That's the main reason they are trying their best to ban Ion, without making too much noise.
Posted on Reply
#24
bryan_d
Children in Forums=FTL!

I think Darkmatter needs to take a deep breath, and just step away from "Darkmatter land" for a moment.

If you really believe Intel only focuses on its high-end CPU's to drive its profits then you can go ahead and believe what you want. But the vast majority of Corporate environments, Home Business, Educational Institutes, and personal computers that I have seen with my own eyes do not use high-end; these are what make up Intel's profits.

Well I am finished here. If you will continue to spit and scream at your monitor, go ahead. :)

Darkmatter go see some family and loved ones,

Bryan D.
Posted on Reply
#25
DarkMatter
bryan_d said:
I think Darkmatter needs to take a deep breath, and just step away from "Darkmatter land" for a moment.

If you really believe Intel only focuses on its high-end CPU's to drive its profits then you can go ahead and believe what you want. But the vast majority of Corporate environments, Home Business, Educational Institutes, and personal computers that I have seen with my own eyes do not use high-end; these are what make up Intel's profits.

Well I am finished here. If you will continue to spit and scream at your monitor, go ahead. :)

Darkmatter go see some family and loved ones,

Bryan D.
OK I will repeat it. The LOW en you are talking about IS HIGH- END NOW, Atom is LOW now.

I mean I've been giving numbers, I've been giving the names of the CPUs I considered high-end in this discussion (ah and only within the context of this discussion, I'm not crazy still, thanks). Namely e7200 and G31. :banghead:

And BTW I'm very proud of the companies, colleges and government in Spain now, because they have much better computers than what you are suggesting my friend. Better than the ones I have listed above...

EDIT: Ah:
If you will continue to spit and scream at your monitor, go ahead. :)
I didn't read that the first time, it is pretty clear what kind of things go after: I am finished here. But it was so funny that I had to reply to this. :laugh:. You are naive at best if you think that anything said here or any forum can alterate me. Just because I use the language I use doesn't mean that I am altered or something. It meas, well that I am spanish... maybe? more so... from Bilbao? (don't worry if any spanish people enter, they will understand why this matters :D)
Posted on Reply
Add your own comment