1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Try and Develop x86 CPU in Two to Three Years

Discussion in 'News' started by malware, Mar 4, 2009.

  1. v12dock

    v12dock

    Joined:
    Dec 18, 2008
    Messages:
    1,617 (0.74/day)
    Thanks Received:
    326
    Go Go nvidia
     
  2. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    13,633 (5.00/day)
    Thanks Received:
    1,850
    Intel slapped a lawsuit against AMD, I wonder what they gonna do about it against NV?
     
  3. AddSub

    AddSub

    Joined:
    Aug 9, 2006
    Messages:
    1,001 (0.33/day)
    Thanks Received:
    152
    They will probably be CPU's targeted at low to mid range notebooks or netbooks, at first at least. Intel can't be too happy about this or AMD for that matter. While more competition is usually good for the consumer, these corporations generally do not like competition in any form. Even incompetent competition is usually considered dangerous.

    Also, this is not that big of a surprise. There have been rumors about this going back at least 2-3 years, back to autumn of 2006 when it was even rumored that nVidia might be buying AMD. Instead nVidia around that time hired a bunch of former Stexar (ex-Intel) x86 engineers. I'm guessing around this same time period is when nVidia started tinkering with x86 architecture, behind closed doors at least.
     
  4. TooFast New Member

    Joined:
    Sep 11, 2005
    Messages:
    391 (0.12/day)
    Thanks Received:
    6
    Location:
    montreal quebec
    lol...........they will fail.:D
     
  5. bryan_d

    bryan_d New Member

    Joined:
    Dec 28, 2006
    Messages:
    42 (0.01/day)
    Thanks Received:
    4
    Interesting article,

    But I too agree that all will be in vain due to the x86 licensing. If game developers all jump to the x86 ship, nVidia is pretty much DEAD. AMD/ATI have their hands on the license, and hence can make C/GPU cards, but nVidia will be left drowning in an abandoned architecture. :(

    I bet nVidia is sweating with their efforts in 3D gaming, only to be swept away from them with the coming of Larrabee.


    Seriously? I hope you are being sarcastic.

    The High-End market might be the main market here in TPU, but Intel is not stupid enough to believe that is what makes them rich! :) Do you really believe that the 40,590 members of TPU even have X-editions of Intel's line-up? Heck no. Intel ripens itself with CPUs that sell; AKA Atoms, e1/2/4/5/700 series CPU, and some lower Q's. Period.

    It is like saying that Dodge was only successful because of the Viper... it was the Neons that fattened the pockets. :D

    PowerVR HSR FTW!

    Bryan d
     
  6. 3nd0 New Member

    Joined:
    Oct 2, 2007
    Messages:
    2 (0.00/day)
    Thanks Received:
    0
    Location:
    Bandung, Indonesia
    WoW??

    I really don't know why they(nVIDIA) want to destroying their reputation by competing in developing x86 CPU..(I'm not trying to offend any one but nVIDIA is doing a good job by just developing GPU's!!) b'cause ATI still my fav.. :rockout:
     
  7. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.65/day)
    Thanks Received:
    184
    Those lower Q's as you named them are indeed high-end market forced to lower prices by competition. And all those e1/2/4/... are no different, Core2 flavours that Intel had NO CHOICE but to create to compete with AMD's offerings and potentially Via. If you don't believe me, just look back and tell me how many sub $100-150 CPUs they did in the past?? They used to do the least Celerons they could. Period.

    Anyway if you read better my post, and don't try to understand it from your ass, this same thing is explained in the post itself, when I say "high-end" (with quotes), $100-500 CPUs, $30-100 chipsets... I think it's clear that I'm talking about "high-end" CPUs as opposed to cheap embedded solutions like Atom, Nano, etc. I think that the fact that I didn't talk about $300-1500 CPUs (half the people in TPU are in that range...) already contadicts your stupid reply. I can stand critics, but from someone that didn't bother to read and understand my post NOPE.

    Really, I don't know how to explain to you, that from a modern market point of view even a cheap $50 e1/2/4 CPU is already high-end compared to $10 Atom. << Atom is way overpriced now, competition and simple market trends wil bring it down there soon.

    On a $50 CPU intel can have $10 for themselves, on a $500 they can have $100-200 (does it matter if they sell 50 flod less of them? NO), on a $10 CPU they can't have a shit in comparison, and as I said the market is not growing at a pace enoubh to fight that fact.

    Nettops are going to cannibalize the market that feeds Intel. I repeat: Intel owns the market already and a change doesn't benefit them, just as the previous trend change towards cheaper CPUs didn't benefit them neither. On the other hand a new market trend favours almost every other manufacturer, because it's a mean of scape from a market where they cannot compete to one where the field is even for everybody. AMD, Via, Nvidia (if they finally do a CPU) can't compete in the desktop market (AMD does it's best, but hardly) and nettop, ulpcs, and all can save them. Intel is scared by the fact that with the current trend "only" enthusiast will buy their desktop CPUs in a not so far future and that includes $50 CPUs. How did you say? Ah, PERIOD.
     
    Roph says thanks.
  8. Error 404

    Error 404

    Joined:
    Apr 14, 2008
    Messages:
    1,777 (0.73/day)
    Thanks Received:
    169
    Location:
    South Australia
    If they incorporate Stream Processing Units into this CPU that can accelerate x86 instructions, it would be a truly epic CPU. I hope they can pull this off, the Ion platform looks like a great chipset for any low to mid ranged PC/notebook/netbook, and an nVidia CPU in it would mean nVidia could sell these competitively against Intel.
    I want nVidia CPUs, especially if they can be overclocked!
     
  9. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    13,633 (5.00/day)
    Thanks Received:
    1,850
    they way you are Praising NV for such practice makes me think why are you using an Intel product if you dont like it very much?
     
  10. Swansen New Member

    Joined:
    Nov 18, 2007
    Messages:
    182 (0.07/day)
    Thanks Received:
    9
    Um, Intel has a HUGE hand in the nettop market, are you kidding me??? THE FRIGGIN ATOM IS ALL THAT IS USED CURRENTLY. That said, no one ever realises that even though the consumer market is large, the corporate market is MUCH larger, as well as the server market. All desktop CPUs are spin-offs of server CPUs.

    On Nvidia, everyone is freaking out over nothing, if and when Nvidia gets into the CPU market, it will be cell phones, smart phones, things like that, 20 or so years as we'll probably see a Nvidia CPU.
     
  11. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.65/day)
    Thanks Received:
    184
    Please read the post before posting. For Gods sake!!

    O K. For the nth time. And traduced according to "THE FRIGGIN ATOM IS ALL THAT IS USED CURRENTLY.":

    - Every nettop sold with an Atom, is one less e7200 + G31 mobo sold by Intel.

    Which at the same time can be traduced as:

    - Every 30$ given to Intel is one less $150 given to Intel.

    Do any of you guys understand this simple thing yet? Because I have no other damn way of saying it...
     
  12. DrPepper

    DrPepper The Doctor is in the house

    Joined:
    Jan 16, 2008
    Messages:
    7,483 (2.96/day)
    Thanks Received:
    813
    Location:
    Scotland (It rains alot)
    I understand what your talking about.
     
    DarkMatter says thanks.
  13. 3870x2

    3870x2

    Joined:
    Feb 26, 2008
    Messages:
    4,875 (1.96/day)
    Thanks Received:
    689
    Location:
    Joplin, Mo
    NV might be making their first move in filing an anti competitive complaint:
    1. Try to gain a license for x86.
    ---a. Success! start making money competing in both markets! GO TO 2.
    ---b. FAIL! file anti competition complaint. GO TO 3.
    2. Make money and compete directly on both fronts with their rival AMD. END.
    3. Complaint filed in courts.
    ---a. Success! start competing with AMD on both fronts. END.
    ---b. FAIL! AMD 5xxx series GPU > NV GTX 300 series. CEO commits suicide, NV files bankruptcy.
     
  14. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.65/day)
    Thanks Received:
    184
    Thanks!! At least one. Yeeeeeeeeehaaaaa!
     
  15. crazy pyro

    crazy pyro New Member

    Joined:
    Jun 28, 2008
    Messages:
    1,662 (0.70/day)
    Thanks Received:
    125
    Location:
    Newcastle
    I understand what you're talking about too Dark Matter, I'm damn impressed with the performance of my atom so intel won't be selling me a desk-top CPU if the machine's only gonna be used for web browsing.
     
  16. 3870x2

    3870x2

    Joined:
    Feb 26, 2008
    Messages:
    4,875 (1.96/day)
    Thanks Received:
    689
    Location:
    Joplin, Mo
    I think darkmatter hit it on the spot with his novel on p1. However, Intel is safe, they know it, they have no reason to be scared. why?
    I work in the US army 5th special forces group in FT Campbell kentucky. We have contracts with both dell and intel worth more than the consumer market demands by far, and we overpay, drastically. Intel likes dabbling in the consumer market, but it is contracts like these that make them money. Our contracts with AMD are only for server processors, and only dual cores at that, which hold only ~10% of the share.
     
  17. Roph

    Roph

    Joined:
    Nov 1, 2008
    Messages:
    380 (0.17/day)
    Thanks Received:
    109
    Don't forget VIA's nano. While it does use slightly more power than atom, it's still aimed at atom's market and does perform alot better, including handling 1080p fine, whith atom can't do. And also, it doesn't have to be lumped together with an old, power hungry chipset, unlike atom.

    I wonder if nvidia are still considering ION but with the VIA nano instead.
     
  18. KBD New Member

    Joined:
    Feb 23, 2007
    Messages:
    2,477 (0.87/day)
    Thanks Received:
    279
    Location:
    The Rotten Big Apple
    While VIA Nano is a nice little chip and a direct competitor to the Atom it is nowhere to be found. Try buying one or a nettop based on it. It will be very, very hard. But Atom is so commonplace its ridiclous. So the Nano is really out of the equation until VIA makes this chip and Nano-based nettops widely available.
     
  19. crazy pyro

    crazy pyro New Member

    Joined:
    Jun 28, 2008
    Messages:
    1,662 (0.70/day)
    Thanks Received:
    125
    Location:
    Newcastle
    Also there are no nano netbooks with the battery life of the NC10 out yet, well except for possibly the NC20 when it comes out.
     
  20. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.65/day)
    Thanks Received:
    184
    Even then they are scared. Even if consumer market is smaller, it's still a very big portion of their revenue, and in reality even the smaller of the portions are very important. The PC bussiness is very harsh and the profits are not so big in comparison to the revenue, that is, expenses are big too. It's not unnusual to see tens of billions of dollars in revenue and still millions of losses. So if for any reason you loose 10% of the revenue (which doesn't seem too much at a first glance), that could very well mean you went from making profits to some worrying losses, and IMHO that's the reality that Intel could face. Because being the big company they are, they are much less flexible to restructuring it to lower the expenses. They depend on their revenues more than what most people think. And even if they are far from a difficult situation they just don't want to be there. And they are scared, their actions just show that they are.

    EDIT: And don't subestimate Nvidia in that market you mentioned. Maybe not the army, but in medical environment or in topography CUDA has created a very good impression, doing things that Intel can never dream of, at least for now. CUDA is already being implemented there and as to the army, a GPGPU solution could make a lot of things better than a CPU like GPS image tracking, or pattern finding, for example. What you show is the CURRENT reality, but Intel is scared of the future. Things are bright for them now, and theywant them that way forever. Intel right now is that child that had not seen bad days (or are far forgotten) and is scared of what might be ahead.
     
    Last edited: Mar 5, 2009
  21. Swansen New Member

    Joined:
    Nov 18, 2007
    Messages:
    182 (0.07/day)
    Thanks Received:
    9
    Thats not how economics work, you make money on volume. Even then, your not understanding this, a side from everything else i said, just because the price is X amount higher doesn't mean there is more profit. High end parts are expensive because they are difficult to make and because there is a smaller market, which is vise versa for low end parts. Its that simple, i understand your point, but your logic, but it has wholes. Even then, Intel make a BOAT LOAD off licensing, they develop something, and then everyone else pays them to make what they developed, almost pure profit. Intel has ZERO reasons to be any kind of scared, Intel has an extreme amount of money, they will win any kind of suit against them, CHINA, as in the country, couldn't get an x86 license from them. Honestly 3870X2 pretty much nailed it, and on the nettops, they are a VERY small portion of the market, most people don't like them because they are to small. Intel is going anywhere, they aren't "scared" Nvidia is going to topple any mountains, and they are a LONG way off from creating a processor that would compete with AMD or Intel. Also there are a couple open source derivatives of CUDA, which do just as good of a job as CUDA does. On that, Intel is HUGE, they will have answer to CUDA soon enough.
     
    Last edited: Mar 5, 2009
  22. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.65/day)
    Thanks Received:
    184
    :roll: So let me undertand that. They told you that BS, marketing BS and you believed it????
    My god. While that is true to a point, and was more true some time ago than it is today, it definately is not to the extent that prices show. The cheaper and the more expensive Quad (about $150 and $1500) costs exactly the same to develop (it's the same chip in the same waffer), almost exactly the same to test and the excuse to sell them high is because they are selected chips that can clock higher, have some better properties, etc. / end of theory

    Now we take Core2 and it just happens to OC like a charm, to almost 3.8 GHz irregardless of what was it's original clock. Could have been clocked at 3 Ghz in the first place?? Of course my friend. Do they make a profit selling them at $130?? Of course my friend. Then could Intel sell $150 Quads clocked at 3Gz+ and still make a profit? Of course my friend. It's common bussiness, but in no way it is because they are harder to make. :laugh:

    * higher end CPUs DO tend to OC better, but is trully because they are selected pieces or because they have a higher multiplier?? I just ask. :rolleyes:

    Yeah, agreed. And because of that, what happens when that IP starts losing it's value as is the case with x86 CPUs? :cry:

    EDIT: You forgot to list "they force the adoption of that thing" and "they ensure the exclusivity, by not licensing to companies that can do it better" in that las sentence though.
     
    Last edited: Mar 5, 2009
  23. Swansen New Member

    Joined:
    Nov 18, 2007
    Messages:
    182 (0.07/day)
    Thanks Received:
    9
    ....... wow... first off, YOU MISSED HALF OF WHAT I'VE WRITTEN, so whatever we are even. What about the fall? everything currently is still based on a x86 architecture, so Intel has say. Right now the only reason Intel is allowing AMD to make processors is because of X86_64. In which case we are in a whole different ball game, in which if Nvidia wants in, they have to go through AMD and Intel, good luck. Yes, i understand that licensing doesn't help development, but again, thats a completely different conversation. I agree, i don't agree with current rules, they only hinder development, but we live in a world where they exists, in turn making Nvidia's climb to anything a very difficult one.
     
  24. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.65/day)
    Thanks Received:
    184
    I just answered to the points I wanted to make clear, but if you want:

    Overall everything you said was only half true. Like that you make money on volume. You make money on volume and in margins. No margin = no profits. High-end parts have a much higher margin (like 10/20 to one) and even if by volume low-end is 10 times bigger, in profits it's only twice as big. And the most profitable segment is always the mainstream anyway, and it's that market segment which is in risk really. Ion while could be considered super-low-end does threaten some Intels mainstream solutions, because it has better graphics than the better Intel. If it was a success that could lead to Nvidia moving that scheme to higher levels and would, in fact, erase the need for ANY mainstream CPU. Intel then would be left with only the enthusiast segment (which will never dissapear, and no one neither Nvidia said so) and the new low segment, which would be the Atom (and sucessors) and not the $50-100 CPUs of today. Not the bright future Intel wants, that's for sure. That's the main reason they are trying their best to ban Ion, without making too much noise.
     
  25. bryan_d

    bryan_d New Member

    Joined:
    Dec 28, 2006
    Messages:
    42 (0.01/day)
    Thanks Received:
    4
    Children in Forums=FTL!

    I think Darkmatter needs to take a deep breath, and just step away from "Darkmatter land" for a moment.

    If you really believe Intel only focuses on its high-end CPU's to drive its profits then you can go ahead and believe what you want. But the vast majority of Corporate environments, Home Business, Educational Institutes, and personal computers that I have seen with my own eyes do not use high-end; these are what make up Intel's profits.

    Well I am finished here. If you will continue to spit and scream at your monitor, go ahead. :)

    Darkmatter go see some family and loved ones,

    Bryan D.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page