1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Integrated Graphics Chip Market to Disappear by 2012 According to Jon Peddie Research

Discussion in 'News' started by alexp999, Mar 5, 2009.

  1. alexp999

    alexp999 Staff

    Joined:
    Jul 28, 2007
    Messages:
    8,045 (3.00/day)
    Thanks Received:
    862
    Location:
    Dorset, UK
    Jon Peddie Research (JPR), the industry's research and consulting firm for graphics and multimedia, today announced a new study that indicates the end of the market for the popular integrated graphics processor chipset, known as the IGP.

    After fifteen years of stellar growth the IGP, will cease to exist, replaced by graphics embedded in the processor. Integrated graphics are used in desktop and net top PCs, notebooks, and netbooks, and various embedded systems such as point of sale, set-top boxes, and signage systems.

    In 2008 67% of the graphics chips shipped were IGPs. In 2011 it will drop to 20%, and by 2013 it will be less than one percent.

    However, this will not, as many believe, impact the discrete graphics and add-in board market. In fact, with hybrid configuration, embedded graphics will enhance the discrete GPU sales.

    For a period of time, between 2010 and 2012 there will be three choices for graphics available: traditional discrete GPUs mounted on add-in boards and/or the motherboard, integrated graphics processor (IGP) chipsets, and processors with embedded graphics. One or more of these devices will be employed in PCs.

    Inevitably, market shares will shift as suppliers of IGPs like AMD, Intel, Nvidia, SiS, and VIA find the opportunities for chipsets diminishing and they will seek to develop new products that take advantage of their specific strengths. We can already see significant maneuvering between Intel and Nvidia as Nvidia strengthens its high end offerings with CUDA development tools and on the mobile side, the company has introduced the Tegra platform which relies on an ARM processor and Nvidia graphics. AMD is going head to head with Intel with Fusion, an embedded graphics CPU but it too is building out its workstation and visualization graphics. VIA and its S3 graphics subsidiary is playing its cards close to the chest but they are currently attempting to challenge Intel on price in key strategic markets such as netbooks.

    The first integrated graphics controller (IGP) was Sun Microsystems' LEGOS which came out in 1989 for its SPARC processor. The first integrated graphics controller for the PC was introduced by Silicon Integrated Systems - SiS, for Intel processors in 1997.

    The first embedded graphics processor will be Intel's Westmere in Q4 2009, AMD will introduce their Fusion processor in Q2 2011, and both companies will employ 32nm process.
     
  2. AlCabone New Member

    Joined:
    Oct 2, 2007
    Messages:
    78 (0.03/day)
    Thanks Received:
    2
    Location:
    Europe
    So the GPUs integrated in the CPUs won't be called integrated GPUs.
     
  3. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.66/day)
    Thanks Received:
    184
    I'm not so sure about that. Really, according to what I've seen around, there's been a shift from Intel IGPs to more capable Ati/Nvidia IGPs lately. That suggests that people want better IGPs than the ones Intel is offering, and IMHO chipset-IGPs will always be able to squeeze some more performance than graphics embedded on the CPU die, due to power and size constraints in the socket. High-end CPUs are already somewhat limited by TDP constraints and adding a powerful enough GPU will add just more. I can see that shift to start happening in 2-3 years and be complete in 5, but not as soon as JPR says.

    I'm talking about what the market wants though. I'm sure Intel will force the change, like not allowing IGPs for Westmere, even if they are required/wanted by a good chunk of the market...
     
  4. TreadR New Member

    Joined:
    Mar 4, 2009
    Messages:
    79 (0.04/day)
    Thanks Received:
    4
    AMD's 700 series consume 15W max... so I don't see those power constrains you've mentioned.
     
  5. jagass New Member

    Joined:
    Feb 10, 2009
    Messages:
    271 (0.13/day)
    Thanks Received:
    13
    Nice article man...Thanks for this...
     
  6. xfire

    xfire New Member

    Joined:
    Nov 22, 2007
    Messages:
    1,395 (0.54/day)
    Thanks Received:
    193
    Location:
    Hyderabad,India
    So graphic cards were introduced to reduce work of CPU's and they are being put back on CPU's. Just make a powerfull CPU.(one way of seeing things)
     
  7. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.66/day)
    Thanks Received:
    184
    Add those 15w to a 65w TDP CPU and you are way out of spec. Increasing the spec to contain the full consumption makes it a more expensive platform. 80 TDP for CPU socket + 20W for chipset socket is more expensive than an evenly distributed one like 65w + 35W or whatever. Extremes always cost more.

    And high-end 700 consume much more than that anyway, regardless of advertised (780G easily consumes 30-35w under load). My point is that high-end IGP of today is what people wil ask for low-end in 2 years. And nowadays CPU sockets couldn't handle a high-end CPU along a mid-high IGP. I don0t see that changing too much due to costs.

    Also let me doubt that Intel's x86 based IGPs are going to be any close in perf/watt to Ati's superb offerings, because that was also taken into consideration in my comment, though I didn't say. My bad.
     
    Last edited: Mar 5, 2009
  8. TreadR New Member

    Joined:
    Mar 4, 2009
    Messages:
    79 (0.04/day)
    Thanks Received:
    4
    20W for chipset? I don't know where you're getting this but as far as I know the 790FX uses less than 10W under load and since graphics are on CPU basic chipsets like the 790FX are required... that means 95 for CPU (get this... not 65!) + 15 for on-CPU GPU + 10 for chipset = 120W... there are boards out there that can handle 140W just for CPU.

    You're off with that assumption, deal with it and leave the weak arguments for a some other time.
     
  9. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.66/day)
    Thanks Received:
    184
    Erm I think you are misunderstanding what I mean. Integrated is suposedly to make things cheaper. And for the same thing the trend is to make CPUs with lesser and lesser TDPs. See latest CPUs please. 140w??? That's FAR from cheap. Can be done it, of course, but probably a IGP would be just better.

    ALL that if we are asking for the best IGP possible. If we are just talking about something that will make the screen work, then yes, it wouldn't matter. But that's why I stated that I think that people are demanding more powerful IGPs and I think that won't change. I said it in my very first line, in this thread.

    Regarding ATI 700 series: so how do you put a 15W chipset, a 35w-45w CPU, 1 stick of ram, one 2500rpm HDD and you end up with 100w+ of power consumption under load?? Tell me. Nah one thing is the advertised TDP and another thing is what really is happening. Constraints would not happen on paper, they would happen (and DO happen nowadays on MANY cheaper boards) on real world!! Don't talk me about specs when there are more than enough proofs out there, of true power figures and of true power constraints...

    EDIT: Proof of 780G. Let's do the math 15w (780g) + 45 w (CPU) + 8w HDD + 20w memory (let's be generous)= 125w????? :eek: It looks like not to me, that gives me 88w, don't know you. Ah it must be that the memory consumes 50W, it must be that, for sure.
     
    Last edited: Mar 5, 2009
  10. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,121 (6.11/day)
    Thanks Received:
    6,182
    Integrating it onto another part, is still integrated. The chipset has moved to the CPU, so moving the IGP with that chipset doesn't mean it isn't integrated anymore.
     
    TreadR says thanks.
    50 Million points folded for TPU
  11. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.66/day)
    Thanks Received:
    184
    I guess they are talking about what Larrabee integrated is supposed to be doing. AFAIK the integrated Larrabee flavour is NOT going to be a "GPU", it's going to be more like a math coprocessor which happens to be able to render DirextX at the same time it is doing other regular math ops. Like GPGPU does exactly the contrary thing. That's what I always understood from graphics on CPU "the Intel way II".
     
  12. mechtech

    mechtech

    Joined:
    Dec 26, 2006
    Messages:
    251 (0.09/day)
    Thanks Received:
    18
    I will believe it when I see it, I doubt it happening that fast. I wonder if anyone/everyone at JPR is willing to bet their testicles on their assumption??? Anyone there JPR??? I thought so.

    Next.
     
  13. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    13,961 (6.25/day)
    Thanks Received:
    3,803
    Location:
    IA, USA
    I can't say the general trend JPR is suggesting is incorrect because it isn't. The GPU is moving from the north bridge to the CPU. AMD would have done this long ago but they didn't for three reasons: 1) they didn't have the expertise nor resources to pursue it, 2) they didn't have enough marketing presence to dictate to chipset manufacturers a significant overhaul that moving the IGP would require, and 3) they didn't have the memory controller on the processor until Athlon 64. The first point changed when they acquired ATI. The second point changed with they started manufacturing chipsets themselves. The third they've had for quite some time but couldn't act on it until they got somewhere with the first two.

    Intel, on the other hand, has had points one and two for many years but didn't have point 3 until recently; hence the move.

    Now we have to watch the other corporations involved that are being pinned into a corner by the changes at AMD and Intel: VIA and NVIDIA. Both are effectively being driven out of the chipset market. NVIDIA already announced their answer: make an x86 processor to use with NVIDIA chipsets. VIA? I haven't heard anything but I suspect they will continue to support the C7 and future VIA processors but will shift manufacturing focus to other areas like ARM processors.

    Essentially, the entire industry is on the verge of changing. "Integrated" graphic processors are simply relocating to where the memory controller is at. It's the ensuing battle following that paradigm shift is what's going to cause a lot of heartache.
     
  14. TreadR New Member

    Joined:
    Mar 4, 2009
    Messages:
    79 (0.04/day)
    Thanks Received:
    4
    Try to be sober if you reply!... read my previous post to get the "math".... if you can. And please do take my advice and stop posting weak arguments without a point.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page