Sunday, October 30th 2011

Are Improving Integrated Graphics Slowly Killing Off Discrete Graphics Cards?

Intel started the trend of improving integrated graphics with their second generation LGA1155 socket Core i3, i5 & i7 line of processors. Depending on the model, these processors sport integrated HD2000 or HD3000 graphics right on the processor die, which nowadays give acceptable performance for low-end gaming and can play Full HD 1080p video perfectly. This trend is increasing with the upcoming Ivy Bridge processors, which will be able to support a massive 4096 x 4096 pixel display, as we reported here. AMD now also have equivalent products with their Llano-based A-series processors. So, where does this leave discrete graphics cards? Well, the low end market is certainly seeing reduced sales, as there really isn't enough of a performance difference nowadays to always warrant an upgrade from an IGP. As integrated graphics improve further, one can see how this will hurt sales of higher end graphics cards too. The problem is that the bulk of the profit comes not from the top-end powerhouse graphics cards, but from the low to mid-end cards which allow these companies to remain in business, so cannibalizing sales of these products to integrated graphics could make high-end graphics cards a much more niche product and crucially, much more expensive with to boot.

Hence, it's not surprising to see that Digitimes are reporting that while NVIDIA are about to produce the next generation Kepler-based GPU's on TSMC's 28nm process and AMD have already started production of their Southern Islands-based GPU's, the graphics card manufacturers are cautious about jumping in head first with cards based on these new products. Taiwan-based card makers are watching the market before making decisions, according to Digitimes' industry sources:
Compared to the makers' eagerness for the previous-generation GPUs, graphics card makers are rather conservative about the upcoming 28nm chips due to concerns such as TSMC's weak 40nm process yield rate issues may re-occur in its 28nm process and weakening demand for graphics cards and lower-than-expected gross margins.
The poor 28nm yield rate isn't helping either:
Although previous rumors have indicated that TSMC's poor 28nm process yield rate could affect Nvidia's launch of its 28nm GPUs on schedule at the end of 2011, as TSMC already announced its 28nm process has entered mass production, Nvidia's new Kepler GPUs are expected to be announced in December.
All this of course, is bad news for PC enthusiasts, who are always looking to upgrade their PCs with the latest technology so that they can run power-intensive tasks on them, such as 3D gaming and distributed projects such as Folding@Home. On the plus side, a top-end card like a GTX 580 or HD 6970 will not be integrated into an IGP any time soon, because of the sheer power, heat and die size requirements, so there is still hope that affordable high-end cards will remain available.

What's interesting, is that as AMD are now a combined CPU & GPU company, they know full well that their IGP solutions eat into sales of their own discreet low to mid-end graphics cards. It will be worth watching AMD's strategy for dealing with this problem, closely.
Add your own comment

79 Comments on Are Improving Integrated Graphics Slowly Killing Off Discrete Graphics Cards?

#1
NdMk2o1o
I really hope it is kept in comments and feeback cause I for one am sick to death of seeing posts constantly crapped in and derailed, more specifically posts from said news poster. To be honest I see it a lot from the same group of regular members which amounts to no more than bullying imo. And people seem to overlook all of these news threads/editorials are normally sourced elsewhere but still feel the need to attack the OP time and time again, it's out of order and I for one am starting to get a little annoyed at it all.
Posted on Reply
#2
erocker
Senior Moderator
Okay, now let's keep on topic. :)
Posted on Reply
#3
ensabrenoir
minigig said:
The only player here to get hurt is Nvida . Amd makes money either way its still there gpu in fact its cheaper for them to have it on die.

Intel only gains from having better IGPs.

Since Nvidia does not have cpu's they are the losers here on the lowend market.
NC37 said:
Playing a game with an engine from 5+years ago does not mean they are good.

IGP will surpass discrete when and only when...IGPs can play modern current titles at high detail right out of the box. This means for AMD, pair your APUs with high mid range GPUs minimum. Then for Intel this means...just give up, you'll never be able to produce GPUs of this caliber. You are a good CPU maker but you lack the ability to make GPUs.
What intel does haves though is resources i.e. money. A few phone calls and an email or two (yeah i know its more than that easy ) intel acquires/partners nvidia stir mix and in a few years they're stomping as usual. Its just me but i still see the ipads as toys and dont consider them for computing(gonna buy one though to see for myself) I do however realize this is where everything is headed (low power& mobile). Who know one day we might be gaming on 3D eyefinity Ipads with holographic screens :laugh:....i might need to patent that.
Posted on Reply
#4
shiny_red_cobra
Integrated graphics cards are and will always be terrible. /thread
Posted on Reply
#5
swaaye
Are we talking IGPs as in what's in PS3 and 360? ;) Cuz those are extremely popular, successful and capable IGPs.

IGPs on computers serve 90% of computer users. If somebody wants an upgrade, it's because 1) 3D gamer enthusiast 2) professional graphics software 3) need more/different output options. But really most computer users only need their graphics processor to be able to run the OS GUI.
Posted on Reply
#6
NdMk2o1o
ensabrenoir said:
What intel does haves though is resources i.e. money. A few phone calls and an email or two (yeah i know its more than that easy ) intel acquires/partners nvidia stir mix and in a few years they're stomping as usual. Its just me but i still see the ipads as toys and dont consider them for computing(gonna buy one though to see for myself) I do however realize this is where everything is headed (low power& mobile). Who know one day we might be gaming on 3D eyefinity Ipads with holographic screens :laugh:....i might need to patent that.
I actually agree with a part of your post and think it wouldn't be wholly unrealistic that Intel and NVIDIA partner and come up with an APU that rivals and aims to surpass that of AMD, seems very logical infact.
Posted on Reply
#7
Crap Daddy
It's not the improved IGPs that are killing off discrete graphics cards. It's the game developers, publishers and the general trend of making games for mobile platforms.
Posted on Reply
#8
wiak
qubit said:
But you've missed the crucial difference: it's not integrated into the CPU, but in the chipset. Putting it in the CPU changes the dynamics considerably, as you can see.
Integrated Graphics, http://en.wikipedia.org/wiki/Graphics_processing_unit#Integrated_graphics_solutions
basicly graphics built into something that it shares die space and main memory with

The HD 3200 aka 780G was the first chipset to allow you to play Blu-ray on the pc in hardware, it was the first that also had the best 3d performance, intel igps was horrible at the time
http://techreport.com/articles.x/14261/8
http://www.techspot.com/review/233-intel-core-i5-661/page13.html

FordGT90Concept said:
AMD HD 3200 can't even run The Sims 3: Pets without crashing the game so, no. They work for the casual gamer but not the mainstream gamers.
intel graphics wont run it and many many many other games too, and that amd do update their drivers on a monthly cycle dooh
Posted on Reply
#9
ensabrenoir
Crap Daddy said:
It's not the improved IGPs that are killing off discrete graphics cards. It's the game developers, publishers and the general trend of making games for mobile platforms.
In the final analysis, they're all businesses with no emotional/sentimental ties whose goal is to make a profit.They go where the money is. From pc to ps3 to smartphones etc. etc.Just as music went from record & tape to cd and then to digital; its evolve or extinction in the quest for the mighty dollar.
Posted on Reply
#10
LAN_deRf_HA
When IGP hits 6670 levels then they need to worry. Hell even 4670. You can play Crysis reasonably well on one of those, and unreal games pretty effortlessly. This is a double edged problem because of stagnating graphics. Every other game still uses the unreal engine and crysis is still about the toughest to run game you'll come across.
Posted on Reply
#11
ice_v
jmcslob said:
I think many of you are mistaking piss poor code optimizations for the need of a discrete GFX card.
You're being ripped off and most of you don't realize it.
qubit said:
Yes, that sounds very possible. I don't think anyone expects a modern game like BF3 to run properly on an IGP, regardless of code optimisation, but I believe your point is still valid.

I remember when I posted some benchmarks on here some time back showing 30-40% losses in framerates when going from XP to Vista or 7, how I got flamed for it by all those in denial. I can never understand why ordinary people like you and me, defend the big corps who are screwing us all, over the messenger that highlights these problems. :shadedshu
that's exactly how I think about it.

It all depends on the software (game engines in this case). And it has some logic to it as well:

the need of high end graphic card/s (more then one) relays within the need to render at it's maximum visual potential a certain videogame, so basicaly it's engine...this greed of power is determined by the achitecture of the engine itself...

...the turning point is, do you continue making more hungry power engines or do you make more efficient ones? IMHO I think efficiency is the winner card...if you could make a game engine that can match the visual impact of the highend ones but with less computational demand, you wouldn't really need high-end graphics, now would you?

Sounds almost to good to be true...and how could you achieve that?

I would like to know as well but...seems to me that some people out there are already way down that path:

http://www.youtube.com/watch?v=00gAbgBu8R4

http://www.youtube.com/watch?v=1sfWYUgxGBE

...it's just a matter of time ;)


ensabrenoir said:
In the final analysis, they're all businesses with no emotional/sentimental ties whose goal is to make a profit.They go where the money is. From pc to ps3 to smartphones etc. etc.Just as music went from record & tape to cd and then to digital; its evolve or extinction in the quest for the mighty dollar.
true...but isn't that the overall tendency of todays industries (not just videogame companies)? The quality of a product is just one the major consequences of COMPETITION.
Posted on Reply
#12
KainXS
If IGP ever has a chance of replacing low end desktop parts(They will NEVER replace mid or high end) then AMD has to do it, intels drivers are terrible and intel has always been terrible at driver support of their IGP's and I don't see that changing soon.

IGP really threatens the market for oem's who want to sell their parts to company's like dell and hp on their budget pc's though, theres no point for it really since most IGP's now and handle HD video flawlessly,(I remember buying a extra card just for video not even 5 years ago because thats how bad the Integrated was but times have changed)
Posted on Reply
#13
Delta6326
..... IGP HA thats not going to scare Discrete Graphics Cards. They are need in to many things medical, entertainment, TV producing.

Posted on Reply
#14
3volvedcombat
No:

Why?

Because any company dumping huge loads of money on discrete Graphics Cards is going to make sure they have a market and a product to fit that market reasonably.

Short example:
Any MP3 device, up to 120gb of storage ect ect..
Hard-drive, up to 2-3terabytes of storage.

MP3 is used for multiple productive decisions daily.
Hard-drive's have 1 use, store memory, and store some more memory.

On-board gpu, shares space and is designed inside of a multiple productive design USE for the user.
Discrete gpu, designed for ONE(1) use, to process any graphical output that needs to be displayed.

When someone design's either of those products they will think about different usage scenarios there for resulting in a overall difference in performance in different scenarios guarantied BASED ON PRIORITY'S.
Posted on Reply
#15
m4gicfour
qubit said:
But you've missed the crucial difference: it's not integrated into the CPU, but in the chipset. Putting it in the CPU changes the dynamics considerably, as you can see.
Irrelevant. Different dynamics of course, but you said integrated graphics, which is a term which until very recently meant motherboard graphics and now means CPU or MOBO.

ATi or nVidia were the ones who started making integrated GPUs capable of playing low-end games for low-end to casual users. AMD's 780G was the first one to be able to play some of the higher-end games of the day, albeit at a reduced resolution and detail setting, which the casual gamers don't mind as much as us enthusiats (casual gamer's preference from personal experience with those people)

IIRC It was in AMD's roadmap to integrate graphics into the processor long before Intel announced any such thing. Intel just delivered fourty dumptrucks full of gold bullion to their R&D department and beat AMD to the punch. Even if intel was planning to do it before AMD released information about it, they never announced anything. So you could say that Intel delivered on AMDs roadmap before AMD could, with a weaker graphics core. (My google-fu is failing me on this, so I'm going by memory and hence willing to concede this point to anybody who can prove otherwise)

Even ignoring motherboard graphics, it's only because Intel's integrated graphics started out so damn laughably weak (whether speaking of chipset or processor based) that the current upgrades look good, AMD's graphics have already been at the same level as Intel is "Upgrading to" since they came out. (adjusting for release date and technology improvements, of course. I'm not implying that 780G or AMD's low-end APUs are a match for current integrated, just it's in the same class)

Oh, BTW, quoting marketing fluff like 4Kx4K res support is meaningless. Even today's single-chip enthusiast-class graphics cards would be reduced to a slideshow at that res for anything beyond 2d productivity apps. Intel could claim res all they want, until I see a video recorded at that res playing back fluidly with at least (minimum fps, with no drops below) 24fps on a 4K screen without discrete graphics, I call bullshit. It's like claiming a chevy s-10 can reach 300mph*

*when dropped out of an airplane, with an aerodynamically designed outer shell surrounding it


(wow that whole thing came off a lot more harsh sounding than I meant it to... Oh well, this note's here to point out the fact it wasn't meant to)
Posted on Reply
#16
Mussels
Moderprator
of course sales are down, if you released motherboards without an IGP in the past, then a video card was needed to make it work - namely, low end discrete cards.


nowadays, everything has onboard video - even mid and high range products.


the IGP isnt killing off discrete cards, its killing off ENTRY level discrete cards.
Posted on Reply
#17
bika08
the low end GPUs will be history in the near future.

There is no problem in this: AMD eat into sales not of their own only but from all the market and i think the low end GPUs will be history in the near future.
what about the high end GPUs, there is no way that any company manufacturer GPUs in the world makes the high end GPUs is embedded
Posted on Reply
#18
naoan
bika08 said:
There is no problem in this: AMD eat into sales not of their own only but from all the market and i think the low end GPUs will be history in the near future.
what about the high end GPUs, there is no way that any company manufacturer GPUs in the world makes the high end GPUs is embedded
If they could price it at, say, 2/3 of cpu+gpu $, why not? :)
Posted on Reply
#19
CyberDruid
PC enthusiasts are not the largest market share. IGP is going to meet the needs of the 99% that we are not part of.
Posted on Reply
#20
bika08
If they could price it at, say, 2/3 of cpu+gpu $, why not?
and i think this time is coming soon, wait for AMD Tranity ;) Llano the 1st gen is awesome in its performance
Posted on Reply
#21
Jetster
Todays integrated HD Graphics are not your fathers graphics:rockout:

I was skeptical until I tried it. My i3 does everything except DTS-HD passthrough. It will do 1080P and DTS, Dolby Digital without a hick up. To beat it you have to go with a 5650 or 430 or above.
Posted on Reply
#22
micropage7
i dont think so, even the integrated vga goes better it wont kill dedicated graphic card. the newer development just lift integrated vga into a better position
dedicated vga is better if you wanna something better (like performance) over the integrated that made for all around needs
Posted on Reply
#23
jmcslob
Uhm I don't think IGP is likely to kill off low to mid level GFX cards when those cards are easily used to increase the IGP by 50% at the lowest cost.... If anything I think we've already seen the effects to high level GFX cards....Anyone remember the price of an 8800 or a 2900xt compared to the price of a 570 or a 6970......
http://www.techpowerup.com/reviews/ATI/HD_2900_XT/
http://www.techpowerup.com/reviews/HIS/Radeon_HD_6970/

I think with improving IGP's and better mobile devices "The enthusiast market" is diminishing
I think both AMD and Nvidia are both already heading away from monster card market specially when most games are heading for cross market sales.


I look forward to the return of a PC without the need of an overpriced overpowered GFX solution....I think the problem is many of you weren't around before you "needed" a discrete GFX solution.
Posted on Reply
#24
qubit
Overclocked quantum bit
jmcslob said:
Uhm I don't think IGP is likely to kill off low to mid level GFX cards when those cards are easily used to increase the IGP by 50% at the lowest cost.... If anything I think we've already seen the effects to high level GFX cards....Anyone remember the price of an 8800 or a 2900xt compared to the price of a 570 or a 6970......
http://www.techpowerup.com/reviews/ATI/HD_2900_XT/
http://www.techpowerup.com/reviews/HIS/Radeon_HD_6970/

I think with improving IGP's and better mobile devices "The enthusiast market" is diminishing
I think both AMD and Nvidia are both already heading away from monster card market specially when most games are heading for cross market sales.



I look forward to the return of a PC without the need of an overpriced overpowered GFX solution....I think the problem is many of you weren't around before you "needed" a discrete GFX solution.
You can't say that here! :eek: :cry:

Yes, what you said matches the fact that nvidia will be releasing low end cards first for their next gen, which is opposite to what used to happen. Sad, but true. :(
Posted on Reply
#25
jmcslob
qubit said:
You can't say that here! :eek: :cry:
LuLzzz...
Being an Enthusiast Forum I'm sure we'll be (are already) one of the last hold outs for high end GFX cards.
Posted on Reply
Add your own comment