Tuesday, September 26th 2006
“Merged” CPU-GPU in 2008, Says AMD Chief Technologist
A CPU with an integrated graphics core should become reality in 2008, according to AMD. The CPU will be manufactured using 45nm technology.
"Integration of the CPU and the GPU. Assuming the transaction closes on time, we would target a merged design in the 45nm time frame," said Phil Hester of AMD. Mr. Hester explained that the trends in the development of graphics processors increased programmability: "We've crossed the point where the GPU can do real programs of a significant size," he said. He then re-iterated the point made earlier by another AMD representative in an interview with AMD and said that the first CPU-GPU will come around 2008.
"It may seem like 2008 is a long way away, but that's actually a major design cycle," Mr. Hester said.
Source:
X-bit labs
"Integration of the CPU and the GPU. Assuming the transaction closes on time, we would target a merged design in the 45nm time frame," said Phil Hester of AMD. Mr. Hester explained that the trends in the development of graphics processors increased programmability: "We've crossed the point where the GPU can do real programs of a significant size," he said. He then re-iterated the point made earlier by another AMD representative in an interview with AMD and said that the first CPU-GPU will come around 2008.
"It may seem like 2008 is a long way away, but that's actually a major design cycle," Mr. Hester said.
28 Comments on “Merged” CPU-GPU in 2008, Says AMD Chief Technologist
Im seeing why AMD bought ATI for this.. I think my next rig in 4 years will be AMD if intel doesnt pull a cat from their behind or something by then..
EDIT: oh and intel have released a monster out of the bag:
news.com.com/2100-1006_3-6119618.html?part=rss&tag=6119618&subj=news
p.s. sorry for using this again alecstaar, seems to back up intel well on this post. (btw im mutual between intel and AMD)
The guys at bit-tech mentioned this in their article.
Maybe there was a reason why we did away with proprietary computer solutions back in the 90's. Oh wait someone thinks they can do it better:shadedshu. If one thing goes wrong, the entire MB will need to be replaced costing more money.
Wow whats next intergrated ram? If you are going backwards you mind as well go all the way, IMO. :D
The trouble is that GPU's are much more expensive than the Cell's, and the Cell's are capable of around 256 GB/s worth of bandwidth. I think this is all AMD's attempt to try to merge the two technologies in order to compete with the Cell, more than with the Intel chips.
This will however, make a big wave in the IGP market :) super cheap IGP on the CPU- with amd's memory controller, put even more of that memory bandwidth to use..
They will no longer have to add the IGP into the chipset.. making the value line overall that much cheaper to make, can just attach a GPU onto a CPU package, and blammo! easy IGP CPU's, no matter what mobo..
I for one would love it.. I could get a decent mobo for people that want features like hdd controller, etc and be able to get a decent cpu with a video thats cheap..
or to help early adopters or transitions.
Remember how much of a pain it was to convert from AGP to pcie? ;) You could now get a high end mobo with an IGP cpu and be set :)
Not to mention how cheap it would be to make a GPU as simple as most IGP's when it comes to a CPU's level of transistor size. The die size impact for a GMA950 would be almost negligable!
IGP will make 1-2 watts, even less with a die shrink and CPU fab tweaks. It would be neglegible and provide dead surface area.
Every overclock a celeron or sempron?
Exactly.
(IGP's right now are FAR from a full GPU. they lack much of what a GPU has- memory controller, and many shaders.)
to all you apocolyptites-the world/gaming is not coming to an end because they are making computers easier to own for those who don't already.
And yes, it is a good move, if we look at the money to be made in systems, it is all in the business sector, integrated graphics, mid power workstations, low profile, low heat. That is what provides the $$$ to research for high end ring bus memory controllers, and stacking gpu's and other goodies for the rest of us.
Then you can factor in the fact the new GPU's are as big as what? a quarter? That would make the CPU dies down to ~15-20 per wafer instead of 60+.
Yeah, thats going to be a $2200+ CPU. Really worth it?