Tuesday, September 26th 2006

“Merged” CPU-GPU in 2008, Says AMD Chief Technologist

A CPU with an integrated graphics core should become reality in 2008, according to AMD. The CPU will be manufactured using 45nm technology.

"Integration of the CPU and the GPU. Assuming the transaction closes on time, we would target a merged design in the 45nm time frame," said Phil Hester of AMD. Mr. Hester explained that the trends in the development of graphics processors increased programmability: "We've crossed the point where the GPU can do real programs of a significant size," he said. He then re-iterated the point made earlier by another AMD representative in an interview with AMD and said that the first CPU-GPU will come around 2008.

"It may seem like 2008 is a long way away, but that's actually a major design cycle," Mr. Hester said.
Source: X-bit labs
Add your own comment

28 Comments on “Merged” CPU-GPU in 2008, Says AMD Chief Technologist

#1
Jimmy 2004
NO! Surely this would mean to upgrade one component you HAVE to upgrade the other? I think they're better separate myself. Plus, it will be bad for us overclockers... if you fry one of them you need to buy a whole new G/CPU.
Posted on Reply
#2
Seany1212
yea but it could also mean that whatever your processor runs at the gpu runs at too, but im not sure how memory would be handled, heck im just putting my theories foward :D, i personally prefer them seperate also.
Posted on Reply
#3
pt
not a suicide-bomber
Seany1212yea but it could also mean that whatever your processor runs at the gpu runs at too, but im not sure how memory would be handled, heck im just putting my theories foward :D, i personally prefer them seperate also.
separate will exist, but this is a good idea for low budget pc's :)
Posted on Reply
#4
W2hCYK
I think system memory would turn into vid memory, because DDR3 should be the perfect speed for a video card.. we may also start seeing DDR4 RAM by then, and DDR5? on seperate gfx cards.. :-)

Im seeing why AMD bought ATI for this.. I think my next rig in 4 years will be AMD if intel doesnt pull a cat from their behind or something by then..
Posted on Reply
#5
magibeg
Well intel is supposed to come out with a whole new design every 2 years now or something os intel definately will pull a cat from out of their behind. The question would be if it was a lion or a kitten. Wont having the cpu and gpu merge mean more heat in a tiny little space? I think its a big gamble but i think chances are it will be more towards low end to replace integrated graphics on the mobo :P
Posted on Reply
#6
Seany1212
actually using ATI technology i think it will be the opposite of the low end stuff, i mean, via and intel make low end onboard graphics but i reckon ATI can go one step ahead.

EDIT: oh and intel have released a monster out of the bag:

news.com.com/2100-1006_3-6119618.html?part=rss&tag=6119618&subj=news

p.s. sorry for using this again alecstaar, seems to back up intel well on this post. (btw im mutual between intel and AMD)
Posted on Reply
#7
Shift
This is pretty old :/

The guys at bit-tech mentioned this in their article.
Posted on Reply
#8
EastCoasthandle
proprietary solutions :wtf:. Say good bye to consumer choice. Told you this was coming :rolleyes: ..Consumers, expect to be :nutkick:
Maybe there was a reason why we did away with proprietary computer solutions back in the 90's. Oh wait someone thinks they can do it better:shadedshu. If one thing goes wrong, the entire MB will need to be replaced costing more money.


Wow whats next intergrated ram? If you are going backwards you mind as well go all the way, IMO. :D
Posted on Reply
#9
jocksteeluk
ill be quite optimistic about this, this should help to create a reduction of the size of a standard pc
Posted on Reply
#10
t_ski
Former Staff
I've been doing some research on the Cell processor for a report for school. They talk about how the GPU's of today are simlar to the Cell. Today's GPU's have lots of vertex and shader units, similar to how the Cell has it's SPE's. Today's GPU's are capable of doing all kinds of calculations, and provide more bandwidth than current CPU's. An NV 6800 is capable of about 40GB/s when dealing with frangmented textures, where IIRC the Intel P4 can only do around 6GB/s.

The trouble is that GPU's are much more expensive than the Cell's, and the Cell's are capable of around 256 GB/s worth of bandwidth. I think this is all AMD's attempt to try to merge the two technologies in order to compete with the Cell, more than with the Intel chips.
Posted on Reply
#11
POGE
School?! Arent you like 40? :p
Posted on Reply
#12
Dippyskoodlez
EastCoasthandleproprietary solutions :wtf:. Say good bye to consumer choice. Told you this was coming :rolleyes: ..Consumers, expect to be :nutkick:
Maybe there was a reason why we did away with proprietary computer solutions back in the 90's. Oh wait someone thinks they can do it better:shadedshu. If one thing goes wrong, the entire MB will need to be replaced costing more money.


Wow whats next intergrated ram? If you are going backwards you mind as well go all the way, IMO. :D
This is far from proprietary.. they will never kill the pci-e gpu via this method. just not feasable in a gazillion ways- from a pin count stance, performance, heat output, and transistor count...

This will however, make a big wave in the IGP market :) super cheap IGP on the CPU- with amd's memory controller, put even more of that memory bandwidth to use..
Posted on Reply
#13
Dippyskoodlez
t_skiThe trouble is that GPU's are much more expensive than the Cell's, and the Cell's are capable of around 256 GB/s worth of bandwidth. I think this is all AMD's attempt to try to merge the two technologies in order to compete with the Cell, more than with the Intel chips.
Yeah, bandwidth is nice, too bad the cell has no processing power :banghead:
Posted on Reply
#14
EastCoasthandle
DippyskoodlezThis is far from proprietary.. they will never kill the pci-e gpu via this method. just not feasable in a gazillion ways- from a pin count stance, performance, heat output, and transistor count...

This will however, make a big wave in the IGP market :) super cheap IGP on the CPU- with amd's memory controller, put even more of that memory bandwidth to use..
Yeah but this appears to be their solution for laptops and cheap desktops. This has been done years ago with destop (but not the same way). It didn't work then and it won't work now.
Posted on Reply
#15
Dippyskoodlez
EastCoasthandleYeah but this appears to be their solution for laptops and cheap desktops. This has been done years ago with destop (but not the same way). It didn't work then and it won't work now.
Intel has shown off these GPU+CPU's already- its going to happen. trust me. ;)

They will no longer have to add the IGP into the chipset.. making the value line overall that much cheaper to make, can just attach a GPU onto a CPU package, and blammo! easy IGP CPU's, no matter what mobo..

I for one would love it.. I could get a decent mobo for people that want features like hdd controller, etc and be able to get a decent cpu with a video thats cheap..

or to help early adopters or transitions.

Remember how much of a pain it was to convert from AGP to pcie? ;) You could now get a high end mobo with an IGP cpu and be set :)

Not to mention how cheap it would be to make a GPU as simple as most IGP's when it comes to a CPU's level of transistor size. The die size impact for a GMA950 would be almost negligable!
Posted on Reply
#16
POGE
This will mean lots of heat. You guys arent remembering that. Lots of heat means lower clocks.
Posted on Reply
#17
Dippyskoodlez
POGEThis will mean lots of heat. You guys arent remembering that. Lots of heat means lower clocks.
lool!

IGP will make 1-2 watts, even less with a die shrink and CPU fab tweaks. It would be neglegible and provide dead surface area.

Every overclock a celeron or sempron?

Exactly.

(IGP's right now are FAR from a full GPU. they lack much of what a GPU has- memory controller, and many shaders.)
Posted on Reply
#18
-Thrilla-
Great news for non gamers and overclockers, small desktops will get even smaller.
Posted on Reply
#19
POGE
Dippyskoodlezlool!

IGP will make 1-2 watts, even less with a die shrink and CPU fab tweaks. It would be neglegible and provide dead surface area.

Every overclock a celeron or sempron?

Exactly.

(IGP's right now are FAR from a full GPU. they lack much of what a GPU has- memory controller, and many shaders.)
You assume way to much. What if they start integrating high end parts? You know what I meant too.
Posted on Reply
#20
i_am_mustang_man
this will help reduce in size, power reqs, for gpu's, which will translate into the separate cpu gpu market. this is good for EVERYONE. tech progress is good in the long run.

to all you apocolyptites-the world/gaming is not coming to an end because they are making computers easier to own for those who don't already.
Posted on Reply
#21
Steevo
More n00bs to pwn.



And yes, it is a good move, if we look at the money to be made in systems, it is all in the business sector, integrated graphics, mid power workstations, low profile, low heat. That is what provides the $$$ to research for high end ring bus memory controllers, and stacking gpu's and other goodies for the rest of us.
Posted on Reply
#22
EastCoasthandle
DippyskoodlezIntel has shown off these GPU+CPU's already- its going to happen. trust me. ;)

They will no longer have to add the IGP into the chipset.. making the value line overall that much cheaper to make, can just attach a GPU onto a CPU package, and blammo! easy IGP CPU's, no matter what mobo..

I for one would love it.. I could get a decent mobo for people that want features like hdd controller, etc and be able to get a decent cpu with a video thats cheap..

or to help early adopters or transitions.

Remember how much of a pain it was to convert from AGP to pcie? ;) You could now get a high end mobo with an IGP cpu and be set :)

Not to mention how cheap it would be to make a GPU as simple as most IGP's when it comes to a CPU's level of transistor size. The die size impact for a GMA950 would be almost negligable!
Until they try to mainstream the idea for solutions beyond the "none gamers". You know this is only a start not a end to how far they will take this idea.
POGEYou assume way to much. What if they start integrating high end parts? You know what I meant too.
Exactly what I am saying...
Posted on Reply
#23
Dippyskoodlez
POGEYou assume way to much. What if they start integrating high end parts? You know what I meant too.
Then we'll be needing phase change on any and every CPU. Simply un managable simply due to heat..


Then you can factor in the fact the new GPU's are as big as what? a quarter? That would make the CPU dies down to ~15-20 per wafer instead of 60+.

Yeah, thats going to be a $2200+ CPU. Really worth it?
Posted on Reply
#25
pt
not a suicide-bomber
DippyskoodlezThen we'll be needing phase change on any and every CPU. Simply un managable simply due to heat..


Then you can factor in the fact the new GPU's are as big as what? a quarter? That would make the CPU dies down to ~15-20 per wafer instead of 60+.

Yeah, thats going to be a $2200+ CPU. Really worth it?
it doesn't have to be top gaming, an ahton 3000+ and a x1600 won't get too hot :)
Posted on Reply
Add your own comment
Apr 29th, 2024 10:10 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts