Wednesday, June 2nd 2010

AMD Demonstrates Graphics Processing Power of Llano Fusion APUs

AMD demonstrated its first Fusion APU (accelerated processing unit), which is a "fusion" between a processor and a graphics processor. The first such processor in the works is based on the 32 nm silicon fabrication technology, codenamed Llano, and fuses a quad-core processor with a DirectX 11 compliant GPU. AMD's Rick Bergman showed off a wafer of the Llano APUs, but it didn't stop there. Rick surprised the press when he went on to claim that the APU can power Aliens vs. Predator in DirectX 11 mode, with a reasonable level of detail, which was demonstrated. Find a video of the same at the source.

Source: TweakTown
Add your own comment

77 Comments on AMD Demonstrates Graphics Processing Power of Llano Fusion APUs

#1
EastCoasthandle
I wouldn't say it's a lot for a APU solution. How much power does it take to power an equivalent CPU and GPU solution?
Posted on Reply
#2
theubersmurf
by: Imsochobo
the problem was you stuck the cpu in the socket.
You boot up the pc.
maybe it discovers it as eng sample.
mobo webpage.
dl bios.
Flash.

Solved.
This is the same if there is a new chip for a LGA1156 platform, doesnt matter, you still need the same update.
So thats not a valid statement in my opinion, i stuff in AM3 cpu's in AM2 boards and they work, yes AM2, with AGP and nforce 3( good old asrock) worked as easy as any other cpu upgrade.
I didn't actually have problems with it myself, it's the number of others who couldn't seem to figure out how to get an am2 or am2+ cpu to work in an am3 mobo (since they already had a chip) Or didn't bother to flash, or flashed badly or whatever.
Posted on Reply
#3
theubersmurf
by: EastCoasthandle
I wouldn't say it's a lot for a APU solution. How much power does it take to power an equivalent CPU and GPU solution?
I seem to recall my athlon x2 5000+ being about 89watts or similar? maybe it was 125 but I haven't used it in a while, I think 25 watts is easily something a Lithium Ion laptop battery could maintain, though I think that figure was targetted at netbooks, not so much laptops.

They said they wanted four core x86 processing, but didn't say how many sp's on the die, so I'm not sure.
Posted on Reply
#4
Imsochobo
by: a_ump
25watts is quite a bit for a CPU, however if it can pull off performance of say a mobility HD 4670 or 5770 then that's dam good. the detail in that screenshot looks good, but can't tell how smooth it was running.
let remind you.
Quadcore. 15W
GPU 5W
Chipset 5W

well, then we're down to a 8 pci-e lane chipset from AMD.
A single core 1.6 ghz
and no gpu got that TPD.

its basicly impossible with dedicated gpu, 5770 is too high, 4670 is more the right direction, maybe slightly less due to memory performance.
Rather excited, even though im not a huge laptop fan, but i would like to see better performance on laptops, intel have really improved, but amd gonna bring it to another level with gpu performance.
Opencl may really kick off if intel manages to bring out "acceptable" performance atleast in terms of gpgpu, 100 gigaflops would be a huge improvement for a app when the part is needed anyways for video when required.
The only looser is nvidia, GF9400 class and such may get really obsolute! (huge profits there!)

and for high end pc's the gpu part could be used in opencl, so no matter what its usefull. physx on the gpu on the cpu in high end pc's :P alltho nvidia wudnt let us do that anyways :P
Posted on Reply
#5
WarEagleAU
Bird of Prey
AMD that is just sweet news, dx11 on a chip especially if it can run the game like that at decent frames. I definitely would want a desktop version (HTPC) and a notebook to boot.
Posted on Reply
#6
manchesterutd81
I wish i knew if this will support DDR3 OR DDR5?

Im ready for DDR5..... and I hope AMD will jump to it for the next Gen of motherboards and processors...
Posted on Reply
#7
F1reFly
considering video cards get upgraded far more often and easier, i doubt AMD will ever want to completely kill off the discreat video card business. i suspect the graphical power of these will remain in the lower end for many years.
Posted on Reply
#8
kaneda
by: TVman
so its like cell cpu ?
if you mean the Cell BroadBand Engine, no. not in the slightest.

it should be thrown in with the high performance computing components though.


maybe~
Posted on Reply
#9
pr0n Inspector
Am I getting this right: this only needs to connect RAM, vRAM and southbridge?
Posted on Reply
#10
FordGT90Concept
"I go fast!1!11!1!"
by: _JP_
Ok, I'm confused...a "fusion" between a GPU and a CPU...results in a APU...and what exactly is that?
Auxiliary Power Unit. ;)

They need a better acronym like CPUWG (Central Processing Unit With Graphics).


by: manchesterutd81
I wish i knew if this will support DDR3 OR DDR5?

Im ready for DDR5..... and I hope AMD will jump to it for the next Gen of motherboards and processors...
I imagine it uses the main system RAM just like every other IGP.
Posted on Reply
#11
CounterZeus
this is very interesting, even cheap laptops good get decent performance. My mate's nv7000M just sucks...He can barely run warcraft 3 with a mobile athlon X2 and that crap igp.
if intel and nvidia would team up, we got a serious cpu/gpu war on our hands :D (hopefully with better prices for us costumers :p)
Posted on Reply
#12
Frick
Fishfaced Nincompoop
by: CounterZeus
this is very interesting, even cheap laptops good get decent performance. My mate's nv7000M just sucks...He can barely run warcraft 3 with a mobile athlon X2 and that crap igp.
if intel and nvidia would team up, we got a serious cpu/gpu war on our hands :D (hopefully with better prices for us costumers :p)
I assume it's an Intel IGP. They do suck.
Posted on Reply
#13
HalfAHertz
I'm really interested in the memory configuration of this. Will it have two memory controllers - one for the CPU part and one for the GPU part. Will the GPU have an on-board memory like AMD's current igps? So many questions come up to mind...
Posted on Reply
#14
a_ump
be sweet/inovative if they shared the memory controller. def show real integration right there.
Posted on Reply
#15
HalfAHertz
It would be great if we could see a 265bit bus split 50/50 between the two, and as someone else said earlier maybe DDR5 - but i highly doubt that. The easiest thing to do is have dedicated GDDR5 sideport memory on the motherboard and then dual channel ddr3 fro the cpu.
Posted on Reply
#16
Trigger911
by: Frick
I assume it's an Intel IGP. They do suck.
I think he said it was a amd system so hes prob got an xpress 200 series igp
Posted on Reply
#17
a_ump
i dout it, probly more along the lines of an actual discrete card's gpu, like an HD 5550 or something along those lines. just a guess.
Posted on Reply
#19
Techtu
by: PCpraiser100
I wonder what OCing would be like???
That's actually the something no one questioned until now... and now you've really got me thinking... Would overclocking even be something we would be able to do on these chip's, lot's and lot's of question's and only time will tell :banghead:
Posted on Reply
#20
theubersmurf
by: Tech2
That's actually the something no one questioned until now... and now you've really got me thinking... Would overclocking even be something we would be able to do on these chip's, lot's and lot's of question's and only time will tell :banghead:
In the presentation, they talk about "Black" editions that would be unlocked for enthusiats. But I wonder about it myself. Two different clock speed sections on a die? How would that work?
Posted on Reply
#21
mastrdrver
Talking about how many shaders this thing has, rumors are pretty consistant that max will about around 400. Basically, a 5570 on a cpu die (since it has the slower ddr3 instead of gddr5 like the 5670). Combine that with up to a quad phenom minus the L3 and you have Llano or what was just demoed.

AM3+ (aka AM3r2) is coming probably when this thing debuts or with Bulldozer. There is a slide that has been posted on several sites that show it. This is probably what the 8 series chipsets have been designed for.

Also, anyone remember the talk about quad channel AMD cpus? If they start pushing up the size of the gpu on the die with the cpu, your going to need a wider bus or else you will end up with a bandwidth limited APU. Although, the GPU is suppose to become more integrated like how the Interger processing unit was back 10+ years ago.
Posted on Reply
#22
erocker
by: theubersmurf
Two different clock speed sections on a die? How would that work?
It'll work. They basically do it now with the CPU and the memory controller. :)
Posted on Reply
#23
AsRock
TPU addict
by: Frick
I assume it's an Intel IGP. They do suck.
Suck that depends on how you look at it. Laptops should be kept for none gaming and when they make them for gaming you have to expect them to blow up due to the crappy cooling a laptop can have. Just a way for them make money of people saying they can cope with such heat they produce.

Now if ya had it in a PC that's a different matter.
Posted on Reply
#24
HalfAHertz
I guess you can always set the frequency of the GPU as multiple of the HT speed?
Posted on Reply
#25
theubersmurf
by: erocker
It'll work. They basically do it now with the CPU and the memory controller. :)
I guess that's true, hadn't even thought about that.
Posted on Reply
Add your own comment