• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD to Sample 32 nm Processors Within H1 2010

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,749 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD, in its presentation at the International Solid State Circuits Conference (ISSCC) 2010, presented its plan to build its much talked about 'Fusion' processor platform, codenamed Llano, central to which, is the Accelerated Processing Unit (APU). AMD's APU is expected to be the first design to embed a multi-core x86 CPU and a GPU onto a single die. This design goes a notch ahead of Intel's recently released 'Clarkdale' processor, where Intel strapped a 32 nm dual-core CPU die and a 45 nm northbridge die with integrated graphics, onto an MCM (multi chip module) package. Llano is also expected to feature four processing cores, along with other design innovations.

Some of the most notable announcements in AMD's presentation is that the company will begin sampling the chip to its industry partners within the first half of 2010. The Llano die will be build on a 32 nm High-K Metal Gate process. On this process, each x86 core will be as small as 9.69 mm². Other important components on the Llano die are a DDR3 memory controller, on-die northbridge, and a DirectX 11 compliant graphics core derived from the Evergreen family of GPUs. The x86 cores are expected to run at speeds of over 3 GHz. Each core has 1 MB of dedicated L2 cache, taking the total chip cache size to 4 MB.



AMD has also embraced some new power management technologies, including power gating. Power gating is a feature that allows the system to power down some x86 core to push their power draw to near-zero. This reduces the chip's overall power draw better, which also allows active cores to be powered up. Similar to Intel's Turbo Boost technology. The power aware clock grid design reduces power consumption by cutting down on clock distribution across the chip.

With the APU, AMD thinks it has the right product for tomorrow's market, a chip which packs a large chunk of the motherboard's silicon, while not compromising on the features due to space constraints within the package. There's room made for four processing cores, and an Evergreen-derived DirectX 11 compliant GPU. If implemented well enough on the software side, AMD believes it can deliver one chip that handles both serial processing workloads (by the x86 cores), and highly parallel workloads (by the stream processors). Market availability of the chip isn't definitively known, but we expect it to be out early next year, or late this year.

View at TechPowerUp Main Site
 
Looks promising for low power PC's.
 
and an Evergreen-derived DirectX 11 compliant GPU.

Im wondering if the GPU will be able to handle anything recent at a acceptable rez. We will prob see a 5400 variant(80 shaders) in there, although a 5570/50 (400 shaders w/ddr3) would be a HUGE advancement. They may even go in between these two shader count wise; but this is just one mans speculation/hope!!
 
I think that hexa core is going to be a quad core processor and two core graphic chip... maybe!
 
A pretty quick counter on Intel's on-chip GPU's, and a powerful one! This will be good stuff for HTPC's and all-in-one starter PC's.
I'm tempted to follow the heat on this stuff and get some when it hits the market.
 
and come the 32nm

:}
 
A very exciting step in Computing Technology.
One of the big innovations to look forward to IMO, the other being OLED screens, as opposed to just simple ever increasing raw processing power.:)
 
it better hurry up! its a niche where intel hasnt really taken over yet.
 
A very exciting step in Computing Technology.
One of the big innovations to look forward to IMO, the other being OLED screens, as opposed to just simple ever increasing raw processing power.:)

OLEDs, all-in-one chips and affordable SSD's imho ;)
 
Fantastic option for lower end builds.

Looks good.
 
I'm trying to be positive but I don't see the Athlon II architecture taking on Sandy Bridge.
 
Fantastic option for lower end builds.

Looks good.

Why lower? It doesn't mention whether or not it's based on the same design as the Phenom IIs and it is expected to clock fast and has this apu on board. Even if they use a similar core design and get no clock for clock increase in speed, they will be going faster and will essentially have a coprocessor if you're using a discrete card and they handle it correctly in software....
 
Why lower? It doesn't mention whether or not it's based on the same design as the Phenom IIs and it is expected to clock fast and has this apu on board. Even if they use a similar core design and get no clock for clock increase in speed, they will be going faster and will essentially have a coprocessor if you're using a discrete card and they handle it correctly in software....

As a heavy gamer, im not going to be running my games on an integrated graphics chip that has been bolted onto a Phenom II or any other CPU for that matter. I would use this chip for more less demanding tasks.

But yes if they play there cards right, we may be seeing these kinds of chips smoke in gaming,
 
If they can squeeze a 5850 into the chip, that will spell the end of graphics cards as we know it. However, if they ship a chip with a weak 5xxx and we are able to disable/crossfire it with an external graphics card, it will put the clarkdales into the junk bin.
 
Considering the language of the article, it seems it won't be ultra-low end graphics, but I can't see them packing a whole lot of power into that 32nm package. At 22nm things may get interesting, but anyway, we can only speculate, as AlienIsGOD said.
 
More AMD does this, the better competition and prices will be.
 
I would really love to see this as a GIANT step for AMD but 4 cores @ 3ghz,on die gpu,ddr3 memory controller plus the other north bridge handling!Even with it being 32nm thats going to be one HOT chip.
I hope they work it out and crush big blue in the notebook/netbook department :nutkick:
 
I'm trying to be positive but I don't see the Athlon II architecture taking on Sandy Bridge.

Maybe not CPU-wise, but AMD/ATi is superiour when it comes to IGP's.
 
I'm trying to be positive but I don't see the Athlon II architecture taking on Sandy Bridge.

Sandy Bridge is going to flatten Athlon II performance wise, but we have seen AMD selling their goods at cutthroat prices, so I wouldn't be too bothered.
 
I seriously doubt AMD is trying to do away with dedicated graphics cards, however if the APU kills intel at the encoding game, I think AMD has won.
 
Sandy Bridge is going to flatten Athlon II performance wise, but we have seen AMD selling their goods at cutthroat prices, so I wouldn't be too bothered.
On the other hand Intel gets flatten performance wise in graphics. :p
 
On the other hand Intel gets flatten performance wise in graphics. :p

+1 on that, and I think we already reached the point where even a mid end processor is going to be a lot more powerful than what an average Joe needs, so I think AMD's more GPU power is going to make a better product.
 
if AMD really wants a huge takeoff with the new chip they need to work with devs very close i.e. sending out platforms and having a good support on the soft/hardware side

if thats not the case, AMD will be in the same spot as Intel is with its integrated gpu, graphicscore just for the graphic but not for apps
 
Now is time that AMD copy from Intel...will that DX11 inside support ATi stream?

Yes boys and girls the future is now, not for true gamers but for everything else.
 
Back
Top