• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD A-Series APUs Tested Against Sandy Bridge CPUs at Gaming on IGP

Fourstaff

Moderator
Staff member
Joined
Nov 29, 2009
Messages
10,024 (1.91/day)
Location
Home
System Name Orange! // ItchyHands
Processor 3570K // 10400F
Motherboard ASRock z77 Extreme4 // TUF Gaming B460M-Plus
Cooling Stock // Stock
Memory 2x4Gb 1600Mhz CL9 Corsair XMS3 // 2x8Gb 3200 Mhz XPG D41
Video Card(s) Sapphire Nitro+ RX 570 // Asus TUF RTX 2070
Storage Samsung 840 250Gb // SX8200 480GB
Display(s) LG 22EA53VQ // Philips 275M QHD
Case NZXT Phantom 410 Black/Orange // Tecware Forge M
Power Supply Corsair CXM500w // CM MWE 600w
Yes because Plants vs Zombies requires a high-end DX11 GPU! No offense AMD, but unless these CPUs are able to compete with Sandy Bridge in terms of performance, no-one is going to give a hoot about how good the integrated graphics are.
.

Actually, I think they might have a winner here. Cheap integrated package in a mITX package + weak graphics card for crossfire can make the consoles sweat. Also, if they manage to use the stream processors properly, this processor might become a beast in media encoding and other massively multithreaded tasks.
 
Joined
Jan 2, 2008
Messages
3,296 (0.55/day)
System Name Thakk
Processor i7 6700k @ 4.5Ghz
Motherboard Gigabyte G1 Z170N ITX
Cooling H55 AIO
Memory 32GB DDR4 3100 c16
Video Card(s) Zotac RTX3080 Trinity
Storage Corsair Force GT 120GB SSD / Intel 250GB SSD / Samsung Pro 512 SSD / 3TB Seagate SV32
Display(s) Acer Predator X34 100hz IPS Gsync / HTC Vive
Case QBX
Audio Device(s) Realtek ALC1150 > Creative Gigaworks T40 > AKG Q701
Power Supply Corsair SF600
Mouse Logitech G900
Keyboard Ducky Shine TKL MX Blue + Vortex PBT Doubleshots
Software Windows 10 64bit
Benchmark Scores http://www.3dmark.com/fs/12108888
Love that yellow blast graphic with the red "Playable Frame Rates" written on it :D. So 60s commercial
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.61/day)
[H] posted a review of 6990 + 6970 (essentially 6970 tri-fire) a while back and they found it scaled excellently, beating out GTX 580 SLI in a number of scenarios and at a much lower price point.


I think you mis-understood what I meant by NOW. I meant upon release, will it work with 2x VGAs, plus the "IGP", hence my comment about it only being a signle discrete add-in.;)
 
Joined
Aug 7, 2007
Messages
2,723 (0.45/day)
Processor i5-7600k
Motherboard ASRock Z170 Pro4
Cooling CM Hyper 212 EVO w/ AC MX-4
Memory 2x8GB DDR4 2400 Corsair LPX Vengeance 15-15-15-36
Video Card(s) MSI Twin Frozr 1070ti
Storage 240GB Corsair Force GT
Display(s) 23' Dell AW2310
Case Corsair 550D
Power Supply Seasonic SS-760XP2 Platinum
Software Windows 10 Pro 64-bit
I somehow doubt this. A lot.

Why is that? The graphs might very well be fake, but we all know how Intel's integrated graphics performs, especially how their earlier procs w/ IGP have done. Now, Sandy Bridge has improved quite a bit on that front, but from what I have always noticed, AMD and NVIDIA motherboards IGP's have almost ALWAYS outperformed the Intel's counterpart.
 
Joined
Oct 30, 2008
Messages
1,758 (0.31/day)
System Name Lailalo
Processor Ryzen 9 5900X Boosts to 4.95Ghz
Motherboard Asus TUF Gaming X570-Plus (WIFI
Cooling Noctua
Memory 32GB DDR4 3200 Corsair Vengeance
Video Card(s) XFX 7900XT 20GB
Storage Samsung 970 Pro Plus 1TB, Crucial 1TB MX500 SSD, Segate 3TB
Display(s) LG Ultrawide 29in @ 2560x1080
Case Coolermaster Storm Sniper
Power Supply XPG 1000W
Mouse G602
Keyboard G510s
Software Windows 10 Pro / Windows 10 Home
Depends what you use your machine for. Personally I'd rather have better graphics than CPU if I had to choose between either. Generally if I play games, the ones I had the most problems with are the more GPU bound. CPU intensive games I don't find that difficult because they tend to not be as demanding as time goes.

Hydrophobia: Prophecy is a good example right now. It is a GPU eating monster. I have to turn my 460's fan up to 100% just to play it and keep temps out of the 100C range. GPU usage is also almost always above 90%.

Not that I'd be looking to buy a budget level laptop right now, but if I did, Fusion would be very attractive. Actually wish Apple would switch to these in their low end Macbooks cause my old iBook is in need of an update. Only use the thing for writing and web browsing. Haven't wanted to spend the $$ to replace yet till I see some interesting features.
 
Joined
Mar 24, 2011
Messages
2,356 (0.49/day)
Location
VT
Processor Intel i7-10700k
Motherboard Gigabyte Aurorus Ultra z490
Cooling Corsair H100i RGB
Memory 32GB (4x8GB) Corsair Vengeance DDR4-3200MHz
Video Card(s) MSI Gaming Trio X 3070 LHR
Display(s) ASUS MG278Q / AOC G2590FX
Case Corsair X4000 iCue
Audio Device(s) Onboard
Power Supply Corsair RM650x 650W Fully Modular
Software Windows 10
I only am interested in these for HTPC's\Laptops. Even if this slide is legit, you have to account for "AMD Inflation" which means the Llano APU's which prioritized equal performance from CPU + IGP is barely capable of outperforming SB which kind of just tacked the IGP on there. That stupid clip art starburst makes me think this is all fake though...
 
Joined
Oct 19, 2008
Messages
3,162 (0.56/day)
System Name White Theme
Processor Intel 12700K CPU
Motherboard ASUS STRIX Z690-A D4
Cooling Lian Li Galahad Uni w/ AL120 Fans
Memory 32GB DDR4-3200 Corsair Vengeance
Video Card(s) Gigabyte Aero 4080 Super 16GB
Storage 2TB Samsung 980 Pro PCIE 4.0
Display(s) Alienware 38" 3840x1600 (165Hz)
Case Lian Li O11 Dynamic EVO White
Audio Device(s) 2i2 Scarlett Solo + Schiit Magni 3 AMP
Power Supply Corsair HX 1000 Platinum
If these results are the same as the final product then we're going to be seeing alot more PC gamers in the future :toast:
 

sethk

New Member
Joined
Apr 14, 2007
Messages
63 (0.01/day)
Do these APUs or their corresponding chipsets (NB) have a dedicated Video Transcode / encode / decode block, like the Z68's "QuickSync" feature?
 

donanimhaber.com

New Member
Joined
May 12, 2011
Messages
2 (0.00/day)
APU: the definition

An APU integrates a CPU and a GPU on the same die thus improving data transfer rates between these components while reducing power consumption. APUs can also include video processing and other application-specific accelerators.

So, Amd is (will) simply but effectively targeting Intel's heart, "the onboard video"
 
W

wahdangun

Guest
An APU integrates a CPU and a GPU on the same die thus improving data transfer rates between these components while reducing power consumption. APUs can also include video processing and other application-specific accelerators.

So, Amd is (will) simply but effectively targeting Intel's heart, "the onboard video"

they already stabbing onboard video and let it bleeding while mutilate it for breakfast
 
Joined
Sep 5, 2004
Messages
1,956 (0.27/day)
Location
The Kingdom of Norway
Processor Ryzen 5900X
Motherboard Gigabyte B550I AORUS PRO AX 1.1
Cooling Noctua NB-U12A
Memory 2x 32GB Fury DDR4 3200mhz
Video Card(s) PowerColor Radeon 5700 XT Red Dragon
Storage Kingston FURY Renegade 2TB PCIe 4.0
Display(s) 2x Dell U2412M
Case Phanteks P400A
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850 (from 2012)
Software Windows 10?
this is gonna be epic in notebooks and premium netbooks, who wants an atom now? or some cheap pentium with onchip crapstics?

same goes for htpcs, who wants to have a radeon hd 5570 class gpu and a phenom II class cpu in the same chip in their little htpc in the living room? instead of intel with sucky bultin crapstics
 
Joined
Jan 11, 2005
Messages
1,491 (0.21/day)
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor FX-8350 vishera
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) Sapphire RX 580 Nitro+;1450/2000 Mhz
Storage SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb
Display(s) Benq XL2730Z 144 Hz freesync
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Riotoro Enigma G2 850W
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win10 64bit ltsc
Benchmark Scores irrelevant for me
finally office pc's with good integrated graphic
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,383 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Joined
Feb 19, 2009
Messages
1,151 (0.21/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) M32Q,AOC 27" 144hz something.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Benchmark Scores 33000FS, 16300 TS. Lappy, 7000 TS.
...to play Battlefield when the boss isn't looking.

My work pc has a I7 2.13 ghz quad with HT 16gb ram and a HD5870 mobility (hd5770 based)
I can play games all nicely :)
 

p3ngwin1

New Member
Joined
May 12, 2011
Messages
24 (0.01/day)
Yes because Plants vs Zombies requires a high-end DX11 GPU! No offense AMD, but unless these CPUs are able to compete with Sandy Bridge in terms of performance, no-one is going to give a hoot about how good the integrated graphics are.



[H] posted a review of 6990 + 6970 (essentially 6970 tri-fire) a while back and they found it scaled excellently, beating out GTX 580 SLI in a number of scenarios and at a much lower price point.

you do know that GPGPU processing is used for more than gaming right?

Apple use OpenCL in the OS and Windows will use it in Win8, Office 2010 has GPU acceleration, as do all today's Internet browsers (getting towards everything on and off screen practically), media players, etc.

it's not only needed for games, it's usable for almost anything you do with your OS and it is increasing every year. the customer may not care about graphics processors, they just need to know price/performance information.

to say a GPGPU is not needed for "normal" applications and casual computer usage is to show ignorance of today's applications and their abilities.
 

Fourstaff

Moderator
Staff member
Joined
Nov 29, 2009
Messages
10,024 (1.91/day)
Location
Home
System Name Orange! // ItchyHands
Processor 3570K // 10400F
Motherboard ASRock z77 Extreme4 // TUF Gaming B460M-Plus
Cooling Stock // Stock
Memory 2x4Gb 1600Mhz CL9 Corsair XMS3 // 2x8Gb 3200 Mhz XPG D41
Video Card(s) Sapphire Nitro+ RX 570 // Asus TUF RTX 2070
Storage Samsung 840 250Gb // SX8200 480GB
Display(s) LG 22EA53VQ // Philips 275M QHD
Case NZXT Phantom 410 Black/Orange // Tecware Forge M
Power Supply Corsair CXM500w // CM MWE 600w
to say a GPGPU is not needed for "normal" applications and casual computer usage is to show ignorance of today's applications and their abilities.

On the other hand, to say GPGPU is needed for "normal" applications and casual computer usage is hopelessly optimistic, there are very few if any "normal" applications which benefit from the boost provided by the GPGPU. In specific applications like Photoshop etc. is another story though.
 

p3ngwin1

New Member
Joined
May 12, 2011
Messages
24 (0.01/day)
On the other hand, to say GPGPU is needed for "normal" applications and casual computer usage is hopelessly optimistic, there are very few if any "normal" applications which benefit from the boost provided by the GPGPU. In specific applications like Photoshop etc. is another story though.

read my post again, the other part you didn't quote:

"Apple use OpenCL in the OS and Windows will use it in Win8, Office 2010 has GPU acceleration, as do all today's Internet browsers (getting towards everything on and off screen practically), media players, etc. "

Apple's "Core Image" for example uses the GPU for acceleration, as well as paid programs like "Pixelmator".

GPU's are ALREADY in use for "normal" operations.

On March 3, 2011, Khronos Group announces the formation of the WebCL working group to explore defining a JavaScript binding to OpenCL. This creates the potential to harness GPU and multi-core CPU parallel processing from a Web browser.

On May 4, 2011, Nokia Research releases an open source WebCL extension for the Firefox web browser, providing a JavaScript binding to OpenCL.


the current versions of Internet browsers such as Opera, Firefox and I.E. ALL have GPU acceleration for many aspects of browsing, from off-screen web-page composition, CSS, JPEG decoding, font rendering, window scaling, scrolling, Adobe Flash video acceleration, and an increasing list of the pipeline that makes the internet surfing experience.

everything in the browser is evolving to take advantage of the CPU and GPU as efficiently as possible, using API's such as DirectX, OpenCL, etc.

Office apps, internet browsers, video players.....

between office apps and browsers alone, that's most of the planet's "normal" usage.

what definition of "normal" are you thinking of that refutes that GPU's are used already in today's everyday usage scenarios ?
 

p3ngwin1

New Member
Joined
May 12, 2011
Messages
24 (0.01/day)
With browsers and stuff becoming 3D accelerated, I do think this has a purpose in the office other than gaming.

that's not even considering the power efficiency of having the processor performing the tasks better and saving power.

imagine all of the current companies and corporations, even datacentres, clusters etc that benefit from either better performance, efficiency, or their desired balance of either.

this is not just for casual customers on the street, these chips are going to be in everything, and the more efficient they are the better for us in many ways that few people here seem to appreciate.

some people may ask what the point is of having the browser, or Office 2010 being accelerated ? maybe they won't notice the performance, but they are not the whole equation and they shouldn't be considered such.

even if they don't perceive the performance, the applications developers now have more potential processing to make the apps better. if that doesn't happen, then the processing become more efficient and prices drop, saving money. even if the money saved is not a consideration for these office people, their managers and bosses WILL consider it.

it's a WIN for us any way we look at it.

more efficient processors benefit us one way or another, and to think "i don't need a more efficient processor" demonstrates ignorance of this technology and the applications you use.
 

Fourstaff

Moderator
Staff member
Joined
Nov 29, 2009
Messages
10,024 (1.91/day)
Location
Home
System Name Orange! // ItchyHands
Processor 3570K // 10400F
Motherboard ASRock z77 Extreme4 // TUF Gaming B460M-Plus
Cooling Stock // Stock
Memory 2x4Gb 1600Mhz CL9 Corsair XMS3 // 2x8Gb 3200 Mhz XPG D41
Video Card(s) Sapphire Nitro+ RX 570 // Asus TUF RTX 2070
Storage Samsung 840 250Gb // SX8200 480GB
Display(s) LG 22EA53VQ // Philips 275M QHD
Case NZXT Phantom 410 Black/Orange // Tecware Forge M
Power Supply Corsair CXM500w // CM MWE 600w
what definition of "normal" are you thinking of that refutes that GPU's are used already in today's everyday usage scenarios ?

I should have phrased it better: The benefit of having a powerful gpu diminishes extremely rapidly as the gpu performance increases for normal applications. In other words, the onboard HD4200 and Sandy Bridge's integrated IGP is going to be enough, and Bulldozer's 400SP's will not provide a noticeable boost (noticeable as in you can see and feel that its faster, rather than faster only in benchmarks).

As a side note, I don't feel any difference between my Mobility 4570 and my sister's Mobility 5650 in web page performance.

Edit: Currently the main usage of processing power revolves around the x86 architecture, so as long as we are primarily using the x86, I cannot see how we will be able to migrate to GPGPU.
 
Last edited:
W

wahdangun

Guest
I should have phrased it better: The benefit of having a powerful gpu diminishes extremely rapidly as the gpu performance increases for normal applications. In other words, the onboard HD4200 and Sandy Bridge's integrated IGP is going to be enough, and Bulldozer's 400SP's will not provide a noticeable boost (noticeable as in you can see and feel that its faster, rather than faster only in benchmarks).

As a side note, I don't feel any difference between my Mobility 4570 and my sister's Mobility 5650 in web page performance.

Edit: Currently the main usage of processing power revolves around the x86 architecture, so as long as we are primarily using the x86, I cannot see how we will be able to migrate to GPGPU.

you know the irony part is sb didn't support opencl yet, despite its support dx10.1.
 

Fourstaff

Moderator
Staff member
Joined
Nov 29, 2009
Messages
10,024 (1.91/day)
Location
Home
System Name Orange! // ItchyHands
Processor 3570K // 10400F
Motherboard ASRock z77 Extreme4 // TUF Gaming B460M-Plus
Cooling Stock // Stock
Memory 2x4Gb 1600Mhz CL9 Corsair XMS3 // 2x8Gb 3200 Mhz XPG D41
Video Card(s) Sapphire Nitro+ RX 570 // Asus TUF RTX 2070
Storage Samsung 840 250Gb // SX8200 480GB
Display(s) LG 22EA53VQ // Philips 275M QHD
Case NZXT Phantom 410 Black/Orange // Tecware Forge M
Power Supply Corsair CXM500w // CM MWE 600w
you know the irony part is sb didn't support opencl yet, despite its support dx10.1.

Ah :eek: I thought after that Larabee nonsense they should have came up with GPU supporting OpenCL already. Guess not :ohwell:
 

p3ngwin1

New Member
Joined
May 12, 2011
Messages
24 (0.01/day)
the next evolutionary step in homogenous processing...

I should have phrased it better: The benefit of having a powerful gpu diminishes extremely rapidly as the gpu performance increases for normal applications. In other words, the onboard HD4200 and Sandy Bridge's integrated IGP is going to be enough, and Bulldozer's 400SP's will not provide a noticeable boost (noticeable as in you can see and feel that its faster, rather than faster only in benchmarks).

As a side note, I don't feel any difference between my Mobility 4570 and my sister's Mobility 5650 in web page performance.

Edit: Currently the main usage of processing power revolves around the x86 architecture, so as long as we are primarily using the x86, I cannot see how we will be able to migrate to GPGPU.

for your mobility reference, it's good that you have hardware that satisfies you for the present, and foreseeable future. no need to upgrade unless you perhaps need better battery life from more efficient processors.

regarding x86, that may change with ARM and Microsoft changing things with Google's and Nvidia's help (let alone the continuing investments from T.I., Qualcomm, Marvel, Samsung, etc), and it certainly will change with languages like OpenCL that mean homogenous acceleration on any hardware.

currently you need to program for either the GPU or the CPU, with our current baby steps to make languages like OpenCL programmable to automatically take advantage of any OpenCL compatible hardware. with the OpenCL the goal is to write code without thinking of the hardware. we're just starting, and we've already got traction and practical benefits today. this will only get better with time.

as for the hardware, the traditional CPU and GPU have been on a collision course for decades, with increasing features being shared such that it's getting more difficult to differentiate them by the day.

things like the GPU becoming programmable with DirectX's programmable shaders, than later unified shaders, then Direct Compute, and OpenGL's similar parallel efforts.

we now have GPU's like Nvidia's Fermi architecture with L1 and L2 caches, ECC and a bunch of other HPC and supercomputing features that rock the previously CPU-only-driven supercomputing world. even Intel with it's Sandy bridge CPU's has evolved from the cross-bar memory bus, to what GPU's have been using for years: a ring-bus.

the defining lines separating CPU's and GPU's are blurring and there will very soon be a single processor that computes everything, there will be no more GPU, there will only be an evolved new generation of processors that may still be called CPU's, but they will no longer be "general", they will be agnostic: neither specific, like a GPU used to be, or general, like the CPU used to be.

it only leaves what architecture these chips will follow: x86, ARM, or some other flavour.

these new generation processors will be no longer "jack of all trades, yet master of none" to evolve into "master of all"

signs of such evolutionary processor design have been seen already.

Sony evolved their thinking when designing their processors with Toshiba and IBM for their Playstation consoles along this way. their playstation 2 console had a PPC with custom V0 and V1 units. these were similar to DSP's that could blaze through vector maths and could be programmed more than traditional DSP's.
games developers used them in all sorts of ways, from improving the graphics functions to add to the GPU's hard-wired features set, to EA creating an accelerated software stack for Dolby 5.1 running solely on the V0 unit, freeing up the rest of the CPU.

with the PS3 Sony took the idea forward again, and evolved the idea of the V0 and V1 units, generalizing even more their functionality, further again from their DSP heritage, and came up with the "The synergistic processing unit (SPU)". sometimes called SPelements, confusingly.

these SPE's would not only be more powerful than their previous V0 and V1 units,, they would increase in number, such that whereas in the PS2 they were the minority of the processing in the main CPU, in the PS3's CELLbe, they would be the majority of the processing potential. these SPE's would amount to 8 units attached to a familiar PPC.

sony prototyped the idea of not having a separate CPU and GPU for the PS3, toying with the idea of two identical CELLbe chips, to be used by developers as they wish. the freedom was there, but he development tools to take advantage of massively parallel processors wasn't, as seen with Sega's twin SH2 processors from Hitachi generations ago.

we have the parallel hardware, we simply need to advance programming languages to take advantage. this is the idea behind OpenCL.

to find a better balance, Sony approached Nvidia late into development of the PS3 and finally decided on the 7800GT with it's fixed function vertex and pixel shader technology to go up against ATI's more modern unified shader architecture in the XBOX360.

it will be interesting to see Sony's plans for the PS4's architecture if they continue their commitment to massively parallel processing.

meanwhile PC architectures like Intel's "Larabee" and AMD's "Fusion" projects, show that the evolution of processing is heading towards homogenous computing, with no specialty chips, and all processing potential efficiently being used due to no idle custom functions.

AMD bought ATI and their Fusion project will eventually merge the GPU's SIMD units together with the CPU's traditional floating point unites to begin the mating of the CPU and GPU into what will eventually be homogenous processor.

just as the smart folks like PC game-engine designers and Sony have predicted since 2006 and beyond
 

p3ngwin1

New Member
Joined
May 12, 2011
Messages
24 (0.01/day)
Ah :eek: I thought after that Larabee nonsense they should have came up with GPU supporting OpenCL already. Guess not :ohwell:

it's really odd, you would think Intel with it's 50x more resources than AMD would have at least passable GPU tech in their latest CPU's. the CPU's are great, but damn how long does it take for Intel to catch-up in graphics?

AMD, small as they are in comparison, made a massive bet buying ATI years ago, and while it may have been a little premature by a year or so, and nearly broke the company....it's is already paying MASSIVE rewards.

Intel are at least 2 years to catch up to what AMD is selling THIS YEAR with it's Fusion technology. Bulldozer tech+GPU processors have great potential. these current Fusion's are based on old AMD CPU designs, so there's even more potential ahead.

Intel seem to be fumbling around with GPU technology as if they don't understand it or something, like it's exotic or alien. why can't they make a simple and decent GPU to start ? what's with the ridiculous sub-standard and under-performaing netbook-class GPU's ?
 

lashton

New Member
Joined
Sep 24, 2010
Messages
63 (0.01/day)
hmmmm

Well thats strange because i seem a Lllano is physically smaller than a Sandybridge!!
 
Top