• We've upgraded our forums. Please post any issues/requests in this thread.

AMD Charts Path for Future of its GPU Architecture

Thatguy

New Member
Joined
Nov 24, 2010
Messages
666 (0.26/day)
Likes
68
#51
the market is headed toward console games that are directx (xbox360) and that get recompiled with a few clicks for pc to maximize developer $$
If you say so, I think your off base here and the microsoft design will offer huge problems downstream. The company ready for tommorow, will be the winner tommorw.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
16,546 (3.88/day)
Likes
10,908
Location
Parkland County, Alberta
System Name Gamer
Processor Intel i7-6700K (ES)
Motherboard MSI Aegis TI
Cooling Custom Dragon Cooler
Memory 16 GB Kingston HyperX 2133 MHz C13
Video Card(s) 2x MSI GAMING GTX 980
Storage 2x Intel 600P
Display(s) Dell 3008WFP
Case MSI Aegis Ti
Mouse MSI Interceptor DS B1
Keyboard MSI DS4200 GAMING Keyboard
Software Windows 10 Home
#52
If you say so, I think your off base here and the microsoft design will offer huge problems downstream. The company ready for tommorow, will be the winner tommorw.
You MUST MUST keep in mind that all of this is business, and as such, the future of technology is very unfluenced by the businesses behind it. the least amount of work that brings in the most dollars is what WILL happen, without a doubt, as this is the nature of business.


What needs to be done is for someone to effectively show why other options make more sense, not fro, a technical standpoint, but from a business standpoint.

And like mentioned, none of these technologies AMD/ATI introduced over the years really seem to make much business sense, and as such, they fail hard.


Amd's board now seems to realize this...Dirk was dumped, and Bulldozer "delayed", simply becuase that made the MOST business sense...they met the market demand, and rightly so, as market demnad for those products is so high that they have no choice but to delay the launch of Bulldozer.

Delaying a new product, because an existing one is in high demand, makes good business sense.
 
Joined
May 31, 2005
Messages
252 (0.06/day)
Likes
21
#53
What I see is AMD selling all of their consumer CPUs under $200, even their 6 core chips. They need new CPU tech that they can get some better margins on. Intel charges 3-4x more for their 6 core chips because they have clear performance dominance.

Buying ATI was a good move because both AMD and NV are now obviously trying to bypass Intel's dominance by creating a new GPU compute sector. I'm not sure if that will ever benefit the common user though because of the limited types of computing that work well with GPUs.

Also, Llano and Brazos are redefining the low end in a way that Intel didn't bother to so that's interesting too.
 
Last edited:

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (5.94/day)
Likes
3,682
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
#54
Becuase soon enough the hardware will do the work anyways. Its not always about software. As to Nvidia, they painted themselves into a corner years ago.
The hardware need software to operate. This comment doesn't even make any sense.
 

Thatguy

New Member
Joined
Nov 24, 2010
Messages
666 (0.26/day)
Likes
68
#55
The hardware need software to operate. This comment doesn't even make any sense.
Sure it does, what if the cpu schedulre and the cpu decoder know how to break the works loads across, int,fpu,vliw etc. If it get smart enough, and there no reason it can't be, then the OS just sees x86 emulated as plain x86, but the underlying micro handles alot of the heavy lifting, if you don't really see the guiness behind bulldozer, your looking in the wrong places. How hard would it be for amd to intorduce vliw like elements into that modular core design ? Not terrifically hard, better belive that this is the way forward. Tradition x86 is dead.
 
Joined
Apr 21, 2010
Messages
142 (0.05/day)
Likes
19
Location
Perth, Australia
Processor Phenom II X6 1055 @3290/3760Mhz 1.375 HT@2115 NB@2350
Motherboard GA-880GM-usb3 (seems to dislike bclk of over 235)
Cooling Scythe Big Shiruken + TR 1300rpm 140mm
Memory G.Skill Ripjaw 4GB 8-9-8-24-32 1T @1566mhz
Video Card(s) Gigabyte 6850 @875mhz gpu 1150mhz memory
Storage G.Skill Evo 115GB, Seagate 7200.11 1TB, WD Green 5400 1TB
Display(s) LG Flatron E2240V
Case Lian Li V352
Audio Device(s) Realtek ALC 892
Power Supply Seasonic X-650
Software Win 7 Ult 64bit
#56
And even that hyped video encoding is mostly done on CPU which makes it utterly useless as it's not much faster than pure CPU anyway. They were bragging about physics as well but they never made them. Not with Havok, not with Bullet, not with anything. I mean if you want to make something, endorse it, embrace it, give developers a reason to develop and build on that feature.
In stead they announce it, brag about it and then we can all forget about it as it'll never happen.
They should invest those resources into more productive things instead of wasting them on such useless stuff.

Only thing that they pulled off properly is MLAA which uses shaders to process screen and anti-alias it. It functions great in pretty much 99,9% of games, is what they promised and i hope they won't remove it like they did with most of their features (Temporal AA, TruForm, SmartShaders etc). Sure some technologies got redundant like TruForm, but others just died because AMD didn't bother to support them. SmartShaders were good example. HDRish was awesome, giving old games a fake HDR effect which looked pretty good. But it worked only in OpenGL and someone else had to make it. AMD never added anything useful for D3D which is what most of the games use. So what's the point!?!?! They should really get their stuff together and stop wasting time and resources on useless stuff and start making cool features that can last. Like again, MLAA.
most games these days use at least, parts of the Havok, Bullet or what ever libraries. resident evil 5 and company of heroes are 2 that mention use of havok on the box. bad company 2 used parts of havok or bullet? most physics come from these libraries. its alot easier for devs than writing their own.
(below in relpy to someone above, i'm not sure how relevant it is but it's true none the less)
the whole do what makes the most money now and we'll deal with the consequences later ideology, is why the american economy is in the state that it is. companies are like children, they want the candy & lots of it now but then they make themselves sick because they had too much. a responsible parent regulates them, it doesn't matter how big a tantrum they throw, because they know that cleaning up the resulting mess that occurs if they let them do as they please is much worse. just saying companies will do what makes the biggest short term gains regardless of the long term consequences doesn't help you or i see better games.
 
Joined
May 16, 2011
Messages
1,430 (0.59/day)
Likes
460
Location
A frozen turdberg.
System Name Runs Smooth
Processor FX 8350
Motherboard Crosshair V Formula Z
Cooling Corsair H110 with AeroCool Shark 140mm fans
Memory 16GB G-skill Trident X 1866 Cl. 8
Video Card(s) HIS 7970 IceQ X² GHZ Edition
Storage OCZ Vector 256GB SSD & 1Tb piece of crap
Display(s) acer H243H
Case NZXT Phantom 820 matte black
Audio Device(s) Nada
Power Supply NZXT Hale90 V2 850 watt
Software Windows 7 Pro
Benchmark Scores Lesbians are hot!!!
#57
Speaking of AMD's graphics future this is a long, but interesting read.

http://www.anandtech.com/show/4455/amds-graphics-core-next-preview-amd-architects-for-compute

Graphics Core Next (GCN) is the architectural basis for AMD’s future GPUs, both for discrete products and for GPUs integrated with CPUs as part of AMD’s APU products. AMD will be instituting a major overhaul of its traditional GPU architecture for future generation products in order to meet the direction of the market and where they want to go with their GPUs in the future.
 
Joined
Apr 1, 2008
Messages
2,817 (0.79/day)
Likes
604
System Name HTC's System
Processor Ryzen 5 1600
Motherboard Asrock Taichi
Cooling NH-C14
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 2 * Samsung 1 TB HD103UJ
Display(s) LG 27UD58
Case Corsair Obsidian 650D
Audio Device(s) Onboard
Power Supply Corsair TX750
Mouse Logitech Performance MX
Software Ubuntu 16.04 LTS
#58

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (5.94/day)
Likes
3,682
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
#59
Sure it does, what if the cpu schedulre and the cpu decoder know how to break the works loads across, int,fpu,vliw etc. If it get smart enough, and there no reason it can't be, then the OS just sees x86 emulated as plain x86, but the underlying micro handles alot of the heavy lifting, if you don't really see the guiness behind bulldozer, your looking in the wrong places. How hard would it be for amd to intorduce vliw like elements into that modular core design ? Not terrifically hard, better belive that this is the way forward. Tradition x86 is dead.
There is no way to do it transparently to the OS. You still need software to tell the scheduler what type of info is coming down the pipeline. It will require a driver at minimum.
 

Thatguy

New Member
Joined
Nov 24, 2010
Messages
666 (0.26/day)
Likes
68
#60
There is no way to do it transparently to the OS. You still need software to tell the scheduler what type of info is coming down the pipeline. It will require a driver at minimum.
Why ? The dirver makes up for the lack of logic on the chip.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (5.94/day)
Likes
3,682
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
#61
If they were capable of giving a chip that kind of logic at this point, we would have things like multi-GPU gfx cards that show up to the OS as a single gpu.

We aren't anywhere near the chips being able to independently determine data type and scheduling like that.
 

Thatguy

New Member
Joined
Nov 24, 2010
Messages
666 (0.26/day)
Likes
68
#62
If they were capable of giving a chip that kind of logic at this point, we would have things like multi-GPU gfx cards that show up to the OS as a single gpu.

We aren't anywhere near the chips being able to independently determine data type and scheduling like that.
What do you think all this APU nonsense is about ? Popcorn on tuesdays ?
 
Joined
Jul 10, 2009
Messages
467 (0.15/day)
Likes
89
Location
TR
Processor C2duo e6750@ 2800mhz
Motherboard GA-P43T-ES3G
Cooling Xigmatek S1283
Memory 2x2Gb Kingstone HyperX DDR3 1600 (KHX1600C9D3K2/4GX )
Video Card(s) HIS HD 6870
Storage samsung HD103SJ
Case Xigmatek Utgard
Audio Device(s) X-FI Titanium Pci-e
Power Supply Xigmatek NRP-PC702 700W
Software win7 ultimate -64bit
#63
Havoc is different from others , Intel bought Havoc and decided to use it as software physics api to advert intel cpus ,i dont think anyone would do anyting else what AMd done about this .noone would use a software api while you could do it on hardware.

You know, AMD has always had these great gpu hardware features (the ability to crunch numbers, like in physics), then promises us great software to run on it (GPU accelerated Havok anyone?), but the software never materializes.

I'll get excited about this when it is actually being implemented by devs in products I can use.

yup .. i dont think amd has successfully brought any software feature to market.. maybe x86_64 if you count intel adopting it


I see Cloud computing renamed version of old terminal pc-thin client /server concept ,with online gaming problem is connection more than hardware ,youll need rock-stable connection -something hard to find always http://en.wikipedia.org/wiki/Thin_client
The future is fusion, remember? CPU and GPU becoming one, it's going to happen, I believe it, but are we gonna really "own" it?

These days cloud computing is starting to make some noice and it makes sense, average guys/gals are not interested in FLOPS performance, they just want to listen to their music, check facebook and play some fun games. What I'm saying is that in the future we'll only need a big touch screen with a mediocre ARM processor to play Crysis V. The processing, you know GPGPU, the heat and stuff, will be somewhere in China, we'll be given just what we need, the final product through a huge broadband. If you've seen the movie Wall-E, think about living in that spaceship Axiom, it'd be something like that... creepy, eh?
 
Joined
Nov 21, 2007
Messages
3,684 (1.00/day)
Likes
402
Location
Smithfield, WV
System Name Felix777
Processor Core i5-3570k@stock
Motherboard Biostar H61
Memory 8gb
Video Card(s) XFX RX 470
Storage WD 500GB BLK
Display(s) Acer p236h bd
Case Haf 912
Audio Device(s) onboard
Power Supply Rosewill CAPSTONE 450watt
Software Win 10 x64
#65
What do you think all this APU nonsense is about ? Popcorn on tuesdays ?
its about getting a CPU and GPU into one package, once die, eventually one chip that'll be way more cost effective than 2 separate chips. Oh, and taking over the entry/low end of the market from Intel.

That's what that APU common sense is about :p


Very nice find sir, i want to read it all but i might have to bookmark it.:toast:,
 
Joined
Jan 2, 2009
Messages
9,767 (2.99/day)
Likes
1,779
Location
Suffolk/Essex, England
System Name Joseph's Laptop Clevo P771ZM
Processor 4970k @4/4.4ghz
Motherboard *shrugs*
Cooling About 2 kilos of copper fins and pipes.
Memory 2x 8gb
Video Card(s) GTX 970m 6gb
Storage 500gb Msata SSD 2x 2TB storage drives
Display(s) Built in
Power Supply 300w power brick
Mouse Steam controller
Software Windows ten

Thatguy

New Member
Joined
Nov 24, 2010
Messages
666 (0.26/day)
Likes
68
#67
its about getting a CPU and GPU into one package, once die, eventually one chip that'll be way more cost effective than 2 separate chips. Oh, and taking over the entry/low end of the market from Intel.

That's what that APU common sense is about :p

Long range its about comming to grips with serial processing and the lack of compute power you get from it.
 
Joined
Nov 21, 2007
Messages
3,684 (1.00/day)
Likes
402
Location
Smithfield, WV
System Name Felix777
Processor Core i5-3570k@stock
Motherboard Biostar H61
Memory 8gb
Video Card(s) XFX RX 470
Storage WD 500GB BLK
Display(s) Acer p236h bd
Case Haf 912
Audio Device(s) onboard
Power Supply Rosewill CAPSTONE 450watt
Software Win 10 x64
#68
Long range its about comming to grips with serial processing and the lack of compute power you get from it.
You're talking about GCN then. I was talking short range :p.

Honestly, I definitely think AMD is going to take a leap in innovation over Nvidia these next 5 years or so. I really do think AMD's experience with CPU's is going to pay off when it comes to integrating compute performance in their GPU...well APU. Nvidia has the lead right now, but i can see AMD loosening that grip.
 

Thatguy

New Member
Joined
Nov 24, 2010
Messages
666 (0.26/day)
Likes
68
#69
You're talking about GCN then. I was talking short range :p.

Honestly, I definitely think AMD is going to take a leap in innovation over Nvidia these next 5 years or so. I really do think AMD's experience with CPU's is going to pay off when it comes to integrating compute performance in their GPU...well APU. Nvidia has the lead right now, but i can see AMD loosening that grip.
I don't think Nvidia is going to come much further then they have thus far. AMD is set to put a whoppin on Intel and nvidia in that area. given the limits of IPC and clock speed, its the only way to get where they need to go in the first place.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (5.94/day)
Likes
3,682
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
#70
What do you think all this APU nonsense is about ? Popcorn on tuesdays ?
A fancy name for using gpu shaders to accelerate programs. AKA: The same shit we already have for a gfx cards in the way of CUDA/whatever stream was renamed to.

I would bet money this is not hardware based at all, and requires special software/drivers to work properly.
 
Joined
Jan 2, 2009
Messages
9,767 (2.99/day)
Likes
1,779
Location
Suffolk/Essex, England
System Name Joseph's Laptop Clevo P771ZM
Processor 4970k @4/4.4ghz
Motherboard *shrugs*
Cooling About 2 kilos of copper fins and pipes.
Memory 2x 8gb
Video Card(s) GTX 970m 6gb
Storage 500gb Msata SSD 2x 2TB storage drives
Display(s) Built in
Power Supply 300w power brick
Mouse Steam controller
Software Windows ten
#71
A fancy name for using gpu shaders to accelerate programs. AKA: The same shit we already have for a gfx cards in the way of CUDA/whatever stream was renamed to.

I would bet money this is not hardware based at all, and requires special software/drivers to work properly.
I bet it is hardware based, it's not just a fancy name though, it's gpu shaders in the cpu ( or next to in this case) meaning your cpu/gpu (apu) can handle all the physics and your gpu can focus on being a graphics card.

Or if all of AMDs cpus go this way, means people don't have to buy a gpu straight away which is also nice.
 

Thatguy

New Member
Joined
Nov 24, 2010
Messages
666 (0.26/day)
Likes
68
#72
A fancy name for using gpu shaders to accelerate programs. AKA: The same shit we already have for a gfx cards in the way of CUDA/whatever stream was renamed to.

I would bet money this is not hardware based at all, and requires special software/drivers to work properly.
No, its about compute power, these first generation GPU's are about figuring out how to get the transistor and some of the basic technology figured out with How to make the transistors on the same piece of silica. The next step will be more transistors on both side cpu/gpu and the step beyond that will be a intergration of x86 cpu logic and gpu parellelism. Which will Give AMD a massive advantage over Nvidia and Intel in compute power and heavy workloads.

AMD got it right, 6 years ago when they started down this road, thats why bulldozer is modular.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (5.94/day)
Likes
3,682
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
#73
I bet it is hardware based, it's not just a fancy name though, it's gpu shaders in the cpu ( or next to in this case) meaning your cpu/gpu (apu) can handle all the physics and your gpu can focus on being a graphics card.

Or if all of AMDs cpus go this way, means people don't have to buy a gpu straight away which is also nice.
No it isn't. It's basically a gpu put on the same pcb as the cpu. The concept is exactly the same as current gpu accelerated programs. The only difference is the location of the gpu.
No, its about compute power, these first generation GPU's are about figuring out how to get the transistor and some of the basic technology figured out with How to make the transistors on the same piece of silica. The next step will be more transistors on both side cpu/gpu and the step beyond that will be a intergration of x86 cpu logic and gpu parellelism. Which will Give AMD a massive advantage over Nvidia and Intel in compute power and heavy workloads.

AMD got it right, 6 years ago when they started down this road, thats why bulldozer is modular.
WIll give an advantage =/= currently having an advantage.

Again, this is just gpgpu. Same thing we've had for ages. It is not transparent to the OS, and must specifically be coded for. Said coding is always where AMD ends up dropping the ball on this crap. I will not be excited until I see this actually being used extensively in the wild.
 
Joined
Jan 2, 2009
Messages
9,767 (2.99/day)
Likes
1,779
Location
Suffolk/Essex, England
System Name Joseph's Laptop Clevo P771ZM
Processor 4970k @4/4.4ghz
Motherboard *shrugs*
Cooling About 2 kilos of copper fins and pipes.
Memory 2x 8gb
Video Card(s) GTX 970m 6gb
Storage 500gb Msata SSD 2x 2TB storage drives
Display(s) Built in
Power Supply 300w power brick
Mouse Steam controller
Software Windows ten
#74
No, it's on the same silicon man, there's no latency between the communication of CPU-GPU ( or very little)

It does have benefits.
 

Thatguy

New Member
Joined
Nov 24, 2010
Messages
666 (0.26/day)
Likes
68
#75
No it isn't. It's basically a gpu put on the same pcb as the cpu. The concept is exactly the same as current gpu accelerated programs. The only difference is the location of the gpu.WIll give an advantage =/= currently having an advantage.

Again, this is just gpgpu. Same thing we've had for ages. It is not transparent to the OS, and must specifically be coded for. Said coding is always where AMD ends up dropping the ball on this crap. I will not be excited until I see this actually being used extensively in the wild.
Imagine the power of GPU with the programming front end of x86 or x87, which are widely supported instructions in compilers right now.

Thats where this is headed, INT + GPU the FPU is on borrowed time and thats likely why they shared it.