Monday, August 27th 2012

AMD "Vishera" FX-Series CPU Specifications Confirmed

A leaked AMD document for retail partners spelled out specifications of the first three FX "Vishera" processors by AMD. The new CPUs incorporate AMD's "Piledriver" architecture, and much like the first-generation "Zambezi" chips, will launch as one each of eight-core, six-core, and four-core chips. The eight-core FX-8350 is confirmed to ship with 4.00 GHz nominal clock speed, with 4.20 GHz TurboCore speed. The six-core FX-6300 ships with 3.50 GHz nominal, and 4.10 GHz TurboCore speed. The quad-core FX-4320, on the other hand, ships with the same clock speeds as the FX-8350. In addition, the document confirmed clock speeds of several socket FM2 A-series APUs, such as the A10-5700 and the A8-5500.

Source: Expreview
Add your own comment

493 Comments on AMD "Vishera" FX-Series CPU Specifications Confirmed

#1
AvonX
by: TheLaughingMan
That is not true. They current don't plan to release another dedicated CPU. The next generation is suppose to be improvements in the core design, drop to smaller fab, and addition of a on-die GPU. This would bring their top end CPU's in line with Intel current model of including a on-die GPU on every chip.

The question there would be will it be wasted space or will AMD be smart enough to leverage that extra GPU to its advantage on a software side. Example: While NOT gaming, AMD ZeroCore can turn your dedicated GPU completely off and handle day to day tasks with the on-die GPU. OpenCL and Direct Compute used to run physics calculations on the on-die while dedicated card renders game. etc.
Well if that is the case, no more AMD stuff for me. :laugh:
I see no reason at all for them to stop doing dedicated desktop cpus. Since these cpus come from the server parts, i find it silly for them to don't make the extra cash.
I see bad management again for AMD. APUs are not ready for really high performance yet.
I see AMD sinking and fast. If they have the same mindset for servers as well, then they will sink for sure.
Posted on Reply
#2
Frick
Fishfaced Nincompoop
by: eidairaman1
for most consumers here they game more, Im sure the BD shines in stuff other than gaming
BD for servers review.

by: AvonX
Well if that is the case, no more AMD stuff for me. :laugh:
I see no reason at all for them to stop doing dedicated desktop cpus. Since these cpus come from the server parts, i find it silly for them to don't make the extra cash.
I see bad management again for AMD. APUs are not ready for really high performance yet.
I see AMD sinking and fast. If they have the same mindset for servers as well, then they will sink for sure.
You do know pretty much every Intel CPU have integrated GPUs as well don't you? Even the 3770k has it.
Posted on Reply
#3
theoneandonlymrk
by: AvonX
Well if that is the case, no more AMD stuff for me.
I see no reason at all for them to stop doing dedicated desktop cpus. Since these cpus come from the server parts, i find it silly for them to don't make the extra cash.
I see bad management again for AMD. APUs are not ready for really high performance yet.
I see AMD sinking and fast. If they have the same mindset for servers as well, then they will sink for sure.
dont be silly APU's already possess way more computational performance then Cpu's, the software stack decides what it will use, and that isnt the gpu part often enough, but Amd ,Arm ,Qualcom ,also indirectly associated but still intel and nvidia are all moveing towards a hetrogeneouse future with all cpu's configured with an on die gpu/GPGPU unit, thats ALL of them ARE makeing the same thing ,APU i believe is trademarked by AMD, but they are all makeing them and soon Arm will have a high performance APU in its IP stack, as will AMD, if you arent buying that shit then you wont be upgradeing your pc at all soon then.


APU server parts are already in the market place afaik.
Posted on Reply
#4
Steevo
Reviews, and they have horrible advertising still. It looks like something a middle school child would make to tout the benefits of them being class president.
Posted on Reply
#5
Covert_Death
YAYYYYYYYY I passed up on BD and am glad i did... Im looking at you 8350 and hoping for a ~$200 price tag!

I run a PII x4 955 @ 3.99Ghz day in day out and this MIGHT actually be an upgrade if it overclocks well! HerEs to hoping! (and hoping arma3 is more CPU optimized!)
Posted on Reply
#6
Oberon
So, can anyone explain the presence of APU-specific features (AVC, "dual-graphics" [distinct from CF for AMD], eyefinity, etc.) for the FX CPUs that don't have a GPU on-board? Smells fishy to me.
Posted on Reply
#7
AvonX
by: theoneandonlymrk
dont be silly APU's already possess way more computational performance then Cpu's, the software stack decides what it will use, and that isnt the gpu part often enough, but Amd ,Arm ,Qualcom ,also indirectly associated but still intel and nvidia are all moveing towards a hetrogeneouse future with all cpu's configured with an on die gpu/GPGPU unit, thats ALL of them ARE makeing the same thing ,APU i believe is trademarked by AMD, but they are all makeing them and soon Arm will have a high performance APU in its IP stack, as will AMD, if you arent buying that shit then you wont be upgradeing your pc at all soon then.


APU server parts are already in the market place afaik.
You cant compare those things with a dedicated cpu and a discrete graphics card for gaming.
Way more powerful than APU's currently.
Posted on Reply
#8
pantherx12
by: AvonX
You cant compare those things with a dedicated cpu and a discrete graphics card for gaming.
Way more powerful than APU's currently.
Yes two dedicated pieces of hardware would be faster.

How ever no where near as efficient power wise, production wise and space wise. Oh and lets not forget performance for price where they would own two dedicated pieces of hardware.


Pretty much every device in the future will be a combination of CPU/GPU AMD are right when they say the future is fusion.
Posted on Reply
#9
AvonX
by: Frick
BD for servers review.



You do know pretty much every Intel CPU have integrated GPUs as well don't you? Even the 3770k has it.
You do know that the 3770k is way way faster than any AMD dedicated CPU or APU don't you?
That is the main difference. I think you may have misunderstood me.
The point that i want to make here is: Even if AMD has no plans to do dedicated cpu's they still have to innovate and push harder to be faster. Currently they really don't have those things in place. So its a big risk from my point of view to go full APU from now because they are not ready for that yet.
Posted on Reply
#10
AvonX
by: pantherx12
Yes two dedicated pieces of hardware would be faster.

How ever no where near as efficient power wise, production wise and space wise. Oh and lets not forget performance for price where they would own two dedicated pieces of hardware.


Pretty much every device in the future will be a combination of CPU/GPU AMD are right when they say the future is fusion.
The only thing that Intel needs to do is to improve their integrated gpu.
I think they have already improved on that lately if i am not mistaken.
So AMD will be miles behind again soon enough in "CPU/GPU"
They still have to improve allot on the CPU side.
Posted on Reply
#11
Frick
Fishfaced Nincompoop
by: AvonX
You do know that the 3770k is way way faster than any AMD dedicated CPU or APU don't you?
That is the main difference. I think you may have misunderstood me.
The point that i want to make here is: Even if AMD has no plans to do dedicated cpu's they still have to innovate and push harder to be faster. Currently they really don't have those things in place. So its a big risk from my point of view to go full APU from now because they are not ready for that yet.
Ah yes I did misunderstand. But they have a lot of caching up to do (if they want to compete at the higher end) anyway so I'm not sure I agree with you. The APU technology is maturing.

by: AvonX
The only thing that Intel needs to do is to improve their integrated gpu.
I think they have already improved on that lately if i am not mistaken.
Indeed they have. The HD4k GPU isn't that bad. It's not as good as AMD's offerings but it was a step forward.
Posted on Reply
#12
repman244
by: AvonX

The point that i want to make here is: Even if AMD has no plans to do dedicated cpu's they still have to innovate and push harder to be faster. Currently they really don't have those things in place. So its a big risk from my point of view to go full APU from now because they are not ready for that yet.
It's not a risk (Intel went full APU even tho the GPU still lacks a lot), you need to remember that it is a lot cheaper to make only one chip, AMD had to produce both Llano (which is based on the old stars core) and BD based chips.
They want to combine those two so that the production cost drops, and that is planned for the FM2 socket.

by: AvonX
The only thing that Intel needs to do is to improve their integrated gpu.
I think they have already improved on that lately if i am not mistaken.
So AMD will be miles behind again soon enough in "CPU/GPU"
They still have to improve allot on the CPU side.
Don't forget that Trinity's GPU is still based on the VLIW4 architecture, when they introduce a GCN based APU it will bring a lot better performance (and great compute power!) and efficiency.
And looking at the Trinity benchmarks it's actually a great improvement compared to Llano.
Posted on Reply
#13
Dent1
by: Frick
Ah yes I did misunderstand. But they have a lot of caching up to do (if they want to compete at the higher end) anyway so I'm not sure I agree with you. The APU technology is maturing.
The thing is neither AMD or Intel truly care about competing on the high end or low end, they only care about generating revenue.

The high end segment doesn't exist anymore in gaming like it used to. We've getting to a point where even a CPU from 2009 can play the latest games well or a cheap £50 CPU from today is plenty. There really isn't any need to spend £200+ on a high-end CPU for gaming. The high end CPU gaming market is dying and APUs are putting the last nail in the coffin.

IMO, in a decade or so there will only be APUs. And dedicated video cards will be reserved for the elite gamers whom want to SLI or CF their dedicated cards with their APU.
Posted on Reply
#14
suraswami
blah blah.

I guess we will know what crap shoot this is when reviews are posted.
Posted on Reply
#15
GLD
Sign me up for a 95w FX-4320 please. :)
Posted on Reply
#16
techtard
Just like last time, not going to expect too much from AMD. Looking forward to reviews.
Might even buy a Piledriver rig just to play with and overclock.

As a side note, I recently played some games at a friends house (dude is an AMD fanboy trapped in the AthlonXP glory days) on his Bulldozer FX-8120 rig, and everything was just fine.
If you don't measure the FPS on screen, you wouldn't know you were playing on an 'inferior' CPU. Everything was pretty smooth.

Also, it doesn't matter how improved Piledriver may be if the Intel compiler still forces all non genuine-intel CPUs to run non-optimal codepaths.
Posted on Reply
#17
AvonX
by: Dent1
The thing is neither AMD or Intel truly care about competing on the high end or low end, they only care about generating revenue.

The high end segment doesn't exist anymore in gaming like it used to. We've getting to a point where even a CPU from 2009 can play the latest games well or a cheap £50 CPU from today is plenty. There really isn't any need to spend £200+ on a high-end CPU for gaming. The high end CPU gaming market is dying and APUs are putting the last nail in the coffin.

IMO, in a decade or so there will only be APUs. And dedicated video cards will be reserved for the elite gamers whom want to SLI or CF their dedicated cards with their APU.
You could be correct. If that is the case, i have a great idea but i don't have the cash to do it. :laugh:
I was thinking of an all in one system. A system which would be a console for games and a pc at the same time. APU style. :laugh:
Posted on Reply
#18
N3M3515
Exciting news, not for gamers....:ohwell:
SB still wipes the floor with piledriver in games.
Posted on Reply
#19
eidairaman1
One Thing I noticed is Bd was marketed incorrectly. N btw Intle at time was still using unfair business practices hence antitrust cases

by: xenocide
HyperThreading may seem "lol" but the fact is Intel's CPU's are powerful enough that even with it being a kind of hacked on feature they shine. The numbers don't lie; http://www.anandtech.com/bench/Product/434?vs=287



Basically the new design caused a hit on per thread performance but you now could run 8 threads so it was supposed to be an even trade off. The downside was they didn't consider that most applications (if someone brings up compiling or video-editing I will choke them, you are incredibly niche and make up less of the market than gamers even) do not use more than 2-4 threads, so Intel's offerings and even AMD's older offerings were better. Not sure why but Tom's Hardware also noticed the FX series bottlenecked GPU's like crazy.

I hope Vishera is a marked improvement at least for AMD's sake. I honestly am pretty pleased with having gone for SB instead of waiting for BD :x



Role-reversal is fun like that. I wouldn't say BD "shines" in anything. I'm sure I'll get crucified for that, but everything I've seen indicates that in an optimal environment (basically one that heavily favors AMD) it is at least on par for SBIB chips, but the remaining 90%+ of the time Intel edges them out. I would say this is kind of more troubling for AMD than Netburst was for Intel. Intel had tons of capital and a massive market share, so gambling on a new architecture was acceptable. AMD really has neither of those, so BD being released in the condition it was in was a huge risk. Luckily their GPU and APU sales are doing great.
Posted on Reply
#20
YautjaLord
Been visiting the front page for any info like that on daily basis: finally some info, ffs! :laugh:

Now all i need to know is when the FM2/1090FX mobos will be released. Here's for larger (16M?) L3 cache, real 10 - 15% performance improvements over BD & just about everything else BD lacked. Dreaming yeah, but you can't take it from me now, do you? :) :toast:
Posted on Reply
#21
SIGSEGV
by: AvonX
Well if that is the case, no more AMD stuff for me. :laugh:
I see no reason at all for them to stop doing dedicated desktop cpus. Since these cpus come from the server parts, i find it silly for them to don't make the extra cash.
I see bad management again for AMD. APUs are not ready for really high performance yet.
I see AMD sinking and fast. If they have the same mindset for servers as well, then they will sink for sure.
blah blah blah blah,..

http://gmplib.org/pi-with-gmp.html
http://arstechnica.com/gadgets/2008/07/atom-nano-review/6/
http://www.agner.org/optimize/blog/read.php?i=49
http://en.wikipedia.org/wiki/Intel_C%2B%2B_Compiler
Posted on Reply
#22
AvonX
:laugh:
by: YautjaLord
Been visiting the front page for any info like that on daily basis: finally some info, ffs! :laugh:

Now all i need to know is when the FM2/1090FX mobos will be released. Here's for larger (16M?) L3 cache, real 10 - 15% performance improvements over BD & just about everything else BD lacked. Dreaming yeah, but you can't take it from me now, do you? :) :toast:
:laugh: For the 1090FX mobos i think yes you are dreaming. :laugh:
Even if they finally decide to do it, its still far away from now.
Somewhere about near the end of 2013.
Posted on Reply
#23
pantherx12
by: AvonX
The only thing that Intel needs to do is to improve their integrated gpu.
I think they have already improved on that lately if i am not mistaken.
So AMD will be miles behind again soon enough in "CPU/GPU"
They still have to improve allot on the CPU side.
Yeah but the thing is AMDs graphics is better than intels graphics to the degree that Intels cpus are better than AMDs.

So when the software catches up with the hardware they'll both offer decent performance with AMD theoretically having better compute power. ( so better for media creation, editing,rendering and gaming)
Posted on Reply
#24
cdawall
where the hell are my stars
by: slybunda
looks like another fail from amd. 4.2ghz clock speed and will probably be on par with a 3ghz ivybridge using half the number of cores.
move along folks nothing to see here.
Who cares? Why is clockspeed the deciding factor here? If it pulls the same wattage who gives two flying shits if its clocked at 1ghz or 100ghz performance per watt is the only thing that should make any difference. If you think any differently maybe you should rethink things.

by: Hustler
Well unless these new CPU's have increased their IPC rate by at least 30% over Bulldozer, which obviously they wont, they're still a waste of time.
Phenom I vs Phenom II with the increased clock speeds as long as we see a 10-15% jump it will be enough to compete in a middle of the road market. Seeing how GPU's continue to be the lag point for video gaming price point yet again will be a deciding factor for gamers.

by: eidairaman1
which is fine, considering even x4 BEs gave FX 81** series a hard run while costing way less
Hence why I have not bothered to buy anything that wasn't Phenom II based. I think everyone should remember the huge performance jump from Phenom I to Phenom II.

by: Frick
BD for servers review.
When you use the correct programs to render BD does quite well. Weird how an Intel based rendering program favors well Intel. There is a reason AMD added certain things to its technology and it wasn't to continue using Intel styled shit.


Now personally I don't understand why everyone is against AMD improving products and releasing new chips. You do understand that if it was an Intel only market we would not see improvements like we do now, prices would skyrocket etc...Just remember on release the Athlon X2 4200+ was $537 Intel was the same pricing. I am all for a "crappy" Piledriver chip for $200. Intel can keep its $500+ chips.
Posted on Reply
#25
ensabrenoir
Cpu releases are so much fun....tbis is all too fimilar though.. All we need is for someone to say is amds gonna mop the floor with intel then itll be like old times. Would be ironlc if tbe chip no one belives in actually does it. But physics and reality gonna step in. just looking foward to some new tech. Dont care if fisher price made it.
Posted on Reply
Add your own comment