Monday, September 24th 2012

AMD FX "Vishera" Processor Pricing Revealed

AMD's upcoming second-generation FX "Vishera" multi-core CPUs are likely to appeal to a variety of budget-conscious buyers, if a price-list leaked from US retailer BLT is accurate. The list includes pricing of the first four models AMD will launch some time in October, including the flagship FX-8350. The FX-8350 leads the pack with eight cores, 4.00 GHz clock speed, and 16 MB of total cache. It is priced at US $253.06. The FX-8350 is followed by another eight-core chip, the FX-8320, clocked at 3.50 GHz, and priced at $242.05.

Trailing the two eight-core chips is the FX-6300, carrying six cores, 3.50 GHz clock speed, 14 MB total cache, and a price-tag of $175.77. The most affordable chip of the lot, the FX-4350 packs four cores, 4.00 GHz clock speed, and 8 MB of total cache (likely by halving even the L3 cache). The FX-4350 is expected to go for $131.42. In all, the new lineup draws several parallels with the first-generation FX lineup, with FX-8150, FX-8120, FX-6100, and FX-4150.

Source: HotHardware
Add your own comment

221 Comments on AMD FX "Vishera" Processor Pricing Revealed

#1
Steevo
by: os2wiz
Never said he was a liar
by: os2wiz
I think you are lying.
:laugh:

I like you, you are funny.
Posted on Reply
#2
xenocide
I should start bashing Intel products more often; then I could say whatever I wanted without getting it deleted. As for Piledriver, I read someone (nt300 I think) say there was a reason they were keeping it hushed, implying it was because its performance was a sizable improvement. It's important to remember AMD did the same thing with Bulldozer, so I don't buy into that theory too much. I hope AMD makes me eat my words, because on the CPU side of things--aside from APU's for laptops--I have not been too impressed since I replaced my Athlon 64 X2...

by: Steevo
:laugh:

I like you, you are funny.
Meh. It's clear he has differing opinions, I respect his right to that and choose to ignore what he posts unless its relevant to the conversation at this point.
Posted on Reply
#3
Super XP
Yes AMD did fabulous marketing with Bulldozer. That awesome high clocked 8 core magic processor that performed as well as a mouses fart. Yes I know, AMD now has a better management team. Though I do find it interesting that they are not talking much about Vishera? We shall see soon enough :D
Posted on Reply
#4
xenocide
by: Super XP
Yes AMD did fabulous marketing with Bulldozer. That awesome high clocked 8 core magic processor that performed as well as a mouses fart. Yes I know, AMD now has a better management team. Though I do find it interesting that they are not talking much about Vishera? We shall see soon enough :D
I don't believe their marketting team or strategy has really changed all that much since the management change. I have spoken plenty about Vishera, but whenever I bring up that it's improvements are less than a million fold people accuse me of lying despite sources saying otherwise and I'm just tiring of arguing about it. I think Vishera will be what Bulldozer should have been, and hopefully corrects a lot of the awful issues BD suffered from (reported high cache latency, unusually high power consumption for many users, poorly handled scheduling, etc.).

P.S. That question mark is grammatically incorrect since you're not asking a question; rather making a statement.
Posted on Reply
#5
eidairaman1
by: xenocide
I don't believe their marketting team or strategy has really changed all that much since the management change. I have spoken plenty about Vishera, but whenever I bring up that it's improvements are less than a million fold people accuse me of lying despite sources saying otherwise and I'm just tiring of arguing about it. I think Vishera will be what Bulldozer should have been, and hopefully corrects a lot of the awful issues BD suffered from (reported high cache latency, unusually high power consumption for many users, poorly handled scheduling, etc.).

P.S. That question mark is grammatically incorrect since you're not asking a question; rather making a statement.
This i agree with. We can only hope the improvements allow for the chip to run cooler be faster in single and multi thread apps and draws less power than BD across the PD lineup
Posted on Reply
#6
SIGSEGV
sometimes i want to ask you people who already compared amd and intel cpu, have you tried to play the games, or working with fx or bulldozer cpu? does this bulldozer cpu already make you suffering a lot while you're on gaming and working? i already tried to compare using fx8150, intel i5 2500k and intel i7 2600k with gtx680 on real gaming and working apps, i didn't see any differences using those chips except an unrealistic of benchmark numbers from benchmark apps.
personally i want the upcoming amd's piledriver can draw less power, cheaper price compared to competitor, and maybe can perform better on benchmark apps :nutkick:.
Posted on Reply
#7
erocker
by: SIGSEGV
sometimes i want to ask you people who already compared amd and intel cpu
Yes I have. I had a 8150 and a Crosshair V. Multi GPU gaming wasn't even close to what the other company has to offer (2500K at the time). Gaming benchmarks from various review sites (using actual games) show the same results. :)
Posted on Reply
#8
os2wiz
a Lie does not always make one a liar

by: Steevo
:laugh:

I like you, you are funny.
I agree I said he was lying because he misrepresented the benchmark results. I do not think he is a liar. A liar is somebody who repeatedly tells falsehoods and is conscious of that. A person can tell one lie and that doesn't make him a liar. We are all human. Sometimes in the heat of argument people say things they later realize they did not mean. It happerns to all of us. I just hope that one doesn't make it an every day practice.
Posted on Reply
#9
cdawall
where the hell are my stars
by: SIGSEGV
sometimes i want to ask you people who already compared amd and intel cpu, have you tried to play the games, or working with fx or bulldozer cpu? does this bulldozer cpu already make you suffering a lot while you're on gaming and working? i already tried to compare using fx8150, intel i5 2500k and intel i7 2600k with gtx680 on real gaming and working apps, i didn't see any differences using those chips except an unrealistic of benchmark numbers from benchmark apps.
personally i want the upcoming amd's piledriver can draw less power, cheaper price compared to competitor, and maybe can perform better on benchmark apps :nutkick:.
My ancient Xeon 3440 beats thuban and bulldozer in every game I tried. Mind you I am probably one of he most consistent AMD purchasers around. I skipped bulldozer. I tried it on a couple of friends rigs and laughed as my lower end card rendered more fps when coupled to a phenom x4@4 while he ran his bd@4.3.

Bulldozer sucked it was one of the biggest let downs I have seen in my lifetime in computers. The idea is great parallel computing is awesome. AMD however needs to step up their game for it to catch on. The server market is already turning ears towards bulldozer it works well with the correct programs. Give them a few mind you I said a few more generations and we will see some shoe exchanging. Intel has had it happen to them on multiple occasions AMD innovates and Intel copies.
Posted on Reply
#10
SIGSEGV
by: erocker
Yes I have. I had a 8150 and a Crosshair V. Multi GPU gaming wasn't even close to what the other company has to offer (2500K at the time). Gaming benchmarks from various review sites (using actual games) show the same results. :)
thanks, i haven't ever used multi gpu setup on my FX8150 (with ASRock 990fx extreme4) so i couldn't say much about that, but i did it on my phenom ii x4 955 BE with cf hd 6870 in the past.
Posted on Reply
#11
TRWOV
by: xenocide
I don't believe their marketting team or strategy has really changed all that much since the management change. I have spoken plenty about Vishera, but whenever I bring up that it's improvements are less than a million fold people accuse me of lying despite sources saying otherwise and I'm just tiring of arguing about it. I think Vishera will be what Bulldozer should have been, and hopefully corrects a lot of the awful issues BD suffered from (reported high cache latency, unusually high power consumption for many users, poorly handled scheduling, etc.).

P.S. That question mark is grammatically incorrect since you're not asking a question; rather making a statement.
If I recall correctly there were Bulldozer shortages even months after the reviews hit, why wouldn't the same occur with Piledriver being a better product and all that? (rhetorical question, not directed at you BTW) :toast:

I don't think that good marketing matters for AMD; it's not like they could just churn out an extra million of CPUs every month even if the demand was there. And they always sell out. People might want them to be present in every tier but does AMD want that too? Why did AMD become fabless in the first place?
Posted on Reply
#12
os2wiz
by: Super XP
Let's get back to topic os2wiz. No need to explain yourself bro ;)
Piledriver will soon enough prove itself as an improvement over Bulldozer under AMD's new management team.

Even at a higher clock rate, that would be great, so long as it runs at lower or equal volts than Bulldozer.
Thank you and solidarity greetings with the Greek working class.

by: TRWOV
If I recall correctly there were Bulldozer shortages even months after the reviews hit, why wouldn't the same occur with Piledriver being a better product and all that? (rhetorical question, not directed at you BTW) :toast:

I don't think that good marketing matters for AMD; it's not like they could just churn out an extra million of CPUs every month even if the demand was there. And they always sell out. People might want them to be present in every tier but does AMD want that too? Why did AMD become fabless in the first place?
Now that is objective and constructive criticism that I can relate to and accept. The trolling and baiting that some people indulge in is ridiculous though. Let us all maintain some level of intellectual honesty and respect and we will all get a lot more positives out of this forum. I appreciate your post.
Posted on Reply
#13
Super XP
by: xenocide
P.S. That question mark is grammatically incorrect since you're not asking a question; rather making a statement.
Yes, my tablet sometimes auto finishes words for me but sometimes adds stuff like?!/:() for some reason. I didn't pick up on that. ? In time :)
Posted on Reply
#14
xenocide
by: SIGSEGV
sometimes i want to ask you people who already compared amd and intel cpu, have you tried to play the games, or working with fx or bulldozer cpu? does this bulldozer cpu already make you suffering a lot while you're on gaming and working? i already tried to compare using fx8150, intel i5 2500k and intel i7 2600k with gtx680 on real gaming and working apps, i didn't see any differences using those chips except an unrealistic of benchmark numbers from benchmark apps.
I've seen (on Tom's Hardware) and heard (around TPU) that AMD CPU's tend to be a bottleneck for Multi-GPU setups a lot quicker than Intel's offerings. I don't much care for SLiCrossfire, so it's not a huge concern for me. I can definitely say going from my Q6600 to an i5-2500K made a huge difference for me in a lot of games, even with the same GPU. My only conclusion is that my HD5850 was being bottlenecked, so the CPU can matter a lot more than most people are ever willing to admit.

I would imagine the difference between rendering something in 1 hour rather than 1 hour and 20 minutes isn't a huge deal for most people. I would never say BD was completely unusable, in fact it did pretty well for certain programs, but in my mind there was not enough reason to get it over Intel's offerings at the time.

by: cdawall
My ancient Xeon 3440 beats thuban and bulldozer in every game I tried. Mind you I am probably one of he most consistent AMD purchasers around. I skipped bulldozer. I tried it on a couple of friends rigs and laughed as my lower end card rendered more fps when coupled to a phenom x4@4 while he ran his bd@4.3.

Bulldozer sucked it was one of the biggest let downs I have seen in my lifetime in computers. The idea is great parallel computing is awesome. AMD however needs to step up their game for it to catch on. The server market is already turning ears towards bulldozer it works well with the correct programs. Give them a few mind you I said a few more generations and we will see some shoe exchanging. Intel has had it happen to them on multiple occasions AMD innovates and Intel copies.
There were a lot of situations where Phenom II was better than Bulldozer (especially of the 4xxx and 6xxx variant). Phenom II had the higher IPC, so with the same number of 'cores' Phenom II performed better clock for clock. It was only really when optimizing new instruction sets that this wasn't true. It also depends greatly on the game, obviously certain games are way more CPU dependent--primarily RTS and MMO's.

Parallel Computing is definitely the future, but it's a slow progression because software developers are slow to support increased numbers of cores as it would require an increase in development costs. As for the innovation, AMD definitely innovated a lot, their implementation of x86-64 was definitely the smarter move. The way I see it historically AMD introduces new concepts but Intel really executes them.
Posted on Reply
#15
theoneandonlymrk
by: erocker
Yes I have. I had a 8150 and a Crosshair V. Multi GPU gaming wasn't even close to what the other company has to offer (2500K at the time). Gaming benchmarks from various review sites (using actual games) show the same results.
respectfully ,your an exception(ie that you did actually bother to try it out ) rather then the norm ,and multi Gpu puts you in a smaller bracket yet again, so your sumary and experiance might not best fit another users experiance even at that time, not a dig just saying:).
Posted on Reply
#16
Super XP
by: os2wiz
Thank you and solidarity greetings with the Greek working class.

Now that is objective and constructive criticism that I can relate to and accept. The trolling and baiting that some people indulge in is ridiculous though. Let us all maintain some level of intellectual honesty and respect and we will all get a lot more positives out of this forum. I appreciate your post.
Thanks, we Greeks always seem to find a way to excellence :D
Posted on Reply
#17
Horrux
by: erocker
Yes I have. I had a 8150 and a Crosshair V. Multi GPU gaming wasn't even close to what the other company has to offer (2500K at the time). Gaming benchmarks from various review sites (using actual games) show the same results. :)
I can report the same, but with an added twist: I use to run crossfire with Radeons on my X6 1100t rig, which went very well, then when I replaced those with Geforces (even on a SLI-compatible motherboard), I had tons of trouble and could not get any real performance increase over a single geforce, but instead got lots of performance issues. Then I changed my mobo/RAM/cpu for intel and things have been flying.

From my experience, I deduced that multi-GPU with video cards built around nVidia GPUs works a lot less well with AMD systems than those built around AMD GPUs. If history is any guide, intel has paid nVidia a fortune to make that happen... Now I'm not saying that is the case, I'm just saying Intel has such a history of monopolistic, illegal, rotten, evil business practices, that it wouldn't surprise me overly if such were the case.

Either way, using multiple GPUs will tend to move the bottleneck from the (single) GPU to the CPU, so that's where the CPU's power becomes more relevant.

It remains that in my case, it wasn't simply a situation where the second GTX 570 didn't add any performance, it just made everything stutter like complete madness.
Posted on Reply
#18
os2wiz
by: Super XP
Thanks, we Greeks always seem to find a way to excellence :D
Now if you can develop a REAL revolutionary party then maybe you can throw out the oligarchs from the EU banks-Merkels "miracle workers"
Posted on Reply
#19
cdawall
where the hell are my stars
by: Horrux
I can report the same, but with an added twist: I use to run crossfire with Radeons on my X6 1100t rig, which went very well, then when I replaced those with Geforces (even on a SLI-compatible motherboard), I had tons of trouble and could not get any real performance increase over a single geforce, but instead got lots of performance issues. Then I changed my mobo/RAM/cpu for intel and things have been flying.

From my experience, I deduced that multi-GPU with video cards built around nVidia GPUs works a lot less well with AMD systems than those built around AMD GPUs. If history is any guide, intel has paid nVidia a fortune to make that happen... Now I'm not saying that is the case, I'm just saying Intel has such a history of monopolistic, illegal, rotten, evil business practices, that it wouldn't surprise me overly if such were the case.

Either way, using multiple GPUs will tend to move the bottleneck from the (single) GPU to the CPU, so that's where the CPU's power becomes more relevant.

It remains that in my case, it wasn't simply a situation where the second GTX 570 didn't add any performance, it just made everything stutter like complete madness.
Mine has none of those issues could have been an AMD mobo issues with drivers or BIOS.
Posted on Reply
#20
theoneandonlymrk
Rumours abound Am3+ is still going to be used for steamroller too, happy days:D

to above and bellow posts, i cant vouch for nvidia on Amd as i havent used it but i have xfired on Amd(this) and intel platforms with minimal issues(mostly app specific) for years now and the main thing xfire users need to know is AFR on,,, this brings a near doubleing of fps in all but nvidia optimised games and metro 2033 but is the most crashy
Posted on Reply
#21
Horrux
by: cdawall
Mine has none of those issues could have been an AMD mobo issues with drivers or BIOS.
Might, we'll never know. All I know is I first tried with an ASUS M4A79 Deluxe, which isn't SLI certified, using patched drivers. With old, then new BIOS. Then with a newer mobo, a high end MSI that had USB3, with stock, then new BIOS. Then I just gave up.
Posted on Reply
#22
erocker
by: Super XP
Piledriver finally got Benchmarked :eek:
My sisters, husbands, best friends mothers boyfriends school teacher confirms this. For now I am taking this information with a grain of SALT :rolleyes:
Don't post useless posts. It accomplishes nothing.
Posted on Reply
#23
1d10t
by: SIGSEGV
sometimes i want to ask you people who already compared amd and intel cpu, have you tried to play the games, or working with fx or bulldozer cpu? does this bulldozer cpu already make you suffering a lot while you're on gaming and working? i already tried to compare using fx8150, intel i5 2500k and intel i7 2600k with gtx680 on real gaming and working apps, i didn't see any differences using those chips except an unrealistic of benchmark numbers from benchmark apps.
personally i want the upcoming amd's piledriver can draw less power, cheaper price compared to competitor, and maybe can perform better on benchmark apps :nutkick:.
just reminds me of my co-worker friend.he just switch thuban to sandy-bridge platforms.his jobs mainly to draw 2D,editing small clips of video,and represented them in beautifully presentation.and he wonder why "those lag" is still persist although he "made a good decision".
then he came to my workshop and tries my computer.6 hours torturing my computer,and he just comment "did you use highly overclocked ivy-bridge?" :p

by: erocker
Yes I have. I had a 8150 and a Crosshair V. Multi GPU gaming wasn't even close to what the other company has to offer (2500K at the time). Gaming benchmarks from various review sites (using actual games) show the same results. :)
all-in-all,if you crossfire'd or sli'ed high-end GPU,the choice was obvious.
but when it came to mainstream,i.e my system with just two HD7850,there's is no noticeable different with highly overclocked 2600k + GTX 680 @1080p gaming :)
Posted on Reply
#24
symmetrical
by: SIGSEGV
sometimes i want to ask you people who already compared amd and intel cpu, have you tried to play the games, or working with fx or bulldozer cpu? does this bulldozer cpu already make you suffering a lot while you're on gaming and working? i already tried to compare using fx8150, intel i5 2500k and intel i7 2600k with gtx680 on real gaming and working apps, i didn't see any differences using those chips except an unrealistic of benchmark numbers from benchmark apps.
personally i want the upcoming amd's piledriver can draw less power, cheaper price compared to competitor, and maybe can perform better on benchmark apps :nutkick:.
I used to have an FX-8120, certain games like Starcraft 2 and even Bad Company 2 I used to suffer from frame dips to as low as 40fps in BC2 and 30fps in SC2. After I got my 2600K, everything was locked 60fps with v-sync. Although SC2 is a worse case scenario for Bulldozer seeing as how it only utilizes 2 threads. Other games like Operation Flashpoint barely hit 45fps while the i7 was 60fps locked with zero dips. There are a bunch of other games that the FX-8120 bottlenecked my GTX 580 at the time and god knows it would bottleneck my current GTX 680.

I also extract a lot of .rar files and do a lot of video conversion with handbrake. The i7 was dramatically faster in extraction and compression. Although I do say the FX-8120 was great in Handbrake and overclocked it was the same as the i7. As for photo editing, it would be hard to tell a difference.

Gaming though Bulldozer really does suck though in comparison. If you REALLY cant wrap your head around that, then take a look at some of the Borderlands 2 CPU scaling benchmarks or the Multi-GPU benchmarks (tweaktown.com) There are tons and tons of proven benchmarks that aren't just "bars and graphs" that prove this. It really DOES just suck and is behind even the Phenom II.

However I will note that everybody's gaming experience will be based on their own standards. Some people will be fine with 30fps and claim they see no difference with 60fps (probably need their eyes checked)
Posted on Reply
#25
NeoXF
by: theoneandonlymrk
Rumours abound Am3+ is still going to be used for steamroller too, happy days:D

to above and bellow posts, i cant vouch for nvidia on Amd as i havent used it but i have xfired on Amd(this) and intel platforms with minimal issues(mostly app specific) for years now and the main thing xfire users need to know is AFR on,,, this brings a near doubleing of fps in all but nvidia optimised games and metro 2033 but is the most crashy
Since they implied that FM2 socket will support at least ONE future APU generation, and that being the Steamroller one... and FM2 being very similar to FM1, which in turn is very similar to AM3/AM3+... I can see how it's all but certain performance Steamroller will be on AM3+ as well. Which is good... if they can muster the performance they're boasting on that age old platform... as well as that they'll be more compelled to change it afterwards for Excavator (old age/obsolescence, DDR4, maybe a need to unify APU and enthusiast platforms etc).
Posted on Reply
Add your own comment