• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces 2012 FX "Vishera" Line of Performance Desktop Processors

AMD is doing the right thing with its pricing. AMD is clearly a winner when it comes to performance/price per PC.

However, I buy Intel for myself because Microcenter always has combo deal (cpu+main) which is cheaper than newegg or any other online retailer at least 35%, otherwise, I would buy AMD.

Also, people should realize that we've came to a point where cpu's performance is no longer matter for gaming due to better programming for multicore cpu.

What is really matter now is gpu. Spend less on cpu and get a better gpu will give you more FPS!
 
Also, people should realize that we've came to a point where cpu's performance is no longer matter for gaming due to better programming for multicore cpu.

What is really matter now is gpu. Spend less on cpu and get a better gpu will give you more FPS!

Not true, the issue is the majority of games are console ports and with console tech over 5 years old, the games just aren't made to push PC's very hard if at all.:shadedshu

The trend will continue till the new consoles come out.
 
Not true, the issue is the majority of games are console ports and with console tech over 5 years old, the games just aren't made to push PC's very hard if at all.:shadedshu

The trend will continue till the new consoles come out.

That's so two years ago. The OLD Unreal Engine has been optimized greatly for multicore cpu in the past 2 years if you haven't realized (Dishonored for ex.).

Even if your statement is true (absolutely unlikely), Vishera can handle all non-multicore games just fine as its current clockspeed (even a Phenom II can do so).

Funny fact: All Capcom (a huge console game publisher) games in the past three years support multi-threads.
 
That's so two years ago. The OLD Unreal Engine has been optimized greatly for multicore cpu in the past 2 years if you haven't realized (Dishonored for ex.).

Even if your statement is true (absolutely unlikely), Vishera can handle all non-multicore games just fine as its current clockspeed (even a Phenom II can do so).

Funny fact: All Capcom (a huge console game publisher) games in the past three years support multi-threads.

Funny thing I never said squat about cores or the AMD chips not being able to handle gaming. My point was and still is that because the vast majority of games are straight console ports, they don't require much oomph from PC's. Way to get defensive about nothing.:wtf:
 
That's also not entirely true. It's not because of console that devs don't make games that "push our pc hard enough".

It's because the market for PC games is not big enough for publishers to make the investment (think about the people who have i7 and GTXs).

Prettier a game, more expensive it is to make. I'm pretty sure all the devs out there want as much money as possible to make the prettiest evar but they just don't get funded that way.

And, I just love to argue.
 
Power consumption is still nothing short of horrible and goes beyond horrible when OC'ed.
Piledriver is a fail in my book.

26 watts in difference?

power_full_load.gif


While not exactly as low as the lower voltage/lower wattage/smaller die intel chips it is still a vast improvement over the previous generation.
 
Just what I wanted to see. Surpassed my expectations in Handbrake tho. Much larger bump over the BD and PIIs than I thought it would have.

Think I might just pick one of these up. Maybe be my last AMD CPU if things don't improve. Now to just wait for a good deal. Sorry Newegg but $220 is too much. Gimmie that $200 price point or give me a special bundle game deal and then I'll bite.
 
He clearly said "when overclocked"..
According to TPU, it consumes 254W when O'ced. Now compare it to the stock 134W. It's almost twice the power consumption. At least to me, it's unacceptable.

read again
he clearly said:
Power consumption is still nothing short of horrible

and there's this:
and goes beyond horrible when OC'ed.
Piledriver is a fail in my book.
 
According to TPU, it consumes 254W when O'ced. Now compare it to the stock 134W. It's almost twice the power consumption. At least to me, it's unacceptable.

...and you think my 3820 only eats 130 watts OC'ed as well as stock? I don't think so. Overclocking makes my power usage skyrocket on SB-E. :) Maybe even as high as 225 watts? Could be higher.
 
...and you think my 3820 only eats 130 watts OC'ed as well as stock? I don't think so. Overclocking makes my power usage skyrocket on SB-E. :) Maybe even as high as 225 watts? Could be higher.


Cool down, my friend. Why are you comparing piledriver to LGA2011. Let's compare it with something within it's performance range.


i7 3770K overclocks from 3.5GHz to 4.6GHz, which is 31% overclock. It consumes only 39.3% more power, when load. (http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review/4)


FX-8350 overclocks from 4GHz to 4.8GHz, which is only 20% overclock. It consumes hefty 59% more power, when load. (http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/8)


According to that 8350 review, i7 3770K at 4.6GHz would have been used only 167W, which will be lower than that of FX-8350's stock power consumption



I didn't try to bash you. But I always respect the reality rather than speculations. If you really need to compare a lower tier of product to higher tier. Try to compare it with the best or at least second best. Then compare the performance gained percentage with power consumption percentage. For example try to compare FX8350 @4.8GHz against i7 3930K @ stock, because i7 3970K(@stock clock) will prevail Fully O'ced FX. Now compare the power consumption of stock i7 3930K and FX's (@4.8GHz) power consumption. Otherwise compare it with i7 3770K or i5 3570K (Both FX and Ivy fully overclocked). Still ivybridge overclocks better than FX-8350 (Ivy bridge's 31% OC vs 8350's 20% OC) on air cooler :shadedshu
 
Last edited:
That's also not entirely true. It's not because of console that devs don't make games that "push our pc hard enough".

It's because the market for PC games is not big enough for publishers to make the investment (think about the people who have i7 and GTXs).

Prettier a game, more expensive it is to make. I'm pretty sure all the devs out there want as much money as possible to make the prettiest evar but they just don't get funded that way.

And, I just love to argue.

I have to go with [H] on this one, though you provide a perfectly valid point.

We as the PC community want to move forward, are limited in graphics technology from earlier consoles.

There are a few gleaming examples though, like dishonored, that shine through, but they are few and far between.
 
Cool down, my friend. Why are you comparing piledriver to LGA2011. Let's compare it with something within it's performance range.

i7 3770K overclocks from 3.5GHz to 4.6GHz, which is 31% overclock. It consumes only 39.3% more power, when load. (http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review/4)

FX-8350 overclocks from 4GHz to 4.8GHz, which is only 20% overclock. It consumes hefty 59% more power, when load. (http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/8)

According to that 8350 review, i7 3770K at 4.6GHz would have been used only 167W, which will be lower than that of FX-8350's stock power consumption

I didn't try to bash you. But I always respect the reality rather than speculations. If you really need to compare a lower tier of product to higher tier. Try to compare it with the best or at least second best. Then compare the performance gained percentage with power consumption percentage. For example try to compare FX8350 @4.8GHz against i7 3930K @ stock, because i7 3970K(@stock clock) will prevail Fully O'ced FX. Now compare the power consumption of stock i7 3930K and FX's (@4.8GHz) power consumption. Otherwise compare it with i7 3770K or i5 3570K (Both FX and Ivy fully overclocked). Still ivybridge overclocks better than FX-8350 (Ivy bridge's 31% OC vs 8350's 20% OC) on air cooler :shadedshu

/facepalm

i think you missed the point...
perhaps you should at least read on wikipedia first..
 
Last edited:
Aww gee wiz, I would get a Piledriver CPU right now. And do benchmark of "Son versus Father: a duel where Piledriver tries to prove he's better than his pops!"
Too bad here in Lithuania computer parts usually "lag behind" by at least a week. (The bulldozer "lagged" by more than a month, for example). That's a long time to deal with this mouth watering. As 'dose Piledrivers do look tasty...
 
Powerload is the biggest improvement I noticed. Performance is better actually. lets compare a 8320 to 8150 now.
 
Back
Top