• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

HD 7970: Bulldozer vs. Sandy Bridge vs. Nehalem

My argument was according to Wizzards review. The Bulldozer performs on par with the i7 Nehalem. If Bulldozers single threaded performance is so subpar, why does the Nehalem get a pass? Sure its older but I bet 7/10 of TPU members still own a i7 920 or had one, and they wouldn't classify it as a shitty performer at all.

It's not that I'm asking for a pass, I'm just pointing out that people buy whats suitable for them. Some people dont mind average gaming performance if they can encode their videos super quick, doesnt mean the CPU is rubbish, just means its not for you.

In this review it only performs on par do to GPU limitations at the higher res. If you look at the lower res it looses. Nehalem doesn't need a pass because it offers superior IPC to Bulldozer. When you start looking at benchmarks outside of games you will see this. I am a TPU user with a 920 and I classify it as a better performer than BD even more so at my overclocked speed.

And I agree people should buy what suits their needs and purchasing level. I didn't say its total rubbish but I think Bulldozer is still a flop because it doesn't offer performance much greater than phenom II or intel 3 year old Nehalem cpu's.

When you spend 4+ years designing a cpu and it doesn't beat your previous model this is bad no matter how you spin it. And yes it offers you an upgrade path but not a performance improvement so why bother upgrading?
 
In an equation as performance/€ there are 2 cogs you can tweak, not just 1.
One way is obvious: double the performance.
But there is also another way: Halve the price!

This is what happened with the iteration from HD5850 --> HD6850/HD6870.
The HD6870 was only slightly faster than the HD5850 but the price was smashed into half.
In my book that's just as good and led me to buy one.

Of course there are also some other factors playing into that equation (e.g. power consumption), where the HD6870 also did better than its predecessor.
 
When you spend 4+ years designing a cpu and it doesn't beat your previous model this is bad no matter how you spin it. And yes it offers you an upgrade path but not a performance improvement so why bother upgrading?

Fair enough the development for bulldozer took far too long.

To say it's upgrade path is "not a performance improvement" is a misconception. Look at the FX-8150 reviews in encoding, 3D rendering, compression and general productivity and you'll see that for people coming off an Athlon II, Phenom II X4/X6 and you'll see the upgrade path does yeild an big improvement.

Not all upgrades are based on gaming. I can go from a low end Athlon II X4 to FX-8150 spanking an i5 2500k in rendering, just by dropping in a new CPU. The perfect upgrade path for a 3D animator.
 
Fair enough the development for bulldozer took far too long.

To say it's upgrade path is "not a performance improvement" is a misconception. Look at the FX-8150 reviews in encoding, 3D rendering, compression and general productivity and you'll see that for people coming off an Athlon II, Phenom II X4/X6 and you'll see the upgrade path does yeild an big improvement.

Not all upgrades are based on gaming. I can go from a low end Athlon II X4 to FX-8150 spanking an i5 2500k in rendering, just by dropping in a new CPU. The perfect upgrade path for a 3D animator.

yes in those area's it will show an improvement over phenom.

The 8150k will indeed beat a 2500k in rendering but it will not beat a 2600k which is in the same price bracket.

with a am3 board you don't really have much of a choice when upgrading. And that would be the only reason I would buy one because there is no other option.
 
yes in those area's it will show an improvement over phenom.

Gd gd
The 8150k will indeed beat a 2500k in rendering but it will not beat a 2600k which is in the same price bracket.

In the UK the FX8150 price falls inbetween both.

Ebuyer.com
2600K - £240
FX 8150 - £210
2500K £170

Anyways if I render all day as a job, I'm running socket AM3 and I have limited money and it's a choice between dropping in a FX-8150 or changing platforms specifically for a 2600K. Most people would select the FX-8150 all day.

And to be fair, there are some situations where the FX-8150 does outperform the 2600K, transcoding comes to mind. Now if you transcode all day, Intel doesnt have a commercial solution.


with a am3 board you don't really have much of a choice when upgrading..

Huh??? Socket AM3 has more CPUs on it's support list than any current Intel socket.

And that would be the only reason I would buy one because there is no other option.

Granted if coming from a Phenom II X6, yes you'll have little option.
 
Last edited:
In this review it only performs on par do to GPU limitations at the higher res. If you look at the lower res it looses. Nehalem doesn't need a pass because it offers superior IPC to Bulldozer. When you start looking at benchmarks outside of games you will see this. I am a TPU user with a 920 and I classify it as a better performer than BD even more so at my overclocked speed.

And I agree people should buy what suits their needs and purchasing level. I didn't say its total rubbish but I think Bulldozer is still a flop because it doesn't offer performance much greater than phenom II or intel 3 year old Nehalem cpu's.

When you spend 4+ years designing a cpu and it doesn't beat your previous model this is bad no matter how you spin it. And yes it offers you an upgrade path but not a performance improvement so why bother upgrading?

I7 is a hell of a chip. :respect: As are all the other current chips and again this is dumb, buy the cheapest cpu and oc the hell out of it and use the extra money to buy the best performing gpu your system supports whether it is Amd or Intel.
 
Gd gd


In the UK the FX8150 price falls inbetween both.

Ebuyer.com
2600K - £240
FX 8150 - £210
2500K £170

Anyways if I render all day as a job, I'm running socket AM3 and I have limited money and it's a choice between dropping in a FX-8150 or changing platforms specifically for a 2600K. Most people would select the FX-8150 all day.

And to be fair, there are some situations where the FX-8150 does outperform the 2600K, transcoding comes to mind. Now if you transcode all day, Intel doesnt have a commercial solution.




Huh??? Socket AM3 has more CPUs on it's support list than any current Intel socket.



Granted if coming from a Phenom II X6, yes you'd have little option.

I agree with all you have said, and sorry when I said am3 I was talking about just moving from a phenom level chip you are correct there are many more choices on the AMD side as this has been one of the areas AMD is better than Intel for socket compatibility.

as for the transcoding part all of Intel 6 core chips I believe bulldozer cannot touch. So gulftown and sb-e and yes if you mention price they won't be comparable but then again if that's my job and a business expense cost is not that big a factor.
 
I agree with all you have said, and sorry when I said am3 I was talking about just moving from a phenom level chip you are correct there are many more choices on the AMD side as this has been one of the areas AMD is better than Intel for socket compatibility.

as for the transcoding part all of Intel 6 core chips I believe bulldozer cannot touch. So gulftown and sb-e and yes if you mention price they won't be comparable but then again if that's my job and a business expense cost is not that big a factor.

Depends as a small business I would choose to towers with an AMD processor vs one tower with a highend Intel. Buisiness right off or not you still have to have the up front capitol.

As for the GPU comparison an HD2900XT smoked a 3870 yet which was the better seller?

As for GPU limited benching I'm just saying the $280 for a cpu another $150-200 on mobo and $500 on a GPU I'm buying at least a pair of 1920x1080P monitors and running eyefinity. At that point it makes no sense to choose anyone cpu for its vast gaming improvements. That would be GPU limited I7, I5, AMD it doesn't matter they will all push the same framerate.
 
Depends as a small business I would choose to towers with an AMD processor vs one tower with a highend Intel. Buisiness right off or not you still have to have the up front capitol.

As for the GPU comparison an HD2900XT smoked a 3870 yet which was the better seller?

As for GPU limited benching I'm just saying the $280 for a cpu another $150-200 on mobo and $500 on a GPU I'm buying at least a pair of 1920x1080P monitors and running eyefinity. At that point it makes no sense to choose anyone cpu for its vast gaming improvements. That would be GPU limited I7, I5, AMD it doesn't matter they will all push the same framerate.

Yes it does depend.

If my business was rendering work for clients time is money. So yes cost of Intel's rigs would be higher but the investment will also pay for itself sooner.

I don't remember difference being as large as you state for it to be considered "smoking" it.
its been awhile since I looked at benchmarks but I do own an Asus 3870 512mb in my closet somewhere.

I'm someone that prefers high-end single gpu's to xfire/sli due to the stuttering issue and drivers that come with that kind of setup.

But good point if going that route.
 
As for GPU limited benching I'm just saying the $280 for a cpu another $150-200 on mobo and $500 on a GPU I'm buying at least a pair of 1920x1080P monitors and running eyefinity. At that point it makes no sense to choose anyone cpu for its vast gaming improvements. That would be GPU limited I7, I5, AMD it doesn't matter they will all push the same framerate.

that argument has never made sense.



yes, if you slap in more and more video cards, eventually you end up CPU limited... but thats it.

people always use the 'logical' extension of 'so that means, CPU doesnt matter!' that is wrong.


your CPU requirements matter just as much, or even more - and who's to say you cant lower one or two CPU settings to get higher GPU usage/FPS? every percent matters when you want to get the most out of your video cards.
 
Yes it does depend.

If my business was rendering work for clients time is money. So yes cost of Intel's rigs would be higher but the investment will also pay for itself sooner.

Rendering performance isn't 2x on a 6 core SB-e, but it is double the cost. You can shave some money off of both with lesser boards but most companies will pick an Intel mobo so to be fair I picked a midrange AMD 990FX. With the entire PC price factored in its still only a $400ish difference. So for every 2 SB-e's you could pretty much build 3 BD rigs... I would get better turn around running 3 "slower" rigs than 2 "faster" ones.

SB-e

I7 3930K $599
Intel X79 $279

$878


BD

FX8150 $269
Gigabyte 990FX $154
$423

rest of rendering rig

Geil 4x8GB $279
Powercolour 6770 $106
Seagate 1TB $104
Coolermaster Case+PSU $160

$649

I don't remember difference being as large as you state for it to be considered "smoking" it.
its been awhile since I looked at benchmarks but I do own an Asus 3870 512mb in my closet somewhere.

It was only a couple of %'s in all honesty the stock oc'd cards were better performers overall.
I'm someone that prefers high-end single gpu's to xfire/sli due to the stuttering issue and drivers that come with that kind of setup.

But good point if going that route.

I like whatever gives the best performance last setup I had was dual 4870X2's or dual 4850X2's before that dual 3870X2's before that quad 3850's with nvidia stuff thrown about in the mix.

that argument has never made sense.



yes, if you slap in more and more video cards, eventually you end up CPU limited... but thats it.

people always use the 'logical' extension of 'so that means, CPU doesnt matter!' that is wrong.


your CPU requirements matter just as much, or even more - and who's to say you cant lower one or two CPU settings to get higher GPU usage/FPS? every percent matters when you want to get the most out of your video cards.


I am saying with currently available GPU's and high resolution what difference does CPU make? <2FPS? When GPU's drop new CPU's will be out and as you can see with Intel's 3 year old i7 920 again not huge performance gain. As it sits at a resolution I would play there is next to no difference between a 3 year old i7, 3 year old phenom x4 or brand spanking new SB. Current/last generation highend CPU's have more than enough performance for all current games GPU's are still the limiting factor. Anything below 1080P is pretty much a useless resolution on a $2000 PC.
 
Last edited:
Review is fail... Average frame rates only tell half the tail when it comes to CPU performance.

TPU should of included the minimum frame rates as that would of showed a much wider gap between the 3 architectures used in the testing.
 
i am still surprised how well the nehalem processors are holding up especially because of how old they are. I upgraded from my 920 at the start of last year to a 2600k only because it seemed more fun and it ran much cooler.
 
Review is fail... Average frame rates only tell half the tail when it comes to CPU performance.

TPU should of included the minimum frame rates as that would of showed a much wider gap between the 3 architectures used in the testing.

Yeah, cause things like updates, slower hard drives, network congestion, and many other factors are always the same.



Wait, they aren't.


Minimum frame rates on a system are not indicative of overall performance, they are useful, but not as much as common sense. Common sense tells me that if it is less than 40FPS it should have the settings dropped, the resolution dropped, or its time for a upgrade.
 
Yeah, cause things like updates, slower hard drives, network congestion, and many other factors are always the same.



Wait, they aren't.


Minimum frame rates on a system are not indicative of overall performance, they are useful, but not as much as common sense. Common sense tells me that if it is less than 40FPS it should have the settings dropped, the resolution dropped, or its time for a upgrade.

Very very rarely doors any thing that you listed above cause low minimums.

And common sense should tell you that :rolleyes:
 
Very very rarely doors any thing that you listed above cause low minimums.

And common sense should tell you that :rolleyes:

it takes just one stutter from a hard drive or SATA controller or background windows task to cause a min FPS dip. reviews that use them, always have to manually look for such dips and remove them from their results, so you're still getting an average - just say, the bottom 10% averaged, instead of the total amount.
 
it takes just one stutter from a hard drive or SATA controller or background windows task to cause a min FPS dip. reviews that use them, always have to manually look for such dips and remove them from their results, so you're still getting an average - just say, the bottom 10% averaged, instead of the total amount.

If a HDD stutter causes a frame dip then its either a bad engine or you need more memory.

All game should be kept in local memory so a HDD shouldn't affect it at all.
 
Code:
If a HDD stutter causes a frame dip then its either a bad engine or you need more memory.

All game should be kept in local memory so a HDD shouldn't affect it at all.

sorry, but you've clearly never dealt with this stuff before. short of running everything from a ramdrive, you're never going to prevent these kinds of problems.
 
Code:

sorry, but you've clearly never dealt with this stuff before. short of running everything from a ramdrive, you're never going to prevent these kinds of problems.

And even then memory controllers are not perfect. There will still be dips running a ram drive.
 
Code:

sorry, but you've clearly never dealt with this stuff before. short of running everything from a ramdrive, you're never going to prevent these kinds of problems.

My drives are in active when I game.... And have never gone crazy and lowered my fps.
 
My drives are in active when I game.... And have never gone crazy and lowered my fps.

you cant see the dips with the naked eye, they show up in FRAPS logs and such, hence ruining the minimum FPS benchmarks. if your drives are inactive, you must be playing small console ports. every game has a load screen, every OS has background tasks. this isnt worth arguing over, what i've stated is fact - minimum FPS is tricky to measure.
 
you cant see the dips with the naked eye, they show up in FRAPS logs and such, hence ruining the minimum FPS benchmarks. if your drives are inactive, you must be playing small console ports. every game has a load screen, every OS has background tasks. this isnt worth arguing over, what i've stated is fact - minimum FPS is tricky to measure.

Its not tricky at all, And fraps is not 100% accurate and causes issues itself.

If you get to the point where HDD access is affecting your frame rate in a big enough way for you to notice it without using frame rate counters ( again not 100% accurate and there own issues ) then there us something wrong with your rig.
 
Its not tricky at all, And fraps is not 100% accurate and causes issues itself.

If you get to the point where HDD access is affecting your frame rate in a big enough way for you to notice it without using frame rate counters ( again not 100% accurate and there own issues ) then there us something wrong with your rig.

this has nothing to do with the original discussion of why minimum FPS wasnt used in the reviews.
 
And fraps is not 100% accurate and causes issues itself

doesn't that mean "tricky" ? actually your statement confirms that.

how do you propose to reliably measure minimum fps? how to ensure decent accuracy? what resolution and accuracy for the measurement do you consider acceptable?

how is minimum fps defined? (just one frame? over one second?)
when does a frame start and end anyway? what about the time between frames?

everybody who shows minimum fps in their reviews uses fraps.
 
Last edited:
doesn't that mean "tricky" ? actually your statement confirms that.

how do you propose to reliably measure minimum fps? how to ensure decent accuracy? what resolution and accuracy for the measurement do you consider acceptable?

how is minimum fps defined? (just one frame? over one second?)
when does a frame start and end anyway? what about the time between frames?

Haha I read that post as go play in traffic antuk15. You know what W1zzard probably has no idea what he is talking about I mean he only runs a forum with 65 thousand members and 2.4 million posts, not to mention some of the most in depth reviews on the internet. :shadedshu
 
Back
Top