• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

TechPowerUp $800 Build Guide

Ehm, why? The R9 280 will create a GPU bottleneck which is the last place you'd want one, plus it's an old card with lower specs than the latest rebrand. Why would you invest in a 3 year -old GPU?
An A10 will create a much slower machine overall that is insufficient for any kind of 3D game at 1080p and struggles on the CPU side with virtually anything. Unless you like 30 fps medium settings.

A10-7870K is just a fine CPU processor and graphics. It is basically a myth that a customer won't be happy using it.
As for the R9 280. Don't forget that the R9 380 series are basically offering the same performance (actually a rebrand) and are the most recent and modern hardware update.
Yes, it is quite an old GPU but there is nothing more modern. Ask TSMC and GloFo for the inconveniences with their extremely slow transition to 14/16 nm.
 
A10-7870K is just a fine CPU processor and graphics. It is basically a myth that a customer won't be happy using it.
As for the R9 280. Don't forget that the R9 380 series are basically offering the same performance (actually a rebrand) and are the most recent and modern hardware update.
Yes, it is quite an old GPU but there is nothing more modern. Ask TSMC and GloFo for the inconveniences with their extremely slow transition to 14/16 nm.

http://www.anandtech.com/show/9307/the-kaveri-refresh-godavari-review-testing-amds-a10-7870k/5

720p ultra, 37 fps. Enjoy that A10. It is the equivalent of low-end discrete, you are being delusional brother. It does not belong in an 800 dollar build at all. The CPU side is weaker than its price equivalents on the Intel side too. There is a good reason these things hardly sell; the market is nonexistant or niche at best. R9 280 and an i3 is just not good balance either, you are stuck with a weak GPU that will be replaced faster than you'd like. It was great in 2013, but this is 2015.
 
Last edited:
The CPU side is weaker than its price equivalents on the Intel side too.

I have friends who buy the AMD APUs and don't complain like whining sissies, you. :D

Basically, even if intel offerings in the same price level are faster, they are so marginally faster, that you will hardly if ever notice a difference.

This 800$ screams for low end PC, i3 with R9 290 is plain delusional too. ;)
 
I have friends who buy the AMD APUs and don't complain like whining sissies, you. :D

Basically, even if intel offerings in the same price level are faster, they are so marginally faster, that you will hardly if ever notice a difference.

This 800$ screams for low end PC, i3 with R9 290 is plain delusional too. ;)

If you burn 800 dollars and end up low-end, we are done here ;)
 
I have about 500 usd. What should I upgrade now ?
CPU: I5 2550K
Main: Asrock Z68 extreme 4 gen 3
GPU: R9 280X
HDD: 2,5 TB ( 3 HDD)
PSU: Cooler master V1000 ( 1000W)

Almost of my time is playing Path of exile and Dota 2. I also play TW 3 and GTA 5 but they are not usual.

thanks in advance
 
I have about 500 usd. What should I upgrade now ?

4K Monitor - take for example Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS.

PSU: Cooler master V1000 ( 1000W)

You do not need 1000 W unless you plan a serious Crossfire configuration. Two R9 Fury X, or two R9 290X.

If you burn 800 dollars and end up low-end, we are done here ;)

Sorry, but almost all components, perhaps except the videocard, are low-end. Don't tell me that i3 is anything more, or that cheap lg monitor.
 
tried to find some prices in East Europe...yeah...processor 125 Eur + VGA 330-360 Eur...
so even its a nice build, doesn't work for Europe :(
800$ build goes almost up to 1000 euro.
 
4K Monitor - take for example Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS.



You do not need 1000 W unless you plan a serious Crossfire configuration. Two R9 Fury X, or two R9 290X.



Sorry, but almost all components, perhaps except the videocard, are low-end. Don't tell me that i3 is anything more, or that cheap lg monitor.

Yeah, the CM V1000W 2nd is too cheap in my country, about 85 usd for 3 years warranty


I will :D
 
I first thought that as I like the 3Gb.. the problem from PcPartsPicker the are drying up fast, the best price $178 -AR$20 for a Gigabyte WINDFORCE... working $20 less for a R9 285 seem a good trade.

Many 380 2Gb have little merit over the PowerColor R9 285 TurboDuo so to bump up it was $25 still working a $20 rebate, I wanted to use that someplace else. If going that R9 380 route then you want 4Gb, and the best price on 380 4Gb is $204 (still working a $20 rebate) +$45 is ludicrous... Then it's almost mandatory to ante up for a 290 $243 (same old rebate). For 20% more cash, the minimum FpS you receive is 20%, while most 30% (some like 40%) and power about 20% increase.

http://www.hardwareluxx.de/index.ph...3-drei-modelle-der-radeon-r9-380-im-test.html

That's the issue, there's a hard cut-off if you stay with lower CPU. Crazy working 53% more money to just do a 290 , and if craving the maximum FpS the 290 can offer, it really mandates the i5. ~$700.

Looking at the lower level (less costly) I ponder how W10/Dx12, Asynchronous Shading, improved multi-core support (or at least getting away from high CPU overhead within Dx11), low level API come play into the future. While not spending big on old 28nm allots you money/justification to jump to a the next 14/16nm GPU as soon as they arrive.

The downgrade to the 380 was presented as an option if one wanted to include a beefier PSU with an upgrade track in mind. It's not as good a deal as the Turbo Duo, but it still stays within the budget without rebates. You couldn't really cut the cost anywhere else, and it should offer respectable performance at the target resolution for at least a couple of years. Then if one were to upgrade the GPU later, only one (two if you want to bump the CPU as well) would have to be replaced instead of three. And you'd have a more reliable PS to boot. Hypothetically.
 
No they didn't. One licence one system. They sold 3 packs at one time, but they were like $250.

Yeah, I do remember the 3-packs. But I'm somehow managing to run one key on two systems at the moment anyway. Maybe it was different for Student copies? Or perhaps I'm just lucky and their servers haven't noticed.
 
Yeah, I do remember the 3-packs. But I'm somehow managing to run one key on two systems at the moment anyway. Maybe it was different for Student copies? Or perhaps I'm just lucky and their servers haven't noticed.

People over estimate Microsoft's activation system. It is easy to activate the same license on a few machines. They don't really catch on unless you really start to abuse it and activate it on a bunch of machines.
 
A10-7870K is just a fine CPU processor and graphics. It is basically a myth that a customer won't be happy using it.

Except all the benchmarks show when you are planning on using a discrete CPU even an i3 will outperform it in games.

As for the R9 280. Don't forget that the R9 380 series are basically offering the same performance (actually a rebrand) and are the most recent and modern hardware update. Yes, it is quite an old GPU but there is nothing more modern. Ask TSMC and GloFo for the inconveniences with their extremely slow transition to 14/16 nm.

Uh, what? There are plenty of cards more modern. The R9 290 and up use GCN 1.1 as opposed to GCN 1.0, and the R9 285 uses GCN 1.2. With the new 3xx series and Fury lines we have more GCN 1.1 cards and the R9 380 and all the Fury cards are GCN 1.2. The R9 280 is a GCN 1.0 so it's 2 revisions removed from AMD's most modern architecture, plus it performs worse in almost all tests than the R9 380 so there's really no incentive to buy and older slower card. At least when DX12 comes out the R9 380 will probably see some improvements.

Sorry, but almost all components, perhaps except the videocard, are low-end. Don't tell me that i3 is anything more, or that cheap lg monitor.

Intel's Low-End still competes with AMD's high end in almost every gaming test. Yes, there are a few games where AMD is competitive but for about 90% of them even and i3 can blow the doors off an FX-8350 or their newest APU. AMD won't be a real option until Zen launches.
 
Last edited:
Uh, what? There are plenty of cards more modern. The R9 290 and up use GCN 1.1 as opposed to GCN 1.0, and the R9 285 uses GCN 1.2. With the new 3xx series and Fury lines we have more GCN 1.1 cards and the R9 380 and all the Fury cards are GCN 1.2. The R9 280 is a GCN 1.0 so it's 2 revisions removed from AMD's most modern architecture, plus it performs worse in almost all tests than the R9 380 so there's really no incentive to buy and older slower card. At least when DX12 comes out the R9 380 will probably see some improvements.

What is the performance difference between R9 280 and R9 380? Somewhere in the statistical error? Maybe in some titles the R9 380 is even slower?
Or maybe you can quote the price tag of the R9 380 because it is a lame rebrand of R9 285.
Or that R9 280 comes with 3 GB of memory, while R9 380 comes with 2 GB.

Or that it is again nonsense to mention Fury in this thread. I hope you won't recommend i3 with Fury.

Or that ALL GCN cards support DX12.

I think on all these points you are rather confused.

And I didn't even mention you for the nth-time that I don't care about how marginally faster i3 could be. You will never notice a difference.

It is just for press e-peen comparisons who has the bigger. :( You guys suck in this regard.
 
About 5-15% depending on the game. There were no benchmarks that I saw in Guru3D's review of the 2GB R9 380 where the R9 380 was slower than the R9 280. Also, if you think the amount of VRAM will make a difference you're mistaken, neither card is powerful enough to be Memory-bound above 1440p, and there's no way a 280 or 380 would get playable framerates at 4K. The GTX 960 by comparison has 2GB of VRAM and outperforms the 4GB variation of the R9 380 as well as the 3GB R9 280/280X.

And the kicker is, the cards are about the same price. You can get a 280 for $10 cheaper after mail in rebate, but the upfront cost is nearly identical--$190-200. Also, the i3 isn't marginally faster, it's noticeably faster in most circumstances. And an i3 setup versus your proposed A10-7800K setup is only about a $20-30 difference, and at least with the i3 setup you can get a better CPU, the A10 is the top of that pyramid.

You can't really argue about e-peen comparisons and choose to get a card that is worse performance for the same price just because it has 3GB of VRAM rather than 2GB. Performance is key, and cost per performance is one of the best metrics out there. You can dig your head in the sand all you want but the facts will always win. Enjoy your proposed setup with an APU and a 4K monitor, I'm sure that will be one pretty slideshow.
 
What is the performance difference between R9 280 and R9 380? Somewhere in the statistical error? Maybe in some titles the R9 380 is even slower?
I agree with you if I could get a 280 for like what was the lowest price months back (aka ~$140-145) that would present the BfB, although that's no longer the retail price since basically EOL. The other is why consider (pay more) for a 380... this is when re-branding provides folks a boon. Why pay 16% more for a little higher clock when I'd OC the 285, and be right in the same Mhz with the 285 as the 380 when OC'd. The 380 brings no "virtue" to what is a budget build.

I realize the 280 having 3Gb and 348-Bit feels like a benefit in itself @1080p it's not huge justification. 280 vs. 285 (stock/stock) are mostly the same. There's a school of thought that Tahiti can offer some higher OC Mhz numbers, I consider a 285 (GCN 1.2) offer some good grunt when OC even if at less Mhz. All that somewhat speculative/opinion so I can see the other side.

The thing is there are times the 285 (even with less memory) finds favor with newer titles and seem to have more oomph when the setting are moved higher. Boishock is one title; 1080p high the 280 is 7% faster, although at 107fps you'd move to Ultra at which the 280 drops to 62 Fps, now the 285 is 14% faster.

I see it as the 280 has been a great 1080p BfB choice, that said if you get a R9 285 for less it is not inferior, it normally stays out ahead of the 280 by at least 5%

http://www.hardwareluxx.de/index.ph...odelle-der-radeon-r9-380-im-test.html?start=9

I sure wish this Build Guide article had not just offered speculative/opinion, and we could actually have had gaming B-M's. Honestly, I might think the i3 and a GTX 960 with additional OC might have a good showing. As then it's easy for a novice to OC the 960, while not the CPU OC'n apprehension of working the FX6300. The upside of a i3/960 is any "decent" but inexpensive ($30-35) 450-500W is sufficient while offsetting the add price of the GTX 960.

Honestly, for me the test would be; the i3/960 low power simple for novice to OC (console substitute). Next the brutish but very affordable FX6300/285 system, with CPU/GPU OC (GPU 4.3Ghz / GPU 1050/1550Mhz), perhaps not to their extremes though still fairly straightforward to achieve. I would say the conclusions (either one those systems, or btarunr i3/290) might not bear out Cost vs. FpS, and none really offering any huge visual difference for the use of improved/higher setting.
 
Last edited:
What is the performance difference between R9 280 and R9 380? Somewhere in the statistical error? Maybe in some titles the R9 380 is even slower?
Or maybe you can quote the price tag of the R9 380 because it is a lame rebrand of R9 285.
Or that R9 280 comes with 3 GB of memory, while R9 380 comes with 2 GB.

Or that it is again nonsense to mention Fury in this thread. I hope you won't recommend i3 with Fury.

Or that ALL GCN cards support DX12.

I think on all these points you are rather confused.

And I didn't even mention you for the nth-time that I don't care about how marginally faster i3 could be. You will never notice a difference.

It is just for press e-peen comparisons who has the bigger. :( You guys suck in this regard.

So you would prefer paying the same price for a system that is going to be weaker and more prone to a bottleneck, you dismiss an i3 as low-end and place AMD's top APU against it as a better alternative while being a weaker CPU, and then you come tell us we suck?

Maybe grow a set of balls and own up to your misinformation, because it is not true that all GCN cards fully support DX12 in all feature levels either, and nobody cares that you don't care or want to see what components are best picks at what price bracket. Last but not least, DX12 is irrelevant for now and especially with a mid-range system like this. To top it all off you tout high amounts of VRAM on cards that are nowhere near the gpu horsepower to drive 3 GB, this is n00b buyer's error number 1, you know, the errors of the uninformed customer base.

Seriously, dude, you are making such a fool of yourself.
 
Last edited:
The case used is much too expensive, I would have check if the VGA fits in a DeepCool Smarter LED case. This mini tower comes equipped with two 120mm fans and room for a 3rd on the side panel I think. The price is well bellow 30$.
 
AMD's top APU against it as a better alternative while being a weaker CPU
I was reading this forum yesterday and found it interesting that one of the charts the author posted (not sure as to the site he found in but appears German) use a A10 7870K to run what appears to be AoS tests with Enthusiast level cards.
http://forums.guru3d.com/showthread.php?t=401950

Or that ALL GCN cards support DX12.
it is not true that all GCN cards fully support DX12

AnandTech Win10 Lanch day... "AMD’s Catalyst 15.7 driver offers working DirectX 12 support for all GCN 1.0, 1.1, and 1.2 GPUs,"
http://www.anandtech.com/show/9472/windows-10-launch-day-gpu-support-summary

At this point neither side can claim Full hardware support for Dx12, either side has support for specific Tiers' through hardware, while some have support of specific subset through software emulation; like SAD4 shader instruction (Shader model 5.0 can optionally support). While GCN doesn't not provide FL 12_1 (conservative rasterization/ROV's).

It has been folks trying to work from some "basic explanation" without specifying all the irregularity between native support and emulation which causes all the mystification. I see having Dx12 as not irrelevant, you should be trying to have a system the stays relevant for 18+ months.

I ponder how W10/Dx12, Asynchronous Shading, improved multi-core support (or at least getting away from high CPU overhead within Dx11), low level API come play into the future. While not spending big on old 28nm allots you money/justification to jump to a the next 14/16nm GPU as soon as they arrive.
 
have you seen haswell i3s keeping up with 8 core FXs in most games?

if you are resetting with a 970 and locked i5 on a 450 watt PSU- its a problem with the PSU not the components power consumption.
It's not - I had it replaced on warranty and the replacement unit behaves in the same way. And this particular PSU had rave reviews, including the one here on TPU (550W model): https://www.techpowerup.com/reviews/Seasonic/G550_V2/11.html
 
Niiiice. EVGA PSUs are fantastic. Every PC in my house has EVGA or Silverstone.
 
I have my doubts if 500W psu is enough for the R290. I have a Seasonic g450 which has 3A less on +12V rail and it would occasionally reset with GTX970 + stock i5 3350P due to a random spike pushing it over the edge. R290 pulls ~70W (or 6A) more in peak.
if you are resetting with a 970 and locked i5 on a 450 watt PSU- its a problem with the PSU not the components power consumption. Btw, 70 watts is just less than 6 amps of 12v.
It's not - I had it replaced on warranty and the replacement unit behaves in the same way. And this particular PSU had rave reviews, including the one here on TPU (550W model): https://www.techpowerup.com/reviews/Seasonic/G550_V2/11.html
I had to go back see how this progress over the last three page, it peak my inner engineer so I thought I'd look into it a little more.

First a i5 3350P (3.1Mhz) and a 970 (would like to know the GPU model/clock) running eXtreme PSU calculator with a normal group of ram, HDD/SSD, DVD, fans etc. such a system seems to draw 366W, recommending a minimum 416W PSU. I see eXtreme has add (don’t recall seeing that) the “rail” recommendations; indicating 3.3V\8.5A; 5V\9.5A; 12V\24.2V, but is that minimum under load or general rating for the PSU. As the Seasonic G-450 3.3V\20A; 5V\20A; 12V\37V so one would think you're covered there, but under load is a different story. Using the old rule of thumb that looked at the load watts (366W) then add 25% (457W) from that point it still looks passable.
http://outervision.com/power-supply-calculator

A review form [H] for the Seasonic G-450 they find, when loaded to 75% (338W) the 12V rail does succumb to ripple/noise that's a just slightly up, and that’s with known good 115V input. If you’re not seeing that at the wall it could perhaps see an adverse effect. But then look at the Amps of the rails from [H] at 338W (75% load) 115V - 3.3V\3A; 5V\5A; 12V\24A is that what eXtreme is indicating (under load)? If so there’s a divergence.
http://www.hardocp.com/article/2014/04/03/seasonic_gseries_g450_power_supply_review/1#.Vd3pl53n8m4
http://www.hardwareinsights.com/wp/seasonic-g-450-review/
http://us.hardware.info/reviews/5312/seasonic-g-series-450w-psu-review-golden-price-war

I’m not one to think or say such a Seasonic G-450 is a bad choice, quite the opposite I’d pick for a moderate build. Though a 69W CPU and GPU with 148W TDP would seem comfortable for such a PSU, Guru3D has consistently said for custom OC 970's a 500W PSU is minimum. From the numbers it appears be pushing that PSU at 366W or like 81% seeing what the rails provide and the ripple/noise IDK. Like you said the PSU may see a random spike from PC components, or lower power at the wall from perhaps something else kicking on and the PSU just does what it intended to do… shut-down.
 
I had to go back see how this progress over the last three page, it peak my inner engineer so I thought I'd look into it a little more.

First a i5 3350P (3.1Mhz) and a 970 (would like to know the GPU model/clock) running eXtreme PSU calculator with a normal group of ram, HDD/SSD, DVD, fans etc. such a system seems to draw 366W, recommending a minimum 416W PSU. I see eXtreme has add (don’t recall seeing that) the “rail” recommendations; indicating 3.3V\8.5A; 5V\9.5A; 12V\24.2V, but is that minimum under load or general rating for the PSU. As the Seasonic G-450 3.3V\20A; 5V\20A; 12V\37V so one would think you're covered there, but under load is a different story. Using the old rule of thumb that looked at the load watts (366W) then add 25% (457W) from that point it still looks passable.
http://outervision.com/power-supply-calculator

A review form [H] for the Seasonic G-450 they find, when loaded to 75% (338W) the 12V rail does succumb to ripple/noise that's a just slightly up, and that’s with known good 115V input. If you’re not seeing that at the wall it could perhaps see an adverse effect. But then look at the Amps of the rails from [H] at 338W (75% load) 115V - 3.3V\3A; 5V\5A; 12V\24A is that what eXtreme is indicating (under load)? If so there’s a divergence.
http://www.hardocp.com/article/2014/04/03/seasonic_gseries_g450_power_supply_review/1#.Vd3pl53n8m4
http://www.hardwareinsights.com/wp/seasonic-g-450-review/
http://us.hardware.info/reviews/5312/seasonic-g-series-450w-psu-review-golden-price-war

I’m not one to think or say such a Seasonic G-450 is a bad choice, quite the opposite I’d pick for a moderate build. Though a 69W CPU and GPU with 148W TDP would seem comfortable for such a PSU, Guru3D has consistently said for custom OC 970's a 500W PSU is minimum. From the numbers it appears be pushing that PSU at 366W or like 81% seeing what the rails provide and the ripple/noise IDK. Like you said the PSU may see a random spike from PC components, or lower power at the wall from perhaps something else kicking on and the PSU just does what it intended to do… shut-down.

Insightful analysis, and I agree on all counts. For the record, the 970 in question is Asus Strix 970, which is indeed fairly power-hungry. I don't know the quality of the electrical network here (it's an old building) but it could be definitely a factor too.

In any case, I also used to be one of those guys who would run the numbers and cut it close when it came to picking a PSU but not anymore. There are factors that are hard to account for and it's better to pay a little extra and have more headroom than end up with your PC resetting in the middle of a game.
 
Back
Top