Discussion in 'Reviews' started by cadaveca, Oct 19, 2012.
Both really. Will it?
Boy, you can tell who 'dress to the left' and 'dress to the right'. NOBODY FOOKN KNOWS PEOPLE! (= both sides to the middle)
NT300 seems too, thats why I asked for proof
Why are people spouting nonsense for make believe CPU's for a FX-8350 CPU review? Stay on topic.
I say I don't care I would still rather look at it than another 3 year socket.
Oh I know, but nothing says it wont hell they could dominate everything ever created, but no one will know until they release.
I don't either but I thought it would be funny just to see what he could come up with HAHAHAHAH
II don't go to bookies or loan sharks. Just the sharks from Wall Street, your neighborhood B of A and JP Morgan-Chase.
If I am going to get abroken arm for something I'd rather get it fighting the bosses and their corrupt system. I got a heart attack from the consequences of that No regrets. They may have some after what I and my co-workers put them through.
I won't say it again. I will take action if you cannot keep on topic.
By the way I saw a retail price drop on-line. Tiger Direct has the FX-8350 for 204.50. You have to pay 2.99 for slow shipping though. Still better than 219.99 with rip-off Newegg.
Techs Spot gave a terrible hack review of the cpu. He used primarily synthetic benchmarks with a lot of single threaded games and about 3 or 4 applications. Written by an Aussie named Steve. Not nearly as thorough or favorable review as I saw on Tom's Hardware.
I told him so on the site. his only reply was "have a nice day." No effort to defend his review or justify his findings which were not generally upheld by other testing. I found no intellectual process explained in his review. Tom's broke down everything and explained every nuance in their testing procedure and results.
Man I posted before I saw your admonition. Sorry. Must have crossed paths at the same time. I was composing the oiff-topic reply when you were posting yours, so I did not see it.
^ oh get back to work Erocker
Great review Dave
If I had AMD like I used to I'd upgrade to the pile driver, but really nothing is enticing me about this cpu over the older 8150... sure its a bit faster but ... The bulldozer was what pissed me off so much that I sold my amd rig and went Intel! PPL go on about overclocking and I just shake my head. Amd still cant touch Intel. Im keeping an open mind here but really Pile driver @ stock 4.0-4.2 GHz vs Stock 3.5ghz 3770K as an example.... sure the numbers are close but up the I7 to the same clock as amd and Intel HAMMERS the pile driver big time.... It's not even a pissing match any more... It's like a Vett vs a chevett
I must say Im looking forward to upgrading to Z77 and 3770K in the next month
And Sorry AMD, you didn't win me back this time around and I don't think you ever will
Id rather spend a bit more money and know what I have, then spend less and wished I spent more to have more
All in all good job AMD
So what? My nephew uses my old hand-me-down 4 year old QX9650/X48 Rampage on LGA 775. It certainly isn't embarrassed by any AMD CPU performance.
The one that delivers on time.
On a personal note, I like new stuff. If Haswell delivers, I might just buy it...but then, I tend to upgrade yearly- probably why Intel make so much money!
The same can be said for any full featured Z77 and X79...or are you too myopic to see past a brand?
Because AMD are the only company to evolve their CPU design ? because Intel CPU's lack multi-threaded performance ? Because the next 6-12 months are going to see an exponential growth in software tailored to Bulldozer architecture ? Because AMD can stick to a timetable and their performance estimates ?
While you're answering those, maybe you can provide links to support your supposition of "every company complaining about multithreading".
So what? How many people use every feature on a motherboard?
Bleat on about dual x8 boards (presumeably for CFX/SLI) and rave about "mildly worse" in gaming. Sounds about right.
I'd think that more people might look at options such as choice in the mATX/ITX form factor, onboard WiFi and WiDi, SSD caching and the like.
True, but whats AMD's track record ? I know which company I'd trust more to adhere to their timetable.
Rubbish. Dual GPU cards and CFX/SLI are already impacting the electrical restriction of PCI-E 2.0
I see we've reached the limit of your knowledge.
PCI-E x16 2.0 @x16 = 80GT/sec (8GB/sec) * 80% (8b/10b encode) = 64GT/sec (6.4GB/sec)
PCI-E x16 3.0 @x8 = 64GT/sec (6.4GB/sec)* 98.46% (128b/130b encode)= 63.6GT/sec (6.36GB/sec)
Total bandwidth is not the answer. So we've reached the limit of yours as well, eh?
Single-GPU is not Dual-GPU or tri-GPU or quad-GPU. W1zz shows differnce with one card..it's worse with two, becuase not only do you have traffic for each card, but also traffic between the cards themselves.
Nice perf per clock increase!!
If scaling is linear (should it? I don't know, I am just assuming here) I am not convinced with the 1% drop in performance between PCI-E 2.0 x8 and x16 is indicative of us hitting the wall. I am also not sure if dual GPU cards require more than 2x single GPU card, and on top of that I have absolutely no knowledge of how much GPU talks between each other outside CF/SLi bridge. Enlighten me
Nah, I've done testing, and then sold the cards. It's best you test yourself.
Scaling between cards is NOT linear with more than two for sure, and not quite linear with just two. You tell me why it isn't, and you'll answer the question you just asked.
I am no expert, so I'll leave them to explain the ins and ours of why...because I've been saying this for years, but nobody seems to agree...but then you ask those people if they've ever ran such configs, and the answer is no.
I feel it's better you educate yourself than relying on me with this subject in particular.
well you could allways human smoke its ass and move on to the next system each year, nice job you must have smokey, bit of a waste though ,a new system each year just to slate Amd in threads
my main rigs listed , it has 3 cards in it and x8 for the xfire setup, i tried x16 but cant keep that config with this card mix but i loose 2-4 fps out of 60-80 avg on 8 ,and x4 for physx card dosnt hinder it either, nor does the emense cpu i have, ima get me one dave ,Ill tell you how it clocks on mad water cooling, i might do a tap oc run too, perma fresh cold waters .
Precious little on the web, and I can never get my two friends with 670s together at any one time to test.
Either way, I feel like there is a need for people with equipment and knowledge (*hint* Wiz *hint*) to do some writeup on CF/SLi scalings on top of PCIe 3.0/2.0. I wonder if different platforms (x79 etc) react differently.
Can we get Clock for clock comparisons with the FX8150 vs. FX8350 perhaps at 3.60GHz and 4.0GHz.
Different platform? Not really. I mean, there is the whole PLX PEX8747 or whatever chips used to provide similar bandwidth on each platform, and I did some testing of course, but the same basic things applied.
What I cna say for sure if that the very best result when it comes to scaling was with three cards, and only on the Gigabyte X79-UD5, the only X79 board I found to offer three links direct to CPU with no bridge chips of any kind, and it easily out-performed any other board by 5-8%.
I did mention this to W1zz, and we kinda agreed that those other chips in the link also add latency just by being there, which is something commonly mentioned anyway when it came to nvidia's NF200. So natuarally, adding another device in the link, mainly another GPU, is going to add latency too.
So, because this perforamnce problem exists with multiple GPUs, and drivers seemingly cannot deal with it effectively, running more than two GPUs doesn't make much sense, even with Eyefinity...the added latency of the third card at that res is killer, IMHO.
THe whole lack of perfect scaling when adding GPUs says it all, I think.
Why? I did it at 5.0 GHz. I do not have the time for such things unfortunately, as it doesn't really add much usable info to purchasing advice.
Chef - the one that sets the menu.
No worries on the upgrade score pal. Turn over a rig a year means I can resell for good cash because everything is still covered by warranty.
I build systems in my spare time -mainly bespoke watercooled gamers, w/c'ed and OC'd workstation for stock traders, and budget all purpose for the deaf community since the internet becomes an important access point to keep them in touch with the community (I'm profoundly deaf myself so generally know what the requirements are). I also really love building systems- more than using them I think. To me upgrading and building is adult LEGO, and I refuse to limit my enjoyment simply because someone elses idea of empirical cost effectiveness differs from my own.
As for slating AMD...only the BoD and the myopic Walter Mitty fanboy base that dwell in an alternate reality where AMD can do no wrong and benevolently rule the technological world.
That should do wonders for limiting galvanic corrosion.
And to whoever burbled on about AMD having native USB3.0 support...You'll find that only FM1/FM2 have that. 700, 800 and 900 series southbridges don't-they rely on third party controllers.
Already done. Here's an example from HardOCP. Bear in mind that the PCI-E 3.0 system they are using runs at x16 x16 (CFX/SLI) or x16, x8, x8 (3-way CFX/SLI). There are examples which show lesser gains on the net, and also greater gains (mainly using dual-GPU cards at high res)
That was never the point of my answer. The question/accusation posed by cdawall was that AMD's full 2.0 spec x16 implementation was somehow superior to Intel's x8 3.0 (the fact that there are plenty of Intel x16,x16 3.0 spec boards around seems to escaped him). Of course there's other factors involved- Game and driver coding to limit stalls in CPU, GPU and memory subsystems come to mind, but not a lot of point flying off on a tangent when you're supplying answers to a specific statement.
honestly who cares, Its Out, it draws less power, runs faster than BD so whats it matter anymore. These chips will replace BD totally and make room for SR so lets end the nonsense.
(Beating a Deadhorse is only fun for maybe 2 minutes then its boring, some start sounding like broken records around here)
Amen, it's funny the people buying new GPUs and and CPUs every year are probably the same ones slamming people buying Ipads and Macbook pros every year.
Separate names with a comma.