Discussion in 'Reviews' started by W1zzard, Apr 2, 2014.
Only one slot on mITX boards... and this is the only way to get Crossfire 290 in that form-factor. So yeah, just one. VGA will cost more than the rest of the rig combined.
From 2 to 4 fps
was recently posted in the wrong subforum and someone got soft. was told to repost here to get the answer.
this a 690 question regarding this review. NOT a question about - 295x2 nor 290x.
Review - 04/08/14 - AMD Radeon R9 295x2 8gb
*** shows 690 with 335.23 with BF4 at 5760x1080 4xAA to be 36.3 fps
Review - 03/04/14 - PowerColor r9 290x PCS+ 4gb
*** shows 690 with 331.82 with BF4 at 5760x1080 4xAA to be 26.1 fps
what cause this 39% boost? typo error for 26.3fps?
what happen here? what is the correct value?
Notice the differences in the test setup for both reviews: that's the reason.
I had a couple of thoughts that I was going to pose to you yesterday.
What CPU are you using? The reason I ask is because BF4 is well known to be CPU intensive also. W1zzard used a i7 4770k @4.2 GHz. Perhaps in addition to the improved Nvidia drivers between tests there was also some further optimizations from Dice on the CPU side as well.
Another thing is, did you record your FPS in the exact same part of the game as W1zzard did in the 3/4 and 4/8 benchmarks? Dice could have done some tweaks in that area of the game that factored into these tests.
It's not possible to go back and rerun the benchmarks from 3/4 even with rolling back the drivers because we can't undo any optimizations that Dice has made since then and even if you could get W1zzard to rerun both benchmarks what difference would it make? I sense that you are frustrated with the performance of BF4 which is understandable because so many players are based on the posts all over the internet but getting W1zzard to say he made a typo, if he did, isn't going to alleviate your frustration. Right?
I have already jump ship. 290x x3 (the lesser evil) is on route and one of the 680 x3 is already sold. 780ti x3 was first choice but ruled out cause it is a stutter fest in BF4.
cannot say the same for other BF4 players who is less fortunate without the financial backing to also jump ship. do share their pain of not being able to enjoy BF4 like it was intended.
SLI is simply broken in BF4. I clearly did not see that 39% boost nor any amount of boost in any part of BF4.
39% is not exactly a testing margin of error. that is huge.
hence here asking if W1zzard made a typo.
cpu is a i7-3770k oc to 4.7GHz
Many patches and many driver updates have happened. They are both correct for the time in which they were published.
Patches have been updated. No doubt that you have noticed the improvement yourself as FX-GMC has pointed out.
Well, post back here if you want to. There are a lot of BF4 players here that could possibly benefit from your experience. Maybe not 3X R9 290 crossfire. That's exotic...but interesting.
Why the shitty frames in Diablo 3 ?
I was looking to upgrading my Videocard, and i was waiting for the dual AMD card for what is like 2 years now, only to find it lacks major time in the game i play the most.
Crossfire 290x seems to scale ok. Is this suppose to be fixed, or what is really wrong ?
How is diablo 3 really tested, can anyone clarify ?
The reviewer kinda screwed up somewhere on measuring the power consumption of these cards. The lack of consistency of the games used in measuring performance and power consumption caused some of these cards' efficiency bars to go all over the place.
The patches might not be part of the problem; notice how to GTX 760, 780 and Titan greatly improved in efficiency, but the 770 and 780 Ti stayed the same.
And either half of the AMD cards have either improved slightly or severely dropped in efficiency.
What also could have changed the paradigm:
Thief was part of the benchmark suite in one graph, and wasn't used in the other
Diablo III:RoS was part of the benchmark suite in one graph, and wasn't used in the other
Call of Juarez: Gunslinger was part of the benchmark suite in one graph, and wasn't used in the other
16 games were benchmarked in one graph, 15 in the other.
Driver improvements - Case in point, BF4 from both of the reviews you're pinpointing. Not huge framerate increases for some cards (GTX 690's SLI config excepted), but in marked in percentage to those cards that don't benefit.
Thanks for review .The video from noise test is very cool!
hi bros i just have a question is possible make one tri crossfire of 295x , so i do ear some over the new standard CrossFireX XDMA was possible make a cross with more of 4 GPUS .. cheers i just have that doubt
I wouldn't even imagine the load of issues you'd be running into.
We switched from using Crysis 2 to Metro Last Light for typical gaming power consumption, also the list of games has changed.
Hi, i recently made a rookie mistake, i bought this gpu, and this PSU http://www.overclockers.co.uk/showproduct.php?prodid=CA-025-SS, i had heard that seasonic were one of the best makes, and that it was a high voltage high cost psu, that it would defiantly be compatible with this GPU, i've just read online that the psu for this card needs to be able to handle 28amps in a single rail and was unsure if this psu was able to, can anyone ease my mind and tell me if this is the case or if i have purchased a psu that won't work like a moron, thankyou.
It will be fine.
If I was to go all out with money on a GPU this would be the card I would get! The Nvidia cards like the Titan for example is over $1000 and not even as good as a regular R9 290X. So the R9 295X2 for under $800 would be what I would get. For under $800 this GPU puts Nvidia to shame with their overpricing. Not to mention Nvidia has nothing yet to offer that outperforms the 295X2. So the R9 295X2 is currently the fastest GPU in the world! I also love the water cooling and I love how it runs well under 70c. That is impressive. Nvidia may have slightly better efficiency but AMD once again beats them in top performance. Do not get me wrong I have owned many Nvidia cards and I like Nvidia cards. But AMD has more logical prices. And I do not care if I get a card that runs a few degrees warmer or uses a few more watts. That is meaningless anyway! You are only going to save about a dollar a year in electricity so I do not understand why anyone would buy a GPU based on using less wattage. Besides you are supposed to have a good quality PSU anyway. Unless you are dumb enough to buy a Alienware PC, But then they never have cards like this one, They are a ridiculous set up like a i7 4790K with a OEM GTX 645 LOL and a crap Hipro built PSU that will destroy the system not long after it messes up the bios causing it to have to be constantly reset.
Separate names with a comma.