Discussion in 'Reviews' started by W1zzard, Aug 28, 2014.
Well at least WoW is playable at 4k on the card
Not for any serious raiding with your own guild + spell effects + monsters + dont stand in fire
So in other words stand in the corner and watch people walk by and acknowledge that its pretty at least
Err...Pretty might be relevant to the situation since the graphics are not exactly on the highest of standards by today lol (But then again that's not the point of WoW)
I read reviews on four sites. This is one of them. I haven't been around here for very long, but I like it here, and the reviews are part of the reason why. (I ended up here chasing down a creep that cheated me)
Anyone can post ~picking the flyshit out of the pepper~, but actually do some reviews and you're gonna realize that they're all different and that variables exist every time.
I like being able to depend on a familiar review format. It makes it easier to find the info I want to see. (what interests me and what doesn't)
Thanks for the time and effort you put into the reviews here.
A few dozen of these votes are mine because it is open for quite a while and you can vote every day.
How representative is this result and how many real unique votes are there?
So, if I understand you correctly, you will wait until 4K becomes mainstream and then you will begin paying more serious attention? Right?
lol, this review is like totally opposite to the one on the tech report
Plenty of 4K in there
I believe you meant to say the R7 260x on page 1 correct?
I'm seeing that the Asus R9 285 Strix and the Gigabyte variant on Tom's are drawing approximately 176W (average).
These are somewhat lower than what has been reported here by the Sapphire variant (189W avg.), but still in line with what turns out to be a card that draws a lot of power. I was hoping to see this card in the 150W range.
Ugh, why does AMD continue to use a shim that is higher than the core? It forces you to use special heatsinks that are incompatible with every other GPU on the market. The only reason I can imagine they did it in this GPU is to maintain compatibility with the 79x0 and 280 heatsinks.
Can't speak for anyone else, but I voted once only....to keep within the spirit of the thing y'know?
I guess if I had the facebook mentality I'd be checking boxes like a demented Bingo player- but I don't so I don't.
I'd guess that W1zzard bases his bench suite on both reliable (repeatable) bench scenario's, game popularity/relevance, and feedback from the site membership - admitting to spamming the poll doesn't seem like advancing your position.
forum users can only vote once on the poll, unregistered once per ip per 24 hours. these polls are a fun thing, they have no significance for us, so i really don't care if there is cheating.
I keep forgetting about changing the poll to a new one ..
Look around, you'll see variances all over the place. TR's power consumption...
Looks bad doesn't it? If you're the kind of person that skips straight to the results you couldn't really tell that the 285 card is overclocked, and like the gaming benchmarks...
Doesn't have normalized stock clocks as either indicator that the tested card is OC'ed, or base performance as W1zzard supplied (i.e. an idea of OC scaling perf. vs power consumption) - which is the better methodology ?
Well, I see the Gigabyte results, but I don't see the Asus results.
Another point I'd highlight is that not all cards are created equal.
I am not spamming or cheating the poll but using my officially given right to cast my vote as much as I would like to. Calm down.
I don't speak about gaming at 4K but all other services where GPUs can and should be used - television, movies, videos in YouTube, etc...
Well if thats all you care about, im pretty sure any of the cards that contain DP 1.2 can run 4k just fine (60hz effective). Some NVS 510's in the office we have running support it and videos really do not utilize much video power.
Yes, probably you are right. I am just taking that ideal case when the GPU and CPU load stay at ~1% during 4K playback. But if it is already possible, or even not that bad when those are loaded at several dozens percents, then I agree with you. I didn't know that.
All the GTX 770s linked here has their promotion period expired...all over $300... at least in Canada
I think the number of flak this review gets is due to the ratings system reliant on numbers (and decimals, for Chrissake), for example what determines it a 8.0 and not a 8.2 or 8.4? Or not 7.9 or 7.8? Or 7.95 rounded up to 8 because the reviewer was feeling good that day?(Crossing into 7 territory from 8 is like crossing the Rubicon or Styx, I can see W1zzard thought real careful about that!) I am sure this has been discussed to death already, but take it what you will and make an informed choice using this review and others is all.
I tried, and it crashes, is it incompatible with 64 bit version of MPC?
Interestingly enough the hardware using ATI provided video encoding software from a few years ago provides another GPU assisted rendering method with MPC, 0-5% CPU use, 0-60% GPU use depending on file scaling and scene complexity. Attempting to play the same clip with CPU bound enhancements resulted in a single core maxed out and jumpy replay.
Most don't use it, as they don't understand that it needs turned on, needs some filters, their built in Intel graphics are crap, Netflix silverlight plugin doesn't allow true hardware acceleration, youtubes smorgasbord of file types and streaming does more to hurt quality than to help, and other limitations make it almost impossible to test.
I have a ATI 750HD tuner card, where encrypted digital cable makes it virtually worthless. OTA programming benefits from hardware, as does plugging in a old VHS or DVD player, but why do that when it already has a DVD player in it, and backup copies can be had through the pipes that be.
I own and use 1080 camcorder, on my camera, and my new cell does 4K that looks great at 1080, but I am sure meh at actual 4K. You are trying to start a movement on something we aren't sure when and how is going to be adopted, no delivery service, no content, no reason as of yet. Few cards can push that many pixels without issues gaming, and google 4K tiles screen issues and read up.
Yes but do the apps running on your 4K phone actually support 4K, or are they just stretched like most Android apps?
Power consumption results confirmed.
Don't double post.
It records 4K video, it is 1080 resolution. 4K screen resolution on a phone is stupid, no need for that pixel density.
Is it really considered a double post if it is posted 8 hours later and has completely nothing to do with the other post? Hmm...
I have a question about the reviewed GPU.
On page 4, A Closer Look, there is this: "A BIOS switch is also available. It lets you switch between a legacy and UEFI BIOS and acts as a safeguard should something go wrong during a BIOS flash."
Is the switch a two position switch? Does it make the card operate at different speeds or voltages? Note that in the past, I have seen these serve as a stock speeds and voltages BIOS in one position, and accelerated settings in the other.
Separate names with a comma.