Discussion in 'News' started by btarunr, Feb 10, 2013.
Id rather get my moneys worth too +1
Radeon HD 7790, which will probably get rebranded as HD 8770 or 8750 once the full fledged HD 8000 series hits is proof of the bullshit this rumor is...
Anyway, can't AMD do 192/320bit buses? I know nVidia can...
But either way, all in all a 2-year schedule with a bigger performance boost from generation to generation seems way more rational/market-friendly and even more profitable (if done right).
I'd rather have a driver update that gets rid of crossfire stuttering than any new hardware... the 7970 is still the best hardware out there by far considering its price - it's only latency/CF issues that bring it down.
If AMD can ever fix the latency/stutter issues on 7970 CF, I won't be upgrading for a long time. If they can't do that, I'll be heading for GTX 780 SLI or whatever the next high end thing is from Nvidia.
I dunno why driver issues are such a bugbear with AMD. You'd think they'd invest some money into getting this critical piece of software right, wouldn't you?
I haven't experienced any significant graphics problems since going over to nvidia in 2009 and I'm now on my third graphics card with them (GTX 285, GTX 580, GTX 590).
However, I also see many people with AMD cards which work just fine and in CF too, to be fair.
I haven't had a single problem with any of my AMD cards ever. I often wonder if this driver issue is just made up to excuse the lower frame rates than nVidia. Though quite frankly, anyone that thinks they "need" 60+FPS is, in my opinion, a fool with more money than sense.
Then according to your definition, I'm one of those fools, lol.
You're dead wrong there howver. I have a 120Hz monitor and gaming at a solid 120fps makes for a massive improvement, easily visible without even trying to look for it. It's even noticeable just moving the mouse and windows around the desktop.
And finally, my monitor can handle 144Hz and yes, you can see the difference when compared to 120Hz, although it's more subtle.
60fps constant is pretty much mandatory for a multiplayer FPS IMO. Other than that I can live with anything that has minimum framerate above 30.
AMD has stuttering issues with crossfire, and I have zero ideas why they can't improve it. When I (and many others) can vastly improve the experience by just using a framerate limiter, why the heck can't AMD figure out what the problem is and fix it? Makes no sense to me at all...
I've more or less alternated between AMD and Nvidia over my 15 years of building PCs but if Nvidia can release something that doesn't suck to me, my 7970 experiment may be a short lived one.
If it's so easy to fix, why don't you do it? It's not as easy of a problem to solve as one might think it to be. nVidia has the same problems in multi-gpu setups as well and all it has to do is how long it takes for the the second card to render and sends its frame buffer to the other card. The problem is one card always has a leg up when one is master and drives the display because it takes time to send that data from one GPU to another. Now this goes very quickly, Less than a millisecond, we're talking maybe a couple hundred nano seconds, but that's added time that gets put on top of the rendering load that slows down that second frame. So what you have is it alternating between one frame rendered a little faster and the next rendered a little slower, and this difference between each frame is the stuttering you're seeing.
There was a review on another site the showed jitter and that running 3 cards in crossfire had less jitter than 2 and I would hypothesize that it happens because two of the three cards has the latency introduced because of the crossfire bridge and one gpu being master. So what happens is two out of 3 frames render in about the same amount of time with 1 frame getting rendered slightly faster. So the frame rate jitters less often than with two cards.
So since you have a frame rendering where you do not know how long it will take to render, then the next frame that might render more quickly or slowly than the last frame depending on what has changed on top of the added latency of the crossfire link and the master handling the frame buffer. So you tell me, how do you make that jitter less? I suspect both AMD and nVidia have programmers and engineers thinking about it but it's a tough problem, so don't go blaming either company for their multi-gpu short-comings because it's not an easy problem to solve at all.
When all it takes to vastly reduce stutter is a framerate limiter (which in my understanding just cuts out those frames that get displayed ultra-quickly thus creating jitter), why can't AMD just build that (or something like it) into their software implementation of CF?
That's what makes no sense to me. I'm not claiming to be a tech guru about this, but it isn't obvious at all to me why that would be so difficult to do.
Separate names with a comma.