1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

HD 7950 May Give Higher Framerates, but GTX 660 Ti Still Smoother: Report

Discussion in 'News' started by btarunr, Dec 13, 2012.

  1. Melvis

    Melvis

    Joined:
    Mar 18, 2008
    Messages:
    3,528 (1.59/day)
    Thanks Received:
    508
    Location:
    Australia
    Good point, but im not to sure if it would matter? as this so called issue has been around awhile? but to cut it down maybe go with the latest drivers from this yr and covering all windows OS's XP/Vista/7/8 as there the main ones people use for gaming on, and the stuttering they cliam is regardless of OS?
    Last edited: Dec 14, 2012
  2. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (8.82/day)
    Thanks Received:
    3,776
    It's not a "so called" issue. It's true in some setups. My 4870x2 stuttered. When I went to replace it, I initially went with a 5870, much less stuttering (was still there in some games though), but not enough performance. Basically a cross-grade for me.

    So I tried 2 x 5850. Nice performance boost, but stuttering was back in full effect.

    So I decided to try nVidia. My 580 gets roughly the same frame rates as the 5850 Crossfire combo in the things I tested, but is noticeably smoother. Stuttering is a rare occurrence on my 580. It does happen on some occasions though. Much less frequent however.

    Of course, ymmv. I'm sure there's more to it than just AMD and their drivers. But on my setup, I get a better experience with nVidia.
  3. Melvis

    Melvis

    Joined:
    Mar 18, 2008
    Messages:
    3,528 (1.59/day)
    Thanks Received:
    508
    Location:
    Australia
    See this is what i mean, i ran a single 4870X2 then went to 2x 4870X2 and never saw this stuttering issue. I think it depends more on software installed on each individual machine and hardware then all down to just what there claiming.

    And lets face it if its THAT BAD then no one would be buying AMD cards period, but we both know that isnt true?
  4. mediasorcerer New Member

    Joined:
    Sep 15, 2011
    Messages:
    979 (1.03/day)
    Thanks Received:
    225
    Location:
    coast ,melbourne
    I saw this over at ars yesterday, i'm very happy with mine, as if the human eye can see microstuttering anyway lol, cinema is 24 frames per second and nobodies complained about that for the last 100 years have they?

    We are talking milliseconds, anyone out there can honestly tell me they can see in milliseconds lol?


    Give me the bus width and extra vid ram anyday, much more futureproof.


    They are a great card for the money, plain and simple
    Melvis says thanks.
  5. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (8.82/day)
    Thanks Received:
    3,776
    I think it's more a combination of setup, software, and also a individual's natural ability to see it or not. Though on my setup, AMD was much more guilty of it. Hard to say why that is for sure, but notice I said my setup. Again, ymmv.

    Movies and games are recorded and rendered differently. Movies have blurring which effect the perceived smoothness. The blurring is caused by the camera at capture time. Games generate the images, not capture them, and therefore are not blurred, but perfect frame by frame. The human eye DOES perceive the difference. Blurring fools the human eye and brain into seeing smooth movement.

    The few games that do have some sort of blurring option, generally run much smoother at much lower framerates. Just look at Crysis 1 as an example. With blurring, it rendered smoothly on most setups all the way down in the 30's, whereas other games require a much higher framerate to achieve the same level of smoothness.

    Besides, you do not have to be able to see each individual frame to recognize when something isn't looking smooth. Most people I let see these issues first hand can't put a finger on what's wrong, but they see microstuttering as something that's just a little off, and doesn't feel quite right.


    EDIT: Found what I was looking for to prove my point. Even at 60fps, some settings show a noticeable difference. It's even more pronounced if you view on a high quality CRT.
    http://frames-per-second.appspot.com/
    Last edited: Dec 15, 2012
    aayman_farzand says thanks.
  6. rvalencia

    Joined:
    Nov 3, 2011
    Messages:
    79 (0.09/day)
    Thanks Received:
    8
    http://techreport.com/review/23150/amd-radeon-hd-7970-ghz-edition/6

    [​IMG]

    http://techreport.com/review/23150/amd-radeon-hd-7970-ghz-edition/7
    [​IMG]

    Might as well return to XFX 7950 Black with 900Mhz and no turbo boost.

    For AIB overclock vs AIB overclock product, Techreport should have used Sapphire 100352VXSR i.e. 7950 @ 950Mhz with no turboboost.
    Last edited: Dec 15, 2012
    the54thvoid says thanks.
  7. mediasorcerer New Member

    Joined:
    Sep 15, 2011
    Messages:
    979 (1.03/day)
    Thanks Received:
    225
    Location:
    coast ,melbourne

    If you say so, you may well be right, i don't get that with my card, and it's just a stock 7950 too, i'm not using boost bios though. Thanks for the reply and info.
  8. jihadjoe

    jihadjoe

    Joined:
    Oct 26, 2011
    Messages:
    381 (0.42/day)
    Thanks Received:
    97
    There's actually an interesting paper from Utah University about that:
    http://webvision.med.utah.edu/book/part-viii-gabac-receptors/temporal-resolution/

    And xbitlabs also had a look at how display technology affects perceived response times:
    http://www.xbitlabs.com/articles/monitors/display/lcd-parameters_3.html

    Anyways they something like the eye (thanks to the brain) is actually able to perceive changes up to 5ms. That's 200 frames per second.

    Cinema is smooth at 24fps because those frames are delivered consistently.

    i.e., if you plot time vs frames, then
    at 0ms you get frame 1,
    at 41.6ms you get frame 2,
    at 83.3ms you get frame 3 and so on.

    The frames always arrive right on time, and your brain combines them into an illusion of fluid motion. Of course it kinda helps that every frame in a movie is already done and rendered, so you dont have to worry about and render delays.

    On a computer, the case might be like:

    at 0 ms you get frame 1
    at 16.7ms you get frame 2
    at 40ms you get frame 3 (now this frame should have arrived at 33.3ms)
    at 50ms you get frame 4

    Frame 3 was delayed by 8ms. Going by a consistent 60fps it should have arrived at 33.3ms, but processing delays meant it rolls off the GPU late. Your in-game fps counter or benchmark tool wont notice it at all because it still arrived before 50ms (or when frame 4 was due), but your eye, sensitive to delays of up to 5ms notices this as a slight stutter.
    Last edited: Dec 15, 2012
  9. Pehla

    Pehla

    Joined:
    Mar 29, 2012
    Messages:
    283 (0.38/day)
    Thanks Received:
    66
    Location:
    brčko dc/bosnia and herzegovina
    i agree...
    i think ppl who bought nvidia must say negative coments about amd becouse...well lets face it they can't do nothing else..they have nvidia!!and since im not fan of any of those
    i can say the sam about amd fans!i just go price performance and now.. that is amd!!!
    i would even go with amd cpu setup just becouse they are cheaper..but that dont give pcie gen3 suport so i give it up!!
    :nutkick:
  10. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (8.82/day)
    Thanks Received:
    3,776
    Oh, that doesn't mean it's going to effect everyone. That's not the argument I'm trying to make. I'm just saying that it is real, and does effect some.

    If you have a great experience with your card, by all means, keep using it. I'm not here to tell you otherwise. After all, what works best for one, doesn't always work best for another. I'm not here to tell you AMD is bad for you. If it works great in your setup, there's no reason for you to worry about it at all.

    The cards just don't seem to work their best in my particular setup. I can't speak for everyone though.

    On the topic of this particular article and related reviews, however, I do like the latency based approach to testing. It seems to fall into line with how my system behaves with these cards.
  11. okidna

    okidna

    Joined:
    Jan 2, 2012
    Messages:
    464 (0.55/day)
    Thanks Received:
    340
    Location:
    Indonesia
    :D Old driver is OLD.

    With newer BETA driver :

    http://techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/5
    [​IMG]

    http://techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/8
    [​IMG]
    the54thvoid and Frick say thanks.
    Crunching for Team TPU
  12. rvalencia

    Joined:
    Nov 3, 2011
    Messages:
    79 (0.09/day)
    Thanks Received:
    8
  13. Ferrum Master

    Ferrum Master

    Joined:
    Nov 18, 2010
    Messages:
    408 (0.33/day)
    Thanks Received:
    88
    Location:
    Rīga
    It is a shame we cannot compare such tests on Linux or MacOS... at least for now... but AMD still lags behind nvidia driver binary blobs tough there...

    Anyway I see this as necessary evil. This will shaken up AMD driver team at least.
  14. seronx

    seronx

    Joined:
    Jul 10, 2010
    Messages:
    981 (0.71/day)
    Thanks Received:
    216
    Location:
    USA, Arizona, Maricopa
    http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1157&lid=1&pid=1547&leg=0

    950 MHz for ALUs/TMUs/ROPs(1792/112/32)
    = 3404.8 GFlops/106.4 GTexels/30.4 GPixels
    5 GHz 384-bit GDDR5
    = 240 GB/s

    http://www.zotac.com/index.php?page...6&option=com_virtuemart&Itemid=100313&lang=en

    1111 MHz for ALUs/TMUs/ROPs(1344/112/24)
    = 2986.368 GFlops/124.432 GTexels/26.664 GPixels
    6.608 GHz 192-bit GDDR5
    = 158.592 GB/s

    ---
    In my conclusion it would appear that the 660 Ti has faster timings(Renders the scene faster) and does more efficient texel work(Can map the textures faster).

    --> Higher clocks = faster rendering. <--
    Games don't use the ALUs, the ROPs, and the RAM efficiently on the PC, so more Hz means more power even if you have significantly less units.

    Games(+HPC with CUDA): Nvidia <--unless AMD is cheaper for the same performance.
    High Performance Computing(Not with CUDA): AMD
    Last edited: Dec 16, 2012
  15. Ferrum Master

    Ferrum Master

    Joined:
    Nov 18, 2010
    Messages:
    408 (0.33/day)
    Thanks Received:
    88
    Location:
    Rīga
    To prove it is right... we need to downclock the 660ti even through output numbers and then do the benches... then we'll see it those are kernel/driver problems or hardware limitation by itself... although yes skyrim is a mess even so... project stutter.
  16. jihadjoe

    jihadjoe

    Joined:
    Oct 26, 2011
    Messages:
    381 (0.42/day)
    Thanks Received:
    97
    Edit: Ah fk it I just realized I'm DAYS late to the party...

    Just a little tweet from Anand Shimpi:
    https://twitter.com/anandshimpi/status/279440323208417282
    I'm pretty confident btarunr wouldn't have linked the article here either if he felt it was biased.
  17. entropy13

    entropy13

    Joined:
    Mar 2, 2009
    Messages:
    4,870 (2.60/day)
    Thanks Received:
    1,171
    LOL yeah, and talking about "bias"...I'm also a regular in the comments section at Tech Report, and it's Cyril Kowalski that's more frequently called an "Nvidia fanboy" even though he recommended the 7850/7870 over its Nvidia counterparts because of the prices at the time of the reviews.
  18. the54thvoid

    the54thvoid

    Joined:
    Dec 14, 2009
    Messages:
    3,128 (1.97/day)
    Thanks Received:
    1,355
    Location:
    Glasgow - home of formal profanity
    Having read more into it there is no bias. Any issue with the latency is on a game to game, driver to driver basis. Here are the older latency graphs for Skyrim. The only card to suffer is the GTX 570.

    [​IMG]

    Yes, older drivers (12.7 beta) but the entire point is, no bias and no AMD crap out. Also, for each latency blip to be identifed as a non-glitch requires continual rerunning of the same scene and seeing how the latencies play out.

    And yes, to repeat, the graph above are older drivers but the point is still valid - there are no inherent issues with the AMD cards. Nvidia's Vsync may well be doing it's intended job here to minimise latency (effectively reducing it by throwing resources - speed - at those more difficult scenes.)
  19. crazyeyesreaper

    crazyeyesreaper Chief Broken Rig

    Joined:
    Mar 25, 2009
    Messages:
    8,074 (4.36/day)
    Thanks Received:
    2,674
    Location:
    04578
    Nvidia has a bit of an edge when it comes to stutter free game play what people don't realize is NVIDIA using some of those transistors in the GPU for that very purpose while AMD not so much, essentially NVIDIA is using GPU die space to improve smoothness of gameplay to an extent how well it works well thats up to ppl with their GPUs to decide its. In the end both companies can provide fantastic performance and stutter free gameplay apparently for AMD it just requires driver switching lol.
  20. kristimetal New Member

    Joined:
    Mar 5, 2012
    Messages:
    6 (0.01/day)
    Thanks Received:
    0
    Sad

    I'm sad now, i have a 7950 Windforce.
    In Skyrim with 12.8, it had sometimes a small stuttering, i have updated to 12.10 and the stuttering increased in outside areas (not in the cities, in the cities it runs smoothly, but in some caves the stuttering appears, weird).

    I boughted it in july, there was a special offer, paid 310 euros, it was a deal back then.
    Now i think i should have gone with a 670gtx, but the damn thing even now is around 360 euros (the cheapest with standard cooling), Asus DCU or Gigabyte windforce beeing at 380-390 euros.
    :cry:
    Hope AMD improves their drivers fast.:ohwell:
  21. sergionography

    Joined:
    Feb 13, 2012
    Messages:
    264 (0.33/day)
    Thanks Received:
    33
    whatever the case turbo for gpus never made sense to me
    it sounds like it could skew average fps due to super high fps in easy to render scenes that allow thermal headroom but not so much to the intensive ones that allow no thermal headroom which is were you need the power
    if anyone knows of any good reviews that compare minimum fps between boast and non boast it would be greatly appreciated
    xorbe says thanks.
  22. okidna

    okidna

    Joined:
    Jan 2, 2012
    Messages:
    464 (0.55/day)
    Thanks Received:
    340
    Location:
    Indonesia
    7950 Boost reviews but you can also find 7950 non-boost version minimum FPS (and average) as a comparison :

    http://www.bit-tech.net/hardware/2012/08/16/amd-radeon-hd-7950-3gb-with-boost/1
    http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-HD-7950-3GB-PowerTune-Boost-Review
    http://www.hardwarecanucks.com/foru...owercolor-hd-7950-3gb-boost-state-review.html
    Crunching for Team TPU
  23. sergionography

    Joined:
    Feb 13, 2012
    Messages:
    264 (0.33/day)
    Thanks Received:
    33
    see just like i thought
    http://www.pcper.com/files/imagecache/article_max_width/review/2012-08-13/bf3-1680-bar.jpg
    here the minimum fps is the same

    http://www.pcper.com/files/imagecache/article_max_width/review/2012-08-13/bac-1920-bar.jpg
    and here the boast has lower minimum for some reason

    so yeah it appears the whole boast on graphic cards isnt as reliable, it only affects average fps due to higher max fps, which is useless if you ask me because anything higher than 60fps isnt noticable, while going below 60 in some competitive games might be a big problem

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page