1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ATI AA better than NV AA screens

Discussion in 'Graphics Cards' started by wolf2009, Sep 8, 2008.

  1. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    Going by your methodology, it could be aplied to AF as well. According to the [H] article, the older 3 series suffered from this, but it no longer applies to the 4xxx series, with which is consistant with my experience, and from what Ive read elsewhere. If anyone else has any other links to show otherwise, please post. Even in the [H] article, there were a few instances where the ATI AA solution showed better, apples to apples. That being said, the hit from using it is less on the ATI solution, which everyone should be able to agree with, as Ive seen no evidence where nVidia does better using AA regarding fps, and the hit is harder on their cards
     
  2. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (7.60/day)
    Thanks Received:
    3,778
    Yes, It could be applied for the AF as well, but I'm also going by what countless others on this forum have said after switching to a 4 series card from and NV card. Those that have owned both agree, the IQ is the same.

    I should also personally have some first hand experience with the switch in the coming weeks, as I plan on getting some 4 series cards.

    I'm not loyal to either camp at all.

    Now, yes, the ATI seems to scale better with AA, but that's not the debate here.
     
  3. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    14,185 (4.84/day)
    Thanks Received:
    2,060
    would never be apples to apples as the techniques for each company is different.
     
  4. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    Wile E, if you could, please post any findings you have going to the new card. I value user experience, after all, its us that use the cards the most, and value their pruchases more. Id appreciate that. Also, if you could, try using the ATI special filters, and let us know if they truly give a better picture, since thats been the concensus
     
  5. Lionheart

    Lionheart

    Joined:
    Apr 30, 2008
    Messages:
    4,154 (1.58/day)
    Thanks Received:
    906
    Location:
    Milky Way Galaxy
    Heys guys, all I can say in my opinion is that when I had the 8800GTS 320MB the detail quality of AA and AF were heaps good, but when I bought the HD4850 and gave it a run, the AA and AF quality seems better to me, I don't know why but it just seems clearer and sharper at the same time, but with ATI cards they lack colour while Nvidia have really good colour and it can be enhanced with colour vibrance which makes it looks nice, ATI has this now but I think it only for avivo which sucks but im happy, the thing I like about it is its different, but thats just me.
     
  6. DrPepper

    DrPepper The Doctor is in the house

    Joined:
    Jan 16, 2008
    Messages:
    7,483 (2.74/day)
    Thanks Received:
    813
    Location:
    Scotland (It rains alot)
    I didn't see any aliasing anyway .... why not throw the resolution down then use AA, idk
     
  7. SK-1

    SK-1

    Joined:
    May 15, 2005
    Messages:
    3,315 (0.89/day)
    Thanks Received:
    425
    Location:
    In a Galaxy Far Far...you know the rest.
    I do (weird and anal I know) especially if it is a new engine release or a graphically beautiful game, and when a guy has a lot of money and pride wrapped up in his rig, image quality matters.
     
  8. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.61/day)
    Thanks Received:
    184
    http://www.techreport.com/articles.x/14990/7

    RV770's AF is as bad as on RV670, technically. On games it's not very noticeable, but comparatively MUCH MUCH MUCH MUCH (I could go forever) MORE noticeable than the difference of AA levels beyond 4x. It's specially bad at 45º angles.

    Also as you can see AA scaling is slightly better on Nvidia GTX 280 until 8xAA is enabled. Even then the GTX is usually above Radeon's performance level, but all the advantage is has at 4x has gone. You will see the same patern in Wizzards reviews and elsewhere. Now the X2 scales better, but you can't really compare as it's a dual GPU that costs $150 more right now. And in this case the "it's a dual GPU" argument DOES matter, it requires nearly double as much power as the GTX card, for instance.

    Bottom line is what I already said. A dual GPU card will ALWAYS perform better on higher AA levels, but they have their own collection of dissadvantages, and nowadays it just happens that the high levels are well above the discernible threshold.
     
  9. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    Firstly, it requires no wheres near double the 280. Secondly, show me links where less than 8x AA is less stressing on nVidia cards, and as youve said, actually in nVidias favor. Its not a 150$ more card, it BECAME a 150$ more card. And as far as high levels, wheres the qualification? By whos authority? Or is it nothing more than ones opinion as to what theyd like? Im reading the TR article, Ill be back
     
  10. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    From your link. Regarding AF Quote : "The bottom line, I think, on image quality is that current DX10-class GPUs from Nvidia and AMD produce output that is very similar. Having logged quite a few hours playing games with both brands of GPUs, I'm satisfied that either one will serve you well. We may revisit the image quality issue again before long, though. I'd like to look more closely at the impact of those trilinear optimizations in motion rather than in screenshots or test patterns. We'll see. " Now also from your link regarding AA Quote : "Incidentally, the RV770's performance also scales much more gracefully to 8X MSAA than any GeForce does. The Radeon HD 4870 outperforms even the mighty GeForce GTX 280 with 8X multisampling, and the 4850 practically trounces the 9800 GTX. Believe it or not, I'm already getting viral marketing emails from amdguyintoronto@hotmail.com asking me to test more games with 8X AA. Jeez, these guys are connected. " Now, as it says, it scales much more gracefullt TO 8x means it has the lessor impact below 8x and starts pulling away even moreso 8x and up
     
  11. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.61/day)
    Thanks Received:
    184
    Well it's not as much as double under load, but it DOES consume a lot more:

    http://www.techpowerup.com/reviews/Palit/HD_4870_Sonic_Dual_Edition/24.html

    From one of latest Wizzard's reviews. Total SYSTEM power consumption under load. GTX280 = 293W, X2 = 385W. The system excluding the graphics card consumes around 100W, so:

    GTX280 = 193W
    X2 = 285W

    That's 50% more power under load.
    Average is 60% more.
    Idle is 80% more.

    Regarding AF quality. I find it funny you can tell the difference between 4x and 8x AA when gaming, but you are not able to see the huge difference of AF quality on still images. I say that's some outstandingly skewed perception mate.

    EDIT: Who said it doesn't scale better to 8x? That's exactly what I did. But I responded to one of your claims that the Radeons scale better to ANY AA level, which is not true at all as you can see on the link. Jesus, look the chart, no need to especulate on what he wanted to say with "TO 8x". LOL.
    Ati friendly game BTW.
     
    Last edited: Sep 9, 2008
  12. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    Id also like to point out that this comparison is being done using a 270$ card vs a +400$ card, which like I said, or inferred, came in at over twice the 4870s pricing, within a short peroiod of time. If you follow the links that youve either said or posted, youll find the 3870x2 without AA does quite well against many of todays nVidia solutions. That being said, its because of the superior AA and lack of impact and use of AA on the 4xxx series that brought down those prices of the G280. You cant fool the folks is something I heard a long time ago, and it seems to me, ATI wins this one, AA according to what Ive seen in this thread from the OP, is superior coming from the ATI side of things. As for its demands concerning fps, it should and is well known, that also is in favor of the ATI current solution
     
  13. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    Im surprised you cant tell the difference from what the OP posted, as its obvious, as some have said, sure it COULD be bacause of other things, but that only points to one thing. There IS a difference. And as far as to what youre saying about the AF, which was by my request, not about the OP, your own link disputes what youve just said
     
  14. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    We all know AA scaling isnt the same from game to game. Itd be of better service to have more games to actually see what occurs here. Heres really the thing. Unless you have or are going to buy a G200 card, which say compared to the much lessor priced 4850, nVidia scales poorly comparitively. Can we agree with that at least?
     
  15. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.61/day)
    Thanks Received:
    184
    No, not until 8x, Nvidia GT200 cards scale better.

    And how many times I have to say? As long as Ati's AA seems to reduces its AF quality, you can't say it does better AA. I can see the AA becoming better as it goes up in the OP, but at the same time I can see the AF quality going down too, or the whole image getting blurred, I can't tell either way. But IMO (aka for me) OVERALL image quality is reduced as AA levels are increased, and so AA quality is NEVER better, it's higher, it's bigger, smoother, whatever, but never BETTER. Not for me (not for Jeffredo as you can see by yourself on the first page, not for many others), and as neither your opinion or mine or anyone's are absolute, can we conclude AA and AF quality, both are in practice (gaming) of similar level of quality for both cards, please. As that is what I've been saying from the start. For me this is finished.
     
    Last edited: Sep 9, 2008
  16. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    Id like to see that as well. Not sure hwo theyre going to go about that, but possibly the 55nm node will help, plus some low level coding
     
  17. Lionheart

    Lionheart

    Joined:
    Apr 30, 2008
    Messages:
    4,154 (1.58/day)
    Thanks Received:
    906
    Location:
    Milky Way Galaxy
    SNIFF SNIFF SNIFF, I smell a nvidia fanboy............. COUGH COUGH!!!!!!!!!!!!!! darkmatter:laugh:
     
  18. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    17,516 (5.07/day)
    Thanks Received:
    3,291
    Location:
    Worcestershire, UK
    Actually to be fair, knowing him fairly well, and also having had my fair share of "Debates" over Gfx card issues in the past with him, I can honestly say he is not a fanboi for either side, he does however beleive in accuracy (at least based on the research he does, and thats a fair bit) and he sometimes can come over very passionate but thats not aimed as I said at any particular brand.

    Anyways, as soon as I saw the title of this thread, I just knew that it was gonna be a flamebait session, No doubt Wolf's intentions were good (they usually are!)....but fanboi's always drag down meaninful discussions in my opinion, especially when it's about IQ! :rockout:
     
    DarkMatter says thanks.
  19. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    I think Ive said enough. I find the OPs post as not only relevant, but obvious. Maybe nVidia will create a better display engine for their msaa on the next gen. 8xAA matters folks, especially when you can apply it
     
    Tatty_One says thanks.
  20. Widjaja

    Widjaja

    Joined:
    Jun 12, 2007
    Messages:
    4,819 (1.63/day)
    Thanks Received:
    639
    Location:
    Wangas, New Zealand
    Yes this thread was going to have flaming in it.
    I expected it from the title.

    I may be able to back up this nVidia vs ATi IQ thing with proof but not with AA but with mirror reflections being blocky on his 8800GTX while the HD4850 has detailed reflection as he states.

    Still waiting though as I haven't seen what the reflections look like with my own eyes on his card.

    But in saying that I'm pretty sure my HD4850 has blocky self shadows in COD4 compared to my old 8800GT.
    Unfortunatley I don't think I would be able to get a decent comparision for the COD4 self shadows since they are always moving.

    With graphics cards it comes down to which is the best bang for buck as the IQ differences are usually on very small thing which are not noticable.
     
  21. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    17,516 (5.07/day)
    Thanks Received:
    3,291
    Location:
    Worcestershire, UK
    Agreed! But I find it strange making an IQ comparision between a 2 gen old 8800GTX and a current gen 4850 :confused:
     
  22. Widjaja

    Widjaja

    Joined:
    Jun 12, 2007
    Messages:
    4,819 (1.63/day)
    Thanks Received:
    639
    Location:
    Wangas, New Zealand
    My brother and I didn't think we would find any difference at all between the 8800GTX and HD4850 despite the gen gap.
    We hear and read about this nVidia vs ATi and think yeah, yeah whatever since we never seen any real proof until just recently.

    I find it interesting there is any difference at all.
    But still my brother has to come up with the screens first to make it valid.
     
  23. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.61/day)
    Thanks Received:
    184
    Again with the screenshots up and down...
    I clearly want to be called fanboy again and I know I will be, because I'm going to make another educated argument that it just happens it plays in favor of Nvidia, but I know that some intelligent people will be able to differentiate and apreciate the truth behind my words. I just didn't write it down before because I thought it was common knowledge. After so many posts talking about screenshots, and taking tham as dogma I feel I have to explain it.

    You can't compare image quality on still screenshots, specially anti-aliasing because of its nature. Any IQ feature on graphics cards are thought for games or videos, which are composed of motion pictures, not for still images. When gaming you are always moving and even if you aren't, nowadays all games have camera balancing to some degree, so not two consecutive frames are identical even if you are standing still. This does impact on how the antialiased image looks like, because while on frame 1 a said pixel can have a black value of 80% (after wheighting all the fragments that will compose that pixel), in the next one it can have a value of 50% (with probably almost the same fragments but different weights), 25% in the next, 80% again and so on. When this happens through 30+fps what you get is a much more anti-aliased image than that of each frame. It's just a different way of doing it, and one that IMO doesn't hurt texture clarity and detail so much (because pixel color blending is not as pronounced*), but I guess that's my opinion.

    *EDIT: I thought this could require further explanation. Look at the spotlight or chandelier 8xAA close up pictures in the OP. You can clearly see that on Ati the color is exceeding much more the boundaries of the lamp. This way the image appears more smooth (with its pros and cons), Nvidia gets the same effect doing the color blending as frames go by, so the textures don't get as much blurred and the MOVING image is as anti-aliased as with the other technique. Of course Ati's image blends over the time too, but it's redundant as they are already blend them each frame. Why the temporal anti-aliasing is better than the other technique to keep the detail of textures is complicated, but I think that screenshots speak for themselves so that this can be taken as fact.

    Think that most older LCDs only have a few thousands of colors and use this same technique to attain "true" color, the millions of colors. Because of the purpose of anti-aliasing, comparing it in screenshots is almost as pointless as comparing DivX IQ on screenshots.

    But yeah, you can all take the easy path and call me fanboy. Up to this point I think I don't care. Everyday I learn that most people don't care about the truth (or as Tatty said accuracy into the information), they just want to hear their "something" is better, and by no means they want to hear their flaws. I still have this post on my mind as it pictures very well what I'm saying:

    Sure, because a REVIEW shouldn't take everything true into account, good and bad, in order to describe and recommend a product. It should be based on feelings...
     
    Last edited: Sep 9, 2008
  24. newconroer

    newconroer

    Joined:
    Jun 20, 2007
    Messages:
    3,399 (1.16/day)
    Thanks Received:
    417
    The differences are so minimal, I don't believe either company purposely attempts to create better AA with the intent of beating the competition. It's more rather that as new technology is developed, AA gets better naturally, and of course different products can produce different results.

    I often find that it's the application that ultimatly determines how effective the AA will be, and whether or not AA can even be applied properly.

    Also, this is very old news so to speak, and the comparison of a 4850 and 8800GT seems a bit irrelevant as their different generations...

    If Nvidia is purposely sacrifcing IQ for performance, I would say that's acceptable, given how little if any difference there is in IQ. The cards work differently, applications are different etc.etc. when the driver has to work around obstacles, performance degredation can become apparent. If curbing the quality can limit or reduce this degredation then I can't see how anyone would complain.

    Lastly, still pictures isn't a fair representation, especially when you consider the addition of post processing effects...
     
  25. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    14,185 (4.84/day)
    Thanks Received:
    2,060
    i have an idea, how about we close this thread down as it seems to be drawing all the fan boys.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page