1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The Ati vs. Nvidia Image Quality Thread

Discussion in 'Graphics Cards' started by DarkMatter, Mar 14, 2008.

?

Who has the better IQ this generation?

  1. Ati

    84 vote(s)
    56.8%
  2. Nvidia

    19 vote(s)
    12.8%
  3. None. Both have the same.

    45 vote(s)
    30.4%
  1. Kovoet

    Kovoet

    Joined:
    Mar 9, 2008
    Messages:
    864 (0.35/day)
    Thanks Received:
    180
    I've had both cards and this will be on going for a long time. I remember been such an ATI fan since the day of the 9800xt came out all the way till the HIS 1950pro turbo which I loved but since going over to the nVidia 8800GTS 512mb I can't decide anymore. Both cards have there advantages.

    The only thing that gets me about nVidia it's going to make me bankrupt one day because the cards change every 5 minutes.
     
  2. saadzaman126

    saadzaman126 New Member

    Joined:
    Apr 16, 2008
    Messages:
    415 (0.17/day)
    Thanks Received:
    7
    Location:
    Ontario, Canada
    we'll see when both new series come out and who has the better card i hear 4870x2 is insanity
     
  3. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
    i own an 8800gt and have installed/setup 5-6 3800 cards, as well as getting to play with them.

    out of the box without spending any time tweaking, the ati cards look BETTER in movies and most games, for video playback they are quite alot better then the 2900, the 2900 lacked something or other that helps in video playback acceleration, to lazy to look it up at the moment.

    now with tweaking you can make both look better, but OUT OF THE BOX without any tweaking ATI/AMD clearly look better and this has been varifyed by more then a few people, including the votes on this thred, funny since nvidia's cards are so much faster and more popular in games, that they are winning the vote here.

    My x1900xtx looks better for VIDEO PLAYBACK, image are more crips and clear, i dont know about cpu use, since it dosnt matter to me really, i can watch 720+ and 1080p movies and do other things at the same time witout lagg, so im happy.

    now for gaming, no aa the cards are close to even, but PER AA SETTING, ati's is better for QUILITY, but nvidia cards can crank the AA up drasticly without having a major performance hit.

    heres my personaly experiance using a gateway 2001fp gaming monotor(benq built)

    in games 2x ATI AA looks the same to me as 4x or 8xQ aa depending on the game, some games respond diffrently to AA settings then others,

    WoW for example at 1600x1200 looks as good with 2x ATI AA as it does at 8xQ aa on my 8800gt, 4x nvidia aa looks about the same as 2x nvidia, i dont know if its a driver or game bugg but thats my experiance, nvidias supersampled transpariancy AA looks better then adaptive AA, but dosnt look better then the advanced AA modes you can enable with atitool advanced tweaks, also supersampled mode really hammers performance in WoW and other games that use alot of 3d textures for stuff like ground clutter and trees, it can cut the fps in 1/2 or even 1/4th, where as i can enable adaptive AA+EATM and AlphaSharpenmode and have far less of a hit with the same or better quility.

    mind you the new enhanced modes dont work with all games, but those that they work well with show a nice quility boost, and you can even dissable AAA and get the same quility boost as you had by enableing it with effectivly no perf hit(same perf as not having AAA enabled but same quility as it would be with it enabled.

    so yeah, nvidia cards are faster and with higher settings you can get the same quility, but if you are just talking PURE QUILITY stock for stock, the way most people use their cards(most people do not tweak their drivers beyond maby setting aa/af levels in the driver) ati wins.

    when you bring performance in dx9 and dx9+dx10 shaders into account with AA, nvidia stops ATI, no dought, in 10.1 ati pulls ahead by current reports/reviews because true dx10(10.1 is what dx10 was orignaly ment to be) uses shader based AA, so it dosnt have the huge impact that trying to do dx9 AA with shaders/software has.

    you are talking about 2 totaly diffrent designs when you compare the g80/g92 with the r600/r670.
    g80/92 are dx9 parts with dx10 shader support taged on(sm4) they are designed to work best in games that where out at the time the card came out, and that have mostly come out since, this was a good movie on nvidias part, because it allowed them to pull ahead nicely when the 2900 came out.

    the r600/670 chips are a NATIVE dx10 design with dx9 supported via software, the problem here is that they didnt support AA with a hardware unit for use when playing dx9 based games like all games till very recently have been.

    crysis and others are not true native dx10 games, they are dx9 games with dx10(shader model 4) shader fx added for teh dx10 version, and as such do not support true dx10 shader based AA, relaying on dx9 hardware AA insted.

    when/if nvidia's next card comes out with NATIVE dx10.x support we will see how it does with older games, my guess is that they will keep the hardware unit there for games that work best with it and support shader based aa for those games that requier it or that will run best/look best with it.

    I get an impression that the r700 based cards will have a hardware AA unit onboard to improve performance with older games as well as also fully supporting shader based AA as is requiered by true dx10 specs(ms backed off on some specs for nvidia, orignaly dx10 was ment to be what dx10.1 is)

    so yeah, nobodys really better IF you tweak things, but out of the box, ati wins in my book, and yet i have an 8800gt(well its in the shop, i will have it back soon i hope)
     
    EastCoasthandle says thanks.
  4. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,482 (11.48/day)
    Thanks Received:
    9,762
    Rebo&Zooty:
    One important thing here is that you are using a CRT screen. That means the RAMDACS on the cards are in use, whereas over DVI they would not be. If possible, compare over DVI as well - it wouldnt surprise me that in this LCD era, they've been using cheaper RAMDAC's lately.
     
  5. ShadowFold

    ShadowFold New Member

    Joined:
    Dec 23, 2007
    Messages:
    16,921 (6.70/day)
    Thanks Received:
    1,644
    Location:
    Omaha, NE
    Seeing as I just went from a 3850 to a 8800GT I did not notice anything image quality wise, just performance increases :D
     
  6. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
    um, wrong, the dell 2001fp is a 20.1in lcd, benq dosnt make crt's afik.

    and i also have a 19in kds monotor i hook up to watch movies on as i game, its a crt.

    http://support.dell.com/support/edocs/monitors/2001fp/EN/specs.htm

    results are the same,and the kds i have is a very high quility crt, the lcd is a high quility gaming lcd(dispite the specs looking very unimpressive today it dosnt ghost at all, and is VERY crisp)

    and nvidia sometimes uses crappy filtering on the analog portionof lower end cards, same crap they pulled back in the geforce1/2 days, but back then u could remove the filters with a few snips of plyers and get 100% quility boost from it.

    havent seen the high end nvidia cards use bad filters in years, but the quility of the aa/af and hell genral iq on the pre 8 seirse cards was very questionable to be kind.

    and yes this monotor is using NATIVE dvi-d, it has vga, svideo and componant plugs as well(may hook up a dvd player to or for the hell of it :p )

    one thing i can say about this monotor, i have yet to find a stand thats better or as good, this ones telescopic, lets me raise the monotor to eye level from it being dirrectly on the desk......my fathers newer dell 20.1in widescreen(samsung) as a CHEEZY little stand i had to build him a wooden piller to set his monotor on to get it to eye level(how laim is that?)

    i didnt pay for this lcd, it was a gift from a good friend, and dispite the dell logo on it, i like it ;) (but i hate dell :p )
     
  7. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,482 (11.48/day)
    Thanks Received:
    9,762
    oh my bad, i thought it was a CRT model (used to have a 19" CRT with a very similar name)
     
  8. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
    fp stands for flat panil :p

    oh and little FYI for you all, even todays best lcd's cant reproduce a TRUE RED, a good/great crt is better then a kickass lcd for colour reproduction and quility.

    theres a reasion most prof gfx studios still use crt's for their high end work machiens.

    only problem is the room they take up and how much they cost to ship.

    place i use to get monotors http://www.merkortech.com/home.asp

    good prices, specly if you stick with the free shiping models, my buddy got a 36in hdtv/crt for 400bucks around 1.5 years ago from them, thats SHIPED, granted it was shiped ground freight so it took 2 weeks to get here, but still, that was a killer price for that unit, its native res is 1080p, and it has both true digital dvi and vga, its got an hdmi port but the hdmi ports not encrypted like it needs to be for it to properly support hdcontent in vista :p
     
  9. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (8.18/day)
    Thanks Received:
    3,778
    The G8x and G9x cards are true DX10 cards as well. Their shaders are totally unified and fully programmable. They aren't locked into specific tasks at all.

    And the DX version has nothing to do with the way AA is processed. It doesn't require any specific type of AA processing at all. That's 100% up to the game developers, and always has been.

    And 2xAA on ATI is in no way equal to 4x (or whatever) on Nvidia.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page