• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The Ati vs. Nvidia Image Quality Thread

Who has the better IQ this generation?

  • Ati

    Votes: 85 56.7%
  • Nvidia

    Votes: 19 12.7%
  • None. Both have the same.

    Votes: 46 30.7%

  • Total voters
    150
Joined
Mar 9, 2008
Messages
1,177 (0.20/day)
System Name KOV
Processor AMD 5900x
Motherboard Asus Crosshair Hero VIII
Cooling H100i Cappellix White
Memory Corsair 3600kHz 32GB
Video Card(s) RX 7900XT
Storage Samsung 970 evo 500gb, Corsair 500gb ssd, 500GB 840 pro & 1TB samsung 980pro. M2.SSD 960 evo 250GB
Display(s) ASUS TUF Gaming VG32VQ1B 165mhz, Dell S2721DGFA 27 Inch QHD 165mhz
Case Corsair 680x
Audio Device(s) ON Board and Hyper X cloud flight wireless headset and Bose speakers
Power Supply AX1200i Corsair
Mouse Logitech G502x plus lightspeed
Keyboard Logitech G915 TKL lightspeed and G13 gamepad
Software windows 11
I've had both cards and this will be on going for a long time. I remember been such an ATI fan since the day of the 9800xt came out all the way till the HIS 1950pro turbo which I loved but since going over to the nVidia 8800GTS 512mb I can't decide anymore. Both cards have there advantages.

The only thing that gets me about nVidia it's going to make me bankrupt one day because the cards change every 5 minutes.
 

saadzaman126

New Member
Joined
Apr 16, 2008
Messages
415 (0.07/day)
Location
Ontario, Canada
System Name Cyclone
Processor AMD Athlon II X4 640
Motherboard MSI 870 G45
Cooling Cooler Master Hyper 212+, Terminator 3 80mm LED, and 120mm Front LED
Memory Kingston HyperX Genesis 4gb DDR3-1600
Video Card(s) Zotac 1Gb GTS 250
Storage Seagate Barracuda SATA2 7200RPM 500gb
Display(s) LG Flatron 19" LCD
Case Aerocool Jetmaster Jr.
Audio Device(s) Onboard
Power Supply OCZ ModXStream Pro 500W
Software Win 7 Ultimate 64 bit
we'll see when both new series come out and who has the better card i hear 4870x2 is insanity
 

Rebo&Zooty

New Member
Joined
May 17, 2008
Messages
490 (0.08/day)
i own an 8800gt and have installed/setup 5-6 3800 cards, as well as getting to play with them.

out of the box without spending any time tweaking, the ati cards look BETTER in movies and most games, for video playback they are quite alot better then the 2900, the 2900 lacked something or other that helps in video playback acceleration, to lazy to look it up at the moment.

now with tweaking you can make both look better, but OUT OF THE BOX without any tweaking ATI/AMD clearly look better and this has been varifyed by more then a few people, including the votes on this thred, funny since nvidia's cards are so much faster and more popular in games, that they are winning the vote here.

My x1900xtx looks better for VIDEO PLAYBACK, image are more crips and clear, i dont know about cpu use, since it dosnt matter to me really, i can watch 720+ and 1080p movies and do other things at the same time witout lagg, so im happy.

now for gaming, no aa the cards are close to even, but PER AA SETTING, ati's is better for QUILITY, but nvidia cards can crank the AA up drasticly without having a major performance hit.

heres my personaly experiance using a gateway 2001fp gaming monotor(benq built)

in games 2x ATI AA looks the same to me as 4x or 8xQ aa depending on the game, some games respond diffrently to AA settings then others,

WoW for example at 1600x1200 looks as good with 2x ATI AA as it does at 8xQ aa on my 8800gt, 4x nvidia aa looks about the same as 2x nvidia, i dont know if its a driver or game bugg but thats my experiance, nvidias supersampled transpariancy AA looks better then adaptive AA, but dosnt look better then the advanced AA modes you can enable with atitool advanced tweaks, also supersampled mode really hammers performance in WoW and other games that use alot of 3d textures for stuff like ground clutter and trees, it can cut the fps in 1/2 or even 1/4th, where as i can enable adaptive AA+EATM and AlphaSharpenmode and have far less of a hit with the same or better quility.

mind you the new enhanced modes dont work with all games, but those that they work well with show a nice quility boost, and you can even dissable AAA and get the same quility boost as you had by enableing it with effectivly no perf hit(same perf as not having AAA enabled but same quility as it would be with it enabled.

so yeah, nvidia cards are faster and with higher settings you can get the same quility, but if you are just talking PURE QUILITY stock for stock, the way most people use their cards(most people do not tweak their drivers beyond maby setting aa/af levels in the driver) ati wins.

when you bring performance in dx9 and dx9+dx10 shaders into account with AA, nvidia stops ATI, no dought, in 10.1 ati pulls ahead by current reports/reviews because true dx10(10.1 is what dx10 was orignaly ment to be) uses shader based AA, so it dosnt have the huge impact that trying to do dx9 AA with shaders/software has.

you are talking about 2 totaly diffrent designs when you compare the g80/g92 with the r600/r670.
g80/92 are dx9 parts with dx10 shader support taged on(sm4) they are designed to work best in games that where out at the time the card came out, and that have mostly come out since, this was a good movie on nvidias part, because it allowed them to pull ahead nicely when the 2900 came out.

the r600/670 chips are a NATIVE dx10 design with dx9 supported via software, the problem here is that they didnt support AA with a hardware unit for use when playing dx9 based games like all games till very recently have been.

crysis and others are not true native dx10 games, they are dx9 games with dx10(shader model 4) shader fx added for teh dx10 version, and as such do not support true dx10 shader based AA, relaying on dx9 hardware AA insted.

when/if nvidia's next card comes out with NATIVE dx10.x support we will see how it does with older games, my guess is that they will keep the hardware unit there for games that work best with it and support shader based aa for those games that requier it or that will run best/look best with it.

I get an impression that the r700 based cards will have a hardware AA unit onboard to improve performance with older games as well as also fully supporting shader based AA as is requiered by true dx10 specs(ms backed off on some specs for nvidia, orignaly dx10 was ment to be what dx10.1 is)

so yeah, nobodys really better IF you tweak things, but out of the box, ati wins in my book, and yet i have an 8800gt(well its in the shop, i will have it back soon i hope)
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Rebo&Zooty:
One important thing here is that you are using a CRT screen. That means the RAMDACS on the cards are in use, whereas over DVI they would not be. If possible, compare over DVI as well - it wouldnt surprise me that in this LCD era, they've been using cheaper RAMDAC's lately.
 

ShadowFold

New Member
Joined
Dec 23, 2007
Messages
16,918 (2.83/day)
Location
Omaha, NE
System Name The ShadowFold Draconis (Ordering soon)
Processor AMD Phenom II X6 1055T 2.8ghz
Motherboard ASUS M4A87TD EVO AM3 AMD 870
Cooling Stock
Memory Kingston ValueRAM 4GB DDR3-1333
Video Card(s) XFX ATi Radeon HD 5850 1gb
Storage Western Digital 640gb
Display(s) Acer 21.5" 5ms Full HD 1920x1080P
Case Antec Nine-Hundred
Audio Device(s) Onboard + Creative "Fatal1ty" Headset
Power Supply Antec Earthwatts 650w
Software Windows 7 Home Premium 64bit
Benchmark Scores -❶-❸-❸-❼-
Seeing as I just went from a 3850 to a 8800GT I did not notice anything image quality wise, just performance increases :D
 

Rebo&Zooty

New Member
Joined
May 17, 2008
Messages
490 (0.08/day)
Rebo&Zooty:
One important thing here is that you are using a CRT screen. That means the RAMDACS on the cards are in use, whereas over DVI they would not be. If possible, compare over DVI as well - it wouldnt surprise me that in this LCD era, they've been using cheaper RAMDAC's lately.

um, wrong, the dell 2001fp is a 20.1in lcd, benq dosnt make crt's afik.

and i also have a 19in kds monotor i hook up to watch movies on as i game, its a crt.

http://support.dell.com/support/edocs/monitors/2001fp/EN/specs.htm

results are the same,and the kds i have is a very high quility crt, the lcd is a high quility gaming lcd(dispite the specs looking very unimpressive today it dosnt ghost at all, and is VERY crisp)

and nvidia sometimes uses crappy filtering on the analog portionof lower end cards, same crap they pulled back in the geforce1/2 days, but back then u could remove the filters with a few snips of plyers and get 100% quility boost from it.

havent seen the high end nvidia cards use bad filters in years, but the quility of the aa/af and hell genral iq on the pre 8 seirse cards was very questionable to be kind.

and yes this monotor is using NATIVE dvi-d, it has vga, svideo and componant plugs as well(may hook up a dvd player to or for the hell of it :p )

one thing i can say about this monotor, i have yet to find a stand thats better or as good, this ones telescopic, lets me raise the monotor to eye level from it being dirrectly on the desk......my fathers newer dell 20.1in widescreen(samsung) as a CHEEZY little stand i had to build him a wooden piller to set his monotor on to get it to eye level(how laim is that?)

i didnt pay for this lcd, it was a gift from a good friend, and dispite the dell logo on it, i like it ;) (but i hate dell :p )
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
oh my bad, i thought it was a CRT model (used to have a 19" CRT with a very similar name)
 

Rebo&Zooty

New Member
Joined
May 17, 2008
Messages
490 (0.08/day)
fp stands for flat panil :p

oh and little FYI for you all, even todays best lcd's cant reproduce a TRUE RED, a good/great crt is better then a kickass lcd for colour reproduction and quility.

theres a reasion most prof gfx studios still use crt's for their high end work machiens.

only problem is the room they take up and how much they cost to ship.

place i use to get monotors http://www.merkortech.com/home.asp

good prices, specly if you stick with the free shiping models, my buddy got a 36in hdtv/crt for 400bucks around 1.5 years ago from them, thats SHIPED, granted it was shiped ground freight so it took 2 weeks to get here, but still, that was a killer price for that unit, its native res is 1080p, and it has both true digital dvi and vga, its got an hdmi port but the hdmi ports not encrypted like it needs to be for it to properly support hdcontent in vista :p
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
i own an 8800gt and have installed/setup 5-6 3800 cards, as well as getting to play with them.

out of the box without spending any time tweaking, the ati cards look BETTER in movies and most games, for video playback they are quite alot better then the 2900, the 2900 lacked something or other that helps in video playback acceleration, to lazy to look it up at the moment.

now with tweaking you can make both look better, but OUT OF THE BOX without any tweaking ATI/AMD clearly look better and this has been varifyed by more then a few people, including the votes on this thred, funny since nvidia's cards are so much faster and more popular in games, that they are winning the vote here.

My x1900xtx looks better for VIDEO PLAYBACK, image are more crips and clear, i dont know about cpu use, since it dosnt matter to me really, i can watch 720+ and 1080p movies and do other things at the same time witout lagg, so im happy.

now for gaming, no aa the cards are close to even, but PER AA SETTING, ati's is better for QUILITY, but nvidia cards can crank the AA up drasticly without having a major performance hit.

heres my personaly experiance using a gateway 2001fp gaming monotor(benq built)

in games 2x ATI AA looks the same to me as 4x or 8xQ aa depending on the game, some games respond diffrently to AA settings then others,

WoW for example at 1600x1200 looks as good with 2x ATI AA as it does at 8xQ aa on my 8800gt, 4x nvidia aa looks about the same as 2x nvidia, i dont know if its a driver or game bugg but thats my experiance, nvidias supersampled transpariancy AA looks better then adaptive AA, but dosnt look better then the advanced AA modes you can enable with atitool advanced tweaks, also supersampled mode really hammers performance in WoW and other games that use alot of 3d textures for stuff like ground clutter and trees, it can cut the fps in 1/2 or even 1/4th, where as i can enable adaptive AA+EATM and AlphaSharpenmode and have far less of a hit with the same or better quility.

mind you the new enhanced modes dont work with all games, but those that they work well with show a nice quility boost, and you can even dissable AAA and get the same quility boost as you had by enableing it with effectivly no perf hit(same perf as not having AAA enabled but same quility as it would be with it enabled.

so yeah, nvidia cards are faster and with higher settings you can get the same quility, but if you are just talking PURE QUILITY stock for stock, the way most people use their cards(most people do not tweak their drivers beyond maby setting aa/af levels in the driver) ati wins.

when you bring performance in dx9 and dx9+dx10 shaders into account with AA, nvidia stops ATI, no dought, in 10.1 ati pulls ahead by current reports/reviews because true dx10(10.1 is what dx10 was orignaly ment to be) uses shader based AA, so it dosnt have the huge impact that trying to do dx9 AA with shaders/software has.

you are talking about 2 totaly diffrent designs when you compare the g80/g92 with the r600/r670.
g80/92 are dx9 parts with dx10 shader support taged on(sm4) they are designed to work best in games that where out at the time the card came out, and that have mostly come out since, this was a good movie on nvidias part, because it allowed them to pull ahead nicely when the 2900 came out.

the r600/670 chips are a NATIVE dx10 design with dx9 supported via software, the problem here is that they didnt support AA with a hardware unit for use when playing dx9 based games like all games till very recently have been.

crysis and others are not true native dx10 games, they are dx9 games with dx10(shader model 4) shader fx added for teh dx10 version, and as such do not support true dx10 shader based AA, relaying on dx9 hardware AA insted.

when/if nvidia's next card comes out with NATIVE dx10.x support we will see how it does with older games, my guess is that they will keep the hardware unit there for games that work best with it and support shader based aa for those games that requier it or that will run best/look best with it.

I get an impression that the r700 based cards will have a hardware AA unit onboard to improve performance with older games as well as also fully supporting shader based AA as is requiered by true dx10 specs(ms backed off on some specs for nvidia, orignaly dx10 was ment to be what dx10.1 is)

so yeah, nobodys really better IF you tweak things, but out of the box, ati wins in my book, and yet i have an 8800gt(well its in the shop, i will have it back soon i hope)

The G8x and G9x cards are true DX10 cards as well. Their shaders are totally unified and fully programmable. They aren't locked into specific tasks at all.

And the DX version has nothing to do with the way AA is processed. It doesn't require any specific type of AA processing at all. That's 100% up to the game developers, and always has been.

And 2xAA on ATI is in no way equal to 4x (or whatever) on Nvidia.
 
Top