• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

8800gts vs 2900 pro all tests made by me

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,789 (3.89/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
3DMark is not shader intensive? Cmon... How can then a less pixel fillrate, less texel fillrate, but much more shader powered HD 2900s be ahead of their respective competitors on this benchmark if they are just behind them on most games and benchies?

As for the other thing I was only trying to point out that on those conditions 14000 were not improbable at all, granted you believe the others at 15000. It's not that a big difference. You even mention on your post that only the CPU clock increase to 4.5Ghz can account for 800 points increase. In the same post you assume (based on core/memory) the shaders are running at 1860Mhz, but who knows? That is what I was trying to explain, I didn't say anywhere you didn't know nothing about shaders, but you oversaw the fact that they could be using a lower shader clock, and automatically you say it's imposible for a single GTS to achieve 14000 at those speeds. I was basically responding to that post on your part.
The "Since I can't reach..." was about other people, not you. That is why I said that first and then I talked to you. Should I have made 2 posts? Maybe. Still there are lots of post saying that they barely reach 14k with their GTX and thus 14k on the GTS is impossible, my first sentence was 4 them.

Sorry for the rant. :ohwell:

Okey dokey, we'll call it a day! but we do disagree on one point, 2006 is not particularily shader intensive and actually the 2900XT beats (since cat 7.8) even a GTX in a number of real world gaming scenario's....but thats another story! Just as a matter of interest, with no additional shader clocking, just leaving it in "sync" a core speed of 783Mhz auto clocks the shaders to 1761mhz which is pretty significant in itself!
 

Lekamies

New Member
Joined
Dec 16, 2004
Messages
152 (0.02/day)
I do ,however, stand corrected on the 8800 gts it appears it can break 14000+ must have dry Ice strapped to it and a blessing or holy water in it's cooling system...

Warm water is enough to cool down for breaking that barrier.
Here is pic after 3d mark'03 run include rivatuners hardware monitor.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.29/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I have heard and read lot of people claiming big performance boosts out of shader overclocking alone, so I will keep my thoughts.

The fact that new cat 7.8 makes 2900 perform better only corroborates my point. Shader power is what performance is asking for. And if there's something that Radeons have is shader power. Better said they have as much as 2x the theoretical peak shader power of Nvidia counterparts.

I didn't know how shaders overclock when overclocking the core, I assumed they did rise their clock but I doubt they do it in a linear fashion. Now I know it's not linear. Nevertheless you have to agree that 1860 (or 1836 for that matter) is higher than 1761, don't you?

Now if you want you can join me in the "more shader power, more performance" club, wich BTW is backed up by the 8800GT and many 3dmark records, for example lekamies case. :D
 
Joined
May 12, 2006
Messages
11,119 (1.70/day)
System Name Apple Bite
Processor Intel I5
Motherboard Apple
Memory 40gb of DDR 4 2700
Video Card(s) ATI Radeon 500
Storage Fusion Drive 1 TB
Display(s) 27 Inch IMac late 2017
Warm water is enough to cool down for breaking that barrier.
Here is pic after 3d mark'03 run include rivatuners hardware monitor.

I'm not impressed. THAT RESOLUTION NO WONDER!!!! and in a 4 1/2 year old bench. Lets see 3dmark06. As a matter of fact why don't you bench it to show me just how wrong I am.
 
Last edited:
Joined
May 12, 2006
Messages
11,119 (1.70/day)
System Name Apple Bite
Processor Intel I5
Motherboard Apple
Memory 40gb of DDR 4 2700
Video Card(s) ATI Radeon 500
Storage Fusion Drive 1 TB
Display(s) 27 Inch IMac late 2017
Warm water is enough to cool down for breaking that barrier.
Here is pic after 3d mark'03 run include rivatuners hardware monitor.

I'm gonna stop now because what your say is total garbage good luck. one more thing I still don't see a 3dmark06 19000 score with full screens cpuz, gpu clock etc. or for that matter a 8800 gts hitting 14000 with screen shot to back that up other than future marks. Future marks by it'self means zero. I want to see a completed test screen and all the other required screens I bet I won't see them.
 
Last edited:

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,789 (3.89/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
I have heard and read lot of people claiming big performance boosts out of shader overclocking alone, so I will keep my thoughts.

The fact that new cat 7.8 makes 2900 perform better only corroborates my point. Shader power is what performance is asking for. And if there's something that Radeons have is shader power. Better said they have as much as 2x the theoretical peak shader power of Nvidia counterparts.

I didn't know how shaders overclock when overclocking the core, I assumed they did rise their clock but I doubt they do it in a linear fashion. Now I know it's not linear. Nevertheless you have to agree that 1860 (or 1836 for that matter) is higher than 1761, don't you?

Now if you want you can join me in the "more shader power, more performance" club, wich BTW is backed up by the 8800GT and many 3dmark records, for example lekamies case. :D

I agree completely with the shader power, I think you are misunderstanding what I am trying to say regarding the 1761. The 1761 is not independantly overclocked, unlike the R600 series (where the shader clock is locked and fixed at a lower speed than Nvidia) on the G80, as you increase the core clock the shader clock automatically increases with it so, when a user sets his 8800GTS core speed to 863 Mhz he already has a shader clock speed of 1761, then he can go into rivatuner and overclock the shader clock seperatly (he has probably hit his core max already at 863mhz) without touching the core, so my point was the chances are his shaders are well beyond 1761 if he knows what he is doing, and to acheive that score on an 8800GTS suggest he does.....make sense? So I am saying the minimum it can be is 1761 and is likely to be much higher.....but of course we dont know, what we do know is though that you would have to expect it to be a lot LOWER because the difference in CPU speed and Core clocks is a lot and the faster (perhaps not the best terminology :)) is actually the slower.
 
Last edited:

ccleorina

New Member
Joined
Mar 9, 2007
Messages
195 (0.03/day)
Location
Overclocking Hell
Processor Intel Core i7 Extreme Edition 965 4.2GHz
Motherboard Gigabyte GA-EX58-EXTREME Intel X58
Cooling Noctua NH-U12P SE1366 120mm SSO CPU Cooler
Memory G.SKILL 6GB (3 x 2GB) DDR3 2000Mhz
Video Card(s) 2 XFX GeForce GTX 295 1792MB GDDR3 SLI
Storage 2 WD VelociRaptor 300GB RAID0 + WD GP 2TB Data
Display(s) 2 Samsung 30W" LCD 6ms
Case Thermaltake SwordM Black Liquid Cooling Case
Audio Device(s) Creative Falat1ty X-Fi
Power Supply Ultra X3 1600W Modular PSU
Software Microsoft Windows Vista Ultimate 64-Bit SP1 OEM
I just want to ask... Is ATI HD2900 have shader clock??? I cant see in GPUZ?:rockout:

If the shader clock is locked? So what is the real shader clock?:roll:

Thanks.....
 
Joined
Feb 18, 2006
Messages
5,147 (0.78/day)
Location
AZ
System Name Thought I'd be done with this by now
Processor i7 11700k 8/16
Motherboard MSI Z590 Pro Wifi
Cooling Be Quiet Dark Rock Pro 4, 9x aigo AR12
Memory 32GB GSkill TridentZ Neo DDR4-4000 CL18-22-22-42
Video Card(s) MSI Ventus 2x Geforce RTX 3070
Storage 1TB MX300 M.2 OS + Games, + cloud mostly
Display(s) Samsung 40" 4k (TV)
Case Lian Li PC-011 Dynamic EVO Black
Audio Device(s) onboard HD -> Yamaha 5.1
Power Supply EVGA 850 GQ
Mouse Logitech wireless
Keyboard same
VR HMD nah
Software Windows 10
Benchmark Scores no one cares anymore lols
I think it's tied to the stock core clock, and from what I've heard it doesnt change. even when you oc the core.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.81/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
I think it's tied to the stock core clock, and from what I've heard it doesnt change. even when you oc the core.
I wish somebody would make a good BIOS editor for R600. :(
 

cefurkan

New Member
Joined
Oct 21, 2007
Messages
206 (0.03/day)
Processor Q6600 2.4 GHZ@ 3.754 GHZ
Motherboard Asus P5Q
Cooling stok fan
Memory 4 GBYT DDR2 CL4 Geil
Video Card(s) MSI 8800 GTX 768 MB @621 GPU 945 Memory
Storage 3*500 raid + 500 harici + 160 GBYT
Display(s) 17. lcd philips
Case feel hurricane 1
Audio Device(s) onboard
Power Supply 750 Watt Silver Stone
Software win xp sp3
I just want to ask... Is ATI HD2900 have shader clock??? I cant see in GPUZ?:rockout:

If the shader clock is locked? So what is the real shader clock?:roll:

Thanks.....

shader and gpu clock is same at ati
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,789 (3.89/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
I just want to ask... Is ATI HD2900 have shader clock??? I cant see in GPUZ?:rockout:

If the shader clock is locked? So what is the real shader clock?:roll:

Thanks.....

Yes it is locked and does not increase with core overclocking, the 8800GTS stock shader clock is at 1185mhz and will increase with core overclocks, it can also now be increase independantly in rivetuner :D the 2900XT shader clock runs at 800Mhz and stays there, one of the reasons why it has so many SP's, to compensate (although thats not the specific reason).

Just to bore you though :eek:Technically tho for practical purposes, although the 2900XT has 320SP's it actually has 64 groups of 5 shaders. Each group of 5 shaders can only run 1 thread each while each single shader in the 8800 can run 1. This means that the 8800GTX runs 128 shaders per clock and the 2900 runs 64. BUT! Each thread is worked on by 5 shaders and hypothetically can have 5 instructions ran per thread per clock. This equates to 320 intructions per clock versus the GTX's 128. While it is easy to divide the threads to all 64 groups of shaders, it is very difficult to keep all 5 shaders in each group working. This means that on a best case senario (all 5 shaders per group working to max) the 2900 does 2.5x's the work per clock than the 8800GTX and on a worse case senario it does half the work per clock than the 8800GTX. This means that the perfromance of the 2900XT GREATLY relys on the ability of the driver to distribute the instructions of the game being played. This is why we see poorish performance in some games and spectacular performance in others with the 2900XT. Future drivers can help this greatly.
 
Last edited:

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.29/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I understood your point of shader clocks been locked. And I understand why you assume it is even higher, it makes sense after all. Happens that I don't, but maybe I'm biased in this respect. I used to frequent a local forum of overclockers who's only goal was to achieve the higher clocks possible, or better said the higher conbined clocks they could reach, of course being it stable 100%, and they used to say it was common in their circles. The problem is that higher clocks not always translate into better performance. At some point there are current leakages, sinc problems between modules, etc. When this happens nobody knows what could happen and one posibility is that higher speeds give worse results, even if it's (or seems to be) totally stable. Ok, we all know this and I am talking about CPU overcloking for the most part, but the same applies to GPU, more in this case where core and shaders are asinc'ed. And although you have 3DMark to try the performance as you increase clocks, not everybody benches all speeds, they use only stability tests instead.

Another thing that I take into acount when doing my assumptions is this: the whole 8800 line is the same chip when manufacturing it, and then they choose wich one is wich model based on yields or demand. This is nothing new, we all know, but I don't see anyone paying attention to this as much as I think we should. There's a big difference between the models, as much that it could have been called a different chip. For example a chip that has been chosen as GTS because a defective ROP alone, is going to overclock a lot better than one that has been chosen because it couldn't reach GTX speeds as well as Nvidia wanted. This isn't new to anyone either, but I don't see anyone giving it the importance it deserves.

So my point about the thing was:
first, do we know for sure they are overclocking in a proper manner, so they get better performance? or on the contrary they are aiming for higher clocks only? (ok I didn't read the whole link, I just looked at screenies and little more so...) There are so many records or record claims out there with higher scores, but still lower clocks than those on your link, that I know where my two cents are going to stay by now.
second, can a specific 8800gts perform better at 720/1850 (core/shader) than other(s) at 860/1750? From my point of view it can.

Wow! That was a long post! I hope it explains my point of view. I will admit that I don't have personal experience with new cards, both Nividia 8 or HDs, since I'm out of this bussiness right now, but I do have some knowledge and I read a lot. So yeah I speak out of theory for the most part, and I have to rely on others experiences, and so I have to believe them. But I have learnt something on technology: Impossible is nothing (or was it adidas?)
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.29/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
About the HD 2900 5 instructions per clock... I remember that when I saw all the specs and diagrams, my guess was that despite AMD's claims it was more likely that the chip wouldn't be able to effectively use both scalar and vectorial units at the same time. Can't remember why though, and from that day until now I look at the HD SPs as 4 ops/cycle units.

I'm too lazy to go back on reading all the stuff, so I would love your opinion. So please everybody share your thoughts on this.
 

bigboi86

New Member
Joined
Apr 8, 2006
Messages
1,442 (0.22/day)
Location
techPowerUp!
System Name Just getting started....
Processor Athlon II Propus 620 @ 3.51ghz (quad core) L3 cache enabled
Motherboard ASUS M4A785TD-V EVO
Cooling Cooler Master Hyper 212 (great heatsink, 40c max load overclocked)
Memory Kingston HyperX DDR31600 4gb 2x2
Video Card(s) ATI Radeon HD 4850 by XFX
Storage Western Digital Caviar Blue WD3200AAKS 320GB SATA
Display(s) Acer 21.5 inch viewable G215H 1920x1080p, AOC 22inch 1680x1050
Case Antec 300, stock for now
Audio Device(s) Onboard, Turtle Beach headphones / crappy logitech desktop speakers
Power Supply Corsair 650 watt PSU <3
Software Windows 7 x64
In this bench it does because part of the bench is cpu performance . Regardless your wrong a slow cpu can hold back a gpu it's called bottle necking.

That doesn't mean that the CPU will cause the graphics card not to overclock as much. They are completely different subsystems.

I am not wrong.

This has nothing to do with benchmarking.
 

Lekamies

New Member
Joined
Dec 16, 2004
Messages
152 (0.02/day)
...or for that matter a 8800 gts hitting 14000 with screen shot to back that up other than future marks. Future marks by it'self means zero. I want to see a completed test screen and all the other required screens I bet I won't see them.

I had only this orb link to proof my 3d mark'06 score.

So i run it again at same clocks and take screenshot it for you.
Here
 
Joined
May 12, 2006
Messages
11,119 (1.70/day)
System Name Apple Bite
Processor Intel I5
Motherboard Apple
Memory 40gb of DDR 4 2700
Video Card(s) ATI Radeon 500
Storage Fusion Drive 1 TB
Display(s) 27 Inch IMac late 2017

ccleorina

New Member
Joined
Mar 9, 2007
Messages
195 (0.03/day)
Location
Overclocking Hell
Processor Intel Core i7 Extreme Edition 965 4.2GHz
Motherboard Gigabyte GA-EX58-EXTREME Intel X58
Cooling Noctua NH-U12P SE1366 120mm SSO CPU Cooler
Memory G.SKILL 6GB (3 x 2GB) DDR3 2000Mhz
Video Card(s) 2 XFX GeForce GTX 295 1792MB GDDR3 SLI
Storage 2 WD VelociRaptor 300GB RAID0 + WD GP 2TB Data
Display(s) 2 Samsung 30W" LCD 6ms
Case Thermaltake SwordM Black Liquid Cooling Case
Audio Device(s) Creative Falat1ty X-Fi
Power Supply Ultra X3 1600W Modular PSU
Software Microsoft Windows Vista Ultimate 64-Bit SP1 OEM
Yes it is locked and does not increase with core overclocking, the 8800GTS stock shader clock is at 1185mhz and will increase with core overclocks, it can also now be increase independantly in rivetuner :D the 2900XT shader clock runs at 800Mhz and stays there, one of the reasons why it has so many SP's, to compensate (although thats not the specific reason).

Just to bore you though :eek:Technically tho for practical purposes, although the 2900XT has 320SP's it actually has 64 groups of 5 shaders. Each group of 5 shaders can only run 1 thread each while each single shader in the 8800 can run 1. This means that the 8800GTX runs 128 shaders per clock and the 2900 runs 64. BUT! Each thread is worked on by 5 shaders and hypothetically can have 5 instructions ran per thread per clock. This equates to 320 intructions per clock versus the GTX's 128. While it is easy to divide the threads to all 64 groups of shaders, it is very difficult to keep all 5 shaders in each group working. This means that on a best case senario (all 5 shaders per group working to max) the 2900 does 2.5x's the work per clock than the 8800GTX and on a worse case senario it does half the work per clock than the 8800GTX. This means that the perfromance of the 2900XT GREATLY relys on the ability of the driver to distribute the instructions of the game being played. This is why we see poorish performance in some games and spectacular performance in others with the 2900XT. Future drivers can help this greatly.

Thanks for nice info....:rockout:
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,789 (3.89/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64

cefurkan

New Member
Joined
Oct 21, 2007
Messages
206 (0.03/day)
Processor Q6600 2.4 GHZ@ 3.754 GHZ
Motherboard Asus P5Q
Cooling stok fan
Memory 4 GBYT DDR2 CL4 Geil
Video Card(s) MSI 8800 GTX 768 MB @621 GPU 945 Memory
Storage 3*500 raid + 500 harici + 160 GBYT
Display(s) 17. lcd philips
Case feel hurricane 1
Audio Device(s) onboard
Power Supply 750 Watt Silver Stone
Software win xp sp3
well i am upping this topic for

IQ compare

already both card are meaningless since

8800gt and 3870 on market
 
Joined
Apr 30, 2008
Messages
4,875 (0.84/day)
Location
Multidimensional
System Name Boomer Master Race
Processor AMD Ryzen 7 7800X3D 4.2Ghz - 5Ghz CPU
Motherboard MSI B650I Edge Wifi ITX Motherboard
Cooling CM 280mm AIO + 2x 120mm Slim fans
Memory G.Skill Trident Z5 Neo 32GB 6000MHz
Video Card(s) Galax RTX 4060 8GB (Temporary Until Next Gen)
Storage Kingston KC3000 M.2 1TB + 2TB HDD
Display(s) Asus TUF 24Inch 165Hz || AOC 24Inch 180Hz
Case Cooler Master NR200P Max TG ITX Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply CoolerMaster V850 SFX Gold 850W PSU
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 10 Home 64bit
Benchmark Scores Don't do them anymore.
nvidia fanboyism to the max
 
Joined
Jul 18, 2007
Messages
2,693 (0.44/day)
System Name panda
Processor 6700k
Motherboard sabertooth s
Cooling raystorm block<black ice stealth 240 rad<ek dcc 18w 140 xres
Memory 32gb ripjaw v
Video Card(s) 290x gamer<ntzx g10<antec 920
Storage 950 pro 250gb boot 850 evo pr0n
Display(s) QX2710LED@110hz lg 27ud68p
Case 540 Air
Audio Device(s) nope
Power Supply 750w superflower
Mouse g502
Keyboard shine 3 with grey, black and red caps
Software win 10
Benchmark Scores http://hwbot.org/user/marsey99/
dead thread res to the max :)
 
Top