• We've upgraded our forums. Please post any issues/requests in this thread.

Zotac GeForce GTX 280 Amp! Edition

DaMulta

My stars went supernova
Joined
Aug 3, 2006
Messages
16,159 (3.90/day)
Likes
1,416
Location
Oklahoma T-Town
System Name Work in progress
Processor AMD 955---4Ghz
Motherboard MSi GD70
Cooling OcZ Phase/water
Memory Crucial2GB kit (1GBx2), Ballistix 240-pin DIMM, DDR3 PC3-16000
Video Card(s) CrossfireX 2 X HD 4890 1GB OCed to 1000Mhz
Storage SSD 64GB
Display(s) Envision 24'' 1920x1200
Case Using the desk ATM
Audio Device(s) Sucky onboard for now :(
Power Supply 1000W TruePower Quattro
#51
Most of the games I play are DX9 and not DX10. Plus I don't have Vista installed at this moment.
----


So does the card eat a ton of juice overclocked? How big of a PSU would you say you need for TRi SLi with the 280s W1zzard?
 
Joined
May 19, 2007
Messages
4,294 (1.11/day)
Likes
509
Processor Intel Core i9 7900X delidded @ 4.7Ghz 1.2v
Motherboard ASUS Rampage VI Extreme
Cooling EK-Supremacy EVO, EK-CoolStream PE 360, EK-CoolStream PE 240, EK-DDC Pump/X-RES 100
Memory Corsair Vengeance RED LED 32GB(8GBx4) DDR4 3400MHz
Video Card(s) EVGA GeForce GTX 1080 Ti FE ( EK 1080TI Waterblock./ Backplate )
Storage Samsung 950 Pro NVME 512GB, 2x 850 Evos 1TB, WD Black 1TB
Display(s) Samsung S27B970 27" Wide PLS Monitor - 2560 x 1440
Case Corsair Carbide Air 540
Power Supply Cooler Master Vanguard Series 1000W
Software Windows 10
Benchmark Scores http://www.3dmark.com/fs/12778735
#52
75% of gamers care. the remaining 25% use vista with dx9 games that were rushed to have some dx10 features
yer ture but always good to see what a high end card can really handle and bioshock was DX10 and i didnt have any problem, but crysis was a mess :laugh:
 

senninex

New Member
Joined
Dec 2, 2007
Messages
59 (0.02/day)
Likes
1
Location
Penang, Malaysia.
System Name Ms Window Vista Ultimate X64 SP1
Processor Intel E6750 @ 3.2Ghz (OC)
Motherboard Gigabyte GA-P35-DS3
Cooling Coolermaster GeminII-S (34C idle- 47C max load- environment temp= ~25C)
Memory Kingston Hyper-X 1066 DDR2 NVD (4x1GB)
Video Card(s) Zotac 8800GT AMP [OC - 760/2200/1800]
Storage WD Caviar SE16 16Mb 250Gb + Seagate Baracuda 750Gb 32Mb
Display(s) Samsung SyncMaster 740N 19"
Case Coolermaster Centurion 5- ATX
Audio Device(s) Intergrated Realtek 7.1 channel HD Audio
Power Supply Enermax FMAII 460Watt
Software MS Windows Vista Ultimate
Benchmark Scores 3Dmark06 (default- on vista X64)= ~13200
#53
I'll stick on my Zotac 8800GT AMP.. ;) for a few year..
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
18,523 (4.59/day)
Likes
3,156
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i5 2400 :: Athlon II x4 630
Motherboard MSI H67-G43-B3 :: GIgabyte GA-770T-USB3
Cooling Corsair H70 :: Thermaltake Big Typhoon
Memory 4x2GB DDR3 1333 :: 2x1GB DDR3 1333
Video Card(s) 2x PNY GTX1070 :: GT720
Storage Plextor M5s 128GB, WDC Black 500GB :: Mushkin Enhanced 60GB SSD, WD RE3 1TB
Display(s) Acer P216HL HDMI :: None
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) X-Fi Titanium Fatal1ty Pro - iLive IT153B Soundbar (optical) :: None
Power Supply Corsair CX600w :: Unknown
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
#54
The core clock can not be higher than shader clock / 2. If the core clock goes beyond that it will be set to 1/10th of what is requested. For example if you set 600 MHz with a shader clock of 1200 MHz it will work. But if you set 601 MHz with 1200 MHz Shader, the actual operating core frequency will be a mere 60 MHz.
I also noticed that if the shader frequency is too high in relation to the core frequency, the card will instantly render artifacts.
Last but not least, changing the PCI-Express clock frequency, causes the card to change clocks as outlined in this article.
Wow that's gay. If you try to squeeze the most out of one part the other pretty much commits suicide...
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
26,540 (6.38/day)
Likes
7,452
Location
Houston
System Name Team Blue
Processor 5960X@4.8 1.42v
Motherboard Asus X99M-WS
Cooling EK Supremecy EVO, MCR220-Stack+MCR220+MCR320, D5-PWM+EK X-RES 140
Memory 4x8GB G.Skill Trident Z 3200 CL16
Video Card(s) (2) EVGA SC BLACK 1080Ti's+ EVGA reference 1080Ti soon to be under water
Storage Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron
Keyboard Thermaltake Poseidon ZX
Software W10P
Benchmark Scores Zoom zoom mofo
#55
im thinking that at least for me 4850s will be a better shot...i do have reasons and i can list them

1. my dual PSU set up gives me 3x6pin connectors with 4850s i don't need new psus to get new cards.
2. i have an xfire mobo so 3x4850s is plenty doable
3. 3x4850 will destroy 1 GTX280 very easily and from the looks of the wattage numbers maybe the equivalent of 1.5x GTX280s in power

i'm seeing straight up loose on these unless NV figures out shoving more shit onto the same die is the wrong way to do things. they need a new architecture the one there using hasn't changed in years. right now i see a blunder very similar to AMD's cpu blunder quite simply they have gotten cocky A64 was great C2D knocked it flat on its ass. now lets apply that here g92 is great GT200 is better R700 will knock it flat on its ass. a 4870X2 should destroy this card easily sure just like the Intel argument its not a "true" single card setup but who the fuck cares its better performing and if its priced like i have heard from some suppliers i know good bye GT200
 

profzerg

New Member
Joined
Jul 8, 2008
Messages
1 (0.00/day)
Likes
0
#56
Good horsepower comparison. In fact this thread means to me the GT200 series delivers barely twice the performance my 8800 has. The only "complain" about this review is the lack of "Physx inside" feature test. None of the games supports physx engine as hard as i can remember and in my oppionion this is the most valuable improvement on Nvidia cores.

Thanks for the review and comments.
 
Last edited: