• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ATI Believes GeForce GTX 200 Will be NVIDIA's Last Monolithic GPU.

Joined
Sep 26, 2006
Messages
6,959 (1.09/day)
Location
Australia, Sydney
The sheer size of the G200 won't allow for an GX2 or whatever. The heat that 2 of those things produce will burn each other out. Why in the hell would NV put 2 of them in a card when it costs an arm & a leg just to make one. The PP ratio for this this card is BS too when $400 worth of cards, whether it be the 9800GX2 or 2 4850s, are not only in the same league as the beast but allegedly beats it. The G200b won't be any different either. NV may be putting all their cash in this giant chip ATM but that doesn't mean that they're going to do anything stupid with it.

If the 4870x2 & the 4850x2 are both faster than the GTX280 & costs a whole lot less then I don't see what the problem is except for people crying about the 2 GPU mess. As long as its fast & DON'T cost a bagillion bucks I'm game.

I agree with your view.

GT200s, well Nvidia are shearing down their profits just to get these things to sell, AMD on the otherhand enjoy not having to reinforce their cards and put high end air cooling on-they are way better off. If these 4850s sell well, as well as the RV770, the GT200s look like an awful flop.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,283 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
I would like to know in which facts are you guys basing your claims that a die shrink won't do anything to help lowering heat output and power consumption?
A 65nm single GPU requires 6 + 8 pin power input (obviously for higher power input at peak). How much of that input can be reduced with a die shrink to 55nm? Enough to make a GX2? Without say three 6-pin connectors?
 

Megasty

New Member
Joined
Mar 18, 2008
Messages
1,263 (0.22/day)
Location
The Kingdom of Au
Processor i7 920 @ 3.6 GHz (4.0 when gaming)
Motherboard Asus Rampage II Extreme - Yeah I Bought It...
Cooling Swiftech.
Memory 12 GB Crucial Ballistix Tracer - I Love Red
Video Card(s) ASUS EAH4870X2 - That Fan Is...!?
Storage 4 WD 1.5 TB
Display(s) 24" Sceptre
Case TT Xaser VI - Fugly, Red, & Huge...
Audio Device(s) The ASUS Thingy
Power Supply Ultra X3 1000W
Software Vista Ultimate SP1 64bit
I would like to know in which facts are you guys basing your claims that a die shrink won't do anything to help lowering heat output and power consumption? It has always helped A LOT. It is helping Ati and surely will help Nvidia. Thinking that the lower power consumption of RV670 and RV770 is based on architecture ehancements alone is naive. I'm talking about peak power, in comparison to what R600 was, idle power WAS improved indeed, and so has GT200.

Of course it'll lower the heat & power. The only point was that it still wouldn't allow for a GX2, not to mention that the card would cost around $1200 :wtf: :shadedshu

However, a faster & cheaper 400mm² die does have EVERY advantage over a slower, more costly 576mm² die.
 
Joined
Sep 26, 2006
Messages
6,959 (1.09/day)
Location
Australia, Sydney
I would like to know in which facts are you guys basing your claims that a die shrink won't do anything to help lowering heat output and power consumption? It has always helped A LOT. It is helping Ati and surely will help Nvidia. Thinking that the lower power consumption of RV670 and RV770 is based on architecture ehancements alone is naive. I'm talking about peak power, in comparison to what R600 was compared to competition, idle power WAS improved indeed, and so has GT200.

90nm > 65nm (R600 to RV670), was a HUGE leap in the drop of the transistor size, moreover remember the RV670 has a massive chunk of it; half of the 512bit memory controller effectively removed.

Die shrinks do something but under around 65nm the usefulness of die shrinking insn't really significant. Nvidia's CEO admitted that dieshrinking the GTX280 wouldnt help its extreme heat output a lot. Its fairly reasonable as to why, transistor count is more of a factor. In all cases, G80 > G92, R600 > RV670, its due to the cutting down of the memory controller.

By the way, the reason why AMD's cards use more power is simple; their cards use more phases in contrast to Nvidia. More phases = more power used but phases subject to less current, as well as generating less heat.
 
Joined
Jun 20, 2007
Messages
3,937 (0.64/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
I agree with your view.

GT200s, well Nvidia are shearing down their profits just to get these things to sell, AMD on the otherhand enjoy not having to reinforce their cards and put high end air cooling on-they are way better off. If these 4850s sell well, as well as the RV770, the GT200s look like an awful flop.


Ya because ATi's been looooooving the way things have turned out the last two and a half years.

Yep, they don't have put 'high end air cooling' on their products, what a wonderful relief for them!

~
 

Megasty

New Member
Joined
Mar 18, 2008
Messages
1,263 (0.22/day)
Location
The Kingdom of Au
Processor i7 920 @ 3.6 GHz (4.0 when gaming)
Motherboard Asus Rampage II Extreme - Yeah I Bought It...
Cooling Swiftech.
Memory 12 GB Crucial Ballistix Tracer - I Love Red
Video Card(s) ASUS EAH4870X2 - That Fan Is...!?
Storage 4 WD 1.5 TB
Display(s) 24" Sceptre
Case TT Xaser VI - Fugly, Red, & Huge...
Audio Device(s) The ASUS Thingy
Power Supply Ultra X3 1000W
Software Vista Ultimate SP1 64bit
Ya because ATi's been looooooving the way things have turned out the last two and a half years.

Yep, they don't have put 'high end air cooling' on their products, what a wonderful relief for them!

~

Not to be negative or anything but none of the stock cards from NV or ATi have high-end cooling fans. The stock casing only restrict most of the fans anyway :(
 

Kreij

Senior Monkey Moderator
Joined
Feb 6, 2007
Messages
13,817 (2.21/day)
Location
Cheeseland (Wisconsin, USA)
The head of ATI Technologies claims that the recently introduced NVIDIA GeForce GTX 200 GPU will be the last monolithic “megachip” because they are simply too expensive to manufacture. The statement was made after NVIDIA executives vowed to keep producing large single chip GPUs. The size of the G200 GPU is about 600mm2¬¬ which means only about 97 can fit on a 300mm wafer costing thousands of dollars.


Why are the wafers limited to 300mm? Can't they use a 600mm wafer and get four times the processors out of it?
Is it just because all the FABs are set up to use that size or is there some kind of physical limit?
 
Joined
May 19, 2007
Messages
7,662 (1.24/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
Not to be negative or anything but none of the stock cards from NV or ATi have high-end cooling fans. The stock casing only restrict most of the fans anyway :(


orly?
im sure i read somewhere about teh gtx 280 cooler being designed by CM or sumpn.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
IMO a die shrink to 55nm could enable the posibility of doing a GX2. Maybe not a GTX280 GX2 but yes one with slightly lower clocks or one cluster dissabled and with enough performance to crunch the X2, of course it would require more power, but it would still be within the 6+8 pin envelop. I have three "facts", but of course are only based on my opinion:

1- You have to take into account how power consumtion works. It's exponential, not linear, so a slower part would consume a lot less and the same can be applied to voltages. Nvidia because GT200 was worse than expected in this area had to lower the clocks, but probably they have kept it as high as possible within the selected power envelope. There's always a hot spot for performance-per-watt for any chip and GTX 280 is probably quite higher than that spot. FACT: look at Wizzard's Zotac AMP! GTX280 it consumes a lot more than what you should expect from that overclock. Aim a bit lower than that said spot and you have a "low power" chip. For example a GTX280 GX2 @ 500 Mhz would consume a lot less and still leave the HD4870 X2 behind in performance.

2- Nvidia has implemented the abbility to shut down parts of the chip in the GT200, and it really works very well. Again look at Wizzard's power consumption charts and how it consumes a lot less than the X2 on average, even though its maximum is almost the same. That would make the card to probably never reach the maximum power consumtion in the GX2 card. There's no way you are going to be able to make a total of 64 ROPs work at the same time, for example.

3- Continuing with the above argument, IMO if Nvidia did a GX2 it wouldn't be based on the GTX 280, but on the 8800 GS substitute. Nvidia will surely make a 16-20-24 ROP card while mantaining a high shader count (maybe 192/168 SP, same or one less cluster than GTX260 for example), they would be stupid if they didn't, as it makes more sense than ever. The GS is "weak" because it has 12 ROPs but 16 on the other hand are enough for high-def gaming. 16 ROP x 2 is more than enough as the X2/GX2 can testify, 32 x 2 is just over-over-overkill and silly.
My bet is that Nvidia will do a 20 ROP 168/192 SP card for high mainstream no matter what and they could use that for the GX2. Final specs for that hypothetical GX2 would be: 40 ROP, 336/384 SP, 112/128 TMU and 2 x 320 bit memory controler, that if they can't make the card use the same memory pool as R700 seems to be going to do. The above card would leave the X2 well behind performance wise and still be within the power envelop IMO. Of course that envelop would be higher than that on the X2 but reachable IMHO and still within the 6+8 pin layout that's 300W, the GTX 280 needs 6+8 pins just by a hair.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,726 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Why are the wafers limited to 300mm? Can't they use a 600mm wafer and get four times the processors out of it?
Is it just because all the FABs are set up to use that size or is there some kind of physical limit?

Exactly. Wafer size is to Fabs and manufacturers like CPU sockets are for motherboard and CPU makers (interms of compatibility) or ATX standard if you prefer. 450 mm wafers are on track already BTW, 600mm is too much to handle right now.

EDIT: And yes, there is some physical limit too. Take in mind wafers are done by slicing silicon bars at very thin width (less than 1 mm IIRC) and have to mantain the same width all over their area. To that you have to add that the alloy of silicon has to be homogeneous throughout the whole wafer too.
 
Last edited:

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,283 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Exactly. Wafer size is to Fabs and manufacturers like CPU sockets are for motherboard and CPU makers (interms of compatibility) or ATX standard if you prefer. 450 mm wafers are on track already BTW, 600mm is too much to handle right now.

That's because of yields. Big wafer, wafer fails, loss of more yield. Keeping wafer sizes limited is a precautionary measure (while compromising on manufacturing expenditure).
 

Nyte

New Member
Joined
Jan 11, 2005
Messages
181 (0.03/day)
Location
Toronto ON
Processor i7 x980
Motherboard Asus SuperComputer
Memory 3x 2GB Triple Channel
Video Card(s) 2x Tahiti
Storage 2x OCZ SSD
Display(s) 23 inch
Power Supply 1 kW
Software Win 7 Ultimate
Benchmark Scores Very high!
90nm > 65nm (R600 to RV670), was a HUGE leap in the drop of the transistor size, moreover remember the RV670 has a massive chunk of it; half of the 512bit memory controller effectively removed.

Die shrinks do something but under around 65nm the usefulness of die shrinking insn't really significant. Nvidia's CEO admitted that dieshrinking the GTX280 wouldnt help its extreme heat output a lot. Its fairly reasonable as to why, transistor count is more of a factor. In all cases, G80 > G92, R600 > RV670, its due to the cutting down of the memory controller.

By the way, the reason why AMD's cards use more power is simple; their cards use more phases in contrast to Nvidia. More phases = more power used but phases subject to less current, as well as generating less heat.

670/620/635 = 55 nm
630/610 = 65 nm
 
D

Deleted member 24505

Guest
Whats the average yield for a 300mm wafer then? Does it differ with differant manufacturers or is it totally dependant on the size of the wafer?
 

HTC

Joined
Apr 1, 2008
Messages
4,601 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
Whats the average yield for a 300mm wafer then? Does it differ with differant manufacturers or is it totally dependant on the size of the wafer?

It depends on the die size: the bigger the die size, the less units a wafer yields.

That's why ATI is ahead of nVidia (in this respect, atm): they manage to make their die size much smaller then nVidia.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,283 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Whats the average yield for a 300mm wafer then? Does it differ with differant manufacturers or is it totally dependant on the size of the wafer?

Of course, articles from The Inquirer are so full of it, but in one such article, it was mentioned that on 300mm wafer, for the GT200 yields could be as low as 40%. Somewhere else is said that the die costs $110 to manufacture and assemble into the package (package as in electronics, not logistics) will send the cost upto $120. With increase in wafer sizes, you're increasing the risk of yield loss.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
That's because of yields. Big wafer, wafer fails, loss of more yield. Keeping wafer sizes limited is a precautionary measure (while compromising on manufacturing expenditure).

Yeah that's what I wanted to say when I said physical limit, as there is no absolute physical limit for that.
Also because bigger wafers are possible I said it works like standards. Nvidia will surely want bigger wafers for GT200 in expense of waffer yields, because probably the loss in those yields would be smaller than the gains in die yields, but since it's like an standard they can't. I don't know if I have explained that well.

EDIT: Also I highly doubt those Inquirer yield numbers. Probably are on the high 40s and were told so, and they just slapped that 40% number. Also that number seems extremely low without knowing how high are other GPU yields. Probably are never higher than 75%, and much lower in new high-end chips. For example RV770. Difference from say 60% and 50% is already very high.
 
Last edited:
Joined
Feb 12, 2007
Messages
1,192 (0.19/day)
Location
scotland
System Name spuds K8-X2
Processor amd athlon X2 4200+ toledo s939 2794mhz 254x11 1.4 vcore
Motherboard MSI K8N Neo4-F v1.0 (MS-7125) nforce4 sata2 mod, laptop cpu heatpipe copper nb cooler
Cooling akasa evo "blue" + 90mm fan, 2x120mm front, 250mm side, 120mm rear, 120mm in psu, pci slot exhaust.
Memory OCZ Platinum XTC DDR PC3200 4GB(4x1024) @254mhz 3-3-3-8 2T
Video Card(s) sapphire HD3870 512mb GDDR4 vf900cu, several ramsinks on components / nvidia 7300gt 256mb secondary
Storage hitachi 160gb (slightly fried) / hitachi 120gb ATA / Seagate 160gb / 2x ps3 seagate 60gb
Display(s) CTX EX1300F 20" flat CRT, 1280x1024@100hz / 19" benq FP91G X / 19" hanns-g (all free)
Case mesh server/gaming black case, 9x 5.25' drive bays, silvestone auto fan controller
Audio Device(s) onboard realtek alc850 7.1/soundblaster LIVE! ct4780 + kxaudio - sony home theatre surround
Power Supply winpower 650w, system draws around 470-500w under load(+all screens)
Software win7 64bit
Benchmark Scores ~16m trips/sec using mty trip generator. triple monitor gaming using SoftTH. 3840x1024
so when nvidia see this they say o rly?
next gfx card will be 2pcb's, one for the gpu, other for the rest of the components:D
 
Joined
Oct 6, 2005
Messages
10,242 (1.52/day)
Location
Granite Bay, CA
System Name Big Devil
Processor Intel Core i5-2500K
Motherboard ECS P67H2-A2
Cooling XSPC Rasa | Black Ice GT Stealth 240 | XSPC X2O 750 | 2x ACF12PWM | PrimoChill White 7/16"
Memory 2x4GB Corsair Vengeance LP Arctic White 1600MHz CL9
Video Card(s) EVGA GTX 780 ACX SC
Storage Intel 520 Series 180GB + WD 1TB Blue
Display(s) HP ZR30W 30" 2650x1600 IPS
Case Corsair 600T SE
Audio Device(s) Xonar Essence STX | Sennheisser PC350 "Hero" Modded | Corsair SP2500
Power Supply ABS SL 1050W (Enermax Revolution Rebadge)
Software Windows 8.1 x64 Pro w/ Media Center
Benchmark Scores Ducky Year of the Snake w/ Cherry MX Browns & Year of the Tiger PBT Keycaps | Razer Deathadder Black
^ THAT was a good laugh. It wouldn't surprise me...
 
Joined
Sep 26, 2006
Messages
6,959 (1.09/day)
Location
Australia, Sydney
so when nvidia see this they say o rly?
next gfx card will be 2pcb's, one for the gpu, other for the rest of the components:D

Explain the logic behind that. You're only mfgr prices up by doing that, ever heard of multi layered PCBs man?...

HD4850 > 9800GTX by 25% According to AMD, this is fairly believeable.

A dual GTX280 is technically impossible, between two slots, why? 65nm to 55nm doesn't boast much of a change in TDP! Nvidia's CEO even admitted it, do I have to repeat this? GX2 would be viable, with say a GT200 variant that is similar to the G92 in die size. It was mentioned that a die shrink would only drop the GTX280's heat ouput down to what, around 200W, which is still ridiculously high (400W+ GX2). Who gives about Idle when the card is ridiculously hot at load.

Nvidia really stabbed themselves in the foot, while it is powerful as such, the HD4870X2 will be a more successful product.
 
Joined
Feb 12, 2007
Messages
1,192 (0.19/day)
Location
scotland
System Name spuds K8-X2
Processor amd athlon X2 4200+ toledo s939 2794mhz 254x11 1.4 vcore
Motherboard MSI K8N Neo4-F v1.0 (MS-7125) nforce4 sata2 mod, laptop cpu heatpipe copper nb cooler
Cooling akasa evo "blue" + 90mm fan, 2x120mm front, 250mm side, 120mm rear, 120mm in psu, pci slot exhaust.
Memory OCZ Platinum XTC DDR PC3200 4GB(4x1024) @254mhz 3-3-3-8 2T
Video Card(s) sapphire HD3870 512mb GDDR4 vf900cu, several ramsinks on components / nvidia 7300gt 256mb secondary
Storage hitachi 160gb (slightly fried) / hitachi 120gb ATA / Seagate 160gb / 2x ps3 seagate 60gb
Display(s) CTX EX1300F 20" flat CRT, 1280x1024@100hz / 19" benq FP91G X / 19" hanns-g (all free)
Case mesh server/gaming black case, 9x 5.25' drive bays, silvestone auto fan controller
Audio Device(s) onboard realtek alc850 7.1/soundblaster LIVE! ct4780 + kxaudio - sony home theatre surround
Power Supply winpower 650w, system draws around 470-500w under load(+all screens)
Software win7 64bit
Benchmark Scores ~16m trips/sec using mty trip generator. triple monitor gaming using SoftTH. 3840x1024
the logic is "taking the piss", wheres your sense of humour?
 

Megasty

New Member
Joined
Mar 18, 2008
Messages
1,263 (0.22/day)
Location
The Kingdom of Au
Processor i7 920 @ 3.6 GHz (4.0 when gaming)
Motherboard Asus Rampage II Extreme - Yeah I Bought It...
Cooling Swiftech.
Memory 12 GB Crucial Ballistix Tracer - I Love Red
Video Card(s) ASUS EAH4870X2 - That Fan Is...!?
Storage 4 WD 1.5 TB
Display(s) 24" Sceptre
Case TT Xaser VI - Fugly, Red, & Huge...
Audio Device(s) The ASUS Thingy
Power Supply Ultra X3 1000W
Software Vista Ultimate SP1 64bit
Oh, um it broke when my cousin dropped my guitar.

This is serious.

The only thing that's serious about it is how NV bet the farm on this thing. I'll be collecting that farm when I buy my 4870x2 :p
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (1.00/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.


gtx 200 series gpu (from what I've found so far)

A wafer from the 4800 series gpu will offer a whole lot more. However, I haven't found one yet. Anyone have a 55nm wafer pic?
 
Last edited:
Top