• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

G80 pictures reveal a second RAMDAC processor

zekrahminator

McLovin
Joined
Jan 29, 2006
Messages
9,066 (1.29/day)
Location
My house.
Processor AMD Athlon 64 X2 4800+ Brisbane @ 2.8GHz (224x12.5, 1.425V)
Motherboard Gigabyte sumthin-or-another, it's got an nForce 430
Cooling Dual 120mm case fans front/rear, Arctic Cooling Freezer 64 Pro, Zalman VF-900 on GPU
Memory 2GB G.Skill DDR2 800
Video Card(s) Sapphire X850XT @ 580/600
Storage WD 160 GB SATA hard drive.
Display(s) Hanns G 19" widescreen, 5ms response time, 1440x900
Case Thermaltake Soprano (black with side window).
Audio Device(s) Soundblaster Live! 24 bit (paired with X-530 speakers).
Power Supply ThermalTake 430W TR2
Software XP Home SP2, can't wait for Vista SP1.
Pictures of the NVIDIA GeForce 8800GTS and 8800GTX show what looks like two graphics processors. However, the smaller second chip is dedicated to "NV I/O".
It turns out that the small chip on the Geforce 8800 cards is an Nvidia NVIO chip. It provides dual 400MHz RAMDAC and Dual - Dual Link DVI output, TV Out and HDCP. Our informative friends call the chip External Video I/O Chip or simply External RAMDAC. We don't have any idea why Nvidia needed that, as RAMDAC has been normally part of the chip for generations.
There are also specifications showing the power requirements for the NVIDIA GeForce 8800GTS and GTX. The 8800GTS will require 26 amps on the 12 volt rail, and the 8800GTX will require a whopping 30 amps on the 12 volt rail. The 8800GTX also has a second SLI bridge, which may link to a future physics solution from NVIDIA.



View at TechPowerUp Main Site
 
30 amps?^^ what will quad sli then be?
4x30 amps?
a bit much or what? and what cpu can feed 4 of this cards
 
will the OCZ GameXstream 600watt sli/cf be good enuff to run one of theses? it has quad 18 amp rails, I'm guessing its just enough.

Edit: Thought I had posted this already, the 4 rails would have like 56 max amps so 18 +18 = 36 leaves 20amps for the system and maybe some room for an overclock.

I'll prob get one of theses suckers in a years time, hoping my 1900gt will be enough for now.
 
Last edited:
thats some power requirements
 
that cards pretty damn big too.
 
that cards pretty damn big too.

lol yeah, they'll have to start GeForce 8 series certifying cases to let you know they can accommodate them :laugh:

I don't see the point in DX10 hardware atm though, dx9 will still be supported for a long time and first generation hardware is always a bit iffy. I'll stick with my X1900XT until lots of games start coming out that would make having dx10 a real advantage
 
The second SLi bridge is for SLi... Just like ATi, nVidia are moving to two-way transfer of information... Less stress on the PCI-E channels means better performance system-wide...
 
Looks like I'll either be needing to take my hard drive cage out again or get a new case, and definitely get a new PSU. Yay for black PCB though!
 
quick ? - 30amps on the 12v rail, but is that 15amp per plug? cuz it's a two plugger.... idk much about the whole rail thing, just want to know enough that it won't explode when i plug it in
 
You guys are misunderstanding the amps requirements. That's 30A for the whole computer, not the video card. And remember they have to leave lots of room for tolerance so you could probably get away with a nice 380watt PSU for the whole system.
 
can you say boom on a cheap psu the cards sound like each will need its own psu
 
you'll need one of those thermaltake pure power express front drive bay thingies for each card! that'd be pretty sick actually...
 
i have few words to say:
BAH, to dx10, vista, and dx10 cards :shadedshu
:slap:
 
have u forgoten that the X1900XT is 30 AMP and crossfire is 36 AMP
 
I am not switching until they fix this stupid high power consummation. Come on, that card will suck 360 watts of power, that’s bull. They need to work on that. House breakers can take 15 amps, so that’s 1800 watts at 120 volts. So if they keep this up we’re going to need a breaker for the computer.
 
roll on 2kw pc's.2x8800gtx,oc'd 6800extreme.2kw power supply please lol.
 
roll on 2kw pc's.2x8800gtx,oc'd 6800extreme.2kw power supply please lol.

nah
3kw

2x8800gtx+phisycs
quad core
4gb of ddr2
Nvidia 680i
4hdd's
xfi sound card
case with lots of fans

:toast:
 
I'd say Prescott instead of Quad core...
 
I'm not going to bother trying to convince you guys anymore. When it comes out you can run it on your 1KW PSU and I'll run it just fine on my 430watt PSU.
 
Prescott extreme edition OC'ed...
 
Just when we get past cpu bottle necks bam a gpu that will need more and more it has 2 psu connectors weeeeeeeeeeeeeeeeeeeee.........................so all you will need is a new case(size) psu (volts/amps) and a faster more powerful cpu. sure we can get by on what we have but.Ok sell your car and your present pc and build a new pc makes me wonder how much longer we can build our own what with vistas new rules and all :ohwell:
 
...meh, i dont think the man will ever be able to stop us from building our own PCs.
They might take away the OS that we have to use, but theyll never take our freedom in building our own custom cheap PCs.

Never EVER!!!!
some please second me!
 
I am not switching until they fix this stupid high power consummation. Come on, that card will suck 360 watts of power, that’s bull. They need to work on that. House breakers can take 15 amps, so that’s 1800 watts at 120 volts. So if they keep this up we’re going to need a breaker for the computer.

Pay attention class:

The 30a power requirement is for the whole computer, not just the video card. The current high end cards on the market have very similar requirements, so shut up if you don't know what you are talking about.
 
Last edited:
Pay attention class:

The 30w power requirement is for the whole computer, not just the video card. The current high end cards on the market have very similar requirements, so shut up if you don't know what you are talking about.

I sure hope so, only reason I'm not sure is because of DX10 being known for hot parts and the second power dongle on the 8800GTX :p.
 
Back
Top