Wednesday, June 2nd 2010

Galaxy Readies Dual-Fermi Graphics Card

Galaxy is finally breaking ground on graphics cards with two GF100 "Fermi" GPUs from NVIDIA, with the company displaying one such design sample at the ongoing Computex event. The dual-Fermi board uses essentially the same design NVIDIA has been using for generations of its dual-GPU cards, involving an internal SLI between two GPUs, which connect to the system bus via an nForce 200 bridge chip, and are Quad SLI capable.

The power conditioning and distribution on this design consists of two sets of 4+1 phase VRM, the card draws power from two 8-pin PCI-Express power connectors. The GPUs carry the marking "GF100-030-A3", which indicates that it has the configuration of GeForce GTX 465, and since we count 8 memory chips per GPU system with no traces indicative of the other two memory chips per GPU sitting on their own memory channels, on the reverse side of the PCB, it is likely that the GPUs have a 256-bit wide memory interface. Galaxy, however, calls the card GTX 470 Dual. Output connectivity includes 3 DVI-D, with a small air-vent. It's likely that the cooler Galaxy designs will dissipate hot air around the graphics card, rather than out through the rear-panel.
Source: HotHardware
Add your own comment

105 Comments on Galaxy Readies Dual-Fermi Graphics Card

#26
kid41212003
Maybe somewhere between 465 and 470 because even with SLI 465 it's not possible to beat the HD5970.

I'm expecting something like 416x2 cores, 1GBx2 memory, and GPU clock @ 650.
Posted on Reply
#27
DrPepper
The Doctor is in the house
poo417Sorry that makes no sense. Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower. If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air. Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.
It does make sense, we just don't understand why. I have proof that backs me up.

www.techpowerup.com/reviews/Zotac/GeForce_GTX_480_Amp_Edition/27.html
Posted on Reply
#28
douglatins
If this is a 470 card it will break PCI-E specs. A 400W+ card?
wolfbecause nvidia cards scale so well in sli this would only need to be a dual GTX465 to compete well against the 5970
No not really, SLI and CF scales pretty much the same, differences come when one is not working properly
newtekie1I was going to come in an say exactly these two points.

These, based on the memory configuration, are definitely GTX465 cores, unless Galaxy hid 2 memory chips on the back of the card.

And yeah, two GTX480s perform the same as two HD5970s, so really two GTX465s would probably scale well enough to match a single HD5970.

And if these are GTX265s, then we are only looking at power consumption in the 300w range at peak. That shouldn't be too give of an issue considering it is only about 20w beyond what the HD4870x2 ran at, and they were fine.



It might not make any sense to you, but W1z's latest review of the GTX480 proves it. Fermi uses less power when it is cooler. The fan speed didn't effect the cards power consumption, in fact in his tests, when he lowered the fan speed via software, power consumption went up because the card was running hotter.

Now, we aren't sure if that is because the VRM area is getting hotter and less effecient, or if the GPU itself is less efficient due to voltage leakage, or a combination of both. However, the fact is, temperature and temperature alone is what is causing Fermi to consume so much power. In fact it is almost a exact linear progression, for every 1°C hotter the card runs, it needs 1.2w more power.
Try getting 40K in vantage with 2 480's like 2 5970 can.

Also i pretty much think that the zotac 480 consumes less because of the different fans, and some of the heat thing
Posted on Reply
#29
newtekie1
Semi-Retired Folder
douglatinsTry getting 40K in vantage with 2 480's like 2 5970 can.

Also i pretty much think that the zotac 480 consumes less because of the different fans, and some of the heat thing
I really don't care about Vantage scores, or any Synthectic benchmark.

All you need to look at is:


HD5970 Crossfire=12% Better than Single HD5970
GTX480 SLI=13% Better than Single HD5970
Overall, which is the important thing, not just on a single synthetic benchmark.



If you just want to focus on a high resolution(and lets face it, no one buying this configuration is running it on 1024x768:laugh:):
HD5970 Crossfire=19% Better than a single HD5970
GTX480 SLI=18% Better than a single HD5970
Posted on Reply
#30
DaJMasta
All your watts are belong to us.
Posted on Reply
#31
Fourstaff
I am surprised no one have congratulated Galaxy for undertaking this unprecedented challenge. Good luck Galaxy and I hope you can come up with something not named barbecue pit! :toast:
Posted on Reply
#32
trt740
Tatty_OneHmmmm with this setup and the hot air dissipated inside, I could turn my case into a nice little hibernation chamber for all sorts of little furry creatures :)
don't be so sure the 465 gtx does not run near as hot as a 480 gtx
Posted on Reply
#33
_JP_
FourstaffI am surprised no one have congratulated Galaxy for undertaking this unprecedented challenge. Good luck Galaxy and I hope you can come up with something not named barbecue pit! :toast:
I think by now the only thing to cheer is for them to continue the development.
I'll give my congratulations when it's done and working. :)

And it won't be a "barbecue pit" if it gets a water cooling kit. :D
Posted on Reply
#34
poo417
DrPepperIt does make sense, we just don't understand why. I have proof that backs me up.

www.techpowerup.com/reviews/Zotac/GeForce_GTX_480_Amp_Edition/27.html
Yeah I know I read Wizz's review of that. That card is using a different bios at the moment and we still don't know what they have done to reduce power use and temps (magic voo doo dust I think :toast:) Sadly we cant get current, voltage and power readings from all relevant parts of the card to see where it has been changed. As I said it does not make sense that there would be massive drop and Wizz's graph shows that. I don't call 20W a lot when we are talking about a 300w card.

I know that any electronic part becomes less efficient the hotter it gets (Intel p4's anyone :laugh:) I knew from Wizz's review that he had found that with that card. There are other reviews that have found that even running a liquid cooling system and block on a 480 it still uses less amps and watts.

www.guru3d.com/article/geforce-gtx-480-liquid-cooling-danger-den-review/9

hilberts water block review shows this too. The fan design on the 400's may well be pulling a lot of power even when spinning slow it depends on how it is used I have no idea (about many things it seems)

I think it is fairly simple that 2 x 480's are a good bit faster than a 5970 even when overclocked to 5870 speeds. However the 495 could struggle at the resolutions that people that buy would be looking for 2560 x 1600 or eyefinity/ surround as the 465 and 470 seem to struggle a bit that type of resolution at the moment. That all may change. I just want to see what the 480's are like at 6000 x 1080 so I can make my bloody mind up to buy them. :D

Please feel free to call me an idiot it has been over 20 years since I have done anything to do with P=IV or as some smart people say
Posted on Reply
#35
crow1001
Meh who the f*** is galaxy, some poor mans brand trying to make some waves in the market with its make believe dual fermi solution, I call bull, card will never see light of day, galaxy trying to get some media attention.:roll:
Posted on Reply
#36
Bjorn_Of_Iceland
FordGT90ConceptAre they mad? They can barely cool one, nevermind two. Fermi would need a die shrink before it is even reasonable to put two on one card.
Tell that to the people making 470 a single slot o_0
Posted on Reply
#37
EastCoasthandle
crow1001Meh who the f*** is galaxy, some poor mans brand trying to make some waves in the market with its make believe dual fermi solution, I call bull, card will never see light of day, galaxy trying to get some media attention.:roll:
That's a pretty bold claim there. I guess we will see soon enough.
Posted on Reply
#38
a_ump
that is a bold claim. every market segment has to be filled by someone and galaxy fills the roll of cheaper products compared to the norm. That doesn't make their products POS tho, grab a galaxy graphics card and then the same card under a diff brand(evga, xfx, etc) and performance will be the same. the difference is in components used to construct the card, which honestly only matters when it comes to overclocking.
Posted on Reply
#39
lyndonguitar
I play games
wow, with this card, your rig could fly!
Posted on Reply
#40
tkpenalty
I love how galaxy always puts out cards that make everyone go WTF :rockout:

EDIT:
crow1001Meh who the f*** is galaxy, some poor mans brand trying to make some waves in the market with its make believe dual fermi solution, I call bull, card will never see light of day, galaxy trying to get some media attention.:roll:
They've been in the market much longer than you have stuck your head into basic hardware IT. They usually put out much improved non-reference versions of cards addressing issues such as VRM Cooling, OC editions etc.

EDIT: Galaxy products are cheaper because they tend to simplify the circuitry, therefore saving power, saving money, at no cost to perf, usually with some innovative thing like this: www.galaxytech.com/en/productview.aspx?id=278
Posted on Reply
#41
newtekie1
Semi-Retired Folder
poo417Yeah I know I read Wizz's review of that. That card is using a different bios at the moment and we still don't know what they have done to reduce power use and temps (magic voo doo dust I think :toast:)
You say you've read the review, then one sentence later say "we still dont' know what they've done to reduce power use and temps".

Are you blind? Did you miss miss this gigantic 3 slot beast sitting on top of the card? Yeah, they must have used "magic voo doo dust" to lower the temps...:rolleyes:

Oh, and we know exactly what they have done to reduce the power use, lowered the temperatures. How do you know? Because when the fans are artifically slowed down, the card gets hotter, and it consumes more power. No magic voo doo dust, just hotter = more power.
Posted on Reply
#42
mdsx1950
Can't wait to see some reviews on this card. :)
Posted on Reply
#43
Unregistered
The electric companies should put some bonus coupons in the box with these.
#44
newfellow
And we got a new grill in town now where's the steaks!

@Roph

hehe, good1
Posted on Reply
#46
mdsx1950
I honestly doubt this card will beat the HD 5970. :shadedshu
Posted on Reply
#47
theonedub
habe fidem
I think more manufacturers should do what this card does and break the width specs of normal cards instead of making them longer. It would make them easier on cases and gives more room for larger heatsinks. (It is wider isn't it? :) )
Posted on Reply
#48
Tatty_Two
Gone Fishing
trt740don't be so sure the 465 gtx does not run near as hot as a 480 gtx
Clearly, but 2 of them glued together with the hot air flowing inside the case is going to make temps difficult for those with the more "enclosed" type cases. Am I one of the only ones who think though that from a performance perspective this might actually be a good thing? In reality it "should" be cheaper than the 5970, possibly slower yes but who knows, this might be just the first dual GPU offering that we will see from them this year....... to me choice is good, we shouldn't knock it.
Posted on Reply
#49
phanbuey
465's are weak tho... they are well behind a 5850. 5850cf is basically a 5970... This might match it in some tests, but this is no 5970 killer.
Posted on Reply
#50
GenTarkin
Yeah this isnt gonna take off....their dual GTS250 seems to have died...

I dont see their GTS250 dual gpu card they announced a few months back yet.....so why would this card be any different?!?! lol
Posted on Reply
Add your own comment
May 3rd, 2024 21:07 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts