Wednesday, June 2nd 2010

Galaxy Readies Dual-Fermi Graphics Card

Galaxy is finally breaking ground on graphics cards with two GF100 "Fermi" GPUs from NVIDIA, with the company displaying one such design sample at the ongoing Computex event. The dual-Fermi board uses essentially the same design NVIDIA has been using for generations of its dual-GPU cards, involving an internal SLI between two GPUs, which connect to the system bus via an nForce 200 bridge chip, and are Quad SLI capable.

The power conditioning and distribution on this design consists of two sets of 4+1 phase VRM, the card draws power from two 8-pin PCI-Express power connectors. The GPUs carry the marking "GF100-030-A3", which indicates that it has the configuration of GeForce GTX 465, and since we count 8 memory chips per GPU system with no traces indicative of the other two memory chips per GPU sitting on their own memory channels, on the reverse side of the PCB, it is likely that the GPUs have a 256-bit wide memory interface. Galaxy, however, calls the card GTX 470 Dual. Output connectivity includes 3 DVI-D, with a small air-vent. It's likely that the cooler Galaxy designs will dissipate hot air around the graphics card, rather than out through the rear-panel.
Source: HotHardware
Add your own comment

105 Comments on Galaxy Readies Dual-Fermi Graphics Card

#51
Unregistered
**Generic comment about how much power this will use and how hot it will run**
#52
_JP_
indybird**Generic comment about how much power this will use and how hot it will run**
**Generic comment on what's been said here about the power consumption regarding the heat**

**Generic EDIT to add the fact that it will probably have a massive cooler and thus being able to cool properly, regardless of how many slot it should take, in order to be retailed**
Posted on Reply
#53
erixx
Like I said in the Galaxy 'space' videocard: at least these guys show some innovation!!!!!
Posted on Reply
#54
crow1001
Galaxy produced the below GFX card...nuff said...:laugh:

Posted on Reply
#55
theorw
Any price yet?Who bets its over 700$?
Posted on Reply
#56
Tatty_Two
Gone Fishing
theorwAny price yet?Who bets its over 700$?
What for two 465's? Naaaa, a good benchmark tends to be, twice the retail price of a single 465 minus about 10 - 15%.
Posted on Reply
#57
_JP_
theorwAny price yet?Who bets its over 700$?
My guess, is around 450€ to 500€, because it's not supposed to top a HD 5970 (I guess), so it shouldn't cost as much.
On the worse case, maybe 600€, but that would be pushing it (even know it is nVidia).
Posted on Reply
#58
JayliN
newtekie1And yeah, two GTX480s perform the same as two HD5970s, so really two GTX465s would probably scale well enough to match a single HD5970.
newtekie1If you just want to focus on a high resolution(and lets face it, no one buying this configuration is running it on 1024x768:laugh:):
HD5970 Crossfire=19% Better than a single HD5970
GTX480 SLI=18% Better than a single HD5970
I'm lost as to what point you're trying to make. Where have you proven that two gtx465s will match a single 5970?

All your graphs have shown is that 2x GTX480 in SLI is faster than a single 5970 which is a no brainer since the GTX480 is 11% faster than a 5870 and a 5970 is actually 2x downclocked 5870s in crossfire.

Its no secret that GPUs don't scale well beyond 2. It doesn't matter whether its crossfire or sli. To imply that crossfire is inferior to sli by comparing benchmarks taken from how 2 ati gpus scales to 4 and how 1 nvidia gpu scales to 2 is nonsense.

If you think 2 465s (which is ~37% slower than a 480 and ~32% slower than a 5870 at 2560x1600) is going to match a 5970, you're going to be surprised.

For your consideration: www.guru3d.com/article/geforce-gtx-465-sli-review/12
Posted on Reply
#59
naram-sin
Sorry for this, but F equals FAIL... or FERMI... whichever comes to mind (or situation) first... and it's an epic fail... Buahahahahahahaah!!! :)

Although, I hope they have a nice comeback, because we don't need another Intel/MS here... prices should go DOWN! :)
Posted on Reply
#60
OneCool
LAME!!!!!!!!!!!!!

If you cant bring in the big guns on a dual GPU card it = FAIL in my book.


They should have used crippled 480s :shadedshu
Posted on Reply
#61
the54thvoid
Intoxicated Moderator
How on earth does the card being hotter mean it uses more power? In physics heat is a by product of current, essentially it is waste energy (unless heat is what you are after). Therfore does it not follow that because the card is drawing so much power, it is getting hotter?
The more cycles a processor performs the hotter it becomes because of the extra 'power' required to do the extra cycles. This is why liquid gas (N2) cooling is used on OC records. The processor does not consume more power because it is hot - it is hot because it consumes more power.

It is a bloody rule of electrical power - that which requires more power becomes hotter. Heat does not generate power.

"However, the fact is, temperature and temperature alone is what is causing Fermi to consume so much power"

This is not correct. What is really happening is the fact that so much heat is being lost by an innefficient design means that more power has to be pumped in to perform the given task* If the system is more efficient, as W1zz's review sample must be, it loses less heat and therefore does not require more power. So to be more accurate:

"Temperature (heat) loss is what causes Fermi to consume so much power"

Well, that and the fact it requires a lot more power in the first place.

*i.e. if i need 100 units (of whatever) to perform an operation and i have a 100% efficient system, i only need to draw 100 units. However if my system is innefficient, say 75%, then i lose 25 units as heat (or light or sound). This leaves me with 75 units which is not enough for the operation, so i have to draw 25 more. My total draw for a 100 unit task is now 125 units because of my 25 unit heat loss. Well done if you follow that!
Posted on Reply
#62
Fourstaff
the54thvoidHow on earth does the card being hotter mean it uses more power? In physics heat is a by product of current, essentially it is waste energy (unless heat is what you are after). Therfore does it not follow that because the card is drawing so much power, it is getting hotter?
The more cycles a processor performs the hotter it becomes because of the extra 'power' required to do the extra cycles. This is why liquid gas (N2) cooling is used on OC records. The processor does not consume more power because it is hot - it is hot because it consumes more power.

It is a bloody rule of electrical power - that which requires more power becomes hotter. Heat does not generate power.

"However, the fact is, temperature and temperature alone is what is causing Fermi to consume so much power"

This is nonsense. Fermi consumes so much power because of the architecture and the amount of stuff crammed onto the chip. It does so much processing it requires so much more power and therefore GENERATES more heat.
I hope your physics teacher told you that hot objects have higher resistance, therefore to supply the same amount of current, more power is needed. Hence, thermal runway we see in Fermi.
Posted on Reply
#63
Kantastic
the54thvoid*i.e. if i need 100 units (of whatever) to perform an operation and i have a 100% efficient system, i only need to draw 100 units. However if my system is innefficient, say 75%, then i lose 25 units as heat (or light or sound). This leaves me with 75 units which is not enough for the operation, so i have to draw 25 more. My total draw for a 100 unit task is now 125 units because of my 25 unit heat loss. Well done if you follow that!
If you're losing 25% of your units then you will need 133.33 units in order to maintain a 100 stable.
Posted on Reply
#64
HillBeast
JayliNI'm lost as to what point you're trying to make. Where have you proven that two gtx465s will match a single 5970?
Why are people still going on about 465s for? It clear says below the card "DUAL GTX 470". It may say on the cores they a 465s but bear in mind that this is a mock up. Do you honestly think that card will resemble the final product? Remember the original Fermis, they had wood screws on them. Someone pull their GTX480 apart and tell me where the wood screws are. Do you seriously think that Galaxy are going to waste a good GTX 470 core on a mock up card? Do you know how expensive those things are?

It's a Dual 470 and it's as simple as that. If Galaxy says it is, then it is.
Posted on Reply
#65
newtekie1
Semi-Retired Folder
JayliNI'm lost as to what point you're trying to make. Where have you proven that two gtx465s will match a single 5970?

All your graphs have shown is that 2x GTX480 in SLI is faster than a single 5970 which is a no brainer since the GTX480 is 11% faster than a 5870 and a 5970 is actually 2x downclocked 5870s in crossfire.

Its no secret that GPUs don't scale well beyond 2. It doesn't matter whether its crossfire or sli. To imply that crossfire is inferior to sli by comparing benchmarks taken from how 2 ati gpus scales to 4 and how 1 nvidia gpu scales to 2 is nonsense.

If you think 2 465s (which is ~37% slower than a 480 and ~32% slower than a 5870 at 2560x1600) is going to match a 5970, you're going to be surprised.

For your consideration: www.guru3d.com/article/geforce-gtx-465-sli-review/12
Read the quoted text, it really isn't that hard to follow.
FourstaffI hope your physics teacher told you that hot objects have higher resistance, therefore to supply the same amount of current, more power is needed. Hence, thermal runway we see in Fermi.
Correct, it is for this same reason that power supplies are less efficient at higher temps.
Posted on Reply
#66
hat
Enthusiast
Posted on Reply
#67
a_ump
i also remember reading that Fermi G100 has bad leakage. Heat increases leakage, so as said before, in order to maintain stability and compensate for the leakage in Fermi, more power or current must be fed.
Posted on Reply
#69
HammerON
The Watchful Moderator
Wow this thread is "another crap on fermi" thread and that is just to bad:shadedshu

As far as Galaxy releasing this Dual GTX 465 card; we will just have to wait and see if they do or don't. And then we can judge it accordingly...
Posted on Reply
#70
HillBeast
GenTarkinI dont see their GTS250 dual gpu card they announced a few months back yet.....so why would this card be any different?!?! lol
Well it's not like that card was going to be hard to make either considering it was a single PCB 9800GX2, and they gave up on it.
Posted on Reply
#71
Yellow&Nerdy?
newtekie1I really don't care about Vantage scores, or any Synthectic benchmark.

All you need to look at is:

tpucdn.com/reviews/HIS/Radeon_HD_5970_CrossFire/images/perfrel.giftpucdn.com/reviews/NVIDIA/GeForce_GTX_480_SLI/images/perfrel.gif
HD5970 Crossfire=12% Better than Single HD5970
GTX480 SLI=13% Better than Single HD5970
Overall, which is the important thing, not just on a single synthetic benchmark.


tpucdn.com/reviews/HIS/Radeon_HD_5970_CrossFire/images/perfrel_2560.giftpucdn.com/reviews/NVIDIA/GeForce_GTX_480_SLI/images/perfrel_2560.gif
If you just want to focus on a high resolution(and lets face it, no one buying this configuration is running it on 1024x768:laugh:):
HD5970 Crossfire=19% Better than a single HD5970
GTX480 SLI=18% Better than a single HD5970
I not defending ATI or anything, but I think because the review of the 5970 Crossfire was done when they were released, which is over 8 months ago. And one of the things that is improved with new drivers is multiple video card scaling. But you do have a point there, SLI usually scales better than Crossfire.
Posted on Reply
#72
pr0n Inspector
HillBeastUgh. This card is going to be terrible. Even if they use two 465s, that is 200W TDP each, totalling to 400W! The thing has 2 8 pin PCI-E plugs which can do MAX 150W and the PCI-E socket provides MAX 75W. That's 375W total, and even if the TDP figures are accurate (which they never are) this thing will pull over 450W at least in FurMark. I can't see any cooler being good enough to keep this thing under 100C. Only water cooling would be viable but even that would be a struggle if it's on the same loop as the CPU.

Seriously by the time they get the TDP good enough to work, it simply won't be powerful enough to beat a 5970.

EDIT: Just noticed in the picture it says below it 'GTX 470 Dual' so that's 215W x 2 = 430W. FAIL!
Those numbers are the minimum requirements. Decent PSUs(read: 18AWG wires, powerful 12V rail(s)) can provide well over 150W on a pci-e 8 pin* connector.


*in fact, even 6-pin can do it since most 8-pin connectors are 6+2.
Posted on Reply
#73
Tatty_Two
Gone Fishing
phanbuey465's are weak tho... they are well behind a 5850. 5850cf is basically a 5970... This might match it in some tests, but this is no 5970 killer.
You say weak well yes in comparision to a 5850, 5870, 470 or 480 but they fit a niche and are positioned as a "mid range" (or perhaps lower mid) product, much like the HD4850 was (eventually once the HD4890 was released), now I crossfired 2 of those 1GB HD4850's at the time and their performance to be honest was pretty awesome but not quite as good as a HD4870x2, however that didnt stop Ati releasing a HD4850x2 although it was only taken up by 2 board partners, my point being..... whats the difference here?
Posted on Reply
#74
poo417
newtekie1You say you've read the review, then one sentence later say "we still dont' know what they've done to reduce power use and temps".

Are you blind? Did you miss miss this gigantic 3 slot beast sitting on top of the card? Yeah, they must have used "magic voo doo dust" to lower the temps...:rolleyes:

Oh, and we know exactly what they have done to reduce the power use, lowered the temperatures. How do you know? Because when the fans are artifically slowed down, the card gets hotter, and it consumes more power. No magic voo doo dust, just hotter = more power.
I don't think what I was trying to say was very clear. On that card yes. Other cards that have the newer bios still all on stock heatsink are cooler and use less power. I am not talking just about the zotac card. Will sticking a finger on the fan stop it pulling current even more because it is told to run at a certain speed and it cant so pulls more current?

Perhaps you can enlighten me as to what you know they have done to the stock cards bioses to lower temps and power without added cooling or faster fans which is what I was trying to say. Using the the graph from the zotac review applied to a water cooled card at 48C under load would mean it should use a lot less power than it is in the review.

Anyway back this post. It will interesting to see what a bigger company comes up with. With the 4 gig 5970 seeming to all have 3 slot coolers it does leave a lot of space for a big efficient cooler. Will be interesting to see what they all come up with. We need something to knock the 5970 off its perch. :D
Posted on Reply
#75
Fourstaff
I cant see how that can be a good "mid range" graphics card, power consumption is going to be quite high. But I do see this card as a filler between the GTX480 and HD5970
Posted on Reply
Add your own comment
Apr 23rd, 2024 11:53 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts