Thursday, May 28th 2009

ASUS Designs Own Monster Dual-GTX 285 4 GB Graphics Card

ASUS has just designed a new monster graphics card that breaks the mold for reference design GeForce GTX 295, called the ASUS MARS 295 Limited Edition. The card, although retains the name "GeForce GTX 295", same device ID, and is compatible with existing NVIDIA drivers, has two huge innovations put in by ASUS, which go far beyond being yet another overclocked GeForce GTX 295: the company used two G200-350-B3 graphics processors, the same ones that make the GeForce GTX 285. The GPUs have all the 240 shader processors enabled, and also have the complete 512-bit GDDR3 memory interface enabled. This dual-PCB monstrosity holds 32 memory chips, and 4 GB of total memory (each GPU accesses 2 GB of it). Apart from these, each GPU system uses the same exact clock speeds as the GeForce GTX 285: 648/1476/2400 MHz (core/shader/memory).

Each PCB holds 16 memory chips, a 6-phase digital PWM power circuit, drawing auxiliary power from an 8-pin PCI-E power connector, the GeForce GTX 285-class GPU, and its companion NVIO2 processor. The PCB holding the PCI-Express bus interface, also holds the bridge chip. ASUS broke away with using the nForce 200 chip, and instead is using a yet to be disclosed third-party bridge chip. Currently, PLX and IDT are two likely sources for such a chip. The memory consists of high-density 0.77 ns memory chips made by Hynix.
The electrical-management on each PCB is care of a Volterra VRM controller, which supports the I2C interface, which means that the card supports software voltage control, perhaps a big plus for ASUS' Voltage Tweak feature that is gaining in popularity. Fused power circuit provides Over Current Protection while also facilitating extreme overclocking.

The cooler internally has the same basic construction as the reference cooler, it uses a single leaf-blower. The card spans across two expansion slots and is slightly higher than the reference design card. ASUS also used slightly longer internal bridges that make more room for third-party coolers, and the likes. Our source from ASUS EMEA conducted a quick 3DMark Vantage test proving the card's seamless compatibility with existing drivers, while also providing a significant boost in performance over existing GTX 295 cards. Being Quad-SLI capable, this card finally makes GeForce GTX 285 (effective) quad-SLI possible, and makes for the most powerful desktop multi-GPU setup ever conceived. ASUS designed this card despite pressure from NVIDIA enforcing its rigid policy of restricting its partners from custom-designing GeForce GTX 295. If everything goes smooth throughout the development process, the card might make it for a gala launch at Computex.
Add your own comment

179 Comments on ASUS Designs Own Monster Dual-GTX 285 4 GB Graphics Card

#1
locoty
by: 1Kurgan1
Another Asus innovation, seems they should almost be making their own GPU's as they also are the only company who makes 4850x2's.
Do you forget Sapphire?

in fact i think Sapphire is the only company who makes 4850x2
Posted on Reply
#2
method526
damn. that thing looks like a c4 brick. i hope cooling will be sufficient. i think the performance wont be a problem.
Posted on Reply
#4
INSTG8R
Meh, just another "I have the biggest e-penis card"....:rolleyes:
Posted on Reply
#5
happita
I would so get this if it was released a few months ago, but now I'm just waiting on DX11 cards. That's too bad, because that card looks like a freakin' tank!
4GB makes my pee-pee go, da doing doing doing!!
Posted on Reply
#6
qubit
Overclocked quantum bit
Frakking awesome card and sell my GTX285 coz me want!

I might just get one and burn out my credit card with the expense. :D Question is, which card??! :eek:

And then the GT300 will come out and I won't want this toy any more, but the shiny new one. hehehehehe
Posted on Reply
#7
buggalugs
by: qubit
Frakking awesome card and sell my GTX285 coz me want!

I might just get one and burn out my credit card with the expense. :D Question is, which card??! :eek:

And then the GT300 will come out and I won't want this toy any more, but the shiny new one. hehehehehe
Dont do it my friend. DX 11 cards are less than 6 months away and this card will look much less impressive and lose most of its value.
Posted on Reply
#8
btarunr
Editor & Senior Moderator
There's always something better 6 months down the line, and will always be so, unless, God forbid, North Korea sends something flying towards California.
Posted on Reply
#9
El_Mayo
by: btarunr
There's always something better 6 months down the line, and will always be so. Unless, God forbid, North Korea sends something flying towards California.
roflmao
that's highly unlikely
Posted on Reply
#10
1Kurgan1
The Knife in your Back
by: locoty
Do you forget Sapphire?

in fact i think Sapphire is the only company who makes 4850x2
Good eye, forgot it was Sapphire, hmmm.... Just seems weird these PCB manufacturers are going crazy, kinda cool though.
Posted on Reply
#11
qubit
Overclocked quantum bit
by: buggalugs
Dont do it my friend. DX 11 cards are less than 6 months away and this card will look much less impressive and lose most of its value.
Indeed, that's good advice my friend. :) While I would very much like to have it, I wasn't actually being serious. It's just that my GTX 285 is very nice, so double that in a special edition card is twice as desirable!

Can you imagine the shock to the system when the GT300 comes out, DX11 support, 4GB GDDR5 as standard (and crucially, all visible) and 3 times the power of a GT200 - the Asus will go from what, $1000 to 150, 100, 50? :eek:

While one would still be paying it off....

Doesn't bear thinking about, does it?
Posted on Reply
#12
Assassin48
im pretty sure that one of atis partner will bring out the 4890x2 no matter what ati says to compete against this thing and people are willing to pay for the latest and greatest

WE WANT POWER !
Posted on Reply
#13
BOSE
This card is for the sheer purpose of bragging rights for Asus, that they were the first to slap 4Gb of ram.

Next thing we will see some one will make a card with 6GB of Ram, then 12GB of Ram, and so on and so on.

Ill bet 4GB will be standard on all performance cards with in a year.
Posted on Reply
#14
El Fiendo
Um, it also uses GTX285 processors instead of GTX275. It has a 512 bit interface and the full 240 shader processors so its not just more memory.
Posted on Reply
#15
erocker
by: El Fiendo
Um, it also uses GTX285 processors instead of GTX275. It has a 512 bit interface and the full 240 shader processors so its not just more memory.
Both the GTX285 and 275 have 240 shader GPU's. The GTX275 just uses the 448 bit bus and 896mb of ram.
Posted on Reply
#16
alexp999
Staff
by: El Fiendo
Um, it also uses GTX285 processors instead of GTX275. It has a 512 bit interface and the full 240 shader processors so its not just more memory.
The GTX 295 uses the GTX 260 GPU, so 216 SPs :)
Posted on Reply
#17
BOSE
It doesnt matter. The point, is that Asus trying to be first at something that no one has done it yet.

Soon it will be standard among all VGA makers. Thus no reason to piss all over your self like a school girl in some cheap porn.
Posted on Reply
#19
Kitkat
by: BOSE
It doesnt matter. The point, is that Asus trying to be first at something that no one has done it yet.

Soon it will standard among all VGA makers.
Hardly. Next couple years will be all about the process SIZE and ram speed. Efficiency over ram amounts. we are on the brink of 22 28 32 45 and 55 (which ppl are running away from even tho would be just as awesome with new architecture.) 4GB as a "soon standard" is silly. The card is a gimmick like most 2g cards now that are the same speed (YES EVEN TEXTURE WISE) as there 1g counterparts. This all goes back to good solid code and efficiency. Look at crysis 1 as today's standard benchmark, Also silly. Compared side by side with the second one it was "poorly coded". Advancements in CODE were made and the system its self and its performance the second time around was MUCH better. Believing that crysis would "soon become the standard demand for vidcards" was completely SILLY. Running to higher ram (unnecessary) amounts on vid cards leads to looser code that utilizes UNNECESSARY racecourses. With smaller chip and ram processes coming easier, and sooner than they did before, they are the future. Personally too on the side i dunno if its just me but as a coder i think CONSTRAINTS make better,tighter and more efficient code, which brings out performance that's expected and sometimes even UNEXPECTED from hardware. Sure crysis caused an arms race that gave us the stuff we have now but when they came back the second and even from the previews Ive seen the 3rd time around we saw what CODE did that which hardware couldn't. Always CODE over hardware. I love hardware but if code doesn't utilize ANY of this $hi+ whats the point.
Posted on Reply
#20
BOSE
by: Kitkat
Hardly. Next couple years will be all about the process SIZE and ram speed. Efficiency over ram amounts. we are on the brink of 22 28 32 45 and 55 (which ppl are running away from even tho would be just as awesome with new architecture.) 4GB as a "soon standard" is silly. The card is a gimmick like most 2g cards now that are the same speed (YES EVEN TEXTURE WISE) as there 1g counterparts. This all goes back to good solid code and efficiency. Look at crysis 1 as today's standard benchmark, Also silly. Compared side by side with the second one it was "poorly coded". Advancements in CODE were made and the system its self and its performance the second time around was MUCH better. Believing that crysis would "soon become the standard demand for vidcards" was completely SILLY. Running to higher ram (unnecessary) amounts on vid cards leads to looser code that utilizes UNNECESSARY racecourses. With smaller chip and ram processes coming easier, and sooner than they did before IS the future. Personally too on the side i dunno if its just me but as a coder i think CONSTRAINTS make better,tighter and more efficient, code which brings out performance that's expected and sometimes even UNEXPECTED from hardware. Sure crysis caused an arms race that gave us the stuff we have now but when they came back the second and even from the previews Ive seen the 3rd time around we saw what CODE did that hardware couldn't. always CODE over hardware. I love hardware but if code doesn't utilize ANY of this $hi+ whats the point.
And there was time when we were happy with 32MB of ram in Win 95. Now cell phones have as much ram as desktop PC used to have.

Point is, you cant stop hardware progress, no matter how good the code is. There is always money to be made from something newer, bigger, better, faster hardware. Today its Crysis, tomorrow its the new Packman that will raise the bar.
Posted on Reply
#23
El_Mayo
what if..
i think already said this a few pages back
but what if nVidia made 40nm versions of the 9800GTX
and stuck four of those on one card?
wouldn't that be a tiny bit better?
cheaper and less power hungry!
Posted on Reply
#24
Marineborn
im a ati fanboy, but this is extremly temping...possible 2 of them slid!BWHAHWAHAWHAWHAW HWAHAW HW H!!!! ! ! V@#@!*&^)$ *goes mad with power*
Posted on Reply
#25
El_Mayo
by: Marineborn
im a ati fanboy, but this is extremly temping...possible 2 of them slid!BWHAHWAHAWHAWHAW HWAHAW HW H!!!! ! ! V@#@!*&^)$ *goes mad with power*
you must be joking. lmfao
just why in god's name would you want that?
is your monitor a cinema screen?
Posted on Reply
Add your own comment