Thursday, April 15th 2010

MSI Prepares Lower-Cost Lucid Hydra-based LGA1156 Motherboard

MSI is working on its third motherboard that makes use of Lucid Hydra multi-GPU technology, this one aimed to be a more affordable model than the Big Bang Fuzion, for the socket LGA-1156 platform. Lucid Hydra technology allows users to mix and match graphics cards from across brands and models to upscale performance. A newly-release driver (1.5.106) for Hydra is said to increase functionality by adding full support for DirectX multi-GPU scaling, as well as expects significantly higher performance upscaling compared to older drivers. The new motherboard from MSI is the P55-GD88 Hydra, also to be known as P55A Hydra.

Built on a PCB that's different from that of the Big Bang Fuzion, the P55A Hydra uses High-C capacitors only for the CPU VRM, with normal solid-state capacitors for the rest of the board. Unlike the Big Bang Fuzion it has only two PCI-Express 2.0 x16 slots, but these slots work at x16 speeds. Perhaps the only selling point for Hydra on this setup (since Intel P55 platform already supports both SLI and CrossFire, and tests have shown performance hit between x16 and x8 to be insignificant for even high-end GPUs), is the ability to mix and match different kinds of graphics cards, including mixing an ATI Radeon card with an NVIDIA GeForce card.

Other features of the P55A Hydra include 10-phase CPU VRM, consolidated voltage measure points, two SATA 6 Gb/s ports apart from seven 3 Gb/s ones, USB 3.0, and connectivity which includes 8-channel audio, gigabit Ethernet, eSATA and FireWire. The MSI P55A could debut in a few weeks time, targeting a price point significantly lower than that of the Big Bang Fuzion.Source: TechConnect Magazine
Add your own comment

13 Comments on MSI Prepares Lower-Cost Lucid Hydra-based LGA1156 Motherboard

#1
Mussels
Moderprator
you spelled big bang as big band in the first line.
Posted on Reply
#2
BazookaJoe
I am VERY amped to see some real world testing & benchmarks on this hydra stuff.

Very interesting indeed.
Posted on Reply
#3
OneCool
They have been on about this forever now.Release it already!! :slap:
Posted on Reply
#4
Mussels
Moderprator
by: OneCool
They have been on about this forever now.Release it already!! :slap:
i read a review in a magazine, where they claimed some weird results. high end cards dont work, but midrange get nice boosts - 1.8x boost combining an 8800GT with a 4870 or something,and no boost at all combining a GTX 280 with a 5870


lucid need to work on drivers a looooot more.
Posted on Reply
#5
Cold Storm
Battosai
I almost bought the fuzion before I bought the trienergy board.. Reviews of the lucid ability is what is expected for something so new.. Their is a lot of problems, and mostly, from what I've seen, to do with nvidia. They can get ati to work quite well with one another, but ati/nvidia is a big trial/error type thing.

As for the item at hand. It's cool that they are doing this. Makes more "testers" then what they can get from the $360ish father board..
Posted on Reply
#6
BazookaJoe
by: Mussels
i read a review in a magazine, where they claimed some weird results. high end cards dont work, but midrange get nice boosts - 1.8x boost combining an 8800GT with a 4870 or something,and no boost at all combining a GTX 280 with a 5870

lucid need to work on drivers a looooot more.
Maybe they do - But if the hardware's good, and its just a matter of drivers then I'm very interested.

In all reality they would HAVE to tune the drivers after an actual release there's just too much hardware out there for them to have tested everything :)

I'm just hoping like crazy that the results you tell of are only the results of a software issue...
Posted on Reply
#7
Mussels
Moderprator
by: BazookaJoe
Maybe they do - But if the hardware's good, and its just a matter of drivers then I'm very interested.

In all reality they would HAVE to tune the drivers after an actual release there's just too much hardware out there for them to have tested everything :)

I'm just hoping like crazy that the results you tell of are only the results of a software issue...
sounded like it.

the only reason i can think of a hardware limitation is if the lucid chip had insufficient bandwidth - you know, they only gave it 4x bandwidth and it needed more, or something.
Posted on Reply
#8
amschip
I'm still waiting for Wizz for a promissed review. :)
Posted on Reply
#9
ToTTenTranz
by: Mussels
i read a review in a magazine, where they claimed some weird results. high end cards dont work, but midrange get nice boosts - 1.8x boost combining an 8800GT with a 4870 or something,and no boost at all combining a GTX 280 with a 5870


lucid need to work on drivers a looooot more.
As soon as they fix the performance to get as good as Crossfire and SLI, Lucid should be a lot better on the long term.

I don't really care about being able to use cards from different companies. Features are going through standardization/open-source on all levels so eventually the only thing that matters is price/performance.

What I do care is that Lucid doesn't make memory redundancy, so it should make up for cheaper high-end systems. 2*HD5870 1GB will use 2GB total for the graphics system, so it'll be cheaper than Crossfiring 2*HD5870 2GB.
Posted on Reply
#10
Mussels
Moderprator
by: ToTTenTranz
As soon as they fix the performance to get as good as Crossfire and SLI, Lucid should be a lot better on the long term.

I don't really care about being able to use cards from different companies. Features are going through standardization/open-source on all levels so eventually the only thing that matters is price/performance.

What I do care is that Lucid doesn't make memory redundancy, so it should make up for cheaper high-end systems. 2*HD5870 1GB will use 2GB total for the graphics system, so it'll be cheaper than Crossfiring 2*HD5870 2GB.
yeah, making ram additive really is where the goodness will be.


what they need is a method to divvy up the cards properly - you know, its not two identical cards so 50/50 doesnt work - user should be able to tweak it themselves (5870 + 5850 you could set 60/40, 5870 + 5770 you could set 70/30, whatever)
Posted on Reply
#11
ToTTenTranz
by: Mussels
what they need is a method to divvy up the cards properly - you know, its not two identical cards so 50/50 doesnt work - user should be able to tweak it themselves (5870 + 5850 you could set 60/40, 5870 + 5770 you could set 70/30, whatever)
I think it already does that automatically.
It determines how much power there is in each GPU and it sends more polygons+textures to the fastest GPU.
Posted on Reply
#12
Mussels
Moderprator
by: ToTTenTranz
I think it already does that automatically.
It determines how much power there is in each GPU and it sends more polygons+textures to the fastest GPU.
i think thats why they're having issues with scaling tho, the auto-detect is crap.
Posted on Reply
#13
r9
I`m not to sure about this technology knowing the scaling unefficiency of SLI and CFIRE specially and new released games and that is the only time that you gonna need more than one card. This is mismatching not only class of GPU say 4870 ans 4670 but even brand.
Posted on Reply
Add your own comment