• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Lucid HYDRA 200 Multi-GPU Technology Performance Preview

Joined
Jul 17, 2009
Messages
921 (0.16/day)
Location
SouthERN Africa
System Name inferKNIGHT
Processor Intel Core i5-4590
Motherboard MSI Z97i Gaming AC
Cooling Corsair H100i
Memory 2 x 4GB DDR3-1866 Crucial Ballistix Tactical Tracer (R/G)
Video Card(s) ASUS GTX 970 STRIX 3.5GB (+0.5GB? o.O)
Storage 1 x 256GB Cricial M550, 1 x 2TB Samsung 7200.12
Display(s) Samsung SyncMaster T260
Case Corsair Obsidian 250D
Power Supply Corsair RM750
Software Windows 8.1.1 pro x64
After all the talk of nVidia blocking Lucid's Hydra and much speculation about whether there may be reasons to fear it for nVidia because of quality control reasons, we finally see some performance stats.
Well, all this of nVidia's block has been thankfully recanted:nutkick:, so here's a preview of what we can expect at this point from Hydra: http://www.pcper.com/article.php?aid=815

10pu0e9.jpg


Comments?
 
  • Like
Reactions: HTC
Hmmm .... it works thats the most important thing .... :rockout:

but when will this technology will be made available for the users ??
eventhough nVidia refused saying it had nothing to with the MSI big bang MOBO release being delayed ... i seriously doubt that ... (its just ma opinion ... dont get me wrong guys.. ) ... nVidia is damaging its image lately ... :shadedshu
 
Nice, might grab a 8800gt or 9800 gt and slap that with a 5850 when I eventually upgrade.
 
I'd love to see the chip being implemented on addon cards, perhaps a videocard with the chip integrated to allow it to pair with another card. They said this would be possible, it would allow you to use basically any motherboard.
 
Nice, might grab a 8800gt or 9800 gt and slap that with a 5850 when I eventually upgrade.

Then you wouldnt be able to run dx11 when games start appearing without disabling hydra, which defeats the object
 
Hydra will be a benefit to everyone.....this has so much potential...looks what is does already in its early beta stage....very nice technology!
 
Then you wouldnt be able to run dx11 when games start appearing without disabling hydra, which defeats the object

I thought the 5850 would handle DX11 effects D:
 
so with this fish n chips you can run a 8800gt for example and a 4870 and scale them?

i dont understand! can i run 2 completely different chips together and process the same thing? i thought they had to be the same graphics core?
 
That's the magic about this new tech : ]

Any gpu two GPUs will scale.

Two identical scales best how ever.
 
what was up with those gpu configs that did not use the same card? Totally strange numbers there. Nothing worked as well as GTX260 with GTX260.
 
we need micro-stutter-begone

I just hope they have good split frame rendering algorithms down to help prevent micro stutter. I can imagine a gts250 combined with a gtx260 in alternate frame rendering would produce some wonky micro stutter. I remember trying out SFR(when it was still an option in the nvidia control panel), and I found some games would produce a line across the screen :shadedshu
 
Iand I found some games would produce a line across the screen :shadedshu

I found if you use vert sync then the line in the middle will be gone..(tearing i think)

I find it odd I have not seen any microstuddering in my builds since my X1800XTX crossfire build.
 
Last edited:
I thought the 5850 would handle DX11 effects D:

What he meant is that the game won't run DX11 enabled while a 8800GT (older directX) is enabled, it will scale down and the Hydra software would run the game in DX10 for example.

so with this fish n chips you can run a 8800gt for example and a 4870 and scale them?

i dont understand! can i run 2 completely different chips together and process the same thing? i thought they had to be the same graphics core?

Yes it will work, it works in a way to redirect the rendering of the scene on both GPU and the processor in the chip calculates which one of the GPU should handle the majority of the graphic load. It does so in various ways, among one it calculate the time it takes for a GPU to finish a task and "answer" back, the delay then decides which card is more powerful.

I just hope they have good split frame rendering algorithms down to help prevent micro stutter. I can imagine a gts250 combined with a gtx260 in alternate frame rendering would produce some wonky micro stutter. I remember trying out SFR(when it was still an option in the nvidia control panel), and I found some games would produce a line across the screen :shadedshu

They do not use AFR they use "Real Time Distributed Processing" http://www.lucidlogix.com/download/WP-Multi%20GPU.pdf

Check this out http://jonpeddie.com/publications/whitepapers/multi-gpu/
 
Last edited by a moderator:
i hope nvdia didn't screw this time,because i can't wait to buy this hydra thing
 
do anyone know how the external PCIe bus work(like hydra demo), it's really interesting if that can be adopted to laptop, imagine you did'nt have to buy a new laptop every year just for the video card
 
I'd love to see the chip being implemented on addon cards, perhaps a videocard with the chip integrated to allow it to pair with another card. They said this would be possible, it would allow you to use basically any motherboard.

Yeah, that would be nice. If it could work with PCI-E 1x, then it would fit most motherboards and still room for two cards.

Reason being, the big bang motherboard won't be cheap and you could just use your current motherboard and get 2 identical card with the money you save.

Now you need to get that motherboard first (and new CPU and new memory) and after that you don't have any money left to get a new GPU to do NVIDIATI fusion.

Anyhow, it's nice that it works, I was already happy for X58 boards doing both stuff and then cheaper P55 boards. Soon we have a third option and even though I've never used multi-gpu, would be fun to try (if I had that motherboard already).
 
do anyone know how the external PCIe bus work(like hydra demo), it's really interesting if that can be adopted to laptop, imagine you did'nt have to buy a new laptop every year just for the video card

ATI and nvidia have messed with that sort of technology for years - the cost of the new GPUs with their own power supply and cooling wasn't going to sell enough units to warrent commercialising the work they did with external PCI-E graphics cards for laptops. The demo by hydra just used a standard PCI-E extension cable - you can get those in the shops for desktop mobos now, there's just not that much use for them so you don't see many people using them in desktops.

As for the hydra demo - good to see it working and still due for release - i wonder if ATI will drop the PLX PCI-E bridges on their X2 cards and replace them with customised hydra chips?
 
Back
Top