• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

MSI ''Big Bang'' P55 Motherboard Implements Lucid Hydra

Hrmm... whats funny, now thinking about it, is the locaiton of the Hydra chip on the P55 boards, its where the traditional Northbridge would have been. Almost like LGA 1156 was made with the intenton of adapting Hydra. :D . I want Hydra on a X58 board.

EDIT:
http://www.anandtech.com/video/showdoc.aspx?i=3646
 
Last edited:
How much latency does this introduce into the mix? Even as a SoC, you'll have latency introduced with missed cycles and such (assuming a more traditional CPU style architecture, but I could be wrong of course!). So, even if you get better load balancing, could this affect minimum framerates while boosting the average, etc.? Just a thought...
 
Considering nVidia makes liberal use of it's NF200 PCIe bridge chips with a lot of success (most older FSB tech boards, some X58 boards, there's even one onboard the GTX295) PCIe seems to not be very much affected by latency issues. At least not with current generation video cards.
 
I was really keen on the idea of the original Lucid Demo, the one where the hydra chip was alone on a board with only pci-express slots, and a fat cable running back to a host card that sat in your pc's pci-express slot.

Of course I think they will sell more on a motherboard, but I really like the idea of a separate graphics subsystem altogether, with its own PSU and cooling etc, not to mention when you upgrade your pc, you don't need to buy another board with a lucid chip, you can keep keep the extra box.

I want it so bad;

lucid-demo.jpg
 
That reminds me of the Sega CD or Sega 32x. It's a neat idea, but it won't go far. Having it on the motherboard is essential in production costs. Having a separate box with a PCB housing the Hydra chip, PSU, etc., would get expensive. But yes, that would be awesome having a separate box just for the video cards.
 
better or not, its the older one. old 'n busted.

1156 is will smith, and therefore gets the new stuff :P



nvidias big bang and big bang II were different things

Wah? 1156 is 1366's retard sibling. Not getting any cpus with more than 4 cores and it doesn't even overclock as far. Either save money and buy a 775 or go all out on a 1366. 1156 is a dead end.
 
it was a men in black joke.
 
Wah? 1156 is 1366's retard sibling. Not getting any cpus with more than 4 cores and it doesn't even overclock as far. Either save money and buy a 775 or go all out on a 1366. 1156 is a dead end.

Well, 1156 is replacing s775 and is cheaper than 1366 so why not? Anyways, if the Hydra chip actually works well, I'm positive we'll see it on the 1366 boards.
 
Well, well, well... Pairing a Ati and Nvidia card for performance scaling. Sounds suspicious to me, weird Nvidia hasn't filed a lawsuit on this yet.

:skeptical:

i hope Lucid can work it out
 
i hope Lucid can work it out

to me, it sounds like its dividing parts of the screen up - say if it was 2 cards, they'd each think they were driving half that resolution.

1600x1200

each card would be doing 1600x600 on its own monitor, as far as it was concerned
 
Wah? 1156 is 1366's retard sibling. Not getting any cpus with more than 4 cores and it doesn't even overclock as far. Either save money and buy a 775 or go all out on a 1366. 1156 is a dead end.

Ugh. This is just not true. Dead end because of one very expensive cpu? Honestly, there is little difference b/t 1156 and 1366. Really, the only significant thing that 1366 gives you is a 6 core cpu that will cost you $1000. Other than that, you could argue that 1156 gives you more options to chose from, i5 or i7 chips.
 
to me, it sounds like its dividing parts of the screen up - say if it was 2 cards, they'd each think they were driving half that resolution.

1600x1200

each card would be doing 1600x600 on its own monitor, as far as it was concerned

Nope, that's what SLI and Crossfire are already capable of doing, in some cases.

Lucid distributes the work into polygons. One GPU handles the characters whereas the other handles the environment. This way they don't have to load the same texture data so the system actually doubles the available video memory with two cards.


That's why you can couple different cards. They're doing different things.
 
I have no time to read the hole thread now but i want to say that this seems really to good to be true... i will get a mobo that has this new feature installed before they stopped being produced cause there will be a lawsuit maybe...
 
Back
Top