1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

How To: Enable SLI on pre-i7/i5 hardware

Discussion in 'NVIDIA' started by bluevelvetjacket, May 22, 2008.

  1. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (7.61/day)
    Thanks Received:
    3,778
    The psu you have actually behaves like a single rail. It doesn't matter where you plug the cables.
     
  2. oiluj New Member

    Joined:
    Nov 12, 2009
    Messages:
    31 (0.02/day)
    Thanks Received:
    2
    anatolymik, any news about your (awesome) work with the new 'HyperVisor' version of the SLIpatch ??
     
  3. anatolymik

    anatolymik

    Joined:
    Jun 21, 2009
    Messages:
    1,432 (0.65/day)
    Thanks Received:
    824
    it'll be not a HyperVisor :). at least i think about. the matter is there is a method more simplier which enables io port emulating.
     
  4. TiN New Member

    Joined:
    Aug 28, 2009
    Messages:
    184 (0.09/day)
    Thanks Received:
    68
    Location:
    Taipei
    I will maybe test how sli patch work with 2xGTX480 in asus X48 REX board soon.
     
    Sailindawg and Ross211 say thanks.
  5. Ross211

    Ross211

    Joined:
    Jan 9, 2010
    Messages:
    474 (0.24/day)
    Thanks Received:
    115
    Location:
    Kansas
    Looking forward to this :rockout:
     
  6. ken2cky New Member

    Joined:
    Apr 20, 2010
    Messages:
    41 (0.02/day)
    Thanks Received:
    2
    I have finally 2x GTX 260 with a SLI Bridge in my rig.
    I really can see the difference in 3DMark V (almost 2x GPU points) but not in games... why :confused:
    I tried: Assassin's Creed 2, Mirror's Edge, Dirt2, H.A.W.X and SLI or not, I got the same FPS. I'm running at 1280*1024 (17" xD) all maxed out.

    I click "Do not use SLI" to test games with one card and all Nvidia Control Panel setups are default but I select the .exe file of the game.

    I see 3 options for SLI: Single GPU, AFR1, AFR2 (or Nvidia Recommended)
    Which one should I select?
     
  7. Ross211

    Ross211

    Joined:
    Jan 9, 2010
    Messages:
    474 (0.24/day)
    Thanks Received:
    115
    Location:
    Kansas
    I don't believe you'll see a benefit with another card if you are running your games in 1280x1024.

    You'll only see a significant improvement in your frame rate at 1280x1024 if you get a better CPU. (single card or SLI). 1280x1024 simply isn't much work for a single GTX 260, let alone two. At 1280x1024 you are CPU bottlenecked.

    I'm pretty certain SLI scales best at 1920x1080 or higher (ie 2560x1600).
     
  8. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,513 (1.60/day)
    Thanks Received:
    900
    Location:
    Nelson B.C. Canada
    Yep, sure does scale well at higher res. I'm at 1920x1200, and games like Crysis scale at least 60% or better. As for sli settings, open up your cp and go to Manage 3D Settings, click on the Program Settings tab, click Add, and navigate to the game exe that launches the game. The cp will automatically set the correct sli mode. Done!
     
  9. ken2cky New Member

    Joined:
    Apr 20, 2010
    Messages:
    41 (0.02/day)
    Thanks Received:
    2
    Ok... so I'll have to wait till my 52" TV arrives to see the difference?

    But SLIpatch worked perfect here right??

    Can I just disable SLI and unplug the PCI-E power cables of the 2nd card so its consumes less power?? It's struggle to remove the card from the case... Or will that damage the card?
     
    Last edited: May 20, 2010
  10. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,513 (1.60/day)
    Thanks Received:
    900
    Location:
    Nelson B.C. Canada
    No don't unplug the pcie cables. If only one or no monitor is plugged in, it will just downclock. Leave them in sli, just disable sli in cp, and only one card will be used.
    Edit: when you switch sli on/off, you may have to reboot, less likely if you have win7. When you switch to sli mode, take a look at program settings, the sli mode each app uses
    should be highlighted in BLACK BOLD. If not, reboot.
     
  11. ken2cky New Member

    Joined:
    Apr 20, 2010
    Messages:
    41 (0.02/day)
    Thanks Received:
    2
    How much many watts a downclocked GTX260 consumes?

    No problem switching SLI/no SLI under Win7. It does everything automatically.

    Thank's to all for your answers
     
  12. Ross211

    Ross211

    Joined:
    Jan 9, 2010
    Messages:
    474 (0.24/day)
    Thanks Received:
    115
    Location:
    Kansas
    I'm not sure how much power a downclocked GTX 260 will consume. You should check out this article Tom's did - Actual Power Consumption And Current Requirements.
     
  13. tdbone1

    Joined:
    Dec 31, 2004
    Messages:
    161 (0.04/day)
    Thanks Received:
    1
    Hi all:

    I just got a msi 8900gxm-g65 amd crossfire with 2 pcie x8 slots
    I also have 2 8800gts 512mb (g92 chip) video cards
    I am running vista x64 currently but will probably be upgrading to win7 x64

    can anyone tell me the correct drivers or bios mods or whatever i need to do to get sli to work with my setup?

    thanks.
     
  14. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,513 (1.60/day)
    Thanks Received:
    900
    Location:
    Nelson B.C. Canada
  15. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,513 (1.60/day)
    Thanks Received:
    900
    Location:
    Nelson B.C. Canada
    Last edited: May 21, 2010
  16. tdbone1

    Joined:
    Dec 31, 2004
    Messages:
    161 (0.04/day)
    Thanks Received:
    1
    AWESOME!

    just to recap

    SLIPatch 0.7 WinVista x64 alpha
    Vista x64 (UAC turned to OFF)
    MSI 890GXM-G65 bios 1.6
    (2) msi 8800gts 512MB in SLI (with SLI Bridge connector)
    197.45

    didnt know any driver worked...

    gonna switch to the newer driver and run some benchmarks.

    here are the benchmarks for non-sli and sli with the 197.45

    3400mhz 17x200 (955BE)

    3DMark Score 14869
    SM 2.0 Score 6384
    SM 3.0 Score 6116
    CPU Score 4670


    3400mhz 17x200 SLI

    3DMark Score 17857
    SM 2.0 Score 6848
    SM 3.0 Score 8799
    CPU Score 4785
    Thanks for the quick info.
     
    Last edited: May 21, 2010
  17. Sailindawg New Member

    Joined:
    Sep 22, 2009
    Messages:
    31 (0.01/day)
    Thanks Received:
    4
    Any one try this with either a pair of 470's or 480's yet?

    EDIT: Tin, missed your earlier post. Looking forward to your results.
     
    Last edited: May 22, 2010
  18. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,513 (1.60/day)
    Thanks Received:
    900
    Location:
    Nelson B.C. Canada
    Try Vantage tdbone1, you will see a much bigger difference. You will in games too. XP64 under sli just screams! But 3dmark06 is too cpu bound.
     
  19. tdbone1

    Joined:
    Dec 31, 2004
    Messages:
    161 (0.04/day)
    Thanks Received:
    1
  20. Neeyon New Member

    Joined:
    May 14, 2010
    Messages:
    5 (0.00/day)
    Thanks Received:
    0
    Everyone is reporting that the newest Nvidia 257+ drivers are boosting framerates for some games like crazy. Has anyone confirmed that these drivers work for SLI Patched boards? I would try it out myself and post my results but unfortunately, I am at work.
     
  21. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,513 (1.60/day)
    Thanks Received:
    900
    Location:
    Nelson B.C. Canada
    Haven't had much time to test yet, but 257.15 works fine with sli patch. This is what the new sli section looks like in cp:
    [​IMG]
     
  22. r3b00t3r New Member

    Joined:
    Dec 7, 2009
    Messages:
    39 (0.02/day)
    Thanks Received:
    7
    Location:
    Alverca / Lisboa / Portugal
    I installed these new 275.15 drivers, and ran 3DMark Vantage in default settings before (with 197.45) and after installation.

    SLI works flawlessly with this bench (I haven't tried any games yet), but I don't see any improvements. I had the following results:

    197.45 - P22033
    275.15 - P21903

    Because of the dispersion in 3DMark Vantage results (I've done some testing and I concluded that a 200 marks difference is not significative), I think I can say with a good confidence level that these new drivers, for my cards (and the rest of my system), has no impact in 3DMark Vantage performance.

    Let me remember that this performance boost you said is only for the new GTX4XX series, with checked results for a single GTX480.
     
  23. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,513 (1.60/day)
    Thanks Received:
    900
    Location:
    Nelson B.C. Canada
    Like I said, I haven't done any testing yet, but from what I've read around the net, these drivers improve performance on all cards. BFBC2 is supposedly really improved. Many people with sli 260s give it a thumbs up, so I image they help with most cards. And use Vantage to check for improvement, only way 3dmark06 will get higher is if you oc your cpu!
     
    Last edited: May 25, 2010
  24. r3b00t3r New Member

    Joined:
    Dec 7, 2009
    Messages:
    39 (0.02/day)
    Thanks Received:
    7
    Location:
    Alverca / Lisboa / Portugal
    Nice to know that! Probably I will not test these drivers with some gaming until thursday or friday, but it's very good to expect some gaming improvement!
     
  25. ken2cky New Member

    Joined:
    Apr 20, 2010
    Messages:
    41 (0.02/day)
    Thanks Received:
    2
    Does installing new drivers undo the SLIPatch? I mean, how do I install the new drivers?
    1. Uninstall SLIPatch
    2. Uninstall older drivers
    3. Install new drivers
    4. Reinstall SLIPatch
    Or can I pass step 1 & 4 and only do 2 & 3??
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page