• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Hybrid SLI Technology

malware

New Member
Joined
Nov 7, 2004
Messages
5,422 (0.72/day)
Location
Bulgaria
Processor Intel Core 2 Quad Q6600 G0 VID: 1.2125
Motherboard GIGABYTE GA-P35-DS3P rev.2.0
Cooling Thermalright Ultra-120 eXtreme + Noctua NF-S12 Fan
Memory 4x1 GB PQI DDR2 PC2-6400
Video Card(s) Colorful iGame Radeon HD 4890 1 GB GDDR5
Storage 2x 500 GB Seagate Barracuda 7200.11 32 MB RAID0
Display(s) BenQ G2400W 24-inch WideScreen LCD
Case Cooler Master COSMOS RC-1000 (sold), Cooler Master HAF-932 (delivered)
Audio Device(s) Creative X-Fi XtremeMusic + Logitech Z-5500 Digital THX
Power Supply Chieftec CFT-1000G-DF 1kW
Software Laptop: Lenovo 3000 N200 C2DT2310/3GB/120GB/GF7300/15.4"/Razer
NVIDIA is planning a new technology called Hybrid SLI which can boost the graphics performance for systems that have NVIDIA's IGP (integrated graphics processor) and discrete GPUs, while it can also increase the battery life of notebook PCs. Hybrid SLI will turn off discrete GPUs and only run the IGP when the user is operating normal 2D applications. On the other hand when the user is running 3D applications, the technology will automatically turn on the discrete GPU and boost performance without the need to reboot the system and manually switch between the two graphics processors. NVIDIA expects to use the new technology on both desktop and notebook PCs. The technology is predicted to appear by the end of the year, according to Jen Hsun Huang - chief executive of NVIDIA Corp.

View at TechPowerUp Main Site
 
Nifty. I like the idea.
 
Should help boost the battery life of high performance notebooks which is good...
 
Nice idea. Definitely useful.
 
Hmm.... a neat idea, but if you're going to only use the IGP during 2D animation, why not get rid of it entirely and use the CPU to process what's on the screen? With today's incredibly fast processors, it shouldn't be that much of a hassle, should it?
 
definately sounds interesting. IGP and Discrete card. Sounds pretty sweet. Just wonder how game performance is when the IGP borrows from system memory.
 
i think it would be a little more of a problem then it would help. dont ask why
 
definately sounds interesting. IGP and Discrete card. Sounds pretty sweet. Just wonder how game performance is when the IGP borrows from system memory.

I'd assume it'd release it when you load a 3D game.
 
Si,è una tecnologia molto utile.Nvidia è una compagnia forte.:rockout:
caveman_3.jpg

Uhh... what?
 
CRAP IDEA

Basically it's a cludge because NVidia can't choke the power consumption of their discete GPUs.

Intel can do it. Can you imagine what BULLSHIT it would be to have a 386ULV running your OS for desktop, only to kick in a Core 2 Duo for hard core app work? Completely stupid idea. Intel got clock and voltage chocking to a dream. NVidia should pull their finger out and do the same.
 
why is it a crap idea to shutdown a 100w draw when its twiddling its thumbs? integrated is alot better than it used to be and the demand \has shown it. i sould buy into this on the ati side.
 
Hybrid Graphic System
Sony's revolutionary Hybrid Graphic System lets you set your graphics performance. A simple hardware switch enables you to toggle between an internal graphics chip for optimal power consumption with excellent performance and an external graphics chip for even more robust performance, for unmatched control of your time and output

this has been around a while, im sure i saw oanother laptop (not the sony referred to in the quote) that had a 7900go, but fell back on the intel onboard in 2d, i think its just a case of nvidia giving the technology a name (and probably a patent too)
 
this has been around a while, im sure i saw oanother laptop (not the sony referred to in the quote) that had a 7900go, but fell back on the intel onboard in 2d, i think its just a case of nvidia giving the technology a name (and probably a patent too)

That laptop had a physical switch to do it, but the laptop had to be powered off to change it.

To those who dont get this, a few facts.

8800GTX uses ~180W of power at most.
X2900XT uses 250W of power at most.

Onboard video - 30W would be a good figure.

Swapping to the onboard saves power, is less heat in the PC (therefore less cooling) The main 3D card will live a longer life, and the greenies wont hunt you down and murder you in your sleep for contributing to global warming.

I'd be glad for a way to reduce the 200W power draw of my PC at idle (its 298W load) as it makes it too expensive to leave on downloading overnight.
 
Undervolt the video card.
 
Undervolt the video card.

Hat... we cant. Nvidia/ATI might be able to, but they dont seem to want to. This tech was mainly for laptops wheree battery life is king, but it also helps desktops too.
 
CRAP IDEA

Basically it's a cludge because NVidia can't choke the power consumption of their discete GPUs.

Intel can do it. Can you imagine what BULLSHIT it would be to have a 386ULV running your OS for desktop, only to kick in a Core 2 Duo for hard core app work? Completely stupid idea. Intel got clock and voltage chocking to a dream. NVidia should pull their finger out and do the same.

Sometimes fanboys cant resist taking down good ideas from the enemies... cmon man give nvidia a break. :laugh:

Now I wish I had an onboard IGP..
 
how much "hard core" gaming actualy gets done on a notebook anyway??????
(I actualy don't know, not a trick question)
 
CRAP IDEA

Basically it's a cludge because NVidia can't choke the power consumption of their discete GPUs.

Intel can do it. Can you imagine what BULLSHIT it would be to have a 386ULV running your OS for desktop, only to kick in a Core 2 Duo for hard core app work? Completely stupid idea. Intel got clock and voltage chocking to a dream. NVidia should pull their finger out and do the same.

I want you to show me a single AGP 8x/PCI-E 16x 3D video card from intel. Intel have clock and voltage control to a dream? since when? AMD have cool and quiet on their CPU's, Intel's method barely saves any power at all (no voltage control, only multiplier adjustments) - Intel don't even MAKE video cards, unless you count onboard.

It seems you already have this technology implanted in your head, as your brain switched off the minute you hit the reply button.

Edit:
Si,è una tecnologia molto utile.Nvidia è una compagnia forte.:rockout:


"Yes, this techology is very useful. Nvidia is a strong company" - is that translation right? I dont speak... spanish?
Anyway, please speak english on the forums :) thank you
 
CRAP IDEA

Basically it's a cludge because NVidia can't choke the power consumption of their discete GPUs.

Intel can do it. Can you imagine what BULLSHIT it would be to have a 386ULV running your OS for desktop, only to kick in a Core 2 Duo for hard core app work? Completely stupid idea. Intel got clock and voltage chocking to a dream. NVidia should pull their finger out and do the same.

Uhh, dude...
Video cards don't have FSB's or Multipliers. The most they can do is lower the voltages and clocks of the video card when not under strain, but still this technology nVidia has is better than that because try as you will you won't get a video card running at 30~50 watts like an intergrated graphics chip.

fanboy.
 
(no voltage control, only multiplier adjustments)

they have voltage control. when it steps down the multiplier it will also step down voltages

Onboard video - 30W would be a good figure.

i hope to god not. my laptop has a c2d (mobile) which has a TDP of 35 watts, and my power adapter is 65 watts. so if the cpu gave off 100% of the power it uses (loses 100% into heat) that would be 35 watts. which is 35 watts for the rest of the system. keep in mind i have a low end discrete graphics. 35 watts for gfx, monitor, wifi, hdd, optical drive, north bridge, south bridge, usb poser, nic, and maybe a few other things.
 
they have voltage control. when it steps down the multiplier it will also step down voltages


i hope to god not. my laptop has a c2d (mobile) which has a TDP of 35 watts, and my power adapter is 65 watts. so if the cpu gave off 100% of the power it uses (loses 100% into heat) that would be 35 watts. which is 35 watts for the rest of the system. keep in mind i have a low end discrete graphics. 35 watts for gfx, monitor, wifi, hdd, optical drive, north bridge, south bridge, usb poser, nic, and maybe a few other things.

The voltage control doesnt work on any of the C2D systems i've used, and with my power meter i've seen barely a 5-10W drop in power use (stock and OC'd test) (These were all E6600's, on asus boards)

AMD at least can drop to 1GHz (not just two multiplier notches down) and the voltage drops significantly - i've seen 20-30W reduction in power use with AMD's CnQ.

AS for my 30W comment - i meant desktop onboard, and should have said <30W. Also i meant peak, not TDP or average. I've got an Nviida 6150LE onboard in a PC here and it uses 15W or so (compared to a PCI card with 512K ram ;) ) but yeah i guess <15 is better with the trend towards passive heatsinks on mobos these days. (Laptops of course, lower is better)
 
Back
Top