• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Optimus Technology Delivers Perfect Balance Of Performance And Battery Life

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,853 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA Corp., announces NVIDIA Optimus technology, a breakthrough for notebook PCs that chooses the best graphics processor for running a given application and automatically routes the workload to either an NVIDIA discrete GPU or Intel integrated graphics - delivering great performance while also providing great battery life.

"Consumers no longer have to choose whether they want great graphics performance or sustained battery life," said Rene Haas, general manager of notebook products at NVIDIA. "NVIDIA Optimus gives them both - great performance, great battery life and it simply works." Just as a Hybrid car chooses between the gas-powered and electric car engine on-the-fly and uses the most appropriate engine, NVIDIA Optimus technology does the same thing for graphics processors. NVIDIA Optimus Technology instantly directs the workload through the most efficient processor for the job, extending battery life by up to 2 times compared to similarly configured systems equipped with discrete graphics processors (GPUs). When playing 3D games, running videos, or using GPU compute applications the high-performance NVIDIA discrete GPU is used. When using basic applications, like web surfing or email, the integrated graphics processor is used. The result is long lasting battery life without sacrificing great graphics performance.



"The genius of NVIDIA Optimus is in its simplicity," said Dr. Jon Peddie, President of Jon Peddie Research, a pioneer of the graphics industry and a leading analyst. "One can surf the web and get great battery life and when one needs the extra horsepower for applications like Adobe Flash 10.1, Optimus automatically switches to the more powerful NVIDIA GPU."




Notebooks with NVIDIA Optimus technology will be available shortly, starting with the Asus UL50Vf, N61Jv, N71Jv, N82Jv, and U30Jc notebooks. For more information on NVIDIA Optimus technology visit the NVIDIA Website here.

View at TechPowerUp Main Site
 
I wonder if Hasbro will sue!?
 
I dont think having 2 GPUs is very efficient, Why not just make a decent GPU that downclocks to IGP level when its not needed. That would be something to get excited about.
 
Sorry the name optimus is taken and where's teh fermi?
 
An attempt at countering PowerPlay I take it...:ohwell:
 
An attempt at countering PowerPlay I take it...:ohwell:

"we cant make power efficient cards for shit, so we're inventing this instead"
 
its fermly on schedule for a release before 2012


ohnoes.gif
it better
ohnoes.gif



"we cant make power efficient cards for shit, so we're inventing this instead"


kludge ftw?
 
This is HybridPower version 2, but it works with Intel GPUs.
 
I have PowerXpress, switching between 4200 and 4570 whenever I like. Took Nvidia long enough to notice that.
 
Interesting, I have an old sony Z series laptop that does something similar to this. There's a switch just above the keyboard that either disables the nVidia graphics and enabled the onboard intel or vise versa.

It's there "stamina/performance" switch.
 
No need for a switch, it's all done on the fly as you "need" the Nvidia GPU if you believe the PR
 
Yea, Sony did do this with the switch. Sadly you had to reboot to enable the other GPU (not on-the-fly). IMO, I don't see the point to this. High-end mobile GPUs can already underclock and undervolt during low usage (powerplay and other technology).
 
Yea, Sony did do this with the switch. Sadly you had to reboot to enable the other GPU (not on-the-fly). IMO, I don't see the point to this. High-end mobile GPUs can already underclock and undervolt during low usage (powerplay and other technology).

But even when underclocked and undervolted, they still use a huge amount of power compared to a weaker card, and that will continue to be the case until they figure out how to completely turn off parts of the silicon, which I don't thing will be much longer.
 
Yea, Sony did do this with the switch. Sadly you had to reboot to enable the other GPU (not on-the-fly). IMO, I don't see the point to this. High-end mobile GPUs can already underclock and undervolt during low usage (powerplay and other technology).

if the intel uses 10W at load (desktop) and a G92 card can use upto 50W at idle... you get the idea for battery life. the bigger the GPU, the more this will help.


shit, i want this on desktops - i get ~250W of power draw from my two cards idling.
 
Yea, Sony did do this with the switch. Sadly you had to reboot to enable the other GPU (not on-the-fly). IMO, I don't see the point to this. High-end mobile GPUs can already underclock and undervolt during low usage (powerplay and other technology).

PowerPlay is an ATi tech, so nVidia can't use it and are trying to make a competitive technology with this.
I can tell you for one, that the PowerPlay on my 5850 drops it's GPU freq from my overclocked 775MHz down to 157MHz and the same for the OC'd GDDR5 at 1125MHz that drops to 300MHz in about 1-2secs of becoming idle, dropping the power draw by a magnitude of almost 10!
nVidia realise, I think, that in the face of this (and the concern for the environment by most potential consumers), considering that they're making another monolithic GPU with the GF100 which will no doubt draw some crazy power, that they need to reassure everyone that once 10+ people switch on their GF100 rigged PCs, it won't cause a global blackout.:laugh:
 
Back
Top