1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Optimus Technology Delivers Perfect Balance Of Performance And Battery Life

Discussion in 'News' started by btarunr, Feb 10, 2010.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    27,676 (11.61/day)
    Thanks Received:
    13,414
    Location:
    Hyderabad, India
    NVIDIA Corp., announces NVIDIA Optimus technology, a breakthrough for notebook PCs that chooses the best graphics processor for running a given application and automatically routes the workload to either an NVIDIA discrete GPU or Intel integrated graphics – delivering great performance while also providing great battery life.

    “Consumers no longer have to choose whether they want great graphics performance or sustained battery life,” said Rene Haas, general manager of notebook products at NVIDIA. “NVIDIA Optimus gives them both - great performance, great battery life and it simply works.” Just as a Hybrid car chooses between the gas-powered and electric car engine on-the-fly and uses the most appropriate engine, NVIDIA Optimus technology does the same thing for graphics processors. NVIDIA Optimus Technology instantly directs the workload through the most efficient processor for the job, extending battery life by up to 2 times compared to similarly configured systems equipped with discrete graphics processors (GPUs). When playing 3D games, running videos, or using GPU compute applications the high-performance NVIDIA discrete GPU is used. When using basic applications, like web surfing or email, the integrated graphics processor is used. The result is long lasting battery life without sacrificing great graphics performance.

    [​IMG]

    “The genius of NVIDIA Optimus is in its simplicity,” said Dr. Jon Peddie, President of Jon Peddie Research, a pioneer of the graphics industry and a leading analyst. “One can surf the web and get great battery life and when one needs the extra horsepower for applications like Adobe Flash 10.1, Optimus automatically switches to the more powerful NVIDIA GPU.”


    [​IMG]

    Notebooks with NVIDIA Optimus technology will be available shortly, starting with the Asus UL50Vf, N61Jv, N71Jv, N82Jv, and U30Jc notebooks. For more information on NVIDIA Optimus technology visit the NVIDIA Website here.
  2. mcloughj

    Joined:
    Oct 27, 2005
    Messages:
    302 (0.10/day)
    Thanks Received:
    65
    Location:
    Dublin, Ireland
    I wonder if Hasbro will sue!?
    Roph and Zubasa say thanks.
  3. buggalugs

    buggalugs

    Joined:
    Jul 19, 2008
    Messages:
    896 (0.43/day)
    Thanks Received:
    132
    Location:
    Australia
    I dont think having 2 GPUs is very efficient, Why not just make a decent GPU that downclocks to IGP level when its not needed. That would be something to get excited about.
    Roph says thanks.
  4. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (3.04/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe
    Sorry the name optimus is taken and where's teh fermi?
  5. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    41,680 (11.98/day)
    Thanks Received:
    9,210
    its fermly on schedule for a release before 2012
  6. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    27,676 (11.61/day)
    Thanks Received:
    13,414
    Location:
    Hyderabad, India
    If they had named it "Optimus Prime", then yes.
  7. inferKNOX

    inferKNOX New Member

    Joined:
    Jul 17, 2009
    Messages:
    899 (0.52/day)
    Thanks Received:
    118
    Location:
    SouthERN Africa
    An attempt at countering PowerPlay I take it...:ohwell:
  8. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    41,680 (11.98/day)
    Thanks Received:
    9,210
    "we cant make power efficient cards for shit, so we're inventing this instead"
  9. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (3.04/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe

    [​IMG] it better [​IMG]



    kludge ftw?
  10. Cheeseball

    Joined:
    Jan 2, 2009
    Messages:
    652 (0.34/day)
    Thanks Received:
    79
    This is HybridPower version 2, but it works with Intel GPUs.
  11. Fourstaff

    Fourstaff Moderator Staff Member

    Joined:
    Nov 29, 2009
    Messages:
    9,098 (5.69/day)
    Thanks Received:
    1,943
    Location:
    Home
    I have PowerXpress, switching between 4200 and 4570 whenever I like. Took Nvidia long enough to notice that.
  12. xrealm20

    xrealm20 New Member

    Joined:
    Jan 29, 2010
    Messages:
    467 (0.30/day)
    Thanks Received:
    129
    Location:
    Houston
    Interesting, I have an old sony Z series laptop that does something similar to this. There's a switch just above the keyboard that either disables the nVidia graphics and enabled the onboard intel or vise versa.

    It's there "stamina/performance" switch.
  13. TheLostSwede

    TheLostSwede

    Joined:
    Nov 11, 2004
    Messages:
    909 (0.26/day)
    Thanks Received:
    154
    No need for a switch, it's all done on the fly as you "need" the Nvidia GPU if you believe the PR
  14. ktr

    ktr

    Joined:
    Apr 7, 2006
    Messages:
    7,406 (2.53/day)
    Thanks Received:
    687
    Yea, Sony did do this with the switch. Sadly you had to reboot to enable the other GPU (not on-the-fly). IMO, I don't see the point to this. High-end mobile GPUs can already underclock and undervolt during low usage (powerplay and other technology).
  15. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,484 (6.35/day)
    Thanks Received:
    5,725
    But even when underclocked and undervolted, they still use a huge amount of power compared to a weaker card, and that will continue to be the case until they figure out how to completely turn off parts of the silicon, which I don't thing will be much longer.
    angelkiller says thanks.
    25 Million points folded for TPU
  16. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    41,680 (11.98/day)
    Thanks Received:
    9,210
    if the intel uses 10W at load (desktop) and a G92 card can use upto 50W at idle... you get the idea for battery life. the bigger the GPU, the more this will help.


    shit, i want this on desktops - i get ~250W of power draw from my two cards idling.
  17. inferKNOX

    inferKNOX New Member

    Joined:
    Jul 17, 2009
    Messages:
    899 (0.52/day)
    Thanks Received:
    118
    Location:
    SouthERN Africa
    PowerPlay is an ATi tech, so nVidia can't use it and are trying to make a competitive technology with this.
    I can tell you for one, that the PowerPlay on my 5850 drops it's GPU freq from my overclocked 775MHz down to 157MHz and the same for the OC'd GDDR5 at 1125MHz that drops to 300MHz in about 1-2secs of becoming idle, dropping the power draw by a magnitude of almost 10!
    nVidia realise, I think, that in the face of this (and the concern for the environment by most potential consumers), considering that they're making another monolithic GPU with the GF100 which will no doubt draw some crazy power, that they need to reassure everyone that once 10+ people switch on their GF100 rigged PCs, it won't cause a global blackout.:laugh:
  18. TheMailMan78

    TheMailMan78 Banstick Dummy

    Joined:
    Jun 3, 2007
    Messages:
    20,635 (8.22/day)
    Thanks Received:
    7,244

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page