1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

FurMark Returns with Version 1.7.0

Discussion in 'News' started by btarunr, Jul 3, 2009.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    27,711 (11.59/day)
    Thanks Received:
    13,424
    Location:
    Hyderabad, India
    Nearly four months after its previous version, the chaps at oZone3D have released Furmark 1.7.0. This release packs a host of nifty new features, and a number of bug fixes. For starters, FurMark is able to work along with GPU-Z to provide real-time readings on the graphics card's temperatures, voltages and VDDC current (for cards that support it). An experimental feature allows you to Twit your score onto your Twitter account. While the stability test or benchmark is running, the main GUI stays minimized, so you needn't have to start another instance to run several tests.

    With multiple GPUs doing the rendering, each GPU is given its own temperature graph. You can start or stop the rendering by hitting the space key without having to close the window. A number of new resolutions have been added, and the application is now also available in Castilian, Bulgarian, Polish, Slovak, and Spanish, thanks to translations. Issues relating to temperature updates in the graph, and the application's multithreading management are resolved. Give your graphics cards a sunbath.

    [​IMG]

    DOWNLOAD: FurMark 1.7.0

    Source: Geeks3D
    Last edited by a moderator: Jul 3, 2009
    r1rhyder and W1zzard say thanks.
  2. MRCL

    MRCL

    Joined:
    May 31, 2008
    Messages:
    5,790 (2.69/day)
    Thanks Received:
    860
    Location:
    Switzerland, Heart of Europe
    I can't grill my cards in this heat, tehy would melt like I do! But nice features, I like the GPUz support.
    Crunching for Team TPU
  3. z1tu

    Joined:
    Dec 31, 2008
    Messages:
    245 (0.13/day)
    Thanks Received:
    27
    Location:
    Romania
    yes my card is having trouble without having to grill it :roll:
  4. entropy13

    entropy13

    Joined:
    Mar 2, 2009
    Messages:
    4,872 (2.59/day)
    Thanks Received:
    1,172
    There are vampires in Switzerland? :eek:
  5. boogerlad

    Joined:
    Jun 29, 2006
    Messages:
    219 (0.08/day)
    Thanks Received:
    12
    How much cpu does furmark use?
  6. h3llb3nd4

    h3llb3nd4 New Member

    Joined:
    Feb 15, 2009
    Messages:
    3,323 (1.75/day)
    Thanks Received:
    307
    Location:
    Durban, South Africa
    it's mostly GPU that it's utilising, so I don't think the CPU is being stressed..
  7. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    27,711 (11.59/day)
    Thanks Received:
    13,424
    Location:
    Hyderabad, India
    It's mostly single-threaded even today.

    [​IMG]
    fullinfusion and boogerlad say thanks.
  8. boogerlad

    Joined:
    Jun 29, 2006
    Messages:
    219 (0.08/day)
    Thanks Received:
    12
    Thanks. That's good. Then, I could stress test my cpu, gpu and ram all at the same time!
  9. erocker

    erocker Super Moderator Staff Member

    Joined:
    Jul 19, 2006
    Messages:
    39,156 (13.80/day)
    Thanks Received:
    13,607
    Sweet, now I can destroy more than one card at once! :nutkick: I don't like programs that overstress hardware. Too bad they can't tone it down a bit.
    Last edited: Jul 3, 2009
    fullinfusion says thanks.
  10. boogerlad

    Joined:
    Jun 29, 2006
    Messages:
    219 (0.08/day)
    Thanks Received:
    12
    The ultimate tortue test. Furmark running at max settings, LinX 20 passes and memtest at the same time!
  11. boogerlad

    Joined:
    Jun 29, 2006
    Messages:
    219 (0.08/day)
    Thanks Received:
    12
    I don't think that the power draw is right. 66watts full load for a gtx260?
  12. dcf-joe

    dcf-joe

    Joined:
    Feb 11, 2008
    Messages:
    372 (0.16/day)
    Thanks Received:
    20
    Location:
    Nebraska, USA
    Is this any good for a single 4870x2?

    [​IMG]
  13. sneekypeet

    sneekypeet Unpaid Babysitter Staff Member

    Joined:
    Apr 12, 2006
    Messages:
    21,310 (7.26/day)
    Thanks Received:
    5,842
  14. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    14,545 (4.00/day)
    Thanks Received:
    11,227
    of course you can tone it down, work with the different settings available
  15. Hayder_Master

    Hayder_Master

    Joined:
    Apr 21, 2008
    Messages:
    5,159 (2.35/day)
    Thanks Received:
    637
    Location:
    IRAQ-Baghdad
    nice , im go try it
  16. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    7,993 (2.58/day)
    Thanks Received:
    1,088
    Sweet.



    3C from idle for full load. 41C but the house is hot :(
    10 Million points folded for TPU
  17. r1rhyder New Member

    Joined:
    Jul 20, 2008
    Messages:
    194 (0.09/day)
    Thanks Received:
    32
    Location:
    texas
    I could grill a steak on my cards.


    [​IMG]
  18. largon New Member

    Joined:
    May 6, 2005
    Messages:
    2,778 (0.85/day)
    Thanks Received:
    432
    Location:
    Tre, Suomi Finland
    Curiously, btarunr's shot displays:
    While for my HD4890 it says:
    Why are not Radeons running the app with OpenGL 2.1 and not 3.0? These cards are supposed to be OpenGL 3.1 compliant. My shot was taken on official Cat9.6s.
    For a 55nm card it wouldn't be a problem. Remember that figure accounts for nothing but the GPU. There are also 14 memory chips onboard that each munch away something like 2W.

    Here's a shot of a HD4890 getting busy:
    [​IMG]
    The reported wattage figure for this card is even less relevant as these things have a secondary core power circuitry whose output is not included in this figure. And ofcourse memory on top of that.
    Last edited: Jul 4, 2009
  19. boogerlad

    Joined:
    Jun 29, 2006
    Messages:
    219 (0.08/day)
    Thanks Received:
    12
    Then why do we need one 6 pin and one 8 pin of some graphic cards? Pci-e slot= 75watts. 6 pin=75 watts, and 8 pin= 150watts. In total, the max is 300 watts for a graphic card. But none of these cards actually reach that high.
  20. Arrakis+9

    Arrakis+9

    Joined:
    Aug 10, 2007
    Messages:
    1,452 (0.59/day)
    Thanks Received:
    491
    Also keep in mind that quite a bit of that converted wattage going through the VRM's is waisted as heat
  21. denice25 New Member

    Joined:
    Feb 24, 2009
    Messages:
    316 (0.17/day)
    Thanks Received:
    17
    thanks for the share....
  22. largon New Member

    Joined:
    May 6, 2005
    Messages:
    2,778 (0.85/day)
    Thanks Received:
    432
    Location:
    Tre, Suomi Finland
    Because if the card has an onboard PCIe power plug, slot power cannot be used for powering the same load as the 6pin plug is used for. Otherwise current load would be shared between slot and PCIe plug, and that's something one doesn't want to happen, for a number of reasons.
    Volterra chips are around 90-95% efficient. Seems like they're more efficient than other more conventional VRMs, which is evident from the increased power consumption of GTX295 when it went from 2 PCBs to 1 PCB which no longer uses Volterra VRMs.
  23. hat

    hat Maximum Overclocker

    Joined:
    Nov 20, 2006
    Messages:
    16,820 (6.20/day)
    Thanks Received:
    2,032
    Location:
    Ohio
    That doesn't make sense. Why would one use a 75w power connector when one could simply use the 75w from the slot seeing as the power from the slot becomes unavailable when an external power connector is present. And pci-e 2.0 is 150w... why would anyone put a single 75w external power connector (ala 8800GTS G92) on a card that already gets 150w from the slot when using an external power source makes the slot power unavailable? I must have misunderstood somehow...
  24. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    27,711 (11.59/day)
    Thanks Received:
    13,424
    Location:
    Hyderabad, India
    Uhm, no. The PCI-E 2.0 x16 slot provides 75W. Not a Watt more.
  25. hat

    hat Maximum Overclocker

    Joined:
    Nov 20, 2006
    Messages:
    16,820 (6.20/day)
    Thanks Received:
    2,032
    Location:
    Ohio
    You know... you're the first person I've ever heard say that... can you back that statement up?

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page