• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

FurMark Returns with Version 1.7.0

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,682 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Nearly four months after its previous version, the chaps at oZone3D have released Furmark 1.7.0. This release packs a host of nifty new features, and a number of bug fixes. For starters, FurMark is able to work along with GPU-Z to provide real-time readings on the graphics card's temperatures, voltages and VDDC current (for cards that support it). An experimental feature allows you to Twit your score onto your Twitter account. While the stability test or benchmark is running, the main GUI stays minimized, so you needn't have to start another instance to run several tests.

With multiple GPUs doing the rendering, each GPU is given its own temperature graph. You can start or stop the rendering by hitting the space key without having to close the window. A number of new resolutions have been added, and the application is now also available in Castilian, Bulgarian, Polish, Slovak, and Spanish, thanks to translations. Issues relating to temperature updates in the graph, and the application's multithreading management are resolved. Give your graphics cards a sunbath.



DOWNLOAD: FurMark 1.7.0

View at TechPowerUp Main Site
 
Last edited by a moderator:
I can't grill my cards in this heat, tehy would melt like I do! But nice features, I like the GPUz support.
 
I can't grill my cards in this heat, tehy would melt like I do! But nice features, I like the GPUz support.

yes my card is having trouble without having to grill it :roll:
 
I can't grill my cards in this heat, tehy would melt like I do! But nice features, I like the GPUz support.

There are vampires in Switzerland? :eek:
 
How much cpu does furmark use?

It's mostly single-threaded even today.

Capture165.png
 
Thanks. That's good. Then, I could stress test my cpu, gpu and ram all at the same time!
 
Sweet, now I can destroy more than one card at once! :nutkick: I don't like programs that overstress hardware. Too bad they can't tone it down a bit.
 
Last edited:
The ultimate tortue test. Furmark running at max settings, LinX 20 passes and memtest at the same time!
 
Is this any good for a single 4870x2?

Untitled.jpg
 
Sweet, now I can destroy more than one card at once! :nutkick: I don't like programs that overstress hardware. Too bad they can't tone it down a bit.

of course you can tone it down, work with the different settings available
 
nice , im go try it
 
Sweet.



3C from idle for full load. 41C but the house is hot :(
 
I could grill a steak on my cards.


37065804.jpg
 
Curiously, btarunr's shot displays:
Renderer: GeForce GTX 260/PCI/SSE2 - GL=3.0
While for my HD4890 it says:
Renderer: ATI Radeon HD 4800 Series - GL=2.1
Why are not Radeons running the app with OpenGL 2.1 and not 3.0? These cards are supposed to be OpenGL 3.1 compliant. My shot was taken on official Cat9.6s.
I don't think that the power draw is right. 66watts full load for a gtx260?
For a 55nm card it wouldn't be a problem. Remember that figure accounts for nothing but the GPU. There are also 14 memory chips onboard that each munch away something like 2W.

Here's a shot of a HD4890 getting busy:
HD4890_Furmark_power.jpg

The reported wattage figure for this card is even less relevant as these things have a secondary core power circuitry whose output is not included in this figure. And ofcourse memory on top of that.
 
Last edited:
Then why do we need one 6 pin and one 8 pin of some graphic cards? Pci-e slot= 75watts. 6 pin=75 watts, and 8 pin= 150watts. In total, the max is 300 watts for a graphic card. But none of these cards actually reach that high.
 
Also keep in mind that quite a bit of that converted wattage going through the VRM's is waisted as heat
 
Nearly four months after its previous version, the chaps at oZone3D have released Furmark 1.7.0. This release packs a host of nifty new features, and a number of bug fixes. For starters, FurMark is able to work along with GPU-Z to provide real-time readings on the graphics card's temperatures, voltages and VDDC current (for cards that support it). An experimental feature allows you to Twit your score onto your Twitter account. While the stability test or benchmark is running, the main GUI stays minimized, so you needn't have to start another instance to run several tests.

With multiple GPUs doing the rendering, each GPU is given its own temperature graph. You can start or stop the rendering by hitting the space key without having to close the window. A number of new resolutions have been added, and the application is now also available in Castilian, Bulgarian, Polish, Slovak, and Spanish, thanks to translations. Issues relating to temperature updates in the graph, and the application's multithreading management are resolved. Give your graphics cards a sunbath.

[url]http://www.techpowerup.com/img/09-07-03/10b_thm.png[/URL]

DOWNLOAD: FurMark 1.7.0

Source: Geeks3D

thanks for the share....
 
Then why do we need one 6 pin and one 8 pin of some graphic cards? Pci-e slot= 75watts. 6 pin=75 watts, and 8 pin= 150watts. In total, the max is 300 watts for a graphic card. But none of these cards actually reach that high.
Because if the card has an onboard PCIe power plug, slot power cannot be used for powering the same load as the 6pin plug is used for. Otherwise current load would be shared between slot and PCIe plug, and that's something one doesn't want to happen, for a number of reasons.
Also keep in mind that quite a bit of that converted wattage going through the VRM's is wasted as heat
Volterra chips are around 90-95% efficient. Seems like they're more efficient than other more conventional VRMs, which is evident from the increased power consumption of GTX295 when it went from 2 PCBs to 1 PCB which no longer uses Volterra VRMs.
 
Because if the card has an onboard PCIe power plug, slot power cannot be used for powering the same load as the 6pin plug is used for. Otherwise current load would be shared between slot and PCIe plug, and that's something one doesn't want to happen, for a number of reasons.

That doesn't make sense. Why would one use a 75w power connector when one could simply use the 75w from the slot seeing as the power from the slot becomes unavailable when an external power connector is present. And pci-e 2.0 is 150w... why would anyone put a single 75w external power connector (ala 8800GTS G92) on a card that already gets 150w from the slot when using an external power source makes the slot power unavailable? I must have misunderstood somehow...
 
Uhm, no. The PCI-E 2.0 x16 slot provides 75W. Not a Watt more.

You know... you're the first person I've ever heard say that... can you back that statement up?
 
Back
Top