1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RealTemp General Discussion

Discussion in 'RealTemp' started by unclewebb, Jun 28, 2008.

  1. Systemlord New Member

    Joined:
    Nov 9, 2010
    Messages:
    17 (0.01/day)
    Thanks Received:
    0
    Does an AC count towards having a chiller?
  2. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,125 (11.68/day)
    Thanks Received:
    9,461
    no because that lowers your ambients, not your water temp.
  3. Arctucas

    Arctucas

    Joined:
    Jul 14, 2006
    Messages:
    1,764 (0.60/day)
    Thanks Received:
    290
    No, as Mussels said, while you may achieve lower temperatures in an air-conditioned environment, it is because the A/C has lowered your ambient air temperature.

    I am curious as to how you determined you are getting 10° less than ambient from your loop?

    What hardware and/or software are you using to measure both ambient air temperature and water temperature?

    Some empirical data, along with the above and your specific testing methodology would be appreciated.
  4. Systemlord New Member

    Joined:
    Nov 9, 2010
    Messages:
    17 (0.01/day)
    Thanks Received:
    0
    My data is all wrong as is my math. :eek:
  5. donkrx New Member

    Joined:
    Jul 29, 2011
    Messages:
    8 (0.01/day)
    Thanks Received:
    0
    So I have a new i5 2500k ....

    I usually get at idle: 33/30/33/41 ... with ambient at 27 (that would be avg for stock settings). The temps are good but the 41 is bugging me. I've reseated the stock cooler once and just installed the Hyper 212+, but I was really careful and did right. When I had stock cooling the temps were 36/34/36/45 on average (both times I reseated).... so basically all temps dropped a proportional amount across the board, and the 4th core remains arbitrarily higher than the rest by a fixed amount.

    My realtemp sensor test produces a different pattern and while I havent yet analyzed it myself (was gonna plot it on an excel graph), I thought I'd post here. The attached picture is a snapshot of my sensor test at 4.7ghz, not stock settings. You can read the minimum values though for idle temps - they're pretty close to stock settings but a tad higher (prob 1.5C).

    Can I somewhat reliably conclude that the 4th temp is inaccurate?

    I read the documentation for RealTemp which was a nice explanation but the bottom line is, can I calibrate my temps? Is there any way I can at least improve the accuracy of the temps? since I am overclocking this is obviously something I want to watch.

    thanks guys.

    Attached Files:

  6. unclewebb

    unclewebb RealTemp Author

    Joined:
    Jun 1, 2008
    Messages:
    946 (0.42/day)
    Thanks Received:
    419
    The sensors that Intel continues to use are not 100% accurate temperature monitoring devices. The amount of error varies from one core to the next and these sensors have many other undocumented issues. Intel only designed these sensors to control thermal throttling and thermal shutdown and for that purpose, they work great.

    I have no idea whether you have 1 sensor that is accurate and 3 that are wrong or maybe 3 are close to right and only 1 is wrong. With individual threads and cores rapidly entering and exiting various sleep states, there's little point in trying to calibrate them to make the numbers all line up and look pretty. Overclock like everyone else does and do your best to keep the numbers that RealTemp reports as low as possible. That's all those sensors are good for and that's all you can do. It's very difficult to hurt an Intel CPU so you'll be fine.

    RealTemp 3.67
    http://www.mediafire.com/?n99nq4kn95u6i6a

    The latest version has been updated to support the new Sandy Bridge CPUs.
    mlee49 and donkrx say thanks.
  7. donkrx New Member

    Joined:
    Jul 29, 2011
    Messages:
    8 (0.01/day)
    Thanks Received:
    0
    Thank you for the reply. I just downloaded RealTemp a couple weeks ago after I built my PC, and I thought I had the version that supports SB, I guess I overlooked that. Thanks for letting me know. I got it running now.

    If the thermal sensors well for thermal throttling (which depends on how hot they are, and the chips/cores limitations) why can't we derive a temperature for them? I ask this because you say they work great for throttling but at the same time are terrible for measuring temperature. Is it simply because thermal throttling doesn't require a high resolution, so we cannot conclude anything even if we know the temps that the chip throttles at?

    Additionally, if it is meaningless to calibrate and adjust things, then why does RealTemp (and others) have these implementations for the user to adjust, along with a guide? Under what circumstances should that tool be used?

    I read, somewhere, a suggestion to grab CPU temps from a monitoring tool such as RealTemp immediately after your computer comes out of sleep... that way you can compare the ambient temp by the computer case and the temp reported by Windows to get an idea of how incorrect the offset is (because the chip should basically be at room temperature if given a long time to cool).

    Is this feasible - could I use HDWRITE=1 in RealTemp.ini, with a low log frequency to grab the temps the instant they are available? Or does the CPU heat up too much while the computer wakes up?
  8. donkrx New Member

    Joined:
    Jul 29, 2011
    Messages:
    8 (0.01/day)
    Thanks Received:
    0
    I just had a clever idea.... could I go into BIOS, and disable the core, and then go into Windows and note the difference from it and ambient temp? That way there would be no switching from C states and all (some of which I have disabled, btw).
  9. unclewebb

    unclewebb RealTemp Author

    Joined:
    Jun 1, 2008
    Messages:
    946 (0.42/day)
    Thanks Received:
    419
    Intel's temperature sensors count down to zero as they heat up. If you knew that the sensor reached zero at exactly 100C then you could work backwards and if that sensor reported 10 then you would know that your CPU was 10 degrees away from the thermal throttling point so it must be at 90C. That's a great theory but now here are some of the problems.

    TJ Max is not a fixed value. With Core i processors, Intel writes a value to each core which software like RealTemp can read but they call it TJ Target. 100C might be written to that register for each core but that does not guarantee that thermal throttling will begin right at 100C for each core. Intel does not document their calibration procedure or how much error is in this number but I don't think 10C of error is unreasonable to assume. Your full load delta between cores is 7C which I'm betting is mostly error in how TJMax or TJ Target is set. With the previous Core 2 generation, my theory is that Intel deliberately offsets TJMax from one core to the next so on a quad core processor, all 4 cores wouldn't reach the thermal throttling temperature at the exact same time. This would give a more gradual decrease in performance during a throttling episode. Throttling one core is often times enough to keep the CPU in check so maybe Intel did this for a reason. Maybe they are still doing this deliberately with the Core i CPUs. None of this is documented though so that's just my opinion.

    The next problem is slope error. What this means is the sensor changes values at a slightly different rate than the actual temperature is changing. These sensors might be reasonably accurate when you are +/- 10C from the calibration point but the further you get away from this point, the more these sensors can wander. At idle there might be +/- 10C of slope error. Intel also doesn't say what temperature they calibrate at.

    When you combine this error with not really knowing how accurate TJMax really is, you're left with a number that is not that useful for 100% accurate core temperatures from idle to TJMax.

    The previous 45nm Core 2 sensors also had another problem where they would get stuck as the CPU cooled off. As the CPU went from 70C to 60C to 50C, etc., the sensor might be fine at 70C and 60C but a sensor could simply stop moving if the CPU got cool enough. Many sensors stopped moving at 50C so a CPU at 50C or 40C or 30C would report the same thing. You can't calibrate a stuck sensor.

    The calibration features I added to RealTemp were a good idea at the time but it was based on a very tiny sample of processors that I owned. Without some hard engineering data from Intel and some decent quality sensors, trying to design a calibration formula is impossible. Without more information from Intel, you end up having to make so many assumptions that the final number might look nice but it might not be any more accurate than the original random numbers that some of these sensors put out.

    Intel could have spent some more money and used some high quality temperature sensors that are accurate over a wide range but for most of their customers, that would be wasted money. Most users don't care about the exact core temperature of their CPU. As long as their computer runs and doesn't crash too often, that's good enough for the average Joe.

    If you were able to perfectly calibrate your CPU temperature, what would you do with that data? No one else in the world has 100% accurate temperatures so you still wouldn't be able to compare your results to other users.

    After a long time I finally had to agree with Intel. It's just not that important. As long as your CPU is running below the throttling temperature then it is running within the Intel spec. It is designed to run a long time at that temperature and most enthusiasts lose interest in a CPU long before it ever craps out.

    As an extreme overclocker, it is important to run your CPU as cool as possible but whether it is actually running at 60C or 70C doesn't make any difference as long as your computer is stable. If it is not stable, you need to overclock less or improve your cooling. The sensors are good enough to determine that but they are not good enough to provide 100% accurate temperatures. There is so much error from core to core that you can't compare one core to the next in the same CPU and trying to compare your temps to another user becomes pointless.
    EarthDog and donkrx say thanks.
  10. donkrx New Member

    Joined:
    Jul 29, 2011
    Messages:
    8 (0.01/day)
    Thanks Received:
    0
    EDIT: Deleted a bunch of stuff because I was being brain dead and not using my own head to answer my questions.

    In RealTemp, why is it that I enter calibration settings and on exit everything gets reverted to default? Nothing else in the program settings window does this. I have to play around with it a bit more, but I see other people have this problem too. Let me know if you know of any potential fixes.

    Why does HW Monitor consistently report slightly lower temps than RealTemp and CoreTemp do for me? I have the very latest version, and 2 versions ago they added support for Nuvoton NCT6776 (Sandy Bridge sensor, which I have verified I do have)... so why might they be reporting different values? Core Temp and Real Temp agree with each other.

    I think I may just push my CPU to the limit of throttling, note the temperature or distance from TJ max, and make sure I stay a certain distance from it. That sounds like the only reliable precaution I can take.
    Last edited: Jul 31, 2011
  11. unclewebb

    unclewebb RealTemp Author

    Joined:
    Jun 1, 2008
    Messages:
    946 (0.42/day)
    Thanks Received:
    419
    I haven't been able to duplicate the bug that some people have reported about the settings not being saved. Do you push OK when you exit the Settings window?

    I put the RealTemp folder in C:\Program Files (x86) and the calibration settings are saved for me and they get restored when I restart RealTemp. I sent a test version of RealTemp to the last people that complained about this but neither of them got back to me. All I can think of is make sure that you have Read and Write access to the RealTemp.INI file where all of the configuration settings are stored. Sometimes when that file gets dragged to different drives or directories, Windows changes the R/W access to the files within that folder. I'll have one more look at this tomorrow in case there is something that I've overlooked but it's difficult to fix a program if I can't duplicate the problem.

    As long as your CPU is not thermal throttling then it is running within the Intel spec. That's all you can depend on. When Intel's sensors count down to zero, your CPU will throttle regardless of what the real core temperature is so you need to avoid that. When significantly overclocking 45nm Core 2 processors, you often times needed to leave 30C of headroom to ensure that they would remain stable. When overclocking Core i processors, they seem to be able to operate reliably at much higher temperatures. Don't let them throttle and just let stability be your guide. If those two are good then no worries.

    You'll have to ask the programmer of HWMonitor what his program is doing. I know RealTemp runs at a slightly higher priority to ensure that it has access to the temperature sensors even when the CPU is 100% loaded such as when running LinX. HWMonitor might be reading the sensors less frequently or doing some averaging or something like that.
    donkrx says thanks.
  12. donkrx New Member

    Joined:
    Jul 29, 2011
    Messages:
    8 (0.01/day)
    Thanks Received:
    0
    Thanks so much for the help. I understand this better now.

    EDIT: I tried a bunch of things trying to get the RealTemp calibration to hold, unfortunately it wont. I did however test other options like the alarm temperature and colors and those are all persistent changes in the ini file. Only the calibration wont save... btw I also tried version 3.60. If you want me to do anything specific to figure out why its happening let me know.

    Thanks for creating and maintaining this program!
    Last edited: Jul 31, 2011
  13. unclewebb

    unclewebb RealTemp Author

    Joined:
    Jun 1, 2008
    Messages:
    946 (0.42/day)
    Thanks Received:
    419
    donkrx: I just sent you a PM with the updated version. If it fixes the problem or not, just let me know. Thanks.

    Edit: I just noticed that the last two people that mentioned this problem didn't even bother to download the updated version I did for them. :(
  14. donkrx New Member

    Joined:
    Jul 29, 2011
    Messages:
    8 (0.01/day)
    Thanks Received:
    0
    unclewebb, I decided I'd just post here instead of sending PM.

    The new version 3.68 is working and will save my idle calibration & TjMax corrections.

    Thanks for the fix!
  15. unclewebb

    unclewebb RealTemp Author

    Joined:
    Jun 1, 2008
    Messages:
    946 (0.42/day)
    Thanks Received:
    419
    Thanks for getting back to me. With that bug finally solved, now I can get the main version of RealTemp updated at TechPowerUp in the near future.
  16. donkrx New Member

    Joined:
    Jul 29, 2011
    Messages:
    8 (0.01/day)
    Thanks Received:
    0
    One other thing I just noticed...

    OldFormula=0 and OldFormula=1 seem to have no detectable difference among them, and they both still seem to be affecting temperatures pretty significantly above 70 degrees.

    Basically what I was trying to do was not change my temps above 70 while calibrating the idle temps (because I assume that the sensors are reasonably accurate at the upper end, and the idle ones are garbage). When I plotted my temperatures on a graph, I got results VERY similar to what you found originally - in particular all of my cores looked like the green line in the Documentation. Ideally I would be able to straighten out the "V" shape in the graph so there was a linear relationship with respect to load.

    (In the end I got it right, so I'm pretty pleased with the results.)

    Process:

    What I did first was I ran Prime for 10 minutes until I reached my max temp of 75 ... then I ran Intel Burn Test and reached a max temp of 90-91. I noted along with my max temps, that 3 of my cores were consistent with each other at any temp above 70. Additionally, the first core (Core 0) was consistently 6 degrees lower at any temp above 70.

    I then let the system cool for a period of 10 minutes and used the idle calibration to adjust one temp. After, I ran Prime again, but found that now my original max temp of 75 was 70-71. I cycled back and repeated the process several times until I got it right for each core, then I used Intel Burn Test with the same process to make sure the temps that were originally around 90C, were still 90C. Finally I checked with Prime once more. Took a lot of trial and error but I finally got it right which is great.

    ===========

    So am I misunderstanding how the program works, or is there possibly something wrong? I tried both settings 0 and 1 and they don't seem to be any different.
  17. unclewebb

    unclewebb RealTemp Author

    Joined:
    Jun 1, 2008
    Messages:
    946 (0.42/day)
    Thanks Received:
    419
    My testing found that these sensors work like the graph in the documentation shows. Intel came along and released some information at one of their IDF forums that showed the error as a totally linear relationship from idle to TJMax. As a small individual programmer, I decided to go along with their version of the truth but in hindsight, I should have ignored this "new information" and never changed my original formula. Most of the information that was released at the first IDF conference turned out to be a farce and Intel released an updated version of the truth a month later. I was naive thinking that Intel was going to finally come clean and release some engineering data about these sensors at IDF to help out developers but it never happened.

    After that, I gave up on doing calibrations and I haven't looked at this INI option for a long time. I'll check if it still exists or if I got rid of this option from lack of feedback.

    I found Prime95 Small FFTs was the best application for testing purposes. LinX / Linpack testing was able to create more heat but Small FFTs puts a very equal load on each core.
    donkrx says thanks.
  18. slipstream New Member

    Joined:
    Jul 14, 2011
    Messages:
    19 (0.02/day)
    Thanks Received:
    1
    I am wondering whether realtemp supports GPU temp logging, it logs my CPU temps but shows nothing under GPU in excel log file, thing is i like to log CPU, GPU temps so i can calculate a rough average when i'm gaming so i don't fry my processor.
  19. unclewebb

    unclewebb RealTemp Author

    Joined:
    Jun 1, 2008
    Messages:
    946 (0.42/day)
    Thanks Received:
    419
    RealTemp 3.69
    http://www.mediafire.com/?5ymas9doahivdju

    -fixes calibration factors not being saved correctly for some users.


    slipstream: Does RealTemp report your GPU temperatures? RealTemp reports most Nvidia GPUs and most ATi desktop GPUs but doesn't report all ATi mobile GPUs. If you have enabled GPU temperature reporting in the RealTemp Settings window and RealTemp is showing your GPU temperature then the data should be showing up in the log file. Give me some more details and upload a log file to www.mediafire.com and send me a link so I can have a look.

    [​IMG]
  20. slipstream New Member

    Joined:
    Jul 14, 2011
    Messages:
    19 (0.02/day)
    Thanks Received:
    1
    Yes both realtemp 3.60GT and 3.64GT recognize my GPU
    http://www.techpowerup.org/uploaded.php?file=110803/Capturenew.png (3.60GT)
    http://www.techpowerup.org/uploaded.php?file=110803/Capturenew2.png (3.64GT)

    I've also uploaded a pic of realtempGT 3.64 settings (just in case)
    http://www.techpowerup.org/uploaded.php?file=110803/settings.png


    Here is the log files
    For 3.60:
    http://www.mediafire.com/?yg3wapofq321c3b
    For 3.64:
    http://www.mediafire.com/?f2d5al0bszg6ln6

    My specs:
    core i5 480m
    nvidia GT 540M
  21. unclewebb

    unclewebb RealTemp Author

    Joined:
    Jun 1, 2008
    Messages:
    946 (0.42/day)
    Thanks Received:
    419
    Why are you using RealTemp GT? It is designed for the 6 core CPUs. Why not try using the regular version, RealTemp.exe

    Thanks for the bug report. I'll have a look at the GT version and get that fixed up but for your CPU, I think the other version should work OK.

    Edit: I just checked RealTemp GT on an older laptop with an Nvidia GPU. The GPU temperature data is showing up fine in the log file.

    Edit #2: I just had a better look at the log files. The data is showing up correctly but the column headings are screwed up. Remove the headings CPU_2 and CPU_3 and move the headings GPU and LOAD% to replace those other 2 headings. A Core i5-480M is only a 2 core processor so the CPU_2 and CPU_3 shouldn't be showing up. I should be able to fix that. Thanks for taking the time to let me know about this. Those hyper threaded dual cores is causing my programs to think that you have a quad core so it is adding a couple of extra columns to the table even though there is no data.
    Last edited: Aug 3, 2011
  22. slipstream New Member

    Joined:
    Jul 14, 2011
    Messages:
    19 (0.02/day)
    Thanks Received:
    1
    6 cores! what do people need 6 cores for? to calculate the answer to life universe and everything :) don't know much but i assumed Intel would not release a 6 core CPU until next year, guess Moore's law still holds. Pardon my ignorance for using the GT.exe

    I fixed the headings as you said, but if i delete the log file, the next one generated has headings screwed up again.
  23. unclewebb

    unclewebb RealTemp Author

    Joined:
    Jun 1, 2008
    Messages:
    946 (0.42/day)
    Thanks Received:
    419
    I'll get the heading issue fixed up, probably by tomorrow. At least it looks like it is reporting your GPU temperature correctly. I originally thought it might be an Optimus issue. When on battery power, it might be a good idea not to monitor your GPU temperatures. I haven't done any testing but it is possible that this could constantly wake up the Nvidia GPU and defeat the power savings that you would get be leaving the Nvidia GPU asleep. RealTemp gets GPU information directly from the driver so the driver should be smart enough not to wake up a sleeping GPU but you never know. You might want to run Battery Bar or something like that to keep an eye on power consumption with and without the RealTemp GPU feature active to see if there is any difference.
    Last edited: Aug 3, 2011
    slipstream says thanks.
  24. slipstream New Member

    Joined:
    Jul 14, 2011
    Messages:
    19 (0.02/day)
    Thanks Received:
    1
    Hey i don't know what went wrong, yesterday the 3.69 version was working just fine, but this morning the log file looked like this
    http://www.mediafire.com/?q82pzs0rhfl1nt2
    Also now the 3.69.exe won't start but 3.64GT works fine apart from the erroneous log file.
  25. unclewebb

    unclewebb RealTemp Author

    Joined:
    Jun 1, 2008
    Messages:
    946 (0.42/day)
    Thanks Received:
    419
    The log file looks like Nvidia Optimus has turned off your GPU.

    http://www.nvidia.com/object/optimus_technology.html

    If the Nvidia GPU goes to sleep, I hope RealTemp doesn't wake it up to check its temperature because that would defeat the purpose of Optimus trying to shut the GPU down to save energy. In the log file, I'll try to move the GPU temperature to the far right hand side so this might be easier to see.

    If you were 3D gaming during that log file you posted and the Nvidia GPU was asleep, that would be a problem but if you weren't doing anything 3D related and the GPU was asleep then the log file looks as it should.

    In the RealTemp log you could try setting the Log File interval to 1 second instead of 20 seconds. That might give you a better idea of what's going on.

    3.69 still starts fine for me. What sort of error message are you seeing if any?

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page