• We've upgraded our forums. Please post any issues/requests in this thread.

GPU-Z Shared Memory Layout

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,049 (3.44/day)
Likes
17,946
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
#1
Code:
#define SHMEM_NAME _T("GPUZShMem")
#define MAX_RECORDS 128

#pragma pack(push, 1)
struct GPUZ_RECORD 
{
	WCHAR key[256];
	WCHAR value[256];
};

struct GPUZ_SENSOR_RECORD
{
	WCHAR name[256];
	WCHAR unit[8];
	UINT32 digits;
	double value;
};

struct GPUZ_SH_MEM
{
	UINT32 version; 	 // Version number, 1 for the struct here
	volatile LONG busy;	 // Is data being accessed?
	UINT32 lastUpdate; // GetTickCount() of last update
	GPUZ_RECORD data[MAX_RECORDS];
	GPUZ_SENSOR_RECORD sensors[MAX_RECORDS];
};
#pragma pack(pop)
If you use this shared memory in your application please leave a short comment.
 
Joined
Dec 6, 2007
Messages
16 (0.00/day)
Likes
0
#3
hi!

thx for the shared memory support! it works really fine (i use it in C# via a C++ wrapper class) but unfortunately i found no way to access the temperatures and load of both GPUs at the same time (i have a HD3870 X2). :(

hmm, is there any way? it's really disappointing that NO tool has this feature.. :cry:
 
Joined
Feb 8, 2005
Messages
1,675 (0.36/day)
Likes
149
Location
Minneapolis, Mn
System Name Livingston
Processor i7-4960HQ
Motherboard macbook prp retina
Cooling Alphacool NexXxoS Monsta (240mm x 120mm x 80mm)
Memory 16Gb
Video Card(s) Zotac Arctic Storm Nvidia 980ti
Display(s) 1x Acer XB270HU, 1x Catleap, 1x Oculus
Benchmark Scores http://www.3dmark.com/fs/770087
#4
I really dont mean to sound stupid here, but is this code I can integrate into a C++ program to utalise memory locations on the videocard instead of standered memory locations?
 

brianhama

New Member
Joined
Jul 13, 2008
Messages
3 (0.00/day)
Likes
0
#5
PS3Divx,

Would you be willing to post that wrapper class you created? If not, would it be feasible to access this shared memory from managed code? I am primarily a .NET developer and this isnt somthing I have done before.

Thanks,

Brian P. Hamachek
 
Joined
Dec 6, 2007
Messages
16 (0.00/day)
Likes
0
#6
hmmm i don't know if i get you right, but i think the answer is "no". this "code" is for guys who want to access the data read by GPUZ in their own programs. so for example i want to use the temperature and core load of both of my GPUs in my own C# application. to get these values i access GPUZ's shared memory, where the data is stored. hope i could help you a bit. ;)

E: @brianhama: hi, yeah i'm also quite a C++ newbie, but i managed to create something that works for me.. if you like i can give you the files, it aren't much lines. however i can't guarantee that it's really good and clean code if you know what i mean..^^
 

brianhama

New Member
Joined
Jul 13, 2008
Messages
3 (0.00/day)
Likes
0
#7
I understand that. I am asking if you could post some code on how to access shared memory.
 
Joined
Dec 6, 2007
Messages
16 (0.00/day)
Likes
0
#8
allright, i will post the documented code, files and visual studio project soon (next 1-3 days), for anyone who needs it..
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,049 (3.44/day)
Likes
17,946
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
#9
at this time it is only possible to get the readings for the currently selected video card (in gpuz)
 
Joined
Dec 6, 2007
Messages
16 (0.00/day)
Likes
0
#10
hmm i see.. is it possible to extend it to all cores in the next version? that would really help me out. however, i think the whole application is designed to show only one core at a time so it might be a bit complicated to get all data at the same time.. so if that's absolutely not possible could you at least add some kind of command line argument that defines which of the cores is selected at startup?

because then i could start gpuz twice (one instance per core) and the data of both cores is written into (the same) shared memory. i could read the data each 500ms in my app so i get both values.. a bit complicated and dirty but it would serve my needs, if the shared memory for all cores is to complex. ;)

PS: at the moment i'm pimping my neat-o wrapper class for you guys.. just love it. :D
 

OlafSt

New Member
Joined
Jun 3, 2008
Messages
6 (0.00/day)
Likes
1
Location
Rellingen, Germany
#12
Maybe it is a nice idea to have a "GPU Count" value in the shared memory block. Then, for any further GPU, just create another shared-mem-block called "GPUZShMem_2", "GPUZShMem_3" and so on. I think this is the easiest way to accomplish this, as nobody knows how much GPU's we sometime will have on a Graphics card...

BTW, for next release a "minimize to tray"-Feature would be nice ;)

Greets,

Olaf
 

JohnnyUT

New Member
Joined
Jul 14, 2008
Messages
7 (0.00/day)
Likes
4
#14
Allright, here we go with the C# wrapper class for GPUZ! :) (as i said: it's the "later" night.. :p)

With it you can access the shared memory of GPUZ in your C# applications quite convenient.
For example the following lines will print the Temperature and the Core Load on the console:

Code:
Gpuz.Wrapper.Open()
Console.WriteLine(Gpuz.Wrapper.SensorValue(2));  //temp
Console.WriteLine(Gpuz.Wrapper.SensorValue(4));  //load
Gpuz.Wrapper.Close();
Here is the link to the RAR-file (unfortunately it's to big to upload it directly to the board :/):
http://www.file-upload.net/download-980228/GpuzWrapper.rar.html

It includes:
- 2 DLLs to include and use in your C# application
- the visual studio (2005) project to modify and compile the DLLs on your own
- a visual studio (2005) project which shows the proper usage of the wrapper class
- a small readme file which lists the steps to take if you want to use the wrapper class in your C# application.

Well, i think it's pretty handy and some guys might have a use of this! :toast:
Oh, and of course the whole code is absolutely open source and free to modify and use. ;)

Enjoy! :D

PS: Feel free to ask if something is unclear. ;)
 
Joined
Aug 29, 2005
Messages
4,801 (1.07/day)
Likes
1,575
Location
Whatever my internet protocol shows I guess O.o
System Name Lynni and The Great White Dragon in Tempered Glass | Lynni-Stick
Processor Intel Core i7-6700K "Skylake" | Intel Atom X5-Z8300 "Cherry Trail"
Motherboard Gigabyte Aorus GA-Z270X-Gaming 7 "Union Point" | Intel Compute Stick board
Cooling Thermalright True (Old legend still going strong) 1xNoctua NF-F12 PWM | Fan xD
Memory Geil Dragon 2x8GB@3000mhz 15-17-17-35 (GWW416GB3000C15DC) | 2GB DDR3-L @ 1600mhz
Video Card(s) MSI GTX 1080 Ti FE "Pascal" | Intel HD Graphics
Storage OS/Games:Samsung 960 EVO 250GB NVME|2xSamsung EVO 850 1TB SSD|Data:3xWD Red/Purple 4TB & WD SE 1TB
Display(s) Dell S2417DG 1440p@165hz G-Sync | Philips 50PFT4009/12
Case Phantek Eclipse P400 Black/White | Intel Compute Case
Audio Device(s) Creative Core3D (Onboard) | Intel HD Audio
Power Supply Corsair SF600 | Generic Intel Power Adapter (3amp)
Mouse Logitech G502 | Logtech MK270 kit @ Lynni-Stick
Keyboard Razer Blackwidow Chroma X UK
Software Win10 Pro CU UK x64 | Win10 Home CU x86
Benchmark Scores 3DMark Skydrive @ 1440p: GS: 25188 / PS: 12238 / CS: 25308: http://www.3dmark.com/3dm/11665249
#15
Joined
Sep 23, 2008
Messages
31 (0.01/day)
Likes
11
System Name Noname
Processor Core i7 3930K @ 4500
Motherboard EVGA X79 SLI
Cooling Coolermaster Hyper 212+
Memory Kingston 4x4GB 1600 Mhz
Video Card(s) AMD Radeon R9 290X
Display(s) 3x 24" Eyefinity
Case NZXT H630
Audio Device(s) Creative Soundblaster Z
Power Supply XFX 850W Black Edition
Software Windows 7 Ultimate x64
#17
W1zzard, is it possible to get the GPU count and if sli/crossfire is enabled? What properties should I look at? Sorry, no sli/crossfire to test.

BTW, I'll use GPU-Z in Framebuffer Crysis Warhead Benchmark Tool.
 
Last edited:

ascl

New Member
Joined
Sep 24, 2009
Messages
1 (0.00/day)
Likes
0
#21
Thanks for publishing this.

If anyone is interested, I have sample C# code for accessing this posted on my blog here.
 

tomoyo

New Member
Joined
Mar 15, 2010
Messages
1 (0.00/day)
Likes
0
#22
Thanks for this interface. Now I have a monitor plugin of rivatuner, making use of the GPU load and other information reported by GPU-Z. It works very well.

I have described some detail in my blog but I'm sorry it was in Chinese.
 

SirReal

New Member
Joined
Jul 5, 2010
Messages
3 (0.00/day)
Likes
1
#24
LCDSirReal, or SirReal's multipurpose G15 plugin, is a plugin for the Logitech G13/G15 gaming keyboards. It will also work on the G19, in black and white. It provides more features than all of the Logitech bundled plugins together, while using much less CPU and memory. Written entirely in C++, it's very stable and efficient. Among the more notable features are support for applications (WinAMP, iTunes, SpeedFan, FRAPS, TeamSpeak...) and system real time monitoring of networking, CPU and memory usage and more. And of course it shows the basic stuff as time, date and waiting mail as well.



Changes in 2.8.3:

  • Adds support for the GPU-Z application
  • You can now prevent the test window from appearing even if the keyboard is unresponsive, see lcdsirreal.txt for more info

http://www.linkdata.se/
 
Last edited by a moderator:

lastOne

New Member
Joined
Mar 22, 2008
Messages
2 (0.00/day)
Likes
0
Processor AMD 4600 X2 @ 2.75Ghz
Motherboard ASUS M2N32-SLI-Delux
Cooling Zalman CNPS-9500-AM2
Memory 4x 1Gb GeIL Black Dragon
Video Card(s) Gainward 7900 512Mb
Storage 4x Samsung 500Gb
Display(s) Eizo Flexscan 24" SX2461WH-BK
Case Antec P180
Power Supply Seasonic S12 II - 500W
Software XP64 SP2
#25
- what happens if 2 or more GPUs are present, the structure posted on the OP provide all the info ?
- the OP is rather old, does the OP structure is still up to date ?