• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

CUDA and PhysX detection & Clocks

interman

New Member
Joined
Oct 24, 2012
Messages
16 (0.01/day)
Likes
0
#1
Hello, I have strange doubts about my hardware and its support in GPU-Z.
I have new laptop with Core i7 Ivy Bridge, is it possible that integrated graphic card does support for PhysX and other ticked features? Or is it nothing more than a bug? Here is a screenshot:

Second problem: Except above card, I have GeForce GT640M.
I tried various drivers and nothing helps that GPU-Z shows some features unticked where it should all be ticked on this card series.

One of previous versions of GPU-Z did show that it supports CUDA, but now it doesn't. I have always installed full newest available driver.
How does GPU-Z check whether a card does support CUDA or not ?
One more thing which would be handy is that GPU-Z has default and current clock, right?
But all the time both values are equal which is not correct. When I switch to sensors tab, I can read current clock for GPU core and GPU memory.

Does it have any chances to be corrected?
Thank you very much!
 

Sinfamy

New Member
Joined
Oct 24, 2012
Messages
1 (0.00/day)
Likes
0
#2
Your laptop probably has Nvidia Optimus, as does mine.

You need not worry as there is support for everything.

If you want proof, add GPU-Z to the nvidia list (in the control panel) or right click on gpu-z and "Run with graphics processor" high performance nvidia.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,153 (3.43/day)
Likes
18,105
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
#3
physx doesn't know about individual cards. it's either available system-wide or not.

the opencl detection is per-card and should be accurate

i vaguely remember seeing somewhere that kepler low-end does not support cuda?
 

interman

New Member
Joined
Oct 24, 2012
Messages
16 (0.01/day)
Likes
0
#4
Yes, it has Nvidia Optimus. I've done what you said and it shows CUDA.

but problem with clocks still persists, When I switched to sensors tab, here are different clocks:
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,153 (3.43/day)
Likes
18,105
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
#5
the first tab shows 3d target clocks, the sensors tab shows real-time current clocks. due to power saving these will be lower than the clocks on the first tab
 
Joined
Jul 23, 2011
Messages
1,556 (0.66/day)
Likes
1,459
Location
Kaunas, Lithuania
System Name Eternal WIP ][ HP ProLiant server blade
Processor AMD FX-8320 @ 4.4/4.1 GHz ][ 2x Intel Xeon L5520 @ 2.27GHz
Motherboard Asus Crosshair V Formula-Z ][ -
Cooling Corsair H55 ][ Stock
Memory 2x4GB Kingston HyperX Genesis @ 1866MHz ][ 4x4 GiB Hynix @ 1333 MHz; ECC on
Video Card(s) 2x MSI GeForce GTX 770 DirectCU II OC ][ None
Storage 1TB WD Black + 1.5TB WD Green + 750GB Samsung + 64GB OCZ Agility 4 ][ 3x 1 TB Hitachi
Display(s) Asus VG278H + Asus VH226H ][ None
Case Antec Nine Hundred ][ it's a server blade, lol
Audio Device(s) Supreme FX II (Integrated); Using optical S/PDIF output ][ none
Power Supply Corsair AX1200i ][ [stock]
Software Gentoo Linux ][ Linux Mint 16
Benchmark Scores 65970 BogoMIPS. Ha. Ha. Ha. >=|
#6
IOW: It shows the highest power state clocks, so if You would overclock Your GPU, You would see that overclock in the 1st tab's GPU clock field. While the monitors in the 2nd tab show the clocks of *whatever-the-current-powerstate/clock-the-gpu-is-in.

Imagine: You try to overclock Your card so You set some clock for [the highest power state of] Your GPU. Looking into the 1st tab, You can easily see if it worked. Otherwise, You would need to induce some load on the GPU to push it to the HP state to see it setting the clock worked. Which would be tedious.

Thus, it's not a bug - it's a feature.
 
Last edited:

interman

New Member
Joined
Oct 24, 2012
Messages
16 (0.01/day)
Likes
0
#7
Oh, if so it's ok. But I think it should be better if it could show realtime clock+all other parameters calculated in realtime on the first tab.

BTW. Where can I read number of TMUs in GPU-Z? Is it possible?
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,153 (3.43/day)
Likes
18,105
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
#9
Where can I read number of TMUs in GPU-Z?
TMUs are not directly displayed, they are used for the calculation of texture rate, though
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
19,710 (5.12/day)
Likes
5,121
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
#10

interman

New Member
Joined
Oct 24, 2012
Messages
16 (0.01/day)
Likes
0
#11
TMUs are not directly displayed, they are used for the calculation of texture rate, though
I know, but it would be great if it would show directly number of TMUs, so treat it as an advice :D

Patience is golden, youll get your answer if you dont bump a topic like you did.
I edited my post and though nobody read it.It isn't a crime I guess?
 
Last edited: