• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GT 640 Synergy half memory clock

Joined
Dec 27, 2014
Messages
4 (0.00/day)
Location
Austria
System Name Windows XP
Processor AMD Athlon II X3 455
Memory 2*2GB
Video Card(s) NVIDIA GeForce GT 640 synergy
Audio Device(s) -
Joined
Feb 19, 2009
Messages
1,831 (0.33/day)
Location
UK Warwickshire
System Name PC-Chips
Processor Ryzen 5 5600x
Motherboard Asus ROG Strix B550-F Gaming.
Cooling Thermalright Peerless Assassin 120 SE CPU Air Cooler 6 heat pipes.
Memory Patriot Viper 32gig dual channel 3600mhz
Video Card(s) PowerColor HellHound RX 7900 GRE OC
Storage 2X Samsung 860 EVO SSD's 500gig / 2TB crucial P3-NVME / WD-BLUE SN550 1TB M.2 / SP A55 512gig
Display(s) Panasonic 40-inch 4k TV
Case Modded NZXT H510
Audio Device(s) Realtek S1220A - Yamaha A-S501 AMP - 4 x Wharfedale diamond 9.1 speakers - Wharfedale SW150 sub
Power Supply EVGA SuperNOVA G6 750W 80+ Gold
Mouse Some cheap wireless thing
Keyboard Razer Cynosa lite
VR HMD Oculus Quest 2 128gig version
Software Windows 11 pro 64bit
Are you 100% sure is running at 800mhz?

Could it be the fact it's sitting in power saving mode like 99% of todays GPU's do when idle?

I would download GPUz and run the render test to see if the speed jumps upto 1600mhz
 
Joined
Jul 21, 2010
Messages
386 (0.08/day)
Location
Romania
Processor Intel Core i5 4570
Motherboard Gigabyte GA-Z87-HD3
Cooling Stock
Memory 8GB Kingston ValueRAM CL9 1333MHz DDR3
Video Card(s) Gigabyte GeForce GTS450OC-1GL
Storage 1TB WD Black + 1.5TB WD Black + Kingston V300 120GB
Display(s) T200HD
Case Delux MZ401
Audio Device(s) onboard
Power Supply Enermax NAXN 500W
Software Windows 8 Pro x64
Its 800MHz but DDR3 has twice the effective clock so 1600MHz. Everything is okay with your card.
 
Joined
Dec 9, 2013
Messages
911 (0.24/day)
System Name BlueKnight
Processor Intel Celeron G1610 @ 2.60GHz
Motherboard Gigabyte GA-H61M-S2PH (rev. 1.0)
Memory 1x 4GB DDR3 @ 1333MHz (Kingston KVR13N9S8/4)
Video Card(s) Onboard
Storage 1x 160GB (Western Digital WD1600AAJS-75M0A0)
Display(s) 1x 20" 1600x900 (PHILIPS 200VW9FBJ/78)
Case μATX Case (Generic)
Power Supply 300W (Generic)
Software Debian GNU/Linux 8.7 (jessie)
Its 800MHz but DDR3 has twice the effective clock so 1600MHz. Everything is okay with your card.
This and I would not trust much what the ZOTAC website says...

Here's the info for my 9500GT: http://www.zotac.com/products/graph...er/DESC/amount/10/section/specifications.html

It says it has 650MHz core clock, 1800MHz (900MHz) for memory and 1625MHz for shader while my card has 550MHz core clock, 1000MHz (500MHz) memory clock and 1350MHz for shader clock.

But you think the model is wrong? That's the model that is PRINTED on box and on my card. And the info for the DDR2 isn't correct either: http://www.zotac.com/products/graph...er/DESC/amount/10/section/specifications.html

Mine is GDDR2, 512MB, 550-500(1000 effective)-1350.

Disappointment. I would fire all the people in charge of that website. :shadedshu:

But the card is good. :)
 
Last edited:
Joined
Dec 27, 2014
Messages
4 (0.00/day)
Location
Austria
System Name Windows XP
Processor AMD Athlon II X3 455
Memory 2*2GB
Video Card(s) NVIDIA GeForce GT 640 synergy
Audio Device(s) -
Are you 100% sure is running at 800mhz?

Could it be the fact it's sitting in power saving mode like 99% of todays GPU's do when idle?

I would download GPUz and run the render test to see if the speed jumps upto 1600mhz

Yes I've run the render test. Same result.
 
Joined
Dec 9, 2013
Messages
911 (0.24/day)
System Name BlueKnight
Processor Intel Celeron G1610 @ 2.60GHz
Motherboard Gigabyte GA-H61M-S2PH (rev. 1.0)
Memory 1x 4GB DDR3 @ 1333MHz (Kingston KVR13N9S8/4)
Video Card(s) Onboard
Storage 1x 160GB (Western Digital WD1600AAJS-75M0A0)
Display(s) 1x 20" 1600x900 (PHILIPS 200VW9FBJ/78)
Case μATX Case (Generic)
Power Supply 300W (Generic)
Software Debian GNU/Linux 8.7 (jessie)
Yes I've run the render test. Same result.
It is correct, it shouldn't be at 1600. My card is 1000MHz effective and GPU-Z always show 499.5MHz.

Nothing wrong at all.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
My GPU only use 800 of his 1600 MHz memory clock. When I try to overclock it, the computer will crash.
You're seeing the actual clock rate. DDR stands for Double Data Rate, which means it uses both the rising and falling edge of each clock cycle to transmit data. As a result, a base clock speed of 800Mhz would have a DDR "rate" twice as big; 1600Mhz. So I think you're misunderstanding what you're looking at.
 
Joined
Feb 19, 2009
Messages
1,831 (0.33/day)
Location
UK Warwickshire
System Name PC-Chips
Processor Ryzen 5 5600x
Motherboard Asus ROG Strix B550-F Gaming.
Cooling Thermalright Peerless Assassin 120 SE CPU Air Cooler 6 heat pipes.
Memory Patriot Viper 32gig dual channel 3600mhz
Video Card(s) PowerColor HellHound RX 7900 GRE OC
Storage 2X Samsung 860 EVO SSD's 500gig / 2TB crucial P3-NVME / WD-BLUE SN550 1TB M.2 / SP A55 512gig
Display(s) Panasonic 40-inch 4k TV
Case Modded NZXT H510
Audio Device(s) Realtek S1220A - Yamaha A-S501 AMP - 4 x Wharfedale diamond 9.1 speakers - Wharfedale SW150 sub
Power Supply EVGA SuperNOVA G6 750W 80+ Gold
Mouse Some cheap wireless thing
Keyboard Razer Cynosa lite
VR HMD Oculus Quest 2 128gig version
Software Windows 11 pro 64bit
Yes I've run the render test. Same result.

Sorry i was half asleep when i posted :( i also have to agree with people who have replyed.

You card might show 800mhz but the effective speed = 1600mhz

My HD7970 has GDDR5 and it shows as 1375mhz but its effective clock speed = 5500mhz

It's odd how they do it and can be very confusing.
 
Last edited:

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
My HD7970 has GDDR5 and it shows as 1375mhz but its effective clock speed = 5500mhz

It's odd how they do it and can be very confusing.
It's not that confusing, it's different technologies. GDDR5 is quad-pumped, or QDR (quad data rate, wikipedia.)

While I don't condone solely using Wikipedia for your information, its summary for QDR fairly streight to the point:
Quad data rate (or quad pumping) is a communication signaling technique wherein data are transmitted at four points in the clock cycle: on the rising and falling edges, and at two intermediate points between them. The intermediate points are defined by a 2nd clock that is 90° out of phase from the first. The effect is to deliver four bits of data per signal line per clock cycle.[1]

In a quad data rate system, the data lines operate at twice the frequency of the clock signal. This is in contrast to double data rate systems where the clock and data lines operate at the same frequency.[1]

QDR technology was introduced by Intel in their Willamette core Pentium 4 CPU, and is currently employed in their Atom, Pentium 4, Celeron, Pentium D, and Core 2 Processor ranges. This technology has allowed Intel to produce chipsets and microprocessors that can communicate with each other at data rates expected of the traditional FSB technology running from 400 MT/s to 1600 MT/s, while maintaining a lower and thus more stable actual clock frequency of 100 MHz to 400 MHz.[2]
 
Joined
Dec 27, 2014
Messages
4 (0.00/day)
Location
Austria
System Name Windows XP
Processor AMD Athlon II X3 455
Memory 2*2GB
Video Card(s) NVIDIA GeForce GT 640 synergy
Audio Device(s) -
Thanks! I think I understood this now.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Thanks! I think I understood this now.
I should really clarify this as terms tend to get thrown around but not properly used and I'm guilty of this.

The proper way of saying this is that you have DDR3 memory running at 800Mhz (1600MT/s) where MT/s stands for Mega (si-prefix) Transfers per second, since you're making two transfers for one clock cycle.

Once again, to quote wikipedia (which I should stop doing):
The units usually refer to the "effective" number of transfers, or transfers perceived from "outside" of a system or component, as opposed to the internal speed or rate of the clock of the system.
So it's important to make a distinction from clock rate and transfer rate.
 
Top