How does mains frequency relate to computer clock speed, or does it?
November 23, 2004 6:57 AM   Subscribe

Electricity and computers. 3 questions about:
1] Mains frequency
2] Computer clock speed
3] Putting 1 and 2 together. [~240v inside]

1] At school we were told that the mains supply isn't a constant 60Hz, but that the average over a day was 60Hz. The theory was that under heavy load during the day, the power system runs slow (lets say 58Hz) and that during the night the system crept up to 64Hz to compensate. Was this ever true, and is it still so?

2] How do computers regulate clock speed? Is it analagous to a quartz chip in a watch, or does it multiply the frequency supplied at mains?

3] Does this mean a PC (theoretically) runs faster at night?
posted by twine42 to Computers & Internet (9 answers total)
They use a crystal to create the clockpulses that drive the cpu and associated data buses, so unfortunately they are unlikely to change speed overnight!

(See also
posted by gi_wrighty at 7:08 AM on November 23, 2004

Response by poster: That's what I guessed, but what about [1]?
posted by twine42 at 7:09 AM on November 23, 2004

In the UK at least, the mains supply runs at an almost constant 50Hz. I once did work experience at a local power station and saw a big display in the control room showing the current frequency of the National Grid - it never moved much between 49.9 and 50.1Hz.

These fluctuations can effect devices that rely on the frequency for whatever reason - such as old tape recorders that use the AC supply to drive the motor; warbling would result in the recordings. Other than that most digital / modern devices convert the AC to a DC supply as it enters the device and this conversion also goes some way to reduce the effects of fluctuations in the supply.
posted by gi_wrighty at 7:18 AM on November 23, 2004

gi_wrighty is right- the clock of a computer is entirely independent from the input voltage. in fact, your computer's power supply irons out the AC into DC- which has no fluctuations.
posted by fake at 7:56 AM on November 23, 2004

There is a small fluctuation in the mains frequency, but not as much as you indicate. You can grab a multimeter and stuff it into your wall socket to check... (Not really recommended... but it'll work)

The powersupply in your computer deals with all sorts of funny power and converts it all to DC for inside the case.

Really the only modern things affected by a frequency change are lightbulbs. That's why they dim a bit when you plug in your 2000W industrial hairdryer. :)
posted by defcom1 at 8:06 AM on November 23, 2004

#1. Mains frequency should be VERY close to 60 Hz at all times in 60 Hz areas, and 50 Hz in 50 Hz areas. The reason for this is that some clocks are controlled by main frequency, and obviously the time would be out if it wasn't.

This is, in fact, something most power plants carefully regulate, and if they do go off the base frequency, they ensure they make it up by lagging or speeding up at night (but not by much, say it was 59.95 Hz during the day, they'd run at 60.05). 64 Hz just isn't likely to happen, IMHO.

Now, as an aside, during WW II the Germans (and possibly allies, but of that I'm not certain) had an extremely difficult time generating power at the correct frequency, so at that time wild swings like 64 Hz were very likely.

BTW: As the frequency increases, transformers in linear power supplies (nowadays usually just wall warts) become increasingly inefficient and heat up. This is why using a 2:1 transformer to step down 220 V for 120 V devices isn't always a great idea. Not to mention your cheap clock will be WAY off.

defcom1, the reason your lights dim from the hair drying isn't a change in line frequency (that's almost impossible without violating a lot of CSA/UL approvals). It's because the load on your line is too high while the elements heat up meaning the resistance of the wire decreases the voltage available at the power socket due to the newly increased current flow through the wire resistance (V = I*R). The decrease in voltage is met with an equal increase of heat on your wires.
posted by shepd at 8:25 AM on November 23, 2004

shepd, are you sure inefficiency increases with increasing frequency? I would think the lower the frequency, the closer it would be to DC, and more energy would be spent as heat since transformers are only effective with AC.
posted by nTeleKy at 2:03 PM on November 23, 2004

Argh, nTeleKy, yes, you're absolutely right. I wrote that backwards.

However, my suggestion about using 120 volt devices (which are almost always 60 Hz) on 220 volts transformed to 120 volts (which is usually 50 Hz) was right...
posted by shepd at 4:32 PM on November 23, 2004

In an authoritative and convincing article about the last big blackout, they threw in an astonishing fact: the entire north American power grid is not just on a 60 herz cycle, but it's the same 60 herz in every part of the grid--the entire continent's grid is synched to the same rising and falling wave. Don't ask me how they account for signal propogation timing across 4000+ miles, though.

The need to synchronize with the master wave was sited as one of the reasons isolated working power stations couldn't independently power their own local users.
posted by NortonDC at 7:07 PM on November 23, 2004

« Older Video Card Upgrades   |   I am looking for recommendations for telephonic... Newer »
This thread is closed to new comments.