Wall clocks vs PC clocks
August 6, 2008 1:25 PM   Subscribe

My phones, watch, mp3 player, wall clocks and bedside alarm clock all keep near perfect time (+-1 minute a month) with little or no brain. Computers, with their comparatively limitless power seem unable to maintain anything even vaguely resembling accurate time without connecting to an atomic clock on a daily basis (It's not uncommon for my desktop PC to lose 10 minutes when left switched on overnight). Why are computers so bad at keeping time, and why does no-one improve them?
posted by twine42 to Computers & Internet (17 answers total)
 
Response by poster: Damnit... One day I learn that preview wrecks post formatting. That was supposed to be three paragraphs.

I know my question singles out my work PC for blame here, but the majority of PCs I work with do the same to greater or lesser extents. Oh, and I'm not after recomendations for time syncing programs (I already use the analogx sync for that) I'm just asking why PCs can't keep count themselves.
posted by twine42 at 1:28 PM on August 6, 2008


I'm not sure I agree, at all. My computer's perfectly reliable, and I've never experienced anything like your ten-minute loss.
posted by Tomorrowful at 1:32 PM on August 6, 2008 [1 favorite]


Several former desktops of mine had horrible issues with keeping accurate time. The manufacturers blamed it on the batteries in their respective motherboards. Maybe there's just something about the batteries computer manufacturers use, or the process of neglectfully storing motherboards and letting the battery run for x amount of time before putting them in computers and selling them.
posted by aswego at 1:35 PM on August 6, 2008




First Google hit for PC clock accuracy (scroll down to the "PC Time Accuracy" section).
posted by DevilsAdvocate at 1:39 PM on August 6, 2008


One issue is that on a Windows Server and other Kerberos environments, a computer's time & date should be dictated by an internal time server, not an external one. Thus, your internal network's time could be totally off from the rest if the world.

On a personal computer, if the time is not set to automically update from a time server, then again the time could easily be thrown off. I can't find anything to confirm my sketchy memory, but I recall reading there being issues with XP's default time.nist.gov server that didn't give results, thus causing computers not to have updated timestamps. Though I'm probably remembering it wrong.

And yea, if the cmos battery is dying, them the clock will lag whenever it is off.
posted by jmd82 at 1:40 PM on August 6, 2008


Mobile phones sync over the air on a regular basis. That's why the time on your phone automatically changes when you got off a plane in a new time zone. Many MP3 players sync up their clocks whenever they are connected to a computer.

Quartz clock accuracy has almost nothing to do with complexity or having a "brain" and a lot to do with the oscillator that keeps time, whose stability is greatly affected by temperature. The inside of a computer isn't all that thermally stable, especially in comparison your bedroom or kitchen where you're likely to be keeping a wall clock or alarm clock.
posted by strangecargo at 1:42 PM on August 6, 2008


Meaningless data point: As far as I can recall, I've never had this problem on any PC I've owned, regardless of how infrequently it is connected to the net. If your case is unusual (or the case of a flat back-up battery or similar), then that would explain why no-one seems troubled enough to fix it.
posted by -harlequin- at 2:07 PM on August 6, 2008


Prior to always keeping my computer synced with a time server, gross errors in computer time came down to bad batteries on the mother board. Changing them kept the clocks pretty on time, at least as good as my digital wristwatch.
posted by qwip at 2:30 PM on August 6, 2008


Response by poster: It is possible that the computers that suffer from it the most are the oldest PCs. Certainly some of the PCs in the office are 7+ years old.

strangecargo - Yeah, I didn't mean to imply that a quartz watch had a brain, but if a watch can manage that accuracy with a crystal and a bit of power, why couldn't a computer? Actaually, do PCs have a quartz (or similar) oscillator in them?
posted by twine42 at 2:31 PM on August 6, 2008


Are you sure your machine isn't synching to another networked machine that is simply set to the wrong time? I had that happen at a previous place of business.
posted by Cool Papa Bell at 3:17 PM on August 6, 2008


Almost everything electronic with a processor or clock in it uses a quartz oscillator to regulate its clock cycles. Computers, mobile phones, clocks, wristwatches, televisions, etc.

Like I said in my first answer, temperature regulation plays a part in quartz accuracy, meaning that the clock in your wall clock or wristwatch is going to tend to be more stable than the clock in a computer, just due to thermal environment. As a quartz oscillator ages, its resonant frequency can also shift over time, a possible reason for older computers to be less accurate than newer ones.

I suspect another reason for the inaccuracy of PC clocks is that it just doesn't matter that much to manufacturers. As long as the clock is close enough, the computer will operate perfectly fine. No matter how accurate, a computer's internal clock rate is almost guaranteed to be off compared to that of a more accurate source and that is why things like NTP exist. Most non-trivial NTP client implementations don't merely sync the internal clock periodically, but actually keep track of the amount of clock drift that exists between the local clock and the reference clock, constantly adjusting the local clock according its measured frequency offset. The assumption is that the local clock is always off and once the difference is quantified, it can be kept in check.
posted by strangecargo at 3:34 PM on August 6, 2008 [1 favorite]


FWIW my computer last did a time sync a week ago at it is currently off by 3 seconds. However I leave it on all the time, which may make a difference.
posted by flug at 4:22 PM on August 6, 2008


It used to be (when I was a hardware designer, some aeons ago), that computer circuit boards used the electricity mains supply frequency to regulate the quartz oscillator. I suspect that circuits have changed a bit, but the inaccuracy and drift of the 50 or 60 Hz supply tended to cause a lot of time drift.
posted by Susurration at 6:30 PM on August 6, 2008


I had always guessed (and it's just a guess) that this was because it wasn't made to tell time. A wrist watch's crystal was designed to tick over at a predictable (and slow) rate and to be reliable. If it's off by a hundredth of a percent then that's enough to reject it because it would make a crappy watch. However when we're talking about a crystal oscillator in a CPU it can cycle billions of times a second. Nobody cares if its off by a tenth of a percent, given all the other stuff that needs to go right for a CPU to work. And besides, computer clocks are easy to update.

Also you need smaller crystals for faster vibrations and smaller crystals are harder to make accurately.

disclaimer: I only know what I read on the Internet and I don't know if a computer's clock runs off the CPU clock or not.
posted by Ookseer at 1:32 AM on August 7, 2008


Every computer I've ever had has had time issues of some manner or another. Always slow.

It can't *just* be the battery, because the computer doesn't use the battery when it's plugged in. I've heard the same thing about the oscillator crystal- something about using a cheap and commonly available one, that is close enough. But the math on whatever frequency it operates at just doesn't divide quite right. Something along the lines of not being exactly divisible by 60.

And as someone else mentioned, slower is better than faster when it comes to networking and time stamps and the like.
posted by gjc at 3:57 AM on August 7, 2008


Why are computers so bad at keeping time, and why does no-one improve them?

They have improved them significantly. For example, they added an Internet connection and software to synchronize the computer's clock with a reliable time source on the Internet! When you can do that, why would you bother making any further improvements? That's as improved as it's ever going to get.

Apple seems to have done something interesting. When the overclocking software came out for the Mac Pro recently, it was revealed that the machine actually keeps time in software, reading the clock chip only at startup. Thus, overclocking the machine after it had started up actually accelerated the real-time clock! I don't know why they did it this way, but I presume it must have been more accurate, possibly due to temperature-related fluctuations.
posted by kindall at 5:26 AM on August 7, 2008


« Older Vector street map of Columbus, OH suburb?   |   Returning Department from Active Directory Users Newer »
This thread is closed to new comments.