Daylight saving time in the United States will start this weekend. Are your computers ready?
For daylight saving time (DST), the time will shift forward one hour on Sunday, March 11th at 2am. (And it'll shift back on Sunday, November 4th at 2am.) So what? Well, as I documented in "Daylight Savings Time and Java" (five months ago), computers may not handle this time properly. The popular press has picked up on this story in the last week or two and begun to ask, "Daylight Savings: Y2K All Over Again?," which the more technically savvy computer press is answering, "Daylight savings a nuisance, but no Y2K."
BTW, changes are necessary. IBM has a site listing the patches for its products; see "IBM Alerts and Daylight Savings Time." Microsoft has a similar Daylight Saving Time Help and Support Center site.
What I don't understand is why DST is such a big deal. Y2K was about us programmers taking shortcuts with how dates are stored, and the chickens came home to roost at the end of the century (give or take a year). But DST isn't about storing dates or times, it's about human preferences for when sunrise and sunset occur, and the position of the sun at the time we call "noon." What do computers care where the sun is?
DST should affect the way the correct current time is displayed to us humans; errors would confuse us into being an hour late for appointments. But why is it a problem for the computers themselves? The IT ISAC (Information Sharing and Analysis Center) warns in "IT-ISAC: 2007 Daylight Saving Time Alert" (PDF) that the change could cause "failures of systems that depend on correct time stamps to store, monitor or help operate critical infrastructures."
Failures? Really?! How dumb are computers these days? Maybe I don't get it, but it seems like this isn't that difficult of a problem to solve.
All computers' clocks should synchronize their time to one unchanging standard, usually Greenwich Mean Time (GMT), also known as Coordinated Universal Time (UTC). Many international sources with timestamps "all use UTC to avoid confusion about time zones and daylight saving time." Sound familiar?
I think this is the way Unix computers have worked for 30 years now. Their internal clocks run on GMT/UTC time and never change (except if/when they need to be resynchronized; see "Network Time Protocol (NTP)"). The way they display time to the user does take into account the time zone the computer is located in and daylight saving time, but that's just the display. If you move such a computer to a different time zone and change its time zone setting, the computer doesn't change its clock's setting, it just changes the offset added to the time when displaying it to the user. But logs that the computer keeps and signals sent to other computers should all be scheduled and time stamped by the computer's internal GMT/UTC time.
So if computers worked this way today, I would think there should be no system failures just because of DST. Patches would be needed for displaying time to humans and interpreting times inputted by humans, but computers and networks should internally be able to keep time just fine. I thought this is more the way things worked today, but apparently not. That's what's really frightening.
Why do computers care about daylight saving time?