Apple’s marketing copy for the Watch:
High-quality watches have long been defined by their ability to keep unfailingly accurate time, and Apple Watch is no exception. In conjunction with your iPhone, it keeps time to within 50 milliseconds of the definitive global time standard.
Since the Watch was announced in 2014, Apple has touted its extraordinary accuracy. I’ve never understood why I should be impressed by this.
For over a century, quartz oscillators have made it possible to build incredibly precise timepieces. As early as 1929, the federal Bureau of Standards relied on quartz clocks that drifted from actual time by less than half a second per month. These days, even a $10 Casio wristwatch from your local gas station likely loses less than a minute per year—accurate enough for nearly every practical use.
Digital devices—including laptops and phones—also rely on quartz-based oscillators. But they have an additional advantage over “dumb” timepieces: an Internet connection. Using the Network Time Protocol (NTP), our devices synchronize themselves against precisely-tuned time servers on the Internet. NTP keeps our computer clocks within a few dozen milliseconds of “actual” time; that probably explains Apple’s “50 millisecond” figure in the marketing quoted above.
Now, Apple actually claims that the Watch is “far more accurate as a timekeeping device than the iPhone.” This makes little sense to me, since both devices presumably depend on the same NTP servers.
And even if the Watch were somehow slightly more accurate than my other digital devices… should I care? Do average consumers even need the exact time, down to fractions of a second? Are atomic physicists timing their experiments using the Watch? Do NASA engineers schedule booster ignition using Siri? Do international secret agents synchronize their capers by watching Mickey Mouse’s hand? I honestly can’t imagine a realistic scenario where even a few seconds’ aberration makes a difference in everyday life.