Time, famously, stops for no one. Yet it doesn’t pass the same way in every place—and at one point in human history, that was a major source of contention.
In the new book The Order of Time, published in April, quantum physicist Carlo Rovelli discusses the history of human tools used to measure time. Hard as it may be to fathom in our current era, people’s lives weren’t always ruled by the clock.
“Sundials, hourglasses, and water clocks already existed in the ancient world, in the Mediterranean region and China—but they did not play the cruel role that clocks have today in the organization of our lives,” Rovelli writes.
Clock-regulated time began in 14th-century Europe, where every city and village had a sundial and measured the passing hours with ringing church bells. Still, these bells didn’t toll together; Venice and Paris, say, weren’t in sync.
Instead, every place, even those close together, operated on a slightly different time. Everyone agreed that noon was when the sun was highest in the sky. But the sun moves from east to west, so mid-day differed in various locales. The hour indicated on a sundial is called “apparent solar time,” or “true local time.”
It wasn’t until the 19th century that train travel exposed the need for a more uniform way of measuring time. “It is awkward to organize train timetables if each station marks time differently,” Rovelli writes. And so the notion of “standard time” was born.
In November 1840, the Great Western Railway of Britain adopted this fictitious measure, known locally as Greenwich Mean Time. Other railways soon followed suit. Standard time didn’t become the law of the land until 1880, however, when the Definition of Time Act took effect in the UK and received the Royal Assent.
Britain proposed that the rest of the world follow its lead, and suggested that midday in London would be noontime everywhere else—even in places where noon might fall at midnight. As Rovelli puts it, “people are attached to local time,” so this idea didn’t fly.
But compromises had to be made, or chaos would reign. Lack of standardization was bad for businesses like the shipping industry. So, in 1884, chronologists from around the world met for the International Meridian Conference in Washington DC, where Britain convinced other countries to adopt Greenwich Mean Time as the prime meridian for measuring longitude and timekeeping.
The reasoning for this was that the Brits had more ships than all other nations put together and had highly developed nautical maps, as well as advanced chronological data. GMT thus became the standard, and the globe was divided into three corresponding time zones. “In this way, the discrepancy between 12 on the clock and local midday is limited to 30 minutes,” the physicist explains.
Still, standardization was a slow process—and one that faced some resistance. Railways in the US and Canada coordinated clocks in November 1883. But the city of Detroit, for example, clung to sun time. In 1900, the Detroit City Council ordered that clocks be adjusted 28 minutes to abide by Central Standard Time. Citizens refused, however, and the decision was rescinded.
In fact, uniformity wasn’t established in the US until 1918, when the Standard Time Act passed. Today, there are 37 different local times in use around the world.
Now, technology is changing our relationship with time again. In Britain, schools are removing analog clocks from exam room walls, replacing the traditional round-faced tools with digital timepieces, which are easier for postmodern students to read. Malcolm Trobe, deputy general secretary at the Association of School and College Leaders, told The Telegraph in April, ”You don’t want them to put their hand up to ask how much time is left.”