Not content with being a half-a-trillion-dollar company with 2 billion users worldwide, Facebook casually announced this week it had invented a new unit of time. The “flick,” at 1/705,600,000th of a second, is longer than a nanosecond (1/1,000,000,000th of a second), and is designed to measure the time individual video frames appear on screen.
In the fields of motion graphics, virtual reality, and video, “frame rate” has long been used to describe the frequency by which consecutive images appear on a display. The frame rate system is based on nanoseconds and seconds, which, when used to account for the length of individual frames, can lead to awkward fractions and infinite decimal places. For example, 24 frames per second is one frame per 0.0416666666666666 (ad infinitum) seconds. These awkward numbers can in turn create syncing errors.
Facebook’s “flick” is calculated based on how quickly technology currently enables consecutive images to appear on the screen—allowing the temporal length of frames to be described in whole numbers. It creates a simple, standardized, time measurement that in theory will make it easier for those who create such images to better describe and sync the speeds of their work. The greater the complexity of the moving image, the more useful flicks become; no surprise, then, that the idea for flicks came from Facebook’s virtual-reality division, Oculus.
Facebook’s decision to invent a unit of time to suit its needs may sound obnoxious and audacious. That, though, is how time works—and how it has always worked. Time as we think of it isn’t innate to the natural world; it’s a manmade construct intended to describe, monitor, and control industry and individual production. The number of flicks in a frame rate is no less objectively “real” than the number of seconds in a minute or hours in a day; all units of time are arbitrary inventions used for industrial means.
“We take this concept of time, this amorphous entity, and translate it into one of the most objective tangible entities: money,” says Robert Levine, psychology professor at California State University-Fresno and author of A Geography of Time. “A person gets paid by the hour; a lawyer gets paid by the minute.”
The history of time’s construction reveals its clear links with work. In medieval Europe, for example, wages were paid by day, and as a result were partially determined by the mercurial habits of the dominant mode of telling time in that age: the sun. Daylight hours changed with the season, of course, and so “when you look at work arrangements for Europe in the middle ages that you find instructions for the length of work in the winter versus in the summer and the varying wages,” says Kevin Birth, an anthropology professor at Queens College in New York and author of Time Blind. It wasn’t until 18th century Europe, when workers tended to travel more and so experienced different day lengths as they crossed latitudes, that wages began to be paid by the hour, Birth adds.
The industrial revolution in the 19th century bought the technology for far more precise clocks and, with them, heightened attention to the hourly work schedule. Employers started to carefully track work attendance, with timestamps marking employees’ arrival and departure every day. Meanwhile, companies also began to adopt principles of scientific management (often called “Taylorism” after its founder, Frederick Winslow Taylor), in order to try to increase production. In this system, workers’ individual movements were carefully monitored, timed with a stopwatch, and then streamlined into a precise, machine-like production line. “There were people looking over workers’ shoulders and measuring how long each element of the task [took],” says Levine. Each movement was then modified to maximize efficiency.
Industrialization also led to standardized time zones; when humans developed the means to travel quickly across large areas, switching between what were then fairly arbitrary local times became confusing and dangerous. In the 19th century, local times varied considerably, and trains would travel forwards and backwards in time even when going to and from stations just a few miles apart. American railways ran according to 75 local times in 1875—Chicago alone had three—according to an Atlantic article on Vanessa Ogle’s book The Global Transformation of Time. The confusion caused by such inconsistency inevitably led to accidents, explains Levine, and so in 1883, US railroads established the four time zones used today; these were made federal law in 1918. (British railways had already introduced standardized time by 1840.)
For much of the 19th century Britain, each neighborhood legally ran according to local solar time, even though the clocks were set to Greenwich Mean Time. This created great confusion for pubs in 1872, when liquor laws required strict opening and closing hours for watering holes. Legally, police were able to impose fines for selling alcohol past the set hour according to local solar time—even though no one but the law relied on such time. In 1880, Greenwich Mean Time finally became the legal standard: “I guess because the Members of Parliament drank a lot they decided to make Greenwich Time the official time for all of England,” says Birth.
A similarly concentrated push for temporal structure played out worldwide when diplomats at the 1884 International Meridian Conference in Washington, DC, decreed there should be an international system of time zones based on Greenwich Mean Time. It took decades to override local systems of telling time—Russia used the Julian calendar and was 13 days behind Western Europe until 1918, notes The Atlantic—and many countries protested this western imposition. Eventually, nearly every country in the world changed its time system to better coordinate and trade with the major powers of the world.
Contemporary units of time are just as arbitrary as time zones. The notion of 24 hours in a day developed from Egyptian time-telling, 60 minutes in an hour is Babylonian, and dividing a second into fractions of 1/10 comes from the French Revolution, says Birth. “Any time you look at a clock, it’s a weird hodgepodge of different cultures over time,” he adds.
Time is so strictly and uniformly enforced in contemporary culture that we tend to forget it’s a human invention. “We treat every leap year as though it’s the same length when there are leap years that are a day longer. We treat every minute as though it’s the same length when there are some minutes that are a second longer,” says Birth. Similarly, we create “little fictions” to pretend that time is a useful indicator of our habits and experiences. “Workers are paid a salary for five days a week with the fiction that [the] week is eight hours a day. How often do they really work eight hours a day?” he asks.
For those whose work plays out on the screen, it likely makes sense to measure time in flicks. Though it sounds arbitrary now, invented measures of time have a tendency to solidify to the point of objectivity. We may scoff at Facebook’s invention today, but in years to come video producers could well plan to spend their time in flicks, rather than minutes. We already track our lives in hours wasted and efficient days; we obediently follow the arbitrary man-made construct of time.