These MIT researchers want to translate Shakespeare into GIFs

Help your computer figure out what the heck is going on here.
Help your computer figure out what the heck is going on here.
We may earn a commission from links on this page.

Love them or hate them, GIFs rule the web—and a pair of graduate students at the MIT Media Lab want to turn them into a language. Travis Rich and Kevin Hu started a site that uses human brainpower to quantify the emotional content of animated GIFs (like the two below), as a side project. But their site, GIFGIF, is no joke.

“We were talking about GIFs one day,” Hu told Quartz, “and we realized that they’re becoming more and more serious of a medium. They’re more popular, they’re used for more things.” Buzzfeed, for example, recently used GIFs to explain what was going on in Ukraine—reaching an audience that otherwise might have ignored the news. “And we realized,” Hu said, “that we could quantify this usage.”

The site, where visitors pick which of two GIFs relates better to a particular emotion, is powered by another MIT Media Lab project’s platform. Place Pulse used the multiple-choice A/B voting system to assign emotions to pictures of different cities, allowing researchers to quantify, for example, how “sad” or “unsafe” people felt when looking at pictures of Rio de Janeiro.

But Rich and Hu, who worked on separate teams but sat near each other (and the Place Pulse group) in the lab, decided to harness the system for their own purposes, to create a visual database of emotion. “It’s the same idea,” Rich said. “Taking something that’s very easy for humans to read—emotion—and translating it for computers.” While humans have no trouble deciphering what a GIF “means,” the same task is impossible for a computer.

Since launching on March 3, the site has drawn an average of 15,000 users a day who vote around 10 times per visit. “The average time is increasing already,” Hu said, “so we’re pretty optimistic for the future.” Their first goal is to build a text-to-GIF translator. “I want people to be able to put in a Shakespearian sonnet and get out a GIF set,” Hu said. But once they’ve gotten qualitative metrics for a large number of GIFs, they think the possibilities are pretty endless. “You could reverse-engineer it and use a GIF to find a movie that fits a certain mood,” Rich said.

The two are also interested in the sociological aspect of emotional GIF analysis. “We’re already seeing that votes vary across different cultures,” Rich said, “and looking at which GIFs are the most volatile—which ones have votes change the most based on country—could help us understand how emotions are interpreted across the world.”

GIFs that express happiness, he said, are almost universally agreed upon, but the emotion of “relief” showed much  more variation. “So maybe,” Rich said, “situations where you’re expressing relief are the most likely to be misinterpreted by members of a different cultural group.” But even with cultural variation, GIFs are proving to be more universal than the written word: The researchers recently heard from an ESL teacher who’s using GIFGIF to teach students the words for different emotions.

Rich and Hu think that the most useful applications of their database will come from other researchers, so one of their first projects is to create an open API. They’re excited to see what others do with the data, once they’re able to plug it into their own projects.

Whether or not GIFGIF is able to survive and build a usable database, Rich says, GIF-speak isn’t going anywhere. ”Like with any tech trend,” he said, “some people don’t get it. But that’s not going to be an issue in a decade—the people who don’t get it will retire. Whether or not Congress ever convenes to discuss whether or not the constitution should be translated into GIFs, they’re still a big part of culture and going to remain that way for some time… We hope the tool we’re building will be useful in that future.”