ad homonym

Why TV captions are still so terebel

June 23, 2014
June 23, 2014

Imagine sitting down to watch an episode of the HBO hit series Game of Thrones—and hardly being able to understand anything. That’s the case for non-native English speakers or any of the 36 million deaf or hard-of-hearing Americans. HBO doesn’t expect its viewers to have a knowledge of High Valyrian; that’s why it takes care to offer subtitles to viewers understand exactly how Daenerys intends to free the slaves of Essos.

If only most online streaming companies took as much care in everyday captioning.

Machine translation is responsible for much of today’s closed-captioning and subtitling of broadcast and online streaming video. It can’t register sarcasm, context, or word emphasis. It can’t capture the cacophonous sounds of multiple voices speaking at once, essential for understand the voice of an angry crowd of protestors or a cheering crowd. It just types what it registers. Imagine watching classic baseball comedy Major League and only hearing the sound of one fan shouting from the stands. Or only hearing every other line of lightning-fast dialogue when watching reruns of the now-classic sitcom 30 Rock.

As of April 30, streaming video companies are now required to provide closed captioning. On all programming. There’s no doubt that we’re in a better place than we were even five years ago, when streaming video companies weren’t required to closed-caption any of its content.  But, there still is a long way to go in improving the accuracy of subtitles. Netflix and Amazon Prime users have bemoaned the quality of the streaming companies’ closed captions, citing nonsense words, transcription errors, and endless “fails.” These companies blame the studios for not wanting to pay for accurate translations but excuses aren’t flying with paying streaming video subscribers.

Marlee Matlin, the Oscar-winning actress and longtime advocate for better closed captions for the deaf and hard of hearing, recently mentioned in an interview that she knows that she’s missing out on most of the action when she’s watching streaming video. “I rely on closed captioning to tell me the entire story,” she says. “I constantly spot mistakes in the closed-captions. Words are missing or something just doesn’t make sense. My kids spot it too, they’re aware of sloppy captions and the pieces of information that I’m not being given.”

Context and Knowledge of Cultural Nuance Matters

Machines also fall short when it comes to translating one language into another. It isn’t sufficient to merely exchange words in one language for its equivalent. When it comes to translating emotional writing and an actor’s subtle delivery of a piece of dialogue, there’s no substitute for the human touch.  Let’s take it a step further. Imagine being a Japanese person watching the 1995 film Trainspotting and having to rely on a word-for-word translation of heavy Scottish dialect and slang. To say that much would get lost in translation is an understatement.

A good translator will have lived in the countries where the respective languages are spoken and aware of cultural and linguistic nuances and will keep up to date through various media with current affairs, the introduction of new words and phrases, and most importantly, will have an intuitive sense for the languages. A good translator will understand how important these details are because she wants others to be as excited and horrified about everything that’s unfolding on the screen as she is.

The heart of language and understanding

Humans can ensure quality and quantity when it comes to giving beloved films and TV shows the proper translations they deserve. Machines can’t be fans in the same way that humans can. They don’t go back and add more details just to enrich the experience or think carefully about whether an audience will understand why a certain word sounds silly in one language but is a deep and unforgivable insult in another.

Crowdsourced subtitling platforms like Viki, a TV site powered by a community of fans who translate shows into multiple languages or Amara that allows you to add closed captions to YouTube videos there are no boundaries on the number of languages a show or film can be translated into and how quickly and accurately the content can be made available to viewers all over the world.

Finally, it’s impossible to overemphasize the role subtitling plays in connecting people who are living abroad or in exile.

The demand for better closed captioning is yielding positive results. Congress recently passed a closed captioning law, requiring broadcasters to caption Internet-distributed video files if the content was broadcast on TV with captions. Netflix is working to comply with the Americans with Disabilities Act to make sure that all its streamed content has subtitles. Amazon Prime is also putting efforts behind making sure all of its Instant Video is closed-caption ready. YouTube has also improved its closed captioning and crowdsourcing and capabilities,. On the human translation front, other companies, such as Amara, a new transcribe-and-translate platform are seeing the value of using people to provide better options than machines. Up in the air, there are changes on the horizon as well. Last week, Sen. Tom Harkin (D-Iowa) demanded that US airlines add closed captioning to in-flight movies to for the benefit of hearing-impaired airline passengers. Closed captioning for all is a fantastic step in the right direction, but until there’s accurate closed captioning for all, we’re still fighting the good fight.

Follow Tammy on Twitter @TammyNam. We welcome your comments at ideas@qz.com.

Top News

Powered by WordPress.com VIP
Follow

Get every new post delivered to your Inbox.

Join 23,615 other followers