Video buffering is still the scourge of streaming, but broadcast TV may hold the solution

Akamai’s Broadcast Operations Center in Cambridge, Massachusetts.
Akamai’s Broadcast Operations Center in Cambridge, Massachusetts.
Image: Akamai/John Agger
By
We may earn a commission from links on this page.

As you read this, there are at least a dozen engineers huddled in a facility in Cambridge, Massachusetts, monitoring live streaming videos from around the world. The control room is ablaze with screens. Some have six different video feeds of the same program, others display virtual maps showing traffic spikes across the globe, while the rest detail what’s happening with the technology behind the streams. They’re searching for regional internet outages, traffic overloads, and anything that breaks video feeds.

The crew watches it all, 24 hours a day, seven days a week. They’re the ultimate binge-watchers.

The war room—run by Akamai, a network for serving up online videos—is a new concept in the streaming space, where engineers normally monitor data logs rather than actual video. It mimics those used in broadcast TV and was built to prevent all-too-common technical problems from interrupting live streams.

Twenty-eight percent of all TV viewing is now done through digital streaming, according to a 2015 GfK MRI study. And that share is expected to increase. But for all the promise of streaming video, it’s still bogged down by pesky, persistent technical problems that infuriate viewers.

Remember Taylor Swift’s opening performance at this year’s Grammy’s? Well, some fans don’t. The live stream that viewers paid to watch on CBS All Access, which costs $5.99 a month, stalled when it couldn’t authenticate viewers’ locations.

Some of the pop star’s fans also missed out on the premiere of her world tour in December, when the video crashed, lagged, or wouldn’t appear on Apple Music. ABC similarly sparked a Twitter storm when it fumbled its live stream of a US Democratic presidential debate. And Tidal’s Yeezy Season 3 feed was an utter failure.

Buffering blunders are extremely common, especially during high-stakes live events like those. A recent US survey by IneoQuest Technologies found that one-third of people encounter buffering—when a program pauses to download data—once in every three videos they watch, and one-quarter experience buffering once in every five videos.

It’s maddening for viewers.

“As soon as you hit a speed bump technologically or digitally, you go elsewhere,” said Disney CEO Bob Iger, at a recent Deutsche Bank conference. “You just don’t want to tolerate it.”

That’s a troubling reality for companies like Disney, CBS, Twitter, and Yahoo that are betting their futures on online video distribution.

So why is buffering still a burden? After all, streaming has been around for at least a decade. And Netflix seems to have conquered this scourge a long time ago.

Well, Netflix is unique, but we’ll get to that.

Streaming is incredibly complicated

Issues like buffering persist online while they’re virtually non-existent in TV because the two media deliver content in completely different ways.

Streaming video is not like TV. The same video doesn’t fly through the same set of tubes to every viewer at once. Everyone who watches streaming video has a single stream that’s uniquely dedicated to them and travels through one of dozens of networks around the world. Those variables cause all kinds of capacity, buffering, and scalability challenges, says Kurt Michel, senior marketing director at IneoQuest, which monitors video quality.

It’s also more complicated than sending, say, a large file through the internet, because streaming video is delivered through small packets of content that are each a few seconds long and need to be delivered consistently to avoid buffering or poor video quality. This is a truncated view of how streaming video works, but it’s a start.

What makes buffering worse is that content providers like ABC and FX don’t always know where things went wrong when there’s a problem. They may have to call 10 different vendors to figure it out.

Unlike with TV, where cable providers own the end-to-end network used to deliver content, there are a lot of players involved in streaming. Oftentimes, someone like Comcast or PBS will upload a video to a third-party cloud service, like Amazon Web Services. Then, an encoder like Envivio will compress that video into the six or more versions needed to stream across devices like mobile phones, tablets, desktop computers, and HD TVs. Once those are uploaded to the cloud server, a network like Akamai will deliver them to the viewer, who streams them through an internet connection that comes from Verizon, Time Warner Cable, or another service provider. The player you use to watch the video on an app or webpage may also come from a third party, as might the cloud based system used to add metadata, titles, captions, and other details to it behind the scenes.

“It’s incredibly complicated,” said Michel. “With greater complexity you have a greater risk of failure. … To some extent, the streaming industry is built on hoping for the best.”

Netflix is one exception. (Told you we’d get here.) The company built most of the pieces it uses to deliver video from the ground up. It doesn’t own the internet, but Netflix can assume that if nothing is wrong on its end, the problem lies with the internet-service provider (ISP). And it makes that known to customers through its ISP Speed Index, which ranks Netflix performance on providers around the world based on the bandwidth and average viewing experience.

Netflix also doesn’t have any of the bandwidth-intensive challenges that come with large-scale, live programming like the Super Bowl or Oscars, which draw millions of eyeball at once, or live, linear programming that’s on all day. It strictly streams videos on demand (for now).

But technical issues still keep the company’s leadership up at night, as the platform pushes toward 100 million subscribers.

“The only inhibitor in our growth is how great is our service,” said chief executive Reed Hastings on Monday (April 18) following the release of the company’s first quarter earnings. ”Could we make it, so there’s never buffering? So it always starts up instantly? … If we can do all that, we’ll continue to grow.”

The streaming-video industry has spent the last decade just getting to the point where the technology works reasonably well, most of the time. It’s only now starting to match the level of quality and reliability of TV, with the aim of surpassing it.

Streaming’s meteoric rise over the last few years was staked on convenience. But, to keep growing, the streaming experience has to become better than TV.

“In the early days, 10 seconds of black screen in streaming wasn’t unheard of, but in broadcast that’s catastrophic,” said Mike Green, vice president of marketing for media at Brightcove, an online video platform. “And that’s what’s now expected in streaming.”

Taking cues from broadcast

Content providers can’t all build their own end-to-end streaming solutions like Netflix. It’s expensive and beyond the scope of their businesses. (Streaming is Netflix’s business.)

But there are other ways to streamline this process, the most promising of which are borrowed from the broadcast world.

They involve giving content owners and streaming platforms a clearer picture of what’s happening at each stage of the streaming pipeline. Most content owners have separate dashboards with data from each vendor. But being able to see every third-party tools’ performance in one place cuts down on the finger-pointing behind the scenes and helps solve connectivity problems faster.

That’s the mission of Akamai’s broadcast operations control center, which was introduced Monday at the National Association of Broadcasters trade show in Las Vegas.

It aims to cut response times down from hours and days to seconds and minutes, by monitoring as many pieces of the pipeline as possible. ”In broadcast, the contingency plans and operational rigor are such that people are able to respond to things in seconds,” said Matt Azzarto, head of media operations at Akamai, and a veteran broadcast engineer. That has not been the case with streaming, he said, where services ”are at the mercy of visibility.”

Better visibility, Azzarto said, also helps improve disaster-recovery and failover scenarios in streaming. (A failover is when a server or network crashes, and the data is rerouted to another server or network on standby.) It makes sure problems that inevitably arrive never interrupt viewers.

Taking another cue from the broadcast world, companies are also experimenting with single streams during live events, which all users can join, instead of sending thousands of individual streams.

Others are trimming the number of external services they work with, or leaning on online-video platforms like Brightcove and Piksel to manage all the moving parts for them.

And there are other technical ways to combat connectivity problems, like using “adaptive bitrate” technology, which is pretty standard in streaming today. If you watch a YouTube video on a phone, you might notice it blur for a few seconds when you step outside or into another room. That’s a sign that the video is being swapped out to adjust for higher or lower internet bandwidths. Content-delivery networks also re-route traffic locally to keep the core network at AT&T or Time Warner Cable from choking up.

Beyond connectivity issues, there are other challenges with streaming video, like integrating advertising technology. Ever stream a TV show and have the player freeze when it gets to the commercial break? Then, when you reload, the show starts over from the very beginning? If not, you’re lucky. It’s slam-your-device-on-the-table frustrating. It happens because most online video platforms use technology that places an ad player on top of a content player.

Think about all the challenges we just outlined. Now double them. There’s a greater risk of buffering—and aggravating viewers—because the technology relies on those content and ad players talking to each other. ”That was an approach that worked in a desktop environment but works less and less with the proliferation of mobile devices,” says Jeremy Helfand, vice president of Adobe Primetime, a marketing service with an alternative method of delivering ads online.

That’s partly why many content providers run pre-roll, or ads that air before the video starts. It cuts down on interruptions mid-stream and the odds that something might go wrong.

Companies like Adobe also get around this by inserting the ads into the actual video. It snips the content, inserts the ad, which might vary by platform or user depending on how personalized it is, and then stitches it all back together. It did this for NBCSports during the last two winter Olympics.

There’s no doubt that TV is barreling toward a future in which streaming is king. You can watch everything from new episodes of Game of Thrones to reruns of Breaking Bad to live Thursday night NFL games online nowadays. But streaming still has to grapple with its unwieldy, underlying technology before it takes the crown. Learning from traditional TV, and other technical fixes, could hold the key.