Skip to navigationSkip to content

The ‘See Food’ app from Silicon Valley really happened, and it was also a lie

John P. Johnson/HBO
In HBO’s Silicon Valley, Erlich (T.J. Miller) pitches Jian-Yang’s (Jimmy O. Yang) app as a “Shazam for food.”
  • Sarah Kessler
By Sarah Kessler

Deputy Editor

Published This article is more than 2 years old.

“We knew Jian-Yang’s app was in the food space, but we assumed that it was camera-based,” says a nameless venture capitalist in a recent episode of HBO’s Silicon Valley. “Like you take a photo of food, the app returns nutritional information or recipes or how it was sourced.”

Jian-Yang, one of the programmers who live at the “incubator” in which the show is set, has just pitched an app called “Seafood” to VCs at the fictional firm Coleman Blair. His idea is to promote his grandmother’s octopus recipes (she gave him the family recipe, he says, “before she died in a horrible way”), but Seafood quickly gets twisted into the pitch Coleman Blair expected.

By the end of the meeting, Jian-Yang’s app is called “See Food,” it’s a “Shazam for food,” and Coleman Blair has invested–even though the technology doesn’t actually exist. (Shazam is an app that can identify songs from just short clips.)

The whole scene is ridiculous. It’s also not so different from a real app that launched in 2011.

Called “Meal Snap,” the app took users’ food photos, identified the foods within them, and returned their calorie contents in real-time. It, too, worked like a “Shazam for food.”

Daily Burn, the company that created Meal Snap, ran a whole suite of fitness- and food-tracking apps at the time, but this magic food identification technology seemed special. “With a little more speed and accuracy, Meal Snap could join the pantheon of truly jaw-dropping apps,” CNET wrote at the time.

Like See Food, however, Meal Snap wasn’t entirely upfront about the actual state of its technology.

In the HBO-version of Silicon Valley, Jian-Yang and Erlich Bachman, who runs the incubator, eventually come up with an automated food tagging system. The problem is that it can only identify “hot dog” and “not hot dog,” and identifying more foods will require scraping endless images of food from the internet to use as training data for a computational model (a task Erlich attempts to trick a Stanford computer science class into completing for free).

Back in 2011, as Meal Snap launched, identifying food in real time using artificial intelligence was out of the question, even if you could recruit a swarm of smart millennials to help.

Meal Snap’s website, referred to the technology as ”magic.” So did Daily Burn CEO Andy Smith, in interviews. Though he acknowledged that the app’s magic involved humans, he never offered specifics about how it worked.

Here’s how it worked: After users took a photo, Daily Burn routed it to Mechanical Turk, an Amazon job market where companies can pay workers cents to do small tasks like identify photos. Mechanical Turk workers identified the photos as “apple” or “chicken” or “burger,” and Daily Burn matched their descriptions with a database of different foods’ calorie contents.

Daily Burn pivoted shortly later (the Meal Snap app in particular wasn’t a great business, as Mechanical Turk fees piled up every time a user snapped a photo). It now streams online fitness classes, which it sells access to on a subscription basis.

“I don’t watch Silicon Valley,” Smith told me when I reached him today. ”It’s too close to home. It stresses me out.”

📬 Kick off each morning with coffee and the Daily Brief (BYO coffee).

By providing your email, you agree to the Quartz Privacy Policy.