Everything Google announced at its 2019 I/O developers conference

Google I/O 2019 kicks off
Google I/O 2019 kicks off
Image: REUTERS/Paresh Dave
We may earn a commission from links on this page.

Google has a panopticon view into your life. It’s hoping its products will be so “helpful” you won’t mind.

Google opened its Google I/O developer conference in Mountain View today (May 7) by releasing new services and features introducing artificial intelligence into our everyday lives. Google’s pitch was that its new products would be irresistible useful by “giving you the tools to increase your knowledge, success, health, and happiness,” said Google CEO Sundar Pichai in his May 7 keynote. “We feel so privileged to be developing products for billions of users, and with that comes a deep responsibility for creating products that improve peoples lives… and society as a whole.”

Barely a minute passed before Pichai mentioned security and privacy—a theme that cropped up over and over. Google reassured its users it had their best interests at heart. As it becomes unavoidable our lives at work and home, Google has unprecedented insight into our personal lives. That might give it a leg up as it races against Apple and Amazon to become the operating system for all our personal needs.

The demonstrations elicited plenty of impressed sounds from the crowd. Want to book a rental car? Google Assistant can find your emailed flight reservation, and then pick out the time, place, pickup location, and model you’ll likely want the car on the web. Just tap a button to confirm. Sending texts or emails to mom, dad, or “my” doctor? Personal References means Google figures out exactly who you mean among your friends, families, and acquaintances to make the connection (or even hold conversations on your behalf).

Take together, today’s announcements point to a future where we’ll never need to leave Google behind, and we may not mind.

AI was everywhere

Google opened the day by handing the DJ booth over to its AI, which took turns mixing tunes with a human (one machine learning expert in the audience told me “not bad.”). It quickly became clear that the day would focus on ways Google wants to integrate its technology in the real world.

Duplex for the web: During last year’s conference, developers demoed Duplex—an AI system that understands natural language and holds conversations like a human— making a dinner reservation at a real restaurant. This year, the virtual assistant took a verbal command from a would-be user to book a rental car. To do so, it fetched information about an upcoming flight from email, filled in an Enterprise car rental form on the website with personal information, dates, and preferred model, and let the user click to finalize it. Google claims this wasn’t a one-off example built for Enterprise: Duplex can adapt to work with websites across the internet (it’s preparing to roll this out to everyone late this year).

Google’s amazing shrinking Assistant: Google has been obsessed about reducing the size of its algorithms to fit on a smartphone. Until recently, Google’s Assistant worked by streaming voice data to the cloud, processing it there, and then sending back to the phone. That causes delays. To allow the digital assistant to work natively on the phone, Pichai said reduced its 500GB language-processing database to just half a gigabyte. That allowed every phone to carry the AI-powered Assistant on the device so interactions feel fast and seamless. They’re so fast, in fact, that voice may now be a credible replacement for the touch screen. A live demo on stage allowed the user to zip between apps—maps, messaging, photos, and more—without ever touching the screen, a feat not possible before now (and maybe not even now, since Pichai said more was coming “later this year.”).

Less bias please, Google: AI is making more decisions for us, and our inability to comprehend the decision-making process is risky. Machine learning models train on databases that can lead to biases. Our understanding of how those biases emerge is still murky. Google’s new methodology surfaces such biases by highlighting what features the algorithm prioritized to draw conclusions. If a model was trained to identify doctors based on digital images, for example, this approach reveals the algorithm (incorrectly) associated “males” as a predictive variable, allowing designers to correct it.

Google’s new approach to detecting bias in AI
Google’s new approach to detecting bias in AI

Accessibility and social impact: Google is testing algorithms to diagnose early-stage lung cancer and to predict floods in India. The Live Transcribe service lets people with hearing problems see spoken words transcribed accurately in real time. Live Relay lets people reply in text to spoken conversation. Google also created personalized language-learning models so the algorithm can instantly transcribe speech from those with speech impediments. Finally, Live Captions transcribes videos in real time, no matter whether it’s a YouTube clip or a home video.

All about Home

Google announced the Nest Hub Max ($229) to be released this summer (the standard Nest Hub is $129). The new Hub comes with a camera (and thumping bass). It’s designed with features as convenient as they are creepy: The wide-angle camera will follow you during a video conference if you move across the room. You can use a hand gesture to turn off the music rather than a voice command.

Google never wants you to leave search

A search for news on Google will soon bring up “knowledge panels.” The search giant is pulling together a much richer series of media sources from the web so you never leave the search results page. Google will index podcasts by crawling the content (not just titles) and offering it in search results; users can play the podcast that comes up or save it for later on the results page. Users can also access 3D models, created by companies and that appear in search results, by clicking on a 3D icon in your search results.

Organizing the physical world

Google is turning its prowess indexing the web into cataloging the physical world. Google Lens can now use the Android camera and Assistant to apply a rich digital layer to the world. Google product director Aparna Chennapragada pointed a Pixel phone at paper menus to show food reviews and photos, as well as at a recipe in Bon Appétit Magazine showing an overlaid video of a chef preparing the meal (Google is working with other commercial partners to bring this to more interactions). “With Lens, we’re indexing the physical world for billions of people and places much like search indexes the billions of places on the web,” she said. Chennapragada also showed off how shoppers might match virtual New Balance sneakers with an outfit laid on a bed, or transport a virtual great white shark anywhere.

Aparna Chennapragada demonstrates Google Lens.
Aparna Chennapragada demonstrates Google Lens.
Image: Michael Coren

No more “Hey Google”

It was always a bit of a hassle to utter “Hey Google” before issuing a command. So Google dropped it. For alarms, you can now just say “stop” to shut off the alarm for all devices. To text, you can simply issue commands and Assistant is able to differentiate between the message content and the command (it wasn’t possible to really evaluate it based on its performance on stage). As the voice assistant wars heat up, the ability to seamlessly weave those digital helpers into users’ normal speech patterns might be the deciding factor in who wins.

Android Q and Pixel hardware

Android got a raft of updates. The mobile operating system is ready for foldable phones, even if foldable phones aren’t ready for it (Samsung’s model has been delayed although Google’s Stephanie Cuthbertson says models are coming later this year). Dark Mode will be launched with Android Q (developers rejoice), which spares users’ eyes the strain of staring at high-contrast white backgrounds. Lots of machine learning goodies are being crammed onto the devices’ native apps, such as Smart Replies (which let suggests replies and anticipates your next action such as opening maps).

It’s “Digital Wellbeing” initiative re-emerged this year with more parental controls, Wind Down timers for app usage, an Focus Mode to pause distractions. Google will soon let users know when its time to put the phone down, or place limits on children’s screen time. A Beta version is available now.

Google also made sure it was zigging when Apple was zagging. Every comparison of its Pixel camera’s was placed against a generic “Phone X,” a loosely-veiled swipe at the $1,000 iPhone X.

Google announced it would pack in premium features, camera, and services into its starter smartphones. The new Pixel 3a and 3a XL start at $399 including a 30-hour charge, slick camera features such as Night Sight, and, yes, a 3.5 mm headphone jack. You’ll be giving up wireless charging and waterproof casing for that price.