How Google Assistant, Home, and Lens will completely change the way you search

How Google Assistant, Home, and Lens will completely change the way you search

If Google I/O 2017 had a tagline, it might be something like “Rise of the Machine Learning.” The biggest thread throughout the keynote was an aggressive push to fully move from a mobile-first world to an AI-first world, changing the way we use Google and our devices to look at everything around us.

It used to be about algorithms, but now it’s about artificial intelligence. Google doesn’t simply want to be the tool we use in our browsers to find something, it wants to be within reach whenever we have a question, whether or not we’re even looking at a screen. We already see it with Google Assistant on Google Home, but now Google is starting to pull search out of our phones and apps in order to make it accessible everywhere.

But that also means changing our expectations. Where now we still need to explicitly ask Google to search for something, the new AI push will use our cameras, calendars, and favorite apps to deliver information where and when we need it, even if we don’t precisely know what we’re looking for.  And in some instances, we might not even need to ask a question.

Seeing is believing

The only real new product Google unveiled at I/O was Google Lens. It’s not a separate app but rather an underlying platform that supercharges the way our phones integrate with their cameras and our photos. It’s kind of like Google Goggles meets Samsung’s Bixby, with a little augmented reality mixed in for good measure.

google search everywhere2 Google

Google Lens knows what it’s looking at and can tell you all about it.

And if it works as well as Google says it will, it could dramatically change the way we use search. Instead of opening Google Translate and typing a bit of text into a search field, you’ll be able to point your phone’s camera at the text, and Translate will work its magic. It’s likely similar to the way the Google Translate app uses Word Lens to instantly decipher text—so this is less about introducing a new feature, and more about making something dead simple so we’ll regularly use it.

And eventually it may work for everything else we see, too. Snap a picture of a flower and Google Lens will tell you what species it is—and from there you can use Assistant to learn more about it. This distills all the steps (and time) of a laborious image search down to a single simple action. We’re still Googling, but we’re barely thinking about it.

Home is where the search is

The Google Home desktop speaker is already a perfect way to limit our reliance on our phones without reducing our dependency on search (and Google in general). But now Home stands to become a true hub for our digital lives. The big new feature is hands-free calling—which would have been much splashier had Amazon not unveiled the same thing last week with the Echo Show. Fear not, though, because Google is also giving Home more responsibility for organizing our lives.

google search everywhere 3 Google

Now Google Home can tell you if there’s something you need to know before you ask.

A new feature called Proactive Assistant will rummage through your schedule and reminders to let you know if you’re forgetting something. So if there’s heavy traffic and you have to pick up your kid from soccer practice, Google Home will search through your calendar and traffic reports to let you know you should probably leave a little early.

It’s not too intrusive either. The spinning circle of lights at the top of the device will alert you to the notifications. From there, you can say, “Hey Google, what’s up,” and it’ll tell you what it found. Down the line, I could see it extending to other Google Now On Tap services, like breaking news stories and sports updates, and maybe even third-party apps.

Granted, it could all get a little too overbearing if Google isn’t judicious about what Google Home delivers, so there needs to be a balance between what’s useful and what’s overkill. But there’s a real opportunity to add a new dimension to search in a way that anticipates our needs, and delivers timely and relevant content like a true assistant should.

Personal touch

The mobile-first world that Google is transitioning away from isn’t just driven by traditional search. It’s also driven by apps. Our phones are filled with dozens, nay hundreds of apps that we only need to open a few times a month. And Google is looking to Assistant as the primary way to access all of that information without actually needing to open or even download the apps.

assistant transactions Google

With Transactions, you’ll be able to search, order, and pay for things through Google Assistant.

First, Google is bringing Google Home actions to phones. And with a screen, you’ll not only be able to talk to your apps through Assistant, but also interact with them in some new, fresh ways. Tasks like ordering food and accessing recipes can be swiftly done through Google Assistant, and like Google Lens, it dramatically cuts down on the time needed to accomplish these tasks.

Once you’ve told Assistant to access a specific app, Assistant will use the database to retrieve whatever you ask it. And soon that will extend to transactions as well, letting us buy products or order food just by asking. Developers can leverage Assistant to create an end-to-end ordering process that walks you through the whole transaction, from searching for something, to customizing your order, to checking out. And if it works as advertised, it will take a fraction of the time it takes now, without ever logging into a website or opening an app.

Google everywhere

Google’s new AI push isn’t about phones or apps or even Android. It’s about bringing Google everywhere it isn’t and using Assistant to fill in the gaps. It’s not a move away from mobile per se, but rather a way to make Google more versatile and expansive as we evolve away from traditional search.

google search everywhere4 Google

Soon we won’t need a bar to search for something.

But it’s still powered by the two main things that propelled Google to such astronomical heights: speed and accuracy. The three-pronged attack of Google Lens, Home, and Assistant will simultaneously expand our use of search and cut down the time we need to spend with it by delivering prompt, targeted responses.

And pretty soon we won’t need to open Chrome or tap a bar to find out about something. Google will just be there whenever we need it.

IDG Insider

PREVIOUS ARTICLE

«Android device updates: Android O has an official beta, and Nougat hits several new phones

NEXT ARTICLE

Tim Cook is testing a new Apple Watch device that monitors his blood sugar levels»
author_image
IDG Connect

IDG Connect tackles the tech stories that matter to you

Add Your Comment

Most Recent Comments

Our Case Studies

IDG Connect delivers full creative solutions to meet all your demand generatlon needs. These cover the full scope of options, from customized content and lead delivery through to fully integrated campaigns.

images

Our Marketing Research

Our in-house analyst and editorial team create a range of insights for the global marketing community. These look at IT buying preferences, the latest soclal media trends and other zeitgeist topics.

images

Poll

Should companies have Bitcoins on hand in preparation for a Ransomware attack?