So far smart speakers and smart displays (such as the Amazon Echo and Google Home lines) have been aimed at consumers and homes.
Where’s the smart appliance for business?
As I wrote in this space previously, I believe in a coming convergence between smart speakers and business phone and conference systems. At the time, I believed that convergence would first show up in the form of a business- or enterprise-specific phone device based on the Alexa for Business platform.
Now it looks as if Google will beat Amazon to the office and conference room.
Google Assistant’s big day is Tuesday
In advance of a major set of product announcements scheduled for New York on Tuesday, Google started rolling out this week a new, redesigned Google Assistant.
The redesign will make Assistant interaction more visual. Home automation tools become more complex, with elements such as sliders for lights and speaker volume. The interface is gaining buttons, big pictures and charts.
Swiping up on the Assistant screen gives you an “overview” of your day, using data from Calendar and elsewhere.
Google is also leapfrogging Alexa in ride hailing. You can just say, “Hey, Google, book a ride to the airport,” and Google will give you an instant comparative analysis based on wait times and price among many ride services, including Uber, Lyft and many others.
Or you can specify the service in your command.
Either way, you still need to finalize the ride on your phone.
That’s all nice. But for retailers and content publishers, the most important change is that users will be able to make purchases by simply talking.
Users will be able to upgrade and enhance existing products such as apps or games by requesting the changes with voice.
Google will also introduce Google Sign-In for Assistant. The feature has been tested by Starbucks. It lets you simply tell Assistant to “order a tall vanilla latte with extra whip,” and Assistant logs you in and uses your Starbucks card to place the order. It does everything for you except actually drink the coffee.
Google claims that the Sign-In feature doubled “login conversions” for Starbucks.
Another underreported bit of news is that Google has recently acquired Onward, which is an AI chatbot startup that creates tools for businesses, such as sales and customer service tools.
One Onward tool is particularly interesting for Google Assistant integration. Called Agent Q, it uses AI to recommend products to consumers.
Onward technology could also enable Google Assistant integration with major sales and business tools such as Salesforce and many others.
Customer-facing enterprises might want to explore how they can use the Google Sign-In for Assistant and other coming announcements, tools and integrations to facilitate commerce and customer service on Assistant-supporting devices.
It would be a mistake to underestimate the coming reach and ubiquity of the Google Assistant.
Amazon has a strong lead in the smart speaker and smart display race, but Google Assistant-powered devices have been outselling Echos and other Alexa devices.
Google also has another advantage that Amazon doesn’t have, which is the ability to offer Assistant on its own smartphones, on third-party Android smartphones, in all Chrome browsers and as part of its mobile iOS and Android apps, such as the Google app.
A ‘real’ smart display and a ‘virtual one’
Google will probably announce on Tuesday its own smart display product, called the Home Hub.
Google should also announce a Pixel Stand for the upcoming Pixel 3 and Pixel 3 XL smartphones.
Based on a variety of reports, it’s clear that Google is planning to enable Pixel phones to slip into a kind of “smart display” mode while resting in the dock, which will hold the phone in portrait mode.
Given the word “home” in the names of Google’s smart speaker (Google Home) and upcoming smart display (Google Home Hub), it’s clear that Google’s smart appliances are for consumers and the home.
But the Pixel phones and Pixel Stand hub do not have the word “home” in them. That’s because they’re not just for home. The smart display in the form of Pixel phones and the Pixel Stand will be for both home and office, and thereby become a major inroad into the use of Google Assistant in enterprises and businesses of all sizes.
The new smart display interface will constantly show contextual information such as the time, the weather, battery status, and other data.
Google released its third version of the smartphone operating system Wear OS, which comes with an improved Google Assistant feature. The biggest change: proactivity. The Wear OS Google Assistant can offer all kinds of contextual information (some of it based on personal data mined in Gmail).
This makes sense, because wristwatches can gather amazing contextual data, such as user location and also whether the user is walking or sitting.
I think this is a preview of what’s coming for the docked Android phone version of Google Assistant. Phones have even better contextual information than watches, because placing the phone in the dock says a lot about intention — namely that the user is not intending to leave and go somewhere else, but plans to stay in a single place and may want hands-free notifications and assistance.
The docked Google Assistant feature will emphasize pre-emption and agency, which is to say that it will make suggestions to you before you ask, and will also do things for you — all while your Pixel phone is docked and locked.
(It will base these actions on personal data, but won’t show any personal data on screen while locked unless you grant your specific phone on your specific Pixel Stand permission to show person information while docked.)
Google’s wireless charging support project is code-named Dreamliner. It’s likely that Dreamliner can support third-party chargers and non-Google phones.
Crucially, the Android phones that support these Google Assistant “docked” features will know when the user is at work and should display a different set of information “cards” than when it’s at home.
Even better — and unlike the Google Home Hub, which probably does not even have a camera — an Android phone has a front-facing camera and should be able to detect when and if the user is sitting at the desk or not present.
Google has reportedly been working on a feature called Face Match for the Google Assistant, which sounds to me like something that could show personal information when the user is recognized, but conceal it when somebody else is present.
All this news and educated speculation adds up to the ability to use Android phones in general, and Google Pixel phones in particular, as smart, context-aware, personalized and secure smart displays in the office.
Finally — a smart display for work!