Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

The big thing missing from Google's new line-up is a next-gen AI assistant

Google has big plans for Bard and generative AI, but what about a next-generation version of Assistant?
By

Published onMay 21, 2023

Stock photo of Google Bard website on phone 3
Edgar Cervantes / Android Authority

The dust has settled on Google I/O 2023, with sweeping announcements spanning physical products, including the Pixel 7a and Pixel Fold, to AI updates like PaLM2. However, looking back, a glaring absentee from the roll call was a vision tying these products altogether: a next-gen AI assistant to span Google’s mighty ecosystem.

Of course, with PaLM2 now powering Google Bard and growing integrations with Search, Gmail, and more, it would be wrong to accuse Mountain View of neglecting advancements in AI. Far from it. Google continues to work at pace to close the gap on ChatGPT, and it clearly has the more established product ecosystem into which it could (and probably should) quickly integrate advanced AI features. However, it’s still unclear what Google’s vision is for AI regarding its physical product portfolio, if it even has one at all.

The Pixel Tablet would have been far more exciting if it brought Bard into our homes.

Search is still the big earner, of course, but chatbots, like Bing Chat, are already far more impressive than Google Assistant at answering humdrum queries we often make to smart speakers. Integration into Google’s expansive Home ecosystem seems like an inevitable next step that would vastly improve the utility of smart speakers and displays. Yet there was no announcement, not even a forward-looking roadmap, to coincide with the new Pixel Tablet. Undoubtedly, the Tablet would have been a far more exciting prospect if it brought Bard or similar capabilities into the heart of our homes. Instead, we have an expensive dockable but otherwise generic Android tablet.

Microsoft Bing Chat listening next to Google Assistant listening
Rita El Khoury / Android Authority

Of course, Google is still ironing out Bard’s kinks, and a sweeping rollout to a range of tangential products would have been far swifter than Mountain View typically moves. Bard’s waitlist only opened in March, after all, and development attention is focused on the impressive power of these huge online-only language models precisely because that’s where the most immediate use cases currently reside. However, that may have to change quite quickly, and Google should be forward-looking.

Because it currently costs just fractions of a cent to perform an individual query, scaling up to the equivalent of the 8.5 billion daily Google Searches is quite likely uneconomical. While Google plans to integrate generative AI into Search, how this affects the profitability of the all-important ads business remains to be seen. This is where the significance of slimmed-down, on-device models has yet to be truly recognized.

The ballooning cost of AI search will make on-device capabilities increasingly important.

There’s a way to go before anything approaching the impressiveness of Bard or ChatGPT runs on your phone without an internet connection, but lower-accuracy models running directly on device are almost certainly an integral part of the AI future — both from a cost but also a security perspective. We’ve already seen the possibilities when Qualcomm compressed Stable Diffusion to run on its Snapdragon 8 Gen 2 processor.

Google Pixel 7a case back white
Robert Triggs / Android Authority

In that vein, Google already has its own custom silicon specifically built for on-device machine learning tasks, including advanced image processing, in its Tensor G2 processor. This chip features all of its recent hardware launches, powering AI tools like Magic Eraser, and it’s clear that custom silicon with ML smarts will be a core part of future product launches too. So, again, it’s a rather glaring omission that Google has, at least externally, no imminent plans to level up Assistant and leverage this investment to take broader generative AI to where it would be most helpful: in our pockets.

Google needs to bring AI to where it would be most helpful: in our pockets.

The Tensor G3 processor and Pixel 8 series are still yet to come this year, which may have more in store for us regarding pocketable AI capabilities. New hardware often has to lead before software can follow, after all. But the fact that Google had nothing to say at Google I/O about how AI will influence its smart home, smartphone, and other product ecosystems suggests, to me at least, that we’ll be waiting at least another twelve months before the company attempts to push the envelope.

A year is an awful long time in the fast-paced world of AI. Google was clearly caught out by the explosive arrival of ChatGPT. Let’s hope it’s not sleeping on the bigger AI picture as well.

You might like