How Google's new technology will change your search experience

Google's Multitask Unified Model

Google's Multitask Unified Model can comprehend data in the form of text, images and videos simultaneously, and draw insights and connections between topics, concepts and ideas.

Photo credit: Shutterstock

Artificial Intelligence (AI) is changing the world in ways unimaginable, and Google is now deploying the technology’s full prowess to improve internet search experience for users.

During the virtual Google Search On event held September 29, the company unveiled a redesign improvement of the search results page, blending images, videos and text, suggested searches, and “things to know” feature.

Utilising a number of AI algorithms, Google has created a new way to understand different formats of information around the world and package it for easy access when searched on a web browser.

 The technology, dubbed Multitask Unified Model (MUM), was first announced during this year’s Google I/O global event last May and can comprehend data in the form of text, images and videos simultaneously, and draw insights and connections between topics, concepts and ideas.

Deeper insights

 Google said MUM will unlock deeper insights when one places a search request on Google Search, with a plan in place to plug the technology on Google Lens visual search to give it a text box to support visual searches.

Google Lens, an image recognition technology which lets you use the phone’s camera for real-time translation, identifying plants and animals, copying and pasting from photos, finding items similar to what’s in the camera’s viewfinder and even getting help with math problems, will be updated with MUM algorithms.

The company said desktop users of Google Lens in Chrome, will also be able to search images with a similar experience as mobile and tablet users.

Elizabeth Reid, head of search experiences at Google said the company was focused on improving experience to access information that guides users when handling a task of new information, improving techniques as well as exploring new designs.

 “With this new improvement, we are transforming the Search page into an endless stream of visual ideas. When exploring a new topic, a new way to visually understand it is by watching videos. By applying MUM, we can identify what is referenced in the video, even when not explicitly mentioned by name,” she said.

But how will this work for users not privy of the new development?

 Pandu Nayak, Vice President of Search at Google said since MUM is a multi-data intelligent technology in the background, users should expect more choices from results displayed on Google Search.

 Similar patterns

“Say, for instance, I really like a pattern on a shirt but I feel the same would be much better on my socks. By posting the shirt, other similar shirts pop up and I can add a text requesting for a pair of socks with similar patterns and colour.

 “More information pops up informing me of local and large chains that stock such products,” he exemplified.

 The “things to know” feature will list some of the most important tidbits about the subject users are searching about, in a similar format to the “people also ask” feature.

 Matt Madrigal, VP Merchant Shopping at Google said the new technology will improve online shopping experience for millions of users worldwide, with filters enabling shoppers to select what they want faster.

 “To make it easy to shop at the moment of inspiration, exploration or when ready to make a purchase, we are making items you see online shoppable with Google Lens from your favourite chains and retailers. If you like what you see, you can use the in-stock filter to find a local store selling the item you want.”

 Google’s Shopping Graph, a comprehensive, real-time dataset of products, inventory and   merchants with over 24 billion listings could prove critical in a period where the Covid-19 pandemic has accelerated the uptake of e-commerce services.

Address Maker feature

 Through open-source system Plus Codes, Google said, governments and organisations will be able to provide addresses to people and businesses via the Address Maker feature which is already available in Kenya, The Gambia, India, South Africa and the United States.

 “And to make it easier to evaluate the credibility of information, we launched the About This Result feature that helps you understand the source of information. We keep you safe and secure as you search. That is our responsibility to protect and respect your data,” said the company’s vice president on trust, Daniel Romain.

 The company’s wildfire boundary map feature that is used to help environmentalists easily understand the approximate size and location of a fire, right from their device will now include emergency websites, phone numbers, and evacuation information starting this October.

Make suggestions

 Also coming up in the next few weeks is an update on YouTube that will rely on MUM to understand what the video you are searching for is about and make suggestions.

 The Environmental Insights Explorer (EIE) Tree Canopy tool, that uses aerial imagery and advanced AI capabilities to identify places in a city that are at the greatest risk of experiencing rapidly rising temperatures is to be expanded to 100 cities during the first half of 2022.

 This gives local governments insights about where to plant trees in order to increase shade and reduce heat.

 Google’s senior vice president for Search said the company’s mission remains to organise the world’s information and make it universally accessible and useful, enabling users to learn helpful new skills without the need for formal education.

 “Anyone can start a home business and instantly connect with billions of potential customers. Students, parents and entrepreneurs, all have the world’s insights at their fingertips. We make this available by helping you tap into the endless knowledge of creators, publishers and businesses from across the web and around the world.”