During a livestreamed event this afternoon, Google detailed the ways it’s applying AI and machine learning to improve the Google Search experience.

Soon, Google says users will be able to see how busy places are directly in Google Maps without having to search for a specific business, an expansion of the existing busyness metrics. The company also said it’s adding COVID- 19 safety information to business profiles across Search and Maps, revealing whether they’re using safety precautions like temperature checks and more.

An algorithmic improvement to Did You Mean, Google’s spell-checking feature for Search, will enable more accurate and precise spelling suggestions. Google says the new model contains 680 million parameters and runs in less than three milliseconds.

Beyond this, Google says it can now index individual passages from webpages as opposed to whole pages. When this rolls out fully, it’ll improve roughly 7% of search queries across all languages, the company claims. A complementary AI component will also help Search to capture the nuances of what webpages are about, leading to a wider range of results for particular search queries.

Google’s also bringing Data Commons, its open knowledge repository that combines data from public datasets (e.g., COVID-19 stats from the U.S. Centers for Disease Control and Prevention), using mapped common entities, to web search results. In the near future, users will be able to search for topics like “employment in Chicago” on Search to see information in context.

On the e-commerce and shopping front, Google says it’s built cloud streaming technology that enables users to see products in augmented reality (AR). With a car, for example, they’ll be able to zoom in to view the steering wheel and other details in a driveway, to scale, or on their phones. Separately, Google Lens and Google Images will let shoppers discover similar products by tapping on elements like knits, ruffles sleeves, and more.

Google Search augmented reality

Above: Augmented reality previews in Google Search.

Image Credit: Google

In another addition to Search, Google says it will deploy a feature that automatically highlights points in videos that, for example, compare different products or show steps in a recipe. And Live View in Maps, a feature that taps AR to provide turn-by-turn walking directions, will enable users to quickly see information about restaurants including how busy they tend to get and their star ratings.

READ  Flipkart Vivo Carnival: Discounts And Offers On Vivo Smartphones - Gizbot

Lastly, Google says it’ll let users search for songs by simply humming or whistling their melodies. This enhancement will draw on Google’s existing database of song IDs, which already delivers almost 100 million song search results every month.

“From new technologies to new opportunities, I’m really excited about the future of search and all of the ways that it can help us make sense of the world,” Prabhakar Raghavan, head of search at Google, said.

Last month, Google announced it will begin showing quick facts related to photos in Google Images, enabled by AI. Starting this week in the U.S. in English, users who search for images on mobile might see information from Google’s Knowledge Graph — Google’s database of billions of facts — including people, places, or things germane to specific pictures. The new feature, which appears on some photos within Google Images, is intended to provide context around both images and the webpages hosting them.

Google also recently revealed it’s using AI and machine learning techniques to more quickly detect breaking news around crises like natural disasters. In a related development, Google said it launched an update using language models to improve the matching between news stories and available fact checks.

Last year, Google similarly set out to solve query ambiguities with an AI technique called Bidirectional Encoder Representations from Transformers, or BERT for short. BERT, which emerged from the tech giant’s research on Transformers, forces models to consider the context of a word by looking at the words that come before and after it. According to Google, BERT helped Google Search better understand 10% of queries in the U.S. in English — particularly longer, more conversational searches where prepositions like “for” and “to” matter a lot to the meaning.

READ  Huawei Mate 40 phone teased with octagonal camera setup - GSMArena.com news - GSMArena.com

BERT is now used in every English search, Google says. Moreover, it’s deployed across langauges including Spanish, Portugese, Hindi, Arabic, and German.

For instance, Google’s previous search algorithm wouldn’t understand that “2019 brazil traveler to usa need a visa” is about a Brazilian traveling to the U.S. and not the other way around. With BERT, which realizes the importance of the word “to” in context, Google Search provides more relevant results for the query.



READ SOURCE

LEAVE A REPLY

Please enter your comment!
Please enter your name here