Google Search Gets AI Improvements for Better User Experience, Adds ‘Hum to Search’ Feature to Identify Songs

Read Time:3 Minute, 53 Second

Google Search has added several new AI-based tools to enhance user experience, the company announced during its Search On virtual event on October 15. It brings in a host of improvements that includes better understanding of spelling inputs from users, indexing individual passages from webpages, dividing broader searches into subtopics, and dividing videos into segments. The search giant has also brought in a ‘hum to search’ feature that helps users identify a song stuck in their head by simply humming or singing a tune for a few seconds. Here’s a look at some of the most useful features introduced in Google Search.

Understanding misspellings

Google’s new AI-based improvements to its Search aims to better organise the “world’s information and make it universally accessible and useful.” It said in a blog post that Google Search receives one out 10 queries that are misspelled on a daily basis. While Google has been suggesting correct spellings on searches via the “did you mean” feature, it has now introduced a new spelling algorithm that uses a deep neural net to help decipher misspelling better.

For example, if you search for “does algae bloom produce foul order,” Google’s suggestion would now show “does algae bloom produce foul odour.” Google says that the new algorithm will now help understand context of misspelled words and throw up right results in under three milliseconds. The feature has been introduced yesterday, October 15, as per Google.

Indexing passages

Google Search will now be able to index specific passages from webpages and not just show the entire webpage based on a query. If a user searches, “how can I determine if my house windows are UV glass,” Google will now index specific section from a webpage that talks about the exact query, rather than throw up searches populated by entire webpages on UV glass. The company said that the feature will improve seven percent of search queries across all languages when it will be rolled out globally.

Dividing searches into subtopics

In another new feature, Google Search now throws up subtopics based on the user’s query. For instance, if someone searches “home exercise equipment,” Google will show subtopics such as budget equipment, premium picks, or small space ideas. The feature will start rolling out by the end of this year, the company said.

High quality COVID-19 information

Owing to the ongoing pandemic, Google Search has added a Live View feature that provides essential information about a business before you visit it in person. The update will show you how busy a business is right now to help you maintain social distancing easily. COVID-19 safety information will also be shown on Business Profiles across Google Search and Maps, letting you know if a business requires you to wear a mask or make reservations in advance. It will also show if the employees are taking extra safety precautions like regular temperature checks. Businesses can also choose to keep their online information up to date, including opening hours and store inventory.

Showing key moments in videos

With its new AI-driver tools, Google can now automatically identify key moments in videos and divide them with useful markers that would help users skip to the parts they’re interested in. This particularly comes in handy when watching sports highlights or following a cooking recipe. While the feature has already been in its testing phase throughout this year, Google expects that 10 percent of all searches on its platform will be able to use this new technology.

Advanced search to help quality journalism

Google has introduced Journalist Studio to help news professionals efficiently look through massive collections of documents, images, and audio recordings. It introduces a Pinpoint feature that will help reporters sift through “hundreds of thousands of documents by automatically identifying and organizing the most frequently mentioned people, organizations and locations.” To request access to Pinpoint, reporters can sign up starting this week.

Hum a tune to identify a song

In a fun new feature, Google Search now allows users to hum, sing, or whistle a tune to correctly identify a song. The ‘hum to search’ feature works in a similar way as the Shazam app that helps you identify songs playing around you.

The new feature was introduced on October 15 and is available in the Google app on both Android and iOS platforms. When using Google Assistant, simply ask “what’s the song” and then hum the tune. When using Google Search, you can tap on the mic icon in the search bar and say “search a song” or “what’s this song.” This would enable Google to throw up suggestions of songs that resembles your tune.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous post Apple introduces the iPhone 12
The Beautiful Poetry of Donald Trump Next post The Beautiful Poetry of Donald Trump by Rob Sears