Technology

Google to use MUM to improve visual search, offer broader result fields

Published

on

Google’s latest research on artificial intelligence will now be visible in search products, triggering what it believes will be “a new way to search and explore information in a more natural and intuitive way.” With the launch of Google’s Multitasking Unified Model (MUM), users will begin to see updated visual search methods and will also be able to understand the topics they are searching more broadly. With the help of the new visual search function, Senior Vice President Prabhakar Raghavan stated in a new blog post that it coincides with Google’s annual search activity, and Google users will be able to view “shirt pictures and ask Google to find the same pattern for you—” -But on another piece of clothing, such as socks.”

The post explains that it can be helpful when the search is difficult to describe accurately with only words. You can also use this visual search to point to a specific part of the name that you don’t know, and get a tutorial on how to use or fix it. Liz Reid of Google Search explained that the new artificial intelligence features bring three things.

“First, what is the real problem behind your query, how do we understand it; and can do it in new ways, not just text, voice or images.

Advertisement

The second is to help you sometimes you don’t know When should the next question be when to ask the next question.

The third is to make it easy to explore information… The network is so huge and amazing, but sometimes it can be a little overwhelming,” she said. The blog stated that it will also become easier to explore and understand new topics through “things”. Know”.

For most topics, Google will use its learning to show what it knows what people might look for in the first place. Google promises that soon “MUM will unlock deeper insights you might not know about search.” It will also begin Showcase new features to optimize and expand the search scope, and provide a newly designed browsable result page for easier inspiration. In the video, Google will begin to identify related topics, “easily dig deeper and learn more through links”.

The post said: “Using MUM, we can even display related topics that are not explicitly mentioned in the video based on our in-depth understanding of the information in the video.” When asked how Google will contextualize searches in different languages ​​or from locations with different sensitivity, Reid told indianexpress.com: “Mom really understands concepts, so it can accept your queries and sort mappings. It, and then actually connect it to related information that may be expressed in different ways in another language. The fact that it is cross-trained with all different languages ​​also makes it easier.” Currently, MUM can view 75 different languages ​​and understand queries in all these languages.

Advertisement

News Source : The Indian Express

Trending

Exit mobile version