Where is your business on the AI adoption curve? Take our AI survey to find out.
In May, at its Google I / O 2021 developer conference, Google introduced the Unified Multitasking Model (MUM), a system trained in 75 languages at a time that can simultaneously understand different forms of information, including text, images and videos. Today, Google revealed that it is using MUM to identify variations in the names of COVID-19 vaccines in multiple languages, which the company says has improved Google search’s ability to bring up information about the COVID-19 vaccines for users around the world.
As Google notes, the COVID-19 vaccines released to date – including those from AstraZeneca, Moderna and Pfizer – have different names depending on the country and region of origin. There are approximately hundreds of COVID-19 vaccine names around the world, not all of which have historically reached the top of the search when users typed in phrases like “new virus vaccines,” “mrna vaccines,” and “AZD1222”.
MUM, which can transfer knowledge between languages and does not need to be explicitly taught how to perform certain tasks, has helped Google engineers identify more than 800 COVID-19 name variations in more than 50 languages, according to the vice president of Google search, Pandu Nayak. With just a few examples of “official” vaccine names, MUM was able to find interlingual variations “in seconds” from the weeks it could take for a human team.
“This first application from MUM has helped deliver important and timely information to users around the world,” Nayak said in a blog post translated from Japanese. “We look forward to making research more convenient through the use of MUM in the future. The first tests showed that MUM not only improves existing systems, but also helps to develop new methods of information retrieval.
Google previously applied AI to the problem of providing projections of COVID-19 cases, deaths, intensive care use, ventilator availability, and other metrics useful to policymakers and healthcare workers. . In August 2020, in partnership with Harvard, the company released models that forecast COVID-19-related developments over the next 14 days for U.S. counties and states.
MUM has potential beyond identifying the vaccine name, especially in situations where it can rely on context and more on pictures and dialogue. For example, from a photo of hiking boots and the question “Can I use it to hike Mount Fuji?” MUM can understand the content of the image and the intent behind the query, advising the user that the hiking boots would be appropriate and pointing to a lesson in a Mount Fuji blog.
MUM can also understand questions like “I want to hike Mount Fuji next fall, what should I do to prepare?” Because of its multimodal capabilities, MUM realizes that “preparation” can encompass things like physical training as well as the weather. The model could then recommend to the person asking the question to bring a waterproof jacket and give tips for digging deeper into the topics with relevant content from articles, videos and images on the web.
“We are only in the early days of harnessing this new technology,” said Prabhakar Raghavan, senior vice president at Google, on stage at Google I / O. “We’re excited about her potential to solve more complex questions, no matter how you ask… MUM is a game-changer with her language comprehension abilities. “
VentureBeat’s mission is to be a digital public place for technical decision-makers to learn about transformative technology and conduct transactions. Our site provides essential information on data technologies and strategies to guide you in managing your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the topics that interest you
- our newsletters
- Closed thought leader content and discounted access to our popular events, such as Transform 2021: Learn more
- networking features, and more
Become a member