Google announces 6 new features on search

Google has announced six new features and developments on search to help people gather and explore information in new ways.

Mr Taiwo Kola-Ogunlade, the Head, Communication, West Africa and sub-Saharan Africa, said this in a statement on Thursday in Lagos.

Kola-Ogunlade said that the new features leveraging machine learning would help people gather information in new ways.

He highlighted the six new features as multisearch, multisearch near me, translation in the blink of an eye, Google for iOS updates, even faster ways to find what you are looking for and new ways to explore information.

Kola-Ogunlade disclosed that people use lens to answer over eight billion questions monthly.

He noted that Google introduced multisearch, a major milestone, to help people search for information earlier this year.

‘’With multisearch, you can take a picture or use a screenshot and then add text to it — similar to the way you might naturally point at something and ask a question about it.

‘’Multisearch is available in English globally, and will now be rolling out in 70 languages in the next few months.

‘’Multisearch near me allows you to take a screenshot or a photo of an item, and then find it nearby.

‘’So, if you have a hankering for your favourite local dish, all you need to do is screenshot it, and Google will connect you with nearby restaurants serving it,” he said.

According to him, in translation, in the blink of an eye, one of the most powerful aspects of visual understanding, is its ability to breakdown language barriers.

“Google has gone beyond translating text to translating pictures – with Google translating text in images over one billion times montly in more than 100 languages,” Kola-Ogunlade said.

He said with major advancements in machine learning, Google is now able to blend translated text into complex images.

“Google has also optimised their machine learning models to do all this in just 100 milliseconds — shorter than the blink of an eye,” Kola-Ogunlade added.

He said that this uses generative adversarial networks (GAN models) which powers the technology behind the Magic Eraser on Pixel.

“Google is working to make it possible to ask questions with fewer words- or even none at all- and still help you find what you are looking for.

“For those who do not know exactly what they are looking for until they see it, Google will help you to specify your question.

“Google is reinventing the way it displays search results to better reflect the ways people explore topics to see the most relevant content,” Kola-Ogunlade said. (NAN)

- Advertisement -
Exit mobile version