• Multisearch could make Google Lens your search sensei

    From TechnologyDaily@1337:1/100 to All on Thu Apr 7 22:15:04 2022
    Multisearch could make Google Lens your search sensei

    Date:
    Thu, 07 Apr 2022 20:59:21 +0000

    Description:
    Google is updating Lens with the ability to mix image and text search for better results

    FULL STORY ======================================================================

    Google searches are about to get even more precise with the introduction of multisearch, a combination of text and image searching with Google Lens.

    After making an image search via Lens, youll now be able to ask additional questions or add parameters to your search to narrow the results down.
    Googles use cases for the feature include shopping for clothes with a particular pattern in different colors or pointing your camera at a bike
    wheel and then typing how to fix to see guides and videos on bike repairs. According to Google, the best use case for multisearch, for now, is shopping results.

    The company is rolling out the beta of this feature on Thursday to US users
    of the Google app on both Android and iOS platforms. Just click the camera icon next to the microphone icon or open a photo from your gallery, select what you want to search, and swipe up on your results to reveal an add to search button where you can type additional text.

    This announcement is a public trial of the feature that the search giant has been teasing for almost a year; Google discussed the feature when introducing MUM at Google I/O 2021 , then provided more information on it in September 2021 . MUM, or Multitask Unified Model, is Googles new AI model for search that was revealed at the companys I/O event the same year.

    MUM replaced the old AI model, BERT; Bidirectional Encoder Representations from Transformers. MUM, according to Google, is around a thousand times more powerful than BERT. (Image credit: Google) Analysis: will it be any good?

    Its in beta for now, but Google sure was making a big hoopla about MUM during its announcement. From what weve seen , Lens is usually pretty good at identifying objects and translating text. However, the AI enhancements will add another dimension to it and could make it a more useful tool for finding the information you need about what you're looking at right now, as opposed
    to general information about something like it.

    It does, though, beg the questions about how good itll be at specifying exactly what you want. For example, if you see a couch with a striking
    pattern on it but would rather have it as a chair, will you be able to reasonably find what you want? Will it be at a physical store or at an online storefront like WayFair? Google searches can often get inaccurate physical inventories of nearby stores, are those getting better, as well?

    We have plenty of questions, but theyll likely only be answered once more people start using multisearch. The nature of AI is to get better with use, after all.



    ======================================================================
    Link to news story: https://www.techradar.com/news/multisearch-could-make-google-lens-your-search- sensei/


    --- Mystic BBS v1.12 A47 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)