• Bings ChatGPT brain is behaving so oddly that Microsoft may rein

    From TechnologyDaily@1337:1/100 to All on Fri Feb 17 10:45:04 2023
    Bings ChatGPT brain is behaving so oddly that Microsoft may rein it in

    Date:
    Fri, 17 Feb 2023 10:38:34 +0000

    Description:
    As accuracy problems and creepy responses grow, Microsoft may put measures in place to curtail odd user experiences.

    FULL STORY ======================================================================

    Microsoft launched its new Bing search engine last week and introduced an AI-powered chatbot to millions of people, creating long waiting lists of
    users looking to test it out, and a whole lot of existential dread among sceptics.

    The company probably expected some of the responses that came from the
    chatbot to be a little inaccurate the first time it met the public, and had put in place measures to stop users that tried to push the chatbot to say or do strange, racist or harmful things. These precautions havent stopped users from jailbreaking the chatbot anyway, and having the bot use slurs or respond inappropriately.

    While it had these measures in place, Microsoft wasnt quite ready for the
    very strange, bordering unsettling, experiences some users were having after trying to have more informal, personal conversations with the chatbot. This included the Chatbot making things up and throwing tantrums when called out
    on a mistake or just having a full on existential crisis .

    In light of the bizarre responses, Microsoft is considering putting in new safeguarding protocols and tweaks to curtail these strange, sometimes too-human responses. This could mean letting users restart conversations or giving them more control over tone.

    Microsoft's chief technology officer told The New York Times it was also considering cutting the lengths of conservations users can have with the chatbot down before the conversation can enter odd territory. Microsoft has already admitted that long conversations can confuse the chatbot , and can pick up on users' tone which is where things might start going sour.

    In a blog post from the tech giant, Microsoft admitted that its new
    technology was being used in a way it didnt fully envision. The tech industry seems to be in a mad dash to get in on the artificial intelligence hype in some way, which proves how excited the industry is about the technology. Perhaps this excitement has clouded judgement and put speed over caution. Analysis: The bot is out of the bag now

    Releasing a technology as unpredictable and full of imperfections was definitely a risky move by Microsoft to incorporate AI into Bing in an
    attempt to revitalise interest in its search engine. It may have set out to create a helpful chatbot that wont do more than its designed to do, such as pull up recipes, help people with puzzling equations, or find out more about certain topics, but its clear it did not anticipate how determined and successful people can be if they wish to provoke a specific response from the chatbot.

    New technology, particularly something like AI, can definitely make people feel the need to push it as far as it can go, especially with something as responsive as a chatbot. We saw similar attempts when Siri was introduced, with users trying their hardest to make the virtual assistant angry or laugh or even date them. Microsoft may not have expected people to give the chatbot such strange or inappropriate prompts, so it wouldnt have been able to
    predict how bad the responses could be.

    Hopefully the newer precautions will curb any further strangeness from the AI powered chatbot and take away the uncomfortable feelings when it felt a
    little too human.

    Its always interesting to see and read about ChatGPT, particularly when the bot spirals towards insanity after a few clever prompts, but with a
    technology so new and untested, nipping problems in the bud is the best thing to do.

    Theres no telling whether the measures Microsoft plans to put in place will actually make a difference, but since the chatbot is already out there,
    theres no taking it back. We just have to get used to patching up problems as they come, and hope anything potentially harmful or offensive is caught in time. AI's growing pains may only just have begun.



    ======================================================================
    Link to news story: https://www.techradar.com/news/bings-chatgpt-brain-is-behaving-so-oddly-that-m icrosoft-may-rein-it-in


    --- Mystic BBS v1.12 A47 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)