Microsofts tinkering with ChatGPT-powered Bing is doing more harm than good
Date:
Wed, 22 Feb 2023 13:02:38 +0000
Description:
Microsoft is speed running Frankenstein with its ChatGPT Bing tinkering.
FULL STORY ======================================================================
It seems like adding the artificial intelligence chatbot ChatGPT to its Bing search engine was both a stroke of genius and a bit of a disaster for Microsoft, but in its rush to fix things, could it end up making things
worse?
In a story arc that would have got Mary Shelley ringing her lawyers to
discuss copyright infringement, Microsoft caught the worlds attention by unveiling an updated Bing that had a new brain powered by ChatGPT . This allowed users to ask more complex questions, and Bing would respond in a human-like way, drawing from ChatGPTs huge language models, as well as information on the internet.
The potential was huge, and it finally got people talking about Bing, after years of, well, no one talking about it at all. However, once people started using the new Bing, some rather strange and worrying quirks emerged, with Bing providing incorrect answers to questions, giving troubling responses
that hinted at some kind of existential crisis and even throwing tantrums and getting aggressive with users .
Suddenly, all that positive exposure started to turn a bit sour leading Microsoft to hastily take a scalpel to Bing and give it a lobotomy. Topics were banned and the number of replies it could give was reduced in a bid to stop chats with Bing from descending into weirdness.
The problem was, by removing the weirdness, Microsoft made Bing boring again. Self-inflicted damage
After Microsoft drastically reduced the new Bings AI smarts, people took to the internet to complain. It seemed like Bing was now trying to avoid discussing any contentious issues, and would call time on a conversation
after just five responses.
Microsoft had been a bit too brutal when it came to limiting Bing, and it appears the company is now looking to reverse some of those changes. As CNET reports , a new blog post from Microsoft acknowledges that since placing the chat limits, we have received feedback from many of you wanting a return of longer chats, so that you can both search more effectively and interact with the chat feature better.
The company is looking into ways to bring back longer chats responsibly, and it is increasing the number of replies to six. Users can now also perform a total of 60 chats per day with Bing, an increase over the 50 that was imposed last week. According to Microsoft , Our data shows that for the vast majority of you, this will enable your natural daily use of Bing and that eventually, this will increase to 100 chats a day.
All this chopping and changing isnt a great look for Microsoft, however. The ChatGPT inclusion was supposed to be a big relaunch for the floundering Bing search engine, which has failed to challenge Googles dominance, and straight after the reveal, it seemed like Bings time in the limelight had finally
come. However, those early glitches werent just embarrassing for Microsoft, they highlighted the perils of showing new tech off too soon.
As Ive mentioned before , first impressions count, and for many people, this was the first time theyd have used either Bing or ChatGPT so if the experience was poor, theyd be unlikely to try again.
Microsofts subsequent removal of features, then adding some back, and drastic cuts to Bings capabilities means no one is really sure what Bing is at the moment. It feels like Microsoft might not have fully understood what it was creating, and that creation took on a life of its own and turned into a monster. In its rush to get it back under control, the company might have inadvertently killed off its creation in the process.
======================================================================
Link to news story:
https://www.techradar.com/news/microsofts-tinkering-with-chatgpt-powered-bing- is-doing-more-harm-than-good
--- Mystic BBS v1.12 A47 (Linux/64)
* Origin: tqwNet Technology News (1337:1/100)