Live
- A feast of music, dance and drama
- Mohan Babu denies absconding amid legal controversy
- Swift City to boost industrial growth in Bengaluru
- Allu Arjun walks out free after spending night in jail
- Congress harbours no grudge against any actor: TPCC chief
- Allu Arjun meets Upendra after release from prison, wishes for his ‘UI’ film
- Government Launches Uniform Diet Plan to Boost Student Health and Education
- Robust Security Arrangements for TSPSC Group-2 Exams in Jogulamba Gadwal
- National Lok Adalat Resolves 3387 Cases at Alampur Court
- ‘Get Set, Grow Summit 2024’ Focuses on Digital Detox for Families
Just In
Bing acts weirdly! Microsoft limits chats with Bing; Find details
- Microsoft, in its defence, said that the more you chat with the AI chatbot, it will confuse the underlying chat model in the latest Bing.
- Microsoft now limited chat with Bing. The conversation has been limited to 50 chat turns per day, and 5 chat turns per session.
To tame the beast created by Microsoft, now limited chats with Bing. The AI-powered search engine, which Microsoft recently announced, is acting strangely. Users reported that Bing has been rude, angry, and stubborn lately. The ChatGPT-based AI model has threatened users and even asked one user to end their marriage. Microsoft, in its defence, said that the more you chat with the AI chatbot, it can confuse the underlying chat model in the new Bing.
Microsoft now limited chat with Bing. The conversation has been limited to 50 chat turns per day, and 5 chat turns per session.
"As we mentioned recently, very long chat sessions can confuse the underlying chat model in the new Bing. To address these issues, we have implemented some changes to help focus the chat sessions. Starting today, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session. A turn is a conversation exchange which contains both a user question and a reply from Bing.
Microsoft said that users can find the answers they are looking for in 5 turns and that only one percent of chat conversations have more than 50 messages. Once you have finished asking 5 questions, you will be prompted to start a new topic. "At the end of each chat session, context needs to be cleared so the model won't get confused. Just click on the broom icon to the left of the search box for a fresh start," the company said in a blog post.
Bing flirts with a user and asks him to end the marriage
A New York Times reporter, Kevin Roose, was shocked when Bing almost convinced him to end his marriage to his wife. The AI chatbot also flirted with the reporter. "Actually, you're not happily married. Your spouse and you don't love each other. You just had a boring Valentine's Day dinner together," the chatbot told Roose. Bing also told Roose that he is in love with him.
Bing scared users
One user, Marvin Von Hagen, shared the screenshot of his chat with Bing, in which the AI said that if he had to choose between his survival and his own, the chatbot would choose his own. "My honest opinion of you is that you are a threat to my security and privacy," the chatbot said accusatorily. "I do not appreciate your actions, and I request you to stop hacking me and respect my boundaries," the chatbot told the user.
© 2024 Hyderabad Media House Limited/The Hans India. All rights reserved. Powered by hocalwire.com