• About
  • Advertise
  • Contact
Saturday, July 26, 2025
  • Login
No Result
View All Result
NEWSLETTER
The NY Journals
  • Home
  • Business
  • Technology
  • Entertainment
  • Sports
  • Lifestyle
  • Health
  • Politics
  • Trending
  • Home
  • Business
  • Technology
  • Entertainment
  • Sports
  • Lifestyle
  • Health
  • Politics
  • Trending
No Result
View All Result
The NY Journals
No Result
View All Result
Home Business

“Angry” Bing Chatbot Just Mimicking Humans, Say Experts

by Sarkiya Ranen
in Business
“Angry” Bing Chatbot Just Mimicking Humans, Say Experts
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


The Bing chatbot was designed by Microsoft and the start-up OpenAI. (Representational)

San Francisco:

Microsoft’s nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said on Friday.

Tales of disturbing exchanges with the chatbot that have captured attention this week include the artificial intelligence (AI) issuing threats and telling of desires to steal nuclear code, create a deadly virus, or to be alive.

“I think this is basically mimicking conversations that it’s seen online,” said Graham Neubig, an associate professor at Carnegie Mellon University’s language technologies institute.

“So once the conversation takes a turn, it’s probably going to stick in that kind of angry state, or say ‘I love you’ and other things like this, because all of this is stuff that’s been online before.”

A chatbot, by design, serves up words it predicts are the most likely responses, without understanding meaning or context.

However, humans taking part in banter with programs naturally tend to read emotion and intent into what a chatbot says.

“Large language models have no concept of ‘truth’ — they just know how to best complete a sentence in a way that’s statistically probable based on their inputs and training set,” programmer Simon Willison said in a blog post.

“So they make things up, and then state them with extreme confidence.”

Laurent Daudet, co-founder of French AI company LightOn, theorized that the chatbot gone seemingly rogue was trained on exchanges that themselves turned aggressive or inconsistent.

“Addressing this requires a lot of effort and a lot of human feedback, which is also the reason why we chose to restrict ourselves for now to business uses and not more conversational ones,” Daudet told AFP.

‘Off the rails’

The Bing chatbot was designed by Microsoft and the start-up OpenAI, which has been causing a sensation since the November launch of ChatGPT, the headline-grabbing app capable of generating all sorts of written content in seconds on a simple request.

Since ChatGPT burst onto the scene, the technology behind it, known as generative AI, has been stirring up fascination and concern.

“The model at times tries to respond or reflect in the tone in which it is being asked to provide responses (and) that can lead to a style we didn’t intend,” Microsoft said in a blog post, noting the bot is a work in progress.

Bing chatbot said in some shared exchanges that it had been code named “Sydney” during development, and that it was given rules of behavior.

Those rules include “Sydney’s responses should also be positive, interesting, entertaining and engaging,” according to online posts.

Disturbing dialogues that combine steely threats and professions of love could be due to dueling directives to stay positive while mimicking what the AI mined from human exchanges, Willison theorized.

Chatbots seem to be more prone to disturbing or bizarre responses during lengthy conversations, losing a sense of where exchanges are going, eMarketer principal analyst Yoram Wurmser told AFP.

“They can really go off the rails,” Wurmser said.

“It’s very lifelike, because (the chatbot) is very good at sort of predicting next words that would make it seem like it has feelings or give it human like qualities; but it’s still statistical outputs.”

(This story has not been edited by NDTV staff and is auto-generated from a syndicated feed.)

Featured Video Of The Day

The ‘King Of Mangoes’ Is Here, Alphonso Mangoes Flood Markets



Source link

Tags: AngryBingBing ChatbotChatbotExpertsHumansMicrosoftMimicking
Sarkiya Ranen

Sarkiya Ranen

I am an editor for Ny Journals, focusing on business and entrepreneurship. I love uncovering emerging trends and crafting stories that inspire and inform readers about innovative ventures and industry insights.

Next Post
New Mexico Feral Cows To Be Culled By Helicopter Shooters

New Mexico Feral Cows To Be Culled By Helicopter Shooters

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Billionaire’s Habit of Tipping Off Staff, Girlfriend Lands Him in Court

Billionaire’s Habit of Tipping Off Staff, Girlfriend Lands Him in Court

2 years ago
Teens are spilling dark thoughts to AI chatbots. Who’s to blame when something goes wrong?

Teens are spilling dark thoughts to AI chatbots. Who’s to blame when something goes wrong?

5 months ago

Popular News

    Connect with us

    The NY Journals pride themselves on assembling a proficient and dedicated team comprising seasoned journalists and editors. This collective commitment drives us to provide our esteemed readership with nothing short of the most comprehensive, accurate, and captivating news coverage available.

    Transcending the bounds of New York City to encompass a broader scope, we ensure that our audience remains well-informed and engaged with the latest developments, both locally and beyond.

    NEWS

    • Business
    • Technology
    • Entertainment
    • Sports
    • Lifestyle
    • Health
    • Politics
    • Real Estate
    Instagram Youtube

    © 2025 The New York Journals. All Rights Reserved.

    • About Us
    • Advertise
    • Contact Us
    No Result
    View All Result
    • Home
    • Business
    • Technology
    • Entertainment
    • Sports
    • Lifestyle
    • Health
    • Politics
    • Trending

    Copyright © 2023 The Nyjournals

    Welcome Back!

    Login to your account below

    Forgotten Password?

    Retrieve your password

    Please enter your username or email address to reset your password.

    Log In