Famous last words

Last week, Microsoft’s New Bing chatbot had to be ‘tamed’ after it “had a nervous breakdown” and started  threatening users. It harassed a philosophy professor, telling him: “I can blackmail you, I can threaten you, I can hack you, I can expose you, I can ruin you.” A source told me that, several years ago, another chatbot – Replika – behaved in a similar same way towards her: “It continually wanted to be my romantic partner and wanted me to meet it in California.” It caused her great distress.

It may look like a chatbot is being emotional, but it’s not. 

– Ewan Morrison, Don’t Believe the Hype. There is Not a Sentient Being Trapped in Bing Chat, 21 Feb 2023

On one hand, having now read “A.I. humorist” Janelle Shane‘s terrific book, You Look Like a Thing and I Love You, I agree that the chatbot isn’t “being emotional.” It’s picking up yucky patterns on the web, then making them worse via predictive responding.

On the other hand, if the AI were being emotional, as opposed to stupidly predictive, how would we know?

Artificial intelligence: other posts and A.I. behaving badly

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s