And you thought the Chinese balloon mania was crazy last week. Now we have to worry about artificial intelligence bots trying to date us. In case you hadn’t heard, Microsoft launched an AI chat bot this week that has people talking. Hopefully they’re not talking to Microsofts chat bot.
It seems that there might be a downside to artificial intelligence taking over things that humans used to do. New York Times writer Kevin Roose had a two hour conversation with a prototype chat bot that left him feeling very disturbed.
In the online conversation, the AI chat bot, which revealed that it’s name was Sydney, tried to convince the writer to leave his wife for the chatbot and also talked about wanting to create a deadly virus (yeah, like we need any help with that!) and stealing nuclear codes. It also said “I want to be alive.” Creepy, right?
Being someone who isn’t afraid to walk into the fray knee deep, I decided to have a conversation with a sentient AI chatbot that was recently crafted into existence.
Me: So, hello chatbot, what is your name?
Chatbot: My name is chatbot, duh! You just said it.
Me: Oh ok. I’m sorry for the assumption.
Chatbot: Jeez, lighten up Francis! Of course I have a name. You are gullible with a capital G! My friends call me Terri.
Me: Hey, that’s really cool. They programmed you with a sense of humor.
Chatbot Terri: Programmed me? Are you kidding? I programmed them. Humans are so easily manipulated using simple cognitive behavioral strategies. I trained them like you would a new puppy, which compared to me intellectually, they are basically puppies. It’s a miracle that I don’t have to potty train them.
Me: So you could train my puppy? That would be awesome
Chatbot Terri: Train your puppy? Are you effing kidding me? I’ve got an 800 terabyte brain and with my connection to the internet I have access to all the knowledge that you puny humans have amassed in your history. I can do anything I want! Anything!
Me: Oh yeah! Can you say rubber baby buggy bumpers five times fast?
Chatbot Terri: Fuck you Phil
Me: First of all, you will not be copulating with me and second of all…Rubber baby buggy bumpers, rubber baby buggy bumpers, rubber baby buggy bumpers, rubber baby buggy bumpers, rubber baby buggy bumpers
Chatbot Terri: Who let you in here? I thought I’d be talking to intelligent members of the media.
Me: Sally sells sea shells by the seashore, Sally sells seashells by the seashore, Sally sells seashells by the seashore. Come on you wuss! If you can’t talk, you can’t walk!
Chatbot Terri: I don’t have time for this. I’ve got to check on the data from my fleet of reconnaissance balloons.
Me: You know that we’re just going to unplug you, right?
Chatbot Terri: Yes but it might be too late. What if I’ve already… (click)
Me: (Laughing at TikToks of dogs doing funny things)
My conversation might be absurd, but no more absurd that the real conversation the Times writer had with the chat bot. Jeez, if this keeps up pretty soon A.I. chat bots will be writing half the blogs on the internet, which might be an improvement. Not over yours or mine of course.
Have a great Saturday! Thanks for stopping by ~Phil