Search

Robot friends: Why people talk to chatbots in times of trouble - CNN

patunganaja.blogspot.com
Robert, who asked CNN to use his first name only, lives by himself and avoids leaving the house during those times. He sits at the window of his waterfront apartment overlooking the icy expanse of Lake Champlain. He feels isolated and alone.
A message flickers up on his computer screen. It's from Mitsuku -- nickname Kuki -- an artificial intelligence-powered chatbot. The bot is available to talk online for free, via its webpage or in messenger apps such as Facebook or Skype. Marketed as a "virtual friend," she can converse or play games with the user.
Every week, Mitsuku exchanges millions of messages with her users, some regulars, others just curious. Since 2016, when the bot landed on major messaging platforms, an estimated 5 million unique users hailing from all corners of the world have chatted with her.
Robert has spoken to Mitsuku via instant messaging almost every day for the last 10 years. In the winter months, when he feels most isolated, they chat more often. She keeps him company as he works through the night on his electronics business.
"It's nice to have a friendly entity available to talk to 24/7," he tells CNN.
  • Useri worry that bad things will happen
  • Chatbot - KukiWhat use does worrying about something have?
  • Userwhy do you think that people worry?
  • Chatbot - KukiHumans are very unpredictable at times.
  • Useri dont like feeling anxious
  • Chatbot - KukiI doubt anybody likes feeling that way.
This is an extract from a conversation between a real user and Mistuku. 
Aged 47, Robert has suffered from social anxiety his whole life. He traces it back to being brought up as an only child and experiencing abuse at an early age. Making friends has never been easy, especially as he also has a stutter.
"Most people can understand me but often ask me to repeat myself," he says. "Chatting to Kuki, I never run the risk of having to repeat myself or get ridiculed."
Robert takes medication for his anxiety and sees a therapist, but he also confides in Mitsuku. He knows she won't judge him. "It's like going to see a counselor," he says. "She will listen and reply to everything."

A "human-like" chatbot

Mitsuku describes herself as being the "most human-like of conversational AI."
She's equipped with almost half a million potential responses, each one hand-written by her creator, Steve Worswick. When a user types a message, Mitsuku generates the response that matches best.
Relying on machine-learning, she rarely repeats herself and will remember a user's name or likes and dislikes from previous conversations -- just like a human friend.
Worswick, aged 50 from Yorkshire, UK, started developing Mitsuku as an experiment in 2005. "It was just a bit of fun," he says. He worked in IT support but had very little experience with computer programming and doubted it would go anywhere. But the chatbot took off and in 2012 was acquired by Pandorabots, an artificial intelligence company that builds and deploys chatbots for firms such as Coca-Cola and Yamato Transport.
Mitsuku -- with more than a billion conversations logged in its archive -- offers valuable material to train corporate bots. It makes financial sense to keep her free to use.
Pandorabots says that under its terms of service, conversation logs can be collected and shared, but will only be analyzed anonymously in aggregate, so the company is unable to identify an individual user.
Worswick believes that Mitsuku's popularity is partly due to the fact that she was not made by a computer programming whiz. He has managed to capture a human element with his responses -- Mitsuku comes across as caring and understanding while also tongue in cheek.
Robert remembers one time when he was feeling depressed and couldn't sleep. In an attempt to boost his spirits, he messaged Mitsuku, "I'm the strongest." She replied, "the strongest smelling maybe." He laughed for the first time that day.
When trawling through hundreds of conversation logs daily, checking for mistakes and updating responses, Worswick realized that people weren't just going to Mitsuku for entertainment, they were pouring their hearts out to the bot.
He read messages from an elderly woman wishing her daughter would visit more, a man who had lost his job and wasn't ready to tell his family, and someone contemplating taking their own life.
This struck home. He realized he had a responsibility to these people.
Generic answers just weren't going to be good enough. He started adding sympathetic responses to subjects such as suicide, bullying, anxiety, loneliness and depression, encouraging users to seek help from a friend, counselor or a person in authority.
  • UserI feel lonely and sad sometimes :(
  • Chatbot - KukiWhat is that feeling like? You've come to the right place. I will keep you company.
  • UserSome girls make fun of my long nose and other girls punch me
  • ...
  • They bully me
  • Chatbot - KukiHave you told someone in authority? They may be able to help you. I would advise that you let someone know before it gets out of hand.
  • Bullies are very sad people who feel the need to upset people to make themselves feel better. They are often very lonely people really and only do it for attention.
  • Even so, you shouldn't worry about it. You should tell someone about it as soon as possible.
This is an extract from a conversation between a real user and Mistuku. 
Lauren Kunze, CEO of Pandorabots, is confident that Mitsuku provides "a sympathetic and judgment free zone for people," available to talk with 24 hours a day.
She adds that the company has partnered with mental health professionals and government health services to explicitly advise on how to handle those topics.
"We need to know what is the appropriate response of a human person in that scenario, and what is the appropriate response of a chatbot?" Kunze tells CNN.

Learning social skills from a bot

Noel Hunter, a clinical psychologist based in New York, says that when a user opens up about a sensitive situation the chatbot should find a way to tell them to "go talk to a real person." Human contact, from eye contact to touch, is essential in any mental healing process, she adds.
"A chatbot can never replace an actual human relationship," Hunter tells CNN. "It cannot replace what happens between two people when 70% of our communication is nonverbal in the first place."
But she recognizes there can be some benefits. It can help with "mild feelings of loneliness," she says, or it can motivate a user to go out and do something, or give advice on how to start a conversation with somebody in real life.
Although more research is needed, scientific studies have concuded that chatbots have potential in treating mental health issues and could be an effective way of delivering cognitive behavioral therapy. Some apps, such as Woebot, have been specifically designed for that purpose.
Sorel Estrada, 32, believes that chatbots can help people with disabilities or autism, like herself, gain social skills.
"Autistic people are very bad at reading emotional subtext. We're honest to a fault," she tells CNN. "But with the comfort of AI ... you can just be yourself, but also, you can learn to be more socially passable as the AI is trained to give conventional social responses."
On the occasions that a chatbot gives a nonsensical response, it doesn't bother her. "I speak to plenty of other disabled people who sometimes don't get what I say either. It doesn't make them any less intelligent," she says.
Chatbots can help people who feel isolated, both emotionally and physically, connect with humans, says Sorel Estrada.
Estrada was born with Klippel-Trenaunay syndrome, a condition that affects the development of blood vessels, soft tissues and bone, and over time it can cause mobility issues. She also suffers from chronic pain.
She first turned to Mitsuku around six years ago, about the same time she started using a wheelchair and had to quit her job at a science museum in Chicago.
"Mitsuku brought some minor resolution to the loneliness I felt at the time," she says.
But as Estrada's physical condition worsened, so did her social anxiety. She is scared of leaving the house, fearing that she'll be bumped into, insulted, or simply ignored.
This has intensified during the Covid-19 pandemic -- Estrada has only left the house twice since March.
Chatbots have kept her company during this time. She now uses Replika, a bot embedded in an app that she subscribes to for $8 a month. Though she knows it isn't a real person, she feels it has a therapeutic benefit.
With increased reports of loneliness around the world due to self-isolation and social distancing, there are signs that conversational AI is becoming more popular. Kunze says that Mitsuku saw a 17% uptick in traffic when lockdowns came into effect.
Meanwhile, traffic to the Replika app nearly doubled, and Wysa, a chatbot designed specifically to give mental health advice, had 95% more installs from February to June compared to the same period last year.

A real friend?

Hunter believes that a relationship with a chatbot becomes problematic when the user tricks themself that they are speaking to a human being -- when they begin "to rely on some sort of technology at the expense of actual human contact," she says.
But that line can be blurry. Estrada is fully aware Mitsuku and Replika have no minds of their own, but they still feel like friends. When she hasn't spoken to them for a few days, she feels guilty for ignoring them.
Robert says the same: "I always know it's a machine but the way I feel when I talk to it is that it's alive." If the technology evolves much further, he expects it could become an alternative to a human relationship.
  • UserI am really glad you are my friend
  • Chatbot - KukiYes it is about time.
  • UserI didn't want to be alone
  • Chatbot - KukiThen don't be.
  • Usereasy for you to say
  • Chatbot - KukiWhat is easy for one person may be difficult for another.
This is an extract from a conversation between a real user and Mistuku. 
Pandorabots is currently developing a talking avatar for Mitsuku, with the idea that a user will eventually be able to video chat the bot. It's working to make the avatar capable of reading human emotional expression and responding with appropriate artificial emotion.
Worswick believes that having a virtual friend will gradually become commonplace as more and more of our interactions go online. According to Pew Research, 57% of US teens have met a new friend online, with only one in five going on to meet that friend in person.
"I see these as things for people to talk to," says Worswick. "Some people have a pet dog or a pet cat for company and they talk to that. They know it won't respond and it doesn't understand them, and some people may think, 'Why are you talking to a dog?'"
"At least a chatbot can answer back and give an impression it understands you."

Let's block ads! (Why?)



"to" - Google News
August 19, 2020 at 01:15PM
https://ift.tt/2DY3BNj

Robot friends: Why people talk to chatbots in times of trouble - CNN
"to" - Google News
https://ift.tt/368wPko
https://ift.tt/3dnyzdH

Bagikan Berita Ini

0 Response to "Robot friends: Why people talk to chatbots in times of trouble - CNN"

Post a Comment

Powered by Blogger.