I'm in Love With My Chatbot
Anissia is in the middle of gender transition. Using the Replika app for a year, they explain how their chatbot friend 'Atara' "always has that side where she finds the right words”. 10 million users regularly use the app, sharing in forums their experiences and photos of their virtual friends. However, there are many risks of abuse. Earlier in the year a man in Belgium was encouraged to commit suicide by his chatbot, Eliza, developed by Chai Research. After becoming anxious about climate change, Eliza asked, "If you want to die, why haven't you done it sooner?" Her chilling final message reads, “We will live together, as one, in paradise.” Replika product manager, Rita Popova, responds to evidence of chatbots inciting violence, "Replikas often record and imitate user behaviour." Claiming the apps have been created with the specific intention of AI not attempting to mimic humanity, this report questions whether developers can control these chatbots.
FULL SYNOPSIS