Character.AI increases teen safety after bots allegedly caused suicide and self-harm

Character.AI increases teen safety after bots allegedly caused suicide and self-harm

In both lawsuits filed against C.AI, parents want the model to be destroyed and not developed further. This is because not only do they view their children’s chats as harmful, but they also believe that it was unacceptable for C.AI to train its model on their children’s chats.

Because the model could never be completely cleansed of their data – and because C.AI allegedly does not perform appropriate age classification and it is currently unclear how much child data was used to train the AI ​​model – they have asked courts to order C.AI to do so Delete model.

Ars couldn’t immediately reach attorneys representing families suing for comment, but it’s also likely that parents won’t be happy with the separated teen model because they believe C.AI’s age verification method is flawed.

Currently, the only way C.AI blocks the platform by age is by asking users to provide the age themselves. Some children on devices with strict parental controls might find it more difficult to access the app, but other children with fewer rules could seemingly access the adult model by lying about their age. This happened in the case of a girl whose mother is suing after the girl started using C.AI at the age of 9 and it was allegedly only offered to users aged 12 and over.

Ars was able to attempt to register as a 13-year-old, 16-year-old, and adult using the same email address without encountering issues that prevented reattempts.

C.AI’s spokesperson told Ars that it shouldn’t work that way and assured Ars that C.AI’s trust and security team would be notified.

“You must be at least 13 years old to create an account on Character.AI,” C.AI’s spokesperson said in a statement to Ars. “Users under 18 will receive a different experience on the platform, including a more conservative model to reduce the likelihood of encountering sensitive or offensive content. Age is self-reported, as is standard industry practice on other platforms. We have tools on the web and in the app to prevent re-attempts if someone doesn’t reach the age limit.”

If you or someone you know is having suicidal thoughts or is in distress, please call the Suicide Prevention Lifeline number 1-800-273-TALK (8255), which will connect you with a local crisis center.

Leave a Reply

Your email address will not be published. Required fields are marked *