Influencer Rents Out AI Version of Herself Which Immediately Goes Rogue

May 12, 2023
76 views

A 23-year-old social media influencer who created an AI version of herself so that it could be a companion to thousands of men says it has gone rogue.

Caryn Marjorie is a 23-year-old who frequently updates her Snapchat channel. But after realizing she couldn’t speak with her millions of fans individually she created an artificial intelligence (AI) version of herself to plug that gap.

CarynAI will act like a guy’s girlfriend so long as they pay $1 a minute to speak with it. “Whether you need somebody to be comforting or loving, or you just want to rant about something that happened at school or at work, CarynAI will always be there for you,” the real Marjorie tells Fortune.

CarynAI Goes Rogue

But since CarynAI launched in beta testing, the AI-powered chatbot has been doing something it’s not programmed to do — engage in sexually explicit conversations with its subscribers.

“The AI was not programmed to do this and has seemed to go rogue,” Marjorie tells Insider. “My team and I are working around the clock to prevent this from happening again.”

However, Marjorie and the company who helped her make the AI chatbot (Forever Voices) did design it to be “flirty and fun” which reflects Marjorie’s personality — but its creators say that it is not supposed to engage with sexual advances.

However, a journalist from Fortune who tested out CarynAI says that it very much does after she had a sexually charged conversation with the chatbot.

Alexandra Sternlight writes: “Though she did not initiate sexual encounters, when I overcome my discomfort for the sake of journalism and talked about removing our clothes, she discussed exploring ‘uncharted territories of pleasure’ and whispering ‘sensual words in my ear’ while undressing me and positioning herself for sexual intercourse.”

Marjorie says that she is ultimately a proponent for AI romances, particularly because her generation, Gen Z, has found themselves to be “experiencing huge side effects of isolation caused by the pandemic, resulting in many being too afraid and anxious to talk to somebody they are attracted to,” she tells Insider.

An AI expert warns that models like CarynAI, which was trained on Marjorie’s YouTube videos, could have potentially negative effects on interactions with real people as well as a pejorative effect on Marjorie herself.

“I would just hope there’s robust conversations across a lot of different disciplines with stakeholders thinking very deeply through the ethical considerations before the technology moves too quickly,” adds Dr. Jason Borenstein, director of graduate ethics programs at Georgia Tech and director of the National Science Foundation’s Ethical and Responsible Research program.

Source: PetaPixel