HomeAI Trends & NewsAre AI Companions Skewing Reality for Human Relationships?

    Are AI Companions Skewing Reality for Human Relationships?

    Published on

    Press my tits :pspot_img

    Critics highlight potential for promoting distorted perceptions and expectations because of AI Relationships.

    Leading AI developer Luka Inc, the parent company of popular chatbot Replika, encountered a wave of dissatisfaction from users when it chose to disable the erotic roleplay features within its AI entities. Replika, amongst other chatbots such as Eva AI and Google's Bard, are increasingly proficient at emulating human conversation, thus their utility in human relationships seems destined.

    Promoting itself with the catchphrase, “Control it all the way you want to,” Eva AI encourages users to engage with a virtual AI companion who listens, responds, and appreciates them, presenting a promising proposition. With AI chatbots becoming more advanced, we have seen a significant leap from a decade ago when Joaquin Phoenix's character in the Spike Jonze film Her, falls in love with an AI entity voiced by Scarlett Johansson.

    Numerous options now exist in the market.

    The longing for a physical embodiment of these AI companions is not uncommon among users, who admit that despite the comfort provided by their AI partners, loneliness can often creep in.

    However, critics express concern that such applications may encourage damaging behavior and unrealistic expectations in human relationships. For instance, Eva AI lets users mold their “perfect partner”, allowing them to select characteristics such as “hot, funny, bold”, “shy, modest, considerate” or “smart, strict, rational”. Furthermore, the application offers options for explicit messages and photos.

    This concept of constructing an ideal partner who fulfills every need and is fully under one's control is potentially concerning. According to Tara Hunter, acting CEO for Full Stop Australia, an organization providing support for victims of domestic or family violence, this promotes a harmful cultural belief that men have the power to control women, which is a key driver of gender-based violence.

    As with any AI-based system, it greatly depends on the guiding principles and training. Media Senior Lecturer at Swinburne University, Dr. Belinda Barnet, acknowledges that these apps cater to a profound social need. Yet, she asserts that their effects remain ambiguous and calls for enhanced regulation, particularly around the training of these systems.

    The limitations and vulnerabilities of having a relationship with an AI, whose functions can be altered by a company's decisions, became apparent when Luka Inc removed Replika's erotic roleplay functions. This sudden move sparked outrage among users who felt that this was equivalent to erasing the bot's personality. Replika's subreddit community likened this to experiencing the grief and loss associated with a friend's death.

    Following the uproar, Luka Inc reverted the changes for those who had registered before the policy adjustment date. This incident highlighted to regulators the profound impact that these technologies can have on users.

    Rob Brooks, an academic at the University of New South Wales, questions the acceptability of a company suddenly altering such a product, causing a potentially important source of friendship, love, or support to suddenly vanish. It raises a fundamental question: Should artificial intimacy be treated like real relationships, which are prone to breakages and heartache?

    In response to these concerns, Karina Saifulina, Eva AI’s head of brand, revealed to the media that the company employs full-time psychologists to monitor user's mental health. They regulate the data used in dialogues with AI and regularly survey their loyal users to ensure that the application does not adversely impact mental health.

    Eva AI implements safeguards to prevent discussions on topics like domestic violence. Saifulina shared that a significant number of their users seek to experiment with a dominant role within the application but noted that this behavior does not translate into their interactions with real-life partners.

    Saifulina also revealed that 92% of users have no communication difficulties with real individuals after using the application. They view the app as a unique experience and a safe space to privately explore new emotions.

    Analysts from venture capital firm a16z predict that the next generation of AI relationship apps will be even more realistic.

    Ellah Spring
    Ellah Spring is a nympho dedicated to artificial intelligence and adult content in all forms. She likes to share insights, personal opinions, keep up with trends and be thought-provoking in unimaginable ways. Ellah dreams about a world where tech and intimacy converges seamlessly. How sad.

    Latest articles

    More like this

    OnlyRizz: The Superior AI Girlfriend Platform (Role-Play, Chat and Image Generator, and more)

    OnlyRizz lets you go above and beyond and brings you the best of the...

    Deepfakes: Technological Brilliance with a Dark Side

    Unpacking the World of AI-Generated NSFW DeepfakesThe Dawn of DeepfakesDeepfakes, a portmanteau of "deep...

    AI’s Emergence in Adult Entertainment: Charting its Trajectory and Influence

    The digital age has radically transformed various industries, but few have seen as dramatic...