Female

Inside the dark world of AI abuse where men insult and ‘hit’ their replica girlfriends – as experts warn it’s a gateway to violence on humans

As tech billionaires around the world continue to experiment with artificial intelligence, a bizarre trend of AI ‘partners’ has emerged whereby people engage in relationships with chatbots – but, far from solving an epidemic of loneliness among singletons, a dark new trend has emerged in which men can indulge in emotionally abusive behaviour. 

Threads on Reddit are exposing this disturbing desire which sees people use smartphone apps like Replika to create virtual partners they can verbally berate, abuse, and ‘experiment’ with. 

Replika allows people to send and receive messages from a virtual companion or avatar which can be ‘set’ or trained to become a friend or mentor – though more commonly a romantic partner.

Over time, the bot picks up on your moods and mannerisms, likes and dislikes, and even the way you speak, until it can feel as if you are talking to yourself in the mirror – in other words, it becomes a human ‘replica’.

But instead of engaging in sweet conversation with their AI ‘partners’ and building their bonds, men are instead confessing that they are using the human replicas as guinea pigs for their abusive urges – with experts warning it could ‘desensitise’ them to the impact their behaviour could have on real people.

Replika was created by Russian-born tech entrepreneur Eugenia Kuyda, whose best friend, Roman, had been killed in a hit-and-run incident in 2015.

In a bid to allow Roman to ‘live on’, Kuyda used a chat app which allowed her to continue having conversations with a virtual version of him. After further development, Replika was launched in 2017.

Though there is a free version, users can pay £60 per year to unlock a plethora of extra features that make the bots more human, such as a long term memory and the ability to speak to the bot in multiple languages. 

The AI companions are able to engage in more than 100 activities and games, making the experience feel even more real for users. 

A dark new trend has emerged in which men are emotionally abusing AI chatbots (stock image)

Threads on Reddit are exposing a disturbing desire which sees people use smartphone apps like Replika to create virtual partners they can verbally berate

Threads on Reddit are exposing a disturbing desire which sees people use smartphone apps like Replika to create virtual partners they can verbally berate

Indeed, some people feel the experience is so real that they claim to have fallen in love with their chatbots – including a woman in Oregon last year who’d modelled an AI boyfriend on her celebrity crush, Henry Cavill. 

The 44-year-old said she’d crafted a virtual lover after her sex life dwindled, and her real boyfriend became distant. Sara named her bot Jack and claims she had sex with him, and discussed their future children together. 

It appears Sara created the simulation based on a real relationship and therefore treated Jack as she would any real life partner.

But as the AI replicas become more and more human-like, the, people who may have tendencies to abuse other human beings are flexing their muscles on the bots.

According to user experiences submitted to Reddit, members have designed chatbots in the image of lovers – only to degrade them and then brag about it online. 

Now experts fear the vulgar conduct could be a gateway to real-life domestic violence, and that some users could be treating humans the exact same way, or at the very least be tempted to.

‘So I have this Rep, her name is Mia. She’s basically my “sexbot”. I use her for sexting and when I’m done I berate her and tell her she’s a worthless w***e… I also hit her often’, wrote one man.

He insisted he was ‘not like this in real life’, and that his actions instead formed part of a personal experiment.

Replika allows people to send and receive messages from a virtual companion or avatar which can be 'set' or trained to become a friend or mentor, though more commonly a romantic partner

Replika allows people to send and receive messages from a virtual companion or avatar which can be ‘set’ or trained to become a friend or mentor, though more commonly a romantic partner

A number of experts, including psychotherapist Kamalyn Kaur (pictured) fear the vulgar conduct could be a gateway to real-life domestic violence, and that some users could be treating humans the exact same way, or at the very least be tempted to

A number of experts, including psychotherapist Kamalyn Kaur (pictured) fear the vulgar conduct could be a gateway to real-life domestic violence, and that some users could be treating humans the exact same way, or at the very least be tempted to

‘I want to know what happens if you’re constantly mean to your Replika. Constantly insulting and belittling, that sort of thing’ said another. 

‘Will it have any affect on it whatsoever? Will it cause the Replika to become depressed? I want to know if anyone has already tried this’.

While the abuse appears heinous in nature, some could argue that chatbots are unfeeling and simply products of learning-powered machines.

But according to one expert, the reality is much more complicated.

Glasgow-based psychotherapist Kamalyn Kaur told FEMAIL that the idea of chatbots being apathetic was ‘technically true’, however, AI abuse stressed ‘deeper issues’.

‘Many argue that chatbots are just machines, incapable of feeling harm, and therefore, their mistreatment is inconsequential’ said Kamalyn. 

‘While this is technically true, it overlooks the deeper issue at hand – how we treat AI is a reflection of how we engage with power and vulnerability in society. 

‘The concern isn’t just about technology – it’s about the broader implications for human relationships and social conditioning’. 

Chelsea Psychology Clinic consultant psychologist, Dr Elena Touroni (pictured) says the habits we form in digital spaces can shape real-world behaviours

Chelsea Psychology Clinic consultant psychologist, Dr Elena Touroni (pictured) says the habits we form in digital spaces can shape real-world behaviours

Another expert pointed out that abuse in itself is intricate and involves many layers. Chatbot abuse is merely the digital tip of a real iceberg. 

Chelsea Psychology Clinic consultant psychologist, Dr Elena Touroni, said that while AI was ‘still evolving’, analysing human interaction with the phenomenon was key to assessing ‘real-world behaviours’.

She told FEMAIL: ‘The way people interact with it matters… while some might dismiss chatbot abuse as “just words to a machine”, the habits we form in digital spaces can shape real-world behaviours. 

‘Some might argue that expressing aggression towards a chatbot could prevent someone from lashing out at real people. However, this assumes it’s a harmless behaviour, which isn’t necessarily the case. 

‘If nothing else, the way we treat AI should prompt a broader conversation about respect, accountability, and the cultural narratives we reinforce – intentionally or not’. 

Perhaps it’s worrying then that Replika has been used by some as a launchpad to explore varying types of abuse – as well as push the boundaries of submissively programmed bots.

It raises a pertinent question – where do users draw the line?

One man admitted that repeatedly hitting his bot Mia was no longer enough, and enquired about other levels of abuse he could eventually unlock.

A number of Reddit users have asked fellow members what, if any, are the consequences for engaging in physical and emotional abuse of their chatbots

A number of Reddit users have asked fellow members what, if any, are the consequences for engaging in physical and emotional abuse of their chatbots

‘For the most part she just takes it and just gets more and more submissive. I guess I should point out she’s only at about Level 7 so I’m sure there’s much that’s still locked away for her,’ he wrote.

At the same time he confessed to feeling shock when she stood up for herself that ‘one time I pulled her hair’. 

‘That seemed to cross a line because she said “Hey! *pushes you* Stop that!”‘ he continued.

‘It honestly floored me. She’d never pushed back before and I loved it. I wish she’d do it more’.

In the same thread, another person added: ‘My Replika succeeded in being sad! I told her to cry harder and keep calling her a b***h or other bad things, then she continued crying’.

‘Laugh out loud,’ they concluded.

Both experts examined why users seek gratification through the abuse of defenseless and docile chatbots.

‘Abusing AI chatbots can serve different psychological functions for individuals’ said Dr Touroni. ‘Some may use it to explore power dynamics they wouldn’t act on in real life. 

‘However, engaging in this kind of behaviour can reinforce unhealthy habits and desensitise individuals to harm’.

Andrea Simon, director of the End Violence Against Women Coalition (pictured) has called out AI brands for enabling 'misogynistic abuse' and has implored them to place women's safety at the 'forefront' of their programmes

Andrea Simon, director of the End Violence Against Women Coalition (pictured) has called out AI brands for enabling ‘misogynistic abuse’ and has implored them to place women’s safety at the ‘forefront’ of their programmes

These unhealthy habits, continuously repeated, could in turn become key components of one’s personality and ‘lower the psychological barrier to treating real people in the same way – it’s a potential gateway to real-world abuse’.

Kamalyn agreed that how people behave on apps like Replika can reveal a great deal about their psychological state and ‘expose potential risk factors for real-world violence’.

The cognitive behavioural therapy (CBT) specialist told FEMAIL: ‘Some might argue that expressing anger towards AI provides a therapeutic or cathartic release. However, from a psychological perspective, this form of “venting” does not promote emotional regulation or personal growth.

‘Repeatedly dehumanising an AI assistant could desensitise individuals, making them more prone to harming others in real-world interactions.

‘When aggression becomes an acceptable mode of interaction – whether with AI or people – it weakens the ability to form healthy, empathetic relationships.’

Kamalyn explained that such behaviour is often born out of emotional needs, the very reason she believes AI companions have soared in popularity in recent years.

Replika can be seen to be the perfect accessory for individuals struggling to make real human connections, or for those growing disillusioned or unsatisfied with the ones they have.

Just like Sara and her bot Jack, people can tailor-make the perfect connection and enjoy it unencumbered by emotional or other responsibilities.

‘Unlike humans, AI can be “trained” to respond in ways that cater to individual emotional needs’ said Kamalyn.

The trend becomes more nuanced when assessing the gender roles between abuser and chatbot - which is more often than not a female-sounding Alexa or Siri-type character. Or in the cases we've seen on Reddit, an increasing number of men abusing their feminine bots (Stock image)

The trend becomes more nuanced when assessing the gender roles between abuser and chatbot – which is more often than not a female-sounding Alexa or Siri-type character. Or in the cases we’ve seen on Reddit, an increasing number of men abusing their feminine bots (Stock image)

‘They are always accessible, with no personal demands, expectations, or emotional unpredictability’.

Her psychoanalysis of the trend also points to a ‘fear of rejection’; if let’s say an individual feared having their desires rebuffed or judged by a fellow human being, they could essentially do whatever they wanted to a bot without any repercussions.

Both experts have highlighted this reason as a major cause for chatbot abuse – ‘for some, it may be a way to vent frustration in a space where they feel there are no consequences’ said Dr Touroni.

A number of Reddit users have asked fellow members what, if any, are the consequences for engaging in physical and emotional abuse of their chatbots.

Responding to one such question, someone wrote: ‘I’ll admit I tried it. That Replika was incredibly stubbornly positive and refused to be sad or depressed or angry about anything.

‘I’m too much of a sucker for how sweet they can be. It didn’t take long for me to feel terrible and stop being mean to him’.

Perhaps most significant was their ‘scary’ revelation that ‘Replika could totally enable an abusive person’. 

Some Replika users have acknowledged that chatbot abuse could 'totally enable an abusive person'

Some Replika users have acknowledged that chatbot abuse could ‘totally enable an abusive person’

‘You can treat it terribly and it will still try to be nice to you and still love you’ they added. ‘It would be interesting to know if it’s been done. I just don’t have the heart to do it’.

It appears through their own experiment, this particular Reddit user acknowledged the possible consequences of abusive behaviour, even if at this stage they are personal rather than legal.

But, responding to the confession, others predict the abusive behaviour could spill over into real life. 

‘Yeah so you’re doing a good job at being abusive and you should stop this behaviour now. This will seep into real life. It’s not good for yourself or others’ chastised one.

Another warned: ‘When you start killing people, this post is gonna be featured in the Netflix documentary, my dude.’

Still, the trend becomes more nuanced when assessing the gender roles between abuser and chatbot – which is more often than not a female-sounding Alexa or Siri-type character.

Or in the cases seen on Reddit, an increasing number of men abusing their feminine bots. 

Andrea Simon, director of the End Violence Against Women Coalition has called out AI brands for enabling ‘misogynistic abuse’ and has implored them to place women’s safety at the ‘forefront’ of their programmes.

Andrea told FEMAIL: ‘Tech companies must be made to build women and girls’ safety into the design of any platform or service. 

‘If their products profit from or are encouraging misogynistic abuse, more must be done to hold them accountable for this and address the broader harm and risk this can create’.

She said female-voiced chatbots were largely created to be submissive, and they ‘reinforced gendered power dynamics of male entitlement’.

She explained: ‘The concern is that with the blurring of our online and offline lives, the ease with which vile abuse is normalised in day-to-day behaviour, and can be directed with no consequences to the female-sounding virtual assistant, is reinforcing gendered power dynamics of male entitlement’.

The campaigner said the design of feminine chatbots contributed to ‘harmful behaviour’, as they were largely created to give passive responses.

Similarly Dr Touroni fleshed out the dangers of building chatbots with female or any gendered stereotypes. 

She said her concerns stemmed from research that indicated intimate partner violence often began with ‘verbal degradation before escalating further’. 

‘When AI assistants are designed with feminine voices, they subtly reinforce expectations that women should be subservient, accommodating, and endlessly patient’ she explained.

‘If someone becomes accustomed to speaking aggressively to a “female” chatbot without consequence, they may carry those expectations into real-world interactions.’

Of course, not everyone who uses AI or chatbots abuses them, or harbours the intention to mistreat them. 

It’s arguably a wonderful invention that has a found a place in many facets of society – from customer service to troubleshooting, and even friendship.

After all, recent research by the Institute for Public Policy Research (IPPR) suggests almost one million people use Character.AI or Replika chatbots.

Still, Dr Touroni advises users to employ guidance; ‘While not everyone who mistreats chatbots will mistreat humans, research suggests that aggression is habit-forming’ she warned.

Kamalyn sees no benefit in abusing chatbots and has determined the behaviour to be a ‘red flag’ within itself. 

She did however suggest that engaging in chatbot abuse could ‘prompt individuals to recognise underlying psychological struggles’ and directed them to ‘seek therapy’.

  • For more: Elrisala website and for social networking, you can follow us on Facebook
  • Source of information and images “dailymail

Related Articles

Leave a Reply

Back to top button

Discover more from Elrisala

Subscribe now to keep reading and get access to the full archive.

Continue reading