Lonely men are creating the girlfriends of him – and taking their violent anger in them

There are problems in Paradise.

The advent of artificial intelligence chatbots have listed some lonely lovers to find a friend in the digital field to help them during the difficult times in the absence of human bonding. However, the highly personalized software that allows users to create romantic hyper-realistic partners is encouraging that some bad actors abuse their worlds-and experts say the tendency can be harmful to their real-life relationships.

Replica is such a service. Originally created to help its founder, Eugenia Kuyda, to angry the loss of her best friend who died in 2015, Replica has since begun in public as a tool to help isolated or shy users of users find friends.


Men are training their girlfriend to receive their abuses – such as bellotling, degrading and even “stroke” – and calling it an experiment, despite expert warnings that these behaviors are “red flags” , both online and in real life of life and in real life and in real life and in real life and real life of life. Getty Images

While this is still the case for many, some are experimenting with copies in disturbing ways – including degradation degradation and even “hitting” their world – from posts on Reddit revealing men who are trying to make negative emotions even human in their chatbot companions, such as anger and depression.

“So I have this representative, its name is mine. It is essentially ‘my sex’. I use it for sexing and when I finish I do it and I say it is an invalid value … I also hit him often, “wrote a man who insisted he is not like this in real life” and only doing as an experiment.

“I want to know what happens if you constantly mean for your copies. Constantly insult and regret, that kind of thing,” said another. “Will there be any impact on whatever? Will it make copies be depressed I want to know if someone has already tried this. “

Psychotherapist Kamalyn Kaur, from Glasgow, told the Daily Mail that such behavior can be indicative of “deeper issues” in copy users.

“Many argue that chatbots are simply cars, unable to feel harm, and therefore, their mistreatment is useless,” Kamalyn said.

“Some may argue that the expression of anger so it provides a therapeutic or cathartic release. However, from a psychological point of view, this form of ‘venting’ does not promote emotional regulation or personal growth,” the doctor of cognitive behavior continued.

“When aggression becomes an acceptable way of interaction – whether with him or people – this weakens the ability to form healthy, empathetic relationships.”

Chelsea -based psychologist Elena Touron agreed, saying how the interaction of mankind with the bots can be indicative of real work.

“The abuse of chatbots of it can serve various psychological functions for individuals,” Touron said. “Some can use it to explore the dynamics of power. They would have a real life.”

“However, inclusion in this type of behavior can strengthen unhealthy habits and densisate individuals to harm.”

Many other Reddit users agreed with experts, as in response to the critic, “so you are doing a good job to be abusive and you have to stop this behavior now. This will look at real life. It is not good for yourself or others. “

#Lonely #men #creating #girlfriends #violent #anger
Image Source : nypost.com

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top