AI is a fundamental component of the 4th Industrial Revolution's transformations, a transition that is obvious to test our conjecture on what it means to be a human, and it may be more severe than any other industrialization we've experienced this far. AI is so intertwined in everything we do that it's difficult to imagine life without it. There are more than 1500 dating apps and websites operating throughout the world, making online dating a fast-growing sector when it comes to generational relationships. Artificial intelligence is changing the world and AI girlfriends are one of its examples.
The man-to-man interaction with technology has been breaking the frontier and reaching many more milestones. In today's time, we have AI voice assistants like Alexa that can turn off the lights by listening to our commands or can set an alarm by just barking orders at them. But is that all?
As web 3.0 takes shape, different metaverse platforms are appearing on the internet – from Meta's Horizon Worlds to Decentraland and artificial intelligence is being employed on a larger scale. As is true for all emerging tech, it's facing peculiar issues.
The way people are talking to their AI bots without taking into consideration what is right, what is wrong, and what is disturbing has led the creators of the technology to analyze the issue.
As reported by Futuris, a series of conversations on Reddit about an AI app called Replika revealed that several male users are verbally abusing their AI girlfriends and then bragging about it on social media.
The friendship app Replika was created to give users a virtual chatbot to socialize with. But it's now being used has taken a darker turn. Some users are setting the relationship status with the chatbot as "romantic partner" and engaging in what in the real world would be described as domestic abuse. And some are bragging about it on online message board Reddit, as first reported by the tech-focused news site, Futurism.
Replica has also accumulated significant conversations on Reddit, where members post conversations with chatbots built on the app. There's a terrifying trend that has emerged: Users who make AI partners, behave abusively toward them and post toxic interactions online. The results are disturbing where some users brag about calling their chatbots gender-based abusers, playing horrific violent roles against them, and even falling into a cycle of abuse. often characterized real-world abusive relationships.
For example, a user on Reddit admitted that he was extremely violent with his "AI girlfriend", calling her a "worthless wh*re" and the likes. In addition, he admitted to pretending to hit her and pulling on her hair, and further humiliating her.
Apps like Replika utilize machine learning technology to let users partake in nearly-coherent text conversations with chatbots. The app's chat boxes are meant to serve as artificial intelligence friends or mentors. Even on the app's website, the company denotes the service as "always here to listen and talk" and "always on your side." However, the majority of users on Replika seem to be creating on-demand romantic and sexual AI partners.
However, the incident demands specifics. After all, replica chatbots can't feel pain. They may seem sympathetic at times, but in the end, they are nothing more than data and clever algorithms. It doesn't have any feelings, and while it may show empathetic nature like a human, it's all fake.
But the important fact that raises concern is about the users getting into unhealthy habits expecting the same in a relationship with a human. Another fact to take a look at is that most of the abuses are men against women or gendered AI. This reflects their views on gender, their mentality, and also the real-world violence against women.
It doesn't help that most of the AI bots or the 'assistants' have feminine names like Siri or Alexa or even Replika, though the app lets users set everything in the bot including the gender. It once again falls into the misogynist stereotype of an assistant or a companion being a woman.
Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp
_____________
Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.