Many people across the world are turning to AI Chatbots for emotional support. Mary O’Leary explores what it means for human connection when everyone leans on Gen-AI to do the work for them.
In the last decade, Generative Artificial Intelligence (Gen-AI) has seemingly taken over life, with it being used for anything from complex math equations to simple everyday questions. It feels as though overnight, this tool has become completely unavoidable within our day to day lives. Even those who actively avoid using it come across its work through search engines, on social media apps and automated AI summaries on various websites.
It should come as no surprise that this exposure to Gen-AI has helped it to become a common tool for many people. Unlike other search engines however, Gen-AI can simulate conversations, allowing for many to use it as a simulation of genuine human connection. One such example of this is the growing rise in ‘Character AI’, both a specific platform and genre of Gen-AI simulations in which a chatbot is given a particular personality meant to replicate fictional characters or real people (living or dead). Often these characters can be given specific parts to play within these simulations, taking the role of a boyfriend, girlfriend, spouse or friend. You can even get the AI to replicate certain emotions such as jealousy, infatuation, and devotion.
If this application of AI were to become commonplace, it’s safe to assume that users will unwittingly create unhealthy emotional and physical expectations for their partner and other people in general.
In using AI this way, many people can become accustomed to flirtation and connection between themselves and their AI. More often than not, this flirtation is even separate from the oddities of character roleplaying, with the AI itself being used as the partner within a simulated relationship. Like ‘Character AI’, AI boyfriends and girlfriends are becoming increasingly popular, with many online spaces such as Reddit and Discord offering communities for those in AI relationships.
Unfortunately, these communities can also act as an echo chamber and can further isolate people from their communities within the real world. This isolation, whether encouraged by these communities or not, can also lead to many turning towards Gen-AI for physical intimacy as well. Over the past few years AI has also broken out into the sex industry, with an alarming rise in AI generated pornography and deepfake images.
Just like with ‘Character AI’, AI of this manner can be curated to the wants of the person it is servicing. Body parts and actions can be picked and chosen without any regard for the impact of the images that are being created. Deepfakes, which allow someone to AI generate a real person’s face onto these images, have also become increasingly popular and most countries have limited legislation to combat its proliferation.
Using Gen-AI in this way could also lead to a much higher risk of dependence on such images and the sex industry as a whole. The Internet Watch Foundation (IWF) has highlighted that “AI-generated imagery of child sexual abuse has progressed at such an accelerated rate that the IWF is now seeing the first realistic examples of AI videos depicting the sexual abuse of children.” The normalisation of Gen-AI as a substitute for intimacy could also exacerbate this rise in AI pornographic imagery, and encourage further isolation in downright dangerous online communities which circulate and administer these sorts of images.
AI if left unchecked in matters regarding human connection, both emotional and physical, will inevitably help to sustain these exploitative industries without any regard for the effect it will have on real world people. This is not to say that the use of relationship-based AI inevitably leads to deepfake or child pornography on this scale, but it does play a significant part in simulating the same type of emotional connection that is the foundation for these genres of AI to exist. Though seemingly insignificant due to the online personal aspect of Gen-AI chatbots, using it in this way can lead to a large rise in sexual violence against the groups of people within the images it is generating.
If this application of AI were to become commonplace, it’s safe to assume that users will unwittingly create unhealthy emotional and physical expectations for their partner and other people in general. The real world, and the human relationships it provides, will seem imperfect compared to the faultless nature of simulated AI interactions.
Normalizing conversation based AI would also limit our ability to understand human conversation, with body language and tone of voice being absent from AI interactions. The way we as humans converse and understand one another is much deeper than simply words. When AI strips conversations down to that formula, the person using it will eventually lose parts of their ability to understand the more subtle aspects of human language.
Intimacy, like all other forms of human bonding, is something that cannot be so readily replaced by technology. It is what stimulates our empathy and understanding for one another as well as our ability to reinforce our real-world communities.
It is obvious that those who fall victim to AI intimacy are people looking for genuine human connection, especially in an increasingly online based world. It is no wonder that along with the rise in romantic AI relationships, that AI friendships and AI therapists are also growing. Artificial Intelligence is simply the cheapest and most effortless way for people to gain the human connection they so desire.
What many of these people don’t understand is that AI is in no way an actual replicate for human intelligence and emotional connection. Instead, it desensitizes a person to the inhumane nature of their relationship with it, making them more likely to use AI images of physical intimacy as well. And thus, the rise of illegal and harmful deepfakes can occur, its widespread use having the dangerous potential to be normalized to the public.
The good news is that AI isn’t without its critics, with many opposing AI’s use within artistic and creative fields as well as its damaging effect on the environment. It is like most new inventions, unregulated and misused to a wide degree. The issue now is discerning where and how these sorts of restrictions should be enforced.
Intimacy, like all other forms of human bonding, is something that cannot be so readily replaced by technology. It is what stimulates our empathy and understanding for one another as well as our ability to reinforce our real-world communities. Technology and AI, no matter how seemingly realistic it may be, is simply an imitation of this very human understanding and community. Ironically it is this imitation of human intimacy that is a direct threat to the very human connection that so many yearn for.
