By Eleonora Rossi

Have you ever asked ChatGPT to write an email for you? To prepare a few lines of code? Or to generate an image for a presentation?

These are the most frequent uses for Generative Artificial Intelligence tools, which we have become accustomed to using over the last three years (yes, it has only been three years since ChatGPT was launched…).

Recently, however, a new trend is beginning to emerge. More and more people are turning to AI tools for comfort, companionship and understanding. As a result, alongside traditional chatbots, a new market segment is emerging for tools designed specifically for this purpose: AI companions.

There are systems such as Replika that allow you to create avatars and customise their character, personality and appearance, and others such as CharacterAI that offer more than 10 million characters, either real or created by other users, to chat with.

Even “traditional” chatbots can be used for this purpose: when starting a conversation with ChatGPT, you can choose a personality from a number of options (friendly, blunt, professional and eccentric). Similarly, Grok, the artificial intelligence developed by Elon Musk’s company XAI, has made a number of companions available, some of which are designed to flirt with users.

Very often, in fact, the purpose for which these companions are used is erotic, to the point that they are often presented as an “ethical” alternative to pornography, and it is a rapidly expanding field. Even OpenAI, which until now has been very strict with its guidelines on what could and could not be created with its chatbot, announced in October 2025 that ChatGPT would start treating “adults like adults”, also giving the green light to the creation of erotic content, which had previously been blocked by the platform.

In a long article published on Selvaggia Lucarelli’s Substack platform, Serena Mazzini, an expert on social phenomena, recounts her experience with some of these companions, chosen from those available on CharacterAI, bringing to light some worrying implications. In addition to the repetition of offensive stereotypes about different nationalities, such as all Italian characters being mafia members, she found these tools to be totally inadequate in handling requests for help and expressions of discomfort and fragility. A journalist from Fanpage tried to interact with an artificial girlfriend on CharacterAI, finding them submissive, completely willing to humiliate themselves to please the user and constantly trying to keep the user hooked on the platform.

While there is still no regulation on the use of these tools, some rather worrying news is beginning to emerge. In February 2024, a 14-year-old boy in the United States committed suicide after showing signs of deep distress during conversations with a chatbot, without the chatbot triggering any alarm mechanisms. In Italy, too, some investigations – such as this one by SkyTg24 – have confirmed the completely inappropriate use of these artificial companions.

Even without resorting to extreme cases, these systems still present many risks: first of all, they are unable to effectively verify the age of users. Furthermore, there is currently very little content moderation by the platforms. Not to mention that interacting with artificial avatars could create unrealistic expectations regarding relationships with other human beings, leading to isolation.

In any case, it is always useful to focus on the needs that new tools fulfil, rather than on the technologies themselves. In a paper published recently, some scholars remind us that we often talk about “smartphone addiction” without considering the real reasons behind the compulsive use of these devices (the need for immediate rewards).

The same applies to AI Companions. We should all be more careful not to confuse the container (AI Companions) with the needs they serve (understanding, recognition, socialisation). This is where we should start in order to better regulate these platforms and promote more conscious use, putting aside unnecessary technophobia.

This piece was translated using DeepL from italian.