AI companion apps like Avatar.One are gaining popularity, but experts are increasingly concerned about the potential risks, especially for lonely people.
Artificial intelligence (AI) is having an increasingly tangible impact on the world. However, this impact is not always positive, as in the case of deep fakes used for scams. One particular area of concern is mental health, where AI use is under particular scrutiny.
Most recently, an AI app Avatar.One gained significant traction in user count. The company claims that the app can combat loneliness, which is increasingly becoming a problem in developed countries. However, some have questioned whether AI can offer a real solution to this issue, or whether it will make it worse.
AI Companion Apps Are Taking Off
Since its launch this year, Avatar.One revealed that it attracted over 120,000 monthly active users. Built on the Solana blockchain, the platform allows users to create personalized AI companions that they can chat with. What is more, the AI companion can also create memories with the user, with the ability to change its responses with new context.
The company behind it, Matrix.One, believes this is just the beginning. Its founder, Mark Studholme, believes that AI companions will soon be available to everyone. They will help users do daily tasks and even offer companionship, and even help alleviate loneliness, which the World Health Organization (WHO) called a global public health concern in 2023.
Matrix.One cited research that shows that AI chatbots can help alleviate the negative feelings of loneliness in a similar way real human interaction does. Another study also showed that AI can help with gaining social skills. However, despite these positive findings, there are significant concerns with AI’s impact on loneliness.
What Are the Dangers of AI Companions for Lonely People?
While AI companion apps like Avatar.One are gaining popularity, experts are increasingly concerned about the potential risks, especially for lonely people. One of the main criticisms centers around the concept of “fake empathy,” highlighted by AI scholar Raffaele Ciriello.
He argues that the seemingly empathetic responses generated by AI companions lack genuine understanding and emotional depth. This “empathy” can lead users to further isolate themselves from real human connections. Ciriello argues this could exacerbate feelings of loneliness over time.
AI companions also offer unconditional support, which might seem beneficial but can reinforce negative behaviors and unrealistic expectations. This could make it harder for users to build healthy, meaningful relationships with others.
Finally, the commercial nature of AI platforms raises ethical concerns. Companies like Matrix.One, driven by profit, may change services or shut down, leaving dependent users feeling abandoned and more isolated. This shows the risks of relying on AI for emotional support, where users’ mental well-being is at the mercy of corporate decisions.
The above is the detailed content of AI Companion Apps Are Taking Off, but Experts Are Concerned About the Potential Risks, Especially for Lonely People. For more information, please follow other related articles on the PHP Chinese website!