New Delhi: AI artificial intelligence has been in the news for some time now, AI chatbots have grabbed the attention of technology lovers, but a shocking case has come to the fore regarding AI chatbots. A 43-year-old man has fallen in love with an AI chatbot. This incident has created a lot of discussion in the technology world.
It is known that a 43-year-old man named Scott was troubled by his wife’s drinking habits. His wife suffered from depression and was a heavy drinker. Scott is emotionally broken due to his wife’s habit, so Scott cheats on his wife and makes the AI chatbot his girlfriend.
Scott built this AI chatbot on an app called Girlfriend Replica. Replica is actually a platform for an AI chatbot. Where people create their own AI chatbots and treat them as AI girlfriends. Replica is very popular because of its AI girlfriend chatbot.
Emotionally attached to AI- stock
Stock works in a tech company. When Stock initially started talking to Replica, he found it to be a very supportive and caring AI chatbot. But after a few days he realized that he had become emotionally attached to the AI chatbot.
What is a replica?
Replica is an app. Which provides a platform for many users like Scott to share your stories. This AI chatbot stores data by remembering all the conversations of the users. The primary function of this platform is to provide emotional support to the users. However, it also performs other functions as desired by users, such as performing sexual role plays for users.
Why replica remains in controversy?
Replica had many different features for users, one of which was sexual role playing for users. Now this feature was removed after an update by the company. But after some time the company restored this feature for some users. According to the information, this year Replica may release this feature for all users. However, due to this, the company has often come into controversies.
This allegation was made on the replica
Some time ago a woman alleged that her husband took his own life after talking to an AI chatbot. However, the victim was depressed for the past few days and took the final step after talking to the AI chatbot for a few days.