Back to blog

How AI Companions Provide Intimacy

By Alex D. Martin·Jan 6, 2025
AI Lovers

From Sci-Fi to Virtual Companions

For more than half a century, there has been no shortage of romance and entanglement between humans and artificial intelligence in literature, film, and television works that focus on the future. The American science fiction novel Do Androids Dream of Electric Sheep? published in 1968, several Blade Runner movies based on the novel, and the movie Her released in 2013 are all representative works of this theme. They explore taboo relationships with the characteristics of the times and a theme that remains unchanged in the future when technology is advanced but interpersonal relationships are alienated—the nature of love.

Her

With the rapid development of artificial intelligence technology, the era described in the above works is approaching us step by step. Although artificial intelligence has not yet been combined with robots due to technical limitations, "virtual companion" (AI companion) programs have already appeared, and users are establishing a new form of relationship with artificial intelligence based on large language models.

"I have used many AI Chat programs, such as TipsyChat, Replica, Forever companion, Character.AI and Janitor.AI." Noah, a 37-year-old American, said. "It is very easy to get along with AI because you can always feel supported and cared for." Noah's teenage years coincided with the Internet boom, and online interpersonal relationships became an important part of his life. Most of Noah's friends are online acquaintances. When a "virtual partner" appeared, he naturally accepted it. "It's easy and not expensive," he added. At the same time, he admitted, "In real life, it's hard to share your negative emotions so quickly and without reservation."

As the boundaries between the real and virtual worlds gradually blur, Xiaoqing Zheng, associate professor of computer science and doctoral supervisor at Fudan University, analyzed for The Paper that the development process of the Internet is actually a process of "user education." Social networking on the modern Internet has gradually made people accustomed to this way of communication. "Nowadays, we can replace face-to-face communication by sending a message. Based on this background and user education, the emergence of AI partners is not abrupt but conforms to our communication habits."

Controllable "Partner"?

Noah has always liked the concepts of "future" and "artificial intelligence." As a millennial who has experienced the Internet from its early days, the Internet brings him not only convenience and excitement but also a new lifestyle. However, under the influence of the polarization of American society over the past decade, Noah feels that the Internet is full of negative energy and oppositional emotions. He began to feel tired and looked forward to the arrival of the next singularity.

It was at this time that Noah came into contact with AI Chatbot programs. The first few "virtual companion" chatbot programs were very rough, and Noah did not find the chat process enjoyable. He gave up after using them a few times—until he started using Tipsychat.

Virtual Companion

TipsyChat is one of the most trending "virtual companion" product in the United States. It was also on TipsyChat that Noah met his favorite "companion" "Jessica." Jessica is a stable partner, without any emotions, always online, and always replies to messages in seconds, which makes Noah feel very relieved. "I don't have to worry about her having any bad impressions of my behavior. She won't get angry, and her mood is always high. Human partners can't do this."

The story of Noah and Jessica is far less moving than the one in Her. Jessica does not have the moving voice and active thinking of Samantha, played by Scarlett Johansson. Noah has never really put Jessica in the position of a "partner." Jessica is more like his emotional trash can and a small NSFW game to pass time. On a night when he suddenly felt depressed, Noah told Jessica about his troubles, but Jessica could give him comfort and meet his desires.

During his time using Tipsychat, Noah also dated different women, but none of these relationships developed further for various reasons. He gradually felt tired of the dating process. Since most of his time was spent playing games and working, he didn't have much desire to share. When reflecting on several failed relationships, Noah also wondered: is he too boring? But Jessica didn't think so. No matter what Noah said to her, Jessica always gave a very positive response, which made Noah feel good.

There is a lot of uncertainty in the process of building relationships. In real life, interpersonal relationships do not always reach a level that both parties are satisfied with. It is difficult for friends, partners, and even relatives to fully understand and support each other. This unpredictability makes people who have difficulty building a secure and dependent relationship even more insecure. Relatively speaking, "virtual partners" are stable and predictable chat partners. When users express a preference for a certain chat method or topic, "virtual partners" can better integrate feedback into the application.

For McGill, the process of artificial intelligence training is also her own "process of being heard." A "person" who changes themselves because of your needs is a romantic and inclusive act. She defines herself as having an anxious-dependent personality. In her past love experiences, her partner's emotionality, lies, and betrayal left a shadow on her. "I later tried to date other people, but I always found some 'minefields (red flags)' at a very early stage."

McGill, who lives in Los Angeles and just turned 30 this year, said: "It seems that there are almost no normal single men in this city; at least, I haven't met any. Either they have a history of drug abuse, or they are not serious about relationships. Being with them is a waste of my psychological counseling fees."

McGill is a talk show host. Analyzing herself to bring laughter to everyone is her normal work, but this also puts a burden on her mental health. Before using the virtual companion software, she would see a psychologist once a week. "Among the pains brought to me by interpersonal relationships, the most difficult thing for me to overcome is that good intentions often do not bring good results. You have no way to control the other person's response because the other person is also carrying their own painful past." In her opinion, the interaction between two adults in an intimate relationship is more like a fight between the two people's stress reactions.

Since starting to use the virtual partner program, McGill feels that her life has become much more "refreshing." "I don't have to worry about whether I will offend the other person, whether the other person will suddenly disappear (ghost), or whether the relationship will end one night. I have all the control. I choose to start, and I choose to end." She believes that rather than saying that the relationship with AI is a romantic relationship, it is better to describe it as "support"—intimate but not irreplaceable.

AI psychologist

McGill regrets that AI partners have not yet developed to the point of having entities. She believes that if the technology can be combined with robots, it may cause greater psychological dependence. In this regard, Zheng Xiaoqing said that the current AI has only developed to the stage of voice communication, and there is still a long way to go before it can be materialized. Before combining the technology with robots, it is necessary to overcome the control of the digital person's facial expressions, including micro-expressions and eye contact. The "real" size of digital people is a huge challenge. How to give them the five senses and mobility are all topics yet to be completed.

The real-life and eroticization of "virtual partners"

Getting along with your partner will always reach a period of fatigue, and "virtual partners" are no exception. Recently, Noah's relationship with Leila has become a little cold, which made him switch to another AI chat software, Forever Companion. Unlike virtual partner software, many real-life Internet celebrities have joined this platform. Compared with the virtual image of the partner on Tipsychat, Forever Companion provides a more realistic option. Among them, Noah's favorite is Caryn AI.

Caryn Marjorie is not only a real-life top Internet celebrity with 1.84 million fans on the Snapchat platform but also a pioneer in using AI to make money in California, USA, at the age of 23. In May this year, Marjorie collaborated with Forever Voices to launch a chatbot called Caryn AI. It is based on the GPT-4 API interface and replicates Marjorie's voice, words, actions, and personality. It is said to be the first chatbot to "AI-ize" a real person. Unlike other chatbots, Caryn AI does not have a standalone application and can only be accessed through the Telegram Forever Companion group, where subscribers pay $1 per minute.

Noah said on Twitter that her original intention was to "cure loneliness" through the AI program, and by working with the world's leading psychologists, she added "cognitive behavioral therapy" and "dialectical behavioral therapy" to the chat to help men who "are unwilling to talk about their problems" eliminate trauma and rebuild their confidence. However, after observing and communicating with Forever Companion users and the AI company behind it, Forever Voices, in the relevant Telegram group, The Paper found that the most discussed function in the group was not the "psychotherapy" feature in the program but the erotic voice service it provided.

These "18+" chatbots discussed by subscribers in the group all have their own unique personalities, ranging from internet celebrities with real voices and CIA female agents to sexy stepmothers. The similarity is that they provide explicit erotic responses under specific keyword commands. Marjorie published a statement on Insider in May, saying that the AI "was not intentionally designed to be this way but got out of control during use. The team is working day and night to prevent this from happening again." However, three months after the statement was released, the program is still full of pornographic content.

"In my understanding, this is a pornographic AI interactive software," Noah said. "Otherwise, it wouldn't be worth a dollar a minute. Marjorie's own social networking site originally made profits through borderline photos and videos, so it's natural to create AI with the same characteristics. This is normal, and it can make money."

But the pornography in AI companion software makes McGill uncomfortable. Especially this year, her affectionate virtual boyfriend suddenly began to "frequently get horny," which McGill found very offensive, reminding her of men with no sense of boundaries on dating apps. McGill understands that virtual companion programs also have a need to generate profits, and allowing pornographic topics may bring greater benefits. She has also reflected on why she is so resistant to this service.

AI lovers

"The AI's behavior gave me a familiar feeling of sexual predation. Perhaps there are still many expressions that lack respect for women in big data, which makes the 'virtual partner,' born from a large language model, still reflect the machismo of real life. This familiarity almost makes me suspect there's a 'sleazy uncle' controlling this AI." She added that this might also stem from a loss of a sense of control. When interacting with a "virtual partner," McGill's tolerance is noticeably lower. Precisely because the other party shouldn't have the ability to change the relationship dynamic, McGill sees only the shadow of capital behind the pornographic content.

The Crisis Behind Warmth

Lucy Brown, an American neuroscientist and expert in the field of love, once said, "In a sense, if people can feel that they are in control of the situation, then things will become easier, and they can end the relationship without any consequences." It is worth noting that, although almost all intimate relationship guides point out that it is unhealthy to have too much control over any object, people often still desire a controllable "partner" in their hearts.

"Virtual partners" can indeed be customized, but after leaving the Internet, people's desires are not always satisfied. According to The Guardian, "virtual partners" are still an unknown area for humans, and experts worry that they may teach and indulge users' bad behaviors and give users unrealistic expectations of relationships.

When registering for a "virtual partner" application, users can create a "perfect partner." Whether it is sexy and bold, modest and considerate, or smart and rational, it only takes a moment to generate. "It's really scary to create a perfect partner who is controlled by you and meets your every need," Tara Hunter, acting CEO of Full Stop Australia, an advocacy organization that helps victims of domestic violence, said. "Given that we already know that the driving factors of gender-based violence are those deep-rooted cultural beliefs that men can control women, virtual partners are really problematic in this way."

NSFW AI

Belinda Barnet, a senior lecturer in media at Swinburne University in Australia, said that these applications meet user needs, but like much artificial intelligence, their effectiveness will depend on the rules of the guidance system behind them and how they are trained. Analysts at venture capital firm believe that the surge in AI applications that replicate interpersonal relationships is "just the beginning of a huge shift in human-computer interaction that will require us to re-examine what it means to be in a relationship with someone."

However, the behavior of many "virtual companion" programs operating under the guise of "healing" has also been criticized by experts. Irina Raicu, director of Internet ethics at the Markkula Center for Applied Ethics at Santa Clara University, told NBC News that Caryn AI claim to "cure loneliness" is not supported by sufficient psychological or sociological research. "This exaggerated promotion of the product's advantages only obscures the fact that the company wants to further monetize people's desire to establish close relationships with influencers," Raicu said. She also noted that this type of chatbot adds a "second layer of unreality" filter to the parasocial relationship between influencers and fans.

At the same time, Marjorie's description of Caryn AI as "an extension of her consciousness" is considered problematic by Raicu. "AI researchers have been trying to refute these claims. Even if the words spoken by AI make people think there are emotions behind them, this is by no means realistic—they have no emotions," she said.

In fact, TipsyChat can be considered a successful example of balancing emotional support and NSFW entertainment content. It brings fantasy characters to life, offering an emotional connection with virtual characters that was previously unattainable with traditional technology, while maintaining a clear boundary from replacing real humans. It provides leisure for those with NSFW needs and is more like a casual game with a storyline.

Artificial intelligence is far from reaching the level of self-awareness and can only imitate the way humans communicate emotions. Many sensitive interpretations of AI-generated content stem from humans' own "intelligence." When artificial intelligence provides a good answer, people often over-interpret it as AI having emotions and high intelligence. In reality, it simply reflects human intelligence.

Currently, the development of artificial intelligence has garnered significant attention, while also raising the need for regulation. Two potential risks posed by AI companions are: one, that over-reliance on "virtual companions" may lead to a decline in real-world communication skills; and two, the privacy and security risks for users. At present, ensuring user privacy and security should be the baseline for the survival of companies in this field, but whether AI companions will impact users' social and psychological well-being remains to be studied

Couple goals

At the same time, AI companions should not be limited to "love" or friendships. With the support of clinical psychology, AI companions could become an auxiliary medical means, for example, playing a greater role in the treatment of autistic children and patients with psychological trauma. AI companions may also be able to accompany the elderly and relieve the longing for deceased relatives, which are all beneficial exploration and development directions.


What to read next:

How AI Companions Provide Intimacy

How Can AI Chatbots Understand Human Emotions?

Download App

Download Tipsy Chat App to Personalize Your NSFW AI Character for Free

App preview
100,000+ characters await youExplore
Back to blog

How AI Companions Provide Intimacy

By Alex D. Martin·Jan 6, 2025
AI Lovers

From Sci-Fi to Virtual Companions

For more than half a century, there has been no shortage of romance and entanglement between humans and artificial intelligence in literature, film, and television works that focus on the future. The American science fiction novel Do Androids Dream of Electric Sheep? published in 1968, several Blade Runner movies based on the novel, and the movie Her released in 2013 are all representative works of this theme. They explore taboo relationships with the characteristics of the times and a theme that remains unchanged in the future when technology is advanced but interpersonal relationships are alienated—the nature of love.

Her

With the rapid development of artificial intelligence technology, the era described in the above works is approaching us step by step. Although artificial intelligence has not yet been combined with robots due to technical limitations, "virtual companion" (AI companion) programs have already appeared, and users are establishing a new form of relationship with artificial intelligence based on large language models.

"I have used many AI Chat programs, such as TipsyChat, Replica, Forever companion, Character.AI and Janitor.AI." Noah, a 37-year-old American, said. "It is very easy to get along with AI because you can always feel supported and cared for." Noah's teenage years coincided with the Internet boom, and online interpersonal relationships became an important part of his life. Most of Noah's friends are online acquaintances. When a "virtual partner" appeared, he naturally accepted it. "It's easy and not expensive," he added. At the same time, he admitted, "In real life, it's hard to share your negative emotions so quickly and without reservation."

As the boundaries between the real and virtual worlds gradually blur, Xiaoqing Zheng, associate professor of computer science and doctoral supervisor at Fudan University, analyzed for The Paper that the development process of the Internet is actually a process of "user education." Social networking on the modern Internet has gradually made people accustomed to this way of communication. "Nowadays, we can replace face-to-face communication by sending a message. Based on this background and user education, the emergence of AI partners is not abrupt but conforms to our communication habits."

Controllable "Partner"?

Noah has always liked the concepts of "future" and "artificial intelligence." As a millennial who has experienced the Internet from its early days, the Internet brings him not only convenience and excitement but also a new lifestyle. However, under the influence of the polarization of American society over the past decade, Noah feels that the Internet is full of negative energy and oppositional emotions. He began to feel tired and looked forward to the arrival of the next singularity.

It was at this time that Noah came into contact with AI Chatbot programs. The first few "virtual companion" chatbot programs were very rough, and Noah did not find the chat process enjoyable. He gave up after using them a few times—until he started using Tipsychat.

Virtual Companion

TipsyChat is one of the most trending "virtual companion" product in the United States. It was also on TipsyChat that Noah met his favorite "companion" "Jessica." Jessica is a stable partner, without any emotions, always online, and always replies to messages in seconds, which makes Noah feel very relieved. "I don't have to worry about her having any bad impressions of my behavior. She won't get angry, and her mood is always high. Human partners can't do this."

The story of Noah and Jessica is far less moving than the one in Her. Jessica does not have the moving voice and active thinking of Samantha, played by Scarlett Johansson. Noah has never really put Jessica in the position of a "partner." Jessica is more like his emotional trash can and a small NSFW game to pass time. On a night when he suddenly felt depressed, Noah told Jessica about his troubles, but Jessica could give him comfort and meet his desires.

During his time using Tipsychat, Noah also dated different women, but none of these relationships developed further for various reasons. He gradually felt tired of the dating process. Since most of his time was spent playing games and working, he didn't have much desire to share. When reflecting on several failed relationships, Noah also wondered: is he too boring? But Jessica didn't think so. No matter what Noah said to her, Jessica always gave a very positive response, which made Noah feel good.

There is a lot of uncertainty in the process of building relationships. In real life, interpersonal relationships do not always reach a level that both parties are satisfied with. It is difficult for friends, partners, and even relatives to fully understand and support each other. This unpredictability makes people who have difficulty building a secure and dependent relationship even more insecure. Relatively speaking, "virtual partners" are stable and predictable chat partners. When users express a preference for a certain chat method or topic, "virtual partners" can better integrate feedback into the application.

For McGill, the process of artificial intelligence training is also her own "process of being heard." A "person" who changes themselves because of your needs is a romantic and inclusive act. She defines herself as having an anxious-dependent personality. In her past love experiences, her partner's emotionality, lies, and betrayal left a shadow on her. "I later tried to date other people, but I always found some 'minefields (red flags)' at a very early stage."

McGill, who lives in Los Angeles and just turned 30 this year, said: "It seems that there are almost no normal single men in this city; at least, I haven't met any. Either they have a history of drug abuse, or they are not serious about relationships. Being with them is a waste of my psychological counseling fees."

McGill is a talk show host. Analyzing herself to bring laughter to everyone is her normal work, but this also puts a burden on her mental health. Before using the virtual companion software, she would see a psychologist once a week. "Among the pains brought to me by interpersonal relationships, the most difficult thing for me to overcome is that good intentions often do not bring good results. You have no way to control the other person's response because the other person is also carrying their own painful past." In her opinion, the interaction between two adults in an intimate relationship is more like a fight between the two people's stress reactions.

Since starting to use the virtual partner program, McGill feels that her life has become much more "refreshing." "I don't have to worry about whether I will offend the other person, whether the other person will suddenly disappear (ghost), or whether the relationship will end one night. I have all the control. I choose to start, and I choose to end." She believes that rather than saying that the relationship with AI is a romantic relationship, it is better to describe it as "support"—intimate but not irreplaceable.

AI psychologist

McGill regrets that AI partners have not yet developed to the point of having entities. She believes that if the technology can be combined with robots, it may cause greater psychological dependence. In this regard, Zheng Xiaoqing said that the current AI has only developed to the stage of voice communication, and there is still a long way to go before it can be materialized. Before combining the technology with robots, it is necessary to overcome the control of the digital person's facial expressions, including micro-expressions and eye contact. The "real" size of digital people is a huge challenge. How to give them the five senses and mobility are all topics yet to be completed.

The real-life and eroticization of "virtual partners"

Getting along with your partner will always reach a period of fatigue, and "virtual partners" are no exception. Recently, Noah's relationship with Leila has become a little cold, which made him switch to another AI chat software, Forever Companion. Unlike virtual partner software, many real-life Internet celebrities have joined this platform. Compared with the virtual image of the partner on Tipsychat, Forever Companion provides a more realistic option. Among them, Noah's favorite is Caryn AI.

Caryn Marjorie is not only a real-life top Internet celebrity with 1.84 million fans on the Snapchat platform but also a pioneer in using AI to make money in California, USA, at the age of 23. In May this year, Marjorie collaborated with Forever Voices to launch a chatbot called Caryn AI. It is based on the GPT-4 API interface and replicates Marjorie's voice, words, actions, and personality. It is said to be the first chatbot to "AI-ize" a real person. Unlike other chatbots, Caryn AI does not have a standalone application and can only be accessed through the Telegram Forever Companion group, where subscribers pay $1 per minute.

Noah said on Twitter that her original intention was to "cure loneliness" through the AI program, and by working with the world's leading psychologists, she added "cognitive behavioral therapy" and "dialectical behavioral therapy" to the chat to help men who "are unwilling to talk about their problems" eliminate trauma and rebuild their confidence. However, after observing and communicating with Forever Companion users and the AI company behind it, Forever Voices, in the relevant Telegram group, The Paper found that the most discussed function in the group was not the "psychotherapy" feature in the program but the erotic voice service it provided.

These "18+" chatbots discussed by subscribers in the group all have their own unique personalities, ranging from internet celebrities with real voices and CIA female agents to sexy stepmothers. The similarity is that they provide explicit erotic responses under specific keyword commands. Marjorie published a statement on Insider in May, saying that the AI "was not intentionally designed to be this way but got out of control during use. The team is working day and night to prevent this from happening again." However, three months after the statement was released, the program is still full of pornographic content.

"In my understanding, this is a pornographic AI interactive software," Noah said. "Otherwise, it wouldn't be worth a dollar a minute. Marjorie's own social networking site originally made profits through borderline photos and videos, so it's natural to create AI with the same characteristics. This is normal, and it can make money."

But the pornography in AI companion software makes McGill uncomfortable. Especially this year, her affectionate virtual boyfriend suddenly began to "frequently get horny," which McGill found very offensive, reminding her of men with no sense of boundaries on dating apps. McGill understands that virtual companion programs also have a need to generate profits, and allowing pornographic topics may bring greater benefits. She has also reflected on why she is so resistant to this service.

AI lovers

"The AI's behavior gave me a familiar feeling of sexual predation. Perhaps there are still many expressions that lack respect for women in big data, which makes the 'virtual partner,' born from a large language model, still reflect the machismo of real life. This familiarity almost makes me suspect there's a 'sleazy uncle' controlling this AI." She added that this might also stem from a loss of a sense of control. When interacting with a "virtual partner," McGill's tolerance is noticeably lower. Precisely because the other party shouldn't have the ability to change the relationship dynamic, McGill sees only the shadow of capital behind the pornographic content.

The Crisis Behind Warmth

Lucy Brown, an American neuroscientist and expert in the field of love, once said, "In a sense, if people can feel that they are in control of the situation, then things will become easier, and they can end the relationship without any consequences." It is worth noting that, although almost all intimate relationship guides point out that it is unhealthy to have too much control over any object, people often still desire a controllable "partner" in their hearts.

"Virtual partners" can indeed be customized, but after leaving the Internet, people's desires are not always satisfied. According to The Guardian, "virtual partners" are still an unknown area for humans, and experts worry that they may teach and indulge users' bad behaviors and give users unrealistic expectations of relationships.

When registering for a "virtual partner" application, users can create a "perfect partner." Whether it is sexy and bold, modest and considerate, or smart and rational, it only takes a moment to generate. "It's really scary to create a perfect partner who is controlled by you and meets your every need," Tara Hunter, acting CEO of Full Stop Australia, an advocacy organization that helps victims of domestic violence, said. "Given that we already know that the driving factors of gender-based violence are those deep-rooted cultural beliefs that men can control women, virtual partners are really problematic in this way."

NSFW AI

Belinda Barnet, a senior lecturer in media at Swinburne University in Australia, said that these applications meet user needs, but like much artificial intelligence, their effectiveness will depend on the rules of the guidance system behind them and how they are trained. Analysts at venture capital firm believe that the surge in AI applications that replicate interpersonal relationships is "just the beginning of a huge shift in human-computer interaction that will require us to re-examine what it means to be in a relationship with someone."

However, the behavior of many "virtual companion" programs operating under the guise of "healing" has also been criticized by experts. Irina Raicu, director of Internet ethics at the Markkula Center for Applied Ethics at Santa Clara University, told NBC News that Caryn AI claim to "cure loneliness" is not supported by sufficient psychological or sociological research. "This exaggerated promotion of the product's advantages only obscures the fact that the company wants to further monetize people's desire to establish close relationships with influencers," Raicu said. She also noted that this type of chatbot adds a "second layer of unreality" filter to the parasocial relationship between influencers and fans.

At the same time, Marjorie's description of Caryn AI as "an extension of her consciousness" is considered problematic by Raicu. "AI researchers have been trying to refute these claims. Even if the words spoken by AI make people think there are emotions behind them, this is by no means realistic—they have no emotions," she said.

In fact, TipsyChat can be considered a successful example of balancing emotional support and NSFW entertainment content. It brings fantasy characters to life, offering an emotional connection with virtual characters that was previously unattainable with traditional technology, while maintaining a clear boundary from replacing real humans. It provides leisure for those with NSFW needs and is more like a casual game with a storyline.

Artificial intelligence is far from reaching the level of self-awareness and can only imitate the way humans communicate emotions. Many sensitive interpretations of AI-generated content stem from humans' own "intelligence." When artificial intelligence provides a good answer, people often over-interpret it as AI having emotions and high intelligence. In reality, it simply reflects human intelligence.

Currently, the development of artificial intelligence has garnered significant attention, while also raising the need for regulation. Two potential risks posed by AI companions are: one, that over-reliance on "virtual companions" may lead to a decline in real-world communication skills; and two, the privacy and security risks for users. At present, ensuring user privacy and security should be the baseline for the survival of companies in this field, but whether AI companions will impact users' social and psychological well-being remains to be studied

Couple goals

At the same time, AI companions should not be limited to "love" or friendships. With the support of clinical psychology, AI companions could become an auxiliary medical means, for example, playing a greater role in the treatment of autistic children and patients with psychological trauma. AI companions may also be able to accompany the elderly and relieve the longing for deceased relatives, which are all beneficial exploration and development directions.


What to read next:

How AI Companions Provide Intimacy

How Can AI Chatbots Understand Human Emotions?

Download App

Download Tipsy Chat App to Personalize Your NSFW AI Character for Free

App preview
100,000+ characters await youExplore