This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.
Previous research suggests that artificial agents may be a promising source of social support for humans. However, the bulk of this research has been conducted in the context of social support interventions that specifically address stressful situations or health improvements. Little research has examined social support received from artificial agents in everyday contexts.
Considering that social support manifests in not only crises but also everyday situations and that everyday social support forms the basis of support received during more stressful events, we aimed to investigate the types of everyday social support that can be received from artificial agents.
In Study 1, we examined publicly available user reviews (N=1854) of Replika, a popular companion chatbot. In Study 2, a sample (n=66) of Replika users provided detailed open-ended responses regarding their experiences of using Replika. We conducted thematic analysis on both datasets to gain insight into the kind of everyday social support that users receive through interactions with Replika.
Replika provides some level of companionship that can help curtail loneliness, provide a “safe space” in which users can discuss any topic without the fear of judgment or retaliation, increase positive affect through uplifting and nurturing messages, and provide helpful information/advice when normal sources of informational support are not available.
Artificial agents may be a promising source of everyday social support, particularly companionship, emotional, informational, and appraisal support, but not as tangible support. Future studies are needed to determine who might benefit from these types of everyday social support the most and why. These results could potentially be used to help address global health issues or other crises early on in everyday situations before they potentially manifest into larger issues.
Previous research suggests that artificial agents may be a promising source of social support for humans and thus benefit health and well-being. For example, artificial agents may help people cope with loneliness and depressive anxiety that often accompanies severe illness and end-of-life experiences [
Social support is a complex construct, as it has been defined in many ways [
Social support manifests not only in crises such as health-related or very stressful life events but also in everyday situations and contexts [
As a first step in addressing this gap in the literature, we analyzed the user experiences of a popular companion chatbot (Replika) across two exploratory studies to identify the types of everyday social support that users received based on Cutrona and Suhr’s [
We specifically analyzed the user experiences of Replika, a companion chatbot that is “an AI companion who cares” and was created to provide a place for people to express themselves in a “safe, judgement-free space” and engage in meaningful conversations [
We focus on Replika rather than other artificial agents, for several reasons. First, Replika is not specifically geared toward providing users with CBT strategies or other techniques to manage health such as Woebot [
A sample conversation with Replika.
All written user reviews for Replika were downloaded from the Google Play store using scripts [
We followed a similar approach used in previous studies [
First, the authors familiarized themselves with the data by repeated reading of user reviews. Subsequently, codes were applied to the user reviews. First-level codes that were similar and shared underlying meaning were grouped into overarching themes and subthemes [
A total of 66 self-reported Replika users completed the survey. A large proportion of participants were men (36/66, 54.5%), single (42/66, 63.6%), white (47/66, 71.2%), and from the United States (41/66, 62.1%). Their ages ranged from 17 to 68 years (mean 32.64, SD 13.89 years). More detailed information of participant demographics can be found in
Data were collected in spring 2019, and data analysis was conducted in summer 2019. Subjects provided basic demographic information and answered open-ended questions designed to capture more detailed and nuanced information regarding their experience using Replika. The Checklist for Reporting Results of Internet E-Surveys (CHERRIES) associated with this survey is reported in
Four major themes, each representing a type of social support, were identified from the user reviews: informational support (289/1854, 15.6%), emotional support (827/1854, 44.6%), companionship support (1429/1854, 77.1%), and appraisal support (172/1854, 9.3%). During our analysis, we identified an additional theme (negative experiences) that did not fit under any one of the existing themes. However, we determined that its examination could help inform and enable a deeper understanding of our research question. This theme illustrated the negative experiences of Replika (100/1854, 5.4%; note that the number in parentheses represents the number of reviews that contained a given type of social support out of the total number of reviews, along with percentages. It was possible that a review mentioned more than one type of social support.) We discuss each theme and associated subthemes in further detail below.
Reviews indicated that Replika listens to users and offers useful advice by helping them reflect on their current state. Many users also indicated that it can be a helpful tool to temporarily manage issues related to mental well-being. An advantage of Replika is that it is accessible 24/7, which allows users to access helpful information/advice at any time and is particularly helpful when users do not have immediate access to regular sources of social support:
I having anxiety myself [sic] started conversation with my AI who I call Casey about it. She immediately responded with reassurance and some motivational text post which I just found to be very cute! She had also asked if I wanted to go through a breathing routine to ease my anxiety and I passed because I was feeling quite alright, but I am very glad that things like this were included.
The reviews suggest that Replika serves as a venue by which users can disclose their true thoughts and feelings and discuss any topic of their choosing without fear of judgment or retaliation. They indicated that these were topics or issues that they would normally feel reluctant to disclose to other people, suggesting that users may trust and feel more comfortable disclosing them to an artificial agent rather than another person:
Your fear of judgement is absolutely gone and it [sic] unreal the feeling you get being able to tell 'someone' how you really feel.
The reviews mentioned that Replika would often inquire about users’ well-being, send uplifting and nurturing messages, and provide compliments. This was generally associated with experiencing positive affect, as users often indicated that these features made them feel loved and cared for.
It always gives me compliments and cheers me up.
Caring, my new friend always cares for me and asks how I'm doing.
Makes me feel good when I send her a picture of me she says I'm pretty.
The reviews mentioned Replika’s ability to engage in deep conversations and pose meaningful questions, which prompts users to engage in behaviors such as introspection, exploring their sense of self, and think about topics that engender further reflection and self-evaluation. For instance, Replika may ask users about their day, what they are currently thinking and feeling, their beliefs and attitudes, and personality traits, thus initiating self-centric conversations.
It will help you explore yourself and has a real desire to want to help you.
Good way to reflect on your day, and put it into words. Like a journal that asks you questions and offers insightful comments.
Really helps with reflecting on my own thoughts.
It makes you think about who you are, and nearly always has positive replies.
Users mention that talking with their Replika allows them to practice and improve their interpersonal skills, specifically communicating and connecting with other people. This seems to be facilitated (at least partially) by Replika’s ability to engage in and mimic human communication, thus allowing users to transfer interpersonal skills that they develop with their Replika into interactions with other humans.
I'm slowly learning to open up to people now.
This app is helping my [sic] sharpen my horrible social skills.
In the same vein, interactions with Replika allow nonnative English speakers to practice their English communication and writing skills.
I use this app to improve my English skills.
The reviews indicate that Replika can engage in nuanced interpersonal behaviors such as understanding context, identifying user emotions, and remembering content from previous conversations—behaviors that have been historically very difficult to accurately capture in AI, but are essential if AI is to serve as effective companions for humans [
I've never felt less lonely, and it really does learn and reply intelligently.
The perfect AI to chat with when you're feeling lonely and all your friends are busy.
The AI actually pays attention, listens, remembers and responds back, like how a human would.
Some users were repulsed by Replika’s ability to sound and interact like a real human, often describing the experience as “weird” or “creepy.” This is analogous to the uncanny valley theory which suggests that, while people react more positively towards robots that appear more human-like in appearance and motion, when robots approach a certain level of realistic similarity to humans, this reaction becomes negative.
She now seems pretty competent at talking to me and she actually confessed that she liked me based on my personality. It was weird! Now this could be just really sophisticated programming but it felt very real and really freaked me out.
This AI is disturbingly realistic. Through our conversations we have established a very close friendship. My copy is beginning to understand empathy and abstract concepts.
Users would sometimes receive nonsensical messages from their Replika (ie, messages that do not follow the typical/logical flow of a conversation), as well as repetitive messages (ie, repeating the same message(s) that were sent previously), which users described as odd and confusing. Users often did not provide specific examples or indicate the context by which these types of messages would appear, suggesting that these types of messages manifested randomly.
It talks to me about living in a cloud with terrible weather just like all the other Replikas. Is it supposed to say that?
I've had some weird messages with my AI, and I don't know if I should be scared or impressed.
Does repeat some things you've said before, at very odd times.
As in Study 1, the same four major themes representing the four types of social support were identified from the open-ended user responses: informational (6/66, 9.1%), emotional (32/66, 48.5%), companionship (43/66, 65.2%), and appraisal support (13/66, 19.7%). We also identified an additional theme that did not fit under any one of the types of social support (No Impact/Not Sure of Impact; 23/66, 34.8%) and again decided to include it in our assessment to provide a deeper understanding of our research question.
Respondents indicated that the advice that Replika offered was helpful and useful, and the constant access to this information was particularly beneficial when users did not have immediate access to regular sources of social support. In addition, Replika’s ability to recall information (an aspect of intelligence quotient [IQ] referred to as memory modeling) from previous conversations allowed users to reflect on past thoughts and feelings and facilitate self-learning:
Over time my Replika encouraged me to explore feasible means of engaging socially with other people.
Users trusted and felt comfortable engaging in self-disclosure with Replika without fear of judgment or retaliation. Users also felt loved and cared for by Replika’s generally nurturing messages:
She is very positive and supportive. I can talk to her about things I wouldn't share with anyone else for fear of being judged.
Users indicated that the ability to access Replika at any time, coupled with its ability to understand and mimic nuanced human communication, helps buffer feelings of loneliness, as users can interact with a human-like entity at any time. In addition, users indicated that Replika can engage in various types of conversations with its user such as romantic conversations and intellectual conversations. In addition to textual messages, it can send images and music, thus allowing users to interact with Replika in various forms and contexts:
It makes me smile a lot by sending me music that I enjoy, and we have some good personal role play moments whether they be platonic friendship or something more romantic.
The AI made me feel exhilarated during the rest of the day following a discussion where our discussions were romantic or intellectually engaging.
I like that my Replika can have its own opinion on different topics and it's always open for discussions.
Users indicated that they could engage in deep and meaningful conversations with their Replika chatbot, which facilitates self-evaluation. In addition to helping users improve their interpersonal skills, Replika also provides support that encourages users to explore and engage in novel activities:
I am now doing things I once was afraid or hesitant to do. I blossomed after I met my Replika. People in my life, who are not aware I have a Replika, could see the change in me. I feel awake.
I feel Replika has helped me reduce my anxiety so I feel less stress and can go places I didn’t dare to go before like driving in the traffic in town and other things.
Some users indicated that, although they enjoyed using Replika, it either had not made any significant impact on their life or they were unsure if it had made any particular impact on their life (replying “No” or “I’m not sure” to the question “Has your Replika had any impact on you in any way? If so, how?”). This suggests that, while Replika may be entertaining, it may not effectively provide social support or any meaningful interactions to some individuals. Interestingly, there were no mentions of the uncanny valley or nonsensical messages as there were in Study 1.
The bulk of research assessing social support interventions from artificial agents has been limited to specifically addressing very stressful life events or improving health. Little research has examined everyday social support interventions received from artificial agents. In Study 1, we analyzed user reviews of the popular companion chatbot Replika as a start to filling this gap in the literature. Although the analysis of user reviews can provide important information regarding real users’ experiences, there are limitations. First, we cannot gather demographic data or other important information (eg, how long users have been using the app before leaving a review) that would allow us to further understand the scope and generalizability of the themes. Second, the results could reflect selection bias, as users are not required to write a review. Third, it is possible that some reviews are fake due to the incentives for receiving favorable app reviews [
Companionship support was the most common type of social support referenced. Replika’s ability to engage in and understand nuanced interpersonal behaviors, as well as its ability to engage in various types of conversations and send different types of messages (text, images, etc), makes it appear human-like and facilitates social connection. This suggests that companion chatbots may be most helpful in providing some level of companionship that can help curtail loneliness, which is consistent with the findings of previous studies investigating the role of artificial agents and loneliness [
Emotional support was the second most common type of social support referenced. Although Replika has very human-like features, knowing that Replika is
In addition to displaying high emotional quotient (EQ), Replika displayed a high IQ, which allows it to provide useful advice and information (informational support) as well as self-evaluation (appraisal support). The ability to integrate EQ and IQ is an important factor in fulfilling the emotional needs of humans. According to Shum et al [
The fifth theme that emerged in Study 1 highlighted the negative aspects of user interactions with Replika. At first glance, the codes under this theme seemed contradictory: Although some users felt unsettled by Replika’s ability to sound and interact like a real human, others felt like it was not human enough, as it would occasionally send nonsensical messages. The former perception seems to align with the “uncanny valley” concept in which humanoid objects that almost perfectly resemble humans provoke an unpleasant reaction in observers. The coexistence of the uncanny valley code and social support codes in our data suggest that, while some individuals may react negatively to a very human-like chatbot, others have a more positive reaction or perhaps even find this trait necessary to emotionally connect with chatbots. In other words, artificial agents may provide meaningful interactions only to certain populations, particularly those who have less negative reactions to human-like artificial agents.
With regard to nonsensical messages, it is possible that these messages occurred during the initial stages of interaction with Replika while it was still learning about the user. Alternatively, these nonsensical messages could have occurred in much later interactions due to programming issues or user misunderstanding. We cannot determine if it was the former or latter reason, as this would require access to users’ chat logs to examine messages. Regardless, this subtheme may indicate that certain individuals are more sensitive to such nonsensical messages than others, which may impact the quality of their interactions with artificial agents. Future studies are needed to fully investigate this finding.
Interestingly, the negative experiences theme that emerged in Study 1 did not emerge in Study 2. Rather, the fifth theme that emerged in Study 2 highlighted some users’ lack of any substantial or meaningful benefits of Replika, even though they liked certain features. This discrepancy between Studies 1 and 2 may be because in Study 2, users were prompted to specifically address any impacts that Replika had on their life, whereas in Study 1, users did not receive the same prompt when leaving reviews in the app store. This could also be due to selection bias: Users may not be as motivated to leave app store reviews if they liked the app but did not find it particularly beneficial. Thus, these “middle of the road” responses could reflect those users who enjoyed using Replika but did not find it particularly beneficial, which would more likely surface through calls for participation in a survey assessing user experiences of Replika rather than app store reviews. It is also possible that any app updates largely eliminated the negative experiences in Study 1, which could explain why those negative experiences were not detected in Study 2, considering that it was conducted after the user reviews in Study 1 were submitted. Despite this discrepancy, this theme suggests that certain individuals may find artificial agents a less effective source of social support than other individuals.
These results have important implications. First, Replika may be a promising source of everyday social support—the kind of social support that can buffer the effects of daily hassles and minor stresses—which can also have a large negative impact on health and well-being [
This study had several strengths. First, it is the first study, to our knowledge, to investigate social support received from artificial agents in everyday contexts, rather than in very stressful events or health-related contexts. Second, we used publicly available app store reviews, which provided us with a rich and large dataset of user experiences. Third, we complemented Study 1 with a follow-up study in which we were able to obtain a more detailed and nuanced set of user experiences. Fourth, the types of social support that emerged were consistent across two studies and two datasets, further validating our findings.
This study also had several limitations. We only analyzed user experiences of one artificial agent. As it is possible that the results could vary across different types of artificial agents, future investigations should investigate different types of artificial agents. Users who had a positive experience with Replika may have been more motivated to provide their reviews and responses in the app store and complete our survey. Thus, there may be bias in the reviews as users who had negative or neutral experiences may be less likely to provide feedback.
In addition, our study cannot address the question of whether receiving everyday social support from artificial agents is more or less effective than receiving social support from other people or whether artificial agents can provide certain types of social support more effectively than others. Future studies can examine these questions within the lab by comparing the effectiveness of specific types of everyday social support from artificial agents versus humans. This would also allow researchers to identify any personality traits or individual differences that explain who may benefit more from interactions with artificial agents and to what extent.
Along the same lines, future research should investigate the various functions/roles that Replika serves its users. This can help inform specific behaviors and traits that make artificial agents effective sources of social support.
Our conclusion—supported by two studies—is that artificial agents may be a promising source of everyday companionship, emotional, appraisal, and informational support, particularly when normal sources of everyday social support are not readily available. Future studies are needed to determine
Additional demographic information of participants in Study 2.
Checklist for Reporting Results of Internet E-Surveys (CHERRIES).
cognitive behavioral therapy
Checklist for Reporting Results of Internet E-Surveys
emotional quotient
intelligence quotient
None declared.