This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.
Future development of electronic health (eHealth) programs (automated Web-based health interventions) will be furthered if program design can be based on the knowledge of eHealth’s working mechanisms. A promising and pragmatic method for exploring potential working mechanisms is qualitative interview studies, in which eHealth working mechanisms can be explored through the perspective of the program user. Qualitative interview studies are promising as they are suited for exploring what is yet unknown, building new knowledge, and constructing theory. They are also pragmatic, as the development of eHealth programs often entails user interviews for applied purposes (eg, getting feedback for program improvement or identifying barriers for implementation). By capitalizing on these existing (applied) user interviews to also pursue (basic) research questions of how such programs work, the knowledge base of eHealth’s working mechanisms can grow quickly. To be useful, such interview studies need to be of sufficient quality, which entails that the interviews should generate enough data of sufficient quality relevant to the research question (ie,
Building the next generation of automated electronic health (eHealth) programs will require a shift of attention from the performance of individual programs to a joint effort of understanding eHealth’s working mechanisms [
Instead, the development of eHealth programs often relies on rather static traditional behavior change theories [
eHealth’s working mechanisms can be studied using various methods, but a promising and pragmatic venue of investigation is the qualitative interview, that is, “professional conversations (...) where knowledge is constructed in the inter-action between the interviewer and the interviewee (...) about a theme of mutual interest” [
The qualitative interview is also a pragmatic research method, as many researchers already conduct interviews with program users as part of an applied research goal (developing or implementing an intervention). In the process of conducting interviews with program users, a researcher may become intrigued by a more basic research question and may perhaps consider the pragmatic solution of pursuing both the applied and the basic research goal in the same interviews by simply adding questions to the existing interview guide. We believe that such studies mixing basic and applied research goals have the potential of becoming an important asset to the field, by accumulating knowledge on more general issues that may help us understand how eHealth therapy works.
However, to become such an asset, the interviews conducted in these studies should provide what in qualitative methodology is known as
The reflections that are presented in this viewpoint paper arose from some of the authors’ experiences with a specific interview study [
We opted for trying to improve the interview method; however, we found no guidelines within the field of eHealth for how to conduct high-quality qualitative interview studies on potential eHealth working mechanisms. Therefore, we started looking more closely at the interviews we had conducted, asking ourselves what had gone wrong. This process led to the identification of a handful of problems that we believed were likely to have contributed to the difficulties in getting rich data on how the participants related to the program. As we started defining these problems, we discovered that we had also encountered several of them in other eHealth studies we had been involved in [
We wanted our experiences to be of benefit to other researchers with similar agendas and interests, and we therefore sought to describe the problems we had encountered in a way that would maximize their generalizability. Thus, through discussion among ourselves and with other researchers, we conceptualized 5 interview challenges: achieving a joint understanding of the interview topic, keeping participants from straying off the focus of enquiry, aiding recall of specific program experiences, avoiding negative influence of the social interview situation, and structuring the dual-aim interview. Having identified the challenges, we consulted the literature on qualitative methodology to identify methodological tools to counteract each challenge.
Returning to the study that had started this process [
In short, although it seemed a pragmatic solution to use already-planned interviews to pursue the answer to a basic eHealth research question, we experienced that getting rich data on the basic research question was challenging. In the absence of guidelines for conducting high-quality qualitative interviews specifically adapted to the field of eHealth, the process we entered into led to an enhanced methodological awareness and specific methodological tools for increasing study quality. The main focus of this paper is to share the identified challenges and tools with the research community. However, before doing so, we will offer what we consider to be a handy heuristic for understanding some of these methodological challenges: the
We suggest that a person’s interaction with any health intervention can be visualized in terms of a triangle, which includes the individual help seeker, the intervention, and the behavior change processes (
However, interventions may differ according to how much the 2 interacting parties—the person and the intervention—influence the interactional content and the interactional processes. In the case of psychotherapy, both the client and the therapist highly influence both components of the interaction. Taking another example, a person reading a self-help book is also interacting with a health intervention: things also
Considering the working mechanisms of an eHealth program, many programs will influence both the interactional content and the interactional processes. As with a self-help book, the interactional content will usually to a large extent be decided by the program. Moreover, just as a self-help book, the program is a
The invisible interaction is a useful heuristic when considering the challenges of interview studies for exploring eHealth’s working mechanisms. We previously stated that an asset of qualitative interviews is their potential to explore eHealth working mechanisms from the program user’s perspective. However, from this perspective, part of the program’s working mechanisms—the interactional processes— are maybe invisible to the participant, unless she or he purposefully directs his or her attention toward them. In other words, being largely invisible, the interactional processes may not be part of the participant’s conscious experience that she or he is ready to share in an interview. This may create or contribute to certain challenges with exploring eHealth working mechanisms through interviews. We will now present 5 such challenges and suggest methodological tools to counteract them.
Working mechanisms of a behavior change intervention.
Working mechanisms of an automated electronic health intervention.
When a researcher sets out to explore a potential eHealth working mechanism in an interview, it may be difficult to achieve a joint understanding of the interview topic together with the participant. For example, as mentioned previously, in the study that was the starting point for this paper, the researchers were interested in understanding how the participants related to the program [
Failure to get rich data on a research question may indicate a marginal phenomenon—or that the interviewer is failing to communicate the focus of enquiry in a way that facilitates joint understanding with the participant. We believe it is a truism that experience is multifaceted and that an experience can be described from many perspectives. For example, a client may describe a therapy session from a factual perspective of when and where it took place, from an experiential perspective of his or her emotions before, during, and after the session, from a historical perspective of the session as a stage in his or her spiritual development, and so on. If the interviewer’s questions are mostly descriptive, there may be a scarcity of cues concerning which perspective to assume, leaving the decision up to the participant—and the participant’s choice may not be the researcher’s choice. This may be especially challenging in studies on eHealth working mechanisms, as the interactional processes may not be part of the participant’s conscious experience. Therefore, descriptive questions asking for the participant’s program experiences will perhaps not cause him or her to talk about the (invisible) person-program interaction but rather about the program as a thing with a content. On the other side of the conversation, the interviewer may fear that more direct questions onto the focus of enquiry will put words in the participant’s mouth and disqualify any subsequent answer.
An interviewer can use several methodological tools to foster a joint understanding of the interview topic with the participant. One such tool is vignettes: vivid, exemplifying prose stories that guide the conversation toward a particular aspect of the participant’s experience [
Another and more direct way of fostering a joint understanding of the interview topic is to involve the participants as coresearchers, or using epistemic interviewing [
A final tool to clarify and exhaust the interview topic is to ensure the possibility of conducting follow-up interviews [
Coresearcher design and vignettes foster clearer communication, but they may also threaten the study’s validity if the researcher holds on to his or her initial assumptions about the studied process, failing to acknowledge unexpected perspectives. To ensure that these tools strengthen and not hamper the quality of the study, the researcher should adopt what in psychotherapy is known as the
To allow time for joint exploration of the person-program interaction, it is necessary to limit the interview time spent on matters that are not at the core of the research question. Returning to
If the participants continuously stray off the focus of enquiry by spending time on contextual aspects, it can threaten the data richness. Aspects that are contextual to the researcher may be aspects the participant wants to share or aspects she or he believes to be important to the investigation. The interviewer may try to lead the conversation back onto the focus of enquiry, but the participant may return to the contextual aspects, turning the interview into a battle over topic. Apart from being unpleasant for both, the result may be scant data on the focus of enquiry. When the focus of enquiry is potential eHealth working mechanisms, the invisible interaction may add to the challenge of straying off the topic. As the participant may be largely unaware of the interactional processes, she or he will instead talk about the aspects of which she or he is aware: the change processes (in isolation of the program) or the program (in isolation of the change processes). Information about the behavior change and about the program is certainly relevant contextual information, but talking about these aspects in isolation should not dominate the interview.
The interview conversation can be kept from straying off the research topic by using in-interview questionnaires to keep contextual answers short. The questionnaire can include questions addressing contextual issues (eg,
Sometimes participants may not recall program experiences in sufficient detail to answer the interviewer’s questions. In the study that inspired this paper [
Recalling specific program experiences may be challenging as although participants may be active program users at the time of the interview, they are not engaging with the program at that particular moment (unless you are combining the interview with a
There are, however, methodological tools to amend the problem with recall in the interview situation: 1 such tool is to get
Another tool for aiding recall is asking memory-facilitating interview questions. If program experiences have not been encoded as specific episodic memories, the interviewer’s phrasing of questions becomes increasingly important, as the words she or he uses will influence the participant’s memory-retrieval process by serving as memory cues [
As a final note on program recall, it may not be necessary for the participant to remember any particular program session at all; the researcher must consider what level of detail is necessary to answer the research questions meaningfully. For some research questions, the sum of program experiences may be more important than any particular experience. If so, using the interview to discuss the participant’s overall experience with the program can be more meaningful than facilitating recall of specific sessions [
All interviews are also social situations, and aspects of the social situation will influence the data [
Gender stereotypes are not the only potential social disturbances in an interview—other social roles may be prominent, and within eHealth research, the interviewer may be particularly prone to be perceived as an interviewer or clinician or interviewer or developer. Perceiving the interviewer as also a clinician may cause the participant to think of him or her as a therapeutic interactional partner and to be less attentive to the therapeutic agency of the eHealth program. Similarly, perceiving the interviewer as also a program developer may highlight the program as a thing made by someone else, making it more difficult to see the program’s role as a therapeutic agent—or cause the participant to self-censor negative experiences, as 1 of the authors experienced in 2 different studies [
The potentially negative influence of the social interview situation can be counteracted with methodological tools. The researcher acknowledging the potential negative influence of roles and stereotypes, both before and after the interviews, can minimize their negative effect. Before an interview, researchers should reflect on potentially salient social aspects and whether something should be done about them [
Finally, it is important to acknowledge that although the social interview situation may sometimes be a negative influence on the data, it can also be an asset. Through the interviewer’s reflexivity, the social situation may generate insights that would otherwise be missed. The interview in which the interviewer had rushed through the questions as she feared being labeled an
It was mentioned in the introduction that qualitative interviews are pragmatic for exploring potential eHealth working mechanisms as the development or implementation of eHealth programs often entail user interviews anyway. Therefore, researchers who are interested in exploring potential eHealth working mechanisms may do so through existing interviews with applied purposes. However, when applied and basic research goals are mixed like this in the same interview study, it may create an additional challenge in getting rich data on the basic research question. In the study that inspired this paper [
Mixing applied and basic research aims can be problematic as different aims may require different interviewing modes. For the interviewer, changing from an
Interviews with both applied and basic research aims may serve both aims through topical blocks and clear introductions. The transition can be facilitated by structuring the interview in topical blocks [
Conducting qualitative interviews is a promising and pragmatic approach for identifying the working mechanisms of automated eHealth programs. Existing user interviews for applied purposes can be used to also pursue basic research questions on eHealth working mechanisms. Researchers planning to conduct user interviews for applied purposes would be wise to ensure the possibility to pursue research questions concerning potential eHealth working mechanisms by including this purpose in the study information provided to ethics boards and prospective participants. However, getting rich data on eHealth working mechanisms through qualitative interviews may be challenging. In this paper, we suggest that challenges may arise partly due to what we have described as the
electronic health
This research is funded by grant no. 228158/H10 from the Research Council of Norway. The authors wish to thank Filip Drozd, Caroline L Brandt, and Maja Wilhelmsen for contributing to this paper with their experiences with qualitative eHealth research. Their inputs were valuable corrections of the authors’ initial presumptions, and they contributed to making this paper more applicable to a larger part of the field. The authors also wish to thank Professor Azy Barak for his encouragement and valuable insights regarding the need for theoretical work within the field of eHealth. The authors additionally wish to thank Anne-Lise Middelthon for inspiring insights and advice on qualitative methodology. The second author of this paper, Ayna B Johansen, tragically passed away before this paper was published. We are grateful for her enthusiasm for this paper and how she helped shape its content. She is remembered and missed.
None declared.