This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.
Electronic health screening tools for primary care present an opportunity to go beyond data collection to provide education and feedback to adolescents in order to motivate behavior change. However, there is limited research to guide feedback message development.
The aim of this study was to explore youth perceptions of and preferences for receiving personalized feedback for multiple health risk behaviors and reinforcement for health promoting behaviors from an electronic health screening tool for primary care settings, using qualitative methodology.
In total, 31 adolescents aged 13-18 years completed the screening tool, received the electronic feedback, and subsequently participated in individual, semistructured, qualitative interviews lasting approximately 60 min. Participants were queried about their overall impressions of the tool, perceptions regarding various types of feedback messages, and additional features that would help motivate health behavior change. Using thematic analysis, interview transcripts were coded to identify common themes expressed across participants.
Overall, the tool was well-received by participants who perceived it as a way to enhance—but not replace—their interactions with providers. They appreciated receiving nonjudgmental feedback from the tool and responded positively to information regarding the consequences of behaviors, comparisons with peer norms and health guidelines, tips for behavior change, and reinforcement of healthy choices. A small but noteworthy minority of participants dismissed the peer norms as not real or relevant and national guidelines as not valid or reasonable. When prompted for possible adaptations to the tool, adolescents expressed interest in receiving follow-up information, setting health-related goals, tracking their behaviors over time, and communicating with providers electronically between appointments.
Adolescents in this qualitative study desired feedback that validates their healthy behavior choices and supports them as independent decision makers by neutrally presenting health information, facilitating goal setting, and offering ongoing technological supports.
Most of the leading contributors to adolescent morbidity and mortality are preventable health risk behaviors such as substance use, unprotected sexual activity, and unsafe driving practices [
One strategy for increasing the frequency of screening and follow-up counseling for adolescent health risk behaviors in primary care is the use of electronic screening tools [
Whereas some electronic screening tools have incorporated feedback, educational components, or brief interventions [
In this study, we used qualitative methodology to better understand youth perceptions of and preferences for receiving feedback by evaluating their responses to an electronic screening tool for primary care (the “Check Yourself” tool) that incorporates immediate, personalized feedback for multiple health risk behaviors and reinforcement for health promoting behaviors. Adopting a similar approach to previous qualitative investigations of electronic health tools [
Participants were recruited primarily from an adolescent-focused, academic clinic in Seattle, WA using a purposive sampling approach [
Adolescents were eligible to participate if they were in the age range of 13-18 (inclusive) years and could read and speak English. For participants aged 13-17 years, both youth assent and parental consent was obtained. For participants aged 18 years, youth consent was obtained, and parental consent was not required. Recruitment continued until theoretical saturation was reached (see “Data analysis” section below).
Adolescents first completed the Check Yourself tool and subsequently participated in individual qualitative interviews. The individual interview format was used to protect confidentiality and to allow interviewers time to explore the complexities of each participant’s perspectives. There were three interviewers in total (GZ, KK, and one other interviewer), all of whom completed training in qualitative methods before conducting interviews. Interviews occurred primarily in a private room in the same building as the adolescent clinic used for recruitment, at a time convenient for participants. When logistical constraints did not allow for the interview to occur at the clinic, interviewers met participants in a convenient and private location (eg, a private meeting room at a library). A majority of the interviews lasted 60 min (range: 45-90 min). Interviews occurred from February to July, 2015. Interviewers were gender-matched with participants as much as possible to promote comfort for participants in discussing sensitive health topics. Before the beginning of the interview, interviewers explained to participants that they did not personally develop the tool and that negative feedback was as valuable as positive feedback. Participants were oriented in this way to ensure that they did not suppress negative feedback to please the interviewer. Additionally, the interview guide contained several questions designed to elicit adolescents’ suggestions regarding how the tool could be improved. Adolescents received US $30 for participating. All study procedures were approved by the institutional review board of Seattle Children’s Research Institute.
Example screenshots from the Check Yourself tool.
The Check Yourself tool is a tablet-based screening instrument developed by researchers at the University of Washington (LR and CM) in conjunction with the digital health company Shift Health [
Semistructured interviews were conducted using an interview guide that covered three areas (
Overall impressions of the Check Yourself tool
What was your overall experience with the tool?
When you used the tool, did you find any parts confusing? Which parts?
Perceptions of motivational feedback
Which health messages made the biggest impression on you?
How did using the tool make you feel about your own health?
Was there anything you thought you would want to change after using the tool?
Which messages made you feel motivated to change?
What additional information could have helped you feel more motivated?
Desired expansions of tool
Are there other things you think we should add to the tool that would be helpful?
Do you think it would be helpful if the tool sent you follow-up resources and websites? What would be most helpful to send?
Interviews were audio recorded and transcribed. Interview transcripts were uploaded to Dedoose, a Web-based qualitative analysis platform [
Following Hill et al [
In total, 26 participants were recruited from the waiting area of the clinic, 5 through word of mouth, and 1 via flyer. In total, 32 interviews were conducted, though 1 interview was not analyzed as the participant provided predominantly single-word responses that were not felt to enhance understanding of the youth’s perspective. The final sample therefore consisted of 31 adolescents. Participants’ demographic data are displayed in
Participant demographic data (n=31).
Characteristic | Value | Participants |
Age (years), mean (SDa) | 15.2 (1.4) | |
13 | 5 (16) | |
14 | 5 (16) | |
15 | 6 (19) | |
16 | 10 (32) | |
17 | 4 (13) | |
18 | 1 (3) | |
Male | 13 (42) | |
Female | 18 (58) | |
Hispanic | 6 (19) | |
Non-Hispanic | 25 (81) | |
African American | 2 (7) | |
Asian | 3 (10) | |
White | 20 (65) | |
Multiracial | 2 (7) | |
Race not specified | 4 (13) |
SD: standard deviation.
Themes are presented below and grouped by the three topic areas of the interview guide: (1) overall impressions of the Check Yourself tool, (2) reactions to the personalized feedback, and (3) desired expansions of the tool. These topic areas are not themes themselves but rather broader categories of themes.
In general, participants reported that the Check Yourself tool was easy to use and that colorful images and interactive content increased their interest in the health information that was presented:
Teenagers, we like color...Bright colors make it fun, make it not like your filling out paperwork at a hospital or a clinic.
The true and false questions at the end, or at least it was when I did it. Which was a good memory thing because when I went through it...there I had to answer those things so I actually remembered them.
All participants indicated that they would prefer the Check Yourself tool to pencil-and-paper screening. Some adolescents particularly appreciated that questions were presented one at a time such that responses to previous questions were not visible. They felt this feature would help conceal their responses from family members in a waiting area:
I didn’t want anyone else to see [my responses] because my sister was sitting here, and my mom was sitting here...but this way like I said before was kind of like, you can answer the question and quickly move on and no one will see your answer if you do it fast enough.
Some participants also noted that not displaying responses to previous questions would make it less upsetting to endorse risk behaviors since they didn’t have to continue to see their responses as they waited to see their provider.
Adolescents described distinct ways that they felt the Check Yourself tool could enhance their interactions with doctors. Some thought the tool could function as a “warm up before the main event” of an appointment by priming them to identify their questions and concerns to discuss during the visit. Many participants found it easier and less awkward to disclose health risk behaviors on the tool than face-to-face, and perceived the tool as helpful in reducing providers’ need to ask patients about sensitive topics during the appointment:
I like the idea of having [the tool] because a lot of people, like I know a lot of times I would go to the doctor ready to say something and then get scared and not say it. This way, it’s a little bit impersonal, but at least I’m getting it down and so the doctor, I wouldn’t have to make eye contact with him, but he will know because I put it in there.
Adolescents commented on several aspects of the personalized feedback provided by the Check Yourself tool. In general, participants stated that the tool provided new information including education, tips for health improvement, and information on how their behavior compared with peer norms and national guidelines for adolescent health. Nearly all youth appreciated the presentation of information in a nonjudgmental manner:
It wasn’t super forceful. It was kind of like here’s an amount that you eat, and here’s the amount that people your age normally eat, and here’s the recommended amount...I like how it wasn’t super forceful like black screen, red words, eat more fruits and vegetables.
The most commented on feedback components were the graphs which compared the participants’ own responses with peer norms and national health guidelines. Many reported that they found these comparisons to be motivating:
When I saw the graph of the physical activity, I was way below, and I was just thinking, “Wow, I really should do something about that.”
However, some youth felt that the comparisons were not helpful. The most common concern regarding comparisons to peer norms was that the statistics were not accurate or relevant as they were not consistent with what participants encountered in their close friend groups:
When I was looking at some of the numbers I was like, ‘This doesn’t seem right’ because—or at least my high school it might seem different from what I see around me. I felt like some of those [numbers] might have been a little low, the numbers of the average teen.
Even if no one else in the school [used marijuana] but my friends, it wouldn’t exactly matter to me because people I know do—that’s what’s relevant.
Regarding comparisons with national health guidelines, some adolescents disagreed that recommendations for certain behaviors (eg, screen time) were any healthier than what they were already doing. Participants typically voiced this objection if they had not yet experienced any consequences from their behaviors:
Personally, I’m happy with what I do right now. I have a little bit too much screen time, but I sleep eight hours every night, I do cross country, track, and winter running club...The consequences that [the tool] mentioned were loss of sleep, less time to do athletic stuff, and things like that which were all things that it previously said I was great with.
Sometimes, even if participants recognized that following the health guidelines would be beneficial, they perceived the recommendations as “unreasonable” in the context of modern adolescent lifestyles. This objection was voiced primarily in relation to screen time and exercise recommendations (<2 hours/day and 1 hour, 7 days/week, respectively). Lastly, some participants found the comparisons too stark and suggested that it would be helpful to add validating phrases to soften the message:
The slides that were specifically around nutrition and exercise, especially when you’re not meeting it, it’s very jarring to be like, “You’re not meeting it,” and there’s no kind of soft landing that you get when you talk to a person and they’re like, “Well, you’re doing a pretty good job,” and some of the little changes you need to make to get there.
In addition to normative comparisons, many participants reported that they were motivated by learning about the benefits of healthy behaviors:
When I did see that cause and effect thing it kind of made me think, “Well, that effect would be nice.”
Some others felt that learning about the consequences of risk behaviors was more motivating and specifically requested more alarming statistics. It is worth noting, however, that those endorsing this view tended to not be engaging in risk behaviors themselves and were more often reflecting on what they felt would be motivating for their peers who were.
Many adolescents also perceived value in aspects of the feedback which aimed to promote self-efficacy for and commitment to behavior change, including practical tips to change behaviors. Many were interested in getting tips regardless of whether they were engaging in risk behaviors, as the tips might be useful in the future:
Back to the sexual activity and stuff it gave a feedback, like stages of what is better to use like for birth control-wise...Eventually I’m going to be sexually active and I want to plan what happens and what I should use and I want to be safe.
Participants reported that the tips regarding how to increase and sustain healthy behaviors, as well as how to stay safe if engaging in risk behaviors (eg, ways to drink alcohol responsibly) were all useful. Finally, many adolescents appreciated validation for healthy behaviors:
I think just affirming the fact that—for instance, I put down that I always wear seat belts...It’s nice to be like “Yeah, that’s the right thing to do.”
Participants expressed interest in four potential expansions for the Check Yourself tool: receiving follow-up information, goal-setting, tracking of behaviors over time, and communication with providers in between visits. Regarding follow-up content, many desired more information about specific health behaviors for personal research and learning and thought that it would be helpful if the tool would provide a targeted set of links and resources for areas of interest. Many adolescents were also interested in additional resources for changing behaviors, especially practical “tips” (eg, for how to start going to bed earlier).
Many adolescents commented that they would like to have the opportunity to set goals based on the feedback they received from the Check Yourself tool. In talking about goal setting, participants expressed the importance of being able to determine their own goals and desired a process that emphasizes small steps and integrates with follow-up information:
Maybe there could be an option for their goal in each section like at the end of [the tool]...I think if it’s a goal that could help with sent out information—so if they say their goal is to get two more hours of sleep every night, we could get information about the best ways to do that.
I would say have little steps in there to getting better. Take steps—baby steps—to getting better.
Many participants also wanted to expand the tool to include tracking health behaviors over time and in relation to goals. Some thought that tracking systems would ideally be more integrated and able to recognize and alert them to patterns across health behaviors (eg, relationships between physical activity and eating). As a part of ongoing tracking, some participants indicated an interest in receiving electronic reminders and ongoing motivational messaging about goals that they had set:
I guess [it would be useful to send] just how often I should do it just as a reminder and also encouragement as to why I wanted to do it in the first place.
Notably, some participants indicated a preference for electronic reminders over reminders from parents or other adults, particularly in the context of goals that they had set for themselves:
Those things where your parents would probably remind you, but your parents are like, “Hmm,” but then [with electronic reminders] it’s like oh maybe I should do it because it’s good for me and it’s me doing it, not my parent to be like “Go do it.”
Some adolescents described a desire for their providers to be involved in the behavior tracking process, specifically wanting doctors to view their progress in between visits and to provide encouraging comments.
In this qualitative study, we examined adolescents’ perspectives on an electronic health screening tool for primary care settings that provides personalized feedback. Overall, the tool was well-received by participants, who strongly preferred electronic screening over paper-and-pencil forms. Youth appreciated the colorful and interactive content, valued aspects of the tool that enhanced privacy, and indicated that they would disclose more health risk behaviors to the tool than to paper-and-pencil forms, consistent with prior research on electronic health screening [
Adolescents thought that the feedback delivered by the tool was generally useful and motivating. They appreciated that the feedback was presented with nonjudgmental language and responded positively to a variety of specific feedback components including information regarding the benefits of healthy behaviors, risks of negative behaviors, tips for behavior change, and the reinforcement of good choices.
Whereas many found the comparisons with peer norms and national health guidelines interesting and helpful, a small but noteworthy minority of participants dismissed the peer data as not real or relevant and guidelines as not valid or reasonable. The perspectives expressed by this minority suggest that adolescents are discerning consumers of peer normative data and that peer comparisons should be presented in a way that enhances the perceived relevance and credibility of the information. Additionally, participants in this study could have responded more truthfully about their health risk behaviors than adolescents in national survey research, spuriously making participants’ behaviors appear worse than the national averages. Whereas more data is needed to evaluate the possibility that adolescents respond more honestly to electronic health screening tools than to anonymous national survey research, future interventions including normative feedback should consider the limitations of comparing self-reported health data gathered through different modalities.
In any case, when presenting normative feedback, it may be important to select comparison groups as similar as possible to individual adolescents (eg, with respect to age, gender, and school) and to clearly cite data sources to increase perceptions of relevance and credibility. Notably, the two studies which have investigated Web-based interventions, including normative feedback for adolescent alcohol use, either did not present school specific norms or did not present age and gender specific norms, and did not find a connection between normative feedback and reductions in alcohol use [
Regarding comparisons with national health guidelines, some participants specifically found the recommendations for screen time (<2 hours/day) and exercise (1 hour, 7 days/week) unreasonable in the context of their daily lives. The media guidelines used in the Check Yourself tool were based on the 2013 guidelines from the American Academy of Pediatrics [
When prompted with possible adaptations to the tool, adolescents expressed interest in receiving follow-up information about health risks, opportunities to set goals and track health behaviors, receiving reminders of planned changes, and communicating with providers electronically between appointments. Participants’ enthusiasm for these additional features suggests that adolescents may envision an ongoing role for technology in health behavior change, and specifically technological supports that enhance their ability to self-regulate behaviors and include the option of seeking professional input if desired. These findings corroborate results from other recent qualitative studies investigating adolescents’ preferences for text messaging-based preventive interventions, which have documented high youth interest in receiving brief, personalized advice and reminders of reasons to change on an ongoing basis [
The qualitative design of this study entails both strengths and limitations. Though we were able to examine nuanced perspectives not easily accessible to quantitative research, it is possible that adolescents’ opinions regarding effective types of feedback expressed during a qualitative interview may not reflect what would actually have an impact if tested using quantitative methodology. For this reason, results from this study should function to inform hypothesis generation rather than generalizable knowledge. An additional limitation stems from our recruitment from primarily one clinic in Seattle, WA, which resulted in a sample largely representative of this geographic region but whose perspectives may not describe those of adolescents in other areas and health care settings.
With the opportunity to create screening tools that go beyond collecting information from adolescent patients by providing education and feedback, it is useful to seek youth input in order to design content that is acceptable and effective for this population. Adolescents in this qualitative study were specific about their preferences for electronic, personalized feedback on their health behaviors. They expressed engagement with and enthusiasm for the process of receiving feedback in general and were appreciative of new health information and tips, while at the same time they offered some critiques and suggestions regarding specific feedback messages. Participants desired feedback that validates healthy behavior choices and supports them as independent decision makers by neutrally presenting relevant information. Additionally, participants valued feedback that enhances their ability for self-management by facilitating goal setting and offering ongoing technological supports. Future quantitative outcome research should test screening tools that incorporate these suggestions.
This research was supported by the Agency for Healthcare Research and Quality (AHRQ 5R01HS023383-02; PI: McCarty). The granting agency which supported this research approved the study design and received periodic updates on data collection but was not involved in the analysis of the data, the decision to submit a manuscript, or in the writing of the manuscript itself. No additional financial inputs (eg, honorariums or other forms of payment) aside from those of the granting agency supported the writing of the manuscript. Publication made possible in part by support from the Berkeley Research Impact Initiative (BRII) sponsored by the UC Berkeley Library.
LR and CM designed the study. GZ and KK collected the data. All the authors contributed to the analysis and interpretation of the data. GZ wrote the manuscript with guidance from the other authors. All the authors approved the final version of the manuscript.
SW serves as medical director and co-owner of Shift Health, the company that hosted the electronic intervention examined in this study.