Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?


Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 05.03.18 in Vol 20, No 3 (2018): March

This paper is in the following e-collection/theme issue:

    Original Paper

    Mode Equivalence of Health Indicators Between Data Collection Modes and Mixed-Mode Survey Designs in Population-Based Health Interview Surveys for Children and Adolescents: Methodological Study

    Corresponding Author:

    Elvira Mauz, MPH, Dipl-Psych

    Department of Epidemiology and Health Monitoring

    Robert Koch Institute

    PO Box 650261

    Berlin, 13302


    Phone: 49 30 18754 3332

    Fax:49 30 18754 3449



    Background: The implementation of an Internet option in an existing public health interview survey using a mixed-mode design is attractive because of lower costs and faster data availability. Additionally, mixed-mode surveys can increase response rates and improve sample composition. However, mixed-mode designs can increase the risk of measurement error (mode effects).

    Objective: This study aimed to determine whether the prevalence rates or mean values of self- and parent-reported health indicators for children and adolescents aged 0-17 years differ between self-administered paper-based questionnaires (SAQ-paper) and self-administered Web-based questionnaires (SAQ-Web), as well as between a single-mode control group and different mixed-mode groups.

    Methods: Data were collected for a methodological pilot of the third wave of the "German Health Interview and Examination Survey for Children and Adolescents". Questionnaires were completed by parents or adolescents. A population-based sample of 11,140 children and adolescents aged 0-17 years was randomly allocated to 4 survey designs—a single-mode control group with paper-and-pencil questionnaires only (n=970 parents, n=343 adolescents)—and 3 mixed-mode designs, all of which offered Web-based questionnaire options. In the concurrent mixed-mode design, both questionnaires were offered at the same time (n=946 parents, n=290 adolescents); in the sequential mixed-mode design, the SAQ-Web was sent first, followed by the paper questionnaire along with a reminder (n=854 parents, n=269 adolescents); and in the preselect mixed-mode design, both options were offered and the respondents were asked to request the desired type of questionnaire (n=698 parents, n=292 adolescents). In total, 3468 questionnaires of parents of children aged 0-17 years (SAQ-Web: n=708; SAQ-paper: n=2760) and 1194 questionnaires of adolescents aged 11-17 years (SAQ-Web: n=299; SAQ-paper: n=895) were analyzed. Sociodemographic characteristics and a broad range of health indicators for children and adolescents were compared by survey design and data collection mode by calculating predictive margins from regression models.

    Results: There were no statistically significant differences in sociodemographic characteristics or health indicators between the single-mode control group and any of the mixed-mode survey designs. Differences in sociodemographic characteristics between SAQ-Web and SAQ-paper were found. Web respondents were more likely to be male, have higher levels of education, and higher household income compared with paper respondents. After adjusting for sociodemographic characteristics, only one of the 38 analyzed health indicators showed different prevalence rates between the data collection modes, with a higher prevalence rate for lifetime alcohol consumption among the online-responding adolescents (P<.001).

    Conclusions: These results suggest that mode bias is limited in health interview surveys for children and adolescents using a mixed-mode design with Web-based and paper questionnaires.

    J Med Internet Res 2018;20(3):e64




    The assessment of population health using health interview surveys is an established method in many countries and is a cornerstone of health reporting, health policies, and health sciences. However, epidemiological studies have shown decreasing response rates since the 1990s [1-3]. The use of mixed-mode health interview surveys offers respondents various data collection modes and can increase the response rate, improve sample composition, and reduce overall costs [3,4]. Currently, there is considerable interest in using Web-based health survey interviews because of lower costs and faster data availability. Web-based surveys are increasingly becoming standard [5], and they are frequently combined with other modes in mixed-mode designs [6]. However, the use of different survey modes may increase the risk of measurement error (mode effects) [5].

    Mode effects are systematic distortions caused by different survey modes or interview situations [5]. They often arise when there are large methodological differences in the survey situation (self-administered questionnaire vs interviews) or the communication channel (auditory vs visual) [3]. Such differences are minimal between self-administered paper-based questionnaires (SAQ-paper) and self-administered Web-based questionnaires (SAQ-Web)—both are conducted without an interviewer and both use visual perception. For this reason, these 2 self-administered modes (SAQ-Web and SAQ-paper) are considered mode equivalent [4,7,8]. Mode equivalence is shown if an individual gives the same response to the same question or instrument administered through 2 different modes, leading to the same results [9]. For example, research has shown no differences between the 2 data collection modes in prevalence rates of diseases among adult populations [10,11] or in reported health behaviors among adolescents [12].

    However, researchers have discussed mode effects for sensitive topics. Web-based responses are associated with both anonymity and greater individualization. Consequently, SAQ-Web participants are not affected by social desirability; rather, they are less orientated toward social norms. Therefore, SAQ-Web mode yields the most honest reports, especially compared with interview modes [13,14]. Furthermore, differences have been found between the 2 self-administered modes, for example, in political attitudes [15], reporting of sensitive sexual behaviors [16], or adolescent risk behavior [17]. However, there is high consistency of responses across modes, with only a few respondents taking advantage of the greater privacy of the Web mode [16]. Hence, possible mode effects should be investigated before changing or adding modes to existing health surveys. In ongoing longitudinal studies, changing the mode or offering a second mode may risk time-based comparability.

    The German Health Interview and Examination Survey for Children and Adolescents (KiGGS) is a nationally representative health interview and examination survey of children and adolescents in Germany [18,19]. It is part of the nationwide health monitoring system administered by the German national public health institute (Robert Koch Institute) [20,21]. KiGGS obtains representative cross-sectional information on German children and adolescents aged 0-17 years at regular intervals. Additionally, based on the first cross-sectional sample (KiGGS baseline; 2003-2006), a KiGGS cohort has been implemented. The baseline respondents are being followed throughout their life course into adulthood [21]. The survey involves physical examinations and tests, as well as laboratory analysis of urine and blood parameters. All the parents and adolescents aged 1117 years completed paper-based questionnaires [20]. The first follow-up, KiGGS Wave 1 (2009-2012), was conducted using telephone interviews of parents and adolescents [22]. KiGGS Wave 2 (2014-2017) involved a health interview and examination, continuing the baseline concept [23]. The aim of the KiGGS survey is to provide current data on population health, health determinants, and the utilization of health care services. In addition, information is gathered about the incidence of disorders as well as trajectories of multiple health indicators throughout the life course. The data are widely used in national health reporting, health policies, and public health research.

    When planning population-based (health) studies like KiGGS, the survey design must minimize total survey error [24,25]. In addition to lower data quality owing to measurement errors such as mode effects, the total survey error comprises different kinds of systematic errors—an undervalued sample size leads to imprecise estimates (sampling error) and the composition of the sample might be different from the target population (coverage error) owing to errors in the sampling procedure or because of systematic nonresponse (nonresponse bias). All these aspects were examined in a methodological pilot study as part of the KiGGS Wave 2 pretest. The pilot study aimed to compare 3 mixed-mode survey designs using Web- and paper-based questionnaires with a single-mode SAQ-paper design in terms of response rates, sample composition, data quality, and effort [26]. The study also explored whether estimates of health indicators differed among the survey designs and data collection modes. This study focused only on the second aim of the pilot study and addressed 2 research questions:

    • Are there any differences in the prevalence rates or mean values of core public health indicators for children and adolescents aged 0-17 years between the single-mode control group using only SAQ-paper and different mixed-mode groups that combine offers of SAQ-paper and SAQ-Web?
    • Are there any differences in prevalence rates or mean values of these indicators between the 2 data collection modes (SAQ-paper and SAQ-Web) if all online respondents are pooled and all paper-and-pencil respondents are pooled across all survey designs?


    Study Design

    The methodological pilot study used a sample of children and adolescents registered in the local resident registries of 20 municipalities in 5 federal states of Germany, covering urban and rural areas as well as the eastern and western regions of the country.

    Data were collected using SAQ-Web or SAQ-paper methods. All selected individuals were invited by mail to participate in the study. They were sent a cover letter with the invitation to participate, information about the study and data privacy, and an informed consent form. Depending on the allocated mode, the invitation comprised a username and password for participation through the Web option along with a paper questionnaire for those allocated to the concurrent mixed-mode design, only a paper questionnaire in the single-mode design, or only the access data for the online questionnaire in the sequential mixed-mode design. The SAQ-Web questionnaire was only optimized for desktop computers. A reminder was sent by mail to respondents who had not replied within 3 weeks of the initial invitation. Participants who did not respond to the reminder were telephoned up to 5 times 4 weeks after the initial invitation. As an additional motivation for prospective participants, each parent and adolescent who had completed a questionnaire received a shopping voucher to the value of €10. The methodological pilot study strictly adhered to the data protection regulations set out in the German Federal Data Protection Act. Participation in the study was voluntary. All parents and participating adolescents were informed about the study’s aims and content, as well as data protection, and they provided informed consent. Following the strict data privacy protocol, prospective participants between the ages of 11 and 17 years received their questionnaires only after their parents provided consent.

    Different questionnaires were used for different age groups. Main health indicators were included on the health questionnaires for parents of all age groups (0-17 years), and self-report data for main health indicators were obtained from adolescents aged 11-17 years. To reduce the risk of mode effects, the 2 questionnaires were designed to be as similar as possible and contained the same wording for the questions and response categories. On the basis of the unified-mode design [27], the wording and formatting of questions and response categories were standardized. To help participants visually distinguish single-choice questions from multiple-choice questions, all survey modes used the same checkbox design. Single-choice checkboxes were round, whereas multiple-choice checkboxes were rectangular. Additionally, multiple-choice questions included the instruction “Multiple entries are possible.” For filter questions, Web-based questionnaires were optimized with filter skips whenever the perceivability of the questions was not impaired. Plausibility checks and ranges were defined for the Web-based questionnaire. Additionally, soft prompting was programmed into the Web-based questionnaire to reduce item nonresponse. These differences were used to capitalize on the advantages of the Web mode for better data quality, and they were the only mode-specific design differences. Detailed information of the survey design and other technical aspects of the Web-based part of the survey are described in a “Checklist for Reporting Results of Internet E-Surveys” [28] (Multimedia Appendix 1).

    As shown in Figure 1, a gross sample of 11,140 children and adolescents was randomly allocated to four survey designs:

    1. A single-mode survey design as a control group—respondents were sent an invitation letter and paper-and-pencil questionnaires, followed by a reminder after 3 weeks
    2. A sequential mixed-mode survey design—respondents were sent an invitation letter and an online access code, followed 3 weeks later with a reminder letter and a paper-based questionnaire
    3. A concurrent mixed-mode survey design—respondents were sent an invitation letter, a paper-based questionnaire, and an online access code (a longer version of the questionnaire was tested with a subgroup of the concurrent mixed-mode design, but this subgroup was excluded from this study) and
    4. A preselect mixed-mode design—respondents were sent the invitation along with a postcard asking participants to choose one of the 2 options (SAQ-Web or SAQ-paper), followed by a reminder with the same offer

    There were no statically significant differences in the (gross) sample composition across the 4 design groups in terms of known sample characteristics, such as age, sex, municipality size, region, or respondent citizenship, which were obtained from local registries.

    The combined response rate for all survey designs was 38.43% (n=4032), following the internationally used Standard Definitions of Outcome Rates for Surveys of the American Association for Public Opinion Research (AAPOR Response Rate 2) [29]. There were no significant differences in response rates among the concurrent mixed-mode design, the sequential mixed-mode design, and the single-mode control group design. However, there was a significantly lower response rate in the preselect mixed-mode design. Detailed comparisons of response rates, sample compositions, data quality, and efforts among the different survey designs have been published previously [26].


    For this study, only survey design groups using the same version of the questionnaire were included, with 3468 completed parent-reported health questionnaires for children and adolescents aged 0-17 years and 1194 questionnaires completed by adolescents aged 11-17 years. A response was defined as one completed health questionnaire from either parents or children. Hence, a valid response did not require both parents and children to complete all requested questionnaires. To answer the first research question regarding mode equivalence across the different survey designs, we compared the single-mode control group with each of the 3 mixed-mode groups. To answer the second research question regarding mode equivalence between the 2 data collection modes, data from all survey designs were pooled (Table 1).

    Sociodemographic Characteristics of Responding Parents and Adolescents by Survey Design and Data Collection Mode

    Analyzed Sociodemographic Characteristics

    The sample compositions of participating parents and adolescents were described by various sociodemographic characteristics separately by survey design and data collection mode. The variables examined included individual adolescent characteristics (age, sex, migration background, and highest level of education reached or aspired); parental characteristics (age, marital status, and participating parent); location (municipality size and region [East vs West Germany]); and household properties (education level and net household income). Household education level was measured using the Comparative Analysis of Social Mobility in Industrial Nations [30]. Household income was assessed using a question on household monthly net income.

    Figure 1. Study design of the methodological pilot study. SAQ-paper: self-administered paper-based questionnaire; SAQ-Web: self-administered Web-based questionnaire.
    View this figure
    Table 1. Cases used in this study.
    View this table
    Statistical Methods

    Differences between the control group and the different mixed-mode groups and between the 2 data collection modes were tested using chi-squared tests.

    Mode Equivalence of Health Indicators Between Survey Designs and Data Collection Modes

    A wide range of health status indicators and health behaviors for children and adolescents with high public health relevance were analyzed to identify differences between the mixed-mode designs and the single-mode control group, as well as mode differences between SAQ-paper and SAQ-Web.

    Analyzed Indicators of Physical and Mental Health

    Lifetime diagnoses of asthma, hay fever, atopic eczema, and attention-deficit hyperactivity disorder (ADHD) were indicated by parents. Recurrent pain during the last 3 months was measured using the adolescents’ self-reports. Self-rated health (SRH) and chronic diseases were evaluated by parental report using the Minimum European Health Module questions [31], modified for children. Adolescents also answered the SRH question. Impairments owing to health problems were evaluated with a question from the Children with Special Health Care Needs Screener, which was answered by parents [32]. To define obesity, body mass index was calculated based on self-reported weight and height for adolescents and parent-reported weight and height for children aged 3-10 years. The body mass index cut-offs used in this study were determined by German norms [33].

    Child and adolescent mental health problems were evaluated using the parent- and self-report Strengths and Difficulties Questionnaire (SDQ) [34]. An SDQ total difficulties score was calculated for all children and adolescents. Participants with a borderline or abnormal score (based on German norms) [35] were defined as at risk for emotional and behavioral symptoms. Participants with borderline or abnormal SDQ impact scores were defined as at risk for psychosocial impairment.

    Analyzed Indicators of Health Care Utilization

    As indicators of health care use, pediatrician and orthodontist visits during the past 12 months for adolescents and parent-reported visits to any doctor for children under 11 years were analyzed [36].

    Analyzed Measure of Health-Related Quality of Life

    Health-related quality of life (HRQoL) was measured using KIDSCREEN-27 for adolescents aged 11-17 years, with 5 subscores for physical and psychological well-being, relationships with peers and parents, and school well-being. Scores were summed and transformed into t values [37].

    Analyzed Health Behaviors

    Adolescents reported their current smoking status, water pipe consumption during the past 12 months, second-hand smoke exposure [38], and lifetime consumption and current use of screen-based media. Excessive use of screen-based media was defined as more than 2 hours per day [39]. Harmful alcohol use and binge drinking were defined using responses to the Alcohol Use Disorders Identification Test (AUDIT-C) [40].

    Following the recommendation of the World Health Organization [41], healthy physical activity was defined as physical activity for at least 60 min per day. Low physical activity was defined as less than 2 days per week of at least 60 min of activity. All questions on physical activity were answered by adolescents aged 11-17 years.

    Statistical Methods

    We calculated prevalence rates for dichotomous health indicators and mean values for HRQoL (a scale outcome) by survey design and data collection mode. We compared these values using z or t tests.

    Due to the different sample compositions of the SAQ-paper and SAQ-Web groups (see the Results), it was necessary to control for sociodemographic characteristics to identify possible mode effects. Survey modes can differ in selection (different population groups prefer different modes) and measurement (different answers are given by the same person under different modes of administration), so these differences are confounded [42]. Additionally, health status and health behavior differ by sex, education, and other sociodemographic characteristics [43,44]. To eliminate the risk of confounding, we adjusted for sociodemographic characteristics by calculating adjusted prevalence rates using predictive margins [45] based on logistic or linear regression models with sociodemographic factors as covariates. To analyze indicators based on parental reports, we included child attributes (age, sex, and migration background); parental attributes (relationship to the child, age, and marital status); household attributes (education and income); and regional attributes (region and municipality size). Adolescents’ reports were adjusted by child attributes, including the highest level of education completed, as well as household attributes and location. The mode of data collection was another covariate used to identify adjusted prevalence for each mode. Differences were tested using z or t tests.

    For the survey design comparison, crude as well as adjusted prevalence rates and mean values were calculated. A statistical test for diversity was conducted between the single-mode control group design and each of the 3 mixed-mode designs. Because the survey design samples did not differ in sociodemographic characteristics (see the Results) and there were only marginal differences between the 2 approaches, only the results for crude prevalence rates or mean values without adjustment for sociodemographic characteristics to simplify the presentation of results are shown here.

    Handling of Multiple Testing

    In total, we analyzed 12 health indicators using the parental sample and 28 using the adolescent sample. For these health indicators, we tested each mixed-mode survey design against the control group. Additionally, we used 2 other statistical tests to identify differences between the data collection modes, using first the crude values and then the adjusted values.

    Regarding the research questions, a sensitive approach to detect possible differences (ie, a higher probability of accepting the null hypothesis) is needed. Therefore, we decided to address the statistical problem of multiple testing by correcting the significance level only for the number of tests performed for each health indicator. This was done only for tests comparing the different survey designs. We used the Bonferroni correction method to neutralize the accumulation of α-error [46], using an adjusted significance level of P<.02 to examine differences between the mixed-mode survey designs and the single-mode control group. For the comparison of data collection modes, a significance level of α=.05 was used.


    Sociodemographic Characteristics of Responding Parents and Adolescents in Different Survey Designs and Different Data Collection Modes

    Responding Parents

    There were no statistically significant differences in sample composition between the mixed-mode survey designs and the single-mode control group for participating parents. However, the sample sociodemographic characteristics differed significantly between data collection modes (Multimedia Appendix 2). Parents responding online were more often married and had higher household education levels, higher incomes, and younger children than those who responded to the SAQ-paper. More fathers responded via the Web-based questionnaire than in the paper-and-pencil group. There were no significant differences in migration background, parental age. P values close to significant level are found for region of residence (P=.08), municipally size (P=.05) or child’s sex (P=.06).

    Responding Children and Adolescents

    For the responding children and adolescents (aged 11-17 years), there were no statistically significant differences in sociodemographic characteristics between the different survey designs, but adolescents responding online were more often male, had reached or aspired to reach higher levels of education, and were more likely to live in households with higher education and higher income, compared with adolescents who responded to the SAQ-paper (Multimedia Appendix 3).

    Mode Equivalence of Health Indicators Between Survey Designs and Data Collection Modes

    Physical and Mental Health

    The analyzed indicators of physical and mental health status showed no statistically significant differences by survey design or data collection mode (Table 2). Across modes and designs, parents reported the same results for SRH, chronic disease, impairment owing to health problems, lifetime prevalence of diagnosed diseases, obesity, and mental health problems and impairment. Adolescent self-reports showed no statistically significant differences in SRH, mental health problems and impairment, or chronic pain.

    Health Care Utilization

    No differences in the crude or adjusted prevalence rates were found in adolescent-reported 12-month use of pediatric or orthodontic services (Table 3). The crude prevalence of parent-reported 12-month use of any doctor and of pediatric services (for children under 11 years) differed significantly, with more frequent reports of doctor’s visits in the SAQ-Web group. After adjusting for sociodemographic attributes, this difference disappeared. There were no significant differences between the mixed-mode design groups and the control group for any of the analyzed indicators of health care utilization.

    Health-Related Quality of Life

    HRQoL, measured using the 5 dimensions of the KIDSCREEN-27 for adolescents, was the only indicator scale analyzed. Independent of adjustment, there were no significant differences between the 2 data collection modes (SAQ-paper and SAQ-Web) for any of the observed dimensions (Table 4). Regarding survey design, better psychological well-being was reported in the concurrent mixed-mode design and better relations with parents were reported in the preselect mixed-mode survey design, compared with the single-mode control group. After correcting the significance level for multiple testing, no differences were found by survey design.

    Health Behaviors

    The crude prevalence of lifetime alcohol consumption (self-reported by adolescents aged 11-17 years), as well as hazardous consumption and binge drinking (based on AUDIT-C reports), showed significant differences between SAQ-paper and SAQ-Web, with higher levels of alcohol consumption reported by online participants (Table 5). Although the differences in hazardous consumption and binge drinking between the 2 modes of data collection disappeared after controlling for sociodemographic characteristics, significantly more online respondents than paper-and-pencil respondents reported that they had consumed alcohol.

    There were no differences in other health behaviors assessed (tobacco consumption, physical activity, and media consumption) by survey design or data collection mode.

    Table 2. Physical and mental health status of children and adolescents aged 0-17 years by survey design and data collection mode (prevalence rates).
    View this table
    Table 3. Health care utilization among children and adolescents aged 0-17 years by survey design and data collection mode (prevalence rates).
    View this table
    Table 4. Health-related quality of life of adolescents aged 11-17 years by survey design and data collection mode (mean values).
    View this table
    Table 5. Health behaviors of adolescents aged 11-17 years by survey design and data collection mode (prevalence-rates).
    View this table



    The main aim of this study was to examine the risk of mode effects in a mixed-mode health interview survey for children and adolescents that combined paper-and-pencil questionnaires and Web-based questionnaires. Therefore, we compared prevalence rates and mean values of a broad range of health indicators from 3 alternative mixed-mode designs (all combining paper-and-pencil and Web-based questionnaires) with a single-mode control group (paper-and-pencil only). We also compared results between online respondents and paper-and-pencil respondents regardless of the survey design. First, we examined differences in sociodemographic characteristics by survey design and data collection mode, as it is well documented that sociodemographic characteristics are associated with health status and health behavior [43,44]. Regarding survey design, there were no statistically significant differences in sample composition, prevalence rates, or mean values of the examined health indicators. There were differences in sociodemographic characteristics across the data collection mode groups. After adjusting for these differences, only one of the analyzed health indicators (lifetime alcohol consumption) showed between-group differences. These results indicate that there is limited mode bias in health interview surveys for children and adolescents using a mixed-mode design with Web-based and paper questionnaires.

    Sample Composition and Digital Divide

    Consistent with previous findings, the sample composition of responding parents and of responding adolescents differed by data collection mode. We confirmed the so-called “digital divide” [47-50]—male adolescents and younger fathers preferred the online mode, a well-known systematic difference [5] between these modes[10,49,51-54]. Additionally, SAQ-Web respondents had higher household incomes [15,49,55] and higher household education levels [10,49,54-57]. Despite these differences, and differences in online response rates between the mixed-mode survey designs, there were no statistically significant differences in sample composition between the paper-and-pencil single-mode control group and the 3 mixed-mode groups. To control for the influence of sociodemographic on health indicators, we adjusted for sociodemographic characteristics by first calculating crude prevalence rates. Then, the analysis was complemented with adjusted prevalence rates or adjusted mean values using predictive margins to identify possible mode effects. Comparisons between the mixed-mode survey designs and the single-mode control group were made using only the crude prevalence rates. Using this approach, hardly any statistically significant differences by data collection mode or by survey design were found for the analyzed health indicators.

    Health Status and Health Care Utilization

    Prevalence rates of health complaints, such as diagnosed allergies, diagnosed ADHD, obesity, and chronic pain, were equivalent between the modes, as previous studies of adults [9,11,53,58] and adolescents [12] have shown. A population-based Norwegian study found higher asthma prevalence rates among online respondents; this was interpreted as possible nonresponse bias and not as a mode effect because there were no differences in the prevalence rates for any other condition [59]. A literature review by Hox et al showed that after controlling for selection, small mode effects do appear, most often distinguishing between modes that involve interviewers (face-to-face, telephone) and modes that do not (mail, Web) [42].

    We found similar prevalence rates for SRH, chronic diseases, and impairment owing to health problems between SAQ-paper and SAQ-Web respondents. The 2 previous studies examining these health indicators among adults in general [11] and among older adults [10] also found no differences between these 2 data collection modes. Another study of adults interpreted the higher SRH found among online respondents compared with paper-based respondents as an expression of different sample characteristics linked to the digital divide era [49], or a case of better-situated people with better health using Web-based questionnaires, and not as a mode effect. We cannot say whether this holds true for the KiGGS methodological pilot study, because we controlled for most characteristics linked to the preference for online participation, such as region of residence and education or income.

    For mental and psychosocial problems, we calculated risk groups for emotional and behavioral problems and for impairment owing to psychosocial problems based on SDQ scores [34]. Both parent- and adolescent-reported scores were equivalent across the examined modes. Several other studies have postulated the comparability of measurement results between these 2 self-administered modes for other standardized mental health questionnaires (eg, depression or anxiety) [12,58,60,61].

    In their review of 55 studies investigating 79 instruments, Campbell et al [9] found measurement equivalence for electronic- and paper-based patient outcomes and concluded that standardized instruments can generally be used electronically without measurement effects. In our study, we also found comparable results for standardized instruments (the SDQ and AUDIT-C), as well as for self-reported HRQoL (KIDSCREEN-27). No existing studies have compared these particular instruments, but previous studies have compared the Short Form Health Survey-36, a frequently used standardized HRQoL instrument for adults, and found measurement equivalence [9,58,62-64].

    All reports of health care utilization were equivalent between the self-administered modes; this is consistent with prior empirical results, including studies of adult vaccination use [11], adolescent health care use [12], and multiple health care quality indicators [56]. The greater use of pediatric services (and of any doctor) before adjustment for sociodemographic characteristics may be explained by the younger age of children in the online group—in Germany, all children are invited to undergo regular health screening examinations (U3-U9 examinations) from early childhood until the age of 5 years, with a well-established system of reminders and reporting.

    Health Behaviors

    Most of the analyzed adolescent health behaviors (current smoking, 12-month water pipe consumption, second-hand smoke exposure, physical activity, and screen-based media use) showed comparable results and no differences between the 2 modes. These results are consistent with the results of other studies on adolescents [12,65].

    Considering alcohol consumption, the crude and adjusted prevalence rates for lifetime consumption were significantly higher among SAQ-Web-responding adolescents. After adjusting for sociodemographic characteristics, the difference decreased but could not be explained by the sociodemographic differences between the 2 groups of respondents. The prevalence of hazardous consumption and binge drinking were comparable between data collection modes after controlling for sample composition.

    Most previous studies have reported no statistically significant differences in alcohol consumption among adolescents or young adults by these 2 data collection modes [12,66]. However, research comparing sensitive health behaviors is inconsistent. Some studies have found higher adult binge drinking [53] and higher adolescent alcohol consumption [17] in online reports, whereas others have found no difference in sensitive health behaviors in general for college students [67,68] and young adults [69].

    The higher rate of reported lifetime alcohol consumption among SAQ-Web-responding adolescents, in the absence of frequently reported hazardous consumption or binge drinking, may be interpreted in multiple ways. For example, this may be a result of different sample properties, such as SAQ-Web-preferring adolescents being more likely to experiment with alcohol consumption. However, it is also possible that this result is a mode effect based on the assumption of identical alcohol consumption in both groups. Web-based questionnaires afford greater privacy because there is no risk of parents checking the responses. Another possible explanation is the lower social orientation in the Internet mode [13]. Both these explanations assume that Web-based questionnaires are more likely to elicit honest reports, but the similar results between the 2 mode groups for reported harmful alcohol consumption after adjustment contradict this assumption. Taken together, the results for alcohol consumption suggest that lifetime consumption should be used with caution as a health indicator in a mixed-mode design. Hazardous consumption and binge drinking are better indicators because they exhibit mode equivalence and have greater public health relevance than lifetime consumption, which is measured by a single question asking whether the respondent has ever consumed alcohol.

    Main Result

    Other empirical comparisons of measurement results between different mixed-mode survey designs are rare. In accord with one other result for the adult population [70], all of the analyzed health indicators for children and adolescents showed comparable results, with no statistically significant differences between the single-mode control group and the 3 mixed-mode groups. Additionally, sociodemographic characteristics did not differ by survey design for parents or adolescents. Regarding measurement comparability, any of the tested mixed-mode health interview survey designs, which offer both Web-based and paper questionnaires, could be used for children and adolescents.

    Strengths and Limitations

    The strengths of the methodological pilot study are the randomized study design, the population-based sample, and the inclusion of a single-mode control group as a reference to interpret the results. However, there are also some limitations, predominantly the relatively small size of the net samples of the analyzed groups. Each survey design had a relatively low number of cases, so interpretations of the results based on the net samples must be made with caution. Possible differences across the 4 survey designs or between the 2 data collection modes could have been overlooked because of a lack of statistical power, particularly regarding the need for correction for multiple testing. Other limitations concern the external validity of the results; the study was conducted in a German setting using register-based samples of children and adolescents, so the results are difficult to generalize to other countries, settings, or populations.


    Our results are consistent with those of most previous studies. We found comparable results between the 2 self-administered modes (SAQ-Web and SAQ-paper) for almost all analyzed health indicators, except for lifetime consumption of alcohol among adolescents aged 11-17 years. Thus, no differences were found between the single-mode control group design and 3 mixed-mode survey designs that combined the 2 data collection modes.

    These results suggest that it is possible to measure health indicators for children and adolescents using a mixed-mode design combining SAQ-Web and SAQ-paper methods, with a low risk of mode effects and high comparability across different mixed-mode survey designs combining these 2 data collection modes [4]. The implementation of a Web-based option in the existing paper-based interview surveys of children and adolescents has a low risk of changed measurement values caused by the mixed-mode survey design.


    This study was funded by the Robert Koch Institute and the German Federal Ministry of Health within the German Health Monitoring System. The authors thank Diane Williams, PhD, from Edanz Group for editing a draft of this manuscript.

    Conflicts of Interest

    None declared.

    Multimedia Appendix 1

    Checklist for Reporting Results of Internet E-Surveys (CHERRIES).

    PDF File (Adobe PDF File), 92KB

    Multimedia Appendix 2

    Sociodemographic characteristics of responding parents of children aged 0-17 years by survey design and data collection mode mode.

    PDF File (Adobe PDF File), 58KB

    Multimedia Appendix 3

    Sociodemographic characteristics of responding adolescents aged 11-17 years by survey design and data collection mode.

    PDF File (Adobe PDF File), 50KB


    1. Galea S, Tracy M. Participation rates in epidemiologic studies. Ann Epidemiol 2007 Sep;17(9):643-653. [CrossRef] [Medline]
    2. Bowling A. Mode of questionnaire administration can have serious effects on data quality. J Public Health (Oxf) 2005 Sep;27(3):281-291. [CrossRef] [Medline]
    3. Revilla M. Quality in unimode and mixed-mode designs: a multitrait-multimethod approach. Surv Res Methods 2010;4(3):151-164 [FREE Full text] [CrossRef]
    4. De Leeuw E, Hox J. Internet surveys as part of a mixed mode design. In: Das M, Ester P, Kaczmirek L, editors. Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies. New York: Taylor & Francis Group; 2011:45-76.
    5. Hox JJ, De Leeuw ED, Zijlmans EA. Measurement equivalence in mixed mode surveys. Front Psychol 2015;6:87 [FREE Full text] [CrossRef] [Medline]
    6. Vannieuwenhuyze J, Loosveldt G, Molenberghs G. A method for evaluating mode effects in mixed-mode surveys. Public Opin Q 2011 Feb 09;74(5):1027-1045. [CrossRef]
    7. Couper MP. The future of modes of data collection. Public Opin Q 2011 Dec 15;75(5):889-908. [CrossRef]
    8. Rutherford C, Costa D, Mercieca-Bebber R, Rice H, Gabb L, King M. Mode of administration does not cause bias in patient-reported outcome results: a meta-analysis. Qual Life Res 2016 Mar 3;25(3):559-574. [CrossRef] [Medline]
    9. Campbell N, Ali F, Finlay AY, Salek SS. Equivalence of electronic and paper-based patient-reported outcome measures. Qual Life Res 2015 Aug;24(8):1949-1961. [CrossRef] [Medline]
    10. de Bernardo DH, Curtis A. Using online and paper surveys: the effectiveness of mixed-mode methodology for populations over 50. Res Aging 2013;35(2):220-240. [CrossRef]
    11. Hoebel J, von der Lippe E, Lange C, Ziese T. Mode differences in a mixed-mode health interview survey among adults. Arch Public Health 2014 Dec 22;72(1):46 [FREE Full text] [CrossRef] [Medline]
    12. Mangunkusumo RT, Moorman PW, Van Den Berg-de Ruiter AE, Van Der Lei J, De Koning HJ, Raat H. Internet-administered adolescent health questionnaires compared with a paper version in a randomized study. J Adolesc Health 2005 Jan;36(1):70.e1-70.e6. [CrossRef]
    13. Taddicken M. Methodeneffekte von Web-Befragungen: Soziale Erwünschtheit vs. Soziale Entkontextualisierung. In: Weichbold M, Bacher J, Wolf C, editors. Umfrageforschung: Herausforderungen und Grenzen. Wiesbaden: VS Verlag für Sozialwissenschaften; 2009:85-104.
    14. van Gelder MM, Bretveld RW, Roeleveld N. Web-based questionnaires: the future in epidemiology? Am J Epidemiol 2010 Dec 01;172(11):1292-1298. [CrossRef] [Medline]
    15. Rookey BD, Hanway S, Dillman DA. Does a probability-based household panel benefit from assignment to postal response as an alternative to Internet-only? Public Opin Q 2008 Dec 09;72(5):962-984. [CrossRef]
    16. Burkill S, Copas A, Couper MP, Clifton S, Prah P, Datta J, et al. Using the web to collect data on sensitive behaviours: a study looking at mode effects on the British national survey of sexual attitudes and lifestyles. PLoS One 2016 Feb 11;11(2):e0147983 [FREE Full text] [CrossRef] [Medline]
    17. Paperny DM, Aono JY, Lehman RM, Hammar SL, Risser J. Computer-assisted detection and intervention in adolescent high-risk health behaviors. J Pediatr 1990 Mar;116(3):456-462. [Medline]
    18. Kurth BM, Kamtsiuris P, Hölling H, Schlaud M, Dölle R, Ellert U, et al. The challenge of comprehensively mapping children's health in a nation-wide health survey: design of the German KiGGS-Study. BMC Public Health 2008 Jun 04;8:196 [FREE Full text] [CrossRef] [Medline]
    19. Hölling H, Schlack R, Kamtsiuris P, Butschalowsky H, Schlaud M, Kurth BM. [The KiGGS study. Nationwide representative longitudinal and cross-sectional study on the health of children and adolescents within the framework of health monitoring at the Robert Koch Institute]. Bundesgesundheitsblatt Gesundheitsforschung Gesundheitsschutz 2012 Jun 7;55(6-7):836-842. [CrossRef] [Medline]
    20. Kurth BM. [Das RKI-Gesundheitsmonitoring - was es enthält und wie es genutzt werden kann]. Public Health Forum 2012;20(3):4.e1-4.e3. [CrossRef]
    21. Kurth BM, Kamtsiuris P, Hölling H, Mauz E. Strategien des Robert Koch-Instituts zum Monitoring der Gesundheit von in Deutschland lebenden Kindern und Jugendlichen. Kinder- und Jugendmedizin 2016;16(3):176-183. [CrossRef]
    22. Robert Koch-Institut. 2011. KiGGS - Kinder- und Jugendgesundheitsstudie Welle 1. Projektbeschreibung   URL: https:/​/www.​​DE/​Content/​Gesundheitsmonitoring/​Gesundheitsberichterstattung/​GBEDownloadsB/​KiGGS_welle1.​pdf?__blob=publicationFile [accessed 2018-01-30] [WebCite Cache]
    23. Mauz E, Gößwald A, Kamtsiuris P, Hoffmann R, Lange M, von Schenck U, et al. New data for action. Data collection for KiGGS Wave 2 has been completed. J Health Monit 2017;2(3):2-27 [FREE Full text] [CrossRef]
    24. Biemer PP. Total survey error: design, implementation, and evaluation. Public Opin Q 2011 Feb 09;74(5):817-848. [CrossRef]
    25. Groves RM, Lyberg L. Total survey error: past, present, and future. Public Opin Q 2011 Feb 09;74(5):849-879. [CrossRef]
    26. Robert Koch-Institut. Berlin: Robert Koch-Institut; 2017. Methodische Studie zur Durchführung von Mixed-Mode-Befragungen zur Gesundheit von Kindern und Jugendlichen (Pilotstudie KiGGS Welle 2)   URL: https:/​/www.​​DE/​Content/​Gesundheitsmonitoring/​Gesundheitsberichterstattung/​GBEDownloadsB/​Pilotstudie-KiGGS-Welle2.​pdf?__blob=publicationFile [accessed 2018-01-27] [WebCite Cache]
    27. Dillman DA, Smyth JD, Christian LM. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method. 4th Edition. New York: John Wiley & Sons; 2014.
    28. Eysenbach G. Improving the quality of web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). J Med Internet Res 2004 Dec 29;6(3):e34 [FREE Full text] [CrossRef] [Medline]
    29. American Association for Public Opinion Research (AAPOR). Esomar. 2011. Standard definitions: final dispositions of case codes and outcome rates for surveys (revised 2011)   URL: https:/​/www.​​uploads/​public/​knowledge-and-standards/​codes-and-guidelines/​ESOMAR_Standard-Definitions-Final-Dispositions-of-Case-Codes-and-Outcome-Rates-for-Surveys.​pdf [accessed 2018-01-30] [WebCite Cache]
    30. Brauns H, Scherer S, Steinmann S. The CASMIN Educational Classification in International Comparative Research. In: Hoffmeyer-Zlotnik JHP, Wolf C. editors. Advances in Cross-National Comparison: A European Working Book for Demographic and Socio-Economic Variables. Boston, MA: Springer US; 2003:221-244.
    31. Cox B, van Oyen H, Cambois E, Jagger C, le Roy S, Robine JM, et al. The reliability of the Minimum European Health Module. Int J Public Health 2009 Jan;54(2):55-60. [CrossRef] [Medline]
    32. Bethell CD, Read D, Stein RE, Blumberg SJ, Wells N, Newacheck PW. Identifying children with special health care needs: development and evaluation of a short screening instrument. Ambul Pediatr 2002;2(1):38-48. [Medline]
    33. Kromeyer-Hauschild K, Wabitsch M, Kunze D, Geller F, Geiß H, Hesse V, et al. Perzentile für den Body-mass-Index für das Kindes- und Jugendalter unter Heranziehung verschiedener deutscher Stichproben. Monatsschr Kinderheilkd 2001 Aug 1;149(8):807-818 [FREE Full text] [CrossRef]
    34. Goodman R. The extended version of the Strengths and Difficulties Questionnaire as a guide to child psychiatric caseness and consequent burden. J Child Psychol Psychiatry 1999 Jul;40(5):791-799. [Medline]
    35. Woerner W, Becker A, Rothenberger A. Normative data and scale properties of the German parent SDQ. Eur Child Adolesc Psychiatry 2004 Jul;13 Suppl 2:II3-I10. [CrossRef] [Medline]
    36. Rattay P, Starker A, Domanska O, Butschalowsky H, Gutsche J, Kamtsiuris P, KiGGS Study Group. [Trends in the utilization of outpatient medical care in childhood and adolescence: results of the KiGGS study - a comparison of baseline and first follow up (KiGGS Wave 1)]. Bundesgesundheitsblatt Gesundheitsforschung Gesundheitsschutz 2014 Jul;57(7):878-891. [CrossRef] [Medline]
    37. The Kidscreen Group Europe (editors). The KIDSCREEN Questionnaires: Quality of Life Questionnaires for Children and Adolescents. Lengerich, Germany: Pabst Science Publishers; 2006.
    38. Lampert T, Kuntz B, KiGGS Study Group. [Tobacco and alcohol consumption among 11- to 17-year-old adolescents: results of the KiGGS study: first follow-up (KiGGS Wave 1)]. Bundesgesundheitsblatt Gesundheitsforschung Gesundheitsschutz 2014 Jul;57(7):830-839. [CrossRef] [Medline]
    39. Manz K, Schlack R, Poethko-Müller C, Mensink G, Finger J, Lampert T, KiGGS Study Group. [Physical activity and electronic media use in children and adolescents: results of the KiGGS study: first follow-up (KiGGS wave 1)]. Bundesgesundheitsblatt Gesundheitsforschung Gesundheitsschutz 2014 Jul;57(7):840-848. [CrossRef] [Medline]
    40. Bush K, Kivlahan DR, McDonell MB, Fihn SD, Bradley KA. The AUDIT alcohol consumption questions (AUDIT-C): an effective brief screening test for problem drinking. Ambulatory Care Quality Improvement Project (ACQUIP). Alcohol Use Disorders Identification Test. Arch Intern Med 1998 Sep 14;158(16):1789-1795. [Medline]
    41. World Health Organization. Geneve: WHO; 2010. Global recommendations on physical activity for health   URL: [accessed 2018-01-27] [WebCite Cache]
    42. Hox J, De LE, Klausch T. Mixed mode research: issues in design and analysis. In: Biemer PP, de Leeuw E, Eckman S, Edwards B, Kreuter F, Lyberg LE, et al, editors. Total Survey Error in Practice: Improving Quality in the Era of Big Data. New Yorck: Wiley; 2017:511-530.
    43. Abuladze L, Kunder N, Lang K, Vaask S. Associations between self-rated health and health behaviour among older adults in Estonia: a cross-sectional analysis. Br Med J Open 2017 Jun 09;7(6):e013257 [FREE Full text] [CrossRef] [Medline]
    44. Link BG, Phelan JC. Understanding sociodemographic differences in health--the role of fundamental social causes. Am J Public Health 1996 Apr;86(4):471-473. [Medline]
    45. Graubard BI, Korn EL. Predictive margins with survey data. Biometrics 1999 Jun;55(2):652-659. [Medline]
    46. Bland JM, Altman DG. Multiple significance tests: the Bonferroni method. Br Med J 1995 Jan 21;310(6973):170 [FREE Full text] [Medline]
    47. Estacio EV, Whittle R, Protheroe J. The digital divide: examining socio-demographic factors associated with health literacy, access and use of internet to seek health information. J Health Psychol 2017 Feb 01:1359105317695429. [CrossRef] [Medline]
    48. Kontos E, Blake KD, Chou WS, Prestin A. Predictors of eHealth usage: insights on the digital divide from the Health Information National Trends Survey 2012. J Med Internet Res 2014 Jul;16(7):e172 [FREE Full text] [CrossRef] [Medline]
    49. Shim JM, Shin E, Johnson TP. Self-rated health assessed by web versus mail modes in a mixed mode survey: the digital divide effect and the genuine survey mode effect. Med Care 2013 Sep;51(9):774-781. [CrossRef] [Medline]
    50. Latulippe K, Hamel C, Giroux D. Social health inequalities and eHealth: a literature review with qualitative synthesis of theoretical and empirical studies. J Med Internet Res 2017 Apr 27;19(4):e136 [FREE Full text] [CrossRef] [Medline]
    51. Börkan B. The mode effect in mixed-mode surveys mail and web surveys. SSCR 2010;28(3):371-380 [FREE Full text]
    52. Dolnicar S, Laesser C, Matus K. Online versus paper format effects in tourism surveys. J Travel Res 2009;47(3):295-316.
    53. Link MW, Mokdad AH. Alternative modes for health surveillance surveys: an experiment with web, mail, and telephone. Epidemiology 2005 Sep;16(5):701-704. [Medline]
    54. Callas PW, Solomon LJ, Hughes JR, Livingston AE. The influence of response mode on study results: offering cigarette smokers a choice of postal or online completion of a survey. J Med Internet Res 2010 Oct 21;12(4):e46 [FREE Full text] [CrossRef] [Medline]
    55. Messer B, Edwards M, Dillman D. Determinants of item nonresponse to web and mail respondents in three address-based mixed-mode surveys of the general public. Surv Pract 2012;5(2):1-27 [FREE Full text]
    56. Zuidgeest M, Hendriks M, Koopman L, Spreeuwenberg P, Rademakers J. A comparison of a postal survey and mixed-mode survey using a questionnaire on patients' experiences with breast care. J Med Internet Res 2011 Sep 27;13(3):e68 [FREE Full text] [CrossRef] [Medline]
    57. Smith AB, King M, Butow P, Olver I. A comparison of data quality and practicality of online versus postal questionnaires in a sample of testicular cancer survivors. Psychooncology 2013 Jan;22(1):233-237. [CrossRef] [Medline]
    58. Gwaltney CJ, Shields AL, Shiffman S. Equivalence of electronic and paper-and-pencil administration of patient-reported outcome measures: a meta-analytic review. Value Health 2008;11(2):322-333 [FREE Full text] [CrossRef] [Medline]
    59. Brøgger J, Nystad W, Cappelen I, Bakke P. No increase in response rate by adding a web response option to a postal population survey: a randomized trial. J Med Internet Res 2007 Dec 31;9(5):e40 [FREE Full text] [CrossRef] [Medline]
    60. Austin DW, Carlbring P, Richards JC, Andersson G. Internet administration of three commonly used questionnaires in panic research: equivalence to paper administration in Australian and Swedish samples of people with panic disorder. Int J Test 2006 Mar;6(1):25-39. [CrossRef]
    61. Zhang X, Kuchinke L, Woud ML, Velten J, Margraf J. Survey method matters: online/offline questionnaires and face-to-face or telephone interviews differ. Comput Human Behav 2017 Jun;71:172-180. [CrossRef]
    62. Basnov M, Kongsved SM, Bech P, Hjollund NH. Reliability of short form-36 in an Internet- and a pen-and-paper version. Inform Health Soc Care 2009 Jan;34(1):53-58. [CrossRef] [Medline]
    63. Broering JM, Paciorek A, Carroll PR, Wilson LS, Litwin MS, Miaskowski C. Measurement equivalence using a mixed-mode approach to administer health-related quality of life instruments. Qual Life Res 2014 Mar;23(2):495-508. [CrossRef] [Medline]
    64. Kongsved SM, Basnov M, Holm-Christensen K, Hjollund NH. Response rate and completeness of questionnaires: a randomized study of Internet versus paper-and-pencil versions. J Med Internet Res 2007 Sep 30;9(3):e25 [FREE Full text] [CrossRef] [Medline]
    65. Wyrick DL, Bond L. Reducing sensitive survey response bias in research on adolescents: a comparison of web-based and paper-and-pencil administration. Am J Health Promot 2011 May;25(5):349-352. [CrossRef] [Medline]
    66. McCabe SE, Diez A, Boyd CJ, Nelson TF, Weitzman ER. Comparing web and mail responses in a mixed mode survey in college alcohol use research. Addict Behav 2006 Sep;31(9):1619-1627 [FREE Full text] [CrossRef] [Medline]
    67. Pealer LN, Weiler RM, Pigg RM, Miller D, Dorman SM. The feasibility of a web-based surveillance system to collect health risk behavior data from college students. Health Educ Behav 2001 Oct;28(5):547-559. [CrossRef] [Medline]
    68. Jun MK. Websm. Indiana: Indiana University; 2005. Effects of survey mode, gender, and perceived sensitivity on the quality of data regarding sensitive health behaviors   URL: [accessed 2018-01-27] [WebCite Cache]
    69. McMorris BJ, Petrie RS, Catalano RF, Fleming CB, Haggerty KP, Abbott RD. Use of web and in-person survey modes to gather data from young adults on sex and drug use: an evaluation of cost, time, and survey error based on a randomized mixed-mode design. Eval Rev 2009 Apr;33(2):138-158 [FREE Full text] [CrossRef] [Medline]
    70. Schilling R, Hoebel J, Müters S, Lange C. Berlin: RKI; 2015. Pilotstudie zur Durchführung von Mixed-Mode-Gesundheitsbefragungen in der Erwachsenenbevölkerung (Projektstudie GEDA 2.0). Beiträge zur Gesundheitsberichterstattung des Bundes   URL: [accessed 2018-01-27] [WebCite Cache]


    ADHD: attention-deficit hyperactivity disorder
    AUDIT-C: Alcohol Use Disorders Identification Test
    HRQoL: health-related quality of life
    KiGGS: The German Health Interview and Examination Survey for Children and Adolescents
    MM: mixed-mode
    SAQ-paper: self-administered paper-based questionnaires
    SAQ-Web: self-administered Web-based questionnaires
    SDQ: Strengths and Difficulties Questionnaire
    SRH: self-rated health

    Edited by G Eysenbach; submitted 12.04.17; peer-reviewed by T Johnson, J Broering; comments to author 15.06.17; revised version received 14.10.17; accepted 16.11.17; published 05.03.18

    ©Elvira Mauz, Robert Hoffmann, Robin Houben, Laura Krause, Panagiotis Kamtsiuris, Antje Gößwald. Originally published in the Journal of Medical Internet Research (, 05.03.2018.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on, as well as this copyright and license information must be included.