Published on in Vol 23, No 3 (2021): March

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/15846, first published .
Effects of Information Architecture on the Effectiveness and User Experience of Web-Based Patient Education in Middle-Aged and Older Adults: Online Randomized Experiment

Effects of Information Architecture on the Effectiveness and User Experience of Web-Based Patient Education in Middle-Aged and Older Adults: Online Randomized Experiment

Effects of Information Architecture on the Effectiveness and User Experience of Web-Based Patient Education in Middle-Aged and Older Adults: Online Randomized Experiment

Original Paper

1Faculty of Industrial Design Engineering, Delft University of Technology, Delft, Netherlands

2Faculty of Behavioural, Management and Social sciences, University of Twente, Enschede, Netherlands

3Department of Orthopaedic Surgery, Reinier de Graaf Hospital, Delft, Netherlands

Corresponding Author:

Tessa Dekkers, PhD

Faculty of Behavioural, Management and Social sciences

University of Twente

Drienerlolaan 5

Enschede, 7522 NB

Netherlands

Phone: 31 534899741

Email: t.dekkers@utwente.nl


Background: Web-based patient education is increasingly offered to improve patients’ ability to learn, remember, and apply health information. Efficient organization, display, and structural design, that is, information architecture (IA), can support patients’ ability to independently use web-based patient education. However, the role of IA in the context of web-based patient education has not been examined systematically.

Objective: To support intervention designers in making informed choices that enhance patients’ learning, this paper describes a randomized experiment on the effects of IA on the effectiveness, use, and user experience of a patient education website and examines the theoretical mechanisms that explain these effects.

Methods: Middle-aged and older adults with self-reported hip or knee joint complaints were recruited to use and evaluate 1 of 3 patient education websites containing information on total joint replacement surgery. Each website contained the same textual content based on an existing leaflet but differed in the employed IA design (tunnel, hierarchical, or matrix design). Participants rated the websites on satisfaction, engagement, control, relevance, trust, and novelty and completed an objective knowledge test. Analyses of variance and structural equation modeling were used to examine the effects of IA and construct a theoretical model.

Results: We included 215 participants in our analysis. IA did not affect knowledge gain (P=.36) or overall satisfaction (P=.07) directly. However, tunnel (mean 3.22, SD 0.67) and matrix (mean 3.17, SD 0.69) architectures were found to provide more emotional support compared with hierarchical architectures (mean 2.86, SD 0.60; P=.002). Furthermore, increased perceptions of personal relevance in the tunnel IA (β=.18) were found to improve satisfaction (β=.17) indirectly. Increased perceptions of active control in the matrix IA (β=.11) also improved satisfaction (β=.27) indirectly. The final model of the IA effects explained 74.3% of the variance in satisfaction and 6.8% of the variance in knowledge and achieved excellent fit (χ217,215=14.7; P=.62; root mean square error of approximation=0.000; 95% CI [0.000-0.053]; comparative fit index=1.00; standardized root mean square residual=0.044).

Conclusions: IA has small but notable effects on users’ experiences with web-based health education interventions. Web-based patient education designers can employ tunnel IA designs to guide users through sequentially ordered content or matrix IA to offer users more control over navigation. Both improve user satisfaction by increasing user perceptions of relevance (tunnel) and active control (matrix). Although additional research is needed, hierarchical IA designs are currently not recommended, as hierarchical content is perceived as less supportive, engaging, and relevant, which may diminish the use and, in turn, the effect of the educational intervention.

J Med Internet Res 2021;23(3):e15846

doi:10.2196/15846

Keywords



Background

Verbal and written patient education methods are often supplemented with web-based education to improve patients’ ability to learn, remember, and apply health information. Such improvements are needed because patients’ recall of traditional education is generally poor [1-3], which negatively affects their satisfaction with care, ability to self-manage, and emotional well-being [4,5].

There are many options to engage patients with web-based education, ranging from animations and interactive exercises to tailored health advice [6]. However, for education to be the most effective, patients must be able to use such functions independently. An efficient information architecture (IA) supports independent use [7,8], yet few studies have systematically examined IA in the context of web-based health education. To support intervention designers in making informed choices that enhance patients’ learning, this paper describes a randomized experiment concerning the effect of IA on the effectiveness, use, and user experience of a patient education website and the theoretical mechanisms that explain these effects. In addition, the study explores the benefit of tailoring IA to specific user profiles.

IA

IA concerns “the structural design of a shared information environment” [9]. It describes “the way in which digital content is organized and displayed, which strongly impacts users’ ability to find and use content” [10]. IA has a pervasive role in website design because it affects the user’s ability to find information with no or very limited training and helps save long-term costs. Web-based environments with effective IAs are typically more scalable, easier to maintain and update, and require fewer redesigns [9]. Yet, despite the importance of IA, there is a lack of primary research that examines IA specifically in the context of web-based health education. A recent review on this subject revealed that to date, only 1 study has empirically manipulated IA in isolation from other design features [10]. This study, conducted in 2012 by Crutzen et al [11] to examine web-based hepatitis information, investigated whether providing users with the opportunity to skip pages (or not) affected website use and user perceptions of efficiency, effectiveness, and enjoyment. It was found that an architecture that provided users with less control over navigation increased both website use and knowledge gain [11]. Although this study demonstrated that IA influences web-based learning experiences, it examined only one particular IA design (the tunnel). Therefore, we argue that a more comprehensive examination of IA is required. For this purpose, we used the taxonomy of 4 archetypes of IA by Danaher et al [12,13]: tunnel, hierarchical, matrix, and hybrid architectures. Hybrid architectures mix design elements of tunnel, hierarchical, and matrix architectures. Each hybrid mix may thereby present unique advantages and disadvantages that cannot be readily understood before experimentation with the nonhybrid IA designs. Therefore, this study focuses on the three nonhybrid IA designs (ie, tunnel, hierarchical, and matrix) only. The features, advantages, and disadvantages of each design are outlined below, and additional examples of each IA design are presented in the Methods section of this paper.

The tunnel IA design is the most common IA in health interventions: 90%-100% of interventions for chronic illness or mental health support include some form of tunneling [14]. In a typical tunnel, IA users follow a step-by-step approach to access content in a predefined, sequential order. For example, a website that only allows access to new material once users have completed previous lessons can be considered to have a tunneled design. A possible advantage of this IA is that it reduces the complexity of information. However, it also reduces the perceived control of users, which may decrease engagement and lead to nonadherence and attrition [15]. The second IA archetype is the hierarchical design. Hierarchical designs organize content hierarchically, differentiating between major and minor content. Typically, users are first provided with a general overview of the major content present on the website. For example, the official United States government website on health organizes content by major topics such as “Health Insurance,” “Medications,” and “Vaccines and Immunizations.” After selecting the appropriate topic, users can explore nested, minor content to review in detail. Assumed advantages of this IA include increased control over content selection, familiarity, and simplicity. However, usability may be limited when users are unable to locate deeply nested content. The third IA concerns the matrix design. This IA design presents all available content on 1 home page or dashboard, thereby removing any differentiation between major and minor content or predefined sequential paths included in the hierarchical and tunnel designs, respectively. This allows users to freely navigate content in their preferred order and duration. Travel agency websites that display all available travel options first and then allow users to sort on date, price, or location are examples of matrix designs. The matrix IA design is considered engaging yet disorienting and is particularly appropriate for highly educated and experienced users looking for enrichment [15,16].

What Explains the Effects of IA?

Many scholars have condemned the black box approach to eHealth, which offers little understanding of the underlying mechanisms through which web-based interventions (and the tools, techniques, and strategies embedded in them) exert their effects [13,14,17]. IA design has the same issue. Although there are several assumed benefits (eg, increased usability and increased user control) of each IA design, as outlined above, there is no overarching conceptual model of IA effects. This makes it difficult to determine how IA affects the user experience of a health education website. Therefore, we examine the following 5 aspects of the user experience: user engagement, user perceptions of control, personal relevance, trustworthiness, and novelty, which may be influenced by IA design in depth. These are depicted in the conceptual model (Figure 1). We do not hold specific expectations regarding the main effects of IA design but rather expect that each IA design may elicit a different user experience in comparison with the other IA designs, as detailed below.

Figure 1. Conceptual model of information architecture (IA). Solid arrows represent expected effects related to matrix IA design, dashed arrows represent expected effects related to hierarchical IA design, and dotted arrows represent expected effects related to tunnel IA design.
View this figure
User Engagement

First, we hypothesize that IA design affects user engagement. User engagement is defined as “a quality of user experience characterized by the depth of an actor’s investment when interacting with a digital system” [18,19]. It is often conceptualized as a multidimensional construct composed of cognitive, affective, and behavioral components [20], which means that engagement can both refer to a subjective experience of flow and immersion as well as the actual act of using an intervention [15]. Several recent reviews suggest that user engagement is pivotal for creating an effective and enjoyable web-based experience [15,21].

Our expectations regarding IA design as a determinant of engagement are twofold. First, tunnel IA designs (in comparison with hierarchical and matrix IA) are thought to increase behavioral engagement because the sequential, predefined setup allows researchers to persuasively guide users through the web-based process, resulting in extended use [11,14]. In a study of a web-based smoking cessation intervention, users who viewed content in a set order accessed content more often and for longer [22]. This indicates that tunnel IA design should result in higher levels of behavioral user engagement. In contrast, a more flexible matrix IA design may increase the subjective experience of engagement by providing the user more control over the interaction, as outlined below.

Perceived Active Control

As stated earlier, tunnel IA designs have been found to decrease user perceptions of control [11]. User control is a “user’s ability to voluntarily participate in and instrumentally influence a communication” [23,24]. As matrix IA designs allow users to both influence the selection of content and the order in which content is consumed, this design is expected to increase perceptions of user control. Active user control is a dimension of interactivity [23,24], and interactive interventions, in turn, are associated with a more engaging experience [6,25]. Possibly, this is because users who are able to influence an intervention instrumentally consider this to be an enjoyable experience or become more emotionally invested in the intervention. It is important to note here that perceived interactivity and control appear to be more important than actual website interactivity [26,27]. Together, this indicates that matrix IA designs may also improve (cognitive or affective components of) engagement through increased user perceptions of control.

Perceived Personal Relevance

Perceived personal relevance refers to the extent to which people feel that information is relevant to themselves and their situation [28-30]. People are more motivated to process personally relevant content, leading to deeper processing and greater susceptibility to any persuasive attempts the content makes [28,31,32]. Perceptions of relevance have also been linked to educational enjoyment [33]. We expect that perceived personal relevance may increase knowledge acquisition through the same motivational pathway. Hierarchical and matrix IA designs are the only designs that allow users to select content. We expect that users, to some extent, select content based on what they consider most personally relevant. Therefore, we hypothesize that hierarchical and matrix IA designs (in contrast to tunnel IA design) increase the perceived personal relevance of the health information presented and that this leads to both greater knowledge acquisition and greater satisfaction.

Perceived Trust

Perceived trust is a belief that influences whether a patient is willing to engage with health education [34]. Trust in health information is influenced by source, message, channel, and recipient [35,36] as well as structural website features [37]. A previous study on the credibility of health websites showed that the presence of a navigation menu (as is included in most hierarchical IA designs) increases perceived website credibility, as it reinforces the notion that the website is produced by a professional organization [37]. This type of heuristic evaluation of information credibility can lead to a better experience on the health website [38]. Therefore, we hypothesize that hierarchical IA design positively influences participants’ trust in the health information presented and, in turn, the knowledge and satisfaction derived from the education.

Perceived Novelty

Finally, we considered perceived novelty as a potential explanatory variable. As the tunnel IA design is the norm in health interventions, other IA designs may offer more novel ways to access health information. Novelty in the context of interfaces can “act as a curiosity generating mechanism that arouses the imaginations of users and captures their interest in a site” [39]. Users pay greater attention and effort to novel media [40], subsequently leading to a greater uptake of information. Novelty has also been related to enjoyable experiences of flow and engagement [18,38]. Therefore, we expect that the less common IA designs (hierarchical and matrix) will increase user perceptions of novelty and that increased novelty will improve both user satisfaction and knowledge acquisition through increased attention to the content.

Does One IA Design Fit All?

A final consideration in examining the effects of IA is the role of individual preferences and capabilities. Many recommendations regarding IA design take user characteristics into account. For example, Lynch and Horton [16] describe matrix IA designs (which they refer to as webs) as more suitable for highly educated users with a high level of prior knowledge about the content. It has also been suggested that perceived control over website navigation may be more important to some users than to others [11]. However, the influence of individual differences on the effectiveness and experience of different IA designs has not been empirically tested.

This study used a previously defined set of user profiles of patients [41] who had undergone total joint replacement (TJR) surgery to explore the potential benefit of tailored IA design (Table 1). Each profile represents 1 of 3 ways through which communicative preferences and capabilities may manifest in patients. So-called managing patients prefer open, participative communication, particularly regarding personal circumstances, and have high capabilities and self-efficacy for understanding and applying health information. In comparison, optimistic patients have similar capabilities but find patient-provider communication of lesser importance and only have a slight preference for an open communicative style. Finally, modest patients value both open information and emotional support but have limited self-efficacy and skills in health communication. With these profiles and the recommendations for each IA design in mind, we hypothesize that users with higher preferences for open communication (ie, managing patients) will prefer IA designs that offer more control (ie, matrix), optimistic patients will not prefer any IA design in particular, and modest patients will prefer more supportive IA designs that guide them through the educational content step by step (ie, tunnel).

Table 1. Description of communicative preferences and capabilities of three total joint replacement patient profilesa.
Managing profileOptimistic profileModest profile
High preference for open communicationModerate preference for open communicationModerate preference for open communication
High preference for emotionally supportive communicationLow preference for emotionally supportive communicationModerate preference for emotionally supportive communication
High critical communication capabilitiesModerate critical communication capabilitiesLow critical communication skills
High personal communication capabilitiesModerate personal communication capabilitiesLow personal communication skills
High self-efficacy for health informationHigh self-efficacy for health informationLow self-efficacy for health information

aPatient profiles are based on Groeneveld et al [41].

Study Objectives

The aims of this study are threefold: (1) to test the effects of IA in the context of a TJR surgery patient education website on knowledge acquisition and satisfaction with web-based education; (2) to test possible working mechanisms of IAs, including user engagement, perceived user control, perceived personal relevance, perceived trust, and perceived novelty; and (3) to explore the potential of tailored IAs.


Design

In July 2018, we conducted a between-subjects experiment comparing the knowledge and satisfaction gained from a patient education website with three different IA designs. Ethics approval for this study was obtained from the Human Research Ethics Committee Delft University of Technology. Participants provided written consent and signed a data processing agreement formulated in concordance with the General Data Protection Regulation.

Participants and Procedure

Participants were recruited using a Dutch web-based consumer research service (respondenten.nl B.V.). Middle-aged to older adults (40-80 years) with self-reported chronic hip or knee joint complaints (including arthrosis, wear and tear, chronic inflammation, birth deficits, or unknown causes) were eligible for participation. To detect a small-to-medium effect (f2=0.15-0.25) on satisfaction and knowledge using an α of .05 and a power of 0.80, a sample size between 159 and 432 participants was needed [42,43]. We aimed to recruit at least 100 participants per condition for a total sample of 300 participants. In total, we were able to enroll 235 participants, of which the data of 215 participants were included in the analysis (see the Results section). Participants received monetary reimbursement (15 euro [US $18.2]) for their participation.

The complete experiment was conducted on the web via survey software (Qualtrics). Each eligible participant was provided a hyperlink to the survey. After providing consent, participants filled out questionnaires regarding their communication preferences and skills, health, anxiety, and coping behavior, which were used to determine the patient profile [41]. Participants also stated the extent to which they already felt knowledgeable about TJR surgery (part A). In part B, participants were randomly assigned to 1 of 3 experimental conditions using Qualtrics’ built-in randomizer. The allocation sequence and assignments were concealed from all participants, the researchers, and the consultant hired for participant recruitment until all data were collected. Participants were initially asked to focus on either the website’s design or its content. After reviewing the website’s design, participants reported satisfaction and user perceptions. They were then asked to view the website for a second time while focusing on content. Then, they completed a knowledge test designed for the purpose of this study. The order of focus (design vs content) was counter-balanced. Finally, participants shared their sociodemographic information and received a code for reimbursement (part C). Eligible participants who had not started or completed the survey after 3 weeks were reminded via email once.

Materials

Design Process

The three websites were designed between March and June 2018 by a design agency (Panton B.V.) specializing in the design of products, services, and processes for health care under the supervision of the first author. The lead designer provided literature on IA [12] and was given access to patient profile role descriptions and anonymized data about patients’ communication preferences and capabilities collected in an earlier study (T Dekkers, PhD, unpublished data, February 2017). In June, prototypes of the websites were pilot tested. To discuss progress and ensure accuracy and quality of health information shared on the patient education websites, the design team met with the first author 10 times throughout the design process. At 2 points in the design process (after first conceptualization and after the pilot tests), the design team also met with the full research team, including an orthopedic surgeon.

Pilot Usability Study

Prototypes of the three websites were pilot tested with 7 patients (age range 46-77 years) scheduled for TJR surgery and 7 informal caregivers (age range 42-76 years) in June 2018. The pilot test focused specifically on usability of the websites rather than effectiveness in terms of knowledge acquisition. Interested patients present at the clinic for scheduled group-based patient education were shown the prototypes after they provided written consent. They first freely explored the websites while mentioning aloud any (positive or negative) aspects that stood out. Then, they were asked to find information about the first checkup after surgery. This assignment was used to identify usability issues and software bugs [44]. Finally, patients were asked to report engagement using the User Engagement Scale-Short Form (UES-SF, see Measurements section). Throughout the pilot test, the cursor of the participants was tracked using screen capture software (CamStudio Recorder v2.7, Rendersoft Development). Screen captures were used both to identify unclear navigational cues and to get an initial impression of whether the users navigated through the IAs as intended (eg, whether patients explored more pages in the matrix design, made use of the table of contents in the hierarchical design, and moved step by step using the next and prior buttons in the tunnel design). The input of patients and caregivers was shared with the lead designer and implemented in the following iteration of the design. This led to significant improvements in usability, including less scrollable text, more prominently displayed contact information, vivid color accents, and larger buttons.

Websites

All websites contained the same textual content based on an existing patient education leaflet titled Instructions after an outpatient Total Hip Prosthesis (THP; Instructies na een Totale Heup Prothese [THP] in dagbehandeling) used by the local hospital (Reinier de Graaf Gasthuis, the Netherlands). The leaflet addressed practical concerns before and after outpatient THP surgery, including preparation for surgery, pain, medication, and physiotherapy. All graphic design elements (including photos, fonts, and color) were equivalent across websites.

The tunnel IA website design had a chronological sequential ordering of topics presented as a timeline, starting with the day of the operation and ending with the 3-month follow-up and frequently asked questions. Navigation was limited to next and previous buttons placed below the text and in the timeline. Topics that were not yet accessible to the user were grayed out (Figure 2). The hierarchical IA website design presented participants with a choice menu in which they selected the phase of their patient journey (eg, in the hospital and able to walk a few steps). After selecting an option, users were presented with topics grouped in a table-of-content menu. Participants could further investigate their chosen topic using the menu and could return to the home page using the buttons or navigation path (ie, bread crumb trail). The matrix IA website design showed all topics in tiles on the home page and provided no suggested reading order. By clicking on the topic tiles or hyperlinks in the body of text, participants could switch between topics. Offline copies of the experimental websites are available on request by contacting the first author.

Figure 2. Annotated screenshots of tunnel, hierarchical, and matrix information architecture (IA) design of a Dutch patient education website to prepare patients for total joint replacement surgery. Tunnel IA: (A) next/previous buttons, (B) grayed-out text (not yet accessible), and (C) next/previous buttons. Hierarchical IA: (D) table of contents, (E) major grouping by recovery phase, and (F) return to main menu. Matrix IA: (G) topic matrix and (H) hyperlink. All screenshots depict the same content about pain and swelling (pijn en zwelling).
View this figure

Measurements

The primary outcomes of interest are knowledge acquisition and website satisfaction. Satisfaction with web-based education captures both the attitude of patients toward website functioning (eg, satisfaction with comprehensibility and with emotional support derived from the website) as well as their affective attitude (eg, satisfaction with website attractiveness) [45,46]. The secondary outcomes used to test the conceptual model include user perceptions of engagement, control, personal relevance, trust, and novelty. We also measured use by capturing the total time spent on the website in minutes. Finally, we collected short qualitative feedback forms on the perceived advantages and disadvantages of the website.

Knowledge Acquisition and Satisfaction With Website

A total of 5 multiple-choice (MC) questions and 3 open questions about (self-)care after TJR surgery were used to assess knowledge acquisition. The questions were based on the information provided on the websites and included, for example: after the surgery, it is important to strengthen the muscles surrounding the hip joint. Which ways to do so are recommended by orthopedic surgeons? Each question included the following answer options: not been discussed, discussed, but I cannot remember the details, a correct answer, and an incorrect answer (distractor) [47]. For each correct MC answer, participants scored 1 point, and for each open question, an answer sheet was developed that assigned points from 0 (incorrect), 1 (partly correct), to 2 (fully correct). All points were summed and converted to reflect the percentage of correct answers (0%-100% correct).

Satisfaction with patient education was measured using the Website Satisfaction Scale [45,46] comprising three subscales: satisfaction with the (1) attractiveness, (2) comprehensibility of the information, and (3) emotional support received from the website. All items consisted of statements to which participants’ agreement was measured on a 5-point Likert scale (1=totally disagree and 5=totally agree). Statements included the website looks nice, the website is understandable, and the website give ease of mind. Both the overall index score of satisfaction and the separate subscales achieved excellent reliability (α=.82-.98).

User Perceptions of Engagement, Active Control, Personal Relevance, Trust, and Novelty

We included 5 constructs to explore the theoretical mechanisms through which (tailored) IAs may influence knowledge acquisition and satisfaction. The first is user engagement, as measured through the UES-SF [19]. We obtained permission to translate this validated questionnaire to Dutch according to the guidelines for cross-cultural adaptation of self-reported instruments [48,49] (personal communication by HL O’Brien, May 18, 2018). The instrument contains 12 questions, which form 1 index score (α=.88), and 4 subscales: focused attention (I was absorbed in this experience, α=.75), aesthetic appeal (the website was attractive, α=.87), reward (using the website was worthwhile, α=.71), and perceived usability (I felt frustrated while using the website, α=.79; Multimedia Appendix 1 [18,19,50]). The other user perceptions of interest included perceived active control (during the website visit, I could freely decide what I wanted to see, 4 items, α=.96) [27], personal relevance (the website was relevant to my situation, 2 items, α=.83) [51], trust (the website is sincere and honest, 3 items, α=.97) [34], and novelty (the website incited my curiosity, 3 items, α=.90) [50]. All questions were answered on a 5-point Likert scale (1=strongly disagree and 5=strongly agree).

Statistical Methods

We conducted chi-square (χ2) and analyses of variance (ANOVA) tests to check whether background characteristics were evenly distributed over experimental conditions. To test the main effect of IA, 2 ANOVA tests were conducted with satisfaction and knowledge gain as dependent variables. Follow-up pairwise t tests were performed to explore differences between the IA designs, and these were all corrected using the Bonferroni correction. Finally, ANOVA tests were performed with the secondary outcomes (user perceptions) as dependent variables, and the concept of tailored IAs was explored in a two-way ANOVA with condition and profile as the independent variables.

To construct a conceptual model of how IA influences satisfaction and knowledge acquisition, we used structural equation modeling. User perceptions of engagement, personal relevance, active control, trust, and novelty (hereafter, mediating variables) were regressed on IA. Satisfaction and knowledge were regressed on IA and the mediating variables. To improve the parsimony and fit of the model, we removed nonsignificant paths. As our hypotheses suggest that IA design may influence perceived control and subsequently user engagement, and ultimately satisfaction and knowledge, we also constructed a separate serial mediation model for this hypothesis specifically. Model chi-square (χ2), comparative fit index (CFI), standardized root mean square residual (SRMR), and root mean square error of approximation (RMSEA) were used to determine model fit. A model was considered to have a good fit when χ2 divided by degrees of freedom ≤3 with P<.05, CFI≥0.95, SRMR≤0.09, and RMSEA≤0.07 [52,53]. All analyses were conducted using R version 3.5.1 [54] with α=.05.


Participant Characteristics

We enrolled 235 participants, of which, 215 participants were included in the analysis (Figure 3). A total of 20 participants completed the survey on a mobile device, despite instructions to view the survey and the websites on a laptop or personal computer. As the layout and, thus, the information architecture of the websites may appear distorted on mobile devices, these participants were excluded from analysis. There were no significant differences between the excluded participants compared with the included participants with respect to background characteristics, except for device use (P<.001). Excluded participants used the personal computer less (47% vs 9% nonuse) and tablet devices more (89% vs 41% use). No significant associations were found between background characteristics and experimental conditions, indicating that participants were evenly distributed over all three conditions. All participant characteristics are reported in Table 2. On average, participants were 57 years old (SD 7.7), female (155/215, 72.1%), attained lower secondary education (95/215, 44.2%), and were employed or self-employed (118/215, 54.9%). They used the internet daily (mean 3.2 hours, SD 2.1) mainly on personal computers or laptops (91%) and mobile phones (82%). Participants rated their overall health significantly lower (69 out of 100) than the Dutch average of 81.5 for people aged 50-59 years [55,56] and experienced considerable movement-evoked joint pain (mean 4.9, SD 2.3).

Figure 3. Participant recruitment and follow-up diagram. IA: information architecture.
View this figure
Table 2. Participant characteristics (N=215).
VariableValue
Age (years), mean (SD)a57.18 (7.70)
Sex, n (%)

Female155 (72.1)

Male60 (27.9)
Education, n (%)

Primary education3 (1.4)

Lower secondary education95 (44.2)

Higher secondary education36 (16.7)

Tertiary education81 (37.7)
Occupation, n (%)

Employed83 (38.6)

Self-employed35 (16.3)

Retired37 (17.2)

Beneficiary29 (13.5)

Other or none31 (14.4)
Relationship status, n (%)

Married or long-term relationship132 (61.4)

Divorced41 (19.1)

Never married35 (16.3)

Widowed5 (2.3)

Other2 (0.9)
Social supportb, n (%)

Partner124 (57.7)

Friend75 (34.9)

Child52 (24.2)

Neighbor36 (16.7)

Family member34 (15.8)

Colleague7 (3.3)

Group (church or sports)4 (1.9)

Other2 (0.9)

No support25 (11.6)
Internet use in hours per day, mean (SD)c3.17 (2.14)
Device useb, n (%)a

Personal computer or laptop194 (90.7)

Phone175 (81.8)

Tablet88 (41.1)
Self-reported previous knowledge of hip replacement surgery, mean (SD)d1.85 (0.92)
Patient profile, n (%)

Optimistic90 (41.9)

Modest72 (33.5)

Managing53 (24.7)

aData were missing for 1 participant.

bParticipants could select multiple answers.

cData were missing for 10 participants.

dData were missing for 2 participants.

Effects of IA on Knowledge Acquisition and Satisfaction

All three websites received predominantly positive feedback via the open qualitative feedback forms; participants appreciated that they were clear and organized. Multimedia Appendix 2 summarizes the qualitative feedback on advantages and disadvantages for each IA. Table 3 and Figure 4 report the overall effects of IA. IA did not directly affect knowledge acquisition (F2,212=1.023; P=.36; ηp2=0.010) or overall satisfaction (F2,212=2.702; P=.07; η2=0.025). IA did have a significant effect on satisfaction with emotional support (F2,212=6.376; P=.002; η2=0.057). Post hoc analyses indicated that participants were significantly less satisfied with the hierarchical IA design (mean 2.86, SD 0.60) compared with the matrix (mean 3.17, SD 0.69) and tunnel (mean 3.22, SD 0.67) architectures. The hierarchical design was perceived as the least favorable in general: users devoted less focused attention (mean difference to tunnel −0.31; P=.03), saw the design as less novel (mean difference to tunnel −0.33; P=.02 and mean difference to matrix −0.36; P=.01) and less personally relevant (mean difference to tunnel −0.44; P=.006), and found that it provided the least active control (mean difference to matrix −0.32; P=.02).

Table 3. Knowledge acquisition, satisfaction, and user perceptions of patient education website by information architecture.
Outcome, mean (SD)Tunnel IAa (n=75)Matrix IA (n=71)Hierarchical IA (n=69)P value2)b
Website satisfaction3.69 (0.52)3.65 (0.52)3.50 (0.48).07N/Ac

Attractiveness3.73 (0.61)3.68 (0.65)3.61 (0.61).50N/A

Comprehension4.24 (0.56)4.21 (0.59)4.17 (0.71).79N/A

Emotional support3.22 (0.67)3.17 (0.69)2.86 (0.60).002d.057
Knowledge acquisition51.64 (19.55)48.02 (19.75)47.3 (19.63).36N/A
User engagement3.71 (0.55)3.65 (0.55)3.48 (0.57).047e.028

Focused attention3.16 (0.75)3.00 (0.70)2.85 (0.79).04f.030

Esthetic appeal3.76 (0.68)3.75 (0.68)3.52 (0.76).08N/A

Reward3.81 (0.62)3.78 (0.57)3.58 (0.68).06N/A

Perceived usability4.08 (0.68)4.05 (0.78)3.98 (0.78).67N/A
Perceived active control3.84 (0.67)3.95 (0.65)3.63 (0.74).02g.035
Perceived personal relevance3.08 (0.86)2.73 (0.83)2.64 (0.86).005h.050
Perceived trustworthiness3.94 (0.56)3.92 (0.57)3.78 (0.59).21N/A
Perceived novelty3.43 (0.75)3.46 (0.73)3.10 (0.76).007i.046
Time spent in minutes:seconds5:53 (4:24)5:18 (4:15)4:59 (4:09).44N/A

aIA: information architecture.

bEffect size is only provided for significant differences.

cN/A: not applicable.

dHierarchical IA was significantly different from both tunnel IA (P=.02) and matrix IA (P=.02).

eHierarchical IA was significantly different from tunnel IA (P=.05).

fHierarchical IA was significantly different from tunnel IA (P=.03).

gHierarchical IA was significantly different from matrix IA (P=.02).

hTunnel IA was significantly different from both hierarchical IA (P=.006) and matrix IA (P=.04).

iHierarchical IA was significantly different from both tunnel IA (P=.03) and matrix IA (P=.01).

Figure 4. Main effects of information architecture.
View this figure

Model of IA Effects

The ANOVA tests demonstrated that the tunnel and matrix designs performed significantly better than the hierarchical IA design. To explain why tunnel and matrix IAs perform better compared with hierarchical IAs, we selected the hierarchical IA as the reference category in the mediation model.

The first mediation model (Model 1) specified that the effect of IA on knowledge and satisfaction would be mediated by user perceptions of engagement, active control, personal relevance, trust, and novelty. Specification of complete mediation results in a fully saturated regression model with zero degrees of freedom, as the number of observations is equal to the number of parameters [57,58]. Therefore, the first model was interpreted based on the regression paths instead of the fit indices (Table 4). All pathways (of which the exact P values are provided in Table 4) with P<.10 were considered in a second model (Model 2). For Model 3 and Model 4, we continued eliminating pathways with a more stringent cut-off of P<.05.

Overall, models 2 to 4 all achieved similarly good fit (Table 5). Model 4 (Figure 5) was selected as the final model, as it was the most parsimonious (expressed by highest degrees of freedom [59]). This model explained 74.3% of the variance in satisfaction and 6.8% of the variance in knowledge and achieved excellent fit (χ217,215=14.7; P=.62; RMSEA=0.000; CI 0.000-0.053; CFI=1.00; SRMR=0.044).

Table 4. Pathways included in mediation models 1, 2, 3, and 4.
Outcome and predictor or mediatorPath estimate (Model 1)P value (Model 1)Model 2Model 3Model 4
User engagement

Tunnel IAa0.190.02bc

Matrix IA0.139.08
Perceived active control

Tunnel IA0.142.07

Matrix IA0.215.006
Perceived personal relevance

Tunnel IA0.243.002

Matrix IA0.048.54
Trust

Tunnel IA0.133.09

Matrix IA0.109.17
Perceived novelty

Tunnel IA0.208.007

Matrix IA0.225.004
Knowledge

User engagement0.226.045

Perceived active control0.006.96

Perceived personal relevance0.089.22

Trust−0.007.93

Perceived novelty−0.006.95
Satisfaction

User engagement0.382<.001

Perceived active control0.273<.001

Perceived personal relevance0.169<.001

Trust0.227<.001

Perceived novelty0.026.60
Knowledge

Tunnel IA design0.042.59

Matrix IA design−0.018.82
Satisfaction

Tunnel IA design−0.011.80

Matrix IA design−0.017.68
Knowledge

User engagement×matrix IA0.031.19

Perceived novelty×matrix IA−0.001.95

Trust×matrix IA−0.001.93

Perceived personal relevance×matrix IA0.004.58

Perceived active control×matrix IA0.001.96

User engagement×tunnel IA0.043.12

Perceived novelty×tunnel IA−0.001.95

Trust×tunnel IA−0.001.93

Perceived personal relevance×tunnel IA0.022.25

Perceived active control×tunnel IA0.001.96
Satisfaction

User engagement×tunnel IA0.073.02

Perceived active control×tunnel IA0.039.09

Perceived personal relevance×tunnel IA0.041.01

Trust×tunnel IA0.030.11

Perceived novelty×tunnel IA0.005.61

User engagement×matrix IA0.053.09

Perceived active control×matrix IA0.059.02

Perceived personal relevance×matrix IA0.008.54

Trust×matrix IA0.025.18

Perceived novelty×matrix IA0.006.61

aIA: information architecture.

bPathways indicated with a check mark were included in the model formulation.

cPathways indicated with an em dash were excluded in the model formulation.

Table 5. Fit statistics of mediation models 2, 3, and 4.
ModelChi-square (df)P valueχ2 divided by dfCFIaSRMRbRMSEAc95% CI
Model 24.7 (9).860.52210.0270.0000.000-0.041
Model 310.8 (13).630.83310.0420.0000.000-0.057
Model 414.7 (17).620.86410.0440.0000.000-0.053

aCFI: comparative fit index.

bSRMR: standardized root mean square residual.

cRMSEA: root mean square error of approximation.

Figure 5. Structural equation model of the effects of information architecture.
View this figure

The model explains the effect of IA as follows: compared with hierarchical IAs, health information presented in a tunnel IA is perceived as more personally relevant (β=.18). This subsequently increases user satisfaction (β=.17). Matrix IAs, in comparison with hierarchical IAs, significantly increase the active control users perceive to have over the health information (β=.11), which also increases satisfaction (β=.27). Furthermore, the model shows that next to user perceptions of personal relevance and active control, user engagement and perceived trust in the health information affect users’ satisfaction with a patient education website. Although we hypothesized that perceived novelty would also be affected by IA and affect satisfaction and knowledge in turn, this was not the case. Finally, we already established that IA design did not directly affect knowledge acquisition. The model demonstrated that IA also did not indirectly influence knowledge, as none of the tested mediation pathways were significant. Knowledge acquisition was influenced by user engagement (β=.26), but user engagement itself was unaffected by IA.

Serial Mediation by Perceived Control and User Engagement

The serial mediation model, including perceived control and user engagement, confirmed that IA design did not significantly predict satisfaction (P=.07) or knowledge (P=.36). However, an indirect-only serial mediation by perceived control and user engagement on satisfaction emerged for matrix IA designs (β[indirect]=.052; z=2.053; P=.04) and hierarchical designs (β[indirect]=−.063; z=−2.545; P=.01), where matrix IA increased active control and subsequently user engagement and satisfaction, whereas hierarchical design decreased active control (Figure 6). Serial mediation was not present for tunnel IA (P=.65) or for knowledge (Pmatrix=.10, Phierarchical=.06, Ptunnel=.65).

Figure 6. Serial mediation model of matrix information architecture effects on satisfaction via active control and engagement. IA: information architecture.
View this figure

Tailored IAs: Interactions With Patient Profile

Interaction effects between IA and patient profile indicated that some IA designs were preferred more by users with specific profiles (F4,206=2.646; P=.04; ηp2=0.049). In the post hoc analyses, a consistent difference was demonstrated between participants of the managing profile and modest profile using a tunnel IA design (Figure 7). Managing participants were significantly more satisfied with the tunnel design (mean difference to modest 0.489; P=.04), perceived it as more attractive (mean difference to modest 0.673; P=.01) and trustworthy (mean difference to modest 0.630; P=.009), and found it to provide more active control (mean difference to modest 0.764; P=.009).

Figure 7. Interaction effects between information architecture and patient profile.
View this figure

Principal Findings

The aim of this study is to investigate how the organization, display, and structural design of a website, that is, IA, influences patients’ experience with web-based patient education and the satisfaction and knowledge derived from the educational content. We wanted to understand whether user perceptions of engagement, control, personal relevance, trust, and novelty could explain how IA affects satisfaction and knowledge. Furthermore, we examined whether a user’s profile affected which IA design was most effective or enjoyable to explore the potential of tailored IA design. Research on IA in the context of web-based health education has been sparse [10], which has limited intervention designers’ ability to make informed design choices that enhance patients’ experiences with web-based education.

This study compared three IA designs: tunnel, hierarchical, and matrix design. We found that in comparison with hierarchical IAs, tunnel and matrix IAs slightly improve user satisfaction. This effect may be explained by increased user perceptions of personal relevance in the tunnel IA and increased perceptions of control in the matrix IA. Contrary to our hypotheses and earlier findings [11], no direct or indirect effects of IA on knowledge acquisition or website use were found. However, the findings did indicate that IA preferences differ between patients with different user profiles. Specifically, patients with a so-called managing profile, who prefer open communication and have high communicative capabilities, are more satisfied with health education that is presented in a tunnel IA.

Our finding that tunnel IA design specifically affects satisfaction with emotional support is consistent with research showing that tunneled education improved the emotional well-being of patients with type 2 diabetes and chronic low back pain [60]. However, we did not replicate previous findings indicating that tunneling increases the use of web-based health interventions [11,22]. We did perceive a trend in this direction: participants in the tunnel condition used the website longer on average. However, this difference was not statistically significant. IA design did not predict knowledge acquisition either, despite previous findings indicating that tunneling improves knowledge acquisition [11]. Instead, user engagement emerged as the only predictor of knowledge acquisition. Some research on patient education indicates that cognitive factors such as working memory and cognitive load may be better predictors of knowledge acquisition than the user experience variables included in this study [61]. As IA design may facilitate cognitive processes, for example, by presenting information in smaller chunks as done in the hierarchical and tunnel designs, exploring whether IA design influences cognitive factors may be a worthwhile avenue for future studies that could help explain a larger portion of the variance in knowledge acquisition. In general, knowledge acquisition scores were low (47%-52%), which is in contrast with earlier findings that show that web-based patient education is effective for orthopedic patients [62] even when websites are consulted just once [63]. However, we are unsure whether these findings are due to inadequate education offered or poor source material (which was not changed when converted from paper to website) or because we did not test knowledge before the experiment. Regarding the latter, if participants had very little knowledge of TJR to begin with, it may be that although attained knowledge levels were low, they still represented decent knowledge acquisition. The participants’ low self-reported knowledge of hip replacement supports this assumption: 81% of participants said that they had no or very limited prior knowledge. However, to fully answer this question, future research on IA design including pre-post measurements of health knowledge is needed.

The results of IA design on user engagement were mixed; the matrix IA achieved the highest subjective (ie, self-reported) engagement, but the tunnel IA was used the longest (albeit, not significantly longer). This reflects the dichotomous nature of engagement raised in the introduction, where engagement is thought to include both a subjective component of immersion and a behavioral component of use [15,64,65]. The findings indicate that IA design affects both but that matrix IA designs may be specifically suited for creating subjective experiences of engagement in patients. Furthermore, as only subjective self-engagement (and not duration of use) predicted actual knowledge scores, a very tentative conclusion may be that it might be more important to design engaging experiences rather than to design patient education materials that are used the longest. As most studies currently employ a the more use, the better perspective regarding engagement, use, and adherence to health interventions [66], this may require a different focus of researchers and designers alike.

Finally, this study focused on three simple IA designs for experimental clarity. Hybrid IAs that combine design elements from different IAs could mitigate the disadvantages associated with nonhybrid IAs. As users were most satisfied with matrix and tunnel IAs, hybrid matrix-tunnel designs should be explored further specifically. This study also identified that a large proportion of older adults with self-reported joint complaints use mobile phones (82%) and tablet devices (41%). As web-based IA designs cannot be ported to smartphones [13], IA designs suitable for health interventions distributed through mobile devices should be explored further. Finally, the field of IA has been affected considerably by the rise of recommender systems (RSs). These machine-based learning and information retrieval systems can predict and present relevant content, easing requirements for an adequate IA to help users locate content themselves. As this may diminish information overload [67], the potential benefits of combining RS techniques and IA in web-based health interventions warrant further research.

A secondary objective of this study is to explore the potential of tailored IAs. We found that participants with the highest information needs (so-called managers) preferred tunnel IAs. This finding supports the idea that patients’ web-based learning experiences may be improved when IA is tailored to relevant user characteristics. However, we did not envision beforehand that the tunnel IA would actually match the managing profile. Rather, we assumed that participants in this group would prefer a matrix IA, as their skills, high self-efficacy, and preferences for openness and participation are in line with the theoretical ideal user of matrix IA websites [12]. According to the qualitative feedback, one reason why they may have preferred the tunnel IA design instead is because it functioned as a checklist. Completeness or comprehensiveness is 1 of the 5 quality criteria for health information [68], and reassurance that all content had been covered may be particularly important for patients with high information needs. A tendency of patients with high information needs to actively seek out and ensure they have all available information (ie, a monitoring style of coping with threats) has been documented before in research with older patients with cancer [69]. Perceived comprehensiveness was not one of the mediators included in this study, but the question of whether some patients value it more than others, and which design elements may elicit comprehensiveness specifically, may be worthwhile avenues for future research. Finally, the patient profiles included in this study provided insight into orthopedic patients’ skills and preferences for general communication, not digital communication, specifically. Effective use of eHealth requires composite skills beyond basic literacy, such as being able to operate search functions and knowing what information is available on the web [70]. Therefore, it may be more accurate to tailor IA design to eHealth literacy levels instead of a general profile.

In any case, the incongruence between anticipated and actual match of patient profile and IA design indicates that translating stated preferences to a tailored design is complex. Although the knowledge base on what works for whom is growing slowly, it may be more beneficial in the meantime to offer users a choice of IAs rather than dictating one design. Studies that explored the benefit of tailoring the mode of health information (eg, text, illustrations, audio-visual material) have successfully used user-initiated tailoring when working with multiple interfaces [71,72]. User-initiated tailoring requests users to customize a website’s content and graphical user interface directly. Such customizations improve users’ satisfaction, users’ attention, and users’ ability to recall knowledge [71,72]. Possibly, user-initiated tailoring may also be applicable to tailored IA design if users are offered a choice of IA designs when they first visit the website. A second consideration is to design IAs that support many different styles of health information processing. The work by Pang et al [73] on a website that was purposely designed to support 4 (rather than 1) distinct health information-seeking behaviors showed that users were more engaged with these dynamic interfaces. The communality between these studies is that users were not restricted or coerced to use the website in a particular way but instead were able to customize the experience to their self-determined preferences and needs at the time of visiting. Although this design approach may improve the fit between user and design, it may also introduce new issues (such as motivating people to adjust interfaces) that warrant further research. Yet, as more intricate eHealth interventions are developed and examined, it should be taken into consideration that these findings show that none of the examined IA designs had serious negative effects on satisfaction and knowledge acquisition and that although advantages in terms of improved user experience were present, they were small. The added value of highly customizable interventions should, therefore, be examined in tandem with the additional costs associated with developing multiple interfaces.

Strengths and Limitations

This study was conducted among adults who had self-reported joint complaints and may have viewed web-based education differently than patients scheduled for TJR surgery. However, previous studies have successfully tested health education websites in similar populations [11,71,72], and the high self-reported pain and lower health scores indicate that the study sample had considerable health concerns. As such, the sample can actually be considered a study strength as these individuals were likely motivated to learn about orthopedic health. At the same time, as the sample consisted solely of people with orthopedic health concerns, we know little of the generalizability of the findings of this study to other populations. Preferences for IA design may differ when using health education for purposes other than to prepare for TJR surgery (eg, to decide between alternative treatment options or to obtain support in managing a chronic illness), and additional research is needed to explore this.

Another limitation was self-selection, as participants were able to determine whether they wanted to join or leave the study. Between invitation for participation and inclusion in the study, 37% of participants were lost to follow-up. Of particular concern is that 6% of the sample dropped out after viewing the allocated website, as they might have done so based on their (negative) response to the website. This could make the study susceptible to type I errors [74,75]. This problem could not be remedied by intention-to-treat analysis due to the design of the experiment in which the participants that had dropped out generated no outcome data [75]. Therefore, we checked whether dropout was associated with allocation to a specific website, which was not the case. This made it unlikely that participants stopped because they were discontent with the allocated website. Another issue with self-selection was that participants could have been exceptionally interested in and already knowledgeable about TJR surgery. This would explain why we did not find any effects on knowledge. However, both self-reported prior knowledge of hip replacement and knowledge acquisition were generally low. A final limitation is that we determined satisfaction and knowledge gained from visiting the website once. As such, we cannot draw conclusions about experience with the website over time or knowledge retention after longer periods.

The strengths of this study include the experimental design. Although randomized experiments of website features known as A/B tests or web-based field experiments [76] are common in industry, the method is not often used in academic research on web-based health interventions. Various scholars have advocated moving beyond the black box approach which assesses only intervention efficacy. Testing specific features can help understand the mechanisms by which web-based interventions (do not) improve health outcomes [10,17,22]. By experimentally manipulating one feature and assessing both outcomes as well as mediating variables, this study takes a step in that direction. Second, the study took a human-centered and interdisciplinary approach to patient education design. The team included interaction designers, clinicians, and psychologists and followed an iterative design process that involved patients early via pilot studies to ensure the usability of all three variants of the website. We believe that this commitment to developing three distinct but comparable, usable, and enjoyable web-based experiences has made it more likely that the effects on satisfaction can be attributed to differences in IA alone.

Conclusions and Recommendations for Intervention Design

Overall, our findings indicate that IA has small but notable effects on users’ experiences with web-based health education interventions, at least in the context of orthopedic patient education. Tunnel IA design, in which users are guided through sequentially ordered content, improves perceptions of personal relevance and, in turn, user satisfaction. This design may be specifically appropriate for patients with high information needs. In contrast, providing users with more control over the way they progress through a web-based health intervention via a matrix IA design has positive effects on user perceptions of active control, which also contributes to higher satisfaction. Although additional research on IA design in different target groups and interventions is needed, hierarchical IA designs are not recommended at the moment, as hierarchical content is perceived as less supportive, engaging, and relevant, which may diminish the use and, in turn, the effect of the educational intervention.

Acknowledgments

This work is part of the research program Tailored health care through customer profiling (Project 314-99-118), which is financed by the Netherlands Organisation for Scientific Research (NWO) and Zimmer Biomet Inc. The authors thank the participants who took part in the pilot study and the consortium partners and reviewers for their constructive feedback on this work.

Conflicts of Interest

Part of the funding of this project is provided by Zimmer Biomet Inc (refer to the Acknowledgments section). This sponsor had no role in the study design of this protocol, data collection, analysis and interpretation, or writing of the report. In the case that this partner wants to apply for a patent based on research findings, publication can be postponed for a maximum of 3 months. No party has the right to prohibit the publication of these findings. The authors have full access to the study data.

Multimedia Appendix 1

Dutch translation of the User Engagement Scale-Short Form: validity, questionnaire items, and instructions for scoring.

DOCX File , 33 KB

Multimedia Appendix 2

Perceived advantages and disadvantages of tunnel, hierarchical, and matrix information architecture designs (translated from Dutch).

DOCX File , 22 KB

Multimedia Appendix 3

CONSORT-eHEALTH checklist (V 1.6.1).

PDF File (Adobe PDF File), 2224 KB

References

  1. Langdon IJ, Hardin R, Learmonth ID. Informed consent for total hip arthroplasty: does a written information sheet improve recall by patients? Ann R Coll Surg Engl 2002 Nov;84(6):404-408 [FREE Full text] [CrossRef] [Medline]
  2. Turner P, Williams C. Informed consent: patients listen and read, but what information do they retain? N Z Med J 2002 Oct 25;115(1164):218. [Medline]
  3. Fagerlin A, Sepucha KR, Couper MP, Levin CA, Singer E, Zikmund-Fisher BJ. Patients' knowledge about 9 common health conditions: the DECISIONS survey. Med Decis Making 2010;30(5 Suppl):35-52. [CrossRef] [Medline]
  4. Krupic F, Määttä S, Garellick G, Lyckhage ED, Kärrholm J. Preoperative information provided to Swedish and immigrant patients before total hip replacement. Med Arch 2012;66(6):399-404. [CrossRef] [Medline]
  5. Kinnersley P, Edwards A, Hood K, Cadbury N, Ryan R, Prout H, et al. Interventions before consultations for helping patients address their information needs. Cochrane Database Syst Rev 2007 Jul 18(3):CD004565. [CrossRef] [Medline]
  6. Morrison LG, Yardley L, Powell J, Michie S. What design features are used in effective e-health interventions? A review using techniques from Critical Interpretive Synthesis. Telemed J E Health 2012 Mar;18(2):137-144. [CrossRef] [Medline]
  7. Kebede MM, Liedtke TP, Möllers T, Pischke CR. Characterizing active ingredients of eHealth interventions targeting persons with poorly controlled type 2 Diabetes Mellitus using the behavior change techniques taxonomy: scoping review. J Med Internet Res 2017 Oct 12;19(10):348 [FREE Full text] [CrossRef] [Medline]
  8. Arden-Close EJ, Smith E, Bradbury K, Morrison L, Dennison L, Michaelides D, et al. A visualization tool to analyse usage of web-based interventions: the example of Positive Online Weight Reduction (POWeR). JMIR Hum Factors 2015 May 19;2(1):8 [FREE Full text] [CrossRef] [Medline]
  9. Morville P, Rosenfeld L. Information architecture for the World Wide Web. 2nd edition. In: O'Reilly Media. Sebastopol: O'Reilly Media; 2002:1600330150.
  10. Pugatch J, Grenen E, Surla S, Schwarz M, Cole-Lewis H. Information architecture of web-based interventions to improve health outcomes: systematic review. J Med Internet Res 2018 Mar 21;20(3):97 [FREE Full text] [CrossRef] [Medline]
  11. Crutzen R, Cyr D, de Vries NK. The role of user control in adherence to and knowledge gained from a website: randomized comparison between a tunneled version and a freedom-of-choice version. J Med Internet Res 2012 Mar 09;14(2):45 [FREE Full text] [CrossRef] [Medline]
  12. Danaher BG, McKay HG, Seeley JR. The information architecture of behavior change websites. J Med Internet Res 2005 May 18;7(2):12 [FREE Full text] [CrossRef] [Medline]
  13. Danaher BG, Brendryen H, Seeley JR, Tyler MS, Woolley T. From black box to toolbox: outlining device functionality, engagement activities, and the pervasive information architecture of mHealth interventions. Internet Interv 2015 Mar 01;2(1):91-101 [FREE Full text] [CrossRef] [Medline]
  14. Kelders SM, Kok RN, Ossebaard HC, Van Gemert-Pijnen JEWC. Persuasive system design does matter: a systematic review of adherence to web-based interventions. J Med Internet Res 2012 Nov 14;14(6):152 [FREE Full text] [CrossRef] [Medline]
  15. Perski O, Blandford A, West R, Michie S. Conceptualising engagement with digital behaviour change interventions: a systematic review using principles from critical interpretive synthesis. Transl Behav Med 2017 Jun;7(2):254-267 [FREE Full text] [CrossRef] [Medline]
  16. Lynch P, Horton S. Web style guide: foundations of user experience design. 4th edition. New Haven, Ct: Yale University Press; 2016:0300211651.
  17. Whitton AE, Proudfoot J, Clarke J, Birch M, Parker G, Manicavasagar V, et al. Breaking open the black box: isolating the most potent features of a web and mobile phone-based intervention for depression, anxiety, and stress. JMIR Ment Health 2015;2(1):3 [FREE Full text] [CrossRef] [Medline]
  18. O'Brien H. Theoretical perspectives on user engagement. In: Why engagement matters. Switzerland: Springer International Publishing; 2016:1-26.
  19. O’Brien HL, Cairns P, Hall M. A practical approach to measuring user engagement with the refined user engagement scale (UES) and new UES short form. Int Journal Human-Computer Studies 2018 Apr;112:28-39. [CrossRef]
  20. Kelders SM, van Zyl LE, Ludden GDS. The concept and components of engagement in different domains applied to eHealth: a systematic scoping review. Front Psychol 2020 May 27;11:926 [FREE Full text] [CrossRef] [Medline]
  21. Ludden GD, van Rompay TJL, Kelders SM, van Gemert-Pijnen JEWC. How to increase reach and adherence of web-based interventions: a design research viewpoint. J Med Internet Res 2015 Jul 10;17(7):172 [FREE Full text] [CrossRef] [Medline]
  22. McClure JB, Shortreed SM, Bogart A, Derry H, Riggs K, St John J, et al. The effect of program design on engagement with an internet-based smoking intervention: randomized factorial trial. J Med Internet Res 2013 Mar 25;15(3):69 [FREE Full text] [CrossRef] [Medline]
  23. Liu Y. Developing a scale to measure the interactivity of websites. JAR 2003 Jun 01;43(2):207-216. [CrossRef]
  24. Liu Y, Shrum LJ. What is interactivity and is it always such a good thing? Implications of definition, person, and situation for the influence of interactivity on advertising effectiveness. J of Advert 2002 Dec;31(4):53-64. [CrossRef]
  25. Vandelanotte C, Müller AM, Short CE, Hingle M, Nathan N, Williams SL, et al. Past, present, and future of eHealth and mHealth research to improve physical activity and dietary behaviors. J Nutr Educ Behav 2016 Mar;48(3):219-228. [CrossRef] [Medline]
  26. Song JH, Zinkhan GM. Determinants of perceived web site interactivity. J of Market 2008 Mar;72(2):99-113. [CrossRef]
  27. Voorveld HAM, Neijens PC, Smit EG. The relation between actual and perceived interactivity. J of Advert 2011 Jul;40(2):77-92. [CrossRef]
  28. Lustria MLA, Cortese J, Gerend MA, Schmitt K, Kung YM, McLaughlin C. A model of tailoring effects: a randomized controlled trial examining the mechanisms of tailoring in a web-based STD screening intervention. Health Psychol 2016 Nov;35(11):1214-1224. [CrossRef] [Medline]
  29. Strecher VJ, McClure J, Alexander G, Chakraborty B, Nair V, Konkel J, et al. The role of engagement in a tailored web-based smoking cessation program: randomized controlled trial. J Med Internet Res 2008 Nov 04;10(5):36 [FREE Full text] [CrossRef] [Medline]
  30. Kreuter MW, Strecher VJ, Glassman B. One size does not fit all: the case for tailoring print materials. Ann Behav Med 1999;21(4):276-283. [CrossRef] [Medline]
  31. Dijkstra A. The psychology of tailoring-ingredients in computer-tailored persuasion. Social Pers Psych Compass 2008 Mar;2(2):765-784. [CrossRef]
  32. Hawkins RP, Kreuter M, Resnicow K, Fishbein M, Dijkstra A. Understanding tailoring in communicating about health. Health Educ Res 2008 Jun;23(3):454-466 [FREE Full text] [CrossRef] [Medline]
  33. Ryan RM, Deci EL. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am Psychol 2000 Jan;55(1):68-78. [CrossRef] [Medline]
  34. Yi MY, Yoon JJ, Davis JM, Lee T. Untangling the antecedents of initial trust in web-based health information: the roles of argument quality, source expertise, and user perceptions of information quality and risk. Decision Support Systems 2013 Apr;55(1):284-295. [CrossRef]
  35. Hesse BW, Nelson DE, Kreps GL, Croyle RT, Arora NK, Rimer BK, et al. Trust and sources of health information: the impact of the Internet and its implications for health care providers: findings from the first Health Information National Trends Survey. Arch Intern Med 2005;165(22):2618-2624. [CrossRef] [Medline]
  36. Wathen CN, Burkell J. Believe it or not: factors influencing credibility on the web. J Am Soc Inf Sci 2002;53(2):134-144. [CrossRef]
  37. Rains SA, Karmikel CD. Health information-seeking and perceptions of website credibility: examining Web-use orientation, message characteristics, and structural features of websites. Computers in Human Behavior 2009 Mar;25(2):544-553. [CrossRef]
  38. Sillence E, Briggs P, Harris P, Fishwick L. A framework for understanding trust factors in web-based health advice. Int J Human-Computer Studies 2006 Aug;64(8):697-713 [FREE Full text] [CrossRef]
  39. Huang M. Designing website attributes to induce experiential encounters. Computers in Human Behavior 2003 Jul;19(4):425-442. [CrossRef]
  40. Clark RE. Reconsidering research on learning from media. Review of Educational Research 2016 Jun 30;53(4):445-459. [CrossRef]
  41. Groeneveld B, Melles M, Vehmeijer S, Mathijssen N, Dekkers T, Goossens R. Developing digital applications for tailored communication in orthopaedics using a research through design approach. Digit Health 2019;5:2055207618824919 [FREE Full text] [CrossRef] [Medline]
  42. Faul F, Erdfelder E, Lang A, Buchner A. G*Power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods 2007 May;39(2):175-191. [CrossRef] [Medline]
  43. Selya AS, Rose JS, Dierker LC, Hedeker D, Mermelstein RJ. A practical guide to calculating cohen's f(2), a measure of local effect size, from PROC MIXED. Front Psychol 2012;3:111 [FREE Full text] [CrossRef] [Medline]
  44. Wiklund M, Kendler J, Strochlic A. Usability testing of medical devices. 2nd edition. In: Usability testing of medical devices. 2nd edition. Boca Raton, Florida: CRC Press; 2016:1-477.
  45. Bol N, van Weert JCM, de Haes HCJM, Loos EF, de Heer S, Sikkel D, et al. Using cognitive and affective illustrations to enhance older adults' website satisfaction and recall of online cancer-related information. Health Commun 2014;29(7):678-688. [CrossRef] [Medline]
  46. Bol N, Smets EMA, Rutgers MM, Burgers JA, de Haes HCJM, Loos EF, et al. Do videos improve website satisfaction and recall of online cancer-related information in older lung cancer patients? Patient Educ Couns 2013 Sep;92(3):404-412. [CrossRef] [Medline]
  47. Jansen J, van Weert J, van der Meulen N, van Dulmen S, Heeren T, Bensing J. Recall in older cancer patients: measuring memory for medical information. Gerontologist 2008 Apr;48(2):149-157. [CrossRef] [Medline]
  48. Beaton DE, Bombardier C, Guillemin F, Ferraz MB. Guidelines for the process of cross-cultural adaptation of self-report measures. Spine (Phila Pa 1976) 2000 Dec 15;25(24):3186-3191. [CrossRef] [Medline]
  49. Sousa VD, Rojjanasrirat W. Translation, adaptation and validation of instruments or scales for use in cross-cultural health care research: a clear and user-friendly guideline. J Eval Clin Pract 2011 Apr;17(2):268-274. [CrossRef] [Medline]
  50. O'Brien HL, Toms EG. The development and evaluation of a survey to measure user engagement. J Am Soc Inf Sci 2009 Oct 19;61(1):50-69. [CrossRef]
  51. Jensen JD, King AJ, Carcioppolo N, Davis L. Why are tailored messages more effective? A multiple mediation analysis of a breast cancer screening intervention. J Commun 2012 Oct;62(5):851-868 [FREE Full text] [CrossRef] [Medline]
  52. Iacobucci D. Structural equations modeling: Fit Indices, sample size, and advanced topics. J Consumer Psychol 2010 Jan;20(1):90-98. [CrossRef]
  53. Hooper D, Coughlan J, Mullen M. Structural equation modelling: guidelines for determining model fit. Electron J Bus Res Methods 2008;6(1):60. [CrossRef]
  54. R Core Team. R: a language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing; 2016.
  55. Szende A, Williams A, editors. Measuring self-reported population health: an international perspective based on EQ-5D. Dordrecht Heidelberg New York London: Springer; 2004:1-115.
  56. Essink-Bot ML, Stouthard ME, Bonsel GJ. Generalizability of valuations on health states collected with the EuroQolc-questionnaire. Health Econ 1993 Oct;2(3):237-246. [CrossRef] [Medline]
  57. Dijkstra T. On statistical inference with parameter estimates on the boundary of the parameter space. Br J Math Stat Psychol 1992;45(2):309. [CrossRef]
  58. Baron RM, Kenny DA. The moderator-mediator variable distinction in social psychological research: conceptual, strategic, and statistical considerations. J Pers Soc Psychol 1986 Dec;51(6):1173-1182. [CrossRef] [Medline]
  59. Raykov T, Marcoulides GA. On desirability of parsimony in structural equation model selection. Structural Equation Modeling: A Multidisciplinary Journal 1999 Jan;6(3):292-300. [CrossRef]
  60. Weymann N, Dirmaier J, von Wolff A, Kriston L, Härter M. Effectiveness of a web-based tailored interactive health communication application for patients with type 2 diabetes or chronic low back pain: randomized controlled trial. J Med Internet Res 2015 Mar 03;17(3):53 [FREE Full text] [CrossRef] [Medline]
  61. Wilson EAH, Wolf MS. Working memory and the design of health materials: a cognitive factors perspective. Patient Educ Couns 2009 Mar;74(3):318-322. [CrossRef] [Medline]
  62. Dekkers T, Melles M, Groeneveld BS, de Ridder H. Web-based patient education in orthopedics: systematic review. J Med Internet Res 2018 Apr 23;20(4):143 [FREE Full text] [CrossRef] [Medline]
  63. Fraval A, Chandrananth J, Chong YM, Coventry LS, Tran P. Internet based patient education improves informed consent for elective orthopaedic surgery: a randomized controlled trial. BMC Musculoskelet Disord 2015 Feb 07;16:14 [FREE Full text] [CrossRef] [Medline]
  64. Yardley L, Spring BJ, Riper H, Morrison LG, Crane DH, Curtis K, et al. Understanding and promoting effective engagement with digital behavior change interventions. Am J Prev Med 2016 Nov;51(5):833-842. [CrossRef] [Medline]
  65. Short CE, DeSmet A, Woods C, Williams SL, Maher C, Middelweerd A, et al. Measuring engagement in eHealth and mHealth behavior change interventions: viewpoint of methodologies. J Med Internet Res 2018 Nov 16;20(11):292 [FREE Full text] [CrossRef] [Medline]
  66. Sieverink F, Kelders SM, van Gemert-Pijnen JE. Clarifying the concept of adherence to eHealth technology: systematic review on when usage becomes adherence. J Med Internet Res 2017 Dec 06;19(12):402 [FREE Full text] [CrossRef] [Medline]
  67. Cheung KL, Durusu D, Sui X, de Vries H. How recommender systems could support and enhance computer-tailored digital health programs: a scoping review. Digit Health 2019;5:2055207618824727 [FREE Full text] [CrossRef] [Medline]
  68. Eysenbach G, Powell J, Kuss O, Sa E. Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review. J Am Med Assoc 2002 May 22;287(20):2691-2700. [CrossRef] [Medline]
  69. Bol N, Linn AJ, Smets EM, Verdam MG, van Weert JC. Tailored communication for older patients with cancer: using cluster analysis to identify patient profiles based on information needs. J Geriatr Oncol 2020 Jul;11(6):944-950 [FREE Full text] [CrossRef] [Medline]
  70. Norman CD, Skinner HA. eHealth literacy: essential skills for consumer health in a networked world. J Med Internet Res 2006 Jun 16;8(2):9 [FREE Full text] [CrossRef] [Medline]
  71. Nguyen MH, Smets EMA, Bol N, Loos EF, Van Weert JCM. How tailoring the mode of information presentation influences younger and older adults' satisfaction with health websites. J Health Commun 2018;23(2):170-180. [CrossRef] [Medline]
  72. Nguyen MH, van Weert JCM, Bol N, Loos EF, Tytgat KMAJ, van de Ven AWH, et al. Tailoring the mode of information presentation: effects on younger and older adults' attention and recall of online information. Hum Commun Res 2016 Oct 21;43(1):102-126. [CrossRef]
  73. Pang PC, Chang S, Verspoor K, Pearce J. Designing health websites based on users' web-based information-seeking behaviors: a mixed-method observational study. J Med Internet Res 2016 Jun 06;18(6):145 [FREE Full text] [CrossRef] [Medline]
  74. Wertz RT. Intention to treat: Once randomized, always analyzed. Clinical Aphasiology 1995;23:57-64 [FREE Full text]
  75. Gupta SK. Intention-to-treat concept: a review. Perspect Clin Res 2011 Jul;2(3):109-112 [FREE Full text] [CrossRef] [Medline]
  76. Kohavi R, Longbotham R. Online controlled experiments and A/B testing. In: Sammut C, Webb GI, editors. Encyclopedia of Machine Learning and Data Mining. Boston, MA: Springer; 2017.


ANOVA: analyses of variance
CFI: comparative fit index
IA: information architecture
MC: multiple choice
RMSEA: root mean square error of approximation
RS: recommender system
SRMR: standardized root mean square residual
THP: total hip prosthesis
TJR: total joint replacement
UES-SF: User Engagement Scale-Short Form


Edited by G Eysenbach, L Buis; submitted 12.08.19; peer-reviewed by G Ludden, N Vinson, MH Nguyen, C Chen; comments to author 15.05.20; revised version received 28.08.20; accepted 18.11.20; published 03.03.21

Copyright

©Tessa Dekkers, Marijke Melles, Stephan B W Vehmeijer, Huib de Ridder. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 03.03.2021.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.