Background: Providing informed consent means agreeing to participate in a clinical trial and having understood what is involved. Flawed informed consent processes, including missing dates and signatures, are common regulatory audit findings. Electronic consent (eConsent) uses digital technologies to enable the consenting process. It aims to improve participant comprehension and engagement with study information and to address data quality concerns.
Objective: This systematic literature review aimed to assess the effectiveness of eConsent in terms of patient comprehension, acceptability, usability, and study enrollment and retention rates, as well as the effects of eConsent on the time patients took to perform the consenting process (“cycle time”) and on-site workload in comparison with traditional paper-based consenting.
Methods: The systematic review was conducted and reported in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. Ovid Embase and Ovid MEDLINE were systematically searched for publications reporting original, comparative data on the effectiveness of eConsent in terms of patient comprehension, acceptability, usability, enrollment and retention rates, cycle time, and site workload. The methodological validity of the studies that compared outcomes for comprehension, acceptability, and usability across paper consent and eConsent was assessed. Study methodologies were categorized as having “high” validity if comprehensive assessments were performed using established instruments.
Results: Overall, 37 publications describing 35 studies (13,281 participants) were included. All studies comparing eConsenting and paper-based consenting for comprehension (20/35, 57% of the studies; 10 with “high” validity), acceptability (8/35, 23% of the studies; 1 with “high” validity), and usability (5/35, 14% of the studies; 1 with “high” validity) reported significantly better results with eConsent, better results but without significance testing, or no significant differences in overall results. None of the studies reported better results with paper than with eConsent. Among the “high” validity studies, 6 studies on comprehension reported significantly better understanding of at least some concepts, the study on acceptability reported statistically significant higher satisfaction scores, and the study on usability reported statistically significant higher usability scores with eConsent than with paper (P<.05 for all). Cycle times were increased with eConsent, potentially reflecting greater patient engagement with the content. Data on enrollment and retention were limited. Comparative data from site staff and other study researchers indicated the potential for reduced workload and lower administrative burden with eConsent.
Conclusions: This systematic review showed that compared with patients using paper-based consenting, patients using eConsent had a better understanding of the clinical trial information, showed greater engagement with content, and rated the consenting process as more acceptable and usable. eConsent solutions thus have the potential to enhance understanding, acceptability, and usability of the consenting process while inherently being able to address data quality concerns, including those related to flawed consenting processes.
Informed consent to participate remains a fundamental aspect of ethical clinical research. Potential participants of a clinical trial must be given adequate information about the study before they decide whether to participate in accordance with good clinical practice quality standards . Providing informed consent means to agree to take part in the trial and to have understood what is involved, including the risks and benefits of participation [ ]. Traditionally, the trial information is conveyed using printed documents that potential participants read before signing to indicate their consent to participate. The informed consent form (ICF), and the associated effective communication of study information, remains among the most challenging and complex processes within the clinical trial landscape. ICFs are known to have poor readability and take too long to be understood and digested effectively [ ]. A review of ICFs developed for use in phase III oncology clinical trials showed that these were, on average, 21.4 pages long and that many participants had only a poor understanding of the key elements of their trial [ ]. Poor understanding of the study requirements and treatment has been cited as a reason for early withdrawal from clinical trials [ ]. To ensure that potential trial participants fully comprehend the study information, ICFs need to convey complicated and technical information in a way that meets the target group’s health literacy capabilities. ICFs have to maintain readers’ engagement sufficiently to ensure that they can make a fully informed decision on whether to participate.
In addition to patient-centered challenges of the ICF process, administrative aspects of the consenting process can pose challenges to investigators conducting clinical trials. Flawed informed consent processes are listed within the top 10 cited regulatory deficiencies and audit findings and are the third highest reason for US Food and Drug Administration (FDA) warning letters to clinical investigators [- ]. Informed consent was among the top 2 most frequently observed issues in a recent auditing case study, conducted across 37 centers, and problems were identified related to processing errors and missing operational records [ ]. Findings included missing signatures, incomplete ICFs, signing of incorrect ICF versions, and unauthorized site staff obtaining consents [ ]. These are serious issues that can undermine the integrity of the consent process and the study, and they can result in the inability of researchers to analyze and report the data as intended.
Electronic consent (eConsent) uses digital technologies to enable the consenting process. Components can include multimedia to complement text-based content; interactivity (eg, to handle questions, test knowledge, explain definitions, and allow patients to resume the process from where they left off); electronic signature capture; status dashboards; and version control technology. eConsent aims to improve participant comprehension and engagement with study information and to address data quality concerns that may limit study integrity [, ]. Although eConsent has been in use for about 15 years, its adoption has been slow until recently, when its accelerated uptake has been driven primarily by the COVID-19 pandemic [ ]. Much of the supporting information on the promised benefits of eConsent comes from informal commentaries and reports from eConsent solution providers. In addition to digital technology, and just as with paper-based ICFs, eConsent solutions require good content to be effective. Similar to the computing analogy of “garbage in, garbage out,” poor eConsent content will result in poor overall effectiveness in terms of patient comprehension, acceptability, and usability, irrespective of the quality of the delivery technology.
The aim of our systematic review of peer-reviewed research was to provide a summary of qualitative and quantitative evidence to draw conclusions on the relative effectiveness of eConsent in comparison with traditional paper-based consenting.
The systematic literature review was conducted and reported in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines . A completed PRISMA checklist is included in [ ]. We systematically searched the peer-reviewed literature for full papers and conference abstracts relevant to our review using Ovid Embase and Ovid MEDLINE on November 11, 2021. Ovid MEDLINE is equivalent in content to PubMed and additionally includes advanced search options (eg, adjacency operator and within-phrase wildcard) [ , ]. The search string contained terms related to electronic and consenting as follows: ([dynamic OR electronic OR interactive OR multimedia OR online OR tablet OR computer OR digital OR virtual] ADJ4 [consent* OR econsent OR e-consent]). Terms related to “electronic” were limited to the title, abstract, and keywords of a publication. The operator “ADJ4” was used to identify “electronic”- and “consent”-related terms separated by ≤3 words to filter for literature relevant to this review. The records were screened and selected based on our review of the title, abstract, and full text. No language restrictions or publication date limits were applied. The review was not registered, and a protocol was not prepared.
Inclusion and Exclusion Criteria
Publications reporting original, comparative data on the effectiveness of eConsent in terms of patient comprehension, acceptability, and usability were eligible for inclusion. Comparative data on the effect of eConsent on clinical study enrollment and retention rates, cycle time (ie, time taken to consent), site workload, and stakeholder views were also considered relevant. Head-to-head comparisons of paper-based methods versus eConsent were of particular relevance. Publications that did not present original data (eg, reviews, editorials, and commentaries) were excluded.
Following the systematic literature searches, duplicate records were removed using the deduplicate option in Ovid. All the remaining records were exported to EndNote X9 (Clarivate), and further duplicates were identified and removed manually. Two reviewers (EC and BB) independently assessed the systematic literature search results and the corresponding full texts following the initial screening of titles and abstracts, with one reviewer (AB) excluding the ineligible publications (eg, reviews), and the team of 3 reviewers resolved any disagreements by consensus-based discussions. Reasons for exclusion and inclusion were captured.
Data Collection and Summary
Data extraction (conducted by AB and reviewed by BB) included measures and outcomes for patient comprehension, acceptability, usability, enrollment rates, retention rates, cycle time, site workload, and stakeholder views. The extracted data were summarized descriptively. Data on patient comprehension, acceptability, and usability with eConsent versus paper-based ICFs were tabulated as part of the main descriptive summary. An overview of all the studies identified for inclusion is provided in the[ - ].
For studies comparing patient comprehension, acceptability, and usability with eConsent versus paper-based ICFs, we estimated the quality of the evidence by categorizing their methodological validity as “high,” “moderate,” or “limited.” The study methodologies that we categorized as having high validity (score=+++) were those that used comprehensive assessments including detailed and open-ended questions (eg, “Tell me what will be done during the study visits”), possibly using established instruments as part of the formal assessments. Methodologies that involved self-rating by participants (eg, “Did you understand the following aspects of the study?”), without formal testing, were categorized as having moderate validity (score=++). When a methodology that involved limited questioning was used in the studies or when methodological details were not reported, we categorized these studies as having limited validity (score=+).
The systematic literature search identified 1872 publications (). Of these 1872 publications, 608 (32.48%) duplicates were excluded before screening, and a further 1228 (65.6%) were excluded based on screening by title, abstract, and full publication, with the most common reason for exclusion being that the publication did not report on eConsent research. A total of 36 studies met the eligibility criteria [ - ], and an additional outcomes publication [ ] was retrieved manually based on the identification of its accompanying methodology article during screening. Thus, in total, 37 publications (32 full publications and 5 conference abstracts) were included in this review ( ) [ - ].
The included publications together described 35 studies (2 studies were each covered by 2 publications). Most of the studies (28/35, 80%) were from North America (United States: n=26, 93%; Canada, n=2, 7%;). The remaining studies (7/35, 20%) were from Europe (Italy: n=1, 3%; Ireland and United Kingdom: n=1, 3%; United Kingdom: n=1, 3%), Australia (n=2, 6%), and Gambia (n=1, 3%), and 1 (3%) was multinational. Taken together, these studies included a total of 13,281 participants. The number of participants per study ranged from 9 to 3485. In total, 13 (37%) out of 35 studies were conducted as part of randomized (n=10) or nonrandomized (n=3) clinical research studies, 14 (40%) studies were simulated consent studies, and 8 (23%) studies were survey or interview studies. Most of the research and simulation consent studies (23/27, 85%) were conducted in person ( ). Comparative data on patient comprehension, acceptability, and usability with eConsenting were provided in 26, 13, and 6 studies, respectively, of which 20, 8, and 5 studies included comparisons for eConsent versus paper-based ICFs, respectively. Aspects of eConsent in relation to enrollment rates, retention, cycle time, staff workload, and stakeholder views were covered in 12, 1, 13, 3, and 5 studies, respectively. Age groups ranged from 8 years to 91 years in the 14 studies that included age range information. Among the 23 studies that provided sufficient information on average (mean or median) age, the average age was <50 years in 12 studies and ≥50 years in 11 studies.
Overall, 26 studies (8778 participants in total) assessed the aspects of patient comprehension with eConsenting.
Patient Comprehension: eConsenting Versus Paper
Comparative information on comprehension with eConsenting versus paper-based ICFs was provided in 20 studies, including a total of 6769 participants (of whom 5809 participants contributed comparative data on comprehension;). All 20 studies reported significantly better understanding with eConsent, better understanding but without significance testing, or no significant differences in overall understanding ( ).
|Study, year||Participants||Methodology||Comprehension findingsa|
|Sample size, N||Age (years)||Measure||Validityb|
|Abujarad et al , 2021||50|
|Afolabi et al , 2015||311|
|Bickmore et al , 2009||29|
|Buckley et al , 2020||97|
|Chalil Madathil et al , 2013||40|
|Chapman et al [, ], 2021 and 2020||298|
|Harmell et al , 2012||35|
|Jayasinghe et al , 2019||35|
|Jeste et al , 2009||60|
|Knapp et al , 2021||109|
|McCarty et al , 2015||56|
|McGraw et al , 2012||43|
|Rothwell et al , 2020||669|
|Rothwell et al , 2014||62|
|Rowbotham et al , 2013||75|
|Simon et al , 2016||200|
|Simon et al , 2021||501|
|Sonne et al , 2013||61|
|Varnhagen et al , 2005||3045|
|Warriner et al , 2016||33|
aSignificant P values, effect sizes, and CIs are reported when provided in the publications.
bMethodological validity was categorized as “high” (+++), “moderate” (++), or “limited” (+).
cNR: not reported.
dQuIC : Quality of Informed Consent. Part A=20 questions self-rated (agree, unsure, and disagree); part B=14 questions to self-rate the understanding of different aspects on a scale of 1 to 5.
eDICCQ : Digitized Informed Consent Comprehension Questionnaire. A total of 26 questions (9 yes or no, 6 multiple-choice single answers, 4 multiple-choice multiple answers, and 7 verbal recall) and investigator-rated responses.
fBICEP : Brief Informed Consent Evaluation Protocol. Contains 12 open questions, scored by the interviewer and assesses pressure to participate, understanding of care if not consented, benefits, risks, study requirements, purpose of study, when the study ends, and when participants could withdraw consent.
gUBACC : University of California San Diego Brief Assessment of Capacity to Consent. It contains 10 open questions on study purpose, requirement to participate, impact of withdrawing, study requirements, risks and benefits, and costs.
hMacCAT-CR : MacArthur Competence Assessment Tool for Clinical Research. Understanding (scores range from 0 to 26), Appreciation (0-6), Reasoning (0-8), and expression of a choice.
iDMQ: Decision-Making Questionnaire.
jHealth-ITUES: Health Information Technology Usability Evaluation Scale.
Different methods were used across studies to assess comprehension, some of which were more robust than others in their approaches. Overall, 10 studies included established instruments to assess comprehension, and their methodological validity was thus categorized as “high” (score=+++) [- , , , , , , , ]. The instruments used included the Brief Informed Consent Evaluation Protocol [ , ], Digitized Informed Consent Comprehension Questionnaire [ , ], MacArthur Competence Assessment Tool for Clinical Research [ , , ], Quality of Informed Consent [ , , , , , ], and University of California San Diego Brief Assessment of Capacity to Consent [ , , , ].
Overall, 60% (6/10) of the “high” validity studies reported significantly better understanding with eConsent than paper-based ICFs for at least some of the concepts assessed using established instruments, with no statistical tests in favor of the paper process [, , , , , ]. The remaining 4 (40%) of the 10 studies reported no significant difference in comprehension between eConsent and a paper-based consent process [ , , , ], with 1 study reporting statistically nonsignificant better comprehension using eConsent [ ]. However, confidence in understanding was significantly lower with eConsent than with paper-based ICFs in 1 study that observed no difference in overall understanding [ ].
Furthermore, 6 studies included custom surveys or participant self-rating without formal testing to evaluate comprehension, and their methodological validity was thus categorized as “moderate” (score=++) [, , , , , ]; one of these studies used both “high” and “moderate” validity methodologies [ ]. Of the 6 “moderate” validity studies, 67% (n=4) of studies reported significantly better comprehension with eConsent than with paper-based ICFs for at least some of the concepts assessed [ , , , ], with the remainder reporting no significant differences [ , ].
The remaining 5 studies (covered by 6 publications) used limited questioning or did not report methodological details, and their methodological validity was thus categorized as “limited” (score=+) [, - , , ]. Of the “limited” validity studies, 3 (covered by 4 publications) reported better comprehension with eConsent than with paper-based ICFs for at least some aspects [ , - ], and 2 reported no differences [ , ].
Patient Comprehension: Other Evidence
In the study by Rothwell et al  ( ), participants in the eConsent group were interviewed after the consent process; several noted that the eConsent format was easy to understand and held their attention more than a paper-based approach would have done. Several further studies assessed comprehension either by comparing different electronic formats at baseline versus postconsent time point or by describing results from interviews about patient preferences [ - , , , , ] ( ). Comprehension was significantly better with a highly interactive eConsent version than with less-interactive versions in a study by Geier et al [ ]. Participants in a study by Naeim et al [ ] found that information was easier to understand when the video presentation was animated rather than text based. Perrault and Keating [ ] found that text layouts using line spacing, bold font, and bullet points could improve comprehension compared with a bullet-pointed flowchart. Golembiewski et al [ ] and Harle et al [ ] observed no significant differences in understanding between a standard tablet-based version and versions that had key terms hyperlinked to additional research-related information. Tait et al [ ] showed that parents’ and children’s understanding of clinical trial–related terminology was improved after eConsenting compared with baseline. Most participants (67%) in a survey of clinical trial researchers by Zeps et al [ ] thought that eConsent would improve patients’ comprehension.
Overall, 13 studies (1694 participants in total) assessed the aspects of patient acceptability with eConsenting.
Patient Acceptability: eConsenting Versus Paper
Comparative information on the acceptability of eConsenting versus paper-based ICFs was provided in 8 studies, including a total of 631 participants (of whom 621 participants contributed comparative data on acceptability;). All 8 studies reported significantly higher satisfaction or enjoyment with eConsent, higher satisfaction but without significance testing, or no differences in acceptability ( ). Only one of the studies was categorized as having “high” methodological validity, having used an established instrument to assess acceptability, in this case, the Computer System Usability Questionnaire [ , ]. This study reported statistically significant higher satisfaction scores with eConsent compared with paper-based ICFs [ ]. The methodology used to assess acceptability was categorized as having “limited” validity in the remaining 7 studies (covered by 8 publications) [ , , , , , , , ]. Furthermore, 6 studies with “limited” validity reported higher acceptability with eConsent than with paper-based ICFs [ , , , , , ], and in 3 of these studies, at least some of the differences were statistically significant [ , , ].
|Study, year||Participants||Methodology||Acceptability findings|
|Sample size, N||Age (years)||Measure||Validitya|
|Abujarad et al , 2021||50||3 questions as part of a 12-question survey (Likert scale)||+||Significantly higher satisfaction scores with eConsent vs paper for 1 of 3 questions (P=.01); no significant differences for other 2 questions|
|Bickmore et al , 2009||29||1 question (Likert scale)||+||Significantly higher satisfaction scores with eConsent vs paper or verbal (P=.02)|
|Chalil Madathil et al , 2013||40||CSUQc overall satisfaction score||+++||Higher satisfaction scores with eConsent formats vs paper. Difference across the different formats statistically significant (P<.05)|
|Chapman et al , 2021; Chapman et al [ ], 2020||298||3 questions (multiple choice)||+||Similar levels of overall acceptability|
|Harmell et al , 2012||35||1 question (multiple choice)||+||Proportion of patients preferring current vs past consenting experience higher with eConsent vs paper (P value NR)|
|Rowbotham et al , 2013||75||2 questions (Likert scales)||+||Significantly higher scores with eConsent vs paper for enjoyment (P<.05). No significant difference for satisfaction (P=.09)|
|Sonne et al , 2013||61||1 question||+||79% of participants preferred eConsent over paper format|
|Warriner et al , 2016||33||3 questions (Likert scales)||+||Higher satisfaction with eConsent vs paper, but difference not statistically significant|
aMethodological validity was categorized as “high” (+++), “moderate” (++), or “limited” (+).
bNR: not reported.
cCSUQ : Computer System Usability Questionnaire. It contains 19 questions measuring overall satisfaction, system usefulness, information quality, and interface quality.
Patient Acceptability: Other Evidence
Several studies described viewpoints regarding consenting format preferences or comparing acceptability when using different electronic formats [, , , , , ] ( ). The survey and interview results indicated a preference for eConsent over paper-based ICFs. McGowan et al [ ] reported that 52% of their study sample preferred eConsent, 46% had no preference, and only 3% would have preferred face-to-face consenting. Similarly, in a survey by Vercauteren et al [ ], 41% of the respondents preferred eConsent, 41% had no preference, and only 16% preferred paper-based ICFs. The focus group participants in a study by Jimison et al [ ] thought that eConsent was useful and could replace the paper-based ICFs. Only 22% of legally authorized representatives that eConsented on behalf of clinical study patients would have preferred a paper-based ICF in a study by Haussen et al [ ]. The studies by Golembiewski et al [ ] and Harle et al [ ] compared different formats of eConsenting and observed no significant differences in acceptability between the standard version and the version with hyperlinks to additional materials.
Overall, 6 studies (582 participants in total) assessed the aspects of patient usability with eConsenting.
Patient Usability: eConsenting Versus Paper
Comparative information on the usability of eConsenting versus paper-based ICFs was provided in 5 studies, including a total of 542 participants (of whom 532 participants contributed comparative data on usability;). All 5 studies reported significantly better usability with eConsenting, better usability but without significance testing, or no differences in usability ( ). One study had “high” methodological validity for assessing usability, having measured this via the Computer System Usability Questionnaire, and reported statistically significant higher usability scores with eConsent than with paper-based ICFs [ ]. One study had “moderate” methodological validity and observed no overall significant difference in the usability between eConsent and paper-based ICFs [ ]. Three studies (4 publications) with “limited” validity reported better usability with eConsent than with paper-based ICFs [ , , , ], and in 2 of these studies, at least some of the differences were statistically significant [ , , ].
|Sample size, N||Age (years)||Measure||Validitya|
|Abujarad et al , 2021||50||1 question (Likert scale)||+||eConsent participants scored the process as significantly less difficult than paper consent participants (P=.02)|
|Chalil Madathil et al , 2013||40||CSUQc system usefulness and interface quality subscales||+++||Higher usefulness and interface quality scores with eConsent formats vs paper. Difference across the different formats statistically significant (P<.05)|
|Chapman et al [, ], 2021 and 2020||298||2 questions (multiple choice) plus successful completion||+||Significantly better scores with eConsent vs paper for engagement with study information (P<.001); no significant difference for improvement. All participants successfully completed the consenting process|
|Jayasinghe et al , 2019||35||10 questions (Likert scales)||++||Overall, no statistically significant difference with eConsent vs paper|
|Knapp et al , 2021||109||1 question (Likert scale)||+||Better scores with eConsent vs paper (P value NR)|
aMethodological validity was categorized as “high” (+++), “moderate” (++), or “limited” (+).
bNR: not reported.
cCSUQ : Computer System Usability Questionnaire. It contains 19 questions measuring overall satisfaction, system usefulness, information quality, and interface quality.
Patient Usability: Other Evidence
Participants who were asked about their impressions of the electronic and paper-based informed consent processes described the electronic process as well organized, easy to use, and useful in a study by Simon et al  ( ).
A total of 12 studies (6399 participants in total) assessed the effect of the ICF format on the aspects of patient enrollment, with mixed results. Comparisons of consenting rates with eConsenting versus paper-based ICFs were reported in 5 studies [, , , , ]. In the study by Bickmore et al [ ], a significantly higher proportion of participants in the eConsent group than those in the paper group signed their ICFs (P=.01). Consenting rates were also higher with eConsent than with the paper-based ICFs in a study by Chalil Madathil et al [ ] (P value not reported). Consenting rates were similar between the groups in a study by Jeste et al [ ] (P value not reported), and Rothwell et al [ ] reported that consenting rates were similar in paper-based ICFs and video eConsent groups, but the rates were lower in the app eConsent group (P value not reported). In a study by Simon et al [ ], enrollment was significantly higher with a face-to-face informed consent process than with eConsenting (P=.004), although immediately after the consenting process, similar proportions of the 2 groups had reported their intention to enroll; the eConsent process was conducted at the same location as the face-to-face process.
Overall, 4 studies reported eConsenting rates using different electronic media formats [, , , , ]. No significant differences in enrollment rates were observed with animated versus text-based video consents by Naeim et al [ ], with video- versus text-based consenting by Fanaroff et al [ ], or with different levels of eConsent interactivity by Golembiewski et al [ ] and Harle et al [ ]. Siegel et al [ ] observed an increase in enrollment rates after a content redesign and attributed the increased rates to the web-based consenting being directly integrated with new patient on-boarding.
Furthermore, 3 studies described results about preferences and found that the format of the ICF made little difference to participants’ decision-making regarding study participation [, , ]. In the study by Abujarad et al [ ], participants were asked to score the importance of the consenting process in their decision to participate; scores were not significantly different between the paper ICF and the eConsent groups. In the study by Knapp et al [ ], similar proportions of patients in the paper-based ICF and the eConsent groups found that the trial information provided helped them make their decision about whether to take part [ ]. Study researchers who were surveyed in the study by Cagnazzo et al [ ] thought that the use of eConsent had little influence on whether patients declined to participate in a stud.
None of the included studies reported overall study retention comparisons between the 2 consenting approaches of paper-based ICFs versus eConsenting. Fanaroff et al  (3485 participants) assessed different formats of eConsenting and found no statistically significant differences in the proportions of enrolled patients who subsequently completed the 2 requested study procedures, namely, a blood draw and survey questions.
A total of 13 studies (2063 participants in total) assessed cycle time, and 10 studies (covered in 11 publications) assessed the comparative effect of eConsent versus paper-based ICFs on consenting times, 2 studies (3 publications) asked about perceived consenting time, and 1 study assessed consenting times with different electronic formats. eConsenting took more time than paper consenting in the studies by Chapman et al [, ] (P=.006), Jayasinghe et al [ ] (P<.001), McCarty et al [ ] (P<.001), Rowbotham et al [ ] (P<.001), Simon et al [ ] (P<.001), Sonne et al [ ] (P value not reported), and Varnhagen et al [ ] (P<.001; partial η2=0.36). eConsenting was faster than paper consenting in the studies by Afolabi et al [ ] and Jeste et al [ ] (P value not reported in either study). Chalil Madathil et al [ ] found no significant effect of consenting condition on time taken to complete the task. Abujarad et al [ ] and Warriner et al [ ] asked participants about their perceived time to complete the task and found no statistically significant differences between the eConsent and paper-based ICF groups. Different electronic formats of eConsenting did not significantly affect consenting times in the study by Golembiewski et al [ ] and Harle et al [ ].
In total, 3 studies (3284 participants in total) assessed the site workload. Hospital staff in the study by Chalil Madathil et al  reported a less subjective workload with eConsenting than with paper-based formats (P=.02), and the responses were assessed using the National Aeronautics and Space Administration Task Load Index. Site advisory group feedback in the study by Vanaken and Masand [ ] included a beneficial reduction in the administrative burden and reduction in paper trail, although a potential for increased workload was also noted, for example, in relation to training and device management. In the study by Zeps et al [ ], clinical trial researchers noted that eConsent devices could be clunky and prone to malfunction, which increased overall study time and burdened trial staff.
Overall, 5 studies (3416 participants in total) assessed stakeholder views. Staff in the study by Chalil Madathil et al  preferred eConsenting formats over paper-based consenting (differences among systems, P<.005). In the study by Warriner et al [ ], the findings from a telephone survey of practice sites that administered both consent processes favored eConsent over paper-based ICFs, but the differences were not statistically significant. Health Authority representatives were in favor of the broad implementation of eConsent in alignment with local regulations in the study by Vanaken and Masand [ ]. However, approximately half (53%) of the surveyed research participants preferred having both a paper document and an eConsenting system [ ]. Similarly, most centers (65%) in a survey by Cagnazzo et al [ ] preferred using a paper-based ICF in parallel with eConsenting. Clinical research stakeholders surveyed by Cagnazzo et al [ ] in late 2020 thought that at a regulatory level, the use of eConsent might increase the time to study approval. In the survey of clinical trial researchers’ opinions on eConsent conducted by Zeps et al [ ] in early 2019, a total of 68% of the respondents believed that ethics committees would not approve the use of eConsent or were unsure if they would, while 67% of the respondents thought that the lack of standardized, consistent guidance across the sector was an important barrier to success and 60% of the respondents believed that the high initial cost might be a barrier to uptake.
Our systematic literature review aimed to assess the effectiveness of eConsent in terms of patient comprehension, acceptability, usability, study enrollment and retention rates, cycle time, and site workload, primarily in comparison with traditional paper-based consenting. We identified 37 primary publications for inclusion that together described 35 studies (13,281 participants in total). Our results showed that compared with patients who used paper-based consenting, patients who used eConsent had a better understanding of the trial information, showed greater engagement with content, and rated the consenting process as more acceptable and usable. Cycle times were increased with eConsent, potentially reflecting the greater patient engagement with the content. Data on enrollment, retention, and site workload effects were limited. Some general themes emerged in relation to the effectiveness of eConsent, its administrative aspects, and the variability in eConsenting formats used across studies. We have discussed these under the following subheadings.
Comprehension, Acceptability, and Usability
Informed consent involves providing potential clinical trial participants with adequate information on what the study involves, including the risks and benefits of participation, to allow them to make a fully informed decision on whether to participate. Knowing that potential trial participants have understood the study information is thus of utmost importance. Our systematic review showed good evidence of improvements in comprehension with eConsent versus paper-based ICFs. Assessments of patients’ experiences with eConsenting need to distinguish between the content of the eConsent information and the workability of the digital platform. Our findings in terms of comprehension, acceptability, and usability were consistent, showing either overall benefits to patients of eConsenting versus paper-based ICFs or no significant overall differences. Patients reported higher satisfaction and enjoyment with the eConsent process than with paper-based consenting and found eConsenting both more useful and less difficult to use than paper versions. None of the studies reported significantly higher overall patient benefits with paper-based ICFs than with eConsent.
Studies were limited in terms of their exploration of why eConsent was more effective than paper-based consenting. Craik and Lockhart , in their “levels of processing” framework for memory research, suggest that learning and memory are improved when the information is processed in depth. This deeper level of processing might be achieved in many ways. Research by Dellson et al [ ] suggests that use of good graphic design in consent materials, for example, using illustrations rather than text to explain treatment regimens, raises potential participants’ motivation to engage with the materials and facilitates their understanding of the clinical study. In the cognitive theory of multimedia learning, Mayer [ ] proposes that people can learn more deeply with multisensory processing, when audio and visual information is presented together at the same time [ , ]. Further improvements in learning efficiency are obtained with user-focused active engagement [ ]. Compared with text alone, the use of multimedia is also likely to increase attention arousal [ ], which is typically associated with increased learning [ , ]. However, maximizing sustained attention needs to be balanced with the cognitive processing effort, which should not be increased beyond the cognitive capacity of the participant [ ]. In their study of web-based lectures, Chen and Wu [ ] found that the visual information presented with a voice-over resulted in increased sustained attention workload and negatively affected learning performance, compared with the visual information presented with video and audio of the presenter. Future research might wish to explore the role of cognitive capacity in eConsent comprehension and ways to mitigate cognitive demands, for example, via the use of self-pacing [ ].
Effectiveness in Older Age Groups
Encouragingly, many studies that we examined included patient groups up to the age of 91 years. Although studies do not examine age cohorts separately, the positive effectiveness findings also applied to patients in older adult age groups, indicating that age does not have a negative impact on the effectiveness of eConsent, although more data for a comprehensive assessment are needed. In a cardiovascular study that included 298 participants with a mean age of 63 (range 45-74) years, those randomized to eConsent, consisting of multimedia including video-, audio-, and computer-based finger-signed consent, had a better understanding of study requirements than their counterparts randomized to the traditional paper-based consenting [, ]. It has been found that older adults integrate more of the audiovisual information in their environment when performing tasks and benefit more from multisensory processing than younger adults do [ ], thus supporting the use of eConsent in older age groups. The perceived lower technology literacy in some older cohorts can be mitigated with a good solution design and effective training [ ].
In addition to comprehension, the usability data showed that eConsenters had better engagement with study information than their paper-based ICF counterparts, and acceptability was similar for the 2 consenting formats [, ]. Focus group discussions with individuals aged ≥65 years yielded frequently cited advantages of eConsenting, including its convenience and the usefulness of additional features such as definitions, graphics, and audio [ ]. Although not evaluated here, it is likely that age per se has less of an impact than other patient characteristics, such as cognitive ability, dexterity, and technology literacy, on the usability and acceptability of digital solutions within clinical trials.
Overall, there was no consensus across publications as to whether a patient’s likelihood to enroll in a study is affected by whether the consenting process is electronic or on paper [, , , , ]. When questioned, the patients indicated that the format of the ICF made little difference to their decision-making regarding study participation [ , , ]. However, eConsenting had the potential to increase patient enrollment by increasing accessibility when integrated into a web-based patient platform [ ].
Potentially more relevant than enrollment effects is whether improved patient comprehension of the study and its requirements leads to enhanced trial retention. We identified a marked gap in the comparative research on the effect of eConsent on patient retention within clinical studies. The observed improvements in comprehension with eConsent could potentially be used as a surrogate for retention because we know that not fully understanding the study requirements beforehand is a key reason for early withdrawal from clinical trials .
Most studies in this review that assessed the time it took for patients to undertake the consenting process with eConsent versus paper found that eConsenting took more time than paper consenting [, , , , , , , ]. This finding is not unexpected. As eConsent is better able to hold patients’ attention than paper-based approaches [ ], eConsenting patients are likely to engage more fully with the information provided, thus increasing cycle time. Explanations provided by the primary study authors for the increased time taken with eConsent included that this format enabled participants to engage more with the study information [ , ], that participants made use of the opportunities to view additional information available in the eConsent format [ ], or that participants took time to listen to slide narration [ ]. To mitigate the increase in cycle time, clinical researchers might consider providing remote eConsent access ahead of a study visit.
We identified only limited comparative information on site workload. None of the studies assessed workload across the entirety of a clinical trial. The workload advantages of the fully digitized consent process versus paper-based consenting may become visible only later in the clinical study timeline, when the administrative burden with paper-based consenting may increase owing to data quality issues.
One study assessed hospital staff’s subjective workload and found it to be reduced with eConsent versus paper . Advisory group feedback included a reduction in administrative burden and paper trail, better version control, fewer issues around missing dates or signatures on forms, improved data quality, better participant oversight, and reduced number of site visits, potentially offset by an increased workload in relation to training and device management [ ]. Site staff and health authority representatives tended to prefer eConsenting formats over paper-based consenting [ , ], although a preference for using both a paper document and an eConsenting system was also reported [ , ]. Technical difficulties with devices were noted as a potential burden for trial staff [ ]. Sonne et al [ ] described 1 in 5 participants with technical difficulties, including videos not loading or needing to be restarted, and internet connection issues, although that study was published in 2013 and is thus unlikely to reflect current setups.
Flawed informed consent processes are among the topmost regulatory and inspection findings for clinical trials [- ]. Although we did not review, we expect that eConsent would implicitly protect against most of the common reasons for such findings. eConsent solutions prevent lodgment with incomplete information, missing signatures, or signing of incorrect versions and preclude retrospective signing, which cannot be detected or demonstrably proven with paper-based consenting. A fully digital consent process would allow for the evaluation of the success of the consent form content and system and their continuous improvement for patients. With paper-based ICFs, such evaluations would not be possible without having validated questionnaires included in each study. Moreover, the ability to track the withdrawal of a consent on an individual or a sample level benefits patients by ensuring that their data or samples are not used for future research. The industry would benefit by being able to comply with patients’ wishes by not using data or samples outside of the study. Future work will need to evaluate and confirm these benefits.
The types of eConsent used varied considerably across the included studies. Formats included straightforward digitization of paper documents, signature management systems, audio- and video-enhanced content, and fully interactive systems. Across formats, active multimedia engagement principles were observed, and significantly improved comprehension was achieved with highly interactive versions compared with less-interactive eConsent versions . Animated video–based information was found to be easier to understand than text-based videos [ ]. Even for text-based formats, use of line spacing, bold font, and bullet points could improve comprehension [ ]. This variability in format has implications for future work, which might explore the differences and most effective formats further.
There was also variability in terms of how comprehension, acceptability, and usability were assessed. Among the studies that assessed the effectiveness of eConsenting compared with paper-based consenting, half of the 20 studies on comprehension had high methodological validity, but only 1 of the 8 studies on acceptability and 1 of the 5 studies on usability did so.
In its guidance on the use of electronic informed consent, the FDA notes the following:
[Electronic informed consent] may be used to provide information usually contained within the written informed consent document, evaluate the subject’s comprehension of the information presented, and document the consent of the subject or the subject’s legal authorized representative. Electronic processes to obtain informed consent may use an interactive interface, which may facilitate the subject’s ability to retain and comprehend the information .
In line with the FDA guidance, we suggest that a consenting format should be referred to as eConsent only if it can support patient engagement using multimedia components (eg, text, graphics, audio, and video) together with interactive functionalities to share information related to the study. If a digital consent solution does not have these capabilities, we suggest that it should be referred to as a digital consent form rather than a true eConsent solution.
Limitations of our systematic literature search include the fact that the search strategy that we used would have missed some potentially relevant studies while trying to keep the number of publications for screening manageable. Among eConsent studies, it is conceivable that there may be a reporting bias in favor of those finding comparative differences. Differences in the outcome measures used, including differences in their validity, made comparisons between studies challenging and pooling across studies unfeasible. Most studies included in this review did not provide a detailed description of the eConsent format, and such information should be included in future studies to allow researchers to assess and compare the results across studies. These observations are a call to action to harmonize the analysis, documentation, and reporting of eConsent findings, as well as the parameters defining best practices for eConsent (including whether these are met by current eConsent vendors). The same applies for the used terminologies and processes around eConsent. A current ongoing initiative of the European Forum of Good Clinical Practice aims to achieve the standardization within the clinical trial .
In conclusion, this systematic review showed overall patient benefits with eConsent versus paper-based consenting in terms of understanding, acceptability, and usability. No study reported significantly better overall patient benefits with paper-based ICFs than with eConsent. eConsenting can increase enrollment into clinical studies by improving access to research. Comparative data from site staff and other study researchers indicate the potential for reduced workload and lower administrative burden with eConsent. In addition to these benefits, there are various other advantages associated with the use of digital solutions, including preventing flawed consenting processes, ensuring data quality, and supporting study integrity. Importantly, there are several avenues for future research that we believe are necessary. These include, but are not limited to, research that explores the best methodologies to target specific measures of eConsent efficacy (eg, recruitment, retention, and site experience); research that explores cross-cultural and globalization elements of eConsent; and research that makes better use of the theoretical underpinnings for why eConsent methods are more efficacious.
Oxford PharmaGenesis, with the financial support of AstraZeneca, provided the support for the development of this manuscript for publication, including assistance with the systematic literature review and data extraction. AstraZeneca authors participated in the study design and data collection and analysis.
EC, BB, and MJ-K were involved in the conception and design of the study. Acquisition, analysis, and interpretation of data was carried out by EC, BB, AB, MJ-K, and AKM. Drafting and critically revising the article was carried out by EC, BB, AB, MJ-K, and AKM. All the authors provided final approval of the version to be published.
Conflicts of Interest
EC is an employee at AstraZeneca. BB is an employee at Signant Health. AB is a contractor at Oxford PharmaGenesis. MJ-K is an employee and shareholder at AstraZeneca. AKM has no conflicts of interest to declare.
PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) checklist.DOCX File , 36 KB
Overview of the studies identified for inclusion.DOCX File , 80 KB
- E6(R2) Good clinical practice: integrated addendum to ICH E6(R1). U.S. Food & Drug Administration. 2018 Mar. URL: https://www.fda.gov/regulatory-information/search-fda-guidance-documents/e6r2-good-clinical-practice-integrated-addendum-ich-e6r1 [accessed 2022-06-23]
- Schumacher A, Sikov WM, Quesenberry MI, Safran H, Khurshid H, Mitchell KM, et al. Informed consent in oncology clinical trials: a Brown University Oncology Research Group prospective cross-sectional pilot study. PLoS One 2017 Feb 24;12(2):e0172957 [https://dx.plos.org/10.1371/journal.pone.0172957] [CrossRef] [Medline]
- Retention in clinical trials: keeping patients on protocols. Advarra. 2021 Mar 23. URL: https://www.advarra.com/resource-library/retention-in-clinical-trials-keeping-patients-on-protocols/ [accessed 2022-06-23]
- Gogtay NJ, Doshi BM, Kannan S, Thatte U. A study of warning letters issued to clinical investigators and institutional review boards by the United States Food and Drug Administration. Indian J Med Ethics 2011 Oct 1;8(4):211-214 [CrossRef]
- Rogers CA, Ahearn JD, Bartlett MG. Data integrity in the pharmaceutical industry: analysis of inspections and warning letters issued by the bioresearch monitoring program between fiscal years 2007-2018. Ther Innov Regul Sci 2020 Sep 24;54(5):1123-1133 [https://europepmc.org/abstract/MED/32096103] [CrossRef] [Medline]
- Bernabe RD, van Thiel GJ, Breekveldt NS, Gispen CC, van Delden JJ. Ethics and the marketing authorization of pharmaceuticals: what happens to ethical issues discovered post-trial and pre-marketing authorization? BMC Med Ethics 2020 Oct 27;21(1):103 [https://bmcmedethics.biomedcentral.com/articles/10.1186/s12910-020-00543-w] [CrossRef] [Medline]
- Takaoka A, Zytaruk N, Davis M, Matte A, Johnstone J, Lauzier F, PROSPECT Investigatorsthe Canadian Critical Care Trials Group. Monitoring and auditing protocol adherence, data integrity and ethical conduct of a randomized clinical trial: a case study. J Crit Care 2022 Oct;71:154094 [https://linkinghub.elsevier.com/retrieve/pii/S0883-9441(22)00123-X] [CrossRef] [Medline]
- Use of electronic informed consent: questions and answers. U.S. Food and Drug Administration. 2016 Dec. URL: https://www.fda.gov/media/116850/download [accessed 2022-06-23]
- Electronic informed consent implementation guide: practical considerations. European CRO Federation. 2021 Mar. URL: https://www.eucrof.eu/images/Electronic_Informed_Consent_Implementation_Guide_Practical_Considerations_Version_1.0___March_2021_2.pdf [accessed 2022-06-23]
- Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ 2021 Mar 29;372:n71 [http://www.bmj.com/lookup/pmidlookup?view=long&pmid=33782057] [CrossRef] [Medline]
- How to compare Ovid MEDLINE and PubMed. Wolters Kluwer. 2019 Nov. URL: https://tools.ovid.com/ovidtools/pdf/Ovid_MEDLINE_and_PubMed_compared.pdf [accessed 2023-06-18]
- Ovid tools and resources portal. Wolters Kluwer. URL: https://tools.ovid.com/ovidtools/medline.html [accessed 2023-06-18]
- Abujarad F, Peduzzi P, Mun S, Carlson K, Edwards C, Dziura J, et al. Comparing a multimedia digital informed consent tool with traditional paper-based methods: randomized controlled trial. JMIR Form Res 2021 Oct 19;5(10):e20458 [https://formative.jmir.org/2021/10/e20458/] [CrossRef] [Medline]
- Afolabi MO, McGrath N, D'Alessandro U, Kampmann B, Imoukhuede EB, Ravinetto RM, et al. A multimedia consent tool for research participants in the Gambia: a randomized controlled trial. Bull World Health Organ 2015 May 01;93(5):320-38A [https://europepmc.org/abstract/MED/26229203] [CrossRef] [Medline]
- Bickmore TW, Pfeifer LM, Paasche-Orlow MK. Using computer agents to explain medical documents to patients with low health literacy. Patient Educ Couns 2009 Jun;75(3):315-320 [https://europepmc.org/abstract/MED/19297116] [CrossRef] [Medline]
- Buckley MT, Lengfellner JM, Koch MJ, Search B, Hoidra C, Lin M, et al. MSK eConsent: digitalizing the informed consent process to improve participant engagement and understanding. J Clin Oncol 2020 May 20;38(15_suppl):2066 [CrossRef]
- Cagnazzo C, Nanni O, Di Costanzo A, Cenna R, Marchetti F, La Verde N, et al. 1856P Electronic informed consent: the need to redesign the consent process for the digital era. Annals Oncol 2021 Sep;32(11-12):S1249 [CrossRef]
- Chalil Madathil K, Koikkara R, Obeid J, Greenstein JS, Sanderson IC, Fryar K, et al. An investigation of the efficacy of electronic consenting interfaces of research permissions management system in a hospital setting. Int J Med Inform 2013 Sep;82(9):854-863 [https://europepmc.org/abstract/MED/23757370] [CrossRef] [Medline]
- Chapman N, Mcwhirter R, Armstrong M, Fonseca R, Campbell J, Nelson M, et al. Multimedia for delivering participant informed consent in cardiovascular trials. J Hypertens 2021 Apr;39:e217-e218 [CrossRef]
- Chapman N, McWhirter R, Armstrong MK, Fonseca R, Campbell JA, Nelson M, et al. Self-directed multimedia process for delivering participant informed consent. BMJ Open 2020 Jul 26;10(7):e036977 [https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=32713850] [CrossRef] [Medline]
- Fanaroff AC, Li S, Webb LE, Miller V, Navar AM, Peterson ED, et al. An observational study of the association of video- versus text-based informed consent with multicenter trial enrollment: lessons from the PALM Study (Patient and Provider Assessment of Lipid Management). Circ Cardiovasc Qual Outcomes 2018 Apr;11(4):e004675 [https://europepmc.org/abstract/MED/29625993] [CrossRef] [Medline]
- Geier C, Adams RB, Mitchell KM, Holtz BE. Informed consent for online research-is anybody reading?: assessing comprehension and individual differences in readings of digital consent forms. J Empir Res Hum Res Ethics 2021 Jul;16(3):154-164 [CrossRef] [Medline]
- Golembiewski EH, Mainous AG, Rahmanian KP, Brumback B, Rooks BJ, Krieger JL, et al. An electronic tool to support patient-centered broad consent: a multi-arm randomized clinical trial in family medicine. Ann Fam Med 2021;19(1):16-23 [http://www.annfammed.org/cgi/pmidlookup?view=long&pmid=33431386] [CrossRef] [Medline]
- Harle CA, Golembiewski EH, Rahmanian KP, Brumback B, Krieger JL, Goodman KW, et al. Does an interactive trust-enhanced electronic consent improve patient experiences when asked to share their health records for research? a randomized trial. J Am Med Inform Assoc 2019 Jul 01;26(7):620-629 [https://europepmc.org/abstract/MED/30938751] [CrossRef] [Medline]
- Harmell AL, Palmer BW, Jeste DV. Preliminary study of a web-based tool for enhancing the informed consent process in schizophrenia research. Schizophr Res 2012 Nov;141(2-3):247-250 [https://europepmc.org/abstract/MED/22939457] [CrossRef] [Medline]
- Haussen DC, Craft L, Doppelheuer S, Rodrigues GM, Al-Bayati AR, Ravindran K, et al. Legal authorized representative experience with smartphone-based electronic informed consent in an acute stroke trial. J Neurointerv Surg 2020 May;12(5):483-485 [CrossRef] [Medline]
- Jayasinghe N, Moallem BI, Kakoullis M, Ojie MJ, Sar-Graycar L, Wyka K, et al. Establishing the feasibility of a tablet-based consent process with older adults: a mixed-methods study. Gerontologist 2019 Jan 09;59(1):124-134 [https://europepmc.org/abstract/MED/29757375] [CrossRef] [Medline]
- Jeste DV, Palmer BW, Golshan S, Eyler LT, Dunn LB, Meeks T, et al. Multimedia consent for research in people with schizophrenia and normal subjects: a randomized controlled trial. Schizophr Bull 2009 Jul;35(4):719-729 [https://europepmc.org/abstract/MED/18245061] [CrossRef] [Medline]
- Jimison HB, Sher PP, Appleyard R, LeVernois Y. The use of multimedia in the informed consent process. J Am Med Inform Assoc 1998;5(3):245-256 [https://europepmc.org/abstract/MED/9609494] [CrossRef] [Medline]
- McCarty CA, Berg R, Waudby C, Foth W, Kitchner T, Cross D. Long-term recall of elements of informed consent: a pilot study comparing traditional and computer-based consenting. IRB 2015;37(1):1-5 [https://europepmc.org/abstract/MED/26247077] [Medline]
- McGowan CR, Houlihan CF, Kingori P, Glynn JR. The acceptability of online consent in a self-test serosurvey of responders to the 2014-2016 West African Ebola outbreak. Public Health Ethics 2018 Jul;11(2):201-212 [https://europepmc.org/abstract/MED/30135701] [CrossRef] [Medline]
- McGraw SA, Wood-Nutter CA, Solomon MZ, Maschke KJ, Bensen JT, Irwin DE. Clarity and appeal of a multimedia informed consent tool for biobanking. IRB 2012;34(1):9-19 [Medline]
- Naeim A, Dry S, Elashoff D, Xie Z, Petruse A, Magyar C, et al. Electronic video consent to power precision health research: a pilot cohort study. JMIR Form Res 2021 Sep 08;5(9):e29123 [https://formative.jmir.org/2021/9/e29123/] [CrossRef] [Medline]
- Perrault EK, Keating DM. Seeking ways to inform the uninformed: improving the informed consent process in online social science research. J Empir Res Hum Res Ethics 2018 Feb;13(1):50-60 [CrossRef] [Medline]
- Rothwell E, Johnson E, Wong B, Goldenberg A, Tarini BA, Riches N, et al. Comparison of video, app, and standard consent processes on decision-making for Biospecimen research: a randomized controlled trial. J Empir Res Hum Res Ethics 2020 Oct;15(4):252-260 [https://europepmc.org/abstract/MED/32242760] [CrossRef] [Medline]
- Rothwell E, Wong B, Rose NC, Anderson R, Fedor B, Stark LA, et al. A randomized controlled trial of an electronic informed consent process. J Empir Res Hum Res Ethics 2014 Dec;9(5):1-7 [https://europepmc.org/abstract/MED/25747685] [CrossRef] [Medline]
- Rowbotham MC, Astin J, Greene K, Cummings SR. Interactive informed consent: randomized comparison with paper consents. PLoS One 2013;8(3):e58603 [https://dx.plos.org/10.1371/journal.pone.0058603] [CrossRef] [Medline]
- Siegel EM, Hawkins KP, Hildreth L, Grose T, Stringfellow D, Bloomer A, et al. Process improvement in online consenting for the Moffitt Cancer Center Total Cancer Care biobanking protocol. In: Proceedings of the AACR Special Conference on Modernizing Population Sciences in the Digital Age. 2019 Presented at: AACR Special Conference on Modernizing Population Sciences in the Digital Age; February 19-22, 2019; San Diego, CA [CrossRef]
- Simon CM, Klein DW, Schartz HA. Interactive multimedia consent for biobanking: a randomized trial. Genet Med 2016 Jan;18(1):57-64 [https://linkinghub.elsevier.com/retrieve/pii/S1098-3600(21)04297-0] [CrossRef] [Medline]
- Simon CM, Schartz HA, Rosenthal GE, Eisenstein EL, Klein DW. Perspectives on electronic informed consent from patients underrepresented in research in the United States: a focus group study. J Empir Res Hum Res Ethics 2018 Oct;13(4):338-348 [CrossRef] [Medline]
- Simon CM, Wang K, Shinkunas LA, Stein DT, Meissner P, Smith M, et al. Communicating with diverse patients about participating in a biobank: a randomized multisite study comparing electronic and face-to-face informed consent processes. J Empir Res Hum Res Ethics 2022;17(1-2):144-166 [https://europepmc.org/abstract/MED/34410195] [CrossRef] [Medline]
- Sonne SC, Andrews JO, Gentilin SM, Oppenheimer S, Obeid J, Brady K, et al. Development and pilot testing of a video-assisted informed consent process. Contemp Clin Trials 2013 Sep;36(1):25-31 [https://europepmc.org/abstract/MED/23747986] [CrossRef] [Medline]
- Tait AR, Voepel-Lewis T, McGonegal M, Levine R. Evaluation of a prototype interactive consent program for pediatric clinical trials: a pilot study. J Am Med Inform Assoc 2012 Jun;19(e1):e43-e45 [https://europepmc.org/abstract/MED/21803924] [CrossRef] [Medline]
- Vanaken HI, Masand SN. Awareness and collaboration across stakeholder groups important for eConsent achieving value-driven adoption. Ther Innov Regul Sci 2019 Nov;53(6):724-735 [CrossRef] [Medline]
- Varnhagen CK, Gushta M, Daniels J, Peters TC, Parmar N, Law D, et al. How informed is online informed consent? Ethics Behav 2005;15(1):37-48 [CrossRef] [Medline]
- Vercauteren S, Virani A, Longstaff H, Robillard J, Portales‐Casamar E, Lutynski A, et al. Electronic consent for pediatric biobanking: do kids and parents understand what they consent to? Biopreserv Biobank 2020 Jun 12;18(3):O-07 [http://doi.org/10.1089/bio.2020.29065.abstracts] [CrossRef]
- Warriner AH, Foster PJ, Mudano A, Wright NC, Melton ME, Sattui SE, et al. A pragmatic randomized trial comparing tablet computer informed consent to traditional paper-based methods for an osteoporosis study. Contemp Clin Trials Commun 2016 Aug 15;3:32-38 [https://linkinghub.elsevier.com/retrieve/pii/S2451-8654(15)30035-1] [CrossRef] [Medline]
- Zeps N, Northcott N, Weekes L. Opportunities for eConsent to enhance consumer engagement in clinical trials. Med J Aust 2020 Sep;213(6):260-2.e1 [https://europepmc.org/abstract/MED/32794197] [CrossRef] [Medline]
- Knapp P, Mandall N, Hulse W, Roche J, Moe-Byrne T, Martin-Kerry J, (for the TRECA study group). Evaluating the use of multimedia information when recruiting adolescents to orthodontics research: a randomised controlled trial. J Orthod 2021 Dec 06;48(4):343-351 [https://journals.sagepub.com/doi/abs/10.1177/14653125211024250?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub 0pubmed] [CrossRef] [Medline]
- Joffe S, Cook EF, Cleary PD, Clark JW, Weeks JC. Quality of informed consent: a new measure of understanding among research subjects. J Natl Cancer Inst 2001 Jan 17;93(2):139-147 [CrossRef] [Medline]
- Afolabi MO, Bojang K, D'Alessandro U, Ota MO, Imoukhuede EB, Ravinetto R, et al. Digitised audio questionnaire for assessment of informed consent comprehension in a low-literacy African research population: development and psychometric evaluation. BMJ Open 2014 Jun 24;4(6):e004817 [https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=24961716] [CrossRef] [Medline]
- Sugarman J, Lavori PW, Boeger M, Cain C, Edsond R, Morrison V, et al. Evaluating the quality of informed consent. Clin Trials 2005 Sep 03;2(1):34-41 [CrossRef] [Medline]
- Jeste DV, Palmer BW, Appelbaum PS, Golshan S, Glorioso D, Dunn LB, et al. A new brief instrument for assessing decisional capacity for clinical research. Arch Gen Psychiatry 2007 Aug 01;64(8):966-974 [CrossRef] [Medline]
- Appelbaum PS, Grisso T. MacArthur Competence Assessment Tool for Clinical Research (MacCAT-CR). Sarasota, FL: Professional Resource Press; 2001.
- Lewis JR. IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int J Hum Comput Interact 1995 Jan;7(1):57-78 [CrossRef]
- Craik FI, Lockhart RS. Levels of processing: a framework for memory research. J Verbal Learning Verbal Behav 1972 Dec;11(6):671-684 [CrossRef]
- Dellson P, Nilbert M, Carlsson C. Patient representatives' views on patient information in clinical cancer trials. BMC Health Serv Res 2016 Feb 01;16(1):36 [https://bmchealthservres.biomedcentral.com/articles/10.1186/s12913-016-1272-2] [CrossRef] [Medline]
- Mayer RE. Cognitive theory of multimedia learning. In: Mayer RE, editor. The Cambridge Handbook of Multimedia Learning. Cambridge: Cambridge University Press; 2005.
- Seitz AR, Kim R, Shams L. Sound facilitates visual learning. Curr Biol 2006 Jul 25;16(14):1422-1427 [https://linkinghub.elsevier.com/retrieve/pii/S0960-9822(06)01631-9] [CrossRef] [Medline]
- Shams L, Seitz AR. Benefits of multisensory learning. Trends Cogn Sci 2008 Nov;12(11):411-417 [CrossRef] [Medline]
- Michel N, Cater III JJ, Varela O. Active versus passive teaching styles: an empirical study of student learning outcomes. Hum Resour Dev Q 2009 Sep;20(4):397-418 [CrossRef]
- Andres HP. Multimedia, information complexity, and cognitive processing. Inf Resour Manag J 2004;17(1):63-78 [CrossRef]
- Chen CM, Wang JY. Effects of online synchronous instruction with an attention monitoring and alarm mechanism on sustained attention and learning performance. Interact Learn Environ 2017 Jun 30;26(4):427-443 [CrossRef]
- Steinmayr R, Ziegler M, Träuble B. Do intelligence and sustained attention interact in predicting academic achievement? Learn Individ Differ 2010 Feb;20(1):14-18 [CrossRef]
- Chen CM, Wu CH. Effects of different video lecture types on sustained attention, emotion, cognitive load, and learning performance. Comput Educ 2015 Jan;80:108-121 [CrossRef]
- Merkt M, Weigand S, Heier A, Schwan S. Learning with videos vs. learning with print: the role of interactive features. Learn Instr 2011 Dec;21(6):687-704 [CrossRef]
- de Dieuleveult AL, Siemonsma PC, van Erp JB, Brouwer AM. Effects of aging in multisensory integration: a systematic review. Front Aging Neurosci 2017 Mar 28;9:80 [https://europepmc.org/abstract/MED/28400727] [CrossRef] [Medline]
- Garner K, Byrom B. Attitudes of older people/seniors to completion of electronic patient-reported outcome measures and use of mobile applications in clinical trials: results of a qualitative research study. J Comp Eff Res 2020 Mar;9(4):307-315 [https://www.becarispublishing.com/doi/10.2217/cer-2019-0155?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub 0pubmed] [CrossRef] [Medline]
- eConsent initiave. European Forum for Good Clinical Practice. URL: https://efgcp.eu/project?initiative=eConsent [accessed 2023-03-23]
|eConsent: electronic consent|
|FDA: Food and Drug Administration|
|ICF: informed consent form|
|PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses|
Edited by T de Azevedo Cardoso; submitted 15.11.22; peer-reviewed by R Marshall, H Vanaken; comments to author 27.02.23; revised version received 24.04.23; accepted 27.06.23; published 01.09.23Copyright
©Edwin Cohen, Bill Byrom, Anja Becher, Magnus Jörntén-Karlsson, Andrew K Mackenzie. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 01.09.2023.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.