Review
Abstract
Background: Mobile devices offer an emerging opportunity for research participants to contribute person-generated health data (PGHD). There is little guidance, however, on how to best report findings from studies leveraging those data. Thus, there is a need to characterize current reporting practices so as to better understand the potential implications for producing reproducible results.
Objective: The primary objective of this scoping review was to characterize publications’ reporting practices for research that collects PGHD using mobile devices.
Methods: We comprehensively searched PubMed and screened the results. Qualifying publications were classified according to 6 dimensions—1 covering key bibliographic details (for all articles) and 5 covering reporting criteria considered necessary for reproducible and responsible research (ie, “participant,” “data,” “device,” “study,” and “ethics,” for original research). For each of the 5 reporting dimensions, we also assessed reporting completeness.
Results: Out of 3602 publications screened, 100 were included in this review. We observed a rapid increase in all publications from 2016 to 2021, with the largest contribution from US authors, with 1 exception, review articles. Few original research publications used crowdsourcing platforms (7%, 3/45). Among the original research publications that reported device ownership, most (75%, 21/28) reported using participant-owned devices for data collection (ie, a Bring-Your-Own-Device [BYOD] strategy). A significant deficiency in reporting completeness was observed for the “data” and “ethics” dimensions (5 reporting factors were missing in over half of the research publications). Reporting completeness for data ownership and participants’ access to data after contribution worsened over time.
Conclusions: Our work depicts the reporting practices in publications about research involving PGHD from mobile devices. We found that very few papers reported crowdsourcing platforms for data collection. BYOD strategies are increasingly popular; this creates an opportunity for improved mechanisms to transfer data from device owners to researchers on crowdsourcing platforms. Given substantial reporting deficiencies, we recommend reaching a consensus on best practices for research collecting PGHD from mobile devices. Drawing from the 5 reporting dimensions in this scoping review, we share our recommendations and justifications for 9 items. These items require improved reporting to enhance data representativeness and quality and empower participants.
doi:10.2196/51955
Keywords
Introduction
Collecting Person-Generated Health Data for Research Using Mobile Devices and Their Facilitators
The proliferation of mobile devices boosts the generation of person-generated health data (PGHD). In 2018, more than 80% of Americans owned a smartphone [
, ], with modest growth to 85% by 2021 [ ]. Wearable health care devices were used by nearly 28% of the US population by 2020 [ , ]. Among smartphone users, over 50% are collecting “health-associated information” [ ]. We use the phrase “person-generated health data (PGHD)” for health-related data created, recorded, or gathered by or from individuals, family members, or other caregivers to help address a health concern inside and outside clinical settings.PGHD from mobile devices have significant research implications. Mobile devices could facilitate access to large pools of study participant data in a granular, longitudinal, and personal way, potentially accessing high-frequency data at low costs [
]. In this study, “mobile devices” refers to smartphones and wearables, which include fitness trackers and smartwatches; “research” covers biomedical and behavior studies [ ].Because of the above advantages, researchers have made numerous efforts to collect PGHD for mobile device research. Those efforts are enabled by participants willing to share their PGHD for research [
] and by using the informatics infrastructure needed for participants to share their data. Examples of informatics infrastructure to enable PGHD sharing for research dates from 2015 and 2016, with the releases of ResearchKit (Apple Inc) [ , ] and ResearchStack (Cornell Tech and Open mHealth) [ , ] and the launch of mPower [ ], Sage Bionetwork’s first major smartphone-based health research study. Research practices dealing with PGHD from mobile devices are, in part, guided by the European Union’s General Data Protection Regulation (published in 2016; in effect since 2018) and the Food and Drug Administration’s guidelines on clinical research, medical devices, and mobile health apps (released in 2013, 2015, and 2019). Between 2000 and 2020, more than 12,000 mobile device–related health research papers were published [ ].Potential Issues With Data Representativeness and Quality When Collecting Person-Generated Health Data From Mobile Devices
Despite the potential advantages of using PGHD for mobile device research, the wide variety of practices to collect PGHD may lead to issues with data quality. These issues may be due to emerging research strategies such as Bring-Your-Own-Device (BYOD; which uses participant-owned devices instead of provisioned devices) that can miss some patient demographics [
], thus leading to low generalizability. Similar issues are noticed in studies using crowdsourcing platforms to collect data [ ]. Selection bias with wearables is a significant issue due to differences in access [ , ], which in turn can lead to differences in sharing data for research [ ]. Both accessibility and self-selection could lead to lower study sample diversity for race and socioeconomic status. For example, National Institutes of Health’s All of Us program allows participants to choose to contribute Fitbit (Google LLC) data through its BYOD subprogram. The All of Us BYOD subprogram has noticed a higher proportion of participants who are White, earn >US $25,000/year, and have college or advanced degrees in the BYOD subgroup compared with all participants [ ]. When such data are used in machine learning, the low representation of some patient groups may lead to poor prediction accuracy when applied to those groups [ , ].Motivation for This Work
To better understand methodological issues such as those related to data representativeness and quality, we aimed to review publications on research collecting PGHD from mobile devices and to characterize the reporting elements of that research thoroughly, particularly those elements relevant to BYOD and crowdsourcing. In doing so, we sought to capture aspects of this field not addressed by previously published reviews. Several reviews explore mobile health research. A review by Cao et al [
] covering the years 2000 to 2020 and a review by El-Sherif et al [ ] covering 2020 to February 2021 identified 4 leading contributors (the United States, the United Kingdom, Canada, and Australia). The 2 reviews, however, focused only on bibliometrics. A scoping review by Fischer and Kleen [ ] published in 2021 investigated data collection by smartphone apps in longitudinal epidemiological studies, and they found a limited number of studies that integrated apps in data collection. Studies with cross-sectional designs were excluded. A scoping review by Huhn et al [ ] published in 2022 summarized wearables in health research and found that most studies were observational and that 93% of the participants were in global health studies. Smartphones were not included in their scope. Other reviews of interest include one that briefly mentioned data collection using mobile devices for health research [ ] and another that discussed the application of sensors (eg, inertial measurement units) in health care [ ]. Our review differs from these reviews by covering wearables and smartphones and their use to collect PGHD in research. We also attempt to be comprehensive in our characterization of reporting practices, our inclusion of a range of study types (eg, both longitudinal and cross-sectional studies), and our examination of bibliographic information. To the best of our knowledge, ours is the first review of publications on research collecting PGHD from mobile devices and the first to examine BYOD practices and the use of crowdsourcing platforms in this field.This review characterizes the reporting elements of included publications and discusses the relevance of reporting practices for supporting reproducible and responsible research. There are no current, ready-to-use guidelines for reporting research involving PGHD from mobile devices. For example, the Future of Privacy Forum’s “Best Practices for Consumer Wearables & Wellness Apps & Devices” dates back to 2016 [
]. More recently, the Office of the National Coordinator for Health Information Technology released a white paper in 2018 to conceptualize a data infrastructure for the capture, use, and sharing of PGHD in care delivery and research through 2024 [ ]. The final document, however, is still under development. Also relevant are the Structured Template and Reporting Tool for Real-World Evidence (STaRT-RWE) (2021) [ ] and the Mobile Health Evidence Reporting and Assessment (mERA) checklist (2016) [ ], although both are limited. STaRT-RWE does not specify requirements for mobile device studies, and mERA is limited to health interventions. There are also some focused guidelines for app development (2016) [ ] and data integration (2021) [ ], but none fully cover what is needed for high-quality reporting of mobile device research. In fact, reporting deficiencies are widely noted in studies involving mobile devices. A systematic review by Olaye et al [ ] in 2022 found that 30% of publications failed to report quantitative adherence; such missingness can impede a complete understanding of study data. Our review characterizes the magnitude and types of reporting problems in publications on research collecting PGHD from mobile devices.Methods
We performed a scoping review following the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) checklist (
) [ ]. The scoping review first involved implementing a comprehensive search and screening strategy. Next, the selected articles were characterized according to their bibliographic details and 5 reporting dimensions.Search Strategy
A PubMed search was used to find instances of explicitly described participant data contributions (eg, through the use of language such as “data donation,” “data sharing,” etc) involving wearables, smartphones, or mobile apps (
). A MeSH (Medical Subject Headings) term analysis of a preliminary sample of articles was conducted to support PubMed query development (Yale MeSH Analyzer [ ]). We selected search terms to balance recall and precision in the search results. We ran the PubMed search on August 7, 2021.- Search concepts
- Research questions:
- Q1a. Data donation
- (data[tw] OR record*[tw] OR information[tw]) AND (donat*[tw] OR donor*[tw])
- Q1b. Data sharing
- (data[tw] OR record*[tw] OR information[tw]) AND (“Information Dissemination”[Mesh] OR sharing*[tw] OR share*[tw])
- Q2. Wearables, smartphones, or mobile applications
- “Wearable Electronic Devices”[Mesh:NoExp] OR “Fitness Trackers”[Mesh] OR wearable sensor*[tw] OR wearable device*[tw] OR wearable technolog*[tw] OR self-tracking[tw] OR self-tracker*[tw] OR fitness tracker*[tw] OR smart watch*[tw] OR smartwatch*[tw] OR “Mobile Applications”[Mesh] OR “Cell Phone”[Mesh:NoExp] OR “Smartphone”[Mesh] OR “Computers, Handheld”[Mesh] OR Mobile health[tw] OR mHealth[tw] OR eHealth[tw] OR e-health[tw] OR e-healthcare[tw] OR mobile application*[tw] OR mobile technolog*[tw] OR app[tw] OR apps[tw] OR cell phone*[tw] OR cellphone*[tw] OR cellular phone*[tw] OR cellular telephone*[tw] OR mobile phone*[tw] OR mobile telephone*[tw] OR smart phone*[tw] OR smartphone*[tw] OR mobile device*[tw] OR personal digital assistant*[tw] OR “Digital Technology”[Mesh] OR digital technolog*[tw]
- Q1a. Data donation
- Full search strategy
- (Q1a OR Q1b) AND Q2
Screening and Data Charting
Covidence [
] was used to screen publications for eligibility in 2 steps, first by eliminating publications based on a title and abstract review and then by reviewing the full text of the remaining journals. A pre-established set of eligibility criteria was used for screening ( ). As this an emerging field and our motivation to see the trends in full landscape, we did not set criteria for time range. Screening decisions were made only after 2 independent reviewers on the screening team (SS, MA, and RHY) agreed. All conflicts were resolved by group discussion. Library journal subscriptions and interlibrary loan were used to obtain full text, with the full text of 1 publication [ ] not available from either source. All reviewers conducted pilots for each round of screening.We charted publications for bibliographic details and along 5 reporting dimensions (“participant,” “device,” “data,” “study,” and “ethics”; double quoted here and in the text below). Because our focus is on health data contributions from participants through BYOD and crowdsourcing platforms, we modeled the reporting dimensions on existing work in the areas of citizen science data contribution [
], bioethical approaches to using personal health data [ ], and mobile health apps [ ]. Upon reviewing the keywords of included publications (eg, “ethical code,” “consent,” and “informed consent”), we summarized frequent themes of keywords (eg, consent). Furthermore, we refined our charting template by selecting factors related to those frequent themes to improve data adequacy. The final factors included under the 5 reporting dimensions are listed in . A data charting pilot with a random sample of one-fifth of the final pool of publications was conducted before finalizing the data charting template [ ]. A team of 4 reviewers (SS, MA, RHY, and ZL) charted the data using the template built in Covidence [ ]. In addition, 2 reviewers independently charted each publication. Consensus was reached by group discussion.Theme | Criteria |
Phenomenon of interest | Data contribution (ie, data donation or data sharing). |
Technology | Wearables, smartphones, or mobile applications: limit the scope of wearables to fitness trackers and smartwatches and include surveys through smartphone applications. Exclude surveys through websites. |
Perspective | Individual: individual refers to either patient or nonpatient and exclude data transfer between researcher and researcher, researcher and organization, and organization and organization. |
Goal | Collected data is used for research. |
Data Type | Data concerning human health. |
Study Type | —a |
Language | English. |
Time | — |
aNot applicable.
Factors | Definitions | ||||
Bibliographic details (3 factors) | |||||
Publication year |
| ||||
Publication location |
| ||||
Publication type |
| ||||
Reporting dimensions (21 factors) | |||||
“Participant” | |||||
Identity of data subject and data contributor |
| ||||
Living status of data subject |
| ||||
Purpose of data generation |
| ||||
“Device” | |||||
Device owner |
| ||||
Device type |
| ||||
Data capture mode |
| ||||
“Data” | |||||
Data type |
| ||||
Data owner |
| ||||
Participant access to data after contribution |
| ||||
“Study” | |||||
Research scenario |
| ||||
Purpose of data collection |
| ||||
Used crowdsourcing platform to collect data |
| ||||
Research design |
| ||||
Design duration |
| ||||
Attrition reported |
| ||||
“Ethics” | |||||
Consent reported |
| ||||
Informed consent reported |
| ||||
Consent type |
| ||||
Consent subject |
| ||||
Right to opt-out reported |
| ||||
Monetary benefits |
|
aDynamic consent is a process facilitated by collaborative and online digital platforms that allows participants to regularly check research activities and modify their consent for any upcoming research projects [
, ]. Dynamic consent can also be described as an approach to consent that enables ongoing engagement and communication between individuals and the users and custodians of their data [ ].Data Analysis and Visualization
To determine the leading countries and their contributions to various publication types, we compared the counts and percentages of publications by type and location. In addition, we visualized the number of publications by year to evaluate publication trends. We also visualized publication counts by year and location for original research publications. Reporting dimensions, including factors concerning “participant,” “device,” “data,” “study,” and “ethics” (
), were analyzed for original research publications only. We charted the frequency and percentage of original research studies that included these 21 reporting dimension factors.In addition, 4 factors with multiple-choice options were visualized using Venn (if options ≤3) or alluvial (if options >3) diagrams after excluding publications that did not report these factors. These 4 factors included 1 “device” dimension factor (device owner), 2 “data” dimension factors (data type and data owner), and 1 “study” dimension factor (research scenario). We also analyzed publications according to the frequency of data types collected. We assessed missing factors among original research publications to evaluate changes in reporting deficiencies for each reporting dimension. We compared the percentages of publications missing factors between 2 temporal groups (2010-2016 and 2017-2021). All analyses and visualizations were performed using R (version 4.2.1; R Core Team).
Results
Search and Data Charting
Our search strategy yielded 3206 results. A total of 100 publications remained following screening [
- ]. All 100 articles are included in our data analyses ( ; bibliographic information in ).
Trends by Publication Year, Location, and Publication Type
Publication year and location were extracted from the 100 included articles and organized by publication type (
, ). The majority of publications had a first author from the United States (46%, 46/100), European Union (20%, 20/100), United Kingdom (11%, 11/100), or Canada (6%, 6/100). Publications designated as “Research” (original research) made up the largest proportion of publications (45%, 45/100). These were followed by publications designated as “Tech” (20%, 20/100), “Other” (17%, 17/100), “Review” (13%, 13/100), and “Protocol” (5%, 5/100). Refer to for definitions of these publication types.There was a spike in total publications in 2016, with 11 publications in 2016 compared with 3 or fewer annually beforehand. Since 2016, a minimum of 9 publications have been published annually. There was another spike in output in 2019, with 19 publications. Other trends observed were US-dominant “Tech” publications and diversely geolocated “Review” publications.

Reporting in Original Research Publications
Reporting of Participant Factors
Most publications (78%, 35/45) declared or implied that the data subject and contributor were the same individual. Almost all data subjects (96%, 43/45) were living at the time of data contribution. In most publications, analyses were conducted using data generated by data subjects for themselves during routine use (56%, 25/45) rather than explicitly for research (11%, 5/45;
).Reporting factors | Original research publications (N=45), n (%) | |||
“Participant” dimension | ||||
Identity of data subject and data contributor (Are the data subject and the data contributor the same individual?) | ||||
Yes | 35 (78) | |||
No | 0 (0) | |||
Mixed | 6 (13) | |||
Not available | 4 (8) | |||
Living status of the data subject | ||||
Living | 43 (96) | |||
Both living and deceased | 1 (2) | |||
Not available | 1 (2) | |||
Purpose of data generation | ||||
For research | 5 (11) | |||
For self | 25 (56) | |||
Not available | 15 (33) | |||
“Device” Dimension | ||||
Device owner (multiple choices possible)a | ||||
Data subject | 19 (42) | |||
Data contributor | 18 (40) | |||
Researcher | 9 (20) | |||
Not available | 17 (38) | |||
Device type | ||||
Wearable | 7 (16) | |||
Smartphone | 17 (38) | |||
Both | 20 (44) | |||
Not available | 1 (2) | |||
Data capture mode | ||||
Active | 8 (18) | |||
Passive | 8 (18) | |||
Both | 22 (49) | |||
Not available | 7 (16) | |||
“Data” Dimension | ||||
Data type (multiple choices possible)a | ||||
Location | 18 (40) | |||
Genetics, biospecimen | 2 (4) | |||
Genetics, non-biospecimen | 3 (7) | |||
Claims or administrative | 1 (2) | |||
Clinical | 25 (56) | |||
Mental health or lifestyle | 38 (84) | |||
Data owner (multiple choices possible)a | ||||
Data subject | 3 (7) | |||
Data contributor | 4 (9) | |||
Researcher | 0 (0) | |||
Company | 3 (7) | |||
The public | 0 (0) | |||
Not available | 41 (91) | |||
“Study” Dimension | ||||
Participant access to data after contribution | ||||
Yes | 7 (16) | |||
No | 0 (0) | |||
Not available | 38 (84) | |||
Research scenario (multiple choices possible)a | ||||
Public health | 10 (22) | |||
Commercial | 16 (36) | |||
Nonprofit | 37 (82) | |||
Not available | 2 (5) | |||
Purpose of data collection | ||||
Primary | 18 (40) | |||
Secondary | 0 (0) | |||
Both | 12 (27) | |||
Not available | 15 (33) | |||
Used crowdsourcing platform to collect data | ||||
Yes | 3 (7) | |||
Research design | ||||
Observational | 41 (91) | |||
Experimental | 4 (9) | |||
Design duration | ||||
Cross-sectional | 25 (56) | |||
Longitudinal | 18 (40) | |||
Not available | 2 (4) | |||
Attrition reported | ||||
Yes | 14 (31) | |||
“Ethics” Dimension | ||||
Consent reported | ||||
Yes | 41 (91) | |||
Informed consent reported | ||||
Yes | 28 (62) | |||
Consent type | ||||
Blanket | 0 (0) | |||
Broad | 2 (4) | |||
Tiered | 1 (2) | |||
Meta | 0 (0) | |||
Explicit | 5 (11) | |||
More than one type of consent | 2 (4) | |||
Not available | 35 (78) | |||
Consent subject | ||||
Data subject | 4 (9) | |||
Data contributor | 4 (9) | |||
Both | 29 (64) | |||
Not available | 8 (18) | |||
Right to opt-out reported | ||||
Yes | 9 (20) | |||
Monetary benefits | ||||
Unconditional | 2 (4) | |||
Conditional | 9 (20) | |||
No benefit | 6 (13) | |||
Not available | 28 (62) |
aLevels below reporting factors were cleaned to be binary if multiple choices were possible.
Reporting of Device Factors
Among publications reporting the device owner, more reported that devices were owned by participants (data subject or data contributor) than were owned by researchers (75%, 21/28 vs 32%, 9/28;
A). Furthermore, 2 publications reported a mix of device ownership (owned by data subject, data contributor, and researcher), indicating a transfer of ownership (ie, participants retained the provisioned device after the study). Wearables were more often used with smartphones than used alone (wearables and smartphones 44%, 20/45 vs wearables alone 16%, 7/45; ). Most publications reported a combination of active and passive data capturing, and active-alone and passive-alone data capturing were equally popular (active and passive 49%, 22/45 vs active-alone 18%, 8/45 vs passive-alone 18%, 8/45; ).
Reporting of Data Factors
With multiple choices possible, the mental health or lifestyle data type (84%, 38/45;
) was the most collected, followed by clinical (56%, 25/45) and location (40%, 18/45) data types. In comparison, genetics data types (biospecimen 4%, 2/45 and nonbiospecimen 7%, 3/45 [ ]; in total 11%, 5/45 [ B]) and insurance claims or administrative data type (2%, 1/45; ) were seldom collected. Most publications reported more than one data type (80%, 36/45; refer to table in B). Overall, 40% (18/45) of publications’ collected data included clinical data and mental health or lifestyle data ( B). No dominant data ownership arrangement was observed (data subject 7%, 3/45 vs data contributor 9%, 4/45 vs company 7%, 3/45; ).Reporting of Study Factors
Most studies that reported research scenarios were conducted within a strictly nonprofit milieu (47%, 20/43;
D). No study collected data solely for secondary analysis purposes ( ). A limited number of publications reported using crowdsourcing platforms to collect data (7%, 3/45; ). Most studies were observational (91%, 41/45; ). A prevalence of cross-sectional designs over longitudinal designs was observed (56%, 25/45 vs 40%, 18/45; ).Reporting of Ethics Factors
No publication reported blanket or meta-consent (
). Among those publications reporting monetary benefits for participation, more than one-third reported giving no monetary benefits to participants (35%, 6/17; ).Reporting Deficiencies in Original Research Publications
The number of publications that missed reporting at least 1 factor under the 5 reporting dimensions broke down as follows: 36% (16/45) for the “participant” dimension, 42% (19/45) for the “device” dimension, 96% (43/45) for the “data” dimension, 78% (35/45) for the “study” dimension, and 96% (43/45) for the “ethics” dimension (
).
Except for the data type factor (“DataType”) and the observational or experimental study design factor (“DesignObsExp”), missingness existed in most factors under the 5 reporting dimensions. In total, 6 factors had missingness in more than half of the original research publications: data owner (91%, 41/45), participant access to data after data contribution (84%, 38/45), right to opt out (80%, 36/45), consent type (78%, 35/45), monetary benefits (62%, 28/45), and attrition (69%, 31/45). These 6 factors were often missing together (
).The factor for marking studies that used crowdsourcing platforms to collect data (“CrowdsourcingPlatform”) was excluded from the reporting deficiency analysis because using these platforms is optional for research.
Trends in reporting deficiencies were observed in original research publications over time. Reporting deficiencies were measured by the absence, or missingness, of specific reporting factors, with increased missingness reflecting declining reporting performance and decreased missingness reflecting improving reporting performance. The reporting performance of some reporting factors showed declines and others improvements when compared across the years 2010-2016 and 2017-2021. Furthermore, 2 factors in the “data” dimension exhibited substantial declines in reporting performance: data owner (58% change in missingness; from 2/5, 40% in 2010-2016 to 39/40, 98% in 2017-2021) and participant access to data after contribution (50% change in missingness; from 2/5, 40% in 2010-2016 to 36/40, 90% in 2017-2021). In contrast, 4 factors in the “device” and “ethics” dimensions exhibited substantial improvements in reporting performance: data capture mode (–28% change in missingness; from 2/5, 40% in 2010-2016 to 5/40, 13% in 2017-2021), device type (–20% change in missingness; from 1/5, 20% in 2010-2016 to 0/40, 0% in 2017-2021), consent type (–25% change in missingness; from 5/5, 100% in 2010-2016 to 30/40, 75% in 2017-2021), and consent subject (–25% change in missingness; from 2/5, 40% in 2010-2016 to 6/40, 15% in 2017-2021). The magnitude of change in the percentage of publications with missingness was substantial if the absolute value of the difference was equal to or greater than 20% (
and ).
Discussion
Principal Results
We have characterized the reporting practices of published research studies that collected PGHD using mobile devices. Our findings show that the United States has gained momentum in publishing in this area over the past decade (
). In addition, a high proportion of the original research articles we identified used participant-owned devices or a BYOD strategy if they reported device ownership (75%, 21/28; A), with few studies collecting data using crowdsourcing platforms (7%, 3/45; “Used crowdsourcing platform to collect data” in ). We also found reporting deficiencies in the “data” and “ethics” dimensions among original research articles, where most original research articles were deficient in at least 1 factor concerning “data” (96%, 43/45) or “ethics” (96%, 43/45; ).Trends by Publication Year, Location, and Publication Type
Our analysis of publications by year and location echoed trends seen in the field, with publication spikes in 2016 and 2019 and stable development afterward (
). Like others [ , ], we found that the United States is a leading contributor to these publications. Furthermore, the United States dominates contributions in several publication types, including original research ( ).Reporting in Original Research Publications
In the “participant” reporting dimension, we found that the mobile device data used for research were more often generated by participants for themselves through routine use rather than generated explicitly for research (56%, 25/45 vs 11%, 5/45; “Purpose of data generation” in
). However, the quality of data acquired for routine use may not meet data quality requirements for research. For instance, individuals engaging in routine device use may not be trained in the best way to wear and use the device to ensure high-quality data collection, thereby leading to potential issues with low or missing wear time and user errors that impact the completeness and correctness of the collected data [ ]. Systematic data quality assessment tools have been developed specifically for the secondary use of PGHD collected from mobile devices. These tools address the quality issues of routine use data by recommending research reporting clarity on the definition of nonwear time, the threshold for valid records, the definition of data completeness, and the identification of outliers [ ].In the “device” reporting dimension, among publications that reported device ownership, we found that three-quarters of original research publications (21/28;
A) reported using mobile devices owned by participants (data contributors or data subjects), which implies the popularity of the BYOD strategy. As mentioned in the introduction, BYOD may bring bias into demographics through disparate accessibility to device and self-selection, thus a comprehensive guideline on using BYOD to collect PGHD, which is not available so far, is a need. As an alternative to BYOD, some researchers provide devices to participants (9/28; A). In our review, for studies that provided devices to participants, participants in 2 of them ( A) retained the device after the study (shown in A as a mix of device ownership). This strategy may serve as an incentive and help improve device use simultaneously. In our own research [ ], however, we found that providing a device when recruiting study participants for cardiovascular wearable studies did not improve sample representativeness in age, race, and education. Thus, more work is needed to understand effective incentives to participate in research that collects PGHD from mobile devices such that recruitment would lead to a more representative study sample.We found that wearables were predominantly used as secondary devices connected to smartphones rather than as standalone devices (44%, 20/45 vs 16%, 7/45; “Device type” in
), indicating that barriers may exist to the direct collection of data from wearables for research purposes. Evidence to support this supposition is provided by a study published in 2023 [ ] that reported that among those who expressed a willingness to share data from a wearable, only 25% deposited their data in the study’s research database. The primary reason for the refusal to deposit data was the inconvenience of the data transfer process. Previous research findings [ ] have also raised concerns that wearables may be less sustainable than smartphones for remote monitoring over long periods [ ]. However, the reliance on a smartphone for data transfer is not ideal if other data are not collected by the smartphone, such as information about heart rate or sleep. A review published in 2022 [ ] found that a limited number of mobile health apps use Bluetooth, and an even smaller number use standard Bluetooth Low Energy. This indicates a potentially low capability of smartphone apps to interact with external devices, including different types of wearables.Researchers adopt mobile devices for research due to their ability to capture data passively, thus reducing recall bias introduced by active approaches (eg, surveys). Our findings, however, suggest that active and passive data capturing were equally popular (18%, 8/45 for each; “Data capture mode” in
) and frequently used in tandem (49%, 22/45; “Data capture mode” in ). Researchers may have hesitated to depend solely on passive data collection because this approach can result in low participant engagement due to privacy concerns [ ] and potential data quality issues [ ]. An approach that uses both passive and active data collection can offer more comprehensive data and enable cross-validation of responses through data linkages.In the “data” reporting dimension, a majority of the publications reported collecting multiple data types (80%, 36/45; refer to table in
B). Furthermore, 40% (18/45) of publications’ collected data include clinical data and mental health or lifestyle data ( B). A subset of these publications indicated that clinical data were collected from participants’ electronic health records, reflecting progress made to integrate mobile device data into electronic health records [ ]. Studies seeking to connect multiple data types from study participants should exercise caution due to the increased risk of participant reidentification [ ].We found that certain data types were more infrequently collected than others. For example, insurance claim and administrative data were rarely collected (2%, 1/45; “Data type” in
). This gap may be because of difficulties linking these data to other data [ ]. Also, despite advancements in informatics infrastructure (eg, the HL7 FHIR [Fast Health Interoperability Resources] API) enabling patients to download these data (eg, Medicare claims and encounter data through Blue Button [ ]), only a limited number of beneficiaries have done so. For example, while Blue Button, available since 2010, has been used by over one million Medicare beneficiaries [ ], this is a small fraction of the 65 million people covered by Medicare as of 2022 [ ]. Genetics data collection was also rare (11%, 5/45; B), which is consistent with the prevailing observation that a limited number of genomics mobile apps serve to collect data, compared with other tasks (eg, education and communication) [ , ].In the “study” reporting dimension, only 7% of original research publications used crowdsourcing platforms to collect data (3/45; “Used crowdsourcing platform to collect data” in
), which may indicate a scarcity of streamlined mechanisms for using these platforms. Given the growing popularity of BYOD research, there is an opportunity for improved mechanisms to leverage such platforms to transfer data from device owners to researchers. We also found that the majority of the studies were observational (91%, 41/45; “Research design” in ), aligning with previous research [ ]. Although mobile devices enable facile longitudinal data collection, we discovered a slightly greater prevalence of cross-sectional studies over longitudinal studies (56%, 25/45 vs 40%, 18/45; “Design duration” in ). This could be due to researchers’ apprehensions about device fatigue [ ] and stakeholders’ concerns regarding privacy [ ].In the “ethics” reporting dimension, we explored factors such as consent models and monetary benefits because they are relevant to the likelihood of participants to share data. For example, consent models, as noticed in a survey conducted by Köngeter et al [
], have different acceptance rates among patients with cancer. In our scoping review, we found no publication reported blanket or meta consent (“Consent type” in ). Since no studies collected data purely for secondary analysis (“Purpose of data collection” in ), the preference for nonblanket consent was justified. Despite debates about meta consent [ ] and dynamic consent [ ] as solutions for modern research, our finding that no studies used meta (dynamic) consent aligns with critics who posit that these innovative approaches require highly participatory technological platforms and may introduce bias due to participants’ technical competence [ ]. Over one-third of the original research publications that reported on monetary benefits provided no monetary benefits to participants (35%, 6/17; “Monetary benefits” in ). Research generally indicates that even small monetary incentives increase consent and response rates [ ], irrespective of risk level [ ]. However, the impact of monetary benefits on wearable study participation remains undetermined and requires examination on a case-by-case basis (eg, for studies conducted on Amazon Mechanical Turk [ ]) given the diversity of mobile device studies.Prevalence of Reporting Deficiencies in Original Research Publications
Our results show a high level of reporting deficiencies in the “data” and “ethics” dimensions (
), with 5 reporting factors missing in over half of the original research publications—data ownership (91%, 41/45), participant access to data after data contribution (84%, 38/45), right to opt out (80%, 36/45), consent type (78%, 35/45), and monetary benefits (62%, 28/45). Among these 5 factors, we also noticed a trend over time of significantly declined performance in the reporting of data ownership (40% missingness, 2010-2016 vs 98% missingness, 2017-2021; ) and participant access to data after contribution (40% missingness, 2010-2016 vs 90% missingness, 2017-2021). However, over time, there was a significantly improved performance in reporting consent type (100% missingness, 2010-2016 vs 75% missingness, 2017-2021). In the “study” dimension, nearly 70% (31/45; ) of original research publications failed to report study participant attrition.Recommendations for Reporting Based on Findings From This Review
Our findings highlight several areas to consider when reporting research with PGHD collected using mobile devices. Recommendations for reporting based on findings from this review are summarized in
. Further work is needed, however, to gain consensus on a more comprehensive set of reporting items. Such a consensus on best practices would greatly facilitate responsible and reproducible research and improve research quality and impact [ ]. For example, a 2012 systematic review indicated a positive association between journal endorsements of a reporting guideline and the quality of reporting in randomized clinical trials published in those journals [ ]. Another study looking at the publications from one journal found that those publications whose papers received editorial interventions designed to ensure adherence to reporting guidelines were more highly cited, with citation counts 43% higher than those publications that had not received the editorial interventions [ ].Reporting dimension | Reporting phenomena identified in this scoping review | Justification to include reporting dimension as a best practice |
“Participant” (1 item) |
|
|
“Device” (2 items) |
|
|
“Data” (2 items) |
|
|
“Study” (1 item) |
|
|
“Ethics” (3 items) |
|
|
aPGHD: person-generated health data.
Limitations
Our review has limitations related to how we gathered evidence from the literature, our choice of reporting factors, and one of our data analysis methods.
Our PubMed search strategy (
) has concepts with explicit data donation and data sharing terms to retrieve publications that describe participant data contributions. Because of this design, the search strategy only retrieved publications whose titles and abstracts had data donation and data sharing terms. Possibly, our search missed relevant publications that described participant data contributions only in the full text. The articles we analyzed were only collected from PubMed. Articles were not retrieved from other databases (eg, Embase or Web of Science), from the reference lists of included articles, or from the gray literature. Thus, it is possible that articles only available through these other sources were overlooked. Our preliminary searches, however, suggested that PubMed provided literature coverage sufficiently comprehensive for our research objectives.While we selected reporting factors relevant to our goal of reviewing research that collects PGHD using mobile devices with the perspective of an individual’s data contribution, we may have excluded relevant factors. For example, we did not include the demographic factors of study populations, which may be more broadly relevant to wearable PGHD research. Detecting the low representation of some groups when using wearables for PGHD collection can indicate where biases exist that could lead to downstream inequities. This need is apparent based on findings that aging populations were underrepresented in studies with wearables [
] and that studies of cardiovascular disease that involve wearables [ ]. Such a need requires domain-specific examinations, and we regard it out of the scope of this review, which aims to explore the landscape as the first step. As an example of a second-step effort, we have investigated the studies of cardiovascular disease that involve wearables, and we noticed that they did not reflect the demographics of patient populations in terms of age, race, education, cigarette smoking status, and hypertension status [ ].We lowered the granularity of some factors to avoid issues of imbalance in categorical analysis. For example, when analyzing research scenarios, we consolidated academic institutions, for-profit commercial companies, corporations or foundations, independent research organizations, patient-led groups, citizen scientists, and so on, into 3 categories. Thus, our findings do not necessarily represent the full range of possible ways of describing this research.
Despite these limitations, we are confident that having used strict procedures to reduce bias (eg, 2-level screening, independent charting, and a predefined charting template), this review offers a solid foundation for future research. We also feel that the full PubMed query we provide (
) is a valuable contribution to other researchers since it was developed and tested for wearables and smartphones and since publications in the emerging field of mobile devices and PGHD are not well indexed.Conclusions
We identified trends and patterns in the reporting of research that collects PGHD using mobile devices. Given the growing interest in using a BYOD model among researchers, there is an opportunity for a broader use of crowdsourcing platforms for data transfer from participant-owned devices. There is also the opportunity to develop best practices that address observed reporting deficiencies in this research and make it more reproducible and responsible. Based on our findings, we recommend 9 items for enhanced reporting.
Acknowledgments
SS, ZL, and COT were supported by National Institutes of Health grant NHGRI R35 HG010714.
Data Availability
The data underlying this article are readily available upon reasonable request to the corresponding author.
Authors' Contributions
All authors had full access to the data in the study, take responsibility for data integrity and analysis accuracy, and have approved the final manuscript. Study design was contributed by SS and COT. Review strategy was contributed by SS, RW, and COT. Data collection was managed by SS, MA, RHY, and ZL. Data analysis was done SS. Drafting was contributed by SS. Intellectual inputs were contributed by RW, DJHM, and COT. All authors contributed to manuscript revision. Supervision was handled by COT.
Conflicts of Interest
None declared.
PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) checklist.
DOC File , 124 KBBibliographic information of the articles included in the scoping review.
DOC File , 327 KBPublication types across publication years and publication locations.
DOC File , 37 KBReporting deficiencies in original research publications over time, as measured by the missingness of reporting factors.
DOC File , 47 KBReferences
- Martin M. Computer and Internet Use in the United States: 2018. Suitland, MD. United States Census Bureau; 2021.
- Mobile technology and home broadband 2021. Pew Research Center. URL: https://www.pewresearch.org/internet/2021/06/03/mobile-technology-and-home-broadband-2021/ [accessed 2023-01-31]
- Mobile fact sheet. Pew Research Center. 2021. URL: https://www.pewresearch.org/internet/fact-sheet/mobile/ [accessed 2021-04-07]
- Chandrasekaran R, Katthula V, Moustakas E. Patterns of use and key predictors for the use of wearable health care devices by US adults: insights from a national survey. J Med Internet Res. 2020;22(10):e22443. [FREE Full text] [CrossRef] [Medline]
- About one-in-five Americans use a smart watch or fitness tracker. Pew Research Center. URL: https://www.pewresearch.org/fact-tank/2020/01/09/about-one-in-five-americans-use-a-smart-watch-or-fitness-tracker/ [accessed 2023-01-31]
- Rothstein MA, Wilbanks JT, Beskow LM, Brelsford KM, Brothers KB, Doerr M, et al. Unregulated health research using mobile devices: ethical considerations and policy recommendations. J Law Med Ethics. 2020;48(1_suppl):196-226. [FREE Full text] [CrossRef] [Medline]
- Sim I. Mobile devices and health. N Engl J Med. 2019;381(10):956-968. [CrossRef]
- Part 46, Subpart A version of the Final Rule "Federal Policy for the Protection of Human Subjects". 2017. URL: https://tinyurl.com/ypsdr69m [accessed 2024-10-18]
- Taylor CO, Flaks-Manov N, Ramesh S, Choe EK. Willingness to share wearable device data for research among mechanical turk workers: web-based survey study. J Med Internet Res. 2021;23(10):e19789. [FREE Full text] [CrossRef] [Medline]
- IDO T. Apple's ResearchKit frees medical research. Nat Biotechnol. 2015;33(4):322. [CrossRef]
- Jardine J, Fisher J, Carrick B. Apple's ResearchKit: Smart Data Collection for the Smartphone era? London, England. SAGE Publications; 2015:294-296.
- Misra S. ResearchStack being developed as android's ResearchKit cousin, beta in January 2016. iMedicalApps. 2015. URL: https://www.imedicalapps.com/2015/11/researchkit-cob-open-mhealth/ [accessed 2023-05-25]
- Wicklund E. ResearchStack goes live, opening mHealth studies to the android ecosystem. Xtelligent Healthcare Media, mHealth Intelligence. 2016. URL: https://mhealthintelligence.com/news/researchstack-goes-live-opening-mhealth-studies-to-the-android-ecosystem [accessed 2023-05-25]
- Bot BM, Suver C, Neto EC, Kellen M, Klein A, Bare C, et al. The mPower study, parkinson disease mobile data collected using ResearchKit. Sci Data. 2016;3(1):160011. [FREE Full text] [CrossRef] [Medline]
- Cao J, Lim Y, Sengoku S, Guo X, Kodama K. Exploring the shift in international trends in mobile health research from 2000 to 2020: bibliometric analysis. JMIR Mhealth Uhealth. 2021;9(9):e31097. [FREE Full text] [CrossRef] [Medline]
- Cho PJ, Yi J, Ho E, Shandhi MMH, Dinh Y, Patil A, et al. Demographic imbalances resulting from the bring-your-own-device study design. JMIR Mhealth Uhealth. 2022;10(4):e29510. [FREE Full text] [CrossRef] [Medline]
- Mortensen K, Hughes TL. Comparing amazon's mechanical turk platform to conventional data collection methods in the health and medical research literature. J Gen Intern Med. 2018;33(4):533-538. [FREE Full text] [CrossRef] [Medline]
- Dhingra LS, Aminorroaya A, Oikonomou EK, Nargesi AA, Wilson FP, Krumholz HM, et al. Use of wearable devices in individuals with or at risk for cardiovascular disease in the US, 2019 to 2020. JAMA Netw Open. 2023;6(6):e2316634. [FREE Full text] [CrossRef] [Medline]
- Vrijheid M, Richardson L, Armstrong BK, Auvinen A, Berg G, Carroll M, et al. Quantifying the impact of selection bias caused by nonparticipation in a case-control study of mobile phone use. Ann Epidemiol. 2009;19(1):33-41. [CrossRef] [Medline]
- Swan M. Crowdsourced health research studies: an important emerging complement to clinical trials in the public health research ecosystem. J Med Internet Res. 2012;14(2):e46. [FREE Full text] [CrossRef] [Medline]
- Holko M, Ratsimbazafy F, Marginean K, Natarajan K, Cho S, Schilling J. Fitbit' Bring Your Own Device' data in the All of Us Research Program. United States. American Medical Informatics Association; 2020.
- Panch T, Mattie H, Atun R. Artificial intelligence and algorithmic bias: implications for health systems. J Glob Health. 2019;9(2):010318. [FREE Full text] [CrossRef] [Medline]
- Paulus JK, Kent DM. Predictably unequal: understanding and addressing concerns that algorithmic clinical prediction may increase health disparities. NPJ Digit Med. 2020;3:99. [FREE Full text] [CrossRef] [Medline]
- El-Sherif DM, Abouzid M. Analysis of mHealth research: mapping the relationship between mobile apps technology and healthcare during COVID-19 outbreak. Global Health. 2022;18(1):67. [FREE Full text] [CrossRef] [Medline]
- Fischer F, Kleen S. Possibilities, problems, and perspectives of data collection by mobile apps in longitudinal epidemiological studies: scoping review. J Med Internet Res. 2021;23(1):e17691. [FREE Full text] [CrossRef] [Medline]
- Huhn S, Axt M, Gunga H, Maggioni MA, Munga S, Obor D, et al. The impact of wearable technologies in health research: scoping review. JMIR Mhealth Uhealth. 2022;10(1):e34384. [FREE Full text] [CrossRef] [Medline]
- Grundy Q. A review of the quality and impact of mobile health apps. Annu Rev Public Health. 2022;43(1):117-134. [FREE Full text] [CrossRef] [Medline]
- Vijayan V, Connolly JP, Condell J, McKelvey N, Gardiner P. Review of wearable devices and data collection considerations for connected health. Sensors (Basel). 2021;21(16):5589. [FREE Full text] [CrossRef] [Medline]
- Future of Privacy Forum. 2016. URL: https://fpf.org/wp-content/uploads/2016/08/FPF-Best-Practices-for-Wearables-and-Wellness-Apps-and-Devices-Final.pdf [accessed 2022-04-08]
- Cortez AHP, Mitchell E, Riehl V, Smith P. Conceptualizing a Data Infrastructure for the Capture, Use, and Sharing of Patient-Generated Health Data in Care Delivery and Research through 2024. Washington, DC. Office of the National Coordinator for Health Information Technology; 2018.
- Wang SV, Pinheiro S, Hua W, Arlett P, Uyama Y, Berlin JA, et al. STaRT-RWE: structured template for planning and reporting on the implementation of real world evidence studies. BMJ. 2021;372:m4856. [FREE Full text] [CrossRef] [Medline]
- Agarwal S, LeFevre AE, Lee J, L'Engle K, Mehl G, Sinha C, et al. WHO mHealth Technical Evidence Review Group. Guidelines for reporting of health interventions using mobile phones: mobile health (mHealth) evidence reporting and assessment (mERA) checklist. BMJ. 2016;352:i1174. [CrossRef] [Medline]
- Chatzipavlou IA, Christoforidou SA, Vlachopoulou M. A recommended guideline for the development of mHealth apps. Mhealth. 2016;2:21. [FREE Full text] [CrossRef] [Medline]
- Guide to integrate patient-generated digital health data into electronic health records in ambulatory care settings. AHRQ. URL: https://digital.ahrq.gov/health-it-tools-and-resources/patient-generated-health-data-i-patient-reported-outcomes/practical-guide [accessed 2022-04-08]
- Olaye IM, Belovsky MP, Bataille L, Cheng R, Ciger A, Fortuna KL, et al. Recommendations for defining and reporting adherence measured by biometric monitoring technologies: systematic review. J Med Internet Res. 2022;24(4):e33537. [FREE Full text] [CrossRef] [Medline]
- Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467-473. [FREE Full text] [CrossRef] [Medline]
- Grossetta NHK, Wang L. The Yale MeSH Analyzer. New Haven, CT. Cushing/Whitney Medical Library URL: http://mesh.med.yale.edu/ [accessed 2021-08-17]
- Covidence Systematic Review Software. Melbourne, Australia. Veritas Health Innovation; 2024.
- Torous J, Staples P, Slaters L, Adams J, Sandoval L, Onnela J, et al. Characterizing smartphone engagement for schizophrenia: results of a naturalist mobile health study. Clin Schizophr Relat Psychoses. Aug 04, 2017. [FREE Full text] [CrossRef] [Medline]
- Bietz M, Patrick K, Bloss C. Data donation as a model for citizen science health research. Citizen Science: Theory and Practice. 2019;4(1). [CrossRef]
- Krutzinna J, Floridi L. The Ethics of Medical Data Donation. Cham, Switzerland. Springer; 2019.
- Schmietow B, Marckmann G. Mobile health ethics and the expanding role of autonomy. Med Health Care Philos. 2019;22(4):623-630. [CrossRef] [Medline]
- Song S, Ashton M, Yoo R, Wright R, Mathews D, Taylor CO. "Data Donation" or "Data Sharing"? A Scoping Review Characterizing Language Use in mHealth Research Involving Person-generated Health Data. Washington, DC. AMIA Annual Symposium 2022; 2022:5-9.
- Siddiqi AEA, Sikorskii A, Given CW, Given B. Early participant attrition from clinical trials: role of trial design and logistics. Clin Trials. 2008;5(4):328-335. [FREE Full text] [CrossRef] [Medline]
- Wendler D. Broad versus blanket consent for research with human biological samples. Hastings Cent Rep. 2013;43(5):3-4. [FREE Full text] [CrossRef] [Medline]
- Wiertz S, Boldt J. Evaluating models of consent in changing health research environments. Med Health Care Philos. 2022;25(2):269-280. [FREE Full text] [CrossRef] [Medline]
- Informed consent: considerations for biobanks. CloudLIMS. URL: https://cloudlims.com/informed-consent-dynamic-broad-tiered-and-meta-consent-for-biobanking/ [accessed 2022-02-23]
- Explicit Consent. Health Research Board. URL: https://www.hrb.ie/funding/gdpr-guidance-for-researchers/gdpr-and-health-research/consent/explicit-consent/ [accessed 2022-02-23]
- Budin-Ljøsne I, Teare HJA, Kaye J, Beck S, Bentzen HB, Caenazzo L, et al. Dynamic consent: a potential solution to some of the challenges of modern biomedical research. BMC Med Ethics. 2017;18(4):1-10. [FREE Full text] [CrossRef] [Medline]
- Ajana B. Digital health and the biopolitics of the quantified self. Digit Health. 2017;3:2055207616689509. [FREE Full text] [CrossRef] [Medline]
- Arab L, Winter A. Automated camera-phone experience with the frequency of imaging necessary to capture diet. J Am Diet Assoc. 2010;110(8):1238-1241. [CrossRef] [Medline]
- Atreja A, Khan S, Rogers JD, Otobo E, Patel NP, Ullman T, et al. HealthPROMISE Consortium Group. Impact of the mobile health PROMISE platform on the quality of care and quality of life in patients with inflammatory bowel disease: study protocol of a pragmatic randomized controlled trial. JMIR Res Protoc. 2015;4(1):e23. [FREE Full text] [CrossRef] [Medline]
- Auffray C, Balling R, Barroso I, Bencze L, Benson M, Bergeron J, et al. Erratum to: making sense of big data in health research: towards an EU action plan. Genome Med. 2016;8(1):118. [FREE Full text] [CrossRef] [Medline]
- Augusto DG, Murdolo LD, Chatzileontiadou DSM, Sabatino JJ, Yusufali T, Peyser ND, et al. A common allele of HLA is associated with asymptomatic SARS-CoV-2 infection. Nature. Aug 2023;620(7972):128-136. [FREE Full text] [CrossRef] [Medline]
- Bietz MJ, Bloss CS, Calvert S, Godino JG, Gregory J, Claffey MP, et al. Opportunities and challenges in the use of personal health data for health research. J Am Med Inform Assoc. 2016;23(e1):e42-e48. [FREE Full text] [CrossRef] [Medline]
- Bloem BR, Marks WJ, Silva de Lima AL, Kuijf ML, van Laar T, Jacobs BPF, et al. The personalized Parkinson project: examining disease progression through broad biomarkers in early Parkinson's disease. BMC Neurol. 2019;19(1):160. [FREE Full text] [CrossRef] [Medline]
- Bocher E, Petit G, Picaut J, Fortin N, Guillaume G. Collaborative noise data collected from smartphones. Data Brief. 2017;14:498-503. [FREE Full text] [CrossRef] [Medline]
- Boker SM, Brick TR, Pritikin JN, Wang Y, von Oertzen T, Brown D, et al. Maintained individual data distributed likelihood estimation (MIDDLE). Multivariate Behav Res. 2015;50(6):706-720. [FREE Full text] [CrossRef] [Medline]
- Bouras A, Simoes EJ, Boren S, Hicks L, Zachary I, Buck C, et al. Non-hispanic white mothers' willingness to share personal health data with researchers: survey results from an opt-in panel. J Particip Med. 2020;12(2):e14062. [FREE Full text] [CrossRef] [Medline]
- Bruno E, Böttcher S, Viana PF, Amengual-Gual M, Joseph B, Epitashvili N, et al. Wearable devices for seizure detection: practical experiences and recommendations from the wearables for epilepsy and research (WEAR) international study group. Epilepsia. 2021;62(10):2307-2321. [CrossRef] [Medline]
- Buoite Stella A, AJČEVIĆ M, Furlanis G, Cillotto T, Menichelli A, Accardo A, et al. Smart technology for physical activity and health assessment during COVID-19 lockdown. J Sports Med Phys Fitness. 2021;61(3):452-460. [CrossRef]
- Burchert S, Kerber A, Zimmermann J, Knaevelsrud C. Screening accuracy of a 14-day smartphone ambulatory assessment of depression symptoms and mood dynamics in a general population sample: comparison with the PHQ-9 depression screening. PLoS One. 2021;16(1):e0244955. [FREE Full text] [CrossRef] [Medline]
- Burkhardt HA, Brandt PS, Lee JR, Karras SW, Bugni PF, Cvitkovic I, et al. StayHome: A FHIR-native mobile COVID-19 symptom tracker and public health reporting tool. Online J Public Health Inform. 2021;13(1):e2. [FREE Full text] [CrossRef] [Medline]
- Cafazzo JA, Casselman M, Hamming N, Katzman DK, Palmert MR. Design of an mHealth app for the self-management of adolescent type 1 diabetes: a pilot study. J Med Internet Res. 2012;14(3):e70. [FREE Full text] [CrossRef] [Medline]
- Çelik Ertuğrul D, Çelik Ulusoy D. A knowledge-based self-pre-diagnosis system to predict Covid-19 in smartphone users using personal data and observed symptoms. Expert Syst. 2022;39(3):e12716. [FREE Full text] [CrossRef] [Medline]
- Chan YY, Bot BM, Zweig M, Tignor N, Ma W, Suver C, et al. The asthma mobile health study, smartphone data collected using ResearchKit. Sci Data. 2018;5:180096. [FREE Full text] [CrossRef] [Medline]
- Chang AR, Bailey-Davis L, Hetherington V, Ziegler A, Yule C, Kwiecen S, et al. Remote dietary counseling using smartphone applications in patients with stages 1-3a chronic kidney disease: a mixed methods feasibility study. J Ren Nutr. 2020;30(1):53-60. [FREE Full text] [CrossRef] [Medline]
- Chen J, Bauman A, Allman-Farinelli M. A study to determine the most popular lifestyle smartphone applications and willingness of the public to share their personal data for health research. Telemed J E Health. 2016;22(8):655-665. [CrossRef] [Medline]
- Cheung C, Bietz MJ, Patrick K, Bloss CS. Privacy attitudes among early adopters of emerging health technologies. PLoS One. 2016;11(11):e0166389. [FREE Full text] [CrossRef] [Medline]
- Chung AE, Sandler RS, Long MD, Ahrens S, Burris JL, Martin CF, et al. Harnessing person-generated health data to accelerate patient-centered outcomes research: the crohn's and colitis foundation of America PCORnet patient powered research network (CCFA partners). J Am Med Inform Assoc. 2016;23(3):485-490. [FREE Full text] [CrossRef] [Medline]
- Clarke H, Clark S, Birkin M, Iles-Smith H, Glaser A, Morris MA. Understanding barriers to novel data linkages: topic modeling of the results of the LifeInfo survey. J Med Internet Res. 2021;23(5):e24236. [FREE Full text] [CrossRef] [Medline]
- Cooke Bailey JN, Crawford DC, Goldenberg A, Slaven A, Pencak J, Schachere M, et al. Willingness to participate in a national precision medicine cohort: attitudes of chronic kidney disease patients at a Cleveland public hospital. J Pers Med. 2018;8(3):21. [FREE Full text] [CrossRef] [Medline]
- Cornet VP, Holden RJ. Systematic review of smartphone-based passive sensing for health and wellbeing. J Biomed Inform. 2018;77:120-132. [FREE Full text] [CrossRef] [Medline]
- Deering S, Grade MM, Uppal JK, Foschini L, Juusola JL, Amdur AM, et al. Accelerating research with technology: rapid recruitment for a large-scale web-based sleep study. JMIR Res Protoc. 2019;8(1):e10974. [FREE Full text] [CrossRef] [Medline]
- Deering S, Pratap A, Suver C, Borelli AJ, Amdur A, Headapohl W, et al. Real-world longitudinal data collected from the sleep health mobile app study. Sci Data. 2020;7(1):418. [FREE Full text] [CrossRef] [Medline]
- Eicher-Miller HA, Prapkree L, Palacios C. Expanding the capabilities of nutrition research and health promotion through mobile-based applications. Adv Nutr. 2021;12(3):1032-1041. [FREE Full text] [CrossRef] [Medline]
- Evans BJ. The perils of parity: should citizen science and traditional research follow the same ethical and privacy principles? J Law Med Ethics. 2020;48(1_suppl):74-81. [FREE Full text] [CrossRef] [Medline]
- Fadda M, Jobin A, Blasimme A, Greshake Tzovaras B, Price Ball M, Vayena E. User perspectives of a web-based data-sharing platform (open humans) on ethical oversight in participant-led research: protocol for a quantitative study. JMIR Res Protoc. 2018;7(11):e10939. [FREE Full text] [CrossRef] [Medline]
- Fadrique LX, Rahman D, Vaillancourt H, Boissonneault P, Donovska T, Morita PP. Overview of policies, guidelines, and standards for active assisted living data exchange: thematic analysis. JMIR Mhealth Uhealth. 2020;8(6):e15923. [FREE Full text] [CrossRef] [Medline]
- Faurholt-Jepsen M, Þórarinsdóttir H, Vinberg M, Ullum H, Frost M, Bardram J, et al. Automatically generated smartphone data and subjective stress in healthy individuals - a pilot study. Nord J Psychiatry. 2020;74(4):293-300. [CrossRef] [Medline]
- Fish LA, Jones EJH. A survey on the attitudes of parents with young children on in-home monitoring technologies and study designs for infant research. PLoS One. 2021;16(2):e0245793. [FREE Full text] [CrossRef] [Medline]
- Franklin EF, Nichols HM, House L, Buzaglo J, Thiboldeaux K. Cancer patient perspectives on sharing of medical records and mobile device data for research purposes. J Patient Exp. 2020;7(6):1115-1121. [FREE Full text] [CrossRef] [Medline]
- Freifeld CC, Chunara R, Mekaru SR, Chan EH, Kass-Hout T, Ayala Iacucci A, et al. Participatory epidemiology: use of mobile phones for community-based health reporting. PLoS Med. 2010;7(12):e1000376. [FREE Full text] [CrossRef] [Medline]
- Fylan F, Caveney L, Cartwright A, Fylan B. Making it work for me: beliefs about making a personal health record relevant and useable. BMC Health Serv Res. 2018;18(1):445. [FREE Full text] [CrossRef] [Medline]
- Genes N, Violante S, Cetrangol C, Rogers L, Schadt EE, Chan YY. From smartphone to EHR: a case report on integrating patient-generated health data. NPJ Digit Med. 2018;1(1):23. [FREE Full text] [CrossRef] [Medline]
- Hafen E, Kossmann D, Brand A. Health data cooperatives – citizen empowerment. Methods Inf Med. 2018;53(02):82-86. [CrossRef]
- Hamed A, Curran C, Gwaltney C, DasMahapatra P. Mobility assessment using wearable technology in patients with late-onset pompe disease. NPJ Digit Med. 2019;2(1):70. [FREE Full text] [CrossRef] [Medline]
- Hartmann R, Sander C, Lorenz N, Böttger D, Hegerl U. Utilization of patient-generated data collected through mobile devices: insights from a survey on attitudes toward mobile self-monitoring and self-management apps for depression. JMIR Ment Health. 2019;6(4):e11671. [FREE Full text] [CrossRef] [Medline]
- Heidel A, Hagist C. Potential benefits and risks resulting from the introduction of health apps and wearables into the German statutory health care system: scoping review. JMIR Mhealth Uhealth. 2020;8(9):e16444. [FREE Full text] [CrossRef] [Medline]
- Heidel A, Hagist C, Schlereth C. Pricing through health apps generated data-digital dividend as a game changer: discrete choice experiment. PLoS One. 2021;16(7):e0254786. [FREE Full text] [CrossRef] [Medline]
- Henderson ML, Thomas AG, Eno AK, Waldram MM, Bannon J, Massie AB, et al. The impact of the mKidney mHealth system on live donor follow-up compliance: protocol for a randomized controlled trial. JMIR Res Protoc. 2019;8(1):e11000. [FREE Full text] [CrossRef] [Medline]
- Henriksen A, Johannessen E, Hartvigsen G, Grimsgaard S, Hopstock LA. Consumer-based activity trackers as a tool for physical activity monitoring in epidemiological studies during the COVID-19 pandemic: development and usability study. JMIR Public Health Surveill. 2021;7(4):e23806. [FREE Full text] [CrossRef] [Medline]
- Herbec A, Brown J, Shahab L, West R. Lessons learned from unsuccessful use of personal carbon monoxide monitors to remotely assess abstinence in a pragmatic trial of a smartphone stop smoking app - a secondary analysis. Addict Behav Rep. 2019;9:100122. [FREE Full text] [CrossRef] [Medline]
- Hershman SG, Bot BM, Shcherbina A, Doerr M, Moayedi Y, Pavlovic A, et al. Physical activity, sleep and cardiovascular health data for 50,000 individuals from the MyHeart counts study. Sci Data. 2019;6(1):24. [FREE Full text] [CrossRef] [Medline]
- Hesse BW, Greenberg AJ, Rutten LJF. The role of Internet resources in clinical oncology: promises and challenges. Nat Rev Clin Oncol. 2016;13(12):767-776. [CrossRef] [Medline]
- Hong SJ, Cho H. Privacy management and health information sharing via contact tracing during the COVID-19 pandemic: a hypothetical study on AI-based technologies. Health Commun. 2023;38(5):913-924. [CrossRef] [Medline]
- Househ M, Grainger R, Petersen C, Bamidis P, Merolli M. Balancing between privacy and patient needs for health information in the age of participatory health and social media: a scoping review. Yearb Med Inform. 2018;27(1):29-36. [FREE Full text] [CrossRef] [Medline]
- Hughes A, Landers D, Arkenau H, Shah S, Stephens R, Mahal A, et al. Development and evaluation of a new technological way of engaging patients and enhancing understanding of drug tolerability in early clinical development: PROACT. Adv Ther. 2016;33(6):1012-1024. [FREE Full text] [CrossRef] [Medline]
- Katapally TR. A global digital citizen science policy to tackle pandemics like COVID-19. J Med Internet Res. 2020;22(5):e19357. [FREE Full text] [CrossRef] [Medline]
- Kim G, Bae JC, Yi BK, Hur KY, Chang DK, Lee M, et al. An information and communication technology-based centralized clinical trial to determine the efficacy and safety of insulin dose adjustment education based on a smartphone personal health record application: a randomized controlled trial. BMC Med Inform Decis Mak. 2017;17(1):109. [FREE Full text] [CrossRef] [Medline]
- Kim JW, Ryu B, Cho S, Heo E, Kim Y, Lee J, et al. Impact of personal health records and wearables on health outcomes and patient response: three-arm randomized controlled trial. JMIR Mhealth Uhealth. 2019;7(1):e12070. [FREE Full text] [CrossRef] [Medline]
- Kim TK, Choi M. Older adults' willingness to share their personal and health information when adopting healthcare technology and services. Int J Med Inform. 2019;126:86-94. [CrossRef] [Medline]
- Kolovson S, Pratap A, Duffy J, Allred R, Munson S, Areán PA. Understanding participant needs for engagement and attitudes towards passive sensing in remote digital health studies. Int Conf Pervasive Comput Technol Healthc. 2020;2020:347-362. [FREE Full text] [CrossRef] [Medline]
- Kostkova P, Brewer H, de Lusignan S, Fottrell E, Goldacre B, Hart G, et al. Who owns the data? Open data for healthcare. Front Public Health. 2016;4:7. [FREE Full text] [CrossRef] [Medline]
- Labs J, Terry S. Privacy in the coronavirus era. Genet Test Mol Biomarkers. 2020;24(9):535-536. [CrossRef] [Medline]
- Laurie GT. Cross-sectoral big data: the application of an ethics framework for big data in health and research. Asian Bioeth Rev. 2019;11(3):327-339. [FREE Full text] [CrossRef] [Medline]
- Leal Neto O, Dimech GS, Libel M, de Souza WV, Cesse E, Smolinski M, et al. Saúde na copa: the world's first application of participatory surveillance for a mass gathering at FIFA world cup 2014, Brazil. JMIR Public Health Surveill. 2017;3(2):e26. [FREE Full text] [CrossRef] [Medline]
- Levin HI, Egger D, Andres L, Johnson M, Bearman SK, de Barbaro K. Sensing everyday activity: parent perceptions and feasibility. Infant Behav Dev. 2021;62:101511. [CrossRef] [Medline]
- McKendry RA, Rees G, Cox IJ, Johnson A, Edelstein M, Eland A, et al. Share mobile and social-media data to curb COVID-19. Nature. 2020;580(7801):29. [CrossRef] [Medline]
- Moore S, Tassé AM, Thorogood A, Winship I, Zawati M, Doerr M. Consent processes for mobile app mediated research: systematic review. JMIR Mhealth Uhealth. 2017;5(8):e126. [FREE Full text] [CrossRef] [Medline]
- Nakamoto I, Jiang M, Zhang J, Zhuang W, Guo Y, Jin M, et al. Evaluation of the design and implementation of a peer-to-peer COVID-19 contact tracing mobile app (COCOA) in Japan. JMIR Mhealth Uhealth. 2020;8(12):e22098. [FREE Full text] [CrossRef] [Medline]
- O'Doherty KC, Christofides E, Yen J, Bentzen HB, Burke W, Hallowell N, et al. If you build it, they will come: unintended future uses of organised health data collections. BMC Med Ethics. 2016;17(1):54. [FREE Full text] [CrossRef] [Medline]
- Poletti P, Visintainer R, Lepri B, Merler S. The interplay between individual social behavior and clinical symptoms in small clustered groups. BMC Infect Dis. 2017;17(1):521. [FREE Full text] [CrossRef] [Medline]
- Poudyal A, van Heerden A, Hagaman A, Maharjan SM, Byanjankar P, Subba P, et al. Wearable digital sensors to identify risks of postpartum depression and personalize psychological treatment for adolescent mothers: protocol for a mixed methods exploratory study in rural Nepal. JMIR Res Protoc. 2019;8(8):e14734. [FREE Full text] [CrossRef] [Medline]
- Quer G, Gadaleta M, Radin JM, Andersen KG, Baca-Motes K, Ramos E, et al. The physiologic response to COVID-19 vaccination. medRxiv. 2021. [FREE Full text] [CrossRef] [Medline]
- Radin JM, Peters S, Ariniello L, Wongvibulsin S, Galarnyk M, Waalen J, et al. Pregnancy health in POWERMOM participants living in rural versus urban zip codes. J Clin Transl Sci. 2020;4(5):457-462. [FREE Full text] [CrossRef] [Medline]
- Rake EA, van Gelder MHJ, Grim DC, Heeren B, Engelen L, van de Belt TH. Personalized consent flow in contemporary data sharing for medical research: a viewpoint. Biomed Res Int. 2017;2017:7147212. [FREE Full text] [CrossRef] [Medline]
- Rendina HJ, Mustanski B. Privacy, trust, and data sharing in web-based and mobile research: participant perspectives in a large nationwide sample of men who have sex with men in the United States. J Med Internet Res. 2018;20(7):e233. [FREE Full text] [CrossRef] [Medline]
- Rieger A, Gaines A, Barnett I, Baldassano CF, Connolly Gibbons MB, Crits-Christoph P. Psychiatry outpatients' willingness to share social media posts and smartphone data for research and clinical purposes: survey study. JMIR Form Res. 2019;3(3):e14329. [FREE Full text] [CrossRef] [Medline]
- Rohlman D, Dixon HM, Kincl L, Larkin A, Evoy R, Barton M, et al. Development of an environmental health tool linking chemical exposures, physical location and lung function. BMC Public Health. 2019;19(1):854. [FREE Full text] [CrossRef] [Medline]
- Rothstein MA, Wilbanks JT, Brothers KB. Citizen science on your smartphone: an ELSI research agenda. J Law Med Ethics. 2015;43(4):897-903. [CrossRef] [Medline]
- Ságvári B, Gulyás A, Koltai J. Attitudes towards participation in a passive data collection experiment. Sensors. 2021;21(18):6085. [CrossRef]
- Salamone F, Masullo M, Sibilio S. Wearable devices for environmental monitoring in the built environment: a systematic review. Sensors (Basel). 2021;21(14):4727. [FREE Full text] [CrossRef] [Medline]
- Saleheen N, Chakraborty S, Ali N, Mahbubur Rahman MD, Hossain S, Bari R, et al. mSieve: differential behavioral privacy in time series of mobile sensor data. Proc ACM Int Conf Ubiquitous Comput. 2016;2016:706-717. [FREE Full text] [CrossRef] [Medline]
- Santos-Lozano A, Baladrón C, Martín-Hernández J, Morales JS, Ruilope L, Lucia A. mHealth and the legacy of John Snow. The Lancet. 2018;391(10129):1479-1480. [CrossRef]
- Schmitz H, Howe CL, Armstrong DG, Subbian V. Leveraging mobile health applications for biomedical research and citizen science: a scoping review. J Am Med Inform Assoc. 2018;25(12):1685-1695. [FREE Full text] [CrossRef] [Medline]
- Seltzer E, Goldshear J, Guntuku SC, Grande D, Asch DA, Klinger EV, et al. Patients' willingness to share digital health and non-health data for research: a cross-sectional study. BMC Med Inform Decis Mak. 2019;19(1):157. [FREE Full text] [CrossRef] [Medline]
- Simblett SK, Biondi A, Bruno E, Ballard D, Stoneman A, Lees S, et al. RADAR-CNS consortium. Patients' experience of wearing multimodal sensor devices intended to detect epileptic seizures: a qualitative analysis. Epilepsy Behav. 2020;102:106717. [CrossRef] [Medline]
- Sleigh J. Experiences of donating personal data to mental health research: an explorative anthropological study. Biomed Inform Insights. 2018;10:1178222618785131. [FREE Full text] [CrossRef] [Medline]
- Slotwiner DJ, Tarakji KG, Al-Khatib SM, Passman RS, Saxon LA, Peters NS, et al. Transparent sharing of digital health data: a call to action. Heart Rhythm. 2019;16(9):e95-e106. [FREE Full text] [CrossRef] [Medline]
- Smith RJ, Grande D, Merchant RM. Transforming scientific inquiry. Academic Medicine. 2016;91(4):469-472. [CrossRef]
- Staccini P, Fernandez-Luque L. Secondary use of recorded or self-expressed personal data: consumer health informatics and education in the era of social media and health apps. Yearb Med Inform. 2017;26(01):172-177. [CrossRef]
- Struminskaya B, Toepoel V, Lugtig P, Haan M, Luiten A, Schouten B. Understanding willingness to share smartphone-sensor data. Public Opin Q. 2020;84(3):725-759. [FREE Full text] [CrossRef] [Medline]
- Tajiri E, Yoshimura E, Hatamoto Y, Tanaka H, Shimoda S. Effect of sleep curtailment on dietary behavior and physical activity: a randomized crossover trial. Physiol Behav. 2018;184:60-67. [CrossRef] [Medline]
- Tannis C, Senerat A, Garg M, Peters D, Rajupet S, Garland E. Improving physical activity among residents of affordable housing: is active design enough? Int J Environ Res Public Health. 2019;16(1):151. [FREE Full text] [CrossRef] [Medline]
- Taylor SA, Jaques N, Nosakhare E, Sano A, Picard R. Personalized multitask learning for predicting tomorrow's mood, stress, and health. IEEE Trans Affect Comput. 2020;11(2):200-213. [FREE Full text] [CrossRef] [Medline]
- van Heerden A, Wassenaar D, Essack Z, Vilakazi K, Kohrt BA. In-home passive sensor data collection and its implications for social media research: perspectives of community women in rural South Africa. J Empir Res Hum Res Ethics. 2020;15(1-2):97-107. [CrossRef] [Medline]
- Van Oeveren BT, De Ruiter CJ, Hoozemans MJM, Beek PJ, Van Dieën JH. Inter-individual differences in stride frequencies during running obtained from wearable data. J Sports Sci. 2019;37(17):1996-2006. [CrossRef] [Medline]
- Vitak J, Zimmer M. More than just privacy: using contextual integrity to evaluate the long-term risks from COVID-19 surveillance technologies. Soc Media Soc. 2020;6(3):2056305120948250. [FREE Full text] [CrossRef] [Medline]
- Vo JDV, Gorbach AM. A platform to record patient events during physiological monitoring with wearable sensors: proof-of-concept study. Interact J Med Res. 2019;8(1):e10336. [FREE Full text] [CrossRef] [Medline]
- von Gablenz P, Kowalk U, Bitzer J, Meis M, Holube I. Individual hearing aid benefit in real life evaluated using ecological momentary assessment. Trends Hear. 2021;25:2331216521990288. [FREE Full text] [CrossRef] [Medline]
- Wanyua S, Ndemwa M, Goto K, Tanaka J, K'opiyo J, Okumu S, et al. Profile: the Mbita health and demographic surveillance system. Int J Epidemiol. 2013;42(6):1678-1685. [CrossRef] [Medline]
- Wasfi R, Poirier Stephens Z, Sones M, Laberee K, Pugh C, Fuller D, et al. Recruiting participants for population health intervention research: effectiveness and costs of recruitment methods for a cohort study. J Med Internet Res. 2021;23(11):e21142. [FREE Full text] [CrossRef] [Medline]
- Webster DE, Suver C, Doerr M, Mounts E, Domenico L, Petrie T, et al. The mole mapper study, mobile phone skin imaging and melanoma risk data collected using ResearchKit. Sci Data. 2017;4(1):170005. [FREE Full text] [CrossRef] [Medline]
- Welsh JB, Derdzinski M, Parker AS, Puhr S, Jimenez A, Walker T. Real-time sharing and following of continuous glucose monitoring data in youth. Diabetes Ther. 2019;10(2):751-755. [FREE Full text] [CrossRef] [Medline]
- Wicks P. Patient, study thyself. BMC Med. 2018;16(1):217. [FREE Full text] [CrossRef] [Medline]
- Witteveen D, de Pedraza P. The roles of general health and COVID-19 proximity in contact tracing app usage: cross-sectional survey study. JMIR Public Health Surveill. 2021;7(8):e27892. [FREE Full text] [CrossRef] [Medline]
- Woldaregay AZ, Henriksen A, Issom DZ, Pfuhl G, Sato K, Richard A, et al. User expectations and willingness to share self-collected health data. Stud Health Technol Inform. 2020;270:894-898. [CrossRef] [Medline]
- Yan K, Tracie B, Marie-Ève M, Mélanie H, Jean-Luc B, Benoit T, et al. Innovation through wearable sensors to collect real-life data among pediatric patients with cardiometabolic risk factors. Int J Pediatr. 2014;2014:328076. [FREE Full text] [CrossRef] [Medline]
- Cho S, Ensari I, Weng C, Kahn MG, Natarajan K. Factors affecting the quality of person-generated wearable device data and associated challenges: rapid systematic review. JMIR Mhealth Uhealth. 2021;9(3):e20738. [FREE Full text] [CrossRef] [Medline]
- Cho S. Data quality assessment for the secondary use of person-generated wearable device data. In: Assessing Self-Tracking Data for Research Purposes. New York. Columbia University; 2021.
- Song S, Taylor CO. Data representativeness in cardiovascular disease studies that use consumer wearables. In: The 1st International Workshop on Ethics and Bias of Artificial Intelligence in Clinical Applications (EBAIC 2023). 2023. Presented at: 2023 IEEE 11th International Conference on Healthcare Informatics (ICHI); 2023 June 26; Houston, Texas. [CrossRef]
- Kim J, Im E, Kim H. From intention to action: the factors affecting health data sharing intention and action. Int J Med Inform. 2023;175:105071. [CrossRef] [Medline]
- Patel MS, Polsky D, Kennedy EH, Small DS, Evans CN, Rareshide CAL, et al. Smartphones vs wearable devices for remotely monitoring physical activity after hospital discharge: a secondary analysis of a randomized clinical trial. JAMA Netw Open. 2020;3(2):e1920677. [FREE Full text] [CrossRef] [Medline]
- Hicks JL, Althoff T, Sosic R, Kuhar P, Bostjancic B, King AC, et al. Best practices for analyzing large-scale health data from wearables and smartphone apps. NPJ Digit Med. 2019;2(1):45. [FREE Full text] [CrossRef] [Medline]
- Philip BJ, Abdelrazek M, Bonti A, Barnett S, Grundy J. Data collection mechanisms in health and wellness apps: review and analysis. JMIR Mhealth Uhealth. 2022;10(3):e30468. [FREE Full text] [CrossRef] [Medline]
- Keusch F, Struminskaya B, Antoun C, Couper M, Kreuter F. Willingness to participate in passive mobile data collection. Public Opin Q. 2019;83(Suppl 1):210-235. [FREE Full text] [CrossRef] [Medline]
- Dinh-Le C, Chuang R, Chokshi S, Mann D. Wearable health technology and electronic health record integration: scoping review and future directions. JMIR Mhealth Uhealth. 2019;7(9):e12861. [FREE Full text] [CrossRef] [Medline]
- Dhruva SS, Ross JS, Akar JG, Caldwell B, Childers K, Chow W, et al. Aggregating multiple real-world data sources using a patient-centered health-data-sharing platform. NPJ Digit Med. 2020;3(1):60. [FREE Full text] [CrossRef] [Medline]
- CARIN consumer directed payer data exchange (CARIN IG for blue button®). HL7 International. 2021. URL: https://build.fhir.org/ig/HL7/carin-bb/ [accessed 2024-10-18]
- Blue Button® 2.0: improving medicare beneficiary access to their health information. Centers for Medicare & Medicaid Services. 2021. URL: https://web.archive.org/web/20231014191517/https://www.cms.gov/data-research/cms-information-technology/blue-button [accessed 2023-10-14]
- CMS Fast Facts. Baltimore, MD. Centers for Medicare & Medicaid Services; 2023.
- Gasteiger N, Vercell A, Davies A, Dowding D, Khan N, Davies A. Patient-facing genetic and genomic mobile apps in the UK: a systematic review of content, functionality, and quality. J Community Genet. 2022;13(2):171-182. [FREE Full text] [CrossRef] [Medline]
- Talwar D, Yeh YL, Chen WJ, Chen LS. Characteristics and quality of genetics and genomics mobile apps: a systematic review. Eur J Hum Genet. 2019;27(6):833-840. [FREE Full text] [CrossRef] [Medline]
- Shaw RJ, Steinberg DM, Bonnet J, Modarai F, George A, Cunningham T, et al. Mobile health devices: will patients actually use them? J Am Med Inform Assoc. 2016;23(3):462-466. [FREE Full text] [CrossRef] [Medline]
- L'Hommedieu M, L'Hommedieu J, Begay C, Schenone A, Dimitropoulou L, Margolin G, et al. Lessons learned: recommendations for implementing a longitudinal study using wearable and environmental sensors in a health care organization. JMIR Mhealth Uhealth. 2019;7(12):e13305. [FREE Full text] [CrossRef] [Medline]
- Köngeter A, Schickhardt C, Jungkunz M, Bergbold S, Mehlis K, Winkler EC. Patients' willingness to provide their clinical data for research purposes and acceptance of different consent models: findings from a representative survey of patients with cancer. J Med Internet Res. 2022;24(8):e37665. [FREE Full text] [CrossRef] [Medline]
- Ploug T, Holm S. Meta consent - a flexible solution to the problem of secondary use of health data. Bioethics. 2016;30(9):721-732. [FREE Full text] [CrossRef] [Medline]
- Sheehan M, Thompson R, Fistein J, Davies J, Dunn M, Parker M, et al. Authority and the future of consent in population-level biomedical research. Public Health Ethics. 2019;12(3):225-236. [FREE Full text] [CrossRef] [Medline]
- Abdelazeem B, Abbas KS, Amin MA, El-Shahat NA, Malik B, Kalantary A, et al. The effectiveness of incentives for research participation: a systematic review and meta-analysis of randomized controlled trials. PLoS One. 2022;17(4):e0267534. [FREE Full text] [CrossRef] [Medline]
- Bentley JP, Thacker PG. The influence of risk and monetary payment on the research participation decision making process. J Med Ethics. 2004;30(3):293-298. [FREE Full text] [CrossRef] [Medline]
- Altman DG, Simera I. Responsible reporting of health research studies: transparent, complete, accurate and timely. J Antimicrob Chemother. 2010;65(1):1-3. [FREE Full text] [CrossRef] [Medline]
- Turner L, Shamseer L, Altman DG, Schulz KF, Moher D. Does use of the CONSORT statement impact the completeness of reporting of randomised controlled trials published in medical journals? A Cochrane review. Syst Rev. 2012;1(1):60. [FREE Full text] [CrossRef] [Medline]
- Vilaró M, Cortés J, Selva-O'Callaghan A, Urrutia A, Ribera J, Cardellach F, et al. Adherence to reporting guidelines increases the number of citations: the argument for including a methodologist in the editorial process and peer-review. BMC Med Res Methodol. 2019;19(1):112. [FREE Full text] [CrossRef] [Medline]
- Guu TW, Muurling M, Khan Z, Kalafatis C, Aarsland D, ffytche D, et al. Wearable devices: underrepresentation in the ageing society. The Lancet Digital Health. 2023;5(6):e336-e337. [CrossRef]
Abbreviations
BYOD: Bring-Your-Own-Device |
FHIR: Fast Health Interoperability Resources |
mERA: Mobile Health Evidence Reporting and Assessment |
MeSH: medical subject headings |
PGHD: person-generated health data |
PRISMA-ScR: Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping reviews |
STaRT-RWE: Structured Template and Reporting Tool for Real-World Evidence |
Edited by A Mavragani; submitted 17.08.23; peer-reviewed by T Katapally, E Baker; comments to author 02.02.24; revised version received 12.04.24; accepted 27.09.24; published 20.01.25.
Copyright©Shanshan Song, Micaela Ashton, Rebecca Hahn Yoo, Zoljargal Lkhagvajav, Robert Wright, Debra J H Mathews, Casey Overby Taylor. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 20.01.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.