This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.
Along with the proliferation of health information technologies (HITs), there is a growing need to understand the potential privacy risks associated with using such tools. Although privacy policies are designed to inform consumers, such policies have consistently been found to be confusing and lack transparency.
This study aims to present consumer preferences for accessing privacy information; develop and apply a privacy policy risk assessment tool to assess whether existing HITs meet the recommended privacy policy standards; and propose guidelines to assist health professionals and service providers with understanding the privacy risks associated with HITs, so that they can confidently promote their safe use as a part of care.
In phase 1, participatory design workshops were conducted with young people who were attending a participating
When considering the use of HITs, the participatory design workshop participants indicated that they wanted privacy information to be easily accessible, transparent, and user-friendly to enable them to clearly understand what personal and health information will be collected and how these data will be shared and stored. The privacy policy review revealed consistently poor readability and transparency, which limited the utility of these documents as a source of information. Therefore, to enable informed consent, the privacy guidelines provided ensure that health professionals and consumers are fully aware of the potential for privacy risks in using HITs to support health and well-being.
A lack of transparency in privacy policies has the potential to undermine consumers’ ability to trust that the necessary measures are in place to secure and protect the privacy of their personal and health information, thus precluding their willingness to engage with HITs. The application of the privacy guidelines will improve the confidence of health professionals and service providers in the privacy of consumer data, thus enabling them to recommend HITs to provide or support care.
Digital health has quickly become an integral component of best practice health care, transforming the way care is delivered. By capitalizing on digital infrastructure, it is widely recognized that digital health solutions improve access to care, particularly for individuals with mobility or transport restrictions, or for those who live remotely where health care resources may be limited [
It is crucial to consider the legal and ethical rights of individuals who choose to both explicitly and passively share their web-based health information. This is essential, particularly in the area of mental health care, where data often contain highly personal information that could cause significant harm and distress if not handled appropriately. There is increasing documentation and guidance in this area, such as the recent release of the National Safety and Quality Digital Mental Health Standards (consultative draft) by the Australian Commission on Safety and Quality in Health Care [
The use of health-related apps has rapidly increased in recent years, with 47% of Australian consumers using apps in 2018 [
This study aims to use co-design methodologies to better understand young people’s preferences for learning about how their personal and health information will be handled by HITs and create prototypes for the InnoWell Platform. The InnoWell Platform was developed by InnoWell, a joint venture between the University of Sydney and PricewaterhouseCoopers (PwC; Australia) through Project Synergy, an Aus $30 million (US $22.1 million) Australian government–funded initiative [
Participatory design (ie, co-design) methodologies are routinely used to ensure that digital tools are designed to meet the needs of the intended user base, thus increasing uptake and engagement [
Our research team conducted a series of 10 participatory design workshops from July to September 2018 in 9 urban and rural
Participants included individuals from the participating
The InnoWell Platform consists of a multidimensional assessment evaluating a range of biopsychosocial domains (eg, psychological distress, sleep, alcohol use, and physical health) to provide a holistic view of the consumer. The assessment results are available in real time and designed to be reviewed collaboratively by the consumer and their health professional to promote shared decision-making about care options, accounting for consumer preferences. A consumer’s progress can then be routinely tracked and monitored over time using assessment tools to inform treatment planning, clinical review, and coordinated care within and between services.
As previously described [
This study was approved by the University of Sydney’s Human Research Ethics Committee (protocol number: 2018/130).
Drawing from digital health privacy and security criteria published by existing research and professional associations, our research team developed a privacy policy risk assessment tool (
Importantly, the privacy policy risk assessment tool is broadly applicable to HITs. To demonstrate its utility, in this study, we evaluated the privacy policies of the apps and e-tools in the
Aligned with established evaluation processes [
The readability levels of each privacy policy were assessed as part of the evaluation process. There are multiple readability formulas available; however, for the purpose of this study, we used the Flesh-Kincaid readability tests because of their acceptance in the health care literature [
Descriptive statistics were used to analyze all aspects of the assessment data. SPSS version 25 (IBM Corp) was used for all analyses.
As a result of Project Synergy and the development of the InnoWell Platform, a set of core principles and privacy guidelines were used as the starting point to formalize a more encompassing set of privacy guidelines. A series of consultations were held between 2014 and 2016 (phase 1) and then again between 2017 and 2018 (phase 2) to develop a set of privacy guidelines for Project Synergy. Initial consultations were conducted by Orygen (the National Centre of Excellence in Youth Mental Health) [
Participants included key stakeholder groups, including Orygen (the National Centre of Excellence in Youth Mental Health), the Young and Well Cooperative Research Centre, Mental Health Commission of NSW (Pacific Privacy Consulting Pty Ltd), the Project Synergy research and development team (the University of Sydney’s Brain and Mind Centre), InnoWell, and PwC (Australia). Select individuals were nominated by each organization, with participants contributing diverse expertise and experience, such as the involvement of 2 ex-serving privacy commissioners for NSW and Victoria (Australia).
From the outset of the development cycle of the prototype and as part of phase 1 of Project Synergy (2014-2016) [
As a result of the review of the initial guidelines developed in phase 1 of Project Synergy, a narrower focus was decided upon and used as the starting point for the development of more encompassing privacy guidelines for phase 2 of Project Synergy (2017-20). Specifically, upon review of the initial guidelines whereby the University of Sydney and Pacific Privacy Consulting determined that privacy concerns were the most important priority, a narrower focus was given to 8 core foundation principles to be followed by organizations using the prototype in phase 1 of Project Synergy (2014-2016) [
This paper presents privacy guidelines to assist health service providers in considering the privacy of their consumers when using HITs as part of care. The guidelines were first drafted by the Project Synergy research and development team based on the information gathered through the initial collaborative consultation process. The checklist was then reviewed, discussed, and evaluated by the research team, ultimately resulting in agreement by consensus.
The results of the knowledge translation process highlighted that participants wanted privacy information to be presented before being required to create an account. Specifically, they emphasized the need for privacy information to be readily available, allowing a user to be completely comfortable when entering more sensitive information into a HIT, such as the InnoWell Platform (eg, “Always ask could this site be more secure with my information” [Wollongong workshop]). This included the ability to change permissions concerning data sharing at their discretion (eg, “[I] would want privacy settings in place so that not everyone that shares the system can see” [Broken Hill workshop]). Participants noted that privacy information is frequently confusing and difficult to understand, leaving them unsure whether they should trust the HIT to protect their personal and health information. Thus, multiple participants suggested a pin code or password (eg, “Consider password security like in bank apps.” [Townsville workshop]) to access certain data so the consumer controls who has access to their information in the InnoWell Platform. Importantly, the idea of consumer control extended beyond HIT manufacturers such as InnoWell and included health professionals and supportive others (ie, family members and carers) accessing personal and health information (eg, “Need privacy setting like Facebook...can filter who can see the information” [Bathurst workshop]).
We evaluated 34 privacy policies using the privacy policy risk assessment tool. Most of these apps and e-tools were designed for both youth and adult users (28/34, 82%), whereas the remaining 18% (6/34) were specifically designed for youth audiences (aged ≤25 years). Most apps and e-tools (20/34, 59%) were self-help or self-management tools supporting mental health and well-being, including three specifically using cognitive behavioral therapy techniques. There were also 12% (4/34) symptom trackers, 6% (2/34) web-based counseling services, 6% (2/34) planning and time management tools, 6% (2/34) psychoeducational websites, and 2% (1/34) mindfulness and meditation app. The remaining apps and e-tools supported fitness (2/34, 6%) and relationships (1/34, 2%).
The summary results from the review of privacy policies are presented in
Summary results of the privacy policy assessment results (N=34).
Privacy policy questions and responses | Value, n (%) | ||
|
|||
|
No | 0 (0) | |
|
Yes | 34 (100) | |
|
|||
|
No | 8 (24) | |
|
Yes | 26 (76) | |
|
|||
|
No | 3 (9) | |
|
Yes | 31 (91) | |
|
|||
|
No | 26 (76) | |
|
Some information | 3 (9) | |
|
Yes | 5 (15) | |
|
|||
|
No | 0 (0) | |
|
Some information | 1 (3) | |
|
Yes | 33 (97) |
aHIPAA: Health Insurance Portability and Accountability Act.
In relation to the Flesch reading ease, the privacy policies were all found to fall into the top three of the seven available score categories, with 9% (3/34) rated as
PHIa assessment results (N=34).
PHI questions and responses | Value, n (%) | ||
|
|||
|
No | 2 (6) | |
|
Yes | 32 (94) | |
|
|||
|
No | 4 (13) | |
|
Yes | 28 (87) | |
|
|||
|
Not clear | 3 (9) | |
|
Somewhat clear | 2 (6) | |
|
Yes |
27 (85) | |
|
|||
|
No | 14 (44) | |
|
Somewhat clear | 1 (3) | |
|
Yes | 17 (53) |
aPHI: personal health information.
Results of the assessment questions related to data sharing, data preferences, and data storage are presented in
Data sharing and data use assessment results (N=34).
Data sharing and data use questions and responses | Value, n (%) | ||
|
|||
|
No | 2 (6) | |
|
Yes | 32 (94) | |
|
|||
|
No | 11 (33) | |
|
Yes | 23 (67) | |
|
|||
|
No | 30 (88) | |
|
Yes | 1 (3) | |
|
Some permissions | 3 (9) | |
|
|||
|
No | 10 (29) | |
|
Yes | 22 (65) | |
|
Informational webpage only | 2 (6) | |
|
|||
|
No | 27 (79) | |
|
Yes | 4 (12) | |
|
Unspecified | 3 (9) | |
|
|||
|
Yes | 32 (94) | |
|
Unspecified | 2 (6) | |
|
|||
|
Unspecified or unclear | 12 (35) | |
|
Until no longer needed | 12 (35) | |
|
1 year or less | 2 (6) | |
|
1-3 years | 3 (9) | |
|
More than 3 years | 3 (9) | |
|
At user discretion | 2 (6) | |
|
|||
|
Secure Australian server | 6 (19) | |
|
Secure overseas server | 11 (34) | |
|
Unspecified server | 7 (22) | |
|
University server | 2 (6) | |
|
Hospital or PHNa | 2 (6) | |
|
Unclear | 4 (13) | |
|
|||
|
Australia | 13 (38) | |
|
United States | 9 (26) | |
|
Canada | 3 (9) | |
|
Europe | 1 (3) | |
|
Multiple countries | 2 (6) | |
|
Unclear or unspecified | 6 (18) |
aPHN: Primary Health Network.
Most apps and e-tools (32/34, 94%) stored data on a server, with a small number (4/34, 12%) storing data on both the device (ie, PHI) and a server (ie, email address and website activity). In addition, 6% (2/34) of apps and e-tools did not specify where the data were stored. Data storage duration ranged from up to 1 year (2/34, 6%), 1 to 3 years (3/34, 9%), and more than 3 years (3/34, 9%). Although more than one-third of the apps and e-tools were unclear or did not specify for how long data were stored (12/34, 35%), approximately one-third (12/34, 35%) stored the data until no longer needed by the organization. The remaining 6% (2/34) of privacy policies stated that the data would be deleted at the user’s discretion.
Of the 32 apps and e-tools that stored data on a server, 11 (34%) stored data on a secure overseas server, 6 (19%) stored data on a secure Australian server, 7 (22%) stored data on an unnamed or unspecified server. In addition, 6% (2/32) of apps and e-tools stored data on a university server, 6% (2/32) stored data on a hospital or primary health network server system, and 13% (4/32) were unclear on the type of server used. The location of data storage was mixed between Australia (13/34, 38%), the United States (9/34, 26%), Canada (3/34, 9%), Europe (1/34, 3%), and multiple locations (2/34, 6%), with the remainder unclear as to where data were stored (6/34, 18%).
Most apps and e-tools (27/34, 79%) shared data with relevant third parties, including but not limited to partners, suppliers, collaborators, advisers, and business associates. The types of data shared varied from PHI to aggregated user data, such as location. A small number of apps and e-tools (4/34, 12%) shared information with irrelevant third parties, including social media platforms such as Facebook. In addition, 38% (13/34) of apps and e-tools shared data with a research partner or university, 15% (5/34) shared information with government departments, and 38% (13/34) shared data with a health-related group or person (eg, a support person or health professional).
All privacy policies were assessed for their inclusion of various other details, which are summarized in
Overview of additional details provided by privacy policies (N=34).
Additional questions and responses | Value, n (%) | ||
|
|||
|
No | 11 (32) | |
|
Yes | 20 (59) | |
|
Unspecified | 3 (9) | |
|
|||
|
No | 26 (76) | |
|
Yes | 8 (24) | |
|
|||
|
No—does not provide either | 3 (9) | |
|
Some—provides information for internal or third party only | 16 (47) | |
|
Yes—provides both internal and third-party expert | 15 (44) | |
|
|||
|
No | 23 (67) | |
|
Somewhat | 3 (9) | |
|
Yes | 8 (24) |
As generated through the collaborative consultation process described previously,
The HIT manufacturer has clearly introduced the purpose of its privacy policy.
The privacy policy includes an introduction to the HIT manufacturer and how and why their organization operates.
The privacy policy provides adequate information and addresses my concerns.
If no, I am aware who I need to contact to seek clarification...
The privacy policy is written clearly.
If no, I am aware who I need to contact to seek clarification...
The privacy policy adequately explains how the HIT manufacturer will collect and use personal data.
The privacy policy adequately explains how and when the HIT manufacturer will disclose personal data to third parties.
If the HIT manufacturer shares data with third parties, I am confident that the third-party partners are reputable and will comply with all legislative requirements when collecting, storing, and sharing data.
If no, I am aware who I need to contact to seek clarification...
I am confident the HIT manufacturer has taken the appropriate steps to protect everyone’s data, adopting the strongest security measures.
I have been made aware of how end users can access and correct their personal information on the HIT.
If no, I am aware who I need to contact to seek clarification...
The privacy policy outlines how and when the HIT manufacturer will delete personal data.
The privacy policy outlines how the HIT manufacturer will respond to any data breaches.
The privacy policy includes information on how I can inquire, provide feedback, or make complaints.
There is the opportunity for me to contact a third-party expert to inquire about the privacy policy (such as the Office of the Australian Information Commissioner).
If yes, they are...
From what I read, I feel comfortable using the proposed HIT as part of my clinical care and/or practice.
With their increased experience and exposure, consumers are becoming more sophisticated users of HITs. They can offer valuable insights into how privacy information should be presented to ensure that it is clear, informative, and transparent. Participants of our co-design workshops highlighted the need for all HITs to have a privacy policy that provides relevant data security information before collecting information from an individual, preceding the account creation process. In addition, participants stated that privacy policies should be accessible, transparent, and user-friendly, ensuring that consumers understand what personal and health data will be collected, stored, and shared and, in turn, enabling them to trust the HIT to protect their data. These findings align with those reported by Schueller et al [
As evidenced by our co-design results, consumers are calling for greater clarity and transparency in the privacy policies of HITs so that they can be confident that they understand how their personal and health information may be used. Importantly, all apps and e-tools included in this study had a privacy policy. All but 1 of those policies provided explicit details explaining the manufacturer’s approach to privacy and how personal information is managed and protected. However, approximately one-quarter of the privacy policies did not meet the standards of the Australian Privacy Act 1988 or other international equivalents, raising concerns regarding undisclosed data sharing and poorly secured data storage. Even when the use of data adheres to privacy standards, issues of transparency often arise. For example, a recent review of the data sharing practices of 24 health-related apps found that data were shared with 55 unique entities, including app developers, their parent companies, and third parties (ie, service providers). Subsequently, third parties shared user data with an additional 216 integrated fourth parties (eg, Facebook sharing data with data brokers to enable targeted advertising) [
In addition to poor overall transparency, our results also confirm that privacy policies, when present, are fairly difficult to read and require a college or professional reading level, essentially rendering them useless for a large portion of potential users (eg, children and young people or individuals with cognitive impairments or intellectual disabilities). This aligns with previous research by Robillard et al [
Most apps and e-tools included in this study collected some form of PHI (32/34, 94%), including, in some cases, information related to mental health, with 87% (28/34) of those apps and e-tools then sharing these data in some manner. Although data sharing was disclosed in most privacy policies (27/32, 84%), how PHI was shared was not transparent in 15% (5/32) of the policies. Most apps and e-tools shared data with relevant third parties (27/34, 79%); however, 12% (4/34) also shared information with third parties deemed to be irrelevant, such as social media platforms (ie, Facebook). Of note, few apps and e-tools (4/31, 13%) allowed users to update their permissions concerning data sharing.
Although it is unlawful in Australia, for example, to share PHI for purposes other than those stated in a privacy policy, the complexities of the web-based environment frequently preclude a full understanding of how data are shared and for what purpose [
Most apps and e-tools in our sample stored data on a server (32/34, 94%), with more than half (21/32, 66%) storing data on an unnamed or unspecified server. Although other server types included university servers (2/32, 6%) and a hospital or primary health network server system (2/32, 6%), the type of server used to store data for an additional 22% (7/32) of apps and e-tools was unclear. Once data are transmitted to a third-party server, it is often difficult to determine the robustness of the privacy and confidentiality standards in place to protect the PHI. For example, Cultura Colectiva, a digital media company with access to user information from Facebook, stored data on a publicly accessible server, resulting in the exposure of 540 million individual records, including user IDs and names [
Given the concerns regarding accessibility, transparency, and readability outlined above, through a consultative process with key stakeholders, our team developed privacy guidelines (
As few consumers will review academic literature before accessing HITs, they are more likely to learn about available apps and e-tools via the internet, app stores, social media, word of mouth, or health professionals. In relation to the latter, it is recommended that health professionals and service providers conduct their own risk assessment before implementing HITs into their service to ensure appropriate risk strategies are in place [
This study has some limitations that are important to note. Although we engaged in a thorough collaborative consultation process to develop the broad structure and content of the privacy guidelines for Project Synergy, the development of the privacy policy risk assessment tool and the guidelines for health professionals and service providers was conducted by the research team, independent of this broader stakeholder group. Therefore, we acknowledge that both the assessment tool and the guidelines may benefit from further input or revision by individuals with expertise in data privacy and security, both from a legal perspective and regarding manufacturers of digital tools. As highlighted by the co-design work presented in phase 1, our group recognizes the importance of including the voice of those with lived experience in our work to reform mental health services and systems of care, including the ethical development and application of digital tools. With that being said, we acknowledge that the privacy policy risk assessment tool and guidelines were developed without contributions from consumers with lived experience of mental ill health.
Given the increasing uptake of HITs, both by individuals for the purposes of self-management and by health professionals as a means to complement clinical services, it is essential that all users have a clear understanding of what personal and health information will be collected, how these data will be shared and stored, and what privacy and security measures are in place to ensure it is protected. Our findings highlight the ubiquitous poor readability and lack of transparency in existing privacy policies, a stark contrast to what consumers emphasized as essential factors in the presentation of privacy information. Although consumers, health professionals, and services are becoming increasingly reliant on HITs to deliver, support, or enhance care, concerns regarding the privacy of health and personal information are likely to undermine user confidence and willingness to engage with HITs. Therefore, we provide suggested guidelines that can be easily adopted by health professionals and service providers when considering the implementation of HITs, including apps and e-tools, into their service. We recommend that these guidelines be adopted to ensure that HITs are used to their full potential to maximize patient health outcomes while minimizing risk and that users are informed of privacy and security considerations to make educated decisions as to whether they would like to share their personal and health information.
Privacy risk assessment tool.
health information technology
New South Wales
personal health information
PricewaterhouseCoopers
The authors IBH and TAD were integral in securing funding to support this study. This study was designed by HML, AER, TAD, and GL. All data analyses were conducted by HML and AER. All authors have contributed to and approved the final manuscript. This research was conducted on behalf of the Australian Government Department of Health as part of Project Synergy (2017-20). InnoWell was formed by the University of Sydney and PricewaterhouseCoopers (Australia) to deliver the Aus $30 million (US $22.1 million) Australian government-funded Project Synergy. The authors would also like to acknowledge Future Generation Global for funding the Youth Mental Health & Technology Program, which aims to improve young people’s access to quality mental health care, including through the use of co-design methodologies. The authors would like to acknowledge and thank Hannah Yee for her contributions to the knowledge translation and Toby Wong for his assistance with data collection. The authors would also like to thank the participants in the co-design workshops for their insights on how to best present privacy information to users.
IBH is the codirector of health and policy at the Brain and Mind Centre, University of Sydney. The Brain and Mind Centre operates an early-intervention youth service at Camperdown under contract to headspace. He is the chief scientific advisor to and a 5%-equity shareholder in InnoWell Pty Ltd. InnoWell was formed by the University of Sydney (45% equity) and PricewaterhouseCoopers (Australia; 45% equity) to deliver the Aus $30 million (US $22.1 million) Australian government-funded Project Synergy (2017-2020; a 3-year program for the transformation of mental health services) and to lead the transformation of mental health services internationally through the use of innovative technologies. TAD is now the director of research and evaluation at the Design and Strategy Division of the Australian Digital Health Agency. The funding source does not entail any potential conflicts of interest for the other members of the Project Synergy research and development team.