Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Monday, March 11, 2019 at 4:00 PM to 4:30 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 02.05.19 in Vol 21, No 5 (2019): May

Preprints (earlier versions) of this paper are available at http://preprints.jmir.org/preprint/12390, first published Oct 04, 2018.

This paper is in the following e-collection/theme issue:

    Viewpoint

    Why Reviewing Apps Is Not Enough: Transparency for Trust (T4T) Principles of Responsible Health App Marketplaces

    1King's College London, Department of Psychology, Institute of Psychiatry, Psychology and Neuroscience, London, United Kingdom

    2Center for Behavioral Intervention Technologies, North Western University, Chicago, IL, United States

    *all authors contributed equally

    Corresponding Author:

    Til Wykes, BSc (Hon), MPhil, DPhil

    King's College London

    Department of Psychology

    Institute of Psychiatry, Psychology and Neuroscience

    De Crespigny Park

    London, SE5 8AF

    United Kingdom

    Phone: 44 20 7848 0596

    Email: til.wykes@kcl.ac.uk


    ABSTRACT

    The overselling of health apps that may provide little benefit and even harm needs the health community’s immediate attention. With little formal regulation, a light-touch approach to consumer protection is now warranted to give customers a modicum of information to help them choose from the vast array of so-called health apps. We suggest 4 guiding principles that should be adopted to provide the consumer with information that can guide their choice at the point of download. We call these the Transparency for Trust (T4T) principles, which are derived from experimental studies, systematic reviews, and reports of patient concerns. The T4T principles are (1) privacy and data security, (2) development characteristics, (3) feasibility data, and (4) benefits. All our questions are in a simple form so that all consumers can understand them. We suggest that app stores should take responsibility for providing this information and store it with any app marketed as a health app. Even the absence of information would provide consumers with some understanding and fuel their choice. This would also provide some commercial impetus for app developers to consider this requested information from the outset.

    J Med Internet Res 2019;21(5):e12390

    doi:10.2196/12390

    KEYWORDS



    Background

    Digital therapeutics are being touted as having the potential to transform health care by improving people’s experience, increasing effectiveness, and reducing costs. A few digital aids are recommended and integral to health services, but much of the e-health field depends on overselling [1]. This business plan seems to be working as the digital health field was estimated to be worth $25 billion globally in 2017 [2,3]. One US survey found that 58% of smartphone users have downloaded at least one health app [4]. The overselling of health apps needs the health community’s immediate attention as many of these apps may provide little benefit and some apps may cause harm. With little formal regulation, alternative light-touch approaches to consumer protection are necessary to give customers a modicum of information as a basis for choosing from the vast array of so-called health apps.

    We believe that simple, but informative, evidence should be available at the point of downloading and propose 4 succinct aspects that represent the critical information required for responsible health app marketplaces. We refer to these 4 principles—privacy and data security, development practices, feasibility, and health benefits—as the Transparency for Trust (T4T) principles. The goal of these principles is to operationalize efforts that can be made by app marketplaces to answer calls for better oversight and backing up products with data and research [3,5,6]. These T4T principles draw from several sources including patient and regulatory perspectives, recent systematic reviews, and experimental studies (eg [4,7-14]). We have used the definition of a health app from Innovate UK as those apps that contribute to the physical, mental, or social well-being of the user [15]. Our principles are designed to be applicable to the whole panoply of health apps from sleep apps to diabetes apps, symptom trackers to mindfulness interventions eg[16-18]. The fastest growing segment of digital therapeutics is health apps. Current estimates suggest that over 300,000 health apps exist with a recent yearly growth rate of 25% [17]. The rapid growth of products and lack of growth of information and regulation have resulted in very little information to separate quality health apps from those that are at best useless and at worst harmful. As a result, consumers are left to navigate the app stores alone. For one fast-growing sector, digital mental health, the likelihood of using an app is also affected by the relative lack of access or choice of mental health services as well as the stigma and discrimination experienced by sufferers and the interrelationship with physical health. We know, for instance, that people with depression face a higher risk of developing heart disease than individuals without depression and that following a heart attack, each additional depressive symptom that develops increases the risk of another heart attack by 15% [17]. Mental health problems also affect morbidity in other disorders such as rheumatoid arthritis and asthma [16]. We have used examples from mental health, but our principles are intended to inform and empower all health app users.


    Why Do We Need Some Simple Principles?

    Exponential Growth and Poor Regulation

    The use of digital technologies to alleviate, prevent, or maintain health has been recognized for many years, but although it offers enormous potential to rethink how services are provided, there are large roadblocks in its way (eg [19]). A Lancet Psychiatry Commission suggested that digital therapeutics could provide benefits now to complement current mental health treatments and aid self-management [8]. However, evidence suggests that some apps are not only ineffective, unsafe, and hard to use, but do not meet users’ privacy and security expectations [9,20,21].

    Formal regulation is remarkably light and restricted to a narrow selection of health apps that provide formal diagnoses or treatment for specified medical conditions [3,22,23]. Even when an app is regulated, we cannot be sure that it will work. For example, the US Food and Drug Administration (FDA) recently approved the first behavioral health app, reSET, for the treatment of substance use by using evidence from a clinical trial of a Web-based version of the treatment, not the app itself [24]. In the United Kingdom, the Care Quality Commission issued guidance in 2017 for digital health care providers, but this concentrated on safety [25]. The Medicines and Healthcare Products Regulatory Agency (MHRA) provides Conformité Européenne marking for medical devices and Certificates of Free Sale [26] but leaves review to the National Information Board. As it stands, many health apps are marketed with few checks and even regulatory approval appears to offer little confidence on whether that specific product was ever directly evaluated.

    More complex regulation has been proposed. The National Institute of Health and Clinical Excellence (NICE) is curating an NHS National Health Service app library. This is a burdensome process, and so few apps will be assessed in any year. Currently, the library has 78 apps, with only 18 for the fastest developing sector, mental health. This is a minute subset of the 325,000 available [27]. In the United States, the FDA launched a precertification pilot program involving 9 companies to speed the approval process, but this will evaluate the developers and their practices rather than focusing on the product [28]. Apple Inc has introduced additional requirements for medical apps for developers, but these focus mainly on measurement accuracy [29]. There is a middle way to fill this important, and now yawning, gap in consumer information. Health app marketplaces could take a lead by providing relatively simple guidance.


    What is Wrong With Current Systems for Reviewing Health Apps?

    Most proposed evaluations (eg, Mobile App Rating Scale, MARS [30], Enlight assessment tools [31]) assess usability, aesthetics, content, user engagement, and available research evidence, and others have been adding to this list [32]. These systems are useful because they facilitate multifaceted and thorough evaluations of apps, but they fall short of allowing clear recommendations. In fact, more recent evidence from Canada involving service users demonstrated that a high MARS rating would not on its own provide enough information to allow service users to form a decision on whether to download an app [33]. Advisory bodies such as NICE in the United Kingdom make a determination of what is likely to be cost-effective (effective, cost relative to benefit, and other comparison treatments), before they recommend its use in the UK NHS. Consumers, however, want to make choices based on simpler information. One for-profit company, ORCHA [34], provides reviews based on current standards, regulation and good practice, but their overall score does not allow a consumer to decide which components are important to them. PsyberGuide [35], a nonprofit organization, also provides reviews that include a service user focus but does not receive data directly from app developers. But apart from their lack of fulfilling all users’ expectations, no method provides clear information at the point of sale, and a potential consumer would have to search in 2 places for the information they need, to make a choice. We propose only 4 aspects of apps that represent the critical information required for responsible health app marketplaces. These 4 principles, deemed the T4T principles, are privacy and data security, development characteristics, feasibility data, and benefits.


    Transparency for Trust (T4T) Principles

    Privacy and Data Security

    Privacy and data security are a primary concern for patients and their clinicians [36-38], and its importance has only become more salient with recent events such as the Facebook and Cambridge Analytica scandal [39-41]. The European Union General Data Protection Regulation is strong and introduces new rights for people to access the information companies hold about them, obligations for better data management for businesses, and a new regime of fines across Europe. There are weaker regulations elsewhere, resulting in varying protections internationally. One review prompted the closure of the NHS app store when it was discovered that accredited apps were not encrypting data adequately and did not explicitly describe the personal data leaving the app [42]. Happtique, an early app certification company, met a similar fate when several of its certified apps were hacked, demonstrating the inadequacy of its processes to evaluate privacy and data security [43,44]. Many apps rely on selling the data they collect for their business plan, which jeopardizes personal privacy [42]. There is also evidence of poor practice resulting in fines for selling sensitive information to Lottery companies and fraudsters [45]. Privacy concerns change with the evolving technology, even though device operating systems are moving toward encryption on the device by default. Nevertheless, users need information about data leaving the app to make informed decisions about their willingness to provide sensitive health information [46,47].

    Although full formal audits are needed to ensure apps follow their stated procedures [48], even requiring developers to list their privacy and data security procedures in simple terms would be a significant step forward on raising standards [49]. We propose 3 questions: (1) what data leave the device? (2) how are those data stored? (eg, de-identified, encrypted), and (3) who will have access to those data? It should be clear what, if any, data are being sold, to whom, and what steps are taken to ensure that users cannot be identified by those data.

    Development Characteristics

    Development characteristics describe how the app was developed, and our recommendations conceptually overlap with those of the FDA’s precertification pilot and the MHRA in the United Kingdom. Good developmental practices would involve all stakeholders (clinicians and the target audience) as well as using evidence-based guidance from the beginning and at all stages of development and testing. The absence of the use of guidance or standards has recently been noticed for physical activity and fitness apps where very few of the thousands of Android apps provided any measurement or used any of the accepted guidance [50]. We especially emphasize including the target audience. This may seem obvious, but unfortunately, development practices often include clinicians and experts but more rarely involve the target audience until evaluation. Many studies rely on small numbers of participants or convenience samples, for example, soliciting feedback from stressed college students rather than individuals with depression [51]. Again, this may seem obvious, but independent usability evaluations have demonstrated that many popular commercial apps are frustrating and challenging for members of the intended audience, raising questions about their prior involvement and the potential for the app to benefit this community [52]. Recent evidence also suggests that good design contributes not only to usability but also engagement with health apps [53], and there are several authoritative descriptions of the processes for developing good design [54].

    Developers should outline their design and development process and clearly describe how patients were involved. Our 3 questions are as follows: (1) how were target users involved in the initial design? (2) how were target users involved in usability evaluations? and (3) has usability been independently evaluated?

    Feasibility

    Feasibility evaluations should address how people use the app (usability and user experience), how long they use it (engagement), and whether any serious adverse concerns are discovered (safety). These aspects provide information on how people use the app, including expectations on the frequency and length of use. This information is also vital to assess benefits. It would not be possible to run a drug trial or market a drug without some concept of the dosing frequency and expected therapeutic dose, and the same should be true with health apps.

    Again, we have 3 questions: (1) what proportion of users continue to use the app after 2 weeks? (2) what adverse events occurred and what was the rate of those events? and (3) has feasibility been independently evaluated? We propose a 2-week test not because it represents a likely therapeutic dose but because very few users persist in using a health app after the first week [55]. A standard metric, such as 2 weeks, could promote cross-app comparisons in engagement. Like usability testing, independent evaluation of apps is the key to promote transparency and confidence in findings. Despite the availability of engagement analytics, few are reported even in the clinical assessments of feasibility [56]. Independent evaluations could be carried out by service user groups, which could further strengthen service user involvement in the process of development and evaluation. Transparency could be further facilitated by making these datasets available to the research community.

    Health Benefits

    Health benefits are apparent from rigorous evaluations using standardized and accepted outcomes for the target condition that provides an indication of health benefits. Although many researchers have noted the mismatch between the development cycle for mobile apps and traditional randomized controlled trials [57,58], it is still the case that health apps presented as digital therapeutics require rigorous evaluation to back up their claims. The speed of development should not preclude such evaluations as suggested by some academics and designers [59]. We should be presented with direct evidence on an app’s safety and effectiveness because they are not merely mobile versions of websites even if they have similar content. People use apps differently, including more frequently and in shorter bursts [60], and these differences could affect their impact. We have already mentioned that this is happening with the FDA-approved (and first) behavioral health app, reSET, using clinical trial evidence from a Web-based version of the treatment [32]. Although triangulation of different sorts of data has been suggested (eg, MindTech [61]), we believe that health apps should undergo a trial to determine their superiority to other treatment options, especially as many unsubstantiated claims have been made [36]. Advertising standards require evidence to support any claims made, so these data fulfill both commercial and patient needs. Evaluations should also consider opportunity cost as using a health app may delay treatments that could be more beneficial, or a delay could worsen the health condition, making it harder to treat. All these benefits and costs need to be weighed in the balance. Our 3 questions are: (1) what is the impact on the health condition? (2) what percentage of users received either no benefit or deteriorated? and (3) are there specific benefits that outweigh any costs?


    What Would This Look Like in Practice?

    We have inserted the information from 4 health apps, one of which was named the app of the year in the iTunes Store in 2017 (Calm) (see Table 1). We have extracted, where possible, the information on each of our principles from the information provided with the app. The differences are very clear, especially in privacy and health benefits. There could, of course, be more data available on benefits held elsewhere, but this was not available at the point of download. However, what these simple principles also provide is the ability for a consumer to trade off the attributes. Some may want to know that their data are totally secure, whereas others might want to allow some encrypted anonymized data to be transmitted if the effectiveness of the app is proven. Indeed, in a recent survey of participants recruited from a mood and anxiety disorder clinic, many respondents were willing to allow an app to collect data directly from one’s phone, including global positioning system motion sensors, and screen state [37].

    Table 1. Evaluating apps with the Transparency for Trust principles.
    View this table

    Are These Principles Different From Those Suggested by Others?

    As we have said, T4T principles were based on those suggested by others in recent years. However, we more clearly operationalize our principles into concrete questions that could be answered and made available to potential users. To do so, we considered information important to regulators, developers, and health services, as well as integrating patient viewpoints taken from a number of different studies [37,66]. Patients are, after all, the consumer group of interest. Their views do not necessarily coincide with the expert groups’ views, as shown in the Delphi exercise by Zelmer et al [33]. Privacy and security feature in every assessment system and in regulations and are high on the list for patients, especially those with a mental health problem who may be more sensitive about information about them being shared [11]; So, it is included here, but in the simplest terms and not buried in an incomprehensible privacy statement. Our principle for a fit-for-purpose app includes development with patients. This principle is often suggested [19] but rarely incorporated into app assessment. As we know that some commercial apps are complex and hard to use by the patient group they were intended for, we have valued this section highly. Effectiveness is often mentioned in many assessment systems, but the promised effects are also dependent on the dose of the app and how intensively it is used. Patients need to consider this time constraint when deciding to make a purchase. Patients also want to know not just how effective it is, but also whether anyone does not receive any benefit. This is also important to clinicians, as patients who receive no benefit may view themselves as hopeless cases and not, as in the BlueIce exemplar, just part of the quarter of patients who report no advantage from following the app.

    Our approach has, therefore, been to provide information to patients at the point of the download that allows them to make an informed decision, and which they can refer to later as part of their self-management plan.


    Responsibility in the Health App Marketplaces

    If these simple T4T principles are followed, then we will have gone some way toward protecting patients. Whose job is it to monitor the T4T principles? Our view is that formal regulation is not needed. We just need the information to allow patients (and patient groups) to make informed choices. Information that is not true can be picked up by advertising standards authorities. Recent examples of this process are the US Federal Trade Commission fining of Lumosity for deceiving consumers with unfounded claims about cognitive benefits [67] and Carrot Neurotechnology for claiming that their app, Ultimeyes, can improve users’ vision [68]. Health apps are not a passing fad, and the low barrier of entry into current app marketplaces has resulted in an environment that at best confuses and at worst delays effective treatment. The problems have been highlighted but rarely have clear solutions like ours been proposed. Developers may be encouraged to produce these answers by commercial advantages, as apps with T4T principles might increase consumer comfort and produce unique revenue streams through increased adoption, not only from direct-to-patient markets, but also from health systems. They will also enjoy increased legitimacy among patient groups.

    We also note that the contributions of these principles are that they are a small, yet informative, set of questions that could be adopted relatively simply. We suggest these principles as the first, but important step. Further steps could attempt to explore if these principles could be defined with more structure. However, this structure would likely require further empirical work and coordination between different stakeholders in the health app space, particularly, developers and purveyors.

    Confidence in the efficacy and safety of these health apps is the least that patients should expect in making a choice to buy or use them. It is now time that existing commercial app stores, specifically the Google Play and Apple iTunes stores, step back from their libertarian ideology and adopt some rules for health app marketing. They should tighten up the definition of health apps and adopt a system, ours hopefully, to allow patients to understand what to expect from a health app. Although some might believe that this proposal is other worldly, starting somewhere is important. Health app marketplaces have a duty to, and health app developers a commercial advantage, from following our suggestions – we should not need to wait for another scandal or disaster before the Google Play or Apple iTunes Stores step up to the plate and help prevent worthless products being pressed on those with health needs.

    Acknowledgments

    TW acknowledges support from the National Institute for Health Research (NIHR) Maudsley Biomedical Research Centre at the South London and Maudsley NHS Foundation Trust and King’s College London and the NIHR Senior Investigator Award. SMS acknowledges the support of One Mind as well as the Implementation Research Institute at the George Washington University in St. Louis; support from the National Institute of Mental Health (5R25MH08091607) and the Department of Veterans Affairs, Health Services Research & Development Service, Quality Enhancement Research Initiative. The views expressed are authors’ and not necessarily those of the NHS, the NIHR, the Department of Health, or National Institute for Mental Health .

    Authors' Contributions

    TW presented the outline and together with SS wrote the initial draft. Both authors contributed to the revisions. TW is the guarantor.

    Conflicts of Interest

    TW has developed a novel software intervention (CIRCuiTS) and sits on the PsyberGuide Scientific Advisory Board. She has not received any funding from companies involved in this field. SS has received funding from One Mind and serves as the Executive Director of PsyberGuide, a nonprofit funded by One Mind. SMS serves as a scientific advisor to Joyable, Inc, and Potentia Labs, Inc, and has received stock options for these contributions.

    References

    1. Wykes T, Brown M. Over promised, over-sold and underperforming? - e-health in mental health. J Ment Health 2016;25(1):1-4 [FREE Full text] [CrossRef] [Medline]
    2. The Lancet. Does mobile health matter? Lancet 2017 Nov 18;390(10109):2216. [CrossRef] [Medline]
    3. Duggal R, Brindle I, Bagenal J. Digital healthcare: regulating the revolution. Br Med J 2018 Dec 15;360:k6. [CrossRef] [Medline]
    4. Krebs P, Duncan DT. Health app use among US mobile phone owners: a national survey. JMIR Mhealth Uhealth 2015 Nov 4;3(4):e101 [FREE Full text] [CrossRef] [Medline]
    5. Torous J, Roberts LW. Needed innovation in digital health and smartphone applications for mental health: transparency and trust. JAMA Psychiatry 2017 Dec 1;74(5):437-438. [CrossRef] [Medline]
    6. -. Is digital medicine different? Lancet 2018;392(10142):95. [CrossRef]
    7. Research2Guidance. 2017. 325,000 mobile health apps available in 2017 - Android now the leading mHealth platform   URL: https://research2guidance.com/325000-mobile-health-apps-available-in-2017/ [accessed 2019-03-08] [WebCite Cache]
    8. Holmes EA, Ghaderi A, Harmer CJ, Ramchandani PG, Cuijpers P, Morrison AP, et al. The Lancet Psychiatry Commission on psychological treatments research in tomorrow's science. Lancet Psychiatry 2018 Mar;5(3):237-286. [CrossRef] [Medline]
    9. S. Bhuyan S, Kim H, Isehunwa OO, Kumar N, Bhatt J, Wyant DK, et al. Privacy and security issues in mobile health: current research and future directions. Health Policy Technol 2017 Jun;6(2):188-191. [CrossRef]
    10. Academy of Medical Sciences. London: Academy of Medical Sciences; 2018 Nov 14. Our data-driven future in healthcare: People and partnerships at the heart of health related technologies   URL: https://acmedsciacuk/file-download/74634438 [accessed 2019-03-08] [WebCite Cache]
    11. Castell S, Ashford H. Academy of Medical Sciences. London: Academy of Medical Sciences; 2018 Nov 14. Future data-driven technologies and the implications for use of patient data: Dialogue with public, patients and healthcare professionals   URL: https://acmedsci.ac.uk/file-download/6616969 [accessed 2019-03-08] [WebCite Cache]
    12. Hollis C, Sampson S, Simons L, Davies EB, Churchill R, Betton V, et al. Identifying research priorities for digital technology in mental health care: results of the James Lind Alliance Priority Setting Partnership. Lancet Psychiatry 2018 Oct;5(10):845-854. [CrossRef] [Medline]
    13. Federal Trade Commission. Washington: Federal Trade Commission; 2016. Interactive Mobile Health Apps Tool: Developing a mobile health app?   URL: https://www.ftc.gov/tips-advice/business-center/guidance/mobile-health-apps-interactive-tool [accessed 2019-03-08] [WebCite Cache]
    14. Simblett S, Greer B, Matcham F, Curtis H, Polhemus A, Ferrão J, et al. Barriers to and facilitators of engagement with remote measurement technology for managing health: systematic review and content analysis of findings. J Med Internet Res 2018 Jul 12;20(7):e10480 [FREE Full text] [CrossRef] [Medline]
    15. British Standards Institution. London: Innovate UK; 2015. Health and wellness apps: Quality criteria across the life cycle - Code of practice   URL: https://shop.bsigroup.com/upload/271432/PAS%20277%20(2015)bookmarked.pdf [accessed 2019-03-08] [WebCite Cache]
    16. Naylor C, Knapp M, McDaid D. The Kings Fund. London; 2012. Long-term conditions and mental health: The cost of co-morbidities   URL: https://www.kingsfund.org.uk/publications/long-term-conditions-and-mental-health [accessed 2019-03-08] [WebCite Cache]
    17. Zuidersma M, Ormel J, Conradi HJ, de Jonge P. An increase in depressive symptoms after myocardial infarction predicts new cardiac events irrespective of depressive symptoms before myocardial infarction. Psychol Med 2012 Apr;42(4):683-693. [CrossRef] [Medline]
    18. Ismail K, Winkley K, Stahl D, Chalder T, Edmonds M. A cohort study of people with diabetes and their first foot ulcer: the role of depression on mortality. Diabetes Care 2007 Jun;30(6):1473-1479. [CrossRef] [Medline]
    19. Bhugra D, Tasman A, Pathare S, Priebe S, Smith S, Torous J, et al. The WPA-Lancet Psychiatry Commission on the future of psychiatry. Lancet Psychiatry 2017 Oct;4(10):775-818. [CrossRef] [Medline]
    20. Nicholas J, Larsen ME, Proudfoot J, Christensen H. Mobile apps for bipolar disorder: a systematic review of features and content quality. J Med Internet Res 2015 Aug 17;17(8):e198 [FREE Full text] [CrossRef] [Medline]
    21. Larsen ME, Nicholas J, Christensen H. A systematic assessment of smartphone tools for suicide prevention. PLoS One 2016;11(4):e0152285 [FREE Full text] [CrossRef] [Medline]
    22. Food and Drug Administration. Washington: Food and Drug Administration; 2015 Feb 9. Mobile Medical Applications: Guidance for industry and food and drug administration staff   URL: https:/​/www.​fda.gov/​downloads/​MedicalDevices/​DeviceRegulationandGuidance/​GuidanceDocuments/​UCM263366.​pdf [accessed 2019-03-08] [WebCite Cache]
    23. Food and Drug Administration. Washington: Food and Drug Administration; 2016 Jul 16. Mobile medical applications   URL: https://www.ftc.gov/tips-advice/business-center/guidance/mobile-health-apps-interactive-tool [accessed 2019-03-08] [WebCite Cache]
    24. Campbell AN, Nunes EV, Matthews AG, Stitzer M, Miele GM, Polsky D, et al. Internet-delivered treatment for substance abuse: a multisite randomized controlled trial. Am J Psychiatry 2014 Jun;171(6):683-690 [FREE Full text] [CrossRef] [Medline]
    25. Care Quality Commission. London: Care Quality Commission; 2017. Clarification of regulatory methodology: PMS digital healthcare providers   URL: https://www.cqc.org.uk/file/1295582 [accessed 2019-03-08] [WebCite Cache]
    26. Department for Business EIS. London: Department for Business EIS; 2012 Oct 08. CE Marking: How a product complies with EU safety, health and environmental requirements, and how to place a CE marking on your product   URL: https://www.gov.uk/guidance/ce-marking [accessed 2019-03-08] [WebCite Cache]
    27. National Health Service. London: National Health Service; 2018 Aug 25. NHS Apps Library: Find digital tools to help you manage and improve your health   URL: https://www.nhs.uk/apps-library/ [accessed 2019-03-08] [WebCite Cache]
    28. Food and Drug Administration. Washington: Food and Drug Administration; 2019. Digital Health Software Precertification (Pre-Cert) Program   URL: https://www.fda.gov/MedicalDevices/DigitalHealth/DigitalHealthPreCertProgram/default.htm [accessed 2019-03-08] [WebCite Cache]
    29. Apple Inc. California: Apple; 2019. App Store Review Guidelines   URL: https://developer.apple.com/app-store/review/guidelines/ [accessed 2019-03-08] [WebCite Cache]
    30. Stoyanov SR, Hides L, Kavanagh DJ, Zelenko O, Tjondronegoro D, Mani M. Mobile app rating scale: a new tool for assessing the quality of health mobile apps. JMIR Mhealth Uhealth 2015 Mar 11;3(1):e27 [FREE Full text] [CrossRef] [Medline]
    31. Baumel A, Faber K, Mathur N, Kane JM, Muench F. Enlight: a comprehensive quality and therapeutic potential evaluation tool for mobile and web-based eHealth interventions. J Med Internet Res 2017 Dec 21;19(3):e82 [FREE Full text] [CrossRef] [Medline]
    32. Drake G, Csipke E, Wykes T. Assessing your mood online: acceptability and use of Moodscope. Psychol Med 2013 Jul;43(7):1455-1464 [FREE Full text] [CrossRef] [Medline]
    33. Zelmer J, van Hoof K, Notarianni M, van Mierlo T, Schellenberg M, Tannenbaum C. An assessment framework for e-mental health apps in Canada: results of a modified Delphi process. JMIR Mhealth Uhealth 2018 Jul 9;6(7):e10016 [FREE Full text] [CrossRef] [Medline]
    34. Orcha. Daresbury; 2018. Orcha: Unlocking the power of Digital Health for the population   URL: https://www.orcha.co.uk/ [accessed 2019-03-08] [WebCite Cache]
    35. Psyberguide. Looking for a mental health app?   URL: https://psyberguide.org/ [accessed 2019-03-08] [WebCite Cache]
    36. Simons DJ, Boot WR, Charness N, Gathercole SE, Chabris CF, Hambrick DZ, et al. Do “Brain-Training” Programs Work? Psychological Science in the Public Interest. Psychol Sci Public Interest 2016 Oct;17(3):103-186. [CrossRef] [Medline]
    37. Di Matteo D, Fine A, Fotinos K, Rose J, Katzman M. Patient willingness to consent to mobile phone data collection for mental health apps: structured questionnaire. JMIR Ment Health 2018 Aug 29;5(3):e56 [FREE Full text] [CrossRef] [Medline]
    38. Hendrikoff L, Kambeitz-Ilankovic L, Pryss R, Senner F, Falkai P, Pogarell O, et al. Prospective acceptance of distinct mobile mental health features in psychiatric patients and mental health professionals. J Psychiatr Res 2019 Feb;109:126-132. [CrossRef] [Medline]
    39. BBC. London: BBC; 2019 Feb 18. Facebook-Cambridge Analytica data breach   URL: https://www.bbc.co.uk/news/topics/c81zyn0888lt/facebook-cambridge-analytica-data-breach [accessed 2019-03-08] [WebCite Cache]
    40. Meyer R. The Atlantic. Washington: The Atlantic; 2018 Mar 20. The Cambridge Analytica Scandal, in Three Paragraphs. What it means for Facebook, for President Trump's world, and for every American   URL: https:/​/www.​theatlantic.com/​technology/​archive/​2018/​03/​the-cambridge-analytica-scandal-in-three-paragraphs/​556046/​ [accessed 2019-03-08] [WebCite Cache]
    41. Wikipedia. 2019 Feb 21. Facebook - Cambridge Analytica data scandal   URL: https://en.wikipedia.org/wiki/Facebook%E2%80%93Cambridge_Analytica_data_scandal [accessed 2019-03-08] [WebCite Cache]
    42. Huckvale K, Prieto JT, Tilney M, Benghozi P, Car J. Unaddressed privacy risks in accredited health and wellness apps: a cross-sectional systematic assessment. BMC Med 2015 Sep 7;13:214 [FREE Full text] [CrossRef] [Medline]
    43. Dan L. HIStalk. 2014 Dec 10. The Rise and Fall of Happtique, mHealth?s First App Prescribing Platform   URL: http://histalkmobile.com/the-rise-and-fall-of-happtique-mhealths-first-app-prescribing-platform/ [accessed 2019-03-08] [WebCite Cache]
    44. Dolan B. Mobi Health News. 2014 Dec 10. SocialWellth acquires health app certification company Happtique   URL: https:/​/www.​mobihealthnews.com/​38894/​socialwellth-acquires-health-app-certification-company-happtique [accessed 2019-03-08] [WebCite Cache]
    45. Pharmacy 2U. 2015 Oct 20. Statement from Pharmacy2U   URL: https://www.pharmacy2u.co.uk/news/statement-oct15/ [accessed 2019-03-08] [WebCite Cache]
    46. Ennis L, Robotham D, Denis M, Pandit N, Newton D, Rose D, et al. Collaborative development of an electronic Personal Health Record for people with severe and enduring mental health problems. BMC Psychiatry 2014 Nov 18;14:305 [FREE Full text] [CrossRef] [Medline]
    47. Robotham D, Mayhew M, Rose D, Wykes T. Electronic personal health records for people with severe mental illness; a feasibility study. BMC Psychiatry 2015 Aug 6;15:192 [FREE Full text] [CrossRef] [Medline]
    48. Papageorgiou A, Strigkos M, Politou E, Alepis E, Solanas A, Patsakis C. Security and privacy analysis of mobile health applications: the alarming state of practice. IEEE Access 2018 Mar;6:9390-9403. [CrossRef]
    49. Rosenfeld L, Torous J, Vahia IV. Data security and privacy in apps for dementia: an analysis of existing privacy policies. Am J Geriatr Psychiatry 2017 Aug;25(8):873-877. [CrossRef] [Medline]
    50. Kebede M, Steenbock B, Helmer SM, Sill J, Möllers T, Pischke CR. Identifying evidence-informed physical activity apps: content analysis. JMIR Mhealth Uhealth 2018 Dec 18;6(12):e10314 [FREE Full text] [CrossRef] [Medline]
    51. Fitzpatrick KK, Darcy A, Vierhile M. Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR Ment Health 2017 Jun 6;4(2):e19 [FREE Full text] [CrossRef] [Medline]
    52. Sarkar U, Gourley GI, Lyles CR, Tieu L, Clarity C, Newmark L, et al. Usability of commercially available mobile applications for diverse patients. J Gen Intern Med 2016 Dec;31(12):1417-1426 [FREE Full text] [CrossRef] [Medline]
    53. Baumel K, Kane J. Examining predictors of real-world user engagement with self-guided eHealth interventions: analysis of mobile apps and websites using a novel dataset. J Med Internet Res 2018 Dec 14;20(12):e11491 [FREE Full text] [CrossRef] [Medline]
    54. Edwards EA, Caton H, Lumsden J, Rivas C, Steed L, Pirunsarn Y, et al. Creating a theoretically-grounded, gamified health app: lessons from developing the Cigbreak smoking cessation mobile phone game. JMIR Serious Games 2018 Nov 29;6(4):e10252 [FREE Full text] [CrossRef] [Medline]
    55. Owen JE, Jaworski BK, Kuhn E, Makin-Byrd KN, Ramsey KM, Hoffman JE. mHealth in the wild: using novel data to examine the reach, use, and impact of PTSD coach. JMIR Ment Health 2015;2(1):e7 [FREE Full text] [CrossRef] [Medline]
    56. Pham Q, Graham G, Carrion C, Morita PP, Seto E, Stinson JN, et al. A library of analytic indicators to evaluate effective engagement with consumer mHealth apps for chronic conditions: scoping review. JMIR Mhealth Uhealth 2019 Jan 18;7(1):e11941 [FREE Full text] [CrossRef] [Medline]
    57. Mohr DC, Cheung K, Schueller SM, Hendricks Brown C, Duan N. Continuous evaluation of evolving behavioral intervention technologies. Am J Prev Med 2013 Oct;45(4):517-523 [FREE Full text] [CrossRef] [Medline]
    58. Kumar S, Nilsen WJ, Abernethy A, Atienza A, Patrick K, Pavel M, et al. Mobile health technology evaluation: the mHealth evidence workshop. Am J Prev Med 2013 Aug;45(2):228-236 [FREE Full text] [CrossRef] [Medline]
    59. Hollis C, Morriss R, Martin J, Amani S, Cotton R, Denis M, et al. Technological innovations in mental healthcare: harnessing the digital revolution. Br J Psychiatry 2015 Apr;206(4):263-265. [CrossRef] [Medline]
    60. Oulasvirta A, Tamminen S, Roto VJK. Interaction in 4-second Bursts: The Fragmented Nature of Attentional Resources in Mobile HCI. In: Proceedings of the SIGCHI conference on Human factors in computing systems. SIGCHI Conference on Human Factors in Computing Systems; Portland, Oregon, USA: ACM; 2005 Presented at: CHI'05; April 2-7, 2005; Portland, Oregon p. 919-928. [CrossRef]
    61. Davies EB, Craven MP, Martin JL, Simons L. Proportionate methods for evaluating a simple digital mental health tool. Evid Based Ment Health 2017 Nov;20(4):112-117 [FREE Full text] [CrossRef] [Medline]
    62. Oxford Health NHS Foundation Trust. BlueIce app   URL: https://www.oxfordhealth.nhs.uk/blueice/ [accessed 2019-03-12] [WebCite Cache]
    63. Calm.   URL: https://www.calm.com/ [accessed 2019-03-12] [WebCite Cache]
    64. MyFitnessPal.   URL: https://www.myfitnesspal.com/ [accessed 2019-03-12] [WebCite Cache]
    65. Dario.   URL: https://mydario.com/ [accessed 2019-03-12] [WebCite Cache]
    66. Schueller S, Neary M, O'Loughlin K, Adkins EC. Discovery of and interest in health apps among those with mental health needs: survey and focus group study. J Med Internet Res 2018 Jun 11;20(6):e10141 [FREE Full text] [CrossRef] [Medline]
    67. Federal Trade Commission. 2016 Jan 15. Lumosity to Pay $2 Million to Settle FTC Deceptive Advertising Charges for Its   URL: https:/​/www.​ftc.gov/​news-events/​press-releases/​2016/​01/​lumosity-pay-2-million-settle-ftc-deceptive-advertising-charges [accessed 2019-03-08] [WebCite Cache]
    68. Federal Trade Commission. 2015 Sep 17. FTC Charges Marketers of 'Vision Improvement' App with Deceptive Claims   URL: https:/​/www.​ftc.gov/​news-events/​press-releases/​2015/​09/​ftc-charges-marketers-vision-improvement-app-deceptive-claims [accessed 2019-03-08] [WebCite Cache]


    Abbreviations

    FDA: Food and Drug Administration
    MARS: Mobile App Rating Scale
    MHRA: Medicines and Healthcare Products Regulatory Agency
    NICE: National Institute of Health and Clinical Excellence
    NIHR: National Institute for Health Research
    T4T: Transparency for Trust


    Edited by G Eysenbach; submitted 04.10.18; peer-reviewed by S Davis, J Rose, K Stawarz; comments to author 05.01.19; revised version received 21.01.19; accepted 09.02.19; published 02.05.19

    ©Til Wykes, Stephen Schueller. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 02.05.2019.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.