Published on in Vol 25 (2023)

Preprints (earlier versions) of this paper are available at, first published .
Digital Phenotyping: Data-Driven Psychiatry to Redefine Mental Health

Digital Phenotyping: Data-Driven Psychiatry to Redefine Mental Health

Digital Phenotyping: Data-Driven Psychiatry to Redefine Mental Health


1Infrastructure for Clinical Research in Neurosciences, Paris Brain Institute, Sorbonne University- Institut national de la santé et de la recherche médicale - Centre national de la recherche scientifique, Paris, France

2Department of Psychiatry, Pitié-Salpêtrière Hospital, Public Hospitals of Sorbonne University, Paris, France

3Department of Psychiatry, Saint-Antoine Hospital, Public Hospitals of Sorbonne University, Paris, France

4Medical Strategy and Innovation Department, Clariane, Paris, France

5NeuroStim Psychiatry Practice, Paris, France

6Department of Child and Adolescent Psychiatry, Nantes University Hospital, Nantes, France

7Pays de la Loire Psychology Laboratory, Nantes University, Nantes, France

8Institute for Advanced Consciousness Studies, Santa Monica, CA, United States

9Media Lab, Massachusetts Institute of Technology, Cambridge, MA, United States

Corresponding Author:

Antoine Oudin, MD

Infrastructure for Clinical Research in Neurosciences

Paris Brain Institute

Sorbonne University- Institut national de la santé et de la recherche médicale - Centre national de la recherche scientifique

Hôpital Pitié-Salpêtrière

47-83 Bd de l'Hôpital

Paris, 75013


Phone: 33 142161739


The term “digital phenotype” refers to the digital footprint left by patient-environment interactions. It has potential for both research and clinical applications but challenges our conception of health care by opposing 2 distinct approaches to medicine: one centered on illness with the aim of classifying and curing disease, and the other centered on patients, their personal distress, and their lived experiences. In the context of mental health and psychiatry, the potential benefits of digital phenotyping include creating new avenues for treatment and enabling patients to take control of their own well-being. However, this comes at the cost of sacrificing the fundamental human element of psychotherapy, which is crucial to addressing patients’ distress. In this viewpoint paper, we discuss the advances rendered possible by digital phenotyping and highlight the risk that this technology may pose by partially excluding health care professionals from the diagnosis and therapeutic process, thereby foregoing an essential dimension of care. We conclude by setting out concrete recommendations on how to improve current digital phenotyping technology so that it can be harnessed to redefine mental health by empowering patients without alienating them.

J Med Internet Res 2023;25:e44502



The emergence and rapid adoption of digital technologies in medicine have led to changes in medical practice and the conception of health. One such technology is based on the notion of the digital phenotype, which first emerged in 2015 in Nature Biotechnology [1], building on Dawkins’ extended phenotype, defined as the set of observable characteristics or traits of an organism. Digital phenotyping (DP) refers to the collection of observable and measurable characteristics, traits, or behaviors of an individual, defined as “moment-by-moment quantification of the individual-level human phenotype in situ using data from personal digital devices” [2]. The data can be divided into active and passive subgroups. Active data requires user engagement (eg, the completion of a questionnaire), while passive data is collected without user participation or notification.

The use of passive data collection through sensors of all kinds represents a step change in the clinical observation of patients, as it gathers fine-grained information that can be more relevant to illness phenotypes than the exclusively active data collection by the patient (eg, ecological momentary assessment and chatbot interactions). Today, there is a proliferation of digital interfaces as each individual interacts with a variety of connected objects, including wearables and smartphones equipped with a plethora of measurement tools. They can store and measure different types of data, including GPS data, proximity to other devices using Bluetooth, walking speed using an accelerometer, heart rate, oxygen level, electrical cardiac activity, sleep quality, perspiration using humidity sensors, tone of voice, activity on social networks, the lexical field of written sentences, etc. The collection of passive data has already led to some progress in various medical disciplines (eg, for monitoring cognitive function in cognitive impairments [3], Parkinson’s progression [4], cardiac electrophysiology [5], seizure detection [6], and glucose in diabetes [7]. DP serves the dual purpose of fulfilling clinical objectives and logistical aims. The clinical goals include improving health care professionals’ (HCPs’) ability to diagnose patients and select the most effective treatment options. Meanwhile, the logistical objectives involve managing health care systems to ensure optimal performance and efficiency. Nevertheless, DP may also constrain the role of HCPs, who already rely on clinical decision support systems, structuring the profession in a top-down manner at the risk of subjugating and disqualifying their know-how. Furthermore, the collection of quantitative data may dispossess patients of their subjective distress [8].

We believe that the emergence of technologies such as DP in medicine underscores the fundamental differences between 2 complementary conceptions of health [9]: one centered on the illness, the other on the patient. Illness-centered medicine has its roots in the ancient Greek medical school of Knidos. The aim is to cure illnesses. It is a medicine focused on the diagnosis and classification of illnesses. It can be related to the myth of Prometheus (ie, delaying or denying death) and corresponds to the objectification of the patient as a body or machine made up of organs and functional systems. Treatments involve invasive gestures (punctures, incisions, etc), the requirement to take medication, or, in the field of psychiatry, neurostimulation techniques such as electroconvulsive therapy. Behind this aggressive dimension of care [10] lies the idea of combating nature rather than seeking to improve coexistence with it. Patient-centered medicine is derived from the Hippocratic tradition. The aim is to care for patients by focusing on their self-experience of their illness, just as in palliative care, where human interactions are fundamental. This holistic (ie, whole person) approach to medicine focuses on prognosis and involves considering mental and social factors in order to improve individual patients’ quality of life. Throughout history, the conception of health has been pulled in these 2 opposite directions, depending on the patient’s or HCP’s point of view [10]. In the past decade, there has been much progress in patient-centered medicine. For example, clinical trials are increasingly using patient-reported outcome measures, as these are now being demanded by health authorities and regulatory agencies [11].

This paper examines how the implementation of DP in psychiatry could redefine mental health. This will be a challenging process owing to the plurality of concepts and approaches it involves. The World Health Organization defines health as a “state of complete physical, mental, and social well-being and not merely the absence of disease or infirmity” [12] and applies a normative approach based on a set of arbitrary conditions. Psychiatry has always claimed to be clinical medicine. In contrast to other medical disciplines, it has no consensual biological markers and no gold standard to help HCPs (ie, psychiatrists, psychologists, nurses, social workers, and therapists) establish diagnosis. The criteria used in psychiatry are clinical and mostly qualitative, stemming from observations of bedridden patients, in accordance with the etymology of this ancestral term (klinê, meaning bed). HCPs must make a thorough assessment of the functional impairment caused by the psychiatric illness in terms of the individual-environment interaction to justify the treatment. Finally, what makes psychiatry so complex is that there are no standards either for the physiopathological explanation of illnesses or for therapeutics. There is no international consensus on therapeutic guidelines, and there is huge interindividual variability in treatment response and tolerance, whether that treatment is pharmacological or psychotherapeutic. In other words, what is beneficial for one patient may not be for another. The same applies to physiopathological models, which range from psychodynamics to neurobiology and from genetic predispositions to environmental factors. In psychiatry, being in poor health can even have secondary benefits [13].

Classifications have been drawn up to justify therapeutic interventions. These can be either categorical, such as the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) [14] and the International Classification of Diseases, or dimensional, such as the “research domain criteria” (RDoC) [15]. Both conceptions have issues [16]. Categorical classification reflects medical tradition but leaves some help-seekers without care due to arbitrary diagnostic thresholds. As they have low intrinsic validity, categorical illnesses also give rise to frontier forms and disorder spectrums with little temporal validity or therapeutic interest [17,18]. Dimensional classification considers all dimensions to be equal in their pathogenicity without taking account of their interplay and causal links (eg, some dimensions may be defense mechanisms from an evolutionary or psychodynamic perspective). It also requires thresholds that are well defined, given that they are exposed to HCPs’ subjectivity. Finally, some dimensions are purely descriptive and do not consider functional impairment or therapeutic implications [19,20].

To delve into these matters, in this viewpoint paper, we begin by discussing the expectations related to the rise of a data-driven approach to mental health in the form of DP. Subsequently, we examine how its use threatens to dehumanize mental health. Lastly, we set out guidelines for ensuring that the implementation of DP in psychiatry fosters more patient-centered mental health.

Emergence of Digital Phenotyping in Psychiatry

Data science emerged in psychiatry some years ago, representing all the digital information about mental health, individuals’ properties, and digital factors involved in the health care processes. It ranges from theoretical variables of interest such as major life events, comorbid diagnosis, and stress factors to blood marker levels and clinical characteristics to functional neuroimaging [21]. DP gives HCPs a new set of digital biomarkers, collected from wearables, smartphones, but also virtual reality devices [22] or in gaming contexts [23], offering the opportunity to model mental health [24] and the extent of individual-environment interactions.

Machine learning brings powerful tools to explore high-dimensional and real-time data concerned with DP. It offers the opportunity “to make sense” of these digital signs of the reality they try to represent in the state of mental health [25-27]. Some see in this technology the potential to better understand the neurobiological mechanisms underlying psychiatric illnesses [28] or give new transdiagnostic models of comprehension of symptoms, in line with the “RDoC” perspective [29]. In addition, ML could be used to deliver new predictive models. Artificial intelligence (AI) might be capable of “overcoming the trial and error-driven status quo in mental health care by supporting precise diagnoses, prognosis, and therapeutic choices” [30]. These systems may have the ability to predict risks, help HCPs make clinical decisions, increase the accuracy and speed of diagnosis, and facilitate the examination of health records. Integrating this data-driven approach into clinical practice could reduce the workload of HCPs. They can already efficiently perform tasks such as diagnosing skin diseases and analyzing medical images (eg, in neurology, ophthalmology, cardiology, and gastro-enterology) [31,32] and could soon be used in clinical decision support systems in psychiatry [33], making compulsory admissions more helpful [34]. DP therefore has a natural home in psychiatry [35,36], helping HCPs access a new set of data based on individuals’ behavioral experiences and increasing their ability to classify and understand symptoms in their contextual and temporal dimensions, as well as the illnesses themselves [37,38]. Data reflecting emotions, levels of energy, behavioral changes [39], symptoms such as sociability, mood, physical activity, and sleep [40], but also logorrhea, agitation, rumination, hallucinations, or suicidal thoughts, could constitute the digital signature of a pathology (Figure 1). Given that psychiatry usually explains a mental illness and its issues in terms of the difficulty patients have interacting with their environment, it is only logical for DP to arouse such interest, especially as it could compensate for the absence of reliable biomarkers.

Figure 1. Schematic representation of digital phenotyping.

An Improved Psychiatric Care

In recent decades, the psychometric assessment of patients has involved self-report questionnaires (eg, the Patient Health Questionnaire for Depression [41]) and observer-rated scales (eg, the Hamilton Rating Scale for Depression [42]). As these scales are filled in either by the patient or the HCP, they necessarily have a degree of subjectivity. DP would enable phenomenological data to be collected, with the possibility of establishing ecologically valid psychometric assessments [37,43] (Figure 1). For instance, user-generated content on social media sites such as Reddit allows for the recognition of mental illness-related posts with good accuracy using a deep learning approach [44].

Various studies have already provided evidence about the use of DP for diagnostic purposes. As an example, there is a correlation between circadian rhythm, step counts, or heart rate variability and the diagnosis of a mood disorder or mood episode [45-48]. Other correlations have been found between data and symptoms of schizophrenia [49-51], major depression [52-57], mood disorders [46,58,59], posttraumatic stress disorder [60,61], generalized anxiety disorder [62], suicidal thoughts [63,64], sleep disorders [65], addiction [66], stress [53,67], postpartum [68,69], autism [70], and child and adolescent psychiatry [71,72]. Among other examples of the efficiency of DP for prediction or diagnosis in mental health, Instagram photos or Facebook language have been found to be predictors of depression [73,74]; suicidal risk could be assessed from social media [75,76] with increasing precision if DP would integrate electronic health records data [77,78]; automated analysis of free speech can measure relevant mental health changes in emergent psychosis [79] or incoherence in speech in schizophrenia [80]. It is worth noting that clinical utility may be derived from a combination of passive and active data and not from passive data alone [81]. In total, increasing diagnostic accuracy could help avoid treatment delays or errors. For example, there is currently an average 8-year delay in the diagnosis of bipolar illness in France [82]. Concerning the choice of medication, phenotypic markers already play a major role (eg, antidepressants for depression, whether it is accompanied by insomnia or hypersomnia). DP could bring together more phenotypic markers than HCPs can collect and thus improve the choice of treatment, countering the cognitive bias effect on decision-making [83].

Finally, DP could improve the follow-up of people with serious mental illnesses and optimize care between 2 consultations, especially when accessing care facilities can be difficult. It could also improve the assessment of treatment efficiency. Some researchers have hypothesized that heart rate variability is influenced by the progression of the patient’s depression and could be a new biomarker of treatment response [84,85]. High diagnostic confidence improves treatment compliance, and problems with compliance are a frequent source of relapse in mental illness [86]. DP could in turn provide useful tools for enhancing therapeutic education and improving the prediction of relapses [48,87-90], Ultimately, DP could also be used to assess whether patients could benefit from psychotherapy [91], using parameters with proven efficiency [45], as part of an evidence-based medicine strategy.

Patient’s Empowerment

DP introduces new capacities to assess differences between what is normal and what is pathological. Canguilhem [92] identified normative capacity (ie, normativity) as a central condition for gauging to what extent a living person is in good health: “What characterizes health is the possibility to tolerate infractions of the usual norm and to set new norms for new situations.” Thus, all living beings live with their own norms based on their specific biological limits. These norms are defined by Canguilhem [92] as a mode of functioning in certain environmental conditions that allows individuals to have normal abilities and live normal lives. Unlike machines, living beings have the possibility of defining their norms according to environmental conditions in order to ensure real-time adaptation. The limits of these norms are inevitably tested in the course of the individuals’ biological lives as they interact with their environment and pursue their life goals, and when they have a disability that prevents them from meeting their goals, they feel ill. Nonetheless, this experience of the limits is based on self-feeling and is totally personal. When this normative capacity no longer allows an individual to adapt, thus triggering an illness, it is legitimate to intervene in order to restore the ability to set new life goals. Concerning psychiatric symptoms, DP could add a digital dimension to the notion of limits and the production and assessment of norms. Experiencing physical pain when one’s leg is broken is more readily acknowledged, and seeking help is a common response. However, people often encounter challenges in recognizing psychic pain [93], depressive symptoms, or anxiety and find it much more difficult to perceive these conditions as abnormal and deserving of appropriate adaptations or treatment. DP could provide a novel and sensitive approach that empowers individuals to define their own digital mental health and establish their own digital norms. In accordance with Canguilhem’s principles, the threshold between the normal and the pathological would become more nuanced [92,94]. Individuals would be able to perceive mental health as something more tangible, allowing them to determine for themselves which conditions contribute to their well-being independently of any scientific paradigms used by HCPs for their evaluation and treatment approaches. Consequently, individuals would have the opportunity to take charge of their mental health care, paving the way for a comprehensive mental health prevention system. With this perspective, it would be simpler to harness individual motivation for making changes and foster active participation in health strategies. In the United States, data-driven psychiatry is starting to emerge, with a close relationship among patients, HCPs, and DP [95]. There have been attempts at self-management, where patients are given greater autonomy with regard to technologies and the management of their symptoms [94]. For instance, some open science applications such as mindLAMP (which collects health data, produces easy-to-understand graphs, enables journaling thoughts and reflections, and offers customized mental health interventions) allow some depressed or alcoholic patients to rapidly develop emotional self-awareness, track thinking patterns in real time, gain insight into their progress, connect to the clinical team, engage with medication, or reinforce their confidence in psychotherapy [96]. Other studies have suggested that DP increases patients’ feelings of control over their symptoms [97]. The quantified self-approach could help scientists understand pathology better and deserves further exploration. In Western medicine, representations of diseases and health are biased by cultural interpretation. Thanks to the new language through the data concerning the self-evaluation close to a data-feeling, patients could free themselves from this bias. Herein lies the idea of compensating for the subjectivity of clinical interviews. DP could introduce a more precise and decisive point of view of the daily life of the patient, where functional impairments, one of the major characteristics of mental illness, must not be neglected. DP is therefore a potential source of objectivity in mental illnesses [98], making it possible to flag up daily abnormalities.

Toward a Personalized Psychiatry

The possibility of empowerment is consistent with psychiatry’s move toward a more personalized approach. It was initially conceived of as the tailoring of psychiatric practices to the patient’s situation based on the HCP’s assessment. However, it could take the form of personalized requests from patients to HCPs, with better comprehension of their mental disorders converting patients into self-researchers investigating their own illness. DP could create a unique network of macro (social and smart cities), meso (relation to environment and situations), micro (the person), and very micro (physiological mechanisms) data integrated into the patient’s daily life. DP would promote personalized psychiatry [99], doing away with the usual lengthy periods of observation in psychiatry. This corresponds to the concept of intelligent health, defined as the expansion of electronic health through the inclusion of built-in data analysis using new technologies, as well as the extension of patient assessment to the patient’s environment and HCPs, and data mining to support decision-making [100]. HCPs and patients must be able to select the parameters that match their characteristics and situation in a move toward clinical augmentation. This added precision in clinical observation should lead to equally shared (between HCPs and patients) models of illnesses and treatments. It opens the way to patient-centered mental health, where “the symptom network” paradigm takes its place for a better phenotypic characterization of disorders and their evolution. The symptom network aims at a technologically augmented clinical and therapeutic relationship [101] that could thus explore how symptoms influence each other without trying to find a unique causal source in a more consistent and transparent psychopathological framework [102].

To conclude, we can see how DP could bring about major scientific progress in psychiatry toward patient-centered mental health. Nevertheless, this optimistic view of its potential uses or advantages needs to be tempered by practical issues. This quest for optimized performance could also have negative impacts (Textbox 1) on society that still need to be assessed.

Textbox 1. Positive and negative impacts of mental health as defined by digital phenotyping on society.

Positive impacts of patient-centered mental health

  • Precise, continuous, multidimensional psychometric assessment.
  • Deeper and more precise understanding of mental illnesses.
  • Faster and more accurate diagnosis of mental illnesses.
  • Better-adapted treatment based on each patient’s specificities, life history, and needs.
  • Better follow-up and prognosis of the mental illness over time and in changing conditions (eg, outside the hospital).
  • Patients are empowered to monitor the progression and root causes of their disease.
  • All in all: a precise, personalized (in the technical sense), patient-centered definition of mental health.

Negative impacts of dehumanized mental health

  • Nominalism and arbitrariness. It is unclear who or what decides the norms and standards beyond mere statistical means. Norms are disconnected from any lived experience.
  • Reproducing bias and systematic error in the understanding of mental illnesses: only relative objectivity.
  • Depriving patients of their agency, self-perception, and personal definition of what it means to be healthy and well.
  • Creation of arbitrary standards with a high risk of normativity, generating guilt and self-image issues.
  • Biopower: surveillance and privacy issues.
  • Alienated patients experience life in the second person after the technology and the disease.
  • All in all: decentering of the human being, disappearance of the human dimension of mental health.


Although DP introduces the possibility of individuals managing their own health data, increasing the amount of information available for them to assess their own health status, there might also be a process of reduction or simplification that would have massive consequences for patients’ self-determination. Moreover, as their health data would be more readily available to third parties, it might end up being used for other purposes besides improving their health.

Concerns With the Implementation of Digital Phenotyping in Psychiatry

The clinical interview is the moment when patients’ complaints are heard by HCPs and their distress is recognized. Their illness is discussed and diagnosed, and an appropriate treatment is prescribed. Subjective representations of individuals and HCPs are part of the decision-making system in psychiatry, shaping discussions about the individual's state of health and illness. In striving to improve the current state of health, a shared objective guides the HCPs’ decisions. However, considering the data yielded by DP analysis, the decision-making process in psychiatry might undergo profound transformations to align with the evolving understanding of mental health. Canguilhem [92] explained that the risk of nominalism is to reduce life to machine functioning. For example, a healthy individual may start to feel ill after seeing figures that deviate from the norm, while a help-seeker may be declared healthy by DP. The patient’s status may thus become disconnected from the felt experience [103], excluding self-construal, illness narratives, interpersonal dynamics, and social contexts, which are determinants of mental health [18]. The term data-driven (in society, marketing, health, etc) already exists and is used when decision-making is mainly based on data interpretation. The question of who creates the norms and indicators does not seem to be an issue at present. By default, these conditions are fixed either by researchers on the basis of their study findings or by the designers of health applications (eg, walking 10,000 steps per day, limiting screen time to 4 hours per day, and eating certain types of food selected by an algorithm). For example, a marketing campaign by Yamasa Corporation claimed that walking 10,000 steps per day is a factor for well-being. However, the goal was to promote a step tracker during the 1964 Tokyo Olympics. They called this technology manpo-kei, which in Japanese literally means “10,000 steps counter.” A systematic review acknowledged the association between walking and the reduction of all causes of mortality [104] but concluded that we can expect more benefits with every 1000 steps we walk beyond 8000. The 10,000-step target is therefore arbitrary and should be reconsidered. This example helps us understand that many recent applications continue to rely on approximations of scientific data [105,106]. At present, too many applications are dedicated to different types of data collection and analysis methods that are not well assimilated [40,107]. Using norms set by DP could deprive individuals of the possibility of discussing which norms they should strive toward in order to achieve better well-being. Studies on DP in psychiatry have limited research interest [108], none of which has been applied to clinical diagnosis and treatment or shown improvements in mental health over the long term. Some applications, such as Instagram, are already suspected of damaging users’ mental health [109].

Froment [103] pointed out that historically speaking, the term illness is not technical but undeniably profane and phenomenological. The science of medicine was built on a prescientific conception of illnesses but has gradually adopted concepts developed by HCPs and health authorities. If DP modifies these concepts, it could compromise patients’ quest to understand their mental health as well as the aim of the treatment. Beyond the very definition of mental health, there is an issue with the power of data over people’s freedom and way of life in society. It compromises the very possibility of innovative objectivity or may lead to the outright failure of any attempt to achieve objectivity through this technology.

Relative Objectivity

Datafication, which is defined as the tendency to overrepresent objects with digital data (eg, DP in psychiatry), has several detractors, especially when it comes to the possibility of data representing part of the real world [93,110-112]. For instance, several scales have been developed to quantify psychic pain, but they are very heterogeneous in terms of content, and cannot be compared or replaced by each other [112]. Data power may therefore paradoxically deplete reality [113]. Each of the many steps between data collection, analysis by an algorithm, and implementation by patients or HCPs is a source of bias [114-119]. Algorithmic bias emerges when databases are created without exhaustive data or when data is sourced from patients with coexisting illnesses [120,121]. Additionally, machine learning algorithms give rise to the black box effect (ie, the opacity of the internal processes of a system that produce outputs from inputs) [122]. This concern appears when users (eg, HCPs or patients) lack knowledge about the inner workings of the system, rendering them unable to explain the outcomes of an analysis [123]. Studies exploring AI-based diagnosis do not take either difficult cases or HCPs’ clinical experience into account. AI makes mistakes when HCPs perform poorly [124]. Machine learning searches for statistical invariants, whereas clinical experience provides tacit data (ie, heuristics coupled with models or concepts, used unconsciously by HCPs, and which are complex to objectify) that algorithms have difficulty considering, although they are at the core of clinical practice in psychiatry. Objectivity is a cultural and historical construct that has elicited different approaches over time and across cultures [125]. There has been a tendency to want to quantify illnesses ever since Claude Bernard introduced the nominalist approach denounced by Canguilhem [92]. Some see self-monitoring as the instrumentalization of health, promoting the proliferation of digital tracking tools with spurious ethical claims [126]. Even more so than other specialties, psychiatry may see itself as alienated from so-called algorithmic governmentality and, paradoxically, adopt the fantasy of illness-centered medicine curing psychiatric illnesses. Algorithmic governmentality does away with direct human interaction. Rouvroy and Stiegler [112] and Rouvroy and Berns [127] introduced this notion, defining it as derivative, with norm production reliant on massive data sets and regulations favored over humans’ anticipatory ability. With DP, psychiatry could therefore end up being driven by unreflective and determined rules instead of by the desire to create greater freedom to achieve goals. The risk is that it depends less on the HCP’s experience and allows less space for patients to express themselves [128]. DP may also lead us to a hostile world, driven by financial interests, where mental health becomes an unregulated market [129]. In addition, there is a risk of psychiatry becoming purely defensive [130], where medical tests or coercion are used under legal constraints. Finally, although DP represents an opportunity for patients to express themselves through technologies, it may bring more anxiogenic modalities of communication and hinder relationships [113], whereas they are actually crucial for patient mental health care.

Alienation Instead of Empowerment

The opportunity for social and human progress in patients’ empowerment in psychiatry remains unclear. DP represents an instrument of biopower (derived from Foucault’s biopower [131,132]) for users, independently of its efficiency. The growing consideration of digital data in decision-making risks dispossessing humans (HCPs and patients) of the tools for producing health data. It may create a dependence on production tools, as already observed in the world of scientific research. Stiegler [133] demonstrated how researchers’ daily lives have been changed by the arrival of more and more digital interfaces between the objects they observe and the data required to produce knowledge. Stiegler [133] claimed that scientists are thus deprived of their production as they have to pay private industry for the right to use their own discoveries. DP could easily spark a similar process in the mental health industry [134], where the production of feelings becomes a patentable technique, introducing a third party in the relationship established by the clinical interview. Finally, DP risks defining mental health goals and dictating the means of attaining them (Figure 2). Illnesses may become political, part of a biopower seeking social control [135-137]. The term mental disorder used in the DSM-5 (instead of illness or disease) speaks for itself, the implicit meaning being that good mental health corresponds to a supposed mental order: illnesses render individuals unfit and remove their autonomy, such that they are dependent on the quantified self or on medical authorities (specialists) with increasingly narrow areas of expertise. Indeed, the current trend in psychiatry is to develop expert centers in which patients undergo a single clinical assessment, resulting in fast diagnosis and therapeutic guidelines. This paternalistic psychiatric assessment has little to do with the patient’s own expertise, which only HCPs who provide follow-up and long-term care can fully know. The risk is that HCPs relying on this data set will reach their diagnosis too early, interfering in the treatment procedure and compromising the follow-up needed to cure psychiatric illnesses.

Figure 2. Conceptual illustration of the links created by the introduction of digital phenotyping into the health care relationship.

Ethical Issues

The concerns of patients whose intrinsic ability to normalize their environment is impaired are regarded as lying on the border between the normal and the pathological. DP could shift these borders by entering the patient-HCP relationship with a third party (Figure 2) or a fourth party if the relationship includes a family member or trusted person [138]. It raises the issue of responsibility, which also concerns the DP designer, manager, or analyst. Technological tools are not autonomous. DP threatens to go against the principles governing new technologies, namely neutrality, diversity, transparency, equity, loyalty, and overall comprehensibility.

Equity has been identified as one of the main ethical challenges of digital health [139]. Indeed, algorithms are constructed on complex models with user interfaces that are often difficult to understand. If users are not trained, DP will be misused due to the learning bias. HCPs and users could be victims of the black box effect, which can hinder the traceability of the decision-making process. It may well exacerbate the existing inequality of access to psychiatric care, penalizing patients from underprivileged backgrounds [94], on the wrong side of the digital divide [140], or exposed to the stigma [141] that extensively affects the psychiatric population [142]. Studies on the use of health data also tend to show that they are more acceptable to younger psychiatric patients [143], and adherence is predicted by a higher education level in schizophrenia [144]. There is therefore a risk that these technological advances will only benefit individuals who already enjoy easier access to psychiatric care.

In psychiatry, respect for privacy and confidentiality is also an essential part of the medical principle of patient autonomy, but health digitalization has brought about a paradigm shift. The sharing of information between different HCPs is now considered beneficial for patients. Professional integrity is replacing confidentiality [145,146]. Data anonymization can easily be bypassed by cross-referencing. Thus, personal health threatens to move from the private to the public arena [139] or to be extensively traded by private companies, particularly in psychiatry, where intimate data are of utmost importance for clinical practice. DP has indeed the potential for intrusion that would go against medical confidentiality, with questioning about the patients’ consent to share their health data. Hence, informed consent and data ownership are among the main issues [95,139,147].

Phenomenology currently lies at the core of mental health but risks being dislodged. Good health may contain a moral imperative, with pathologies being attributed either to alienation from the environment (nurture) or to biological predispositions (nature). When patients are freed of all responsibility in this way, failure ceases to exist and is replaced by illnesses, as has already happened with learning disorders, substance use disorders, and other deviances, ultimately leading to uneasiness [148]. Indeed, DP could stimulate nudging strategies, defined as “any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives” [149]. Nudges could create artificial health but also artificial illnesses, compromising patients’ autonomy [150,151] and agency [152]. Thus, it could be a barrier to meeting their mental health needs while creating new ones. The drift toward normativity through the medicalization of society can already be observed with the cult of performance [153] and the illusion of omnipotence over illness and death. Based on the notion of personal development, the post-Freudian therapies that first emerged back in the 1960s are intended to enable the immediate gratification of drives, thus increasing the isolation of the self [154]. The ultimate goal is for individuals to fulfill their personal achievement goals by worshiping authenticity and assertiveness, thereby exacerbating the very symptoms these therapies claim to cure. By further increasing the emphasis on performance, DP could contribute to the loss of identification with generational continuity, in other words, the inability of individuals to gradually identify themselves with the wellness and success of others rather than their own and, with the fear of growing old, the inability to think about their posterity and the handing of the baton to the next generation. The COVID-19 pandemic may be a good example of the counterhistorical health trade-off between generations: the sacrifice of liberty on behalf of health, with the collateral damage of poorer mental health among young people [155,156].

More and more digital mental health apps are available in web-based sales spaces. However, their design is seldom inspired by rigorous scientific research [105]. The popularity of these apps shows that there is consumer demand. The same need manifests itself when patients turn to complementary medicine because their physician fails to provide satisfaction. We can assume that through their use of these apps, individuals are not only responding to various marketing ploys but are also seeking to improve their daily lives. The COVID-19 pandemic undoubtedly fostered this change in the relationship between individuals and connected health, but there is a dearth of qualitative research on the place that connected objects now have in people’s daily lives [118].


DP gives psychiatry an opportunity to adopt more modern ways of helping patients. However, it raises crucial questions about the values underpinning the definition of mental health [157] and could profoundly undermine the “sacred trust” that patients place in their physician to understand their illness. All in all, interdisciplinary collaboration in DP research is necessary [25,158], fostering expertise in psychiatry, computer science, data science, innovation, ethics, law, and the social sciences, to ensure the development of robust and clinically meaningful DP tools [159]. Further recommendations are needed to ensure that there is a true revolution in mental health.

Empowerment of Patients and Health Care Professionals


Progress must be made in this area to ensure adequate usability. This poses a considerable challenge owing to the properties of the data (ie, high volume, heterogeneity, noise, and sparseness) [33,108]. The Beiwe platform is an example of the will to gather and rationalize passive smartphone data to phenotype psychiatric illnesses [2]. Patients must be able to choose which DP tool they want to use, which data can be collected, for what purpose, and have the right to withdraw [139].

Empathy of Care

DP could contribute to the empowerment of HCPs if the main objective is to facilitate their operational and administrative tasks and enable them to make clinical decisions. At the very least, clinical decision support systems must always be subject to professional validation, and all the data used by the algorithm must be reported. AI-based data analysis should reduce the workload of HCPs and allow them to focus on the Hippocratic aspects of care, such that their relationships with their patients are more humanistic, empathic, and centered on their individuality in terms of history, daily life, and symptoms, thereby creating more room for psychotherapeutic approaches. It goes without saying that this can only happen if the number of HCPs is not reduced by public health policies.

Cooperation Between Patients and Health Care Professionals

Just as DP gives individuals new tools for understanding themselves and their behaviors, it opens up new prospects for sharing a common and practical definition of mental health. This definition should be shared, personalized, and evolutive. The objective of DP should be to foster better cooperation among HCPs, patients, and their environment and to enhance understanding of their interplay.

Maintaining a Critical Mindset

Information on treatments introduces cognitive bias during the decision-making phase [83]. DP should be used as a debiasing strategy and not simply to create yet another layer of information for HCPs. Patients and HCPs should nevertheless consider the risks associated with nudging with these technologies. DP only constitutes one of the data sets representing patients’ mental health, which should instead be about sensitive, pragmatic, and rational reasoning between patients and HCPs over time.


For Patients

To counter the black box effect and the dispossession of health data, some authors recommend the use of explainable AI to set out the reasoning behind the conclusions [160]. This would remove the potential obstacle to patients’ and HCPs’ empowerment posed by algorithms. However, few AI models are currently available [160], but they promise to encompass ethics, security, and safety concerns [161]. In all cases, information and education need to be provided to help users understand DP, how it works, its limitations, and its potential failures.

For Health Care Providers

Data education must also be provided to HCPs. In France, medical students are not yet taught about the use of digital devices to improve patient health. Furthermore, we know that psychiatry residents mistrust digital culture [162]. This could be a major strategic error, as new generations will have to work in a global and competitive world of medicine.

For Designers

Finally, education must also be provided to the people who design and analyze DP applications. There is a lack of standards for building these tools, and their design needs to be in accordance with patients’ and HCPs’ experiences.



Progress must be made regarding the security of the data collected [139], otherwise this technology will never be acceptable [138]. This is a particular priority in psychiatry, where it is a prerequisite for using data science. It is very much an ethical challenge, for if we want this technology to uphold the fundamental principles of privacy, transparency, informed consent, accountability, and fairness [163], the security issue must be resolved within the next few years [147].


It is essential for DP to be available to the whole psychiatric population. The MindLogger platform for mobile mental health assessment is an example of such an initiative to democratize the development of mental health apps [164].

Data Rationalization and Research Priorities

Large-scale longitudinal data sets with standardized evaluation metrics are needed to assess the potential impact of these technologies. The costs of developing such tools need to be set against the proven and expected benefits. Studies are needed to analyze the qualitative and quantitative impact of an electronic health society on patients [165,166] and must ensure that the definition of mental health is not based on an artificial boundary between the normal and the pathological and does not become distanced from the patient’s experience. Ethical aspects of digital health research need to be considered in every study: equity, replicability, privacy, and efficacy [167]. In particular, qualitative studies must be conducted to define the tools’ contents (content validity) before performing studies to validate these tools statistically and psychometrically (structural validity, internal validity, cross-cultural validity, measurement error and reliability, criterion validity) [93]. Transdisciplinary approaches, including phenomenology, must be adopted during this construction process so as not to lose sight of patient-reported outcomes. Concerning algorithms, machine learning, natural language processing, and expert systems are the most studied interventions [115]. Up until now, studies have focused on diagnosis, prognosis, risk management, or follow-up, have had strong biases, have not taken health care end users into account, and finally have not responded to needs, as shown by the concerns about low engagement [143].


An independent committee is needed to provide a legal framework for the marketing of DP tools based on recommendations for their construction, validation, consideration of users’ feedback, ethical considerations, and costs for society. Thus, DP could follow the established norms of quality and safety and finally be cost-effective and feasible [168].

At the turn of the 17th century, there was a move from prescientific and Hippocratic medicine to Promethean medicine as a result of scientific discoveries and technological advances in biology. The 20th century’s paradigm consisted of using evidence-based medicine to bring about pragmatic progress. Medical practice continues to evolve; patients are once again regarded as experts in their own symptoms, and their preferences are given the same importance as external clinical data and medical experience. DP paves the way for a redefinition of mental health, making it more subjective while taking advantage of 21st century technologies for preventive, predictive, effective, and personalized medicine. We noted a certain enthusiasm for these new technologies among the general public, but HCPs remain skeptical, wondering whether the progress touted by private companies is really relevant for all patients and all HCPs. There now needs to be a qualitative study comparing patients’ and HCPs’ perspectives on the implementation of DP in psychiatry. DP calls into question the validity of the risk-benefit ratio, bringing another way of expressing and understanding illnesses. Thus, the challenge of DP will be to let patients access their own state of health, creating a new dimension of care where the borders of mental health are extended and not constrained by more digital interfaces and their pitfalls. Algorithmic governmentality should not be used to decide whether or not individuals deserve mental health care. To conclude, the priority should be to improve the abilities of patients to deal with their difficulties. DP has its place in psychiatry, fostering patients’ empowerment in terms of their illnesses, their health, their own lives, and those of others.

Data Availability

Data sharing is not applicable to this article as no data sets were generated or analyzed during this study.

Authors' Contributions

AO, VA, RM, and SM contributed to the conceptualization of the manuscript. AO and VA wrote the original draft. VA, SM, and RM supervised the development of this study. All authors contributed to the review and editing of the manuscript. All authors have read and agreed to the submitted version of the manuscript.

Conflicts of Interest

FS is partially funded by Joy Ventures and Tiny Blue Dot Foundation. In the past years, FS founded and received compensation from BeSound SAS and Nested Minds LLC. SM has received speaker and consultant fees from Ethypharm, Bioserenity, and Angelini Pharma.

  1. Jain SH, Powers BW, Hawkins JB, Brownstein JS. The digital phenotype. Nat Biotechnol. 2015;33(5):462-463. [CrossRef] [Medline]
  2. Torous J, Kiang MV, Lorme J, Onnela JP. New tools for new research in psychiatry: a scalable and customizable platform to empower data driven smartphone research. JMIR Ment Health. 2016;3(2):e16. [FREE Full text] [CrossRef] [Medline]
  3. Piau A, Wild K, Mattek N, Kaye J. Current state of digital biomarker technologies for real-life, home-based monitoring of cognitive function for mild cognitive impairment to mild Alzheimer disease and implications for clinical care: systematic review. J Med Internet Res. 2019;21(8):e12785. [FREE Full text] [CrossRef] [Medline]
  4. Diao JA, Raza MM, Venkatesh KP, Kvedar JC. Watching Parkinson's disease with wrist-based sensors. NPJ Digit Med. 2022;5(1):73. [FREE Full text] [CrossRef] [Medline]
  5. Atreja A, Francis S, Kurra S, Kabra R. Digital medicine and evolution of remote patient monitoring in cardiac electrophysiology: a state-of-the-art perspective. Curr Treat Options Cardiovasc Med. 2019;21(12):92. [FREE Full text] [CrossRef] [Medline]
  6. Poh MZ, Loddenkemper T, Reinsberger C, Swenson NC, Goyal S, Sabtala MC, et al. Convulsive seizure detection using a wrist-worn electrodermal activity and accelerometry biosensor. Epilepsia. 2012;53(5):e93-e97. [FREE Full text] [CrossRef] [Medline]
  7. Heinemann L, Freckmann G, Ehrmann D, Faber-Heinemann G, Guerra S, Waldenmaier D, et al. Real-time continuous glucose monitoring in adults with type 1 diabetes and impaired hypoglycaemia awareness or severe hypoglycaemia treated with multiple daily insulin injections (HypoDE): a multicentre, randomised controlled trial. Lancet. 2018;391(10128):1367-1377. [CrossRef] [Medline]
  8. Wittink H, Nicholas M, Kralik D, Verbunt J. Are we measuring what we need to measure? Clin J Pain. 2008;24(4):316-324. [CrossRef] [Medline]
  9. Berche P. Ideal medicine does not exist. Trib Santé. 2012;37(4):29. [CrossRef]
  10. Boch AL. In: Supérieur DB, editor. Médecine technique, médecine tragique: Le tragique, sens et destin de la médecine moderne. Paris. Seli Arslan; 2009.
  11. Germain N, Aballéa S, Smela-Lipinska B, Pisarczyk K, Keskes M, Toumi M. Patient-reported outcomes in randomized controlled trials for overactive bladder: a systematic literature review. Value Health. 2018;21:S114. [FREE Full text] [CrossRef]
  12. Grad FP. The preamble of the constitution of the World Health Organization. Bull World Health Organ. 2002;80(12):981-984. [FREE Full text] [Medline]
  13. Aybek S, Nicholson TR, Zelaya F, O'Daly OG, Craig TJ, David AS, et al. Neural correlates of recall of life events in conversion disorder. JAMA Psychiatry. 2014;71(1):52-60. [FREE Full text] [CrossRef] [Medline]
  14. American Psychiatric Association; American Psychiatric Association; DSM-5 Task Force. Diagnostic and Statistical Manual of Mental Disorders: DSM-5. Arlington, VA. American Psychiatric Association; 2013.
  15. Demazeux S, Pidoux V. The RDoC project: the neuropsychiatric classification of tomorrow? Med Sci (Paris). 2015;31(8-9):792-796. [FREE Full text] [CrossRef] [Medline]
  16. Livesley WJ. A framework for integrating dimensional and categorical classifications of personality disorder. J Pers Disord. 2007;21(2):199-224. [CrossRef] [Medline]
  17. Insel T, Cuthbert B, Garvey M, Heinssen R, Pine DS, Quinn K, et al. Research Domain Criteria (RDoC): toward a new classification framework for research on mental disorders. Am J Psychiatry. 2010;167(7):748-751. [FREE Full text] [CrossRef] [Medline]
  18. Gómez-Carrillo A, Paquin V, Dumas G, Kirmayer LJ. Restoring the missing person to personalized medicine and precision psychiatry. Front Neurosci. 2023;17:1041433. [FREE Full text] [CrossRef] [Medline]
  19. Zimmerman M. Why hierarchical dimensional approaches to classification will fail to transform diagnosis in psychiatry. World Psychiatry. 2021;20(1):70-71. [FREE Full text] [CrossRef] [Medline]
  20. Kessler RC. The categorical versus dimensional assessment controversy in the sociology of mental illness. J Health Soc Behav. 2002;43(2):171-188. [Medline]
  21. Chekroud AM, Bondar J, Delgadillo J, Doherty G, Wasil A, Fokkema M, et al. The promise of machine learning in predicting treatment outcomes in psychiatry. World Psychiatry. 2021;20(2):154-170. [FREE Full text] [CrossRef] [Medline]
  22. Torous J, Bucci S, Bell IH, Kessing LV, Faurholt-Jepsen M, Whelan P, et al. The growing field of digital psychiatry: current evidence and the future of apps, social media, chatbots, and virtual reality. World Psychiatry. 2021;20(3):318-335. [FREE Full text] [CrossRef] [Medline]
  23. Mandryk RL, Birk MV. The potential of game-based digital biomarkers for modeling mental health. JMIR Ment Health. 2019;6(4):e13485. [FREE Full text] [CrossRef] [Medline]
  24. Raballo A. Digital phenotyping: an overarching framework to capture our extended mental states. Lancet Psychiatry. 2018;5(3):194-195. [CrossRef] [Medline]
  25. Smith KA, Blease C, Faurholt-Jepsen M, Firth J, Van Daele T, Moreno C, et al. Digital mental health: challenges and next steps. BMJ Ment Health. 2023;26(1):e300670. [FREE Full text] [CrossRef] [Medline]
  26. Galatzer-Levy IR, Onnela JP. Machine learning and the digital measurement of psychological health. Annu Rev Clin Psychol. 2023;19(1):133-154. [FREE Full text] [CrossRef] [Medline]
  27. Choudhary S, Thomas N, Ellenberger J, Srinivasan G, Cohen R. A machine learning approach for detecting digital behavioral patterns of depression using nonintrusive smartphone data (complementary path to patient health questionnaire-9 assessment): prospective observational study. JMIR Form Res. 2022;6(5):e37736. [FREE Full text] [CrossRef] [Medline]
  28. Montag C, Quintana DS. Digital phenotyping in molecular psychiatry-a missed opportunity? Mol Psychiatry. 2023;28(1):6-9. [FREE Full text] [CrossRef] [Medline]
  29. Torous J, Onnela JP, Keshavan M. New dimensions and new tools to realize the potential of RDoC: digital phenotyping via smartphones and connected devices. Transl Psychiatry. 2017;7(3):e1053. [FREE Full text] [CrossRef] [Medline]
  30. Koutsouleris N, Hauser TU, Skvortsova V, De Choudhury M. From promise to practice: towards the realisation of AI-informed mental health care. Lancet Digit Health. 2022;4(11):e829-e840. [FREE Full text] [CrossRef] [Medline]
  31. Topol EJ. High-performance medicine: the convergence of human and artificial intelligence. Nat Med. 2019;25(1):44-56. [CrossRef] [Medline]
  32. McKinney SM, Sieniek M, Godbole V, Godwin J, Antropova N, Ashrafian H, et al. International evaluation of an AI system for breast cancer screening. Nature. 2020;577(7788):89-94. [CrossRef] [Medline]
  33. Arora A. Conceptualising artificial intelligence as a digital healthcare innovation: an introductory review. Med Devices (Auckl). 2020;13:223-230. [FREE Full text] [CrossRef] [Medline]
  34. Oliva F, Ostacoli L, Versino E, Pomeri AP, Furlan PM, Carletto S, et al. Compulsory psychiatric admissions in an Italian urban setting: are they actually compliant to the need for treatment criteria or arranged for dangerous not clinical condition? Front Psychiatry. 2018;9:740. [FREE Full text] [CrossRef] [Medline]
  35. Prakash J, Chaudhury S, Chatterjee K. Digital phenotyping in psychiatry: when mental health goes binary. Ind Psychiatry J. 2021;30(2):191-192. [FREE Full text] [CrossRef] [Medline]
  36. Williamson S. Digital phenotyping in psychiatry. BJPsych Adv. 2023:1-2. [FREE Full text] [CrossRef]
  37. Insel TR. Digital phenotyping: technology for a new science of behavior. JAMA. 2017;318(13):1215-1216. [CrossRef] [Medline]
  38. Insel TR. Digital phenotyping: a global tool for psychiatry. World Psychiatry. 2018;17(3):276-277. [FREE Full text] [CrossRef] [Medline]
  39. Jagesar RR, Roozen MC, van der Heijden I, Ikani N, Tyborowska A, Penninx BWJH, et al. Digital phenotyping and the COVID-19 pandemic: capturing behavioral change in patients with psychiatric disorders. Eur Neuropsychopharmacol. 2021;42:115-120. [FREE Full text] [CrossRef] [Medline]
  40. Mendes JPM, Moura IR, Van de Ven P, Viana D, Silva FJS, Coutinho LR, et al. Sensing apps and public data sets for digital phenotyping of mental health: systematic review. J Med Internet Res. 2022;24(2):e28735. [FREE Full text] [CrossRef] [Medline]
  41. Kroenke K, Spitzer RL, Williams JB. The PHQ-9: validity of a brief depression severity measure. J Gen Intern Med. 2001;16(9):606-613. [FREE Full text] [CrossRef] [Medline]
  42. Carrozzino D, Patierno C, Fava GA, Guidi J. The Hamilton rating scales for depression: a critical review of clinimetric properties of different versions. Psychother Psychosom. 2020;89(3):133-150. [FREE Full text] [CrossRef] [Medline]
  43. Nisenson M, Lin V, Gansner M. Digital phenotyping in child and adolescent psychiatry: a perspective. Harv Rev Psychiatry. 2021;29(6):401-408. [CrossRef] [Medline]
  44. Gkotsis G, Oellrich A, Velupillai S, Liakata M, Hubbard TJP, Dobson RJB, et al. Corrigendum: characterisation of mental health conditions in social media using informed deep learning. Sci Rep. 2017;7(1):46813. [FREE Full text] [CrossRef] [Medline]
  45. Dahl H, Teller V, Moss D, Trujillo M. Countertransference examples of the syntactic expression of warded-off contents. Psychoanal Q. 1978;47(3):339-363. [Medline]
  46. Cho CH, Lee T, Kim MW, In HP, Kim L, Lee HJ. Mood prediction of patients with mood disorders by machine learning using passive digital phenotypes based on the circadian rhythm: prospective observational cohort study. J Med Internet Res. 2019;21(4):e11029. [FREE Full text] [CrossRef] [Medline]
  47. Valenza G, Nardelli M, Lanata' A, Gentili C, Bertschy G, Kosel M, et al. Predicting mood changes in bipolar disorder through heartbeat nonlinear dynamics. IEEE J Biomed Health Inform. 2016;20(4):1034-1043. [FREE Full text] [CrossRef] [Medline]
  48. Lee HJ, Cho CH, Lee T, Jeong J, Yeom JW, Kim S, et al. Prediction of impending mood episode recurrence using real-time digital phenotypes in major depression and bipolar disorders in South Korea: a prospective nationwide cohort study. Psychol Med. 2022:1-9. [CrossRef] [Medline]
  49. Strauss GP, Raugh IM, Zhang L, Luther L, Chapman HC, Allen DN, et al. Validation of accelerometry as a digital phenotyping measure of negative symptoms in schizophrenia. Schizophrenia (Heidelb). 2022;8(1):37. [FREE Full text] [CrossRef] [Medline]
  50. Lakhtakia T, Bondre A, Chand PK, Chaturvedi N, Choudhary S, Currey D, et al. Smartphone digital phenotyping, surveys, and cognitive assessments for global mental health: initial data and clinical correlations from an international first episode psychosis study. Digit Health. 2022;8:20552076221133758. [FREE Full text] [CrossRef] [Medline]
  51. Staples P, Torous J, Barnett I, Carlson K, Sandoval L, Keshavan M, et al. A comparison of passive and active estimates of sleep in a cohort with schizophrenia. NPJ Schizophr. 2017;3(1):37. [FREE Full text] [CrossRef] [Medline]
  52. Zulueta J, Piscitello A, Rasic M, Easter R, Babu P, Langenecker SA, et al. Predicting mood disturbance severity with mobile phone keystroke metadata: a BiAffect digital phenotyping study. J Med Internet Res. 2018;20(7):e241. [FREE Full text] [CrossRef] [Medline]
  53. Thomée S, Härenstam A, Hagberg M. Mobile phone use and stress, sleep disturbances, and symptoms of depression among young adults--a prospective cohort study. BMC Public Health. 2011;11(1):66. [FREE Full text] [CrossRef] [Medline]
  54. Zarate D, Stavropoulos V, Ball M, de Sena Collier G, Jacobson NC. Exploring the digital footprint of depression: a PRISMA systematic literature review of the empirical evidence. BMC Psychiatry. 2022;22(1):421. [FREE Full text] [CrossRef] [Medline]
  55. Taliaz D, Souery D. A new characterization of mental health disorders using digital behavioral data: evidence from major depressive disorder. J Clin Med. 2021;10(14):3109. [FREE Full text] [CrossRef] [Medline]
  56. Ettore E, Müller P, Hinze J, Riemenschneider M, Benoit M, Giordana B, et al. Digital phenotyping for differential diagnosis of major depressive episode: narrative review. JMIR Ment Health. 2023;10:e37225. [FREE Full text] [CrossRef] [Medline]
  57. Wahle F, Kowatsch T, Fleisch E, Rufer M, Weidt S. Mobile sensing and support for people with depression: a pilot trial in the wild. JMIR Mhealth Uhealth. 2016;4(3):e111. [FREE Full text] [CrossRef] [Medline]
  58. Maatoug R, Oudin A, Adrien V, Saudreau B, Bonnot O, Millet B, et al. Digital phenotype of mood disorders: a conceptual and critical review. Front Psychiatry. 2022;13:895860. [FREE Full text] [CrossRef] [Medline]
  59. Ebner-Priemer UW, Mühlbauer E, Neubauer AB, Hill H, Beier F, Santangelo PS, et al. Digital phenotyping: towards replicable findings with comprehensive assessments and integrative models in bipolar disorders. Int J Bipolar Disord. 2020;8(1):35. [FREE Full text] [CrossRef] [Medline]
  60. Bourla A, Mouchabac S, El Hage W, Ferreri F. e-PTSD: an overview on how new technologies can improve prediction and assessment of Posttraumatic Stress Disorder (PTSD). Eur J Psychotraumatol. 2018;9(Suppl 1):1424448. [FREE Full text] [CrossRef] [Medline]
  61. Schultebraucks K. Digital approaches for predicting posttraumatic stress and resilience: promises, challenges, and future directions. Eur Psychiatr. 2023;66(S1):S50-S50. [FREE Full text] [CrossRef]
  62. Jacobson NC, Bhattacharya S. Digital biomarkers of anxiety disorder symptom changes: personalized deep learning models using smartphone sensors accurately predict anxiety symptoms from ecological momentary assessments. Behav Res Ther. 2022;149:104013. [FREE Full text] [CrossRef] [Medline]
  63. Wong QJ, Werner-Seidler A, Torok M, van Spijker B, Calear AL, Christensen H. Service use history of individuals enrolling in a web-based suicidal ideation treatment trial: analysis of baseline data. JMIR Ment Health. 2019;6(4):e11521. [FREE Full text] [CrossRef] [Medline]
  64. Kleiman EM, Turner BJ, Fedor S, Beale EE, Picard RW, Huffman JC, et al. Digital phenotyping of suicidal thoughts. Depress Anxiety. 2018;35(7):601-608. [CrossRef] [Medline]
  65. Teo JX, Davila S, Yang C, Hii AA, Pua CJ, Yap J, et al. Digital phenotyping by consumer wearables identifies sleep-associated markers of cardiovascular disease risk and biological aging. Commun Biol. 2019;2:361. [FREE Full text] [CrossRef] [Medline]
  66. Ferreri F, Bourla A, Mouchabac S, Karila L. e-addictology: an overview of new technologies for assessing and intervening in addictive behaviors. Front Psychiatry. 2018;9:51. [FREE Full text] [CrossRef] [Medline]
  67. Goodday SM, Friend S. Unlocking stress and forecasting its consequences with digital technology. NPJ Digit Med. 2019;2:75. [FREE Full text] [CrossRef] [Medline]
  68. Feldman N, Perret S. Digital mental health for postpartum women: perils, pitfalls, and promise. NPJ Digit Med. 2023;6(1):11. [FREE Full text] [CrossRef] [Medline]
  69. Hahn L, Eickhoff SB, Habel U, Stickeler E, Schnakenberg P, Goecke TW, et al. Early identification of postpartum depression using demographic, clinical, and digital phenotyping. Transl Psychiatry. 2021;11(1):121. [FREE Full text] [CrossRef] [Medline]
  70. Washington P, Park N, Srivastava P, Voss C, Kline A, Varma M, et al. Data-driven diagnostics and the potential of mobile artificial intelligence for digital therapeutic phenotyping in computational psychiatry. Biol Psychiatry Cogn Neurosci Neuroimaging. 2020;5(8):759-769. [FREE Full text] [CrossRef] [Medline]
  71. Sequeira L, Battaglia M, Perrotta S, Merikangas K, Strauss J. Digital phenotyping with mobile and wearable devices: advanced symptom measurement in child and adolescent depression. J Am Acad Child Adolesc Psychiatry. 2019;58(9):841-845. [CrossRef] [Medline]
  72. Currey D, Torous J. Digital phenotyping data to predict symptom improvement and mental health app personalization in college students: prospective validation of a predictive model. J Med Internet Res. 2023;25:e39258. [FREE Full text] [CrossRef] [Medline]
  73. Reece AG, Danforth CM. Instagram photos reveal predictive markers of depression. EPJ Data Sci. 2017;6(1):15. [FREE Full text] [CrossRef]
  74. Eichstaedt JC, Smith RJ, Merchant RM, Ungar LH, Crutchley P, Preoţiuc-Pietro D, et al. Facebook language predicts depression in medical records. Proc Natl Acad Sci U S A. 2018;115(44):11203-11208. [FREE Full text] [CrossRef] [Medline]
  75. Cheng Q, Li TM, Kwok CL, Zhu T, Yip PS. Assessing suicide risk and emotional distress in Chinese social media: a text mining and machine learning study. J Med Internet Res. 2017;19(7):e243. [FREE Full text] [CrossRef] [Medline]
  76. Won HH, Myung W, Song GY, Lee WH, Kim JW, Carroll BJ, et al. Predicting national suicide numbers with social media data. PLoS One. 2013;8(4):e61809. [FREE Full text] [CrossRef] [Medline]
  77. Karmakar C, Luo W, Tran T, Berk M, Venkatesh S. Predicting risk of suicide attempt using history of physical illnesses from electronic medical records. JMIR Ment Health. 2016;3(3):e19. [FREE Full text] [CrossRef] [Medline]
  78. Barak-Corren Y, Castro VM, Javitt S, Hoffnagle AG, Dai Y, Perlis RH, et al. Predicting suicidal behavior from longitudinal electronic health records. Am J Psychiatry. 2017;174(2):154-162. [FREE Full text] [CrossRef] [Medline]
  79. Bedi G, Carrillo F, Cecchi GA, Slezak DF, Sigman M, Mota NB, et al. Automated analysis of free speech predicts psychosis onset in high-risk youths. NPJ Schizophr. 2015;1:15030. [FREE Full text] [CrossRef] [Medline]
  80. Elvevåg B, Foltz PW, Weinberger DR, Goldberg TE. Quantifying incoherence in speech: an automated methodology and novel application to schizophrenia. Schizophr Res. 2007;93(1-3):304-316. [FREE Full text] [CrossRef] [Medline]
  81. Currey D, Torous J. Digital phenotyping correlations in larger mental health samples: analysis and replication. BJPsych Open. 2022;8(4):e106. [FREE Full text] [CrossRef] [Medline]
  82. Solmi M, Radua J, Olivola M, Croce E, Soardo L, de Pablo GS, et al. Age at onset of mental disorders worldwide: large-scale meta-analysis of 192 epidemiological studies. Mol Psychiatry. 2022;27(1):281-295. [FREE Full text] [CrossRef] [Medline]
  83. Mouchabac S, Conejero I, Lakhlifi C, Msellek I, Malandain L, Adrien V, et al. Improving clinical decision-making in psychiatry: implementation of digital phenotyping could mitigate the influence of patient's and practitioner's individual cognitive biases. Dialogues Clin Neurosci. 2021;23(1):52-61. [FREE Full text] [CrossRef] [Medline]
  84. Hartmann R, Schmidt FM, Sander C, Hegerl U. Heart rate variability as indicator of clinical state in depression. Front Psychiatry. 2019;9:735. [FREE Full text] [CrossRef] [Medline]
  85. Lesnewich LM, Conway FN, Buckman JF, Brush CJ, Ehmann PJ, Eddie D, et al. Associations of depression severity with heart rate and heart rate variability in young adults across normative and clinical populations. Int J Psychophysiol. 2019;142:57-65. [FREE Full text] [CrossRef] [Medline]
  86. Kane JM, Aguglia E, Altamura AC, Gutierrez JLA, Brunello N, Fleischhacker WW, et al. Guidelines for depot antipsychotic treatment in schizophrenia. European Neuropsychopharmacology Consensus Conference in Siena, Italy. Eur Neuropsychopharmacol. 1998;8(1):55-66. [CrossRef] [Medline]
  87. Brietzke E, Hawken ER, Idzikowski M, Pong J, Kennedy SH, Soares CN. Integrating digital phenotyping in clinical characterization of individuals with mood disorders. Neurosci Biobehav Rev. 2019;104:223-230. [CrossRef] [Medline]
  88. Barnett I, Torous J, Staples P, Sandoval L, Keshavan M, Onnela JP. Relapse prediction in schizophrenia through digital phenotyping: a pilot study. Neuropsychopharmacology. 2018;43(8):1660-1666. [FREE Full text] [CrossRef] [Medline]
  89. Onnela JP, Rauch SL. Harnessing smartphone-based digital phenotyping to enhance behavioral and mental health. Neuropsychopharmacology. 2016;41(7):1691-1696. [FREE Full text] [CrossRef] [Medline]
  90. Faurholt-Jepsen M, Busk J, Bardram JE, Stanislaus S, Frost M, Christensen EM, et al. Mood instability and activity/energy instability in patients with bipolar disorder according to day-to-day smartphone-based data—an exploratory post hoc study. J Affect Disord. 2023;334:83-91. [CrossRef] [Medline]
  91. Potier R. The digital phenotyping project: a psychoanalytical and network theory perspective. Front Psychol. 2020;11:1218. [FREE Full text] [CrossRef] [Medline]
  92. Canguilhem G. The Normal and the Pathological. New York. Zone Books; 1989.
  93. Charvet C, Boutron I, Morvan Y, Le Berre C, Touboul S, Gaillard R, et al. How to measure mental pain: a systematic review assessing measures of mental pain. Evid Based Ment Health. 2022;25(4):e4. [FREE Full text] [CrossRef] [Medline]
  94. Sharon T. Self-tracking for health and the quantified self: re-articulating autonomy, solidarity, and authenticity in an age of personalized healthcare. Philos Technol. 2017;30(1):93-121. [FREE Full text] [CrossRef]
  95. Insel T. Digital mental health care: five lessons from Act 1 and a preview of Acts 2-5. NPJ Digit Med. 2023;6(1):9. [FREE Full text] [CrossRef] [Medline]
  96. Rauseo-Ricupero N, Henson P, Agate-Mays M, Torous J. Case studies from the digital clinic: integrating digital phenotyping and clinical practice into today's world. Int Rev Psychiatry. 2021;33(4):394-403. [CrossRef] [Medline]
  97. Simblett S, Matcham F, Siddi S, Bulgari V, di San Pietro CB, López JH, et al. RADAR-CNS Consortium. Barriers to and facilitators of engagement with mHealth technology for remote measurement and management of depression: qualitative analysis. JMIR Mhealth Uhealth. 2019;7(1):e11325. [FREE Full text] [CrossRef] [Medline]
  98. Stern E, Franchi JAM, Dumas G, Moreira J, Mouchabac S, Maruani J, et al. How can digital mental health enhance psychiatry? Neuroscientist. 2022:10738584221098603. [CrossRef] [Medline]
  99. Melcher J, Hays R, Torous J. Digital phenotyping for mental health of college students: a clinical review. Evid Based Ment Health. 2020;23(4):161-166. [FREE Full text] [CrossRef] [Medline]
  100. Berrouiguet S, Perez-Rodriguez MM, Larsen M, Baca-García E, Courtet P, Oquendo M. From eHealth to iHealth: transition to participatory and personalized medicine in mental health. J Med Internet Res. 2018;20(1):e2. [FREE Full text] [CrossRef] [Medline]
  101. Briffault X, Morgiève M, Courtet P. From e-Health to i-Health: prospective reflexions on the use of intelligent systems in mental health care. Brain Sci. 2018;8(6):98. [FREE Full text] [CrossRef] [Medline]
  102. Borsboom D. A network theory of mental disorders. World Psychiatry. 2017;16(1):5-13. [FREE Full text] [CrossRef] [Medline]
  103. Froment A. Maladie: Donner un sens. Paris. Paris: Editions des archives contemporaines; 2001.
  104. Hall KS, Hyde ET, Bassett DR, Carlson SA, Carnethon MR, Ekelund U, et al. Systematic review of the prospective association of daily step counts with risk of mortality, cardiovascular disease, and dysglycemia. Int J Behav Nutr Phys Act. 2020;17(1):78. [FREE Full text] [CrossRef] [Medline]
  105. Briffault X. Psychiatrie 3.0: Être soi et ses connexions. Arcueil. John Libbey; 2020.
  106. Wies B, Landers C, Ienca M. Digital mental health for young people: a scoping review of ethical promises and challenges. Front Digit Health. 2021;3:697072. [FREE Full text] [CrossRef] [Medline]
  107. Foltz PW, Rosenstein M, Elvevåg B. Detecting clinically significant events through automated language analysis: Quo imus? NPJ Schizophr. 2016;2:15054. [FREE Full text] [CrossRef] [Medline]
  108. Liang Y, Zheng X, Zeng DD. A survey on big data-driven digital phenotyping of mental health. Inf Fusion. 2019;52:290-307. [CrossRef]
  109. Lup K, Trub L, Rosenthal L. Instagram #instasad?: exploring associations among instagram use, depressive symptoms, negative social comparison, and strangers followed. Cyberpsychol Behav Soc Netw. 2015;18(5):247-252. [CrossRef] [Medline]
  110. Lupton D. How does health feel? Towards research on the affective atmospheres of digital health. Digit Health. 2017;3:2055207617701276. [FREE Full text] [CrossRef] [Medline]
  111. Lupton D. M-health and health promotion: the digital cyborg and surveillance society. Soc Theory Health. 2012;10(3):229-244. [FREE Full text] [CrossRef]
  112. Rouvroy A, Stiegler B. The digital truth regime: from algorithmic governmentality to a new rule of law. Socio. 2015;4:113-140. [FREE Full text] [CrossRef]
  113. Turkle S. Alone Together: Why We Expect More from Technology and Less from Each Other. New York. Basic Books; 2011.
  114. Leonelli S. Scientific research and big data. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Summer Edition). Stanford, CA. Metaphysics Research Lab, Stanford University; 2020.
  115. Rahimi SA, Légaré F, Sharma G, Archambault P, Zomahoun HTV, Chandavong S, et al. Application of artificial intelligence in community-based primary health care: systematic scoping review and critical appraisal. J Med Internet Res. 2021;23(9):e29839. [FREE Full text] [CrossRef] [Medline]
  116. Haibe-Kains B, Adam GA, Hosny A, Khodakarami F, Massive Analysis Quality Control (MAQC) Society Board of Directors; Waldron L, et al. Transparency and reproducibility in artificial intelligence. Nature. 2020;586(7829):E14-E16. [FREE Full text] [CrossRef] [Medline]
  117. Fiske A, Prainsack B, Buyx A. Data work: meaning-making in the era of data-rich medicine. J Med Internet Res. 2019;21(7):e11672. [FREE Full text] [CrossRef] [Medline]
  118. Torous J, Myrick KJ, Rauseo-Ricupero N, Firth J. Digital mental health and COVID-19: using technology today to accelerate the curve on access and quality tomorrow. JMIR Ment Health. 2020;7(3):e18848. [FREE Full text] [CrossRef] [Medline]
  119. Birk RH, Samuel G. Digital phenotyping for mental health: reviewing the challenges of using data to monitor and predict mental health problems. Curr Psychiatry Rep. 2022;24(10):523-528. [CrossRef] [Medline]
  120. Walsh CG, Chaudhry B, Dua P, Goodman KW, Kaplan B, Kavuluru R, et al. Stigma, biomarkers, and algorithmic bias: recommendations for precision behavioral health with artificial intelligence. JAMIA Open. 2020;3(1):9-15. [FREE Full text] [CrossRef] [Medline]
  121. Monteith S, Glenn T, Geddes J, Whybrow PC, Achtyes E, Bauer M. Expectations for Artificial Intelligence (AI) in psychiatry. Curr Psychiatry Rep. 2022;24(11):709-721. [FREE Full text] [CrossRef] [Medline]
  122. Latour B. Science in Action: How to Follow Scientists and Engineers Through Society. Cambridge. Harvard University Press; 1987.
  123. Burrell J. How the machine ‘thinks’: understanding opacity in machine learning algorithms. Big Data Soc. 2016;3(1):205395171562251. [FREE Full text] [CrossRef]
  124. Kiani A, Uyumazturk B, Rajpurkar P, Wang A, Gao R, Jones E, et al. Impact of a deep learning assistant on the histopathologic classification of liver cancer. NPJ Digit Med. 2020;3:23. [FREE Full text] [CrossRef] [Medline]
  125. Daston L, Galison P. Objectivity. New York. Zone Books; 2007.
  126. Ruckenstein M, Schüll ND. The datafication of health. Annu Rev Anthropol. 2017;46(1):261-278. [FREE Full text] [CrossRef]
  127. Rouvroy A, Berns T. Algorithmic governmentality and prospects of emancipation disparateness as a precondition for individuation through relationships? Réseaux. 2013;177(1):163-196. [FREE Full text] [CrossRef]
  128. Prainsack B. Personalized Medicine: Empowered Patients in the 21st Century?. New York. New York University Press; 2017.
  129. Sharon T. From hostile worlds to multiple spheres: towards a normative pragmatics of justice for the Googlization of health. Med Health Care Philos. 2021;24(3):315-327. [FREE Full text] [CrossRef] [Medline]
  130. Bester JC. Defensive practice is indefensible: how defensive medicine runs counter to the ethical and professional obligations of clinicians. Med Health Care Philos. 2020;23(3):413-420. [CrossRef] [Medline]
  131. Foucault M. The Birth of the Clinic: An Archaeology of Medical Perception. Paris. Presses Universitaires De France; 1973.
  132. Foucault M. The Birth of the Prison. Paris. Gallimard; 1975.
  133. Stiegler B. The Digital Truth: Research and Higher Education in the Age of Digital Technologies. Limoges. FYP éditions; 2018.
  134. Cosgrove L, Karter JM, Morrill Z, McGinley M. Psychology and surveillance capitalism: the risk of pushing mental health apps during the COVID-19 pandemic. J Humanist Psychol. 2020;60(5):611-625. [FREE Full text] [CrossRef]
  135. Fanon F, Farrington C. The Wretched of the Earth. New York. Grove Press; 2002.
  136. Conrad P. The shifting engines of medicalization. J Health Soc Behav. 2005;46(1):3-14. [CrossRef] [Medline]
  137. Conrad P. Medicalization and social control. Annu Rev Sociol. 1992;18:209-232. [CrossRef]
  138. Mouchabac S, Adrien V, Falala-Séchet C, Bonnot O, Maatoug R, Millet B, et al. Psychiatric advance directives and artificial intelligence: a conceptual framework for theoretical and ethical principles. Front Psychiatry. 2020;11:622506. [FREE Full text] [CrossRef] [Medline]
  139. Kilgallon JL, Tewarie IA, Broekman MLD, Rana A, Smith TR. Passive data use for ethical digital public health surveillance in a postpandemic world. J Med Internet Res. 2022;24(2):e30524. [FREE Full text] [CrossRef] [Medline]
  140. Too LS, Leach L, Butterworth P. Mental health problems and internet access: results from an Australian National Household survey. JMIR Ment Health. 2020;7(5):e14825. [FREE Full text] [CrossRef] [Medline]
  141. Henderson C, Evans-Lacko S, Thornicroft G. Mental illness stigma, help seeking, and public health programs. Am J Public Health. 2013;103(5):777-780. [FREE Full text] [CrossRef] [Medline]
  142. Insel T. Healing: Our Path from Mental Illness to Mental Health. New York. Penguin Press; 2022.
  143. Lopez-Morinigo JD, Barrigón ML, Porras-Segovia A, Ruiz-Ruano VG, Martínez ASE, Escobedo-Aedo PJ, et al. Use of ecological momentary assessment through a passive smartphone-based app (eB2) by patients with schizophrenia: acceptability study. J Med Internet Res. 2021;23(7):e26548. [FREE Full text] [CrossRef] [Medline]
  144. Raugh IM, James SH, Gonzalez CM, Chapman HC, Cohen AS, Kirkpatrick B, et al. Digital phenotyping adherence, feasibility, and tolerability in outpatients with schizophrenia. J Psychiatr Res. 2021;138:436-443. [FREE Full text] [CrossRef] [Medline]
  145. Lo B. Professionalism in the age of computerised medical records. Singapore Med J. 2006;47(12):1018-1022. [FREE Full text] [Medline]
  146. Satkoske VB, Parker LS. Practicing preventive ethics, protecting patients: challenges of the electronic health record. J Clin Ethics. 2010;21(1):36-38. [Medline]
  147. Martinez-Martin N, Greely HT, Cho MK. Ethical development of digital phenotyping tools for mental health applications: Delphi study. JMIR Mhealth Uhealth. 2021;9(7):e27343. [FREE Full text] [CrossRef] [Medline]
  148. Cederström C, Spicer A. The Wellness Syndrome. Cambridge. Polity Press; 2015.
  149. Thaler RH, Sunstein CR. Nudge: Improving Decisions about Health, Wealth and Happiness. New Haven. Yale University Press; 2008.
  150. Simkulet W. Nudging, informed consent and bullshit. J Med Ethics. 2018;44(8):536-542. [FREE Full text] [CrossRef] [Medline]
  151. Mills C. Why nudges matter: a reply to Goodwin. Politics. 2012;33(1):28-36. [CrossRef]
  152. Schubert C. On the ethics of public nudging: autonomy and agency. SSRN Electron J. 2015:1-26. [FREE Full text] [CrossRef]
  153. Garcia T. The Life Intense: A Modern Obsession. Edinburgh. Edinburgh University Press Ltd; 2018.
  154. Lasch C. The Culture of Narcissism: American Life in an Age of Diminishing Expectations. New York. Warner Books; 1979.
  155. Guessoum SB, Lachal J, Radjack R, Carretier E, Minassian S, Benoit L, et al. Adolescent psychiatric disorders during the COVID-19 pandemic and lockdown. Psychiatry Res. 2020;291:113264. [FREE Full text] [CrossRef] [Medline]
  156. Panchal U, de Pablo GS, Franco M, Moreno C, Parellada M, Arango C, et al. The impact of COVID-19 lockdown on child and adolescent mental health: systematic review. Eur Child Adolesc Psychiatry. 2023;32(7):1151-1177. [FREE Full text] [CrossRef] [Medline]
  157. Birk R, Lavis A, Lucivero F, Samuel G. For what it's worth. Unearthing the values embedded in digital phenotyping for mental health. Big Data Soc. 2021;8(2):205395172110473. [FREE Full text] [CrossRef]
  158. Chia AZR, Zhang MWB. Digital phenotyping in psychiatry: a scoping review. Technol Health Care. 2022;30(6):1331-1342. [CrossRef] [Medline]
  159. Seiferth C, Vogel L, Aas B, Brandhorst I, Carlbring P, Conzelmann A, et al. How to e-mental health: a guideline for researchers and practitioners using digital technology in the context of mental health. Nat Mental Health. 2023;1(8):542-554. [FREE Full text] [CrossRef]
  160. Balcombe L, De Leo D. Digital mental health challenges and the horizon ahead for solutions. JMIR Ment Health. 2021;8(3):e26811. [FREE Full text] [CrossRef] [Medline]
  161. Arrieta AB, Díaz-Rodríguez N, Del Ser J, Bennetot A, Tabik S, Barbado A, et al. Explainable Artificial Intelligence (XAI): concepts, taxonomies, opportunities and challenges toward responsible AI. Inf Fusion. 2020;58:82-115. [CrossRef]
  162. Bourla A, Ferreri F, Ogorzelec L, Peretti CS, Guinchard C, Mouchabac S. Psychiatrists' attitudes toward disruptive new technologies: mixed-methods study. JMIR Ment Health. 2018;5(4):e10240. [FREE Full text] [CrossRef] [Medline]
  163. Martinez-Martin N, Insel TR, Dagum P, Greely HT, Cho MK. Data mining for health: staking out the ethical territory of digital phenotyping. NPJ Digit Med. 2018;1:68. [FREE Full text] [CrossRef] [Medline]
  164. Klein A, Clucas J, Krishnakumar A, Ghosh SS, Van Auken W, Thonet B, et al. Remote digital psychiatry for mobile mental health assessment and therapy: MindLogger platform development study. J Med Internet Res. 2021;23(11):e22369. [FREE Full text] [CrossRef] [Medline]
  165. Birk RH, Samuel G. Can digital data diagnose mental health problems? A sociological exploration of 'digital phenotyping'. Sociol Health Illn. 2020;42(8):1873-1887. [CrossRef] [Medline]
  166. Lupton D. Self-tracking, health and medicine. Health Sociol Rev. 2016;26(1):1-5. [FREE Full text] [CrossRef]
  167. Torous J, Benson NM, Myrick K, Eysenbach G. Focusing on digital research priorities for advancing the access and quality of mental health. JMIR Ment Health. 2023;10:e47898. [FREE Full text] [CrossRef] [Medline]
  168. Huckvale K, Venkatesh S, Christensen H. Toward clinical digital phenotyping: a timely opportunity to consider purpose, quality, and safety. NPJ Digit Med. 2019;2:88. [FREE Full text] [CrossRef] [Medline]

AI: artificial intelligence
DP: digital phenotyping
DSM-5: Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition
HCP: health care professional
RDoC: research domain criteria

Edited by T de Azevedo Cardoso; submitted 22.11.22; peer-reviewed by S Markham, N Jacobson, T Martino, J Chen, R Bluhm; comments to author 31.03.23; revised version received 10.07.23; accepted 21.08.23; published 04.10.23.


©Antoine Oudin, Redwan Maatoug, Alexis Bourla, Florian Ferreri, Olivier Bonnot, Bruno Millet, Félix Schoeller, Stéphane Mouchabac, Vladimir Adrien. Originally published in the Journal of Medical Internet Research (, 04.10.2023.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on, as well as this copyright and license information must be included.