Experiences of Using Digital Mindfulness-Based Interventions: Rapid Scoping Review and Thematic Synthesis

Background Digital mindfulness-based interventions (MBIs) are a promising approach to deliver accessible and scalable mindfulness training and have been shown to improve a range of health outcomes. However, the success of digital MBIs is reliant on adequate engagement, which remains a crucial challenge. Understanding people’s experiences of using digital MBIs and identifying the core factors that facilitate or act as barriers to engagement is essential to inform intervention development and maximize engagement and outcomes. Objective This study aims to systematically map the literature on people’s experiences of using digital MBIs that target psychosocial variables (eg, anxiety, depression, distress, and well-being) and identify key barriers to and facilitators of engagement. Methods We conducted a scoping review to synthesize empirical qualitative research on people’s experiences of using digital MBIs. We adopted a streamlined approach to ensure that the evidence could be incorporated into the early stages of intervention development. The search strategy identified articles with at least one keyword related to mindfulness, digital, user experience, and psychosocial variables in their title or abstract. Inclusion criteria specified that articles must have a qualitative component, report on participants’ experiences of using a digital MBI designed to improve psychosocial variables, and have a sample age range that at least partially overlapped with 16 to 35 years. Qualitative data on user experience were charted and analyzed using inductive thematic synthesis to generate understandings that go beyond the content of the original studies. We used the Quality of Reporting Tool to critically appraise the included sources of evidence. Results The search identified 510 studies, 22 (4.3%) of which met the inclusion criteria. Overall, the samples were approximately 78% female and 79% White; participants were aged between 16 and 69 years; and the most used measures in intervention studies were mindfulness, psychological flexibility, and variables related to mental health (including depression, anxiety, stress, and well-being). All studies were judged to be adequately reported. We identified 3 themes characterizing barriers to and facilitators of engagement: responses to own practice (ie, negative reactions to one’s own practice are common and can deplete motivation), making mindfulness a habit (ie, creating a consistent training routine is essential yet challenging), and leaning on others (ie, those engaging depend on someone else for support). Conclusions The themes identified in this review provide crucial insights as to why people frequently stop engaging with digital MBIs. Researchers and developers should consider using person-based coparticipatory methods to improve acceptability of and engagement with digital MBIs, increase their effectiveness, and support their translation to real-world use. Such strategies must be grounded in relevant literature and meet the priorities and needs of the individuals who will use the interventions.


Introduction
Mindfulness involves: (a) attentional monitoring of present-moment experience (e.g., thoughts, feelings, sensations) and (b) orienting towards this experience with acceptance and non-judgement [1].Mindfulness-based interventions (MBIs) aim to train these skills and have been shown to improve a range of psychological and physical health outcomes in both clinical and non-clinical populations.For example, evidence from meta-analyses of randomised controlled trials suggests that MBIs can reduce depression and anxiety/stress in young people [2], lower pain intensity in patients with chronic pain [3], and reduce symptoms of post-traumatic stress in people with and without a diagnosis [4].
Despite such efficacy, there are numerous challenges to accessing and delivering MBIs, including geographical, logistical, and financial constraints, and a lack of trained mindfulness teachers [5,6].For example, MBIs are typically face-to-face, multi-session, and facilitated by expert interventionists, such as the mindfulness-based cognitive therapy (MBCT) course that is traditionally delivered by dedicated instructors in eight weekly 2-hr group-training sessions [7].The translation of MBIs into digital formats has the potential to overcome these constraints, and it is encouraging that early evaluations of digital MBIs report beneficial effects which are comparable to those found with traditional in-person programmes [8,9].
Unfortunately, however, the success of digital MBIs is reliant on adequate engagement, which remains a crucial challenge.Engagement refers to the investment of energy in an activity and includes physical (i.e., actual performance, which researchers often rely on when examining engagement using objective behavioural metrics [10]), affective (i.e., affective reactions), and cognitive (i.e., selective attention) elements [11].For example, reviews of digital MBIs have found that between 8% and 52-60% of participants do not complete all sessions [9,12].Although low engagement is a common issue in digital mental health interventions generally [13] for example, the pooled completion rate from studies of apps for depressive symptoms is 52% [14] it is particularly important in mindfulness training because regular practice is essential to develop mindfulness skills.Time spent practising mindfulness at home is related to increases in levels of mindfulness and, in turn, improvements in psychological functioning [15].Similarly, those who report high levels of engagement in digital MBIs report greater improvement in outcomes than those that do not [12].
Given that the success of digital MBIs is related to engagement, and engagement tends to be low in digital MBIs, understanding the factors that facilitate or act as a barrier to engagement in these interventions is crucial to promote engagement and opportunities to benefit.Past research has suggested that there are a range of factors which influence adherence to digital MBIs [5], including accessibility (e.g., across devices and populations with different needs), tailoring (e.g., of content to individual needs), and difficulty (e.g., with sustaining attention).In one study, after engaging in a digital MBI, students with no meditation experience reported that the top three obstacles to practice from a checklist of common challenges were: meditation feeling like "just another task", "feeling distracted", and "feeling sleepy" [16].However, the use of closed-response questions in such research potentially prohibits the development of a detailed understanding that is grounded in people's own perspectives regarding aspects that help them and hinder them from engaging [17].
A more detailed approach, using inductive qualitative analysis, examined factors that hindered or facilitated the engagement of 16 healthcare professionals who participated in a self-help MBI (participants could choose a printed book or online programme) [18].Results indicated that longer practices, arising negative thoughts, and self-criticism were key hindrances, and shorter practices, motivation to reduce stress, and feelings of control over thoughts were key facilitators.However, over half of participants opted for the book-based intervention in this study, and themes identified from engaging with the online and bookbased MBIs were combined.Although the authors reported that themes were comparable across the intervention types, it is possible that barriers and facilitators specific to the online version were obscured by those common to both.It is therefore unclear if these themes would apply to typical digital MBIs, as well as to other populations (e.g., groups who are vulnerable to or experiencing clinical-level concerns, or for whom initial engagement is lower).
While some studies have reported on factors that can influence engagement in digital MBIs, they rarely build a deep understanding of users' experiences nor do so systematically.
User-centred design approaches (such as the person-based approach [19]) emphasise that understanding how people use digital MBIs and identifying core barriers and facilitators to engagement are important first steps in intervention development, which suggest key design objectives to ensure interventions are relevant, acceptable, and engaging to target users, before significant investment is made in evaluation and implementation [20].This is particularly important in the context of digital mindfulness interventions because, unlike most digital health interventions, engagement in the digital content is designed to facilitate completion of a concurrent non-digital target behaviour that is metacognitive in nature (e.g., an experiential mindfulness exercise) [11].Since factors influencing engagement vary across different target behaviours, clear guidance is needed to understand which are directly relevant to, and most prominent in, digital MBIs specifically.
This review aimed to synthesise qualitative evidence on individuals' experiences of using digital MBIs targeting psychosocial variables (e.g., anxiety, depression, distress, wellbeing) to identify key barriers and facilitators to engagement.We chose to perform a rapid scoping review of qualitative data because: (a) factors influencing the effects of interventions are often rooted in variations in attitudes, opinions, thoughts, feelings, and behaviours, and therefore best explored through qualitative study [21]; (b) qualitative evidence is necessary to understand engagement in its entirety (i.e., its physical, cognitive, and affective components [11]); and (c) it ensures that existing evidence can be incorporated into the early stages of intervention development and implementation [22,23].The knowledge generated from this review will inform the evaluation and development of new and existing digital MBIs, helping them to overcome some of the challenges that individuals face when engaging in these interventions.

Methods
We adhere to the enhancing transparency in reporting the synthesis of qualitative research (ENTREQ) guidelines [24] in reporting this review, and the review itself was guided by the Cochrane rapid review methods recommendations [25] and PRISMA guidelines for scoping reviews [26] (Multimedia Appendix 1).We developed and pre-registered an a priori protocol that specified the review questions (What are the key barriers and facilitators to engagement in digital MBIs targeting psychosocial variables?How have interventions addressed and used these barriers and facilitators in the past, and in what ways could interventions address and use them in future?);participants, intervention, comparison, outcome, study design (PICOS); electronic database; search strategy; inclusion/exclusion criteria; and data charting form [27].

Inclusion and Exclusion Criteria
The inclusion and exclusion criteria were developed to identify qualitative explorations of individuals' perspectives and experiences of using digital MBIs designed to improve psychosocial variables (Table 1).We excluded studies that did not refer to a digital online intervention (e.g., a biofeedback headband and device based on vapor, light, and sound, both designed to support mindful breathing) and studies of interventions in which mindfulness was not the main part (e.g., an intervention composed of three evidence-based techniques: cognitive behavioural coaching, motivational interviewing, and mindfulness).We specified that sample age ranges must at least partially overlap with 16-35 years because this is the target age group for our own intervention development.We defined "digital" MBIs as those delivered online by the technology itself (e.g., hardware and electronic devices, software, websites) rather than by healthcare professionals remotely [28].Human support (e.g., answering questions; providing feedback; offering coaching, orientation, or check-in sessions) was permitted where the support was considered supplementary to the delivery of content, and we report on the presence and format of such support in each included study.We focused on peer-reviewed papers because they will have received some initial quality assessment.Non-reporting bias [29] was minimised in this review because its focus was on generating themes related to engagement rather than estimating effects (i.e., we did not extract quantitative results and included studies with no reported quantitative outcomes).

Table 1
Inclusion and exclusion criteria for selected articles.

Inclusion criteria
Exclusion criteria Type of publication Peer-reviewed empirical article (i.e., original research based on observation or experiment).
Not peer-reviewed or is a review article (i.e., does not contain original research).Language Published in English.
Not published in English.

Study design
Qualitative or mixed methods study, or an intervention study with a qualitative component (including free text from questionnaire surveys).May report on a full-scale or pilot-scale project.
Does not include a qualitative component (including free text from questionnaire surveys).

Phenomena of interest
Any information on experiences of using a digital online mindfulness-based intervention (an intervention research or commercially availablein which mindfulness is the main part) designed to improve psychosocial variables (i.e., not interventions that solely target physiological variables).If an intervention study, must use psychosocial outcome or process measures.
Does not include any information on experiences of using a digital online mindfulness-based intervention (an intervention in which mindfulness is the main part), or is an intervention study that does not use psychosocial outcome or process measures.

Participants
Sample age range at least partially overlaps with 16-35 years.
Sample age range is entirely <16 years and/or >35 years.

Search Strategy
In consultation with an information specialist (psychology librarian who has extensive training in implementing structured database searches), we developed a comprehensive search strategy to identify articles with at least one keyword related to: (a) mindfulness, (b) digital, (c) user experience, and (d) psychosocial variables, in its title or abstract (Table 2).
Keywords for psychosocial variables were derived from models of disordered eating [30] (i.e., specific focus for our own intervention development), with added terms to broaden the search for all psychosocial variables (e.g., affect, mood, distress, well-being).

Screening
We uploaded the search results to Covidence, an online systematic review software, to streamline the screening process.Consistent with guidance from the Agency for Healthcare Research and Quality [31], we started with a pilot phase to calibrate and test the eligibility criteria.Two researchers independently screened a random selection of 50 studies (10% of records) then met to resolve discrepancies (Multimedia Appendix 2).The first author screened remaining titles and abstracts.All potentially eligible records were obtained as full text articles.We requested full texts via our institution's inter-library loan service if unavailable online.The first author screened full texts for inclusion in consultation with the wider research team and the research team verified the final list of included articles.

Data Charting
We used a pilot-tested form to record study characteristics and qualitative data on user experience (Multimedia Appendix 3).Two researchers independently charted data from a full text using a template adapted from the example evidence table for qualitative studies developed by NICE [32] then met to discuss inconsistencies and improvements (Multimedia Appendix 4).The first author charted remaining data.Our inclusive approach included qualitative data from any study type, such as qualitative data from qualitative studies (i.e., studies that used a qualitative method of data collection and analysis), narrative data from qualitative components of mixed-methods studies, and free text from questionnaire surveys, because various types of qualitative evidence can enrich a synthesis [23].In the present study, charted qualitative data included quotations from participants, and themes, theory, and interpretations generated by the study's authors.They were presented as narrative or summarised in tables, and located in the abstract, results, and discussion sections.We charted all qualitative data related to user experience as verbatim quotations.Multimedia Appendix 5 provides a 17-page excerpt from our extensive data charting table.

Critical Appraisal
We used the quality of reporting tool (QuaRT [33]) to critically appraise the included sources of evidence.The reporting of each study was appraised using four criteria: (a) study design and question, (b) participant selection, (c) data collection, and (d) analysis.We assessed all qualitative studies overall (i.e., as a whole), and all remaining papers (i.e., mixedmethods studies/questionnaire surveys) both overall and considering only qualitative data on user experience (i.e., data included in our qualitative evidence synthesis).After pilot testing the tool with two reviewers, a single reviewer categorised studies as "adequately reported" (satisfied at least two criteria) or "inadequately reported" (satisfied one or no criteria), and the first author verified all judgments and supporting evidence.These criteria have been used in other validated tools (e.g., they represent items 3, 4, 5, and 8 from the Critical Appraisal Skills Programme [CASP] qualitative checklist [34]) and in a review of barriers and facilitators to engagement in digital mental health interventions [13].

Data Analysis
As recommended in the Cochrane Handbook for Systematic Reviews of Interventions [23], we thematically synthesised charted qualitative data [35].Thematic synthesis offers a clear and accessible inductive approach to produce descriptive themes that can evolve beyond the content of the primary studies into more in-depth analytic themes.The first author imported all charted qualitative data verbatim into NVivo qualitative data analysis software and freely coded the data line-by-line according to their meaning and content, using words directly from the data where possible.Since qualitative evidence syntheses have received criticism for decontextualising the findings of individual studies [35], the first author read all the charted data (including study aims, methods, and sample) prior to coding each study's findings to preserve its original context and ensure its findings could be fully understood without misinterpretation [36].The first author then grouped similar codes into "descriptive themes" to summarise their meaning while keeping close to the original findings of the included studies.This was an iterative process that distilled users' perspectives and experiences of using digital MBIs down into their key parts.In the next stage, the wider research team met to discuss the descriptive themes and develop "analytical themes", which go beyond the findings of the primary studies by interpreting the key messages underlying the descriptive themes and using these to answer the review questions.We generated more EXPERIENCES USING DIGITAL MINDFULNESS INTERVENTIONS 14 abstract and analytical themes through an iterative process of inferring barriers, facilitators, and implications for intervention development from the descriptive themes, and making changes to these where necessary.Multimedia Appendix 6 provides more details about the analysis, including a four-page excerpt from our list of codes, full list of descriptive themes, and comprehensive example of how we generated analytical themes.

Methodological Streamlining
We took several steps to accelerate the review process so that evidence could be quickly incorporated into the initial phase of intervention planning [37].First, we limited the inclusion criteria to English-language publications [25].Second, we restricted searching to PsycInfo as an efficient way to achieve a manageable amount of relevant data, i.e., by using a specialist database for psychological interventions [38] to retrieve studies most suitable for answering our review questions.This was necessary given that (a) too many data due to a large number of included studies can undermine qualitative evidence syntheses and (b) other methods of limiting the number of included studies are time and resource intensive (e.g., purposive sampling [39]).Qualitative evidence syntheses aim to understand the phenomenon of interest in a context rather than aggregate data from large representative samples of studies to achieve statistical generalisability [39]; therefore, we do not anticipate this impacting the findings of this review.Third, one reviewer performed full screening and data charting.We minimised the potential for increased errors and lower reproducibility because of this by piloting forms, estimating interrater reliability, and consulting with the wider research team.
Multimedia Appendix 7 provides more detail about our streamlined approach.

Study Selection
The search identified 510 records.Of these, 79 were included for full text review, and 13/09/2021 and a supplementary search on 30/11/2021 prior to analysis (Multimedia Appendix 8).

Study Characteristics
Detailed characteristics of the included studies are described in Table 3.An overview of these characteristics is provided below.

Year and Country
The 22 studies were published between 2010 and 2022, with the majority (77%) published from 2017 onwards.The studies were primarily from the USA (50%), Europe (27%), and Australia (18%).Multimedia Appendix 9 contains details of year and country.

Outcomes
In intervention studies, the most used outcome and process measures were mindfulness, psychological flexibility, and variables related to mental health, including depression, anxiety, stress, and well-being.

Methods
Most studies used in-depth interviews (55%) and/or self-completion questionnaires with open response categories (55%) to collect data, and other studies (14%) used focus groups.The studies primarily used thematic analysis (45%) to analyse data, but other methods included content analysis (27%), descriptive or inferential statistics (18%), and grounded theory (9%).

Critical Appraisal
All studies were assessed as adequately reported (Multimedia Appendix 10), including qualitative studies (n = 8) and mixed-methods studies/questionnaire surveys when evaluated both as a whole and with respect to qualitative data on user experience only (n = 14).Overall, each study reported on study design and question, participant selection, data collection and analysis.When we evaluated mixed-methods studies/questionnaire surveys considering only data included in our qualitative evidence synthesis, 7 studies did not give details of analysis method (e.g., authors reviewed open responses for common themes without reference to or full description of method) and 1 study did not describe data collection sufficiently.

Qualitative Synthesis
We identified three themes: feeling puzzled and confused by the effects they experienced, and questioning the accuracy of their training (e.g., when they fell asleep, whether brief practices "count" or they had "permission" to do a briefer practice when short of time, whether they were in the correct position).
I always want to do things right, and I wasn't sure about how I did the meditation exercises in the beginning.Is this the way I am supposed to do this?-(participant [43]).

When I listen, I have a feeling that I do not quite understand what should happen
during the meditation -(participant [51]).

Not knowing exactly what was expected in terms of program structure and training dose (despite information), and lack of adherence towards the recommended dose sometimes induced a sense of insecurity as to whether one was doing the training
properly and actually benefiting from it or taking it seriously enough.This could deplete motivation -(author [57]).

I'm worried whether I am doing the practices correctly -(participant [44]).
It's really good to have that permission, so to say.I did do the 3-minute breathing space a few times, but I guess I was thinking that wasn't really doing the homework because it is so brief.It's good to know that "counts" -(participant [44]).This led to desires for feedback about whether participants had performed training properly and an additional brief "overview" tutorial to aid memory in instances of insecurity [57].

Making Mindfulness a Habit
Another prominent notion was that establishing a consistent training routine is not only an essential part of digital MBIs, but one that requires resolution, perseverance, and selfdiscipline.Participants recognised that being successful in creating a routine and integrating mindfulness into their lives made regular practice easier, and that regular practice was important when learning a new skill like mindfulness.

It was difficult in that you had to carve out the time really consistently, but it was also really valuable. I don't think the program would be as effective if you weren't being asked to do it daily. What I understand is you're trying to develop a habit -(participant [41]).
To manage the issue of dwindling enthusiasm, the participants made two suggestions.

First, it was important to practise more to make it become a natural habit. […]
setting aside time each day for lying down and practising the exercises before sleep and even during the daytime whenever possible, no matter how short the exercise was, could help them build up their perseverance -(author [60]).
However, seeing the value of making mindfulness a habit was not enough to meet the responsibility.Participants reported needing to persist and grapple with the effortful task of making practice a scheduled activity, which involved frequent adjustments to their plans, priorities, and commitments.

You just have to work for it if this is something that you want -(participant [54]).
It's a question of discipline /. ../I think one should pinpoint that it's strenuous and that one has to be ready to struggle with it because one believes in it -(participant [56]).
(e.g., automated reminders, messages of encouragement, personalised feedback, via email, text, or phone call) was helpful in reminding and motivating participants to practice without feeling intrusive.
A consistent message from all interviewees was that any form of feedback or communication from the programme was likely to improve retention.In addition to forms of feedback already mentioned, email (even if automated and using a 'no-reply' address), and text message reminders, were thought to be likely to be helpful without being intrusive -(author [49]).
I enjoyed the reminders that the app sends you -I really found that helpful because otherwise, I would not have remembered to do it -(participant [54]).
Similarly, having a programme "support person" was considered essential.Many valued the existence of an individual (e.g., instructor, coach, therapist, member of the research team) with whom they could discuss programme concepts and receive technical or administrative support.Participants felt it was reassuring to know someone was available if needed, whether via phone, email, or an "Ask a Question" or "Help" function.
All participants saw the value of having a support person available who was only a phone call or email away.Some participants mentioned more frequent interactions with the support person and even those who did not use the support reported that it was an important asset of the program -(author [41]).
Many endorsed that it was "essential" to have a coach and helpful to know that one was available if needed -(author [44]).
Another main expression of this theme was not feeling part of a community, which led participants to feel alone or that they lacked connection or a sense of belonging with other users.This in turn motivated requests for a "community component" (e.g., online forum, message board, group [video or phone] chats) so participants could discuss their intervention experiences, clarify content, and share challenges, with other users.This was particularly wanted by participants with a shared lived experience so they could interact, connect, and identify with others (e.g., perinatal women, individuals with epilepsy, cancer patients).While most included studies were of interventions that did not have a community component (n = 16), this component was also highly valued by participants for whom it was present (n = 6).
I think this would be a lot better if there was a Web-based group…I felt alone out here.I would have been engaged more -(participant [41]).
All interviewees agreed that an online forum, which enabled discussion about their programme experiences, was highly desirable and was likely to boost retention significantly through: clarifying aspects of the teaching; sharing and overcoming difficulties with practice; and encouraging participants to remain engaged and complete home practice sessions -(author [49]).

The majority expressed […] a desire for a community function component of the
program that would allow them to interact with other perinatal women who were using MMB [Mindful Mood Balance programme] -(author [44]).
A final dimension captured the tendency for participants to engage in creative ways to seek out support from others when none or not enough was provided by the programme.
Participants reported sharing the programme with significant others, such as family members, friends, and spouses, to help encourage their consistent and continued practice.

I'm talking to my husband about how he can help me protect some time on the
weekends to do the longer practices -(participant [44]).
My kids actually started to look forward to it, so they would actually ask to do it.That helped me kind of stay on track -(participant [54]).
Some participants were open with their training, sharing their experiences with the patient and family members and occasionally doing some of the exercises together -(author [57]).
By reaching out to others in their lives, participants were able to orchestrate their own social environment to support their engagement with the programme.This self-made way of forging a helpful foundation for practice not only highlights the impact someone else can have on people's engagement in digital MBIs, but also indicates that people are not reliant on a mindfulness teacher to feel supported.

Principal Findings
This review identified, critically appraised, and synthesised qualitative data from 22 original studies of people's experiences using a digital MBI to identify factors that facilitate or act as a barrier to their engagement in the intervention.Three overarching themes appeared to influence engagement: (a) Responses to Own Practice; (b) Making Mindfulness a Habit; and (c) Leaning on Others.Together, these themes provide crucial insight as to why people frequently stop engaging with digital MBIs.The following discussion elaborates on these areas and offers some recommendations for researchers and developers to guide intervention design and evaluation, and thereby improve acceptability and engagement in digital MBIs, increase their effectiveness, and support their translation into real-world use.
The first theme emphasised how adverse reactions to one's own practice are common and may serve to reduce motivation.This suggests that the tendency to respond negatively to our own experience and application of mindfulness is a major barrier to using digital MBIs, which is consistent with the wider literature on mindfulness interventions and offers initial support for extending this finding to digital intervention formats.For example, in one study, the question "Am I doing it right?"emerged by the second week of a traditional MBCT course [61].In another, participants reported feeling self-critical when they could not make time to practise and when mindfulness did not appear to work for them [18].As in the current review, this negative reaction made it difficult for participants to continue to engage, prompting them to give up and remove it from their to-do list.To help overcome this barrier, traditional face-to-face programmes like MBCT explicitly allocate time to anticipating what difficulties and obstacles may arise in doing home practice (e.g., trying to find free time) and how to deal with them [7].Such content on overcoming barriers may be lost in the translation to digital formats and our review is the first to highlight the importance of explicitly addressing this in digital MBIs.This finding also indicates that one of the most important factors influencing engagement in digital MBIs is unique to mindfulness specifically, rather than general to digital interventions, and reflects the metacognitive nature of the intervention's target behaviour.Our review offers clear guidance about which particular combinations of factors identified across other literatures (e.g., on digital interventions or mindfulness interventions more broadly) are most influential in the specific context of digital MBIs, which is essential to make these interventions more persuasive, feasible, and relevant to users [20].
The second theme (Making Mindfulness a Habit) highlighted the need and effort required to practice consistently, and a call for personalisation to help achieve this.This suggests that forming a mindfulness habit is a key barrier to sustained engagement with digital MBIs, and that persuasive technological features could help overcome this barrier.
Although prior work on digital interventions has identified personalisation as an important feature, this review is the first to demonstrate its relevance to digital mindfulness interventions specifically.For example, a systematic review of web-based interventions found that the inclusion of persuasive design principles, including tailoring (i.e., provision of content or feedback adapted to factors relevant to a user), explained 55% of variance in session completion across studies [62].Our findings suggest that certain factors that contribute to engagement in digital content in mobile and web-based interventions more generally may also apply to interventions for which engagement in the digital content is designed to facilitate completion of a non-digital target behaviour (e.g., an experiential mindfulness exercise) [11].Notably, the threshold of engagement with the digital component that successfully facilitates the 'non-digital target behaviour' can demonstrably vary between individuals [63], supporting a shift towards patient-treatment matching and person-centred care [64], and underscoring the need to implement this digitally (e.g., through automated personalisation).
Conversely, this theme diverges from results of a thematic analysis of the experience of healthcare professionals who participated in either an online or printed self-help MBI [18].
The healthcare professionals consistently reported that longer practices were more challenging to engage with than shorter practices, whereas our review found considerable variation in preferences for different intervention features (e.g., format, materials, sound), including length of practice, perhaps due to the breadth of MBIs included in our robust evidence synthesis.This highlights the importance of understanding the key behavioural and psychological needs of the target population, to ensure that the intervention addresses them.
The third theme (Leaning on Others) highlighted that those engaging in digital MBIs are encouraged by additional support in its broadest sense, i.e., any communication designed to support any aspect of the intervention, its completion, or its desired outcomes.This includes synchronous (e.g., phone call, online chat) and asynchronous (e.g., email, text message) communication, support provided to a group of people (e.g., discussion forums, group chats), and anything else (e.g., automated reminders, technical assistance, feedback, reaching out to someone).While these results align with previous research on the impact of additional support in digital interventions [65], the current study cannot make conclusions about the relative power of each type of support due to the variability across studies.Given this, the provision of support in research settings needs to be considered.Interventions from almost all the studies in this review included additional support; however, it was not always clear what this constituted.For example, some studies reported that participants could ask questions via e-mail but did not specify whether they received clinical and/or purely technical assistance.Relatedly, participants may not have used the support on offer, although results from the current review indicate that this is not as important as having it available.Additional support in other studies was provided to a group of participants; however, this type of support has been excluded from definitions of guidance [66].Future research could explore whether there are unique barriers to engagement in guided versus unguided digital MBIs, and compare different types and levels of support, to advance understanding of how, when, and for whom additional support can improve engagement.This is important because there is a trade-off between the provision of support and scalability: if digital MBIs need to have someone always available to be engaging, they will be limited in reach and cost-effectiveness.
Irrespective of these uncertainties regarding the relative contributions of different types of support, it is worth noting that social support was found to be a key facilitator of engagement.This idea is consistent with the historical origins of mindfulness (i.e., to be practised collectively and in community [67]) and findings from in-person group settings.In a synthesis of the accounts of individuals with mental health difficulties in group MBIs [68], learning mindfulness within a group was found to be helpful because peer support encouraged perseverance with course demands, and learning alongside people with similar experiences fostered a comfortable and destigmatising environment.Our findings point to the idea that digital MBIs may suffer decreased engagement as a result of reduced social support.

Implications for Intervention
Researchers can use the factors identified in this review to guide intervention design and ultimately improve engagement in digital MBIs.However, such strategies must be (a) grounded in relevant literature, and (b) directly relevant to the individuals who will use the intervention.For example, the second theme suggests that instructing people to practice regularly is unlikely to turn it into a habit.Researchers might consider drawing upon research on behaviour change and habit formation, particularly with regard to digital interventions (e.g., gamification technology to motivate behaviour change).Researchers might also consider carrying out primary qualitative research to ensure generated strategies are informed by, and meet the priorities and needs of, the intended user.The person-based approach offers a systematic means of integrating theory, evidence, and user perspectives into initial intervention planning [19,20].The themes highlighted in this review could therefore inform the production of guiding principles within this approach, i.e., intervention design objectives and key features intended to achieve each aim.

Strengths and Limitations
To the best of our knowledge, this is the first review to synthesise qualitative evidence from individual studies across different contexts to advance understanding of the barriers and facilitators to engaging with digital MBIs.Using inductive thematic synthesis encouraged the generation of themes that "go beyond" the content of the primary studies to produce novel findings.All 22 studies were assessed as being adequately reported, which suggests that the papers included in this review are of sufficient quality to draw concrete inferences from.We also followed established methodological guidance, used an a priori published protocol, and took several steps to increase the validity and reliability of the review, including pilot testing forms and procedures, consultation with an information specialist, and regular team meetings.
In terms of limitations, we restricted searching to PsycInfo to manage the number of studies in a resource-efficient manner.However, it is possible that this led to the omission of additional relevant studies or introduced selection bias.Where possible (e.g., in reviews with longer timeframes), researchers should consider searching several sources and using purposive sampling to ensure the final set of included studies meet relevant criteria (e.g., have a wide geographic spread or rich data [39]).The studies included in this review reported mostly on White adult females from Western countries, which means the generalisability of our findings to underrepresented groups is unclear.This is an important area for further research because initial engagement in digital and mobile health interventions is lower in some underserved populations (e.g., people with lower socioeconomic status [28]).Relatedly, we excluded studies with samples entirely <16 years and/or >35 years, due to the focus of our own intervention development being on young people.Although the final age range covered was 16-69, future research would benefit from investigating engagement in younger and older populations since motivations to use digital interventions may vary.
There was significant heterogeneity across interventions (e.g., commercially available programmes, acceptance and commitment therapy, mindful messaging, guided mindfulness meditations) in the included studies and these differences may have influenced engagement.
Researchers and developers of digital MBIs should also consider how specific elements (e.g., content, mode of delivery, provision of support) might make people more or less likely to stop using the technology.Finally, while this review synthesised evidence from diverse study types, it is worth bearing in mind that engagement in MBIs is usually defined in terms of intervention usage (i.e., physical engagement [69]).It is unclear whether the factors identified in this review characterise facilitation and hindrance to aspects of psychological engagement, such as intention to practice mindfulness, belief that practicing mindfulness will be helpful, and commitment to integrating mindfulness into daily life.This is an important area for further research given evidence that psychological rather than physical disengagement from self-help MBIs has a greater impact on cultivating mindfulness [69].

Conclusions
Previous studies have shown the potential of digital MBIs to improve a range of health outcomes.Sufficient engagement with these interventions is required to achieve intended effects; however, engagement is typically poor.This review synthesised evidence from studies of digital MBIs and identified three key factors that influence user engagement.
We recommend that researchers generate their own solutions to these challenges by drawing upon relevant literature and working with people from the target user population.

IdentificationFigure 1
Figure 1 PRISMA flowchart for identification and selection of studies MBCT = mindfulness-based cognitive therapy.MBSR = mindfulness-based stress reduction.ACT = acceptance and commitment therapy.PTSD = posttraumatic stress disorder.a Country of institutional affiliation of the first author.b Data collection and analysis methods for data included in the qualitative evidence synthesis.c Self-completion questionnaire with open response categories.
(a) Responses to Own Practice; (b) Making Mindfulness a Habit; and (c) Leaning on Others.Each theme is outlined below using illustrative quotes.

Table 2
Keywords (in the title or abstract) used during the search.
).Some studies had samples with a combination of these characteristics.Overall, samples were about 75% female and 80% White, and participants were aged between 16-69 years.Using data from 17 studies that reported the mean sample age, the weighted average was 26.3 years.
InterventionsThe digital interventions tested included mindfulness-based stress reduction (MBSR) or MBSR tailored for families living with mental illness (n = 5); MBCT or MBCT tailored for cancer patients, the perinatal period, or people with epilepsy (n = 4); acceptance and commitment therapy (n = 3); commercially available mindfulness programmes (n = 2); and other mindfulness-based programmes (n = 8).Additional support to facilitate intervention completion was included in all studies but one (n = 21; 95%).This ranged from automated reminders and non-clinical (i.e., purely technical) assistance to orientation calls and coaching.At least 19 studies (86%) included human (versus automated) support, and at least 12 studies (55%) included support that went beyond purely technical/administrative assistance (e.g., clinical or psychologically active guidance).

Table 3
Overview of included studies.