Published on in Vol 24 , No 3 (2022) :March

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/32994, first published .
Synthesizing Dimensions of Digital Maturity in Hospitals: Systematic Review

Synthesizing Dimensions of Digital Maturity in Hospitals: Systematic Review

Synthesizing Dimensions of Digital Maturity in Hospitals: Systematic Review

Review

1School of Information Systems, Queensland University of Technology, Brisbane, Australia

2Centre for Health Services Research, The University of Queensland, Herston, Australia

3Digital Health Cooperative Research Centre, Australian Government, Sydney, Australia

4Digital Health Research Network, The University of Queensland, Brisbane, Australia

5Clinical Excellence Queensland, Queensland Health, Brisbane, Australia

6Metro North Hospital and Health Service, Brisbane, Australia

Corresponding Author:

Rebekah Eden, BIT, BAPSC, PhD

School of Information Systems

Queensland University of Technology

2 George Street

Brisbane, 4000

Australia

Phone: 61 434237975

Email: rg.eden@qut.edu.au


Background: Digital health in hospital settings is viewed as a panacea for achieving the “quadruple aim” of health care, yet the outcomes have been largely inconclusive. To optimize digital health outcomes, a strategic approach is necessary, requiring digital maturity assessments. However, current approaches to assessing digital maturity have been largely insufficient, with uncertainty surrounding the dimensions to assess.

Objective: The aim of this study was to identify the current dimensions used to assess the digital maturity of hospitals.

Methods: A systematic literature review was conducted of peer-reviewed literature (published before December 2020) investigating maturity models used to assess the digital maturity of hospitals. A total of 29 relevant articles were retrieved, representing 27 distinct maturity models. The articles were inductively analyzed, and the maturity model dimensions were extracted and consolidated into a maturity model framework.

Results: The consolidated maturity model framework consisted of 7 dimensions: strategy; information technology capability; interoperability; governance and management; patient-centered care; people, skills, and behavior; and data analytics. These 7 dimensions can be evaluated based on 24 respective indicators.

Conclusions: The maturity model framework developed for this study can be used to assess digital maturity and identify areas for improvement.

J Med Internet Res 2022;24(3):e32994

doi:10.2196/32994

Keywords



Planning a strategic roadmap to successful digital health transformation is challenging [1] due to a busy health landscape with competing drivers for change [2-4]. This is further compounded by the myriad of new technologies health care providers can select from to advance their digital health agenda. Despite both the rapid global uptake of eHealth technologies [5] and digital health being viewed as a panacea [6] for achieving the “quadruple aim” of health care (ie, reducing costs, improving patient experience, improving the work life of health care providers, and advancing population health) [7], the outcomes of digital health transformation are inconclusive and mixed [8,9]. One proposed method for strategically developing a digital health agenda is to follow a roadmap informed by digital maturity assessments [10,11].

In health care, digital maturity is defined as the extent to which digital systems are leveraged to provide high-quality health care, resulting in improved services and service delivery for an enhanced patient experience [12]. Assessing digital maturity is particularly important in hospital settings, due to (1) the complexity and cost of health service delivery involving multidisciplinary teams in acute, high-cost care settings [13]; (2) the necessity for rapid digital transformation that leverages eHealth technologies to cater to the needs of an aging population with increased rates of chronic disease [14]; and (3) the difficulties justifying business cases for large-scale electronic medical record system implementations, which require significant upfront and ongoing costs [15].

To assess digital maturity, a maturity model can be used to allow an organization to evaluate its current digital status across a series of dimensions [1]. However, limitations exist in current approaches for measuring digital maturity in hospitals, as there is a lack of consensus over which dimensions should be assessed [11]. Others have argued that current assessments of digital maturity are insufficient due to their primary focus on technology, with limited incorporation of organizational and human factors [4]. This is further supported by Carvalho et al [16], who emphasize that most digital maturity models lack sufficient depth and breadth for adequate assessment. Currently, there is still no agreement or convergence on how to assess digital maturity in health care.

Failure to understand the appropriate dimensions for assessing the digital maturity of hospitals will hamper the success of digital health transformation and be detrimental to health care outcomes. Therefore, a systematic literature review was conducted to find what dimensions are currently used to assess digital maturity in hospitals. As such, our aim was to synthesize the maturity model dimensions that are currently used when assessing digital hospital maturity to develop a consolidated digital maturity framework. Such a synthesis is necessary for, and will be beneficial to, health care executives and strategic decision-makers in evaluating and planning for the transformation of their practice. In addition, this synthesis will be beneficial to researchers as it consolidates the maturity dimensions and provides areas for future research to further refine and strengthen maturity models and their applications.


A systematic literature review following the guidelines of Templier and Pare [17] was conducted in December 2020 of articles that describe how digital maturity is assessed in hospital settings. In line with these guidelines, this systematic literature review was developmental in nature, in that it sought to develop a consolidated digital maturity framework. Therefore, unlike aggregative reviews, which seek to include all articles relevant to the phenomenon of interest, developmental reviews seek to cover only a sample of articles relevant to the phenomenon of interest [17].

To extract articles, medical databases (eg, PubMed, Cochrane, and Medline) and the Association for Information Systems College of Senior Scholars’ Basket of Eight journals were searched using the following search string: (“maturity model*” OR “digital capabilit*” OR “digital maturity”). These databases and sources were selected due to the prominence of digital health in these domains. However, due to the breadth of information systems literature examining digital maturity across a myriad of contexts other than health care, the following search condition was added for a more targeted review of the information systems sources: “AND (“health” OR “healthcare”)”. This additional search condition was not added to medical databases due to their targeted focus.

Articles were excluded if they were (1) focused on settings other than hospitals, as the implementation of eHealth technologies in different contexts (eg, acute vs primary care) requires vastly different resources with large heterogeneity in impact measurement; (2) focused on maturity models not related to digital health (ie, training maturity); (3) not focused on digital maturity; (4) published in a language other than English; and (5) not a full-text article (ie, posters or extended abstracts).

As illustrated in Figure 1 [18], 357 articles were returned from the search, and after removing duplicates, 215 remained. Initially, the first author screened the abstracts resulting in 149 articles being removed and 66 potentially relevant articles being retained. Next, a full-text review was conducted to determine eligibility, resulting in the exclusion of 37 articles. A total of 29 articles were deemed applicable for analysis. To ensure no relevant articles were missed, backward searching of the references was performed. Consistent with Saldaña [19], intercoder corroboration was performed at each stage by the second author when determining whether an article should be included.

To analyze the relevant articles, inductive coding [20] was performed in NVivo (version 12 Plus; QSR International), with maturity model dimensions extracted. These were first extracted using verbatim codes [19] with 245 raw maturity nodes (the nodes included terms such as digital architecture, enterprise architecture, infrastructure, technology capabilities, reliability, decision support systems, picture archiving and communication system [PACS], and software applications). In some instances, the raw maturity nodes represented the specific maturity model dimensions as mentioned in the papers, while in other instances it referred to digital maturity stages, as some maturity models only provided stages rather than specific dimensions.

Through the constant comparison method [20], these raw maturity nodes were grouped into respective indicators based on commonalities. This involved considering the definition of each raw maturity node, as in some cases the name of the raw maturity node (as extracted verbatim from the paper) did not reflect its inherent meaning. By comparing the specific definitions of the raw maturity nodes, similar definitions were consolidated into a single indicator. For instance, digital architecture, enterprise architecture, and infrastructure were all related to information technology (IT), so were grouped into the IT infrastructure indicator, while technology capabilities and system reliability were grouped into the technical quality indicator, and decision support systems, PACS, and software applications were grouped into the systems and services indicator. In total, 24 indicators were identified. Following this, constant comparison was again performed to aggregate the indicators into a consolidated set of dimensions based on commonalities amongst indicators. For instance, the IT infrastructure, technical quality, and systems and services indicators were grouped into the IT capability dimension. In total, 7 dimensions were identified.

To provide further confidence in our findings, reliability assessments were also performed. First, coder corroboration was conducted. The first author independently performed coding of verbatim measures (ie, raw maturity nodes) to indicators, then grouped the indicators into dimensions and discussed these decisions with the second author until consensus was reached [19]. This involved ensuring that all verbatim measures were accurately mapped to the indicators. Through discussion, some of the verbatim measures were moved to a different indicator to better reflect their underlying definitions. Subsequently, additional coder corroboration of the indicators and dimensions was performed. This involved the first 2 authors discussing the indicators and dimensions with the rest of the authorship team and resulted in updates to some of the names and definitions [19]. Second, external reliability checks were performed and the dimensions were discussed at external forums, including (1) a statewide digital health steering committee in April 2021, attended by 14 members, and (2) the statewide “Digital Health Grand Rounds” in May 2021, attended by 120 health service executives, digital health researchers, and clinicians. Consensus was reached at the forums on the derived dimensions.

Figure 1. PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) flowchart.
View this figure

Digital Maturity Dimensions

In total, 27 distinct maturity models were examined (Multimedia Appendix 1). In some instances, multiple maturity models were examined in a single paper, and in other instances, the same maturity model was examined in multiple papers. In total, 14 papers validated an existing maturity model, 10 papers proposed a new maturity model but did not validate the maturity model, 4 papers both proposed and validated a new maturity model, and 1 paper extended an existing maturity model.

Overall, 24 indicators were identified, which were consolidated into 7 digital maturity dimensions: strategy; IT capability; interoperability; governance and management; patient-centered care; people, skills, and behavior; and data analytics. The dimensions are described in Table 1; detailed examples of the indicators are provided with examples of their verbatim measures in Multimedia Appendix 2. Multimedia Appendix 3 illustrates how the distinct digital maturity models mapped onto the 7 digital maturity dimensions.

The findings of this review identified that digital maturity is predominantly assessed based on management-oriented dimensions and technology-related dimensions. Governance and management (n=22 articles) has been the most prevalent dimension of digital maturity, followed by IT capability (n=18), people, skills, and behavior (n=17), interoperability (n=15), and strategy (n=14). Comparatively, limited research has examined data analytics (n=6) and patient-centered care (n=3). Each dimension is further described in the below subsections.

Table 1. Description of the Digital Maturity Dimensions.
DimensionDescriptionIndicators
StrategyThe extent to which the organization has developed and implemented a strategic plan to achieve its goals and objectives [16]Strategic adaptability, strategic alignment, strategic focus
Information technology capabilityThe extent to which the organization has adopted and implemented information technology infrastructure, digital systems, technologies, and services [21] that are usable and effective [22]Information technology infrastructure, technical quality, systems and services
InteroperabilityThe extent to which data and information can be exchanged between systems within the organization, across care settings, and with patients, caregivers, and families [11]External interoperability, internal interoperability, semantic interoperability, syntactic interoperability
Governance and managementThe extent to which the organization embraces leadership, policies and procedures, structures, risk management of quality and safety, integrated workflows, relationship building, and capacity building [23]Change management, data governance, leadership and management, risk management, standards, cultural values
Patient-centered careThe extent to which patients, caregivers, and families can actively participate in their health decisions, have access to information and health data, and cocreate services and service delivery [24]Patient empowerment, patient focus
People, skills, and behaviorThe extent to which stakeholders (internal and external) are digitally literate and motivated to leverage technology [11,25]Education and training, knowledge management, individual competence, technology usage
Data analyticsThe extent to which the organization uses data for effective decision-making for the organization, patients, and population health [1]Descriptive analytics, predictive analytics

Governance and Management

The governance and management dimension is described as the extent to which the health care organization possesses formalized and committed leadership, as well as formalized policies, procedures, structures, and workflows [23]. Six indicators comprise the governance and management dimension: leadership and management, change management, cultural values, standards, risk management, and data governance.

The leadership and management indicator encompasses the executive team’s commitment to and support for improving clinical quality [26-28] and fostering innovation across the hospital [24]. This support is essential for all levels of the workforce. The change management indicator recognizes the need to encourage individuals to embrace planned change to achieve desired outcomes [11,29]. The need for innovation and embracing change is further evident in the cultural values indicator, which espouses values of encouraging innovative behaviors [11,29] within a trusting [26-28] and inclusive environment [30,31]. The standards indicator assesses the extent to which processes [30-32], policies, and procedures are based on the standards that have been formally agreed and mandated [24] and the extent to which these contribute to optimizing the health care organization [11,12,33]; nevertheless, this indicator is not contrary to innovation. The risk management indicator acknowledges the need for the workforce to identify, mitigate, and report risks to ensure the safety, security, and privacy of patients [1,11,26-28] and the workforce [21]. The data governance indicator further assesses whether data integrity, security, and privacy are preserved across the digital systems in health care settings [12], supported by standardized processes and protocols for accessibility and authorization [22,23,34].

IT Capability

The IT capability dimension represents the extent to which the organization has implemented IT infrastructure, digital systems, technologies, and services [21] that are usable and effective [22]. This dimension comprised three indicators: systems and services, IT infrastructure, and technical quality.

The systems and services indicator, which examines the digital systems implemented to support clinical care, is the most prominent indicator within this dimension. The systems identified as being important to digital maturity include electronic medical records, clinical decision support systems, e-prescribing, PACS [11,35,36], orders and results management, asset and resource optimization systems [22], and remote and assistive care systems [1,21]. The IT infrastructure indicator focuses on infrastructure [21,37] and architecture [1] designed and installed to support the aforementioned systems and services [12]. The systems and services indicator, as well as the IT infrastructure indicator, largely examine what technology and supporting structures have been implemented but do not account for their effectiveness. Alternatively, the technical quality indicator focuses on how effective, efficient, and fit for purpose the digital systems are [21,38].

People, Skills, and Behavior

The people, skills, and behavior dimension assesses the extent to which stakeholders, both internal and external to the health care organization, are digitally literate and motivated to leverage digital health systems [11,25]. This dimension consists of four indicators: education and training, knowledge management, individual competence, and technology usage.

The education and training indicator relates to the strategies adopted by the health care organization to provide individuals with opportunities to grow and develop clinical and technical skills, as well as collaboration and teamwork skills [16,29]. The knowledge management indicator refers to the extent to which workforce capability grows through creating, managing, and sharing knowledge [23,30]. These two indicators focus on ensuring organizational practices are in place to foster skill development and knowledge acquisition, whereas the individual competence and technology usage indicators focus on the actual skill sets and behaviors of individuals. For instance, at the individual level, the individual competence indicator takes into consideration that individuals need to possess skills, knowledge, and capability to use digital systems [21]. In contrast, the technology usage indicator recognizes that systems can be used in different ways and that digitally mature organizations need to ensure systems are used as intended [12] in a pervasive and consistent manner [26-28].

Interoperability

The interoperability dimension represents the extent to which data and information can be exchanged between systems within the organization, across care settings, and with patients, caregivers, and families [11]. Four interoperability indicators were identified: external interoperability, internal interoperability, semantic interoperability, and syntactic interoperability. The former 2 indicators relate to how information is exchanged between different actors within and between organizations (ie, intraorganizational vs interorganizational information exchange). The latter 2 indicators relate to data transformation and distinguish between the technical and meaningful structure of the information exchanged.

The external interoperability indicator assesses the adoption of standards to integrate systems, services, and data across the entire health care system [1,11,30,39-41]. Conversely, the internal interoperability indicator assesses the integration of systems and data across departments within a single health care organization [30,40]. The external interoperability indicator was more prevalent in the literature than the internal interoperability indicator.

The semantic interoperability indicator examines the extent to which information exchanged between digital systems can be accurately interpreted and understood by each system involved [34,42]. As such, the semantic interoperability indicator is dependent on the transparency of the underlying lexicon and data dictionary to ensure the intended meaning of the information exchange is retained [42]. Alternatively, the syntactic interoperability indicator represents the extent to which technical standards have been defined to enable the consistent, effective, and efficient integration of digital systems and services [30,34].

Strategy

The strategy dimension represents the extent to which the organization has developed and implemented a strategic plan to achieve its goals and objectives [16]. The strategy dimension includes three indicators: strategic focus, strategic alignment, and strategic adaptability. This dimension is built on the premise that the digital strategy and the organizational strategy should be aligned and adaptable to support the accomplishment of measurable goals and outcomes related to quality and safety [26-28].

The strategic focus indicator was the most prevalent, with an emphasis on quality and safety [26-28], sustainability and cost effectiveness [39], and ensuring the systematic evaluation of quantifiable results and objectives [24,26-28]. While the strategic focus indicator centers on the core elements that health care organizations focus on, the strategic alignment indicator details the need for the digital strategy to be aligned with the organizational strategy [12,43]. To accomplish this, the digital strategy needs to be grounded on clinical benefits and outcomes [1,11]. In contrast, the strategic adaptability indicator recognizes the importance of organizational strategy and digital system dynamism [1,11,16] and that both should be capable of responding to environmental challenges [44] and emerging opportunities [11,16,24].

Data Analytics

The data analytics dimension examines the extent to which the organization uses data collected in its digital systems for effective decision-making to benefit the organization, patients, and population health [1]. Few studies have reported on this dimension and as such only 2 indicators have been identified: descriptive analytics and predictive analytics.

The descriptive analytics indicator is the extent to which data is analyzed to identify and understand historical patterns and trends, facilitating effective decision-making [1]. The predictive analytics indicator focuses on the analysis of data that enables future potential risks and opportunities to be identified to aid decision-making [24], including “proactive/predictive models of care” [11].

Patient-Centered Care

The patient-centered care dimension encompasses the extent to which patients, caregivers, and families can actively participate in their health decisions, access information and their health data, and cocreate services and service delivery [24]. Only 3 articles in this review examined patient-centered care as a dimension of digital maturity, which resulted in 2 indicators: patient focus and patient empowerment.

The patient focus indicator assesses the extent to which the role of the patient is considered, involved, and valued when designing new models of care [25,31]. The patient empowerment indicator represents the extent to which patients are encouraged to actively participate in their health decisions and have access to relevant information and health data [24].


Summary of Key Findings

In summary, we identified 7 dimensions (ie, strategy; governance and management; IT capability; interoperability; data analytics; people, skills, and behavior; and patient-centered care) of digital health maturity that hospitals need to consider when strategically planning their digital health agenda. In addition, we identified 24 indicators that can be used to measure these dimensions (Figure 2). To operationalize these indicators, future research should seek to rigorously develop specific measurement items and follow extensive internal and external validity and reliability assessment [45].

These dimensions have received varying attention in the literature; however, as we argue in the “Implications for Future Work” section of this paper, a robust digital health maturity assessment must consider all dimensions to a sufficient depth. As such, we considered these dimensions to be equally weighted. Failure to consider a dimension could ultimately prove detrimental to the overall digital transformation agenda.

Our findings extend previous systematic literature reviews on digital health maturity models in 3 ways. First, past reviews have sought to identify various maturity models used in health care and analyze them in isolation. For instance, Carvalho et al [37] examined 14 maturity models and Gomes and Romão [46] investigated 26 maturity models commonly employed in health care, providing a descriptive account of each. In contrast, this study synthesized the dimensions present across maturity models to derive a consolidated framework (Figure 2). Second, other reviews have investigated maturity model dimensions, yet had aggregate dimensions that were inappropriately broad. For instance, Tarhan et al [47] developed a consolidated list of only four maturity model dimensions: business process, technology, people, and other. Their business process dimension incorporated government regulations, their technology dimension was aligned with the IT capability dimension identified in this study, and their people dimension focused largely on patient safety culture and therefore differed from the people, skills, and behavior dimension identified in this study, which examined individual-level and organizational-level factors. The “other” category incorporated a wide range of factors including “culture, strategy, governance, leadership, interoperability, and data” [47]. The maturity model framework developed in this paper provides a more granular account of factors in the “other” category represented in the previously reported dimension. Such a granular account is necessary for effective assessment.

Figure 2. Consolidated Digital Maturity Model Framework for Hospitals.
View this figure

Implications for Future Work

Through performing this analysis, we have identified four important areas that future research should focus on: (1) balancing digital maturity dimensions; (2) evaluating the impact of dimensions on the quadruple aim of health care; (3) examining the interrelationships between dimensions; and (4) evaluating longitudinal variations in digital maturity. These are discussed in turn below.

Balancing Digital Maturity Dimensions

No maturity model in our review encompassed all 7 dimensions of our consolidated digital maturity model framework. The vast majority of studies focused on organizational capability (ie, governance and management and strategy), technological capability (ie, IT capability and interoperability), and individual capability (ie, people, skills, and behavior). Only 3 papers recognized patient-centered care as a dimension of digital maturity, which lags behind the goals of current medical practice. This marked difference in attention to the dimensions of the maturity models illustrates the traditional corporate focus on technical and regulatory components of digital health and the neglect of patient outcomes in the digital transformation of health care. This is a clear oversight in current digital maturity assessments, as government policies are increasingly placing the patient “at the heart of their own treatment plans so that they might develop a commitment to self-management” [48].

Moreover, while technology capability has been a prominent theme in both the IT capability and interoperability dimensions, there has been less attention paid to understanding the data analytics dimension [16]. In terms of data analytics, many of the professed benefits of digital health emanate from the “promise and potential” of the secondary use of health care data [49,50]. This capability is, further, central to prior government agendas promoting meaningful use of technology [51]. As such, future research needs to address data analytics as a key aspect of digital maturity, examining not only descriptive and predictive analytics but also the potential of prescriptive analytics.

Our findings (detailed in Multimedia Appendix 1) demonstrate that the vast majority of maturity models have been assessed in developed countries, such as the United States and the United Kingdom. Seldom is the digital maturity of hospitals in developing countries assessed (notable exceptions are the work of Yarmohammadian et al [52] and Moradi et al [31], who examined maturity models in Iran), and cross-cultural comparisons are largely overlooked. Future research therefore needs to examine the extent to which maturity models are equally applicable across cultures and settings. Ammenwerth et al [53] provide some useful guidance into how best to do so.

Evaluating the Impact of Digital Maturity Dimensions

While the importance of the 7 identified dimensions has been raised across multiple papers, their impact on outcomes such as the quadruple aim of health care (ie, reducing costs, improving patient experience, improving the work life of health care providers, and advancing population health [7]) has largely not been assessed (details are shown in Multimedia Appendix 4). Only 2 papers in our study triangulated digital maturity with outcomes. For instance, van Poelgeest et al [44] identified that the higher the digital maturity based on the Electronic Medical Record Adoption Model (EMRAM), the shorter the length of stay, although this was dependent on the location of the hospital. Conversely, Martin et al [12] identified that digital maturity based on a clinical digital maturity index did not influence the mortality, readmission, or complications encountered in the hospital, but found that maturity significantly improved length of stay and the number of harm-free patient care episodes.

Understanding how digital maturity influences outcomes is essential, as past research has found mixed results when assessing the outcomes of the digital transformation of health care, with recommendations made to policy makers to “identify and support the drivers of successful [eHealth] outcomes” [8]. If designed and applied correctly, digital maturity assessments could equip policy makers with tools to evaluate whether they have the drivers in place for successful digital transformation [54]. However, validation of the digital maturity dimensions is still required. Such validation will need to extend beyond measuring operational improvements such as cost savings and productivity goals and consider all 4 health care aims. Failure to adequately recognize the health care aim of improving the working conditions of health care providers will limit successful digital transformation, as demonstrated in many reports of staff dissatisfaction and burnout associated with digital technology in health care [55].

Similar concerns surrounding the validity of digital maturity models have been observed by Thordsen et al [56]. To validate the digital maturity of hospitals, it is necessary to analyze digital maturity through the lens of balanced health care outcomes, as outlined in the quadruple aim. We encourage future research using a multiple case study design to evaluate both the digital maturity dimensions and key performance indicators related to each aim and to assess whether digital maturity correlates with health care outcomes. Confounders will likely be present, but this is a necessary first step to provide evidence to health care executives regarding the need to evaluate and improve digital maturity. In addition, future research should seek to perform an intervention study with targeted improvements within each digital maturity dimension and assess the impact on the health care aims to further understand the mechanisms behind the purported relationship.

Understanding the Interrelationships Between Dimensions

The digital maturity dimensions in the literature reviewed here were largely examined in a subjective manner, with the dependencies and interrelationships open to interpretation, assumptions, and variability. Future research should seek to delve into these interrelationships further, as this could provide insights into the order in which hospitals should seek to improve the digital maturity dimensions. For instance, efforts to improve data analytics and IT capabilities through implementing artificial intelligence algorithms for complex clinical care may be hampered if there is no appropriate clinical governance or a clinical informatics workforce. As such, future research should seek to examine exemplar cases which have excelled in each of the dimensions to identify their drivers.

Conducting a Longitudinal Analysis of Dimensions

At different stages of a hospital’s digital health journey, different maturity model dimensions may need to be assessed. This is because digital systems and organizations are dynamic and, therefore, change over time. Although some maturity models decompose digital maturity into stages, these are often simple in nature. Some notable maturity models have taken this level of detail into account [16] by considering the varying measurement criteria between the different stages of maturity. But as a whole this approach has mostly been overlooked. As such, scholars should seek to perform a longitudinal investigation of digital maturity to ensure appropriate assessments are performed depending on the level of IT capability within the hospital.

Limitations

This review is scoped to the digital maturity of hospitals and not to other health care settings. This is necessary because of the vast differences between acute health care settings and primary care. Future research should seek to investigate the digital maturity of primary care settings to identify maturity dimensions necessary for their successful transformation.

Although maturity models are widely being used in hospitals globally, it is important to note that digital maturity assessments are just one approach to planning and evaluating digital health transformations. Future research should compare the efficacy of digital maturity assessments with other approaches, for instance plans for digital health transformation benchmarking [53], the NASSS (nonadoption, abandonment, scale-up, spread, and sustainability) framework [57], and organizational readiness surveys [58]. Alternatively, from an evaluation perspective, organizations can adopt the measures from evaluation frameworks [59].

In addition, this literature review has been scoped to peer-reviewed outlets in medical databases and leading information systems journals, with “grey” literature excluded, which could have led to publication bias. Although this scoping may have missed some articles, it was necessary to ensure only high-quality, theoretical, rigorously developed models were included. In addition, proprietary maturity models that are used in practice but not examined in the peer-reviewed literature in this study were omitted. Nevertheless, many proprietary maturity models have been examined in peer-reviewed journals and were therefore included in this study, such as the EMRAM of the Health Information Management Systems Society.

Conclusions

This systematic literature review resulted in the development of a consolidated digital maturity model framework consisting of 7 core dimensions and 24 indicators of digital health maturity. Future research needs to be conducted to understand how these dimensions relate to outcomes across the quadruple aim of health care, and to extend the traditional IT and corporate focus to include patient and staff considerations. In that way, digital health strategic plans will become aligned to the strategic aims of hospitals and focused on delivering the quadruple aim of health care.

Acknowledgments

This publication is supported by Digital Health CRC Limited and was funded under the Commonwealth Government’s Cooperative Research Centres Programme. We acknowledge support provided by the Queensland Department of Health, the University of Queensland, the Health Care Information and Management Systems Society, and Queensland University of Technology.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Distinct Maturity Models.

DOCX File , 30 KB

Multimedia Appendix 2

Digital Maturity Dimensions and Indicators.

DOCX File , 24 KB

Multimedia Appendix 3

Mapping of Dimensions to Maturity Models.

DOCX File , 24 KB

Multimedia Appendix 4

Method Applied for Validated Maturity Model.

DOCX File , 24 KB

  1. Johnston DS. Digital maturity: are we ready to use technology in the NHS? Future Healthc J 2017 Oct;4(3):189-192 [FREE Full text] [CrossRef] [Medline]
  2. Eden R, Burton-Jones A, Casey V, Draheim M. Digital Transformation Requires Workforce Transformation. MISQE 2019 Mar 1;18(1):1-17. [CrossRef]
  3. Krey M. Key challenges to enabling IT in healthcare. IEEE Int Conf Healthc Inform: IEEE; 2016 Dec 08 Presented at: IEEE International Conference on Healthcare Informatics; October 2016; Chicago, IL, USA p. 328-333. [CrossRef]
  4. Cresswell K, Sheikh A, Krasuska M, Heeney C, Franklin BD, Lane W, et al. Reconceptualising the digital maturity of health systems. The Lancet Digital Health 2019 Sep;1(5):e200-e201. [CrossRef]
  5. Eden R, Burton-Jones A, Scott I, Staib A, Sullivan C. Effects of eHealth on hospital practice: synthesis of the current literature. Aust Health Rev 2018 Sep;42(5):568-578. [CrossRef] [Medline]
  6. Rahimi K. Digital health and the elusive quest for cost savings. The Lancet Digital Health 2019 Jul;1(3):e108-e109. [CrossRef]
  7. Bodenheimer T, Sinsky C. From triple to quadruple aim: care of the patient requires care of the provider. Ann Fam Med 2014 Nov;12(6):573-576 [FREE Full text] [CrossRef] [Medline]
  8. Eden R, Burton-Jones A, Scott I, Staib A, Sullivan C. The impacts of eHealth upon hospital practice: synthesis of the current literature. Deeble Institute Evidence Brief 2017 Oct 17;16:1-7.
  9. Zheng K, Abraham J, Novak LL, Reynolds TL, Gettinger A. A Survey of the Literature on Unintended Consequences Associated with Health Information Technology: 2014–2015. Yearb Med Inform 2018 Mar 06;25(01):13-29. [CrossRef]
  10. Mettler T, Pinto R. Evolutionary paths and influencing factors towards digital maturity: An analysis of the status quo in Swiss hospitals. Technol Forecast Soc Change 2018 Aug;133:104-117. [CrossRef]
  11. Krasuska M, Williams R, Sheikh A, Franklin BD, Heeney C, Lane W, et al. Technological Capabilities to Assess Digital Excellence in Hospitals in High Performing Health Care Systems: International eDelphi Exercise. J Med Internet Res 2020 Aug 18;22(8):e17022 [FREE Full text] [CrossRef] [Medline]
  12. Martin G, Clarke J, Liew F, Arora S, King D, Aylin P, et al. Evaluating the impact of organisational digital maturity on clinical outcomes in secondary care in England. NPJ Digit Med 2019;2:41 [FREE Full text] [CrossRef] [Medline]
  13. Eden R, Burton-Jones A, Grant J, Collins R, Staib A, Sullivan C. Digitising an Australian university hospital: qualitative analysis of staff-reported impacts. Aust. Health Review 2020;44(5):677-689. [CrossRef]
  14. Klecun E. Transforming healthcare: policy discourses of IT and patient-centred care. Eur J Inf Syst 2017 Dec 19;25(1):64-76. [CrossRef]
  15. Bassi J, Lau F. Measuring value for money: a scoping review on economic evaluation of health information systems. J Am Med Inform Assoc 2013 Jul;20(4):792-801 [FREE Full text] [CrossRef] [Medline]
  16. Carvalho JV, Rocha, Abreu A. Maturity Assessment Methodology for HISMM - Hospital Information System Maturity Model. J Med Syst 2019 Jan 07;43(2):35. [CrossRef] [Medline]
  17. Templier M, Paré G. A Framework for Guiding and Evaluating Literature Reviews. CAIS 2015;37:112-137. [CrossRef]
  18. Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ 2009 Jul 21;339(jul21 1):b2535-b2535. [CrossRef]
  19. Saldaña J. The coding manual for qualitative researchers. London: SAGE; 2021.
  20. Gioia DA, Corley KG, Hamilton AL. Seeking Qualitative Rigor in Inductive Research. Organ Res Methods 2013 Jan;16(1):15-31. [CrossRef]
  21. Vidal Carvalho J, Rocha, Abreu A. Maturity of hospital information systems: Most important influencing factors. Health Informatics J 2019 Sep;25(3):617-631 [FREE Full text] [CrossRef] [Medline]
  22. Martin G, Arora S, Shah N, King D, Darzi A. A regulatory perspective on the influence of health information technology on organisational quality and safety in England. Health Informatics J 2020 Jun;26(2):897-910 [FREE Full text] [CrossRef] [Medline]
  23. Potter I, Petersen T, D'Agostino M, Doane D, Ruiz P, Marti M, et al. The Virgin Islands National Information Systems for Health: vision, actions, and lessons learned for advancing the national public health agenda. Rev Panam Salud Publica 2018;42:e156 [FREE Full text] [CrossRef] [Medline]
  24. Grooten L, Borgermans L, Vrijhoef HJ. An Instrument to Measure Maturity of Integrated Care: A First Validation Study. Int J Integr Care 2018 Jan 25;18(1):10 [FREE Full text] [CrossRef] [Medline]
  25. Flott K, Callahan R, Darzi A, Mayer E. A Patient-Centered Framework for Evaluating Digital Maturity of Health Services: A Systematic Review. J Med Internet Res 2016 Apr 14;18(4):e75 [FREE Full text] [CrossRef] [Medline]
  26. Kulju S, Morrish W, King L, Bender J, Gunnar W. Patient Misidentification Events in the Veterans Health Administration: A Comprehensive Review in the Context of High-Reliability Health Care. J Patient Saf 2020 Sep 08:E290-E296. [CrossRef] [Medline]
  27. Randall KH, Slovensky D, Weech-Maldonado R, Patrician PA, Sharek PJ. Self-Reported Adherence to High Reliability Practices Among Participants in the Children's Hospitals' Solutions for Patient Safety Collaborative. Jt Comm J Qual Patient Saf 2019 Mar;45(3):164-169. [CrossRef] [Medline]
  28. Sullivan JL, Rivard PE, Shin MH, Rosen AK. Applying the High Reliability Health Care Maturity Model to Assess Hospital Performance: A VA Case Study. Jt Comm J Qual Patient Saf 2016 Sep;42(9):389-411. [CrossRef] [Medline]
  29. Brice S, Almond H. Health Professional Digital Capabilities Frameworks: A Scoping Review. JMDH 2020 Nov;Volume 13:1375-1390. [CrossRef]
  30. Blondiau A, Mettler T, Winter R. Designing and implementing maturity models in hospitals: An experience report from 5 years of research. Health Informatics J 2016 Sep;22(3):758-767 [FREE Full text] [CrossRef] [Medline]
  31. Moradi T, Jafari M, Maleki MR, Naghdi S, Ghiasvand H. Quality Management Systems Implementation Compared With Organizational Maturity in Hospital. Glob J Health Sci 2015 Jul 27;8(3):174-182 [FREE Full text] [CrossRef] [Medline]
  32. de Boer JC, Adriani P, van Houwelingen JW, Geerts A. Game Maturity Model for Health Care. Games Health J 2016 Apr;5(2):87-91. [CrossRef] [Medline]
  33. Staggers N, Rodney M. Promoting usability in organizations with a new health usability model: implications for nursing informatics. NI 2012 2012:396 [FREE Full text] [Medline]
  34. Guédria W, Bouzid H, Bosh G, Naudet Y, Chen D. eHealth interoperability evaluation using a maturity model. Stud Health Technol Inform 2012;180:333-337. [Medline]
  35. Kose I, Rayner J, Birinci S, Ulgu MM, Yilmaz I, Guner S, HIMSS Analytics Team, et al. Adoption rates of electronic health records in Turkish Hospitals and the relation with hospital sizes. BMC Health Serv Res 2020 Oct 21;20(1):967 [FREE Full text] [CrossRef] [Medline]
  36. Van de Wetering R, Batenburg R. A PACS maturity model: a systematic meta-analytic review on maturation and evolvability of PACS in the hospital enterprise. Int J Med Inform 2009 Feb;78(2):127-140. [CrossRef] [Medline]
  37. Carvalho JV, Rocha, Abreu A. Maturity Models of Healthcare Information Systems and Technologies: a Literature Review. J Med Syst 2016 Jun;40(6):1-10. [CrossRef] [Medline]
  38. Williams PA, Lovelock B, Cabarrus T, Harvey M. Improving Digital Hospital Transformation: Development of an Outcomes-Based Infrastructure Maturity Assessment Framework. JMIR Med Inform 2019 Jan 11;7(1):e12465 [FREE Full text] [CrossRef] [Medline]
  39. Khanbhai M, Flott K, Darzi A, Mayer E. Evaluating Digital Maturity and Patient Acceptability of Real-Time Patient Experience Feedback Systems: Systematic Review. J Med Internet Res 2019 Jan 14;21(1):e9076. [CrossRef]
  40. Kouroubali A, Papastilianou A, Katehakis DG. Preliminary Assessment of the Interoperability Maturity of Healthcare Digital Services vs Public Services of Other Sectors. Stud Health Technol Inform 2019 Aug 21;264:654-658. [CrossRef] [Medline]
  41. Orenstein EW, Muthu N, Weitkamp AO, Ferro DF, Zeidlhack MD, Slagle J, et al. Towards a Maturity Model for Clinical Decision Support Operations. Appl Clin Inform 2019 Oct 30;10(05):810-819. [CrossRef]
  42. Corrigan D, McDonnell R, Zarabzadeh A, Fahey T. A Multistep Maturity Model for the Implementation of Electronic and Computable Diagnostic Clinical Prediction Rules (eCPRs). EGEMS (Wash DC) 2015;3(2):1153 [FREE Full text] [CrossRef] [Medline]
  43. Van de Wetering R, Batenburg R, Lederman R. Evolutionistic or revolutionary paths? A PACS maturity model for strategic situational planning. Int J Comput Assist Radiol Surg 2010 Jul;5(4):401-409. [CrossRef] [Medline]
  44. Van Poelgeest R, van Groningen JT, Daniels JH, Roes KC, Wiggers T, Wouters MW, et al. Level of Digitization in Dutch Hospitals and the Lengths of Stay of Patients with Colorectal Cancer. J Med Syst 2017 May;41(5):84 [FREE Full text] [CrossRef] [Medline]
  45. MacKenzie, Podsakoff, Podsakoff. Construct Measurement and Validation Procedures in MIS and Behavioral Research: Integrating New and Existing Techniques. MIS Quarterly 2011;35(2):293-334. [CrossRef]
  46. Gomes J, Romão M. Information System Maturity Models in Healthcare. J Med Syst 2018 Oct 16;42(12):235. [CrossRef] [Medline]
  47. Kolukısa Tarhan A, Garousi V, Turetken O, Söylemez M, Garossi S. Maturity assessment and maturity models in health care: A multivocal literature review. Digit Health 2020;6:1-20 [FREE Full text] [CrossRef] [Medline]
  48. Barrett M, Oborn E, Orlikowski W. Creating Value in Online Communities: The Sociomaterial Configuring of Strategy, Platform, and Stakeholder Engagement. Inf Syst Res 2016 Dec;27(4):704-723. [CrossRef]
  49. Raghupathi W, Raghupathi V. Big data analytics in healthcare: promise and potential. Health Inf Sci Syst 2014;2:3 [FREE Full text] [CrossRef] [Medline]
  50. Hersh W. Editorial: Adding value to the electronic health record through secondary use of data for quality assurance, research, and surveillance. Am J Manag Care 2007;13(6):277-278. [CrossRef]
  51. Jha AK. Meaningful use of electronic health records: the road ahead. JAMA 2010 Oct 20;304(15):1709-1710. [CrossRef] [Medline]
  52. Yarmohammadian MH, Tavakoli N, Shams A, Hatampour F. Evaluation of organizational maturity based on people capacity maturity model in medical record wards of Iranian hospitals. J Educ Health Promot 2014;3:18-24 [FREE Full text] [CrossRef] [Medline]
  53. Ammenwerth E, Duftschmid G, Al-Hamdan Z, Bawadi H, Cheung NT, Cho K, Kırca, et al. International Comparison of Six Basic eHealth Indicators Across 14 Countries: An eHealth Benchmarking Study. Methods Inf Med 2020 Dec;59(S 02):e46-e63 [FREE Full text] [CrossRef] [Medline]
  54. Burmann A, Meister S. Practical Application of Maturity Models in Healthcare: Findings from Multiple Digitalization Case Studies. HEALTHINF 2021:100-110. [CrossRef]
  55. Downing NL, Bates DW, Longhurst CA. Physician Burnout in the Electronic Health Record Era: Are We Ignoring the Real Cause? Ann Intern Med 2018 Jul 03;169(1):50-51. [CrossRef] [Medline]
  56. Thordsen T, Murawski M, Bick M. How to Measure Digitalization? A Critical Evaluation of Digital Maturity Models. In: Responsible Design, Implementation and Use of Information and Communication Technology. 2020;1. Skukuza, South Africa: Springer International Publishing; 2020:358-369.
  57. Greenhalgh T, Wherton J, Papoutsi C, Lynch J, Hughes G, A'Court C, et al. Beyond Adoption: A New Framework for Theorizing and Evaluating Nonadoption, Abandonment, and Challenges to the Scale-Up, Spread, and Sustainability of Health and Care Technologies. J Med Internet Res 2017 Nov 01;19(11):e367 [FREE Full text] [CrossRef] [Medline]
  58. Hussain M, Papastathopoulos A. Organizational readiness for digital financial innovation and financial resilience. Int J Prod Econ 2022 Jan;243:1-15. [CrossRef]
  59. Burton-Jones A, Akhlaghpour S, Ayre S, Barde P, Staib A, Sullivan C. Changing the conversation on evaluating digital transformation in healthcare: Insights from an institutional analysis. Inf Organ 2020 Mar;30(1):1-16. [CrossRef]


EMRAM: Electronic Medical Record Adoption Model
IT: information technology
MM: maturity model
NASSS: nonadoption, abandonment, scale-up, spread, and sustainability
PACS: picture archiving and communication system


Edited by A Mavragani; submitted 18.08.21; peer-reviewed by S Meister, G Martin, K Cresswell; comments to author 08.10.21; revised version received 02.12.21; accepted 28.12.21; published 30.03.22

Copyright

©Rhona Duncan, Rebekah Eden, Leanna Woods, Ides Wong, Clair Sullivan. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 30.03.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.