Background: Although clinical decision support (CDS) alerts are effective reminders of best practices, their effectiveness is blunted by clinicians who fail to respond to an overabundance of inappropriate alerts. An electronic health record (EHR)–integrated machine learning (ML) algorithm is a potentially powerful tool to increase the signal-to-noise ratio of CDS alerts and positively impact the clinician’s interaction with these alerts in general.
Objective: This study aimed to describe the development and implementation of an ML-based signal-to-noise optimization system (SmartCDS) to increase the signal of alerts by decreasing the volume of low-value herpes zoster (shingles) vaccination alerts.
Methods: We built and deployed SmartCDS, which builds personalized user activity profiles to suppress shingles vaccination alerts unlikely to yield a clinician’s interaction. We extracted all records of shingles alerts from January 2017 to March 2019 from our EHR system, including 327,737 encounters, 780 providers, and 144,438 patients.
Results: During the 6 weeks of pilot deployment, the SmartCDS system suppressed an average of 43.67% (15,425/35,315) potential shingles alerts (appointments) and maintained stable counts of weekly shingles vaccination orders (326.3 with system active vs 331.3 in the control group; P=.38) and weekly user-alert interactions (1118.3 with system active vs 1166.3 in the control group; P=.20).
Conclusions: All key statistics remained stable while the system was turned on. Although the results are promising, the characteristics of the system can be subject to future data shifts, which require automated logging and monitoring. We demonstrated that an automated, ML-based method and data architecture to suppress alerts are feasible without detriment to overall order rates. This work is the first alert suppression ML-based model deployed in practice and serves as foundational work in encounter-level customization of alert display to maximize effectiveness.
Background and Significance
The potential effectiveness of clinical decision support (CDS) alerts as a scalable tool for promoting evidence-based care for vaccine administration has led to their frequent use by most health care systems seeking to maximize vaccination rates [- ]. CDS alerts to prompt evidence-based practices have been extensively studied and shown to work best when delivered at an appropriate time and place in the clinical workflow, that is, when the clinician is prepared to receive the information [ - ]. Successful CDS alerts have led to a reduction in prescribing brand-name antibiotics [ ], improved lipid management in renal transplant patients [ ], improved compliance with guidelines for treating HIV [ - ], reduced ordering of tests when costs were displayed [ ], and age-specific alerts that reduce inappropriate prescribing in the elderly [ - ].
Although CDS tools are effective reminders of best practices, their effectiveness is blunted by the context in which they are deployed; alert fatigue (clinician desensitization driven by overwhelming number and quality of safety alerts)  is the result of an ever-growing number of alerts in the electronic health record (EHR), leading to clinicians commonly ignoring or failing to respond appropriately to alerts. Alert fatigue resulting from an excess of poor-quality alerts (eg, alerts firing at inappropriate times or for inappropriate patients) contributes to clinicians’ perceptions that the bulk of alerts are likely clinically insignificant regardless of their clinical message. As a result, clinicians now override most medication alerts [ - ] and are becoming increasingly desensitized to alarms [ - ]. Although there is limited consensus on how to measure alert fatigue and its unintended consequences, data show that alert fatigue is significantly impacting the clinician experience and patient care [ - ]. At our large academic health system, the number of active interruptive alerts for providers grew from 13 in 2012 to 107 in 2018, an increase of more than 800%. In December 2018, our providers ordered the shingles vaccine in response to just 6.43% (2219/34,531) of the alerts, indicating that our clinicians view a majority of these alerts as inappropriate. Consequently, an improved EHR experience for clinicians has become an institutional priority for many health systems, including our own.
Individual-level factors, including clinicians’ bias toward ignoring alerts and poor signal detection resulting from the overwhelming number of alerts, and poor alert reliability add to the degraded effectiveness of CDS and user experience [, ]. To optimize a CDS system means to optimize the signal-to-noise ratio of alerts by increasing the signal, decreasing the noise created by an abundance of inappropriate, poorly timed alerts or both. To this end, prior work in medication alerts and monitoring alarms have implemented advanced interventions that use rules to surface or suppress alerts, intending to improve CDS alert signal. A study using basic rules to deactivate irrelevant alerts and manually alter other alert frequencies based on severity decreased the override rates from 33.6 to 4.6 per 100 orders [ ]. Similar severity ranking showed success in increasing alert acceptance rate by 50%, despite a 60% increase in alert events [ ]. Other studies attempting to reduce noise, however, achieved limited or mixed results [ - ].
Research indicates that delivering alerts at the appropriate time and place in the clinical workflow is key to effective CDS [- ]. Prior work to optimize CDS tools focuses on manual approaches [ , ]; these have proven to be time consuming, difficult to maintain, and static, limiting scalability. Optimization and incorporation of more sophisticated rules to surface or suppress alerts achieves limited reduction [ - ]. Machine learning (ML) is a powerful tool for identifying patterns in complex data by using past data to predict future performance. The use of ML in health care has proliferated over the past 10 years in a variety of use cases. ML applied to EHR data specifically shows signs of promise as a tool for improving safety and quality of care; its application to problems such as predicting readmission and sepsis shows the ability of ML ability to better target alerts to the appropriate user and use case [ - ]. An EHR-integrated ML algorithm is a potentially powerful tool to improve the quality of care by increasing the signal-to-noise ratio of alerts to positively impact clinicians’ interactions with these alerts. To date, the informatics literature lacks both prospective evaluation of signal-to-noise optimization interventions as well as detailed accounts of operational steps necessary to implement ML models in clinical care. Using the shingles vaccination alert as our initial use case, we leveraged historical EHR interaction data (clicks), patient and provider sociodemographic data to (1) build and train an ML model that can predict the likelihood of provider interaction with the shingles vaccination alert and (2) establish the data architecture necessary to deploy the model in a live environment.
The objective of this case study was to describe the development, implementation, and prospective evaluation of a novel, ML-based, CDS signal-to-noise optimization (SmartCDS) system that suppresses low-value vaccination alerts applied to a shingles vaccination CDS alert.
This work was conducted within a large urban academic hospital system with approximately 1300 beds over several satellite locations. In the fiscal year 2016, 3584 doctors and 4899 nurses treated approximately 38,000 inpatient admissions, 5.8 million outpatient visits, and 150,000 emergency department visits.
This study uses all data from January 1, 2017, to March 11, 2019, to maintain consistency with the shingles alert content and its clinical setting. The dataset includes a total of 695,311 shingles alerts presented to 780 providers over 327,737 encounters, covering 144,438 unique patients. The overall alert interaction rate (any action toward acknowledging the shingles alert in an encounter) during this period was 16%, and the overall order rate of the shingles vaccine in response to the alert was 5%. The alert response options are illustrated in—providers may choose from four different actions: open SmartSet to sign vaccine orders for targeted patients, health maintenance override, postpone or customize health maintenance modifier based on refusal, and deferral or other decisions made by patients; the alert appears on a side tab located at the right side of the EHR interface and may also be ignored or closed when no action is taken.
Alert Suppression Model Construction
A retrospective data query was performed to extract data related to the shingles vaccine alert. Key data elements to be extracted from the EHR were determined using a combination of descriptive analysis and clinical expertise. After data cleaning, historical changes were analyzed in alert usage to determine the optimal period from which data can be extracted for model training (two years of data from January 1, 2017, to December 31, 2018). Using data from our alert system, the average response rates for the alerts as well as the providers’ interaction history with the alerts were examined for the purpose of determining an appropriate protocol for assigning one unique provider to each alert encounter. Initial analyses demonstrated a large variation among clinicians with regard to the frequency of interaction with the alerts (0%-92%), prompting our team to construct variables for an individual clinician’s activity history, which was expanded to several short-term and long-term activity history variables capturing response rates, alert volume, and demographic variables for both clinicians and patients. The features that affect clinician’s response include (1) clinician-level demographics, clinical roles, and specialties; (2) response rate to previous shingles alert (both short-term and long-term); and (3) the number of recent encounters. The patient-level data included were patient demographics and history of targeted shingles alert responses and shingles vaccine orders by clinicians. In addition, a binary flag indicating walk-in visits and scheduled office visits was included as the architecture did not capture walk-in visits in our pilot implementation.
Machine Learning Model
The model was designed as a binary classification task. The target labels were built based on whether an alert instance was interacted with or whether a follow-up order for shingles vaccination was placed in each primary care visit. The data were split randomly based on individual clinicians into 80%, 10%, and 10% sets for model training, validation, and testing, respectively, as illustrated in. XGBoost was employed as our ML algorithm, with learning rate=0.3, maximum tree depth=0.6, minimum child weight=1, no subsampling, negative log loss, and early stopping (with a maximum of 50 rounds). The validation set was used to monitor the model training through early stopping to derive the operational score threshold and evaluate the model performance; the test set was used to evaluate the effectiveness of the score threshold and the generalizability of the trained model retrospectively. To evaluate the performance of the model, we obtained a sample of nearly 65,000 primary care visits. We reported a highly effective model, adopting individual profiling of providers to reduce the number of clinically insignificant alerts, with average area under receiver operating characteristic of 0.919 and average area under precision-recall curve of 0.562 using 5-fold cross-validation. Our simulation found that of the 50.00% (6490/12,980) lowest ranked vaccination alerts, 99.77% (6475/6490) have been ignored by providers if not suppressed. Given that the corresponding estimated order reduction via nested cross-validation was deemed conservative at 1%, a 50% suppression threshold was selected in collaboration with clinical stakeholders [ ]. As a result, the model that relies on personal history for features is updated daily to incorporate the latest data and update the 50% score threshold during prospective implementation. Upcoming appointments are used for ongoing training, making the training window ongoing. The patients’ appointments for initial primary care visits are excluded from suppression.
Pilot Design and Evaluation
A pilot study was designed over 6 weeks () in biweekly cycles (alternating turning the model on for one week and turning the model off for another week) to verify that the data distribution in the training/validation set was applicable to that in production. In the pilot, key statistical measures were examined with the model both turned on and turned off to compare prospective model performance with estimates generated in retrospective evaluation. The provider response and follow-up orders associated with suppressed shingles alerts cannot be measured; therefore, prospective model performance was evaluated using the percentage of daily suppressed alerts, daily alert response rate, weekly shingles vaccination order count, and alerts per order rate. Weekly aggregated measures were employed because of weekly patterns detected in the clinical setting (eg, Wednesdays and weekend days featured lower alert volume).
Architecture for Signal-to-Noise Optimization System Deployment
After the construction of ML model for alert suppression (see Methods), we built a new data architecture to operationalize the model (). The overall signal-to-noise optimization (SmartCDS) system was broken into three components: (1) the data extraction module, which identifies planned visits for the next day, with the intent to identify upcoming vaccine alerts to suppress, and queries the EHR to extract the variables required to run the ML module; (2) the suppression ML model itself (the ML module built as described above); and (3) the suppression module, which leverages a series of application programming interface calls to the EHR to communicate the alerts that should to be suppressed. The data extraction module queries the EHR and extracts features that should then be passed to the ML module.
The steps, related tasks, and timeline for the development and operationalization of the system are detailed in.
|Alert suppression model construction||3 months|
|Aggregation of production data||2 weeks|
|Web service endpoint configuration||3 weeks|
|Machine learning script optimization||1 month|
|Docker image and container formation||1 week|
|Reporting and dashboard development||2 weeks|
aEHR: electronic health record.
Aggregation of Production Data and Configuration of a Web Service Endpoint
With the alert suppression model built and data aggregated to support production, we worked with our institutional EHR team to create predefined and operationally approved queries to build easy-to-access views of our variables of interest in the EHR database. This enabled and automated the data extraction needed to operationalize the SmartCDS system. We then established a local database to serve as a repository for monitoring and tracking data runs, reports, and system errors per best practices. To complete our work on creating the technical capacity necessary to implement the SmartCDS system, we worked with the enterprise information technology team to determine the appropriate Web service to call; we then created the rules necessary to appropriately respond to the data sent to that endpoint and, if appropriate, suppress the target alert (in this case, the shingles vaccination alert).
Optimization of Machine Learning Script and Docker Image and Formation of Container
Once built, we validated the SmartCDS system with the shingles alert. Predictions are made on upcoming appointments by applying the model to our predefined views, modifying the ML script, and generating model predictions (suppress yes or no), which are saved in a local database and applied to suppress an alert with a predicted score less than the threshold (additional details in Methods). The system was designed to be modular and orthogonal with regard to call frequency, instrumentation, and configuration, allowing for easy adaptation to new environments. Under these parameters, we formed Docker containers (standard units of software that package code and all its dependencies, so each application runs quickly and reliably from one computing environment to another) to appropriately configure the Web service, define the frequency and timing of functionality, and log the system’s normal and error events.
Reporting Dashboard Development
A dashboard was developed to ensure that the system was running properly with a stable performance of the model from a safety and operational perspective and to monitor process outcomes of interest. The dashboard features process outcomes of interest (eg, suppression percentage, daily order counts, and daily alert volume) and factors in timing and frequency of report delivery based on feedback from clinical and operational stakeholders. Daily logging and weekly monitoring reports (and ) were constructed to enable detection of abnormal model behavior related to data shifts or model failures. If, on any date, the alert-related volume diverges from previous patterns, the alert log stream along with the related EHR data can be examined to locate the source of the anomaly.
Daily Alert–Related Volumes
We leveraged the 6 weeks of data (January 19-March 11, 2019) to compare the volume difference in shingles alert count, interacted alert count, and order count between weeks with the model turned on and turned off. We observed 42.2% (3541.0 with the model turned on vs 6123.7 with the model turned off) reduction in the alert count, no significant reduction in the interacted alert count (one-sided two sample t test; P=.20) or in the order count (one-sided two sample t test; P=.38) during the 6-week biweekly cycle with the model turned on and model turned off ().
|Group||Time range (weeks)||Alert count, mean (SD)||Interacted alert counta, mean (SD)||Order countb, mean (SD)||Accumulated alerts per order rate||Accumulated interacted alerts per order rate|
|Model turned off||3||6123.7 (232.9)||1162.3 (22.8)||331.3 (5.1)||18.5||3.4|
|Model turned on||3||3541.0 (669.4)||1118.3 (71.6)||326.3 (23.6)||10.9||3.5|
Alerts per Order Rate and Signal-to-Noise Ratio
Our 6-week pilot deployment of the system in the live environment indicates an alert suppression rate of 43.7% out of 35,315 appointments (), with stable shingles vaccine order volume (no statistically significant difference between active and inactive suppression) slightly lower than the predefined 50% threshold. Initial inspection showed that, on an average, walk-in visits had a higher alert ignored rate (91%) compared with scheduled office visits (87%) in 2017 and 2018. As the model only operated on scheduled appointments in this study and the activity history has the highest weight toward a suppression decision, a slightly lower suppression rate than 50% was expected.
The ratio of alerts fired to orders placed with the model turned on was almost half of that of the ratio with the model turned off, whereas the ratio of the interacted alerts per order placed remained the same. By mapping the average orders placed as the average power of signal and the average count of ignored alert with no follow-up orders as the average power of noise, the signal-to-noise ratio changed from 5.7% to 10.1%, a 78% increase. Furthermore, by mapping the interacted alerts (including follow-up orders) as signal and the ignored alerts with no follow-up actions as noise, the signal-to-noise ratio changed from 23.4% to 46.1%, a 97% increase.
This paper describes the steps and considerations involved in the development and implementation of an ML model for suppressing low-value alerts in the EHR for the shingles vaccination. As predicted in our simulation, validation of this signal-to-noise optimization (SmartCDS) system demonstrated substantial reduction in the shingles vaccine alerts at a limited vaccine ordering expense. The rate of daily alert interaction among individual clinicians during the 6-week pilot was higher with the model turned on vs the model turned off. This result was expected because of the 42.2% lower volume of shingles alerts observed with stable daily alert interactions. Interestingly, the overall interaction rate gradually decreased over the 6-week cycle (). This finding is consistent with the findings that responsiveness to alerts tends to decrease over time [ ]. During the 6-week pilot, the profile of the providers who accepted the alerts did not change, indicating that the profile of patients who are offered the vaccination did not change either. This will be confirmed in our follow-up studies. To date, our literature review indicates that our SmartCDS system is the first to develop an ML-based system to suppress clinically insignificant alerts or alerts unlikely to be accepted and to prospectively evaluate the system in a large-scale health care system. Relevant literature to date has been limited to retrospective studies focused on identifying false-positive or clinically insignificant physiologic monitor alarms (false alarms). In 2015, Physionet opened a challenge to reduce false arrhythmia alarms using a subset of the Medical Information Mart for Intensive Care II waveform database [ ]. The best models showed that by allowing 30 seconds of delay, false alarms can be better distinguished from true alarms; the best models were able to achieve 80% reduction in false alarms, missing 1% of true alarms. Studies focusing on pulse oximetry to reduce peripheral capillary oxygen saturation (SpO2) false alarms, intracranial pressure alarms, and general vital sign monitoring alarms found mixed results ranging from 25% to 47% in alarm reduction, with 0% to 5% false-negative rates [ - ]. A more recent study showed that, by increasing delayed time within 3 min for alarms with physiologic monitoring waveforms, as well as including electrocardiography, SpO2, and arterial blood pressure, an ML model can achieve slightly better performance but fails to stably generalize to unseen data [ ].
The development of a robust reporting structure allows for the logging and monitoring of the system and its impact on clinical outcomes, which are necessary to ensure the stability and safety of the system. Future work will involve gathering feedback from front-line stakeholders to support the adaptation of the signal-to-noise optimization system to other alerts, enabling the system to ingest real-time data as well as further development of a reporting dashboard with effective, user-centered data displays, and a systematic process for establishing organizationally acceptable thresholds for alert suppression.
During the pilot implementation and evaluation, the model only operated on scheduled office visits because of infrastructure gaps restricting the ability to incorporate walk-in visits. We are working to address this gap to be able to assess the effectiveness and impact of this model on a global level. On the other hand, it is possible that clinicians will start to adjust to the volume change in the shingles alert delivery, leading to less responsiveness and less ordering. As potential external or systematic biases, such as seasonal effects, could lead to inaccurate observations and conclusions, we will implement a more comprehensive statistical evaluation after updating the infrastructure to systematically address these potential biases.
Our model presented high discriminatory power in the initial prospective evaluation of shingles alert interactions. Our approach was effective in suppressing unnecessary alerts, with limited reduction in overall order volume. This work also provides potential evidence of increase in interactions and orders (eg, an increase in signal-to-noise ratio) by decreasing noise (eg, suppression). In addition, the process built to operationalize this new ML tool may prove to be a useful model for enabling the deployment of this type of tool across many use cases. Future efforts include applying this approach globally to other EHR alerts and comprehensive randomized controlled trials.
The authors thank Dr Simon Jones for providing insights and helping validate the statistical evaluation approaches.
Conflicts of Interest
Daily count of the shingles best practice advisories (BPAs; alerts), interacted shingles BPAs, and shingles BPA follow-up vaccine orders from January 19, 2019, to March 11, 2019. Each bar in all three plots represents one day - purple (dark) if the model was turned on and light green with a black border if the model was turned off, the shaded areas annotate excluded dates from downstream statistical analysis. BPAL best practice advisory.PNG File , 74 KB
- McDonald CJ, Hui SL, Tierney WM. Effects of computer reminders for influenza vaccination on morbidity during influenza epidemics. MD Comput 1992;9(5):304-312. [Medline]
- Dexter PR, Perkins S, Overhage JM, Maharry K, Kohler RB, McDonald CJ. A computerized reminder system to increase the use of preventive care for hospitalized patients. N Engl J Med 2001 Sep 27;345(13):965-970. [CrossRef] [Medline]
- Kim DK, Riley LE, Hunter P, Advisory Committee on Immunization Practices. Recommended immunization schedule for adults aged 19 years or older, United States, 2018. Ann Intern Med 2018 Feb 6;168(3):210-220. [CrossRef] [Medline]
- Osheroff JA, Teich JM, Middleton B, Steen EB, Wright A, Detmer DE. A roadmap for national action on clinical decision support. J Am Med Inform Assoc 2007;14(2):141-145 [FREE Full text] [CrossRef] [Medline]
- Hunt DL, Haynes RB, Hanna SE, Smith K. Effects of computer-based clinical decision support systems on physician performance and patient outcomes: a systematic review. J Am Med Assoc 1998 Oct 21;280(15):1339-1346. [CrossRef] [Medline]
- Overhage JM, Tierney WM, Zhou X, McDonald CJ. A randomized trial of 'corollary orders' to prevent errors of omission. J Am Med Inform Assoc 1997;4(5):364-375 [FREE Full text] [CrossRef] [Medline]
- Dexheimer JW, Talbot TR, Sanders DL, Rosenbloom ST, Aronsky D. Prompting clinicians about preventive care measures: a systematic review of randomized controlled trials. J Am Med Inform Assoc 2008;15(3):311-320 [FREE Full text] [CrossRef] [Medline]
- Sittig DF, Wright A, Osheroff JA, Middleton B, Teich JM, Ash JS, et al. Grand challenges in clinical decision support. J Biomed Inform 2008 Apr;41(2):387-392 [FREE Full text] [CrossRef] [Medline]
- Bernstein SL, Whitaker D, Winograd J, Brennan JA. An electronic chart prompt to decrease proprietary antibiotic prescription to self-pay patients. Acad Emerg Med 2005 Mar;12(3):225-231 [FREE Full text] [CrossRef] [Medline]
- Garthwaite EA, Will EJ, Bartlett C, Richardson D, Newstead CG. Patient-specific prompts in the cholesterol management of renal transplant outpatients: results and analysis of underperformance. Transplantation 2004 Oct 15;78(7):1042-1047. [CrossRef] [Medline]
- Safran C, Rind DM, Davis RM, Currier J, Ives D, Sands DZ, et al. An electronic medical record that helps care for patients with HIV infection. Proc Annu Symp Comput Appl Med Care 1993:224-228 [FREE Full text] [Medline]
- Safran C, Rind D, Davis R, Ives D, Sands D, Currier J, et al. Guidelines for management of HIV infection with computer-based patient's record. Lancet 1995 Aug 5;346(8971):341-346. [CrossRef] [Medline]
- Safran C, Rind D, Sands D, Davis R, Wald J, Slack W. Development of a knowledge-based electronic patient record. MD Comput 1996;13(1):46-54, 63. [Medline]
- Tierney WM, Miller ME, McDonald CJ. The effect on test ordering of informing physicians of the charges for outpatient diagnostic tests. N Engl J Med 1990 May 24;322(21):1499-1504. [CrossRef] [Medline]
- Simon S, Smith D, Feldstein A, Perrin N, Yang X, Zhou Y, et al. Computerized prescribing alerts and group academic detailing to reduce the use of potentially inappropriate medications in older people. J Am Geriatr Soc 2006 Jun;54(6):963-968. [CrossRef] [Medline]
- Shah N, Seger A, Seger D, Fiskio JM, Kuperman GJ, Blumenfeld B, et al. Improving acceptance of computerized prescribing alerts in ambulatory care. J Am Med Inform Assoc 2006;13(1):5-11 [FREE Full text] [CrossRef] [Medline]
- Tamblyn R, Huang A, Perreault R, Jacques A, Roy D, Hanley J, et al. The medical office of the 21st century (MOXXI): effectiveness of computerized decision-making support in reducing inappropriate prescribing in primary care. Can Med Assoc J 2003 Sep 16;169(6):549-556 [FREE Full text] [Medline]
- Gaikwad R, Sketris I, Shepherd M, Duffy J. Evaluation of accuracy of drug interaction alerts triggered by two electronic medical record systems in primary healthcare. Health Informatics J 2007 Sep;13(3):163-177. [CrossRef] [Medline]
- Smith DH, Perrin N, Feldstein A, Yang X, Kuang D, Simon SR, et al. The impact of prescribing safety alerts for elderly persons in an electronic medical record: an interrupted time series evaluation. Arch Intern Med 2006 May 22;166(10):1098-1104. [CrossRef] [Medline]
- Seidling HM, Schmitt SP, Bruckner T, Kaltschmidt J, Pruszydlo MG, Senger C, et al. Patient-specific electronic decision support reduces prescription of excessive doses. Qual Saf Health Care 2010 Oct;19(5):e15. [CrossRef] [Medline]
- Hussain MI, Reynolds TL, Zheng K. Medication safety alert fatigue may be reduced via interaction design and clinical role tailoring: a systematic review. J Am Med Inform Assoc 2019 Oct 1;26(10):1141-1149. [CrossRef] [Medline]
- Isaac T, Weissman JS, Davis RB, Massagli M, Cyrulik A, Sands DZ, et al. Overrides of medication alerts in ambulatory care. Arch Intern Med 2009 Feb 9;169(3):305-311. [CrossRef] [Medline]
- van der Sijs H, Aarts J, Vulto A, Berg M. Overriding of drug safety alerts in computerized physician order entry. J Am Med Inform Assoc 2006;13(2):138-147 [FREE Full text] [CrossRef] [Medline]
- Carspecken CW, Sharek PJ, Longhurst C, Pageler NM. A clinical case of electronic health record drug alert fatigue: consequences for patient outcome. Pediatrics 2013 Jun;131(6):e1970-e1973. [CrossRef] [Medline]
- Hu X, Sapo M, Nenov V, Barry T, Kim S, Do DH, et al. Predictive combinations of monitor alarms preceding in-hospital code blue events. J Biomed Inform 2012 Oct;45(5):913-921 [FREE Full text] [CrossRef] [Medline]
- Voepel-Lewis T, Parker ML, Burke CN, Hemberg J, Perlin L, Kai S, et al. Pulse oximetry desaturation alarms on a general postoperative adult unit: a prospective observational study of nurse response time. Int J Nurs Stud 2013 Oct;50(10):1351-1358. [CrossRef] [Medline]
- Bonafide CP, Lin R, Zander M, Graham CS, Paine CW, Rock W, et al. Association between exposure to nonactionable physiologic monitor alarms and response time in a children's hospital. J Hosp Med 2015 Jun;10(6):345-351 [FREE Full text] [CrossRef] [Medline]
- Korniewicz DM, Clark T, David Y. A national online survey on the effectiveness of clinical alarms. Am J Crit Care 2008 Jan;17(1):36-41. [Medline]
- Embi PJ, Leonard AC. Evaluating alert fatigue over time to EHR-based clinical trial alerts: findings from a randomized controlled study. J Am Med Inform Assoc 2012 Jun;19(e1):e145-e148 [FREE Full text] [CrossRef] [Medline]
- Genco EK, Forster JE, Flaten H, Goss F, Heard KJ, Hoppe J, et al. Clinically inconsequential alerts: the characteristics of opioid drug alerts and their utility in preventing adverse drug events in the emergency department. Ann Emerg Med 2016 Feb;67(2):240-8.e3 [FREE Full text] [CrossRef] [Medline]
- Joint Commission. Medical device alarm safety in hospitals. Sentinel Event Alert 2013 Apr 8(50):1-3. [Medline]
- McCoy A, Thomas E, Krousel-Wood M, Sittig D. Clinical decision support alert appropriateness: a review and proposal for improvement. Ochsner J 2014;14(2):195-202 [FREE Full text] [Medline]
- Ong M, Coiera E. Evaluating the effectiveness of clinical alerts: a signal detection approach. AMIA Annu Symp Proc 2011;2011:1036-1044 [FREE Full text] [Medline]
- Simpao AF, Ahumada LM, Desai BR, Bonafide CP, Gálvez JA, Rehman MA, et al. Optimization of drug-drug interaction alert rules in a pediatric hospital's electronic health record system using a visual analytics dashboard. J Am Med Inform Assoc 2015 Mar;22(2):361-369. [CrossRef] [Medline]
- Cornu P, Steurbaut S, Gentens K, van de Velde R, Dupont AG. Pilot evaluation of an optimized context-specific drug-drug interaction alerting system: A controlled pre-post study. Int J Med Inform 2015 Sep;84(9):617-629. [CrossRef] [Medline]
- Paterno MD, Maviglia SM, Gorman PN, Seger DL, Yoshida E, Seger AC, et al. Tiering drug-drug interaction alerts by severity increases compliance rates. J Am Med Inform Assoc 2009;16(1):40-46 [FREE Full text] [CrossRef] [Medline]
- Beccaro MA, Villanueva R, Knudson KM, Harvey EM, Langle JM, Paul W. Decision support alerts for medication ordering in a computerized provider order entry (CPOE) system: a systematic approach to decrease alerts. Appl Clin Inform 2010;1(3):346-362 [FREE Full text] [CrossRef] [Medline]
- Guzek M, Zorina OI, Semmler A, Gonzenbach RR, Huber M, Kullak-Ublick GA, et al. Evaluation of drug interactions and dosing in 484 neurological inpatients using clinical decision support software and an extended operational interaction classification system (Zurich Interaction System). Pharmacoepidemiol Drug Saf 2011 Sep;20(9):930-938. [CrossRef] [Medline]
- Czock D, Konias M, Seidling HM, Kaltschmidt J, Schwenger V, Zeier M, et al. Tailoring of alerts substantially reduces the alert burden in computerized clinical decision support for drugs that should be avoided in patients with renal disease. J Am Med Inform Assoc 2015 Jul;22(4):881-887. [CrossRef] [Medline]
- Eppenga WL, Derijks HJ, Conemans JM, Hermens WA, Wensing M, de Smet PA. Comparison of a basic and an advanced pharmacotherapy-related clinical decision support system in a hospital care setting in the Netherlands. J Am Med Inform Assoc 2012;19(1):66-71 [FREE Full text] [CrossRef] [Medline]
- Harinstein LM, Kane-Gill SL, Smithburger PL, Culley CM, Reddy VK, Seybert AL. Use of an abnormal laboratory value-drug combination alert to detect drug-induced thrombocytopenia in critically Ill patients. J Crit Care 2012 Jun;27(3):242-249. [CrossRef] [Medline]
- Connor C, Gohil B, Harrison M. Triggering of systolic arterial pressure alarms using statistics-based versus threshold alarms. Anaesthesia 2009 Feb;64(2):131-135 [FREE Full text] [CrossRef] [Medline]
- Borowski M, Siebig S, Wrede C, Imhoff M. Reducing false alarms of intensive care online-monitoring systems: an evaluation of two signal extraction algorithms. Comput Math Methods Med 2011;2011:143480 [FREE Full text] [CrossRef] [Medline]
- Scalzo F, Hu X. Semi-supervised detection of intracranial pressure alarms using waveform dynamics. Physiol Meas 2013 Apr;34(4):465-478. [CrossRef] [Medline]
- Schmid F, Goepfert MS, Franz F, Laule D, Reiter B, Goetz AE, et al. Reduction of clinically irrelevant alarms in patient monitoring by adaptive time delays. J Clin Monit Comput 2017 Feb;31(1):213-219. [CrossRef] [Medline]
- van de Velde S, Heselmans A, Delvaux N, Brandt L, Marco-Ruiz L, Spitaels D, et al. A systematic review of trials evaluating success factors of interventions with computerised clinical decision support. Implement Sci 2018 Aug 20;13(1):114 [FREE Full text] [CrossRef] [Medline]
- Futoma J, Morris J, Lucas J. A comparison of models for predicting early hospital readmissions. J Biomed Inform 2015 Aug;56:229-238 [FREE Full text] [CrossRef] [Medline]
- Mao Q, Jay M, Hoffman JL, Calvert J, Barton C, Shimabukuro D, et al. Multicentre validation of a sepsis prediction algorithm using only vital sign data in the emergency department, general ward and ICU. BMJ Open 2018 Jan 26;8(1):e017833 [FREE Full text] [CrossRef] [Medline]
- Osheroff JA. Improving Medication Use and Outcomes with Clinical Decision Support: A Step by Step Guide. Chicago, IL: HIMSS Publishing; 2009.
- Chen J, Aphinyanaphongs Y, Mann D, Iturrate E, Chokshi S, Hegde R. Personalized Clinical Decision Support (CDS) using Machine Learning: A Case Study with Shingles. In: Proceedings of the AMIA 2019 Clinical Informatics Conference. 2019 Presented at: AMIA'19; May 1-2, 2019; Atlanta, GA URL: https://www.amia.org/cic2019/presentations
- Clifford G, Silva I, Moody B, Li Q, Kella D, Shahin A, et al. The PhysioNet/Computing in Cardiology Challenge 2015: Reducing False Arrhythmia Alarms in the ICU. Comput Cardiol (2010) 2015 Sep;2015:273-276 [FREE Full text] [CrossRef] [Medline]
- Chen L, Dubrawski A, Wang D, Fiterau M, Guillame-Bert M, Bose E, et al. Using supervised machine learning to classify real alerts and artifact in online multisignal vital sign monitoring data. Crit Care Med 2016 Jul;44(7):e456-e463 [FREE Full text] [CrossRef] [Medline]
|CDS: clinical decision support|
|EHR: electronic health record|
|ML: machine learning|
|SpO2: peripheral capillary oxygen saturation|
Edited by G Eysenbach; submitted 30.10.19; peer-reviewed by A Al, P Marier-Deschenes; comments to author 25.12.19; revised version received 19.02.20; accepted 21.02.20; published 29.04.20Copyright
©Ji Chen, Sara Chokshi, Roshini Hegde, Javier Gonzalez, Eduardo Iturrate, Yin Aphinyanaphongs, Devin Mann. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.04.2020.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.