“It’s déjà vu all over again.” – Yogi Berra

Artificial intelligence (AI) has the potential to improve care quality by decreasing medical errors and reducing clerical burdens for physicians1. While this embrace of AI is exciting to many providers, it leaves others with a sense of déjà vu: Less than twenty years ago, the same promises were made regarding the then-novel Electronic Health Record (EHR)2. But instead of fostering frictionless healthcare delivery, EHRs have contributed to greater resentment and distrust toward healthcare technologies due to their negative impacts on both clinicians and patients. If doing the same thing over again expecting a different result is insanity, then the implementation of AI in healthcare needs to be approached with caution. Can we learn from the failures of the EHR to guide the implementation of AI in medicine, or is history destined to repeat itself?

Overpromised and Underdelivered

EHR adoption in the early 2000s was driven by noble intentions: improving patient care, increasing efficiency, and reducing healthcare costs. The Institute of Medicine’s landmark report, “To Err is Human: Building a Safer Health System,” highlighted the alarming frequency of medical errors and called for electronic systems to enhance patient safety3. Thus, the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009 provided financial incentives for healthcare providers to adopt EHRs, based on their potential to reduce costly errors and save money through increased research and efficiency4.

However, reality fell woefully short of these lofty expectations. In the rush to secure funding and capitalize on a green field market, quality was sacrificed for rapid availability. The result? Poorly designed interfaces that disrupt workflows and frustrate clinicians. Clinicians continue to face missed diagnoses and care inefficiencies, though they now often stem from information overload, alert fatigue, and the challenges of navigating complex EHR systems5. While these technologies may help prevent errors in specific scenarios, their widespread use has inadvertently hindered patient safety—the very thing they were meant to improve.

From burden to burnout

Furthermore, although EHRs were designed to make information recording and extraction more efficient, the opposite has occurred. EHRs require extensive data entry, which cuts into patient care time and leads to exhaustion. Some physicians spend over 50% of their clinical time at an EHR console, not to mention additional time after work charting from home6,7. Most physicians and nurses went into medicine to provide direct care for patients, not to be data entry clerks. It’s unsurprising, then, that a decrease in patient interaction owing to an increase in EHR tasks has resulted in dissatisfaction, detachment, and higher levels of burnout among them. In fact, EHR implementation studies have found a significant increase in provider burnout and decreased job satisfaction8. The perceived work stress associated with EHR use, including complex interfaces and message overload, has been linked to higher levels of burnout. A cross-sectional survey in 2019 found that physicians who used EHRs and computerized physician order entry had a higher risk of burnout compared to those who did not9.

New horizon or only a Mirage

To be fair, the push for EHR adoption did have some downstream effects which have led us to this AI moment. It resulted in the creation of vast repositories of digital health data that now serve as the foundation for developing and training AI algorithms. The rich, structured data captured by EHRs (though via the labor of many human health care providers) has enabled an unprecedented opportunity for AI to uncover patterns, predict outcomes, and support clinical decision-making. AI-powered tools promise to revolutionize patient care. They may now mirror or surpass human performance in certain tasks, preventing us from overlooking minute details or summarizing important information. Some argue that by integrating AI with EHRs, we could finally realize some of the benefits initially envisioned and which are just as relevant today as twenty years ago: improve patient outcomes, increase efficiency, and reduce costs and errors10. But how different is a machine learning algorithm to detect rare events from a human encoded hard-stop alert? If we want to stop running around in circles and engage in the ideals of a learning healthcare system, we need to actually adapt from past mistakes and assumptions.

Learning from machines

Here, the implementation of EHRs offers crucial lessons for the integration of AI in healthcare. Foremost among these is the critical importance of user-centered design and workflow integration. EHRs have often been criticized for their poor usability and interference with clinical workflows, leading to physician burnout and decreased patient interaction11. To avoid similar pitfalls, AI systems must be designed with extensive input from clinicians and seamlessly integrated into existing workflows. This necessitates a collaborative approach involving clinicians, data scientists, and user experience experts throughout the development and implementation process. Furthermore, AI tools should enhance, rather than replace, clinical decision-making, serving as cognitive aids that augment physician expertise.

Of course, user-centered-design alone cannot overcome issues like the fragmentation of health data across disparate EHR systems which has hindered the realization of their full potential12. Another crucial lesson from EHR implementation is the need for robust interoperability and data standardization. For AI to be effectively deployed in healthcare, it is imperative to establish formats and protocols that facilitate seamless information exchange across different institutions and AI platforms. This interoperability will not only enhance the performance of AI algorithms by providing access to larger, more diverse datasets but also ensure that AI-generated insights can be meaningfully integrated into clinical care across various healthcare settings.

Lastly, the EHR experience underscores the importance of ongoing evaluation, iterative improvement, and comprehensive training. Many of the issues with EHRs emerged over time and were exacerbated by inadequate response to user feedback and system refinement13. As AI systems are deployed in healthcare, it is crucial to establish rigorous frameworks for continuous monitoring of their performance, safety, and impact on clinical outcomes. Regular audits should be conducted to detect and mitigate potential biases or errors in AI algorithms, which might address issues of trust and confidence in their use. Moreover, healthcare institutions must invest in comprehensive, ongoing training programs to ensure that clinicians can effectively leverage AI tools while maintaining their clinical acumen.

Conclusion

The potential of AI to improve healthcare is undeniable. But achieving it relies on the successful integration into care - something the EHR experience reminds us that technology alone cannot do. However, just like AI algorithms learn through iterative exposure to past successes and failures, so too must our approach to implementing it in healthcare. Emphasizing user-centered design, interoperability, and ongoing evaluation are crucial for AI’s judicious integration in medicine. And this learning process should extend beyond these technologies to the entire healthcare ecosystem. The effective integration of AI in medicine depends not on algorithms alone, but on our ability to learn, adapt, and thoughtfully incorporate these tools into the complex, human-driven healthcare system. Through this human-centered approach, we can harness AI’s capabilities to improve patient outcomes, increase efficiency, and reduce errors in ways that previous technological interventions could only dream of achieving. Otherwise, providers will be left in the same scenario they currently are: with a promising technology at their fingertips, but perhaps an uncanny feeling that it might not help so much as hinder their ability to provide care. In other words, a sense of déjà vu.