Published on in Vol 26 (2024)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/55472, first published .
Applying Implementation Science to Advance Electronic Health Record–Driven Learning Health Systems: Case Studies, Challenges, and Recommendations

Applying Implementation Science to Advance Electronic Health Record–Driven Learning Health Systems: Case Studies, Challenges, and Recommendations

Applying Implementation Science to Advance Electronic Health Record–Driven Learning Health Systems: Case Studies, Challenges, and Recommendations

Viewpoint

1Department of Family Medicine, School of Medicine, University of Colorado Anschutz Medical Campus, Aurora, CO, United States

2Adult and Child Center for Outcomes Research and Delivery Science Center, University of Colorado Anschutz Medical Campus, Aurora, CO, United States

3Department of Biomedical Informatics, School of Medicine, University of Colorado Anschutz Medical Campus, Aurora, CO, United States

4Colorado Center for Personalized Medicine, School of Medicine, University of Colorado Anschutz Medical Campus, Aurora, CO, United States

5Division of Hospital Medicine, School of Medicine, University of Colorado Anschutz Medical Campus, Aurora, CO, United States

6Caring Health Center, Springfield, MA, United States

7Division of General Internal Medicine, School of Medicine, University of Colorado Anschutz Medical Campus, Aurora, CO, United States

8Ludeman Family Center for Women’s Health Research, University of Colorado Anschutz Medical Campus, Aurora, CO, United States

9VA Eastern Colorado Geriatric Research Education and Clinical Center, Aurora, CO, United States

Corresponding Author:

Katy E Trinkley, PharmD, PhD

Department of Family Medicine

School of Medicine

University of Colorado Anschutz Medical Campus

12631 E 17th Ave

Mail Stop F496

Aurora, CO, 80045

United States

Phone: 1 303 724 3103

Email: [email protected]


With the widespread implementation of electronic health records (EHRs), there has been significant progress in developing learning health systems (LHSs) aimed at improving health and health care delivery through rapid and continuous knowledge generation and translation. To support LHSs in achieving these goals, implementation science (IS) and its frameworks are increasingly being leveraged to ensure that LHSs are feasible, rapid, iterative, reliable, reproducible, equitable, and sustainable. However, 6 key challenges limit the application of IS to EHR-driven LHSs: barriers to team science, limited IS experience, data and technology limitations, time and resource constraints, the appropriateness of certain IS approaches, and equity considerations. Using 3 case studies from diverse health settings and 1 IS framework, we illustrate these challenges faced by LHSs and offer solutions to overcome the bottlenecks in applying IS and utilizing EHRs, which often stymie LHS progress. We discuss the lessons learned and provide recommendations for future research and practice, including the need for more guidance on the practical application of IS methods and a renewed emphasis on generating and accessing inclusive data.

J Med Internet Res 2024;26:e55472

doi:10.2196/55472

Keywords



Nearly all health care settings in the United States use electronic health records (EHRs) to document the continuum of patient care [1,2], a trend that is increasingly evident in other countries as well [3-5]. Driven by federal incentives and regulations, health systems are being progressively encouraged or even mandated to document patient care in structured and actionable ways to facilitate reporting on quality of care metrics. As the breadth and depth of actionable data captured in EHRs expand, health systems are increasingly able to develop learning health systems (LHSs) [6,7]. In these systems, EHR data and other decision support features are utilized to advance the quintuple aim of health care: improving population health, promoting health equity, reducing health care costs, and enhancing both patient and care team experiences [8].

An LHS aims to “align science, informatics, incentives and culture for continuous improvement and innovation, with best practices seamlessly embedded in the health care delivery process in such a way that new knowledge is captured as an integral by-product” [9]. While the concept and the term LHSs are not new, significant progress has recently been made in their development, largely due to the growing capabilities of EHRs that make LHS more feasible. As interest in EHR-driven LHS increases, so does the interest in implementation science (IS), which plays a crucial role in ensuring that LHSs are feasible, rapid, iterative, valid, reliable, reproducible, equitable, and sustainable [10,11]. Both IS and LHSs aim to advance health and health care in ways that are locally relevant and externally valid, with IS providing methods and approaches that can help achieve these goals within LHS [10,12].

“IS is the study of how evidence-based practices are feasibly adopted, implemented, and sustained in real-world settings” [10,12]. A core aspect of IS is the use of its theories, models, and frameworks (TMFs), which are theory-driven approaches for evaluating the context in which LHSs are implemented and for guiding the selection of an implementation evaluation plan for a given LHS learning cycle project. IS and its TMFs draw on methods and theories from various disciplines and evolve over time as they are applied to new perspectives and situations, including different LHSs. A unique strength of IS and its TMFs is their capacity to adapt as new scientific methods emerge, while rigorously translating evidence into practice in ways that are locally relevant, yet generalizable, replicable, sustainable, and equitable [12]. Central to achieving the benefits of IS is the use of its TMFs, which offer a systematic and replicable approach to adapting evidence-based practices to local settings in ways that are both pragmatic and scientifically robust. IS TMFs can be integrated with broader LHS frameworks [13,14] to provide more specific guidance on aligning and evaluating the overarching LHS or learning cycle with the local context in ways that remain generalizable [10]. The adaptability of IS, along with its dual emphasis on locally relevant and externally valid findings, makes it particularly well-suited to support the goals of LHSs in their rapidly changing and inherently complex settings. While the success of LHSs undeniably requires a multidisciplinary approach, with theories and models from other disciplines playing a key role [15], this paper specifically highlights the value of applying IS to advance LHSs [16].

With the increasing use of IS TMFs for LHSs, certain inherent challenges have emerged, partly due to the limitations of EHRs [17,18]. Although EHRs serve as a rich data source, they often lack crucial data needed to effectively apply TMFs, such as key patient-reported outcomes and social determinants of health [19]. Such data are often too complex to collect consistently for most patients but are essential for informing patient-centered care decisions and evaluating the impact of a learning cycle. While EHR constraints challenge the application of IS TMFs, these frameworks can also help address some limitations of EHRs, such as expanding the use of EHR data beyond patient care and billing [19,20]. Nonetheless, there is a lack of guidance and ambiguities in applying IS TMF constructs to learning cycle projects, particularly when using EHR data. Additionally, some perceive TMFs as overly academic, with concerns that operationalizing them is resource-intensive and not practical, intuitive, or sufficiently rapid [18,21].

To advance the goals of LHSs, there is a critical need for guidance and support on pragmatically adapting and applying IS TMFs. This adaptation must align with the varying resources and expectations of different health systems and work within existing constraints, such as competing priorities, resource limitations, and data availability. The purposes of this paper are to (1) discuss the benefits and challenges of applying IS TMFs to EHR-driven LHS learning cycles; (2) outline the key features of a widely used IS TMF, the Practical, Robust Implementation and Sustainability Model (PRISM), as applied to LHS research; (3) provide 3 pragmatic case studies of this application; and (4) explore future directions, challenges, and opportunities for incorporating IS TMFs to support rapid LHSs and address the limitations of EHR data. We offer recommendations and resources for leaders, clinicians, implementers, quality improvement specialists, and researchers involved in developing an LHS or conducting learning cycles. These recommendations focus on how to feasibly adapt IS approaches and methods to various LHS situations.


Overview of PRISM Applications

In this article, we review 3 case studies illustrating the pragmatic application of an IS TMF and its associated methods, the PRISM, to LHSs. These examples highlight useful and practical applications of PRISM and serve as a basis for discussing challenges and future directions related to the use of IS methods and EHR data.

Although many IS TMFs exist [22,23], we selected PRISM to concretely illustrate the application of a TMF within an LHS. PRISM is an expanded version of the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework, incorporating 4 context domains and the 5 RE-AIM outcome dimensions [17,24-27], as detailed in Tables 1 and 2. PRISM evolved from a blend of frameworks from agriculture (Diffusion of Innovation), engineering (Plan-Do-Study-Act quality improvement cycles), and health care (Chronic Care Model) [24]. PRISM is designed to be used throughout the life cycle of a study or project to guide the systematic assessment and alignment of the project with its context, aiming to maximize equitable impact on relevant outcomes and sustainability. PRISM underscores the importance of aligning a project with the diverse perspectives and characteristics of various partners, including those who will be directly or indirectly impacted or involved in approving and funding the programs, such as frontline staff, clinicians, clinic or department leaders, system-level leaders, and boards of directors. For effective alignment, it is crucial to ensure the representation of these partners. A key aspect of promoting equity is defining relevant outcomes that are important from diverse perspectives and measuring the representativeness of these outcomes across various demographics (patients) and types of clinics or providers (settings) [27]. PRISM’s RE-AIM outcomes facilitate discussions about relevant and meaningful outcome measures at different levels of perspective (eg, leadership, clinician, patient) while emphasizing representativeness and pragmatic issues such as adoption and uptake. PRISM also takes into account the external context (eg, policies, guidelines) and the support or infrastructure available for initial implementation and sustainability (eg, resources, audit, and feedback processes). This consideration enhances the likelihood that the project will continue beyond the study timeline or funding. The systematic approach provided by a TMF like PRISM, combined with the RE-AIM pragmatic outcomes, allows a project to be adapted and scaled up in ways that are locally relevant to different situations and health systems.

To illustrate the challenges and potential solutions of applying IS TMFs to LHS learning cycles and using EHR data, we present retrospective case studies of 3 applications of PRISM to different LHS projects across diverse health care settings. These case studies were selected based on the authors’ experiences to represent various types of projects and settings, as well as different approaches to applying PRISM and addressing key issues within LHSs. We focus on identifying challenges and solutions related to (1) operationalizing and applying PRISM within the EHR-embedded LHS context; (2) pragmatically adapting PRISM and its RE-AIM outcomes based on available resources, expectations regarding speed, and data availability; and (3) effectively utilizing EHR data. The solutions provided aim to offer guidance and direction on addressing these challenges and improving the practical application of PRISM and other TMFs and IS methods to EHR-driven LHS.

To facilitate a systematic evaluation of how PRISM was applied in each case study and to minimize recall bias, we first adapted our previously published framework for a fully mature LHS to illustrate where and how PRISM and its associated methods can be applied (Figure 1) [10]. The original framework was designed to be agnostic to any specific IS TMF, so we modified it by overlaying PRISM and highlighting where and how it integrates with the broader LHS framework. In this figure, PRISM guides and informs key aspects and activities of an LHS, including representativeness and equity of perspectives and outcomes; achieving local relevance and external validity or generalizability; rapidity of change and impact; and designing for sustainability. This adapted figure of a fully mature LHS was used to stimulate recall and identify challenges and solutions when applying PRISM and using EHR data. For each case study, we thematically reflected on the challenges and both actual and potential solutions at each phase of the implementation continuum (ie, preimplementation or planning, implementation, and sustainment or evaluation) [28]. This reflection was conducted both inductively to identify new themes and deductively by considering preidentified themes. We also identified aspects of the EHR that pose barriers to applying PRISM and other IS TMFs, which are crucial to address in order to achieve the aspirational goals of a high-functioning LHS.

Table 1. PRISM’sa contextual domains.
PRISM context domainDescription
Patient and organizational characteristics
  • The characteristics, priorities, and needs of the setting, including those affected by or involved in the intervention, are crucial to consider when designing the intervention.
  • LHSsb should design learning cycles to align with the priorities and values of the setting.
Patient and organizational perspectives of the intervention
  • The setting’s view of the intervention, including the perspectives of those directly and indirectly affected, influences its uptake and impact.
  • LHSs should prioritize learning cycles that address local gaps in ways that are both relevant and acceptable to the setting.
Implementation and sustainability infrastructure
  • The time, staff, and money required to feasibly implement and maintain the intervention are crucial considerations. This also includes alignment with existing processes, norms, and priorities to promote sustainability.
  • LHSs must consider the available resources and supporting infrastructure to ensure sustainability and design learning cycles with lasting effects.
External environment
  • This includes clinical practice guidelines, policies or regulations, reimbursement mechanisms, and other incentives such as national benchmarking.
  • LHSs are influenced by contextual factors outside their health setting and consider published literature when prioritizing and designing learning cycles.

aPRISM: Practical, Robust Implementation and Sustainability Model.

bLHS: learning health system.

Table 2. PRISM’sa RE-AIMb outcome dimensions.
RE-AIM outcome dimensioncDescription
Reach
  • Who was intended to benefit and who participated or was exposed to the intervention?
  • Representativeness and equity of reach
  • LHSsd should assess what proportion of the target group was impacted and consider the characteristics of those affected to ensure representativeness and equity in reach.
Effectiveness
  • What was the most important benefit you were trying to achieve and what were the negative outcomes (eg, safety issues)?
  • Include quality of life and equity of outcomes.
  • LHS should evaluate the impact from various perspectives and compare the characteristics of those who were positively and negatively affected.
Adoption
  • Where was the intervention applied and who applied it, and who declined?
  • LHSs should examine how and why uptake or use varies among different users and settings.
Implementation
  • How consistently was the intervention delivered? (Fidelity)
  • How was it adapted?
  • How much did it cost?
  • Equity and representativeness (subgroup effects) across implementation outcomes
  • LHSs should consider how and why adaptations are made based on different contextual influences, as well as the costs associated with initial and ongoing implementation. This information is crucial for making informed decisions about local sustainability and scalability.
Maintenance
  • How long was the intervention sustained and how long are the results sustained?
  • Equity and representativeness of maintenance
  • LHSs should plan for and assess sustainability and contextual drivers

aPRISM: Practical, Robust Implementation and Sustainability Model.

bRE-AIM: Reach, Effectiveness, Adoption, Implementation, and Maintenance.

cIssues of proportion and representativeness of participants compared with all those eligible or those who decline are important and relevant across all RE-AIM dimensions.

dLHS: learning health system.

Figure 1. PRISM-adapted conceptual model of a fully mature learning health system NOTES: The orange denotes ways PRISM helps guide and inform key components and activities of a fully mature LHS, including: alignment of a project with the context internal and external to the health system to ensure local relevance and external validity; consideration of inclusive precision health data to promote equity; engagement and integration of diverse partner perspectives to assess and align a project with the context in ways that promote equity and sustainability; iterative assessments and adaptations across all stages of a project (pre-implementation planning, implementation and evaluation/sustainment) to increase speed, effectiveness, and sustainability; use of pragmatic RE-AIM outcomes that consider equity and issues important to partners; and adapting how PRISM is applied based on available timelines and resources.

Case Studies

Overview

Below, we provide a brief description of each LHS-based project where PRISM was applied. Table 3 offers an overview of the different features of these case studies.

Table 3. Overview of the key differentiating features of the case studies.
FeatureEHRa-embedded CDSb for heart failure [29]EHR-based dashboard for LUSc [30] FQHCd-led social needs screening and response equity study
Problem addressedGaps in guideline-concordant prescribing of beta-blocker medications for patients with heart failureAddress the heightened need for accurate bedside chest imaging during the COVID-19 pandemic by implementing lung ultrasound among hospitalists.Define a multipartner implementation plan for social risk screening and response equity to address the CMSe mandate and emerging Medicaid ACOf equity scores.
SettingThe EHR of 28 primary care clinics in a large integrated health care systemA quaternary care academic medical centerPartnership of multiple FQHCs
Target audiencePrimary care cliniciansHospitalistsResearchers and health center/ACO leaders and staff
Intervention and implementation strategiesA CDS tool was integrated into clinical EHR workflows to alert clinicians and recommend initiating a beta-blocker medication during patient visits. Hospitalist training in lung ultrasound and the use of iterative PRISMg to implement lung ultrasound in the management of patients with COVID-19.Collaboration across FQHCs to develop a plan using the iPRISMh webtool iteratively as a strategy to examine social risk screening and response equity in ISi research.
How PRISM was usedFor the planning and sustainment/evaluation phasesIteratively in the planning and implementation phasesIteratively in the planning phase for cocreation

aEHR: electronic health record.

bCDS: clinical decision support.

cLUS: lung ultrasound.

dFQHC: federally qualified health center.

eCMS: Centers for Medicare and Medicaid Services.

fACO: accountable care organization.

gPRISM: Practical, Robust Implementation and Sustainability Model.

hiPRISM: Iterative Practical, Robust Implementation and Sustainability Model.

iIS: implementation Science.

Case Study 1: EHR-Embedded Clinical Decision Support for Heart Failure

PRISM was used to design and evaluate a clinical decision support (CDS) tool that recommended prescribing a beta-blocker for primary care clinicians treating patients with heart failure with reduced ejection fraction [29,31]. PRISM guided the systematic assessment of context, which included the following:

  • focus groups with patients to understand their treatment preferences and needs;
  • iterative, multilevel user-centered design and testing procedures, including focus groups and semistructured interviews with clinicians and clinician leaders; and
  • meetings with executive-level informatics and operational leaders, and governance groups.

Based on this iterative partner engagement process, the CDS tool was designed to align with the context while also addressing the technical and data limitations of the EHR. The process also involved identifying pragmatic RE-AIM outcomes that were important and relevant from various perspectives.

A total of 28 primary care clinics were cluster-randomized to either the new, contextually customized CDS tool or an active control group for 6 months. PRISM was used to guide a mixed methods evaluation and to identify adaptations that would promote sustainability and expand the scale of the CDS tool. Specifically, adoption and effectiveness were quantitatively assessed using EHR data. Clinicians were interviewed to determine whether either of the CDS tools should be continued and to understand what changes were needed to optimize adoption and effectiveness within their workflows. Based on the evaluation, the customized CDS tool was found to be more effective, and clinicians expressed a preference for its continued use. As a result, all clinics have since transitioned to the customized CDS tool, and its effectiveness has been sustained [32]. It was also determined that the CDS tool needs to include additional evidence-based medications for heart failure and should be expanded to cardiology clinics. Plans are underway to broaden the tool’s scope to include these medications and extend its use to cardiology clinics.

Case Study 2: EHR-Based Dashboard for Lung Ultrasound

In this pilot implementation study, an iterative assessment process was used to ensure high-fidelity and equitable implementation of lung ultrasound for patients hospitalized with COVID-19 [30]. This was achieved through a recurring mixed methods evaluation of prioritized PRISM constructs, including context and progress on outcomes. The evaluation incorporated qualitative interviews and an operational dashboard that displayed prioritized RE-AIM outcomes using real-time EHR data.

Specifically, PRISM was used to assess context by guiding the interview questions posed to multilevel partners during the planning and implementation phases of this study. The questions were designed to identify and characterize barriers to implementation and to uncover potential strategies for overcoming these barriers. The contextual data collected were then used by implementers to inform iterative adaptations to implementation strategies, with the goal of improving PRISM’s RE-AIM outcomes. Additionally, prioritized outcomes were iteratively assessed quantitatively using an operational dashboard that displayed the representativeness and extent of reach and adoption of lung ultrasound.

The implementation team met every 2 weeks to review outcomes displayed on the dashboard and the qualitative interview data. They assessed barriers to progress in reach and adoption and screened for evidence of emerging disparities in implementation. Based on these assessments, the team collaboratively decided which implementation strategies to deploy, adapt, or discontinue. This iterative PRISM approach, combined with an operational dashboard, offered a low-burden method for monitoring progress and disparities in implementation, as well as for making timely, data-driven adjustments to implementation strategies. This approach has since been expanded to support the equitable implementation of additional evidence-based applications of lung ultrasound (eg, management of heart failure) and other point-of-care ultrasound applications within the same health system.

Case Study 3: Social Needs Data Within Federally Qualified Health Centers

In this ongoing project, PRISM is being used to support cross-institutional partnership engagement and the cocreation of an LHS project focused on social risk screening and response equity across multiple federally qualified health centers (FQHCs). Building on long-standing collaborative research by an FQHC in Springfield, Massachusetts, the project brings together partners from a model B Medicaid Accountable Care Organization comprising multiple FQHCs in Massachusetts, collaborators from the Massachusetts Primary Care Association, and research partners from Harvard’s Implementation Science Center for Cancer Control Equity. The focus of the project was on mandated universal social risk screening and response requirements for FQHCs [33,34]. The team used PRISM to assess each partner’s perspective on anticipated RE-AIM outcomes and to identify contextual issues of importance.

To operationalize PRISM, the team used the Iterative Practical, Robust Implementation and Sustainability Model (iPRISM) webtool [17] during the preimplementation planning phase. The iPRISM webtool includes 21 assessment questions designed to systematically guide individuals or teams through the process of assessing context and anticipated RE-AIM outcomes for projects, while also facilitating shared input or cocreation. The iPRISM webtool was developed to help implementers efficiently apply PRISM to various types of projects and support implementation teams from diverse backgrounds (eg, clinicians/researchers with and without IS expertise). Each partner completed the iPRISM webtool independently (n=6; responses from 1 partner were not linked because they selected a different stage/phase within the webtool form).

During 2 sequential debrief meetings, the partners reviewed their responses, which led to increased clarity on (1) the RE-AIM and PRISM factors with lower scores, highlighting areas for priority focus; and (2) the variation in scoring across different partner perspectives, which contributed to a better understanding of each FQHC and defined multilevel assessment opportunities. Examining and discussing the mean scores and ranges of responses for each item provided valuable insights into both aspects. The tangible outcomes from this process included the development of a cocreated set of specific aims for the project, a better understanding of each FQHC’s perspective and contextual issues, and consensus among the partners to apply for grant funding to support the ongoing work as an innovation in FQHC-led social care research.


Challenges, Solutions, and Future Directions

In each case study, we identified several challenges, potential solutions, and considerations for future research when using EHR data, PRISM, and other IS approaches. The detailed evaluation and findings from each case study are described in Multimedia Appendix 1 and are categorized into 6 overarching themes of challenges with corresponding solutions. These themes and solutions are described below and summarized in Table 4. Across these themes, there are interdependencies, which are expected from a systems science perspective. For instance, resource and time constraints can exacerbate EHR data limitations, and varying levels of IS expertise can complicate the appropriate application or adaptation of PRISM and IS methods. For each challenge, we discuss the issues,potential solutions,and future directions.

Table 4. Challenges, solutions, and future directions.
Theme of challengeDescription of challengePotential solutions or ways to mitigate challenges and future directions
Team science issues: LHSaand ISbinvolve diverse teams with different backgrounds and perspectives
  • Level setting of vocabulary and terminology that is new or has different meanings in different fields
  • Shared goals and understanding of the problem and project are important
  • Can be difficult to moderate and understand different perspectives and ensure openness in sharing different perspectives
  • Consistency in applying TMFc to assess context/outcomes across individuals of a team
  • Employ team science best practices [35], including the creation of a shared vision and understanding of each partner’s perspectives.
  • Use the iPRISMd webtool to facilitate the use of team science best practices; level set vocabulary, issues, and goals; identify and summarize different perspectives; and consistently apply a TMF across partners.
Limited/no IS experience: LHS teams may have limited IS expertise
  • Makes the application of IS difficult which is a barrier to using these methods that aim to improve relevance, sustainability, scalability, and equity
  • Identify external expertise or a consultant
  • Leverage existing resources and tools that make IS more accessible (eg, iPRISM webtool; see Multimedia Appendix 2 for additional tools/resources)
  • Create additional resources to increase the accessibility of IS, including guidance on how to (1) feasibly and systematically anticipate, mitigate, and assess for unintended consequences including exacerbation of inequities, and (2) design for sustainability, including identifying and securing resources
  • Invest in capacity building of IS including training and resources aimed at implementer/practitioner education in addition to implementation scientist training
Data and technology limitations: LHS and IS methods are limited by the data available
  • LHSs often rely on EHRe data and the value of these data is limited by completeness, accuracy, equity, biases of documentation, or is difficult to capture because it is unstructured
  • Contextual data are often not documented or accessible in structured formats, which often limits contextual assessments to qualitative analyses which can be limited by partial or small samples
  • Reliance solely on quantitative or qualitative data limits a full picture of the context and impact or outcomes (eg, issues of actual vs stated and depth of understanding)
  • Accessibility and functionality of software and technology to manage and use data can limit use
  • Proactively consider potential data issues and the implications for a given project, and develop workarounds (eg, proxies) or strategies (eg, transparency in reporting) to mitigate the negative impact
  • Promote better/different data collection practices to ensure high-quality, unbiased, and inclusive data documented in standardized and structured ways
  • Develop more guidance on how to qualitatively assess context when not accessible via quantitative data sources
  • Engage diverse partners in the decision of what data are collected and how they are collected
  • Invest in capacity building of personnel skilled in:
  1. making timely/relevant data accessible to implementers/system leaders (eg, can build dashboards)
  2. using advanced analytics such as natural language processing to transform data from unstructured to structured formats
  • Invest in capacity building of software and technology that is more accessible and tailored to the needs of LHS including:
  1. transforming unstructured data into structured formats
  2. analyzing data and images
  3. intuitively conveying complex data in meaningful visualizations and other ways on demand
  4. using decision support and other tools that are more precise and able to nimbly embed within existing clinical and EHR workflows
  5. interoperability or integration across different EHRs and health systems
Time and resource constraints: LHS and IS methods must fit within available timelines and resources
  • Health system or other timelines may be fast and constrain LHS projects and IS methods
  • System-level and implementation team–level resources and time may limit data access; collection of quality, representative, and iterative quantitative and qualitative data to assess context that dynamically changed over time; type, intensity, and frequency of partner engagement and other contextual assessment methodology that can be conducted; and sustained support.
  • Evaluations of context and outcomes may be limited to
  1. what is discretely documented unless skilled staff is available to apply advanced analytic approaches (eg, natural language processing) to abstract unstructured data
  2. availability of skilled staff to query data and generate a report
  3. staff time to manually collect data
  • The availability of skilled staff to pragmatically design and evaluate projects can limit internal or external validity
  • Avoid the trap of perfect and aim for a minimum viable product when there is no anticipated harm
  • Collaborate to extend resource availability with:
  1. trainees who can contribute meaningfully while gaining valuable experience and knowledge
  2. methodologists who can provide needed skills while gaining authorship opportunities
  • Apply IS “designing for sustainability” principles to plan for sustainability and develop the supporting infrastructure from the beginning
  • Leverage advanced analytics (eg, natural language processing, machine learning) and data visualization platforms (eg, dashboards) to automate data collection, analysis, and reporting back to implementers and partners which can:
  1. make iterative assessments of context and outcomes feasible and sustainable
  2. increase the speed of positive impact or change when data are acted on
  3. improve overall efficiency
  • Invest in capacity building of personnel with the skills that can build low-burden means (eg, dashboards) that allow for evaluation of process and effectiveness outcomes in real time
Appropriateness of certain IS principles, outcomes, and TMF constructs: IS needs to adapt to each situation
  • Some aspects may not be relevant or perceived as important
  • Some aspects may not apply or some outcomes may not be addressed because of:
  1. resource or data constraints (eg, limit iterative assessments of context or adaptations, deter costing analyses)
  2. expectations around speed (eg, system priorities or patient safety issues require quick action)
  3. anticipated benefits or needs preclude resource allocation for certain evaluations (eg, cost, unintended consequences)
  4. difficulty measuring certain outcomes (eg, rare clinical outcomes or time to event, denominators unavailable to assess representativeness) or establishing reasonable causality (eg, unable to control for other influences)
  • Application of IS can be challenging when the
  1. intervention is mandated and expected to change but is not evidence based and yet there is a need to (1) design for future and ongoing sustainability and (2) get buy-in from key partners
  2. evidence for an intervention varies by contextual situation (eg, all CDSf interventions are not equal)
  • Create best practices and guidance on how/when to adapt IS principles, outcomes, and TMF constructs for diverse projects and situations
  • Avoid the trap of perfect and work within the constraints of the available timeline, data, and other resources
  • More guidance is needed on how to apply IS to interventions that are mandated without an established evidence base including how to:
  1. use an iterative LHS approach to evaluate effectiveness at intervals under different situations and inform intervention adaptations until it is evidence based (ie, effective)
  2. shift the incentive for partnership and buy-in from the strength of evidence to the requirement to implement
  3. use a “designing for sustainability” approach to assist in prioritizing resource allocation for a project that is likely to change as the evidence evolves
  • For interventions with effectiveness that varies by contextual situation, mixed methods evaluations and transparent reporting of contextual factors will provide clarity of the conditions needed for effectiveness
Representation and equity: LHS and IS need to proactively promote and assess for equity in perspectives and outcomes
  • Partner engagement that does not represent the spectrum of perspectives:
  1. biases a project toward certain perspectives and priorities
  2. stymies equity
  • Incomplete access to accurate data limits the ability to
  1. evaluate the equitable impact of projects
  2. assess for unintended consequences such as exacerbation of inequities
  • Aim to represent the perspectives of partners engaged at each stage of the project (planning, implementation, and evaluation), rather than just an average perspective across all stages.
  • Use the iPRISM webtool [36] and other tools to systematically capture different perspectives
  • Consider less traditional data sources and methods such as crowdsourcing and social media to expand the representation and inclusion of diverse data types
  • Proactively consider potential inequities inherent within quantitative or qualitative data sources and any potential unintended consequences
  • Transparently report perspectives engaged and completeness of data evaluated to guard against lack of diversity in perspectives that inform a project

aLHS: learning health system.

bIS: implementation science.

cTMF: theories, models, and frameworks.

diPRISM: Iterative Practical, Robust Implementation and Sustainability Model.

eEHR: electronic health record.

fCDS: clinical decision support.

Challenge 1: Team Science

Team science refers to the cross-disciplinary collaboration necessary to address scientific questions and challenges [37]. Both LHS and IS require multilevel engagement from partners with diverse roles, backgrounds, and perspectives, which introduces complexities. Representatives from various partner groups (eg, patients, community members, clinicians, leaders, nonclinical staff, and researchers from different fields) bring unique histories, perspectives, terminologies, biases, assumptions, and knowledge. Sometimes, communication issues or differences in perspectives are recognized, but they can often remain unnoticed for extended periods.

These team science challenges are manageable and should be addressed during the planning phase and throughout the project [35]. Taking the time to understand and respect each partner’s perspective and developing a shared vision and vocabulary is essential for team effectiveness and efficiency. It is also important to continually emphasize that different perspectives are not only beneficial but also expected [35]. The use of tools such as the iPRISM webtool [17,36] can systematically capture the diverse perspectives on teams and summarize the mean and distribution of scores, which can be used to focus team discussions on areas with lower mean scores (indicating areas for improvement) or where scores vary and perspectives differ. In the future, greater use and availability of tools such as the iPRISM webtool are recommended to facilitate team science principles within LHS and when using IS, including other TMFs [38,39] beyond PRISM.

In the social needs example, the diverse partner team included researchers and executive-level leaders with a range of IS expertise. The iPRISM webtool offered a grounded and shared entry point and opportunity to contextualize each partner’s perspective and expertise by providing (1) a guided process assessment, (2) a framework to understand the team’s similarities and differences, and (3) a shared language. This resulted in deepening the understanding of the range of potential themes related to social risk screening and response to be addressed and the various contexts in which they need to be considered.

Challenge 2: Limited or No Implementation Science Experience

In an LHS, and within health systems more generally, there may be variable or no IS expertise, which can preclude the application of IS altogether or lead to inconsistent or incomplete implementation. Inconsistent application may result in replication issues or incomplete assessments of context or project alignment, ultimately affecting project outcomes and sustainability. Understanding how to apply IS principles and TMFs such as PRISM can be challenging without training, and it is often impractical to train all team members.

One potential solution is identifying external IS expertise, which may be feasible through existing consultation services [40-42]. Utilizing resources and tools that make IS more accessible to individuals and teams by simplifying its application can also help. Tools such as the iPRISM webtool [17] and other resources [43-50], which can increase the accessibility of IS, are described in Multimedia Appendix 2. In our case studies, we identified specific aspects of applying PRISM and IS generally that would benefit from greater guidance and tools, such as how to feasibly and systematically anticipate, assess for, and mitigate unintended consequences—including those that can exacerbate or create inequities—and how to design for sustainability, including identifying and securing resources. Capacity-building efforts are also needed to train implementation researchers and practitioners [51].

In the social needs example, the iPRISM webtool was used successfully to standardize the application of IS across a team that had variable IS experience, including some members who had no IS experience.
At the time of the heart failure and lung ultrasound examples, the lead researchers were being mentored through an institutional implementation science K12 program and are now independent implementation scientists supporting others’ LHS projects.

Challenge 3: Data and Technology Limitations

Timely and feasible access to complete, accurate, and actionable quantitative and qualitative data is a universal challenge for LHS and IS [6,10,52-54]. These challenges limit outcome evaluations, contextual assessments, and the capacity to rapidly and strategically design interventions and programs that are equitable and optimally fit within workflows [55-58]. Often considered together under the umbrella of “informatics,” access to the needed technology is also a challenge. Even when technologies are accessible, they do not always have the functionality needed to seamlessly embed within clinical workflows. Furthermore, there may not be enough skilled staff to use or configure the technology to achieve the goals of delivering the “right” information at the “right” time, to the “right” person, in the “right” format, and through the “right” channel [59].

When faced with data and technology limitations, it is important to first recognize the potential issues and how they might impact a project, and then explore workarounds or strategies to monitor for potential downstream consequences. Workarounds to address data access issues include creating proxies or estimates and conducting sensitivity or subgroup analyses, which may not be precise but can provide valuable insights and aid in understanding. Addressing technology functional limitations often requires adopting a “good enough” mindset, provided the benefits are expected to outweigh potential sacrifices in user experience and no harm is anticipated [60,61]. Other strategies include transparently reporting the data and acknowledging technology limitations, allowing the audience to make informed interpretations of the findings. The current reality may compel LHS projects to operate within existing constraints, but it is crucial to advocate for the goals of truly inclusive precision health [62], which requires comprehensive integration of data to support holistic care decisions. Guidance is needed to help current LHS teams optimize their projects and evaluations within these constraints. However, it is also important to challenge and push against current limitations when necessary. For example, when available data are known to produce biased information or fail to include critically important patient or contextual factors, the health system may need to add essential data elements or utilize novel methods [63,64] to achieve impactful and equitable results [56,62,65,66].

In the lung ultrasound example, reach was trended month to month as a prioritized outcome to monitor the progress of implementation. However, the reach presented as simply the percentage of eligible patients that received lung ultrasounds was somewhat misleading because the denominator, the number of patients hospitalized with COVID-19, fluctuated so greatly with each surge of the pandemic. In this case, it seemed more transparent and less misleading to present reach with the absolute value of the numerator and denominator visible as opposed to just a percentage. The heart failure example illustrates ways to work within the functional constraints of technology that is good enough and proactively considers potential unintended consequences.

Challenge 4: Time and Resource Constraints

In most projects, restrictions on time, availability of skilled personnel, and other resources can slow data access or preclude it entirely. Additionally, the time required from the implementation team and participants can limit the breadth of perspectives engaged and the frequency of qualitative assessments. These limitations can negatively impact the equity and sustainability of projects.

To make progress, LHS teams often need to adapt evaluation plans and partner engagement methods to fit within available time and resource constraints. Finding partners and collaborators who can generate win-win situations and extend resources is also crucial for overcoming these limitations. There is also a growing availability of consultant-type services to support LHS initiatives, offer specific methodological expertise, or connect projects with needed resources [67,68]. Advances in artificial intelligence, such as natural language processing and machine learning, can enhance efficiency and reduce resource usage by automating data collection and analysis. Although these advancements are increasingly present within LHS, improving accessibility to these skills, resources, and software for automation is essential for further enhancing efficiency. Additionally, while the application of artificial intelligence approaches can offer significant benefits, it is important to exercise caution and carefully balance and vet automated processes [69,70]. To further improve efficiency, creating and disseminating “how-to” or implementation guides and recommendations [42,71-73] could help reduce the resources needed to understand, adapt, or apply specific technologies, data, or methodologies.

In the lung ultrasound example, due to the limited time available to the implementers between iterative PRISM cycles, the qualitative interview data collected were not as systematically analyzed as is ideal. In the social needs example, a partner was not able to engage in all iPRISM webtool planning and contextual assessment activities due to time constraints, thus measures were taken to ensure they were able to engage via asynchronous methods, provide summary updates on progress, and gain full team consensus at various points in the process. In the heart failure example, resource availability prevented the ability to iteratively assess for and make adaptations, which may have led to more impactful outcomes.

Challenge 5: Appropriateness of IS Principles, Outcomes, and TMF Constructs

Not all aspects of IS apply to every LHS situation for a variety of reasons. In some cases, a TMF construct or IS method may not align with a project’s goals, may not fit within resource or timeline constraints, or may need to be adapted to suit the situation [74]. Additionally, when implementing interventions without an established evidence base, flexibility is required in applying IS methods and partner engagement strategies. For instance, the LHS may be implementing a clinical guideline recommendation based on poor-quality or low-strength evidence, or following a new regulatory mandate for an intervention that has yet to demonstrate effectiveness [75].

IS is intended to be practical and pragmatic, and its principles, outcomes, and TMF constructs should be considered as a guide and adapted to the situation at hand, focusing on what is feasible and relevant for the context [74]. When changes are made, it is important to document and report the adaptations along with the rationale to facilitate future scalability. For projects without an established evidence base, some common IS strategies, including how and which partners are engaged, may need to be adjusted. Further, when a project’s evidence base is uncertain or may vary based on contextual conditions (eg, the effectiveness of a CDS alert varying by clinical situation and design), iterative IS and LHS approaches can be leveraged to develop the evidence base and understand the necessary conditions for success.

To increase the uptake of IS, it is essential to promote awareness that IS should be adapted to fit the specific needs and context of each project [76,77]. Current misconceptions that IS cannot be adapted may inhibit its application. Providing guidance on how to adapt TMFs and IS methods, supported by case examples, can help address these misconceptions and enhance the accessibility and use of IS. Additionally, interactive tools such as the iPRISM webtool, which dynamically guides implementers through the process of adapting IS methods for their specific project, could further facilitate this adaptation.

In the heart failure example, the adoption measure was modified from the original definition to be relevant to the situation at hand and to still facilitate the collection of important implementation outcomes.
In the social needs example, the intervention was not yet evidence based, which changed aspects of IS partner engagement. Specifically, partner buy-in shifted from a shared understanding of the effectiveness to a shared incentive to meet the mandate with a common interest in contributing to the development of an evidence base. Across all 3 examples, none were able to assess all of PRISM’s RE-AIM outcomes or evaluate the cost of implementation due to data and resource constraints as well as a need to focus efforts on those that were mission aligned amidst substantial competing priorities.

Challenge 6: Representation, Representativeness, and Equity

Limitations in the representativeness of documented data or in the range of partners engaged can impede the ability to design equitable solutions. This may stem from difficulties in assessing equity of outcomes and having a limited number of partners to strategically address existing disparities [27,62,78]. Such limitations could exacerbate or create new inequities without the capability to use data to identify and resolve these issues. IS methods, including PRISM, promote representativeness in data and partner engagement, which may not always be achievable within existing constraints. Therefore, when planning LHS or learning cycles, equity should be clearly defined and prioritized from the outset [79].

To the extent feasible, inclusive use of data and engagement of partners across the spectrum of perspectives—not just the average or majority perspective—is important for promoting equity within LHS [27]. Proactively considering the potential unintended consequences of using different types of data is also key to mitigating inequities. When possible, integrating data sources beyond the EHR (eg, social media, patient and staff satisfaction, community forums, community partner data) and using systems science approaches that include patient-reported outcomes and other social determinants and behavioral data can aid in more comprehensive consideration of the data needed to promote health equity [80]. There is a clear need for health systems to access a more inclusive integration of reliable, structured data.

Across all examples, none were able to gain the breadth of partner perspectives that is ideal to sufficiently assess the representativeness of outcomes, but they did what was feasible. For instance, the representativeness (eg, gender, race, age) of clinicians who adopted the CDS or lung ultrasound was not assessed because these data are stored outside of the EHR and inaccessible to those evaluating this type of LHS work.

In many ways, EHRs have enabled the visionary idea of an LHS to become a reality for many health systems. Yet, as highlighted in our case studies, this reliance on EHRs also limits their potential. These case examples demonstrate how IS—when applied in practical, accessible, and adaptable ways—can help LHSs navigate the challenges posed by EHRs while also addressing other crucial factors, such as team science. We highlight aspects of EHR-based LHSs that can complicate the application of IS, notably limitations in the type and completeness of available data. To enable LHS to practically apply IS, our case studies illustrate how an IS framework and its methods (PRISM) can be adapted to drive meaningful change. While IS can sometimes appear overly academic, complex, or inflexible, we emphasize that IS should be tailored to fit specific situations. We encourage others to utilize existing tools and resources to make IS more accessible and practical for their needs.

A cross-cutting key take-home message for LHSs broadly, and particularly when applying IS to EHR-based LHS, is to “do what you can with what you have while proactively anticipating and mitigating unintended consequences and harm” [60,81]. Historically, health care has often pursued perfection, which has led to rigidity in applying IS methods and utilizing EHR data and technology. This mindset can significantly delay or impede progress and is at odds with the visionary goals of LHS, which emphasize practical, relevant, and rapid learning cycles. Perfection is neither realistic nor attainable, and while striving for a perfect solution, health systems, priorities, and innovations evolve quickly, rendering solutions obsolete before they are even implemented [82,83].

In our case studies, we also identified areas for future development to enhance the accessibility of IS and the utility of EHR data and technology for LHS. First, there is a need for more user-friendly tools and resources to guide the use and adaptation of IS TMFs and methods across various types of projects and situations. Such resources should provide guidance on simplifying the application and adaptation of TMFs, taking into account relevance, data, and resource constraints. They should also address designing for sustainability, equity, and generalizability, including for mandated projects that lack an evidence base. Additionally, these resources should help systematically anticipate and mitigate unanticipated consequences, including those that could potentially misinform future policy. Additionally, we reinforce the decade-long call for more inclusive integration of accessible, high-quality data essential for achieving precision health goals [84,85]. Change is needed to ensure that LHSs have equitable, collaborative, and agreed-upon access to a comprehensive range of data—such as mental, physical, and behavioral health information; social determinants; environmental risks; patient preferences; and genomic data—that drive equitable health care outcomes and are crucial for making informed, patient-centered, and personalized health care decisions [84,85].

Since the original call for LHS in 2007 [7], significant progress has been made, with a growing number of functional LHS [10,86]. EHRs provide foundational infrastructure and data that make LHS possible, and IS methods can help both new and existing LHSs achieve their goals of equitable, sustainable, reproducible, and relevant knowledge generation and translation. However, to foster the growth of new LHSs and support existing LHSs in achieving the aspirational goals of a fully mature, equitable, and sustainable LHS [10], there is a clear need for greater access to inclusive data and more guidance on the practical application of IS methods.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Examples of challenges and solutions to applying PRISM to EHR-based LHS research categorized according to whether the issue primarily stems from PRISM or the EHR.

DOCX File , 22 KB

Multimedia Appendix 2

Additional tools and resources for guidance on how to apply IS.

DOCX File , 16 KB

  1. Adler-Milstein J, Jha A. HITECH Act drove large gains in hospital electronic health record adoption. Health Aff (Millwood). Aug 01, 2017;36(8):1416-1422. [CrossRef] [Medline]
  2. National trends in hospital and physician adoption of electronic health records. Office of the National Coordinator for Health Information Technology. URL: https:/​/www.​healthit.gov/​data/​quickstats/​national-trends-hospital-and-physician-adoption-electronic-health-records [accessed 2024-09-19]
  3. European Commission: Directorate-General for Communications Networks, Content and Technology, Valverde-Albacete J, Folkvord F, Lupiáñez-Villanueva F, Hocking L, Devaux A, et al. Benchmarking deployment of eHealth among general practitioners (2018): final report. Publications Office of the European Union. 2018. URL: https:/​/op.​europa.eu/​en/​publication-detail/​-/​publication/​d1286ce7-5c05-11e9-9c52-01aa75ed71a1/​language-en [accessed 2024-09-18]
  4. Woldemariam M, Jimma W. Adoption of electronic health record systems to enhance the quality of healthcare in low-income countries: a systematic review. BMJ Health Care Inform. Jun 2023;30(1):e100704. [FREE Full text] [CrossRef] [Medline]
  5. Kim Y, Jung K, Park Y, Shin D, Cho S, Yoon D, et al. Rate of electronic health record adoption in South Korea: a nation-wide survey. Int J Med Inform. May 2017;101:100-107. [CrossRef] [Medline]
  6. Etheredge LM. Rapid learning: a breakthrough agenda. Health Aff (Millwood). Jul 2014;33(7):1155-1162. [CrossRef] [Medline]
  7. Etheredge LM. A rapid-learning health system. Health Aff (Millwood). Jan 2007;26(2):w107-w118. [CrossRef] [Medline]
  8. Nundy S, Cooper L, Mate K. The quintuple aim for health care improvement: a new imperative to advance health equity. JAMA. Feb 08, 2022;327(6):521-522. [CrossRef] [Medline]
  9. Committee on the Learning Health Care System in America, Institute of Medicine. Smith M, Saunders R, Stuckhardt L, McGinnis JM, editors. Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. Washington, DC. National Academies Press (US); May 10, 2013.
  10. Trinkley K, Ho P, Glasgow R, Huebschmann A. How dissemination and implementation science can contribute to the advancement of learning health systems. Acad Med. Oct 01, 2022;97(10):1447-1458. [FREE Full text] [CrossRef] [Medline]
  11. Chambers D, Feero W, Khoury M. Convergence of implementation science, precision medicine, and the learning health care system: a new model for biomedical research. JAMA. May 10, 2016;315(18):1941-1942. [FREE Full text] [CrossRef] [Medline]
  12. Brownson RC, Colditz GA, Proctor EK. Dissemination and Implementation Research in Health: Translating Science to Practice (Second Edition). Oxford. Oxford University Press; Nov 23, 2017.
  13. Foley T, Vale L. A framework for understanding, designing, developing and evaluating learning health systems. Learn Health Syst. Jan 2023;7(1):e10315. [FREE Full text] [CrossRef] [Medline]
  14. Allen C, Coleman K, Mettert K, Lewis C, Westbrook E, Lozano P. A roadmap to operationalize and evaluate impact in a learning health system. Learn Health Syst. Oct 2021;5(4):e10258. [FREE Full text] [CrossRef] [Medline]
  15. Nilsen P. Nilsen P, editor. Implementation Science: Theory and Application. New York, NY. Routledge; May 13, 2024.
  16. Easterling D, Perry AC, Woodside R, Patel T, Gesell SB. Clarifying the concept of a learning health system for healthcare delivery organizations: implications from a qualitative analysis of the scientific literature. Learn Health Syst. Apr 2022;6(2):e10287. [FREE Full text] [CrossRef] [Medline]
  17. Trinkley KE, Glasgow RE, D'Mello S, Fort MP, Ford B, Rabin BA. The iPRISM webtool: an interactive tool to pragmatically guide the iterative use of the Practical, Robust Implementation and Sustainability Model in public health and clinical settings. Implement Sci Commun. Sep 19, 2023;4(1):116. [FREE Full text] [CrossRef] [Medline]
  18. Beidas R, Dorsey S, Lewis C, Lyon A, Powell B, Purtle J, et al. Promises and pitfalls in implementation science from the perspective of US-based researchers: learning from a pre-mortem. Implement Sci. Aug 13, 2022;17(1):55. [FREE Full text] [CrossRef] [Medline]
  19. Boyd A, Gonzalez-Guarda R, Lawrence K, Patil C, Ezenwa M, O'Brien EC, et al. Potential bias and lack of generalizability in electronic health record data: reflections on health equity from the National Institutes of Health Pragmatic Trials Collaboratory. J Am Med Inform Assoc. Aug 18, 2023;30(9):1561-1566. [FREE Full text] [CrossRef] [Medline]
  20. Richesson R, Marsolo K, Douthit B, Staman K, Ho P, Dailey D, et al. Enhancing the use of EHR systems for pragmatic embedded research: lessons from the NIH Health Care Systems Research Collaboratory. J Am Med Inform Assoc. Nov 25, 2021;28(12):2626-2640. [FREE Full text] [CrossRef] [Medline]
  21. Damschroder LJ, Knighton AJ, Griese E, Greene SM, Lozano P, Kilbourne AM, et al. Recommendations for strengthening the role of embedded researchers to accelerate implementation in health systems: findings from a state-of-the-art (SOTA) conference workgroup. Healthc (Amst). Jun 2021;8 Suppl 1(Suppl 1):100455. [FREE Full text] [CrossRef] [Medline]
  22. Tabak R, Khoong E, Chambers D, Brownson R. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. Sep 2012;43(3):337-350. [FREE Full text] [CrossRef] [Medline]
  23. Rabin B. Dissemination Implementation. URL: https://dissemination-implementation.org/ [accessed 2024-09-19]
  24. Feldstein AC, Glasgow RE. A Practical, Robust Implementation and Sustainability Model (PRISM) for integrating research findings into practice. Jt Comm J Qual Patient Saf. Apr 2008;34(4):228-243. [CrossRef] [Medline]
  25. Rabin B, Cakici J, Golden C, Estabrooks P, Glasgow R, Gaglio B. A citation analysis and scoping systematic review of the operationalization of the Practical, Robust Implementation and Sustainability Model (PRISM). Implement Sci. Sep 24, 2022;17(1):62. [FREE Full text] [CrossRef] [Medline]
  26. Glasgow R, Harden S, Gaglio B, Rabin B, Smith M, Porter G, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health. 2019;7:64. [FREE Full text] [CrossRef] [Medline]
  27. Fort MP, Manson SM, Glasgow RE. Applying an equity lens to assess context and implementation in public health and health services research and practice using the PRISM framework. Front Health Serv. 2023;3:1139788. [FREE Full text] [CrossRef] [Medline]
  28. Goodrich D, Miake-Lye I, Braganza M, Wawrin N, Kilbourne A. The QUERI Roadmap for Implementation and Quality Improvement. Washington, DC. Department of Veterans Affairs (US); 2020.
  29. Trinkley K, Kroehl M, Kahn M, Allen L, Bennett T, Hale G, et al. Applying clinical decision support design best practices with the practical robust implementation and sustainability model versus reliance on commercially available clinical decision support tools: randomized controlled trial. JMIR Med Inform. Mar 22, 2021;9(3):e24359. [FREE Full text] [CrossRef] [Medline]
  30. Maw A, Morris M, Glasgow R, Barnard J, Ho P, Ortiz-Lopez C, et al. Using Iterative RE-AIM to enhance hospitalist adoption of lung ultrasound in the management of patients with COVID-19: an implementation pilot study. Implement Sci Commun. Aug 12, 2022;3(1):89. [CrossRef] [Medline]
  31. Trinkley K, Kahn M, Bennett T, Glasgow R, Haugen H, Kao D, et al. Integrating the Practical Robust Implementation and Sustainability Model with best practices in clinical decision support design: implementation science approach. J Med Internet Res. Oct 29, 2020;22(10):e19676. [FREE Full text] [CrossRef] [Medline]
  32. Trinkley KE, Wright G, Allen LA, Bennett TD, Glasgow RE, Hale G, et al. Sustained effect of clinical decision support for heart failure: a natural experiment using implementation science. Appl Clin Inform. Oct 2023;14(5):822-832. [CrossRef] [Medline]
  33. Cole M, Nguyen K, Byhoff E, Murray G. Screening for social risk at federally qualified health centers: a national study. Am J Prev Med. May 2022;62(5):670-678. [FREE Full text] [CrossRef] [Medline]
  34. Billioux A, Verlander K, Anthony S, Alley D. Standardized screening for health-related social needs in clinical settings: the accountable health communities screening tool. NAM Perspectives. May 30, 2017;7(5):1-9. [CrossRef]
  35. Patient-Centered Outcomes Research Institute (PCORI). Best practices in multi-stakeholder team science. PCORI. URL: https://research-teams.pcori.org/best-practices [accessed 2024-09-19]
  36. iPRISM Webtool. URL: https://prismtool.org/ [accessed 2024-09-26]
  37. Little MM, St Hill CA, Ware KB, Swanoski MT, Chapman SA, Lutfiyya MN, et al. Team science as interprofessional collaborative research practice: a systematic review of the science of team science literature. J Investig Med. Jan 2017;65(1):15-22. [FREE Full text] [CrossRef] [Medline]
  38. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. Aug 07, 2009;4:50. [CrossRef] [Medline]
  39. Moullin J, Dickson K, Stadnick N, Rabin B, Aarons G. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. Jan 05, 2019;14(1):1. [FREE Full text] [CrossRef] [Medline]
  40. University of Colorado Anschutz Medical Campus. URL: https:/​/medschool.​cuanschutz.edu/​accords/​cores-and-programs/​dissemination-implementation-science-program [accessed 2024-09-18]
  41. Dissemination and Implementation Science Center (DISC). Altman Clinical and Translational Research Institute UC San Diego. URL: https://actri.ucsd.edu/centers-services/portfolio/disc/index.html [accessed 2024-09-18]
  42. Resources: Implementation Science and Practice. The Center for Implementation. URL: https://thecenterforimplementation.com/toolbox [accessed 2024-09-18]
  43. Coyle K, Carcone A, Butame S, Pooler-Burgess M, Chang J, Naar S. Adapting the self-assessment of contextual fit scale for implementation of evidence-based practices in adolescent HIV settings. Implement Sci Commun. Oct 22, 2022;3(1):115. [FREE Full text] [CrossRef] [Medline]
  44. Robinson C, Damschroder L. A pragmatic context assessment tool (pCAT): using a think aloud method to develop an assessment of contextual barriers to change. Implement Sci Commun. Jan 11, 2023;4(1):3. [FREE Full text] [CrossRef] [Medline]
  45. Palinkas L, Spear S, Mendon S, Villamar J, Reynolds C, Green C, et al. Conceptualizing and measuring sustainability of prevention programs, policies, and practices. Transl Behav Med. Feb 03, 2020;10(1):136-145. [FREE Full text] [CrossRef] [Medline]
  46. Malone S, Prewitt K, Hackett R, Lin J, McKay V, Walsh-Bailey C, et al. The Clinical Sustainability Assessment Tool: measuring organizational capacity to promote sustainability in healthcare. Implement Sci Commun. Jul 17, 2021;2(1):77. [FREE Full text] [CrossRef] [Medline]
  47. Schell S, Luke D, Schooley M, Elliott M, Herbers S, Mueller N, et al. Public health program capacity for sustainability: a new framework. Implement Sci. Feb 01, 2013;8:15. [FREE Full text] [CrossRef] [Medline]
  48. The Hexagon – an exploration tool. National Implementation Research Network. URL: https://implementation.fpg.unc.edu/resource/the-hexagon-an-exploration-tool/ [accessed 2024-09-18]
  49. Hull L, Goulding L, Khadjesari Z, Davis R, Healey A, Bakolis I, et al. Designing high-quality implementation research: development, application, feasibility and preliminary evaluation of the implementation science research development (ImpRes) tool and guide. Implement Sci. Aug 14, 2019;14(1):80. [FREE Full text] [CrossRef] [Medline]
  50. Weiner B, Sherr K, Lewis CC, editors. Practical Implementation Science. New York City, NY. Springer Publishing; Mar 28, 2022.
  51. Huebschmann AG, Johnston S, Davis R, Kwan BM, Geng E, Haire-Joshu D, et al. Promoting rigor and sustainment in implementation science capacity building programs: a multi-method study. Implement Res Pract. 2022;3:26334895221146261. [FREE Full text] [CrossRef] [Medline]
  52. Borsky AE, Savitz LA, Bindman AB, Mossburg S, Thompson L. AHRQ series on improving translation of evidence: perceived value of translational products by the AHRQ EPC learning health systems panel. Jt Comm J Qual Patient Saf. Nov 2019;45(11):772-778. [FREE Full text] [CrossRef] [Medline]
  53. Slutsky JR. Moving closer to a rapid-learning health care system. Health Aff (Millwood). 2007;26(2):w122-w124. [CrossRef] [Medline]
  54. Gould M, Sharp A, Nguyen H, Hahn E, Mittman B, Shen E, et al. Embedded research in the learning healthcare system: ongoing challenges and recommendations for researchers, clinicians, and health system leaders. J Gen Intern Med. Dec 2020;35(12):3675-3680. [FREE Full text] [CrossRef] [Medline]
  55. Wu AW, Snyder C, Clancy CM, Steinwachs DM. Adding the patient perspective to comparative effectiveness research. Health Aff (Millwood). Oct 2010;29(10):1863-1871. [CrossRef] [Medline]
  56. Glasgow RE, Kaplan RM, Ockene JK, Fisher EB, Emmons KM. Patient-reported measures of psychosocial issues and health behavior should be added to electronic health records. Health Aff (Millwood). Mar 2012;31(3):497-504. [CrossRef] [Medline]
  57. Nerenz DR, Austin JM, Deutscher D, Maddox KEJ, Nuccio EJ, Teigland C, et al. Adjusting quality measures for social risk factors can promote equity in health care. Health Aff (Millwood). Apr 2021;40(4):637-644. [CrossRef] [Medline]
  58. Lavallee DC, Chenok KE, Love RM, Petersen C, Holve E, Segal CD, et al. Incorporating patient-reported outcomes into health care to engage patients and enhance care. Health Aff (Millwood). Apr 2016;35(4):575-582. [CrossRef] [Medline]
  59. Osheroff J, Teich J, Levick D, Saldana L, Velasco F, Sittig D. Improving Outcomes With Clinical Decision Support: An Implementers Guide (Second Edition). Chicago, IL. Healthcare Information Management Systems Society (HIMSS); 2012.
  60. Tolf S, Nyström ME, Tishelman C, Brommels M, Hansson J. Agile, a guiding principle for health care improvement? Int J Health Care Qual Assur. 2015;28(5):468-493. [CrossRef] [Medline]
  61. Krebs K, Milani L. Translating pharmacogenomics into clinical decisions: do not let the perfect be the enemy of the good. Hum Genomics. Aug 27, 2019;13(1):39. [FREE Full text] [CrossRef] [Medline]
  62. Glasgow R, Kwan B, Matlock D. Realizing the full potential of precision health: the need to include patient-reported health behavior, mental health, social determinants, and patient preferences data. J Clin Transl Sci. Jun 2018;2(3):183-185. [FREE Full text] [CrossRef] [Medline]
  63. Haneuse S, Daniels M. A general framework for considering selection bias in EHR-based studies: what data are observed and why? EGEMS (Wash DC). 2016;4(1):1203. [FREE Full text] [CrossRef] [Medline]
  64. Haneuse S, Arterburn D, Daniels MJ. Assessing missing data assumptions in EHR-based studies: a complex and underappreciated task. JAMA Netw Open. Feb 01, 2021;4(2):e210184. [FREE Full text] [CrossRef] [Medline]
  65. Nelson TA, Anderson B, Bian J, Boyd AD, Burton SV, Davis K, et al. Planning for patient-reported outcome implementation: development of decision tools and practical experience across four clinics. J Clin Transl Sci. Apr 06, 2020;4(6):498-507. [FREE Full text] [CrossRef] [Medline]
  66. Glasgow R, Huebschmann A, Krist A, Degruy FV. An adaptive, contextual, technology-aided support (ACTS) system for chronic illness self-management. Milbank Q. Sep 2019;97(3):669-691. [FREE Full text] [CrossRef] [Medline]
  67. McDonald PL, Van Der Wees P, Weaver GC, Harwood K, Phillips JR, Corcoran M. Learning health systems from an academic perspective: establishing a collaboratory within a school of medicine and health sciences. Med Educ Online. Dec 2021;26(1):1917038. [CrossRef] [Medline]
  68. Learning health sciences. Department of Learning Health Sciences, University of Michigan Medical School. URL: https://medicine.umich.edu/dept/learning-health-sciences [accessed 2024-09-18]
  69. Feehan M, Owen L, McKinnon I, DeAngelis MM. Artificial intelligence, heuristic biases, and the optimization of health outcomes: cautionary optimism. J Clin Med. Nov 14, 2021;10(22):5284. [FREE Full text] [CrossRef] [Medline]
  70. Cabitza F, Rasoini R, Gensini G. Unintended consequences of machine learning in medicine. JAMA. Aug 08, 2017;318(6):517-518. [CrossRef] [Medline]
  71. Soong C, Burry L, Cho HJ, Gathecha E, Kisuule F, Tannenbaum C, et al. An implementation guide to promote sleep and reduce sedative-hypnotic initiation for noncritically ill inpatients. JAMA Intern Med. Jul 01, 2019;179(7):965-972. [CrossRef] [Medline]
  72. Trecartin KW, Wolfe RE. Emergency department observation implementation guide. J Am Coll Emerg Physicians Open. Aug 2023;4(4):e13013. [FREE Full text] [CrossRef] [Medline]
  73. Draffan E, Danger C, Banes D. Reflections on building a multi-country AAC implementation guide. Stud Health Technol Inform. Aug 23, 2023;306:181-187. [CrossRef] [Medline]
  74. Glasgow R, Estabrooks P. Pragmatic applications of RE-AIM for health care initiatives in community and clinical settings. Prev Chronic Dis. Jan 04, 2018;15:E02. [FREE Full text] [CrossRef] [Medline]
  75. Brownson RC, Shelton RC, Geng EH, Glasgow RE. Revisiting concepts of evidence in implementation science. Implement Sci. Apr 12, 2022;17(1):26. [FREE Full text] [CrossRef] [Medline]
  76. Quinn AK, Neta G, Sturke R, Olopade CO, Pollard SL, Sherr K, et al. Adapting and operationalizing the RE-AIM framework for implementation science in environmental health: clean fuel cooking programs in low resource countries. Front Public Health. 2019;7:389. [FREE Full text] [CrossRef] [Medline]
  77. Soicher RN, Becker-Blease KA, Bostwick KCP. Adapting implementation science for higher education research: the systematic study of implementing evidence-based practices in college classrooms. Cogn Res Princ Implic. Nov 05, 2020;5(1):54. [FREE Full text] [CrossRef] [Medline]
  78. Shelton R, Chambers D, Glasgow R. An extension of RE-AIM to enhance sustainability: addressing dynamic context and promoting health equity over time. Front Public Health. 2020;8:134. [FREE Full text] [CrossRef] [Medline]
  79. Lee-Foon NK, Reid RJ, Brown A. Fairness for whom? Learning health systems' approach to equity in healthcare. Healthc Policy. Nov 2023;19(2):15-20. [FREE Full text] [CrossRef] [Medline]
  80. Bierman AS, Mistry KB. Commentary: Achieving health equity - the role of learning health systems. Healthc Policy. Nov 2023;19(2):21-27. [FREE Full text] [CrossRef] [Medline]
  81. Palakshappa D, Miller D, Rosenthal G. Advancing the learning health system by incorporating social determinants. Am J Manag Care. Jan 01, 2020;26(1):e4-e6. [FREE Full text] [CrossRef] [Medline]
  82. MGowan H, Shipley C. The Adaptation Advantage: Let Go, Learn Fast, and Thrive in the Future of Work. Hoboken, NJ. John Wiley and Sons Inc; 2020.
  83. Chambers D, Norton W. The adaptome: advancing the science of intervention adaptation. Am J Prev Med. Oct 2016;51(4 Suppl 2):S124-S131. [FREE Full text] [CrossRef] [Medline]
  84. Hood L, Friend S. Predictive, personalized, preventive, participatory (P4) cancer medicine. Nat Rev Clin Oncol. Mar 2011;8(3):184-187. [CrossRef] [Medline]
  85. Khoury M, Gwinn M, Glasgow R, Kramer B. A population approach to precision medicine. Am J Prev Med. Jun 2012;42(6):639-645. [FREE Full text] [CrossRef] [Medline]
  86. Lannon C, Schuler C, Seid M, Provost L, Fuller S, Purcell D, et al. A maturity grid assessment tool for learning networks. Learn Health Syst. Apr 2021;5(2):e10232. [FREE Full text] [CrossRef] [Medline]


CDS: clinical decision support
EHR: electronic health record
FQHC: federally qualified health center
iPRISM: Iterative Practical, Robust Implementation and Sustainability Model
IS: implementation science
LHS: learning health system
PRISM: Practical, Robust Implementation and Sustainability Model
RE-AIM: Reach, Effectiveness, Adoption, Implementation, and Maintenance
TMF: theories, models, and frameworks


Edited by G Tsafnat; submitted 13.12.23; peer-reviewed by S Greene, T Watterson; comments to author 28.02.24; revised version received 17.05.24; accepted 24.08.24; published 07.10.24.

Copyright

©Katy E Trinkley, Anna M Maw, Cristina Huebner Torres, Amy G Huebschmann, Russell E Glasgow. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 07.10.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.