Skip to main content

Identifying existing approaches used to evaluate the sustainability of evidence-based interventions in healthcare: an integrative review



There is limited evidence to evaluate the sustainability of evidence-based interventions (EBIs) for healthcare improvement. Through an integrative review, we aimed to identify approaches to evaluate the sustainability of evidence-based interventions (EBIs) and sustainability outcomes.


Following Whittemore and Knafl’s methodological process: (1) problem identification; (2) literature search; (3) data evaluation; (4) data analysis; and (5) presentation, a comprehensive search strategy was applied across five databases. Included studies were not restricted by research design; and had to evaluate the sustainability of an EBI in a healthcare context. We assessed the methodological quality of studies using the Mixed Methods Appraisal Tool.


Of 18,783 articles retrieved, 64 fit the inclusion criteria. Qualitative designs were most commonly used for evaluation (48%), with individual interviews as the predominant data collection method. Timing of data collection varied widely with post-intervention data collection most frequent (89%). Of the 64 studies, 44% used a framework, 26% used a model, 11% used a tool, 5% used an instrument, and 14% used theory as their primary approach to evaluate sustainability. Most studies (77%) did not measure sustainability outcomes, rather these studies focused on sustainability determinants.


It is unclear which approach/approaches are most effective for evaluating sustainability and what measures and outcomes are most commonly used. There is a disconnect between evaluating the factors that may shape sustainability and the outcomes approaches employed to measure sustainability. Our review offers methodological recommendations for sustainability evaluation research and highlights the importance in understanding mechanisms of sustainability to advance the field.

Peer Review reports


The translation of evidence-based interventions (EBIs) to healthcare practices takes an average of 17 years [1]. As a result, the field of implementation science (IS) seeks to understand ways to improve the implementation and dissemination of EBIs in healthcare [2]. As IS has matured, researchers have recognized that implementation, which often requires substantial resources, is meaningless without long-term sustainability efforts [3]. We draw upon a comprehensive definition for sustainability by Moore et al. [4] who define sustainability under five constructs: (1) as occurring after a defined period of time, (2) where the intervention, and/or implementation strategies continue to be delivered, (3) behavior change is maintained, (4) the program and behavior change may evolve or adapt while (5) continuing to produce benefits for individuals/systems. Policy-makers and other stakeholders are increasingly concerned with the long-term impact of such investments in EBIs [5]. Sustainability is a key outcome of the implementation process [5], and a priority topic for IS. Yet, our understanding of how to evaluate the sustainability of an EBI in healthcare remains limited in implementation research.

Recent synthesis efforts have focused on (a) identifying sustainability approaches (i.e. theory, model, framework), (b) how sustainability approaches have been used [6, 7], (c) identifying sustainability determinants [6,7,8,9], or sustainability strategies [10] or (d) methods of sustainability measurement [11]. Despite these recent efforts, there is not a comprehensive synthesized approach on how to evaluate sustainability of EBIs. This work is critical to inform sustainability planning, implementation, and evaluation of EBIs in healthcare.


The aim of this integrative review was to identify and synthesize approaches that have been used to evaluate the sustainability of EBIs in healthcare. We aimed to enhance our understanding of sustainability definitions, research designs, data collection methods, timing, measures, and sustainability outcomes used for sustainability evaluation studies.


We conducted an integrative review that followed Whittemore and Knafl’s [12] five-stage methodological process: (1) problem identification; (2) literature search; (3) data evaluation; (4) data analysis (data reduction, display, comparison, and conclusions); and (5) presentation. Integrative reviews can include diverse data sources and do not restrict the type of study design. Furthermore, integrative reviews take a more flexible approach to analysis compared to systematic reviews, combining both quantitative and qualitative data if there are similarities [12,13,14].


Literature search

We conducted a systematic database search using comprehensive strategies, including diverse data sources and methodologies, to advance the understanding of sustainability evaluation as it relates to health EBIs. In December 2018 and July 2020, we searched the following databases: Ovid MEDLINE(R) and Epub Ahead of Print, In-Process and Other Non-Indexed Citations and Daily (1946 to current); OVID Embase (1974 to current); EBSCOhost CINAHL Plus with Full-text (1937 to current); Wiley Cochrane Library (inception to current). A health research librarian conducted the search in consultation with the research team. We combined terms representing sustainability with terms representing knowledge translation or knowledge implementation of healthcare programs and terms related to evaluation or measurement. An agreed-upon set of terms allowed us to exclude as many irrelevant studies as possible without eliminating relevant ones. For example, the terms excluded studies representing environmental sustainability, patient institutionalization, and animal studies from the primary set of results. Results were limited to the English language and to academic journals (when the interface permitted). We also used a snowball approach to manually search reference lists of relevant systematic reviews to identify additional relevant sustainability evaluation studies.

The initial database search in December 2018 generated a total of 13,613 records. We identified 5399 duplicate records from this batch, leaving 8214 records for title/abstract screening. An update of the search was performed in July 2020 using the same original databases and search strategies. We found 5170 new items from the updated search and removed a further 2718 duplicate records, leaving 2452 items remaining. Full search details can be found in Additional file 1. See Additional file 2 for the completed PRISMA checklist.

Inclusion and exclusion criteria

We applied the inclusion and exclusion criteria (Table 1) during screening. We included studies that focused on implementation, dissemination, impact, uptake, scale and spread, testing and monitoring; but, studies had to have an independent sustainability evaluation component.

Table 1 Inclusion and exclusion criteria

Data extraction, analysis, and synthesis

We used Endnote X7 as the management system for this review. After removing duplicates, we conducted a two-stage screening process of the citations retrieved from our database searches. In the first screening stage, one reviewer (RF) independently screened the abstracts and titles of all the citations retrieved from the database searches. A second reviewer (AB) independently screened a randomly selected 10% of all titles and abstracts to verify selection for inclusion or exclusion. In the second stage, two reviewers (RF and AB) independently screened all full-text articles that had passed first-stage screening. We discussed any differences in screening at team meetings and refined our inclusion and exclusion criteria to reflect these discussions.

The two reviewers independently extracted the following variables: (1) study design, (2) evaluation type (independent versus composite), (3) sustainability definition and terms used, (4) type and name of theoretical approach, (5) purpose of approach use, (6) data collection methods, (7) timing of evaluation data collection (e.g., pre-, and/or post-implementation of intervention), (8) reported sustainability measures and, 9. reported sustainability outcomes.

Theoretical approach used to evaluate sustainability

During extraction, we applied Nilsen’s five categories of theoretical approaches [14] used in implementation science (Table 2) to extract the primary theoretical approach that studies used to evaluate sustainability. Furthermore, we have also included tools and instruments as additionally accepted approaches to sustainability evaluation. It should be noted that Nilsen uses the term theoretical approach as a broad concept, which includes theories as one of many approaches. We will use the term ‘theoretical approach’ to describe all approaches to sustainability evaluation, including models, theories, frameworks, tools and instruments. If the primary theoretical approach was not explicitly stated by the author to be based on a theory, model, or framework, or multiple theoretical approaches were used, the most focused-on theoretical approach was deemed primary. We also extracted measures used to evaluate sustainability as reported by the authors of the included studies.

Table 2 Categories of theoretical approaches used in implementation science

Sustainability measures and outcomes

We extracted whether an included study evaluated (a) sustainability determinants and (b) sustainability outcomes and how they measured these variables. We defined sustainability determinants as correlates and predictors of sustainability (organizational, contextual, and strategies) and sustainability outcomes as the subsequent impact (healthcare improvement or public health outcomes) of sustained intervention use [5]. To further unpack sustainability outcomes, we extracted and synthesized nine sustainability outcomes across the 64 included studies (Table 3) [6, 15, 16].

Table 3 Sustainability outcomes identified in included studies [6]

Data analysis followed the methodological steps outlined by Whittemore and Knafl [12] which included data reduction, data display, data comparison, and drawing conclusions and verifications. We compared, grouped, and synthesized each study by these variables. Evidence tables were created to summarize and describe the studies included in this review.

Quality appraisal

We used the Mixed Methods Appraisal Tool (MMAT) [17] to assess the methodological quality of studies. Two reviewers independently completed MMAT assessments and compared scoring. The MMAT [17] appraises the quality by study design type, such as quantitative, qualitative, or mixed empirical methods. We distinguished mixed-methods as studies combining both qualitative and quantitative methods, whereas studies classified as multi-methods used 2+ qualitative methods. The criteria are specific to each type of study, with five domains apportioned to qualitative studies and quantitative studies subdivided into randomized controlled, non-randomized and descriptive studies. Each study is assigned an overall quality score, using asterisks representing the quality appraisal of each study. Scores vary from 20% (*) when one criterion is met to 100% (*****) when all criteria are met. Studies were not excluded based upon MMAT [17] ratings/scores. The purpose of conducting the MMAT appraisals was to get a sense of the quality of the research on this topic.


Of the total 18,783 records identified through database searching, 64 studies were included for our review. Figure 1 depicts our search, screening, and selection results using the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines flow diagram [18]. Our results are presented under five headings according to our research aims. Full citation list for the 64 studies can be found in Additional file 3.

Fig. 1
figure 1

PRISMA 2020 flow diagram of search results

Data collection methods

Of the included studies, 49% (n = 31) were qualitative, making this the most frequently used research design [19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49]. One quarter (25%; n = 16) used a mixed-methods design [50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65], followed by 14% (n = 9) studies that used a multi-methods design [66,67,68,69,70,71,72,73,74]. Only 12% (n = 8) of the included studies used a quantitative research design [75,76,77,78,79,80,81,82]. Of the 31 qualitative studies, 94% (n = 29) used interviews to collect evaluation data [19,20,21,22, 24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40, 42,43,44,45,46,47,48,49]; however, only 59% (n = 17) of those reported interviews as the sole data collection method [20, 21, 24,25,26,27, 29, 30, 32, 33, 37, 38, 40, 43, 45, 47, 48]. The remaining 3 qualitative studies used an onsite inspection and assessment tool [41], steering-committee meeting minutes [23], and a combination of workshop sessions and field notes [49]. Of the 16 mixed methods studies [50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65], 88% (n = 14) used interviews, although never as the only method of data collection [50,51,52, 55,56,57,58,59,60,61,62,63,64,65]. Interviews were accompanied by surveys or questionnaires in 79% (n = 11) of the 14 studies using interviews [50,51,52, 55, 58,59,60,61,62, 64, 65]. Interviews were also the most common method of data collection in the nine multi-methods studies [66,67,68,69,70,71,72,73,74], with 66% (n = 6) combining interviews with another method [67,68,69,70, 72, 73]. Surveys or questionnaires accompanied the interviews in 22% (n = 2) of studies [69, 73]. Of the nine quantitative studies, 100% (n = 8) administered a survey or questionnaire to collect sustainability evaluation data [75,76,77,78,79,80,81,82] Table 4

Table 4 Data collection method of included studies

Sustainability definitions and terms in evaluation studies

Of the 64 included evaluation studies, 61% (n = 39) provided a clear definition for sustainability [19,20,21, 27,28,29,30,31,32,33,34,35,36, 39, 42,43,44,45,46,47,48,49,50, 53, 54, 56, 58, 60,61,62, 64, 65, 67, 71, 73, 74, 76, 77, 80, 81] and 39% (n = 25) did not explicitly define sustainability [22,23,24,25,26, 30, 37, 38, 40, 41, 51, 52, 55, 57, 59, 63, 66, 68,69,70, 72, 75, 78, 79, 82]. Of the 39 studies with a clear definition, 66% (n = 26) drew upon one empirical source to define sustainability [20, 28, 29, 31,32,33,34,35,36, 39, 42,43,44, 46, 47, 50, 53, 54, 56, 60, 64, 65, 73, 74, 77, 80], 26% (n = 10) drew upon multiple sources [19, 27, 45, 48, 49, 58, 62, 71, 76, 81], and 8% (n = 3) [21, 61, 67] developed their own definitions for sustainability. The sources of sustainability definitions used in the included are detailed in Table 5. The most reported terms used to describe sustainability were continuation, maintenance, integration, routinization, normalization, and institutionalization.

Table 5 Sustainability definition sources

Theoretical approaches used to evaluate sustainability

Of the 64 studies, 44% (n = 28) reported that they used a framework as their primary theoretical approach to evaluate sustainability [19,20,21, 26, 29,30,31, 34,35,36, 39, 43,44,45,46,47,48,49,50,51,52, 55, 57, 59, 61, 63, 75, 82]. The next most common theoretical approach was a model, used in 26% (n = 17) of included studies [22,23,24, 27, 32, 33, 37, 38, 54, 56, 58, 64, 66, 72, 74, 76, 78]. A tool was the primary theoretical approach in 11% (7/64) of the included studies [60, 62, 68, 69, 71, 80, 81]. Only 5% (3/64) of included studies used an instrument, making this the least common theoretical approach used [65, 67, 77]. Theory was used as the primary theoretical approach to evaluate sustainability in 14% (9/64) of studies [25, 28, 40,41,42, 53, 70, 73, 79] (Table 6).

Table 6 Primary theoretical approaches used to evaluate sustainability

Of the 28 studies that used a framework, 82% (n = 23) used a single framework [19,20,21, 26, 29,30,31, 34,35,36, 39, 43, 46,47,48,49,50, 57, 59, 61, 63, 75, 82], while 14% (n = 4) used a combination of frameworks [45, 51, 52, 55]. The remaining 4% (n = 1) developed their own framework [44] to evaluate sustainability. A wide range of frameworks were used to evaluate sustainability; the Consolidated Framework for Implementation Research (CFIR [83]) was used most frequently (n = 5) [20, 30, 36, 61, 82], followed by the Promoting Action on Research Implementation in Health Services (PARiHS) Framework (n = 3) [26, 46, 63]. A total of 17% (n = 11) studies used a combination of theoretical approaches, as opposed to a single theoretical approach [38, 45, 51, 52, 54,55,56, 58, 66, 74, 78].

Of the 17 studies that used a model, 42% (n = 7) used a single model [24, 27, 32, 33, 37, 64, 72]. On the contrary, 18% (n = 3) developed their own model [22, 23, 76], whereas 9% (n = 1) used a combination of models with a theory for data collection purposes [56]. The National Health Service Sustainability Model (NHS SM) [84] was combined with Normalization Process Theory (NPT) [85] to inform a realist evaluation on sustainability in 9% (n = 1) of included studies, making this the only study combining a single model combined with a single theory [38]. The remaining 29% (n = 5) used a model combined with a tool for data collection purposes [54, 58, 66, 74, 78]. The NHS SM [84] was the most frequently used model to evaluate sustainability (35%; n = 6) [32, 54, 58, 74, 76, 78]. Of the six studies that used the NHS SM [84], five also used the NHS SM [84] as a basis for a survey for data collection [54, 58, 74, 76, 78]. One study that used the NHS SM [84] as its primary evaluation approach also drew upon the Theoretical Domains Framework [86] to develop their interview guide [58].

The Program Sustainability Assessment Tool (PSAT) [87] was the most frequently reported tool among the seven studies using a tool as the primary theoretical approach to evaluate sustainability (71%; n = 5) [60, 62, 69, 71, 81]. Of the 64 included studies, 5% (n = 3) used an instrument as their primary theoretical approach to evaluate sustainability [65, 67, 77]. Instruments used include the Technology Adoption Readiness Scale (TARS) [77] (n = 1) [77], Individual Placement and Support Fidelity Scale (IPS-25) [88] (n = 1) [67] and an adapted version of the Level of Institutionalization (LoIn) [89] Scales (n = 1) [65].

A total of 14% (n = 9) used a theory-informed process as their primary theoretical approach to evaluate sustainability, with 89% (n = 8) [25, 28, 40,41,42, 70, 73, 79] of those drawing on (NPT) [85] and 11% (n = 1) [53] drawing on Diffusion of Innovation Theory [90]. All the approaches are outlined in Table 7.

Table 7 Primary approach for evaluation of sustainability

The ways in which the selected sustainability approaches were applied in the included evaluation studies fell into three categories: (1) data collection (construct measures and outcomes), (2) data analysis (to examine and interpret data in relation to sustainability); and (3) a combination of data collection and analysis.

Research design and methodological quality

The research designs and MMAT [18] quality appraisal scores of the included studies are presented in Table 8. The scale ranges from 100 (highest quality) to 0% (lowest quality); however, all included studies ranged from 100 to 40%. More than half (59%; n = 38) of included studies received a quality appraisal score of 100%, indicating the included studies are of high methodological quality. Further, none of the included studies received a quality appraisal score of 0 or 20%. This is especially reflective of the high methodological quality as mixed-method studies were scored using both qualitative and quantitative descriptive categories and assigned the lower score. The rationale for this scoring supported the notion that a study can only be as strong as its weakest component [18].

Table 8 Study design and MMAT score of 64 included studies

Reported timing of evaluation

Of 64 included studies, 66% (n = 43) [20, 22, 24, 25, 28, 31, 34, 35, 39,40,41,42, 44, 45, 48,49,50,51,52,53,54,55,56,57,58,59, 63, 65,66,67,68,69,70,71,72,73,74,75,76, 79,80,81,82] had a clear timing for evaluation and 33% (n = 21) [19, 21, 23, 26, 27, 29, 30, 32, 33, 36,37,38, 43, 46, 47, 60,61,62, 64, 77, 78] had unclear timing. Of the 43 studies with clear timing, 42% (n = 18) evaluated sustainability at a single time point [20, 24, 28, 34, 39,40,41, 44, 45, 48, 51, 53, 66, 67, 73, 80,81,82]. The remaining 57% (n = 25) evaluated sustainability at multiple time points [22, 25, 31, 35, 42, 49, 50, 52, 54,55,56,57,58,59, 63, 65, 68,69,70,71,72, 74,75,76, 79]. The majority of studies (63%; n = 40) conducted data collection post evaluation only [20, 22, 24, 25, 28, 31, 34, 35, 40,41,42, 44, 45, 48,49,50,51,52,53,54,55,56,57, 59, 63, 65,66,67,68,69, 71,72,73,74,75,76, 79,80,81,82]. Evaluation timing and data collection time points are provided in Table 9.

Table 9 Reported timing of evaluation

Reported sustainability outcomes

We extracted and synthesized nine sustainability outcomes of the 64 included studies [6].

The majority of included studies (88%; n = 56) reported one or more evaluated outcome of sustainability [19,20,21,22,23,24,25,26,27,28,29,30,31,32,33, 35,36,37,38,39, 41,42,43,44,45,46, 48,49,50,51,52,53, 55,56,57, 59,60,61,62,63,64,65, 67, 68, 70,71,72,73,74, 76,77,78,79,80,81,82]. Half of these (50%; n = 28) were qualitative in design [19,20,21,22,23,24,25,26,27,28,29,30,31,32, 35,36,37,38,39, 41,42,43,44,45,46, 48, 55, 65]. One quarter (25%; n = 14) used a mixed-methods research design [33, 49,50,51,52,53, 56, 57, 59,60,61,62,63, 82], (13%; n = 7) used a multi-method design [67, 68, 70,71,72,73,74], and (13%; n = 7) used a quantitative research design [64, 76,77,78,79,80,81]. Of the 8 studies that did not report any sustainability outcomes 25%; (n = 2) used a mixed methods design [54, 58], 25%; (n = 2) used a multi-method design [66, 69], 38%; (n = 3) used a qualitative design [34, 40, 47], and the final 12% (n = 1) used a quantitative design [75].

Whether or not benefits continued for patients, staff, and stakeholders was the most commonly evaluated outcome of sustainability (n = 45). The most frequently reported sustained benefits are improved health outcomes and improved quality of care. For example, Blanchet et al., reported that eye care was not available prior to the implementation of the EBI, and therefore the sustained benefit to patients has been substantial [53]. Furthermore, Campbell et al. reported an absolute increase in long-term smoking cessation as an outcome of EBI sustainability [27]. Continuation of initiative activities or components of the intervention was an evaluated outcome of EBI sustainability in 36 of the included studies. Examples of sustained EBI activities include the continuation of HIV rapid testing [72], or continued use of the intervention guidebook [41]. Continuation of initiative activities or components differs from the maintenance of policies and procedures, which was only reported in 31 of the included studies. For example, Spassiani et al., reported continued EBI activities such as providing nutritious food at gatherings, and hosting community outings on evenings and weekends as an outcome of sustainability, but reported no formal policy or procedure change [46]. Alternatively, Kennedy et al. reported sustained changes in nutrition policies as well as sustained activities like giving healthy snacks, and distributing Healthy Habit Questionnaires [59].

Maintenance of relationships, partnerships or networks was a common outcome of sustainability, and reported in 34 of the included studies. The next most reported outcomes are the capacity built within staff stakeholders, and communities (n = 29), and adaptations made in response to new evidence or contextual influences (n = 29). Examples of sustained increased capacity include hiring new staff to help deliver EBI activities [24], funding a new electronic medical record system [24], and regular training [23].

Sustaining increased awareness of the issue, and replication or scale-up of the initiative, were both reported as an outcome of sustainability in 18 of the included studies. While a relatively frequent outcome of sustainability was increased attention and awareness of the problem or issue, it should be noted this was not always a good thing. For example, the increased attention around the provision of HIV treatment resulted in system capacity overload as an increased number of people sought treatment [49].

Gaining further funds to continue the initiative and maintain improvements was the least reported outcome (n = 16) It is apparent that funding of EBIs is often focused on implementation efforts and rarely permanent. Figure 2 depicts the distribution of reported sustainability outcomes. The outcomes reported in each of the 64 included studies can be found in Additional file 4.

Fig. 2
figure 2

Sustainability outcomes measured


As evaluation is necessary to determine effectiveness, this review aimed to improve our understanding of how the sustainability of EBIs in healthcare is evaluated. A focus on evaluation is what differentiates this review from recent syntheses published on approaches to sustainability research. We also need to understand how, if, and why EBIs work or not in certain contexts to enable replication, sustainability, spread, and scale [105]. Therefore, we provide evidence on theoretical approaches used for evaluation, including how and when they have been used, and offer new guidance and synthesis on the combination of approaches for evaluating sustainability of EBIs. Primary research to compare theoretical approaches used in sustainability evaluation research, to which determinants are most pertinent to sustainability evaluation is non-existent. It remains unknown the similarities and differences between these theoretical approaches. Such evidence would be highly beneficial to healthcare leaders who need guidance on what theoretical approach to select to evaluate the sustainability of an EBI they are implementing in clinical environments and not under a research effectiveness trial design. It would also be useful for researchers so that they can inform healthcare leaders on actionable evidence.

While evidence on theoretical approaches exist, we provide further insight by reporting when sustainability evaluations were performed (timing), and what methods, measures, and outcomes were reported in evaluations on the sustainability of EBIs in healthcare. We found 64 studies in the peer-review literature that used a theoretical approach to evaluate the sustainability of EBIs in healthcare. Our synthesis indicated that there is a breadth of theoretical approaches and constructs that were considered for the evaluation of the sustainability of EBIs in healthcare that were consistent with other recent synthesis work [6,7,8,9,10]. A recent scoping review and theory analysis [8] found 37 sustainability determinant factors, which grouped into seven themes: (1) characteristics of the innovation/EBP; (2) adopter/user factors influencing sustained use (3) leadership and management influences/factors; (4) inner context (practice setting/organization) factors where EBPs are delivered; (5) inner processes/infrastructure factors that support the EBPs (e.g., processes, methods, systems, structures, or strategies); (6) outer context or broader system factors; and (7) outcomes descriptions without defined factors. These themes are similar to the work of Lennox et al. [9], who found six themes of sustainability constructs that aligned with the five domains associated with effective implementation outlined CFIR [83]: (1) intervention characteristics; (2) outer setting; (3) inner setting; (4) characteristics of individuals; and (5) process [9].

Despite these scientific advancements on sustainability determinants, there is a lack of guidance on how to select the most appropriate theoretical approach to evaluate the sustainability of EBIs in healthcare. Interestingly, our review provides insight into the combination of theoretical approaches (e.g., theory and a tool) used to evaluate sustainability. We identified eleven studies (17% of included) that used a combined of theoretical approach, as opposed to a single theoretical approach [38, 45, 51, 52, 54,55,56, 58, 66, 74, 78], with the most common combination being a single model with a single tool (n = 5) [54, 58, 66, 74, 78]. Some theoretical approaches originated from implementation science (e.g., CFIR [83], RE-AIM [93], PARiHS [92]), where sustainability is viewed as an outcome of implementation whereas other theoretical approaches were specific to sustainability and encompass the process of sustainability and or factors that influence sustainability (e.g., NHS SM [84], DSF [95]).

Most evaluations in this integrative review applied determinant theoretical approaches, that focus on predictors of sustainability (organizational, contextual, human, and processes), but they did not link these determinants to, and, or measure patient or system level outcomes, such as sustained patient or staff benefit [6]. For those studies that did measure any of the nine sustainability outcomes, there was a lack of correlation between the outcome (e.g., maintenance of policy) and long-term impact on patient or system outcomes. In a review by Lennox et al., they indicated that only 21% of studies reported any information on sustainability outcomes [6].

Most sustainability evaluations included in this study used qualitative research designs (48%), with interviews as the most common data collection method. This finding is consistent with the work of Lennox [6] who reported 59% of their included studies used qualitative methods. While qualitative research designs gather rich detail on potential determinants of sustainability (e.g., context) and participant’s perspectives on the sustainability process, this design cannot solely measure mechanisms and outcomes of sustainability. Researchers must also consider a mixed-methods approach for sustainability evaluation. Mixed-methods research designs allow studying a phenomenon from different perspectives and provide a richer picture by combining insights from qualitative data with quantitative data [56]. The combination of quantitative and qualitative approaches is consistent with a better understanding of complex phenomena than either approach alone [57]. Our findings also highlight a significant knowledge gap on the timing of evaluation in sustainability research—there is no guidance on this matter. Almost all included studies collected sustainability data post-intervention without any pre-intervention or during intervention data collection. Evaluation of sustainability pre-intervention can help to better understand the contextual factors that may hinder or facilitate the likelihood of sustaining a particular intervention in a specific context. In our review, pre-intervention evaluations were only conducted in 6% (n = 4) of included studies [25, 59, 64, 68]. Of these, half (n = 2) used a mixed-method research design, combining semi-structured interviews with quantitative surveys and reviews [59, 64]. One study used a multi-method design including group interviews and health facility assessments [68], and the final qualitative study relied solely on interviews [25]. Timing of sustainability evaluation should be considered in relation to what was being implemented (i.e., an EBI that has been proven to be effective) and how it is being implemented (i.e., hybrid type III research design).

Of the 42 studies with clear evaluation timing, there was no clear pattern of data collection time points. Only 24 studies had multiple data collection time points. Multiple time points are necessary where feasible to adjust for the adaptation of the intervention and context over time. Measuring outcomes at multiple time points over a more extended period is also important to determine continued benefit and impact on patient care and service delivery. Such evidence would also support the sustainability of the EBI in practice [35].

Based upon the findings of our review we can offer some key methodological guidance for evaluations of the sustainability of EBIs in healthcare. Firstly, we recommend where feasible to use a combination of approaches for evaluating sustainability of EBIs. A combination of approaches that can evaluate sustainability determinant and outcomes will facilitate our understanding of linkages between determinants and patient or system level outcomes. Secondly, we recommend mixed-methods approach for sustainability evaluation. Mixed methods research designs can provide a better understanding of complex phenomena. Thirdly, we recommend evaluations of sustainability at multiple time points, including pre-intervention in order understand the evolution of sustainability over time. Finally, we recommend that future research is needed to understand mechanisms of sustainability to advance the field. From our review, these mechanisms have not yet been identified for sustainability. There is evidence on determinants for sustainability, and outcomes for sustainability but there is a knowledge gap on how and why under what contexts certain determinants lead to specific outcomes. Mechanisms are underlying entities, process, or structures which operate in particular contexts to generate outcomes of interest [106]. Mechanisms offer causal pathways to understand how and why, under what contexts a determinant of sustainability does or does not achieve its intended effect. This knowledge will advance researchers and health system implementers ability to design, develop and implement strategies that directly target sustainability determinants and outcomes.


We only included published studies in the English language and peer-reviewed primary studies in this review. This review entailed a comprehensive search of published literature and rigorous review methods; however, we recognize that there is the possibility of incomplete retrieval of identified research. For example, all gray literature was excluded.


Our review has emphasized areas that require further research and the need for methodological guidance for sustainability evaluations of EBIs in healthcare. Advancing our understanding in this area would facilitate better design and tailored strategies for sustainability, therefore contributing to the success of sustainability efforts. This work contributes to existing syntheses on sustainability approaches, specifically for evaluation research and on ways to move forward to advance this field.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.



Consolidated Framework for Implementation Research


Dynamic Sustainability Framework


Evidence-based intervention


Evidence Based Practice


Individual Placement and Support Fidelity Scale


Implementation Science



Level of Institutionalization Scales


Mixed Methods Appraisal Tool


The National Health Service Sustainability Model


Normalization Process Theory


Promoting Action on Research Implementation in Health Services Framework


Preferred Reporting Items for Systematic Reviews and Meta-analyses


Program Sustainability Assessment Tool


Reach, Effectiveness, Adoption, Implementation, and Maintenance


Technology Adaption Readiness Scale


  1. Balas EA, Boren SA. Managing clinical knowledge for health care improvement. Yearb Med Inform. 2000;09(01):65–70 [cited 2021 Nov 19]Available from:

    Article  Google Scholar 

  2. Eccles MP, Mittman BS. Welcome to implementation science. Implement Sci. 2006;1:1–3 [cited 2021 Nov 19]. Available from:

    Article  PubMed Central  Google Scholar 

  3. Wiltsey Stirman S, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012;7 [cited 2021 Nov 19]. Available from:

  4. Moore JE, Mascarenhas A, Bain J, Straus SE. Developing a comprehensive definition of sustainability. Implement Sci. 2017;12(1):110 [cited 2021 Nov 19] Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  5. Proctor E, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, et al. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implement Sci. 2015;10(1):1–13 [cited 2021 Nov 19]. Available from:

    Article  Google Scholar 

  6. Lennox L, Linwood-Amor A, Maher L, Reed J. Making change last? Exploring the value of sustainability approaches in healthcare: a scoping review. Health Res Policy Syst. 2020;18(1):120 Oct 13 [cited 2021 Nov 19]. Available from:

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  7. Braithwaite J, Ludlow K, Testa L, Herkes J, Augustsson H, Lamprell G, et al. Built to last? The sustainability of healthcare system improvements, programmes and interventions: a systematic integrative review. BMJ Open. 2020;10(6):e036453 [cited 2021 Nov 19]Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  8. Nadalin Penno L, Davies B, Graham ID, Backman C, MacDonald I, Bain J, et al. Identifying relevant concepts and factors for the sustainability of evidence-based practices within acute care contexts: a systematic review and theory analysis of selected sustainability frameworks. Implement Sci. 2019;14(1):1–16 Dec 19 [cited 2021 Nov 19]Available from:

    Article  Google Scholar 

  9. Lennox L, Maher L, Reed J. Navigating the sustainability landscape: a systematic review of sustainability approaches in healthcare. Implement Sci. 2018;13(1):1–17 [cited 2021 Nov 19]. Available from:

    Article  Google Scholar 

  10. Moullin JC, Sklar M, Green A, Dickson KS, Stadnick NA, Reeder K, et al. Advancing the pragmatic measurement of sustainment: a narrative review of measures. Implement Sci Commun. 2020;1(1):1–18 17 [cited 2021 Nov 19]Available from:

    Article  Google Scholar 

  11. Whittemore R. Combining evidence in nursing research: methods and implications. Nurs Res. 2005;54(1):56–62 Jan [cited 2021 Nov 19]. Available from:

    Article  PubMed  Google Scholar 

  12. Whittemore R, Knafl K. The integrative review: updated methodology. J Adv Nurs. 2005;52(5):546–53 [cited 2021 Nov 19]. Available from:

    Article  PubMed  Google Scholar 

  13. Deschênes MF, Goudreau J. Addressing the development of both knowledge and clinical reasoning in nursing through the perspective of script concordance: an integrative literature review. J Nurs Educ Pract. 2017;7(12):28.

    Article  Google Scholar 

  14. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):1–13 [cited 2021 Nov 19]Available from:

    Article  Google Scholar 

  15. Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101(11):2059 [cited 2021 Nov 19] Available from: /pmc/articles/PMC3222409/.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Scheirer MA. Is sustainability possible? A review and commentary on empirical studies of program sustainability. Am J Eval. 2005;26(3):320–47 [cited 2021 Nov 23] Available from:

    Article  Google Scholar 

  17. Hong Q, Pluye P, Fàbregues, Bartlett G, Boardman F, Cargo M, et al. Mixed Methods Appraisal Tool (MMAT). Canadian Intellectual Property Office, Industry Canada [Internet]. 2018; Available from:

  18. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372 [cited 2021 Nov 19] Available from:

  19. Abimbola S, Patel B, Peiris D, Patel A, Harris M, Usherwood T, et al. The NASSS framework for ex post theorisation of technology-supported change in healthcare: worked example of the TORPEDO programme. BMC Med. 2019;17(1):233 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  20. Ament SMC, Gillissen F, Moser A, Maessen JMC, Dirksen CD, von Meyenfeldt MF, et al. Factors associated with sustainability of 2 quality improvement programs after achieving early implementation success. A qualitative case study. J Eval Clin Pract. 2017;23(6):1135–43.

    Article  PubMed  Google Scholar 

  21. Azeredo TB, Oliveira MA, Santos-Pinto CDB, Miranda ES, Osorio-de-Castro CGS. Sustainability of ARV provision in developing countries: challenging a framework based on program history. Ciencia saude coletiva. 2017;22(8):2581–94 Available from:

    Article  PubMed  Google Scholar 

  22. Baloh J, Zhu X, Ward MM. What influences sustainment and nonsustainment of facilitation activities in implementation? Analysis of Organizational Factors in Hospitals Implementing TeamSTEPPS. Med Care Res Rev. 2021;78(2):146–56. Available from.

    Article  PubMed  Google Scholar 

  23. Belizán M, Bergh AM, Cilliers C, Pattinson RC, Voce A. Stages of change: a qualitative study on the implementation of a perinatal audit programme in South Africa. BMC Health Services Research [Internet]. 2011;11(1):1–12 [cited 2022 Jun 28]. Available from:

    Google Scholar 

  24. Bray P, Cummings DM, Wolf M, Massing MW, Reaves J. After the collaborative is over: what sustains quality improvement initiatives in primary care practices? Joint Commission J Qual Patient Safety. 2009;35(10):502–AP3 Available from:

    Article  Google Scholar 

  25. Bridges J, May C, Fuller A, Griffiths P, Wigley W, Gould L, et al. Optimising impact and sustainability: a qualitative process evaluation of a complex intervention targeted at compassionate care. BMJ Qual Safety. 2017;26(12):9–7 0–7. [cited 2022 Jun 28]. Available from:

    Article  Google Scholar 

  26. Butow P, Williams D, Thewes B, Tesson S, Sharpe L, Smith A, et al. A psychological intervention (ConquerFear) for treating fear of cancer recurrence: Views of study therapists regarding sustainability. Psychooncology. 2019;28(3):533–9 1 [cited 2022 Jun 28] Available from:

    Article  PubMed  Google Scholar 

  27. Campbell S, Pieters K, Mullen KA, Reece R, Reid RD. Examining sustainability in a hospital setting: case of smoking cessation. Implement Sci. 2011;6(1):1–11 [cited 2022 Jun 28]. Available from:

    Article  Google Scholar 

  28. Carstensen K, Brostrøm Kousgaard M, Burau V. Sustaining an intervention for physical health promotion in community mental health services: a multisite case study. Health Soc Care Commun. 2019;27(2):502–15 Available from:

    Article  Google Scholar 

  29. Dharmayat KI, Tran T, Hardy V, Chirambo BG, Thompson MJ, Ide N, et al. Sustainability of “mHealth” interventions in sub- Saharan Africa: a stakeholder analysis of an electronic community case management project in Malawi. Malawi Med J. 2019;31(3):177–83 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  30. Eakin MN, Ugbah L, Arnautovic T, Parker AM, Needham DM. Implementing and sustaining an early rehabilitation program in a medical intensive care unit: A qualitative analysis. J Crit Care. 2015;30(4):698–704. Available from:.

    Article  PubMed  Google Scholar 

  31. El Bcheraoui C, Kamath AM, Dansereau E, Palmisano EB, Schaefer A, Hernandez B, et al. Results-based aid with lasting effects: Sustainability in the Salud Mesoamérica Initiative 11 Medical and Health Sciences 1117 Public Health and Health Services 16 Studies in Human Society 1605 Policy and Administration. Glob Health. 2018;14(1):1–14 [cited 2022 Jun 28]. Available from:

    Google Scholar 

  32. Flynn R, Rotter T, Hartfield D, Newton AS, Scott SD. A realist evaluation to identify contexts and mechanisms that enabled and hindered implementation and had an effect on sustainability of a lean intervention in pediatric healthcare. BMC Health Serv Res. 2019;19(1):1–13.

    Article  Google Scholar 

  33. Ford JH, Alagoz E, Dinauer S, Johnson KA, Pe-Romashko K, Gustafson DH. Successful organizational strategies to sustain use of A-CHESS: a mobile intervention for individuals with alcohol use disorders. J Med Internet Res. 2015;17(8) Aug 1 [cited 2022 Jun 28] Available from:

  34. Frykman M, von Thiele SU, Athlin ÅM, Hasson H, Mazzocato P. The work is never ending: uncovering teamwork sustainability using realistic evaluation. J Health Organ Manage. 2017;31(1):64–81.

    Article  Google Scholar 

  35. Garst J, L’Heveder R, Siminerio LM, Motala AA, Gabbay RA, Chaney D, et al. Sustaining diabetes prevention and care interventions: a multiple case study of translational research projects. Diabetes Res Clin Pract. 2017;130:67–76. Available from:.

    Article  CAS  PubMed  Google Scholar 

  36. Graham JR, Naylor PJ. Sustainability drivers of Canada’s most health-promoting hospital. Healthcare Manage Forum. 2019;32(3):158–62 Available from:

    Article  Google Scholar 

  37. Hovlid E, Bukve O, Haug K, Aslaksen AB, von Plessen C. Sustainability of healthcare improvement: what can we learn from learning theory? BMC Health Serv Res. 2012;12(1):235 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  38. Kempen TGH, Gillespie U, Färdborg M, McIntosh J, Mair A, Stewart D. A case study of the implementation and sustainability of medication reviews in older patients by clinical pharmacists. Res Soc Admin Pharm. 2019;15(11):1309–16. Available from:.

    Article  Google Scholar 

  39. Klinga C, Hasson H, Andreen Sachs M, Hansson J. Understanding the dynamics of sustainable change: a 20-year case study of integrated health and social care. BMC Health Serv Res. 2018;18(1):1–12 [cited 2022 Jun 28]. Available from:

    Article  Google Scholar 

  40. Lillvis DF, Willison C, Noyes K. Normalizing inconvenience to promote childhood vaccination: a qualitative implementation evaluation of a novel Michigan program. BMC Health Serv Res. 2020;20(1):1–9 [cited 2022 Jun 28];Available from:

    Article  Google Scholar 

  41. Morden A, Brooks L, Jinks C, Porcheret M, Ong BN, Dziedzic K. Research “push”, long term-change, and general practice. J Health Organ Manage. 2015;29(7):798–821.

    Article  Google Scholar 

  42. Nordmark S, Zingmark K, Lindberg I. Process evaluation of discharge planning implementation in healthcare using normalization process theory. BMC Med Inform Decis Making. 2016;16(1):1–10 [cited 2022 Jun 28]. Available from:

    Article  Google Scholar 

  43. Pomey MP, Clavel N, Amar C, Sabogale-Olarte JC, Sanmartin C, de Coster C, et al. Wait time management strategies for total joint replacement surgery: sustainability and unintended consequences. BMC Health Serv Res. 2017;17(1):629 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  44. Rasschaert F, Decroo T, Remartinez D, Telfer B, Lessitala F, Biot M, et al. Sustainability of a community-based anti-retroviral care delivery model - a qualitative research study in Tete, Mozambique. J Int AIDS Soc. 2014;17(1) [cited 2022 Jun 28]. Available from:

  45. Seppey M, Ridde V, Touré L, Coulibaly A. Donor-funded project’s sustainability assessment: a qualitative case study of a results-based financing pilot in Koulikoro region, Mali. Global Health. 2017;13(1):86 8 [cited 2022 Jun 28]8 [cited 2022 Jun 28]. Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  46. Spassiani NA, Meisner BA, Abou Chacra MS, Heller T, Hammel J. What is and isn’t working: factors involved in sustaining community-based health and participation initiatives for people ageing with intellectual and developmental disabilities. J Appl Res Intellect Disabil. 2019;32(6):1465–77.

    Article  PubMed  Google Scholar 

  47. Tabak RG. Research Assistant Professor R, Warren G, Duggan K, Manager R, Smith C, et al. Assessing capacity for sustainability of effective programs and policies in local health departments HHS Public Access. J Public Health Manag Pract. 2016;22(2):129–37.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Tomioka M, Braun KL. Examining sustainability factors for organizations that adopted stanford’s chronic disease self-management program. Front Public Health [Internet]. 2015 Apr 27 [cited 2022 Jun 28];2(APR). Available from:

  49. Zakumumpa H, Kwiringira J, Rujumba J, Ssengooba F. Assessing the level of institutionalization of donor-funded anti-retroviral therapy (ART) programs in health facilities in Uganda: implications for program sustainability. Glob Health Action. 2018;11(1) [cited 2022 Jun 28]. Available from:

  50. Aarons GA, Green AE, Trott E, Willging CE, Torres EM, Ehrhart MG, et al. The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: a mixed-method study. Adm Policy Ment Health. 2016;43(6):991–1008 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  51. Allchin B, Weimand BM, O’Hanlon B, Goodyear M. Continued capacity: Factors of importance for organizations to support continued Let’s Talk practice – a mixed-methods study. Int J Mental Health Nurs 2020;29(6):1131–1143[cited 2022 Jun 29]. Available from:

  52. Berendsen BAJ, Kremers SPJ, Savelberg HHCM, Schaper NC, Hendriks MRC. The implementation and sustainability of a combined lifestyle intervention in primary care: mixed method process evaluation. BMC Fam Pract. 2015;16(1):37 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  53. Blanchet K, James P. Can international health programmes be sustained after the end of international funding: the case of eye care interventions in Ghana. BMC Health Serv Res. 2014;14:77 [cited 2022 Jun 29]. Available from: /pmc/articles/PMC3936912/.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Doyle C, Howe C, Woodcock T, Myron R, Phekoo K, McNicholas C, et al. Making change last: applying the NHS institute for innovation and improvement sustainability model to healthcare improvement. Implement Sci. 2013;8(1):127 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  55. Ford JH, Krahn D, Wise M, Oliver KA. Measuring sustainability within the veterans administration mental health system redesign initiative. Qual Manage Health Care. 2011;20(4):263–79 Available from:

    Article  Google Scholar 

  56. Greenhalgh T, Macfarlane F, Barton-Sweeney C, Woodard F. “If we build it, will it stay?” A case study of the sustainability of whole-system change in London. Milbank Q. 2012;90(3):516–47 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  57. Grow HMG, Hencz P, Verbovski MJ, Gregerson L, Liu LL, Dossett L, et al. Partnering for success and sustainability in community-based child obesity intervention. Fam Commun Health. 2014;37(1):45–59 Available from:

    Article  Google Scholar 

  58. Kastner M, Sayal R, Oliver D, Straus SE, Dolovich L. Sustainability and scalability of a volunteer-based primary care intervention (Health TAPESTRY): a mixed-methods analysis. BMC Health Serv Res. 2017;17(1):514 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  59. Kennedy L, Pinkney S, Suleman S, Mâsse LC, Naylor PJ, Amed S. Propagating change: using RE-FRAME to scale and sustain a community-based childhood obesity prevention initiative. Int J Environ Res Public Health. 2019;16(5):736 Available from:

    Article  PubMed Central  Google Scholar 

  60. Nazar H, Nazar Z. Community pharmacy minor ailment services: pharmacy stakeholder perspectives on the factors affecting sustainability. Res Soc Admin Pharm. 2019;15(3):292–302. Available from:.

    Article  Google Scholar 

  61. Palinkas LA, Spear SE, Mendon SJ, Villamar J, Reynolds C, Green CD, et al. Conceptualizing and measuring sustainability of prevention programs, policies, and practices. Transl Behav Med. 2020;10(1):136–45 Available from:

    Article  PubMed  Google Scholar 

  62. Stoll S, Janevic M, Lara M, Ramos-Valencia G, Stephens TB, Persky V, et al. A mixed-method application of the program sustainability assessment tool to evaluate the sustainability of 4 pediatric asthma care coordination programs. Prev Chronic Dis. 2015;(12, 12):150133 Available from:

  63. Sving E, Fredriksson L, Mamhidir AG, Högman M, Gunningberg L. A multifaceted intervention for evidence-based pressure ulcer prevention: a 3 year follow-up. JBI Evid Implement. 2020;18(4):391–400 9 [cited 2022 May 11]Available from:

    PubMed  Google Scholar 

  64. Stolldorf DP, Mixon AS, Auerbach AD, Aylor AR, Shabbir H, Schnipper J, et al. Implementation and sustainability of a medication reconciliation toolkit: a mixed methods evaluation. Am J Health Syst Pharm. 2020;77(14):1135–43 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  65. Zakumumpa H, Bennett S, Ssengooba F. Accounting for variations in ART program sustainability outcomes in health facilities in Uganda: a comparative case study analysis. BMC Health Serv Res. 2016;16(1):584. Available from.

    Article  PubMed  PubMed Central  Google Scholar 

  66. Belostotsky V, Laing C, White DE. The sustainability of a quality improvement initiative. Healthcare Manage Forum. 2020;33(5):195–9 [cited 2022 Jun 29]. Available from:

    Article  Google Scholar 

  67. Bond GR, Drake RE, Becker DR, Noel VA. The IPS learning community: a longitudinal study of sustainment, quality, and outcome. Psychiatric Serv. 2016;67(8):864–9.

    Article  Google Scholar 

  68. Healey J, Conlon CM, Malama K, Hobson R, Kaharuza F, Kekitiinwa A, et al. Sustainability and scale of the saving mothers, giving life approach in Uganda and Zambia. Glob Health Sci Pract. 2019;7(Supplement 1):S188–206 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  69. Hunter SB, Han B, Slaughter ME, Godley SH, Garner BR. Associations between implementation characteristics and evidence-based practice sustainment: a study of the adolescent community reinforcement approach. Implement Sci. 2015;10(1):173. Available from.

    Article  PubMed  PubMed Central  Google Scholar 

  70. Jones S, Hamilton S, Bell R, Araújo-Soares V, Glinianaia S, EMG M, et al. What helped and hindered implementation of an intervention package to reduce smoking in pregnancy: Process evaluation guided by normalization process theory. BMC Health Serv Res. 2019;19(1):1–14.

    Article  Google Scholar 

  71. King DK, Gonzalez SJ, Hartje JA, Hanson BL, Edney C, Snell H, et al. Examining the sustainability potential of a multisite pilot to integrate alcohol screening and brief intervention within three primary care systems. Transl Behav Med. 2018;8(5):776–84 Available from:

    Article  CAS  PubMed  Google Scholar 

  72. Knapp H, Hagedorn H, Anaya HD. A five-year self-sustainability analysis of nurse-administered HIV rapid testing in Veterans Affairs primary care. Int J STD AIDS. 2014;25(12):837–43.

    Article  PubMed  Google Scholar 

  73. Kosse RC, Murray E, Bouvy ML, de Vries TW, Stevenson F, Koster ES. Potential normalization of an asthma mHealth intervention in community pharmacies: Applying a theory-based framework. Res Soc Admin Pharm. 2020;16(2):195–201. Available from:.

    Article  Google Scholar 

  74. van Heerden C, Maree C, van Rensburg ESJ. Strategies to sustain a quality improvement initiative in neonatal resuscitation. Afr J Prim Health Care Fam Med. 2016;8(2):1–10 [cited 2022 Jun 29]Available from: /pmc/articles/PMC4863134/.

    Google Scholar 

  75. Carlfjord S, Lindberg M, Andersson A. Sustained use of a tool for lifestyle intervention implemented in primary health care: a 2-year follow-up. J Eval Clin Pract. 2013;19(2):327–34 Available from:

    Article  PubMed  Google Scholar 

  76. Curry SJ, Mermelstein RJ, Sporer AK. Sustainability of community-based youth smoking cessation programs: results from a 3-year follow-up. Health Promot Pract. 2016;17(6):845–52. Available from.

    Article  PubMed  Google Scholar 

  77. Finch TL, Mair FS, Odonnell C, Murray E, May CR. From theory to “measurement” in complex interventions: methodological lessons from the development of an e-health normalisation instrument. BMC Med Res Methodol. 2012;12:1–16.

    Article  Google Scholar 

  78. Kacholi G, Mahomed OH. Sustainability of quality improvement teams in selected regional referral hospitals in Tanzania. Int J Qual Health Care. 2020;32(4):259–65 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  79. Lindholm LH, Koivukangas A, Lassila A, Kampman O. What is important for the sustained implementation of evidence-based brief psychotherapy interventions in psychiatric care? A quantitative evaluation of a real-world programme. Nordic J Psychiatry. 2019;73(3):185–94 Available from:

    Article  Google Scholar 

  80. Mahomed OH, Asmall S, Voce A. Sustainability of the integrated chronic disease management model at primary care clinics in South Africa. African Journal of Primary Health Care & Family Medicine [Internet]. 2016;8(1):1–7 Available from:

    Google Scholar 

  81. Smith ML, Durrett NK, Schneider EC, Byers IN, Shubert TE, Wilson AD, et al. Examination of sustainability indicators for fall prevention strategies in three states. Eval Program Plann. 2018;68:194–201 [cited 2022 Jun 29]. Available from:

    Article  PubMed  Google Scholar 

  82. Stolldorf DP, Fortune-Britt AG, Nieuwsma JA, Gierisch JM, Datta SK, Angel C, et al. Measuring sustainability of a grassroots program in a large integrated health care delivery system: the Warrior to Soul Mate Program. J Milit Vet Fam Health. 2018;4(2):81–90.

    Article  Google Scholar 

  83. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):1–15 [cited 2021 Nov 24]. Available from:

    Article  Google Scholar 

  84. Maher L, Gustafson D, Evans A. Sustainability model and guide. NHS Inst Innov Improv [Internet]. 2007 1 [cited 2021 Nov 24]; Available from:

  85. May C, Finch T. Implementing, embedding, and integrating practices: an outline of normalization process theory. Sociology. 2009;43(3):535–54 [cited 2021 Nov 24]. Available from:

    Article  Google Scholar 

  86. Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7(1):37 Available from:

    Article  PubMed  PubMed Central  Google Scholar 

  87. Luke DA, Calhoun A, Robichaux CB, Elliott MB, Moreland-Russell S. The Program Sustainability Assessment Tool: a new instrument for public health programs. PrevChronic Dis [Internet]. 2014;11(2014):130184. [cited 2021 Nov 24] Available from:

  88. Bond GR, Peterson AE, Becker DR, et al. Validating the revised Individual Placement and Support Fidelity Scale (IPS-25). Psychi- atric Services. 2012;63:758–63.

    Article  Google Scholar 

  89. Goodman RM, Mcleroy KR, Steckler AB, Hoyle RH. Perspective: development of level of institutionalization scales for health promotion programs. Heal Educ Behav. 1993;20(2):161–78 [cited 2021 Nov 24]. Available from:

    CAS  Google Scholar 

  90. Rogers EM, Singhal A, Quinlan MM. Diffusion of innovations. In: An Integrated Approach to Communication Theory and Research, Third Edition. New York: Free Press; 2019. [cited 2021 Nov 24]. p. 415–33. Available from:

    Google Scholar 

  91. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23 Available from:

    Article  PubMed  Google Scholar 

  92. Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Saf Heal Care. 1998;7(3):149–58 [cited 2021 Nov 24]. Available from:

    Article  CAS  Google Scholar 

  93. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7 Sep [cited 2021 Nov 24]. Available from:

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  94. Greenhalgh T, Abimbola S. The NASSS framework a synthesis of multiple theories of technology implementation. Stud Health Technol Inform. 2019;263:193–204 [cited 2021 Nov 24]. Available from:

    PubMed  Google Scholar 

  95. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: Addressing the paradox of sustainment amid ongoing change. Implement Sci [Internet]. 2013;8(1) [cited 2021 Nov 20]Available from:

  96. Johnson J, Les Dakens, Peter Edwards, Ned Morse. Switchpoints : culture change on the fast track for business success. 2008 [cited 2021 Nov 24];254. Available from:

  97. Shediac-Rizkallah MC, Bone LR. Planning far the sustainability of community-based health programs: Conceptual frameworks and future directions for research, practice and policy. Health Educ Res. 1998;13(1):87–108 Mar [cited 2021 Nov 24]Available from:

    Article  CAS  PubMed  Google Scholar 

  98. Pomey MP, Forest PG, Sanmartin C, DeCoster C, Drew M. Determinants of waiting time management for health services. In: A Policy Review and Synthesis. Working paper. Montreal: University of Montreal; 2009.

    Google Scholar 

  99. Bergh A, Arsalo I, Malan AF, Patrick M, Pattinson RC, Phillips N. Measuring implementation progress in kangaroo mother care. Acta Paediatr. 2007;94(8):1102–8 [cited 2021 Nov 24]. Available from:

    Article  Google Scholar 

  100. Mancini JA, Marek LI. Sustaining Community-Based Programs for Families: Conceptualization and Measurement*. Fam Relat [Internet]. 2004;53(4):339–47 [cited 2021 Nov 24]. Available from:

    Article  Google Scholar 

  101. Kotter JP. The 8-step process for leading change | Dr. John Kotter [Internet]. 2019 [cited 2021 Nov 24]. Available from:

  102. Gruen RL, Elliott JH, Nolan ML, Lawton PD, Parkhill A, McLaren CJ, et al. Sustainability science: an integrated approach for health-programme planning. Lancet. 2008;372(9649):1579–89 Nov [cited 2021 Nov 24]. Available from:

    Article  PubMed  Google Scholar 

  103. Crites GE, McNamara MC, Akl EA, Richardson WS, Umscheid CA, Nishikawa J. Evidence in the learning organization. Heal Res Policy Syst. 2009;7(1):4 26 [cited 2021 Nov 24]. Available from:

    Article  Google Scholar 

  104. Ipsos MORI. The Service Improvement Self Assessment Tool. Promoting Service Improvement in the NHS [Internet]. 2012. 1–28 p. Available from:

  105. Brewster L, Aveling EL, Martin G, Tarrant C, Dixon-Woods M, Barber N, et al. What to expect when you’re evaluating healthcare improvement: a concordat approach to managing collaboration and uncomfortable realities. BMJ Qual Saf. 2015;24(5):318–24.

    Article  PubMed  PubMed Central  Google Scholar 

  106. Birken SA, Haines ER, Hwang S, Chambers DA, Bunger AC, Nilsen P. Advancing understanding and identifying strategies for sustaining evidence-based practices: a review of reviews. Implement Sci. 2020;15(1):1–13 [cited 2021 Nov 20]. Available from:

    Article  Google Scholar 

Download references


We acknowledge Lauren Dobson for her editorial support on this manuscript. The primary author acknowledges the Canadian Institute for Health Research, the Women and Children’s Health Research Institute, the Faculty of Nursing, University of Alberta, for their support of her post-doctoral research. SDS is funded through a Canada Research Chair and a Distinguished Researcher Award from the Stollery Science Lab program at the Women and Children’s Health Research Institute.


For this review, we did not prepare or register a protocol.


Not applicable.

Author information

Authors and Affiliations



RF conceived this integrative review as part of her postdoctoral research. BS and SDS are the co- supervisors for this post-doctoral research and provided guidance on the conceptual development of this review. BS and SDS provided guidance on integrative reviews and process and contributed to the study conception and development. MK guided and performed the search strategies for this review. AB assisted in data screening and data extraction and intellectual discussion around extraction and synthesis. RF drafted and edited the final manuscript. BS, AB, and SDS all participated in critically appraising and revising the intellectual content of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Rachel Flynn.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1. 

Search strategy. The four databases searched on Dec 21, 2018, as well as an additional search in July 2020. The file depicts the date of searchers and strategy used, including all search terms in each database.

Additional file 2.

 PRISMA 2020 Checklist. The complete 2020 PRISMA checklist for new systematic reviews.

Additional file 3. 

Citation list of included studies. A citation list of all included studies (n=64). They are listed in alphabetical order, according to the Vancouver referencing style.

Additional file 4. 

Citation list of included studies. A table of the sustainability outcomes reported in all of the included studies (n=64). The table depicts which of the 9 sustainability outcomes were reported in each study.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Flynn, R., Stevens, B., Bains, A. et al. Identifying existing approaches used to evaluate the sustainability of evidence-based interventions in healthcare: an integrative review. Syst Rev 11, 221 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: