To what extent do site-based training, mentoring, and operational research improve district health system management and leadership in low- and middle-income countries: a systematic review protocol
© Belrhiti et al. 2016
Received: 21 January 2016
Accepted: 6 April 2016
Published: 27 April 2016
District health managers play a key role in the effectiveness of decentralized health systems in low- and middle-income countries. Inadequate management and leadership skills often hamper their ability to improve quality of care and effectiveness of health service delivery. Nevertheless, significant investments have been made in capacity-building programmes based on site-based training, mentoring, and operational research. This systematic review aims to review the effectiveness of site-based training, mentoring, and operational research (or action research) on the improvement of district health system management and leadership. Our secondary objectives are to assess whether variations in composition or intensity of the intervention influence its effectiveness and to identify enabling and constraining contexts and underlying mechanisms.
We will search the following databases: MEDLINE, PsycInfo, Cochrane Library, CRD database (DARE), Cochrane Effective Practice and Organisation of Care (EPOC) group, ISI Web of Science, Health Evidence.org, PDQ-Evidence, ERIC, EMBASE, and TRIP. Complementary search will be performed (hand-searching journals and citation and reference tracking).
Studies that meet the following PICO (Population, Intervention, Comparison, Outcome) criteria will be included: P: professionals working at district health management level; I: site-based training with or without mentoring, or operational research; C: normal institutional arrangements; and O: district health management functions. We will include cluster randomized controlled trials, controlled before-and-after studies, interrupted time series analysis, quasi-experimental designs, and cohort and longitudinal studies. Qualitative research will be included to contextualize findings and identify barriers and facilitators.
Primary outcomes that will be reported are district health management and leadership functions. We will assess risk of bias with the Cochrane Collaboration’s tools for randomized controlled trials (RCT) and non RCT studies and Critical Appraisal Skills Programme checklists for qualitative studies. We will assess strength of recommendations with the GRADE tool for quantitative studies, and the CERQual approach for qualitative studies. Synthesis of quantitative studies will be performed through meta-analysis when appropriate. Best fit framework synthesis will be used to synthesize qualitative studies.
This protocol paper describes a systematic review assessing the effectiveness of site-based training (with or without mentoring programmes or operational research) on the improvement of district health system management and leadership.
Systematic review registration
KeywordsSite-based training Mentoring Operational research Best fit framework synthesis District health Management Leadership Low- and middle-income countries
Description of the condition
Decentralization has been a common healthcare reform in low- and middle-income countries (LMICs) since the 1950s and the early 1960s . By decentralization we mean the transfer of authority or delegation of power in public planning, management, and decision-making from the national level to sub-national levels [1–3].
A common form of decentralization in LMICs is deconcentration, which has been defined as the handing over of some administrative authority from the central level of government to the district level of, for instance, a ministry of health. Deconcentration in the health sector aims at establishing a local district management team with clearly defined administrative duties and a degree of discretion that would enable local officials to manage without constant reference to ministry headquarters within a limited administrative area (e.g. health district) [1, 4].
However, inadequate leadership and management capacities of district health managers often hamper their ability to improve the quality, effectiveness, and efficiency of health service delivery, which in turn may contribute to a decreased use of healthcare services by the local population [5, 6].
Description of the intervention
There has been a significant investment in capacity-building programmes aiming at developing and maintaining essential competencies required for optimal public health and effective health service delivery [2, 6]. These capacity-building programmes are activities and processes that improve the ability of staff within organizations to carry out stated objectives .
Mentoring (M) is a flexible learning and teaching process that serves specific objectives of a health programme. Health management mentoring involves spending time with managers in their local environment to assist them in their day-to-day challenges. Mentorship is defined as the dynamic, reciprocal relationship in a work environment between an advanced career incumbent (mentor) and a beginner (mentee) . It aims at promoting the development of both the mentor and mentee. Mentoring is recognized as a catalyst for facilitating career selection, advancement, and productivity [8, 9].
Site-based training (SBT) (also called in-service training) involves training current district health managers in their work settings. Such training may enable health managers to run health districts in an integrated manner through sound management of primary and secondary health services, team building, and supervision . SBT has been, and will remain, a significant investment in developing and maintaining essential competencies required for optimal public health in local district level service settings .
Operational research (OR) is “the search for knowledge on interventions, strategies, or tools that can enhance the quality, effectiveness, or coverage of programmes in which the research is being done” .
Action research (AR) is “a method used for improving practice. It involves action, evaluation, and critical reflection and – based on the evidence gathered – changes in practice are then implemented.” .
Both operational research and action research in this review are understood as a tool for capacity-building and as a close collaborative investigation between researchers and district health managers (DHM). In this paper, we will refer to both strategies as operational research.
Complex interventions are combinations of interacting and interdependent actions. The overall effect of the intervention could be higher or lower when interactions are synergistic or antagonistic, respectively. Our review will illuminate the effectiveness of site-based training intervention alone (single intervention) versus site-based training interventions associated with mentoring and/or operational research activities (multicomponent intervention): i.e. (SBT+M+OR) or (SBT+OR) or (SBT+M).
This review will identify the relationship between the intensity of the intervention (number of components) and the effect size. It will seek to reconcile conflicting evidence of the effectiveness of single intervention versus multi-component intervention [14, 15].
How the intervention might work
The World Health Organization (WHO) has defined four factors that enable the improvement of district health systems management: (1) there should be an adequate number of trained managers; (2) managers should have appropriate competencies; (3) there should be support systems to provide managers with the resources they need to carry out their responsibilities, including systems for planning and budgeting as well as human resource management; and (4) the environment in which the managers function should enable them to carry out their responsibilities.
Site-based training will improve the DHM competencies while mentoring DHM will provide the necessary supportive environment to help them carry out their day-to-day responsibilities (factors 2 and 4 of the WHO framework) . Meanwhile, operational research fosters the ability of DHM to make sense of national policies, to translate them into operational terms, and to integrate them into their day-to-day practices . Operational research connects researchers with DHM in their work settings. Researchers reflect on their ways of supporting managers, while managers identify work-related issues, analyse them, take action, and reflect on their action [17, 18].
P: professionals working at district health management level
I: site-based training with or without mentoring AND/OR operational research
C: normal institutional arrangements
O: district health management functions (see Table 1)
District management and leadership functions
District management functions refer to the functions of budgeting and expenditure control, supervisory practices, staffing, planning and resource allocation, procurement of supplies, maintenance, local adaptation of national policies, and revenue raising and training.
Leadership functions stand for interagency coordination, inter-sectorial collaboration, strategic orientation, and staff alignment and motivation
Managerial and leadership functions are what Gilson (2012) refers to as the meso-level health system functions.
Why it is important to do this review
A recent systematic review  has shown no or low-quality evidence of site-based training on the improvement of knowledge, competencies, and health outcomes. However, specific district management and leadership functions were not covered by this review.
This review will inform policymakers and educational institutions involved in site-based training and mentoring about the appropriate educational methods to use, constraints to avoid, and key mechanisms that enhance the effectiveness of these interventions and contextual factors that improve the performance of district health management.
Our objective is to evaluate the available evidence on the effectiveness of site-based training, mentoring, and operational research on the improvement of district health system management and leadership (see Table 1). We will specifically evaluate site-based training intervention alone (single intervention) versus site-based training interventions associated with mentoring and/or operational research activities (multicomponent intervention).
We acknowledge that this intervention is multifaceted [14, 15] and complex in nature, as it is implemented in social systems characterized by human agency, uncertainty, and unpredictability. Hence, our secondary objectives are to identify the enabling or constraining contexts, mechanisms for the effectiveness of this intervention. We will report broader generalizable trends across multiple settings and will devise a “best fit framework” that will help policymakers understand how and why the intervention works.
We acknowledge that we may not review other relevant outcomes due to practical considerations within a limited timeframe (see Additional file 1).
Criteria for considering studies for this review
The population to be included in this review concerns DHM. We define DHM as health officers actually involved in district health management and spending some of their time in management and/or administrative functions within the health district. These include district medical officers, nursing officers, health inspectors, administrators, counsellors from the district health committee, representatives of a hospital (hospital directors), district administrators, representatives of clinics, medical assistants, or local government promotion officers. Also included are health workers who carry out administrative tasks besides their clinical practice, such as medical doctor, nurse practitioners,2 clinical officers3, or primary healthcare officers.
Medical and nursing students will be excluded from the review. Community health workers4 are also excluded because they are not involved in the management of health districts.
The intervention is a complex multifaceted intervention (i.e. combination of different strategies) that is composed of site-based training, with or without mentoring, and/or operational research at district level. Site-based training is the principal component, and mentoring and operational research are optional components of the intervention. However, we would consider studies where site-based training is accompanied by mentoring or operational research programmes.
Traditional in-class training, pre-service training, or medical education will be excluded. Training and mentoring focusing on vertical programmes will also be excluded because of our focus on health district systemic functioning.
Who delivers the intervention
Independent researchers, academics, or local managers might deliver the intervention.
Control groups concern districts that are not the site of site-based training, mentoring, and operational research interventions, delivering regular care.
Intermediate outcomes, as depicted in the logic model (box in red in Fig. 1), are the major focus of this review. We have identified major outcomes that are crucial for an effective functioning of local health systems. These outcomes are categorized into district health management and leadership functions. We validated these outcomes with content experts5 along with end users of this review.6 These outcomes are grouped into two categories, district health management and leadership functions (see Table 1).
Mechanisms underlying the effect of the intervention
Enabling and constraining contextual factors
Comparison of effectiveness of single intervention versus multicomponent intervention
Types of study
In order to assess the effectiveness of site-based training, mentoring, and operational research interventions on the performance of district health system managers, we will rely on study designs that are capable of demonstrating a causal relationship namely the following: cluster randomized controlled trials, controlled before-and-after studies(CBA), interrupted time series(ITS), quasi-experimental design, cohort studies, and longitudinal studies.
We will also collect and extract information from qualitative studies linked to, or associated with, included studies. Qualitative research and economic evaluation studies will be used in the review to help contextualize findings and identify barriers and facilitators. Qualitative research will help to identify how the intervention might work (underlying mechanisms), within a particular local historical and institutional context.
Reviews and meta-analyses will be excluded, but eligible studies identified from within existing reviews will be included.
Only interventions that are delivered in district health systems in LMICs will be included in this review.
We will exclude studies carried out in high-income countries (HICs) because their health district models are far different from the district health system functioning and staff involved. Therefore, evidence from HICs will be hardly transferrable to LMICs. Also, studies that have not been conducted within decentralized health district levels will be excluded.
Our search strategy relies on three elements (population, intervention, and context) from the PICOC (Population, Intervention, Comparison, Outcome, Context) framework . We adopted a sensitive search strategy rather than a specific one to scan a whole range of studies in the specific field of health systems research.
((((((((((((((“evaluation”[All Fields]) OR realist[All Fields]) OR “Program Evaluation”[Mesh]) OR “Evaluation Studies” [Publication Type]) OR “cohort studies”[MeSH Terms]) OR (“Outcome and Process Assessment (Health Care)”[Mesh])) OR Evalu*) OR interview) OR Qualitative) OR “Qualitative Research” [Mesh]) OR Randomized Controlled Trial [Publication Type]) OR Random*) OR Randomly [tiab]) OR Randomized controlled trial [pt]) OR Interrupted Time Series Analysis
Grey literature will not be reviewed
Date of publication
English, French, Arabic and Spanish
Sources to be searched
(1) MEDLINE, (2) PsycInfo, (3) Cochrane Library, (4) CRD database (DARE), (5) Cochrane Effective Practice and Organisation of Care (EPOC) group, (6) ISI Web of Knowledge, (7) Health Evidence.org, (8) PDQ-Evidence, (9) ERIC, (10) EMBASE, (11) TRIP
We will review only published articles.
Reference tracking, citation tracking, hand-searching journals
Data collection and analysis
A team of two reviewers7,8 will be involved in selecting studies. We are using Endnote as reference manager software. Title and abstract screening will be operationalized using Microsoft Excel sheets to record the process, including justifying reasons for exclusion. Full-text articles will be obtained in cases of doubt. In cases of disagreement between the reviewers, we will consult a third reviewer.
A kappa coefficient will be measured to make sure that discordance does not impact on the validity of the selection process. The review team includes two experts in evidence-informed decision-making.9 Since we are interested in implementation gaps, qualitative studies and process evaluations will be obtained and considered alongside the included studies. We mean by process evaluation research that is used to assess fidelity, quality, and reach of implementation; to clarify causal mechanisms; and to identify contextual factors associated with variation in outcomes . Data from multiple reports of the same study will be collated.
Two reviewers will carry out data extraction with external validation by mentors10 of the review. Additional searches for process evaluations and qualitative and implementation studies will be performed using citation searching to inform an understanding of the context, mechanisms, barriers, facilitators, cost, and sustainability of the intervention implementation. Authors will be contacted if implementation data are not included in the published articles or in case of missing data. Relevant data will be collected for each type of intervention (site-based training, mentoring, operational research) according to the data extraction form presented below.
Data extraction form
Author names, journal, year
Unit of analysis
- 5.Type of intervention:
Professional (educational intervention)
Participants (profession, administrative position, level of training, clinical specialty, age, time since graduation)
Setting (location; country; district level, primary or secondary level; rural of urban area)
Duration of the programme,
Underlying theory of change
Length of post intervention follow-up
We will report the frequency and time during which outcome measurement was done. Length of post intervention follow-up will also be collected. Duplicate or multiple reports of the same study will be assembled and compared for duplication, completeness, and possible contradictions. Data will be extracted using a Microsoft Excel spreadsheet form. EndNote and RevMan software will be used for data storage and analysis. Authors will be contacted by mail in case of missing data. Measures of effects in included studies that will be reported are relative risk (RR) for dichotomous variables and categorical measures such as Likert scales and means or changes of means over time for continuous data.
Proposed quantitative data synthesis
We will use RevMan software to perform meta-analysis for quantitative data if similar outcomes and measurement scales were used. A test of heterogeneity (I2 statistic) will be carried out in order to test the relevance of meta-analysis and to inform decisions about the choice of either fixed effects or random effects models of meta-analysis. We will also explore whether effect size differs between single intervention (SBT) and multicomponent intervention (SBT and Mentoring AND/OR operational research). Meta regression, using Revman 5, could be used to examine conditional relationship among predictors (intensity of the intervention and other subgroup variables) and effect size magnitude.
Proposed qualitative data synthesis
We will first use best fit framework (BFF) synthesis to synthetize qualitative evidence gathered from included studies. This methodology fits the analysis of organizational policies and procedures. It is also appropriate for research within a limited time frame, with specific questions, and with heterogeneous outcomes [24, 25]. The main purpose of the BFF is to describe and interpret what is happening in specific contexts. It goes beyond identifying insights from individual case studies to reporting broader generalizable trends across multiple settings. Therefore, BFF helps practitioners design suitable action plans that address system-wide issues.
We will identify relevant framework publications using truncated terms (theor11*, concept*,12 framework*13 or model*14) in our reference management database of included studies. Supplementary search will be carried out on external databases using BeHEMoTh (Behaviour of interest, Health context, Exclusions, Models or Theories) template (see search strategy in Appendix 2) . Study data will be extracted against the concepts and subcategories of the framework.
Quality assessment strategy
For RCTs and non RCTs studies (CBA, ITS) the Cochrane Collaboration tool  and EPOC Tools  will be used, respectively, for assessing risk of bias. Critical Appraisal Skills Programme (CASP) checklist tools will be used for qualitative studies 15. We will assess the quality of studies and categorize them as low risk of bias, unclear risk of bias, and high risk of bias. These categories will be assigned respectively if there is an unlikely plausible risk of bias that could alter confidence in the results, plausible bias that raise a doubt of the validity of the results, or plausible bias that seriously weakens the confidence in results.
Besides, overall strength of recommendations will be assessed using the Grades of Recommendation, Assessment, Development and Evaluation (GRADE) approach. Quality of body of evidence will be categorized into four categories (high, moderate, low, and very low) based on the degree of likelihood of specific risk of bias, publication bias indirectness (surrogate outcomes, indirect comparison, difference in population, difference in intervention), inconsistency, and imprecision.
We will identify risk of bias using specific criteria ; estimate publication bias using funnel plots of study results ; and assess likelihood of inconsistency with criteria such as variation of point estimate, absence of minimal overlap of confidence interval, statistical test for heterogeneity, and I2 [35, 36].
For qualitative studies, the CASP tool will allow us to question the validity of the results, the quality of the analysis process, and the relevance for local context. To ensure consistency with the GRADE approach, we will use the corresponding CERQual approach for the qualitative studies. This will allow us to categorize the evidence into the same four categories (high, moderate, low, and very low) based on the likely confidence in the findings (methodological limitations, publication bias, relevance, coherence, and adequacy of data) .
This protocol was reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA)-P Statement for reporting systematic review protocols (see Additional file 2) .
Site-based training is also called on-the-job training, hands-on training, and continuing education, which means training that takes place in workplace settings.
Nurse practitioners: nurses who are specially trained to assume an expanded role in providing medical care under the supervision of a physician
Clinical officer (formerly called medical assistant) is a medical separate healthcare provider who has clinical duties (diagnosis, treatment of patients in primary healthcare, assessment, management, and transfer of surgical patients) and administrative duties at respective health centres.
Community members who connect healthcare beneficiaries with providers in order to promote access to health
Prof. Bart Criel, Dr. Geneviève Michaux, and Prof. Bruno Marchal from the Institute of Tropical Medicine, Antwerp
Members of the RIPSEC consortium. RIPSEC is an EU-funded programme that aims at implementing a site-based training intervention, mentoring, and operational research in four health districts in the Democratic Republic of Congo.
Zakaria Belrhiti, National School of Public Health, PhD student at Institute of Tropical Medicine
Issam Bennis, National School of Public Health, PhD student at Institute of Tropical Medicine
Andrew Booth, University of Sheffield, and Roos Verstraeten, Institute of Tropical Medicine, Antwerp
Theory, theories, theoretical
Concept, concepts, and conceptual
Critical Appraisal Skills Programme (CASP). Qualitative Research Checklist. 2013 available at http://www.casp-uk.net/#!checklists/cb36
University of Sheffield
Institute of tropical medicine, Antwerp, Belgium
List of LMIC is based on 2016 fiscal report, World Bank, “http://data.worldbank.org/about/country-and-lending-groups#Lower_middle_income. I added nationality adjectives for irregular forms.
Behaviour of interest, Health context, Exclusions, Models or Theories
best fit framework
Control before and after studies
Critical Appraisal Skills Programme
Database of Abstracts of Reviews of Effects
district health managers
Cochrane Effective Practice and Organisation of Care
Education Resources Information Center
Evidence Informed Decision-Making in Health and Nutrition
Interrupted Time Series
low- and middle-income countries
“Pretty Darn Quick” evidence
Population, Intervention, Comparison, Outcome
Population, Intervention, Comparison, Outcome, Context
randomized controlled trials
World Health Organization
The authors acknowledge the excellent comments and critical review of the manuscript provided by Fiona Campbell.16 Authors acknowledge the contributions of content experts Prof Bart Criel and Dr Genevieve Michaux.17
Sources of support
Training in systematic review tools is funded by the Belgian Development Cooperation. This review is supported by the EVIDENT network in terms of training and mentorship.
This review will be used as deliverable in RIPSEC programme (evidence-informed policy programme in Democratic Republic of Congo) funded by European Commission (EuropeAid/135178/C/ACT/Multi).
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
- Mills A, Vaughan JP, Smith DL, Tabibzadeh I. Health system Decentralization: concepts, Issues and country experience In. Geneva, Switzerland: World Health Organization; 1990.
- Bossert T. Analyzing the decentralization of health systems in developing countries: decision space, innovation and performance. Soc Sci Med. 1998;47(10):1513–27.View ArticlePubMedGoogle Scholar
- Bossert TJ, Beauvais JC. Decentralization of health systems in Ghana, Zambia, Uganda and the Philippines: a comparative analysis of decision space. Health Policy Plan. 2002;17(1):14–31.View ArticlePubMedGoogle Scholar
- A.Rondinelli D, Nellis JR. Assessing decentralization policies in developing countries: the case for cautious optimism. Dev Policy Rev. 1986;4:3–23.View ArticleGoogle Scholar
- Rowe AK, de Savigny D, Lanata CF, Victora CG. How can we achieve and maintain high-quality performance of health workers in low-resource settings? Lancet. 2005;366(9490):1026–35.View ArticlePubMedGoogle Scholar
- Waddington C. Towards better leadership and management in health. In: making health systems work. Geneva, Switzerland: World Health Organization; 2007.
- Lafond AK, Brown L, Macintyre K. Mapping capacity in the health sector a conceptual framework. Int J Health Plann Manag. 2002;17:3–22.View ArticleGoogle Scholar
- Barnoya J, Monzon JC, Colditz GA. Increasing chronic disease research capacity in Guatemala through a mentoring program. Can J Public Health. 2013;104(5):e427–432.View ArticlePubMedGoogle Scholar
- Edwards LJ, Moises A, Nzaramba M, Cassimo A, Silva L, Mauricio J, Wester CW, Vermund SH, Moon TD. Implementation of a health management mentoring program: year-1 evaluation of its impact on health system strengthening in Zambezia Province, Mozambique. Int J Health Policy Manag. 2015;4(6):353–61.View ArticlePubMedPubMed CentralGoogle Scholar
- DeBrouwere V, Van Balen H. “Hands on training in health district management”. World Health Forum. 1996;17(3):271–3.Google Scholar
- Rockers PC, Bärnighausen T. Interventions for hiring, retaining and training district health systems managers in low and middles incomes countries (review). Cochrane Database of Systematic Reviews 2013 (4)Art. No.: CD009035. DOI:10.1002/14651858.CD009035.pub2.
- Zackariah R, Harries AD, Ishikawa N, Rieder HL, Bissel K, Laserson K, Massaquoi M, Herp MV, Reid T. Operational research in low-income countries: what, why, and how? Lancet Infect Dis. 2009;9:711–7.View ArticleGoogle Scholar
- Koshy V. Action research for improving practice: a practical guide. London: Paul Chapman Publishing; 2005.Google Scholar
- Squires JE, Sullivan K, Eccles MP, Worswick J, Grimshaw JM. Are multifaceted interventions more effective than single-component interventions in changing health-care professionals’ behaviours? An overview of systematic reviews. Implement Sci. 2014;9(1):152.View ArticlePubMedPubMed CentralGoogle Scholar
- Squires JE, Valentine JC, Grimshaw JM. Systematic reviews of complex interventions: framing the review question. J Clin Epidemiol. 2013;66(11):1215–22.View ArticlePubMedGoogle Scholar
- Gilson L, Elloker S, Olckers P, Lehmann U. Advancing the application of systems thinking in health: South African examples of a leadership of sensemaking for primary health care. Health Res Policy Syst. 2014; 12(1):1–13.View ArticleGoogle Scholar
- Lehmann U, Gilson L. Action learning for health system governance: the reward and challenge of co-production. Health Policy Plan. 2015;30(8):957–63.View ArticlePubMedGoogle Scholar
- Bowling A. Mixed research approaches. In: Press OU, editor. Research methods in health investigating health and health services edn. 2009. p. 441–6.Google Scholar
- Prashanth NS, Marchal B, Kegels G, Criel B. Evaluation of capacity building program of district health managers in India: a contextualized theoretical framework. Front Public Health. 2014; 2(89):1–14.Google Scholar
- Takeuchi R, Bolino MC, Lin CC. Too many motives? The interactive effects of multiple motives on organizational citizenship behavior. J Appl Psychol. 2015;100(4):1239–48.View ArticlePubMedGoogle Scholar
- Gilson L. Health Policy and systems research A Methodology Reader. In. Geneva, Switzerland: Alliance for Health Policy and Systems Research,World Health Organization; 2012.
- Petticrew M, Roberts H. Systematic Reviews in the Social Sciences Malden, USA Oxford,UK Carlton, Australia: Blackwell Publishing; 2006.
- Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, Moore L, O’Cathain A, Tinati T, Wight D, et al. Process evaluation of complex interventions: medical research council guidance. BMJ. 2015;350:h1258.View ArticlePubMedPubMed CentralGoogle Scholar
- Carroll C, Booth A, Cooper K. A worked example of “best fit” framework synthesis: a systematic review of views concerning the taking of some potential chemopreventive agents. BMC Med Res Methodol. 2011;11:29.View ArticlePubMedPubMed CentralGoogle Scholar
- Carroll C, Booth A, Leaviss J, Rick J. “Best fit” framework synthesis: refining the method. BMC Med Res Methodol. 2013;13(1):1–16.
- Dixon-Woods M. Using framework-based synthesis for conducting reviews of qualitative studies. BMC Med. 2011;9:39.View ArticlePubMedPubMed CentralGoogle Scholar
- Anderson LM, Petticrew M, Rehfuess E, Armstrong R, Ueffing E, Baker P, Francis D, Tugwell P. Using logic models to capture complexity in systematic reviews. Res Synth Methods. 2011;2:33–42.View ArticlePubMedGoogle Scholar
- Baxter SK, Blank L, Woods HB, Payne N, Rimmer M, Goyder E. Using logic model methods in systematic review synthesis: describing complex pathways in referral management interventions. Med Res Method. 2014;14(62):1471–2288.Google Scholar
- Baxter S, Killoran A, Kelly MP, Goyder E. Synthesizing diverse evidence: the use of primary qualitative data analysis methods and logic models in public health reviews. Public Health. 2010;124(2):99–106.View ArticlePubMedGoogle Scholar
- Higgins JP, Altman DG, Gøtzsche PC, Jüni P, Moher D, Oxman AD, Savović J, Schulz KF, Weeks L, Sterne JA. The Cochrane Collaboration’s tool for assessing risk of bias in randomised trials. Bmj. 2011 Oct 18;343:d5928.
- Effective Practice and Organisation of Care (EPOC). Suggested risk of bias criteria for EPOC reviews. EPOC Resources for review authors. Oslo: Norwegian Knowledge Centre for the Health Services; 2015. Available at: http://epoc.cochrane.org/sites/epoc.cochrane.org/files/uploads/14%20Suggested%20risk%20of%20bias%20criteria%20for%20EPOC.
- Critical Appraisal Skills Programme (CASP). Qualitative Research Checklist. 2013 available at http://www.caspuk.net/#!checklists/cb36.
- Guyatt GH, Oxman AD, Vist G, Kunz R, Brozek J, Alonso-Coello P, Montori V, Akl EA, Djulbegovic B, Falck-Ytter Y, et al. GRADE guidelines: 4. Rating the quality of evidence—study limitations (risk of bias). J Clin Epidemiol. 2011;64(4):407–15.View ArticlePubMedGoogle Scholar
- Guyatt GH, Oxman AD, Montori V, Vist G, Kunz R, Brozek J, Alonso-Coello P, Djulbegovic B, Atkins D, Falck-Ytter Y, et al. GRADE guidelines: 5. Rating the quality of evidence—publication bias. J Clin Epidemiol. 2011;64(12):1277–82.View ArticlePubMedGoogle Scholar
- Guyatt GH, Oxman AD, Kunz R, Woodcock J, Brozek J, Helfand M, Alonso-Coello P, Falck-Ytter Y, Jaeschke R, Vist G, et al. GRADE guidelines: 8. Rating the quality of evidence—indirectness. J Clin Epidemiol. 2011;64(12):1303–10.View ArticlePubMedGoogle Scholar
- Guyatt GH, Oxman AD, Kunz R, Woodcock J, Brozek J, Helfand M, Alonso-Coello P, Glasziou P, Jaeschke R, Akl EA, et al. GRADE guidelines: 7. Rating the quality of evidence—inconsistency. J Clin Epidemiol. 2011;64(12):1294–302.View ArticlePubMedGoogle Scholar
- Lewin S, Glenton C, Munthe-Kaas H, Carlsen B, Colvin CJ, Gulmezoglu M, Noyes J, Booth A, Garside R, Rashidian A. Using qualitative evidence in decision making for health and social interventions: an approach to assess confidence in findings from qualitative evidence syntheses (GRADE-CERQual). PLoS Med. 2015;12(10):e1001895.View ArticlePubMedPubMed CentralGoogle Scholar
- Moher D, Shamseer L, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P) 2015 statement. Syst Rev. 2015;4(1):1.View ArticlePubMedPubMed CentralGoogle Scholar