The framework presented in this paper is intended to enable researchers to present and contextualize evidence from systematic reviews and other sources of synthesized and quality-assessed evidence. The approach is designed to address the wide range of questions of interest to decision-makers, especially those commissioning services or managing service delivery and organization in primary or secondary care. As such, the framework attempts to go beyond the types of questions normally addressed in systematic reviews of effectiveness. We aim to build relations with decision-makers through initial face to face meetings, followed by continued contact (face-to-face or email) to clarify the issue or question to be addressed. The use of the checklist (Figure 1) enables us to ensure that a common understanding of the question to be addressed is achieved. The checklist takes a broad approach to defining the question, especially for aspects other than effectiveness and cost-effectiveness. For example, there could be overlap between the concepts covered by questions 8, 9 and 11, but having three separate questions allows for differences in understanding of terms like ‘barriers’, ‘facilitators’ and ‘implementation’.
While the approach appears promising, our experience with producing briefings using this framework has been limited to date. Collection of feedback from decision-makers is an important part of the process and will help us to refine our approach further over time. We are aware of the need to develop the peer review system and possibly to involve a wider range of experts, particularly for assessing aspects other than clinical and cost-effectiveness.
As a matter of policy, we are currently limiting production of evidence briefings to questions brought to us by decision-makers, rather than proactively seeking to identify topical and important issues. This could be seen as either a strength or a weakness in our approach. Analyzing real problems in collaboration with those directly affected should mean that research evidence is more likely to be used and have an impact on decision-making. Systematic reviews suggest that the interaction of decision-makers and researchers promotes uptake of research evidence . On the other hand, engagement with decision-makers has historically been a challenge for this type of service . The challenge for us has been to generate enough topics initially to get the service off the ground, particularly at a time of change and uncertainty in the English NHS. Demand for the service is expected to increase once the transition to a system of clinically-led commissioning has occurred because there will be more commissioning bodies with varying levels of expertise and access to resources to support evidence-informed decision-making.
Services that synthesize systematic reviews with other research evidence and context-specific information to answer a specific question are defined as ‘policy briefs’ in the taxonomy developed by Lavis . These services differ from ours in being primarily aimed at government or regional level decision-making. Another point of difference is that in the model of policy briefs described by Lavis and the SUPPORT Collaboration, it appears that the people producing the brief identify possible options to solve the problem being addressed, even if they do not make recommendations . Our preferred approach is to evaluate solutions already under consideration by decision-makers.
Some new services of this type have started since we did the searches for our scoping review, demonstrating a wide current interest in optimizing the use of systematic reviews by decision-makers. This may be related to the increasing pressure to make the best use of limited resources for healthcare in both developed and developing countries. The closest parallel to our service has been developed by the Ottawa Hospital Research Institute, Canada, working with the Champlain Local Health Integration Network, a commissioner of healthcare services . This service, known as ‘Knowledge to Action’, has produced 16 systematic review-based evidence summaries at the time of writing. However, these summaries appear to be mainly overviews of the systematic review literature (including quality assessment using AMSTAR) with less emphasis on the consideration of context, the potential cost impact, implementation and health equity that is integral to our framework. In Africa, rapid response evidence services for national-level decision-makers, based on systematic reviews where possible, have been set up in Uganda and Burkina Faso as part of the EVIPNet (Evidence-Informed Policy Network) program supported by the World Health Organization .
As noted previously , there have been few formal published evaluations of these services, although it is likely that service providers have gathered substantial amounts of information that is not in the public domain. This reinforces the need for us to evaluate the perceived usefulness and use of the briefings that we produce. The eating disorders briefing was evaluated by means of a brief questionnaire. More sophisticated approaches to evaluation could be developed although it may be unrealistic to expect high response rates from NHS commissioners and clinicians.
Further developments could include incorporation of local data supplied to us by the organization requesting the briefing and extraction of data from the wide range of resources available through, for example, the NHS Information Centre. If we were to attempt the latter, systematic and transparent methods for searching and using the data would need to be developed.
Peer review is another area that we are seeking to develop further and is especially important in relation to adapting the briefings for wider audiences. As we decide how best to balance the need for rapid information to support decision-making against the time required for rigorous peer review, it may be that (as with Effective Health Care), we will be able to recruit a core group of responsive peer reviewers. Post-publication peer review is also possible, with readers invited to submit comments for response.
Given that we are going beyond the boundaries of standard systematic reviews/HTAs or even other evidence briefing services, it will be particularly important to determine the feasibility of assessing potential impacts on equity and implementation using the approach outlined in this framework, or whether further expert input will be required.