Metacognition and evidence analysis instruction: an educational framework and practical experience
© Parrott and Rubinstein. 2015
Received: 23 December 2014
Accepted: 14 August 2015
Published: 21 August 2015
The role of metacognitive skills in the evidence analysis process has received little attention in the research literature. While the steps of the evidence analysis process are well defined, the role of higher-level cognitive operations (metacognitive strategies) in integrating the steps of the process is not well understood. In part, this is because it is not clear where and how metacognition is implicated in the evidence analysis process nor how these skills might be taught.
The purposes of this paper are to (a) suggest a model for identifying critical thinking and metacognitive skills in evidence analysis instruction grounded in current educational theory and research and (b) demonstrate how freely available systematic review/meta-analysis tools can be used to focus on higher-order metacognitive skills, while providing a framework for addressing common student weaknesses. The final goal of this paper is to provide an instructional framework that can generate critique and elaboration while providing the conceptual basis and rationale for future research agendas on this topic.
KeywordsEvidence analysis Instruction Critical thinking Metacognition
Our goal in this article is to highlight an aspect of evidence analysis instruction that has been largely overlooked in the literature, namely, the role of metacognitive skills in the evidence analysis process. While research focuses on skills needed at the discrete steps in the evidence analysis process (e.g., searching, article assessment), skills needed to integrate the steps have gone largely unexamined. In this article, we present a conceptual framework we use to distinguish the different types of metacognitive skills within the evidence analysis instruction process and then provide two examples of how freely available online evidence analysis tools (Systematic Review Data Repository and OpenMeta[Analyst]) can be used to facilitate instruction in these metacognitive skills.
Metacognition and self-regulation in the process of evidence analysis
Recent research in evidence analysis instruction focuses more on comparison of modes of delivery than on the relative benefits of different approaches to instruction grounded in alternative theories of adult learning or “andragogy” [1, 2]. This is not to say that courses for teaching evidence analysis skills ignore andragogical theory (e.g., see ), only that the ground of the instructional approaches is rarely made explicit. Lack of direction for more effective approaches for different audiences  may be due more to a lack of evidence  than concrete evidence that different approaches are more or less equivalent.
Our purpose here is to highlight one particular aspect of evidence analysis instruction which is, to the best of our knowledge, absent from the current literature: the role of metacognitive strategies in the integration of the commonly defined steps of the evidence analysis process. We provide a framework that locates both the place and the importance of these metacognitive skills in evidence analysis instruction and then provide an example of how free online tools can be leveraged to focus on these higher-order skills in graduate healthcare education. We ground our approach in adult learning (“andragogical”) theory [5, 6] emphasizing the role of both metacognition and self-regulation as meta-level elements of critical thinking [7, 8] involved in evidence analysis (though a detailed explication of these cognitive processes is beyond the scope of this article). Our hope is to provide a basis for critique, elaboration, and future research of this aspect of evidence analysis instruction. We offer our experience not as an exemplar or prescriptive, but as a worked, applied example that may provide a framework and rationale for further research in this area. For this report, no student level data were gathered and the Rutgers University IRB concluded that the regulatory definition of human subjects research was not met (ID 20140001119).
Where do metacognitive skills fit in evidence analysis?
A key aspect for competence in the evidence analysis process is the critical integration of a set of skills toward a particular goal (whether the goal is production of a systematic review or meta-analysis (SRMA) or application to patient care ). Application of the evidence analysis process—whether in the production or consumption of evidence analysis products—specifically involves the integration of the skills of question formulation, evidence identification, analysis, critical synthesis and evaluation. Toward an integration, it is the higher-order cognitive skills, what we are calling meta-strategic skills , that regulate cognition (i.e., for planning, monitoring, and evaluation) and provide a mechanism for judgment and decision-making in the learning process [11–13]. Since successful application of evidence analysis skills requires students to master not just the discrete skills, but to integrate those skills toward a concrete goal, then self-regulated attention to these meta-strategic skills is needed.
Synchronic evaluation: multiple tools, approaches, and platforms are available at each step of the evidence analysis process. Students should have the ability to identify and differentially evaluate which may be most appropriate for their purpose. For instance, which evaluation tool or which data extraction platform is going to be better for their purpose?
Diachronic evaluation: what is done at one step of the evidence analysis process (e.g., what data is extracted?) depends on the goals of later steps in the process (e.g., what type of analysis is planned?). Diachronic meta-strategic skills focus on teaching students how to look ahead to later steps to plan and execute activities at the current step of the process.
Iterative evaluation: unsatisfactory results at one step in the process (e.g., inadequate search results) may be a result of problems at an earlier step (e.g., poor question formulation). Students need to be able to assess the results of the evidence analysis at each step and evaluate how results in earlier steps may have led to current results.
We have found that using this framework helps to identify common problems in students’ evolving competence of the evidence analysis process toward production of a SRMA and provides a way to determine whether the problems are a result of inadequacy with a discrete evidence analysis skill or a result of a deficit in meta-strategic thinking (e.g., difficulty seeing how planned analyses shape information extracted from studies) .
Identifying explicit meta-strategic skills can be a first step toward recognizing common problems students face and, in our experience, assists in the mentoring process and in recommending additional student resources. What tools are available to facilitate the focus on these higher-order meta-strategic skills within the context of evidence analysis instruction and how can they be used to facilitate the development of meta-strategic skills? In the next section, we provide two examples of how a suite of evidence analysis tools can be used not simply to teach discrete evidence analysis skills but used to facilitate the development of potentially observable meta-strategic skills that we believe are integral for integrating these skills into a coherent process toward a concrete goal.
Two brief examples: using evidence analysis tools to teach meta-strategic skills
Examples of meta-strategic links and SRDR tools uses within the evidence analysis process
Type of meta-strategic skill
(strategies for weighing alternatives)
(assessing current activities in light of downstream goals)
(evaluating current results based on previous activities)
Formulate question/conceptual framework (logic model)
Discriminate among question types
Relevance of question to practice
Identify need for preliminary background reading
Discriminate among question components (PICO, PICOT, PIOS, etc.)
Availability of evidence to answer the question
Alternative data sources
Will available study designs answer the question?
Do too many or too few results indicate that the question was inadequately formulated?
How will different methods of reporting outcomes affect the way the question can be answered?
Are search terms adequate to capture comparisons made at the analytic step?
If current SRMAs exist, how does the question for this SRMA provide new insight?
SRDR: Abstrakr facilitates consensus among project members for source selection.
Extract and analyze
Alternative platforms or extraction tools, basis for choosing among them
What design, sample, or intervention/exposure characteristics are necessary for later analyses or conclusions?
Do presence of common confounders suggests that the conceptual framework was misspecified?
What methods of analyzing data are available? What are their relative benefits?
SRDR: tabular structure scaffolds analytic framework for data extraction and a priori subgroup analyses.
Do available outcome measures reported address the question asked?
Are outcome measures commensurate?
Outcome definition wizard motivates planning for type of analysis.
SRDR: Customizable questions allows for revision of logic model.
SRDR: Customizable fields force planning at two levels: (1) information to be gathered and (2) structure of fields (multiple choice, numerical entry, free text)
What methods of synthesis are available? What are their relative benefits and drawbacks?
How might the synthesis plan need to change in light of available data?
Does observed heterogeneity suggest that important extraction categories were missed?
What are alternative methods of reporting outcomes?
Are sources of heterogeneity relevant for application identified?
SRDR, OMA: high heterogeneity may indicate important moderator conditions missed in data extraction.
SRDR: OMA wizard helps students identify appropriate method of meta-analysis.
SRDR: OMA facilitates post-hoc exploration of sources of heterogeneity.
What are the various threats to confidence in the findings?
What aspects of analyses condition the application of findings?
Were patterns between outcomes and study characteristics identified and analyzed?
In the section below, we provide more details on two examples of how metacognitive skills can be integrated within the evidence analysis instruction process. We describe how the three meta-strategic skills (synchronic, diachronic, iterative) are implicated when formulating a conceptual framework and analyzing heterogeneity. As indicated in Table 1, metacognitive skills are implicated at other evidence analysis steps as well. So, the examples are meant to be illustrative, not comprehensive. Additionally, in order to increase the practical utility of our examples, we offer descriptions of our experience of the level of instructor involvement and the placement of the meta-strategic skill activities within the instruction sequence.
Our example uses two evidence analysis tools available from the Systematic Review Data Repository (SRDR) websites [16, 17]: the main website providing a free, customizable data extraction tool [18, 19] OpenMeta[Analyst], a free downloadable meta-analysis program . While neither SRDR nor OpenMeta[Analyst] were deliberately designed as tools for teaching inexperienced students how to create SRMAs, both incorporate user supports (e.g., tutorials, wizards) relevant to operation of the applications, and both provide constructive scaffolding suitable not only for learning the process but suitable also for teaching more general evidence analysis and, crucially, meta-strategic skills.
Example 1: developing a conceptual framework for data extraction
Choosing among possible options for fields and field structure also has a temporal dimension. Beyond merely helping students recognize the data entry options available to them at a particular point, we encourage them to consider which of these alternatives should be included in light of their initial question and ultimate analytic goals. Students are encouraged to consider, during their initial reading, what aspects of study design, treatment/exposure characteristics, and sample characteristics could reasonably affect the outcomes. They are then encouraged to propose a small set of a priori subgroup analyses and determine what information would need to be systematically gathered from the studies in order to carry out those analyses. Collecting data on potential effect modifiers or context contingencies rarely occurs to the students without explicit instruction, but is facilitated by the scaffolding of the SRDR extraction tool.
Customizable questions allow for revision of the conceptual framework or logic model. For example, as students read more deeply, they often intuit patterns they did not anticipate with the initial question (e.g., differences based on measurement device, based on subgroup, based on treatment or exposure characteristic). Potential confounders are identified that indicate that a more nuanced or detailed logic model is warranted. This highlights the problem with the ability to formulate an adequate question and logic model without substantial exposure to the content area. The initial questions are not always the best questions to be asking.
Instructor involvement at this stage of the process is intensive.
Providing refresher sessions on differences in study design and advising students on how the structure of fields will affect the form of downloaded data (and how this, in turn, will affect which analyses are feasible). Refresher instruction in differences in outcome measurement (categorical versus continuous versus time-to-event) and common statistical measures is often useful. Concrete examples of what different data structures look like in matrix format can help students see what they will be dealing with in coming sessions (e.g., show how data in spreadsheet form can be sorted on various characteristics and thus offer different perspectives on the outcomes). Feedback on student’s extraction template is facilitated by the collaborative features of the tool which enable ready access by instructors to each student’s SRDR extraction template, facilitating transparency of the student’s progress.
Specialists in the student’s field and content area are vital for helping students to understand and identify contingencies in treatment/exposure/diagnostic measurement that may affect the results of the study. Once the student has created an initial extraction framework, advisors provide detailed feedback using a rubric based on the structure of the SRDR tabular format. This allows the student to make targeted changes. Very few students create an adequate data extraction form on their first attempt.
We have found that it is helpful to first have students carry out a data extraction exercise on an article and extraction framework we provide. In this session, students focus on two skills: familiarizing themselves with the online platform (tabs, tools, resources) and study “interrogation” skills. This is followed by an exercise where students create their initial customized data extraction tool. At this step, the students are working on two key skills: (a) tying the creation of the data tool back to their initial logic model (i.e., the analytical framework) as well as forward to their planned analysis and (b) considering how to structure data entry to facilitate analyses. Because of the nature of diagnostic accuracy questions, and the substantially different methodological approach entailed, two separate “tracks” within the course offer tailored instruction for data extraction and framework construction for students asking diagnostic accuracy questions versus other types of question (e.g., treatment, prognosis, etc.).
Example 2: understanding heterogeneity
While students have been encouraged earlier to formulate a priori hypotheses regarding differences in results based on design, treatment, or sample characteristics, the results of their initial analyses often lead to post hoc exploration of other potential sources of heterogeneity as well as push them to consider how to explain differences among studies (e.g., one study may find very different results from the others) in the “Discussion” section of the paper. Students are encouraged to interpret their findings with respect to (a) suggesting explanations for studies that “do not fit,” (b) the level of heterogeneity found and implications for confidence, and (c) identifying explanations that make sense of the results in light of the relationships between individual study characteristics/findings and patterns across studies (IOM standard 4.2.4) and as their findings relate to the relevance of individual studies to the “populations, comparisons, cointerventions, settings, and outcomes or measures of interest” (IOM standard 4.2.5) .
Students initially extract data in the format reported by authors. However, this format is not always commensurate with students’ specific research question and analytic goals. Thus, students are introduced to the notion that the measures needed for their meta-analysis can sometimes be computed or estimated from the information provided in the article to more closely fit their analytic needs with the use of a calculator available in OMA (or online calculators).
In our experience, instructor involvement outside course lectures and assignment grading is relatively light, as students become increasingly self-reliant with the content and mode of learning, and increasingly engage in conversations with their student colleagues in an interprofessional mix. Methodological instructors may be tapped for additional input on analysis or conversion of measures, especially when the student is drawing on observational studies. Because of statistical issues, students are discouraged from meta-regression, though this is permitted when the student has a stronger statistical background and is willing to do additional reading. Content advisors are occasionally asked for input regarding subgroup analyses.
Synthesis and interpretation is covered over the course of three sessions. The first introduces meta-analysis, and students complete an assignment with data provided to them (from a systematic review of treatment studies). The second session introduces them to subgroup meta-analysis and emphasizes the role of heterogeneity (and its analysis) in formulating more nuanced conclusions. Again, data are provided to the students, but this time on a diagnostic accuracy question where results are subgrouped by measurement device. This exposes students from all disciplines to meta-analytic procedures and statistical measures they might otherwise have little exposure to, as it has been observed in the course that presenting students with varying methodological examples reinforces learning in a constructivist sense.
We offer the above conceptual framework and examples of meta-strategic instruction in action as a point of departure for critique, elaboration, and future research. Indeed, if we want to be able to empirically test whether an explicit focus on meta-strategic skills in evidence analysis instruction leads to better mastery of the discrete evidence analysis skills or a more nuanced approach to appraising the quality of systematic reviews and meta-analyses, then we must first have some sense of what these skills are and how they may come into play. In short, before we can test, we must first identify and measure the metacognitive processes involved in evidence analysis.
In our experience, encouraging students to attend critically to their thinking about the integrative process of evidence analysis has resulted in higher-quality performance, both in terms of discrete skills as well as the final product (a limited scope SRMA). We recognize, however, that our approach may not fit every situation, and several questions remain to be explored: Whether or to what degree can a focus on metacognitive skills be applied in other instructional settings? What student outcomes result from this meta-strategic focus? For what kinds of adult learners is this focus appropriate?
question formulation heuristic comprising: population, intervention, comparator, outcome
question formulation heuristic comprising: Population, intervention, comparator, outcome, time (duration of data collection)
question formulation heuristic comprising: population, intervention, outcome, study design
Systematic Review Data Repository
The authors would like to acknowledge the critical feedback of Joseph Lau and Nira Hadar on an early version of this manuscript. All deficiencies and errors remain those of the authors. No funding was received for this research.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
- Ilic D, Maloney S. Methods of teaching medical trainees evidence-based medicine: a systematic review. Med Educ. 2014;48:124–35.View ArticlePubMedGoogle Scholar
- Horsley T, Hyde C, Santesso N, Parkes J, Milne R, Stewart R. Teaching critical appraisal skills in healthcare settings. Cochrane Database Syst Rev. 2011 (11)CD001270. doi:10.1002/14651858.CD001270.pub2.
- MacRae HM, Regehr G, McKenzie M, Henteleff H, Taylor M, Barkun J, et al. Teaching practicing surgeons critical appraisal skills with an Internet-based journal club: a randomized, controlled trial. Surgery. 2004;136:641–6.View ArticlePubMedGoogle Scholar
- Taylor B, Kroth M. Andragogy’s transition into the future: meta-analysis of andragogy and its search for a measurable instrument. J Adult Educ. 2009;38:1–11.Google Scholar
- McGrath V. Reviewing the evidence on how adult students learn: an examination of Knowles’ model of andragogy. Ir J Adult Community Educ. 2009;2009:99–110.Google Scholar
- Chan S. Applications of andragogy in multi-disciplined teaching and learning. J Adult Educ. 2010;39:25–35.Google Scholar
- Mulnix JW. Thinking critically about critical thinking. Educ Philos Theory. 2012;44:464–79.View ArticleGoogle Scholar
- Facione PA. Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Millbrae, CA: The California Academic Press; 1990.Google Scholar
- Barend EGR, Briner RB. Teaching evidence-based practice: lessons from the pioneers: an interview with Amanda Burls and Gordon Guyatt. Acad Manage Learn Educ. 2014;13:483.Google Scholar
- Kuhn D. A developmental model of critical thinking. Educ Res. 1999;28:16–26.View ArticleGoogle Scholar
- Veenman M, Hout-Wolters B, Afflerbach P. Metacognition and learning: conceptual and methodological considerations. Metacognition Learn. 2006;1:3–14.View ArticleGoogle Scholar
- Weber EU, Johnson EJ. Mindful judgment and decision making. Annu Rev Psychol. 2009;60:53–85.View ArticlePubMedGoogle Scholar
- Schraw G, Crippen KJ, Hartley K. Promoting self-regulation in science education: metacognition as part of a broader perspective on learning. Res Sci Educ. 2006;36:111–39.View ArticleGoogle Scholar
- Abrami PC, Bernard RM, Borokhovski E, Wade A, Surkes MA, Tamim R, et al. Instructional interventions affecting critical thinking skills and dispositions: a stage 1 meta-analysis. Rev Educ Res. 2008;78:1102–34.View ArticleGoogle Scholar
- Schraw G. Promoting general metacognitive awareness. Instr Sci. 1998;26:113–25.View ArticleGoogle Scholar
- Systematic Review Data Repository Training Site. http://srdr.training.ahrq.gov/. Accessed 7 August 2015.
- Systematic Review Data Repository. http://srdr.ahrq.gov/. Accessed 7 August 2015.
- Li T, Vedula S, Hadar N, Parkin C, Lau J, Dickersin K. Innovations in data collection, management, and archiving for systematic reviews. Ann Intern Med. 2015;162:287–94.View ArticlePubMedGoogle Scholar
- Ip S, Hadar N, Keefe S, Parkin C, Iovin R, Balk EM, et al. A Web-based archive of systematic review data. Syst Rev. 2012;1:15.View ArticlePubMedPubMed CentralGoogle Scholar
- OpenMeta[Analyst]. http://www.cebm.brown.edu/openmeta/. Accessed 7 August 2015.
- Eden J, Levit L, Berg A, Morton S. Finding what works in health care: standards for systematic reviews. Washington, DC: The National Academies Press; 2011.Google Scholar