Skip to main content

Research on simulation in radiography education: a scoping review protocol

Abstract

Background

Today, there are fewer opportunities for health care students and staff for skills training through direct patient contact. The World Health Organization therefore recommends learning about patient safety through hands-on experience and simulation. Simulation has the potential to improve skills through training in a controlled environment, and simulation has a positive effect on knowledge and skills, and even patient-related outcomes. Reviews addressing the use of simulation across the different radiography specialties are lacking. Further knowledge on simulation in radiography education is needed to inform curriculum design and future research. The purpose of this scoping review is to explore, map, and summarize the extent, range, and nature of published research on simulation in radiography education.

Methods

We will follow the methodological framework for scoping reviews originally described by Arksey and O’Malley. We will search the MEDLINE, Embase, Epistemonikos, The Cochrane Library, ERIC, Scopus, and sources of grey literature. A comprehensive search strategy for Ovid MEDLINE was developed in collaboration with a research librarian. An example of a full electronic search from the Ovid MEDLINE (1641 articles records, January 9, 2020) is provided and will be used to adapt the search strategy to each database. Two independent review authors will screen all abstracts and titles, and full-text publications during a second stage. Next, they will extract data from each included study using a data extraction form informed by the aim of the study. A narrative account of all studies included will be presented. We will present a simple numerical analysis related to the extent, nature, and distribution of studies, and we will use content analysis to map the different simulation interventions and learning design elements reported. Any type of simulation intervention within all types of radiography specializations will be included. Our search strategy is not limited by language or date of publication.

Discussion

An overview of publications on simulation in radiography education across all radiography specialties will help to inform future research and will be useful for stakeholders within radiography education using simulation, both in the academic and clinical settings.

Systematic review registration

Open Science Framework (OSF). Submitted on October 18, 2020

Peer Review reports

Background

New technology and methods for diagnosis and treatment require that health personnel keep abreast with new practices [1]. Traditionally, clinical and communication skills were taught at the bedside [2]. However, in the clinical environment, it is challenging to make adequate observations and to perform feedback and to have enough time for reflection and discussion [2]. Also, today, changes in health delivery have resulted in shorter hospital stays and reduced patient availability for learning [3]. Several other factors make clinical teaching challenging, patients being too sick or unwilling to participate in teaching, and increasing efficiency demands leading to a shorter time for patient consultations [2]. The World Health Organization (WHO) recognizes that patient safety knowledge applies to all areas of practice and to all health care professions ([4], p. 22). To facilitate this, the WHO has provided a Patient Safety Curriculum Guide which recommends learning about patient safety through hands-on experience and simulation ([4], p. 84). Simulation is an important pedagogical method widely used by healthcare professions and may involve a range of learning activities [5]. Motola et al. highlight that simulation is a pedagogical method which has the potential to improve skills and skill retention through training in a controlled environment. Results from systematic reviews show that simulation has a positive effect on learning knowledge and skills [6,7,8], and can potentially improve patient-related outcomes [6, 9,10,11,12,13].

Issenberg and Scalese [3] state that the aim of the simulation is “to imitate real patients, anatomic regions, or clinical tasks or to mirror the real-life situations in which medical services are rendered.” Different types of simulators are used for simulation: part-task trainers, simulated patients, simulated environments, virtual reality and haptic systems, computer-based systems, and integrated simulators (instructor-driven simulators or model-driven simulators) [14]. Simulation is frequently described as high fidelity or low fidelity [15]. A simulation that offers complex and immersive scenarios by providing realistic feedback is described as high fidelity [16]. The use of a static model or task trainer that feels less real to the learner and offers no or low responsiveness is referred to as low-fidelity simulation [17]. To achieve optimal and efficient utilization of resources when designing simulation-based activities, it is recommended to perform a needs assessment, define learning outcomes, design a scenario to provide the context for the simulation including the levels of fidelity, ensure a facilitative approach, conduct pre-briefing and debriefing and feedback/evaluation, make available resources for preparing the participants, and pilot test the simulation scenario before implementation [18].

Simulation is regarded as a highly suitable strategy for learning radiography [11, 19]. The use of simulation in radiography education has been shown to enhance the radiographers’ perceptions of self-efficacy and critical thinking skills in image evaluation and patient assessment ([19], p.93). The professional practice in radiography is characterized by the use of advanced technologies and equipment for diagnostic purposes or for the treatment of medical conditions [20]. Important skills for simulation-based learning are related to positioning, exposure, physics, patient care, and quality assurance ([19], p.52). Students need opportunities to practice in a safe environment to ensure quality in the profession, and simulation offers the possibility for training without putting the patient at risk [3]. Simulation also offers the benefit of repeated learning of outcomes that promote increased cognitive recall and higher confidence with clinical tasks [21,22,23]. The term radiographer refers to “professional roles in the fields of diagnostic imaging, nuclear medicine, interventional radiology, and radiation therapy” ([24], p.20).

Simulation in radiography has previously been addressed in a literature review which focused on the simulation of conventional diagnostic radiography [11]. Most studies published after this review were studies with small sample sizes, evaluating different aspects of simulation [5, 21, 25,26,27,28,29,30,31]. Several of these studies used mixed methods [27]. Examples of topics covered were related to emotional preparedness when encountering open wounds [31] or when being exposed to clinical burns cases [29], confidence levels before and after simulation [30], and perceptions of learning in different high-fidelity computed tomography simulation environments [27]. Other experimental studies compared the use of virtual reality versus traditional placements [26], virtual reality against existing simulation techniques [25], and virtual reality against clinical role-play [21]. Simulation was also compared against traditional therapeutic radiography placements in a randomized controlled trial [5].

According to Lee et al. [27], reviews addressing the use of simulation across the different radiography specialties are lacking. Further knowledge on simulation in radiography education is needed to inform curriculum design and future research. The aim of this proposed scoping review is to explore, map, and summarize the extent, range, and nature of published research on simulation in radiography education. To achieve the aim of this review, we will:

  1. 1

    Explore the extent and range of simulation research conducted in radiography education (e.g., publication dates, volumes, yearly distributions, proportions, geographical locations)

  2. 2

    Explore research methods and designs used in research on radiography education (e.g., purposes, contexts, study populations, sample sizes, designs, and methods for data collection)

  3. 3

    Explore simulation interventions reported in research on simulation in radiography education

Methods

We will follow the methodological framework for scoping reviews originally described by Arksey and O’Malley [32] and later advanced by Levac et al. [33] and by Khalil et al. [34]. This framework consists of the following five stages: (1) identifying the research question by clarifying and linking the purpose and research question, (2) identifying the relevant studies using a three-step literature search in order to balance the breadth and comprehensiveness, (3) careful selection of the studies using a team approach, (4) charting the data in a tabular and narrative format, and (5) collating the results to identify the implications of the study findings for policy, practice, or research [34].

We drafted the protocol using the Preferred Reporting Items for Systematic Reviews and Meta-analysis Protocols (PRISMA-P checklist, Additional file 1) [35]. For the scoping review, we plan to follow the newly developed reporting guidelines for scoping reviews: the PRISMA Extension for Scoping Reviews (PRISMA-ScR) [36]. We have registered our protocol in the Open Science Framework. Due to the iterative nature of the scoping review, methodology changes to the protocol can occur. We will report any changes to the protocol.

Eligibility and exclusion criteria

We will include research publications that involve radiography students, faculty in radiography education and/or clinicians, and publications that describe and/or evaluate any type of simulation intervention within any type of professional radiography specialization. All empirical and theoretical/conceptual peer-reviewed publications and grey literature that focus on simulation in radiography education will be considered for inclusion. We will exclude publications with non-research study designs (e.g., editorial, discussion/opinion papers, guidelines, letters, and non-systematic reviews). All empirical and conceptual publications must have an abstract and aim clearly stated. No language or year restrictions will be applied, and we will not apply any restrictions regarding the status of publication. In line with the Joanna Briggs Institute Reviewer’s Manual [37], the detailed inclusion criteria of this scoping review are specified as the population, concept, context, and types of sources of evidence (Table 1).

Table 1 Study eligibility

Search strategy and information sources

We will search MEDLINE, Embase, Epistemonikos, The Cochrane Library, ERIC and Scopus. To identify grey literature, we will search OpenGrey and Google Scholar. We will search the reference lists and citations of included studies to identify additional, relevant references. The searches will be re-run just before the final analyses to retrieve further studies for inclusion.

We developed a comprehensive search strategy for Ovid MEDLINE in collaboration with an experienced research librarian. An example of a full electronic search using search terms for simulation in radiography education in the Ovid MEDLINE yielded 1641 articles on January 9, 2020 (Additional file 2). We will adapt/use the search strategy used for the Ovid MEDLINE to each database. As the search strategy for scoping reviews is considered an iterative process [32,33,34], we will evaluate the initial search results and evaluate needs for improvement during the review process. Records will be exported to EndNote X9 [38] to enable data management, removal of duplicates, and retrieving full texts.

Study selection

For the selection of eligible studies, we will use the Rayyan screening tool [39]. Based on the inclusion criteria (Table 1), two review authors will independently screen the titles and abstracts from the retrieved studies. Two review authors will then independently assess the acquired full-text publications for eligibility. Any disagreements regarding eligibility will be resolved by discussion among the two review authors, and a third reviewer will resolve disagreement if needed. If full-text articles are excluded, the reasons will be presented in an appendix. To ensure consistency and reliability in the study selection process, we will pilot the study selection using a random selection of 50 titles and abstracts from the literature search before we start formal screening. We will revise until we achieve a percentage agreement of > 80% across reviewers. The search decision process will be illustrated using a flow chart, as recommended in the PRISMA statement [40].

Data charting process

Based on the population, concept, context, and types of sources of evidence as outlined in Table 1, the research team will develop a data extraction form using spreadsheets. Prior to the full data extraction, we will pilot the data extraction form using a sample of 10 studies to determine agreement within the research team, and as such, this will be an iterative process. Two review authors will then independently read and extract data from each included study using this data extraction form. In line with the purpose of this scoping review, we plan to extract the following data:

  1. 1

    Population: study population (student, clinician, faculty), age, sex, level of education (year of study, undergraduate, postgraduate), inclusion and exclusion criteria, needs assessment (e.g., equipment, human resources), number of participants in intervention group/control group, sample size, and data about previous experience with simulation

  2. 2

    Concept: type of intervention (scenario/task/activity, facilitative approach (pre- and debriefing/feedback), manikin or standard patient intervention, virtual reality), overall aim of the simulation (learning outcomes), type of skills, assessment after training (formative or summative evaluation), pedagogical rationales, integrated into curriculum (yes/no), duration (h), fidelity (equipment, environmental and psychological fidelity), settings (educational or healthcare institution or others), comparator, type of outcomes (educational, patient), cost measures used (yes/no), and type of cost measures

  3. 3

    Context: type of institution performing the simulation and type of radiography specializations

  4. 4

    Types of sources of evidence: title, year of publication, volume, author, country, study objective/purpose, type of study, research method (design, number of study participants/sample size, data collection), results, and conclusions

Analysis of the evidence

In this review, we aim to present an overview and a narrative account of all studies included. We will present our results in two ways. Firstly, we will quantitatively summarize the data related to the extent, nature, and distribution of studies. This simple numerical analysis will provide an overview, and it will point to significant knowledge gaps. Secondly, we will use content analysis [41] to map the different simulation interventions and learning design elements reported (e.g., teaching and learning activities, curriculum, pedagogical theory, assessment strategies, and learning outcomes). Reporting guidelines for interventions (e.g., [42]) will be used to structure the presentation of the reported interventions. We will use Kirkpatrick’s four-level model [43] as a framework for the analysis of the different learning outcomes reported.

Discussion

To the best of our knowledge, this will be the first scoping review identifying the research published on simulation in radiography education. This process will provide an overview of the current state of evidence in research about simulation in radiography education, and we will be able to identify in which research areas systematic reviews or primary research are needed.

The strength of this review is the use of a transparent and reproducible procedure. In our protocol, we have presented a detailed description of population, concept and context, data sources, search strategy, data extraction, and analysis. We will not limit the review to only certain kinds of simulation or settings because the radiography profession performs a wide range of clinical tasks, including image diagnostic and treatment procedures combined with patient-related care. We anticipate that this review will be useful for stakeholders within radiography education, both in the academic and clinical settings. Our search strategy is broad, which may result in a high number of redundant texts or publications. The search terms may be changed or expanded during search process due to the iterative method. In this scoping review, we will not assess the impact of simulation intervention, nor the quality of the identified interventions.

Ethics and dissemination

Ethical approval or consent to participate is not required, as data in this study consists of published studies, and not individual data from human or animal participants. The research project will be carried out in accordance with the Helsinki Declaration.

Funding

We will report sources of funding for the included sources of evidence in the studies. For the proposed scoping review, there is no declaration of funding.

Availability of data and materials

Not applicable as this is a protocol for a scoping review. Search strategy and preliminary search result will be available as an additional file.

Abbreviations

WHO:

World Health Organization

JBI:

Joanna Briggs Institute

PRISMA-P:

Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols

PRISMA-ScR:

Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews

EFRS:

European Federation of Radiographer Societies

EQF:

European Qualifications Framework

References

  1. 1.

    Issenberg SB, McGaghie WC, Petrusa ER. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27. https://doi.org/10.1080/01421590500046924.

  2. 2.

    Ramani S, Leinster S. AMEE guide no. 34: teaching in the clinical environment. Med Teach. 2008;30(4):347–64.

    Article  Google Scholar 

  3. 3.

    Issenberg SB, Scalese RJ. Simulation in health care education. Perspect Biol Med. 2008;51(1):31–46.

    Article  Google Scholar 

  4. 4.

    World Health Organization: WHO Multi-professional Patient Safety Curriculum Guide. In.; 2011: 272.

  5. 5.

    Ketterer S-J, Callender J, Warren M, Al-Samarraie F, Ball B, Calder K-A, et al. Simulated versus traditional therapeutic radiography placements: a randomised controlled trial. Radiography. 2019.

  6. 6.

    Cook DA, Hatala R, Brydges R, Zendejas B, Szostek JH, Wang AT, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA. 2011;306(9):978–88.

    CAS  Article  Google Scholar 

  7. 7.

    Cook DA, Hamstra SJ, Brydges R. Comparative effectiveness of instructional design features in simulation-based education: systematic review and meta-analysis. Med Teach. 2013;35. https://doi.org/10.3109/0142159x.2012.714886.

  8. 8.

    Boet S, Bould MD, Fung L, Qosa H, Perrier L, Tavares W, et al. Transfer of learning and patient outcome in simulated crisis resource management: a systematic review. Canadian Journal of Anesthesia/Journal canadien d’anesthésie. 2014;61(6):571–82.

    Article  Google Scholar 

  9. 9.

    Zendejas B, Brydges R, Wang AT, Cook DA. Patient outcomes in simulation-based medical education: a systematic review. J Gen Intern Med. 2013;28(8):1078–89. https://doi.org/10.1007/s11606-012-2264-5.

    Article  PubMed  PubMed Central  Google Scholar 

  10. 10.

    McGaghie WC, Draycott TJ, Dunn WF, Lopez CM, Stefanidis D. Evaluating the impact of simulation on translational patient outcomes. Simul Healthc. 2011;6(Suppl):S42.

    Article  Google Scholar 

  11. 11.

    Shiner N. Is there a role for simulation based education within conventional diagnostic radiography? A literature review. Radiography. 2018;24(3):262–71.

    CAS  Article  Google Scholar 

  12. 12.

    McGaghie WC, Issenberg SB, Cohen MER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86(6):706.

    Article  Google Scholar 

  13. 13.

    Kononowicz AA, Woodham L, Georg C, Edelbring S, Stathakarou N, Davies D, et al. Virtual patient simulations for health professional education. The Cochrane Library. 2016.

  14. 14.

    Maran NJ, Glavin RJ. Low-to high-fidelity simulation–a continuum of medical education? Med Educ. 2003;37:22–8.

    Article  Google Scholar 

  15. 15.

    Hamstra SJ, Brydges R, Hatala R, Zendejas B, Cook DA. Reconsidering fidelity in simulation-based training. Acad Med. 2014;89(3):387–92. https://doi.org/10.1097/acm.0000000000000130.

    Article  PubMed  Google Scholar 

  16. 16.

    Massoth C, Röder H, Ohlenburg H, Hessler M, Zarbock A, Pöpping DM, et al. High-fidelity is not superior to low-fidelity simulation but leads to overconfidence in medical students. BMC Med Educ. 2019;19(1):29.

    Article  Google Scholar 

  17. 17.

    Brydges R, Carnahan H, Rose D, Rose L, Dubrowski A. Coordinating progressive levels of simulation fidelity to maximize educational benefit. Acad Med. 2010;85(5):806–12.

    Article  Google Scholar 

  18. 18.

    Committee IS. INACSL standards of best practice: SimulationSM simulation design. Clinical Simulation In Nursing. 2016;12:S5–S12. https://doi.org/10.1016/j.ecns.2016.09.005.

    Article  Google Scholar 

  19. 19.

    Chiu J, Inserra A, Kelly T, Mangione R, Morote E-S, Tatum S: Radiographer level of simulation trianing, critical thinking skills, self-efficacy, and clinical competence. In.: ProQuest Dissertations Publishing; 2013.

  20. 20.

    European Federation of Radiographer Societies E. European Qualifications Framework (EQF) benchmarking document.  2018. https://api.efrs.eu/api/assets/publications/143. Accessed 20 Oct 2020.

  21. 21.

    Sapkaroski D, Mundy M, Dimmock M. Virtual reality versus conventional clinical role-play for radiographic positioning training: a students’ perception study. Radiography. 2020;26(1):57–62.

    CAS  Article  Google Scholar 

  22. 22.

    Boling B, Hardin-Pierce M. The effect of high-fidelity simulation on knowledge and confidence in critical care training: an integrative review. Nurse Educ Pract. 2016;16(1):287–93.

    Article  Google Scholar 

  23. 23.

    Wibell CJ. Principles of learning: 7 principles to guide personalized, student-centered learning in the technology-enhanced, blended learning environment. https://principlesoflearning.wordpress.com/2011/06/01/principle-of-learning-4-practice/ (2011). Accessed 4 July 2011.

  24. 24.

    European Federation of Radiographer Societies E. European Qualifications Framework (EQF) benchmarking document. European Federation of Radiographer Societies, Utrecht, the Netherlands. 2018. https://api.efrs.eu/api/assets/publications/143. Accessed 20 Oct 2020.

  25. 25.

    Sapkaroski D, Baird M, McInerney J, Dimmock MR. The implementation of a haptic feedback virtual reality simulation clinic with dynamic patient interaction and communication for medical imaging students. J Med Radiat Sci. 2018;65(3):218–25.

    Article  Google Scholar 

  26. 26.

    Gunn T, Jones L, Bridge P, Rowntree P, Nissen L. The use of virtual reality simulation to improve technical skill in the undergraduate medical imaging student. Interact Learn Environ. 2018;26(5):613–20.

    Article  Google Scholar 

  27. 27.

    Lee K, Baird M, Lewis S, McInerney J, Dimmock M. Computed tomography learning via high-fidelity simulation for undergraduate radiography students. Radiography. 2020;26(1):49–56.

    CAS  Article  Google Scholar 

  28. 28.

    Harcus J, Snaith B. Expanding training capacity for radiographer reporting using simulation: evaluation of a pilot academy project. Radiography. 2019;25(4):288–93.

    CAS  Article  Google Scholar 

  29. 29.

    Shiner N, Howard M. The use of simulation and moulage in undergraduate diagnostic radiography education: a burns scenario. Radiography. 2019;25(3):194–201.

    CAS  Article  Google Scholar 

  30. 30.

    Kloc L, Ballor C, Boldt K, Curry R. Using scenario-based simulation to address affective behaviors in sonography students. Journal of Diagnostic Medical Sonography. 2019;35(2):113–24.

    Article  Google Scholar 

  31. 31.

    Shiner N. Can simulation impact on first year diagnostic radiography students’ emotional preparedness to encounter open wounds on their first clinical placement: a pilot study. Radiography. 2019;25(4):294–300.

    CAS  Article  Google Scholar 

  32. 32.

    Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8(1):19–32.

    Article  Google Scholar 

  33. 33.

    Levac D, Colquhoun H, O’Brien KK. Scoping studies: advancing the methodology. Implement Sci. 2010;5(1):69.

    Article  Google Scholar 

  34. 34.

    Khalil H, Peters M, Godfrey CM, McInerney P, Soares CB, Parker D. An evidence-based approach to scoping reviews. Worldviews Evid-Based Nurs. 2016;13(2):118–23.

    Article  Google Scholar 

  35. 35.

    Moher D, Shamseer L, Clarke M. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst Rev. 2015;4. https://doi.org/10.1186/2046-4053-4-1.

  36. 36.

    Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467–73. https://doi.org/10.7326/M18-0850.

    Article  PubMed  Google Scholar 

  37. 37.

    Peters MDJ GC, McInerney P, Munn Z, Tricco AC, Khalil H: Joanna Briggs Institute Reviewer’s Manual, JBI. In: International journal of evidence-based healthcare. Edited by Aromataris E MZE, vol. Chapter 11: Scoping Reviews (2020 version); 2020.

  38. 38.

    Group WoS. EndNote X9. Web of Science Group. 2020.

  39. 39.

    Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan—a web and mobile app for systematic reviews. Systematic reviews. 2016;5(1):210.

    Article  Google Scholar 

  40. 40.

    Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151(4):264–9.

    Article  Google Scholar 

  41. 41.

    Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107–15.

    Article  Google Scholar 

  42. 42.

    Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687. https://doi.org/10.1136/bmj.g1687.

    Article  PubMed  Google Scholar 

  43. 43.

    Johnston S, Coyer FM, Nash R. Kirkpatrick’s evaluation of simulation and debriefing in health care education: a systematic review. J Nurs Educ. 2018;57(7):393–8.

    Article  Google Scholar 

Download references

Acknowledgements

We acknowledge the contribution to the search strategy and preliminary search result from the research librarians Gunhild Austrheim and Gøril Tvedten Jorem.

Funding

For the proposed scoping review, there is no declaration of funding.

Author information

Affiliations

Authors

Contributions

All authors have contributed to developing this protocol, the review questions, and the review design. MV, NRO, and the research librarian designed the search strategy, which was reviewed by a second research librarian. MV and NRO jointly developed the data extraction framework. All authors read and approved the final protocol. The authors read and approved the final manuscript.

Corresponding author

Correspondence to Mona Vestbøstad.

Ethics declarations

Ethics approval and consent to participate

This study consists of published studies and not individual data from human or animal participants.

Consent for publication

This study does not contain any individual person’s data.

Competing interests

There are no financial or non-financial competing interests to declare.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

PRISMA-P checklist

Additional file 2.

Full electronic search in one database

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Vestbøstad, M., Karlgren, K. & Olsen, N.R. Research on simulation in radiography education: a scoping review protocol. Syst Rev 9, 263 (2020). https://doi.org/10.1186/s13643-020-01531-2

Download citation

Keywords

  • Simulation
  • Radiography
  • Education