Skip to main content

Table 1 Characteristics of the instruments used to measure SA in HCPs and description of the included studies

From: Measuring situation awareness in health care providers: a systematic review of measurement properties using COSMIN methodology

Instrument name

Type of measure

Number of subscales

Total items

Response Options

Reference

country

Study participants

Setting (clinical vs. simulation)

Measurement properties

Situation Awareness Global

Assessment Technique (SAGAT)

NR

3

39

NR

Dishman et al., 2020 [9]

USA

49 Nurse anesthetists

7 experts

Simulation (scenario of induction of general anesthesia)

Content validity

Open-ended questions

3

Correct 1, Incorrect 0, Partial correct 0.5

Gardner et al., 2017 [17]

USA

43 medical students

Simulation (advanced

Cardiac life support scenarios)

Criterion validity

Pen-and-paper version of the instrument

31

Yes/No for level 1; 2

possible answers for level 3

Lavoie et al., 2016 [18]

Canada

234 nursing students

15 critical care experts

Simulation (patient deterioration simulation scenario)

Content validity

Internal consistency

Pen-and-paper version of the instrument

7

Answers were based on factual aspects and expert opinion

Hogan et al., 2006 [19]

Canada

16 surgeons and

residents

Simulation

(human patient

Simulator and trauma scenarios)

Content validity

Internal consistency

Convergent validity

Team resuscitation situation awareness tool

Observational checklist

7

7

5-point Likert scale:

O'Neill et al., 2018 [20]

Canada

42 teams and 242 HCPS (physicians and nurses)

13 experts

Simulation (simulated pediatric resuscitation events)

Content validity

Inter-rater reliability

Criterion validit

Team Situation Awareness Global Assessment Technique (TSAGAT)

Observational checklist

3

50

3-point Likert scale

Crozier et al., 2015 [21]

Canada

12 HCPS (physicians, nurses and students)

2 independent raters

Simulation (trauma resuscitation scenarios using HPS)

Convergent validity

Known-groups validity

Inter-rater reliability

Situation awareness (SA) assessment tool

Observational checklis

3

14

NR

Frere et al., 2017 [8]

Ireland

2 expert raters

Simulation (OSCE in 9 medical specialties)

Internal consistency

Inter-rater reliability

Non-Technical Skills for Surgeons tool (NOTSS)

Observational checklist

4

12

4-point rating Scale

Jung et al., 2020 [22]

Canada

5 experts

Clinical (observing recordings of actual OR)

Known-groups validity

Inter-rater reliability

Yule et al., 2018 [23]

UK-USA

255 surgeons in 2 groups

Simulation (video-based simulated crisis scenario)

Structural validity

Internal consistency

Criterion validity

Crossley et al., 2011 [24]

UK

85 surgeons

100 assessor

Clinical (OR)

Content validity

Structural validity

Internal consistency

Non-Technical Skills for Surgeons (NOTSS) tool

5

14

Yule et al., 2008 [25]

UK

44 surgeons

Simulation (video-based simulated scenario)

Internal consistency

Inter-rater reliability

Yule et al., 2006 [26]

UK

 

Clinical (OR)

Development Study

Non-Technical Skills for Urological Surgeons (NoTSUS)

Observational checklist

5

13

5-point Likert scale

Aydın et al., 2020 [27]

UK

43 trainees and 19 specialists

5 expert raters

Simulation

(the full immersion simulation ‘Igloo’ environment)

Criterion validity

Inter-rater reliability

Anesthetists' Non-Technical Skills System (ANTS)

Observational checklist

4

15

4-point rating scale

Fletcher et al., 2003 [28]

UK

50 anesthetists

Simulation (simulated anesthetic scenarios)

Content validity

Internal consistency

Inter-rater reliability

Anesthetists' Non-Technical Skills System (ANTS)

Observational checklist

4

15

4-point rating scale

Graham et al., 2010 [29]

Australia

26 anesthetists

Clinical (videos of real-time and routine anesthesia)

Internal consistency

Inter-rater reliability

Anaesthetic Non-technical Skills for

Anesthetic Practitioners System (ANTS-AP)

Observational checklist

3

9

4-point rating scale

Rutherford et al.,2015 [30]

UK

48 anesthetic practitioners

Simulation (Simulated anesthetic scenarios in OR

Content validity

Internal consistency

Reliability

Inter-rater reliability

Trauma Non-Technical Skills (T-NOTECHS)

tool

Observational checklist

5

5

5-point scale

van Maarseveen et al., 2020 [31]

Netherland

18 recorded videos of resuscitations team

3 assessors

Clinical (trauma center)

Reliability

Inter-rater reliability

     

Steinemann et al.,2012 [32]

USA

44 observations for simulated and 48 for actual resuscitations by 2–3 raters

Both clinical and simulation setting

Development Study

Inter-rater reliability

Oxford Non-Technical Skills scale (NOTECHS)

Observational checklist

4

16

4-point rating scale

Mishra et al., 2009 [33]

UK

65 OR teams

2–3 expert raters

Clinical (OR)

Content validity

Reliability

convergent validity

Inter-rater reliability

Oxford Non-Technical Skills scale (NOTECHS II)

8-point rating scale

Robertson et al.,2014 [34]

UK

297 OR members

Clinical (OR)

Content validity

Known-groups validity

Inter-rater reliability

Interpersonal and Cognitive

Assessment for Robotic Surgery rating system (ICARS)

Observational checklist

4

28

5-point rating scale

Raison et al., 2017 [35]

UK

16 expert surgeons

73 surgeons

Simulation (ureterovesical anastomosis within a simulated OR)

Content validity

Internal consistency

Inter-rater reliability

Explicit professional oral

communication tool (EPOC)

Observational checklist

6

35

NR

Kemper et al., 2013 [36]

Netherland

378 ED members

1144 ICU members

2 independent observers

Clinical

(ED and ICU)

Measurement error

Inter-rater reliability

Scrub Practitioners’ List of Intraoperative Non-Technical Skills (SPLINTS)

Observational checklist

3

9

4-point Likert scale

Loh et al., 2019 [37]

Singapore

30 scrub nurses

10 expert raters

Clinical (OR)

Content validity

Internal consistency

Reliability

Convergent validity

Inter-rater reliability

    

Binary scale

Mitchell et al., 2012 [38]

UK

25 scrub nurses

9 surgeons

NR

Development Study

Ottawa Global Rating Scale (GRS)

Observational checklist

8

8

7-point rating scale

Kim et al., 2006 [39]

Canada

59 medical residents

3 raters

Simulation (ICU; ED; PACU)

Content validity

Internal consistency

Inter-rater reliability

  1. NR Not Reported, OR Operating Room, NTS Non-Technical Skills, ED Emergency Department, ICU Intensive Care Unit, OSCE objective structured clinical examination, N/A Not Applicable (skill not required for the given clinical sitting), PACU post anesthesia care unit