Skip to main content

Use of social network analysis methods to study professional advice and performance among healthcare providers: a systematic review

Abstract

Background

Social network analysis quantifies and visualizes relationships between and among individuals or organizations. Applications in the health sector remain underutilized. This systematic review seeks to analyze what social network methods have been used to study professional communication and performance among healthcare providers.

Methods

Ten databases were searched from 1990 through April 2016, yielding 5970 articles screened for inclusion by two independent reviewers who extracted data and critically appraised each study. Inclusion criteria were study of health care worker professional communication, network methods used, and patient outcomes measured. The search identified 10 systematic reviews. The final set of articles had their citations prospectively and retrospectively screened. We used narrative synthesis to summarize the findings.

Results

The six articles meeting our inclusion criteria described unique health sectors: one at primary healthcare level and five at tertiary level; five conducted in the USA, one in Australia. Four studies looked at multidisciplinary healthcare workers, while two focused on nurses. Two studies used mixed methods, four quantitative methods only, and one involved an experimental design. Four administered network surveys, one coded observations, and one used an existing survey to extract network data. Density and centrality were the most common network metrics although one study did not calculate any network properties and only visualized the network. Four studies involved tests of significance, and two used modeling methods. Social network analysis software preferences were evenly split between ORA and UCINET. All articles meeting our criteria were published in the past 5 years, suggesting that this remains in clinical care a nascent but emergent research area. There was marked diversity across all six studies in terms of research questions, health sector area, patient outcomes, and network analysis methods.

Conclusion

Network methods are underutilized for the purposes of understanding professional communication and performance among healthcare providers. The paucity of articles meeting our search criteria, lack of studies in middle- and low-income contexts, limited number in non-tertiary settings, and few longitudinal, experimental designs, or network interventions present clear research gaps.

Systematic review registration

PROSPERO CRD42015019328

Peer Review reports

Background

In 2015, the Millennium Development Goals (MDGs) expired after 15 years of galvanizing the global development community around health targets related to women, children, and HIV and AIDS. Their replacement, the Sustainable Development Goals (SDGs), broaden global focus beyond health [1]. As such, the health sector will need to explore new ways to influence provider practice and scale up best practices to meet the outstanding MDG targets and improve health outcomes. Understanding and harnessing the power of existing professional advice networks among healthcare providers could assist in influencing provider practice and improving health outcomes in low- and middle-income countries (LMIC). Social network analysis focuses on studying relationships between and among individuals (or organizations) who are connected by one or more ties of interdependency, such as love, friendship, kinship, trust, collaboration, or communication [2]. Social network analysis (SNA) can lend insight into defining, measuring, and understanding these professional communication networks and therefore designing effective network interventions to improve provider performance and ultimately, health outcomes [3, 4].

SNA is defined as a means of mapping and exposing channels of communication and information flow, collaboration, and disconnection between people [5]. SNA is both a theory and a methodology that has generated a body of empirical research [6, 7]. One of these theories is diffusion of innovations defined by Rogers as the “process by which an innovation is communicated through certain channels over time among members of social system” (p. 5) [8]. Rogers proposed that individuals go through several stages in deciding to “adopt” an innovation; a process influenced by the characteristics of innovations, specifically the complexity, triability, observability, and the relative advantage conferred by the innovation [8, 9]. Individual adoption of an innovation can be expressed as a normal distribution, segmenting individuals into categories of individual innovativeness: innovators, early adopters, early majority, late majority, and laggards [8].

Professional behavior change among healthcare providers is often referred to as knowledge translation or transfer. We hypothesize that certain network structures and the presence of network roles within networks of healthcare providers can facilitate diffusion of innovations, or knowledge translation and transfer, particularly where the issue is lack of provider information, and that may in turn change practices and improve patient outcomes. Admittedly, this is a simplification as relationships among healthcare providers are multiplex and friendship or trust networks rather than purely professional communications may be more influential in changing provider behavior when there is informational ambiguity [10]. As such, it is important to consider both formal and informal professional communication in an attempt to mitigate this concern.

While it is not possible for this paper to include a comprehensive overview of social network analysis study designs, data collection, and data analysis methods, the key concepts are highlighted and further explanations can be found elsewhere [2]. Social network analysis studies are defined primarily as either whole network, including all members of a group defined by a specified boundary or ego network studies, capturing the networks of select individuals within a network. Hybrid models can combine elements of both approaches. All networks are characterized by whether the network is “directed,” indicating the orientation of the relationship, for example if A influences B, then the tie would include an arrowhead at B or “undirected” where the relationship either exists or does not and none of the ties or lines have arrowheads. They are also either “valued” capturing the intensity of the relations on a scale or “unvalued,” whereby these relations are dichotomous. Network data can be captured through questionnaires, interviews, observations, existing records, diaries, or other methods [2]. Other data collection methods or ways of generating networks of providers include journal publication co-author lists, identifying patient-sharing among providers, attendance at conferences, and participation on social media forums, to name a few.

Social network analysis data analysis method options depend on how the data were collected and the research questions of interest. In SNA, visualizing data is both a means of presenting findings as well as a tool for generating additional findings. Quantification of network properties are subject to certain constraints as the unit analysis is a relationship between actors (individuals or organizations) rather than independent observations. Thus, SNA requires analytical tools that do not rely on independence of observations or relations [2]. Analysis can be at the actor, subgroup, or network level. Common subgroup structures are dyads, triads, clusters, cliques, components, and bridges [11]. Many network metrics can be calculated including degree, density, centrality, reachability, and distance. Some of these can be calculated at the network or actor level or both. Gesell et al. [12] recommend calculating isolates, degree, and reciprocity at the actor level, and at the network level. the presence of subgroups, density, centralization, transitivity, and cohesion as the metrics most likely to have effect on individual and group processes.

A 2012 systematic review of SNA applications in healthcare settings concluded that SNA’s potential has been unrealized in the health sector, particularly because virtually all identified studies were simple network descriptions rather than studies of network interventions [5]. This review had a definition of a healthcare setting that excluded community-based health workers and interventions, a limitation particularly relevant in LMIC and global health.

The present systematic review builds on the Chambers et al.’s [5] review in the following ways: broadening the definition of “healthcare settings” to be inclusive of community-based settings, expanding the databases and search terms, and updating the searches to include articles from 2011 to 2016. The focus of the review synthesis is substantively different looking specifically at SNA methods used to understand healthcare provider communication and performance. The primary research question this review sought to address is what SNA methods have been used to study professional communication and performance among healthcare providers? Secondary research questions included:

  • Does professional communication improve health outcomes? What professional communication network properties are associated with health outcomes?

  • What methods have been used for which types of research questions?

  • What are the main limitations of the SNA methods?

  • What is the quality of these studies?

  • What is the quantity of SNA studies? What was the evolution over time?

  • To what extent has this research taken place in low- and middle-income countries?

  • To what extent has this research focused on community-based health providers?

Methods

Definitions

For any systematic review, it is critical to clarify our meaning when using terms that define a search strategy. For this review, we have operationally defined “healthcare providers,” “professional communication,” and “performance” as follows.

In this context, we defined “healthcare providers” as physicians, clinical officers, nurses, midwives, counselors, physician’s assistants, and others who provide health-related services to patients in formal medical environments. Additionally, community-based cadres such as community health workers, village health workers, traditional birth assistants, and others were also considered healthcare providers.

For our purposes, we defined “professional communication” as formal or informal professional advice-seeking or giving or discussion about hypothetical or actual work situations or patients. For example, studies exploring friendship networks of healthcare providers were not considered eligible, unless they also captured communication related to work situations or patient care and documented patient health outcomes.

We defined “performance” as a study including a patient health outcome. Studies that only considered “patient satisfaction” or healthcare provider “perceptions of performance” were not eligible for inclusion.

Search strategy

The search strategy focuses on the intersection of SNA and diffusion of innovations, the term used in the SNA community most relevant for professional communication related to knowledge sharing and transfer. Since health policy and health systems research often use “knowledge translation or transfer” language, the search strategy also includes the intersection between those terms and SNA. As a methodologically focused review, this review will highlight the range of SNA methods applied.

To address the research questions, the systematic review focused on three concepts that are integral to the primary research question: (1) SNA, (2) diffusion of innovations, and (3) knowledge translation and transfer. The key terms for these concepts are shown in Additional file 1 and truncation search terms will be used to make the search inclusive.

Concept 1: social network analysis

The search strategy for the SNA concept was adapted from the Chambers et al.’s scoping systematic review of Social Network Analysis and healthcare settings [5]. This was particularly helpful guidance as a more recent SNA review; Cunningham et al. noted the challenge of “social network” yielding irrelevant social media or social support articles [13]. One of the changes from the Chambers et al. review was an expansion of the list of SNA software listed (from four: UCINET, NetDraw, Pajek, and KrackPlot to 56), which was guided by a chapter in the SAGE Handbook of Social Network Analysis [14]. Depending on the database, specific software packages (Blanche, InFlow, Jung, ORA, ORS, Pnet, Puck UNISoN, SNAP, and STRUCTURE) were excluded as they yielded thousands of off-topic articles. See Additional file 2 for a list of exclusions by database. None of these exclusions were the SNA packages included in the previous review and for the most part are not the most commonly used software packages for SNA. The one exception is ORA, a SNA software package that, for 6 of 10 databases, returned thousands of articles that used odd ratios in their analysis. However, as this review still yielded two studies that used ORA, we do not feel that this negatively impacted the search.

Concept 2: diffusion of innovations

The search strategy for the diffusion of innovations concept was influenced by the original search strategy used as a starting point for a meta-narrative on Diffusion of Innovations in Health Service Organizations [15]. However, the focus on health service organizations was seen as potentially too limiting. Therefore, terms related to health service organizations were not included to let the review capture a broader range of studies. “Diffusion of innovations” is a phrase that is relatively new to health systems research. Consequently, the review used a third concept to ensure all relevant studies were captured, which corresponds to diffusion of information: knowledge translation and transfer.

Concept 3: knowledge translation and transfer

Knowledge translation and transfer (KT) are terms describing a relatively new discipline, which does not have an agreed upon lexicon. A systematic study of KT terms used in 12 journals found inconsistent use of KT terms such that less than half of what the authors classified as “KT articles” used the presumed “KT terms” leading the authors to refer to the situation as a “tower of babel” [16]. The search strategy for this concept was developed by determining the common terms across six sources including four systematic reviews [17,18,19,20] and two articles on knowledge translation “KT” or “K*” terms [16, 21]. A comprehensive listing of all 253 K* terms can be found in Additional file 3. An initial search conducted using all the terms yielded over 6000 articles in MEDLINE, which led to a revision of the approach for this concept. Priority terms for inclusion in the search strategy were those that appeared in more than one source.

The search strategies were then developed looking at the intersection of concept 1 with concept 2 and the intersection of concept 1 with concept 3. They were then adapted to each of the databases included in the review, including mapping the above terms to MeSH terms. Detailed search strategies for each of the 10 databases are available upon request—an example is included in Additional file 4.

MEDLINE, EMBASE, PsychINFO, CINAHL, Global Health, Social Policy and Practice, Health Management Information Consortium, and Web of Science were searched. Gray literature was searched via Popline. The Cochrane Library was searched to identify other systematic reviews and relevant studies. Several websites were searched including International Network for Social Network Analysis, American Evaluation Association Social Network Analysis Technical Interest Group, and in the International Sunbelt Social Networks Conference proceedings archives.

Articles were downloaded into Endnote X5.0.01, a bibliographic software package and duplicates within and across databases were removed. All 5970 articles were then assessed for meeting study inclusion criteria through a three-stage review process. Two independent reviewers (KS and DW) screened titles, abstracts, and full-text articles; after each step, discrepancies were discussed and reconciled.

The 10 systematic reviews identified through the search strategies that addressed SNA had the articles they included screened for inclusion in this review [5, 22,23,24,25,26,27,28,29].

The search strategies were executed originally from 1990 to January–March 2015 and then updated in April 2016, capturing articles published since the original search. All systematic reviews identified had the articles they included screened. The final set of articles had their reference lists screened and SCOPUS was used to conduct a prospective citation search. All articles were subjected to our two independent reviewer, 3-stage screening process. The PRISMA flow chart (Fig.1) reflects the combination of the searches and screenings conducted in 2015 and updated in 2016.

Fig. 1
figure 1

PRISMA flow chart

The study protocol was registered with PROSPERO DOI: 10.15124/CRD42015019328 URL: http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42015019328

Study inclusion and exclusion criteria

A checklist was developed to guide each reviewer. A single “no” response to any of the questions below was a cause of exclusion of the study from the systematic review:

  • Does the study use SNA methods?

  • Are the study subjects healthcare providers?

  • Is the communication/relationship of interest between healthcare providers?

  • Does the research focus on professional communication?

  • Is there some metric used for performance, defined as assessing patient outcomes?

Only English search terms were used, and studies included were limited to those published in English since 1990. This date was selected in part because in the previous review, Chambers et al. 2012 had 49 of 52 included articles published after 1990. Furthermore, modern SNA studies rely on software that has primarily existed after 1990.

This review excluded studies conducting SNA of patient-to-patient communication or patient-to-provider communication. Direct-to-consumer advertising and marketing studies, such as pharmaceutical companies marketing to potential patients, were also excluded. Publication or research networks, provider patient-sharing networks, provider friendship networks, and non-empiric research were excluded. Studies whose only measures of performance were “provider perceptions” or “patient satisfaction” were excluded as not being an objectively measurable health outcome.

These exclusions were made on that basis that they were not thought to lend insight into methods used to assess professional communications among healthcare workers and their association with patient outcomes.

Study quality assessment

Two tools were developed for critically appraising study quality—one for qualitative studies and the other one for quantitative study designs. These tools were informed by STROBE, EPOC, CASP, SIGN, ENTREQ, COREQ, RATS, QARI, and NICE Process and Methods guidelines and checklists and seminal articles on the subject [30,31,32,33,34,35,36,37,38,39,40]. Systematic reviews of SNAs identified through our search strategy were consulted as there is not a standard tool for assessing the quality of SNA, and some of the content of existing checklists for other study methods do not apply for network studies [5, 13, 22,23,24,25,26,27,28,29]. However the existing tools were useful starting points for assessing the quality of studies. See Additional files 5 and 6 for the tools developed to assess qualitative and quantitative studies and Table 2 for the summary of study quality. As per Cochrane and SIGN, guidance studies were assessed as being high, medium, and low quality with no summary score produced or a quality threshold for inclusion in the review [36, 41]. Mixed-methods studies had both tools applied and an overall study quality assessment provided drawing on both tools’ assessments.

Selected studies were independently critically appraised using these tools by two individuals (KS and DW). Discrepancies were discussed until reconciled.

Data extraction strategy

A data extraction matrix was developed after reviewing data extraction tools used in relevant systematic reviews and consulting with a SNA and health expert [5, 13, 22,23,24,25,26,27,28,29]. The tool was pilot-tested and revised for greater clarity and specificity with the final version covering 35 data points. Data were extracted independently by two individuals (KS and DW), results compared and discrepancies discussed and resolved by consensus. See Additional file 7 for the tool and Tables 2, 3, 4, 5, and 6 for a subset of the data extracted.

Data synthesis and presentation

Narrative synthesis was used to describe studies included in the review, focusing on the SNA methods and metrics used [42].

Results

Our searches returned 5970 articles, which after double screening yielded six articles meeting our inclusion criteria [43,44,45,46,47,48]. Figure 1 documents the review process using a PRISMA flow chart.

Studies’ characteristics are described in Table 1. They were primarily recently published (all since 2010), looking at multidisciplinary healthcare providers (4 of 6) and conducted in the US (5 of 6) in tertiary care facilities or their equivalent (5 of 6).

Table 1 Study characteristics

Below, we review findings based on each of our research questions.

Primary research question

What SNA methods have been used to study professional communication and performance among healthcare providers?

Tables 2, 3, 4, 5, and 6 contain extractions from the six studies. Table 2 provides an overview of the studies, Table 3 focuses on their SNA methods, Table 4 lists the SNA metrics used by each study, Table 5 looks at the association between the SNA metrics and patient outcomes, and Table 6 explores the relationship between research questions and SNA methods used. Key patterns are summarized below.

Table 2 Summary of the six studies included in this review
Table 3 Summary of studies’ social network analysis methods
Table 4 Summary of studies’ social network analysis metrics
Table 5 Analysis of studies’ SNA metrics and patient outcome findings
Table 6 Analysis of studies’ research questions and study methods used

All studies included in this review were exploratory in nature. All but one study [47] employed a cross-sectional study design looking at whole networks. Four studies could only look at one or two whole networks, listing this as a limitation to their study’s generalizability [43,44,45, 47]. Data collection tools were typically network surveys, designed specifically for that study. However, one study coded observations [43] and another study extracted data on healthcare worker communications from surveys of patients that attended emergency departments [46]. All but one study [47] visualized their networks, which is not surprising as one of the unique aspects of SNA methods and software is the ability to visualize networks. Software preferences leaned towards UCINET and ORA [49, 50] with Microsoft Excel and SPSS mentioned as supplementary tools. A wide range of network metrics were calculated, although density and centrality were the most commonly calculated. See Table 4 below for an overview of which studies calculated specific network metrics. There was a range in how these data were analyzed with some integrating them into models and others using tests of significance.

Secondary research questions

What is the quantity of SNA studies? What was the evolution over time?

Six studies were identified, with all studies published in the past 5 years and none before 2011. One study was published annually from 2011 to 2013, and then, three were published in 2015. The evolution over time suggests there is an increasing interest in this type of study; however, with only six studies, it may be a premature assessment.

To what extent has this research taken place in low- and middle-income countries?

Not a single study that met our search criteria was conducted in a low- or middle-income country. All studies took place in either the USA [43, 45,46,47,48] or Australia [44].

What is the quality of these studies?

The quality of the studies meeting our selection criteria was assessed using the tools found in Additional files 5 and 6 and summarized in Table 2. None of the studies were found to be of low quality, two were found to be of acceptable quality and four of high quality applying the SIGN guidelines for assigning these categories.

What methods were used for which types of research questions?

There were only two studies using mixed methods, the other four only used quantitative methods.

Interestingly, Alexander et al. [43] only visualized the communication networks and did not calculate SNA metrics. Their authors cited this as a limitation of their design, which did not involve coding the observations in a way that would allow for healthcare workers to be individual nodes. Yet another study, Lindberg et al. [47], did not include a visualization of the communication network.

Modeling and tests of significance were used when studies intended to measure the association of network properties with other factors. Table 2 lists the study objectives, research questions, study design, and data collection methods whereas Table 3 goes into detail regarding each study’s SNA methods. Table 4 summarizes the SNA metrics used by each study, and Table 6 looks at the link between research questions and study design.

What are the main limitations of the SNA methods?

Table 2 lists the limitations, identified by our reviewers (although some were mentioned by the authors as well) of each of the six studies. While some of these limitations do not relate to the SNA methods, they provide insight into some of the challenges faced by studies using such methods. A general challenge generated by SNA methods is the need to clearly define the study boundary, which can limit sample size and therefore affect the broader generalizability. This came up in several studies noting areas for further research including broadening to other settings and repeating the study elsewhere given the limited sample size. The sample size limitation related less to the number of nodes, but more to the number and type of whole networks included.

The lack of longitudinal and experimental designs speaks to a broader challenge in the field as these are new areas for application of SNA methods, and the analytical tools and software are still in development. This limited the ability of SNA studies to address causal pathways.

Similarly, the limited qualitative methods being integrated into the studies constrain the contextual understanding of the network properties quantified and visualized through applying the quantitative SNA methods.

One study, Alexander et al. [43] reported that their coding method limited the type of analyses that can be conducted, and therefore, they did not analyze their SNA data beyond visualizing patterns.

To what extent has this research focused on community-based health providers?

None of the studies took place in a community-based setting. Only one study took place in a primary healthcare context [48]. The other five studies took place in tertiary level facilities including hospital units and specialist care facilities like hemodialysis centers or nursing homes [43,44,45,46,47].

What are the key findings of these SNA studies?

While this is a methodologically focused review, therefore inherently less focused on any associations observed between network properties and the health outcomes measured, the next question naturally arises, what did these studies find? Their overall study findings are discussed in Table 2, but to better understand any relationship between specific network metrics and patient health outcomes, we looked at the metrics captured in more than one of the studies and their reported association with patient outcomes in Table 5. There were only two metrics, density, and in-degree centrality reported in more than one study. For this analysis, all centrality metrics were collapsed into one category, although the actual centrality metrics used in the study are specified in Table 4.

Patient outcomes generally improved when healthcare worker communication was denser and more centralized as measured by various centrality metrics. However, for both metric studies reported no significant association with some patient outcomes, as such more studies are needed to clarify patterns.

The Effken study had one exception to the relationship proposed between centrality and patient outcomes. Adverse drug events increased with betweenness centrality, possibly due to the presence of gatekeepers the authors hypothesized. Another patient outcome, symptom management on the surface appears to have conflicting associations with centrality metrics; however, the authors suggest that taken together the correlation of this patient outcome metric with eigenvector centrality and patient symptom management capacity with simmelian ties (strong ties within cliques) which could point to the importance of small group communication [45]. This broader pattern of performance being linked to more centralized networks is generally supported by the SNA literature, although the debate continues [51]. Furthermore, patient outcomes may not necessarily be expected to be associated with healthcare provider centrality as they could be central for reasons other than the quality of care or professional advice they provide. Network density can provide more pathways for communication, however, in its extreme, can reinforce insularity and limit external sources of information [51]. As such a network diagnostic tool proposed ideal network density to be .15–.50 [12].

There are definite limitations to this specific analysis. Through this process, it became clear that not every SNA metric calculated and its association with all outcomes captured in a study are published. Another complication is that not all of the metrics and results were truly comparable given the different data sources and analytical approaches. For example, one study used GLMM models collapsing all data collected across teams that had different professionals with either strong or weak ties for two types of communication networks (electronic and face to face) rather than looking at the in-degree centrality of a network and its association with the patient outcome of interest.

Discussion

The discussion will focus on the two main research questions of the review, the primary research question “What SNA methods have been used to study professional communication and performance among healthcare providers?” and “What methods were used for which types of research questions?”

What SNA methods have been used to study professional communication and performance among healthcare providers?

The majority (5 of 6) of the studies that met our selection criteria used a cross-sectional, observational study design. This posed challenges in addressing the research questions looking at the association between provider communication networks and patient outcomes, as the patient outcome data timeframe and the networks being captured, were not always temporally aligned.

As other systematic reviews suggested, there remain few network intervention studies, a frontier opportunity for researchers [5, 26, 27]. See Additional file 8 for an overview of the other SNA and health systematic reviews identified through our search strategy and their recommendations for further research. The lone experimental study included in this review, Lindberg et al., did not use network data to design the intervention, so it does not qualify as a “network intervention” [47]. Network interventions and experimental, longitudinal study designs will allow for SNA methods to address causal pathways, a current limitation on how the methods are being applied [29].

One of the challenges facing researchers wanting to use SNA methods is the lack of validated SNA survey tools for use in the health sector, as highlighted by Creswick and Westbrook and Perkins et al. [29, 44]. While this only is relevant for those interested in using sociometric survey methods, as more studies use SNA methods, we can anticipate that a set of tools or best practices for applying a range of SNA methods will emerge. The Perkins systematic review aimed to address one aspect of that gap by gathering all the name generating tools they found across the studies they reviewed [29]. However, this is only one step in the process of having more systematically validated tools and best practices available.

The most obvious pattern in study methods was that studies looking to establish associations used more advanced statistical methods to test their hypotheses whereas the studies that were looking to answer questions about processes used more qualitative methods. However, this observation is largely less about SNA methods and more about the relative strengths of qualitative vs quantitative research methods. The diversity of SNA analytical methods could also speak to the expertise of the individual researchers and which methods they were more comfortable using rather than necessarily a clear advantage posed by using one method over another to answer a given research question. That said there are SNA methods such as exponential random graph models which are appropriate to answer specific SNA questions that other SNA methods would not be able to address. These methods were not used in the studies meeting our search criteria.

It is important to note that while this review did not identify any studies conducted in LMIC, this does not mean SNA methods have never been used to study health in these contexts. A systematic review looked specifically at SNA applications in LMIC and found 17 articles from 10 health-related network studies; however, their focus was broad and none of the studies met our criteria of focusing on healthcare provider communication and patient outcomes [29]. Instead, these studies set in 9 countries looked primarily at patients or their household as the ego and used name generators to establish networks related to contraception use and family planning, mercury consumption (2 studies), HIV transmission (5 studies), and diarrheal disease transmission (3 studies) [29].

One of the issues with the way SNA methods have been applied in the health sector is the often artificial boundaries imposed by limiting studies to specific cadres, which did not reflect the actual care environments. Notably four of six studies looked at multidisciplinary teams and one of the studies limited to one cadre, Effken, et al. [45], suggested that future studies look at other providers in the care setting.

There was a surprising variability across the studies with respect to the network metrics calculated and used. Two—centrality (in-degree) and density––were included in 4 and 3 of 6 studies. The range of network metrics calculated per study included in our review ranged from 0 to 15 with most calculating 3 or 4. See Table 4 for a breakdown of which studies calculated which metrics.

What methods were used for which types of research questions?

With only six studies meeting our criteria, there are limits to identifying clear patterns in methods used to address types of research questions and study objectives. Table 6 focuses on the link between types of research questions and study methods. Research questions were classified as either descriptive, relational, or causal in nature [52]. Half of the studies included more than one type of research question. Those studies that included causal or relational research questions typically involved more robust quantitative analyses. Mixed methods were used in two studies: one only had causal research questions while the other had descriptive and relational research questions. Most study designs were observational and cross-sectional and had descriptive and relational research questions. As more studies are conducted over the coming years, these patterns will likely evolve and become more consistent.

While the focus of our review has been on these two research questions as applied to the six articles that met our search criteria, there are a range of SNA methods and metrics beyond what is discussed here which could have applicability in answering research questions related to healthcare professional advice networks and performance including, but not limited to block modeling, core-periphery, presence of structural holes and bridges, cohesion, proximity, and prestige/prominence analyses.

Limitations of the review

This review looked at a very specific question and found that few, albeit in recent years a growing number of researchers, have designed studies meeting these criteria. Our definition of performance as being assessed by patient outcomes rather than through proxy interim measures, such as use of evidence-based tools and practices, restricted the studies that met our search criteria. This may have been particularly limiting for studies of community-based healthcare, which often takes the form of counseling whereby certain outcomes like patient satisfaction are more likely to be appropriate study outcomes than patient outcomes. Our definition of professional communication networks excluded studies of provider friendship networks or other types of ties between healthcare workers unless they explicitly captured professional communication. In theory, those networks may have embedded professional advice exchange not captured, analyzed, or presented in the paper. Another limitation is that we only looked at English language publications. However, looking at other systematic reviews of SNA studies, that is a common limitation [13, 22, 25, 27,28,29]. For those that included studies in other languages, like Benton et al. which included Spanish and Portuguese language studies, they found 2 of 43 included studies were non-English and excluded a further 2 for language reasons [23]. Chambers et al. and Flodgren et al. [5, 26] did not impose any language restrictions in their searches but did not identify studies published outside of English language journals, so this is unlikely to be a major source of bias. We limited our searches to those studies published from 1990, although given the emphasis on software packages in current SNA studies, it was believed that few studies would have been using relevant methods before 1990 as those software packages did not exist for use on widely accessible platforms.

Another limitation speaks to broader limitations of systematic reviews. The language used for social network analysis is vague and inconsistent, and search strategies were challenging to devise that returned a manageable number of articles to screen yet were broad enough to capture all the ways in which researchers may have described an SNA study.

Conclusion

Five years after the Chambers et al.’s [5] review, searched for articles, social network analysis methods continue to be underutilized in the health sector, particularly when looking at healthcare provider communication and performance. There are few studies that do more than describe professional communication networks among healthcare providers, for those that do, only a small subset, six measure performance using patient outcomes. This may be a broader reflection of the challenge in accurately capturing patient outcome data as many studies were excluded for using proxy measures such as patient satisfaction or use of an evidence-based practice. While a diverse set of methods were used across the six studies, as more studies are conducted clearer patterns in methods may emerge. The quality of these studies was either acceptable or high; however, the level of sophistication of these studies was relatively low with an emphasis on cross-sectional study designs. This is not an unsurprising finding as the network methods themselves and software tools capable of dynamic and longitudinal network analyses are still developing. As longitudinal SNA analysis methods mature, other study designs and network interventions should become more common. All articles meeting the review criteria were published in the past 5 years, suggesting that this is a developing area of research.

One pattern that this review highlights is a trend towards looking at multidisciplinary provider networks rather than focusing on one cadre. Other SNA methodological consistencies among these six studies included a preference for calculating specific network metrics: density and centrality. The limited number of articles meeting our search criteria, the glaring lack of any studies in LMIC, non-Western contexts, and in non-tertiary settings or community-based settings present clear research opportunities. Once there are more studies published addressing healthcare provider communication and performance, it may be useful to revisit this analysis and draw conclusions on the SNA methods best placed to answer specific research questions within this space.

Abbreviations

ADE:

Adverse drug event

AR-BSI:

Access-related bloodstream infection

BSI:

Bloodstream infection

CASP:

Critical Appraisal Skills Programme

CDC:

Centers for Disease Control and Prevention

CNA:

Certified Nursing Assistant

COREQ:

Consolidated criteria for reporting qualitative research

DOI:

Digital Objective Identifier

ED:

Emergency department

ENTREQ:

Enhancing transparency in reporting the synthesis of qualitative research

EPOC:

Effective Practice and Organisation of Care

FGD:

Focus group discussion

GLMM:

Generalized linear mixed model

HCW:

Healthcare worker

IT:

Information technology

ITS:

Information technology sophistication

LMIC:

Low- and middle-income countries

LPN:

Licensed practical nurse

MA:

Medical assistant

NH:

Nursing home

NHAMCS:

National Hospital Ambulatory Medical Care Survey

ORA:

Organization Risk Analyzer

PRISMA:

Preferred Reporting Items for Systematic reviews and Meta-Analyses

PROSPERO:

An international database of prospectively registered systematic reviews in health and social care, welfare, public health, education, crime, justice, and international development

QARI:

Qualitative assessment and review instrument

RATS:

Relevance of study question, appropriateness of qualitative method, transparency of procedures, and soundness of interpretive approach

RN:

Registered nurse

SIGN:

Scottish Intercollegiate Guidelines Network

SNA:

Social network analysis

STROBE:

Strengthening the Reporting of Observational studies in Epidemiology

References

  1. Murray CJ. Shifting to sustainable development goals—implications for global health. N Engl J Med. 2015;373:1390–3.

    Article  PubMed  Google Scholar 

  2. Wasserman S, Faust K. Social network analysis: methods and applications, vol. 8. Cambridge: Cambridge University Press; 1994.

    Book  Google Scholar 

  3. Valente TW. Network interventions. Science. 2012;337:49–53.

    Article  CAS  PubMed  Google Scholar 

  4. Meltzer D, Chung J, Khalili P, Marlow E, Arora V, Schumock G, et al. Exploring the use of social network methods in designing healthcare quality improvement teams. Soc Sci Med. 2010;71:1119–30.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Chambers D, Wilson P, Thompson C, Harden M. Social network analysis in healthcare settings: a systematic scoping review. PLoS One. 2012; doi:10.1371/journal.pone.0041911.

  6. Scott J, Carrington PJ, editors. The SAGE Handbook of Social Network Analysis. London: SAGE; 2011.

    Google Scholar 

  7. Blanchet K, James P. How to do (or not to do)… a social network analysis in health systems research. Health Policy Plan. 2012;27:438–46.

    Article  PubMed  Google Scholar 

  8. Rogers E. Diffusion of innovations. 5th ed. New York: Free Press; 2003.

    Google Scholar 

  9. Dearing JW. Evolution of diffusion and dissemination theory. Journal of Public Health Management and Practice. 2008;14:99–108.

    Article  PubMed  Google Scholar 

  10. Saint-Charles J, Mongeau P. Different relationships for coping with ambiguity and uncertainty in organizations. Soc Networks. 2009;31:33–9.

    Article  Google Scholar 

  11. Hanneman RA, Riddle M. Concepts and measures for basic network analysis. In: Scott J, Carrington PJ, editors. The SAGE handbook of social network analysis. Thousand Oaks: SAGE Publications; 2011. p. 340–69.

    Google Scholar 

  12. Gesell SB, Barkin SL, Valente TW. Social network diagnostics: a tool for monitoring group interventions. Implement Sci. 2013; doi:10.1186/1748-5908-8-116.

  13. Cunningham FC, Braithwaite J. Bridges, brokers and boundary spanners in collaborative networks: a systematic review. BMC Health Serv Res. 2013;13:158.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Huisman M, van Duijn MAJ. A reader’s guide for SNA software. In: Scott J, Carrington PJ, editors. The sage handbook of social network analysis. London: SAGE; 2011. p. 578–600.

    Google Scholar 

  15. Greenhalgh T, Robert G, Bate P, Macfarlane F, Kyriakidou O. Diffusion of innovations in health service organisations. Oxford: Blackwell Publishing Ltd and BMJ Books; 2005.

    Book  Google Scholar 

  16. McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a Tower of Babel? Implement Sci. 2010;5:16.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Scott SD, Albrecht L, O'Leary K, Ball GD, Hartling L, Hofmeyer A, et al. Systematic review of knowledge translation strategies in the allied health professions. Implement Sci. 2012;7:70.

    Article  PubMed  PubMed Central  Google Scholar 

  18. LaRocca R, Yost J, Dobbins M, Ciliska D, Butt M. The effectiveness of knowledge translation strategies used in public health: a systematic review. BMC Public Health. 2012;12:751.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Van Eerd D, Cole D, Keown K, Irvin E, Kramer D, Gibson J, et al. Report on knowledge transfer and exchange practices: a systematic review of the quality and types of instruments used to assess KTE implementation and impact. Toronto: Institute for Work & Health; 2011.

    Google Scholar 

  20. Mitton C, Adair CE, McKenzie E, Patten SB, Waye PB. Knowledge transfer and exchange: review and synthesis of the literature. Milbank Q. 2007;85:729–68.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Shaxson L, Bielak A, Ahmed I, Brien D, Conant B, Fisher C, et al. Concept paper & case studies: expanding our understanding of K* (Kt, KE, Ktt, KMb, KB, KM, etc.) In: K* Conference. Hamilton: United Nations University. p. 2012.

  22. Bae S-H, Nikolaev A, Seo JY, Castner J. Health care provider social network analysis: a systematic review. Nurs Outlook. 2015;63:566–84.

    Article  PubMed  Google Scholar 

  23. Benton DC, Perez-Raya F, Fernandez-Fernandez MP, Gonzalez-Jurado MA. A systematic review of nurse-related social network analysis studies. Int Nurs Rev. 2015;62:321–39.

    Article  CAS  PubMed  Google Scholar 

  24. Braithwaite J. Between-group behaviour in health care: gaps, edges, boundaries, disconnections, weak ties, spaces and holes: a systematic review. BMC Health Serv Res. 2010;10:330.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Dunn AG, Westbrook JI. Interpreting social network metrics in healthcare organisations: a review and guide to validating small networks. Soc Sci Med. 2011;72:1064–8.

    Article  PubMed  Google Scholar 

  26. Flodgren G, Parmelli E, Doumit G, Gattellari M, O'Brien MA, Grimshaw J, et al. Local opinion leaders: effects on professional practice and health care outcomes (review). Cochrane Database Syst Rev. 2011; doi:10.1002/14651858.CD000125.pub4.

  27. Long JC, Cunningham FC, Braithwaite J. Bridges, brokers and boundary spanners in collaborative networks: a systematic review. BMC Health Serv Res. 2013;13:158.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Mitchell JI, Long JC, Braithwaite J, Brodaty H. Social-professional networks in long-term care settings with people with dementia: an approach to better care? A systematic review. J Am Med Dir Assoc. 2016; doi:10.1016/j.jamda.2015.11.015.

  29. Perkins JM, Subramanian SV, Christakis NA. Social networks and health: a systematic review of sociocentric network studies in low- and middle-income countries. Soc Sci Med. 2015;125:60–78.

    Article  PubMed  Google Scholar 

  30. von Elm E, Altman DG, Egger M, Pocock SJ, Gotzsche PC, Vandenbroucke JP, et al. Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. BMJ. 2007;335:806–8.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Ovretveit J, Gustafson D. Using research to inform quality programmes. BMJ. 2003;326:759–61.

    Article  PubMed  PubMed Central  Google Scholar 

  32. The Public Health Resource Unit The critical skills appraisal programme: making sense of evidence. 2016. http://www.casp-uk.net/. Accessed 19 Jul 2016.

  33. Harbour R, Miller J. A new system for grading recommendations in evidence based guidelines. BMJ. 2001;323:334–6.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  34. Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19:349–57.

    Article  PubMed  Google Scholar 

  35. Tong A, Flemming K, McInnes E, Oliver S, Craig J. Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Med Res Methodol. 2012;12:181.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Scottish Intercollegiate Guidelines Network (SIGN). Critical appraisal of the medical literature support. 2016. http://www.sign.ac.uk/checklists-and-notes.html. Accessed 19 Jul 2016.

  37. Clark J. How to peer review a qualitative manuscript. In: Godlee F, Jefferson T, editors. Peer review in health sciences. Second ed. London: BMJ Books; 2003. p. 219–35.

    Google Scholar 

  38. Joanna Briggs Institute. Joanna Briggs Institute Reviewers’ Manual: 2011 edition. Adelaide: Joanna Briggs Institute; 2011.

    Google Scholar 

  39. National Institute for Health and Care Excellence (NICE). The social care guidance manual. London: NICE; 2016.

    Google Scholar 

  40. Mays N, Pope C. Assessing quality in qualitative research. BMJ. 2000;320:50–2.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  41. Higgins JPT, Green S, editors. Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 [updated March 2011]. Cochrane Collaboration; 2011. Available from http://training.cochrane.org/handbook. Accessed 19 Jul 2016.

  42. Popay J, Roberts HM, Sowden A, Petticrew M, Arai L, Rodgers M, et al. Guidance on the conduct of narrative synthesis in sytematic reviews. Institute for Health Research. 2006;

  43. Alexander GL, Steege LM, Pasupathy KS, Wise K. Case studies of IT sophistication in nursing homes: a mixed method approach to examine communication strategies about pressure ulcer prevention practices. Int J Ind Ergon. 2015;49:156–66.

    Article  Google Scholar 

  44. Creswick N, Westbrook JI. Who do hospital physicians and nurses go to for advice about medications? A social network analysis and examination of prescribing error rates. Journal of Patient Safety. 2015;11:152–9.

    Article  PubMed  Google Scholar 

  45. Effken JA, Carley KM, Gephart S, Verran JA, Bianchi D, Reminga J, et al. Using ORA to explore the relationship of nursing unit communication to patient safety and quality outcomes. Int J Med Inform. 2011;80:507–17.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Hossain L, Kit Guan DC. Modelling coordination in hospital emergency departments through social network analysis. Disasters. 2012;36:338–64.

    Article  PubMed  Google Scholar 

  47. Lindberg C, Downham G, Buscell P, Jones E, Peterson P, Krebs V. Embracing collaboration: a novel strategy for reducing bloodstream infections in outpatient hemodialysis centers. Am J Infect Control. 2013;41:513–9.

    Article  PubMed  Google Scholar 

  48. Mundt MP, Zakletskaia LI, Shoham DA, Tuan WJ, Carayon P. Together achieving more: primary care team communication and alcohol-related healthcare utilization and costs. Alcohol Clin Exp Res. 2015;39:2003-2015.

    Article  PubMed Central  Google Scholar 

  49. Borgatti S, Everett M, Freeman L. UCINET for windows: software for social network analysis. Analytic Technologies. 2002;

  50. Carley K. ORA. Pittsburgh: Center for Computational Analysis of Social and Organizational Systems (CASOS), Institute for Software Research Interational, School of Computer Science, Carnegie Mellon University. 2001.

  51. Valente TW. Social networks and health: models, methods and applications. New York: Oxford University Press; 2010.

    Book  Google Scholar 

  52. Tochim WMK. Research methods knowledge base. 2006. http://www.socialresearchmethods.net/kb/resques.php. Accessed 19 Jul 2016.

Download references

Acknowledgements

Subject experts were consulted at various points during the development of the protocol. We thank Dr. Tom Valente and Dr. Jim Dearing for sharing their expertise in the areas of social network analysis and diffusion of innovations. We thank Dr. Justin Parkhurst and Professor Sir Andy Haines for sharing their expertise in health policy research and knowledge translation and transfer. We thank Professor Mark Petticrew and Jane Falconer for sharing their expertise in conducting systematic reviews. We thank Professor Val Curtis and Professor James Hargreaves who provided input through their roles on an independent advisory panel for the author’s (KS) DrPH Review.

Funding

The research was supported by the IDEAS––Informed Decisions for Actions to improve maternal and newborn health (http://ideas.lshtm.ac.uk), which is funded through a grant from the Bill & Melinda Gates Foundation to the London School of Hygiene & Tropical Medicine (Gates Global Health grant number OPP1017031).

Availability of data and materials

Search strategies, tools, completed extraction and appraisal tools, and endnote libraries are available upon request.

Author information

Authors and Affiliations

Authors

Contributions

KS conceptualized research questions, wrote protocol, and registered systematic review with PROSPERO. As primary reviewer, she screened titles, abstracts, and full-text articles; developed extraction tools and critical appraisal tools; and extracted data using those tools. KS is the primary writer of the paper. As the secondary reviewer, DW screened titles, abstracts, and full-text articles; located articles as needed; supported development of data extraction tool and critical appraisal tools; extracted data using those tools; and discussed with KS until reconciled; and reviewed drafts of the protocol prior to registration and drafts of the manuscript. KB reviewed the protocol, data extraction tool, critical appraisal, and drafts of the manuscript. BA supported the conceptualization of research questions and reviewed the drafts of the study protocol and the drafts of the manuscript. JS guided the conceptualization of research questions and reviewed the protocol, data extraction tool, critical appraisal tool, and drafts of the manuscript. All authors reviewed the final version of the review.

Corresponding author

Correspondence to Kate Sabot.

Ethics declarations

Ethics approval and consent to participate

Not applicable, conducting the systematic review did not involve human participants.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:

Search strategy concepts and associated terms. (XLSX 12 kb)

Additional file 2:

SNA software search strategy for each database. (XLSX 10 kb)

Additional file 3:

K* search strategy development. (DOCX 107 kb)

Additional file 4:

MEDLINE search strategy. (DOCX 18 kb)

Additional file 5:

Critical appraisal tool for qualitative studies. (XLSX 13 kb)

Additional file 6:

Critical appraisal tool for quantitative studies (XLSX 15 kb)

Additional file 7:

Data extraction tool. (XLSX 11 kb)

Additional file 8:

Existing SNA systematic reviews. (XLSX 14 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sabot, K., Wickremasinghe, D., Blanchet, K. et al. Use of social network analysis methods to study professional advice and performance among healthcare providers: a systematic review. Syst Rev 6, 208 (2017). https://doi.org/10.1186/s13643-017-0597-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13643-017-0597-1

Keywords