Skip to content

Advertisement

  • Methodology
  • Open Access
  • Open Peer Review

Evaluations of the uptake and impact of the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) Statement and extensions: a scoping review

Systematic Reviews20176:263

https://doi.org/10.1186/s13643-017-0663-8

  • Received: 2 October 2017
  • Accepted: 8 December 2017
  • Published:
Open Peer Review reports

Abstract

Background

The PRISMA Statement is a reporting guideline designed to improve transparency of systematic reviews (SRs) and meta-analyses. Seven extensions to the PRISMA Statement have been published to address the reporting of different types or aspects of SRs, and another eight are in development. We performed a scoping review to map the research that has been conducted to evaluate the uptake and impact of the PRISMA Statement and extensions. We also synthesised studies evaluating how well SRs published after the PRISMA Statement was disseminated adhere to its recommendations.

Methods

We searched for meta-research studies indexed in MEDLINE® from inception to 31 July 2017, which investigated some component of the PRISMA Statement or extensions (e.g. SR adherence to PRISMA, journal endorsement of PRISMA). One author screened all records and classified the types of evidence available in the studies. We pooled data on SR adherence to individual PRISMA items across all SRs in the included studies and across SRs published after 2009 (the year PRISMA was disseminated).

Results

We included 100 meta-research studies. The most common type of evidence available was data on SR adherence to the PRISMA Statement, which has been evaluated in 57 studies that have assessed 6487 SRs. The pooled results of these studies suggest that reporting of many items in the PRISMA Statement is suboptimal, even in the 2382 SRs published after 2009 (where nine items were adhered to by fewer than 67% of SRs). Few meta-research studies have evaluated the adherence of SRs to the PRISMA extensions or strategies to increase adherence to the PRISMA Statement and extensions.

Conclusions

Many studies have evaluated how well SRs adhere to the PRISMA Statement, and the pooled result of these suggest that reporting of many items is suboptimal. An update of the PRISMA Statement, along with a toolkit of strategies to help journals endorse and implement the updated guideline, may improve the transparency of SRs.

Keywords

  • Reporting
  • Systematic reviews
  • Methodology
  • Quality

Background

Systematic reviews (SRs) and meta-analyses are an essential resource for healthcare decision-makers [1]. When conducted well, SRs can provide credible and timely data on a range of enquiries, such as which treatments are effective, ineffective or harmful; which tests accurately diagnose a condition and which exposures are associated with health outcomes. However, the value of SRs depends on how well authors have reported what they did, and what they found. If such information is absent or ambiguous, readers cannot judge whether the results of the SR are robust to the methods used, cannot attempt to reproduce the findings and cannot interpret the findings accurately. This can contribute to the failure to implement the findings of SRs into clinical practice [2]. Therefore, transparent reporting of SRs should be considered critically important by authors of SRs [3, 4].

The transparency of SRs and meta-analyses of health research has been called into question on many occasions [5]. The first formal appraisal of SRs with a focus on medicine was performed by Cynthia Mulrow, who identified several poor reporting practices in a sample of 50 medical review articles published between June 1985 and June 1986 [6]. For example, clearly specified methods of identifying, selecting and appraising studies were available in one article only. Transparency was only slightly better in reviews published in 1996, with less than 25% of articles describing how evidence was identified, evaluated or synthesised [7]. In the last decade, transparency of SRs has certainly improved, yet a high amount of suboptimal reporting persists [8].

Improvements in the transparency of SRs in recent years may be attributed to the dissemination of reporting guidelines. Reporting guidelines provide evidence-based recommendations for authors on how to report their research methods and findings clearly [9]. In 1999, an international group of 30 epidemiologists, clinicians, statisticians, editors and researchers developed a reporting guideline for meta-analyses of randomised trials—the QUOROM (QUality Of Reporting Of Meta-analyses) Statement [10]. In 2005, a meeting was convened to update QUOROM to address several conceptual and practical advances in the methodology of SRs and to help overcome several shortcomings identified in an audit of SRs [3]. The guideline was renamed the PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) Statement, and published in 2009 [11]. It was accompanied by an explanation and elaboration document, which provided detailed guidance for each of the 27 included items, and examples of exemplar reporting [12].

According to citation data in Scopus®, the PRISMA Statement has had a very high uptake from the biomedical research community (Fig. 1). The checklist paper [11, 1319] has been cited 19,402 times as of 31 July 2017, and the accompanying explanation and elaboration document [12, 2023] received 5483 citations by this date. However, not all published SRs cite the guideline; for example, in a random sample of 119 non-Cochrane SRs of therapeutic interventions indexed in MEDLINE® in February 2014, 42 (35%) mentioned the use of the PRISMA Statement [8].
Fig. 1
Fig. 1

Cumulative number of citations of the PRISMA Statement. Data obtained from Scopus® on 31 July 2017. E&E explanation and elaboration

Since its publication, seven extensions to the PRISMA Statement have been developed to facilitate reporting of different types or aspects of SRs (Table 1). These include the PRISMA-Equity extension [2426], PRISMA for Abstracts of SRs [27], PRISMA extension for reporting SRs incorporating network meta-analysis [28], PRISMA for SRs and meta-analyses of individual participant data [29], PRISMA for SR protocols [30, 31], PRISMA harms checklist [32] and PRISMA extension for SRs of complex interventions [33, 34]. Citation counts for the PRISMA extensions are much lower than those of the PRISMA Statement, but they have not had the same amount of time to accrue citations (Fig. 2). Also, one should not expect the extensions to receive as many citations, as they are more restricted in scope, meaning that fewer SRs to which the extensions are applicable are published each year. The most cited extension is the checklist paper for PRISMA-P (for SR protocols) [30], which has received 683 citations since its publication in January 2015.
Table 1

Scope of the PRISMA Statement and published extensions

Reporting guideline

Year published

Scope of reporting guideline

PRISMA

2009

Reports of systematic reviews and meta-analyses, primarily of randomised trials that evaluate health care interventions [1123].

PRISMA-Equity

2012

Reports of systematic reviews and meta-analyses with a focus on health equity, defined as the absence of avoidable and unfair inequalities in health [2426].

PRISMA-Abstracts

2013

Abstracts for all types of systematic reviews, but the emphasis is on systematic reviews of evaluations of interventions where one or more meta-analyses are conducted [27].

PRISMA-Network Meta-Analysis

2015

Reports of systematic reviews that address networks of multiple treatment comparisons [28].

PRISMA-Individual Participant Data

2015

Reports of systematic reviews and meta-analyses of individual participant data. Developed primarily for reviews of randomised trials, but many items apply to other contexts, including reviews of diagnosis and prognosis [29].

PRISMA-Protocols

2015

Protocols for systematic reviews and meta-analyses that summarise aggregate data from studies, particularly those which evaluate the effects of interventions [30, 31].

PRISMA-Harms

2016

Reports of systematic reviews and meta-analyses assessing adverse events (as either a primary or secondary outcome) that are reported in prospective interventional studies or observational studies (with or without a comparison group) [32].

PRISMA-Complex Interventions

2017

Reports of systematic reviews and meta-analyses of complex interventions. Complex interventions are defined as interventions that have ‘multiple components (intervention complexity) and complicated/multiple causal pathways, feedback loops, synergies and/or mediators and moderators of effect (pathway complexity)’ [33, 34].

Fig. 2
Fig. 2

Cumulative number of citations of PRISMA extensions published before 2017. Data obtained from Scopus® on 31 July 2017. E&E explanation and elaboration, IPD individual participant data, NMA network meta-analysis

There are also eight PRISMA extensions that are in development (Table 2). These include extensions for SRs of newborn and child health research and for protocols of such SRs, for SRs of diagnostic test accuracy studies, for rapid reviews, for scoping reviews, for SR search methods, for SRs of traditional Chinese medicine interventions and for SRs of in vivo animal studies.
Table 2

Scope of the PRISMA extensions in development

Reporting guideline

Month registered

Scope of reporting guideline

PRISMA-Children

Nov 2014

Reports of systematic reviews and meta-analyses of randomised trials or observational studies of newborn and child health research [155].

PRISMA-Protocol for Children

Nov 2014

Protocols for systematic reviews and meta-analyses of randomised trials or observational studies of newborn and child health research [155].

PRISMA-Diagnostic Test Accuracy

Nov 2015

Reports of systematic reviews and meta-analyses of diagnostic test accuracy studies (i.e. studies of the ability of medical tests to detect a target condition) [156].

PRISMA-Rapid Reviews

Nov 2015

Reports of rapid reviews, including those with analogous terminology (e.g. rapid evidence synthesis, rapid knowledge synthesis) [157].

PRISMA-Scoping Reviews

Dec 2015

Reports of scoping reviews, which are used to map the concepts underpinning a research area and the main sources and types of evidence available [158].

PRISMA-Search

Feb 2016

Reports of literature searches in systematic reviews [159].

PRISMA-Traditional Chinese Medicine

Aug 2016

Reports of systematic reviews and meta-analyses of studies that evaluate Chinese herb medicine or moxibustion [160].

PRISMA-In Vivo Animal studies

To be registered

Reports of systematic reviews and meta-analyses of in vivo animal studies (Manoj M. Lalu, personal communication, June 2017)

Registered PRISMA extensions were identified in the library of reporting guidelines available at the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) Network website (http://www.equator-network.org/library/), on 24 July 2017

It is important to evaluate whether the PRISMA Statement and extensions have achieved what they are designed to do—improve the transparency of SRs. We are aware of two previous SRs that have investigated the adherence of SRs to the PRISMA Statement (i.e. the extent to which SRs comply with each item in the statement) [35, 36]. Another SR has examined whether transparency is better in SRs published in journals that endorse the PRISMA Statement (e.g. suggest its use in the journal instructions to authors or require that authors submit a PRISMA checklist accompanying their SR) [37]. However, to our knowledge, there has been no attempt to map what other research on the uptake, and impact of the PRISMA Statement and extensions has been done. Also, there has been no attempt to synthesise studies evaluating adherence of SRs published after the PRISMA Statement was disseminated. Therefore, we aimed to address these gaps by conducting a scoping review of meta-research studies evaluating the PRISMA Statement and extensions.

Methods

We did not pre-register the methods of our scoping review, as we are unaware of any register for methodological research of this nature.

We considered articles to be eligible for inclusion if they were an empirical study of any design (e.g. randomised trial, cross-sectional analysis, before-after study), which investigated some component of the PRISMA Statement or extensions (e.g. how often PRISMA is referred to in journal instructions to authors) or which used the PRISMA Statement or one of the extensions for evaluative purposes (e.g. to assess how often SRs adhere to each PRISMA item). We included meta-research studies regardless of language or year of publication. We excluded commentaries, editorials or letters to the editor.

One author (MJP) searched for potentially relevant studies indexed in MEDLINE® from inception to 31 July 2017 (specifically, Ovid MEDLINE® Epub Ahead of Print, In-Process and Other Non-Indexed Citations; Ovid MEDLINE® Daily and Ovid MEDLINE and Versions®). The following search strategy was used to retrieve articles that included the term ‘PRISMA’ (abbreviated or spelled out in full) in the title or abstract of the article:
  1. 1.

    ‘Preferred Reporting Items for Systematic Reviews and Meta-analyses’.ti,ab.

     
  2. 2.

    PRISMA.ti,ab.

     
  3. 3.

    1 or 2.

     
One author (MJP) screened all titles and abstracts, and any full-text articles retrieved, to determine eligibility. The same author recorded the types of evidence available in the included meta-research studies. Types of evidence were classified as:
  • data on SR adherence to the PRISMA Statement or extensions;

  • characteristics associated with SR adherence to PRISMA (e.g. journal endorsement, year of publication);

  • the frequency of journal instructions to authors referring to the PRISMA Statement or extensions;

  • other (e.g. frequency of SR authors who reported using the PRISMA Statement to guide their reporting).

To determine the influence of the PRISMA Statement on the transparency of SRs, we pooled the findings of meta-research studies evaluating how often SRs adhere to the PRISMA Statement. This updates a previous SR which included adherence studies published up to October 2014 [35]. One author (MJP) collected from each meta-research study the following data about the SRs evaluated: focus (e.g. therapeutic, diagnostic), clinical area, language, years of publication and frequencies of SRs adhering to each of the 27 PRISMA Statement items. In some cases, authors of meta-research studies recorded if a particular PRISMA item was fully reported or partially reported in each of the SRs evaluated. In such cases, we recorded only the number of SRs that fully reported the PRISMA item. One author (MJP) contacted study authors to request data on adherence to individual items if these data were not available in the published article (e.g. when study authors reported only the mean number of items that SRs adhered to).

We pooled data on SR adherence to individual PRISMA items across all SRs in the included studies. We noted items that fewer than two thirds (67%) of SRs adhered to and those that are fewer than half of SRs adhered to. We also pooled data on SR adherence to individual PRISMA items in a subset of studies that evaluated SRs published after the PRISMA Statement was disseminated. For this analysis, we analysed studies which included only SRs published in 2010 or later or studies which reported data on a subgroup of SRs published in 2010 or later. We did not contact study authors for this subgroup data. We conducted all analyses in Microsoft Excel.

Results

Scoping review of meta-research studies

The search of MEDLINE® yielded 5001 citations (Fig. 3). After screening each title and abstract, we retrieved the full text of 170 articles. We excluded 70 of these articles, most of which were editorials or commentaries (reasons for exclusion are listed in Additional file 1: Table S1). One hundred meta-research studies met our inclusion criteria (listed in Additional file 2: Table S2). The studies were published between 2011 and 2017, and more than half were published in 2015 or later (n = 59). All of the studies were observational in design; there were 86 cross-sectional analyses, six uncontrolled before-after studies, four surveys of authors and four systematic reviews of meta-research studies.
Fig. 3
Fig. 3

Flow diagram of identification, screening and inclusion of studies

We recorded 20 different types of evidence available across the included meta-research studies (Table 3). The most common type of evidence available was data on SR adherence to the PRISMA Statement, which was reported in 57/100 (57%) studies. Many of these 57 studies (n = 37 [65%]) also investigated characteristics associated with SR adherence to the PRISMA Statement, such as the type of journal, year of publication and article word count. The third most common type of evidence available was data on the frequency of journals referring to the PRISMA Statement or extensions in the instructions to authors (n = 18/100 [18%]).
Table 3

Types of evidence available in meta-research studies (n = 100) evaluating the PRISMA Statement or extensions

Type of evidence available

Frequency of studies

SR adherence to the PRISMA Statement or extensions

 Data on SR adherence to the PRISMA Statement [3894]

57

 Data on SR adherence to a particular item of the PRISMA Statement (e.g. searching item, risk of bias assessment item) [9599]

5

 Data on SR abstract adherence to the PRISMA-Abstracts extension [74, 100, 101]

3

 Data on network meta-analysis adherence to the PRISMA-Network Meta-Analysis extension [102]

1

 Data on rapid review adherence to the PRISMA Statement [103]

1

 Data on SR adherence to draft versions of the PRISMA-Child and PRISMA-Protocols Child extensions [104]

1

 Data on SR adherence to reporting standards derived from the PRISMA Statement [8, 105, 106]

3

 Data on SR abstract adherence to items derived from the PRISMA Statement [107, 108]

2

 Data on individual participant data meta-analysis adherence to items derived from the PRISMA Statement [109]

1

Characteristics associated with SR adherence to the PRISMA Statement

 Association between journal endorsement of the PRISMA Statement and SR adherence to PRISMA [38, 39, 46, 71, 77, 81, 84, 94]

8

 Association between factors other than journal endorsement (e.g. type of journal, word count, year of publication) and SR adherence to PRISMA [38, 43, 44, 4651, 54, 5762, 64, 66, 68, 70, 71, 7375, 77, 78, 80, 8286, 8890, 92, 94]

37

Mention of the PRISMA Statement or extensions in journal instructions

 Frequency of journals referring to the PRISMA Statement or extensions in the instructions to authors [46, 71, 77, 81, 84, 110122]

18

 Frequency of journals referring to the PRISMA Statement or extensions in the instructions to peer reviewers [112, 123]

2

Other

 Frequency of SR authors who reported using the PRISMA Statement to guide reporting [8, 71, 114, 115, 124]

5

 Frequency of editors who are aware of the PRISMA Statement [111]

1

 Frequency of inappropriate citation of the PRISMA Statement by authors [125]

1

 Association between adherence to the PRISMA Statement and citation of SRs [126]

1

 Authors’ perceived barriers and facilitators to use of the PRISMA-Equity extension [127]

1

 Authors’ views on what items are most important to report in SRs [128130]

3

 Systematic reviews of meta-research studies evaluating some component of the PRISMA Statement or extensions [3537, 131]

4

Few studies have evaluated how well SRs adhere to the PRISMA extensions; adherence to PRISMA for Abstracts and PRISMA for Network Meta-Analyses has been examined in three studies and one study, respectively (Table 3). Further, few studies have investigated whether the endorsement of the PRISMA Statement by journals was associated with adherence to PRISMA (n = 8/100 [8%]). We did not identify any studies that investigated whether journal endorsement of one of the PRISMA extensions was associated with SR adherence to the extension.

Evaluations of SR adherence to the PRISMA Statement

Of the 57 studies evaluating SR adherence to the PRISMA Statement [3894], most were published between 2015 and 2017 (33/57 [58%]), focused on SRs of therapeutic interventions only (45/57 [79%]), evaluated non-Cochrane SRs only (34/57 [60%]) and evaluated SRs written in English only (39/57 [68%]) (Table 4). A total of 6487 SRs were evaluated across all studies; the median (interquartile range) number of SRs evaluated per study was 74 (44-144). The evaluated SRs were published between 1989 and 2016.
Table 4

Characteristics of 57 studies evaluating SR adherence to the PRISMA Statement

Characteristic

Summary data

Year of study publication

 2011–2014

24 (42%)

 2015–2017

33 (58%)

Focus of SRs evaluated

 Therapeutic interventions (treatment/prevention)

45 (79%)

 Diagnostic

4 (7%)

 Mix (e.g. some therapeutic, some diagnostic)

6 (11%)

 Not specified

2 (4%)

Clinical area of SRs evaluated

 Surgery

14 (25%)

 General medicine

5 (9%)

 Nursing

5 (9%)

 Complementary and alternative medicine

4 (7%)

 Other (specific clinical condition)

29 (51%)

Median number of SRs evaluated

74 (44-144)

Median earliest year of publication of SRs evaluated

2005 (2001–2009)

Median latest year of publication of SRs evaluated

2013 (2011–2015)

Journal of SRs evaluated

 Non-Cochrane only

34 (60%)

 Both Cochrane and non-Cochrane

22 (39%)

 Unclear

2 (11%)

Language of SRs evaluated

 English only

39 (68%)

 Chinese only

9 (16%)

 Portuguese only

1 (2%)

 English and LOE (less than 10% LOE)

6 (11%)

 English and LOE (more than 40% LOE)

2 (4%)

Data given as number (percent) or median (interquartile range)

LOE language other than English, SR systematic review

All 57 studies assessed adherence to individual PRISMA items, with relevant data provided on request by authors of ten studies [39, 42, 43, 45, 66, 67, 69, 77, 79, 85]. By pooling the PRISMA adherence data across SRs in all 57 reports, we identified 11 items that fewer than 67% of SRs adhered to (Fig. 4; numerical data available in Additional file 3: Table S3). These include item 2 (structured summary), item 5 (methods: protocol and registration), item 8 (methods: search), item 11 (methods: data items), item 12 (methods: risk of bias in individual studies), item 15 (methods: risk of bias across studies), item 16 (methods: additional analyses), item 19 (results: risk of bias within studies), item 22 (results: risk of bias across studies), item 23 (results: additional analyses) and item 27 (funding). There were six items that fewer than 50% of SRs adhered to (items 5, 15, 16, 22, 23 and 27).
Fig. 4
Fig. 4

Summary percentage across reports of SRs adhering to the PRISMA Statement

PRISMA adherence data for SRs published in 2010 or later (i.e. after the PRISMA Statement was published) were available in 27 studies [38, 39, 41, 42, 44, 46, 56, 60, 62, 6879, 8184, 92, 94], which evaluated 2382 SRs. The characteristics of these studies (i.e. focus, clinical area, language of SRs) were similar to those of the total set of studies. SR adherence to the PRISMA Statement was higher for nearly all items in this subset of recent SRs, compared with the adherence data across all SRs (Fig. 4; numerical data available in Additional file 3: Table S3). There were 12 items that more than 80% of SRs adhered to (items 1, 3, 4, 6, 7, 14, 17, 18, 20, 21, 24 and 26). However, lack of transparency remains an issue for many SRs. There were nine items that fewer than 67% of SRs adhered to (items 5, 8, 12, 15, 16, 19, 22, 23 and 27), and one item was adhered to by 21% of SRs only (item 5, on whether a SR protocol or registration number exists).

Discussion

Our scoping review suggests that the PRISMA Statement and extensions have provided fertile ground for meta-research. Twenty different types of evidence were available across 100 meta-research studies. The most common type of evidence was data on SR adherence to the PRISMA Statement, which has been evaluated in 57 studies. The pooled results of these studies indicate that reporting of many items of the PRISMA Statement is suboptimal, even in those SRs published after its dissemination in 2009. Very few meta-research studies have evaluated SR adherence to the PRISMA extensions, but this is unsurprising given that most extensions were disseminated in 2015 or later. Few studies have tested strategies to increase adherence to the PRISMA Statement and extensions.

Strengths and limitations

There are several strengths of our research. To our knowledge, this is the first attempt to systematically map research conducted on the PRISMA Statement and extensions. Most of the included studies assessing SR adherence to the PRISMA Statement focused on one clinical area, so by pooling data across these studies, our findings are more generalisable. Also, we managed to obtain unpublished data from ten studies that had not reported data on adherence to each individual PRISMA item [39, 42, 43, 45, 66, 67, 69, 77, 79, 85].

A few limitations must be acknowledged. We included only meta-research articles indexed in one bibliographic database (MEDLINE®) and written in English. However, we do not see any reason why our findings would differ had other databases and meta-research articles in languages other than English been consulted. Screening of records and collection of data from articles were performed by one author only. It is therefore possible that we may have missed some relevant meta-research studies or made errors when recording the frequency of SRs adhering to the PRISMA Statement. We have uploaded all data collected to the Open Science Framework (https://osf.io/7x2mp/) so that interested readers can verify our data and replicate our results. Most of the SRs evaluated in the 57 studies investigating SR adherence to the PRISMA Statement were written in English, and it is possible that non-English language SRs may be less likely to adhere to PRISMA, if their authors were not confident in English. Our classification of types of evidence available in meta-research studies reflects what was reported; we did not contact study authors to enquire whether they conducted other analyses yet chose not to report the findings. We did not record the references of SRs evaluated in each study investigating SR adherence to the PRISMA Statement and so are unaware if some SRs appeared in more than one of the included meta-research studies. However, based on the information regarding the types of SRs (e.g. Cochrane or non-Cochrane), years of publication of SRs and clinical focus of SRs, we judged the number of overlapping SRs to be low.

We were unable to compare the reporting of SRs published after PRISMA was disseminated in 2009 with that before 2009 because of how the included meta-research studies were designed and reported. Most studies (43 of 57) included some SRs published before 2009 and some published after 2009, but most studies did not report the number of SRs in each category. There were 14 studies that included only SRs published after 2009, 13 studies which provided subgroup data on SRs published after 2009 (but not all of these studies provided corresponding data for SRs published before 2009) and three studies included only SRs published before 2009. Given the data on PRISMA adherence in SRs published before 2009 was limited to a small subset of the included studies, we decided to restrict our analysis of PRISMA adherence to all SRs (regardless of year of publication) and SRs published after 2009. A formal before-after comparison was therefore not possible.

We focused on the PRISMA Statement and extensions, although we are aware of other reporting guidelines for SRs. These include the Methodological Expectations of Cochrane Intervention Reviews (MECIR) reporting standards [132, 133], the American Psychological Association Meta-Analysis Reporting Standards (MARS) [134], the ENTREQ Statement for syntheses of qualitative research [135], the RAMESES publication standards for realist syntheses [136] and meta-narrative reviews [137] and reporting guidance for describing interventions in SRs [138]. More research is needed to map the research conducted on these reporting guidelines.

Comparison with other studies

We are aware of two other syntheses of meta-research studies that have investigated the adherence of SRs to the PRISMA Statement [35, 36]. Samaan et al. [36] included three studies, and Pussegoda et al. [35] included 13 studies, respectively. Both reached the same conclusion as us, that adherence to the PRISMA Statement is suboptimal; however, unlike our review, neither analysed reporting of SRs published after the PRISMA Statement was published. Another SR by Stevens et al. [37] synthesised the results of three studies exploring whether SR adherence to the PRISMA Statement is higher in journals which endorse the reporting guideline. We identified in our scoping review an additional five studies that could be added to an update of this review. To our knowledge, ours is the only review which has mapped research conducted on the PRISMA extensions.

Implications of the findings

There are several reasons why adherence is better for some PRISMA items than others. It is possible that the less complex the item, the easier it is to report it. For example, most of the 12 PRISMA items that were adhered to by more than 80% of SRs published in 2010 or later are relatively straightforward to report. These items include identifying the report as a SR or meta-analysis in the title, providing a rationale and objectives, presenting study characteristics and reporting conclusions. Several items in the PRISMA Statement comprise multiple components, which some systematic reviewers may fail to fully address (e.g. item 12 asks authors to ‘describe methods used for assessing risk of bias of individual studies (including specification of whether this was done at the study or outcome level), and how this information is to be used in any data synthesis’). Also, reporting of some items may depend on whether the journal facilitates reporting of that item (e.g. authors may be unable to present a full electronic search strategy (item 8) in journals that do not allow supplementary files). In addition, some items with low adherence may not be considered sufficiently important to report by a majority of systematic reviewers and journal editors. It would be useful to conduct surveys and interviews with systematic reviewers to explore the contributions of these potential barriers and facilitators to complete SR reporting.

To our knowledge, there have been no prospectively designed, controlled studies evaluating whether the PRISMA Statement or extensions are having their intended effect. This is surprising, and a different threshold than that required to introduce a drug into the marketplace. Instead, only a few cross-sectional or uncontrolled before-after studies have evaluated the impact of journal endorsement of the PRISMA Statement on reporting of SRs. Of these eight studies [38, 39, 46, 71, 77, 81, 84, 94], six evaluated whether journals which ‘recommend’ or ‘encourage’ use of the PRISMA Statement in the journal instructions to authors publish SRs that are reported more completely. Two studies investigated whether reporting is clearer in journals that ask authors to submit a PRISMA checklist when submitting an SR. Both are rather low-intensity interventions that may not have the desired effect. For example, a recommendation in the instructions to authors can easily be missed by authors (some of whom will not even check the instructions), while a submitted PRISMA checklist may be ignored by peer reviewers and journal editors who face competing pressures on their time.

Researchers need to develop more efficient and intensive interventions to implement reporting guidelines such as the PRISMA Statement and extensions. We believe technology can play a valuable role in this regard. For example, StatReviewer software performs an automated review of the statistical and reporting integrity of scientific manuscripts (http://www.statreviewer.com/). Manuscripts can currently be checked against the following reporting guidelines: CONSORT 2010 [139], STROBE [140], STARD [141, 142], ARRIVE [143] and The Uniform Requirements for Medical Journals (http://www.icmje.org/recommendations/). StatReviewer is considering including PRISMA in their suite of reporting guidelines (D. Moher, personal communication). We also think rigorous evaluations, in the form of randomised trials, of StatReviewer are needed. Such evaluations could build upon the experiences of previous randomised trials evaluating web-based reporting guideline tools (e.g. WebCONSORT [144], COBWEB [145]).

It is 12 years since the PRISMA group last met, and the PRISMA Statement has not been updated since its publication 8 years ago. We believe that an update is necessary to address the poor adherence to the guideline. An updating process will provide the opportunity to discuss how to rearrange the layout and rephrase the checklist items to increase clarity. It will also allow for potential new items to be considered, based on recent methodological developments affecting SR conduct and reporting. These developments include novel guidance on how to:
  • summarise findings when meta-analysis is not appropriate [146, 147];

  • report and synthesise intervention characteristics of included studies [138, 148];

  • use and interpret prediction intervals for random-effects meta-analyses [149, 150];

  • enhance reproducibility of meta-analytic results and share data collected [151, 152] and

  • report the methods and results of updated SRs [153] and living SRs [154].

In addition, developing a comprehensive research translation strategy to help journals endorse and implement the updated guideline may facilitate its use. Journal editors and researchers should work together to develop prospective (ideally randomised), controlled studies to provide robust evidence about the effect of the updated guideline on the transparency of SRs.

Conclusions

Many studies have evaluated how well SRs adhere to the PRISMA Statement, and the pooled result of these suggests that reporting of many items is suboptimal. Little research has been done to design and test strategies to increase adherence to the PRISMA Statement or extensions. An update of the PRISMA Statement, followed by a toolkit of strategies to help journals endorse and implement the updated guideline, may improve the transparency of SRs.

Abbreviations

CI: 

Confidence interval

E&E: 

Explanation and elaboration

EQUATOR: 

Enhancing the QUAlity and Transparency Of health Research

IPD: 

Individual participant data

NMA: 

Network meta-analysis

PRISMA: 

Preferred Reporting Items for Systematic reviews and Meta-Analyses

QUOROM: 

QUality Of Reporting Of Meta-analyses

SR: 

Systematic review

Declarations

Acknowledgments

We thank the following study authors who provided us with unpublished data on SR adherence to PRISMA: Shayden Bryce, Jared Campbell, Alexander Fowler, Richard McGee, Livia Puljak, Amanda Shen and Matt Vassar.

Funding

There was no direct funding for this study. MJP is supported by an Australian National Health and Medical Research Council (NHMRC) Early Career Fellowship (1088535). DM is supported in part by a University Research Chair, University of Ottawa. The funders had no role in the study design, data collection and analysis, decision to publish or preparation of the manuscript.

Availability of data and materials

The data underlying this study are available on the Open Science Framework: https://osf.io/7x2mp/

Authors’ contributions

Both authors declare to meet the ICMJE conditions for authorship. MJP and DM conceived the study design. MJP collected the data and undertook the statistical analyses. MJP wrote the first draft of the article. DM contributed to the revisions of the article. Both authors approved the final version of the submitted article.

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

We have read the journal’s policy, and the authors of this manuscript have the following competing interests: DM is co-Editor in Chief, and MJP is an Associate Editor for Systematic Reviews, but neither had involvement in the peer review process or decision for publication. DM led the development of the PRISMA Statement. MJP and DM are leading the update of the PRISMA Statement. DM is an unpaid advisor to StatReviewer.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Authors’ Affiliations

(1)
School of Public Health and Preventive Medicine, Monash University, 553 St Kilda Road, Melbourne, VIC, 3004, Australia
(2)
Centre for Journalology and Canadian EQUATOR Centre, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, K1H 8L6, Canada
(3)
School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, K1H 8M5, Canada

References

  1. Agoritsas T, Vandvik PO, Neumann I, Rochwerg B, Jaeschke R, Hayward R, et al. Chapter 5: finding current best evidence. In: Guyatt G, Rennie D, Meade MO, Cook DJ, editors. Users’ guides to the medical literature: a manual for evidence-based clinical practice. 3rd ed. New York: McGraw-Hill; 2015. p. 29–50.Google Scholar
  2. Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. Reducing waste from incomplete or unusable reports of biomedical research. Lancet. 2014;383(9913):267–76.PubMedView ArticleGoogle Scholar
  3. Moher D, Tetzlaff J, Tricco AC, Sampson M, Altman DG. Epidemiology and reporting characteristics of systematic reviews. PLoS Med. 2007;4:e78.PubMedPubMed CentralView ArticleGoogle Scholar
  4. Page MJ, Moher D. Mass production of systematic reviews and meta-analyses: an exercise in mega-silliness? Milbank Q. 2016;94(3):515–9.PubMedPubMed CentralView ArticleGoogle Scholar
  5. Ioannidis JP. The mass production of redundant, misleading, and conflicted systematic reviews and meta-analyses. Milbank Q. 2016;94(3):485–514.PubMedPubMed CentralView ArticleGoogle Scholar
  6. Mulrow CD. The medical review article: state of the science. Ann Intern Med. 1987;106(3):485–8.PubMedView ArticleGoogle Scholar
  7. McAlister FA, Clark HD, van Walraven C, Straus SE, Lawson FM, Moher D, et al. The medical review article revisited: has the science improved? Ann Intern Med. 1999;131(12):947–51.PubMedView ArticleGoogle Scholar
  8. Page MJ, Shamseer L, Altman DG, Tetzlaff J, Sampson M, Tricco AC, et al. Epidemiology and reporting characteristics of systematic reviews of biomedical research: a cross-sectional study. PLoS Med. 2016;13(5):e1002028.PubMedPubMed CentralView ArticleGoogle Scholar
  9. Moher D, Schulz KF, Simera I, Altman DG. Guidance for developers of health research reporting guidelines. PLoS Med. 2010;7(2):e1000217.PubMedPubMed CentralView ArticleGoogle Scholar
  10. Moher D, Cook DJ, Eastwood S, Olkin I, Rennie D, Stroup DF. Improving the quality of reports of meta-analyses of randomised controlled trials: the QUOROM statement. Qual Reporting Meta-Analyses Lancet. 1999;354(9193):1896–900.Google Scholar
  11. Moher D, Liberati A, Tetzlaff J, Altman DG, Group P. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6(7):e1000097.PubMedPubMed CentralView ArticleGoogle Scholar
  12. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. J Clin Epidemiol. 2009;62(10):e1–34.PubMedView ArticleGoogle Scholar
  13. Moher D, Liberati A, Tetzlaff J, Altman DG, Group P. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ. 2009;339:b2535.PubMedPubMed CentralView ArticleGoogle Scholar
  14. Moher D, Liberati A, Tetzlaff J, Altman DG, Group P. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. J Clin Epidemiol. 2009;62(10):1006–12.PubMedView ArticleGoogle Scholar
  15. Moher D, Liberati A, Tetzlaff J, Altman DG, Group P. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151(4):264–9. W64PubMedView ArticleGoogle Scholar
  16. Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA Statement. Open Medicine. 2009;3(3):e123–30.PubMedPubMed CentralGoogle Scholar
  17. Moher D, Liberati A, Tetzlaff J, Altman DG, Group P. Reprint—preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Phys Ther. 2009;89(9):873–80.PubMedGoogle Scholar
  18. Moher D, Liberati A, Tetzlaff J, Altman DG, Altman D, Antes G, et al. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement (Chinese edition). J Chin Integrative Med. 2009;7(9):889–96.View ArticleGoogle Scholar
  19. Moher D, Liberati A, Tetzlaff J, Altman DG, Group P. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement.[erratum appears in Int J Surg. 2010;8(8):658]. Int J Surg. 2010;8(5):336–41.PubMedView ArticleGoogle Scholar
  20. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: explanation and elaboration. BMJ. 2009;339:b2700.PubMedPubMed CentralView ArticleGoogle Scholar
  21. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Ann Intern Med. 2009;151(4):W65–94.PubMedView ArticleGoogle Scholar
  22. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6(7):e1000100.PubMedPubMed CentralView ArticleGoogle Scholar
  23. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gøtzsche PC, Ioannidis JPA, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Ital J Public Health. 2009;6(4):354–91.Google Scholar
  24. Welch V, Petticrew M, Tugwell P, Moher D, O'Neill J, Waters E, et al. PRISMA-Equity 2012 extension: reporting guidelines for systematic reviews with a focus on health equity. PLoS Med. 2012;9(10):e1001333.PubMedPubMed CentralView ArticleGoogle Scholar
  25. Welch V, Petticrew M, Petkovic J, Moher D, Waters E, White H, et al. Extending the PRISMA statement to equity-focused systematic reviews (PRISMA-E 2012): explanation and elaboration. Int J Equity Health. 2015;14:92.PubMedPubMed CentralView ArticleGoogle Scholar
  26. Welch V, Petticrew M, Petkovic J, Moher D, Waters E, White H, et al. Extending the PRISMA statement to equity-focused systematic reviews (PRISMA-E 2012): explanation and elaboration. J Clin Epidemiol. 2016;70:68–89.PubMedView ArticleGoogle Scholar
  27. Beller EM, Glasziou PP, Altman DG, Hopewell S, Bastian H, Chalmers I, et al. PRISMA for abstracts: reporting systematic reviews in journal and conference abstracts. PLoS Med. 2013;10:e1001419.PubMedPubMed CentralView ArticleGoogle Scholar
  28. Hutton B, Salanti G, Caldwell DM, Chaimani A, Schmid CH, Cameron C, et al. The PRISMA extension statement for reporting of systematic reviews incorporating network meta-analyses of health care interventions: checklist and explanations. Ann Intern Med. 2015;162(11):777–84.PubMedView ArticleGoogle Scholar
  29. Stewart LA, Clarke M, Rovers M, Riley RD, Simmonds M, Stewart G, et al. Preferred reporting items for systematic review and meta-analyses of individual participant data: the PRISMA-IPD statement. JAMA. 2015;313(16):1657–65.PubMedView ArticleGoogle Scholar
  30. Moher D, Shamseer L, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst Rev. 2015;4:1.PubMedPubMed CentralView ArticleGoogle Scholar
  31. Shamseer L, Moher D, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ. 2015;349:g7647.View ArticleGoogle Scholar
  32. Zorzela L, Loke YK, Ioannidis JP, Golder S, Santaguida P, Altman DG, et al. PRISMA harms checklist: improving harms reporting in systematic reviews. BMJ. 2016;352:i157.PubMedView ArticleGoogle Scholar
  33. Guise JM, Butler ME, Chang C, Viswanathan M, Pigott T, Tugwell P, et al. AHRQ series on complex intervention systematic reviews—paper 6: PRISMA-CI extension statement and checklist. J Clin Epi. 2017;90:43-50.Google Scholar
  34. Guise JM, Butler M, Chang C, Viswanathan M, Pigott T, Tugwell P, et al. AHRQ series on complex intervention systematic reviews-paper 7: PRISMA-CI elaboration and explanation. J Clin Epi. 2017;90:51-58.Google Scholar
  35. Pussegoda K, Turner L, Garritty C, Mayhew A, Skidmore B, Stevens A, et al. Systematic review adherence to methodological or reporting quality. Systematic Reviews. 2017;6(1):131.PubMedPubMed CentralView ArticleGoogle Scholar
  36. Samaan Z, Mbuagbaw L, Kosa D, Borg Debono V, Dillenburg R, Zhang S, et al. A systematic scoping review of adherence to reporting guidelines in health care literature. J Multidiscip Healthc. 2013;6:169–88.PubMedPubMed CentralGoogle Scholar
  37. Stevens A, Shamseer L, Weinstein E, Yazdi F, Turner L, Thielman J, et al. Relation of completeness of reporting of health research to journals’ endorsement of reporting guidelines: systematic review. BMJ. 2014;348:g3804.PubMedPubMed CentralView ArticleGoogle Scholar
  38. Adie S, Ma D, Harris IA, Naylor JM, Craig JC. Quality of conduct and reporting of meta-analyses of surgical interventions. Ann Surg. 2015;261(4):685–94.PubMedView ArticleGoogle Scholar
  39. Agha RA, Fowler AJ, Limb C, Whitehurst K, Coe R, Sagoo H, et al. Impact of the mandatory implementation of reporting guidelines on reporting quality in a surgical journal: a before and after study. Int J Surg. 2016;30:169–72.PubMedView ArticleGoogle Scholar
  40. Aguiar PM, Brito GD, Correr CJ, Lyra Junior DP, Storpirtis S. Exploring the quality of systematic reviews on pharmacist interventions in patients with diabetes: an overview. Ann Pharmacother. 2014;48(7):887–96.PubMedView ArticleGoogle Scholar
  41. Akhigbe T, Zolnourian A, Bulters D. Compliance of systematic reviews articles in brain arteriovenous malformation with PRISMA statement guidelines: review of literature. J Clin Neurosci. 2017;39:45–8.PubMedView ArticleGoogle Scholar
  42. Bryce S, Sloan E, Lee S, Ponsford J, Rossell S. Cognitive remediation in schizophrenia: a methodological appraisal of systematic reviews and meta-analyses. J Psychiatr Res. 2016;75:91–106.PubMedView ArticleGoogle Scholar
  43. Campbell JM, Kavanagh S, Kurmis R, Munn Z. Systematic reviews in burns care: poor quality and getting worse. J Burn Care Res. 2017;38(2):e552–e67.PubMedView ArticleGoogle Scholar
  44. Chapman SJ, Drake TM, Bolton WS, Barnard J, Bhangu A. Longitudinal analysis of reporting and quality of systematic reviews in high-impact surgical journals. Br J Surg. 2017;104(3):198–204.PubMedView ArticleGoogle Scholar
  45. Chong AB, Taylor M, Schubert G, Vassar M. Interventional radiology clinical practice guideline recommendations for neurovascular disorders are not based on high-quality systematic reviews. AJNR: Am J Neuroradiol. 2017;38(4):759–65.PubMedView ArticleGoogle Scholar
  46. Cullis PS, Gudlaugsdottir K, Andrews J. A systematic review of the quality of conduct and reporting of systematic reviews and meta-analyses in paediatric surgery. PLoS One. 2017;12(4):e0175213.PubMedPubMed CentralView ArticleGoogle Scholar
  47. DiSilvestro KJ, Tjoumakaris FP, Maltenfort MG, Spindler KP, Freedman KB. Systematic reviews in sports medicine. Am J Sports Med. 2016;44(2):533–8.PubMedView ArticleGoogle Scholar
  48. Evaniew N, van der Watt L, Bhandari M, Ghert M, Aleem I, Drew B, et al. Strategies to improve the credibility of meta-analyses in spine surgery: a systematic survey. Spine J. 2015;15(9):2066–76.PubMedView ArticleGoogle Scholar
  49. Fleming PS, Seehra J, Polychronopoulou A, Fedorowicz Z, Pandis N. A PRISMA assessment of the reporting quality of systematic reviews in orthodontics. Angle Orthod. 2013;83(1):158–63.PubMedView ArticleGoogle Scholar
  50. Gagnier JJ, Kellam PJ. Reporting and methodological quality of systematic reviews in the orthopaedic literature. J Bone Joint Surg (Am Vol). 2013;95(11):e771–7.Google Scholar
  51. Ge L, Wang JC, Li JL, Liang L, An N, Shi XT, et al. The assessment of the quality of reporting of systematic reviews/meta-analyses in diagnostic tests published by authors in China. PLoS One. 2014;9(1):e85908.PubMedPubMed CentralView ArticleGoogle Scholar
  52. Hammad TA, Neyarapally GA, Pinheiro SP, Iyasu S, Rochester G, Dal Pan G. Reporting of meta-analyses of randomized controlled trials with a focus on drug safety: an empirical assessment. Clin Trials. 2013;10(3):389–97.PubMedView ArticleGoogle Scholar
  53. Jin YH, Ma ET, Gao WJ, Hua W, Dou HY. Reporting and methodological quality of systematic reviews or meta-analyses in nursing field in China. Int J Nurs Pract. 2014;20(1):70–8.PubMedView ArticleGoogle Scholar
  54. Klimo P Jr, Thompson CJ, Ragel BT, Boop FA. Methodology and reporting of meta-analyses in the neurosurgical literature. J Neurosurg. 2014;120(4):796–810.PubMedView ArticleGoogle Scholar
  55. Kurz A, Evaniew N, Yeung M, Samuelsson K, Peterson D, Ayeni OR. Credibility and quality of meta-analyses addressing graft choice in anterior cruciate ligament reconstruction: a systematic review. Knee Surg Sports Traumatol Arthrosc. 2017;25(2):538–51.PubMedView ArticleGoogle Scholar
  56. Lee SY, Sagoo H, Whitehurst K, Wellstead G, Fowler AJ, Agha RA, et al. Compliance of systematic reviews in plastic surgery with the PRISMA Statement. JAMA Facial Plastic Surgery. 2016;18(2):101–5.PubMedView ArticleGoogle Scholar
  57. Li JL, Ge L, Ma JC, Zeng QL, Yao L, An N, et al. Quality of reporting of systematic reviews published in “evidence-based” Chinese journals. Syst Rev. 2014;3:58.PubMedPubMed CentralView ArticleGoogle Scholar
  58. Li X, Wang R, Shi X, Su J, Pan Y, Tian J, et al. Reporting characteristics and quality of systematic reviews of acupuncture analgesia. Pain Pract. 2017;13:13.Google Scholar
  59. Liu D, Jin J, Tian J, Yang K. Quality assessment and factor analysis of systematic reviews and meta-analyses of endoscopic ultrasound diagnosis. PLoS One. 2015;10(4):e0120911.PubMedPubMed CentralView ArticleGoogle Scholar
  60. Liu P, Qiu Y, Qian Y, Chen X, Wang Y, Cui J, et al. Quality of meta-analyses in major leading gastroenterology and hepatology journals: a systematic review. J Gastroenterol Hepatol. 2017;32(1):39–44.PubMedView ArticleGoogle Scholar
  61. Liu X, Kinzler M, Yuan J, He G, Zhang L. Low reporting quality of the meta-analyses in diagnostic pathology. Arch Pathol Lab Med. 2017;141(3):423–30.PubMedView ArticleGoogle Scholar
  62. Liu Y, Zhang R, Huang J, Zhao X, Liu D, Sun W, et al. Reporting quality of systematic reviews/meta-analyses of acupuncture. PLoS One. 2014;9(11):e113172.PubMedPubMed CentralView ArticleGoogle Scholar
  63. Luo J, Xu H, Yang G, Qiu Y, Liu J, Chen K. Oral Chinese proprietary medicine for angina pectoris: an overview of systematic reviews/meta-analyses. Complement Ther Med. 2014;22(4):787–800.PubMedView ArticleGoogle Scholar
  64. Ma B, Guo J, Qi G, Li H, Peng J, Zhang Y, et al. Epidemiology, quality and reporting characteristics of systematic reviews of traditional Chinese medicine interventions published in Chinese journals. PLoS One. 2011;6(5):e20185.PubMedPubMed CentralView ArticleGoogle Scholar
  65. Ma B, Qi GQ, Lin XT, Wang T, Chen ZM, Yang KH. Epidemiology, quality, and reporting characteristics of systematic reviews of acupuncture interventions published in Chinese journals. J Altern Complement Med. 2012;18(9):813–7.PubMedView ArticleGoogle Scholar
  66. Martins DE, Astur N, Kanas M, Ferretti M, Lenza M, Wajchenberg M. Quality assessment of systematic reviews for surgical treatment of low back pain: an overview. Spine J. 2016;16(5):667–75.PubMedView ArticleGoogle Scholar
  67. McGee RG, Craig JC, Rogerson TE, Webster AC. Systematic reviews of surgical procedures in children: quantity, coverage and quality. J Paediatr Child Health. 2013;49(4):319–24.PubMedView ArticleGoogle Scholar
  68. Nicolau I, Ling D, Tian L, Lienhardt C, Pai M. Methodological and reporting quality of systematic reviews on tuberculosis. Int J Tuberc Lung Dis. 2013;17(9):1160–9.PubMedView ArticleGoogle Scholar
  69. Nissen T, Wayant C, Wahlstrom A, Sinnett P, Fugate C, Herrington J, et al. Methodological quality, completeness of reporting and use of systematic reviews as evidence in clinical practice guidelines for paediatric overweight and obesity. Clin Obesity. 2017;7(1):34–45.View ArticleGoogle Scholar
  70. Padula RS, Pires RS, Alouche SR, Chiavegato LD, Lopes AD, Costa LO. Analysis of reporting of systematic reviews in physical therapy published in Portuguese. Rev Bras Fisioter. 2012;16(4):381–8.PubMedView ArticleGoogle Scholar
  71. Panic N, Leoncini E, de Belvis G, Ricciardi W, Boccia S. Evaluation of the endorsement of the preferred reporting items for systematic reviews and meta-analysis (PRISMA) statement on the quality of published systematic review and meta-analyses. PLoS One. 2013;8(12):e83138.PubMedPubMed CentralView ArticleGoogle Scholar
  72. Passon AM, Drabik A, Sawicki PT. Quality scores do not predict discrepant statistical significances among meta-analyses on different targets of glycemic control in type 2 diabetes. J Clin Epidemiol. 2013;66(12):1356–66.PubMedView ArticleGoogle Scholar
  73. Pastorino R, Milovanovic S, Stojanovic J, Efremov L, Amore R, Boccia S. Quality assessment of studies published in open access and subscription journals: results of a systematic evaluation. PLoS One. 2016;11(5):e0154217.PubMedPubMed CentralView ArticleGoogle Scholar
  74. Peters JP, Hooft L, Grolman W, Stegeman I. Reporting quality of systematic reviews and meta-analyses of otorhinolaryngologic articles based on the PRISMA Statement. PLoS One. 2015;10(8):e0136540.PubMedPubMed CentralView ArticleGoogle Scholar
  75. Pidgeon TE, Wellstead G, Sagoo H, Jafree DJ, Fowler AJ, Agha RA. An assessment of the compliance of systematic review articles published in craniofacial surgery with the PRISMA statement guidelines: a systematic review. Journal of craniomaxillofacial. Surgery. 2016;44(10):1522–30.Google Scholar
  76. Pinzon MC, Hayden DM, Ariel D, Bartosiak KA, Chiodo MV, Kosmidis K, et al. Are our publications failing the inspection?: a review of the publications in rectal cancer surgery between 2002 and 2012. Dis Colon Rectum. 2014;57(8):983–92.PubMedView ArticleGoogle Scholar
  77. Riado Minguez D, Kowalski M, Vallve Odena M, Longin Pontzen D, Jelicic Kadic A, Jeric M, et al. Methodological and reporting quality of systematic reviews published in the highest ranking journals in the field of pain. Anesth Analg. 2017;Google Scholar
  78. Rice DB, Kloda LA, Shrier I, Thombs BD. Reporting completeness and transparency of meta-analyses of depression screening tool accuracy: a comparison of meta-analyses published before and after the PRISMA statement. J Psychosom Res. 2016;87:57–69.PubMedView ArticleGoogle Scholar
  79. Scott J, Howard B, Sinnett P, Schiesel M, Baker J, Henderson P, et al. Variable methodological quality and use found in systematic reviews referenced in STEMI clinical practice guidelines. Am J Emerg Med. 2017;14:14.Google Scholar
  80. Shi C, Zhu L, Wang X, Qin C, Xu Q, Tian J. Epidemiology, methodological and reporting characteristics of systematic reviews of nursing interventions published in China. Int J Nurs Pract. 2014;20(6):689–700.PubMedView ArticleGoogle Scholar
  81. Tam WW, Lo KK, Khalechelvam P. Endorsement of PRISMA statement and quality of systematic reviews and meta-analyses published in nursing journals: a cross-sectional study. BMJ Open. 2017;7(2):e013905.PubMedPubMed CentralView ArticleGoogle Scholar
  82. Tan WK, Wigley J, Shantikumar S. The reporting quality of systematic reviews and meta-analyses in vascular surgery needs improvement: a systematic review. Int J Surg. 2014;12(12):1262–5.PubMedView ArticleGoogle Scholar
  83. Tian J, Zhang J, Ge L, Yang K, Song F. The methodological and reporting quality of systematic reviews from China and the USA are similar. J Clin Epi .2017;85:50-58.Google Scholar
  84. Tunis AS, McInnes MD, Hanna R, Esmail K. Association of study quality with completeness of reporting: have completeness of reporting and quality of systematic reviews and meta-analyses in major radiology journals changed since publication of the PRISMA statement?.[erratum appears in radiology. 2014 Jul;272(1):304]. Radiology. 2013;269(2):413–26.PubMedView ArticleGoogle Scholar
  85. Wasiak J, Shen AY, Ware R, O'Donohoe TJ, Faggion CM, Jr. Methodological quality and reporting of systematic reviews in hand and wrist pathology. J Hand Surg Eur Vol 2017:1753193417712660.Google Scholar
  86. Wasiak J, Tyack Z, Ware R, Goodwin N, Faggion CM Jr. Poor methodological quality and reporting standards of systematic reviews in burn care management. Int Wound J. 2016;18:18.Google Scholar
  87. Weir CR, Staggers N, Laukert T. Reviewing the impact of computerized provider order entry on clinical outcomes: the quality of systematic reviews. Internation J Med Inform. 2012;81(4):219–31.View ArticleGoogle Scholar
  88. Willis BH, Quigley M. The assessment of the quality of reporting of meta-analyses in diagnostic research: a systematic review. BMC Med Res Methodol. 2011;11:163.PubMedPubMed CentralView ArticleGoogle Scholar
  89. Xiao Z, Zhang Y, Wang Y, Xu F. Quality assessment for systematic review/meta-analysis on antidepressant therapy published in Chinese journals. Int J Pharmacol. 2012;8(7):614–20.View ArticleGoogle Scholar
  90. Yang M, Jiang L, Wang A, Xu G. Epidemiology characteristics, reporting characteristics, and methodological quality of systematic reviews and meta-analyses on traditional Chinese medicine nursing interventions published in Chinese journals. Int J Nurs Pract. 2017;23(1)Google Scholar
  91. Yang SL, Ying K, Wang F, Wang L, Ren XY, Yang QF. Methodological and reporting quality assessment for Chinese systematic reviews and meta analysis in oral medicine. Shanghai Kou Qiang Yi Xue/Shanghai J Stomatol. 2015;24(4):505–10.Google Scholar
  92. Zhang H, Han J, Zhu YB, Lau WY, Schwartz ME, Xie GQ, et al. Reporting and methodological qualities of published surgical meta-analyses. J Clin Epidemiol. 2016;70:4–16.PubMedView ArticleGoogle Scholar
  93. Zhang J, Wang J, Han L, Zhang F, Cao J, Ma Y. Epidemiology, quality, and reporting characteristics of systematic reviews and meta-analyses of nursing interventions published in Chinese journals. Nurs Outlook. 2015;63(4):446–55. e4PubMedView ArticleGoogle Scholar
  94. Zhu Y, Fan L, Zhang H, Wang M, Mei X, Hou J, et al. Is the best evidence good enough: quality assessment and factor analysis of meta-analyses on depression. PLoS One. 2016;11(6):e0157808.PubMedPubMed CentralView ArticleGoogle Scholar
  95. Atakpo P, Vassar M. Publication bias in dermatology systematic reviews and meta-analyses. J Dermatol Sci. 2016;82(2):69–74.PubMedView ArticleGoogle Scholar
  96. Hedin RJ, Umberham BA, Detweiler BN, Kollmorgen L, Vassar M. Publication bias and nonreporting found in majority of systematic reviews and meta-analyses in anesthesiology journals. Anesth Analg. 2016;123(4):1018–25.PubMedView ArticleGoogle Scholar
  97. Fleming PS, Koletsi D, Seehra J, Pandis N. Systematic reviews published in higher impact clinical journals were of higher quality. J Clin Epidemiol. 2014;67(7):754–9.PubMedView ArticleGoogle Scholar
  98. Saltaji H, Ospina MB, Armijo-Olivo S, Agarwal S, Cummings GG, Amin M, et al. Evaluation of risk of bias assessment of trials in systematic reviews of oral health interventions, 1991–2014: a methodology study. J Am Dent Assoc. 2016;147(9):720–8. e1PubMedView ArticleGoogle Scholar
  99. Toews LC. Compliance of systematic reviews in veterinary journals with Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) literature search reporting guidelines. J Med Lib Assoc. 2017;105(3):233–9.Google Scholar
  100. Tsou AY, Treadwell JR. Quality and clarity in systematic review abstracts: an empirical study. Research Synthesis Methods. 2016;7(4):447–58.PubMedView ArticleGoogle Scholar
  101. Bigna JJ, Um LN, Nansseu JR. A comparison of quality of abstracts of systematic reviews including meta-analysis of randomized controlled trials in high-impact general medicine journals before and after the publication of PRISMA extension for abstracts: a systematic review and meta-analysis. Syst Rev. 2016;5(1):174.PubMedPubMed CentralView ArticleGoogle Scholar
  102. Ge L, Tian JH, Li XX, Song F, Li L, Zhang J, et al. Epidemiology characteristics, methodological assessment and reporting of statistical analysis of network meta-analyses in the field of cancer. Sci Rep. 2016;6:37208.PubMedPubMed CentralView ArticleGoogle Scholar
  103. Kelly SE, Moher D, Clifford TJ. Quality of conduct and reporting in rapid reviews: an exploration of compliance with PRISMA and AMSTAR guidelines. Syst Rev. 2016;5:79.PubMedPubMed CentralView ArticleGoogle Scholar
  104. Farid-Kapadia M, Joachim KC, Balasingham C, Clyburne-Sherin A, Offringa M. Are child-centric aspects in newborn and child health systematic review and meta-analysis protocols and reports adequately reported?—two systematic reviews. Syst Rev. 2017;6(1):31.PubMedPubMed CentralView ArticleGoogle Scholar
  105. Turner L, Galipeau J, Garritty C, Manheimer E, Wieland LS, Yazdi F, et al. An evaluation of epidemiological and reporting characteristics of complementary and alternative medicine (CAM) systematic reviews (SRs). PLoS One 2013;8(1):e53536.Google Scholar
  106. Gianola S, Gasparini M, Agostini M, Castellini G, Corbetta D, Gozzer P, et al. Survey of the reporting characteristics of systematic reviews in rehabilitation. Phys Ther. 2013;93(11):1456–66.PubMedView ArticleGoogle Scholar
  107. Kiriakou J, Pandis N, Fleming PS, Madianos P, Polychronopoulou A. Reporting quality of systematic review abstracts in leading oral implantology journals. J Dent. 2013;41(12):1181–7.PubMedView ArticleGoogle Scholar
  108. Seehra J, Fleming PS, Polychronopoulou A, Pandis N. Reporting completeness of abstracts of systematic reviews published in leading dental specialty journals. Eur J Oral Sci. 2013;121(2):57–62.PubMedView ArticleGoogle Scholar
  109. Simmonds M, Stewart G, Stewart L. A decade of individual participant data meta-analyses: a review of current practice. Contemp Clin Trials. 2015;45(Pt A):76–83.PubMedView ArticleGoogle Scholar
  110. Glujovsky D, Boggino C, Riestra B, Coscia A, Sueldo CE, Ciapponi A. Quality of reporting in infertility journals. Fertil Steril. 2015;103(1):236–41.PubMedView ArticleGoogle Scholar
  111. Glujovsky D, Villanueva E, Reveiz L, Murasaki R. Adherence to research reporting guidelines in biomedical journals in Latin America and the Caribbean. Pan Am J Public Health. 2014;36(4):232–7.Google Scholar
  112. Hua F, Walsh T, Glenny AM, Worthington H. Surveys on reporting guideline usage in dental journals. J Dent Res. 2016;95(11):1207–13.PubMedView ArticleGoogle Scholar
  113. Knuppel H, Metz C, Meerpohl JJ, Strech D. How psychiatry journals support the unbiased translation of clinical research. A cross-sectional study of editorial policies. PLoS One. 2013;8(10):e75995.PubMedPubMed CentralView ArticleGoogle Scholar
  114. Koch M, Riss P, Umek W, Hanzal E. The explicit mentioning of reporting guidelines in urogynecology journals in 2013: a bibliometric study. Neurourol Urodyn. 2016;35(3):412–6.PubMedView ArticleGoogle Scholar
  115. Mannocci A, Saulle R, Colamesta V, D'Aguanno S, Giraldi G, Maffongelli E, et al. What is the impact of reporting guidelines on public health journals in Europe? The case of STROBE, CONSORT and PRISMA. J Public Health. 2015;37(4):737–40.Google Scholar
  116. Meerpohl JJ, Wolff RF, Antes G, von Elm E. Are pediatric open access journals promoting good publication practice? An analysis of author instructions. BMC Pediatr. 2011;11:27.PubMedPubMed CentralView ArticleGoogle Scholar
  117. Reveiz L, Villanueva E, Iko C, Simera I. Compliance with clinical trial registration and reporting guidelines by Latin American and Caribbean journals. Cad Saude Publica. 2013;29(6):1095–100.PubMedView ArticleGoogle Scholar
  118. Sims MT, Henning NM, Wayant CC, Vassar M. Do emergency medicine journals promote trial registration and adherence to reporting guidelines? A survey of “instructions for authors”. Scand J Trauma Resusc Emerg Med. 2016;24(1):137.PubMedPubMed CentralView ArticleGoogle Scholar
  119. Smith TA, Kulatilake P, Brown LJ, Wigley J, Hameed W, Shantikumar S. Do surgery journals insist on reporting by CONSORT and PRISMA? A follow-up survey of ‘instructions to authors’. Ann Med Surg. 2015;4(1):17–21.View ArticleGoogle Scholar
  120. Tao KM, Li XQ, Zhou QH, Moher D, Ling CQ, WF Y. From QUOROM to PRISMA: a survey of high-impact medical journals’ instructions to authors and a review of systematic reviews in anesthesia literature. PLoS One. 2011;6(11):e27611.PubMedPubMed CentralView ArticleGoogle Scholar
  121. Toews I, Binder N, Wolff RF, Toprak G, von Elm E, Meerpohl JJ. Guidance in author instructions of hematology and oncology journals: a cross sectional and longitudinal study. PLoS One. 2017;12(4):e0176489.PubMedPubMed CentralView ArticleGoogle Scholar
  122. Wayant C, Smith C, Sims M, Vassar M. Hematology journals do not sufficiently adhere to reporting guidelines: a systematic review. J Thromb Haemost. 2017;15(4):608–17.PubMedView ArticleGoogle Scholar
  123. Hirst A, Altman DG. Are peer reviewers encouraged to use reporting guidelines? A survey of 116 health research journals. PLoS One. 2012;7(4):e35621.PubMedPubMed CentralView ArticleGoogle Scholar
  124. Tsujimoto Y, Tsujimoto H, Kataoka Y, Kimachi M, Shimizu S, Ikenoue T, et al. Majority of systematic reviews published in high-impact journals neglected to register the protocols: a meta-epidemiological study. J Clin Epidemiol. 2017;84:54–60.PubMedView ArticleGoogle Scholar
  125. Fleming PS, Koletsi D, Pandis N. Blinded by PRISMA: are systematic reviewers focusing on PRISMA and ignoring other guidelines? PLoS One. 2014;9(5):e96407.PubMedPubMed CentralView ArticleGoogle Scholar
  126. van der Pol CB, McInnes MD, Petrcich W, Tunis AS, Hanna RI. Quality and completeness of reporting of systematic reviews and meta-analyses published in high impact radiology journals associated with citation rates? PLoS One. 2015;10(3):e0119892.PubMedPubMed CentralView ArticleGoogle Scholar
  127. Burford BJ, Welch V, Waters E, Tugwell P, Moher D, O'Neill J, et al. Testing the PRISMA-Equity 2012 reporting guideline: the perspectives of systematic review authors. PLoS One. 2013;8(10):e75122.PubMedPubMed CentralView ArticleGoogle Scholar
  128. Lee AW. Use of network meta-analysis in systematic reviews: a survey of authors. Syst Rev. 2016;5:8.PubMedPubMed CentralView ArticleGoogle Scholar
  129. Rader T, Mann M, Stansfield C, Cooper C, Sampson M. Methods for documenting systematic review searches: a discussion of common issues. Research Synthesis Methods. 2014;5(2):98–115.PubMedView ArticleGoogle Scholar
  130. Shi X, Wang X, Liu Y, Li X, Wei D, Zhao X, et al. A survey of evidence users about the information need of acupuncture clinical evidence. BMC Complement Altern Med. 2016;16(1):455.PubMedPubMed CentralView ArticleGoogle Scholar
  131. Pussegoda K, Turner L, Garritty C, Mayhew A, Skidmore B, Stevens A, et al. Identifying approaches for assessing methodological and reporting quality of systematic reviews: a descriptive study. Syst Rev. 2017;6(1):117.PubMedPubMed CentralView ArticleGoogle Scholar
  132. Churchill R, Lasserson T, Chandler J, Tovey D, Higgins JPT. Standards for the reporting of new Cochrane Intervention Reviews. In: Higgins JPT, Lasserson T, Chandler J, Tovey D, Churchill R, editors. Methodological Expectations of Cochrane Intervention Reviews. London: Cochrane; 2016.Google Scholar
  133. Chandler J, Lasserson T, Higgins JPT, Tovey D, Churchill R. Standards for the planning, conduct and reporting of updates of Cochrane Intervention Reviews. In: Higgins JPT, Lasserson T, Chandler J, Tovey D, Churchill R, editors. Methodological Expectations of Cochrane Irntervention Reviews. London: Cochrane; 2016.Google Scholar
  134. APA Publications and Communications Board Working Group on Journal Article Reporting Standards. Reporting standards for research in psychology: why do we need them? What might they be? Am Psychol. 2008;63(9):839–51.PubMed CentralView ArticleGoogle Scholar
  135. Tong A, Flemming K, McInnes E, Oliver S, Craig J. Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Med Res Methodol. 2012;12:181.PubMedPubMed CentralView ArticleGoogle Scholar
  136. Wong G, Greenhalgh T, Westhorp G, Buckingham J, Pawson R. RAMESES publication standards: realist syntheses. BMC Med. 2013;11:21.PubMedPubMed CentralView ArticleGoogle Scholar
  137. Wong G, Greenhalgh T, Westhorp G, Buckingham J, Pawson R. RAMESES publication standards: meta-narrative reviews. BMC Med. 2013;11:20.PubMedPubMed CentralView ArticleGoogle Scholar
  138. Hoffmann TC, Oxman AD, Ioannidis JP, Moher D, Lasserson TJ, Tovey DI, et al. Enhancing the usability of systematic reviews by improving the consideration and description of interventions. BMJ. 2017;358:j2998.PubMedView ArticleGoogle Scholar
  139. Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMJ. 2010;340:c332.PubMedPubMed CentralView ArticleGoogle Scholar
  140. von Elm E, Altman DG, Egger M, Pocock SJ, Gotzsche PC, Vandenbroucke JP. Strengthening the Reporting of Observational studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. BMJ. 2007;335(7624):806–8.PubMedPubMed CentralView ArticleGoogle Scholar
  141. Bossuyt PM, Reitsma JB, Bruns DE, Gatsonis CA, Glasziou PP, Irwig LM, et al. The STARD statement for reporting studies of diagnostic accuracy: explanation and elaboration. Ann Intern Med. 2003;138(1):W1–12.PubMedView ArticleGoogle Scholar
  142. Bossuyt PM, Reitsma JB, Bruns DE, Gatsonis CA, Glasziou PP, Irwig L, et al. STARD 2015: an updated list of essential items for reporting diagnostic accuracy studies. BMJ. 2015;351:h5527.PubMedPubMed CentralView ArticleGoogle Scholar
  143. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research. PLoS Biol. 2010;8(6):e1000412.PubMedPubMed CentralView ArticleGoogle Scholar
  144. Hopewell S, Boutron I, Altman DG, Barbour G, Moher D, Montori V, et al. Impact of a web-based tool (WebCONSORT) to improve the reporting of randomised trials: results of a randomised controlled trial. BMC Med 2016;14(1):199.Google Scholar
  145. Barnes C, Boutron I, Giraudeau B, Porcher R, Altman DG, Ravaud P. Impact of an online writing aid tool for writing a randomized trial report: the COBWEB (Consort-based WEB tool) randomized controlled trial. BMC Med. 2015;13:221.PubMedPubMed CentralView ArticleGoogle Scholar
  146. Thomson HJ, Thomas S. The effect direction plot: visual display of non-standardised effects across multiple outcome domains. Research Synthesis Methods. 2013;4(1):95–101.PubMedView ArticleGoogle Scholar
  147. Harrison S, Jones HE, Martin RM, Lewis SJ, Higgins JPT. The albatross plot: a novel graphical tool for presenting results of diversely reported studies in a systematic review. Research Synthesis Methods. 2017;8(3):281-289.Google Scholar
  148. Glasziou PP, Chalmers I, Green S, Michie S. Intervention synthesis: a missing link between a systematic review and practical treatment(s). PLoS Med. 2014;11(8):e1001690.PubMedPubMed CentralView ArticleGoogle Scholar
  149. Riley RD, Higgins JP, Deeks JJ. Interpretation of random effects meta-analyses. BMJ. 2011;342:d549.PubMedView ArticleGoogle Scholar
  150. IntHout J, Ioannidis JP, Rovers MM, Goeman JJ. Plea for routinely presenting prediction intervals in meta-analysis. BMJ Open. 2016;6(7):e010247.PubMedPubMed CentralView ArticleGoogle Scholar
  151. Li T, Vedula SS, Hadar N, Parkin C, Lau J, Dickersin K. Innovations in data collection, management, and archiving for systematic reviews. Ann Intern Med. 2015;162(4):287–94.PubMedView ArticleGoogle Scholar
  152. Lakens D, Hilgard J, Staaks J. On the reproducibility of meta-analyses: six practical recommendations. BMC Psycholol. 2016;4(1):24.View ArticleGoogle Scholar
  153. Garner P, Hopewell S, Chandler J, MacLehose H, Schunemann HJ, Akl EA, et al. When and how to update systematic reviews: consensus and checklist. BMJ. 2016;354:i3507.PubMedPubMed CentralView ArticleGoogle Scholar
  154. Elliott JH, Turner T, Clavisi O, Thomas J, Higgins JP, Mavergames C, et al. Living systematic reviews: an emerging opportunity to narrow the evidence-practice gap. PLoS Med. 2014;11(2):e1001603.PubMedPubMed CentralView ArticleGoogle Scholar
  155. Kapadia MZ, Askie L, Hartling L, Contopoulos-Ioannidis D, Bhutta ZA, Soll R, et al. PRISMA-Children (C) and PRISMA-Protocol for Children (P-C) extensions: a study protocol for the development of guidelines for the conduct and reporting of systematic reviews and meta-analyses of newborn and child health research. BMJ Open. 2016;6(4):e010270.PubMedPubMed CentralView ArticleGoogle Scholar
  156. McInnes M, Moher D, Bossuyt P. PRISMA-DTA: checklist for reporting of diagnostic test accuracy systematic reviews (registered 18 November 2015) http://www.equator-network.org/library/reporting-guidelines-under-development/#52 [Accessed 16 Aug 2017].
  157. Stevens A. PRISMA-RR 2017: an extension to PRISMA for rapid reviews (registered 4 November 2015) http://www.equator-network.org/library/reporting-guidelines-under-development/#51 [Accessed 16 Aug 2018].
  158. Tricco AC, Straus S, Moher D. Preferred Reporting Items for Systematic Reviews and Meta-Analysis extension for Scoping Reviews (PRISMA-ScR) (registered 18 December 2015) http://www.equator-network.org/library/reporting-guidelines-under-development/#55 [Accessed 16 Aug 2017].
  159. Rethlefsen M, Koffel J, Kirtley S. PRISMA-Search: guidelines for reporting systematic review literature searches (registered 17 February 2016) http://www.equator-network.org/library/reporting-guidelines-under-development/#57 [Accessed 16 Aug 2017].
  160. Bian Z. Preferred Reporting Items for Systematic Review and Meta-Analyses of traditional Chinese medicine: the PRISMA-TCM Statement (registered 18 August 2016) http://www.equator-network.org/library/reporting-guidelines-under-development/#65 [Accessed 16 Aug 2017].

Copyright

© The Author(s). 2018

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate. Please note that comments may be removed without notice if they are flagged by another user or do not comply with our community guidelines.

Advertisement