Skip to main content

How to update a living systematic review and keep it alive during a pandemic: a practical guide



The covid-19 pandemic has highlighted the role of living systematic reviews. The speed of evidence generated during the covid-19 pandemic accentuated the challenges of managing high volumes of research literature.


In this article, we summarise the characteristics of ongoing living systematic reviews on covid-19, and we follow a life cycle approach to describe key steps in a living systematic review.


We identified 97 living systematic reviews on covid-19, published up to 7th November 2022, which focused mostly on the effects of pharmacological interventions (n = 46, 47%) or the prevalence of associated conditions or risk factors (n = 30, 31%). The scopes of several reviews overlapped considerably. Most living systematic reviews included both observational and randomised study designs (n = 45, 46%). Only one-third of the reviews has been updated at least once (n = 34, 35%). We address practical aspects of living systematic reviews including how to judge whether to start a living systematic review, methods for study identification and selection, data extraction and evaluation, and give recommendations at each step, drawing from our own experience. We also discuss when it is time to stop and how to publish updates.


Methods to improve the efficiency of searching, study selection, and data extraction using machine learning technologies are being developed, their performance and applicability, particularly for reviews based on observational study designs should improve, and ways of publishing living systematic reviews and their updates will continue to evolve. Finally, knowing when to end a living systematic review is as important as knowing when to start.


A living systematic review is a systematic review, which is ‘continually updated, incorporating relevant new evidence as it becomes available’ [1]. Researchers are advised to take a living approach when the topic is a priority for decision-making, new evidence is emerging and changing quickly, and certainty in the existing evidence is low [1, 2]. The pandemic of coronavirus disease 2019 (covid-19), caused by the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), fulfils these conditions in general, and many living systematic reviews addressing questions about SARS-CoV-2 and covid-19 were published during the pandemic [3,4,5,6,7,8,9,10].

The speed and sustained accumulation of published research about SARS-CoV-2 and covid-19 since the beginning of 2020 are unprecedented (Fig. 1). By 28 February 2022, more than 314,000 peer-reviewed articles and preprints had been published in five electronic literature databases [11]. After a rapid early surge, around 14,500 articles on SARS-CoV-2 and covid-19 have been published every month (mean publications from January 2021 until February 2022) [11]. Types of publication have changed over time [12], and the evidence itself is changing, for example, as new viral variants arise and new vaccines and treatments are developed.

Fig. 1
figure 1

Monthly new records on SARS-CoV-2 or covid-19 from January 2020 to February 2022. Number of new records from five electronic databases (PubMed, Embase, PsychINFO, bioRxiv, and medRxiv)

The evolving evidence on covid-19 is being accompanied by changes in living systematic reviews, which were originally defined as an approach to updating an existing systematic review, not a methodology in itself [1]. Now, many authors describe their review as a living systematic review from the outset [3,4,5,6,7,8,9,10]. In these living systematic reviews, distinctions between approaches recommended for rapid reviews [13] and standard systematic reviews [1], which living systematic reviews are supposed to follow, are also becoming blurred. Rapid review methods include processes to speed up production, such as data extraction by a single reviewer, or limits on search dates or languages, even though some of these practices are judged to increase the risk of bias in systematic reviews [13]. Reviewers who have used these methods have described their study designs as a ‘living rapid review’ on the effectiveness of face masks [14], a ‘rapid living systematic review’ of rehabilitation for covid-19 patients [15], or just a ‘living systematic review’ of asymptomatic SARS-CoV-2 infection [10].

Guidance about methods for several aspects of the conduct, reporting, and publication of living systematic reviews is available [1, 16,17,18,19] or in development [20, 21]. The covid-19 pandemic has highlighted not only the methodological considerations when conducting living systematic reviews [19] but also the practical challenges of sustaining a workflow. These challenges include needs for the urgent decision-making [22]; managing high volumes of research, especially on observational study designs; and judging whether to start a living systematic review, when it is time to stop, and how to publish updates [23]. To address these practical aspects of living systematic reviews, we use our own experience of doing a living systematic review on asymptomatic SARS-CoV-2 infections as a case study [24]. We also draw on covid-19-related living systematic reviews on changes in mental health in the general population, SARS-CoV-2 diagnostics, epidemiology of covid-19 in pregnancy, and effectiveness of treatments and vaccines, which cover a variety of methods used to conduct and manage reviews [3,4,5,6,7,8,9,10]. In this article, we summarise the characteristics of ongoing living systematic reviews on covid-19; we follow a life cycle approach to describe key steps in a living systematic review and give recommendations at each step.

The state of covid-19 evidence and living systematic reviews

To summarise the status of living systematic reviews about SARS-CoV-2 and covid-19, we searched titles of records in the World Health Organization COVID-19 Database [25] using the search term ‘living systematic review’ on 7th November 2022. We did not search for reviews hosted only on websites (Additional file 1). Of 861 hits, we found 97 unique studies described by the authors as living systematic reviews on covid-19 (Table 1 and Additional file 2). These living systematic reviews mainly addressed questions about the effects of pharmacological interventions (therapies and vaccines) (n = 46, 47%) and the prevalence of SARS-CoV-2 or covid-19-associated conditions or risk factors (n = 30, 31%). Twenty-eight (29%) living systematic reviews included only randomised controlled trials, 22 (23%) reviewed only observational studies, and 45 (46%) reviewed both observational and randomised controlled trials. There were considerable overlap in the scopes of some studies. For example, four reviews focused on the diagnostic accuracy of rapid antigen tests, four on the effectiveness of vaccines, three on long-term symptoms of covid-19, and two on transmission of SARS-CoV-2 from mother to child. The studied populations were mainly people with suspected or diagnosed covid-19 or long COVID (n = 61, 63%). They were mostly groups of any ages (n = 61, 63%), which included studies on hospital patients (n = 23). Studies on adults only (n = 34, 35%) included hospital patients (n = 13) and healthcare workers (n = 5). Most living systematic reviews had published a protocol before the first publication of the review (n = 84, 87%), and most had at least one version published in a peer-reviewed journal (n = 95, 98%).

Table 1 Characteristics of living systematic reviews on covid-19, 1 January 2020 to 7 November 2022

Updating with new evidence is a core principle of living systematic reviews [1]. From January to June 2020, 7/9 (78%) living systematic reviews had been updated at least once. Of living systematic reviews published from January to June 2021 (12 to 18 months from the date of the search), only 7/22 (32%) had been updated at least once, which might reflect the large workload associated with a living systematic review. No authors clearly stated that their living systematic review had ended.

The workload for living systematic reviews depends on the review question, the eligible study designs, and the amount of underlying evidence. Review questions that do not restrict their search by study design or rely entirely on observational study designs, such as prevalence or aetiology studies, require more work at the early stages of the review than questions about interventions. One reason is that randomised trials are tagged in databases such as PubMed, there are validated search filters to find them, and guidelines that ensure that important items are reported. Indexing and reporting of observational study designs are less consistent, and authors may use different terms to describe the same approach [26]. A second reason is that the search terms for questions about topics such as prevalence and aetiology tend to be less specific than those for interventions, generating more hits to be screened and from which to extract data.

Establishing and updating a living systematic review

We summarise the steps in the life cycle of a living systematic review in Fig. 2. In Table 2, we summarise the methods that we used in our own living systematic review [10]. Frequently overlooked is the need to be realistic about the time needed to start and update the living systematic review and plan as many steps as possible in advance, taking into consideration that numbers of records to screen might continue to increase quickly, as has happened with covid-19 literature. Given the commitment required and the scale of the workload, reviewers should make sure that their living systematic review question has not already been addressed, or is being addressed, by searching the published literature, systematic review registries, such as the PROSPERO international register of systematic reviews [3,4,5, 9], or the Open Science Framework (OSF) [8, 10].

Fig. 2
figure 2

Life cycle of a living systematic review

Table 2 Summary of a living systematic review of asymptomatic and presymptomatic SARS-CoV-2 infection

Setting up and managing a review team

The core team should coordinate the tasks of the review team, which includes anticipating and managing changes in workload, workflow, and team composition. At least one core team member should have sufficient programming skills to automate steps in the workflow where possible. In our case study, a core review team of seven became overwhelmed when the number of new hits to be screened increased (Table 2). Growth in the team of researchers can be seen when following updates of other living systematic reviews [8]. Crowdsourcing is a valuable tool for large reviews [17] and can be mutually beneficial for the volunteers [28], but the core team must weigh up the time spent training volunteers against the time saved. Some review teams have anticipated the workload and used crowdsourcing from the outset, increasing the size of the team to more than 100 volunteers to help with screening and data extraction [3], and other reviews mention the use of volunteers [4, 7]. We recruited twenty volunteers from April 2021, through the core team’s networks. All volunteers had previous experience with systematic reviews and agreed to spend at least 3 h per month working on eligibility assessment, data extraction, and/or risk-of-bias assessment. The core team provided online guidance materials (Additional file 3), individual feedback, and automated tools. Training for members of the crowd (Additional file 3) reduced potential disagreements in screening, extra work for the core team, and delays in the living systematic review.

Defining eligibility criteria for authorship is an essential task for the core team at the start of the review process, who should agree the policy in advance with the review team, including crowdsourced members. In our review, levels of contribution and availability changed during and between updates. Team members who fulfilled the criteria for authorship were co-authors of the relevant publication. We created lists of contributorship [29] in which people whose contributions no longer fulfilled criteria for authorship had their contributions acknowledged separately.

Publishing a protocol

A protocol for a living systematic review is also a living document, which should reduce potential biases and avoid posthoc decisions [1, 18, 30]. Publishing a protocol on PROSPERO [3,4,5, 9, 15], on a preprint server, or public repositories like the OSF [8, 10] allows rapid sharing and updating of protocols. In a living systematic review, the review questions and scope and types of evidence included might evolve over time, so authors should document and justify changes to the protocol before starting an update, including decisions about the frequency of updating and about stopping the review. If protocol changes are needed, authors should note that the scope of a living systematic review can only become narrower over time without having to make changes to the original search strategy. Over the seven versions of the protocol for our review of asymptomatic SARS-CoV-2, the scope has narrowed over time [27]. In the first version, there were few publications, and we included study populations in any setting. After our third version, we reduced the number of studies for data extraction by excluding small studies reporting on single family contact investigations and studies of hospitalised people, who were more likely to be symptomatic.

Study identification

Automatic alerts from bibliographic databases can notify researchers when new records are available [17]. For complex reviews, researchers with sufficient programming skills can set up automatic scripts to regularly search and collect results from search databases using programming languages, either with an application programme interface (API) (a software intermediary that communicates with websites from a third-party application) or by ‘web scraping’. Database aggregators are convenient, single sources for a topic of interest; information scientists develop, refine, automate, and update search strings in different electronic sources and de-duplicate the records. Database aggregators for covid-19 literature include the World Health Organization COVID-19 Database and the Cochrane COVID-19 Trials Register ( We used the COAP living evidence database, a database aggregator [11], which we ran from March 2020 to March 2022 [10]. We scheduled an automated R script [31] to search COAP weekly, using the task scheduler, cron. Each week, the automatic search uploaded 100–200 new records from the COAP database for our living systematic review. We searched preprint servers and included preprints if they fulfilled eligibility criteria. In each update, we checked the status of preprints to see if they had been published in peer-reviewed journals and re-extracted data if the content had changed.

Electronic online databases to save and manage records support a secure and efficient workflow. Living systematic reviewers are using tools such as Evidence for Policy and Practice Information (EPPI)-reviewer [32], Covidence [6, 33, 34], or Microsoft Excel to organise records. New records in our review (Table 2) are saved in a Research Electronic Data Capture (REDCap) database [35], a flexible and secure online system. A copy of the data is stored in a collaborative software repository [36].

Study selection

Several software tools offer fast and user-friendly platforms to facilitate screening records [33]. Living systematic reviews on covid-19 has used REDCap surveys [3, 35], EPPI-reviewer [8, 32], and Covidence [6, 34]. The tools support multiple users, allocate tasks, record decisions, and produce automatic reports [33]. The open-source R package revtools [37] support the screening of titles and abstracts and deduplication. When specific features are desired, or if software licences are unaffordable, building a custom application using open-source software might be more suitable. We constructed password-protected R Shiny applications to support the selection process (Fig. 3) [10]. The core team allocates records to the reviewing team via REDCap [35]. The applications included features to allow the team to train a machine learning algorithm (see below).

Fig. 3
figure 3

Annotated screenshot of the R Shiny application used for the selection process. The RShiny application was developed for the reviewing team of the living systematic review on asymptomatic SARS-CoV-2 infections to screen and verify articles

Semiautomated machine learning tools for the selection process can reduce the volume of studies that needs to be screened manually [38]. However, the tools may not perform as well for observational studies as for RCTs, for which accepted reporting guidelines and terminology facilitate reliable identification of reports [38]. Wynants et al. built a custom classification model to speed up the selection process in their living systematic review of prognostic models for covid-19 [8]. They used the initial set of records that they screened to train an algorithm to recognise patterns in text to identify studies that are very unlikely to be relevant and automatically exclude them. For reporting the results of searching, selection, and inclusion, a specific flow chart for living systematic reviews allows a logical way of updating [39].

Data collection

Web applications may help to streamline manual data extraction by reviewers who are extracting information independently or verifying the information extracted by another reviewer, including using the use of online forms [4, 7], REDCap surveys [3], or standardised prespecified extraction forms [6, 8, 9, 14]. None of these living systematic reviews mentioned the use of automated tools for data collection. We used RShiny applications to facilitate both steps and save decisions in REDCap [10, 35]. While machine learning tools for data extraction exist, very few are publicly available [40]. The tools face challenges with variations in wording, missing information, and adaptability to a subject area on which the tool was not developed [40].

Data synthesis

Manual checks on included studies are still needed before starting data synthesis, especially when a large crowd has contributed to selection of studies and extraction of data and rapid processes have been put in place. Routine checks include making sure that data have not been included from a preprint and a published version of the same study. There are many statistical software packages for conducting quantitative data synthesis for living systematic reviews, including Stata [41] and R [4,5,6, 9, 10, 31]. The use of an API to communicate between an online database and the statistical software allows reviewers to import the latest data and update the analysis when new data are available. Reviewers can generate tables and figures automatically using statistical software (e.g. R Markdown [31], Stata [41]).

There are issues associated with repeated updating of statistical analysis, which are particularly relevant in living systematic reviews. With each update, the analysis of data from RCTs or comparative effectiveness studies is more likely to generate a false statistically significant result [16]. Even when the aim of the living systematic review is to support a decision, e.g. to decide which intervention is more effective, statistical significance is rarely the only criterion guiding this decision. However, reviewers can employ methods that control the type I error if they want [16, 42].

A substantial proportion of living systematic reviews rely on observational studies (Table 1), in which levels of between heterogeneity are often high [43], and for which meta-analysis might not be appropriate. Although several living systematic reviews on covid-19 have conducted meta-analyses [4, 6], some did not, owing to high heterogeneity in included studies [10, 14]. In our living systematic review (Table 2), between-study heterogeneity has increased with each update, contrary to our expectation, and we could not explain most of the heterogeneity [10]. In the fifth and sixth versions of the review, we did not produce a summary estimate for the proportion of asymptomatic SARS-CoV-2 infection. Instead, we reported an interquartile range for the results from included studies and estimated a prediction interval [44] to show the range of values in a future hypothetical study.

Publishing a living systematic review

Living systematic reviews should be published in a way that explicitly cross-references different versions of the report as updates of the same review [20]. These links are needed to make sure that readers have access to the most recent update, and that different versions of a living systematic review are not mistaken for redundant publications. Reviewers should consider contacting the editors of their target journal to find out whether they can submit a living systematic review and to find out how the journal handles updates. Editors of online publications, print publications, and preprint servers use different methods and apply different rules about what they consider a ‘version of record’ [45], which refers to the version of an article that is considered final and is identified online with a digital object identifier (DOI). For living systematic reviews, the version of record is not defined consistently across journals [22]. Different publishers apply different rules to determine whether an update receives the same DOI as a previous version or a new DOI. This decision can depend on whether the journal editors consider and update as minor or major. The BMJ assigns the same DOI to all versions of a living systematic review and adds a ‘reader’s note’ to the abstract, signalling the update number and how to find earlier updates [8]. Cochrane reviews have indexed updates for many years and assign a DOI that incorporates the same review number for all updates and includes an extension with the update number [4]. Newer online publishers, such as F1000, use the same principle as the Cochrane library and also include the version number in all article titles [40]. The Public Library of Science (PLOS) publishes minor updates as online comments to the earlier version and assigns a new DOI if the editors consider it a major update [10]. The Annals of Internal Medicine uses a similar approach, with minor updates published as letters [14, 46]. Preprints are not considered a version of record. The medRxiv server allows updates to a living systematic review to be uploaded under the same DOI (available in the history of the article) until the review is published in a peer-reviewed publication [24, 47]. After that, only a major update can be uploaded, and that version receives a new DOI, again, until published. In our case study (Table 2), we have published both preprints [24, 47] and peer-reviewed articles [10, 48].

Transparency is important when sharing results of living systematic reviews. Living systematic review and living guidelines teams who maintain dedicated websites can display updated results as soon as they are incorporated and include links to articles, protocols, and datasets using FAIR principles (findability, accessibility, interoperability, and reuse of digital assets) [3,4,5,6, 8, 10, 22].

Stopping a living systematic review

An important feature of a living systematic review is knowing when to stop, and the criteria for stopping should be part of the review protocol, updated if necessary. Covid-19 living systematic review teams have reported a predefined point at which they intend to stop: either a specific month [3, 4, 9] or when new evidence is unlikely to emerge [6]. In our review, we stated the following criteria for ending the review: when estimates are stable and unlikely to change or the question is no longer of importance [10]. Alternatively, publishers or available funding may determine the lifetime of a living review. The BMJ has set a duration of 2 years for a living systematic review, after which the editors and authors should assess the need for continuation [23].


The covid-19 pandemic has highlighted the importance of living systematic reviews and living evidence. The volume and speed of evidence generated during the covid-19 pandemic have exceeded expectations since the concept of living systematic reviews was first elaborated. Living systematic reviews have an intense workload, and it is especially important to avoid the research waste of reviews with overlapping scope, such as the 3 reviews on the effectiveness of vaccines that we found in our review of living systematic reviews. Living systematic reviews should be updated as new evidence becomes available, but several studies described as living systematic reviews on covid-19 have not been updated since publication of the first version (n = 63, 65%) [3,4,5, 9]. This highlights the difficulty of keeping a living systematic review alive. In this article, we have summarised the processes in a living systematic review as a life cycle (Fig. 2), described practical considerations at each step, and made recommendations (Table 3), drawing on a case study from our own experiences (Table 2). Methods to improve the efficiency of searching, study selection, and data extraction using machine learning technologies are being developed; their performance and applicability, particularly for reviews based on observational study designs, should improve; and ways of publishing living systematic reviews and their updates will continue to evolve. Finally, knowing when to end a living systematic review is as important as knowing when to start.

Table 3 Recommendations for conducting a living systematic review, by stage of the review life cycle

Availability of data and materials

All data generated or analysed during this study are included in this published article and its supplementary information files.



Application programme interface


Coronavirus disease 2019


Digital object identifier


Evidence for policy and practice information


Findability, accessibility, interoperability, and reuse of digital assets


Open Science Framework


Public Library of Science


Research Electronic Data Capture


Severe acute respiratory syndrome coronavirus 2


  1. Elliott JH, Synnot A, Turner T, Simmonds M, Akl EA, McDonald S, et al. Living systematic review: 1. Introduction-the why, what, when, and how. J Clin Epidemiol. 2017;91:23–30.

    Article  PubMed  Google Scholar 

  2. Garner P, Hopewell S, Chandler J, MacLehose H, Schunemann HJ, Akl EA, et al. When and how to update systematic reviews: consensus and checklist. BMJ. 2016;354:i3507.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Salanti G, Cipriani A, Furukawa TA, Peter N, Tonia T, Papakonstantinou T, et al. An efficient way to assess the effect of COVID-19 on mental health in the general population. Lancet Psychiatry. 2021;8(5):e14–5.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Ghosn L, Chaimani A, Evrenoglou T, Davidson M, Graña C, Schmucker C, et al. Interleukin-6 blocking agents for treating COVID-19: a living systematic review. Cochrane Database Syst Rev. 2021;3(3):CD013881.

    PubMed  Google Scholar 

  5. Allotey J, Stallings E, Bonet M, Yap M, Chatterjee S, Kew T, et al. Clinical manifestations, risk factors, and maternal and perinatal outcomes of coronavirus disease 2019 in pregnancy: living systematic review and meta-analysis. BMJ. 2020;370:m3320.

    Article  PubMed  Google Scholar 

  6. Siemieniuk RA, Bartoszko JJ, Ge L, Zeraatkar D, Izcovich A, Kum E, et al. Drug treatments for covid-19: living systematic review and network meta-analysis. BMJ. 2020;370:m2980.

    Article  PubMed  Google Scholar 

  7. Boutron I, Chaimani A, Devane D, Meerpohl JJ, Rada G, Hróbjartsson A, et al. Interventions for the prevention and treatment of COVID-19: a living mapping of research and living network meta-analysis. Cochrane Database Syst Rev. 2020;11:CD013769.

    Google Scholar 

  8. Wynants L, Van Calster B, Collins GS, Riley RD, Heinze G, Schuit E, et al. Prediction models for diagnosis and prognosis of covid-19: systematic review and critical appraisal. BMJ. 2020;369:m1328.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Korang SK, von Rohden E, Veroniki AA, Ong G, Ngalamika O, Siddiqui F, et al. Vaccines to prevent COVID-19: a living systematic review with trial sequential analysis and network meta-analysis of randomized controlled trials. PLOS ONE 2022;17(1):e0260733.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  10. Buitrago-Garcia D, Ipekci AM, Heron L, Imeri H, Araujo-Chaveron L, Arevalo-Rodriguez I, et al. Occurrence and transmission potential of asymptomatic and presymptomatic SARS-CoV-2 infections: update of a living systematic review and meta-analysis. PLOS Med. 2022;19(5):e1003987.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  11. Counotte M, Imeri H, Heron L, Ipekci AM, Low N. COVID-19 Open Access Project - Living Evidence on COVID-19. Accessed 13 Jan 2022.

  12. Ipekci AM, Buitrago-Garcia D, Meili KW, Krauer F, Prajapati N, Thapa S, et al. Outbreaks of publications about emerging infectious diseases: the case of SARS-CoV-2 and Zika virus. BMC Med Res Methodol. 2021;21(1):50.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  13. Pandor A, Kaltenthaler E, Martyn-St James M, Wong R, Cooper K, Dimairo M, et al. Delphi consensus reached to produce a decision tool for SelecTing Approaches for Rapid Reviews (STARR). J Clin Epidemiol. 2019;114:22–9.

    Article  PubMed  Google Scholar 

  14. Chou R, Dana T, Jungbauer R, Weeks C, McDonagh MS. Masks for prevention of respiratory virus infections, including SARS-CoV-2, in health care and community settings : a living rapid review. Ann Intern Med. 2020;173(7):542–55.

    Article  PubMed  Google Scholar 

  15. Negrini S, Ceravolo MG, Côté P, Arienti C. A systematic review that is “rapid” and “living”: a specific answer to the COVID-19 pandemic. J Clin Epidemiol. 2021;138:194–8.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Simmonds M, Salanti G, McKenzie J, Elliott J, Living Systematic Reviews Network. Living systematic reviews: 3. Statistical methods for updating meta-analyses. J Clin Epidemiol. 2017;91:38–46.

    Article  PubMed  Google Scholar 

  17. Thomas J, Noel-Storr A, Marshall I, Wallace B, McDonald S, Mavergames C, et al. Living systematic reviews: 2. Combining human and machine effort. J Clin Epidemiol. 2017;91:31–7.

    Article  PubMed  Google Scholar 

  18. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71.

  19. Iannizzi C, Dorando E, Burns J, Weibel S, Dooley C, Wakeford H, et al. Methodological challenges for living systematic reviews conducted during the COVID-19 pandemic: a concept paper. J Clin Epidemiol. 2022;141:82–9.

    Article  PubMed  Google Scholar 

  20. Kahale L, Piechotta V, McKenzie J, Dorando E, Iannizzi C, Barker J, et al. Extension of the PRISMA 2020 statement for living systematic reviews (LSRs): protocol [version 2; peer review: 1 approved]. F1000Res. 2022;11:109.

    Article  CAS  Google Scholar 

  21. Bendersky J, Auladell-Rispau A, Urrutia G, Rojas-Reyes MX. Methods for developing and reporting living evidence synthesis. J Clin Epidemiol. 2022;152:89–100.

    Article  PubMed  Google Scholar 

  22. Elliott J, Lawrence R, Minx JC, Oladapo OT, Ravaud P, Tendal Jeppesen B, et al. Decision makers need constantly updated evidence synthesis. Nature. 2021;600(7889):383–5.

    Article  PubMed  CAS  Google Scholar 

  23. Macdonald H, Loder E, Abbasi K. Living systematic reviews at The BMJ. BMJ. 2020;370:m2925.

    Article  PubMed  Google Scholar 

  24. Buitrago-Garcia D, Ipekci AM, Heron L, Imeri H, Araujo-Chaveron L, Arevalo-Rodriguez I, et al. Occurrence and transmission potential of asymptomatic and presymptomatic SARS-CoV-2 infections: update of a living systematic review and meta-analysis. medRxiv. 2022.

  25. WHO COVID-19 Research Database. Accessed 25 Nov 2022.

  26. Li L, Smith HE, Atun R, Tudor Car L. Search strategies to identify observational studies in MEDLINE and Embase. Cochrane Database Syst Rev. 2019;3(3):MR000041.

    PubMed  Google Scholar 

  27. The role of asymptomatic SARS-CoV-2 infections: rapid living systematic review and meta-analysis. Accessed 25 Nov 2022.

  28. Lee C, Thomas M, Ejaredar M, Kassam A, Whittle SL, Buchbinder R, et al. Crowdsourcing trainees in a living systematic review provided valuable experiential learning opportunities: a mixed-methods study. J Clin Epidemiol. 2022;147:142–50.

    Article  PubMed  Google Scholar 

  29. Allen L, O’Connell A, Kiermer V. How can we ensure visibility and diversity in research contributions? How the Contributor Role Taxonomy (CRediT) is helping the shift from authorship to contributorship. Learned Publishing. 2019;32(1):71–4.

    Article  Google Scholar 

  30. Higgins J, Thomas J, Chandler J, Cumpston M, Li T, Page M, et al. Cochrane Handbook for Systematic Reviews of Interventions version 6.3 (updated February 2022): Cochrane; 2022. Available from Accessed 25 Nov 2022.

  31. R Core Team. R: A language and environment for statistical computing. Accessed 25 Nov 2022.

  32. Thomas J, Graziosi S, Brunton J, Ghouze Z, O'Driscoll P, Bond M, et al. EPPI-Reviewer: advanced software for systematic reviews, maps and evidence synthesis. Accessed 25 Nov 2022.

  33. Harrison H, Griffin SJ, Kuhn I, Usher-Smith JA. Software tools to support title and abstract screening for systematic reviews in healthcare: an evaluation. BMC Med Res Methodol. 2020;20(1):7.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Covidence systematic review software. Accessed 25 Nov 2022.

  35. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)–a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42(2):377–81.

    Article  PubMed  Google Scholar 

  36. Heron L. Analysis for version 5 of living systematic review on asymptomatic SARS-CoV-2 infections. Accessed 25 Nov 2022.

  37. Westgate MJ. revtools: an R package to support article screening for evidence synthesis. Res Synth Methods. 2019;10(4):606–14.

    Article  PubMed  Google Scholar 

  38. Hamel C, Hersi M, Kelly SE, Tricco AC, Straus S, Wells G, et al. Guidance for using artificial intelligence for title and abstract screening while conducting knowledge syntheses. BMC Med Res Methodol. 2021;21(1):285.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Kahale LA, Elkhoury R, El Mikati I, Pardo-Hernandez H, Khamis AM, Schunemann HJ, et al. Tailored PRISMA 2020 flow diagrams for living systematic reviews: a methodological survey and a proposal. F1000Res. 2021;10:192.

    Article  PubMed  Google Scholar 

  40. Schmidt L, Olorisade BK, McGuinness LA, Thomas J, Higgins JPT. Data extraction methods for systematic review (semi)automation: a living systematic review. F1000Res. 2021;10:401.

    Article  PubMed  Google Scholar 

  41. StataCorp. Stata Statistical Software. Accessed 25 Nov 2022.

  42. Nikolakopoulou A, Mavridis D, Egger M, Salanti G. Continuously updated network meta-analysis and statistical monitoring for timely decision-making. Stat Methods Med Res. 2018;27(5):1312–30.

    Article  PubMed  Google Scholar 

  43. Hoffmann F, Eggers D, Pieper D, Zeeb H, Allers K. An observational study found large methodological heterogeneity in systematic reviews addressing prevalence and cumulative incidence. J Clin Epidemiol. 2020;119:92–9.

    Article  PubMed  Google Scholar 

  44. Riley RD, Higgins JP, Deeks JJ. Interpretation of random effects meta-analyses. BMJ. 2011;342:d549.

    Article  PubMed  Google Scholar 

  45. Hinchliffe L. The state of the version of record. Accessed 12 Apr 2022.

  46. Laine C, Taichman DB, Guallar E, Mulrow CD. Keeping up with emerging evidence in (almost) real time. Ann Intern Med. 2020;173(2):153–4.

    Article  PubMed  Google Scholar 

  47. Buitrago-Garcia D, Egli-Gany D, Counotte MJ, Hossmann S, Imeri H, Ipekci MA, et al. Asymptomatic SARS-CoV-2 infections: a living systematic review and meta-analysis. medRxiv. 2020.

  48. Buitrago-Garcia D, Egli-Gany D, Counotte MJ, Hossmann S, Imeri H, Ipekci AM, et al. Occurrence and transmission potential of asymptomatic and presymptomatic SARS-CoV-2 infections: a living systematic review and meta-analysis. PLOS Med. 2020;17(9):e1003346.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

Download references


Not applicable


This study was supported by the European Union’s Horizon 2020 research and innovation programme — project EpiPose (grant agreement number 101003688) and the Swiss National Science Foundation (project number 176233). DBG is funded by the Swiss government excellence scholarship (2019.0774) and the Swiss School of Public Health Global P3HS. The funders had no role in the design of the study and collection, analysis, and interpretation of data and in writing the manuscript.

Author information

Authors and Affiliations



LH and NL developed the idea of the paper and wrote the original draft. LH, DBG, MI, RB, HI, MC, and NL collected data. LH, DBG, MI, HI, and MC were involved in data curation. LH, DBG, RB, and NL conducted the review of living systematic reviews within the paper. LH, HI, and MC were responsible for writing computer code, and LH prepared the tables and figures. NL supervised the project and acquired the funding. All authors reviewed and edited the paper, and all have approved this version of the paper.

Authors’ information

Not applicable.

Corresponding author

Correspondence to Nicola Low.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

NL declares that her institution has received funding from the Swiss National Science Foundation (176233) and the European Union Horizon 2020 research and innovation programme — project EpiPose (grant agreement number 101003688). The other authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Summary of methods used in a review of living systematic reviews on covid-19.

Additional file 2.

Living systematic reviews on covid-19 identified in the World Health Organization COVID-19 Database from January 2020 to February 2022, ordered alphabetically by first author.

Additional file 3.

Training materials for voluntary team members in the living systematic review.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Heron, L., Buitrago-Garcia, D., Ipekci, A.M. et al. How to update a living systematic review and keep it alive during a pandemic: a practical guide. Syst Rev 12, 156 (2023).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: