Enhancements to SRDR+, the Next-Level Replacement For SRDR

Image of Bryant Smith Bryant Smith | April 14, 2021


SRDR+ is a free, open source, online, collaborative platform for extracting systematic review data, archiving the data after the review is complete, and sharing the data with others (including other systematic reviewers, clinical guideline developers, healthcare professionals, patients and health policy makers).

Over the last couple of years, we’ve made several improvements to SRDR+. The advancements made were, in large part, based on feedback from users who’ve used both SRDR+ and its predecessor (SRDR) regarding what they think are the main advantages of the newer platform. We summarize some of the main advancements below.

SRDR+ allows safe and secure data extraction for various types of systematic reviews
When data extractors are extracting data, they need not worry about losing unsaved extracted data. All data entered into SRDR+ forms are autosaved.


The autosave feature on SRDR+ is a major advantage. On SRDR, there were times where I lost a page of outcomes due to a malfunction in the "Save" button. SRDR+ is also a lot faster in loading tabs and processing data than SRDR.”

Shivani Mehta, MPH candidate, Center for Evidence Synthesis in Health, Brown University School of Public Health


SRDR+ extraction forms can be tailored to meet the needs of various types of projects
We have made the data extraction form development capabilities in SRDR+ flexible so that the forms can be customized to various types of systematic reviews, such as those evaluating treatment effectiveness, diagnostic accuracy, and epidemiology. SRDR+ extraction forms support an array of question types, including text boxes, multiple-choice checkboxes, single-choice radio buttons, dropdown questions, and matrix formats. Entire sections of extraction forms can also now be added or removed.

We’ve also introduced data field dependencies such that the platform allows subsequent questions to appear to data extractors based on responses to previous questions. Data field dependencies can be used for any question in any section of SRDR+ extraction forms.

“Building and customizing extraction forms on SRDR+ is much, much more intuitive than it was on SRDR. I love the ability to add extra columns and rows to any questions type. Also, allowing dependencies among questions is a wish come true – provides the ability to streamline extractions (not all analysts are great at knowing whether a question sometimes applies/sometimes not), and also allows for the use of multiple risk of bias tools (with a screener question at the beginning of the ROB tab).”

James Scott Parrott, PhD, Professor, Rutgers University School of Professional Studies


The new and improved Data Comparison Tool in SRDR+ is useful for teams that conduct dual independent data extraction of studies and thus need those data compared and adjudicated
Comparison Tools shows users discrepancies between data extracted from multiple team members side-by-side and allows users to resolve those discrepancies efficiently.

“The comparison tool is worth the price of admission! It used to take me ages with SRDR to try to consolidate the data offline after first exporting it to Microsoft Excel. Now, with the consolidation tool, much more streamlined. I love, love, love this tool!”

James Scott Parrott, PhD, Professor, Rutgers University School of Professional Studies


The Outcomes section of SRDR+ allows for clear definition of Outcomes
The structured approach that SRDR+ uses to define outcomes aligns with the outcome definition structure used by study registries such as ClinicalTrials.gov. Defining outcomes on SRDR+ extraction forms clarifies the data extraction process, promotes consistency in how data are extracted, and helps to reduce bias and errors in the systematic review process.

“SRDR+ provides ample space and options to define and edit outcomes and their definitions, which is great!”

Monika Reddy Bhuma, BDS, MPH, Research Associate, Center for Evidence Synthesis in Health, Brown University School of Public Health


The Results section of SRDR+ Extraction Forms supports entry of outcome measure data for multiple time points and for comparisons within and between study arms
Data extractors can choose from pre-existing outcomes or add new outcomes. This helps promote consistency in the data extraction process across what are typically multi-person data extraction teams.

The Between Arm Comparison section of SRDR+ allows for easy input of multiple comparisons, and more clearly distinguishes when two or more study arms are being compared.”

Wangnan Cao, PhD, Research Associate, Center for Evidence Synthesis in Health, Brown University School of Public Health


SRDR+ allows data export into Microsoft Excel or Google Sheets
Individual sections of SRDR+ data extraction forms (e.g., Design, Sample Characteristics, Outcomes, Results) are exported as separate sheets in the same Excel file or Google Sheets document for easy handling, linking, and formatting for analyses.

“ I like the flexibility that SRDR+ provides for data exports. Having completed four meta-analyses using the long download formatting, I think that it works pretty well.”

Scott Parrott, PhD, Professor, Rutgers University School of Professional Studies


If you haven’t already done so, please give SRDR+ a shot. If you have any questions, contact us on this blog, via email (srdr@ahrq.hhs.gov), or via our Twitter page!

Disclaimer: The Systematic Review Data Repository Plus is funded by the Agency for Healthcare Research and Quality (AHRQ). Authors of this blog post are solely responsible for its content, which does not necessarily represent the views of AHRQ or the U.S. Department of Health and Human Services (DHHS). Readers should not interpret any statement in this blog post as an official position of AHRQ or of DHHS.



Bryant Smith, MPH, CPH
SRDR Administrator
Brown Evidence-based Practice Center
Brown University School of Public Health




SRDR+ Presents New Importing Capability for FHIR-formatted Citation Files for Systematic Reviews

Image of Ian Saldanha Ian Saldanha | March 5, 2021


Supported by AHRQ, the Systematic Review Data Repository Plus (SRDR+) is working towards eventually becoming the first FHIR-enabled platform for conducting and sharing the results of systematic reviews. As part of this work, handling citations of included studies in systematic reviews just got easier!

Study-related citation information (author names, article titles, page numbers, and other identifying information) is crucial for identifying studies and referencing them in systematic review reports.

Once researchers are ready to screen and extract data from eligible studies, they must first import their citations into platforms, such as SRDR+. Citation files can take the form of a variety of file formats.

For the purposes of screening and subsequent data extraction, SRDR+ is now capable of importing citations saved in the following file formats: Research Information Systems (.ris), comma-separated values (.csv), a list of PubMed IDs (.txt), EndNote file (.enw), and the newly supported, HL7 FHIR file format.

The HL7 FHIR file format is a brand new standard, developed by the HL7 Fast Healthcare Interoperability Resources (FHIR) endeavor, which defines how healthcare information can be exchanged between different computer systems regardless of how the information is stored within those systems. Furthermore, healthcare information saved in the HL7 FHIR file format is secured in such a way that only those with a need and assigned privileges to access the information are allowed to.

We anticipate that the HL7 FHIR file format will eventually become widespread in the field of evidence synthesis.

PICO Portal is an example of a systematic review platform that is now supporting the HL7 FHIR file format. SRDR+ users will be happy to know that they can now seamlessly import citation data from such platforms directly into SRDR+, where they can then continue working on their systematic reviews.

Give it a shot! Try importing citations using the HL7 FHIR file format and let us know how it’s going. Contact us on this blog, via email (srdr@ahrq.hhs.gov), or via our Twitter page!

The Systematic Review Data Repository Plus is funded by the Agency for Healthcare Research and Quality (AHRQ). Authors of this blog post are solely responsible for its content, which does not necessarily represent the views of AHRQ or the U.S. Department of Health and Human Services (DHHS). Readers should not interpret any statement in this blog post as an official position of AHRQ or of DHHS.



Ian Saldanha, MBBS, MPH, PhD
SRDR Director
Assistant Professor, Brown Evidence-based Practice Center
Brown University School of Public Health




The newly updated Systematic Review Data Repository Plus (SRDR+)

Image of Iam Saldanha Ian Saldanha | March 7, 2019


A lot of human capital and monetary resources are spent conducting systematic reviews. At the same time, there has been strong momentum towards promoting open science and data sharing. Technology can help us realize these ideals and improve the efficiency of the systematic review enterprise. A particularly inefficient systematic review step is data extraction from primary studies. Data extraction can be improved, for both doers and users of systematic reviews, through a data system that is robust, user-friendly, and amenable to sharing.

The Brown University Evidence-based Practice Center (Brown EPC) recently launched an updated version of the Systematic Review Data Repository. This new version, called SRDR+ (https://srdrplus.ahrq.gov), is a free, powerful, online, collaborative system for extracting and archiving study data during systematic reviews. To our knowledge, this is the only system of its kind that is free to anybody around the world. We therefore consider SRDR+ to be a community resource.

The new features of SRDR+



We have made various improvements to SRDR+’s functionality as well as its look and feel in response to advancements in systematic review methodology and technology. SRDR+ has the flexibility to meet the needs of various types of systematic reviews, such as those evaluating treatment effectiveness, diagnostic accuracy, and epidemiology.

SRDR+ also allows users to delete and/or add “tabs.” Tabs help organize questions related to various elements of a study, such as design, participant characteristics, and quality. In addition, users can click a button and load current versions of entire quality assessment tools with their specific items as well as instructions for filling out those items. User can also delete specific items.

SRDR+ also makes it easier to define outcomes clearly. The new version offers a structured approach that is consistent with outcome definitions in study registries such as ClinicalTrials.gov. This can provide clarity about the data that need to be extracted, promote consistency in how those data are extracted, and help minimize bias in systematic reviews.

SRDR+’s new Data Comparison Tool is perhaps the most anticipated and exciting addition to SRDR+. This automated tool displays data extracted by multiple members of the team side-by-side and automatically flags discrepancies that need resolution. Anyone who’s done this process manually knows how incredibly time consuming it can be.

Another major efficiency gain is related to the complete revamp of SRDR+’s underlying open-source code. This revamped code enables considerably faster page-loading and saving.

More than just a data extraction tool



SRDR+ isn’t just a tool for organizing a systematic review’s data extraction process. SRDR+ also functions as a repository of previously-extracted data. As of March 2019, the data extracted in more than 130 systematic reviews (for more than 13,000 primary studies) have been made public. This means that future systematic review teams working on similar topics can reduce countless hours spent on data extraction by re-using these data. This resource is especially invaluable for teams conducting updates of systematic reviews.

SRDR+ also allows users of systematic reviews, such as guideline developers, policy-makers, patients, and the general public, to access study data that might be relevant to their decision-making processes. A list of systematic review projects with data that are publicly available through SRDR+ can be found here:https://srdr.ahrq.gov/projects/published.

We encourage you to try the new version of SRDR+ here:https://srdrplus.ahrq.gov.To learn more or request a demonstration session, please email the SRDR+ team atsrdr@ahrq.hhs.gov.You can also follow SRDR+ on Twitter (@SRDRPlus).

The Agency for Healthcare Research and Quality (AHRQ) has funded the development and maintenance of SRDR since 2012.



Ian Saldanha, MBBS, MPH, PhD
SRDR Director
Assistant Professor, Brown Evidence-based Practice Center
Brown University School of Public Health