Selecting several databases ensures that your search is comprehensive. A well-cited guide on how to conduct a systematic review of medical research suggests, at a minimum, a combination of PubMed, Embase, Web of Science, and Google Scholar.1 Cochrane and Scopus are also common databases for biomedical systematic reviews.
PubMed, which is free, is the most common search database for biomedical and life sciences literature. PubMed will search for relevant words in the title and abstract but not in the full text. PubMed includes citations indexed in Medline, uploaded by journals, and archived in PubMed Central. The distinctions between PubMed, Medline, and PubMed Central are explained
here.
Embase is a subscription database maintained by Elsevier. It provides comprehensive indexing and tagging of the biomedical literature.
Web of Science is a subscription service run by Clarivate that indexes citations in several different science disciplines.
Google Scholar allows the use of Google search techniques but restricts results to academic literature, including journal pages, PubMed, university pages, and pre-print archives.
Cochrane contains many clinical trials, including some publications that are not indexed in PubMed. Cochrane can be searched without a subscription, but a subscription is necessary to download complete search results.
Scopus is the largest abstract and citation database of peer-reviewed literature. It includes scientific journals, books, and conference proceedings.
ClinicalTrials.gov will allow you to assess whether clinical trials have been performed on your topic of interest.
Psychology databases
PsycINFO and
CINAHL can be used if the research question is related to the field of psychiatry, psychology and/or to nursing and allied health.
PsycNet can also be used to search for social and behavioral science content.
To prepare for screening, gather study metadata in an organized way. Although searching or uploading searches to an AutoLit nest is an easy way to perform this step, downloading search metadata from the databases into a spreadsheet is also common. Most databases have an option to automatically download relevant information from search results.
Collected metadata may include identifying information (such as DOI or PubMed ID, URL, author, and year) as well as information necessary for screening, such as title and abstract. If you are not using AutoLit in Nested Knowledge, a system should be implemented for removing duplicates, indicating screening status (i.e., included, excluded, or unscreened), providing exclusion reasons, and collecting full texts.