Nested Knowledge

Bringing Systematic Review to Life

User Tools

Site Tools


wiki:start:demo:cer_aspiration

CER Demo Walkthrough: Aspiration Catheter

Welcome to the walkthrough of the Medical Device: Aspiration Catheter “Penumbra Indigo” demo Nest for Clinical Evaluation Reports (open in your original tab). In this walkthrough, we'll explain the core functionalities of Nested Knowledge through this Nest. We encourage you to work through the Nest as you follow the walkthrough. The Nest available to you is a copy of the original and may be freely modified, so roll up your sleeves and get your hands dirty!

This Nest is a demonstration of a previously-completed review as part of a CER project, presenting the tools at each stage to appraise the Aspiration Catheter: Penumbra Indigo. It is partially completed to allow you to explore the site.

Nest Home

You've landed on your demo Nest in AutoLit, and you're looking at the Nest Home page. This page includes a menu on the left of the page, the protocol in the center, and discussion about the Nest on the right. The menu includes links to all modules & configurations available to you in AutoLit. We'll now walk through these modules one by one. (click the title in the menu to navigate to the the corresponding module).

The Literature Search page allows import of studies to a nest and shows where studies were sourced. This review includes one API-based (automatic integration) search of PubMed. Hover and click the “More” button to see greater detail about the searches, including when they were run and any query structuring available. Click “Run” to update the search at any time.

You have a choice of an automatic search (present in this demo nest) or a file import from any database by clicking “Add Search.”

Other Sources

Records may be imported through other means. Click the “Other Sources” menu item under “Literature Search” to view records that were individually added as expert recommendations. 1 study was imported into this Nest. Try importing the DOI or PMID of your desired study using the “Add by Identifier” form on the right of the page. You can also add manual records or upload records in bulk if you have their pdfs downloaded to your device, the software will parse out all relevant information to fill the metadata.

Note: all records uploaded to the nest by any means, will be de-duplicated. i.e. any duplicates will be automatically removed.

Screening

Abstract Screening

Once studies are imported into a nest, they are “Screened” for relevance to the review in the Screening Module. Since this nest was setup to comply with CER requirements, its Screening mode is set to Two Pass Screening. This means the single reviewer can screen all abstracts first, then all full texts. First up, Abstract Screening:

This screening module displays abstracts that have yet to be screened, allowing you to decide to advance or exclude from the next stage of full text screening. You can either click “Advance” or a specific exclusion reason from the drop-down menu. Try including a reference by clicking the include button. Exclude a reference by selecting an exclusion reason from the drop-down menu and then clicking the exclude button. You may also skip studies you aren't yet sure about, or jump to a prior study, using the buttons under the Navigation menu. These exclusion reasons were configured under Configure Screening in the left hand menu under Abstract Screening.

Abstract Highlighting

Why are study abstracts so colorful? We perform machine learning-based PICO annotation of abstracts using a model derived from RobotReviewer. To turn off PICO highlighting, toggle off the slide button in the legend just beneath the abstract text.

Abstract text may also be highlighted in different colors with User Keywords, which are configured in Configure Screening or when you click on Your Keywords.

Full Text Screening

Full Text Screening works and looks the same as Abstract Screening but you will primarily be in the Full Text tab (in red below) and you will make decisions to “Include” a study instead of “Advance” a study. Exclusion reasons stay the same.

Tagging

The Tagging module allows you to report data from included studies in the form of “tags”. There are two modes for Tagging, Standard or Form-based. Since this nest is configured to best meet CER requirements, it is in Form-based mode. This just means Tagging is in Q&A format for ease of collecting data. Learn more about Form-based Tagging.

Tag Hierarchy

Click the “Configure Tagging” menu item to get started. Tag hierarchies consist of tags (visualized as points) and relationships between them (visualized as connecting lines). The tag hierarchy in this review consists of a set of questions (root tags labelled Q) and answers (child tags) required for this section of the CER.

Tagging Module

Inside the Tagging module, tags may be applied to studies, by answering questions indicating that a concept is relevant to a study.

In the Tagging form, go through question by question and select tags from the dropdown menu or fill out a text excerpt. You can highlight, select an area or simply copy and paste using the circled icons above. When you apply a tag this indicates the data is present in the report and is illustrated in Qualitative Synthesis.

Study Inspector

Study Inspector is the tool in AutoLit for reviewing and searching your past extracted data. Each row in Study Inspector is a study, and columns may be user-selected in the upper left dropdown menu. Studies may be searched into the table by creating Filters. Filters may be created using the Add Filter dropdown menu, but often times the type ahead search bar is fastest. In the below example, we are filtering to studies that were included in the Screening process.

Synthesis

At this point, we've reviewed all the evidence gathered in AutoLit for this Nest. Now let's navigate to Synthesis Home to draw some conclusions from our evidence, by clicking the Synthesis menu heading.

PRISMA

Click the PRISMA button in the bottom left of the page to view a PRISMA 2020 flow diagram. The diagram is auto-populated based on searches imported and studies screened in AutoLit.

Qualitative Synthesis

Navigate back to Synthesis Home and click the Qualitative Synthesis box. Qualitative Synthesis (QLS) displays data gathered in the Tagging Module. Each slice in the sunburst diagram is a tag. Its width corresponds to how frequently it was applied. Its distance from the center corresponds to its depth in the hierarchy (distance between child and root tag). Click a slice to filter studies displayed to those where the tag was applied. Clicking multiple slices filters to studies with all the selected tags applied. The rightmost bar shows relevant studies (bottom) and some data about the tag (top), like its frequency, excerpts, and tags that were commonly applied with the selected tag.

In this tag selection, we see that Level IV evidence was reported in one study. Click the rows of the study table to take a deep dive into the extracted data.

Optional: Meta-analytical Extraction

You may notice in Synthesis, the option for Quantitative Synthesis is disabled. Quantitative Synthesis is the output for our Meta-Analytical Extraction module, which is an optional step only for users who wish to conduct a meta-analysis. This particular review did not include a meta-analysis so the module is disabled.

Closing Remarks

You've now seen how a review in compliance with CER guidelines may be completed & shared with the Nested Knowledge platform. We encourage you to head back to AutoLit and explore the variety of configuration options, and ever-growing feature set we didn't get to cover here. If you're feeling ambitious, start your own Nest from scratch!

Use this documentation to guide you through more complex topics, and as always, please reach out to our support team via email and make requests on Nolt.

wiki/start/demo/cer_aspiration.txt · Last modified: 2023/07/07 23:34 by jthurnham