Nested Knowledge

Bringing Systematic Review to Life

User Tools

Site Tools


wiki:autolit:screening:dual

Dual Screening and Adjudication

Dual Screening is a quality-controlled screening process, where two users independently screen each article, and then all screening decisions are adjudicated by an Administrator. Note, this is different than two-pass screening where a user first reviews abstracts and then full-texts of advanced articles. You can, however, perform dual two-pass screening in our software.

The Admin adjudicates any disagreement between the original screeners and sets the final determination for each study. For example, if Screener 1 includes a given study but Screener 2 excludes it for Reason 1, the Adjudicator will then need to choose between Inclusion, Excluding for Reason 1, or choosing to Exclude for Reason 2.

Only those with Admin privileges can serve as Adjudicators, but any user can serve as a Screener.

Configure Exclusion Reasons

You will need to Configuring Exclusion Reasons before screening underlying studies.

Configure Dual Screening

To configure dual screening in a nest, click on the “Settings” link under Nest Home. Once there, scroll down to the Screening section. Then, click on the “Dual” option in the ( red box ).

Once this is complete, a new “Adjudicate Screening” option will appear in the Nest Menu for all Admins:

Note: Toggling back from Dual Screening to Standard Screening (or switching to Two-Pass Screening) will ONLY save final adjudications, so all records without an adjudicated Include or Exclude decision will be reverted to Unscreened and all data associated with individual users' decisions will be lost!

Dual Screening Steps

1. Screen each study twice

Before Adjudication can take place, two independent users will need to screen each underlying study using the same approach as Standard Screening Mode. AutoLit automatically queues the studies to all users until two screening decisions are made; then, the studies are sent forward for adjudication. You may want to view the full text, see instructions on Full Text Upload.

In Dual Screening, it can be useful to view the number of prior reviewers for the current record. This is displayed to the right of the include button (see below). 0 means no decisions have been made about the current record, 1 means 1 reviewer has made a decision, and so on.

However, in Dual modes the status of whether the full text has been uploaded or not by the other reviewer is hidden. This is to avoid bias as the knowledge that the other user has uploaded the record's full text may influence your screening decision. You still have the option to show the full text upload status as well as the full text regardless by clicking “Show Anyways.” This action does not affect your screening decisions.

2. [OPTIONAL] Auto-Adjudicate

All studies that have undergone two screening decisions are sent forward for adjudication, and any study that is either Included by both Screeners or Excluded by both Screeners is eligible for Auto-Adjudication.

To Auto-Adjudicate all eligible studies, navigate to Adjudicate Screening, and in the upper right, select “Auto-adjudicate {x} studies” ( red box ). This will automatically include all studies that both Screeners included, and exclude all studies that both Screeners excluded.

If Screener 1 and Screener 2 selected different Exclusion Reasons, the Auto-Adjudication will select only one of these and apply it as the final Exclusion Reason.

3. Adjudicate Disagreements

For any study that is not Auto-Adjudicated, an Admin will need to manually adjudicate in order to provide a final screening decision. The Admin should choose between selecting the decision of Screener 1 or Screener 2, or if both are incorrect, provide a different option ( red box ). Once adjudicated, the studies will either be excluded or included and sent forward to Tagging.

Note: by default, the names of the reviewers will be displayed alongside their decisions. You may want to reduce bias by hiding this information. To do so you can Blind Adjudication in Settings.

Kappa Statistics for Interrater Reliability

After you finish Dual Screening, you can view the Kappa statistics in Activity.

Guidance on Dual Screening Best Practices

For guidance on best practices in Dual Screening, click here.

wiki/autolit/screening/dual.txt · Last modified: 2023/12/05 16:50 by jthurnham