The Kappa Statistic or Cohen's Kappa is a statistical measure of inter-rater reliability for categorical variables. These statistics are important as they represent the extent to which the data collected in the study are correct representations of the variables measured. Learn more about Kappa Statistics here.
Nested Knowledge provides the built-in analytic in Activity when a Dual Screening mode is switched on, and screening has been performed by multiple reviewers. To access this information, click on Activity, then Screening Agreement next to User Contributions (see below).
Clicking Screening Agreement opens up a modal. This modal provides you with live kappa scores as well as inter reviewer reliability across each team member.
You can also compare the stats between specific reviewers by utilizing the dropdown: