EasyDIAg: A tool for easy determination of interrater agreement

Publikationen: Beitrag in FachzeitschriftZeitschriftenaufsätzeForschungBegutachtung

Standard

EasyDIAg: A tool for easy determination of interrater agreement. / Holle, Henning; Rein, Robert.

in: Behavior research methods, Jahrgang 47, Nr. 3, 09.2015, S. 837-847.

Publikationen: Beitrag in FachzeitschriftZeitschriftenaufsätzeForschungBegutachtung

Harvard

APA

Vancouver

Bibtex

@article{a75b9c07ef144ef7a655acf66425ce66,
title = "EasyDIAg: A tool for easy determination of interrater agreement",
abstract = "Reliable measurements are fundamental for the empirical sciences. In observational research, measurements often consist of observers categorizing behavior into nominal-scaled units. Since the categorization is the outcome of a complex judgment process, it is important to evaluate the extent to which these judgments are reproducible, by having multiple observers independently rate the same behavior. A challenge in determining interrater agreement for timed-event sequential data is to develop clear objective criteria to determine whether two raters' judgments relate to the same event (the linking problem). Furthermore, many studies presently report only raw agreement indices, without considering the degree to which agreement can occur by chance alone. Here, we present a novel, free, and open-source toolbox (EasyDIAg) designed to assist researchers with the linking problem, while also providing chance-corrected estimates of interrater agreement. Additional tools are included to facilitate the development of coding schemes and rater training. ",
keywords = "Cohen's kappa, Toolbox, Coding, Annotation, Rater, Agreement",
author = "Henning Holle and Robert Rein",
year = "2015",
month = sep,
doi = "10.3758/s13428-014-0506-7",
language = "English",
volume = "47",
pages = "837--847",
journal = "Behavior research methods",
issn = "1554-351X",
publisher = "Psychonomic Society Inc.",
number = "3",

}

RIS

TY - JOUR

T1 - EasyDIAg: A tool for easy determination of interrater agreement

AU - Holle, Henning

AU - Rein, Robert

PY - 2015/9

Y1 - 2015/9

N2 - Reliable measurements are fundamental for the empirical sciences. In observational research, measurements often consist of observers categorizing behavior into nominal-scaled units. Since the categorization is the outcome of a complex judgment process, it is important to evaluate the extent to which these judgments are reproducible, by having multiple observers independently rate the same behavior. A challenge in determining interrater agreement for timed-event sequential data is to develop clear objective criteria to determine whether two raters' judgments relate to the same event (the linking problem). Furthermore, many studies presently report only raw agreement indices, without considering the degree to which agreement can occur by chance alone. Here, we present a novel, free, and open-source toolbox (EasyDIAg) designed to assist researchers with the linking problem, while also providing chance-corrected estimates of interrater agreement. Additional tools are included to facilitate the development of coding schemes and rater training.

AB - Reliable measurements are fundamental for the empirical sciences. In observational research, measurements often consist of observers categorizing behavior into nominal-scaled units. Since the categorization is the outcome of a complex judgment process, it is important to evaluate the extent to which these judgments are reproducible, by having multiple observers independently rate the same behavior. A challenge in determining interrater agreement for timed-event sequential data is to develop clear objective criteria to determine whether two raters' judgments relate to the same event (the linking problem). Furthermore, many studies presently report only raw agreement indices, without considering the degree to which agreement can occur by chance alone. Here, we present a novel, free, and open-source toolbox (EasyDIAg) designed to assist researchers with the linking problem, while also providing chance-corrected estimates of interrater agreement. Additional tools are included to facilitate the development of coding schemes and rater training.

KW - Cohen's kappa

KW - Toolbox

KW - Coding

KW - Annotation

KW - Rater

KW - Agreement

U2 - 10.3758/s13428-014-0506-7

DO - 10.3758/s13428-014-0506-7

M3 - Journal articles

VL - 47

SP - 837

EP - 847

JO - Behavior research methods

JF - Behavior research methods

SN - 1554-351X

IS - 3

ER -

ID: 1446909