Classifier vs. Juror 01

Classifier vs. Juror 02

Classifier vs. Juror 03

Classifier vs. Juror 04

Classifier vs. Juror 05

Agreement among jurors

78

52.00%

80

53.33%

86

57.33%

87

58%

86

57%

Agreement by chance

54.80

36.56%

53.90

36.60%

56.8

37.87%

55.50

37.01%

55.80

37.19%

Kappa

0.243

---

0.264

---

0.313

---

0.333

---

0.321

---

Kappa standard error

0.059

---

0.059

---

0.061

---

0.056

---

0.059

---

Confidence interval

95%

0.128 to 0.358

95%

0.148 to 0.380

95%

0.194 to 0.433

95%

0.223 to 0.444

95%

0.205 to 0.437

Agreement considered

Litle

Litle

Litle

Litle

Litle