Skip to main content

Table 3 Inter-rater reliability

From: Development and validation of a classification and scoring system for the diagnosis of oral squamous cell carcinomas through confocal laser endomicroscopy

Examiner

Fleis’ kappa

Agreement

Expected

Observed

Experts (n = 3)

0.730

0.506

0.867

Non-experts (n = 3)

0.814

0.507

0.909

  1. Inter-rater reliability of experts (subjective evaluation) and non-experts (using DOC-score for diagnosis) in CLE assessment