Analytics

 

Inter-Annotator Agreement Charts (IAA)

Inter Annotator Agreement charts can be used by annotators, reviewers, and managers for identifying contradictions or disagreements within the stared completions. When multiple annotators work on the same tasks, IAA charts are handy to measure how well the annotations created by different annotators align. IAA chart can also be used to identify outliers in the labeled data or to compare manual annotations with model predictions. To get access to the IAA charts, navigate on the third tab of the Analytics Dashboard of NER projects, called “Inter-Annotator Agreement”. Several charts should appear on the screen with a default selection of annotators to compare. The dropdown boxes below each chart allow you to change annotators for comparison purposes. It is also possible to download the data generated for some charts in CSV format by clicking the download button present at the bottom right corner of each of them.

Note: Only the Submitted and starred (Ground Truth) completions are used to render these charts.

  • Shows agreement with Label wise breakdown
  • Shows whether two Annotators agree on every annotated Chunks
  • Shows agreement between one Annotator and Preannotation result on every annotated Chunks
  • Shows Labels of each Chunk by one Annotator and context in the tasks
  • Shows the frequency of Chunk-Label by one Annotator
  • Shows the frequency of Chunk-Annotatory by one Label
Last updated