Inter-annotator Agreement (IAA)
Identifying annotation conflict between annotators is key for a successful annotation project. With UBIAI, you can easily evaluate team performance using the inter-annotator agreement function. This will allow you to distribute same documents to multiple annotators to evaluate the consistency of the annotation and identify conflicts. UBIAI supports Named Entity Recognition, Relation and document classification IAA. To initiate an IAA task:
Go to the IAA tab in collaboration menu
Press on the "+" sign to add a new IAA task
Enter the name of the IAA task, select the annotators, add annotation guidelines, setup a due date for the IAA task and select the documents you want to distribute to the annotators:
You can track the IAA score in real time as team members start annotating. Simply click on the "Task Details" to generate the IAA report.
The first section of the report gives you the details about the project and task progress for each annotator. You can add or delete annotators from an existing IAA task by clicking on "Add member to IAA task" button person_add
In the next section you can check the number of annotators who completed a specific document:
In order to measure the degree of compatibility between annotators, UBIAI provides an average kappa score known as (Cohen's kappa) as well as annotator-pair score that measures the reliability between two annotators, the higher the score, the higher the agreement between annotators:
Last but not least, you can check the IAA score per document for each pair of annotator by clicking on their IAA score. This will open up a dialog that shows the score for each document between the two annotators and allows for more granular analysis:
To review the conflicts, simply click on a document from a list and it will open up the annotations of both annotators, any conflict will be highlighted in red as shown below:
Last updated