# Inter-annotator Agreement (IAA)

Identifying annotation conflict between annotators is key for a successful annotation project. With UBIAI, you can easily evaluate team performance using the inter-annotator agreement function. This will allow you to distribute same documents to multiple annotators to evaluate the consistency of the annotation and identify conflicts. UBIAI supports Named Entity Recognition, Relation and document classification IAA. To initiate an IAA task:

1. Go to the IAA tab in collaboration menu
2. Press on the "+" sign to add a new IAA task
3. Enter the name of the IAA task, select the annotators, add annotation guidelines, setup a due date for the IAA task and select the documents you want to distribute to the annotators:

<figure><img src="https://3073024999-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F0hqV0hLtifjaqlfkGT7n%2Fuploads%2Fpc5aPTiiXbmfKwkzi4Sm%2FIAA.png?alt=media&#x26;token=1f0f3c6a-6737-48c8-8691-4601afadbcf4" alt=""><figcaption></figcaption></figure>

4. You can track the IAA score in real time as team members start annotating. Simply click on the "Task Details" to generate the IAA report.
5. The first section of the report gives you the details about the project and task progress for each annotator. You can add or delete annotators from an existing IAA task by clicking on "Add member to IAA task" button person\_add

<figure><img src="https://3073024999-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F0hqV0hLtifjaqlfkGT7n%2Fuploads%2FWyd8YzV6eW40qrOVznJn%2FIAA_report1.png?alt=media&#x26;token=c142e651-95c5-4c54-9489-ce04bbeec66c" alt=""><figcaption></figcaption></figure>

6. In the next section you can check the number of annotators who completed a specific document:

<figure><img src="https://3073024999-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F0hqV0hLtifjaqlfkGT7n%2Fuploads%2F6EUcmYqbms52CHpwoxQ8%2FIAA_report2.png?alt=media&#x26;token=f155c8e2-949b-45b3-9c81-85a9b0aa1624" alt=""><figcaption></figcaption></figure>

7. In order to measure the degree of compatibility between annotators, UBIAI provides an average kappa score known as (Cohen's kappa) as well as annotator-pair score that measures the reliability between two annotators, the higher the score, the higher the agreement between annotators:

<figure><img src="https://3073024999-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F0hqV0hLtifjaqlfkGT7n%2Fuploads%2FSGLjDcZmakRuEVuAYMwq%2Fimage.png?alt=media&#x26;token=4ea281b7-fcd2-4adf-ac04-d6bfd466f87e" alt=""><figcaption></figcaption></figure>

Last but not least, you can check the IAA score per document for each pair of annotator by clicking on their IAA score. This will open up a dialog that shows the score for each document between the two annotators and allows for more granular analysis:

<figure><img src="https://3073024999-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F0hqV0hLtifjaqlfkGT7n%2Fuploads%2FrArOb8znQ6KzCPhM6KSR%2FIAA_conflicts.png?alt=media&#x26;token=e65e4b14-3b3c-4879-a87d-2776fce36a7a" alt=""><figcaption></figcaption></figure>

To review the conflicts, simply click on a document from a list and it will open up the annotations of both annotators, any conflict will be highlighted in red as shown below:

<figure><img src="https://3073024999-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F0hqV0hLtifjaqlfkGT7n%2Fuploads%2Fk5sb2hZP8pghQ9tH52g6%2Fconflicts_ann.png?alt=media&#x26;token=4e507a7f-c1fe-46fd-8dc5-7b77dd61081a" alt=""><figcaption></figcaption></figure>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://ubiai.gitbook.io/ubiai-documentation/inter-annotator-agreement-iaa.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
