Productivity

 

Analytics Charts

By default, the Analytics page is disabled for every project because computing the analytical charts is a resource-intensive task and might temporarily influence the responsiveness of the application, especially when triggered in parallel with other training/preannotation jobs. However, users can file a request to enable the Analytics page which can be approved by any admin user. The request is published on the Analytics Requests page, visible to any admin user. Once the admin user approves the request, any team member can access the Analytics page.

A refresh button is present on the top-right corner of the Analytics page. The Analytics charts doesn’t automatically reflect the changes made by the annotators (like creating tasks, adding new completion, etc.). Updating the analytics to reflect the latest changes can be done using the refresh button.

Task Analytics

To access Task Analytics, navigate on the first tab of the Analytics Dashboard, called Tasks. The following blog post explains how to Improve Annotation Quality using Task Analytics in the Generative AI Lab.

Below are the charts included in the Tasks section.

Total number of task in the Project

Total number of task in a Project in last 30 days

Breakdown of task in the Project by Status

Breakdown of task by author

Summary of task status for each annotator

Total number of label occurrences across all completions

Average number of label occurrences for each completion

Total number of label occurrences across all completions for each annotator

Total vs distinct count of labels across all completions

Average number of tokens by label

Total number of label occurrences that include numeric values


Team Productivity

To access Team Productivity charts, navigate on the second tab of the Analytics Dashboard, called Team Productivity. The following blog post explains how to Keep Track of Your Team Productivity in the Generative AI Lab.

Below are the charts included in the Team Productivity section.

Total number of completions in the Project

Total number of completions in the Project in the last 30 days

Total number of completions for each Annotator

Total number of completions submitted over time for each Annotator

Average time spent by the Annotator in each task

Total number of completions submitted over time


Inter-Annotator Agreement (IAA)

Starting from version 2.8.0, Inter Annotator Agreement(IAA) charts allow the comparison between annotations produced by Annotators, Reviewers, or Managers.

Inter Annotator Agreement charts can be used by Annotators, Reviewers, and Managers for identifying contradictions or disagreements within the starred completions (Ground Truth). When multiple annotators work on same tasks, IAA charts are handy to measure how well the annotations created by different annotators align. IAA chart can also be used to identify outliers in the labeled data, or to compare manual annotations with model predictions.

To access IAA charts, navigate on the third tab of the Analytics Dashboard of NER projects, called Inter-Annotator Agreement. Several charts should appear on the screen with a default selection of annotators to compare. The dropdown selections on top-left corner of each chart allow you to change annotators for comparison purposes. There is another dropdown to select the label type for filtering between NER labels and Assertion Status labels for projects containing both NER and Assertion Status entities. It is also possible to download the data generated for some chart in CSV format by clicking the download button just below the dropdown selectors.

Note: Only the Submitted and starred (Ground Truth) completions are used to render these charts.

The following blog post explains how your team can Reach Consensus Faster by Using IAA Charts in the Generative AI Lab.

Below are the charts included in the Inter-Annotator Agreement section.

High-level IAA between annotators on all common tasks

IAA between annotators for each label on all common tasks

Comparison of annotations by annotator on each chunk

Comparison of annotations by model and annotator (Ground Truth) on each chunk

All chunks annotated by an annotator

Frequency of labels on chunks annotated by an annotator

Frequency of a label on chunks annotated by each annotator

Download data used for charts

CSV file for specific charts can be downloaded using the new download button which will call specific API endpoints: /api/projects/{project_name}/charts/{chart_type}/download_csv

Screen Recording 2022-03-08 at 3 47 49 PM

Last updated