Skip to content

Experiments quality

To view the quality of an experiment:

  1. Go to the experiments dashboard.
  2. Select the experiment from the Experiments panel.

Or:

  1. Select the last experiment Statistics from the Experiments drop-down menu.
  2. Select the experiment from the Experiments panel.

The Statistics sub-tab—displayed by default—shows a set of statistical information about an experiment.

Information about the experiment

In the upper side tab, you can see the experiment details, such as:

  • Experiment name
  • Performance date and time
  • Experiment author
  • Engine type
  • Number of library documents
  • Number of analyzed documents
  • Number of validated documents
  • Training library
  • ML Model type in case of ML Engine

Check the quality

To visually check the experiment quality measurements, such as: Precision, Recall, and F-measure, watch the charts in the middle side tab.

The quality measurements are displayed in the following metrics:

  • Micro Average
  • Macro Average
  • Sample average
  • Weighted Average

The metric in focus is selected in the Settings window, Experiments tab. The specific metric icon is also displayed in the experiment card.

Analytics on resources

In the Resources sub-tab, the following items are displayed for each category:

  • Cat (Number of hits)
  • Ann. (Number of annotations)
  • TP (True Positive)
  • FP (False Positive)
  • FN (False Negative)
  • Pre. (Precision)
  • Rec. (Recall)
  • F-mea. (F-Measure)

Sort the resources

To sort the categories based on a specific item, hover over a column header and select the arrow beside it.

The taxonomy is viewed in a tree structure by default if there are children nodes. Select the tree structure button to view the taxonomy in a table view. To restore the tree structure, select the arrow beside Category until the structure button reappears.

To open the search bar, hover over the category and select Search .

You can see the Documents statistics tab and the filtered list of documents based on the category in focus.

Show the categories in Resources

To show the selected category in the Resources tab, hover over the category and select Show in resources .

Show info

To show additional information about a category, such as its broader category and the annotations, hover over the category and select Information about this category .

Worst categories

The Worst categories panel on the right lists several information boxes about the worst project categories:

Infobox name Description
Precision Top 10 categories with a precision lower than 70%
Recall Top 10 categories with a recall lower than 70%
Target coverage Top 10 categories with a coverage of the library documents lower than 70%
False positive Top 10 categories with a percentage of false positives lower than 30%

The target coverage is calculated as follows:

(True Positives + False Positives)/Number of documents

The percentage of false positives is calculated as follows:

False Positives/Number of documents

Documents quality results

Documents with metrics

In the Documents sub-tab, select the Documents with metrics sub-panel.

For each document, the following items are displayed:

  • Document validated mark (if the document is validated)
  • Hits
  • Ann. (Number of annotations)
  • TP (True Positive)
  • FP (False Positive)
  • FN (False Negative)
  • Pre. (Precision)
  • Rec. (Recall)
  • F-mea. (F-Measure)

Sort the documents

To sort the documents based on a specific item, hover over a column header and select the arrow beside it.

Check the annotations or the hits in a document

To check the resulting categories in a document, select the expanding icon .

Open a document and add annotations

To open a document in detail view based on a selected experiment in focus, hover over it and select Open document . The document will open in the detail view.

In the selected experiment, Exp. 46, the categories planets and phenomena are respectively recognized as a true positive and a false negative in the document Astro38.txt.

What you see is a record of the document before the latest experiment was performed in comparison with the selected one.

In this view, annotation is not possible. To annotate, select Annotate documents from the toolbar beside the document name. This is what you will see:

As you can see from the latest experiment now in focus—Experiment 55—both categories are true positives. This means that after Exp. 46 and before Experiment 55 the category phenomena has been turned into a true positive.

An alternative to see this view while in the experiment dashboard with an experiment in focus is:

  1. In the Documents with metrics sub-panel, hover over a document.
  2. Select Annotate document .

Filter validated documents

To filter validated documents, select Only validated documents to get the filter criteria:

  • = no filtering.
  • = only validated documents.
  • = only not validated documents.

Not annotated documents

Select the Not annotated documents sub-panel. The list shows the non-annotated documents that get resulting categories from the experiments and they are displayed with a blue chip .

To annotate a document

To annotate a document, hover over the document of interest and select Annotate document , then, follow the annotation procedure.

Open a document in detail view

To open a document in the detail view, hover over the document of interest and select Open document .

When there, select Annotate documents to start annotating.

Filter documents with extractions

To filter documents with extractions, select Only documents with extractions to get the filter criteria:

  • = no filtering.
  • = only documents with extraction.
  • = only documents with no extraction.

Show resources

To show resources, select Show resources.

Delete the experiment

To delete the experiment, select Delete .

Edit the experiment name

To edit the experiment name, select Edit experiment name .

Select other experiments

To view a different experiment, select it from the Experiments panel.

View the model generated by the experiment in focus

Select the Model sub-tab.

Activity log

Select the Activity log sub-tab to check the logs.