Skip to content

Thresholds Statistics

The Thresholds Statistics view is where you configure how confident a model needs to be before its predictions are presented as suggestions during annotation. Each label can have its own threshold and default value.

When a model predicts a label, it also reports how confident it is — a number between 0 and 1. A prediction with confidence 0.95 means the model is fairly certain. A prediction with confidence 0.4 means it is guessing.

The confidence threshold you set here determines the boundary. Predictions at or above the threshold are shown as suggestions in the annotation view. Predictions below it are not surfaced — the annotator sees the default value instead.

At the top of the page, a dropdown labeled Select Config lets you choose which annotation configuration to edit. Each configuration corresponds to a specific image position in your inspection setup, identified by its image label or image ID.

Once you select a configuration, the page shows each annotation label within it.

For each label you can adjust:

SettingDescription
Label nameThe human-readable name of this label, as defined in the label map.
Default valueThe value pre-selected during annotation when no prediction exists or when the prediction is below the threshold. Choose from the available options for this label.
Confidence thresholdA number between 0 and 1. Predictions with confidence at or above this value are shown as suggestions. Enter it directly or use the slider.
  1. Select the configuration you want to edit from the dropdown.
  2. Adjust default values and thresholds for each label as needed.
  3. Click Save at the bottom of the page.
  4. Confirm the result. A green notification appears if the save succeeded, or a red one if something went wrong.

Changes take effect immediately for all users. The next time someone opens the annotation view, they will see the updated thresholds and defaults.