Skip to content

Thresholds

The Thresholds view lets you set confidence thresholds for each annotation label. These thresholds determine the minimum confidence a model prediction needs before it is considered meaningful. Below the threshold, a prediction is treated as uncertain.

Every prediction the model makes comes with a confidence score — a number between 0 and 1 that indicates how certain the model is. A prediction with confidence 0.95 means the model is quite sure; a prediction at 0.35 means it is guessing.

Thresholds let you define where that line sits for each label. This is not about making the model more accurate — it is about deciding how much uncertainty you are willing to accept before acting on a prediction.

At the top of the page, a dropdown lets you select which annotation configuration to work with. Each configuration corresponds to a specific image label and image ID combination, and contains one or more annotation labels.

For each annotation label in the selected configuration, you will see:

ControlDescription
Label nameThe annotation label, shown alongside its image label and image ID.
Default valueA dropdown to select the default annotation value for this label.
Confidence threshold sliderA slider from 0 to 1 that sets the minimum confidence level.
Confidence threshold inputA number field for fine-grained control over the threshold value.

Click the Save button at the bottom of the page to persist your changes. A notification confirms whether the save succeeded or failed.