Thresholds
The Thresholds view lets you set confidence thresholds for each annotation label. These thresholds determine the minimum confidence a model prediction needs before it is considered meaningful. Below the threshold, a prediction is treated as uncertain.
Why thresholds matter
Section titled “Why thresholds matter”Every prediction the model makes comes with a confidence score — a number between 0 and 1 that indicates how certain the model is. A prediction with confidence 0.95 means the model is quite sure; a prediction at 0.35 means it is guessing.
Thresholds let you define where that line sits for each label. This is not about making the model more accurate — it is about deciding how much uncertainty you are willing to accept before acting on a prediction.
UI walkthrough
Section titled “UI walkthrough”Selecting an annotation configuration
Section titled “Selecting an annotation configuration”At the top of the page, a dropdown lets you select which annotation configuration to work with. Each configuration corresponds to a specific image label and image ID combination, and contains one or more annotation labels.
Threshold controls
Section titled “Threshold controls”For each annotation label in the selected configuration, you will see:
| Control | Description |
|---|---|
| Label name | The annotation label, shown alongside its image label and image ID. |
| Default value | A dropdown to select the default annotation value for this label. |
| Confidence threshold slider | A slider from 0 to 1 that sets the minimum confidence level. |
| Confidence threshold input | A number field for fine-grained control over the threshold value. |
Saving changes
Section titled “Saving changes”Click the Save button at the bottom of the page to persist your changes. A notification confirms whether the save succeeded or failed.