Compare Model Performance

You have trained your model and now you are ready to see how it performs. It is time to perform a cycle of the Active model optimization workflow.

Encord Active workflow

Now you want to compare your model's performance before using Encord (or maybe after running a number of data curation and label validtion cycles). Active supports doing direct model prediction performnce comparison from within your Active Project.

To compare your model's performance:

This process assumes you have already imported your model's predictions in to Active at least twice.

  1. Log in to Encord.
    The Encord Homepage appears.

  2. Create a Workflow Project in Annotate.

  3. Click Active.
    The Active landing page appears.

  4. Import your Annotate Project.

  5. Click an Active Project.
    The Project opens on the Explorer.

  6. Click Model Evaluation.
    The Model Evaluation page appears with Summary displaying.

  7. Select an entry from the dropdown under Prediction Set under Overview.

  8. Select an entry from the dropdown under Compare against under Overview.

  9. Click through the various entries on the left side of the Model Evaluation page to view the comparison.

  10. Add more data and start the data curation, label validation, and model optimization cycles until the model reaches a performance level that you require.

To compare your model's performance from scratch:

This process assumes you are just getting started with Encord. You have not trained your model yet. You are using Encord to prepare your data for annotation, annotating your data, labeling your data, validating your labels, fixing any label issues, then training your model.

  1. Log in to Encord.
    The Encord Homepage appears.

  2. Create a Workflow Project in Annotate.

  3. Click Active.
    The Active landing page appears.

  4. Import your Annotate Project.

  5. Click an Active Project.
    The Project opens on the Explorer.

  6. Click Model Evaluation.
    The Model Evaluation page appears.

  7. Import a Prediction Set.

  8. [Perform data curation on your Project in Active(https://docs.encord.com/docs/active-use-cases#data-cleansingcuration).

  9. Send the Project to Annotate.

  10. Label and review your data in Annotate.

  11. Sync the Active Project with the updated Annotate Project.

  12. Perform label validation on your updated and sync'd Project.

  13. Send the Project to Annotate.

  14. Label and review your data in Annoate.

  15. Retrain your model using the curated and validated data/labels.

  16. Click the Active Project.
    The Project opens on the Explorer.

  17. Click Model Evaluation.
    The Model Evaluation page appears.

  18. Import the updated Prediction Set.

  19. Select an entry from the dropdown under Prediction Set under Overview.

  20. Select an entry from the dropdown under Compare against under Overview.

  21. Click through the various entries on the left side of the Model Evaluation page to view the comparison.

  22. Add more data and start the data curation, label validation, and model optimization cycles until the model reaches a performance level that you require.