Manual QA

Watch the video, or follow the step-by-step guide below to learn how to create Manual QA projects.

Creating Manual QA projects

Under in the Annotate section in the navigation bar, select 'Projects'. Select the Manual QA tab to start creating a Manual QA Project.

  1. Enter a meaningful title and description. A clear title and description help keep your Projects organized.

If you part of an Organization, you will see an optional project tags drop-down. Project tags are useful for categorizing your projects. Select as many tags as are relevant to your project.

  1. Enter a meaningful title and description. A clear title and description help keep your Projects organized.

  2. Attach one or more Datasets to the Project. Click the Attach dataset button and select the Datasets you want to add to the Project. You have the option to create a new Dataset.

  1. Attach an Ontology to the Project. Click the Attach ontology button and select the Ontology you want to add to the Project. You have the option to create a new Ontology.
  1. Click Create project to create the Project.

Roles and permissions

PermissionAdminTeam ManagerReviewerAnnotatorAnnotator & Reviewer
Attach / Detach datasets
Attach / Switch ontology
Delete
Invite team members
Manage team permissions
Manage admins
Annotate & review tasks in https://docs.encord.com/docs/annotate-annotation-projects#task-management-systemReview onlyAnnotate only
Confirm annotations outside of the https://docs.encord.com/docs/annotate-annotation-projects#task-management-system
Control assignments and status in https://docs.encord.com/docs/annotate-annotation-projects#task-management-system

Manual Quality Assurance

Manual quality assurance for annotation projects means that annotation tasks have to be reviewed before they can be marked as Complete.

You can set the following parameters for manual quality control in the Settings tab in your annotation project shown in the screenshot below:

  • The percentage of labels that are to be manually reviewed.
  • Rules for distribution of review tasks.
  • Common rejection reasons that can be used to identify and systematize errors in your labels.
  • Reviewer to class and annotator mapping (e.g. label X with class Y should always be reviewed by reviewer Z).
  • Assign tasks that are rejected after a specific number of review cycles for expert reviews.

A. Sampling rate


B. Multi review assignment


C. Default rejection reasons


D. Reviewer mapping


E. Expert review

Sampling rate

Project administrators can dynamically change the sampling rate applied to submitted annotation tasks. The sampling rate determines the proportion of the submitted labels that a reviewer should review. This can be modified with the slider.

Sampling rates can also be configured by annotation type and annotator (e.g. class Y should have a sampling rate of 50%, class Z should have a sampling rate of 80%, annotator A should have a sampling rate of 70%, annotator B should have a sampling rate of 95%) by clicking the Configure button (this feature is only available to paying users).


Multi review assignment

Annotation tasks with many labels across one data asset might get partitioned into review tasks that are distributed to different reviewers. Enabling multi review assignment means that all review tasks generated through the submission of one annotation task are assigned to the same reviewer.


Default rejection reasons

The default rejection reasons allows an admin to create default responses a reviewer can select when rejecting annotation tasks. Pressing the + New button and entering a response will save it for future reviews. Setting default rejection reasons can help you identify and systematize errors in your labels.


Reviewer mapping

You can configure rules that automatically assign specific reviewers to classes and annotators (e.g. label X with class Y should always be reviewed by reviewer Z). The setting can be configured by toggling the 'Reviewer mapping enabled' option.

Clicking the Configure button opens up a window where you can assign reviewers to specific annotators or classes. Assigning a reviewer to classes (objects or classifications) can be done under the Class mapping tab, and assigning a reviewer to annotators under the Annotator mapping tab. Any number of reviewers can be assigned to annotators and classes. One of them will be selected at a time for each task submitted.

ℹ️

Note

If an annotator is mapped to a reviewer(s) and they create labels with specific classes also mapped to a reviewer, the class mapping will take precedence over the annotator mapping.


Expert review

👍

Tip

Set up Expert review in the QA section of the Settings tab of your Manual QA project

Many industries and domains require years of training or experience to accurately recognize and classify examples — and an expert’s time can often be expensive or hard to schedule. In other cases, there may be additional requirements on your data quality assurance processes depending on the regulatory environment.

To help customers speed up their data annotation processes in these complex environments, Encord provides an expert review feature which empowers expert reviewers you designate to have an additional layer of oversight in the review process.

Expert reviews differ from normal reviews in the following ways:

  • Expert reviews are initiated following a normal review, not direct annotator submission.
    • Rules for forwarding to expert review do not clash with normal annotator or class reviewer mappings.
    • Sample rate for expert review is configured according to the expert review stage config, not normal annotation sample rates. Sample rates apply to the review judgments indicated in the expert review configuration.
  • Instance annotations rejected by an expert are permanently rejected instead of being returned to the annotator.

The expert review configuration resembles as follows:

Set up an expert review configuration by specifying the parameters.

  1. After X reviews: Choose X such that after X cycles of submission and rejection by a normal reviewer, all rejected reviews are forwarded to expert review. This may sometimes be known as the review count threshold. Because 2 is the review count threshold in the above sample configuration, all reviews rejected for a second time will be sent to expert review.

  2. Expert reviewers: Choose the pool of possible expert reviewers. There is no requirement to designate a user as an expert reviewer, other than they have at least reviewer permissions inside in the project. Users can be made expert reviewers regardless of their placement within normal annotator or class reviewer mappings.

  3. Expert review stages: Stages or iterations, indicate how to forward review results to expert review after each normal review. In the above sample configuration, 10% of all first reviews will be sent to expert review, and 50% of approved second reviews will be sent to expert review.

    • At each stage you configure the review iteration, the sampling rate, and the action for which to apply the sampling rate. Note that because all rejected reviews are always forwarded to expert review at the threshold count, you can only choose 'Approved' for the possible extra action when configuring the final stage.

The above configuration can be visualized as follows:


Project dashboard

Video Tutorial - Monitoring annotation progress

Selecting a project from the list of annotation projects takes you to its 'Project dashboard'.

This is where you monitor and manage your project. For example, you can view your project's summary statistics, manage labeling tasks, view your team's productivity, train models and invite collaborators.

The dashboard is split into 7 tabs:

  • Summary: a high-level view of labeling and productivity statistics.
  • Explore: a quick way to explore the distribution of instances and labels across data assets in the project.
  • Labels: for managing all the project's labeling activity and tasks.
  • Performance: a more detailed view of your team's manual labeling and productivity.
  • Models: for administering models in this project.
  • Export: for exporting your labeling data.
  • Settings: editing project options, ontology, team collaborators, and other general project settings.

Access to each tab is associated with the various project roles as follows:

TabAnnotatorReviewerAnnotator + ReviewerTeam ManagerAdmin
Summary
Explore
Labels
Performance
Models
Export
Settings

Summary

Clicking an annotation project takes you to its Summary dashboard. This dashboard has 2 components and gives you a rich visual display of your project's progress at a high level.

  • A. Project task status overview: summary of your task statuses.
  • B. Instance label task status: summary of the labels in your tasks.
Project task status overview Displays the number of annotation tasks that are in each state: Annotate, Review or Completed.
  • Annotate: The task is ready to be annotated.
  • Review: The task is ready to be reviewed.
  • Completed: The task has been annotated, and reviewed. There is no further action to be taken.
Instance label task status

Displays the number of labels / instances that have been created, and their assigned status.

  • Approved - The instance has been approved by a reviewer.
  • Returned for annotation - The instance has been returned to the annotator by the reviewer.
  • In review - The instance needs to be reviewed.

For a more comprehensive summary of how a task moves from annotation through instance review and full completion, reference the Status section below.


Explore

The Explore page provides interfaces to help you understand how project's annotations are distributed amongst the data assets at both an instance and label level. It allows a deeper exploration through attributes on objects, as well as frame-level classifications.


Instance statistics

This section provides the total count of all instances across the datasets in your project.

  • Project total: Shows total instances (both objects and classifications) across the project by default. To get instance statistics for individual data files, click the drop-down to select a data file.
  • Select class: Shows the total instances for a particular class. This is a summary of how a given class is
    distributed across your project's data assets. The pie chart segments show a breakdown of how that class is split across the data assets.
  • Display timestamps: Flip the toggle to switch between frame numbers and timestamps for the labels.

Label statistics

This is a summary of how your labels are distributed across the project. The pie chart shows a breakdown of how many labels there are for a given class.

  • Project total: Shows the total number of labels across different datasets in the project. To get label stats for individual data files, click the drop-down to select a data file.
  • Objects: Click on the pie chart segment of a class to see the total number of labels and its attributes (sometimes called nested attributes) if available for that class.
  • Classifications: Shows the global classification at project or individual video level. For example, location, time of day, etc.

Quick definitions of classes, instances and labels

  • Class: Fundamental unit of the project's ontology. For example the ontology of a project annotating traffic videos could have classes such as Car, Truck, Bicycle, and so on. For more information on objects and classifications, see Ontologies Overview.
  • Instance: Specific occurrence of a class. Car(0) is an instance of the Car class, for example, it could be a specific black sedan. The single Car(0) instance can appear in a single frame or a range of frames. Therefore, instances may contain multiple labels across frames.
  • Label: An frame-specific annotation of an instance. For example the annotation of Car(0) on frame 201 is a label.

Labels

The Labels page is your gateway to annotating, reviewing, and auditing the labels made against all the datasets in your project. Access to each pane will depend on the user's project role. We briefly summarize the purpose of each tab, and the roles which can access each below.

RoleActivityQueueDataInstances
Annotator
Reviewer
Annotator + Reviewer
Team Manager
Admin

The labels dashboard features the following tabs:


Activity

The Activity page allows you to quickly monitor annotation and review activity in your project by showing tasks and providing a summary interface. The status of reviewed labels inside each task can also be seen. Tasks are displayed in most recently edited order from top to bottom.

  • A. File, Search, & Reopen: The name of the specific data unit or data asset. This is the same as the name in the dataset to which this data asset is a part of. Use the search box to filter the list by file name, and send tasks back to annotation using the 'Reopen' feature.
  • B. Dataset: The dataset the data asset belongs to.
  • C. Type: The type of the data, such as an image or video. For more on our supported data types, see our label editor documentation.
  • D. Status: The status of this task within the https://docs.encord.com/docs/annotate-annotation-projects#task-management-system.
  • E. Frames: The number of frames in the data asset. For a DICOM series, this will be the number of slices.
  • F. Reviews: How many annotation instances are in this data asset.
  • G. Submitted: Indicates when the last submit action, whether for an annotation or review, was made against any of the labels in this data asset.
  • H. Submitted by: Who last submitted the annotations.
  • I. Reviewed by: Who submitted the most recent review.
  • J. Actions: Click the View link to open the label editor. Note: this feature is only available to Team Managers and Administrators as an extra method of reviewing submissions outside the TMS. We advise extra caution if you decide to edit the labels from this interface. If significant work needs to be done, we strongly recommend to 'Reopen' the task to prevent possible errors from simultaneous edits.
  • K. Filter: Use the filter drop-down to only show tasks with the selected status. Only filtered results will be shown in the Label Editor.

File, Search, and Reopen

The file column shows the name of the data asset. For files uploaded via the GUI, they keep the name they were uploaded with. For files added from your cloud storage, this will normally be the path under the bucket they are stored on.

Use the search interface to quickly filter and display only those tasks with file names matching your desired text. Even partial matches will be shown. For example: searching "fly" will return file names containing "flyover" and "flyaround."

The Reopen button allows Administrators and Team Managers to send tasks which are currently Completed or In review back to annotation. Select your target tasks using the checkboxes in the File column to select individual assets, or select the checkbox in the column header to select all tasks, and press the Reopen button to move all selected tasks back to the annotation stage. Tasks reopened in this way will have the status Returned in the Queue tab. No labels are lost by reopening a task. The 'Reopen' action is only applied to tasks which are both visible (i.e. not filtered out by the file search) and selected.

Status

This column shows the status of this task within the https://docs.encord.com/docs/annotate-annotation-projects#task-management-system. The Activity pane only shows assets which have had some action done on them, and therefore only reflects tasks with the following statuses:

  • In review: The annotation task has been submitted but outstanding review tasks remain. In review task status is shown in blue.
  • Completed: The annotation task has been submitted and all reviews have been completed. Completed task status is shown in green.

For a comprehensive summary of the possible task states, see the status section of the Data tab, below.

Reviews

The 'Reviews' column shows a count of how many instances have been reviewed in a given data asset. Click the number to open a panel which shows the last review action taken on each instance, as well as who originally created the annotation and when. Note that unless the review was done by an 'Expert Reviewer', all reviewed annotations must be either 'Approved' or 'Deleted' before a task can be 'Completed.' Read more about the Expert Review feature here.


Queue

The Queue tab is where annotators and reviewers look to find their next task. The Start labeling and Start reviewing buttons visible throughout the project open the label editor with the next task in the queue according to the relevant task type.

The Queue tab can be used to assess the number of tasks assigned to you as an annotator or reviewer and therefore estimate your likely workload. Administrators and Team Managers can also use it to quickly verify the current assignments per team member, and change assignments as necessary.

  • A. File, Search, & Assign: The name of the specific data unit or data asset. This is the same as the name in the dataset to which this data asset is a part of. Use the search box to filter the list by file name, and send tasks back to annotation using the 'Reopen' feature.
  • B. Dataset: The dataset the data asset belongs to.
  • C. Type: The type of the data, such as an image or video. For more on our supported data types, see our documentation here.
  • D. Status and Task: The status and category of this task within the https://docs.encord.com/docs/annotate-annotation-projects#task-management-system.
  • E. Last Edited: When the task was last edited
  • F. Reserved by: Who the task has been assigned to or reserved by
  • G. Actions: Depending on your collaborator role, you can initiate or reassign the task.
  • H. Filter: Use the filter drop-down to only show tasks of the selected status. Only filtered results will be shown in the Label Editor.

File, Search, and Assign

The file column shows the name of the data asset. For files uploaded via the GUI, they keep the name they were uploaded with. For files added from your cloud storage, this will normally be the path under the bucket they are stored on.

Use the search interface to quickly filter and display only those tasks with file names matching your desired text. Even partial matches will be shown. For example: searching "fly" will return file names containing "flyover" and "flyaround."

The 'Assign' button allows Administrators and Team Managers to allocate unassigned tasks to specific collaborators for annotation or review. Select your target tasks using the checkboxes in the File column to select individual assets, or select the checkbox in the column header to select all tasks, and press the 'Assign' button open the task assignment popup.

Confirm the selected tasks are as intended, then select the target collaborator from the drop-down and press assign. Tasks which have already been assigned to another collaborator, as indicated by the email in the 'Reserved by' column, can not be reassigned until they have first been released.

Status and Task

The Queue tab only shows tasks which have remaining annotation or review work to be done within the https://docs.encord.com/docs/annotate-annotation-projects#task-management-system. Therefore, the stage of the task within the TMS is understood by reading the Status and Task columns together.

The two types of tasks are 'Annotate' and 'Review' which can be in any of the following states:

  • Queued: The task is ready for annotation or review. For an annotation tasks to be 'Queued' it must not be assigned to a user, and must have no submitted labels. It may have been previously assigned to a user, but subsequently released before any annotations were submitted.
  • Assigned: The annotation or review task is assigned to a specific user.
  • Returned: The annotation task was previously submitted, and either 'reopened' after completion by a Team Manager or Administrator, or rejected by the reviewer.

Actions

There are two relevant actions that can be done on each task from the 'Queue' pane. Press 'Initiate' to open the label editor and proceed with annotation or review, depending on the task type.

Additionally, Administrators and Team Managers can click the three vertical dots to open the expanded menu, to access the 'Release task' function. Tasks must be explicitly released before they can be reassigned.


Data

The Data page gives a complete overview of all the data asset tasks in the project, regardless of their progress through the https://docs.encord.com/docs/annotate-annotation-projects#task-management-system. Therefore, this is the first place Administrators and Team Managers should check if they want to confirm the status of a given task.

  • A. File & Search: The name of the specific data unit or data asset. This is the same as the name in the dataset to which this data asset is a part of. Use the search box to filter the list by file name.
  • B. Dataset: The dataset the data asset belongs to.
  • C. Type: The type of the data, such as an image or video. For more on our supported data types, see our documentation here for more details.
  • D. Status: The status of this task within the Task Management System.
  • E. Frames: The total frames in this data asset. This will apply to videos, image sequences and DICOM. Images always only have 1 frame.
  • F. FPS: the frames per second of the data asset. This only applies for data of type video. Others will show a dash (-).
  • G. Created: When the task was created. Tasks are created when the dataset containing the data asset is attached to the project.
  • H. Last edited by: the last collaborator to edit the task in any capacity (such as annotate or review), and when.
  • I. Actions: The Data page allows users to view the task in the label editor, as well as get a code snippet for using the SDK with this task, and confirming the edit actions via the Activity Log.
  • J. Filter by: Use the filter drop-down to view only tasks with the selected Status.

👍

Tip

Confused about the difference between image groups and image sequences? See our documentation here to learn about different data types in Encord.

File and Search

The file column shows the name of the data asset. For files uploaded via the GUI, they keep the name they were uploaded with. For files added from your cloud storage, this will normally be the path under the bucket they are stored on.

Use the search interface to quickly filter and display only those tasks with file names matching your desired text. Even partial matches will be shown. For example: searching "fly" will return file names containing "flyover" and "flyaround."

Status

The data tab provides the most comprehensive overview of all the tasks associated with each data asset in a given project. As such, this is the first place to check to see the status of various tasks.

  • Queued: The task is ready for annotation. For a task to be 'Queued' it must not be assigned to a user, and have no submitted labels. A queued task may have been previously assigned to a user, but subsequently released before any annotations were submitted. Queued tasks are shown in light orange.
  • Assigned: An annotation task has been assigned to a specific user. Assigned tasks are shown in aqua green.
  • In review: The annotation task has been submitted but outstanding review tasks remain. In review task status is shown in blue.
  • Returned: The task was previously submitted, and either several of the annotations were rejected by the reviewer or it was 'reopened' after completion by a team manager or administrator.
  • Completed: The annotation task has been submitted and all reviews have been completed. Completed task status is shown in green.

Actions

Clicking View will drop you into the label editor to do a live audit of the annotations in this data asset. The Data tab is only visible to Administrators and Team Managers and so grants great power to view any data asset, however appropriate care must be taken to ensure annotations are not simultaneously edited from the 'Queue' pane by an annotator or reviewer. Encord advises edit actions are NOT taken from the Data tab unless you have received confirmation no one else is concurrently editing the asset.

🚧

Caution

In order to prevent any possible issues of annotator work being overwritten, it's critical that all annotations are done via the https://docs.encord.com/docs/annotate-annotation-projects#task-management-system's Queue tab, and only the person assigned to the task makes annotations at any given time.

Other possible actions include 'API Details' which show a popup with sample code you can use to get started with our SDK to access this particular data asset, often known as a label row in the SDK. Click 'Activity log' to see a popup with a graphical summary of add / edit / delete actions on this data asset indexed by annotator or ontology class. Click 'Display logs' in the lower right to show all actions in reverse chronological order.


Instances

The Instances tab allows Administrators and Team Managers to search within the data to directly find specific instances. Recall that an annotation instance correlates to a unique instantiation of a specific ontology class in a data asset.

For example, if you have the 'Person' class in your ontology, the first instance of a 'Person' in a given data asset will be indicated in the interface as 'Person (0)', the second as 'Person (1)' and so on. Instances, therefore, can exist in multiple frames of a data asset, and indicate the same object. Use the Instances tab to search for specific instances of objects or classifications using their Identifier.

Instance identifiers are unique at the project scope, and can be found in any of the following ways:

  • From inside the label editor, by clicking on a particular instance, and then selecting 'Copy identifier' from the instance action menu.
  • From inside exported labels, where they are known as the objectHash or classificationHash as appropriate.
  • When uploading labels using the SDK, you may specify your own objectHash or classificationHash.

Once you have an identifier of interest, use the 'Search instance' interface to filter the instances by identifier to quickly find the instance you're interested in. This can be particularly handy when you want to visually confirm an annotation you may not have seen before, but for which you have the identifier.

After locating your instance of interest, click View from the 'Actions' column to jump deeply into the dataset straight to where the instance is first annotated.


Performance

You can switch between a summary, or detailed view of performance metrics using the toggle found at the top of the Performance tab. Please note that the Details tab will vary depending on the type of QA your project has - with both cases documented below.


Performance - Summary

The Summary tab of the performance dashboard provides an overview of your team's manual labeling and productivity.

🚧

Caution

The Summary tab only displays actions taken in the Label Editor. Actions taken in the SDK will not be displayed.

Task actions over time

View the number of tasks in a project that have been approved, rejected, and submitted for review over a given period of time.

  • The height of a bar represents the total number of tasks.
  • The height of each color within a bar represents the number of approved, rejected, and submitted tasks.
  • A: Set the time period you would like to see displayed by selecting a range of dates.
  • B: The Hide days without any actions toggle removes all days at which no actions were taken from the view.
  • C: Download a CSV file of the data.
  • D: Display the data as a bar chart, or a table. While the chart provides a clear visual representation, the table provides exact figures for a more detailed picture of your team's performance.

Instance Label actions over time

View the number of instance label actions in a project that have been approved, rejected, and submitted for review over a given period of time.

  • A: Set the time period you would like to see displayed by selecting a range of dates.
  • B: Download a CSV file of the data.
  • C: Display the data as a bar chart, or a table. While the chart provides a clear visual representation, the table provides exact figures for a more detailed picture of your team's performance.

Within your specified time period, you can choose which dates to display by using the slider located beneath the graph.

Team collaborators

The 'Team collaborators' section shows the duration of time each project collaborator spend working on a given file.

A. 'Data file' displays session time collaborators spent working on individual files. 'Project' displays session time collaborators have spent working on the project.

B. Table entries can be filtered according to dates by clicking the range of dates, and selecting the start and end date of the period you would like to see table entries displayed for.

C. Table entries can be downloaded in CSV format by clicking the Download CSV button.

D. When lots of entries are present they will be split across a number of different pages. The number of table entries per table can be adjusted.


Performance - Details

The Details tab of the performance dashboard gives a more detailed view of your team's labeling and productivity. This section will cover manual QA projects. The below details will be displayed for Manual QA projects.

🚧

Caution

The Details tab of the performance dashboard only shows information for labels created in the Label Editor. Labels submitted via the SDK will not be shown on the Details tab. This includes labels that were submitted using the SDK, and edited in the Label Editor.

👍

Tip

You can specify a range of dates, as well as whether statistics should be displayed for labels, or instances. More information on instances and labels can be found here.

Submissions chart

The submissions chart displays the number of submitted labels or instances over the specified time period. The chart can be filtered to show submissions for specific annotators or classes.

If you filter on both Annotators and Classes then the resulting chart will show the submission statistics for the selected annotators and for selected labels.

Reviews chart

The reviews chart displays the cumulative number of accepted and rejected labels or instances over the specified time period.

Annotator's table

The annotator's table displays all the relevant statistics for all annotators in a Project. It can be filtered on classes to show annotator statistics only for the selected classes.

  • User: The annotator's email.
  • Rejection rate: Percentage of their labels or instances that have been rejected in the review process.
  • Submitted labels / instances: Number of labels or instances that the annotator has submitted for review
    • Repeated submissions are not counted.
  • Accepted labels / instances: Number of labels or instances that the annotator created that passed the review process.
  • Rejected labels / instances: Number of labels or instances that the annotator created that we're rejected during the review process. Note that this can be higher than the number of submitted labels / instances since a label or instance can be rejected multiple times during the review process but the submission will only be logged once.
  • Total session time: Time spent labeling.

Reviewers table

  • User: The reviewers email.
  • Rejection rate: Percentage of labels or instances that they rejected in the review process.
  • Accepted labels / instances: Number of labels or instances that the reviewer accepted.
  • Rejected labels / instances: Number of labels or instances that the reviewer rejected.
  • Total session time: Time spent reviewing.

Objects and classifications table

Each row in the objects and classifications table can be expanded to show statistics on attributes.

  • Class: The class name.
  • Rejection rate: Percentage of labels or instances rejected in the review process.
  • Reviewed labels / instances: Number of labels or instances of the class that have gone through the review process.
  • Accepted labels / instances: Number of labels or instances of the class that have passed the review process.
  • Rejected labels / instances: Number of labels or instances of the class that failed the review process.
  • Avg. time to annotate: Average time spent annotating this class.

Models

The Models page allowed you to manage micro-models linked to your project. For information on models, please see our documentation here.


Export

ℹ️

Note

This tab is only visible to project Admins

Use the Export page to export your data. Please see our exporting data page to learn how to do this.


Settings

The Settings tab allows you to make modifications to your project using the following tabs:

  • Options - Copy a project, modify datasets, modify ontology, upload annotation instructions, modify project tags, QA settings.
  • Team - Manage collaborators on a project.
  • Danger zone - Delete your project.

Options

Copy a project

To copy a project, click the Copy project button in the Options section of the project's Settings. This opens the copy project window. From the Copy Project window, you can pick the various parts of your project you want to copy over into your new project.

1. Select copy options

Choose the parts of your project you want to copy.

You can copy any combination of the following assets:

  • Labels: this will copy the labels within videos and image sequences of your choice.
  • Models: this will copy all the models in your project along with their training logs.
  • Collaborators: copy all project users with their respective roles. Project admins are copied regardless of whether this is selected.
  • All datasets: all datasets will be copied, and new annotation tasks will be created for all videos and image sequences if their labels were not copied over (see next line).

👍

Tip

Confused about the difference between image groups and image sequences? See our documentation here to learn about different data types in Encord.

The new annotation project will use the same ontology as the original. This can be changed in the project settings if required.

  • If you don't want to copy labels, press Copy project. This will create the copy of your project, which you can then access in the Projects tab.

  • If you choose to copy over labels, you will be asked to select the data assets for which you would like labels copied over. To begin the process, press Next: configure labels. Continue to step 2. below.

2. Select labels to be copied

Select the data units with the labels that you want to copy into your new project.

Click Next to continue.

3. Configure labels

Select the statuses of the files you want copied over into your new project.

ℹ️

Note

When a project is copied, the task status will not be copied.
This means that all tasks will be Annotate tasks, and their status will be Queued.
All tasks will have to be re-assigned after being copied.

Click the Copy project button to complete the process.

Upload annotation instructions

Video Tutorial - Uploading annotator instructions
  • Click the Add instructions button to upload instructions for your annotators in PDF format.

  • To ensure the best possible results, provide as much detail as possible about what you would like annotated and how precise bounding boxes should be drawn. For example, instead of saying 'person', consider defining what should constitute a person for your annotators - only a full person? A torso? Or should any part of a person in a frame be labeled as a 'person'?

ℹ️

Note

The more specific your annotator instructions, the higher the chances that your annotators will perform well.

  • Once uploaded, annotation instructions will be accessible within the Label Editor.

Project tags

ℹ️

Note

Tags are created and managed on the Organization level. Once created they can be added to individual projects.

You can add tags to a Project if you are part of an Organization.

Project tags allow you to:

  • Flexibly categorize and group your Projects.

  • Filter your Projects.

Adding and removing tags

You can add tags to your Projects in:

To add tags to your Projects in the Settings page, navigate to the Options tab and click the Project tags drop-down. Here you will see the available tags in your Organization. Click on a tag to add it to a Project. You can remove a tag from your Project by clicking the same tag again, or clicking the x button next to its name.

Filtering projects by tags

You can filter your Projects based on the tags they contain. To do so, click on the Projects tab in the Navigation bar, click the Filter by tags drop-down and select one or more Project tags. This will result in only Projects with the tags being selected being displayed.

Edit project ontology

You can view or switch the Ontology attached to your Project.

ℹ️

Note

Changing the Ontology can render existing labels invalid and lead to data inconsistency.

  • Click the Switch ontology button to switch the ontology linked to your Project.
    The resulting pop-up allows you to choose an existing Ontology from a list, or create a new ontology for this project.

  • Click the View ontology button to view the details of the Ontology that is attached to the current Project.

Edit datasets attached to a project

The Datasets section allows you to attach or detach any number of Datasets to your Project. You must create a new Dataset in the Datasets section for it to become available in a project's settings.

Quality Assurance

The 'Quality' section allows you to configure the way that manual quality assurance is implemented for a given project.

  • The Sampling rate slider determines the percentage of labels that will be manually reviewed. Clicking Configure sampling rate allows you to set the sampling rate for each label type, or annotator separately.

  • The Multi review assignment enabled toggle will assign all labels created for a given task to the same reviewer.

  • Default rejection reasons allows you to add commonly used reasons for rejecting a label, to make them available to your reviewers and save time when reviewing tasks.

  • Toggle Reviewer mapping and click Configure reviewer mapping to assign classes, or labels made my specific annotators, to a particular reviewer.

  • Toggle Expert reviewer rule to enable expert review.


Team

To manage project collaborators, select the 'Team' pane in your project Settings.

Here you can invite collaborators to the project, and configure their roles.

Add collaborators

To invite collaborators from within your organization to the project:

  1. Click the + Invite collaborators button. This will open a new window where you can enter email addresses of the people you would like to invite.
  1. Select a user role for the collaborator you want to add by selecting an option from the list.

  2. Type the email address of the user you'd like to add and select the user from the list.

  3. Click the Add button to add the user with the specified role.

Add collaborators as a group

ℹ️

Note

To add collaborators as a group, your organization needs to have user groups. Navigate to our documentation on creating user groups for more information.

Collaborators can be added to a project as a group - which can save time as well as ensure that no individual is forgotten.

In the 'Groups' section of the page, click on Manage to make the 'Manage Groups' pop-up appear.

Click the 'Select group' drop-down and pick a group you would like to add as collaborators. After selecting a group, click the 'Select Role' drop-down to assign a role to the group of collaborators. Click Add to add the group.

The group you just added will appear under the 'Added groups' heading. Repeat the process if you'd like to add more groups with different roles to the project.

👍

Tip

To delete a group from the project, simply click the button next to the group name.

Change collaborator roles

Project admins can modify the different roles of collaborators, using the drop-down on the right.

You can assign the following roles to collaborators:

  • Annotator: annotators are responsible for labeling. This is the default role for all collaborators.
  • Reviewer: for reviewing labeled tasks.
  • Annotator & reviewer: a combination of annotator and reviewer.
  • Team manager: a team manager can assign tasks to other users, and add collaborators to the project.
  • Admin: gives this collaborator full administrative control over this project. Caution: this is an irreversible action.

Please confirm or cancel your selection when making a collaborator a project Admin.


Danger zone

You can delete your project by going to the Danger zone tab at the bottom of the menu, and clicking the red Delete project button, shown below.

🚧

Caution

Deleting your project does not delete the datasets in the project, but will delete the project's labels and ontology.


Task management system

The Task management system (TMS) is a system built to optimize labeling and quality control for all annotation and review tasks, allowing thousands of annotators, reviewers, team managers, and administrators, to work concurrently on the same Manual QA project.

The task manager is enabled by default but can be switched on and off under the Options tab in project settings.

Annotation and review tasks are distributed automatically using the first in, first out method - tasks that have been in the queue the longest are served first. Annotation tasks are generated and added to the label queue when a dataset or a set of datasets is attached to a project and when new data is added to attached datasets. Review tasks are generated and added to the review queue once an annotator submits an annotation task. Conversely, detaching a dataset will remove any associated annotation and review tasks, so you should exercise caution if you proceed.

Team managers and administrators can also assign tasks explicitly to individual annotators and reviewers. Once an annotation or review task is distributed to an annotator or reviewer, it is reserved by that individual, prohibiting other team members from accessing that task. Both annotation and review tasks are accessible in the Queue pane of the Labels tab.


Task generation

Annotation tasks are generated and added to the label queue when a dataset or a set of datasets is attached to a project and when a new data asset is added to attached datasets. Review tasks are generated and added to the review queue once an annotator submits an annotation task. Conversely, detaching a dataset will remove any associated annotation and review tasks, so you should exercise caution if you proceed.

By default, each data asset will be labeled once, and each label submitted for review will be reviewed once. You can create additional review tasks by clicking the + Add reviews button and following the steps in the window. You can reopen submitted annotation tasks if you wish to send the data asset back into the queue for further labeling by selecting the relevant assets and clicking the Reopen button.


Task distribution

Annotation and review tasks are distributed automatically using the first in-first out method (illustrated below) - tasks that have been in the queue the longest are served first. Once an annotator or reviewer clicks on the Start labeling or Start reviewing button, the next available free task in the queue is reserved by that individual, prohibiting other team members from accessing the task. Once the task is fetched, the annotator or reviewer is taken to the label editor to complete the task.

Project administrators and team managers can override the automated distribution of tasks by explicitly assigning tasks to individuals in the Queue pane of the Labels tab. Assignments can be done on a task-by-task basis or in bulk by selecting the relevant tasks and clicking the Assign button.

Tasks can be released by pressing the icon next to the task and clicking the Release task button. Reserved tasks do not have an expiry and will keep being assigned to an individual until it is submitted, released, or skipped.

👍

Tip

Annotation tasks can be submitted programmatically using our SDK Encord's Python SDK.


Task completion

An annotation task is completed once all outstanding labels subject to review have been reviewed. Completed annotation tasks and annotation tasks currently in the review stage are visible in the Activity pane of the Labels tab.

Task Status

The Task Status indicates the status of a given task. A task's status evolves from Queued for annotation to In review and finally Complete. If labels are rejected or the tasks is otherwise judged in need of further annotation work, the status will be marked as Returned. The most comprehensive view of task statuses is available to project Administrators, and Team Managers in a Project's Labels dashboard of the the Data tab.


Annotation

Tasks are labeled in the Label Editor. Click Submit to submit your labels for review.

Annotators can skip tasks by clicking the Skip button. If a task is skipped, the next available task is automatically displayed and assigned.

🚧

Caution

In order to prevent any possible issues of annotator work being overwritten, it's critical that all annotations are done via the Queue tab, and only the person assigned to the task makes annotations at any given time.


Review

Review tasks are completed in the Label Editor.

Review mode components:

  • A. Single label review toggle
  • B. Edit labels
  • C. Pending reviews pane
  • D. Completed reviews pane
  • E. Reject and Approve buttons
  • F. Approve and Reject all in frame buttons

ℹ️

Note

All labels are reviewed on an instance level. This means that if an instance is rejected on one frame, it will be rejected across all frames. This includes using the Accept all in frame and the Reject all in frame buttons.

'Pending' and 'Completed' review panes

All labels for review for a particular data asset assigned to the reviewer are automatically loaded into the 'Pending reviews' pane. Completed reviews are displayed in the 'Completed reviews' pane. You can click on specific objects to highlight them. Labels can be selected and then approved or rejected for a given instance or in bulk using the Reject and Approve buttons or the matching hotkeys, b for reject and n for approve.

'Single label review' toggle

You can enter the 'Single label review' mode by toggling the switch at the top. The single label review mode automatically highlights and hides other objects, allowing you to review and approve or reject a single label at a time and quickly browse through individual labels using the Up and Down keys on your keyboard.

ℹ️

Note

The reviewer is automatically taken to the next set of requested label reviews once all labels in a particular review task have been reviewed.

Edit labels

A convenient feature is allowing reviewers to edit labels and make small adjustments without the need to return the entire set of labels to the annotator.
Press the Edit labels button and make any necessary changes before switching back to review mode.
Currently, only a subset of label edit operations are supported:

  • objects: moving the object or individual vertices
  • classifications: changing the <> value
  • objects and classifications: change any <>s

Approve/Reject all in frame buttons

In addition to being able to review all labels for a given instance, you can review labels grouped by frame as well.
For review workflows that focus on progressing through video by frame rather than by instance, use the Approve all in frame and Reject all in frame buttons.
Of course, you should be sure you want to apply that judgement to all labels in a given frame before using this feature!


Rejected labels

If a reviewer rejects a label during the review stage, it will be marked as Returned in the Queue pane of the Labels tab. By default, rejected annotation tasks are returned and assigned to the queue of the person who submitted the task.

Returned tasks are resolved in a purpose-built user interface in the Label Editor. Click the icon on the right-hand side of the screen to open the drawer containing rejected labels. Once the reviewer comments have been addressed, click the button icon to mark it as resolved.

Annotation tasks cannot be re-resubmitted until all issues have been marked as resolved. Once a task is re-submitted, the labels marked as resolved are sent back for an additional review. There is no limit on how many times a label can be rejected and sent back for correction.

Returned task label editor view

Missing labels

If a reviewer determines that a label is missing entirely, they can use the report missing labels feature to indicate labels are missing in a given frame or image. Missing label reports will be sent back to the annotator via the same queue as rejected labels.

Submit missing label report

Learn the ins and outs of our products faster, and with more ease than ever before! Our brand-new documentation recently landed, and comes with new features as well as a sleek and stylish new look.

Documentation is an essential component of any product's user experience. Whether you're looking to refresh your memory of features you've previously used, or wanting to learn how to use new features to boost your team's productivity - our documentation is always the first place to look.

This is why we've stepped up, and levelled up our documentation in several key ways

  • Annotate, Active, and Apollo docs are all in one place and provide a clear overview of Encord's capabilities.

  • The improved search makes it easier to find the exact nuggets of information you're after.

  • The voting system makes our docs interactive by letting you submit feedback on every page. Your feedback is invaluable when it comes to improving our products as well as our documentation as we strive to make Encord as accessible and easy to use as possible!

Discover a whole new level of efficiency and ease with our new product documentation! Our latest update brings you not only a fresh and stylish design but also some new features that make mastering our products a breeze.

We understand that comprehensive documentation is the backbone of a seamless user experience. Whether you're a seasoned user looking to revisit familiar features or a newcomer eager to explore our latest innovations, our documentation is your ultimate go-to resource.

Here's how we've taken our documentation to the next level:

Unified Platform: Access all documentation in one centralized location, including Annotate, Active, and Apollo docs. This cohesive setup provides a clear and comprehensive overview of Encord's full range of capabilities.

Enhanced Search: Our improved search functionality allows you to quickly find precise nuggets of information easier than ever before.

Interactive Voting System: Your input matters! We've made our documentation interactive by incorporating a voting system. Every page welcomes your feedback, giving you the opportunity to 'like' or 'dislike' a page, thereby letting us know where we can continue improving our documentation.