Evaluation metrics for the object tracking project
Each group should produce code for automatic evaluation of the
system performance on the CAVIAR datasets.
The following 4 evaluation metrics should be implemented:
- Precision: Defined as sum(TP)/(sum(TP) + sum(FP)).
- Recall: Defined as sum(TP)/(sum(TP) + sum(FN)).
- Average TP overlap: Computed only over the true positives (with correct ID). Intersection-over-union is computed as described in the lecture.
- Identity switches: an identity switch happens when the identity of the detection associated with a ground truth bounding box changes.
- True Positive (TP): A detection that has at least 20% overlap with the associated ground truth bounding box.
- False Positive (FP): A detection that has less than 20% overlap with the associated ground truth bounding box, or that has no associated ground truth bounding box.
- False Negative (FN): A ground truth bounding box that has no associated detection, or for which the associated detection overlap by less than 20%
Sequences for evaluation
Each group is expected to test their system extensively over the suggested datasets. However, for easy comparison of results, we ask you to report your scores on the following sequences of the CAVIAR dataset:
- LeftBox
- Meet_Crowd
- Rest_InChair
External evaluation from CSV-files
Furthermore each group is expected to produce, for the above sequences, a simple file containing the output of their system. Use the .csv file format, where the file consists of a series of line entries where each line is structured in the following way:
framenumber, objectID, ul_x, ul_y, width, height
Here ul_x and ul_y are the x and y coordinates of the upper-left corner of the object bounding box. The origin of the coordinate system is (0,0) in the top-left pixel of the image, and the x and y axes are right-pointing, and down-pointing respectively.
Create a folder in the root of your repository, named Evaluation. In this folder, add a .csv file for each evaluation sequence. Use the naming convention <sequence_name>.csv, i.e. use filenames like: LeftBox.csv etc..Automatic evaluation results
Automatic evaluation results 2019.
Automatic evaluation results 2018.Last updated: 2020-01-03