This repository 360Tracking has been created as the part of master thesis at Brno University of Technology - Faculty of Information Technology. The master thesis has been supervised by Doc. Ing. Martin Čadík, Ph.D.
Single object trackers can fail or produce false positive results when tracking in equirectangular projection of 360° videos. The reasons of failures could be similar as in ordinary videos with limited field of view (e.g. occlusion). Although there might be other problems caused by equirectangular distortion. The tracked object can also cross horizontal borders of equirectangular frame. You could see how the state-of-the-art tracker KYS generates false positive results and fails below.
This approach may be the right solution for border crossing problem of single object tracking in equirectangular panorama. We have provided the solution where we simply simulate the spherical rotation about z-axis (Yaw). The tracker predicts results in shifted/rotated frame and predicted bounding box is transformed backward to original frame.
The second approach could improve tracking process in very distorted areas in equirectangular projection and solves border crossing problem as well. This solution simulates virtual camera system that tries to keep the tracked object in the center area of rectilinear projection. The tracker predicts results in rectilinear/perspective projection (adaptive field of view from 90° to 144°) and these results are converted backward to equirectangular (360°) coordinates.
We have evaluated 12 trackers for manually created dataset composed of 21 equirectangular videos.
New dataset with annotated objects in 360° equirectangular video has been created. You can see demo of this dataset here or on YouTube. This dataset includes 21 video sequences with total 9909 annotated (groundtruth) frames. The bounding boxes could be overflowing the horizontal borders of equirectangular frames.
You can download full dataset using this script or you can download it manually as zip or folder.
The videos used in this dataset have been taken from these resources (datasets 1 a 2 have been reannotated):
Keng-Chi Liu, Yi-Ting Shen, Liang-Gee Chen. "Simple online and realtime tracking with spherical panoramic camera" (ICCE 2018) [Paper] [Dataset]
Mi Tzu-Wei and Yang Mau-Tsuen. "Comparison of Tracking Techniques on 360-Degree Videos" (2019) [Paper] [Dataset]
Afshin Taghavi Nasrabadi, Aliehsan Samiei, Anahita Mahzari, Ryan P. McMahan, Ravi Prakash, Mylène C. Q. Farias, and Marcelo M. Carvalho. "A taxonomy and dataset for 360° videos" (2019) [Paper] [Dataset]
Custom videos captured by Ricoh Theta SC.
The custom improvements of Single Object Tracking in equirectangular projection of 360° video have been evaluated for the following well-known and state-of-the-art trackers. The python implementations of the selected trackers have been made publicly available by their authors or have been added to OpenCV extra modules.
B. Babenko, M. Yang and S. Belongie. "Visual tracking with online Multiple Instance Learning" (CVPR 2009)
-
MEDIANFLOW [Paper] [OpenCV extra modules]
Zdenek Kalal, Krystian Mikolajczyk, Jiri Matas. "Forward-Backward Error: Automatic Detection of Tracking Failures" (ICPR 2010)
Zdenek Kalal, Krystian Mikolajczyk, Jiri Matas. "Tracking-Learning-Detection" (TPAMI 2011)
João F. Henriques, Rui Caseiro, Pedro Martins, Jorge Batista. "High-Speed Tracking with Kernelized Correlation Filters." (TPAMI 2015)
-
CSR-DCF / CSRT [Paper] [OpenCV extra modules]
Alan Lukezic, Tomas Vojir, Luka Cehovin, Jiri Matas, Matej Kristan. "Discriminative Correlation Filter with Channel and Spatial Reliability." (CVPR 2017)
Goutam Bhat, Martin Danelljan, Luc Van Gool, Radu Timofte. "Know Your Surroundings: Exploiting Scene Information for Object Tracking." (ECCV 2020)
-
DiMP [Paper] [Official Code]
Goutam Bhat, Martin Danelljan, Luc Van Gool, Radu Timofte. "Learning Discriminative Model Prediction for Tracking." (ICCV 2019)
-
ATOM [Paper] [Official Code]
Martin Danelljan, Goutam Bhat, Fahad Shahbaz Khan, Michael Felsberg. "ATOM: Accurate Tracking by Overlap Maximization." (CVPR 2019)
Martin Danelljan, Goutam Bhat, Fahad Shahbaz Khan, Michael Felsberg. "ECO: Efficient Convolution Operators for Tracking." (CVPR 2017)
-
DaSiamRPN [Paper] [Official Code]
Zheng Zhu, Qiang Wang, Bo Li, Wu Wei, Junjie Yan, Weiming Hu. "Distractor-aware Siamese Networks for Visual Object Tracking." (ECCV 2018)
-
Ocean [Paper] [Official code]
Zhipeng Zhang and Houwen Peng and Jianlong Fu and Bing Li and Weiming Hu. "Ocean: Object-aware Anchor-free Tracking" (ECCV 2020)
-
SiamDW [Paper] [Official Code]
Zhipeng Zhang, Houwen Peng. "Deeper and Wider Siamese Networks for Real-Time Visual Tracking." (CVPR 2019)
These 12 trackers have been evaluated for the custom dataset mentioned above. You may notice the success plots based on Intersection over Union (IoU) metric with AUC values in legend and precision plots based on center error distance. DEFAULT plots display tracker results without any improvement, BORDER plots display "equirectangular rotation approach" results and finally NFOV plots display trackers results from normal field of view (rectilinear) approach.
git clone https://github.com/VitaAmbroz/360Tracking.git
Run the installation script to install all the dependencies. These dependencies should enable OpenCV tracking.
bash install.sh
Note: The install script has been tested on an Ubuntu 18.04. You could probably use current releases of numpy, torch and matplotlib. The implementation has been tested also on Windows 10 platform.
Note 2: Make sure you have only "opencv-contrib-python" installed on your platform. There could be errors when both "opencv-contrib-python" and "opencv-python" are installed. The release opencv-contrib-python 4.5.1.48 is recommended, because some OpenCV trackers might not work in the newest release.
cd code
# try OpenCV implementation of tracker CSRT
python run_opencv_tracking.py -t CSRT -v annotation/dataset-demo/demo-annotation/demo.mp4
# try CSRT with BORDER improvement
python run_opencv_tracking.py -t CSRT -v annotation/dataset-demo/demo-annotation/demo.mp4 -border
# try CSRT with NFOV improvement
python run_opencv_tracking.py -t CSRT -v annotation/dataset-demo/demo-annotation/demo.mp4 -nfov
You could also try the official python implementations of selected trackers.
This command clones 3 repositories (pytracking, DaSiamRPN, TracKit) including implementations of some state-of-the-art trackers.
git submodule update --init
-
To enable ECO, ATOM, DiMP and KYS trackers (pytracking)
-
To enable DaSiamRPN tracker (DaSiamRPN)
-
To enable SiamDW and Ocean trackers (TracKit)
$360Tracking
|-- tech_report
|-- code
|-- annotation
|-- dataset
|-- dateset-demo
|-- results
|-- opencv_tracking
|-- boundingbox
|-- nfov
|-- modified_DaSiamRPN
|-- modified_pytracking
|-- modified_TracKit
|-- DaSiamRPN
|-- pytracking
|-- TracKit