Skip to content

Modifications to improve single object tracking in 360° equirectangular videos.

License

Notifications You must be signed in to change notification settings

VitaAmbroz/360Tracking

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

46 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Visual Object Tracking in Panoramic video

This repository 360Tracking has been created as the part of master thesis at Brno University of Technology - Faculty of Information Technology. The master thesis has been supervised by Doc. Ing. Martin Čadík, Ph.D.

Improvements of Single Object Tracking (SOT) in 360° video

Single object trackers can fail or produce false positive results when tracking in equirectangular projection of 360° videos. The reasons of failures could be similar as in ordinary videos with limited field of view (e.g. occlusion). Although there might be other problems caused by equirectangular distortion. The tracked object can also cross horizontal borders of equirectangular frame. You could see how the state-of-the-art tracker KYS generates false positive results and fails below.

kys_default

Equirectangular rotation approach

This approach may be the right solution for border crossing problem of single object tracking in equirectangular panorama. We have provided the solution where we simply simulate the spherical rotation about z-axis (Yaw). The tracker predicts results in shifted/rotated frame and predicted bounding box is transformed backward to original frame.

kys_border

Normal field of view (rectilinear) approach

The second approach could improve tracking process in very distorted areas in equirectangular projection and solves border crossing problem as well. This solution simulates virtual camera system that tries to keep the tracked object in the center area of rectilinear projection. The tracker predicts results in rectilinear/perspective projection (adaptive field of view from 90° to 144°) and these results are converted backward to equirectangular (360°) coordinates.

kys_nfov

Evaluation

We have evaluated 12 trackers for manually created dataset composed of 21 equirectangular videos.

Dataset

New dataset with annotated objects in 360° equirectangular video has been created. You can see demo of this dataset here or on YouTube. This dataset includes 21 video sequences with total 9909 annotated (groundtruth) frames. The bounding boxes could be overflowing the horizontal borders of equirectangular frames.

You can download full dataset using this script or you can download it manually as zip or folder.

The videos used in this dataset have been taken from these resources (datasets 1 a 2 have been reannotated):

  1. Keng-Chi Liu, Yi-Ting Shen, Liang-Gee Chen. "Simple online and realtime tracking with spherical panoramic camera" (ICCE 2018) [Paper] [Dataset]

  2. Mi Tzu-Wei and Yang Mau-Tsuen. "Comparison of Tracking Techniques on 360-Degree Videos" (2019) [Paper] [Dataset]

  3. Afshin Taghavi Nasrabadi, Aliehsan Samiei, Anahita Mahzari, Ryan P. McMahan, Ravi Prakash, Mylène C. Q. Farias, and Marcelo M. Carvalho. "A taxonomy and dataset for 360° videos" (2019) [Paper] [Dataset]

  4. Custom videos captured by Ricoh Theta SC.

Single object trackers

The custom improvements of Single Object Tracking in equirectangular projection of 360° video have been evaluated for the following well-known and state-of-the-art trackers. The python implementations of the selected trackers have been made publicly available by their authors or have been added to OpenCV extra modules.

B. Babenko, M. Yang and S. Belongie. "Visual tracking with online Multiple Instance Learning" (CVPR 2009)

Zdenek Kalal, Krystian Mikolajczyk, Jiri Matas. "Forward-Backward Error: Automatic Detection of Tracking Failures" (ICPR 2010)

Zdenek Kalal, Krystian Mikolajczyk, Jiri Matas. "Tracking-Learning-Detection" (TPAMI 2011)

João F. Henriques, Rui Caseiro, Pedro Martins, Jorge Batista. "High-Speed Tracking with Kernelized Correlation Filters." (TPAMI 2015)

Alan Lukezic, Tomas Vojir, Luka Cehovin, Jiri Matas, Matej Kristan. "Discriminative Correlation Filter with Channel and Spatial Reliability." (CVPR 2017)

Goutam Bhat, Martin Danelljan, Luc Van Gool, Radu Timofte. "Know Your Surroundings: Exploiting Scene Information for Object Tracking." (ECCV 2020)

Goutam Bhat, Martin Danelljan, Luc Van Gool, Radu Timofte. "Learning Discriminative Model Prediction for Tracking." (ICCV 2019)

Martin Danelljan, Goutam Bhat, Fahad Shahbaz Khan, Michael Felsberg. "ATOM: Accurate Tracking by Overlap Maximization." (CVPR 2019)

Martin Danelljan, Goutam Bhat, Fahad Shahbaz Khan, Michael Felsberg. "ECO: Efficient Convolution Operators for Tracking." (CVPR 2017)

Zheng Zhu, Qiang Wang, Bo Li, Wu Wei, Junjie Yan, Weiming Hu. "Distractor-aware Siamese Networks for Visual Object Tracking." (ECCV 2018)

Zhipeng Zhang and Houwen Peng and Jianlong Fu and Bing Li and Weiming Hu. "Ocean: Object-aware Anchor-free Tracking" (ECCV 2020)

Zhipeng Zhang, Houwen Peng. "Deeper and Wider Siamese Networks for Real-Time Visual Tracking." (CVPR 2019)

Results

These 12 trackers have been evaluated for the custom dataset mentioned above. You may notice the success plots based on Intersection over Union (IoU) metric with AUC values in legend and precision plots based on center error distance. DEFAULT plots display tracker results without any improvement, BORDER plots display "equirectangular rotation approach" results and finally NFOV plots display trackers results from normal field of view (rectilinear) approach.

success_plots

success_plots

Default installation (only OpenCV trackers)

Clone the GIT repository

git clone https://github.com/VitaAmbroz/360Tracking.git

Install dependencies

Run the installation script to install all the dependencies. These dependencies should enable OpenCV tracking.

bash install.sh

Note: The install script has been tested on an Ubuntu 18.04. You could probably use current releases of numpy, torch and matplotlib. The implementation has been tested also on Windows 10 platform.

Note 2: Make sure you have only "opencv-contrib-python" installed on your platform. There could be errors when both "opencv-contrib-python" and "opencv-python" are installed. The release opencv-contrib-python 4.5.1.48 is recommended, because some OpenCV trackers might not work in the newest release.

Let's test it!

cd code

# try OpenCV implementation of tracker CSRT
python run_opencv_tracking.py -t CSRT -v annotation/dataset-demo/demo-annotation/demo.mp4

# try CSRT with BORDER improvement
python run_opencv_tracking.py -t CSRT -v annotation/dataset-demo/demo-annotation/demo.mp4 -border

# try CSRT with NFOV improvement
python run_opencv_tracking.py -t CSRT -v annotation/dataset-demo/demo-annotation/demo.mp4 -nfov

Advanced installation

You could also try the official python implementations of selected trackers.

Clone the submodules

This command clones 3 repositories (pytracking, DaSiamRPN, TracKit) including implementations of some state-of-the-art trackers.

git submodule update --init  

Follow the instructions

  • To enable ECO, ATOM, DiMP and KYS trackers (pytracking)

    -> Follow the instructions here (see also modified code)

  • To enable DaSiamRPN tracker (DaSiamRPN)

    -> Follow the instructions here (see also modified code)

  • To enable SiamDW and Ocean trackers (TracKit)

    -> Follow the instructions here (see also modified code)

Directory structure

$360Tracking
|-- tech_report
|-- code
   |-- annotation
      |-- dataset
      |-- dateset-demo
      |-- results
   |-- opencv_tracking
   |-- boundingbox
   |-- nfov
   |-- modified_DaSiamRPN
   |-- modified_pytracking
   |-- modified_TracKit
   |-- DaSiamRPN
   |-- pytracking
   |-- TracKit