This repository contains code for performing foreground/background segmentation of the human hand in videos from an egocentric perspective, using pixel-level classification. This project is based on work by Cheng Li and Kris Kitani, and described in the publication below:
Li, Cheng, and Kris M. Kitani. "Pixel-level hand detection in ego-centric videos." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2013. (pdf)
Using this app requires training data. This project comes with sample training data, but you can create your own by labeling images using Kitani's 'Labeling Tool', which we have also ported to work with the latest Xcode IDE. You can find it here.
This project is ported to work in Apple's [Xcode] Hand Tracker uses Xcode which can be installed from the Mac App Store here.
This project also uses the OpenCV library (Version 2.4.13 or 2.4.12) for C++. Two ways to install OpenCV on OS X; we recommend the first:
- Use Homebrew, from the terminal:
brew update brew tap homebrew/science brew install opencv
* Make sure OpenCV's dependencies are also installed, you can check using `brew info opencv`
- Download OpenCV and build it using cMake * This tutorial provides greater detail for setting up OpenCV with cMake and Xcode.
To install the Hand Tracking project, complete the following steps.
-
Clone this repository:
git clone https://github.com/cmuartfab/handtrack.git
-
Double click the
HandTracker
xcode project (the file with the blue icon) to open it in Xcode -
In Xcode, on the top level toolbar navigate to
File -> Add files to HandTracker
. -
When the file window pops up, press
/
to open the folder option. Type inusr/local/lib
and hit Go. -
When in the usr/local/lib folder, select all of the .dylib files that start with libopencv.
-
Before you click add:
- Make sure
Add to targets: Hand Tracker
is selected. - Make sure
Added Folder: Create Groups
is selected.
- Make sure
-
Click Add. You should see all the libopencv
.dylib
files in the HandTracker project folder. -
In Xcode, click on
HandTracker
xcode project to open the build settings. -
Under targets on the left column, select HandTracker.
-
Make sure
Library Search Paths
points to where OpenCV is installed on your machine.- If you used Homebrew, it should be in
usr/local/Cellar
- If you used Homebrew, it should be in
- With Xcode open, on the top toolbar go to
Xcode->preferences->Text Editing
. CheckShow Line Numbers
. - In
main.cpp
, change lines 13 and 14 to the following to run training:
bool TRAIN_MODEL = 1;
bool TEST_MODEL = 0;
- In
main.cpp
, change line 36 with the absolute file path to the Hand Tracker project folder.
string root = "ABSOLUTE PATH TO PROJECT FOLDER";
- Click Run to run the training model. When completed you should see .xml files in
models/
andglobfeat/
- In
main.cpp
, change lines 13 and 14 to the following to run testing:
bool TRAIN_MODEL = 0;
bool TEST_MODEL = 1;
- Click Run to run the testing model. This should open your laptop's webcam and shade your hands with a deep purple color in real time.