Skip to content

Machine Vision Segmentation tool for extracting a mask from hands, based on work by Kris Kitani

License

Notifications You must be signed in to change notification settings

irllabs/handtrack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hand Tracker

This repository contains code for performing foreground/background segmentation of the human hand in videos from an egocentric perspective, using pixel-level classification. This project is based on work by Cheng Li and Kris Kitani, and described in the publication below:

Li, Cheng, and Kris M. Kitani. "Pixel-level hand detection in ego-centric videos." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2013. (pdf)

Training Data

Using this app requires training data. This project comes with sample training data, but you can create your own by labeling images using Kitani's 'Labeling Tool', which we have also ported to work with the latest Xcode IDE. You can find it here.

Dependencies

IDE

This project is ported to work in Apple's [Xcode] Hand Tracker uses Xcode which can be installed from the Mac App Store here.

OpenCV on OS X

This project also uses the OpenCV library (Version 2.4.13 or 2.4.12) for C++. Two ways to install OpenCV on OS X; we recommend the first:

  1. Use Homebrew, from the terminal: brew update brew tap homebrew/science brew install opencv
* Make sure OpenCV's dependencies are also installed, you can check using `brew info opencv`
  1. Download OpenCV and build it using cMake * This tutorial provides greater detail for setting up OpenCV with cMake and Xcode.

Project Installation

To install the Hand Tracking project, complete the following steps.

  1. Clone this repository: git clone https://github.com/cmuartfab/handtrack.git

  2. Double click the HandTracker xcode project (the file with the blue icon) to open it in Xcode

  3. In Xcode, on the top level toolbar navigate to File -> Add files to HandTracker.

  4. When the file window pops up, press / to open the folder option. Type in usr/local/lib and hit Go.

  5. When in the usr/local/lib folder, select all of the .dylib files that start with libopencv.

  6. Before you click add:

    • Make sure Add to targets: Hand Tracker is selected.
    • Make sure Added Folder: Create Groups is selected.
  7. Click Add. You should see all the libopencv .dylib files in the HandTracker project folder.

  8. In Xcode, click on HandTracker xcode project to open the build settings.

  9. Under targets on the left column, select HandTracker.

  10. Make sure Library Search Paths points to where OpenCV is installed on your machine.

    • If you used Homebrew, it should be in usr/local/Cellar

Running the Project

  • With Xcode open, on the top toolbar go to Xcode->preferences->Text Editing. Check Show Line Numbers.
  • In main.cpp, change lines 13 and 14 to the following to run training:
bool TRAIN_MODEL = 1;
bool TEST_MODEL  = 0;
  • In main.cpp, change line 36 with the absolute file path to the Hand Tracker project folder.
string root = "ABSOLUTE PATH TO PROJECT FOLDER"; 
  • Click Run to run the training model. When completed you should see .xml files in models/ and globfeat/
  • In main.cpp, change lines 13 and 14 to the following to run testing:
bool TRAIN_MODEL = 0;
bool TEST_MODEL  = 1;
  • Click Run to run the testing model. This should open your laptop's webcam and shade your hands with a deep purple color in real time.

About

Machine Vision Segmentation tool for extracting a mask from hands, based on work by Kris Kitani

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •