Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using the main function to train and test. #3

Closed
mon95 opened this issue Aug 15, 2017 · 7 comments
Closed

Using the main function to train and test. #3

mon95 opened this issue Aug 15, 2017 · 7 comments
Assignees

Comments

@mon95
Copy link
Owner

mon95 commented Aug 15, 2017

Firstly apologies for having put out some code that is not ready to use (i.e., a one click solution). I have received a lot of queries on how to run the main method, I've decided to share the same here.

The code here was never meant to be a one-click solution to static gesture recognition and in fact, was shared so that the methods used to train the models were available. Either way, if you are interested in running the same as opposed to using the methods in an independent program, you can try the following:

  1. Ignore lines 505 to 508. Change 509, 512 and 513 appropriately to include the correct list(s) of users (as per the data you have downloaded).

  2. Add a line:

gs = GestureRecognizer('/path/to/dataset/') (provide the correct path here)

This constructor is different from the one we've used. The reason we used the other one was because we had already trained the models. Using this constructor means that the models will get trained when you call the train method.

Then use the following as it is:

gs.train(user_tr) gs.save_model(name = "your-model-name.pkl.gz", version = "0.0.1", author = 'ss') print "The GestureRecognizer is saved to disk"

Once it is trained (it might take a lot of time to train), your model will be saved to disk. Then on, you can simply load the model and use it to test (i.e., detect gestures) using the recognize_gesture() method.

For this,

gs = GestureRecognizer.load_model(name = "your-model-name.pkl.gz") # automatic dict unpacking gs.recognize_gesture()

Hope this helps!

@goodmangu
Copy link

Do you now have more datasets available?

@goodmangu
Copy link

Can you show an example of how to use recognize_gesture()?

@KARTHICKRAJA0077
Copy link

i have trained that asl mnist dataset from kaggle and i got 95 percent accuracy.but when i trid with your image i dint get proper output canu please help me to get out of this.

@goodmangu
Copy link

Hi Karthick, could you share the link to the asl mnist dataset?

@KARTHICKRAJA0077
Copy link

KARTHICKRAJA0077 commented Feb 6, 2018 via email

@TusharAI
Copy link

TusharAI commented Apr 2, 2020

I'm getting
le = loadClassifier('label_en (1).pkl') error in pipeline_final.ipynb file.
How to resolve it?

@KARTHICKRAJA0077
Copy link

KARTHICKRAJA0077 commented Apr 6, 2020 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants