转自:爱可可-爱生活
CNNGestureRecognizer ver 2.0
Gesture recognition via CNN neural network implemented in Keras + Theano + OpenCV
Key Requirements: Python 2.7.13 OpenCV 2.4.8 Keras 2.0.2 Theano 0.9.0
Suggestion: Better to download Anaconda as it will take care of most of the other packages and easier to setup a virtual workspace to work with multiple versions of key packages like python, opencv etc.
Repo contents
-
trackgesture.py : The main launcher. This file contains all the code for UI options and OpenCV code to capture camera contents. This internally calls interfaces to gestureCNN.py.
-
gestureCNN.py : This file holds all the CNN specific code to create CNN model, load the weight file (if model is pretrained), train the model using image samples present in ./imgfolder_b, visualize the feature maps at different layers of NN (of pretrained model)
for a given input image present in ./imgs folder. -
imgfolder_b : This folder contains all the 4015 gesture images I took in order to train the model.
-
imgs - This is an optional folder of few sample images that one can use to visualize the feature maps at different layers. These are few sample images from imgfolder_b only.
-
ori_4015imgs_acc.png : This is just a pic of a plot depicting model accuracy Vs validation data accuracy after I trained it.
-
ori_4015imgs_loss.png : This is just a pic of a plot depicting model loss Vs validation loss after I training.
Usage
$ KERAS_BACKEND=theano python trackgesture.py
We are setting KERAS_BACKEND to change backend to Theano, so in case you have already done it via Keras.json then no need to do that. But if you have Tensorflow set as default then this will be required.
Features
This application comes with CNN model to recognize upto 5 pretrained gestures:
-
OK
-
PEACE
-
STOP
-
PUNCH
-
NOTHING (ie when none of the above gestures are input)
This application provides following functionalities:
-
Prediction : Which allows the app to guess the user's gesture against pretrained gestures. App can dump the prediction data to the console terminal or to a json file directly which can be used to plot real time prediction bar chart (you can use my other - https://github.com/asingh33/LivePlot)
-
New Training : Which allows the user to retrain the NN model. User can change the model architecture or add/remove new gestures. This app has inbuilt options to allow the user to create new image samples of user defined gestures if required.
-
Visualization : Which allows the user to see feature maps of different NN layers for a given input gesture image. Interesting to see how NN works and learns things.
Demo
Youtube link - https://www.youtube.com/watch?v=CMs5cn65YK8
链接:
https://github.com/asingh33/CNNGestureRecognizer
原文链接:
https://m.weibo.cn/1402400261/4128216208041805
本站文章如无特殊说明,均为本站原创,如若转载,请注明出处:【推荐】Keras/Theano/OpenCV实现的CNN手势识别 - Python技术站