Skip to content

Latest commit

 

History

History
40 lines (31 loc) · 2.35 KB

README.md

File metadata and controls

40 lines (31 loc) · 2.35 KB

Perception

Android application for controlling air hockey robot in real-time using Convolutional Neural Network (Submodule of Deep Learning Air Hockey Robot)

How it works

The application infers action that the robot should take by looking at the last 3 frames captured by the camera. Then, the inferred action is sent to Arduino via Bluetooth LE.

Predictions are made by using a convolutional neural network. The network is pretrained with labeled frames generated using Air Hockey Game Simulator, and then trained via DDQN using gym-air-hockey as the environment. Finally, the model is converted from keras to caffe2 using keras-to-caffe2 converter.

Screenshots

Launched application

Menu used to connect to a BLE device

Ready to start

During the game (Top prediction is to go southeast (bottom right) ☺)

Prerequisites

Android Studio

Download

git clone https://github.com/arakhmat/perception 

Build

Import the project to Android Studio and it will build automatically.

Limitations

  • The application was tested only on Sony Xperia m4 Aqua. It most likely will not convert raw YUV data to RGB image correctly on any other device due to an unusual format of YUV data on Xperia m4 Aqua.

Acknowledgments