Skip to content

CSID-DGU/2017-2-CECD-ButterflyEffect-2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BattleWorms

Members

Contents

  1. Results
  2. Game Rules
  3. Development Diary
  4. OpenSource
  5. License

Results

1) Demo Environment & Demo video


2) Software architecture & Internal logic



Game Rules

1) Game participation

   

You can play the game by raising your hands for a second. At this point, the user's key colors are stored together. The right picture is the core color of the user.

2) How to Move your worm

   

If you turn your head to the left, the worm will turn left in its own direction, and vice versa.
Tip : You can boost the your worm by raising your hands . various strategies are available with booster!!

Development Diary

1) Simple client-server face tracking program (17.10.05)

This is the program we first created for the project. It is designed to create the overall framework of the project. The framework of our project is as follows.

1. send real-time image frames from client to server (UDP packets).
2. send the value of interpreting frame using openpose (TCP packets).
3. reflects the values in BattleWorms.

AWS Server is needed because openpose requring a high-quilty-computer is difficult to develop on a personal computer. but at this program, we used simple-face-detection-module instead of using openpose since we had not yet analyzed it.


2) Openpose analysis (17.10.30)

The next step is to replace the simple-face-detection-module to openpose. There are three basic examples of openpose tutorial.

Running on webcam.
Running on video.
Running on images.

We decided to analyze the examples here and develop our program. but It was harder than we thought. there were no examples available in the client-server structure. After much thinking, we decided to analyze and use third example because we felt we could solve the problem by replacing the image file with the frame received from the client.


3) Completed BattleWorms for a single player (17.12.04)

   

We made our project BattleWorms in which one player can play. By sending a real-time camera frame, the server sends out the coordinates of the detected person's skeleton on that frame, and the client analyzes the coordinates to move the worm, which consists of a head and a tail, and a head has theta, so it moves in the direction of theta on every frame.

We have a problem to tackle before we extend the project to multiple people.

1. Ineffective Image Processing : we draw every tails at camara frame.
2. Camera and worms use same frame  : it necessary to separate the camera and game views.
3. Difficult to develope the intended UI

It was decided to solve the first problem by unity and we thought the second problem would be simpler if we used android.


4) Changed our program architecture (18.02.22)

   

We have changed the existing C++ client to Android client for convenience in developing the view and unity.
To perform the same as previous clients, what we focused is as follows :

1. Socket module to perform the same operations as c++ socket
2. Send android camera frame 10fps, compressing it jpg bytes array. 
3. Develope BattleWorms in Unity and import them to Android

5) Completed BattleWorms for multi players (18.03.18)

   

We have changed the data structure to JSON for multiple users.
The Filter has been developed to extract only actual users from detected persons from frame.

Filter : How we distinguish actual players between different people

We extract the real player from among many people in frame as follows:

1. The previous frame player coordinates are used to determine the player in the new frame.
2. check the user information near the previous player's position and call them the candidates.
3. when (candidates.size == 1) newPlayerPosition = candidate
4. when (candidates.size >= 2) newPlayerPosition = candiates(mostSimilarColor)
5. when (candidates.size == 0) newPlayerPosition = previousFramePlayerPosition

6) Adding game effect and enhancing filter accuracy (18.06.17)

   

  • We added game elements such as sound and graphics.
  • We improved our filter so that it can function normally even if there are many people in the background.

Thank you!

OpenSource

We are making our program using OpenPose!

OpenPose is authored by Gines Hidalgo, Zhe Cao, Tomas Simon, Shih-En Wei, Hanbyul Joo, and Yaser Sheikh. Currently, it is being maintained by Gines Hidalgo and Bikramjot Hanzra. The original CVPR 2017 repo includes Matlab and Python versions, as well as the training code. The body pose estimation work is based on the original ECCV 2016 demo.

License

This program follows the licensing of the OpenPose. freely available for free non-commercial use, and may be redistributed under these conditions. Please, see the license for further details.

About

Motion Recognition System using Image Processing

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •