Skip to content

LeMoN-research/LeMoN-AI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

72 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LeMoN-AI

Lets teach that dude to dance.

What is the purpose.

LeMoN (Learning Motion Network) is a tiny experiment conducted by four russian students. We are truying to teach an agent to dance in a 3D space based on the music playing.

Ummm... why?

Well. that's easy. We are teaching an agent to behave in a 3D environment based on his surroundings. In this case surroundings are described by music, but we could change it to camera images or 3D scanners' data. So, theoriticly, in the future we could apply the same architecture to solving more complex problems, like picking up big objects, based on their positions or changing one's path based on things in his way.

How?

We have found mocaps of people dancing, plus recordings, that have been playing during the dance. (Data is not included, so write to any of the authors to get it). In the future we would have to record our own mocaps with additional data to train our network.

And what is your NN's architecture?

The exact architecture might have already changed, so check the NN's file for details. But in short, generation is made based on past movements and music, that has been playing for last N mileseconds.

Oh my god, I want it!

Well, that's great. Contact any of us for information on contributing or GIVING US MONEY... Em... Investing.

P.S. If you have mocap gear, that you are willing to give us to use for a day or so, we will owe you a beer.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •