Skip to content

Latest commit

 

History

History
101 lines (83 loc) · 2.44 KB

README.md

File metadata and controls

101 lines (83 loc) · 2.44 KB

Wall-E Robot

Build the Wall-E Robot with the cutting edge AI tools.

And you can talk with Robot, like "What can you see on your right?", and then it turn right, and speak out what she can see.

Dependencies/Features

And generally speaking, it can integrate all the tools build by Langchain.

Setup the environment

Clone Code

git clone git@github.com:pjq/wall-e-robot.git 
pip install -r requirements.txt

Update config.json

{
  "cohere_api_key": "",
  "openai_api_key": "",
  "enable_openai": true,
  "openai_api_base_url": "",
  "enable_mps": true,
  "blip_model": "Salesforce/blip-image-captioning-base",
  "whisper_cpp_path":"",
  "edge_tts_enable": true,
  "car_api_url": "",
  "edge_tts_voice": "en-US-JennyNeural",
}

How to start

python wall-e.py --config myconfig.json

Demo

Watch the video

Watch the video

Wall-E Robot Console

Wall-E Robot Console

Car API Service

Several years ago, I build the smart car with Raspberry Pi

And it supports call the Restful API to control the car which can be used by the LLM.

And basically, it need to support the following params.

class CarAction:
    SUPPORTED_ACTIONS = ("stop", "left", "right", "up", "down")

    def __init__(self):
        self.action = ""
        self.duration = 0
        self.speed = 0
        self.angle = 0

Reference

Toubleshooting

ImportError: cannot import name 'BlipProcessor' from 'transformers' (/Users/i329817/miniconda3/lib/python3.10/site-packages/transformers/__init__.py)

You can install it

pip install git+https://github.com/huggingface/transformers

Python environment

 conda create --name wall-e python=3.10 
 conda activate wall-e
 pip install -r requirements.txt
 conda deactivate wall-e