Product mission: To prototype and test different arm prosthetic inputs (EEG, EMG) and test different physical mechanisms for arm/hand movements (grasping, pointing,...) Make the prosthetic more intuitive, cheaper, and easier than currently available prosthetics
Users: People with physical disabilities, prosthetic researchers, medical institutions
User Stories: I, a disabled person who needs help from the robotic arm, would like to have a robotic arm that could help me grab things in front of me and better could control how hard I hold the item. If it could be at a lower price, it would be even better.
MVP: interface and robotic arm that can close and open based on user input
MVP User Stories: I, a researcher, will be able to collect input from EMG and/or EEG and translate this into useful information for the development of robotic prosthetics.
Technologies: Prosthetic: 3D Printed arm, servo motors, fishing line
Data Collection: MindFlex sensor, electrodes with different sensitivities and filtering, EEG vs EMG, Arduino, Raspberry Pi
Software: MATLAB EEG&EMG package, many open source projects on GitHub, code found in C++, Python, Matlab
Setup of development environment: Code management: Github Documentation: GitHub Wiki (preferred) Issue management: GitHub