Skip to content

BostonUniversitySeniorDesign/brain-4ce

Repository files navigation

Brain-4ce - An EEG-based Brain-Computer Interface

Brain 4ce Logo
Where thoughts become motion.

Designed by BU ECE 2023 Senior Design Capstone Team 4

Brain 4ce Logo
From left to right: Brendan Shortall, Mitchell Gilmore, Dayanna De La Torres, Jonathan Mikalov, Alexander Johnson


Table of Contents


Abstract

The ability to manipulate objects using only your mind is no longer a far-fetched idea. While It may sound like something out of science fiction, our team has turned this into a reality. We have created a headset that utilizes electroencephalogram (EEG) technology to quickly and accurately record a user's neural activity. Our state-of-the-art deep learning algorithm then decodes these signals into predefined motor-imagery tasks.

This breakthrough technology is integrated with a computer game that provides the user with an environment to navigate using their mind. With the Brain-4ce (pronounced "Brain Force") headset, users can perform four different actions: rotating clockwise, rotating counterclockwise, moving forward, and moving backward. Each action corresponds to a physical motion that the user imagines themselves doing.

Our team was inspired to develop this project by our aspiration to bridge the digital divide for those who are physically disabled. With the Brain-4ce headset, users will be able to operate a computer, a wheelchair, and virtually anything else using nothing but their thoughts! The possibilities are endless, and we are excited to see what the future holds.

Getting Started

Current State of the Project

As it currently stands, our project has three major components:

  1. A working EEG data capture headset that interfaces with a Python application to monitor and record brain activity in real time.
  2. A convolutional neural network machine learning algorithm, trained on the recorded brain activity, for classifying motor-imagery tasks observed from a user wearing the headset.
  3. 2D and 3D video games developed in Python to demonstrate the efficacy of the neural network's brain activity classification.

The software communicates in real time using socket programming techniques to achieve parallelism and multithreading.

Lessons Learned

  • Developing on Apple Silicon, and ARM-based processors in general, is tricky for this project. Certain libraries, such as the ChipKIT Core library for compiling firmware onto the microcontroller, lack proper support for ARM architecture. We recommend performing all software development on an x64-based Windows machine. Some development is possible on Intel-based macOS and Linux, but it will probably save you a lot of headache to stick to Windows if possible.
  • The microcontroller appears to have USB functionality built-in, which might make the USB to UART conversion unnecessary. This may be something to examine in the documentation during a future continuation of this project.
  • The ADS1299 chip from Texas Instruments remains difficult to obtain as of Q2 2023 due to the ongoing chip shortage. We tried using the ADS1296 chip in its place, but this was unsuccessful because of differing startup and common-mode rejection configurations between the chips.
  • If future boards are manufactured, it may be worth considering paying for them to be machine-assembled. Assembling the boards by hand takes a long time and requires a substantial amount of electrical rules checking to ensure no shorts or incontinuity between components.
  • Obtaining good skin contact for all electrodes is crucial to ensure proper reading of brain activity.
  • The OpenBCI Forum is a great place to turn to for help troubleshooting. As is their documentation.

Bug Fixes

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •