Skip to content

Commit

Permalink
some changes
Browse files Browse the repository at this point in the history
  • Loading branch information
Dhruva Shaw committed Dec 12, 2024
1 parent 92df898 commit 378643c
Show file tree
Hide file tree
Showing 2 changed files with 111 additions and 3 deletions.
58 changes: 56 additions & 2 deletions _projects/mcba.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ title: Mind Controlled Bionic Arm with Sense of Touch
description: Imagine a prosthetic arm that functions like your natural arm. You wear a headband, and with the thought process, the working signal from mind connects to the prosthetic about moving the arm, it responds accordingly—just like your real arm!
tags: Bionic Arm Robotics Biotechnology Mind Control Prosthetics
giscus_comments: true
citation: true
img: /assets/img/mcba_logo.jpeg
date: 2024-12-12
featured: true
Expand Down Expand Up @@ -43,7 +44,6 @@ authors:
name: Lovely Professional University

bibliography: papers.bib
citation: true

# Optionally, you can add a table of contents to your post.
# NOTES:
Expand All @@ -54,11 +54,65 @@ citation: true
toc: true
---

# Abstract
## Abstract

Advancements in bionic technology are transforming the possibilities for restoring hand function in individuals with amputations or paralysis. This paper introduces a cost-effective bionic arm design that leverages mind-controlled functionality and integrates a sense of touch to replicate natural hand movements. The system utilizes a non-invasive EEG-based control mechanism, enabling users to operate the arm using brain signals processed into PWM commands for servo motor control of the bionic arm. Additionally, the design incorporates a touch sensor (tactile feedback) in the gripper, offering sensory feedback to enhance user safety and dexterity.
The proposed bionic arm prioritizes three essential features:
1. Integrated Sensory Feedback: Providing users with a tactile experience to mimic the sense of touch (signals directly going to the brain). This capability is crucial for safe object manipulation by arm and preventing injuries
2. Mind-Control Potential: Harnessing EEG signals for seamless, thought-driven operation.
3. Non-Invasive Nature: Ensuring user comfort by avoiding invasive surgical procedures.
This novel approach aims to deliver an intuitive, natural, and efficient solution for restoring complex hand functions.

---

## Methodology
### 1. Data Collection and Dataset Overview
The model development utilized a publicly available EEG dataset comprising data from 60 volunteers performing 8 distinct activities [3]. The dataset includes a total of 8,680 four-second EEG recordings, collected using 16 dry electrodes configured according to the international 10-10 system [3].
• Electrode Configuration: Monopolar configuration, where each electrode's potential was measured relative to neutral electrodes placed on both earlobes (ground references).
• Signal Sampling: EEG signals were sampled at 125 Hz and preprocessed using:
- A bandpass filter (5–50 Hz) to isolate relevant frequencies [3].
- A notch filter (60 Hz) to remove powerline interference [3].

### 2. Data Preprocessing
The dataset, originally provided in CSV format, underwent a comprehensive preprocessing workflow:
• The data was split into individual CSV files for each of the 16 channels, resulting in an increase from 74,441 files to 1,191,056 files.
• Each individual channel's EEG data was converted into audio signals and saved in .wav format, allowing the brain signals to be audibly analyzed.
• The entire preprocessing workflow was implemented in Python to ensure scalability and accuracy.
The dataset captured brainwave signals corresponding to the following activities:
1) BEO (Baseline with Eyes Open): One-time recording at the beginning of each run [3].
2) CLH (Closing Left Hand): Five recordings per run [3].
3) CRH (Closing Right Hand): Five recordings per run [3].
4) DLF (Dorsal Flexion of Left Foot): Five recordings per run [3].
5) PLF (Plantar Flexion of Left Foot): Five recordings per run [3].
6) DRF (Dorsal Flexion of Right Foot): Five recordings per run [3].
7) PRF (Plantar Flexion of Right Foot): Five recordings per run [3].
8) Rest: Recorded between each task to capture the resting state [3] [4].

### 3. Feature Extraction and Classification
Feature extraction and activity classification were performed using transfer learning with YamNet [5], a deep neural network model.
• Audio Representation: Audio files were imported into MATLAB using an Audio Datastore [6]. Mel-spectrograms, a time-frequency representation of the audio signals, were extracted using the yamnetPreprocess [7] function [8].
• Dataset Split: The data was divided into training (70%), validation (20%), and testing (10%) sets.
Transfer Learning with YamNet [5] [8]:
- The pre-trained YamNet model (86 layers) was adapted for an 8-class classification task:
-> The initial layers of YamNet [5] were frozen to retain previously learned representations [8].
-> A new classification layer was added to the model [8].
- Training details:
-> Learning Rate: Initial rate of 3e-4, with an exponential learning rate decay schedule [8].
-> Mini-Batch Size: 128 samples per batch.
-> Validation: Performed every 651 iterations.

### 4. Robotic Arm Design and Simulation
A 3-Degree-of-Freedom (DOF) robotic arm was designed using MATLAB Simulink and Simscape toolboxes. To ensure robust validation:
• A virtual environment was developed in Simulink, simulating the interactions between the trained AI models and the robotic arm.
• The simulations served as a testbed to evaluate the system's performance before real-world integration.

### 5. Project Progress and Future Directions
Completed Tasks:
1. AI Model Development: Successfully trained models to classify human activities based on EEG signals.
2. Robotic Arm Design: Designed a functional 3-DOF robotic arm with simulated controls.
3. Virtual Simulation: Validated AI-robotic arm interactions in a virtual environment.

Future Directions:
1. Hardware Integration: Implement the developed AI models into physical robotic hardware for real-world testing.
2. Real-Time EEG Acquisition: Develop a system for real-time EEG data acquisition and activity classification.
3. Tactile Feedback System: Integrate tactile sensors with the robotic arm for real-world sensory feedback, complemented by Simulink-based simulations.
56 changes: 55 additions & 1 deletion assets/bibliography/papers.bib
Original file line number Diff line number Diff line change
@@ -1,4 +1,58 @@
---
---
@misc{nprnews,
url={https://www.npr.org/sections/health-shots/2021/05/20/998725924/a-sense-of-touch-boosts-speed-accuracy-of-mind-controlled-robotic-arm},
journal={NPR},
year={2021},
month={May},
title={Scientists Bring The Sense Of Touch To A Robotic Arm}
}

@misc{transferlearning_matlab,
url={https://in.mathworks.com/help/audio/ug/transfer-learning-with-pretrained-audio-networks.html},
journal={Mathworks.com},
year={2024},
title={Transfer Learning with Pretrained Audio Networks}
}

@misc{classify_sounds_using_yamnet_2021,
url={https://in.mathworks.com/help/audio/ref/yamnetpreprocess.html},
journal={Mathworks.com},
year={2021},
title={yamnetPreprocess}
}

@misc{audio_datastore,
url={https://in.mathworks.com/help/audio/ref/audiodatastore.html},
journal={Mathworks.com},
year={2021},
title={audioDatastore}
}

@misc{yamnet_github,
url={https://github.com/tensorflow/models/tree/master/research/audioset/yamnet},
journal={GitHub},
year={2024},
author={Google and Ellis, Dan and Plakal, Manoj},
}

@article{asanza_2023,
title={MILimbEEG: An EEG Signals Dataset based on Upper and Lower Limb Task During the Execution of Motor and Motorimagery Tasks}, volume={2}, url={https://data.mendeley.com/datasets/x8psbz3f6x/2},
DOI={https://doi.org/10.17632/x8psbz3f6x.2},
journal={Mendeley Data},
author={Asanza, Victor and Montoya, Daniel and Lorente-Leyva, Leandro Leonardo and Peluffo-Ordóñez, Diego Hernán and González, Kléber},
year={2023},
month={July}
}

@misc{https://doi.org/10.5524/100295,
doi = {10.5524/100295},
url = {http://gigadb.org/dataset/100295},
author = {Cho, Hohyun and Ahn, Minkyu and Ahn, Sangtae and {Moonyoung Kwon} and Jun, Sung Chan},
keywords = {ElectroEncephaloGraphy(EEG), Motor imagery, EEG, brain computer interface, performance variation, subject-to-subject transfer},
language = {en},
title = {Supporting data for "EEG datasets for motor imagery brain computer interface"},
publisher = {GigaScience Database},
year = {2017},
copyright = {CC0 1.0 Universal}
}

0 comments on commit 378643c

Please sign in to comment.