CONTROLLING A NON-INVASIVE UPPER- LIMB PROSTHETIC DEVICE VIA A MACHINE-LEARNING ASSISTED BRAIN- COMPUTER INTERFACE

Abstract

The goal of our research was to improve the accessibility of current upper-limb prostheses. We aimed to maintain non-invasive aspects of an Electroencephalography (EEG), use affordable material and resources, and match the accuracy and control of conventional prostheses alongside improved training methods. Our process began with us designing a method of data collection that uses a 3D-printed headset with dry electrodes to record brain signal data through EEG software. We then analyzed the signals, applied preprocessing to reduce noise, and used machine learning (ML) models to classify EEG signals with respect to specific actions such as the opening and closing of a hand. Finally, we constructed a 3D-printed hand that is actuated by servos with Arduino to demonstrate the physical actions interpreted through analysis, and we leveraged novel techniques to build a virtual reality (VR) environment to serve as a tool for prosthetic rehabilitation. We successfully met the goals set for data collection and prosthetic arm actuation. Additionally, we have created a functional algorithm for action prediction but were not able to achieve the desired accuracy. Overall, we achieved our primary goal of collecting brain signal data, analyzing that data through an algorithm, and actuating a prosthetic arm with actions interpreted from the brain signals all in real-time. Moving forward, there is room to increase accessibility and quality of prostheses through further development of non-invasive brain-computer interface (BCI) based technology for 3D-printed prostheses and VR environment prosthetic models.

Notes

Rights