Analysis of Motor Imagery Data from EEG Device to Move Prosthetic Hands by using Deep Learning Classification

Agung Shamsuddin Saragih, Hadyan Nasran Basyiri, Muhammad Yusuf Raihan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Controlling the artificial hand using the mind is a dream for many people who had lost their limbs. Brain-Computer Interface (BCI) technology is hoped in making these things happen by connecting commands and responses to the brain as information in the control system. However, the complexity of the EEG signal becomes a challenge in realizing. The use of a deep learning-based classification model is expected to be a solution for classifying the hand movements imagined by the user as an input to the electric artificial hand control system. The main aim of this study is to classify EEG signals from the human brain in real-time using a non-invasive EEG headset for two different hand operations: rest and grip. OpenBCI Ultracortex Mark IV Headset was used in this study. This study proposes a solution for the classification of rest and grip hand movement by exploiting a Long Short-Term Memory (LSTM) network and Convolutional Neural Network (CNN) to learn the electroencephalogram (EEG) time-series information. EEG signals were recorded from 1 healthy subject via brain waves at specific locations on the scalp, at points F3, Fz, F4, FC1, FC2, C3, CZ, C3. A wide range of time-domain features are extracted from the EEG signals and used to train an LSTM and CNN to perform the classification task. This headset can capture brain waves that include artefacts such as limb movement, heartbeat, blink, and many more. Raw EEG from the headset was processed for event detection. Raw EEG from the headset was filtered using Butterworth bandpass filtering to separate the signal data into a new dataset containing alpha, beta, and both ranges. The results of this study indicate that the classification model using the CNN technique for the classification of two types of hand movements is able to achieve an accuracy of 95.45% at the highest, while the LSTM technique can achieve an accuracy of 93.64 %. Detected events were then used to trigger control signals to a prosthetic hand controlled by microcontroller.

Original languageEnglish
Title of host publication6th Biomedical Engineering''s Recent Progress in Biomaterials, Drugs Development, and Medical Devices
Subtitle of host publicationProceedings of the 6th International Symposium of Biomedical Engineering, ISBE 2021
EditorsSiti Fauziyah Rahman, Ahmad Zakiyuddin, Yudan Whulanza, Nurul Intan
PublisherAmerican Institute of Physics Inc.
ISBN (Electronic)9780735443716
DOIs
Publication statusPublished - 16 Aug 2022
Event6th International Symposium of Biomedical Engineering''s Recent Progress in Biomaterials, Drugs Development, and Medical Devices, ISBE 2021 - Depok, Virtual, Indonesia
Duration: 7 Jul 20218 Jul 2021

Publication series

NameAIP Conference Proceedings
Volume2537
ISSN (Print)0094-243X
ISSN (Electronic)1551-7616

Conference

Conference6th International Symposium of Biomedical Engineering''s Recent Progress in Biomaterials, Drugs Development, and Medical Devices, ISBE 2021
Country/TerritoryIndonesia
CityDepok, Virtual
Period7/07/218/07/21

Keywords

  • Brain-Computer Interface (BCI)
  • Convolutional Neural Network (CNN)
  • electroencephalography (EEG)
  • Long Short-Term Memory (LSTM)
  • prosthetic hand control

Fingerprint

Dive into the research topics of 'Analysis of Motor Imagery Data from EEG Device to Move Prosthetic Hands by using Deep Learning Classification'. Together they form a unique fingerprint.

Cite this