TY - GEN
T1 - Analysis of Motor Imagery Data from EEG Device to Move Prosthetic Hands by using Deep Learning Classification
AU - Saragih, Agung Shamsuddin
AU - Basyiri, Hadyan Nasran
AU - Raihan, Muhammad Yusuf
N1 - Funding Information:
This initial research works was support by Faculty of Engineering Universitas Indonesia, Direktorat Riset dan Pengabdian Masyarakat (DRPM) Universitas Indonesia under PUTI Prosiding 2020 Program No. NKB-3740/UN2.RST/HKP.05.00/2020.
Publisher Copyright:
© 2022 American Institute of Physics Inc.. All rights reserved.
PY - 2022/8/16
Y1 - 2022/8/16
N2 - Controlling the artificial hand using the mind is a dream for many people who had lost their limbs. Brain-Computer Interface (BCI) technology is hoped in making these things happen by connecting commands and responses to the brain as information in the control system. However, the complexity of the EEG signal becomes a challenge in realizing. The use of a deep learning-based classification model is expected to be a solution for classifying the hand movements imagined by the user as an input to the electric artificial hand control system. The main aim of this study is to classify EEG signals from the human brain in real-time using a non-invasive EEG headset for two different hand operations: rest and grip. OpenBCI Ultracortex Mark IV Headset was used in this study. This study proposes a solution for the classification of rest and grip hand movement by exploiting a Long Short-Term Memory (LSTM) network and Convolutional Neural Network (CNN) to learn the electroencephalogram (EEG) time-series information. EEG signals were recorded from 1 healthy subject via brain waves at specific locations on the scalp, at points F3, Fz, F4, FC1, FC2, C3, CZ, C3. A wide range of time-domain features are extracted from the EEG signals and used to train an LSTM and CNN to perform the classification task. This headset can capture brain waves that include artefacts such as limb movement, heartbeat, blink, and many more. Raw EEG from the headset was processed for event detection. Raw EEG from the headset was filtered using Butterworth bandpass filtering to separate the signal data into a new dataset containing alpha, beta, and both ranges. The results of this study indicate that the classification model using the CNN technique for the classification of two types of hand movements is able to achieve an accuracy of 95.45% at the highest, while the LSTM technique can achieve an accuracy of 93.64 %. Detected events were then used to trigger control signals to a prosthetic hand controlled by microcontroller.
AB - Controlling the artificial hand using the mind is a dream for many people who had lost their limbs. Brain-Computer Interface (BCI) technology is hoped in making these things happen by connecting commands and responses to the brain as information in the control system. However, the complexity of the EEG signal becomes a challenge in realizing. The use of a deep learning-based classification model is expected to be a solution for classifying the hand movements imagined by the user as an input to the electric artificial hand control system. The main aim of this study is to classify EEG signals from the human brain in real-time using a non-invasive EEG headset for two different hand operations: rest and grip. OpenBCI Ultracortex Mark IV Headset was used in this study. This study proposes a solution for the classification of rest and grip hand movement by exploiting a Long Short-Term Memory (LSTM) network and Convolutional Neural Network (CNN) to learn the electroencephalogram (EEG) time-series information. EEG signals were recorded from 1 healthy subject via brain waves at specific locations on the scalp, at points F3, Fz, F4, FC1, FC2, C3, CZ, C3. A wide range of time-domain features are extracted from the EEG signals and used to train an LSTM and CNN to perform the classification task. This headset can capture brain waves that include artefacts such as limb movement, heartbeat, blink, and many more. Raw EEG from the headset was processed for event detection. Raw EEG from the headset was filtered using Butterworth bandpass filtering to separate the signal data into a new dataset containing alpha, beta, and both ranges. The results of this study indicate that the classification model using the CNN technique for the classification of two types of hand movements is able to achieve an accuracy of 95.45% at the highest, while the LSTM technique can achieve an accuracy of 93.64 %. Detected events were then used to trigger control signals to a prosthetic hand controlled by microcontroller.
KW - Brain-Computer Interface (BCI)
KW - Convolutional Neural Network (CNN)
KW - electroencephalography (EEG)
KW - Long Short-Term Memory (LSTM)
KW - prosthetic hand control
UR - http://www.scopus.com/inward/record.url?scp=85138276708&partnerID=8YFLogxK
U2 - 10.1063/5.0098178
DO - 10.1063/5.0098178
M3 - Conference contribution
AN - SCOPUS:85138276708
T3 - AIP Conference Proceedings
BT - 6th Biomedical Engineering''s Recent Progress in Biomaterials, Drugs Development, and Medical Devices
A2 - Rahman, Siti Fauziyah
A2 - Zakiyuddin, Ahmad
A2 - Whulanza, Yudan
A2 - Intan, Nurul
PB - American Institute of Physics Inc.
T2 - 6th International Symposium of Biomedical Engineering''s Recent Progress in Biomaterials, Drugs Development, and Medical Devices, ISBE 2021
Y2 - 7 July 2021 through 8 July 2021
ER -