Peer-reviewed articles 17,970 +



Title: IMPLEMENTATION OF MULTIMODAL INTERFACE FOR HUMAN–COMPUTER INTERACTION SYSTEM TO CONTROL A ROBOTIC WHEELCHAIR

IMPLEMENTATION OF MULTIMODAL INTERFACE FOR HUMAN–COMPUTER INTERACTION SYSTEM TO CONTROL A ROBOTIC WHEELCHAIR
Elena Petrunina; Elmin Bayramov; Denis Pecherskij
10.5593/sgem2023/2.1
1314-2704
English
23
2.1
•    Prof. DSc. Oleksandr Trofymchuk, UKRAINE 
•    Prof. Dr. hab. oec. Baiba Rivza, LATVIA
The research on the development of new robot control systems is currently underway. The interest in this area stems from the need for practical, user-friendly means of transport adapted to people with physical disabilities. The application of these devices will allow disabled people to manipulate external devices using EEG of both brain activity and eye movement.
It is still possible for people with disabilities to partially apply their gaze in order to control and communicate with an assistive device. However, the application of eyetracker - based interfaces technology encounters the problem of involuntary eye movements leading to unwanted item selection (the Midas-Touch problem). The following issue can be addressed by the development of multimodal and hybrid management interfaces.
Brain-computer interfaces implement translation of brain activity patterns into commands designed to control interactive applications, with recognition of motor imagination patterns. Thus, this study proposes a multimodal architecture for wheelchair gaze-control system for people with mobility impairments using gaze control and intention confirmation technology using brain-computer interfaces.
In this study, a hybrid model was proposed to classify EEG motor imagery signals and eye tracker signals to implement a control system with a neural network architecture consisting of pre-trained convolutional neural network and gated recurrent unit. The performance of the adapted approach is determined using a multiclass imaginary motion dataset and the corresponding swipes and classification results.
[1] Vinay Krishna Sharma, Murthy, KamalPreet Singh Saluja, Vimal Mollyn, Eye Gaze Controlled Robotic Arm for Persons with SSMI. Computer science. 2020. DOI 10.3233/TAD-200264.
[2] Sunny, M.S.H., Zarif, M.I.I., Rulik, I. et al. Eye-gaze control of a wheelchair mounted 6DOF assistive robot for activities of daily living. J NeuroEngineering Rehabil 18, 173, 2021. https://doi.org/10.1186/s12984-021-00969-2.
[3] Ban, S., Lee, Y.J., Yu, K.J., Chang, J.W., Kim, J. and Yeo, W. (2023), Persistent Human–Machine Interfaces for Robotic Arm Control Via Gaze and Eye Direction Tracking. Adv. Intell. Syst.200408. https://doi.org/10.1002/aisy.202200408.
[4] Zhang X, Liu X, Yuan S-M, Lin S-F. Eye tracking based control system for natural human–computer interaction. Comput Intell Neurosci. 2017; 2017:1– 9. https://doi.org/10.1155/2017/5739301.
[5] Olivas, Brenda E & Chacon, Mario. Classification of multiple motor imagery using deep convolutional neural networks and spatial filters. Applied Soft Computing. 2018, 75. 10.1016/j.asoc.2018.11.031.
[6] Umar Amin, Mansour Alsulaiman, Ghulam Muhammad, Mohamed Amine Mekhtiche, M. Shamim Hossain. Deep Learning for EEG motor imagery classification based on multi-layer CNNs feature fusion. Future Generation Computer Systems, Volume 101, 2019, Pages 542-554, ISSN 0167-739X, https://doi.org/10.1016/j.future.2019.06.027.
[7] Liu C, Jin J, Daly I, Li S, Sun H, Huang Y, Wang X, Cichocki A. SincNet-Based Hybrid Neural Network for Motor Imagery EEG Decoding. IEEE Trans Neural Syst Rehabil Eng. 2022; 30:540-549. doi: 10.1109/TNSRE.2022.3156076. Epub 2022 Mar 14. PMID: 35235515.
[8] Hu H, Pu Z, Li H, Liu Z, Wang P. Learning Optimal Time-Frequency-Spatial Features by the CiSSA-CSP Method for Motor Imagery EEG Classification. Sensors (Basel). 2022 Nov 5;22(21):8526. doi: 10.3390/s22218526.
[9] Khademi, Zahra Ebrahimi, Farideh Montazery Kordy, Hussain. A transfer learningbased CNN and LSTM hybrid deep learning model to classify motor imagery EEG signals. Computers in Biology and Medicine. 2020. 143. 105288. 10.1016/j.compbiomed.2022.105288.
[10] Istomina, T.V., Petrunina, E.V., Kopylova, E.V. et al. Monitoring the Biophysical Status of People with Disability in Gaze-Controlled Wheelchairs. Biomed Engineering. 2022. 56, 268–272. https://doi.org/10.1007/s10527-022-10217-z.
conference
Proceedings of 23rd International Multidisciplinary Scientific GeoConference SGEM 2023
23rd International Multidisciplinary Scientific GeoConference SGEM 2023, 03 - 09 July, 2023
Proceedings Paper
STEF92 Technology
International Multidisciplinary Scientific GeoConference SGEM
SWS Scholarly Society; Acad Sci Czech Republ; Latvian Acad Sci; Polish Acad Sci; Russian Acad Sci; Serbian Acad Sci and Arts; Natl Acad Sci Ukraine; Natl Acad Sci Armenia; Sci Council Japan; European Acad Sci, Arts and Letters; Acad Fine Arts Zagreb Croatia; Croatian Acad Sci and Arts; Acad Sci Moldova; Montenegrin Acad Sci and Arts; Georgian Acad Sci; Acad Fine Arts and Design Bratislava; Russian Acad Arts; Turkish Acad Sci.
81-88
03 - 09 July, 2023
website
9091
multimodal interface, brain-computer interface, human–computer interaction, people with disabilities