Hybrid Brain Computer Interface for Robot Control
2pm
Room 5562 (Lifts 27-28), 5/F Academic Building, HKUST

Supporting the below United Nations Sustainable Development Goals:支持以下聯合國可持續發展目標:支持以下联合国可持续发展目标:

Examination Committee

Prof Jianan QU, ECE/HKUST (Chairperson)
Prof Bertram E SHI, ECE/HKUST (Thesis Supervisor)
Prof Weichuan YU, ECE/HKUST


Abstract

Brain Computer Interfaces (BCIs) use information of brain activity for communication and control application. Pure BCIs often suffer from lack of accuracy in control, long training period in adaptability and other shortcomings. Hybrid systems that incorporate other sensing modalities for better robot control have been studied and developed to solve these problems.
 
We designed a hybrid BCI system by gathering neural, behavioral and contextual cues, and combining them according to a statistical model. Different cues are integrated to predict the subject’s intention. With the extracted environmental information by a camera-robot system, the decision derived by the hybrid BCI is used to control a robot to perform goal-oriented tasks.
 
Experiments are performed both in a virtual test environment and using a real robot. The hybrid model improves the task completion accuracy, reduces task completion time and have a better prediction on the subject’s intent. In comparison to other hybrid BCI systems, where gaze information was used to determine high level control tasks, in our BCI system, gaze information modulates the probability that different low level motion commands will be selected.  This enables subjects to exert more precise control over the motion trajectory.
 

Keyword - Hybrid Brain Computer Interface, contextual cues, Bayesian inference, intent

Speakers / Performers:
Mr Xujiong DONG
Language
English