More Website Templates @!

Brain-Machine Interface (Brain-Computer Interface)

  • A low-cost EEG headset-based Hybrid BCI for real-time humanoid navigation / exploration and recognition

    This work describes a hybrid brain computer interface (BCI) technique that combines the P300 potential, the steady state visually evoked potential (SSVEP), and event-related de-synchronization (ERD) to solve a complicated multi-task problem consisting of humanoid robot navigation and control along with object recognition. Furthermore, the technique is implemented using a low-cost BCI system, along with existing BCI algorithms, which, when combined, promotes the feasibility of real-world applications. Our approach enables subjects to control the navigation and exploration of a humanoid robot and recognize a desired object among candidates. This study aims to demonstrate the possibility of using a hybrid BCI based on a low-cost system for performing a realistic and complex task. It also aims to show that the use of a simple image processing technique, combined with BCI, can further aid in making these complex tasks simpler. This work presents an important implication for the future work that a hybridization of simple BCI protocols provide extended controllability to carry out complicated tasks even with a low-cost system.
    생각만으로 조종 '아바타 시대' (MBC 뉴스테스크)[LINK]

    Related publications
    1. B Choi, S Jo, A low-cost EEG system-based hybrid brain-computer interface for humanoid robot navigation and recognition, PLoS ONE 8(9), 2013. [LINK]

  • Brain-actuated humanoid

    The brain-computer interface (BCI) technique is a novel control interface to translate human intentions into appropriate motion commands for robotic systems. The aim of this study is to apply an asynchronous direct-control system for humanoid robot navigation using an electroencephalograph (EEG), based active BCI. The experimental procedures consist of offline training, online feedback testing, and real-time control sessions. The amplitude features from EEGs are extracted using power spectral analysis, while informative feature components are selected based on the Fisher ratio. The two classifiers are hierarchically structured to identify human intentions and trained to build an asynchronous BCI system. For the performance test, five healthy subjects controlled a humanoid robot navigation to reach a target goal in an indoor maze by using their EEGs based on real-time images obtained from a camera on the head of the robot. The experimental results showed that the subjects successfully controlled the humanoid robot in the indoor maze and reached the goal by using the proposed asynchronous EEG-based active BCI system.

    Related publications
    1. Y Chae, J Jeong, S Jo, Toward Brain-Actuated Humanoid Robots: Asynchronous Direct-Control Using An EEG-Based BCI, IEEE Transactions on Robotics, 28(5):1131-1144, 2012. [LINK] [PDF]
    2. Y Chae, J Jeong, S Jo, Noninvasive brain-computer interface-based control of humanoid navigation, Proc. of IEEE/RSJ Int conf on Intelligent Robots and Systems (IROS) 2011. [LINK] [PDF]
    3. Y Chae, S Jo, J Jeong, Brain-Actuated Humanoid Robot Navigation Control Using Asynchronous Brain-Computer Interface, 5th International IEEE EMBS Conference on Neural Engineering, 2011.[PDF]
    4. B -G Shin, T Kim, S Jo, Non-invasive brain signal interface for a wheelchair navigation, ICCAS 2010. [PDF]
    5. J Park, K-E Kim, S Jo (2010) A POMDP approach to P300-bassed brain-computer interfaces, Proc of the 14th International Conference on Intelligent User Interfaces (IUI). [PDF]

  • Hybrid BCI-based control of quadcopter flight

    We propose a wearable hybrid interface where eye movements and mental concentration directly influence the control of a quadcopter in three-dimensional space. This noninvasive and low-cost interface addresses limitations in previous work by supporting users to complete their complicated tasks in a constrained environment in which only visual feedback is provided. The combination of the two inputs augments the number of control commands to enable the flying robot to travel in eight different directions within the physical environment. Five human subjects participated in the experiments to test the feasibility of the hybrid interface. A front view camera on the hull of the quadcopter provided the only visual feedback to each remote subject on a laptop display. Based on the visual feedback, the subjects used the interface to navigate along pre-set target locations in the air. The flight performance was evaluated comparing with a keyboard-based interface. We demonstrate the applicability of the hybrid interface to explore and interact with a three-dimensional physical space through a flying robot.

    [KAIST AI를 선도한다]조성호 전산학부 교수, 뇌로 드론 제어하는 기술 연구 [LINK]
    월드컵 개막전 '뇌파시축' [LINK]

    Related publications
    1. B Kim, M Kim, S Jo, Quadcopter flight control using a low-cost hybrid interface with EEG-based classification and eye tracking, Computers in Biology and Medicine, 51:82-92, 2014. Selected as an Honorable Mention Paper, Computers in Biology and Medicine, 2014. [LINK] [PDF]