Project Overview
We aim to implement a Brain-Computer Interface using noise obtained during electroencephalography (EEG) measurement.Blinking, eye movement, jaw movement, and teeth clenching are typical biosignals and used as gestures. We propose the optimal electrode placement and hardware configuration for implementation and also present a dynamic time warping method-based algorithm for distinguishing the obtained waveforms in real-time. Through a survey of 116 individuals, we determined the amount of demand for each gesture in the earbud usage environment and created a demo application that depicts the derived user scenarios at a glance with EEG and gesture visualization. The results of this study can be applied to actual earbuds and are expected to be applicable to various devices, such as XR, and a plethora of healthcare scenarios.