Skip to Main Content
We propose a novel metric for quantitatively evaluating ocular artifact (OA) removal methods on real electroencephalogram (EEG) data. For real EEG, existing metrics measure the amount of artifact removed. Our metric measures how much a given method is likely to distort the underlying EEG. The new metric was used to evaluate two existing OA removal algorithms that use the electro-oculogram (EOG) as a reference signal. The combination of a previous metric and our new metric showed there is a trade-off between how well an algorithm removes OAs and how likely it is to distort the underlying EEG. These algorithms require a reference EOG signal, yet for certain applications (e.g., a brain computer interface or BCI) it is preferable or necessary to avoid attaching electrodes around the eyes. We thus also used various combinations of up to 55 channels of EEG to estimate the EOG reference. The metric was again used to compare the use of estimated vs. measured EOG. Our initial results showed that using EOG estimated from as few as 4 EEG electrodes increased the likelihood of distorting the EEG from 14% to 19% and from 21% to 23% for the two algorithms. For some applications (e.g., BCI), the slight reduction in performance may be acceptable in order to avoid using EOG electrodes.