%0 Journal Article %T Decoding spatial attention with EEG and virtual acoustic space %A Kaan E. Raif %A Sarah C. Determan %A Yan Gai %A Yue Dong %J Archive of "Physiological Reports". %D 2017 %R 10.14814/phy2.13512 %X Decoding spatial attention based on brain signals has wide applications in brain¨Ccomputer interface (BCI). Previous BCI systems mostly relied on visual patterns or auditory stimulation (e.g., loudspeakers) to evoke synchronous brain signals. There would be difficulties to cover a large range of spatial locations with such a stimulation protocol. The present study explored the possibility of using virtual acoustic space and a visual©\auditory matching paradigm to overcome this issue. The technique has the flexibility of generating sound stimulation from virtually any spatial location. Brain signals of eight human subjects were obtained with a 32©\channel Electroencephalogram (EEG). Two amplitude©\modulated noise or speech sentences carrying distinct spatial information were presented concurrently. Each sound source was tagged with a unique modulation phase so that the phase of the recorded EEG signals indicated the sound being attended to. The phase©\tagged sound was further filtered with head©\related transfer functions to create the sense of virtual space. Subjects were required to pay attention to the sound source that best matched the location of a visual target. For all the subjects, the phase of a single sound could be accurately reflected over the majority of electrodes based on EEG responses of 90 s or less. The electrodes providing significant decoding performance on auditory attention were fewer and may require longer EEG responses. The reliability and efficiency of decoding with a single electrode varied with subjects. Overall, the virtual acoustic space protocol has the potential of being used in practical BCI systems %K Auditory attention %K brain¨C computer interface %K EEG %K phase %K spatial attention %U https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5704085/