Auditory-GAN: deep learning framework for improved auditory spatial attention detection

View article
PeerJ Computer Science

Main article text

 

Introduction

  • We introduce a novel auditory-GAN system comprising two modules: the SSF module and an end-to-end AD-GAN classifier. The SSF module extracts spatial feature maps by capturing the topographic specificity of alpha power from EEG signals, while the AD-GAN network mitigates the need for extensive training data by synthesizing augmented versions of the original EEG data.

  • We implemented and validated our model, comparing its performance with other state-of-the-art auditory attention decoders. The results confirm the effectiveness of our proposed approach.

Methodology

where Pdata denotes the actual distribution of the data, P(n) past distribution on noise vector n, ExPdata is the expectation of x from real data distribution Pdata, and EnP(n) denotes expectation of n sampled from noise.

Dataset and experimental settings

EEG dataset and data preparation

where TP, FP, and FN represent true positives, false positives, and false negatives, respectively.

Results and Discussion

where ¯X represents the corrupted image of original image X bearing size x1×x2×3. The randn() denotes the Gaussian noise function. The levels of noise are set as n = 0.5 and m = 1, 3, 5, 7, 9, 11. Noise robustness test is performed while adding noise at different levels to the input EEG images to contaminate them. In contaminated or corrupted images, the overstepping pixel values are restricted to the bounds [0, 255]. The box plots given in Table 12 show the strength of the developed auditory-GAN against different Gaussian noise levels. In the next step, contaminated EEG test images are fed to the AD-GAN model, and output accuracy is observed. The results shown in Table 13 reveal that there is a trivial decrease in accuracy for various noise levels which shows the strong robustness of AD-GAN against input noise perturbations.

Conclusion

Supplemental Information

Python code of Auditory GAN.

DOI: 10.7717/peerj-cs.2394/supp-1

Additional Information and Declarations

Competing Interests

The authors declare that they have no competing interests.

Author Contributions

Tasleem Kausar conceived and designed the experiments, prepared figures and/or tables, and approved the final draft.

Yun Lu performed the experiments, prepared figures and/or tables, and approved the final draft.

Muhammad Awais Asghar analyzed the data, prepared figures and/or tables, and approved the final draft.

Adeeba Kausar performed the computation work, authored or reviewed drafts of the article, and approved the final draft.

Siqi Cai analyzed the data, authored or reviewed drafts of the article, and approved the final draft.

Saeed Ahmed performed the computation work, authored or reviewed drafts of the article, and approved the final draft.

Ahmad Almogren analyzed the data, authored or reviewed drafts of the article, and approved the final draft.

Data Availability

The following information was supplied regarding data availability:

The code is available at Zenodo: tasleem-hello. (2024). tasleem-hello/Auditory-GAN-: AD-Gan (1.0). Zenodo. https://doi.org/10.5281/zenodo.13334755.

The data is available at Zenodo: Das, N., Francart, T., & Bertrand, A. (2019). Auditory Attention Detection Dataset KULeuven (2.0) [Data set]. Zenodo. https://doi.org/10.5281/zenodo.4004271.

Funding

This work was supported by the National Natural Science Foundation of China under Grant 62176102; the Joint Fund of Basic and Applied Basic Research Fund of Guangdong Province under Grant No. 2020A1515110498 and Grant No. 2020A1515140109; the Fund aimed at Improving Scientific Research Capability of Key Construction Disciplines in Guangdong Province, specifically for the project “Light-Weight Federal Learning Paradigm and its Application,” under Grant 2022ZDJS058; and the Professorial and Doctoral Scientific Research Foundation of Huizhou University under Grant 2020JB058. There was no additional external funding received for this study. The funders had a role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

677 Visitors 660 Views 32 Downloads

MIT

Your institution may have Open Access funds available for qualifying authors. See if you qualify

Publish for free

Comment on Articles or Preprints and we'll waive your author fee
Learn more

Five new journals in Chemistry

Free to publish • Peer-reviewed • From PeerJ
Find out more