3.4.4. Attention Networks
Attention Networks are designed to prioritize certain parts of the input data, making it more focused and context-aware analysis of complex tasks. Instead of treating all input data equally, Attention Networks assign different weights to different data segments, guiding the network to prioritize the most informative parts. This is especially important in cases where context and details matter, such as in language translation. In emotion recognition, attention mechanism helps focus on specific channels or time points that are more indicative of the emotional state.
Qiao et al. used spatial and channel attention mechanisms together with a Generative Adversarial Network (GAN) to enhance feature extraction and deal with the issue of weak EEG signals. Sartipi et al. developed a hybrid model that combines spatiotemporal encoding with recurrent attention blocks, while also integrating graph smoothing to reduce noise and clarify signals for datasets like DEAP and DREAMER. Liu et al. proposed a semi-supervised model, Consistency Regularization Enhanced Graph Attention Network (CR-GAT), which ultilizes feature graph construction and consistency regularization to boost performance, especially when labeled EEG data is limited.
Last updated