3.3.2. Random Forest

Random Forest is built on decision trees, rather than relying on a single tree, Random Forest builds multiple decision trees and combines their predictions. The final prediction is the majority vote from these trees, making it more accurate and lessly likely to overfit, which is a common problem with single decision trees where they can get too focused on specific details.

Vásquez-Ucho et al. created a system to detect stress using ECG data, where Random Forest reached 82.4% accuracy for spotting stress . Another study by Pisipati and Nandy looked at classifying emotions from EEG signals while participants listened to music. With Random Forest, they got a high 99.94% accuracy using the DREAMER dataset, however, this might not generalize because the paper only used 5 out of 23 participants for DREAMER dataset . Alharbi and Alotaibi’s research on using EEG data to classify gender during emotional states showed Random Forest with only a 7% error, proving it can manage complex EEG data effectively . Lastly, Bhattacharyya et al. used Random Forest in a unique setup (sparse autoencoder in a multivariate-multiscale emotion recognition framework) that combined multiscale features for emotion recognition, hitting 94.4% accuracy on SEED and DREAMER datasets, which really shows Random Forest’s strength in working with multi-channel EEG signals .

Last updated