Recognition and Misclassification Patterns of Basic Emotional Facial Expressions: An Eye-Tracking Study in Young Healthy Adults

Loading...
Thumbnail Image

Date

2025

Journal Title

Journal ISSN

Volume Title

Publisher

MDPI

Open Access Color

GOLD

Green Open Access

Yes

OpenAIRE Downloads

OpenAIRE Views

Publicly Funded

No
Impulse
Average
Influence
Average
Popularity
Average

Research Projects

Journal Issue

Abstract

Accurate recognition of basic facial emotions is well documented, yet the mechanisms of misclassification and their relation to gaze allocation remain under-reported. The present study utilized a within-subjects eye-tracking design to examine both accurate and inaccurate recognition of five basic emotions (anger, disgust, fear, happiness, and sadness) in healthy young adults. Fifty participants (twenty-four women) completed a forced-choice categorization task with 10 stimuli (female/male poser x emotion). A remote eye tracker (60 Hz) recorded fixations mapped to eyes, nose, and mouth areas of interest (AOIs). The analyses combined accuracy and decision-time statistics with heatmap comparisons of misclassified versus accurate trials within the same image. Overall accuracy was 87.8% (439/500). Misclassification patterns depended on the target emotion, but not on participant gender. Fear male was most often misclassified (typically as disgust), and sadness female was frequently labeled as fear or disgust; disgust was the most incorrectly attributed response. For accurate trials, decision time showed main effects of emotion (p < 0.001) and participant gender (p = 0.033): happiness was categorized fastest and anger slowest, and women responded faster overall, with particularly fast response times for sadness. The AOI results revealed strong main effects and an AOI x emotion interaction (p < 0.001): eyes received the most fixations, but fear drew relatively more mouth sampling and sadness more nose sampling. Crucially, heatmaps showed an upper-face bias (eye AOI) in inaccurate trials, whereas accurate trials retained eye sampling and added nose and mouth AOI coverage, which aligned with diagnostic cues. These findings indicate that the scanpath strategy, in addition to information availability, underpins success and failure in basic-emotion recognition, with implications for theory, targeted training, and affective technologies.

Description

Keywords

Gender-Difference, Face, Article

Turkish CoHE Thesis Center URL

Fields of Science

Citation

WoS Q

Q1

Scopus Q

Q3
OpenCitations Logo
OpenCitations Citation Count
N/A

Source

Journal of Eye Movement Research

Volume

18

Issue

5

Start Page

53

End Page

Collections

Page Views

243

checked on Jan 30, 2026

Downloads

11412

checked on Jan 30, 2026

Google Scholar Logo
Google Scholar™
OpenAlex Logo
OpenAlex FWCI
0.0

Sustainable Development Goals

SDG data is not available