Back to the articles list | Back to browse issues page

XML Persian Abstract Print


1- K. N. Toosi University of Technology
Abstract:   (12 Views)
Attention, as one of the key cognitive processes, plays a central role in daily activities, learning, and human-environment interactions. Accurately and objectively assessing individuals' attention levels, especially in dynamic and real-world situations, has always been challenging. Traditional methods, such as self-report questionnaires or paper-based tests, often fail to capture momentary attention fluctuations or environmental factors' impact. This study, aiming to provide a precise and efficient method, utilized the analysis of eye and hand movement patterns within the framework of the Trail-Making Test. Data from 42 healthy participants were recorded while performing the test. Their eye and hand movements were accurately measured using eye-tracking technology and mouse movement tracking. Features such as saccades, fixations, blinks, and mouse movement speed were extracted. Using these features to predict attention levels, a Random Forest model was then trained. The results indicate that the model achieved a coefficient of determination R² score of 72%, demonstrating its ability to predict attention levels accurately. These findings confirm that eye and hand movement patterns can serve as reliable indicators for attention assessment. Therefore, applying machine learning techniques to analyze eye and hand movement data presents a reliable approach for evaluating attention levels in real-world settings. Beyond its scientific and research significance, this approach has practical applications in various fields, including education, clinical psychology, and human-computer interaction system design.
 
     
Type of Article: Research paper | Subject: Special
Received: 2025/02/25 | Accepted: 2025/08/21 | ePublished ahead of print: 2025/11/24

Add your comments about this article : Your username or Email:
CAPTCHA

Send email to the article author


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.