PriMera Scientific Engineering (ISSN: 2834-2550)

Research Study

Volume 8 Issue 2

EnganchAI: Preliminary Work on Real-Time Classroom Engagement Analysis Using Computer Vision

Mauricio Figueroa Colarte*, Claudio Valdivia Parra, Jose Pablo Casas, Alison Bottinelli Thomassen, Vicente Rivas Urrutia and Cristian Molina Pedernera

February 03, 2026

DOI : 10.56831/PSEN-08-249

Abstract

Student engagement is essential for academic achievement and emotional well-being, encompassing behavioral, emotional, and cognitive dimensions. In face-to-face education, fostering engagement helps create learning environments that are both effective—meeting curricular goals—and affective—nurturing motivation and belonging. Technology-assisted strategies are especially valuable for helping teachers detect disengagement in real time and personalize their responses. These adaptive actions support greater student focus and commitment, enabling a teaching process that integrates both effectiveness and emotional connection.

EnganchAI, derived from the fusion of “Engagement” and “AI” (Artificial Intelligence), introduces a proof of concept (PoC) for real-time student engagement analysis in physical classrooms using computer vision technologies. Designed for low resource environments, this platform employs a YOLO-based model trained on custom-labeled datasets to process live video feeds, classifying engagement levels (Engaged, Bored, Frustrated and Confused). By providing actionable insights, EnganchAI enables educators to adapt teaching strategies dynamically, fostering more effective and affective learning experiences.

The PoC was tested both in controlled environments and in real classroom scenarios, ensuring accurate performance measurements and validation of its real-time capabilities. These trials were conducted under strict confidentiality protocols to protect personal data and ensure compliance with privacy regulations. The platform demonstrated promising results, achieving a mean average precision (mAP) of 70.25% and an inference response time under 2 seconds. Future iterations will focus on refining datasets, enhancing model accuracy, and expanding functionalities to support broader adoption.

Keywords: Engagement Analysis; Artificial Intelligence; Affective Learning; Computer Vision; Classroom Technology

References

  1. O Alrashidi, H Phan and B Ngu. “Academic Engagement: An Overview of Its Definitions, Dimensions, and Major Conceptualisations”. (2016).
  2. K Prananto., et al. “Perceived Teacher Support and Student Engagement Among Higher Education Students: A Systematic Literature Review”. (2025).
  3. B Harris and L Bradshaw. “Battling Boredom Part 2: Even More Strategies to Spark Student Engagement”. 2nd Edition ed., New York: Routledge (2017).
  4. A Gupta., et al. “DAiSEE: Towards User Engagement Recognition in the Wild”. Proceedings of the IEEE Winter Conference on Applications of Computer Vision (WACV) (2016).
  5. S Malekshahi., et al. “A General Model for Detecting Learner Engagement: Implementation and Evaluation”. arXiv preprint arXiv:2405.04251 (2024).
  6. J Redmon., et al. “You Only Look Once: Unified, Real-Time Object Detection”. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2016).
  7. EmoAI Smart Classroom: The Development of a Student Emotional and Behavioral Engagement Recognition System (2023).
  8. T Goetz., et al. “Academic Boredom”. The Routledge International Handbook of Boredom, 1st Edition ed., London, Routledge (2024): 25.
  9. R Pekrun., et al. “Educational Psychologist”. in Academic Emotions in Students’ Self-Regulated Learning and Achievement: A Program of Qualitative and Quantitative Research, Berlin (2002).
  10. I Alkabbany., et al. “An Experimental Platform for Real-Time Students Engagement Measurements from Video in STEM Classrooms”. Proceedings of an academic conference (unspecified) (2023).
  11. A Abedi and SS Khan. “Improving State-of-the-Art in Detecting Student Engagement with ResNet and TCN Hybrid Network”. in Proceedings of the 18th Conference on Robots and Vision (CRV) (2021).
  12. T Selim, I Elkabani and MA Abdou. “Students Engagement Level Detection in Online E-Learning Using Hybrid EfficientNetB7 Together with TCN, LSTM, and Bi-LSTM”. IEEE Access 10 (2022): 99573-99583.
  13. C-H Wu., et al. CMOSE: Comprehensive Multi-Modality Online Student Engagement Dataset with High-Quality Labels (2023).