Skip to main content

Computer Software Accurately Predicts Student Test Performance

Study shows automatic recognition of facial expressions can track student engagement in real time

By:

  • Doug Ramsey

Media Contact:

Published Date

By:

  • Doug Ramsey

Share This:

Article Content

Student engagement levels are tracked in real time by the automatic system for recognizing facial expressions. Photo copyright 2014 IEEE; all rights reserved

Computer scientists have developed a technology that uses facial expression recognition to detect how engaged students are during a class and to predict how well they will do in that class. The team, led by scientists at the University of California, San Diego and Emotient, a San Diego-based provider of facial expression recognition, showed that the technology was able to detect students’ level of engagement in real time just as accurately as human observers. The team also included researchers from Virginia Commonwealth University and Virginia State University.

The early online version of the paper, “The Faces of Engagement: Automatic Recognition of Student Engagement,” appeared today in the journal, IEEE Transactions on Affective Computing.

“Automatic recognition of student engagement could revolutionize education by increasing understanding of when and why students get disengaged,” said Dr. Jacob Whitehill, Machine Perception Lab researcher in UC San Diego’s Qualcomm Institute and Emotient co-founder. “Automatic engagement detection provides an opportunity for educators to adjust their curriculum for higher impact, either in real time or in subsequent lessons. Automatic engagement detection could be a valuable asset for developing adaptive educational games, improving intelligent tutoring systems and tailoring massive open online courses, or MOOCs.”

Whitehill (Ph.D., ’12) recently received his doctorate from the Computer Science and Engineering department of UC San Diego’s Jacobs School of Engineering.

The study consisted of training an automatic detector, which measures how engaged a student appears in a webcam video while undergoing cognitive skills training on an iPad®. The study used automatic expression recognition technology to analyze students’ facial expressions on a frame-by-frame basis and estimate their engagement level.

Recent Computer Science and Engineering alumnus Jacob Whitehill (Ph.D., ’14) now works for Emotient, Inc., with his advisor Javier Movellan, co-director of the Machine Perception Lab at UC San Diego.

“This study is one of the most thorough to date in the application of computer vision and machine learning technologies for automatic student engagement detection,” said Javier Movellan, co-director of the Machine Perception Lab at UC San Diego and Emotient co-founder and lead researcher. “The possibilities for its application in education and beyond are tremendous. By understanding what parts of a lecture, conversation, game, advertisement or promotion produced different levels of engagement, an individual or business can obtain valuable feedback to fine-tune the material to something more impactful.”

In addition to Movellan and Whitehill, the study’s authors include Virginia Commonwealth professor of developmental psychology, Zewelanji Serpell, MD, as well as Yi-Ching Lin and Aysha Foster from the department of psychology at Virginia State.

Emotient was founded by a team of six Ph.D.s from UC San Diego, who are the foremost experts in applying machine learning, computer vision and cognitive science to facial behavioral analysis. Its proprietary technology sets the industry standard for accuracy and real-time delivery of facial expression data and analysis. Emotient’s facial expression technology is currently available as an API for Fortune 500 companies within consumer packaged goods, retail, healthcare, education and other industries.

Share This:

Category navigation with Social links