Computers reading student emotions

By:

Jul 24, 2008

“Computer software is too rules-based. It can’t really adapt to students. You lose too much if you rely on computer-based learning.”

This is one push back we hear against the vision of student-centric learning technology that we put forth in Disrupting Class.

We have many responses, of course, but one of them is that technology always starts at the low end of performance and, over time, predictably improves to do more complicated tasks and jobs. One might not be able to envision how it will improve and what it will be able to do, but we can say with certainty it will improve.

For example, what if computers could read student emotions and adjust accordingly in the future? Sound far-fetched? It might not be.

There are many examples of cognitive tutors that are emerging that can read student emotions. According to a recent article in eSchool News, University of Massachusetts researchers received a grant of $890,419 in June from the National Center for Education Research to advance technology that uses sensors to detect student emotions.

How does it work? According to the article:

“The tutoring program uses sensors placed in a student’s seat, in the computer mouse, and on a student’s wrist to detect arousal through skin conductance, a common measure for stress response. Conductance gives researchers a clear picture of the subject’s nervous-system activity. The program also will use cameras to detect smiles and facial expressions that connote negative feelings, such as anxiety or frustration.”

Just imagine the exciting possibilities here. What other research efforts and products are out there like this? How might this revolutionize learning?

– Michael B. Horn

Michael is a co-founder and distinguished fellow at the Clayton Christensen Institute. He currently serves as Chairman of the Clayton Christensen Institute and works as a senior strategist at Guild Education.