Blank stares on a student’s face in a classroom are not uncommon. But do they signal confusion, or mask a pupil’s boredom because he or she already knows the material?
A rise in artificial intelligence and cognitive computing is creating a new workforce of robots, simulating human thought and transforming industries. At the apex of this emerging tech is a new field of sensory technology known as emotive computing.
This is not teaching robots to have emotions. Rather, it is about teaching them to recognize human emotions, based on signals, and then react appropriately based on an evaluation of how the person is feeling. Robots may actually be more useful than humans in this role, as they are not clouded by emotion, instead using intelligent technology to detect hidden responses.
So how could this new field of technology find its way into the classroom? Can it help teachers improve their care of students with extra needs and tailor teaching to suit them?
Facial recognition to measure understanding
Budget constraints in public schools have driven widespread teacher layoffs and crowded classrooms. Teachers, who are already pressed for time, must then provide for a greater range of learning needs for each child. As a result, children that are struggling in lessons often go neglected and problems may only surface when a child flags an issue.
In the last three years, there has been an emergence of new businesses pioneering facial recognition technology in the classroom. Companies like SensorStar Labs use cameras to capture student responses, which feed into algorithms to identify if their attention is wandering. The system, called EngageSense, measures smiles, frowns and audio to classify student engagement.
Psychologist Paul Ekman has taken this to a whole new level, cataloging more than 5,000 facial movements to help identify human emotions. His research is powering new companies like Emotient Inc, Affectiva Inc and Eyeris, each using a combination of psychology and data-mining to detect micro expressions and classify human reactions.
So far this technology has focused on aiding federal law enforcement and market research, but San Diego researchers are also trialling this technology in healthcare, to measure children’s pain levels after surgeries.
Applying this in the classroom means teachers can gather more in-depth data to measure understanding. This can be used on a one-to-one level but also to assess class engagement as a whole, in response to varying teaching methods, informing teachers where additional support may be required.
From emotive to affective computing
The next stage after detecting facial cues is affective computing, which is the measurement and interpretation of human responses, or “affects.” Canadian software startup NuraLogix has developed a proprietary technology for detecting hidden emotions, called Transdermal Optical Imaging, with a camera that is able to measure facial blood flow information and determine emotions where visual cues are not obvious.
MIT Media Labs is currently running research programs into a range of affective studies; from conducting electroencephalogram (EEG) electrical brain activity tests in response to music arousal, to measuring task performance and providing computer mediation to individuals with autism spectrum disorder (ASD).
The MIT Affective Computing team is developing the world’s first wearable affective technology: a social-emotional intelligence prosthetic to detect human affects in children with autism in real-time. The device uses a small camera and analyzes facial expressions and head movements to infer the cognitive-affective state of the child. A similar tool, the ‘galvactivator’, measures the wearer’s skin conductivity, to deduce how excited a person it. The “glove-like device” maps physiological arousal using visuals such as LEDs. By using this form of visual feedback, a bright class could literally glow.
Machine learning to tailor student classwork
Can this form of sensory technology help computers to adapt coursework and teaching methodologies to match student performance indicators?
TechCrunch contributor Roshan Choxi has explored the rise of online platforms tailoring one-on-one mentorship and mastery learning, for improved student performance, effectively solving the Benjamin Bloom’s famous “Sigma 2 Problem” with real-time measurement and customized learning. Deep learning systems powered by personal human response data can sort and organize relevant classroom content, recommend additional exercises and customize recommended course materials and pace based on individual capability and changing curriculum requirements.
Researchers at North Carolina State University have developed software to adapt online tutorials through the use of cameras that can monitor and analyze the facial expressions of students working on computers. Until now, most advances in affective computing tools for educators have been limited to academia. However, within recent years companies like Intel have begun to use this technology to identify student expression—be this frustration, excitement or boredom—and automatically adjust content and environments tuned to their capability and learning styles.
Armed with this data, emotionally intelligent computing systems, tools such as Emoshape’s Emotional Processing Unit (EPU II) can analyze sentiment and respond with appropriate expressions. Being able to tap information in this way helps on an individual level, enabling educators to deliver highly-personalized content that motivates children.
Artificial intelligence and big data have already transformed major consumer industries, from ecommerce to transportation, finance and even healthcare. They are already making headway into education, despite proper concerns over how technologists can prove efficacy and safeguard student privacy.
These technologies will not replace the teacher. Instead, it means supporting them under pressure by detecting stress cues to tailor smarter learning approaches. Artificial intelligence may not be as dreary or dystopian as we feared. Perhaps in this new world, a combination of human and computing intelligence will deliver a higher quality education, where human responses and emotions are pivotal.