New machine learning algorithm can identify the facial expression a person is looking at based on neural activity
Researchers from The Ohio State University have found the area in the brain that allows for people to recognize facial expressions. This area, or region, is called the posterior superior temporal sulcus (pSTS). It is located on the right side of the brain behind the ear. Researchers were able to identify this area using magnetic resonance imaging (MRI) and saw that part of the brain activated when the patient was looking at images of people making different facial expressions. Even more interesting is the pattern they found. Based on what type of facial expression the person was making in the image, the brain would react a certain way. For example, one pattern was used to detect a furrowed brow while another was used to detect a smile. “That suggests that our brains decode facial expressions by adding up sets of key muscle movements in the face of the person we are looking at,” said Aleix Martinez, a cognitive scientist and professor of electrical and computer engineering at Ohio State. To read more on this study, click here.