New Brain-inspired Computer Can Tell a Sad Image from a Happy One

Neuroscientists have combined machine learning and neuroscience to create a neural network that can identify emotions.
Loukia Papadopoulos

University of Colorado Boulder neuroscientists have combined machine learning and neuroscience to create a brain-inspired computer that can tell the difference between sad and happy images. 

Recognizing the emotion of images

"Machine learning technology is getting really good at recognizing the content of images -- of deciphering what kind of object it is," said senior author Tor Wager, who worked on the study while a professor of psychology and neuroscience at CU Boulder. "We wanted to ask: Could it do the same with emotions? The answer is yes."

The experiment is an important development in "neural networks," computer systems modeled after the human brain. It also highlights that what we see could have a more severe impact on our emotions than we might think.

"A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system," said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science. "We found that the visual cortex itself also plays an important role in the processing and perception of emotion."

For their study, the researchers used a neural network called AlexNet designed to enable computers to recognize objects and retooled it to predict how a person would feel when they see a certain image using previous research. The researchers dubbed the new network EmoNet and proceeded to show it 25,000 images.

The computer was then asked to categorize them into 20 sections such as craving, sexual desire, horror, awe, and surprise. The program was found to better at recognizing some emotions better than others.

It could accurately and consistently categorize 11 of the emotion types. Craving or sexual desire, for instance, were categorized with more than 95 percent accuracy.

However, more nuanced discreet emotions such as confusion, awe, and surprise were harder to pinpoint. EmoNet proved very reliable in rating the intensity of images.

It was also rather good at rating brief movie clips. When asked to categorize them as romantic comedies, action films or horror movies, it got it correct 75% of the time.

Same neural network patterns for human and computer

The researchers then used 18 human subjects and had a functional magnetic resonance imaging (fMRI) machine measure their brain activity when they were shown the same 112 images as EmoNet. Surprisingly, the neural network patterns were the same for human and computer.

"We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a way that is biologically plausible, even though we did not explicitly train it to do so," said Kragel.

In the end, the researchers believe their work could be applied to improve computer-human interactions and help advance emotion research. For now, however, the research just proves the importance of monitoring what you are exposed to.

"What you see and what your surroundings are can make a big difference in your emotional life," added Kragel.

The study is published in the journal ScienceAdvances.

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board