AI Generates Personally Attractive Portraits by Reading Brain Waves
Differing conceptions of what "beauty" is has always been a challenging topic, but one thing for sure, what we find attractive in other people's faces is stored in our minds with cultural and psychological factors playing an unconscious role in our personal tastes.
Now, researchers at the University of Helsinki and University of Copenhagen have used electroencephalography (EEG) measurements to make an AI understand our subjective notions of what makes faces attractive, according to findings published in IEEE Transactions on Affective Computing.
The experiment which was conducted with 30 volunteers "worked a bit like the dating app Tinder," explained senior researcher Michiel Spapé from the Department of Psychology and Logopedics, University of Helsinki.
Combining computer science and psychology
The researchers made a generative adversarial neural network (GAN) to create hundreds of artificial portraits. One by one, the images were shown to the volunteers who paid attention to the faces they found attractive while wearing elastic caps fitted with electrodes to measure their brain activity. When the participant found a face attractive, they just had to look at it instead of swiping right.
Then, the measured neural activity was assessed by the GAN to interpret the brain responses in terms of how attractive each face was considered by the viewer. "A brain-computer interface such as this is able to interpret users' opinions on the attractiveness of a range of images," said Academy Research Fellow and Associate Professor Tuukka Ruotsalo, who heads the project.
"By interpreting their views, the AI model interpreting brain responses and the generative neural network modeling the face images can together produce an entirely new face image by combining what a particular person finds attractive."
The newly generated faces, special for each participant, were tested in a double-blind procedure against matched controls. It was seen that the new images matched the preferences of the subjects with an accuracy of over 80 percent.
"The study demonstrates that we are capable of generating images that match personal preference by connecting an artificial neural network to brain responses. Succeeding in assessing attractiveness is especially significant, as this is such a poignant, psychological property of the stimuli. Computer vision has thus far been very successful at categorizing images based on objective patterns. By bringing in brain responses to the mix, we show it is possible to detect and generate images based on psychological properties, like personal taste," Spapé further explained.
The study is admittedly small; however, it suggests that refined AI systems are becoming better at understanding what makes people tick. The researchers state that they could potentially gear this device towards distinguishing stereotypes or implicit bias while making its understanding of individual differences more developed.
Interesting Engineering interviewed two researchers who demonstrated that growing the grass miscanthus can completely decarbonize the aviation industry.