Although certain celebrities are widely considered to be nice-looking, beauty does still ultimately lie in the eye of the beholder. A new AI-based system is able to ascertain which features are found most attractive by individual people, and then create faces combining those qualities.
Led by Assoc. Prof. Tuukka Ruotsalo, scientists from the the universities of Helsinki and Copenhagen started by getting a generative adversarial neural network to produce hundreds of lifelike computer-generated portraits.
Those facial images were then shown one at a time to a total of 30 test subjects, on a computer screen. Each person was instructed to focus more attention on the faces which they found most attractive, while the electrical activity of their brain was recorded using EEG (electroencephalography).
Machine learning-based algorithms subsequently determined which faces produced the greatest amount of activity for each person, then established which traits those faces had in common. Based on that data, the neural network then proceeded to produce new faces that combined those traits.
In a double-blind experiment, those new faces were then shown to the person, along with images of many other faces. Eighty-seven percent of the time, the individual selected the new faces as being amongst the most attractive – that figure should rise as the technology is developed further.
It is hoped that the team’s findings could ultimately be used to help computer systems understand subjective preferences, and perhaps also to identify people’s unconscious attitudes.
“The study demonstrates that we are capable of generating images that match personal preference by connecting an artificial neural network to brain responses,” says senior researcher Michiel Spapé. “Computer vision has thus far been very successful at categorizing images based on objective patterns. By bringing in brain responses to the mix, we show it is possible to detect and generate images based on psychological properties, like personal taste.”
A paper on the research was recently published in the journal IEEE Transactions in Affective Computing.
Source: University of Helsinki
Source of Article