Japanese Android Child's Face Can Now Convey Human Feelings
Japanese have always loved their robots. There is no doubt about that. However, recently, a group of researchers at Osaka University have taken this love to the next level.
[see-also]
Their android child face which is called Affeto is capable of making smooth and swift movements that aim to attain an interaction similar to a child. They have come up with a method that would identify as well as evaluate the facial movements of the android child head qualitatively.
The publication about the first generation of these android robot head models was made in the year 2011. Now, with the second generation, they have taken a great leap forward.
They are upgraded in a way that their expressions are more child-like and can effectively convey feelings. Therefore, they have a higher and a deeper range of emotions that can lead to a better interaction with humans.
Affeto’s facial expressions
The researches were comprehensive in researching the facial expressions on Affeto. They began with investigating as many as 116 different facial points.
This was done to measure and understand its movements in a three-dimensional manner. Further, these facial points found by the researchers were underpinned by the deformation units.
These deformation units helped in creative specific facial contortion to make it more human-like. This included lowering of the eyelid, raising of the eyelid, similar movements in the lips and so on
Finally, they measured these movements through a mathematical model to quantify the data on their surface motion patterns. The researchers were successfully able to adjust the deformation units in a way that they could get precise facial surface motions on Affetto’s face.
However, they did face challenges in improving the synthetic skin and applying the overall force.
"Android robot faces have persisted in being a black box problem: they have been implemented but have only been judged in vague and general terms. Our precise findings will let us effectively control android facial movements to introduce more nuanced expressions, such as smiling and frowning," said Hisashi Ishihara, the first author of the study.
Previous attempts on such android faces
The present version of this android child robot face is created with an aim to show the expressions of a one to a two-year-old child. It is being used to study the early stages of human development in the social context.
As mentioned above, this android face was first, revealed in 2011. There have also been many similar attempts in the past where researchers try to understand the depth of interaction that is possible between people and child robots.
However, they did not have a realistic appearance of a child's face in the past so that affected the interaction immensely. Hence, they were not able to understand the intricacies of human-robot interaction.
In many cases, people could not talk naturally to robots as they would do with humans. Now, with this new child-like android head, the researchers hope to overcome this challenge and move ahead in the human-robot relationship.
The aim of the researchers through these robots is to help these robots share their emotions with caregivers. This is an excellent step for the improvement of services by caregivers and also in human-robot interaction.
We can only wait to see what the future beholds.
Via: Osaka University
Researchers' cutting-edge technology can increase plant productivity and address problems with the world's food supply, particularly in colder locations.