When making non-human things such as paintings and robots similar to humans, the higher the reproducibility, the higher the human’s favorability. However, similar to humans, when it exceeds a certain level, you feel discomfort and fear. This is called Uncanny Valley. Uncanny Valley was also one of the tasks when developing human-like robots, but according to a recent study, research results on why Uncanny Valley phenomenon occurs is drawing attention.
As the technology improves, models using real human-shaped robots and 3D CG are created. However, human-like objects created in this way give people disgust or displeasure when they reach Uncanny Valley. One expert pointed out that being similar to human shape and behavior can be both pros and cons, and pointed out that similar to humans poses the risk of Uncanny Valley.
A research paper recently published in the Journal of Neuroscience was conducted by neuroscientists and psychologists in Britain and Germany. When Uncanny Valley occurs, it has the potential to be the first step in identifying the mechanisms that occur in the brain and improving people’s responses to real human-like robots or CGs.
The team says that Uncanny Valley is an interesting phenomenon in neuroscience. It implies the existence of a neural mechanism that judges whether sensory inputs, such as the first given visual information, such as robot pictures, are felt as humans or non-humans.
To investigate the neural mechanisms by which Uncanny Valley works, the research team conducted two tests on 21 subjects using fMRI to examine brain patterns.
In the first test, the subject was shown a number of pictures, including humans and robots, and asked them to evaluate their favorability and how much they felt humanly. In the second test, the subject was asked to decide which of the humans and robots they saw in the photo, and which one would allow them to choose their own gift. By measuring the subject’s brain activity during the two tests, the team attempted to identify which brain regions produced Uncanny Valley sensations.
Research has shown that some areas of the brain that process visual information, close to the visual cortex, produce brain signals about humanity. In addition, other activities leading to Uncanny Valley were observed in a part of the brain frontal cortex. In studies so far, the frontal precortex has been known as an area with a system for judging all types of stimuli. For example, it is an area that represents the reward value of social stimuli such as feeling of comfort.
Studies have shown that two parts of the inner frontal lobe precortex play an important role in Uncanny Valley. One in two appears to convert brain signals about humanity into signals that they have discovered (determined humans) humans. Another part of the frontal lobe, the inner part of the precortex, is said to integrate the signal of humanity with the evaluation of likelihood. When looking at resemblance to humans through these two functions, it is to check whether they have seen humans or humanity, and directly connect them to the evaluation of likeability.
The inner part of the prefrontal cortex was active in the question of what subjects would be allowed to choose from a robot picture of their own. On the other hand, in the case of refusing gifts to artificially created human-like things such as robots, the amygdala, which controls the emotional reaction, was active. There were individual differences in the response to the robot being accepted and rejected.
The findings of these studies could have an impact on designing robots that look more like humans. The research team explains that the assessment signals generated in these brain regions can be changed by social experience. The study also says that this is the first study to show that there are individual differences in the intensity of the Uncanny Valley effect. In other words, some people may be overreacting and others may not. This means that there is no single robot design that is loved by all users or that fears all users. Related information can be found here.