Techrecipe

How far can AI read human emotions

No matter how advanced AI has been recently, it can be said that it is difficult for computers to recognize human emotions. Experts sometimes say that there is a fundamental flaw in the ability of AI to recognize emotions. A browser game developed by a research team at Cambridge University was created to show this reason.

The Emojify Project looks at the user’s facial expressions through a PC webcam and tries to interpret the emotions hidden there. What AI identifies are six emotions: joy, sadness, fear, surprise, disgust, and anger.

However, emojipie AI does not accurately identify, and even if the user desperately creates facial expressions, it is difficult to recognize emotions. It is also the purpose for which this AI was created.

AI, which guesses emotions from facial expressions, is basically made to be honest. When the other person smiles, it is recognized as a happy expression. However, it is not that simple to say that it is a human expression. Anyone who pretends to be calm while suppressing anger has ever done. According to a 2019 survey conducted by the American Psychological Association APA, it is reported that human emotions cannot be grasped as they see their facial expressions. Even with this browser game, humans need to express six emotions in sequence, but in fact it does not correspond to the six emotions inside.

In other mini-games, the user is asked to specify the difference between winks and winks. However, what the machine actually read may only be closing the eyes or closing the eyelids due to dust in the eyes, and you can see that it is difficult to discern the difference. In the case of humans, people can still unconsciously feel relative careers, gestures, and emotions and personalities that are embedded in them, but expression recognition technology alone cannot yet do so.

Currently, AI is trying to read emotions from human facial expressions as one of the important discriminating factors in facial recognition. The fields of application are, for example, used to squeeze the employability score in hiring interviews, discovering a person like a terrorist, and recognizing whether a person in charge is doing the job well. In addition, in AI facial recognition, differences in recognition precision due to racial differences are also found, but it is reported that blacks will emphasize more negative emotions.

Through this project, the research team will explain the shortcomings of emotion recognition by AI, and try to evoke discussions on its use. The research team said that the goal of the project is to promote a public understanding of these technologies and to engage more citizens in the development and use of these technologies. It is said that it can be strengthened. Related information can be found here.