Now unlock the smartphone with face authentication. It is a world that not only talks about individuals with their faces, but also discerns their emotions. They read facial expressions and guess what the person is thinking. It may sound scary, but is it useful? AI Now, an AI research center at New York University, has released a report that complains of the dangers of facial recognition technology.
What this report puts at risk is that emotion-aware data can have a major impact on object decisions in a variety of fields. For example, it would be a scary thing if the data of reading an individual’s facial expressions in interview pass, insurance price, school grade, etc. influence the result.
The most frightening thing is that emotion recognition is not yet accurate. The report introduces some cases pointing to the inaccuracy of appraisal certification. For example, a system developed by Sound Intelligence and used in schools and hospitals to check excitement levels has been reported to recognize a simple cough as a sign of excitement. For example, when NBA players use Face++ or the Microsoft Face API, the emotions of black players are more aggressive than other players. In China, there is also an inaccuracies in the system for tracking student concentration introduced in schools. The study posture is different for each person, but the appearance is good, but if there are people who are not concentrating or who are concentrating while looking sideways, the former gives an incorrect result according to the external characteristic of high concentration.
For example, the report points out that there is no scientific basis and accuracy for appraisal products. Based on this, it is advised that governments and corporations should refrain from using them until their understanding of emotion perception is deepened and further developed, and that the industry should be more serious about racism and disrespect for women.
The report also noted the importance of claiming the rights of the data grabber. The Illinois Biometric Information Privacy Act (BIPA), therefore, is urged to revitalize the right to request when biological data are collected and used without consent. It also points out that when developing products such as monitoring tools, employees have the right to say no, and companies should be notified when developing espionage tools used by the government.
It is not known whether emotion recognition is not yet used in society. Disney tracks viewers’ facial expressions to see the movie’s reactions, and Apple acquired Emotient, a startup in 2016 that developed an emotion verification tool called Facet. In China, face recognition technology has already been introduced for Muslim immigrants and Hong Kong protests. What will happen if you add emotion recognition to this? The expressions on the face are not all simple. We need to be once again aware of the fear that can arise from overconfiding AI. Related information can be found here .
Add comment