Menu

Microsoft abandons tools for recognizing emotions, gender, and age

On June 21, Microsoft announced that it will no longer sell technologies that should determine a person's gender, age, and emotions. The company took this step, in particular because of criticism from a number of scientists and activists who claimed that such software could be "biased and unreliable," writes the NYT.

Microsoft will gradually abandon artificial intelligence to recognize emotions, gender, and age. New users will not have access to these features, and existing ones will lose it within a year. The company took this step, in particular because of criticism from a number of scientists and activists who claimed that such software could be "biased, unreliable or invasive." Tools for analyzing age, gender, and emotions can be useful in interpreting images for people with visual impairments. However," there are a huge number of cultural, geographical and Individual Differences in the way we express ourselves, " says Natasha Crampton, head of AI. In particular, she added, the so-called gender classifier of the system was binary, "and this does not correspond to the values" of Microsoft. Details

The changes are part of Microsoft's policy aimed at tighter control over its artificial intelligence products. In particular, the company is now focusing on the 27-page document "standard of responsible artificial intelligence". It stipulates that AI should not negatively affect society.

Microsoft will also change the terms of use of the facial recognition feature. Those who want to use it must apply for access and explain exactly how they will use the tool. For example, Uber uses software in its app to make sure that the driver's face matches their account ID.

Tools using AI are regularly criticized. For example, in 2020, researchers found that programs that convert speech to text developed by Microsoft, Apple, Google, IBM, and Amazon are less effective for black people. Microsoft's system was the best, but misidentified 15% of words for white people and 27% for black people.

Earlier, DOU spoke about a Google engineer who believes that the AI chatbot system, which he has been working on since the fall of 2021, is able to think, express their own opinions and feelings like a seven-year-old child. The company examined the employee's evidence and decided to suspend him from work for violating confidentiality.