A report published by American Civil Liberties Union (ACLU) indicates that Amazon’s facial recognition system, Rekognition, has misidentified 28 members of Congress as criminals. Making the matters worse, 39 percent of people of color were flagged as compared to 5 percent of white individuals.
In response to the publication, Amazon has told The New York Times that ACLU failed to use the software properly on the recommended settings. According to the recommended settings, the error-tolerance should be set to 95 percent for law enforcement. However, ACLU set it to only 80 percent.
Amazon is also imploring people to use Rekognition and confirm results and believes that is up to the end user to deploy any software in a responsible and safe manner.
While ACLU may seem crying a false cry, there is no law that limits any union, or even the police from setting the error-tolerance rate to a lower percentage. Moreover, there is also no legislation concerning the use of facial recognition software in the US. For this very reason, Amazon has faced severe backlash for selling facial recognition technology to law enforcement agencies.
It is essential that the stakeholders involved, be it the government, law enforcement agencies, and technology companies address how the algorithms they are developing fail to work efficiently with the color of skin. Or else, this misidentification of people by Amazon’s AI can have serious implications.