New Yorkers living in areas at greater risk of stop-and-frisk policing are more likely to be exposed to facial recognition technology, according to research conducted by Amnesty International. “When we think about the average human who has a certain level of bias, we can think about how creating technology - especially facial recognition technology - is a man-made creation and so it’s automating the bias that is already there,” Attiya Latif, a NYC-based staff organizer with Amnesty International, said. In 2013, a federal judge ruled that the use of “stop and frisk” was racially discriminatory and unconstitutional under the Fourth and 14th amendments. 11, 2001 terror attacks and using racist law enforcement practices like “stop and frisk,” the use of facial recognition technology is essentially a way to codify prejudice. Opponents argue that in a city that has a history of targeting Muslims after the Sept. “Essentially, for those women, the system might as well have been guessing gender at random,” MIT said in a news release about the study. One 2019 study titled “Racial Faces in the Wild: Reducing Racial Bias by Information Maximization Adaptation Network” found that four major commercial facial recognition tools and four state-of-the-art facial recognition tools misidentified white faces in pair matching 10% of the time, while Black and Asian faces were misidentified nearly twice as often.Ī study by the National Institute of Standards and Technology also released in 2019 indicated higher error rates for pair matching of Black and Asian faces, with error rates up to 100 times higher for Black and Asian people than for white people on certain facial recognition systems.Īnd a Massachusetts Institute of Technology study in 2018 found the rate of error was highest when it came to dark-skinned Black women. Researchers have found that the technology is more likely to misidentify people of color, trans people, and women than middle-aged white men. But details on how that technology is used remain murky, and its accuracy has been called into question by critics. The NYPD, for its part, has deployed facial recognition technology since at least 2011. “That really is concerning to me and other public officials.” ‘Automating the bias’ “Where is it used after, say, Jim Dolan screens you as you walk into a Knicks game?” he asked. Not just your face, though, I should add also your expression, your feelings, some facial recognition technology can actually lip read. “When you walk into a venue like Madison Square Garden, your face is scanned. “With facial recognition technology, essentially everyone is a suspect,” Hoylman-Sigal said in an interview with NY1. Brad Hoylman-Sigal, a Democrat, is working on legislation in Albany that would put a pause on the use of facial recognition technology by law enforcement until usage guidelines are established, including a framework for the disposal of the collected images and data. Now, activists and lawmakers are trying to sound the alarm on the technology’s potential to cause harm, particularly to brown and Black New Yorkers, before its use becomes even more widespread. Opponents say there are racial discrepancies - not only in where this technology is used, but in the rate at which it misidentifies people of color. It’s not just privacy or accountability issues, however, that are raising red flags for critics of the technology. In recent weeks, Madison Square Garden owner James Dolan’s use of facial recognition at the venue has drawn new scrutiny to the technology. Dolan has been using the technology to ban attorneys involved in litigation against him from entering the arena. They’re also situations in which New Yorkers’ faces could end up on a computer screen or get scanned for biometrics - all without their consent. Walking into the local bodega, passing through the subway turnstiles and going to a show at Madison Square Garden are all regular New York City occurrences. NY1: Facial recognition tech draws new criticism amid MSG controversy
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |