Facial recognition is real-life ‘Black Mirror’ stuff, Ocasio-Cortez says

Credit to Author: Lisa Vaas| Date: Fri, 17 Jan 2020 10:59:27 +0000

During a House hearing on Wednesday, Rep. Alexandria Ocasio-Cortez said that the spread of surveillance via ubiquitous facial recognition is like something out of the tech dystopia TV show “Black Mirror.”

This is some real-life “Black Mirror” stuff that we’re seeing here.

Call this episode “Surveil Them While They’re Obliviously Playing With Puppy Dog Filters.”

Wednesday’s was the third hearing on the topic for the House Oversight and Reform Committee, which is working on legislation to address concerns about the increasingly pervasive technology. In Wednesday’s hearing, Ocasio-Cortez called out the technology’s hidden dangers – one of which is that people don’t really understand how widespread it is.

At one point, Ocasio-Cortez asked Meredith Whittaker – co-founder and co-director of New York University’s AI Now Institute, who had noted in the hearing that facial recognition is a potential tool of authoritarian regimes – to remind the committee of some of the common ways that companies collect our facial recognition data.

Whittaker responded with a laundry list: she said that companies scrape our biometric data from sites like Flickr, from Wikipedia, and from “massive networked market reach” such as that of Facebook.

Ocasio-Cortez: So if you’ve ever posted a photo of yourself to Facebook, then that could be used in a facial recognition database?

Whittaker: Absolutely – by Facebook and potentially others.

Ocasio-Cortez: Could using a Snapchat or Instagram filter help hone an algorithm for facial recognition?

Whittaker: Absolutely.

Ocasio-Cortez: Can surveillance camera footage that you don’t even know is being taken of you be used for facial recognition?

Whittaker: Yes, and cameras are being designed for that purpose now.

This is a problem, the New York representative suggested:

People think they’re going to put on a cute filter and have puppy dog ears, and not realize that that data’s being collected by a corporation or the state, depending on what country you’re in, in order to …surveil you, potentially for the rest of your life.

Whittaker’s response: Yes. And no, average consumers aren’t aware of how companies are collecting and storing their facial recognition data.

It’s “Black and brown Americans” who suffer the most from the ubiquity of this error-prone technology, Ocasio-Cortez said, bringing up a point from a previous hearing in May 2019: that the technology has the highest error rates for non-Caucasians.

Problems in facial recognition technology

At the May 2019 hearing, Joy Buolamwini, founder of the Algorithmic Justice League (AJL) – a nonprofit that works to illuminate the social implications and harms of artificial intelligence (AI) – had testified about how failures of facial analysis technologies have had “real and dire consequences” for people’s lives, including in critical areas such as law enforcement, housing, employment, and access to government services.

Buolamwini founded the AJL after experiencing such failure firsthand, when facial analysis software failed to detect her dark-skinned face until she put on a white mask. Such failures have been attributed to the lack of diversity within the population of engineers who create facial analysis algorithms. In other words, facial recognition achieves its highest accuracy rate when used with white male faces.

Here’s Buolamwini in the May hearing:

If you have a case where we’re thinking about putting, let’s say, facial recognition technology on police body cams, in a situation where you already have racial bias, that can be used to confirm [such bias].

In Wednesday’s hearing, Ocasio-Cortez said that the worst implications are that a computer algorithm will suggest that a Black person has likely committed a crime when they are, in fact, innocent.

Because facial recognition is being used without our consent or knowledge, she suggested, we may be mistakenly accused of a crime and have no idea that the technology has been used as the basis for the accusation.

That’s right, the AI Now Institute’s Whittaker said, and there’s evidence that the use of facial recognition is often not disclosed. That lack of disclosure is compounded by our “broken criminal justice system,” Ocasio-Cortez said, where people often aren’t allowed to access the evidence used against them.

Case in point: the Willie Lynch case in Florida. A year ago, Lynch, from Jacksonville, Florida, asked to see photos of other potential suspects after being arrested for allegedly selling $50 worth of crack to undercover cops. The police search had relied on facial recognition: the cops had taken poor-quality photos of the drug dealer with a smartphone camera and then sent them to a facial recognition technology expert who matched them to Lynch.

In spite of it being his constitutional right to see the evidence, a state appellate court decided that Lynch had no legal right to see other matches returned by the facial recognition software that helped put him behind bars. This, in spite of the algorithm having expressed only one star of confidence that it had generated the correct match.

From the American Civil Liberties Union’s (ACLU’s) writeup of the case:

Because the officers only identified him based on the results of the face recognition program, looking into how the face recognition algorithm functioned and whether errors or irregularities in the procedure tainted the officers’ ID was critical to his case. But when Lynch asked for the other booking photos the program returned as possible matches, both the government and the court refused. Their refusal violated the Constitution, which requires prosecutors to disclose information favorable to a defendant.

Ocasio-Cortez, in Wednesday’s hearing:

These technologies are almost automating injustices, both in our criminal justice system but also automating biases that compound on the on the lack of diversity in Silicon Valley, as well.

C-SPAN has full coverage of the three hours of testimony given in Wednesday’s hearing.

http://feeds.feedburner.com/NakedSecurity

Leave a Reply