Minority Report: San Francisco Bans Facial Recognition Tech Over Potential Bias, Privacy Concerns
Privacy advocates in the Bay Area have cause for celebration after San Francisco became the
first municipality in the United States to pass an ordinance barring the city’s use of facial recognition technology because of the “propensity [of] facial recognition technology [to] endanger civil rights and civil liberties” and “exacerbate racial injustice.”
The ordinance prohibits the police and other San Francisco governmental agencies from using facial recognition technology for any purpose with an exception for “inadvertent or unintentional” access to or use of the technology. The ordinance also requires city departments seeking to acquire other kinds of surveillance technology, or entering into agreements to receive information from non-city owned surveillance technologies, to obtain Board of Supervisors approval and submit a “Surveillance Impact Report.” Importantly, the ordinance does not prohibit private citizens — including businesses — from using such technologies.
Oakland and Berkeley are
considering similar bans, as is Somerville, Massachusetts. Other cities like Chicago and Detroit have moved in the opposite direction, implementing widespread real-time video surveillance as a means of crime prevention.1 New York and Orlando are considering similar roll-outs.2
And this might be cause for concern, as the potential for abuse of facial recognition technology is great. Indeed, a report from Georgetown Law’s Center on Privacy & Technology noted that “[f]ace recognition is less accurate than fingerprinting,” and “without specialized training, human users make the wrong decision about a match half the time.”3 Moreover, errors and biases in facial recognition technology may disproportionately affect persons of color.4 This is in part because image databases used by law enforcement are disproportionately comprised of African Americans and other minority groups, and also because the facial recognition software misidentifies people of color at higher rates than whites.5
Beyond the inherent flaws in current facial recognition technology, there are opportunities for misuse by law enforcement. For example, another report from Georgetown Law’s Center on Privacy & Technology describes how the NYPD’s Facial Identification Section (FIS) “
got creative” when surveillance images of a man caught on video stealing from a store did not return any matches after detectives ran the images through the face recognition algorithm. An FIS officer observed that the man on the video bore a passing resemblance to the actor Woody Harrelson so he located an online photo of the celebrity and submitted it for possible matches instead of the images caught on the video to successfully come up with a match. The match however, was to the actor, not to the actual suspect caught on tape. And law enforcement agencies around the country have used police sketches that are either hand drawn or computer generated to look for matches in lieu of actual photos, despite compelling evidence that these sketches return accurate results less than 10% of the time.6
The potential for such abuses underscores the need for more local, state, and federal regulation in the space. Currently, there are no federal laws that govern the use of facial recognition technology. Policy advocates at the Electronic Freedom Foundation have argued that regulations could be modeled after current laws governing other technologies, like the Wiretap Act or the Video Privacy Protection Act, and emphasized that any regulation should follow certain best practices like having clear rules for data collection, as well as limiting the types of data that can be stored and retained, and for how long.7 With few national rules in place, how police departments use facial recognition technologies is, for the time being, still a local concern.
Visit our website to learn more about V&E’s Government Investigations & White Collar Criminal Defense practice. For more information, please contact Vinson & Elkins lawyers Jessica Heim or Michael Hoosier.
1 Clare Garvie & Laura M. Moy, America Under Watch, Georgetown Law Center on Privacy & Technology, May 16, 2019, https://www.americaunderwatch.com/.
2 See id.
3 Claire Garvie, Alvaro Bedoya & Jonathan Frankle, The Perpetual Line-Up: Unregulated Police Face Recognition in America, Georgetown Law Center on Privacy & Technology, Oct. 18, 2016, https://www.perpetuallineup.org/.
4 See Jennifer Lynch, Face Off: Law Enforcement Use of Face Recognition Technology, Electronic Freedom Foundation, Feb. 12, 2018, https://www.eff.org/wp/law-enforcement-use-face-recognition#_idTextAnchor044.
6 Clare Garvie, Garbage In, Garbage Out: Face Recognition on Flaw Data, Georgetown Law Center on Privacy & Technology, May 16, 2019, https://www.flawedfacedata.com/.
7 See Face Off: Law Enforcement Use of Face Recognition Technology, supra n. 4.