Facial Recognition Technology: First and Fourth Amendment Implications
On October 18, 2016, the Georgetown Law Center on Privacy & Technology released a report regarding the use of facial recognition technology in law enforcement agencies throughout the country. Clare Garvie et al., The Perpetual Line-Up: Unregulated Police Face Recognition in America 1 (2016), https://www.perpetuallineup.org/sites/default/files/2016-12/The%20Perpetual%20Line-Up%20-%20Center%20on%20Privacy%20and%20Technology%20at%20Georgetown%20Law%20-%20121616.pdf. According to the report, over 117 million American adults are subject to face scanning programs with their picture in a law enforcement database. Id. The report highlights the risks of such programs and calls for legislative oversight at the state and federal level. See id. at 1–6.
Face recognition is the automated process of comparing two images of faces to determine whether they represent the same individual. Id. at 9. Once a face is detected, the algorithm extracts features from the face, such as eye position or skin texture, that can be numerically quantified. Id. The algorithm compares pairs of faces and issues a numerical score reflecting the similarity of their features. Id. The problem, however, is that such programs do not produce binary “yes” or “no” answers. Id. Instead, the algorithm identifies more likely or less likely matches, which law enforcement agencies use as candidates for further investigation. Id.
Historically, law enforcement biometric databases, such as fingerprint and DNA databases, have been exclusively or primarily populated by criminal or forensic samples. Id. at 20. By law, the Federal Bureau of Investigation’s (FBI) national DNA database, known as the National DNA Index System, is almost exclusively composed of DNA profiles related to criminal arrests or investigations. See id.; 42 U.S.C. § 14132(b) (2012). The FBI’s face recognition unit, FACE Services, departs from this trend by utilizing sixteen states’ driver’s license databases, American passport photos, and photos from visa applications. Garvie et al., supra, at 20. Such a database is primarily made up of law-abiding Americans. See id.
The report analyzes how many agencies use face recognition software, how often they use it, and the risk level of those uses. See id. at 23. Additionally, the report highlights the measures that agencies apply to protect First Amendment rights and protect against racial bias. See id. The report estimates that “one in four of all American state and local law enforcement agencies can run face recognition searches of their own databases, run those searches on another agency’s face recognition system, or have the option to access such a system.” Id. at 25. Major police departments are exploring real-time face recognition on live surveillance camera video, which lets police continuously scan the faces of pedestrians walking by a street surveillance camera. Id. at 2. From August 2011 to December 2015, the FBI face recognition unit ran close to 214,920 face recognition searches, including 118,490 searches of its own database and 36,420 searches against the sixteen state driver’s license and mug shot database. Id. at 25. According to the report, “FBI face recognition searches of state driver’s license photos are almost six times more common than federal court-ordered wiretaps.” Id.
While the report recognizes the benefits and effectiveness of such programs to aid in police investigations, the authors are concerned about the lack of oversight to control potential abuses. Id. at 1. Only nine of the fifty-two responsive agencies (17%) and the FBI face recognition unit expressly indicated that they audit their employees’ use of the face recognition system for improper use. Id. at 60. Maryland’s face recognition program—the Maryland Image Repository System—was established in 2011 and is comprised of the license photos of over two million residents; this system has never been audited. Id. at 4. Further, some of the largest agencies with the most advanced systems are often the least transparent. Id. at 58. Only one agency, the San Diego Association of Governments, received legislative approval. Id. at 4.
A main concern for the authors of the report is the potential impact such programs could have on minority communities. Id. at 3. Research on facial recognition programs suggests that these programs exhibit signs of racial bias. See id. at 53. They have been found to perform more poorly on African American faces than on other races, which can make it more likely that a system will misidentify an innocent African American person as a suspect. See id. African Americans are also disproportionately likely to be arrested and, thus, show up in mug-shot databases. See id. Therefore, systems that use mug shoot photos will be more likely to flag an African American face than a Caucasian one. See id. Additionally, the lack of oversight allows agencies to put these programs to suspect uses. See id. at 2. In Maricopa County, Arizona, the sheriff’s office downloaded every driver’s license and mugshot from every resident of Honduras to its facial-recognition database. Id.
The authors also express concerns about potential chilling effects that facial recognition programs may have on First Amendment rights. See id. at 42. The Supreme Court has previously noted that there exists a “vital relationship between freedom to associate and privacy in one’s associations.” NAACP v. Alabama, 357 U.S. 449, 462 (1958). However, when considering whether military surveillance of public meetings had an “inhibiting effect” on the expression of First Amendment rights, the Supreme Court held that without a showing of past or immediate danger of direct injury, surveillance does not produce such an effect. See Laird v. Tatum, 408 U.S. 1, 10–13 (1972). Despite the mixed guidance from the Supreme Court, major federal and state law enforcement agencies have recognized the threat that face recognition presents to free speech. Garvie et al., supra, at 43. The Department of Homeland Security, the FBI, and multiple state agencies recognized in a 2011 Privacy Impact Assessment that “surveillance has the potential to make people feel extremely uncomfortable, cause people to alter their behavior, and lead to self-censorship and inhibition.” Int’l Just. & Pub. Safety Network, Privacy Impact Assessment: Report for the Utilization of Facial Recognition Technologies to Identify Subjects in the Field 2 (2011), https://www.eff.org/files/2013/11/07/09_-_facial_recognition_pia_report_final_v2_2.pdf. Yet, of the fifty-two agencies studied in the report, only one agency expressly prohibits its officers from using face recognition to track individuals engaging in political, religious, or other protected free speech. Garvie et al., supra, at 44.
The growing use of facial recognition programs has also raised many questions regarding how these programs fall into the Fourth Amendment’s protection against unreasonable searches and seizures. See id. at 33. In Katz v. United States, the Supreme Court held that “the Fourth Amendment protects people, not places.” 389 U.S. 347, 351 (1967). Justice Harlan’s concurring opinion laid out a “reasonable expectation of privacy” test to determine whether a Fourth Amendment search has occurred. Id. at 361 (Harlan, J., concurring). Further, the Supreme Court has never formally recognized a reasonable expectation of privacy in public conduct. See United States v. Jones, 565 U.S. 400, 411 (2012) (“This Court has to date not deviated from the understanding that mere visual observation does not constitute a search.”). Yet, in recent cases, the Supreme Court has highlighted the transformational nature of twenty-first century surveillance technology and rejected simplistic comparisons of modern technology to older policing practices. See Riley v. California, 134 S. Ct. 2473, 2488 (2014) (rejecting the government’s contention that a search of an arrestee’s smartphone is “materially indistinguishable” from a search of a person’s pocket upon arrest). It is unclear whether the Court would treat face recognition as being tantamount to “mere visual observation.” See Jones, 565 U.S. at 412.
The report concludes by setting forth a list of recommendations for Congress and state legislatures to pass to regulate law enforcement face recognition. Garvie et al., supra, at 62. The authors recommend that face recognition searches should be conditioned on an individualized suspicion of criminal conduct and searches of license photos should be limited to investigations of serious offenses. See id. at 62–63. The recommendations also call for increased reporting and internal auditing requirements and an outright prohibition on using face recognition to track people on the basis of their race, ethnicity, and religious or political views. See id. at 64–65.
The report has caught the attention of media outlets and civil rights groups. See Daniel Victor, Study Urges Tougher Oversight for Police Use of Facial Recognition, N.Y. Times (Oct. 18, 2016), http://www.nytimes.com/2016/10/19/us/a-virtual-lineups-of-average-citizens-created-by-software.html?_r=0; Kaveh Waddell, Half of American Adults Are in Police Facial-Recognition Databases, Atlantic (Oct. 19, 2016), http://www.theatlantic.com/technology/ archive/2016/10/half-of-american-adults-are-in-police-facial-recognition-databases/504560/. The American Civil Liberties Union, along with the Leadership Conference on Civil and Human Rights and fifty other interest groups, sent a letter to the Department of Justice urging it to investigate the increasing use of face recognition technology. Steven Wildberger, ACLU: DOJ Must Investigate Police Use of Facial Recognition Technology, Jurist (Oct. 19, 2016, 9:40 AM), http://www.jurist.org/paperchase/2016/10/aclu-doj-must-investigate-police-use-of-facial-recognition-technology.php. Time will tell whether Congress or the Court takes the first step to address the issues laid out in the report.
*Ashley Triplett is a second-year law student at the University of Baltimore School of Law, where she is a member of the Royal Graham Shannonhouse III Honor Society. In the summer of 2016, Ashley worked as a Summer Associate for Miles & Stockbridge, PC where she will return to work for the summer of 2017. Ashley was recently named as a Production Editor for the University of Baltimore Law Review Volume 47.