crowded face scanning

 

Why are Universities objecting to face recognition systems?


 

David Ackerman Chief Scientist at Princeton Identity

Published on March 10, 2020

Face recognition has gotten very good lately. Not perfect, but clever improvements made over the last five or so years have increased accuracy a lot. [1] So why have so many University students, faculty and administrators roundly rejected face recognition’s promise of safety and convenience? [2] And what alternatives might avoid such blow-back? (Hint: an opt-in, non-surveillance security system.)

Let’s start by listing some of the published objections to face recognition systems and surveillance systems in general that come from the University sector.

  • Nobody likes the idea of being tracked by a system that they don’t even know exists. And one that can identify them … that’s worse. [3]
  • The unchecked use of a face recognition surveillance system represents a threat to privacy as well as freedom to associate. And it takes time to establish and enforce guidelines to prevent abusive data collection. [4, 5, 6]
  • Futuristic uses of face recognition in surveillance systems might aim to assess a subject’s intent. [7] Think ‘Minority Report.’ Predicting intent is an unproven technology and antithetical to our system of innocent until proven guilty.
  • Independent testing of face recognition has demonstrated demographic bias. [8, 9] While such bias might not be fundamental, until algorithms are trained on demographically balanced data sets, face recognition will operate with higher error rates on women and minorities. [10, 11]

These objections dampen enthusiasm for face recognition’s promise of secure, card-less access to buildings, labs and athletic facilities, streamlined transactions at campus stores and dining halls and remote attendance systems designed to monitor class participation and improve student retention.

The forceful University push-back despite benefits promised by face recognition systems appears to come from a lack of consent. A system that works autonomously, impersonally from a distance – sometimes a large distance – does not ask permission to track you, name you, name those who spend time with you or attempt to figure out what you are thinking or feeling. Mounting evidence shows that University students, faculty and administrators value personal freedom over convenience and reject systems that threaten that freedom.

That said, what about iris recognition, another standoff biometric method to identify people? Maybe there aren’t enough iris systems deployed to generate University resistance yet. But commercialized iris systems work from a few feet (1m) or less and are therefore useless for surveillance tracking and identification at a distance. Users are always aware that they are engaging iris readers because the readers require a degree of compliance – open your eyes, look here – and therefore require implicit consent. Iris recognition is inherently an opt-in/volunteer biometric method. Finally, iris recognition shows no demographic bias. The very nature of iris recognition shields it from the objections to face recognition currently voiced by Universities. At the same time, iris recognition systems, including multimodal, short-range iris-plus-face systems, offer non-contact, secure, accurate recognition for the same University applications that would be served by face recognition.



[1] NIST, "NIST Evaluation Shows Advance in Face Recognition Software’s Capabilities," 30 Nov 2018. [Online]. Available: https://www.nist.gov/news-events/news/2018/11/nist-evaluation-shows-advance-face-recognition-softwares-capabilities.

[2] S. Samuel, "Is your college using facial recognition on you? Check this scorecard.," Vox.com, 29 Jan 2020. [Online]. Available: https://www.vox.com/2020/1/29/21112212/facial-recognition-college-campus-scorecard.

[3] M. Andrejevic and N. Selwyn, "Facial recognition technology in schools: critical questions and concerns," Learning, Media and Technology, 5 Nov 2019. https://doi.org/10.1080/17439884.2020.1686014

[4] C. Garvie and L. Moy, "America Under Watch: Face Surveillance in the United States," Georgetown Law Center on Privacy & Technology, 16 May 2019. [Online]. Available: https://www.americaunderwatch.com/#footnote32_l7gcr1q.

[5] ACLU, "What's wrong with public video surveillance?," 2020. [Online]. Available: https://www.aclu.org/other/whats-wrong-public-video-surveillance.

[6] M. Feeney, "Keep Facial Recognition Away From Body Cameras," Cato Institute, 13 Apr 2018. [Online]. Available: https://www.cato.org/blog/keep-facial-recognition-away-body-cameras.

[7] S. Chinoy, "The Racist History Behind Facial Recognition," 10 Jul 2019. [Online]. Available: https://www.nytimes.com/2019/07/10/opinion/facial-recognition-race.html.

[8] NIST, "NIST Study Evaluates Effects of Race, Age, Sex on Face Recognition Software," 19 Dec 2019. [Online]. Available: https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software.

[9] B. F. Klare, M. J. Burge, J. C. Klontz, R. W. Vorder Bruegge and A. K. Jain, "Face Recognition Performance: Role of Demographic Information," IEEE Trans. on Information Forensics and Security, vol. 7, no. 6, pp. 1789-1801, Dec 2012. https://apps.dtic.mil/dtic/tr/fulltext/u2/a556941.pdf

[10] C. Garvie and J. Frankle, "Facial-Recognition Software Might Have a Racial Bias Problem," The Atlantic, 7 Apr 2016. [Online]. Available: https://www.theatlantic.com/technology/archive/2016/04/the-underlying-bias-of-facial-recognition-systems/476991/.

[11] A. Chabria, "Facial recognition software mistook 1 in 5 California lawmakers for criminals, says ACLU," 13 Aug 2019. [Online]. Available: https://www.latimes.com/california/story/2019-08-12/facial-recognition-software-mistook-1-in-5-california-lawmakers-for-criminals-says-aclu.