Utilizing Facial Recognition on a University Campus
The scenario enlists students as taskforce members making informed decisions on the safe return to university operations in a post-pandemic world. The taskforce are asked to provide a recommendation on a proposed facial recognition system.
Objective
The objective of this case study discussion is to introduce students to facial recognition technology (FRT) and ethical considerations for its use. The case provides students the opportunity to deliberate and understand how the implementation of facial recognition on a college campus and introduce risks, including surveillance. The case discussion is set up as a role-play where students are assigned different roles as members of a committee that is investigating if FRT should be used for COVID detection. This case study will be most useful for introductory courses but can also be used with advanced courses as there is significant potential within the narrative to embed more technical aspects of how FRT works.
Role-Play Scenario Narrative and Description of Roles
Trisha Brown is the chief of the Safety and Emergency Management (SEM) office at a large suburban university, the Andrew Hamilton University (AHU). She is responsible for the safety and security of all students, faculty, and staff on campus. In recent months, her responsibilities have suddenly shifted from the regular aspects of the work – the police force, traffic safety, fire drills – towards responding to the needs of keeping the university functional and safe during a pandemic. She and members of her office have been working round the clock to ensure that the campus is ready to open for the new semester.
While they work towards this goal, she is also engaged in planning for the future of the campus and as part of this effort she is looking at technological solutions for the problem in hand. She has to be prepared, she has realized, for the eventuality that a vaccine will take some time to develop, and even if it does, it might not be as effective as it needs to be, and it may be that not everyone agrees to be vaccinated. She has assembled a taskforce with members from across the university and has worked hard to ensure that all constituents are well represented. On the recommendation of this taskforce, to keep the decision makers informed and to be able to track the health of anyone who is on campus, she has championed an app where users can upload their health information daily.
One member of this taskforce, a staff member of the Information Technology Software and Services (ITSS) group, has recently approached her with another innovative technology that can address one of the shortcomings of the app, which is that users have to proactively submit their information and there is no way to capture their health automatically. Even though the university is thinking of taking people’s temperatures as they enter buildings, this approach will require manpower, and it might be too late if someone has been on campus for a while already and has interacted with others. The new technology uses facial recognition to identify if someone is on campus and then quickly looks them up in the app database to map if they have entered their information. If not, they get a notification on their phone and security is alerted to their presence and their location on campus.
Although Trisha is appreciative of the power and possible usefulness of this technology, she is a little circumspect of the privacy, bias, and discrimination issues that she has read about. For instance, she is unsure how to balance the pros and cons of this solution, especially since facial recognition technology and solutions for analysis are changing fast. She knows that a lot of parameters must be looked at and examined in depth if a good solution is to be reached regarding the usefulness of this potential solution. Consequently, she has asked her taskforce members to learn more about the use of facial recognition and then report back to her with their personal recommendations about this solution. She has further instructed the taskforce to reach a consensus on their recommendation so that she can move forward with making this decision.
Each of you has been assigned one of these roles as a member of Trisha’s taskforce, and today, you are meeting to present your personal recommendation and reasoning. As a team, you should discuss the viability of using an FRT-based application on campus to come to a consensus regarding your recommendation to Trisha.
Taskforce members (roles for the discussion)
Steve Smith is a vice president in the Information Technology Software and Services (ITSS) group at AHU and has recently moved to AHU after a successful career in the industry. In his last job as chief technology officer (CTO) of a small company, he successfully led the migration of their legacy software to a cloud-based solution. He is an unabashed technology optimist who believes that IT can solve almost any organizational problem and that, once a solution has been implemented, any problems associated with it can be addressed. No new technology, he is quick to point out, comes without some downsides that must be overcome.
Courtney Jones is an undergraduate student in organizational psychology and the vice president of the AHU student organization. As part of her position, and because of her interest in wellness and the wellbeing of fellow students, she represents students’ welfare on this taskforce. Courtney is a frequent user of social media and has used it well to drum up support for causes that she believes in on campus, and she has been vocal about the safety of women on campus and around the campus. She recently launched a major campaign against bullying on social media as well as campaigned for the COVID app when it launched.
Trevor Jackson is a professor in the Department of History and a member of the faculty senate. He represents faculty on this taskforce, is dedicated to AHU, and serves on many other committees as well. As a historian, he often takes a long-term perspective on issues and is often circumspect of technology-based solutions, especially when he thinks there are other, and in his opinion simpler, ways of solving a problem. When the app for COVID was being rolled out, he was the one who pushed for self-reported data entry by the user rather than some form of automatically collecting information. He is often preoccupied with issues of surveillance and new limits to privacy due to technology.
Keith Hampton is associate vice president in the Provost office at AHU and on the team that looks at student admissions and retention. He is worried that, if the university gets a reputation for neglecting student safety, it might impact admissions. He made sure that the admissions office publicized the COVID app and reassured students and their parents that AHU was taking all the necessary steps. He thinks facial recognition software would make a huge impact in terms of publicity and would put AHU on the map when it comes to using technology to ensure safety during COVID. However, he is worried about where the funds for the technology and the cameras will come from and wonders whether he will have to spend time trying to get permission from students to use this data.
Gloria Espinosa is a senior director in the Office of Equity and Inclusiveness (OEI) at AHU, and in her role, she works toward a range of efforts such as transfer agreements with community colleges, outreach in K-12 schools, and summer camps for kids, all of which can assist with advancing AHU’s mission to admit and support a broad range of students. She is naturally inclined to be skeptical of any effort that might undermine inclusiveness on campus, which includes technology-driven projects. She raised the question about access to smartphones and data plans when it came to use of the COVID app. She is worried that a facial recognition-based solution to COVID detection and prevention might introduce other unintended problems with grave consequences for students and faculty.
Amelie Montaigne is director of FaceAware, a nonprofit, that works in the field of facial recognition with both the government and the industry. She is providing consulting for the taskforce pro bono. She is a renowned expert on the topic of facial recognition and was responsible for creating one of the first deployable applications of facial recognition, based on an algorithm she wrote, which she later sold to a large company. She has been a proponent of facial recognition and has seen the technology grow by leaps and bounds over the past decade. She is cognizant of problems with FR technology, especially security risks and algorithmic bias, but she believes it is not the technology itself but how it is applied that matters.
Cases in this collection are also available on the Mason Tech Ethics website.
Role-play Instructions
1. Each student is assigned a role a week before the discussion.
2. Student assigned to the role of Trisha Brown serves as the moderator and leads the conversation based on the script below.
3. The script provided below is there to guide the discussion, but you should leave room for the conversation to flow naturally and allow everyone to contribute.
Script for the Role-play
1. What role are you playing in the role-play group discussion? Please state the name, title, and describe the role in your own words (couple of sentences).
[to be answered by each group members individually and in a sequence]
2. From the perspective of your role, what is your recommendation for Trisha regarding the use of FRT?
[to be answered by each group members individually and in a sequence]
3. From the perspective of your role, are there alternative solutions you would like to present to Trisha? Why do you think the approach you suggest is good and what are the main barriers to this approach?
[to be answered by each group members individually and in a sequence]
4. What is your overall group recommendation to Trisha?
[open discussion, anyone can chime in]
One way to ensure students are prepared for the discussion is to assign a few questions from the script as a pre-discussion assignment (short answers). Similarly, to ensure students reflect on the discussion, they can be assigned the last question from the script as a post-discussion exercise. They can also be asked specifically about ethical concepts or concerns related to FRT that have been introduced through the readings.
Extra Assignment - Concept Mapping
Draw a concept map to depict your group's decision. It should include different aspects of technology, applications, stakeholders, and/or other aspects that you considered in your discussion. The map should have between 10-12 concepts or items and should convey how they are related. You can use any medium to create and upload it, ideally as a jpeg. You can take a screenshot or even draw on paper and take a picture and upload it.
Resources to help with concept maps
https://learningcenter.unc.edu/tips-and-tools/using-concept-maps/ (see Example 3)
https://en.wikipedia.org/wiki/Concept_map
FRT Code of Ethics, Frameworks and Guidelines
• Ethical Framework for FRT Submitted by the ACLU to the NTIA Multistakeholder Process on Facial Recognition Technology: https://www.ntia.doc.gov/files/ntia/publications/aclu_an_ethical_framework_for_face_recognition.pdf
• A. K. Roundtree, "Facial Recognition Technology Codes of Ethics: Content Analysis and Review," 2022 IEEE International Professional Communication Conference (ProComm), Limerick, Ireland, 2022, pp. 211-220, doi: 10.1109/ProComm53155.2022.00045. https://ieeexplore.ieee.org/document/9881633
• Center for Strategic and International Studies (CSIS) report on “Facial Recognition Technology: Responsible Use Principles and the Legislative Landscape”: https://www.csis.org/analysis/facial-recognition-technology-responsible-use-principles-and-legislative-landscape
Background Readings, Videos, and Other Resources
• Wicker, S. & Ghosh, D. (2020). Reading in the Panopticon: Your Kindle May Be Spying on You, But You Can't Be Sure. Communications of the ACM, Vol. 63 No. 5, Pages 68-73.
• Lanchester, J. (2017). You are the product. London Review of Books. Vol. 39, No. 6.
• ACLU Resource Page on FRT: https://www.aclu.org/issues/privacy-technology/surveillance-technologies/face-recognition-technology
• Kate Crockford – What you need to know about face surveillance (2019): https://www.ted.com/talks/kade_crockford_what_you_need_to_know_about_face_surveillance?language=en
• Alessandro Acquisti – What will a future without secrets look like? (2013): https://www.ted.com/talks/alessandro_acquisti_what_will_a_future_without_secrets_look_like
• Glenn Greenwald – Why privacy matters (2014): https://www.ted.com/talks/glenn_greenwald_why_privacy_matters
• UK Government's Center for Data Ethics and Innovation's Independent report "Snapshot Paper - Facial Recognition Technology" May 2020
https://www.gov.uk/government/publications/cdei-publishes-briefing-paper-on-facial-recognition-technology/snapshot-paper-facial-recognition-technology
https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/905267/Facial_Recognition_Technology_Snapshot_UPDATED.pdf
Authorship and Project Information and Acknowledgements
The scenarios and roles were conceptualized and written by Aditya Johri. Feedback was provided by Ashish Hingle, Huzefa Rangwala, and Alex Monea, who also collaborated on initial implementation and empirical research. This work is partly supported by U.S. National Science Foundation Awards# 1937950, 2335636, 1954556; USDA/NIFA Award# 2021-67021-35329. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the funding agencies. The research study associated with the project was approved by the Institutional Review Board at George Mason University.