Authors: Dr Nicola Whitehead (University of Wales Trinity Saint David); Professor Sarah Hitt (NMITE); Emma Crichton (Engineers Without Borders UK); Dr Sarah Junaid (Aston University); Professor Mike Sutcliffe (TEDI-London), Isobel Grimley (Engineering Professorsâ Council).
Topic: Development and use of a facial recognition system.Â
Engineering disciplines: Data, Electronics, Computer science, AI.
Ethical issues: Diversity, Bias, Privacy, Transparency.
Professional situations: Rigour, Informed consent, Misuse of data, Conflicts with leadership / management.
Educational level: Advanced.Â
Educational aim: To encourage ethical motivation. Ethical motivation occurs when a person is moved by a moral judgement, or when a moral judgement is a spur to a course of action.Â
Â
Learning and teaching notes:Â
This case involves an engineer hired to manage the development and installation of a facial recognition project at a building used by university students, businesses and the public. It incorporates a variety of components including law and policy, stakeholder and risk analysis, and both macro- and micro-ethical elements. This example is UK-based: however, the instructor can adapt the content to better fit the laws and regulations surrounding facial recognition technology in other countries, if this would be beneficial.
This case study addresses two of AHEP 4âs themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this study to AHEP outcomes specific to a programme under these themes, access AHEP4 here and navigate to pages 30-31 and 35-37.
This case is presented in three parts. If desired, a teacher can use Part one in isolation, but Part two (focusing on the wider ethical context of the case) and Part three (focusing on the potential actions the engineer could take)develop and complicate the concepts presented in Part one to provide for additional learning. The case study allows teachers the option to stop at multiple points for questions and / or activities as desired.
Learners have the opportunity to:
- apply their ethical judgement to a case study relating to privacy and consent;
- judge the societal impact of a technical solution to a complex problem;
- make and justify an ethical decision;
- analyse risks associated with micro-ethical and macro-ethical concerns;
- communicate these risks and judgements to both technical and non-technical audiences.Â
Teachers have the opportunity to:Â
- highlight a range of ethical considerations within the scope of a complex engineering project;
- introduce methods for risk analysis and ethical decision-making;
- evaluate critical thinking, argumentation, and communication skills;
- provide an opportunity for reflection.
Â
Learning and teaching resources:
- RAEng/Engineering Council Statement of Ethical Principles
- Data Protection Act (2018)
- Towards global code of ethics for AI researchÂ
- Ethical concerns about facial recognition systemsÂ
- Facial recognition technology: fundamental rights considerations in the context of law enforcement
- 7-Step guide to ethical decision making
- One Trust Data Guidance
- The Kingâs Cross CCTV Problem
- MPs call for halt to police’s use of live facial recognition
- Facial recognition in King’s Cross prompts call for new laws
- Data regulator probes King’s Cross facial recognition tech
- King’s Cross facial recognition plans revealed by letter
- ICO statement: live facial recognition technology in King’s Cross
- King’s Cross statement on facial recognition
- The Facewatch ICO complaint
Â
Summary:Â
Metropolitan Technical University (MTU), based in the UK, has an urban campus and many of its buildings are located in the city centre. A new student housing development in this area will be shared by MTU, a local college, and medical residents doing short rotations at the local hospital. The building has a public cafĂ© on the ground floor and a couple of classrooms used by the university.Â
The housing development sits alongside a common route for parades and protests. In the wake of demonstrations by Extinction Rebellion and Black Lives Matter, students have raised concerns to the property manager about safety. Despite an existing system of CCTV cameras and swipe cards, the university decides to install an enhanced security system, built around facial recognition technology that would enable access to the building and cross-reference with crime databases. To comply with GDPR, building residents will be required to give explicit consent before the system is implemented. Visitors without a student ID (such as cafĂ© customers) will be buzzed in, but their image will be captured and cross-referenced before entry. A side benefit of the system is that MTUâs department of Artificial Intelligence Research will help with the installation and maintenance, as well as studying how it works, in order to make improvements.Â
Â
Dilemma â Part one:
You are an engineer who has been hired by MTU to take charge of the facial recognition system installation project, including setting policies and getting the system operational. With your background in AI engineering, you are expected to act as a technical advisor to MTU and liaise with the Facilities, Security and Computing departments to ensure a smooth deployment. This is the first time you have worked on a project that involves image capture. So as part of your preparation for the project, you need to do some preliminary research as to what best practices, guidance, and regulations apply.
Â
Optional STOP for questions and activities:Â
1. Discussion: What are the legal issues relating to image capture? Images allow for the identification of living persons and are therefore considered as personal data under GDPR and the Data Protection Act (2018).
2. Discussion: Sharing data is a legally and ethically complex field. Is it appropriate to share images captured with the police? If not the police, then whose crime database will you use? Is it acceptable to share the data with the Artificial Intelligence Research group? Why, or why not?
3. Discussion: Under GDPR, individuals must normally consent to their personal data being processed. How should consent be handled in this case?
4. Discussion: Does the fact that the building will accommodate students from three different institutions (MTU, the local college, and the hospital) complicate these issues? Are regulations related to studentsâ captured images different than those related to public image capture?
5. Activity: Undertake a technical activity that relates to how facial recognition systems are engineered.
Â
Dilemma â Part two:
The project has kicked off, and one of its deliverables is to establish the policies and safeguards that will govern the system. You convened a meeting of project stakeholders to determine what rules need to be built into the systemâs software and presented a list of questions to help you make technical decisions. The questions you asked were:
- Should students be able to opt in or out of image capture? Â
- Should visitors be told that their image will be captured?Â
- What happens if a student living in the housing development decides that they no longer wish to take part in the image recognition project?
What you had thought would be a quick meeting to agree basic principles turned out to be very lengthy and complex. You were surprised at the variety of perspectives and how heated the discussions became. The discussions raised some questions in your own mind as to the risks of the facial recognition system.
Â
Optional STOP for questions and activities:
The following activities focus on macro-ethics. This seeks to understand the wider ethical contexts of projects like the facial recognition system.
1. Activity: Stakeholder mapping – Who are all the stakeholders and what might their positions and perspectives be? Is there a difference between the priorities of the different stakeholders?
2. Activity: There are many different values competing for priority here. Identify these values, discuss and debate how they should be weighed in the context of the project.
3. Activity: Risks can be understood as objective and / or subjective. Research the difference between these two types of risk, and identify which type(s) of risks exist related to the project.
4. Discussion: Which groups or individuals are potentially harmed by the technology and which potentially benefit? How should we go about setting priorities when there are competing harms and benefits?
5. Discussion: Does the technology used treat everyone from your stakeholdersâ list equally? Should the needs of society as a whole outweigh the needs of the individual?
6. Activity: Make and defend an argument as to the appropriateness of installing and using the system.
7. Discussion: What responsibilities do engineers have in developing these technologies?
Â
Dilemma â Part three:
A few days later, you were forwarded a screenshot of a social media post that heavily criticised the proposed facial recognition system. It was unclear where the post had originated, but it had clearly been shared and promoted among both students and the public raising concerns about privacy and transparency. Your boss believes this outcry endangers the project and has requested that you make a public statement on behalf of MTU, reaffirming its commitment to installing the system.
You share the concerns, but have been employed to complete the project. You understand that suggesting it should be abandoned, would most likely risk your job. What will you tell your boss? How will you prepare your public statement?
Â
Optional STOP for questions and activities:
Micro-ethics concerns individuals and their responses to specific situations. The following steps are intended to help students develop their ability to practise moral analysis by considering the problem in a structured way and work towards possible solutions that they can analyse critically.
 1. Discussion: What are the problems here?Â
-
- You are an employee of MTU and have a responsibility to be a representative for its interests. However, you can see that the universityâs actions create significant problems relating to privacy and consent and may be ethically or legally questionable.
2. Discussion: What are the possible courses of action you can take as an employee?
 Students can be prompted to consider what different approaches they might adopt, such as the following, but can also develop their own possible responses.Â
-
- You could take the university line and refuse to consider any compromise. After all, you have a duty of care towards the students.Â
- You could act as a whistleblower and contact the Information Commissionerâs Office,or the press, with the universityâs plans.
- You could look for changes in the hardware setup for the system. Can the cameras be placed so that they only capture people coming into the building without recording anyone else?
- You could look for changes in the software setup for the system. What level of accuracy is needed to declare a match between the image and the reference image before the doors will open?
- You could look to make changes in the data management processes. How long will the data be stored? Which database(s) will images be checked against? What are the data security implications of implementing this system?
- Are there other alternatives available to you?Â
3. Discussion: Which is the best approach and why? – Interrogate the pros and cons of each possible course of action including the ethical, practical, cost, local relationship and the reputational damage implications. Students should decide on their own preferred course of action and explain why the balance of pros and cons is preferable to other options. The students may wish to consider this from other perspectives, such as:Â
-
- What would the best outcome be if cost was no object?Â
- What course of action would be taken if different perspectives were chosen as the priority. For example, if the personal privacy perspective was the main priority, what action would be taken, compared with action taken if the cost to the university were the main priority?Â
- What are the wider implications of the use of image recognition in public spaces and how can these be mitigated?Â
- Are there any other technologies that would solve the security problem without the ethical implications?
- What are the possible solutions open to you?Â
- Are there any short-term solutions versus longer-term solutions? Â
4. Activity: Public Communication – Students can practise writing a press release, giving an interview, or making a public statement about the case and the decision that they make.
5. Activity: Reflection – Students can reflect on how this case study has enabled them to see the situation from different angles. Has it motivated them to understand the ethical concerns and to come to an acceptable conclusion.
Enhancements:
An enhancement for this case study can be found here.
Â
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professorsâ Council or the Toolkit sponsors and supporters.