Author: Martin Griffin (Knight Piésold Consulting, United Kingdom). 

Keywords: Equity; Equality, diversity and inclusion (EDI); Collaboration; Bias; Social responsibility; Design. 

Who is this article for? This article should be read by educators at all levels in higher education who wish to integrate social sustainability, EDI, and ethics into the engineering and design curriculum or module design. It will also help to prepare students with the integrated skill sets that employers are looking for. 



No engineer is an island; it is not good for an engineer to act in isolation. Rather engineers need to be part of a welcoming community in order to thrive.  How an engineering professional interacts with either other engineers and non-engineers is essential for building a culture and professional environment of collaboration, creating environments where engineers can create meaningful bonds with one another and feel comfortable communicating openly. This requires recognising and understanding how unconscious bias and privileges can create divides and foster negative professional (toxic) environments, and being committed to establishing standards of conduct for and addressing issues related to EDI. There is a great need to advocate for fellow engineers providing places to belong and empowering them to thrive in their chosen profession and career pathways. This includes people who are part of one or more underrepresented groups that have been historically, persistently, and systemically marginalised in society based on their identity, such as race, colour, religion, marital status, family status, disability, sex, sexual orientation, gender identity, and age. 

The Royal Academy of Engineering and EngineeringUK (2018) frequently publish reports on the demographics of engineers and the skills shortage in the workforce.  These reports highlight the under-representation of people from ethnic and minority groups, those with a disability or impairment, or those who are LGBTQ+.  In addition, the Institute of Engineering and Technology  recently reported that only 9% of businesses take particular action to increase underrepresented groups into their workforces.   

Engineering and technology are for everyone. It is morally right to ensure that everyone has equal opportunities and by doing so we can improve our world, shape our future, and solve complex global challenges. In order to accomplish these moral imperatives, we need to include a diversity of talent and knowledge. Furthermore, in the UK we still face a nationwide skills shortage threatening our industry. To address this and ensure the sustainability of our industry we must support equal opportunities for all and be truly inclusive. 


The three values: 

The three values of EDI are timeless and should be embedded into the way that engineering professionals act, starting with recognition that the unfair treatment of others exists. This unfair treatment may take the form of bullying, harassment, discrimination (either direct or indirect), victimisation, microaggressions, gaslighting, bias and inequity. An engineer’s role must also include advocating for the support of others in this regard too.  Each of the three values are very different, but all three together are essential to create opportunities for engineers to grow and thrive, and for a productive and creative engineering community to flourish. 

Equity encourages fair processes, treatment, and possibilities for everyone, resulting in an equal playing field for all. It acknowledges that oppressive systems have created varied circumstances for different engineers. By valuing equity, engineers must commit to fairly redistributing resources and power to address inequalities that systems have intentionally or unintentionally created, diminishing the impact of such circumstances and ensuring equitable opportunities.  Equality relates to ensuring engineers and groups are treated fairly and have access to equal opportunities. Note, it should be emphasised that equity is not the same as equality; in the simplest terms, equality means ‘sameness,’ and equity means ‘fairness’.  Thus, equality has become synonymous with ‘levelling the playing field’, whereas equity is synonymous with ‘more for those who need it’. 

Diversity refers to how diverse or varied a particular environment is, be it an engineering consultancy, academic funded research team, interdisciplinary joint venture designing as part of a national megaproject, and so on. Diversity involves professional openness and conscientiousness towards diverse social interactions. Therefore, diversity also involves intentional representation and collaboration with others from different demographic characteristics, identities, and differing experiences. Engineers should feel welcome to be their full self without the need to mask, being able to contribute and bring fresh perspectives where they are in attendance. 

Inclusion refers to a state of conscious belonging, meaning all are respected, empowered, and valued. Inclusivity should therefore be ingrained in an engineer’s daily operations and surrounding culture, being able to feel comfortable being their authentic selves. Inclusion involves extensive representation across roles, levels (grades) and the aforementioned demographic characteristics, recognising who is and is not in the room and the valuable perspectives and experiences they can bring. Inclusion also relates to ensuring all engineers feel valued and supported, where the benefits of creativity, innovation, decision making and problem solving are realised.   


Incorporating EDI in engineering education:

It is not possible to place EDI in a box and open it occasionally such as for annual awareness weeks or as an induction week module. It is a lifestyle, a conscious choice, and it needs to be embedded in an engineer’s values, approach and behaviours. Making engineering EDI an integral part of engineering ethics education will not involve an abstract ethical theory of EDI but rather a case-based approach. The teaching of EDI within engineering ethics through case studies helps students consider their philosophy of technology, recognise the positive and negative impact of technology, imagine ethical conduct, and then apply these insights to engineering situations. Moreover, when similar ethical modules have touched students, they are likely to remember the lessons learned from those cases. Several case studies found in the Ethics Toolkit that reference EDI concerns are listed at the end of this article. 

Good contemporary practical examples should be presented alongside case studies to promote and demonstrate why EDI ought to be embedded into a professional engineer’s life. The need to raise awareness, highlight the issues faced, and accelerate inclusion of Black people is provided in the Hamilton Commission report, focusing on all aspects of UK Motorsport including engineering. The importance of gender inclusivity in engineering design and how user-centred practices address this are addressed by Engineers Without Borders UK. Creating accessible solutions for everyone, including those who are disabled, is seen in the ongoing development of Microsoft’s Accessibility Technology & Tools. BP has launched a global framework for action to help them stay on track and progress in a positive way. The further benefits EDI brings to design and delivery in construction engineering are demonstrated by Mott Macdonald.   

Inclusive Engineering (similar to the principles of Universal Design) ensures that engineering products and services are accessible and inclusive of all users. Inclusive Engineering solutions aim to be as free as possible from discrimination and bias, and their use will help develop creative and enlightened engineers. Ethical responsibility is key to all aspects of engineering work, but at the design phase it is even more important, as we can literally be designing biases and discrimination into our technological solutions, thus amplifying existing biases. Recommended guidance is provided within PAS 6463:2022 as part of the engineering design process; this is a new standard written to give guidance on designing the built environment for our neurodiverse society. With the right design and management, it is possible to eliminate, reduce or adjust potentially negative impacts to create places where everyone can flourish equally.  

It is vital to recognise that achieving true equality, diversity, and inclusion is complex and cannot be ‘fixed’ quickly. An engineer must participate in active learning and go on a six stepped journey of self-awareness from being ‘not listening,’ ‘unaware,’ ‘passive,’ ‘curious,’ and ‘ally,’ to ‘advocate.’ A ‘not listening’ attitude involves shaming the unaware, speaking on behalf of others, invalidating others, clumsy behaviours, being bigoted, prejudiced, antagonistic and unwilling to listen and learn. Cultivating an ‘ally’ attitude is being informed and committed, routinely and proactively championing inclusion by challenging accepted norms, and taking sustained action to make positive change. It is for this reason the values of EDI should be part of an engineering professional’s ongoing lifestyle to have any real and lasting effect on engineering environments. 

Therefore, the importance of EDI needs to influence how an engineering professional thinks, acts, includes others and where engineers seek collaborative input. The concept of engineering is far more important than any individual engineer and sometimes engineers need to facilitate opportunities for voices to be heard. This involves respect and empathy to create trusted relationships and the need for self-awareness and self-development. Sometimes this means stepping back so that other engineers can step forward.   


Resources and support: 

Specific organisations representing protected characteristics such as InterEngineering have the goal to connect, inform and empower LGBTQ+ engineers.  Likewise, the Women’s Engineering Society (WES) and the Association for Black Engineers (AFBE-UK) provide support and promote higher achievements in education and engineering.  The aforementioned organisations are partnered with the Royal Academy of Engineering to highlight unheard voices, raise awareness of the barriers faced by minority groups, and to maximise impact. Many other umbrella groups, for instance Equal Engineers, also raise awareness of other underrepresented groups, such as the neurodivergent in engineering, by documenting case studies, undertaking surveys, holding regular careers events and annual conferences, and more.   

There is evidence to support the widely accepted view that supporting and managing EDI is a crucial element in increasing productivity and staff satisfaction. Diverse experiences and perspectives bring about diversity of thought which leads to innovation. It allows everybody to be authentic at work and provides the opportunity for diverse voices to be heard. Consequently, implementing EDI has proven to increase performance, growth, and innovation, as well as improvements in health, safety and wellbeing. EDI will therefore help to prepare students with the fundamental attitudes that are needed as practitioners and human beings.  

Finally, engineering with EDI embedded into a professional engineer’s lifestyle will make a difference to those most in need. In a globalised world it will put us in a good position to bring innovation and creativity to some of the biggest challenges we face together. Equitable, diverse and inclusive engineering must be at the heart of finding sustainable solutions to help shape a bright future for all. 



Resources in the Ethics Toolkit that link to EDI: 

Additional resources: 


This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Case enhancement: Facial recognition for access and monitoring

Activity: Prompts to facilitate discussion activities. 

Author: Sarah Jayne Hitt, Ph.D. SFHEA (NMITE, Edinburgh Napier University).



There are several points in this case during which an educator can facilitate a class discussion about relevant issues. Below are prompts for discussion questions and activities that can be used. These correspond with the stopping points outlined in the case. Each prompt could take up as little or as much time as the educator wishes, depending on where they want the focus of the discussion to be. The discussion prompts for Dilemma Part three are already well developed in the case study, so this enhancement focuses on expanding the prompts in Parts one and two.


Dilemma Part one – Discussion prompts:

1. Legal Issues. Give students ten minutes to individually or in groups do some online research on GDPR and the Data Protection Act (2018). In either small groups or as a large class, discuss the following prompts. You can explain that even if a person is not an expert in the law, it is important to try to understand the legal context. Indeed, an engineer is likely to have to interpret law and policy in their work. These questions invite critical thinking and informed consideration, but they do not necessarily have “right” answers and are suggestions that can help get a conversation started.

a. Are legal policies clear about how images of living persons should be managed when they are collected by technology of this kind?

b. What aspects of these laws might an engineer designing or deploying this system need to be aware of?

c. Do you think these laws are relevant when almost everyone walking around has a digital camera connected to the internet?

d. How could engineers help address legal or policy gaps through design choices?

2. Sharing Data. Before entering into a verbal discussion, either pass out the suggested questions listed in the case study on a worksheet or project on a screen. Have students spend five or ten minutes jotting down their personal responses. To understand the complexity of the issue, students could even create a quick mind map to show how different entities (police, security company, university, research group, etc.) interact on this issue. After the students spend some time in this personal reflection, educators could ask them to pair/share—turn to the person next to them and share what they wrote down. After about five minutes of this, each pair could amalgamate with another pair, with the educator giving them the prompt to report back to the full class on where they agree or disagree about the issues and why.

3. GDPR Consent. Before discussing this case particularly, ask students to describe a situation in which they had to give GDPR consent. Did they understand what they were doing, what the implications of consent are, and why? How did they feel about the process? Do they think it’s an appropriate system? This could be done as a large group, small group, or through individual reflection. Then turn the attention to this case and describe the change of perspective required here. Now instead of being the person who is asked for consent, you are the person requiring consent. Engineers are not lawyers, but engineers often are responsible for delivering legally compliant systems. If you were the engineer in charge in this case, what steps might you take to ensure consent is handled appropriately? This question could be answered in small groups, and then each group could report back to the larger class and a discussion could follow the report-backs.

4. Institutional Complexity. The questions listed in the case study relate to the fact that the building in which the facial recognition system will be used accommodates many different stakeholders. To help students with these questions, educators could divide the class into small groups, with each group representing one of the institutions or stakeholder groups (college, hospital, MTU, students, patients, public, etc.). Have each group investigate whether regulations related to captured images are different for their stakeholders, and debate if they should be different. What considerations will the engineer in the case have to account for related to that group? The findings can then be discussed as a large class.


Dilemma Part two – Discussion prompts:

The following questions relate to macroethical concerns, which means that the focus is on wider ethical contexts such as fairness, equality, responsibility, and implications.

1. Benefits and Burdens. To prepare to discuss the questions listed in the case study, students could make a chart of potential harms and potential benefits of the facial recognition system. They could do this individually, in pairs or small groups, or as a large class. Educators should encourage them to think deeply and broadly on this topic, and not just focus on the immediate, short-term implications. Once this chart is made, the questions listed in the case study could be discussed as a group, and students asked to weigh up these burdens and benefits. How did they make the choices as to when a burden should outweigh a benefit or vice versa?

2. Equality and Utility. To address the questions listed in the case study, students could do some preliminary individual or small group research on the accuracy of facial recognition systems for various population groups. The questions could then be discussed in pairs, small groups, or as a large class.

3. Engineer Responsibility. Engineers are experts that have much more specific technical knowledge and understanding than the general public. Indeed, the vast majority of people have no idea how a facial recognition system works and what the legal requirements are related to it, even if they are asked to give their consent. Does an engineer therefore have more of a responsibility to make people aware and reassure them? Or is an engineer just fulfilling their duty by doing what their boss says and making the system work? What could be problematic about taking either of those approaches?


This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Case enhancement: Developing a school chatbot for student support services

Activity: Stakeholder mapping to elicit value assumptions and motivations.

Author: Karin Rudolph (Collective Intelligence).



This enhancement is for an activity found in point 5 of the Summary section of the case study.

What is stakeholder mapping?

What is a stakeholder?

Mapping out stakeholders will help you to:

  1. Identify the stakeholders you need to collaborate with to ensure the success of the project.
  2. Understand the different perspectives and points of view people have and how these experiences can have an impact on your project or product.
  3. Map out a wide range of people, groups or individuals that can affect and be affected by the project.


Stakeholder mapping:

The stakeholder mapping activity is a group exercise that provides students with the opportunity to discuss ethical and societal issues related to the School Chatbot case study. We recommend doing this activity in small groups of 6-8 students per table.





To carry out this activity, you will need the following resources:

1. Sticky notes (or digital notes if online).

2. A big piece of paper or digital board (Jamboard, Miro if online) divided into four categories:

3. Markers and pencils.


The activity:


Board One

List of stakeholders:

Below is a list of the stakeholders involved in the Chatbot project. Put each stakeholder on a sticky note and add them to the stakeholders map, according to their level of influence and interest in the projects.

Top tip: use a different colour for each set of stakeholders.

School Chatbot – List of Stakeholders:





Each quadrant represents the following:

Board One

Motivations, assumptions, ethical and societal risks:


1. A big piece of paper or digital board (Jamboard, Miro if online) divided into four categories:

2. Sticky notes (or digital notes if online).

3. Markers and pencils.

The activity:


Board Two

The Board Two activity can be done in two different ways:

Option 1:

You can use some guiding questions to direct the discussion. For example:

Option 2:

We have already written some assumptions, motivations and ethical/societal risks and you can add these as notes on a table and ask students to place according to each category: stakeholders, motivations, assumptions, and ethical and societal risks.



Potential ethical and societal risks:

Move and match: 





Ask students to choose 2- 4 sticky notes and explain why they think these are important ethical/societal risks.


Potential future activity:

A more advanced activity could involve a group discussion where students are asked to think about some mitigation strategies to minimise these risks.


This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

The decisions engineers make on a daily basis can have significant consequences for underrepresented and disadvantaged groups in society. Prof Dawn Bonfield, Visiting Professor of Inclusive Engineering at Aston University, Royal Society Entrepreneur in Residence at King’s College London and a member of the EPC’s Engineering Ethics Advisory Group explains…

In the recent ethics report published by the RAEng (1) you might have noticed the explicit references, in an ethics context, to the societal and social justice implications of our engineering solutions that can lead to biased or discriminatory outcomes for different groups of people. This prioritisation of inclusive outcomes is a welcome expansion of the conventional focus of engineering ethics, which is often rooted in issues such as safety, corruption, and competence.

Reference was made in the first page of the report to the use of crash test dummies that have been designed to represent male drivers, leaving women (and pregnant women in particular) at greater risk in car accidents; the potential for algorithms and internet search engines to influence our thoughts on the world; issues arising from facial recognition technology failing to accurately identify those from Black, Asian and Ethnic Minority communities; and the use of artificial intelligence systems that will make safety-critical, legal, and other life changing decisions, which are often based on historical and biased datasets. You can further explore some of the issues with facial recognition technology in one of the ethics case studies produced by the EPC for their RAEng-supported Engineering Ethics Toolkit.

These are all examples of how, as engineers, we can inadvertently create solutions that are biased against minoritized groups of people if we are not careful. This generally occurs as a direct result of the fact that these groups of people are poorly represented in the engineering sector, and so their inputs are missing in the specification, design, and testing of new technologies (2).

But even before we get to a truly diverse engineering workforce, all engineers must be mindful of the ways in which the decisions they take can be discriminatory or can promulgate bias. In situations like the ones mentioned above it is relatively easy to spot the opportunity for discrimination, but in other cases it can be much more difficult. For example, there are ethical implications associated with the sort of ducting that gets chosen for a new building, where one material causes more pollution to socially and economically disadvantaged populations than another. It is in cases like this that a little more thought is required to spot whether the outcomes of these decisions are inclusive and ethical, or not.

Recently, the Covid-19 pandemic has shown us very clearly what the ethical implications are of our built environment decisions and designs, where people living in densely populated and overcrowded urban areas with minimal access to outdoor space have had significantly worse health outcomes than those with access to outdoor and green spaces. Inclusive design of the built environment is now a growing and recognised area of our engineering work, and as well as the more obvious examples of ensuring equitable access to those with disability issues, it also recognises that public spaces should be equitable and accessible to all communities. Everybody needs to see themselves represented in these environments and feel able to use them safely and fully. These are issues of ethics and inclusion, as well as social justice and equality, and the requirement we have as engineers to consider all of these perspectives as the creators of our future world must be a part of our systems engineering mindset. Several of the EPC’s ethics case studies focus on responsibility, equity, and stakeholder engagement, such as the Ageing Pipeline and its Impact on Local Communities case.

The importance of systems, design, iterative thinking, and the focus on ensuring that the whole life cycle of a product, including maintenance, repair, deconstruction, and end of life decommissioning, requires true stakeholder engagement, means that these inclusive outcomes can be considered at the very start of projects, rather than as an afterthought, where any changes are much more difficult and costly to integrate. The strengthening of the Social Value Act (3), which requires people who commission public services to explicitly evaluate how they can secure wider social, economic and environmental benefits, also puts emphasis on ensuring the outcomes of any procurement are inclusive and ethical. Similarly, the Sustainable Development Goals ethos of Leave No One Behind (4) requires that outcomes are considered from all perspectives, and that solutions taking all of the goals into account are balanced and not considered in silos. The EPC’s ethics case study on Business Growth Models allows engineering students to explore many of these issues.

Designing with the gender perspective in mind, especially in parts of the world where women have very different societal roles based on culture, stereotypes, local norms, and religion, is key to ensuring that the differences and disadvantages that women face are not exacerbated. Understanding these differences is the first step in addressing them, and in many cases, technology can act as a real enabler in situations where women have limited access to traditional education, information, and independence. For example, the widespread use of microfinance in many parts of Africa – a technology not aimed specifically at women – is nevertheless giving women much better access to loans and financial independence than the traditional banking structures did, which women are not always able to access easily. Other examples include understanding the need for sanitation facilities in public spaces such as schools, government offices, transportation hubs and health clinics, without which women’s access to these facilities becomes restricted and their participation curtailed (5).

Another ethical issue comes into play here too. Do we design just to remove bias and discrimination, or do we design to reverse historical bias and discrimination? For example, women have traditionally worked in certain sectors such as care giving roles, and not in sectors like engineering and technology. Algorithmic decision-making tools can use this historical data to preferentially show stereotypical job opportunities based on past trends and evidence, which could foreseeably prevent women from being targeted for engineering related roles. Adapting these tools to make these job opportunities open to all in an equitable way is one thing, but what if we decided to preferentially show engineering roles to women and caring roles to men – a kind of social engineering, if you will? What are the ethics of this, and would that be going too far to remove biases? I will leave you to think about this one yourselves!  If you would like to write a case study about it, we are currently looking for contributors to the toolkit!

The decisions we make daily as engineers have consequences to individuals and communities that have not always been understood or considered in the past, but by understanding the need for inclusive outcomes for all stakeholders, we also ensure that our solutions are ethical, and that we leave no on behind. The ethics case studies in the EPC’s recently launched Engineering Ethics Toolkit reveal the ethical concepts that comprise our everyday activities and what lies behind those decisions – resources like this should be used to ensure ethical decision making is integrated throughout an engineers’ education and continuing professional development.

This blog is also available here.



  1. RAEng Ethics Report
  2. website
  3. Social Value Act
  4. Sustainable Development Goals ethos of Leave No One Behind
  5. Towards Vision Website ‘Gender Perspective in Engineering’


Dawn Bonfield MBE CEng FIMMM FICE HonFIStructE FWES is Visiting Professor of Inclusive Engineering at Aston University and Royal Society Entrepreneur in Residence at King’s College London.


This blog is also available here.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Dr Nicola Whitehead (University of Wales Trinity Saint David); Professor Sarah Hitt (NMITE); Emma Crichton (Engineers Without Borders UK); Dr Sarah Junaid (Aston University); Professor Mike Sutcliffe (TEDI-London), Isobel Grimley (Engineering Professors’ Council).

Topic: Development and use of a facial recognition system. 

Engineering disciplines: Data, Electronics, Computer science, AI.

Ethical issues: Diversity, Bias, Privacy, Transparency.

Professional situations: Rigour, Informed consent, Misuse of data, Conflicts with leadership / management.

Educational level: Advanced. 

Educational aim: To encourage ethical motivation. Ethical motivation occurs when a person is moved by a moral judgement, or when a moral judgement is a spur to a course of action. 


Learning and teaching notes: 

This case involves an engineer hired to manage the development and installation of a facial recognition project at a building used by university students, businesses and the public. It incorporates a variety of components including law and policy, stakeholder and risk analysis, and both macro- and micro-ethical elements. This example is UK-based: however, the instructor can adapt the content to better fit the laws and regulations surrounding facial recognition technology in other countries, if this would be beneficial.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this study to AHEP outcomes specific to a programme under these themes, access AHEP4 here and navigate to pages 30-31 and 35-37.

This case is presented in three parts. If desired, a teacher can use Part one in isolation, but Part two (focusing on the wider ethical context of the case) and Part three (focusing on the potential actions the engineer could take)develop and complicate the concepts presented in Part one to provide for additional learning. The case study allows teachers the option to stop at multiple points for questions and / or activities as desired.

Learners have the opportunity to:

Teachers have the opportunity to: 


Learning and teaching resources:



Metropolitan Technical University (MTU), based in the UK, has an urban campus and many of its buildings are located in the city centre. A new student housing development in this area will be shared by MTU, a local college, and medical residents doing short rotations at the local hospital. The building has a public café on the ground floor and a couple of classrooms used by the university. 

The housing development sits alongside a common route for parades and protests. In the wake of demonstrations by Extinction Rebellion and Black Lives Matter, students have raised concerns to the property manager about safety. Despite an existing system of CCTV cameras and swipe cards, the university decides to install an enhanced security system, built around facial recognition technology that would enable access to the building and cross-reference with crime databases. To comply with GDPR, building residents will be required to give explicit consent before the system is implemented. Visitors without a student ID (such as café customers) will be buzzed in, but their image will be captured and cross-referenced before entry. A side benefit of the system is that MTU’s department of Artificial Intelligence Research will help with the installation and maintenance, as well as studying how it works, in order to make improvements. 


Dilemma – Part one:

You are an engineer who has been hired by MTU to take charge of the facial recognition system installation project, including setting policies and getting the system operational. With your background in AI engineering, you are expected to act as a technical advisor to MTU and liaise with the Facilities, Security and Computing departments to ensure a smooth deployment. This is the first time you have worked on a project that involves image capture. So as part of your preparation for the project, you need to do some preliminary research as to what best practices, guidance, and regulations apply.


Optional STOP for questions and activities: 

1. Discussion: What are the legal issues relating to image capture? Images allow for the identification of living persons and are therefore considered as personal data under GDPR and the Data Protection Act (2018).

2. Discussion: Sharing data is a legally and ethically complex field. Is it appropriate to share images captured with the police? If not the police, then whose crime database will you use? Is it acceptable to share the data with the Artificial Intelligence Research group? Why, or why not?

3. Discussion: Under GDPR, individuals must normally consent to their personal data being processed. How should consent be handled in this case?

4. Discussion: Does the fact that the building will accommodate students from three different institutions (MTU, the local college, and the hospital) complicate these issues? Are regulations related to students’ captured images different than those related to public image capture?

5. Activity: Undertake a technical activity that relates to how facial recognition systems are engineered.


Dilemma – Part two:

The project has kicked off, and one of its deliverables is to establish the policies and safeguards that will govern the system. You convened a meeting of project stakeholders to determine what rules need to be built into the system’s software and presented a list of questions to help you make technical decisions. The questions you asked were:

What you had thought would be a quick meeting to agree basic principles turned out to be very lengthy and complex. You were surprised at the variety of perspectives and how heated the discussions became. The discussions raised some questions in your own mind as to the risks of the facial recognition system.


Optional STOP for questions and activities:

The following activities focus on macro-ethics. This seeks to understand the wider ethical contexts of projects like the facial recognition system.

1. Activity: Stakeholder mapping – Who are all the stakeholders and what might their positions and perspectives be? Is there a difference between the priorities of the different stakeholders?

2. Activity: There are many different values competing for priority here. Identify these values, discuss and debate how they should be weighed in the context of the project.

3. Activity: Risks can be understood as objective and / or subjective. Research the difference between these two types of risk, and identify which type(s) of risks exist related to the project.

4. Discussion: Which groups or individuals are potentially harmed by the technology and which potentially benefit? How should we go about setting priorities when there are competing harms and benefits?

5. Discussion: Does the technology used treat everyone from your stakeholders’ list equally? Should the needs of society as a whole outweigh the needs of the individual?

6. Activity: Make and defend an argument as to the appropriateness of installing and using the system.

7. Discussion: What responsibilities do engineers have in developing these technologies?


Dilemma – Part three:

A few days later, you were forwarded a screenshot of a social media post that heavily criticised the proposed facial recognition system. It was unclear where the post had originated, but it had clearly been shared and promoted among both students and the public raising concerns about privacy and transparency. Your boss believes this outcry endangers the project and has requested that you make a public statement on behalf of MTU, reaffirming its commitment to installing the system.

You share the concerns, but have been employed to complete the project. You understand that suggesting it should be abandoned, would most likely risk your job. What will you tell your boss? How will you prepare your public statement?


Optional STOP for questions and activities:

Micro-ethics concerns individuals and their responses to specific situations. The following steps are intended to help students develop their ability to practise moral analysis by considering the problem in a structured way and work towards possible solutions that they can analyse critically.

 1. Discussion: What are the problems here? 

2. Discussion: What are the possible courses of action you can take as an employee?

 Students can be prompted to consider what different approaches they might adopt, such as the following, but can also develop their own possible responses. 

3. Discussion: Which is the best approach and why? – Interrogate the pros and cons of each possible course of action including the ethical, practical, cost, local relationship and the reputational damage implications. Students should decide on their own preferred course of action and explain why the balance of pros and cons is preferable to other options. The students may wish to consider this from other perspectives, such as: 

4. Activity: Public Communication – Students can practise writing a press release, giving an interview, or making a public statement about the case and the decision that they make.

5. Activity: Reflection – Students can reflect on how this case study has enabled them to see the situation from different angles. Has it motivated them to understand the ethical concerns and to come to an acceptable conclusion.



An enhancement for this case study can be found here.


This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Professor Sarah Hitt SFHEA (NMITE); Professor Raffaella Ocone OBE FREng FRSE (Heriot Watt University); Professor Thomas Lennerfors (Uppsala University); Claire Donovan (Royal Academy of Engineering); Isobel Grimley (Engineering Professors’ Council).

Topic:  Developing customised algorithms for student support.

Engineering disciplines: Computing, AI, Data.

Ethical issues: Bias, Social responsibility, Risk, Privacy.

Professional situations: Informed consent, Public health and safety, Conflicts with leadership / management, Legal implications.

Educational level: Beginner.

Educational aim: Develop ethical sensitivity. Ethical sensitivity is the broad cognisance of ethical issues and the ability to see how these might affect others.


Learning and teaching notes:

This case study involves the employees of a small software start-up that is creating a customised student support chatbot for a Sixth Form college. The employees come from different backgrounds and have different perspectives on the motivations behind their work, which leads to some interpersonal conflict. The team must also identify the ethical issues and competing values that arise in the course of developing their algorithm.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in two parts which build in complexity and navigate between personal, professional, and societal contexts. If desired, a teacher can use Part one in isolation, but Part two develops and complicates the concepts presented in Part one to provide for additional learning. Pre-reading ‘Ethics of Care and Justice’ is recommended, though not required, for engaging with Part two. The case allows teachers the option to stop at multiple points for questions and / or activities as desired.

Learners have the opportunity to:

Teachers have the opportunity to:


Learning and teaching resources:



Exaba is a small, three-person software startup. Like all small businesses, it has been struggling with finances during the pandemic. The company began selling its services across a variety of industry sectors but is now trying to expand by developing software solutions for the growing education technology sector.

Ivan, Exaba’s founder and CEO, was thrilled to be contracted by a growing local Sixth Form College in North West England, NorthStar Academy, to create a chatbot that will optimise student support services. These services include ensuring student safety and wellbeing, study skills advice, careers guidance, counselling, and the identification for the need and implementation of extra learning support. It is such a large project that Ivan has been able to bring in Yusuf, a university student on placement from a computer systems programme, to help Nadja, Exaba’s only full-time software engineer. Ivan views the chatbot contract as not only a financial windfall that can help get the company back on track, but as the first project in a new product-development revenue stream.

Nadja and Yusuf have been working closely with the NorthStar Academy’s Principal, Nicola, to create ‘Alice’: the custom student-support chatbot to ensure that she is designed appropriately and is fit for purpose. Nicola has seen growing evidence that chatbots can identify when students are struggling with a range of issues from attendance to anxiety. She has also seen that they can be useful in helping administrators understand what students need, how to help them more quickly, and where to invest more resources to make support most effective.


Optional STOP for questions and activities:

1. Discussion: What moral or ethical issues might be at stake or arise in the course of this project?

2. Discussion: What professional or legal standards might apply to the development of Alice?

3. Discussion: What design choices might Nadja and Yusuf have to consider as they build the chatbot software in order for it to conform to those standards?

4. Discussion: is there anything risky about giving cognitive chatbots human names in general, or a female name specifically?

5. Activity: Undertake stakeholder mapping to elicit value assumptions and motivations.

6. Activity: Research any codes of ethics that might apply to AI in education, or policies / laws that apply to controlling and processing student data.

7. Activity: View the following TED talk and have a discussion on gender in digital assistants: Siri and Alexa are AI Built for the Past by Emily Liu.


Dilemma – Part one:

After undertaking work to ensure GDPR compliance through transparency, consent, and anonymisation of the data harvested by interactions with Alice, Nadja and Yusuf are now working on building the initial data set that the chatbot will call upon to provide student support. The chatbot’s information to students can only be as good as the existing data it has available to draw from. To enable this, Nicola has agreed to provide Exaba with NorthStar Academy’s existing student databases that span many years and cover both past and present students. While this data – including demographics, academic performances, and interactions with support services – is anonymised, Yusuf has begun to feel uncomfortable. One day, when the entire team was together discussing technical challenges, Yusuf said “I wonder what previous students would think if they found out that we were using all this information about them, without their permission?”

Ivan pointed out, “Nicola told us it was okay to use. They’re the data controllers, so it’s their responsibility to resolve that concern, not ours. We can’t tell them what to do with their own data. All we need to be worried about is making sure the data processing is done appropriately.”

Nadja added, “Plus, if we don’t use an existing data set, Alice will have to learn from scratch, meaning she won’t be as effective at the start. Wouldn’t it be better for our chatbot to be as intelligent and helpful as possible right away? Otherwise, she could put existing students at a disadvantage.”

Yusuf fell silent, figuring that he didn’t know as much as Ivan and Nadja. Since he was just on a placement, he felt that it wasn’t his place to push the issue any further with full-time staff.


Optional STOP for questions and activities:

1. Discussion: Expand upon Yusuf’s feelings of discomfort. What values or principles is this emotion drawing on?

2. Discussion: Do you agree with Yusuf’s perspective, or with Ivan’s and Nadja’s? Why?

3. Discussion: Does / should Yusuf have the right to voice any concerns or objections to his employer?

4. Discussion: Do / should previous NorthStar students have the right to control what the academy does with their data? To what extent, and for how long?

5. Discussion: Is there / should there be a difference between how data about children is used and that of adults? Why?

6. Discussion: Should a business, like Exaba, ever challenge its client, like NorthStar Academy, about taking potentially unethical actions?

7. Technical activity: Undertake a technical activity such as creating a process flow diagram, pieces of code and UI / UX design that either obscure or reinforce consent.

8. Activity: Undertake argument mapping to diagram and expand on the reasoning and evidence used by Yusuf, Nadja, and Ivan in their arguments.

9. Activity: Apply ethical theories to those arguments.  

10. Discussion: What ethical principles are at stake? Are there potentially any conflicts or contradictions arising from those principles?


Dilemma – Part two:

Nicola, too, was under pressure. The academy’s Board had hired her as Principal to improve NorthStar’s rankings in the school performance table, to get the college’s finances back on track, and support the government efforts at ‘levelling up’ This is why one of Nicola’s main specifications for Alice is that she be able to flag students at risk of not completing their qualifications. Exaba will have to develop an algorithm that can determine what those risk factors are.

In a brainstorming session Nadja began listing some ideas on the whiteboard. “Ethnic background, family income, low marks, students who fit that profile from the past and ultimately dropped out, students who engaged with support services a lot, students with health conditions . . .”

“Wait, wait, wait,” Yusuf said. “This feels a little bit like profiling to me. You know, like we think kids from certain neighbourhoods are unlikely to succeed so we’re building this thing to almost reinforce that they don’t.”

“The opposite is true!” Ivan exclaimed. “This algorithm will HELP exactly those students.”

“I can see how that’s the intention,” Yusuf acknowledged. “But I’ve had so many friends and neighbours experience well-intentioned but not appropriate advice from mentors and counsellors who think the only solution is for everyone to complete qualifications and go to university. This is not the best path for everybody!”

Nadja had been listening carefully. “There is something to what Yusuf is saying: Is it right to nudge students to stay in a programme that’s actually not a best fit for them? Could Alice potentially give guidance that is contrary to what a personal tutor, who knows the student personally, might advise? I don’t know if that’s the sort of algorithm we should develop.”

At this point Ivan got really frustrated with his employees: “This is the proprietary algorithm that’s going to save this company!” he shouted. “Never mind the rights and wrongs of it. Think of the business potential, not to mention all the schools and students this is going to help. The last thing I need is a mutiny from my team. We have the client’s needs to think about, and that’s it.”


Optional STOP for questions and activities:

1. Activity: compare an approach to this case through the ethics of care versus the ethics of justice. What different factors come into play? How should these be weighed? Might one approach lead to a better course of action than another? Why?

2. Discussion: what technical solutions, if any, could help mitigate Yusuf and Nadja’s concerns?

3. Activity: imagine that Ivan agrees that this is a serious enough concern that they need to address it with Nicola. Role play a conversation between Ivan and Nicola.

4. Activity: undertake a classroom debate on whether or not Alice has the potential to reinforce negative stereotypes. Variations include alley debate, stand where you stand, adopt and support opposite instinct.



An enhancement for this case study can be found here.


This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Let us know what you think of our website