Authors: Professor Sarah Hitt SFHEA (NMITE); Professor Raffaella Ocone OBE FREng FRSE (Heriot Watt University); Professor Thomas Lennerfors (Uppsala University); Claire Donovan (Royal Academy of Engineering); Isobel Grimley (Engineering Professors’ Council).

Topic:  Developing customised algorithms for student support.

Engineering disciplines: Computing, AI, Data.

Ethical issues: Bias, Social responsibility, Risk, Privacy.

Professional situations: Informed consent, Public health and safety, Conflicts with leadership / management, Legal implications.

Educational level: Beginner.

Educational aim: Develop ethical sensitivity. Ethical sensitivity is the broad cognisance of ethical issues and the ability to see how these might affect others.

 

Learning and teaching notes:

This case study involves the employees of a small software start-up that is creating a customised student support chatbot for a Sixth Form college. The employees come from different backgrounds and have different perspectives on the motivations behind their work, which leads to some interpersonal conflict. The team must also identify the ethical issues and competing values that arise in the course of developing their algorithm.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in two parts which build in complexity and navigate between personal, professional, and societal contexts. If desired, a teacher can use Part one in isolation, but Part two develops and complicates the concepts presented in Part one to provide for additional learning. Pre-reading ‘Ethics of Care and Justice’ is recommended, though not required, for engaging with Part two. The case allows teachers the option to stop at multiple points for questions and / or activities as desired.

Learners have the opportunity to:

Teachers have the opportunity to:

 

Learning and teaching resources:

 

Summary:

Exaba is a small, three-person software startup. Like all small businesses, it has been struggling with finances during the pandemic. The company began selling its services across a variety of industry sectors but is now trying to expand by developing software solutions for the growing education technology sector.

Ivan, Exaba’s founder and CEO, was thrilled to be contracted by a growing local Sixth Form College in North West England, NorthStar Academy, to create a chatbot that will optimise student support services. These services include ensuring student safety and wellbeing, study skills advice, careers guidance, counselling, and the identification for the need and implementation of extra learning support. It is such a large project that Ivan has been able to bring in Yusuf, a university student on placement from a computer systems programme, to help Nadja, Exaba’s only full-time software engineer. Ivan views the chatbot contract as not only a financial windfall that can help get the company back on track, but as the first project in a new product-development revenue stream.

Nadja and Yusuf have been working closely with the NorthStar Academy’s Principal, Nicola, to create ‘Alice’: the custom student-support chatbot to ensure that she is designed appropriately and is fit for purpose. Nicola has seen growing evidence that chatbots can identify when students are struggling with a range of issues from attendance to anxiety. She has also seen that they can be useful in helping administrators understand what students need, how to help them more quickly, and where to invest more resources to make support most effective.

 

Optional STOP for questions and activities:

1. Discussion: What moral or ethical issues might be at stake or arise in the course of this project?

2. Discussion: What professional or legal standards might apply to the development of Alice?

3. Discussion: What design choices might Nadja and Yusuf have to consider as they build the chatbot software in order for it to conform to those standards?

4. Discussion: is there anything risky about giving cognitive chatbots human names in general, or a female name specifically?

5. Activity: Undertake stakeholder mapping to elicit value assumptions and motivations.

6. Activity: Research any codes of ethics that might apply to AI in education, or policies / laws that apply to controlling and processing student data.

7. Activity: View the following TED talk and have a discussion on gender in digital assistants: Siri and Alexa are AI Built for the Past by Emily Liu.

 

Dilemma – Part one:

After undertaking work to ensure GDPR compliance through transparency, consent, and anonymisation of the data harvested by interactions with Alice, Nadja and Yusuf are now working on building the initial data set that the chatbot will call upon to provide student support. The chatbot’s information to students can only be as good as the existing data it has available to draw from. To enable this, Nicola has agreed to provide Exaba with NorthStar Academy’s existing student databases that span many years and cover both past and present students. While this data – including demographics, academic performances, and interactions with support services – is anonymised, Yusuf has begun to feel uncomfortable. One day, when the entire team was together discussing technical challenges, Yusuf said “I wonder what previous students would think if they found out that we were using all this information about them, without their permission?”

Ivan pointed out, “Nicola told us it was okay to use. They’re the data controllers, so it’s their responsibility to resolve that concern, not ours. We can’t tell them what to do with their own data. All we need to be worried about is making sure the data processing is done appropriately.”

Nadja added, “Plus, if we don’t use an existing data set, Alice will have to learn from scratch, meaning she won’t be as effective at the start. Wouldn’t it be better for our chatbot to be as intelligent and helpful as possible right away? Otherwise, she could put existing students at a disadvantage.”

Yusuf fell silent, figuring that he didn’t know as much as Ivan and Nadja. Since he was just on a placement, he felt that it wasn’t his place to push the issue any further with full-time staff.

 

Optional STOP for questions and activities:

1. Discussion: Expand upon Yusuf’s feelings of discomfort. What values or principles is this emotion drawing on?

2. Discussion: Do you agree with Yusuf’s perspective, or with Ivan’s and Nadja’s? Why?

3. Discussion: Does / should Yusuf have the right to voice any concerns or objections to his employer?

4. Discussion: Do / should previous NorthStar students have the right to control what the academy does with their data? To what extent, and for how long?

5. Discussion: Is there / should there be a difference between how data about children is used and that of adults? Why?

6. Discussion: Should a business, like Exaba, ever challenge its client, like NorthStar Academy, about taking potentially unethical actions?

7. Technical activity: Undertake a technical activity such as creating a process flow diagram, pieces of code and UI / UX design that either obscure or reinforce consent.

8. Activity: Undertake argument mapping to diagram and expand on the reasoning and evidence used by Yusuf, Nadja, and Ivan in their arguments.

9. Activity: Apply ethical theories to those arguments.  

10. Discussion: What ethical principles are at stake? Are there potentially any conflicts or contradictions arising from those principles?

 

Dilemma – Part two:

Nicola, too, was under pressure. The academy’s Board had hired her as Principal to improve NorthStar’s rankings in the school performance table, to get the college’s finances back on track, and support the government efforts at ‘levelling up’ This is why one of Nicola’s main specifications for Alice is that she be able to flag students at risk of not completing their qualifications. Exaba will have to develop an algorithm that can determine what those risk factors are.

In a brainstorming session Nadja began listing some ideas on the whiteboard. “Ethnic background, family income, low marks, students who fit that profile from the past and ultimately dropped out, students who engaged with support services a lot, students with health conditions . . .”

“Wait, wait, wait,” Yusuf said. “This feels a little bit like profiling to me. You know, like we think kids from certain neighbourhoods are unlikely to succeed so we’re building this thing to almost reinforce that they don’t.”

“The opposite is true!” Ivan exclaimed. “This algorithm will HELP exactly those students.”

“I can see how that’s the intention,” Yusuf acknowledged. “But I’ve had so many friends and neighbours experience well-intentioned but not appropriate advice from mentors and counsellors who think the only solution is for everyone to complete qualifications and go to university. This is not the best path for everybody!”

Nadja had been listening carefully. “There is something to what Yusuf is saying: Is it right to nudge students to stay in a programme that’s actually not a best fit for them? Could Alice potentially give guidance that is contrary to what a personal tutor, who knows the student personally, might advise? I don’t know if that’s the sort of algorithm we should develop.”

At this point Ivan got really frustrated with his employees: “This is the proprietary algorithm that’s going to save this company!” he shouted. “Never mind the rights and wrongs of it. Think of the business potential, not to mention all the schools and students this is going to help. The last thing I need is a mutiny from my team. We have the client’s needs to think about, and that’s it.”

 

Optional STOP for questions and activities:

1. Activity: compare an approach to this case through the ethics of care versus the ethics of justice. What different factors come into play? How should these be weighed? Might one approach lead to a better course of action than another? Why?

2. Discussion: what technical solutions, if any, could help mitigate Yusuf and Nadja’s concerns?

3. Activity: imagine that Ivan agrees that this is a serious enough concern that they need to address it with Nicola. Role play a conversation between Ivan and Nicola.

4. Activity: undertake a classroom debate on whether or not Alice has the potential to reinforce negative stereotypes. Variations include alley debate, stand where you stand, adopt and support opposite instinct.

 

Enhancements:

An enhancement for this case study can be found here.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Professor Raffaella Ocone OBE FREng FRSE (Heriot-Watt University); Professor Thomas Lennerfors (Uppsala University); Professor Sarah Hitt SFHEA (NMITE); Isobel Grimley (Engineering Professors’ Council).

Topic: Soil carbon sequestration and Solar geoengineering.

Engineering disciplines: Chemical engineering; Energy and Environmental engineering.

Ethical issues: Respect for the environment; Social responsibility; Risk.

Professional situations: Public health and safety, Communication.

Educational level: Beginner.

Educational aim: To develop ethical awareness. Ethical awareness is when an individual determines that a single situation has moral implications and can be considered from an ethical point of view.

 

Learning and teaching notes:

This case involves a dilemma that most engineering students will have to face at least once in their careers: which job offer to accept. This study allows students to consider how personal values affect professional decisions. The ethical aspect of this dilemma comes from weighing competing moral goods –that is, evaluating what might be the better choice between two ethically acceptable options. In addition, the case offers students an introduction to ethical principles underpinning EU environmental law, and a chance to debate ethical aspects surrounding emerging technologies. Finally, the case invites consideration of the injustices inherent in proposed solutions to climate change.

This case study addresses two AHEP 4 themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in two parts. If desired, a teacher can use Part one in isolation, but Part two develops and complicates the concepts presented in Part one to provide for additional learning. The case allows teachers the option to stop at multiple points for questions and / or activities, as desired.

Learners have the opportunity to:

Teachers have the opportunity to:

 

Learning and teaching resources:

 

Summary:

Olivia is a first-generation university student who grew up on a farm in rural Wales and was often frustrated by living in such a remote environment. When she received excellent A levels in maths and sciences, she took a place on a chemical engineering course in London.

Olivia became passionate about sustainability and thrived during her placements with companies that were working on innovative climate solutions. One of the most formative events for her  was COP26 in Glasgow. Here, she attended debates and negotiations that contributed to new global agreements limiting global warming to 1.5°C. Following this experience, Olivia has been looking for jobs that would allow her to work on the front line combating climate change.

 

Dilemma – Part one:

Olivia has received two job offers. One is a very well-paid position at CarGro, a small firm not far from her family farm. This company works on chemical analysis for soil carbon storage – the ability of soil’s organic matter to sequester carbon-rich compounds and therefore offset atmospheric CO2

The other offer is for an entry-level position at EnSol, a company developing the feasibility of stratospheric aerosol injection. This technology aims to mimic the effect that volcanic eruptions have on the atmosphere when they eject particles into the stratosphere that reflect sunlight and subsequently cool the planet. EnSol is a start-up located in Bristol that has connections with other European companies working on complementary technologies.

While considering these two offers, Olivia recalls an ethics lesson she had in an engineering design class. This lesson examined the ethical implications of projects that engineers choose to work on. The example used was of a biomedical engineer who had to decide whether to work on cancer cures or cancer prevention, and which was more ethically impactful. Olivia knows that both CarGro and EnSol have the potential to mitigate climate change, but she wonders if one might be better than the other. In addition, she has her own goals and motivations to consider: does she really want to work near her parents again, no matter how well-paid that job is?

 

Optional STOP for questions and activities: 

1. Discussion: Personal values – what personal values will Olivia have to weigh in order to decide which job offer to accept? 

2. Activity: research the climate mitigation potential of soil carbon sequestration (SCS) and stratospheric aerosol injection (SAI).

3. Discussion: Professional values – based on the research, which company is doing the work that Olivia might feel is most ethically impactful? Make an argument for both companies.

4. Discussion: Wider impact – what impact does the work of these two companies have? Consider this on local, regional, and global scales. Who benefits from their work, and who does not?

5. Discussion: Technical integration – undertake a technical activity in the areas of chemical engineering, energy and / or environmental engineering related to the climate mitigation potential of SCS and SAI.

 

Dilemma – Part two:

To help her with the decision, Olivia talks with three of her former professors. The first is Professor Carrera, whom Olivia accompanied to COP26. Professor Carrera specialises in technology policy, and tells Olivia about the precautionary principle, a core component of EU environmental law. This principle is designed to help governments make decisions when outcomes are uncertain.

The second is Professor Adams, Olivia’s favourite chemical engineering professor, who got her excited about emerging technologies in the area of climate change mitigation. Professor Adams emphasises the opportunity at EnSol provides, to be working on cutting-edge research and development – “the sort of technology that might make you rich, as well!”

Finally, Olivia speaks to Professor Liu, an expert in engineering ethics. Professor Liu’s latest book on social responsibility in engineering argues that many climate change mitigation technologies are inequitable because they unfairly benefit rich countries and have the potential to be risky and burdensome to poorer ones.

Based on these conversations, Olivia decides to ask the hiring managers at CarGro and EnSol some follow-up questions. Knowing she was about to make these phone calls, both her mother and her best friend Owen (who has already secured a job in Bristol) have messaged her with contradictory advice.  What does Olivia ask on the calls to CarGro and EnSol to help her make a decision? Ultimately, which job should Olivia take?

 

Optional STOP for questions and activities:

1. Activity and discussion: research the precautionary principle – what have been the potentially positive and negative aspects of its effect on EU policy decisions related to the environment?

2. Activity: identify the risks and benefits of SCS and SAI for different communities.

3. Activity: map the arguments of the three professors. Whose perspective might be the most persuasive to Olivia and why?

4. Activity: rehearse and role play phone calls with both companies.

5. Activity: debate which position Olivia should take.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Professor Mike Sutcliffe (TEDI-London); Professor Mike Bramhall (TEDI-London); Prof Sarah Hitt SFHEA (NMITE); Johnny Rich (Engineering Professors’ Council); Professor Dawn Bonfield MBE (Aston University); Professor Chike Oduoza (University of Wolverhampton); Steven Kerry (Rolls-Royce); Isobel Grimley (Engineering Professors’ Council).

Topic: Smart meters for responsible everyday energy use.

Engineering disciplines: Electrical engineering

Ethical issues: Integrity, Transparency, Social responsibility, Respect for the environment, Respect for the law

Professional situations: Communication, Privacy, Sustainability

Educational level: Beginner

Educational aim: To encourage ethical motivation. Ethical motivation occurs when a person is moved by a moral judgement, or when a moral judgement is a spur to a course of action. 

 

Learning and teaching notes:

This case is an example of ‘everyday ethics’. A professional engineer must give advice to a friend about whether or not they should install a smart meter. It addresses issues of ethical and environmental responsibility as well as public policy, financial burdens and data privacy. The case helps to uncover values that underlie assumptions that people hold about the environment and its connection to human life and services. It also highlights the way that those values inform everyday decision-making.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in three parts that build in complexity. If desired, a teacher can use Part one in isolation, but Part two and Part three develops and complicates the concepts presented in Part one in order to provide additional learning. The case allows teachers the opportunity to stop at various points to pose questions and/or set activities.

Learners have the opportunity to:

Teachers have the opportunity to:

 

Learning and teaching resources:

 

Summary – Part one:

Sam and Alex have been friends since childhood. As they have grown older, they have discovered that they hold very different political and social beliefs, but they never let these differences of opinion get in the way of a long and important friendship. In fact, they often test their own ideas against each other in bantering sessions, knowing that they are built on a foundation of respect.

Sam works as an accountant and Alex has become an environmental engineer. Perhaps naturally, Alex often asks Sam for financial advice, while Sam depends on Alex for expert information related to sustainability and the environment. One day, knowing that Alex is knowledgeable about the renewable energy industry and very conscious of the impact of energy use at home, Sam messages Alex to say he is getting pressure from his energy company to install a smart meter.

Sam has been told that smart metering is free, brings immediate benefits to customers by helping them to take control of their energy usage, and is a key enabler for the transition away from fossil fuels use and towards the delivery of net zero emissions by 2050. Smart meters give consumers near real-time information on energy use, and the associated cost, enabling them to better manage their energy use, save money and reduce emissions. A further benefit is that they could charge their electric car far more cheaply using a smart meter on an overnight tariff.

Yet Sam has also read that smart meters ‘go dumb’ if customers switch providers and, as a pre-payment customer, this option may not be available with a smart meter. In addition, Sam suspects that despite claims that the smart meter roll out is free, the charge is simply being passed on to customers through their energy bills instead. Alex tries to give Sam as much good information as possible, but the conversation ends with the decision unresolved.

 

Optional STOP for questions and activities: 

1. Discussion and activity: Personal values – We know that Sam and Alex have different ideas and opinions about many things. This probably stems from a difference in how they prioritise values. For instance, valuing transparency over efficiency, or sustainability over convenience. Using this values activity as a prompt, what personal values might be competing in this particular case?

2. Discussion and activity: Everyday ethics – Consider what values are involved in your everyday choices, decisions, and actions. Write a reflective essay on three events in the past week that, upon further analysis, have ethical components.

3. Discussion: Professional values – Does Alex, as an environmental engineer, have a responsibility to advocate installing smart meters? If so, does he have more responsibility than a non-engineer to advocate for this action? Why, or why not?

4. Discussion: Wider impact – Are there broader ethical issues at stake here?

5. Activity: Role-play a conversation between Sam and Alex that includes what advice should be given and what the response might be.

 

Dilemma – Part two:

After getting more technical information from Alex, Sam realises that, with a smart meter, data on the household’s energy usage would be collected every 30 minutes.  This is something they had not anticipated, and they ask a number of questions about the implications of this. Furthermore, while Sam has already compared tariffs and costs as the main way to choose the energy provider, Alex points out that different providers use different energy sources such as wind, gas, nuclear, coal, and solar. Sam is on a tight budget but Alex explains that the cheaper solution is not necessarily the most environmentally responsible choice. Sam is frustrated: now there is something else to consider besides whether or not to install the smart meter.

 

Optional STOP for questions and activities:  

1. Activity: Technical integration Undertake an electrical engineering technical activity related to smart meters and the data that they collect.

2. Activity: Research what happens with the data collected by a smart meter. Who can access this data and how is privacy protected? How does this data inform progress towards the energy transition from fossil fuels?

3. Activity: Research different energy companies and their approach to responsible energy sourcing and use. How do these companies communicate that approach to the public? Which company would you recommend to your friend and why?

4. Activity: Cost-benefit analysis – Sometimes the ethical choice is the more expensive choice. How do you balance short- and long-term benefits in this case? When, if ever, would it be ethically right to choose energy from non-renewable sources? How would this choice differ if the context being considered was different? For example, students could think about responsible energy use in industrialised economies versus the developing world and energy justice.

 

Dilemma – Part three:

Following this exchange with Sam, Alex becomes aware that one of the main obstacles in energy transition concerns communication with the public. Ideally, Alex wants to persuade family and other friends to make more responsible choices; however, it is clear that there are many more factors involved than can be seen in one glance. This includes what kinds of pressure is put on consumers by companies and the government. Alex begins to reflect on how policy drives what engineers think and do, and joins a new government network on Engineering in Policy.  

Alex and Sam meet up a little while later, and Sam announces that yes, a smart meter has been installed. At first Alex is relieved, but then Sam lets it slip that they are planning to grow marijuana in their London home. Sam asks whether this spike in energy use will be picked up as abnormal by a smart meter and whether this would lead to them being found out.

 

Optional STOP for questions and activities:  

1. Discussion: Personal values – What are the ethics involved in trying to persuade others to make similar choices to you?

2. Discussion and activity: Legal responsibility – What should Alex say or do about Sam’s disclosure? Role-play a conversation between Sam and Alex.

3. Discussion: Professional responsibility – What role should engineers play in setting and developing public policy on energy?

4. Activity: Energy footprint – Research which industries use the most energy and, on a smaller scale, which home appliances use the most energy.

 

Enhancements:

An enhancement for this case study can be found here.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Let us know what you think of our website