Authors: Professor Sarah Hitt SFHEA (NMITE); Professor Raffaella Ocone OBE FREng FRSE (Heriot Watt University); Professor Thomas Lennerfors (Uppsala University); Claire Donovan (Royal Academy of Engineering); Isobel Grimley (Engineering Professors’ Council).
Topic: Developing customised algorithms for student support.
Engineering disciplines: Computing, AI, Data.
Ethical issues: Bias, Social responsibility, Risk, Privacy.
Professional situations: Informed consent, Public health and safety, Conflicts with leadership / management, Legal implications.
Educational level: Beginner.
Educational aim: Develop ethical sensitivity. Ethical sensitivity is the broad cognisance of ethical issues and the ability to see how these might affect others.
Learning and teaching notes:
This case study involves the employees of a small software start-up that is creating a customised student support chatbot for a Sixth Form college. The employees come from different backgrounds and have different perspectives on the motivations behind their work, which leads to some interpersonal conflict. The team must also identify the ethical issues and competing values that arise in the course of developing their algorithm.
This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.
The dilemma in this case is presented in two parts which build in complexity and navigate between personal, professional, and societal contexts. If desired, a teacher can use Part one in isolation, but Part two develops and complicates the concepts presented in Part one to provide for additional learning. Pre-reading ‘Ethics of Care and Justice’ is recommended, though not required, for engaging with Part two. The case allows teachers the option to stop at multiple points for questions and / or activities as desired.
Learners have the opportunity to:
- identify ethical and legal issues related to emerging technologies;
- apply codes of ethics to an engineering ethics dilemma;
- consider different perspectives on an ethical issue and what values inform those perspectives;
- practise professional communication related to ethical dilemmas.
Teachers have the opportunity to:
- introduce ethics of care and ethics of justice;
- integrate technical content for developing software and algorithms;
- highlight strategies to deal with conflicts between management, clients, and employees;
- explore wider contexts and implications of engineering technologies;
- informally evaluate students’ critical thinking and communication skills.
Learning and teaching resources:
- RAEng/Engineering Council Statement of Ethical Principles
- The ethical framework for AI in education
- Princeton Dialogues on AI and ethics
- Negotiating the ethics of care and justice
Summary:
Exaba is a small, three-person software startup. Like all small businesses, it has been struggling with finances during the pandemic. The company began selling its services across a variety of industry sectors but is now trying to expand by developing software solutions for the growing education technology sector.
Ivan, Exaba’s founder and CEO, was thrilled to be contracted by a growing local Sixth Form College in North West England, NorthStar Academy, to create a chatbot that will optimise student support services. These services include ensuring student safety and wellbeing, study skills advice, careers guidance, counselling, and the identification for the need and implementation of extra learning support. It is such a large project that Ivan has been able to bring in Yusuf, a university student on placement from a computer systems programme, to help Nadja, Exaba’s only full-time software engineer. Ivan views the chatbot contract as not only a financial windfall that can help get the company back on track, but as the first project in a new product-development revenue stream.
Nadja and Yusuf have been working closely with the NorthStar Academy’s Principal, Nicola, to create ‘Alice’: the custom student-support chatbot to ensure that she is designed appropriately and is fit for purpose. Nicola has seen growing evidence that chatbots can identify when students are struggling with a range of issues from attendance to anxiety. She has also seen that they can be useful in helping administrators understand what students need, how to help them more quickly, and where to invest more resources to make support most effective.
Optional STOP for questions and activities:
1. Discussion: What moral or ethical issues might be at stake or arise in the course of this project?
2. Discussion: What professional or legal standards might apply to the development of Alice?
3. Discussion: What design choices might Nadja and Yusuf have to consider as they build the chatbot software in order for it to conform to those standards?
4. Discussion: is there anything risky about giving cognitive chatbots human names in general, or a female name specifically?
5. Activity: Undertake stakeholder mapping to elicit value assumptions and motivations.
6. Activity: Research any codes of ethics that might apply to AI in education, or policies / laws that apply to controlling and processing student data.
7. Activity: View the following TED talk and have a discussion on gender in digital assistants: Siri and Alexa are AI Built for the Past by Emily Liu.
Dilemma – Part one:
After undertaking work to ensure GDPR compliance through transparency, consent, and anonymisation of the data harvested by interactions with Alice, Nadja and Yusuf are now working on building the initial data set that the chatbot will call upon to provide student support. The chatbot’s information to students can only be as good as the existing data it has available to draw from. To enable this, Nicola has agreed to provide Exaba with NorthStar Academy’s existing student databases that span many years and cover both past and present students. While this data – including demographics, academic performances, and interactions with support services – is anonymised, Yusuf has begun to feel uncomfortable. One day, when the entire team was together discussing technical challenges, Yusuf said “I wonder what previous students would think if they found out that we were using all this information about them, without their permission?”
Ivan pointed out, “Nicola told us it was okay to use. They’re the data controllers, so it’s their responsibility to resolve that concern, not ours. We can’t tell them what to do with their own data. All we need to be worried about is making sure the data processing is done appropriately.”
Nadja added, “Plus, if we don’t use an existing data set, Alice will have to learn from scratch, meaning she won’t be as effective at the start. Wouldn’t it be better for our chatbot to be as intelligent and helpful as possible right away? Otherwise, she could put existing students at a disadvantage.”
Yusuf fell silent, figuring that he didn’t know as much as Ivan and Nadja. Since he was just on a placement, he felt that it wasn’t his place to push the issue any further with full-time staff.
Optional STOP for questions and activities:
1. Discussion: Expand upon Yusuf’s feelings of discomfort. What values or principles is this emotion drawing on?
2. Discussion: Do you agree with Yusuf’s perspective, or with Ivan’s and Nadja’s? Why?
3. Discussion: Does / should Yusuf have the right to voice any concerns or objections to his employer?
4. Discussion: Do / should previous NorthStar students have the right to control what the academy does with their data? To what extent, and for how long?
5. Discussion: Is there / should there be a difference between how data about children is used and that of adults? Why?
6. Discussion: Should a business, like Exaba, ever challenge its client, like NorthStar Academy, about taking potentially unethical actions?
7. Technical activity: Undertake a technical activity such as creating a process flow diagram, pieces of code and UI / UX design that either obscure or reinforce consent.
8. Activity: Undertake argument mapping to diagram and expand on the reasoning and evidence used by Yusuf, Nadja, and Ivan in their arguments.
9. Activity: Apply ethical theories to those arguments.
10. Discussion: What ethical principles are at stake? Are there potentially any conflicts or contradictions arising from those principles?
Dilemma – Part two:
Nicola, too, was under pressure. The academy’s Board had hired her as Principal to improve NorthStar’s rankings in the school performance table, to get the college’s finances back on track, and support the government efforts at ‘levelling up’ This is why one of Nicola’s main specifications for Alice is that she be able to flag students at risk of not completing their qualifications. Exaba will have to develop an algorithm that can determine what those risk factors are.
In a brainstorming session Nadja began listing some ideas on the whiteboard. “Ethnic background, family income, low marks, students who fit that profile from the past and ultimately dropped out, students who engaged with support services a lot, students with health conditions . . .”
“Wait, wait, wait,” Yusuf said. “This feels a little bit like profiling to me. You know, like we think kids from certain neighbourhoods are unlikely to succeed so we’re building this thing to almost reinforce that they don’t.”
“The opposite is true!” Ivan exclaimed. “This algorithm will HELP exactly those students.”
“I can see how that’s the intention,” Yusuf acknowledged. “But I’ve had so many friends and neighbours experience well-intentioned but not appropriate advice from mentors and counsellors who think the only solution is for everyone to complete qualifications and go to university. This is not the best path for everybody!”
Nadja had been listening carefully. “There is something to what Yusuf is saying: Is it right to nudge students to stay in a programme that’s actually not a best fit for them? Could Alice potentially give guidance that is contrary to what a personal tutor, who knows the student personally, might advise? I don’t know if that’s the sort of algorithm we should develop.”
At this point Ivan got really frustrated with his employees: “This is the proprietary algorithm that’s going to save this company!” he shouted. “Never mind the rights and wrongs of it. Think of the business potential, not to mention all the schools and students this is going to help. The last thing I need is a mutiny from my team. We have the client’s needs to think about, and that’s it.”
Optional STOP for questions and activities:
1. Activity: compare an approach to this case through the ethics of care versus the ethics of justice. What different factors come into play? How should these be weighed? Might one approach lead to a better course of action than another? Why?
2. Discussion: what technical solutions, if any, could help mitigate Yusuf and Nadja’s concerns?
3. Activity: imagine that Ivan agrees that this is a serious enough concern that they need to address it with Nicola. Role play a conversation between Ivan and Nicola.
4. Activity: undertake a classroom debate on whether or not Alice has the potential to reinforce negative stereotypes. Variations include alley debate, stand where you stand, adopt and support opposite instinct.
Enhancements:
An enhancement for this case study can be found here.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.