Case enhancement: Developing a school chatbot for student support services

Activity: Stakeholder mapping to elicit value assumptions and motivations.

Author: Karin Rudolph (Collective Intelligence).

 

Overview:

This enhancement is for an activity found in point 5 of the Summary section of the case study.

What is stakeholder mapping?

What is a stakeholder?

Mapping out stakeholders will help you to:

  1. Identify the stakeholders you need to collaborate with to ensure the success of the project.
  2. Understand the different perspectives and points of view people have and how these experiences can have an impact on your project or product.
  3. Map out a wide range of people, groups or individuals that can affect and be affected by the project.

 

Stakeholder mapping:

The stakeholder mapping activity is a group exercise that provides students with the opportunity to discuss ethical and societal issues related to the School Chatbot case study. We recommend doing this activity in small groups of 6-8 students per table.

 

Resources:

 

Materials:

To carry out this activity, you will need the following resources:

1. Sticky notes (or digital notes if online).

2. A big piece of paper or digital board (Jamboard, Miro if online) divided into four categories:

3. Markers and pencils.

 

The activity:

 

Board One

List of stakeholders:

Below is a list of the stakeholders involved in the Chatbot project. Put each stakeholder on a sticky note and add them to the stakeholders map, according to their level of influence and interest in the projects.

Top tip: use a different colour for each set of stakeholders.

School Chatbot – List of Stakeholders:

 

Placement:

 

Guidance:

Each quadrant represents the following:

Board One

Motivations, assumptions, ethical and societal risks:

Materials:

1. A big piece of paper or digital board (Jamboard, Miro if online) divided into four categories:

2. Sticky notes (or digital notes if online).

3. Markers and pencils.

The activity:

 

Board Two

The Board Two activity can be done in two different ways:

Option 1:

You can use some guiding questions to direct the discussion. For example:

Option 2:

We have already written some assumptions, motivations and ethical/societal risks and you can add these as notes on a table and ask students to place according to each category: stakeholders, motivations, assumptions, and ethical and societal risks.

Motivations:

Assumptions:

Potential ethical and societal risks:

Move and match: 

 

 

 

Reflection:

Ask students to choose 2- 4 sticky notes and explain why they think these are important ethical/societal risks.

 

Potential future activity:

A more advanced activity could involve a group discussion where students are asked to think about some mitigation strategies to minimise these risks.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Theme: Research, Collaborating with industry for teaching and learning, Knowledge exchange

Author: Prof Balbir Barn (Middlesex University), Prof Tony Clark (Aston University), Vinay Kulkarni (TCS) and Dr Souvik Barat (TCS)

Keywords: Digital Twin, Model Driven Engineering, Inclusive Innovation

Abstract: Researchers at Middlesex University initiated a collaboration in 2011 with Tata Consultancy Services Research in India based on their research on lightweight methods for enterprise modelling. Since 2014, that initial introduction has developed into a sustained and ongoing collaborative research programme in programming languages and environments to support model based decision making in complex and uncertain scenarios. The research programme has supported annual sabbatical visits to the TCS research labs in India; a PhD studentship; and regular workshop/advanced tutorials at international conferences. The continuing programme is an example of industry based research problems driving academic collaboration in an international context that has led to over 30 research outputs, an Impact Case Study submitted to REF2021, a TCS software product and the establishment of the London Digital Twin Research Centre at Middlesex.

 

Introduction

This case study describes the outcomes of an ongoing collaboration between Middlesex University with Tata Consultancy Services Research, India’s premier software research centre. The collaboration initiated in 2011, was triggered by a research paper published by Clark, Barn and Oussena [3]. The research proposed a precise, lightweight framework for Enterprise Architecture that views an organization as an engine that executes in terms of hierarchically decomposed communicating components. Following a visit to the TCS Research Labs (TRDDC) in Pune, India, a joint research programme between TCS and Middlesex was established to further the notion of the “Model Driven Organisation”. A key feature of the collaboration was the notion of inclusive innovation, from problem location to shared mutual benefits. The research programme has supported annual sabbatical visits to the TCS research labs in India; a PhD studentship; and regular workshops/advanced tutorials at international conferences. The continuing programme is an example of industry-based research problems driving academic collaboration in an international context that has led to over 30 research outputs, an Impact Case Study submitted to REF2021, a TCS software product and the establishment of the London Digital Twin Research Centre at Middlesex.

Systemising a model for collaboration

In 2011, developing strong, sustained and inclusive model of collaboration with industry was seen as an important element of reputation building activities for Middlesex University as it set out to establish an overseas campus in India. The goal was that Middlesex should be seen to delivering impact both to project outcomes but also as value to the geographical setting of the collaboration.  Thus, in 2011, two senior academics, Prof. Balbir Barn and Prof Tony Clark embarked on a visit to India’s leading IT research centres including the Tata Research and Development Centre (TRDDC), IBM Research, Microsoft Research, Accenture Research, HCL Research, Infosys, Cognizant and others. At these visits, the senior academics were able to showcase Middlesex Computer Science research activities leading to two memorandums of cooperation with Accenture and TRDDC. Middlesex CS had also decided to establish a strong presence at India’s premier Software Engineering conference(ISEC) through research papers, tutorials, and the organising of workshops aimed at capacity building of Indian academia (Value in the process).

Further meetings with chief scientist – Vinay Kulkarni from TRDDC in 2012 at ISEC, led to the idea of collaboration around the notion of the “Model Driven Organisation” where an enterprise can be represented symbolically by a model that draws its information/data from range of software artefacts used by the enterprise in its daily operations. Executives are then able to use this model representation as a decision-making aid.

The collaboration was seen as a shared vision that would be beneficial to both partners (TRDDC and MDX) so at the outset, we agreed to make our joint research publicly available with both partners retaining the option to productise any research outputs. However, there was This collaboration can also be seen as a model for Inclusive Innovation in that the research roadmap references a problem from the “wild”, where key stakeholders are engaged equally from research problem formulation, through to research publications and where there are mutual benefits.

The collaboration also developed a way of working that was critical to its subsequent success. TRDDC supported travel and subsistence of Barn and Clark to its research labs in Pune on annual two week “mini-sabbaticals”. These visits which have run since 2012 to now (only coming to pause due to COVID-19) are linked to the ISEC conference where papers, tutorials and workshops have been regularly presented. There has been a strong focus on development of young academics in India at this conference, further establishing the impact of our inclusive innovation approach by generating value in the setting. While the primary interaction is with the TRDDC Software Engineering Laboratory, seminars and other research exploration opportunities are made possible by meetings with other laboratories (such as Psychology). Some of the annual meetings have been supplemented by further meetings at Middlesex. Each annual visit is an intensive research meeting from which emerges the research plan for the year alongside a publication and impact plan. Very early on, we recognised the potential for an impact case study for the periodic research evaluation exercise conducted in the UK.

 

Figure 1: Research Roadmap

 

Outcomes

The collaboration has proved to be singularly successful in delivering concrete outcomes. Our regularly updated research roadmap (see Figure 1.) has evolved from our initial concept of the Model Driven Organisation, through to a practical language (ESL) and execution environment for enterprise simulation and now to advances to methodologies for digital twin design.

Along the way, a TCS Research Scientist (Souvik Barat) has completed a doctoral study in the design of a modelling language to support enterprise decision making. This language would later contribute to work by Dr Souvik Barat to design a sociotechnical digital twin of the City of Pune, to support non-pharmaceutical interventions during the Covid-19 pandemic. 

The ESL Language (lead Prof Tony Clark) developed as a TRL-5 prototype through the collaboration has formed the basis of a TCS TwinX™ software product developed by TCS and is now being used by TCS consulting.

The collaborative research programme has generated over 30 research publications at leading computing conferences and journal publications. Representative publications are listed [2,4,5,6]. The team has also generated impact and knowledge transfer through the production of advanced tutorials and workshops at conferences. The collaboration has also produced an edited book [7].

Recognising the importance of outcomes to the two respective organisations, the research has contributed to executing the research strategy of TCS Research (see strategy document) and has led directly to an impact case study submitted to REF2021.

Further value derived from our inclusive innovation approach has led to developing research publication preparation skills at TCS and even wider social impact through the pandemic planning activities in Pune City [1]. See the video: https://www.youtube.com/watch?v=x48G7-bOvPY).

In 2019, as our research work has steadily shifted towards Digital Twin technologies, Middlesex established the London Digital Twin Research Centre (LDTRC). The centre combines the software engineering research with cyber-physical systems and telecommunications research to present a means of showcasing a range of externally funded Digital Twin research projects. The focus of the centre has been brought to the attention of EPSRC and it holds regular business facing workshops.

Lessons learnt

Developing a strategic collaboration requires: investment from universities; a spirit that places collaboration and not competition at its heart, and willingness from academics to look for long-term benefit. Two senior academics spent three weeks touring Indian IT research labs with no guarantee of success. Hence, alignment with university strategy is critical.

Systemising this model of cooperation should be considered a strategic objective of UK Research and Innovation. A recognition that such success can be found in all our universities is imperative. While the EPSRC and RAE have “visiting academic-industrial collaborator” schemes they could generate much greater outcomes if their scale was smaller and they were genuinely accessible to all academics at all institutions.

References

  1. Barat, Souvik, Ritu Parchure, Shrinivas Darak, Vinay Kulkarni, Aditya Paranjape, Monika Gajrani, and Abhishek Yadav. “An Agent-Based Digital Twin for Exploring Localized Non-pharmaceutical Interventions to Control COVID-19 Pandemic.” Transactions of the Indian National Academy of Engineering 6, no. 2 (2021): 323-353.
  2. Barat, S., Kulkarni, V., Clark, T., Barn, B. (2019) An Actor Based Simulation Driven Digital Twin for Analyzing Complex Business Systems. Proceedings of the 2019 Winter Simulation Conference, 2019, Maryland, USA.(doi10.1109/WSC40007.2019.9004694)
  3. Clark, T., Barn, B.S. and Oussena, S., 2011, February. LEAP: a precise lightweight framework for enterprise architecture. In Proceedings of the 4th India Software Engineering Conference (pp. 85-94). ACM. (doi:10.1145/1953355.1953366)
  4. Clark, T., Kulkarni, V., Barn, B., France, R., Frank, U. and Turk, D., 2014, January. Towards the model driven organization. In 2014 47th Hawaii International Conference on System Sciences (pp. 4817-4826). IEEE. (doi:10.1109/HICSS.2014.591)
  5. Clark, T., Kulkarni, V., Barat, S. and Barn, B., 2017, June. ESL: an actor-based platform for developing emergent behaviour organisation simulations. In International Conference on Practical Applications of Agents and Multi-Agent Systems (pp. 311-315). Springer, Cham. (doi: https://doi.org/10.1007/978-3-319-59930-4_27 )
  6. Kulkarni, V., Barat, S., Clark, T. and Barn, B., 2015, September. Toward overcoming accidental complexity in organisational decision-making. In 2015 ACM/IEEE 18th International Conference on Model Driven Engineering Languages and Systems (MODELS) (pp. 368-377). IEEE. (doi:10.1109/MODELS.2015.7338268)
  7. Kulkarni, Vinay and Sreedhar Reddy, Tony Clark, and Balbir S. Barn, eds. Advanced Digital Architectures for Model-Driven Adaptive Enterprises. Hershey, PA: IGI Global, 2020. https://doi.org/10.4018/978-1-7998-0108-5

 

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

 

Theme: Research, Knowledge exchange

Authors: Dr Matteo Ceriotti (University of Glasgow), Niven Payne (Fujitsu UK), Giulia Viavattene (University of Glasgow), Ellen Devereux (Fujitsu UK), Dr David Snelling (Fujitsu UK) and Matthew Nuckley (Fujitsu UK)

Keywords: Space, Debris Removal, Sustainability, Optimisation

Abstract: A partnership between the University of Glasgow, Fujitsu UK, Astroscale and Amazon Web Services was established in response to a UK Space Agency call on Active Debris Removal mission design. This is the process of de-orbiting space debris objects from low Earth orbit with a dedicated spacecraft. The consortium brought together different but complementary expertise and tools to develop an algorithm (using machine learning and quantum-based computing) to design multiple-debris removal missions, able to select feasible sequences of debris objects among millions of permutations, in a fraction of the time of previous methods, and of better performance in terms of time and propellant required.

 

Overview

Space and its services have become part of everyone’s daily life, quietly. Things like mapping, geolocation, telecommunication services and weather forecast all depend on space assets. The continuous and increasing exploration and exploitation of space heavily depends on sustainability: defunct satellites and other spacecraft and launcher parts that became part of space debris population, or “junk”, increasing the threat of collision for current and future missions. There are 34,000 objects larger than 10 cm, and 130 million smaller than 1 cm, including non-operational satellites, upper stage rocket bodies, satellite parts, etc. Most of these objects are in the low Earth orbit region (below 1000 km), which is where most satellites operate.

Design of new satellites for demise prevents the creation of further debris. Active debris removal (ADR) aims dispose of debris objects that are currently in orbit. ADR actions require a “chaser” spacecraft to grapple a “non-cooperative” target, and transfer it to an orbit low enough that it will eventually de-orbit and burn in the atmosphere in a relatively short amount of time.

The idea

Many ADR missions would be required to make a substantial contribution in diminishing the debris population. The business challenge was to investigate how we could make space debris removal missions more commercially viable. This project investigated the feasibility, viability and design of removal and disposal of multiple debris objects using a single chaser spacecraft. The mission scenario involves a spacecraft that transfers to the orbit of one or more objects, captures it (or them), and then transfers to a lower orbit for release and disposal. At low altitude, the atmospheric drag will quickly cause the object to rapidly fall and burn in the atmosphere. In the meantime, the chaser spacecraft will transfer to another object (or set of objects) and continue the mission.

 

The problem

With million pieces of space junk, there are multiple trillions of permutations for ADR missions between these objects, that would need to be investigated, to efficiently remove even only a few of them. Since orbital transfers have no analytical closed-form solutions, an optimisation strategy must be used to find a solution to trajectory design problems, which is generally computationally demanding.

Our solution

The aim of this project was to make space debris removal missions more commercially viable, through a new solution that allows fast mission planning. First, an Artificial Neural Network (ANN) is trained to predict the cost of orbital transfer to and disposal of a range of debris objects quickly. Then, this information is used to plan a mission of four captures from candidate possible debris targets using Fujitsu’s quantum-inspired optimisation technology, called Digital Annealer (DA), by formulating the problem as a quadratic unconstrained binary optimisation. We used Astroscale’s mission planning data and expertise, and run the algorithms on the Amazon Web Services (AWS) Sagemaker platform. For technical details on our approach, the reader is referred to the publications below.

Outcomes

In a test-scenario, we showed that our solution produced a 25% faster mission, using 18% less propellant when compared to an expert’s attempt to plan the mission using the same assumptions; this was found 170,000 times faster than current methods based on an expert’s work.

Partnership

The project involved the partnership of four institutions, with areas of contributions described in the following diagram:

We believe the key to the success of the partnership was the different, but complementary areas of expertise, tools offered, and contribution of each partner into the project. It may be easier to rely on existing network of contacts, often with similar areas of expertise. However, this project shows that the additional effort of creating a new partnership can have great benefits, that overcome the initial difficulties.

Project set up

An initial contact between Fujitsu and UofG defined the original idea of the project, combining the existing expertise on discrete optimisation (Fujitsu) and multi-body space missions (UofG). The team was strengthened by expertise in active space debris removal (Astroscale) and cloud computing (AWS). The project proposal was funded by the United Kingdom Space Agency (UKSA), for a duration of four months, from September 2020 to January 2021.

Due to the on-going global pandemic, the project was run entirely online, with weekly meetings on Microsoft Teams. Fujitsu, as team lead, was responsible for planning and scheduling of tasks, as well as integration of code and reporting.

Lessons learned and reflections

Reactivity in preparing a project proposal was fundamental for the project: The very first contact between the partners was made at the end of July 2020, the proposal was submitted in mid-August and the project officially kicked-off in September.

Given the short timeframe, it was important to conceive a project proposal that fit the scope of the funder, but also matches with available expertise and personnel. It was also critical to frame the business challenge in the proposal.

From the point of view of the academic team, and again given the short window between notification of successful application and start of the project, these factors were crucial for the success of the project:

A PhD student in the research group was the best candidate for the project: at the cost of taking a leave-of-absence from the PhD studentship, the project constituted a unique experience with industrial collaboration, enriched their CV through a ground-breaking project, added a conference and a journal paper to their track record, and eventually opened new areas of investigation for the rest of the PhD studentship.

It would have been probably unthinkable – or at not very credible – to deliver a project with new partners remotely without any in-person meeting before the pandemic; however, this turned out to be an enabler for this project, allowing to maximise time on actual development and save on travel costs.

Further information

G. Viavattene, E. Devereux, D. Snelling, N. Payne, S. Wokes, M. Ceriotti, Design of multiple space debris removal missions using machine learning, Acta Astronautica, 193 (2022) 277-286. DOI: 10.1016/j.actaastro.2021.12.051

D. Snelling, E. Devereux, N. Payne, M. Nuckley, G. Viavattene, M. Ceriotti, S. Wokes, G. Di Mauro, H. Brettle, Innovation in planning space debris removal missions using artificial intelligence and quantum-inspired computing, 8th European Conference on Space Debris, ESA/ESOC, Darmstadt, Germany (Virtual Conference), 2021.

 

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Professor Sarah Hitt SFHEA (NMITE); Professor Raffaella Ocone OBE FREng FRSE (Heriot Watt University); Johnny Rich (Engineering Professors’ Council); Dr Matthew Studley (University of the West of England, Bristol); Dr Nik Whitehead (University of Wales Trinity Saint David); Dr Darian Meacham (Maastricht University); Professor Mike Bramhall (TEDI-London); Isobel Grimley (Engineering Professors’ Council).

Topic: Data security of smart technologies.

Engineering disciplines: Electronics, Data, Mechatronics.

Ethical issues: Autonomy, Dignity, Privacy, Confidentiality.

Professional situations: Communication, Honesty, Transparency, Informed consent.

Educational level: Intermediate.

Educational aim: Practise ethical analysis. Ethical analysis is a process whereby ethical issues are defined and affected parties and consequences are identified so that relevant moral principles can be applied to a situation in order to determine possible courses of action.

 

Learning and teaching notes:

This case involves a software engineer who has discovered a potential data breach in a smart home community. The engineer must decide whether or not to report the breach, and then whether to alert and advise the residents. In doing so, considerations of the relevant legal, ethical, and professional responsibilities need to be weighed. The case also addresses communication in cases of uncertainty as well as macro-ethical concerns related to ubiquitous and interconnected digital technology.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in two parts. If desired, a teacher can use Part one in isolation, but Part two develops and complicates the concepts presented in Part one to provide for additional learning. The case allows teachers the option to stop at multiple points for questions and/or activities as desired

Learners will have the opportunity to:

Teachers will have the opportunity to:

 

Learning and teaching resources:

 

Summary:

Smart homes have been called “the road to independent living”. They have the potential to increase the autonomy and safety of older people and people with disabilities. In a smart home, the internet of things (IoT) is coupled with advanced sensors, chatbots and digital assistants. This combination enables residents to be connected with both family members and health and local services, so that if there there are problems, there can be a quick response.

Ferndale is a community of smart homes. It has been developed at considerable cost and investment as a pilot project to demonstrate the potential for better and more affordable care of older people and people with disabilities. The residents have a range of capabilities and all are over the age of 70. Most live alone in their home. Some residents are supported to live independently through: reminders to take their medication; prompts to complete health and fitness exercises; help completing online shopping orders and by detecting falls and trips throughout the house. The continuous assessment of habits, diet and routines allows the technology to build models that may help to predict any future negative health outcomes. These include detecting the onset of dementia or issues related to dietary deficiencies. The functionality of many smart home features depends on a reliable and secure internet connection.

 

Dilemma – Part one:

You are the software engineer responsible for the integrity of Ferndale’s system. During a routine inspection you discover several indicators suggesting a data breach may have occurred via some of the smart appliances, many of which have cameras and are voice-activated. Through the IoT, these appliances are also connected to Amazon Ring home security products – these ultimately link to Amazon, including supplying financial information and details about purchases.

 

Optional STOP for questions and activities: 

1. Activity: Technical analysis – Before the ethical questions can be considered, the students might consider a number of immediate technical questions that will help inform the discussion on ethical issues. A sample data set or similar technical problem could be used for this analysis. For example:

2. Activity: Identify legal and ethical issues. The students should reflect on what might be the immediate ethical concerns of this situation. This could be done in small groups or a larger classroom discussion.

Possible prompts:

3. Activity: Determine the wider ethical context. Students should consider what wider moral issues are raised by this situation. This could be done in small groups or a larger classroom discussion.

Possible prompts:

 

Dilemma – Part two:

You send an email to Ferndale’s manager about the potential breach, emphasising that the implications are possibly quite serious. She replies immediately, asking that you do not reveal anything to anyone until you are absolutely certain about what has happened. You email back that it may take some time to determine if the software security has been compromised and if so, what the extent of the breach has been. She replies explaining that she doesn’t want to cause a panic if there is nothing to actually worry about and says “What you don’t know won’t hurt you.” How do you respond?     

 

Optional STOP for questions and activities: 

1. Discussion: Professional values – What guidance is given by codes of ethics such as the Royal Academy of Engineering/Engineering Council’s Statement of Ethical Principles or the Association for Computing Machinery Code of Ethics?

2. Activity: Map possible courses of action. The students should think about the possible actions they might take. They can be prompted to articulate different approaches that could be adopted, such as the following, but also develop their own alternative responses.

3. Activity: Hold a debate on which is the best approach and why. The students should interrogate the pros and cons of each possible course of action including the ethical, technical, and financial implications. They should decide on their own preferred course of action and explain why the balance of pros and cons is preferable to other options.

4. Activity: Role-play a conversation between the engineer and the manager, or a conversation between the engineer and a resident.

5. Discussion: consider the following questions:

6. Activity: Change perspectives. Imagine that you are the child of one of Ferndale’s residents and that you get word of the potential data security breach. What would you hope the managers and engineers would do?

7. Activity: Write a proposal on how the system might be improved to stop this happening in the future or to mitigate unavoidable risks. To inform the proposal, the students should also explore the guidance of what might be best practice in this area. For example, in this instance, they may decide on a series of steps.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Dr Nicola Whitehead (University of Wales Trinity Saint David); Professor Sarah Hitt (NMITE); Emma Crichton (Engineers Without Borders UK); Dr Sarah Junaid (Aston University); Professor Mike Sutcliffe (TEDI-London), Isobel Grimley (Engineering Professors’ Council).

Topic: Development and use of a facial recognition system. 

Engineering disciplines: Data, Electronics, Computer science, AI.

Ethical issues: Diversity, Bias, Privacy, Transparency.

Professional situations: Rigour, Informed consent, Misuse of data, Conflicts with leadership / management.

Educational level: Advanced. 

Educational aim: To encourage ethical motivation. Ethical motivation occurs when a person is moved by a moral judgement, or when a moral judgement is a spur to a course of action. 

 

Learning and teaching notes: 

This case involves an engineer hired to manage the development and installation of a facial recognition project at a building used by university students, businesses and the public. It incorporates a variety of components including law and policy, stakeholder and risk analysis, and both macro- and micro-ethical elements. This example is UK-based: however, the instructor can adapt the content to better fit the laws and regulations surrounding facial recognition technology in other countries, if this would be beneficial.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

This case is presented in three parts. If desired, a teacher can use Part one in isolation, but Part two (focusing on the wider ethical context of the case) and Part three (focusing on the potential actions the engineer could take)develop and complicate the concepts presented in Part one to provide for additional learning. The case study allows teachers the option to stop at multiple points for questions and / or activities as desired.

Learners have the opportunity to:

Teachers have the opportunity to: 

 

Learning and teaching resources:

 

Summary: 

Metropolitan Technical University (MTU), based in the UK, has an urban campus and many of its buildings are located in the city centre. A new student housing development in this area will be shared by MTU, a local college, and medical residents doing short rotations at the local hospital. The building has a public café on the ground floor and a couple of classrooms used by the university. 

The housing development sits alongside a common route for parades and protests. In the wake of demonstrations by Extinction Rebellion and Black Lives Matter, students have raised concerns to the property manager about safety. Despite an existing system of CCTV cameras and swipe cards, the university decides to install an enhanced security system, built around facial recognition technology that would enable access to the building and cross-reference with crime databases. To comply with GDPR, building residents will be required to give explicit consent before the system is implemented. Visitors without a student ID (such as café customers) will be buzzed in, but their image will be captured and cross-referenced before entry. A side benefit of the system is that MTU’s department of Artificial Intelligence Research will help with the installation and maintenance, as well as studying how it works, in order to make improvements. 

 

Dilemma – Part one:

You are an engineer who has been hired by MTU to take charge of the facial recognition system installation project, including setting policies and getting the system operational. With your background in AI engineering, you are expected to act as a technical advisor to MTU and liaise with the Facilities, Security and Computing departments to ensure a smooth deployment. This is the first time you have worked on a project that involves image capture. So as part of your preparation for the project, you need to do some preliminary research as to what best practices, guidance, and regulations apply.

 

Optional STOP for questions and activities: 

1. Discussion: What are the legal issues relating to image capture? Images allow for the identification of living persons and are therefore considered as personal data under GDPR and the Data Protection Act (2018).

2. Discussion: Sharing data is a legally and ethically complex field. Is it appropriate to share images captured with the police? If not the police, then whose crime database will you use? Is it acceptable to share the data with the Artificial Intelligence Research group? Why, or why not?

3. Discussion: Under GDPR, individuals must normally consent to their personal data being processed. How should consent be handled in this case?

4. Discussion: Does the fact that the building will accommodate students from three different institutions (MTU, the local college, and the hospital) complicate these issues? Are regulations related to students’ captured images different than those related to public image capture?

5. Activity: Undertake a technical activity that relates to how facial recognition systems are engineered.

 

Dilemma – Part two:

The project has kicked off, and one of its deliverables is to establish the policies and safeguards that will govern the system. You convened a meeting of project stakeholders to determine what rules need to be built into the system’s software and presented a list of questions to help you make technical decisions. The questions you asked were:

What you had thought would be a quick meeting to agree basic principles turned out to be very lengthy and complex. You were surprised at the variety of perspectives and how heated the discussions became. The discussions raised some questions in your own mind as to the risks of the facial recognition system.

 

Optional STOP for questions and activities:

The following activities focus on macro-ethics. This seeks to understand the wider ethical contexts of projects like the facial recognition system.

1. Activity: Stakeholder mapping – Who are all the stakeholders and what might their positions and perspectives be? Is there a difference between the priorities of the different stakeholders?

2. Activity: There are many different values competing for priority here. Identify these values, discuss and debate how they should be weighed in the context of the project.

3. Activity: Risks can be understood as objective and / or subjective. Research the difference between these two types of risk, and identify which type(s) of risks exist related to the project.

4. Discussion: Which groups or individuals are potentially harmed by the technology and which potentially benefit? How should we go about setting priorities when there are competing harms and benefits?

5. Discussion: Does the technology used treat everyone from your stakeholders’ list equally? Should the needs of society as a whole outweigh the needs of the individual?

6. Activity: Make and defend an argument as to the appropriateness of installing and using the system.

7. Discussion: What responsibilities do engineers have in developing these technologies?

 

Dilemma – Part three:

A few days later, you were forwarded a screenshot of a social media post that heavily criticised the proposed facial recognition system. It was unclear where the post had originated, but it had clearly been shared and promoted among both students and the public raising concerns about privacy and transparency. Your boss believes this outcry endangers the project and has requested that you make a public statement on behalf of MTU, reaffirming its commitment to installing the system.

You share the concerns, but have been employed to complete the project. You understand that suggesting it should be abandoned, would most likely risk your job. What will you tell your boss? How will you prepare your public statement?

 

Optional STOP for questions and activities:

Micro-ethics concerns individuals and their responses to specific situations. The following steps are intended to help students develop their ability to practise moral analysis by considering the problem in a structured way and work towards possible solutions that they can analyse critically.

 1. Discussion: What are the problems here? 

2. Discussion: What are the possible courses of action you can take as an employee?

 Students can be prompted to consider what different approaches they might adopt, such as the following, but can also develop their own possible responses. 

3. Discussion: Which is the best approach and why? – Interrogate the pros and cons of each possible course of action including the ethical, practical, cost, local relationship and the reputational damage implications. Students should decide on their own preferred course of action and explain why the balance of pros and cons is preferable to other options. The students may wish to consider this from other perspectives, such as: 

4. Activity: Public Communication – Students can practise writing a press release, giving an interview, or making a public statement about the case and the decision that they make.

5. Activity: Reflection – Students can reflect on how this case study has enabled them to see the situation from different angles. Has it motivated them to understand the ethical concerns and to come to an acceptable conclusion.

 

Enhancements:

An enhancement for this case study can be found here.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Professor Sarah Hitt SFHEA (NMITE); Professor Raffaella Ocone OBE FREng FRSE (Heriot Watt University); Professor Thomas Lennerfors (Uppsala University); Claire Donovan (Royal Academy of Engineering); Isobel Grimley (Engineering Professors’ Council).

Topic:  Developing customised algorithms for student support.

Engineering disciplines: Computing, AI, Data.

Ethical issues: Bias, Social responsibility, Risk, Privacy.

Professional situations: Informed consent, Public health and safety, Conflicts with leadership / management, Legal implications.

Educational level: Beginner.

Educational aim: Develop ethical sensitivity. Ethical sensitivity is the broad cognisance of ethical issues and the ability to see how these might affect others.

 

Learning and teaching notes:

This case study involves the employees of a small software start-up that is creating a customised student support chatbot for a Sixth Form college. The employees come from different backgrounds and have different perspectives on the motivations behind their work, which leads to some interpersonal conflict. The team must also identify the ethical issues and competing values that arise in the course of developing their algorithm.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in two parts which build in complexity and navigate between personal, professional, and societal contexts. If desired, a teacher can use Part one in isolation, but Part two develops and complicates the concepts presented in Part one to provide for additional learning. Pre-reading ‘Ethics of Care and Justice’ is recommended, though not required, for engaging with Part two. The case allows teachers the option to stop at multiple points for questions and / or activities as desired.

Learners have the opportunity to:

Teachers have the opportunity to:

 

Learning and teaching resources:

 

Summary:

Exaba is a small, three-person software startup. Like all small businesses, it has been struggling with finances during the pandemic. The company began selling its services across a variety of industry sectors but is now trying to expand by developing software solutions for the growing education technology sector.

Ivan, Exaba’s founder and CEO, was thrilled to be contracted by a growing local Sixth Form College in North West England, NorthStar Academy, to create a chatbot that will optimise student support services. These services include ensuring student safety and wellbeing, study skills advice, careers guidance, counselling, and the identification for the need and implementation of extra learning support. It is such a large project that Ivan has been able to bring in Yusuf, a university student on placement from a computer systems programme, to help Nadja, Exaba’s only full-time software engineer. Ivan views the chatbot contract as not only a financial windfall that can help get the company back on track, but as the first project in a new product-development revenue stream.

Nadja and Yusuf have been working closely with the NorthStar Academy’s Principal, Nicola, to create ‘Alice’: the custom student-support chatbot to ensure that she is designed appropriately and is fit for purpose. Nicola has seen growing evidence that chatbots can identify when students are struggling with a range of issues from attendance to anxiety. She has also seen that they can be useful in helping administrators understand what students need, how to help them more quickly, and where to invest more resources to make support most effective.

 

Optional STOP for questions and activities:

1. Discussion: What moral or ethical issues might be at stake or arise in the course of this project?

2. Discussion: What professional or legal standards might apply to the development of Alice?

3. Discussion: What design choices might Nadja and Yusuf have to consider as they build the chatbot software in order for it to conform to those standards?

4. Discussion: is there anything risky about giving cognitive chatbots human names in general, or a female name specifically?

5. Activity: Undertake stakeholder mapping to elicit value assumptions and motivations.

6. Activity: Research any codes of ethics that might apply to AI in education, or policies / laws that apply to controlling and processing student data.

7. Activity: View the following TED talk and have a discussion on gender in digital assistants: Siri and Alexa are AI Built for the Past by Emily Liu.

 

Dilemma – Part one:

After undertaking work to ensure GDPR compliance through transparency, consent, and anonymisation of the data harvested by interactions with Alice, Nadja and Yusuf are now working on building the initial data set that the chatbot will call upon to provide student support. The chatbot’s information to students can only be as good as the existing data it has available to draw from. To enable this, Nicola has agreed to provide Exaba with NorthStar Academy’s existing student databases that span many years and cover both past and present students. While this data – including demographics, academic performances, and interactions with support services – is anonymised, Yusuf has begun to feel uncomfortable. One day, when the entire team was together discussing technical challenges, Yusuf said “I wonder what previous students would think if they found out that we were using all this information about them, without their permission?”

Ivan pointed out, “Nicola told us it was okay to use. They’re the data controllers, so it’s their responsibility to resolve that concern, not ours. We can’t tell them what to do with their own data. All we need to be worried about is making sure the data processing is done appropriately.”

Nadja added, “Plus, if we don’t use an existing data set, Alice will have to learn from scratch, meaning she won’t be as effective at the start. Wouldn’t it be better for our chatbot to be as intelligent and helpful as possible right away? Otherwise, she could put existing students at a disadvantage.”

Yusuf fell silent, figuring that he didn’t know as much as Ivan and Nadja. Since he was just on a placement, he felt that it wasn’t his place to push the issue any further with full-time staff.

 

Optional STOP for questions and activities:

1. Discussion: Expand upon Yusuf’s feelings of discomfort. What values or principles is this emotion drawing on?

2. Discussion: Do you agree with Yusuf’s perspective, or with Ivan’s and Nadja’s? Why?

3. Discussion: Does / should Yusuf have the right to voice any concerns or objections to his employer?

4. Discussion: Do / should previous NorthStar students have the right to control what the academy does with their data? To what extent, and for how long?

5. Discussion: Is there / should there be a difference between how data about children is used and that of adults? Why?

6. Discussion: Should a business, like Exaba, ever challenge its client, like NorthStar Academy, about taking potentially unethical actions?

7. Technical activity: Undertake a technical activity such as creating a process flow diagram, pieces of code and UI / UX design that either obscure or reinforce consent.

8. Activity: Undertake argument mapping to diagram and expand on the reasoning and evidence used by Yusuf, Nadja, and Ivan in their arguments.

9. Activity: Apply ethical theories to those arguments.  

10. Discussion: What ethical principles are at stake? Are there potentially any conflicts or contradictions arising from those principles?

 

Dilemma – Part two:

Nicola, too, was under pressure. The academy’s Board had hired her as Principal to improve NorthStar’s rankings in the school performance table, to get the college’s finances back on track, and support the government efforts at ‘levelling up’ This is why one of Nicola’s main specifications for Alice is that she be able to flag students at risk of not completing their qualifications. Exaba will have to develop an algorithm that can determine what those risk factors are.

In a brainstorming session Nadja began listing some ideas on the whiteboard. “Ethnic background, family income, low marks, students who fit that profile from the past and ultimately dropped out, students who engaged with support services a lot, students with health conditions . . .”

“Wait, wait, wait,” Yusuf said. “This feels a little bit like profiling to me. You know, like we think kids from certain neighbourhoods are unlikely to succeed so we’re building this thing to almost reinforce that they don’t.”

“The opposite is true!” Ivan exclaimed. “This algorithm will HELP exactly those students.”

“I can see how that’s the intention,” Yusuf acknowledged. “But I’ve had so many friends and neighbours experience well-intentioned but not appropriate advice from mentors and counsellors who think the only solution is for everyone to complete qualifications and go to university. This is not the best path for everybody!”

Nadja had been listening carefully. “There is something to what Yusuf is saying: Is it right to nudge students to stay in a programme that’s actually not a best fit for them? Could Alice potentially give guidance that is contrary to what a personal tutor, who knows the student personally, might advise? I don’t know if that’s the sort of algorithm we should develop.”

At this point Ivan got really frustrated with his employees: “This is the proprietary algorithm that’s going to save this company!” he shouted. “Never mind the rights and wrongs of it. Think of the business potential, not to mention all the schools and students this is going to help. The last thing I need is a mutiny from my team. We have the client’s needs to think about, and that’s it.”

 

Optional STOP for questions and activities:

1. Activity: compare an approach to this case through the ethics of care versus the ethics of justice. What different factors come into play? How should these be weighed? Might one approach lead to a better course of action than another? Why?

2. Discussion: what technical solutions, if any, could help mitigate Yusuf and Nadja’s concerns?

3. Activity: imagine that Ivan agrees that this is a serious enough concern that they need to address it with Nicola. Role play a conversation between Ivan and Nicola.

4. Activity: undertake a classroom debate on whether or not Alice has the potential to reinforce negative stereotypes. Variations include alley debate, stand where you stand, adopt and support opposite instinct.

 

Enhancements:

An enhancement for this case study can be found here.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Let us know what you think of our website