Authors: Ahmet Omurtag (Nottingham Trent University); Andrei Dragomir (National University of Singapore / University of Houston).

Topic: Data security of smart technologies.

Engineering disciplines: Electronics; Data; Biomedical engineering.

Ethical issues: Autonomy; Dignity; Privacy; Confidentiality.

Professional situations: Communication; Honesty; Transparency; Informed consent; Misuse of data.

Educational level: Advanced.

Educational aim: Practising Ethical Analysis: engaging in a process by which ethical issues are defined, affected parties and consequences are identified, so that relevant moral principles can be applied to a situation in order to determine possible courses of action.

 

Learning and teaching notes:

This case involves Aziza, a biomedical engineer working for Neuraltrix, a hypothetical company that develops Brain-computer interfaces (BCI) for specialised applications. Aziza has always been curious about the brain and enthusiastic about using cutting-edge technologies to help people in their daily lives. Her team has designed a BCI that can measure brain activity non-invasively and, by applying machine learning algorithms, assess the job-related proficiency and expertise level of a person. She is leading the deployment of the new system in hospitals and medical schools, to be used in evaluating candidates being considered for consultant positions. In doing so, and to respond to requests to extend and use the BCI-based system in unforeseen ways, she finds herself compelled to weigh various ethical, legal and professional responsibilities.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in three parts. If desired, a teacher can use the Summary and Part one in isolation, but Parts two and three develop and complicate the concepts presented in the Summary and Part one to provide for additional learning. The case allows teachers the option to stop at multiple points for questions and/or activities as desired.

Learners have the opportunity to:

Teachers have the opportunity to:

 

Learning and teaching resources:

Legal regulations:

Professional organisations:

Philanthropic organisations:

Journal articles:

Educational institutions:

 

Summary:

Brain-computer interfaces (BCIs) detect brain activity and utilise advanced signal analysis to identify features in the data that may be relevant to specific applications. These features might provide information about people’s thoughts and intentions or about their psychological traits or potential disorders, and may be interpreted for various purposes such as for medical diagnosis, for providing real-time feedback, or for interacting with external devices such as a computer. Some current non-invasive BCIs employ unobtrusive electroencephalography headsets or even optical (near-infrared) sensors to detect brain function and can be safe and convenient to use.

Evidence shows that the brains of people with specialised expertise have identifiable functional characteristics. Biomedical technology may translate this knowledge soon into BCIs that can be used for objectively assessing professional skills. Researchers already know that neural signals support features linked to levels of expertise, which may enable the assessment of job applicants or candidates for promotion or certification.

BCI technology would potentially benefit people by improving the match between people and their jobs, and allowing better and more nuanced career support. However, the BCI has access to additional information that may be sensitive or even troubling. For example, it could reveal a person’s health status (such as epilepsy or stroke), or it may suggest psychological traits ranging from unconscious racial bias to psychopathy. Someone sensitive about their privacy may be reluctant to consent to wearing a BCI.

In everyday life, we show what is on our minds through language and behaviour, which are normally under our control, and provide a buffer of privacy. BCIs with direct access to the brain and increasing capability to decode its activity may breach this buffer. Information collected by BCIs could be of interest not only to employers who will decide whether to hire and invest in a new employee, but also to health insurers, advertising agencies, or governments.

 

Optional STOP for questions and activities:

1. Activity: Risks of brain activity decoding – Identify the physical, ethical, and social difficulties that could result from the use of devices that have the ability to directly access the brain and decipher some of its psychological content such as thoughts, beliefs, and emotions.

2. Activity: Regulatory oversight – Investigate which organisations and regulatory bodies currently monitor and are responsible for the safe and ethical use of BCIs.

3. Activity: Technical integration – Investigate how BCIs work to translate brain activity into interpretable data.

 

Dilemma – Part one:

After the company, Neuraltrix, deployed their BCI and it had been in use for a year in several hospitals, its lead developer Aziza became part of the customer support team. While remaining proud and supportive of the technology, she had misgivings about some of its unexpected ramifications. She received the following requests from people and institutions for system modifications or for data sharing:

1. A hospital asked Neuraltrix for a technical modification that would allow the HR department to send data to their clinical neurophysiologists for “further analysis,” claiming that this might benefit people by potentially revealing a medical abnormality that might otherwise be missed.

2. An Artificial Intelligence research group partnering with Neuraltrix requested access to the data to improve their signal analysis algorithms.

3. A private health insurance company requested Neuraltrix provide access to the scan of someone who had applied for insurance coverage; they stated that they have a right to examine the scan just as life insurance agencies are allowed to perform health checks on potential customers.

4. An advertising agency asked Neuraltrix for access to their data to use them to fine-tune their customer behavioural prediction algorithms.

5. A government agency demanded access to the data to investigate a suspected case of “radicalisation”.

6. A prosecutor asked for access to the scan of a specific person because she had recently been the defendant in an assault case, where the prosecutor is gathering evidence of potential aggressive tendencies.

7. A defence attorney requested data because they were gathering potentially exonerating evidence, to prove that the defendant’s autonomy had been compromised by their brain states, following a line of argument known as “My brain made me do it.”

 

Optional STOP for questions and activities: 

1. Activity: Identify legal issues – Students could research what laws or regulations apply to each case and consider various ways in which Neuraltrix could lawfully meet some of the above requests while rejecting others, and how their responses should be communicated within the company and to the requestor.

2. Activity: Identify ethical issues – Students could reflect on what might be the immediate ethical concerns related to sharing the data as requested.

3. Activity: Discussion or Reflection – Possible prompts:

 

Dilemma – Part two:

The Neuraltrix BCI has an interface which allows users to provide informed consent before being scanned. The biomedical engineer developing the system was informed about a customer complaint which stated that the user had felt pressured to provide consent as the scan was part of a job interview. The complaint also stated that the user had not been aware of the extent of information gleaned from their brains, and that they would not have provided consent had been made aware of it.

 

Optional STOP for questions and activities: 

1. Activity: Technical analysis – Students might try to determine if it is possible to design the BCI consent system and/or consent process to eliminate the difficulties cited in the complaint. Could the device be designed to automatically detect sensitive psychological content or allow the subject to stop the scan or retroactively erase the recording?

2. Activity: Determine the broader societal impact and the wider ethical context – Students should consider what issues are raised by the widespread availability of brain scans. This could be done in small groups or a larger classroom discussion.

Possible prompts:

 

Dilemma – Part three:

Neuraltrix BCI is about to launch its updated version, which features all data processing and storage moved to the cloud to facilitate interactive and mobile applications. This upgrade attracted investors and a major deal is about to be signed. The board is requesting a fast deployment from the management team and Aziza faces pressure from her managers to run final security checks and go live with the cloud version. During these checks, Aziza discovers a critical security issue which can be exploited once the BCI runs in the cloud, risking breaches in the database and algorithm. Managers believe this can be fixed after launch and request the engineer to start deployment and identify subsequent solutions to fix the security issue.

 

Optional STOP for questions and activities: 

1. Activity: Students should consider if it is advisable for Aziza to follow requests from managers and the Neuraltrix BCI board and discuss possible consequences, or halt the new version deployment which may put at risk the new investment deal and possibly the future of the company.

2. Activity: Apply an analysis based on “Duty-Ethics” and “Rights Ethics.” This could be done in small groups (who would argue for management position and engineer position, respectively) or a larger classroom discussion. A tabulation approach with detailed pros and cons is recommended.

3. Activity: Apply a similar analysis as above based on the principles of “Act-Utilitarianism” and “Rule-Utilitarianism.”

Possible prompts:

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

 

Authors: Dr Yujia Zhai (University of Hertfordshire); Associate Professor Scarlett Xiao (University of Hertfordshire). 

Topic: Data security of industrial robots.  

Disciplines: Robotics; Data; Internet of Things. 

Ethical issues: Safety; Health; Privacy; Transparency. 

Professional situations: Rigour; Informed consent; Misuse of data. 

Educational level: Intermediate. 

Educational aim: Gaining ethical knowledge. Knowing the sets of rules, theories, concepts, frameworks, and statements of duty, rights, or obligations that inform ethical attitudes, behaviours, and practices. 

 

Learning and teaching notes: 

This case study involves an engineer hired to develop and install an Industrial Internet of Things (IIoT) online machine monitoring system for a manufacturing company. The developments include designing the infrastructure of hardware and software, writing the operation manuals and setting policies. The project incorporates a variety of ethical components including law and policy, stakeholders, and risk analysis. 

This case study addresses three of the themes from the Accreditation of Higher Education Programmes fourth edition (AHEP4): Design and Innovation (significant technical and intellectual challenges commensurate the level of study), the Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools, and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37. 

The dilemma in this case is presented in three parts. If desired, a teacher can use Part one in isolation, but Part two and Part three develop and complicate the concepts presented in Part one to provide for additional learning. The case study allows teachers the option to stop at multiple points for questions and/or activities as desired. 

Learners have the opportunity to: 

Teachers have the opportunity to:  

 

Learning and teaching resources: 

Professional organisations: 

Legal regulations: 

UN agency: 

Educational resource: 

Government sites: 

 Educational institutions: 

 

Summary: 

IIoT is a new technology that can provide accurate condition monitoring and predict component wear rates to optimise machine performance, thereby improving the machining precision of the workpiece and reducing the production cost.   

Oxconn is a company that produces auto parts. The robotic manipulators and other automation machines on the production line have been developed at considerable cost and investment, and regular production line maintenance is essential to ensure its effective operation. The current maintenance scheme is based on routine check tests which are not reliable and efficient. Therefore Oxconn has decided to install an IIoT-based machine condition monitoring system. To achieve fast responses to any machine operation issues, the machine condition data collected in real time will be transferred to a cloud server for analysis, decision making, and predictive maintenance in the future. 

 

Dilemma – Part one – Data protection on customers’ machines:

You are a leading engineer who has been hired by Oxconn to take charge of the project on the IIoT-based machine monitoring system, including designing the infrastructure of hardware and software, writing the operation manuals, setting policies, and getting the system up and running. With your background in robotic engineering and automation, you are expected to act as a technical advisor to Oxconn and liaise with the Facilities, Security, Operation, and Maintenance departments to ensure a smooth deployment. This is the first time you have worked on a project that involves real time data collection. So as part of your preparation for the project, you need to do some preliminary research as to what best practices, guidance, and regulations apply. 

 

Optional STOP for questions and activities: 

1. Discussion: What are the legal issues relating to machine condition monitoring? Machines’ real-time data allows for the identification of production status in a factory and is therefore considered as commercial data under GDPR and the Data Protection Act (2018). Are there rules specifically for IIoT, or are they the same no matter what technology is being used? Should IIoT regulations differ in any way? Why? 

2. Discussion: Sharing data is a legally and ethically complex field. Are there any stakeholders with which the data could be shared? For instance, is it acceptable to share the data with an artificial intelligence research group or with the public? Why, or why not? 

3. Discussion: Under GDPR, individuals must normally consent to their personal data being processed. For machine condition data, how should consent be handled in this case? 

4. Discussion: What ethical codes relate to data security and privacy in an IIoT scenario?  

5. Activity: Undertake a technical activity that relates to how IIoT-based machine monitoring systems are engineered. 

6. Discussion: Based on your understanding of how IIoT-based machine monitoring systems are engineered, consider what additional risks, and what kind of risks (such as financial or operational), Oxconn might incur if depending on an entirely cloud-based system. How might these risks be mitigated from a technical and non-technical perspective? 

 

Dilemma – Part two – Computer networks security issue brought by online monitoring systems:

The project has kicked off and a senior manager requests that a user interface (UI) be established specifically for the senior management team (SMT). Through this UI, the SMT members can have access to all the real-time data via their computers or mobiles and obtain the analysis result provided by artificial intelligence technology. You realise this has implications on the risk of accessing internal operating systems via the external information interface and networks. So as part of your preparation for the project, you need to investigate what platforms can be used and what risk analysis must be taken in implementation. 

 

Optional STOP for questions and activities: 

The following activities focus on macro-ethics. They address the wider ethical contexts of projects like the industrial data acquisition system. 

1. Activity: Explore different manufacturers and their approaches to safety for both machines and operators. 

2. Activity: Technical integration – Undertake a technical activity related to automation engineering and information engineering. 

3. Activity: Research what happens with the data collected by IIoT. Who can access this data and how can the data analysis module manipulate the data?  

4. Activity: Develop a risk management register, taking considerations of the findings from Activity 3 as well as the aspect of putting in place data security protocols and relevant training for SMT. 

5. Discussion/activity: Use information in the Ethical Risk Assessment guide to help students consider how ethical issues are related to the risks they have just identified. 

6. Discussion: In addition to cost-benefit analysis, how can the ethical factors be considered in designing the data analysis module? 

7. Activity: Debate the appropriateness of installing and using the system for the SMT. 

8. Discussion: What responsibilities do engineers have in developing these technologies? 

 

Dilemma – Part three – Security breach and legal responsibility: 

At the beginning of operation, the IIoT system with AI algorithms improved the efficiency of production lines by updating the parameters in robot operation and product recipes automatically. Recently, however, the efficiency degradation was observed, and after investigation, there were suspicions that the rules/data in AI algorithms have been subtly changed. Developers, contractors, operators, technicians and managers were all brought in to find out what’s going on. 

 

Optional STOP for questions and activities: 

1. Discussion: If there has been an illegal hack of the system, what might be the motive of cyber criminals?   

2. Discussion: What are the impacts on company business? How could the impact of cyber-attacks on businesses be minimised?

3. Discussion: How could threats that come from internal employees, vendors, contractors or partners be prevented?

4. Discussion: When a security breach happens, what are the legal responsibilities for developers, contractors, operators, technicians and managers? 

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Case enhancement: Facial recognition for access and monitoring

Activity: Prompts to facilitate discussion activities. 

Author: Sarah Jayne Hitt, Ph.D. SFHEA (NMITE, Edinburgh Napier University).

 

Overview:

There are several points in this case during which an educator can facilitate a class discussion about relevant issues. Below are prompts for discussion questions and activities that can be used. These correspond with the stopping points outlined in the case. Each prompt could take up as little or as much time as the educator wishes, depending on where they want the focus of the discussion to be. The discussion prompts for Dilemma Part three are already well developed in the case study, so this enhancement focuses on expanding the prompts in Parts one and two.

 

Dilemma Part one – Discussion prompts:

1. Legal Issues. Give students ten minutes to individually or in groups do some online research on GDPR and the Data Protection Act (2018). In either small groups or as a large class, discuss the following prompts. You can explain that even if a person is not an expert in the law, it is important to try to understand the legal context. Indeed, an engineer is likely to have to interpret law and policy in their work. These questions invite critical thinking and informed consideration, but they do not necessarily have “right” answers and are suggestions that can help get a conversation started.

a. Are legal policies clear about how images of living persons should be managed when they are collected by technology of this kind?

b. What aspects of these laws might an engineer designing or deploying this system need to be aware of?

c. Do you think these laws are relevant when almost everyone walking around has a digital camera connected to the internet?

d. How could engineers help address legal or policy gaps through design choices?

2. Sharing Data. Before entering into a verbal discussion, either pass out the suggested questions listed in the case study on a worksheet or project on a screen. Have students spend five or ten minutes jotting down their personal responses. To understand the complexity of the issue, students could even create a quick mind map to show how different entities (police, security company, university, research group, etc.) interact on this issue. After the students spend some time in this personal reflection, educators could ask them to pair/share—turn to the person next to them and share what they wrote down. After about five minutes of this, each pair could amalgamate with another pair, with the educator giving them the prompt to report back to the full class on where they agree or disagree about the issues and why.

3. GDPR Consent. Before discussing this case particularly, ask students to describe a situation in which they had to give GDPR consent. Did they understand what they were doing, what the implications of consent are, and why? How did they feel about the process? Do they think it’s an appropriate system? This could be done as a large group, small group, or through individual reflection. Then turn the attention to this case and describe the change of perspective required here. Now instead of being the person who is asked for consent, you are the person requiring consent. Engineers are not lawyers, but engineers often are responsible for delivering legally compliant systems. If you were the engineer in charge in this case, what steps might you take to ensure consent is handled appropriately? This question could be answered in small groups, and then each group could report back to the larger class and a discussion could follow the report-backs.

4. Institutional Complexity. The questions listed in the case study relate to the fact that the building in which the facial recognition system will be used accommodates many different stakeholders. To help students with these questions, educators could divide the class into small groups, with each group representing one of the institutions or stakeholder groups (college, hospital, MTU, students, patients, public, etc.). Have each group investigate whether regulations related to captured images are different for their stakeholders, and debate if they should be different. What considerations will the engineer in the case have to account for related to that group? The findings can then be discussed as a large class.

 

Dilemma Part two – Discussion prompts:

The following questions relate to macroethical concerns, which means that the focus is on wider ethical contexts such as fairness, equality, responsibility, and implications.

1. Benefits and Burdens. To prepare to discuss the questions listed in the case study, students could make a chart of potential harms and potential benefits of the facial recognition system. They could do this individually, in pairs or small groups, or as a large class. Educators should encourage them to think deeply and broadly on this topic, and not just focus on the immediate, short-term implications. Once this chart is made, the questions listed in the case study could be discussed as a group, and students asked to weigh up these burdens and benefits. How did they make the choices as to when a burden should outweigh a benefit or vice versa?

2. Equality and Utility. To address the questions listed in the case study, students could do some preliminary individual or small group research on the accuracy of facial recognition systems for various population groups. The questions could then be discussed in pairs, small groups, or as a large class.

3. Engineer Responsibility. Engineers are experts that have much more specific technical knowledge and understanding than the general public. Indeed, the vast majority of people have no idea how a facial recognition system works and what the legal requirements are related to it, even if they are asked to give their consent. Does an engineer therefore have more of a responsibility to make people aware and reassure them? Or is an engineer just fulfilling their duty by doing what their boss says and making the system work? What could be problematic about taking either of those approaches?

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Case Enhancement: Choosing to install a smart meter

Activity: Technical integration – Practical investigation of electrical energy.

Author: Mr Neil Rogers (Independent Scholar).

 

Overview:

This enhancement is for an activity found in the Dilemma Part two, Point 1 section of the case: “Technical integration – Undertake an electrical engineering technical activity related to smart meters and the data that they collect.”

This activity involves practical tasks requiring the learner to measure parameters to enable electrical energy to be calculated in two different scenarios and then relate this to domestic energy consumption. This activity will give technical context to this case study as well as partly address two AHEP themes:

This activity is in three parts. To fully grasp the concept of electrical energy and truly contextualise what could be a remote and abstract concept to the learner, it is expected that all three parts should be completed (even though slight modifications to the equipment list are acceptable).

Learners are required to have basic (level 2) science knowledge as well as familiarity with the Multimeters and Power Supplies of the institution.

Learners have the opportunity to:

Teachers have the opportunity to:

 

Suggested pre-reading:

To prepare for these practical activities, teachers may want to explain, or assign students to pre-read articles relating to electrical circuit theory with respect to:

 

Learning and teaching resources:

 

Activity: Practical investigation of electrical energy:

Task A: Comparing the energy consumed by incandescent bulbs with LEDs.

1. Power in a circuit.

By connecting the bulbs and LEDs in turn to the PSU with a meter in series:

a. Compare the wattage of the two devices.

b. On interpretation of their data sheets compare their luminous intensities.

c. Equate the quantity of each device to achieve a similar luminous intensity of approximately 600 Lumens (a typical household bulb equivalent).

d. now equate the wattages required to achieve this luminous intensity for the two devices.

 

2. Energy = Power x Time.

The units used by the energy providers are kWh:

a. Assuming the devices are on for 6 hours/day and 365 days/year, calculate the energy consumption in kWh for the two devices.

b. Now calculate the comparative annual cost assuming 1 kWh = 27p ! (update rate).

 

3.  Wider implications.

a. Are there any cost-benefit considerations not covered?

b. How might your findings affect consumer behaviour in ways that could either negatively or positively impact sustainability?

c. Are there any ethical factors to be considered when choosing LED lightbulbs? For instance, you might investigate minerals and materials used for manufacturing and processing and how they are extracted, or end-of-life disposal issues, or fairness of costs (both relating to production and use).

 

Task B: Using a plug-in power meter.

1. Connect the power meter to a dishwasher or washing machine and run a short 15/30 minute cycle and record the energy used in kWh.

2. Connect the power meter to a ½ filled kettle and turn on, noting the instantaneous power (in watts) and the time taken. Then calculate the energy used and compare to the power meter.

3. Connect the power meter to the fan heater and measure the instantaneous power. Now calculate the daily energy consumption in kWh for a fan heater on for 6 hours/day.

4. Appreciation of consumption of electrical energy over a 24 hour period (in kWh) is key. What are the dangers in reading instantaneous energy readings from a smart meter?

 

Task C: Calculation of typical domestic electrical energy consumption.

1. Using the list of items in Appendix A, calculate the typical electrical energy usage/day for a typical household.

2. Now compare the electrical energy costs per day and per year for these three suppliers, considering how suppliers source their energy (i.e. renewable vs fossil fuels vs nuclear etc).

 

Standing charge cost / day Cost per kWh Cost / day Cost / year
A) 48p 28p
B) 45p 31p
C) 51p 27p

 

3. Does it matter that data is collected every 30 minutes by your energy supplier? What implications might changing the collection times have?

4. With reference to Sam growing marijuana in the case, how do you think this will show up in his energy bill?

 

Appendix A: Household electrical devices power consumption:

Typical power consumption of electrical devices on standby (in Watts).

Wi-Fi router 10
TV & set top box 20
Radios & alarms 10
Dishwasher  5
Washing machine  5
Cooker & heat-ring controls 10
Gaming devices 10
Laptops x2 10

 

Typical consumption of electrical devices when active (in Watts) and assuming Gas central heating.

TV & set top box (assume 5 hours / day) 120
Dishwasher (assume 2 cycles / week) Use calculated
Washing machine (assume 2 cycles / week) Use calculated
Cooking (oven, microwave etc 1 hour / day) 1000
Gaming devices (1 hour / day) 100
Laptop ( 1 hour / day) 70
Kettle (3 times / day) Use calculated
Heating water pump (2 hours / day) 150
Electric shower (8 mins / day) 8000

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Professor Sarah Hitt SFHEA (NMITE); Professor Raffaella Ocone OBE FREng FRSE (Heriot Watt University); Johnny Rich (Engineering Professors’ Council); Dr Matthew Studley (University of the West of England, Bristol); Dr Nik Whitehead (University of Wales Trinity Saint David); Dr Darian Meacham (Maastricht University); Professor Mike Bramhall (TEDI-London); Isobel Grimley (Engineering Professors’ Council).

Topic: Data security of smart technologies.

Engineering disciplines: Electronics, Data, Mechatronics.

Ethical issues: Autonomy, Dignity, Privacy, Confidentiality.

Professional situations: Communication, Honesty, Transparency, Informed consent.

Educational level: Intermediate.

Educational aim: Practise ethical analysis. Ethical analysis is a process whereby ethical issues are defined and affected parties and consequences are identified so that relevant moral principles can be applied to a situation in order to determine possible courses of action.

 

Learning and teaching notes:

This case involves a software engineer who has discovered a potential data breach in a smart home community. The engineer must decide whether or not to report the breach, and then whether to alert and advise the residents. In doing so, considerations of the relevant legal, ethical, and professional responsibilities need to be weighed. The case also addresses communication in cases of uncertainty as well as macro-ethical concerns related to ubiquitous and interconnected digital technology.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in two parts. If desired, a teacher can use Part one in isolation, but Part two develops and complicates the concepts presented in Part one to provide for additional learning. The case allows teachers the option to stop at multiple points for questions and/or activities as desired

Learners will have the opportunity to:

Teachers will have the opportunity to:

 

Learning and teaching resources:

 

Summary:

Smart homes have been called “the road to independent living”. They have the potential to increase the autonomy and safety of older people and people with disabilities. In a smart home, the internet of things (IoT) is coupled with advanced sensors, chatbots and digital assistants. This combination enables residents to be connected with both family members and health and local services, so that if there there are problems, there can be a quick response.

Ferndale is a community of smart homes. It has been developed at considerable cost and investment as a pilot project to demonstrate the potential for better and more affordable care of older people and people with disabilities. The residents have a range of capabilities and all are over the age of 70. Most live alone in their home. Some residents are supported to live independently through: reminders to take their medication; prompts to complete health and fitness exercises; help completing online shopping orders and by detecting falls and trips throughout the house. The continuous assessment of habits, diet and routines allows the technology to build models that may help to predict any future negative health outcomes. These include detecting the onset of dementia or issues related to dietary deficiencies. The functionality of many smart home features depends on a reliable and secure internet connection.

 

Dilemma – Part one:

You are the software engineer responsible for the integrity of Ferndale’s system. During a routine inspection you discover several indicators suggesting a data breach may have occurred via some of the smart appliances, many of which have cameras and are voice-activated. Through the IoT, these appliances are also connected to Amazon Ring home security products – these ultimately link to Amazon, including supplying financial information and details about purchases.

 

Optional STOP for questions and activities: 

1. Activity: Technical analysis – Before the ethical questions can be considered, the students might consider a number of immediate technical questions that will help inform the discussion on ethical issues. A sample data set or similar technical problem could be used for this analysis. For example:

2. Activity: Identify legal and ethical issues. The students should reflect on what might be the immediate ethical concerns of this situation. This could be done in small groups or a larger classroom discussion.

Possible prompts:

3. Activity: Determine the wider ethical context. Students should consider what wider moral issues are raised by this situation. This could be done in small groups or a larger classroom discussion.

Possible prompts:

 

Dilemma – Part two:

You send an email to Ferndale’s manager about the potential breach, emphasising that the implications are possibly quite serious. She replies immediately, asking that you do not reveal anything to anyone until you are absolutely certain about what has happened. You email back that it may take some time to determine if the software security has been compromised and if so, what the extent of the breach has been. She replies explaining that she doesn’t want to cause a panic if there is nothing to actually worry about and says “What you don’t know won’t hurt you.” How do you respond?     

 

Optional STOP for questions and activities: 

1. Discussion: Professional values – What guidance is given by codes of ethics such as the Royal Academy of Engineering/Engineering Council’s Statement of Ethical Principles or the Association for Computing Machinery Code of Ethics?

2. Activity: Map possible courses of action. The students should think about the possible actions they might take. They can be prompted to articulate different approaches that could be adopted, such as the following, but also develop their own alternative responses.

3. Activity: Hold a debate on which is the best approach and why. The students should interrogate the pros and cons of each possible course of action including the ethical, technical, and financial implications. They should decide on their own preferred course of action and explain why the balance of pros and cons is preferable to other options.

4. Activity: Role-play a conversation between the engineer and the manager, or a conversation between the engineer and a resident.

5. Discussion: consider the following questions:

6. Activity: Change perspectives. Imagine that you are the child of one of Ferndale’s residents and that you get word of the potential data security breach. What would you hope the managers and engineers would do?

7. Activity: Write a proposal on how the system might be improved to stop this happening in the future or to mitigate unavoidable risks. To inform the proposal, the students should also explore the guidance of what might be best practice in this area. For example, in this instance, they may decide on a series of steps.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Dr Nicola Whitehead (University of Wales Trinity Saint David); Professor Sarah Hitt (NMITE); Emma Crichton (Engineers Without Borders UK); Dr Sarah Junaid (Aston University); Professor Mike Sutcliffe (TEDI-London), Isobel Grimley (Engineering Professors’ Council).

Topic: Development and use of a facial recognition system. 

Engineering disciplines: Data, Electronics, Computer science, AI.

Ethical issues: Diversity, Bias, Privacy, Transparency.

Professional situations: Rigour, Informed consent, Misuse of data, Conflicts with leadership / management.

Educational level: Advanced. 

Educational aim: To encourage ethical motivation. Ethical motivation occurs when a person is moved by a moral judgement, or when a moral judgement is a spur to a course of action. 

 

Learning and teaching notes: 

This case involves an engineer hired to manage the development and installation of a facial recognition project at a building used by university students, businesses and the public. It incorporates a variety of components including law and policy, stakeholder and risk analysis, and both macro- and micro-ethical elements. This example is UK-based: however, the instructor can adapt the content to better fit the laws and regulations surrounding facial recognition technology in other countries, if this would be beneficial.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this study to AHEP outcomes specific to a programme under these themes, access AHEP4 here and navigate to pages 30-31 and 35-37.

This case is presented in three parts. If desired, a teacher can use Part one in isolation, but Part two (focusing on the wider ethical context of the case) and Part three (focusing on the potential actions the engineer could take)develop and complicate the concepts presented in Part one to provide for additional learning. The case study allows teachers the option to stop at multiple points for questions and / or activities as desired.

Learners have the opportunity to:

Teachers have the opportunity to: 

 

Learning and teaching resources:

 

Summary: 

Metropolitan Technical University (MTU), based in the UK, has an urban campus and many of its buildings are located in the city centre. A new student housing development in this area will be shared by MTU, a local college, and medical residents doing short rotations at the local hospital. The building has a public café on the ground floor and a couple of classrooms used by the university. 

The housing development sits alongside a common route for parades and protests. In the wake of demonstrations by Extinction Rebellion and Black Lives Matter, students have raised concerns to the property manager about safety. Despite an existing system of CCTV cameras and swipe cards, the university decides to install an enhanced security system, built around facial recognition technology that would enable access to the building and cross-reference with crime databases. To comply with GDPR, building residents will be required to give explicit consent before the system is implemented. Visitors without a student ID (such as café customers) will be buzzed in, but their image will be captured and cross-referenced before entry. A side benefit of the system is that MTU’s department of Artificial Intelligence Research will help with the installation and maintenance, as well as studying how it works, in order to make improvements. 

 

Dilemma – Part one:

You are an engineer who has been hired by MTU to take charge of the facial recognition system installation project, including setting policies and getting the system operational. With your background in AI engineering, you are expected to act as a technical advisor to MTU and liaise with the Facilities, Security and Computing departments to ensure a smooth deployment. This is the first time you have worked on a project that involves image capture. So as part of your preparation for the project, you need to do some preliminary research as to what best practices, guidance, and regulations apply.

 

Optional STOP for questions and activities: 

1. Discussion: What are the legal issues relating to image capture? Images allow for the identification of living persons and are therefore considered as personal data under GDPR and the Data Protection Act (2018).

2. Discussion: Sharing data is a legally and ethically complex field. Is it appropriate to share images captured with the police? If not the police, then whose crime database will you use? Is it acceptable to share the data with the Artificial Intelligence Research group? Why, or why not?

3. Discussion: Under GDPR, individuals must normally consent to their personal data being processed. How should consent be handled in this case?

4. Discussion: Does the fact that the building will accommodate students from three different institutions (MTU, the local college, and the hospital) complicate these issues? Are regulations related to students’ captured images different than those related to public image capture?

5. Activity: Undertake a technical activity that relates to how facial recognition systems are engineered.

 

Dilemma – Part two:

The project has kicked off, and one of its deliverables is to establish the policies and safeguards that will govern the system. You convened a meeting of project stakeholders to determine what rules need to be built into the system’s software and presented a list of questions to help you make technical decisions. The questions you asked were:

What you had thought would be a quick meeting to agree basic principles turned out to be very lengthy and complex. You were surprised at the variety of perspectives and how heated the discussions became. The discussions raised some questions in your own mind as to the risks of the facial recognition system.

 

Optional STOP for questions and activities:

The following activities focus on macro-ethics. This seeks to understand the wider ethical contexts of projects like the facial recognition system.

1. Activity: Stakeholder mapping – Who are all the stakeholders and what might their positions and perspectives be? Is there a difference between the priorities of the different stakeholders?

2. Activity: There are many different values competing for priority here. Identify these values, discuss and debate how they should be weighed in the context of the project.

3. Activity: Risks can be understood as objective and / or subjective. Research the difference between these two types of risk, and identify which type(s) of risks exist related to the project.

4. Discussion: Which groups or individuals are potentially harmed by the technology and which potentially benefit? How should we go about setting priorities when there are competing harms and benefits?

5. Discussion: Does the technology used treat everyone from your stakeholders’ list equally? Should the needs of society as a whole outweigh the needs of the individual?

6. Activity: Make and defend an argument as to the appropriateness of installing and using the system.

7. Discussion: What responsibilities do engineers have in developing these technologies?

 

Dilemma – Part three:

A few days later, you were forwarded a screenshot of a social media post that heavily criticised the proposed facial recognition system. It was unclear where the post had originated, but it had clearly been shared and promoted among both students and the public raising concerns about privacy and transparency. Your boss believes this outcry endangers the project and has requested that you make a public statement on behalf of MTU, reaffirming its commitment to installing the system.

You share the concerns, but have been employed to complete the project. You understand that suggesting it should be abandoned, would most likely risk your job. What will you tell your boss? How will you prepare your public statement?

 

Optional STOP for questions and activities:

Micro-ethics concerns individuals and their responses to specific situations. The following steps are intended to help students develop their ability to practise moral analysis by considering the problem in a structured way and work towards possible solutions that they can analyse critically.

 1. Discussion: What are the problems here? 

2. Discussion: What are the possible courses of action you can take as an employee?

 Students can be prompted to consider what different approaches they might adopt, such as the following, but can also develop their own possible responses. 

3. Discussion: Which is the best approach and why? – Interrogate the pros and cons of each possible course of action including the ethical, practical, cost, local relationship and the reputational damage implications. Students should decide on their own preferred course of action and explain why the balance of pros and cons is preferable to other options. The students may wish to consider this from other perspectives, such as: 

4. Activity: Public Communication – Students can practise writing a press release, giving an interview, or making a public statement about the case and the decision that they make.

5. Activity: Reflection – Students can reflect on how this case study has enabled them to see the situation from different angles. Has it motivated them to understand the ethical concerns and to come to an acceptable conclusion.

 

Enhancements:

An enhancement for this case study can be found here.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Professor Sarah Hitt SFHEA (NMITE); Professor Raffaella Ocone OBE FREng FRSE (Heriot Watt University); Professor Thomas Lennerfors (Uppsala University); Claire Donovan (Royal Academy of Engineering); Isobel Grimley (Engineering Professors’ Council).

Topic:  Developing customised algorithms for student support.

Engineering disciplines: Computing, AI, Data.

Ethical issues: Bias, Social responsibility, Risk, Privacy.

Professional situations: Informed consent, Public health and safety, Conflicts with leadership / management, Legal implications.

Educational level: Beginner.

Educational aim: Develop ethical sensitivity. Ethical sensitivity is the broad cognisance of ethical issues and the ability to see how these might affect others.

 

Learning and teaching notes:

This case study involves the employees of a small software start-up that is creating a customised student support chatbot for a Sixth Form college. The employees come from different backgrounds and have different perspectives on the motivations behind their work, which leads to some interpersonal conflict. The team must also identify the ethical issues and competing values that arise in the course of developing their algorithm.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in two parts which build in complexity and navigate between personal, professional, and societal contexts. If desired, a teacher can use Part one in isolation, but Part two develops and complicates the concepts presented in Part one to provide for additional learning. Pre-reading ‘Ethics of Care and Justice’ is recommended, though not required, for engaging with Part two. The case allows teachers the option to stop at multiple points for questions and / or activities as desired.

Learners have the opportunity to:

Teachers have the opportunity to:

 

Learning and teaching resources:

 

Summary:

Exaba is a small, three-person software startup. Like all small businesses, it has been struggling with finances during the pandemic. The company began selling its services across a variety of industry sectors but is now trying to expand by developing software solutions for the growing education technology sector.

Ivan, Exaba’s founder and CEO, was thrilled to be contracted by a growing local Sixth Form College in North West England, NorthStar Academy, to create a chatbot that will optimise student support services. These services include ensuring student safety and wellbeing, study skills advice, careers guidance, counselling, and the identification for the need and implementation of extra learning support. It is such a large project that Ivan has been able to bring in Yusuf, a university student on placement from a computer systems programme, to help Nadja, Exaba’s only full-time software engineer. Ivan views the chatbot contract as not only a financial windfall that can help get the company back on track, but as the first project in a new product-development revenue stream.

Nadja and Yusuf have been working closely with the NorthStar Academy’s Principal, Nicola, to create ‘Alice’: the custom student-support chatbot to ensure that she is designed appropriately and is fit for purpose. Nicola has seen growing evidence that chatbots can identify when students are struggling with a range of issues from attendance to anxiety. She has also seen that they can be useful in helping administrators understand what students need, how to help them more quickly, and where to invest more resources to make support most effective.

 

Optional STOP for questions and activities:

1. Discussion: What moral or ethical issues might be at stake or arise in the course of this project?

2. Discussion: What professional or legal standards might apply to the development of Alice?

3. Discussion: What design choices might Nadja and Yusuf have to consider as they build the chatbot software in order for it to conform to those standards?

4. Discussion: is there anything risky about giving cognitive chatbots human names in general, or a female name specifically?

5. Activity: Undertake stakeholder mapping to elicit value assumptions and motivations.

6. Activity: Research any codes of ethics that might apply to AI in education, or policies / laws that apply to controlling and processing student data.

7. Activity: View the following TED talk and have a discussion on gender in digital assistants: Siri and Alexa are AI Built for the Past by Emily Liu.

 

Dilemma – Part one:

After undertaking work to ensure GDPR compliance through transparency, consent, and anonymisation of the data harvested by interactions with Alice, Nadja and Yusuf are now working on building the initial data set that the chatbot will call upon to provide student support. The chatbot’s information to students can only be as good as the existing data it has available to draw from. To enable this, Nicola has agreed to provide Exaba with NorthStar Academy’s existing student databases that span many years and cover both past and present students. While this data – including demographics, academic performances, and interactions with support services – is anonymised, Yusuf has begun to feel uncomfortable. One day, when the entire team was together discussing technical challenges, Yusuf said “I wonder what previous students would think if they found out that we were using all this information about them, without their permission?”

Ivan pointed out, “Nicola told us it was okay to use. They’re the data controllers, so it’s their responsibility to resolve that concern, not ours. We can’t tell them what to do with their own data. All we need to be worried about is making sure the data processing is done appropriately.”

Nadja added, “Plus, if we don’t use an existing data set, Alice will have to learn from scratch, meaning she won’t be as effective at the start. Wouldn’t it be better for our chatbot to be as intelligent and helpful as possible right away? Otherwise, she could put existing students at a disadvantage.”

Yusuf fell silent, figuring that he didn’t know as much as Ivan and Nadja. Since he was just on a placement, he felt that it wasn’t his place to push the issue any further with full-time staff.

 

Optional STOP for questions and activities:

1. Discussion: Expand upon Yusuf’s feelings of discomfort. What values or principles is this emotion drawing on?

2. Discussion: Do you agree with Yusuf’s perspective, or with Ivan’s and Nadja’s? Why?

3. Discussion: Does / should Yusuf have the right to voice any concerns or objections to his employer?

4. Discussion: Do / should previous NorthStar students have the right to control what the academy does with their data? To what extent, and for how long?

5. Discussion: Is there / should there be a difference between how data about children is used and that of adults? Why?

6. Discussion: Should a business, like Exaba, ever challenge its client, like NorthStar Academy, about taking potentially unethical actions?

7. Technical activity: Undertake a technical activity such as creating a process flow diagram, pieces of code and UI / UX design that either obscure or reinforce consent.

8. Activity: Undertake argument mapping to diagram and expand on the reasoning and evidence used by Yusuf, Nadja, and Ivan in their arguments.

9. Activity: Apply ethical theories to those arguments.  

10. Discussion: What ethical principles are at stake? Are there potentially any conflicts or contradictions arising from those principles?

 

Dilemma – Part two:

Nicola, too, was under pressure. The academy’s Board had hired her as Principal to improve NorthStar’s rankings in the school performance table, to get the college’s finances back on track, and support the government efforts at ‘levelling up’ This is why one of Nicola’s main specifications for Alice is that she be able to flag students at risk of not completing their qualifications. Exaba will have to develop an algorithm that can determine what those risk factors are.

In a brainstorming session Nadja began listing some ideas on the whiteboard. “Ethnic background, family income, low marks, students who fit that profile from the past and ultimately dropped out, students who engaged with support services a lot, students with health conditions . . .”

“Wait, wait, wait,” Yusuf said. “This feels a little bit like profiling to me. You know, like we think kids from certain neighbourhoods are unlikely to succeed so we’re building this thing to almost reinforce that they don’t.”

“The opposite is true!” Ivan exclaimed. “This algorithm will HELP exactly those students.”

“I can see how that’s the intention,” Yusuf acknowledged. “But I’ve had so many friends and neighbours experience well-intentioned but not appropriate advice from mentors and counsellors who think the only solution is for everyone to complete qualifications and go to university. This is not the best path for everybody!”

Nadja had been listening carefully. “There is something to what Yusuf is saying: Is it right to nudge students to stay in a programme that’s actually not a best fit for them? Could Alice potentially give guidance that is contrary to what a personal tutor, who knows the student personally, might advise? I don’t know if that’s the sort of algorithm we should develop.”

At this point Ivan got really frustrated with his employees: “This is the proprietary algorithm that’s going to save this company!” he shouted. “Never mind the rights and wrongs of it. Think of the business potential, not to mention all the schools and students this is going to help. The last thing I need is a mutiny from my team. We have the client’s needs to think about, and that’s it.”

 

Optional STOP for questions and activities:

1. Activity: compare an approach to this case through the ethics of care versus the ethics of justice. What different factors come into play? How should these be weighed? Might one approach lead to a better course of action than another? Why?

2. Discussion: what technical solutions, if any, could help mitigate Yusuf and Nadja’s concerns?

3. Activity: imagine that Ivan agrees that this is a serious enough concern that they need to address it with Nicola. Role play a conversation between Ivan and Nicola.

4. Activity: undertake a classroom debate on whether or not Alice has the potential to reinforce negative stereotypes. Variations include alley debate, stand where you stand, adopt and support opposite instinct.

 

Enhancements:

An enhancement for this case study can be found here.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Professor Mike Sutcliffe (TEDI-London); Professor Mike Bramhall (TEDI-London); Prof Sarah Hitt SFHEA (NMITE); Johnny Rich (Engineering Professors’ Council); Professor Dawn Bonfield MBE (Aston University); Professor Chike Oduoza (University of Wolverhampton); Steven Kerry (Rolls-Royce); Isobel Grimley (Engineering Professors’ Council).

Topic: Smart meters for responsible everyday energy use.

Engineering disciplines: Electrical engineering

Ethical issues: Integrity, Transparency, Social responsibility, Respect for the environment, Respect for the law

Professional situations: Communication, Privacy, Sustainability

Educational level: Beginner

Educational aim: To encourage ethical motivation. Ethical motivation occurs when a person is moved by a moral judgement, or when a moral judgement is a spur to a course of action. 

 

Learning and teaching notes:

This case is an example of ‘everyday ethics’. A professional engineer must give advice to a friend about whether or not they should install a smart meter. It addresses issues of ethical and environmental responsibility as well as public policy, financial burdens and data privacy. The case helps to uncover values that underlie assumptions that people hold about the environment and its connection to human life and services. It also highlights the way that those values inform everyday decision-making.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in three parts that build in complexity. If desired, a teacher can use Part one in isolation, but Part two and Part three develops and complicates the concepts presented in Part one in order to provide additional learning. The case allows teachers the opportunity to stop at various points to pose questions and/or set activities.

Learners have the opportunity to:

Teachers have the opportunity to:

 

Learning and teaching resources:

 

Summary – Part one:

Sam and Alex have been friends since childhood. As they have grown older, they have discovered that they hold very different political and social beliefs, but they never let these differences of opinion get in the way of a long and important friendship. In fact, they often test their own ideas against each other in bantering sessions, knowing that they are built on a foundation of respect.

Sam works as an accountant and Alex has become an environmental engineer. Perhaps naturally, Alex often asks Sam for financial advice, while Sam depends on Alex for expert information related to sustainability and the environment. One day, knowing that Alex is knowledgeable about the renewable energy industry and very conscious of the impact of energy use at home, Sam messages Alex to say he is getting pressure from his energy company to install a smart meter.

Sam has been told that smart metering is free, brings immediate benefits to customers by helping them to take control of their energy usage, and is a key enabler for the transition away from fossil fuels use and towards the delivery of net zero emissions by 2050. Smart meters give consumers near real-time information on energy use, and the associated cost, enabling them to better manage their energy use, save money and reduce emissions. A further benefit is that they could charge their electric car far more cheaply using a smart meter on an overnight tariff.

Yet Sam has also read that smart meters ‘go dumb’ if customers switch providers and, as a pre-payment customer, this option may not be available with a smart meter. In addition, Sam suspects that despite claims that the smart meter roll out is free, the charge is simply being passed on to customers through their energy bills instead. Alex tries to give Sam as much good information as possible, but the conversation ends with the decision unresolved.

 

Optional STOP for questions and activities: 

1. Discussion and activity: Personal values – We know that Sam and Alex have different ideas and opinions about many things. This probably stems from a difference in how they prioritise values. For instance, valuing transparency over efficiency, or sustainability over convenience. Using this values activity as a prompt, what personal values might be competing in this particular case?

2. Discussion and activity: Everyday ethics – Consider what values are involved in your everyday choices, decisions, and actions. Write a reflective essay on three events in the past week that, upon further analysis, have ethical components.

3. Discussion: Professional values – Does Alex, as an environmental engineer, have a responsibility to advocate installing smart meters? If so, does he have more responsibility than a non-engineer to advocate for this action? Why, or why not?

4. Discussion: Wider impact – Are there broader ethical issues at stake here?

5. Activity: Role-play a conversation between Sam and Alex that includes what advice should be given and what the response might be.

 

Dilemma – Part two:

After getting more technical information from Alex, Sam realises that, with a smart meter, data on the household’s energy usage would be collected every 30 minutes.  This is something they had not anticipated, and they ask a number of questions about the implications of this. Furthermore, while Sam has already compared tariffs and costs as the main way to choose the energy provider, Alex points out that different providers use different energy sources such as wind, gas, nuclear, coal, and solar. Sam is on a tight budget but Alex explains that the cheaper solution is not necessarily the most environmentally responsible choice. Sam is frustrated: now there is something else to consider besides whether or not to install the smart meter.

 

Optional STOP for questions and activities:  

1. Activity: Technical integration Undertake an electrical engineering technical activity related to smart meters and the data that they collect.

2. Activity: Research what happens with the data collected by a smart meter. Who can access this data and how is privacy protected? How does this data inform progress towards the energy transition from fossil fuels?

3. Activity: Research different energy companies and their approach to responsible energy sourcing and use. How do these companies communicate that approach to the public? Which company would you recommend to your friend and why?

4. Activity: Cost-benefit analysis – Sometimes the ethical choice is the more expensive choice. How do you balance short- and long-term benefits in this case? When, if ever, would it be ethically right to choose energy from non-renewable sources? How would this choice differ if the context being considered was different? For example, students could think about responsible energy use in industrialised economies versus the developing world and energy justice.

 

Dilemma – Part three:

Following this exchange with Sam, Alex becomes aware that one of the main obstacles in energy transition concerns communication with the public. Ideally, Alex wants to persuade family and other friends to make more responsible choices; however, it is clear that there are many more factors involved than can be seen in one glance. This includes what kinds of pressure is put on consumers by companies and the government. Alex begins to reflect on how policy drives what engineers think and do, and joins a new government network on Engineering in Policy.  

Alex and Sam meet up a little while later, and Sam announces that yes, a smart meter has been installed. At first Alex is relieved, but then Sam lets it slip that they are planning to grow marijuana in their London home. Sam asks whether this spike in energy use will be picked up as abnormal by a smart meter and whether this would lead to them being found out.

 

Optional STOP for questions and activities:  

1. Discussion: Personal values – What are the ethics involved in trying to persuade others to make similar choices to you?

2. Discussion and activity: Legal responsibility – What should Alex say or do about Sam’s disclosure? Role-play a conversation between Sam and Alex.

3. Discussion: Professional responsibility – What role should engineers play in setting and developing public policy on energy?

4. Activity: Energy footprint – Research which industries use the most energy and, on a smaller scale, which home appliances use the most energy.

 

Enhancements:

An enhancement for this case study can be found here.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Let us know what you think of our website