Authors: Ahmet Omurtag (Nottingham Trent University); Andrei Dragomir (National University of Singapore / University of Houston).

Topic: Data security of smart technologies.

Engineering disciplines: Electronics; Data; Biomedical engineering.

Ethical issues: Autonomy; Dignity; Privacy; Confidentiality.

Professional situations: Communication; Honesty; Transparency; Informed consent; Misuse of data.

Educational level: Advanced.

Educational aim: Practising Ethical Analysis: engaging in a process by which ethical issues are defined, affected parties and consequences are identified, so that relevant moral principles can be applied to a situation in order to determine possible courses of action.

 

Learning and teaching notes:

This case involves Aziza, a biomedical engineer working for Neuraltrix, a hypothetical company that develops Brain-computer interfaces (BCI) for specialised applications. Aziza has always been curious about the brain and enthusiastic about using cutting-edge technologies to help people in their daily lives. Her team has designed a BCI that can measure brain activity non-invasively and, by applying machine learning algorithms, assess the job-related proficiency and expertise level of a person. She is leading the deployment of the new system in hospitals and medical schools, to be used in evaluating candidates being considered for consultant positions. In doing so, and to respond to requests to extend and use the BCI-based system in unforeseen ways, she finds herself compelled to weigh various ethical, legal and professional responsibilities.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in three parts. If desired, a teacher can use the Summary and Part one in isolation, but Parts two and three develop and complicate the concepts presented in the Summary and Part one to provide for additional learning. The case allows teachers the option to stop at multiple points for questions and/or activities as desired.

Learners have the opportunity to:

Teachers have the opportunity to:

 

Learning and teaching resources:

Legal regulations:

Professional organisations:

Philanthropic organisations:

Journal articles:

Educational institutions:

 

Summary:

Brain-computer interfaces (BCIs) detect brain activity and utilise advanced signal analysis to identify features in the data that may be relevant to specific applications. These features might provide information about people’s thoughts and intentions or about their psychological traits or potential disorders, and may be interpreted for various purposes such as for medical diagnosis, for providing real-time feedback, or for interacting with external devices such as a computer. Some current non-invasive BCIs employ unobtrusive electroencephalography headsets or even optical (near-infrared) sensors to detect brain function and can be safe and convenient to use.

Evidence shows that the brains of people with specialised expertise have identifiable functional characteristics. Biomedical technology may translate this knowledge soon into BCIs that can be used for objectively assessing professional skills. Researchers already know that neural signals support features linked to levels of expertise, which may enable the assessment of job applicants or candidates for promotion or certification.

BCI technology would potentially benefit people by improving the match between people and their jobs, and allowing better and more nuanced career support. However, the BCI has access to additional information that may be sensitive or even troubling. For example, it could reveal a person’s health status (such as epilepsy or stroke), or it may suggest psychological traits ranging from unconscious racial bias to psychopathy. Someone sensitive about their privacy may be reluctant to consent to wearing a BCI.

In everyday life, we show what is on our minds through language and behaviour, which are normally under our control, and provide a buffer of privacy. BCIs with direct access to the brain and increasing capability to decode its activity may breach this buffer. Information collected by BCIs could be of interest not only to employers who will decide whether to hire and invest in a new employee, but also to health insurers, advertising agencies, or governments.

 

Optional STOP for questions and activities:

1. Activity: Risks of brain activity decoding – Identify the physical, ethical, and social difficulties that could result from the use of devices that have the ability to directly access the brain and decipher some of its psychological content such as thoughts, beliefs, and emotions.

2. Activity: Regulatory oversight – Investigate which organisations and regulatory bodies currently monitor and are responsible for the safe and ethical use of BCIs.

3. Activity: Technical integration – Investigate how BCIs work to translate brain activity into interpretable data.

 

Dilemma – Part one:

After the company, Neuraltrix, deployed their BCI and it had been in use for a year in several hospitals, its lead developer Aziza became part of the customer support team. While remaining proud and supportive of the technology, she had misgivings about some of its unexpected ramifications. She received the following requests from people and institutions for system modifications or for data sharing:

1. A hospital asked Neuraltrix for a technical modification that would allow the HR department to send data to their clinical neurophysiologists for “further analysis,” claiming that this might benefit people by potentially revealing a medical abnormality that might otherwise be missed.

2. An Artificial Intelligence research group partnering with Neuraltrix requested access to the data to improve their signal analysis algorithms.

3. A private health insurance company requested Neuraltrix provide access to the scan of someone who had applied for insurance coverage; they stated that they have a right to examine the scan just as life insurance agencies are allowed to perform health checks on potential customers.

4. An advertising agency asked Neuraltrix for access to their data to use them to fine-tune their customer behavioural prediction algorithms.

5. A government agency demanded access to the data to investigate a suspected case of “radicalisation”.

6. A prosecutor asked for access to the scan of a specific person because she had recently been the defendant in an assault case, where the prosecutor is gathering evidence of potential aggressive tendencies.

7. A defence attorney requested data because they were gathering potentially exonerating evidence, to prove that the defendant’s autonomy had been compromised by their brain states, following a line of argument known as “My brain made me do it.”

 

Optional STOP for questions and activities: 

1. Activity: Identify legal issues – Students could research what laws or regulations apply to each case and consider various ways in which Neuraltrix could lawfully meet some of the above requests while rejecting others, and how their responses should be communicated within the company and to the requestor.

2. Activity: Identify ethical issues – Students could reflect on what might be the immediate ethical concerns related to sharing the data as requested.

3. Activity: Discussion or Reflection – Possible prompts:

 

Dilemma – Part two:

The Neuraltrix BCI has an interface which allows users to provide informed consent before being scanned. The biomedical engineer developing the system was informed about a customer complaint which stated that the user had felt pressured to provide consent as the scan was part of a job interview. The complaint also stated that the user had not been aware of the extent of information gleaned from their brains, and that they would not have provided consent had been made aware of it.

 

Optional STOP for questions and activities: 

1. Activity: Technical analysis – Students might try to determine if it is possible to design the BCI consent system and/or consent process to eliminate the difficulties cited in the complaint. Could the device be designed to automatically detect sensitive psychological content or allow the subject to stop the scan or retroactively erase the recording?

2. Activity: Determine the broader societal impact and the wider ethical context – Students should consider what issues are raised by the widespread availability of brain scans. This could be done in small groups or a larger classroom discussion.

Possible prompts:

 

Dilemma – Part three:

Neuraltrix BCI is about to launch its updated version, which features all data processing and storage moved to the cloud to facilitate interactive and mobile applications. This upgrade attracted investors and a major deal is about to be signed. The board is requesting a fast deployment from the management team and Aziza faces pressure from her managers to run final security checks and go live with the cloud version. During these checks, Aziza discovers a critical security issue which can be exploited once the BCI runs in the cloud, risking breaches in the database and algorithm. Managers believe this can be fixed after launch and request the engineer to start deployment and identify subsequent solutions to fix the security issue.

 

Optional STOP for questions and activities: 

1. Activity: Students should consider if it is advisable for Aziza to follow requests from managers and the Neuraltrix BCI board and discuss possible consequences, or halt the new version deployment which may put at risk the new investment deal and possibly the future of the company.

2. Activity: Apply an analysis based on “Duty-Ethics” and “Rights Ethics.” This could be done in small groups (who would argue for management position and engineer position, respectively) or a larger classroom discussion. A tabulation approach with detailed pros and cons is recommended.

3. Activity: Apply a similar analysis as above based on the principles of “Act-Utilitarianism” and “Rule-Utilitarianism.”

Possible prompts:

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

 

Authors: Dr Yujia Zhai (University of Hertfordshire); Associate Professor Scarlett Xiao (University of Hertfordshire). 

Topic: Data security of industrial robots.  

Disciplines: Robotics; Data; Internet of Things. 

Ethical issues: Safety; Health; Privacy; Transparency. 

Professional situations: Rigour; Informed consent; Misuse of data. 

Educational level: Intermediate. 

Educational aim: Gaining ethical knowledge. Knowing the sets of rules, theories, concepts, frameworks, and statements of duty, rights, or obligations that inform ethical attitudes, behaviours, and practices. 

 

Learning and teaching notes: 

This case study involves an engineer hired to develop and install an Industrial Internet of Things (IIoT) online machine monitoring system for a manufacturing company. The developments include designing the infrastructure of hardware and software, writing the operation manuals and setting policies. The project incorporates a variety of ethical components including law and policy, stakeholders, and risk analysis. 

This case study addresses three of the themes from the Accreditation of Higher Education Programmes fourth edition (AHEP4): Design and Innovation (significant technical and intellectual challenges commensurate the level of study), the Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools, and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37. 

The dilemma in this case is presented in three parts. If desired, a teacher can use Part one in isolation, but Part two and Part three develop and complicate the concepts presented in Part one to provide for additional learning. The case study allows teachers the option to stop at multiple points for questions and/or activities as desired. 

Learners have the opportunity to: 

Teachers have the opportunity to:  

 

Learning and teaching resources: 

Professional organisations: 

Legal regulations: 

UN agency: 

Educational resource: 

Government sites: 

 Educational institutions: 

 

Summary:  

IIoT is a new technology that can provide accurate condition monitoring and predict component wear rates to optimise machine performance, thereby improving the machining precision of the workpiece and reducing the production cost.   

Oxconn is a company that produces auto parts. The robotic manipulators and other automation machines on the production line have been developed at considerable cost and investment, and regular production line maintenance is essential to ensure its effective operation. The current maintenance scheme is based on routine check tests which are not reliable and efficient. Therefore Oxconn has decided to install an IIoT-based machine condition monitoring system. To achieve fast responses to any machine operation issues, the machine condition data collected in real time will be transferred to a cloud server for analysis, decision making, and predictive maintenance in the future. 

 

Dilemma – Part one – Data protection on customers’ machines:

You are a leading engineer who has been hired by Oxconn to take charge of the project on the IIoT-based machine monitoring system, including designing the infrastructure of hardware and software, writing the operation manuals, setting policies, and getting the system up and running. With your background in robotic engineering and automation, you are expected to act as a technical advisor to Oxconn and liaise with the Facilities, Security, Operation, and Maintenance departments to ensure a smooth deployment. This is the first time you have worked on a project that involves real time data collection. So as part of your preparation for the project, you need to do some preliminary research as to what best practices, guidance, and regulations apply. 

 

Optional STOP for questions and activities: 

1. Discussion: What are the legal issues relating to machine condition monitoring? Machines’ real-time data allows for the identification of production status in a factory and is therefore considered as commercial data under GDPR and the Data Protection Act (2018). Are there rules specifically for IIoT, or are they the same no matter what technology is being used? Should IIoT regulations differ in any way? Why? 

2. Discussion: Sharing data is a legally and ethically complex field. Are there any stakeholders with which the data could be shared? For instance, is it acceptable to share the data with an artificial intelligence research group or with the public? Why, or why not? 

3. Discussion: Under GDPR, individuals must normally consent to their personal data being processed. For machine condition data, how should consent be handled in this case? 

4. Discussion: What ethical codes relate to data security and privacy in an IIoT scenario?  

5. Activity: Undertake a technical activity that relates to how IIoT-based machine monitoring systems are engineered. 

6. Discussion: Based on your understanding of how IIoT-based machine monitoring systems are engineered, consider what additional risks, and what kind of risks (such as financial or operational), Oxconn might incur if depending on an entirely cloud-based system. How might these risks be mitigated from a technical and non-technical perspective? 

 

Dilemma – Part two – Computer networks security issue brought by online monitoring systems:

The project has kicked off and a senior manager requests that a user interface (UI) be established specifically for the senior management team (SMT). Through this UI, the SMT members can have access to all the real-time data via their computers or mobiles and obtain the analysis result provided by artificial intelligence technology. You realise this has implications on the risk of accessing internal operating systems via the external information interface and networks. So as part of your preparation for the project, you need to investigate what platforms can be used and what risk analysis must be taken in implementation. 

 

Optional STOP for questions and activities: 

The following activities focus on macro-ethics. They address the wider ethical contexts of projects like the industrial data acquisition system. 

1. Activity: Explore different manufacturers and their approaches to safety for both machines and operators. 

2. Activity: Technical integration – Undertake a technical activity related to automation engineering and information engineering. 

3. Activity: Research what happens with the data collected by IIoT. Who can access this data and how can the data analysis module manipulate the data?  

4. Activity: Develop a risk management register, taking considerations of the findings from Activity 3 as well as the aspect of putting in place data security protocols and relevant training for SMT. 

5. Discussion/activity: Use information in the Ethical Risk Assessment guide to help students consider how ethical issues are related to the risks they have just identified. 

6. Discussion: In addition to cost-benefit analysis, how can the ethical factors be considered in designing the data analysis module? 

7. Activity: Debate the appropriateness of installing and using the system for the SMT. 

8. Discussion: What responsibilities do engineers have in developing these technologies? 

 

Dilemma – Part three – Security breach and legal responsibility: 

At the beginning of operation, the IIoT system with AI algorithms improved the efficiency of production lines by updating the parameters in robot operation and product recipes automatically. Recently, however, the efficiency degradation was observed, and after investigation, there were suspicions that the rules/data in AI algorithms have been subtly changed. Developers, contractors, operators, technicians and managers were all brought in to find out what’s going on. 

 

Optional STOP for questions and activities: 

1. Discussion: If there has been an illegal hack of the system, what might be the motive of cyber criminals?   

2. Discussion: What are the impacts on company business? How could the impact of cyber-attacks on businesses be minimised?

3. Discussion: How could threats that come from internal employees, vendors, contractors or partners be prevented?

4. Discussion: When a security breach happens, what are the legal responsibilities for developers, contractors, operators, technicians and managers? 

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Authors: Professor Sarah Hitt SFHEA (NMITE); Professor Raffaella Ocone OBE FREng FRSE (Heriot Watt University); Johnny Rich (Engineering Professors’ Council); Dr Matthew Studley (University of the West of England, Bristol); Dr Nik Whitehead (University of Wales Trinity Saint David); Dr Darian Meacham (Maastricht University); Professor Mike Bramhall (TEDI-London); Isobel Grimley (Engineering Professors’ Council).

Topic: Data security of smart technologies.

Engineering disciplines: Electronics, Data, Mechatronics.

Ethical issues: Autonomy, Dignity, Privacy, Confidentiality.

Professional situations: Communication, Honesty, Transparency, Informed consent.

Educational level: Intermediate.

Educational aim: Practise ethical analysis. Ethical analysis is a process whereby ethical issues are defined and affected parties and consequences are identified so that relevant moral principles can be applied to a situation in order to determine possible courses of action.

 

Learning and teaching notes:

This case involves a software engineer who has discovered a potential data breach in a smart home community. The engineer must decide whether or not to report the breach, and then whether to alert and advise the residents. In doing so, considerations of the relevant legal, ethical, and professional responsibilities need to be weighed. The case also addresses communication in cases of uncertainty as well as macro-ethical concerns related to ubiquitous and interconnected digital technology.

This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.

The dilemma in this case is presented in two parts. If desired, a teacher can use Part one in isolation, but Part two develops and complicates the concepts presented in Part one to provide for additional learning. The case allows teachers the option to stop at multiple points for questions and/or activities as desired

Learners will have the opportunity to:

Teachers will have the opportunity to:

 

Learning and teaching resources:

 

Summary:

Smart homes have been called “the road to independent living”. They have the potential to increase the autonomy and safety of older people and people with disabilities. In a smart home, the internet of things (IoT) is coupled with advanced sensors, chatbots and digital assistants. This combination enables residents to be connected with both family members and health and local services, so that if there there are problems, there can be a quick response.

Ferndale is a community of smart homes. It has been developed at considerable cost and investment as a pilot project to demonstrate the potential for better and more affordable care of older people and people with disabilities. The residents have a range of capabilities and all are over the age of 70. Most live alone in their home. Some residents are supported to live independently through: reminders to take their medication; prompts to complete health and fitness exercises; help completing online shopping orders and by detecting falls and trips throughout the house. The continuous assessment of habits, diet and routines allows the technology to build models that may help to predict any future negative health outcomes. These include detecting the onset of dementia or issues related to dietary deficiencies. The functionality of many smart home features depends on a reliable and secure internet connection.

 

Dilemma – Part one:

You are the software engineer responsible for the integrity of Ferndale’s system. During a routine inspection you discover several indicators suggesting a data breach may have occurred via some of the smart appliances, many of which have cameras and are voice-activated. Through the IoT, these appliances are also connected to Amazon Ring home security products – these ultimately link to Amazon, including supplying financial information and details about purchases.

 

Optional STOP for questions and activities: 

1. Activity: Technical analysis – Before the ethical questions can be considered, the students might consider a number of immediate technical questions that will help inform the discussion on ethical issues. A sample data set or similar technical problem could be used for this analysis. For example:

2. Activity: Identify legal and ethical issues. The students should reflect on what might be the immediate ethical concerns of this situation. This could be done in small groups or a larger classroom discussion.

Possible prompts:

3. Activity: Determine the wider ethical context. Students should consider what wider moral issues are raised by this situation. This could be done in small groups or a larger classroom discussion.

Possible prompts:

 

Dilemma – Part two:

You send an email to Ferndale’s manager about the potential breach, emphasising that the implications are possibly quite serious. She replies immediately, asking that you do not reveal anything to anyone until you are absolutely certain about what has happened. You email back that it may take some time to determine if the software security has been compromised and if so, what the extent of the breach has been. She replies explaining that she doesn’t want to cause a panic if there is nothing to actually worry about and says “What you don’t know won’t hurt you.” How do you respond?     

 

Optional STOP for questions and activities: 

1. Discussion: Professional values – What guidance is given by codes of ethics such as the Royal Academy of Engineering/Engineering Council’s Statement of Ethical Principles or the Association for Computing Machinery Code of Ethics?

2. Activity: Map possible courses of action. The students should think about the possible actions they might take. They can be prompted to articulate different approaches that could be adopted, such as the following, but also develop their own alternative responses.

3. Activity: Hold a debate on which is the best approach and why. The students should interrogate the pros and cons of each possible course of action including the ethical, technical, and financial implications. They should decide on their own preferred course of action and explain why the balance of pros and cons is preferable to other options.

4. Activity: Role-play a conversation between the engineer and the manager, or a conversation between the engineer and a resident.

5. Discussion: consider the following questions:

6. Activity: Change perspectives. Imagine that you are the child of one of Ferndale’s residents and that you get word of the potential data security breach. What would you hope the managers and engineers would do?

7. Activity: Write a proposal on how the system might be improved to stop this happening in the future or to mitigate unavoidable risks. To inform the proposal, the students should also explore the guidance of what might be best practice in this area. For example, in this instance, they may decide on a series of steps.

 

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.

Let us know what you think of our website