Authors: Ahmet Omurtag (Nottingham Trent University); Andrei Dragomir (National University of Singapore / University of Houston).
Topic:Â Data security of smart technologies.
Engineering disciplines:Â Electronics; Data; Biomedical engineering.
Ethical issues:Â Autonomy; Dignity; Privacy; Confidentiality.
Professional situations:Â Communication; Honesty; Transparency; Informed consent; Misuse of data.
Educational level:Â Advanced.
Educational aim:Â Practising Ethical Analysis: engaging in a process by which ethical issues are defined, affected parties and consequences are identified, so that relevant moral principles can be applied to a situation in order to determine possible courses of action.
Learning and teaching notes:
This case involves Aziza, a biomedical engineer working for Neuraltrix, a hypothetical company that develops Brain-computer interfaces (BCI) for specialised applications. Aziza has always been curious about the brain and enthusiastic about using cutting-edge technologies to help people in their daily lives. Her team has designed a BCI that can measure brain activity non-invasively and, by applying machine learning algorithms, assess the job-related proficiency and expertise level of a person. She is leading the deployment of the new system in hospitals and medical schools, to be used in evaluating candidates being considered for consultant positions. In doing so, and to respond to requests to extend and use the BCI-based system in unforeseen ways, she finds herself compelled to weigh various ethical, legal and professional responsibilities.
This case study addresses two of AHEP 4’s themes: The Engineer and Society (acknowledging that engineering activity can have a significant societal impact) and Engineering Practice (the practical application of engineering concepts, tools and professional skills). To map this case study to AHEP outcomes specific to a programme under these themes, access AHEP 4 here and navigate to pages 30-31 and 35-37.
The dilemma in this case is presented in three parts. If desired, a teacher can use the Summary and Part one in isolation, but Parts two and three develop and complicate the concepts presented in the Summary and Part one to provide for additional learning. The case allows teachers the option to stop at multiple points for questions and/or activities as desired.
Learners have the opportunity to:
- analyse the ethical dimensions of an engineering situation;
- identify professional responsibilities of engineers in an ethical dilemma;
- determine and defend a course of action in response to an ethical dilemma;
- practise professional communication;
- debate viable solutions to an ethical dilemma.
Teachers have the opportunity to:
- highlight professional codes of ethics and their relevance to engineering situations;
- address approaches to resolve interpersonal and/or professional conflict;
- integrate technical content on software and/or cybersecurity;
- informally evaluate students’ critical thinking and communication skills.
Learning and teaching resources:
Legal regulations:
Professional organisations:
- International neuroethics society resources
- RAEng/Engineering Council Statement of Ethical Principles
Philanthropic organisations:
Journal articles:
- Brain measurements can discriminate surgery skill
- Act and rule utilitarianism
- Engineering Ethics by Charles E. Harris
Educational institutions:
Â
Summary:
Brain-computer interfaces (BCIs) detect brain activity and utilise advanced signal analysis to identify features in the data that may be relevant to specific applications. These features might provide information about people’s thoughts and intentions or about their psychological traits or potential disorders, and may be interpreted for various purposes such as for medical diagnosis, for providing real-time feedback, or for interacting with external devices such as a computer. Some current non-invasive BCIs employ unobtrusive electroencephalography headsets or even optical (near-infrared) sensors to detect brain function and can be safe and convenient to use.
Evidence shows that the brains of people with specialised expertise have identifiable functional characteristics. Biomedical technology may translate this knowledge soon into BCIs that can be used for objectively assessing professional skills. Researchers already know that neural signals support features linked to levels of expertise, which may enable the assessment of job applicants or candidates for promotion or certification.
BCI technology would potentially benefit people by improving the match between people and their jobs, and allowing better and more nuanced career support. However, the BCI has access to additional information that may be sensitive or even troubling. For example, it could reveal a person’s health status (such as epilepsy or stroke), or it may suggest psychological traits ranging from unconscious racial bias to psychopathy. Someone sensitive about their privacy may be reluctant to consent to wearing a BCI.
In everyday life, we show what is on our minds through language and behaviour, which are normally under our control, and provide a buffer of privacy. BCIs with direct access to the brain and increasing capability to decode its activity may breach this buffer. Information collected by BCIs could be of interest not only to employers who will decide whether to hire and invest in a new employee, but also to health insurers, advertising agencies, or governments.
Optional STOP for questions and activities:
1. Activity: Risks of brain activity decoding – Identify the physical, ethical, and social difficulties that could result from the use of devices that have the ability to directly access the brain and decipher some of its psychological content such as thoughts, beliefs, and emotions.
2. Activity: Regulatory oversight – Investigate which organisations and regulatory bodies currently monitor and are responsible for the safe and ethical use of BCIs.
3. Activity: Technical integration – Investigate how BCIs work to translate brain activity into interpretable data.
Dilemma – Part one:
After the company, Neuraltrix, deployed their BCI and it had been in use for a year in several hospitals, its lead developer Aziza became part of the customer support team. While remaining proud and supportive of the technology, she had misgivings about some of its unexpected ramifications. She received the following requests from people and institutions for system modifications or for data sharing:
1. A hospital asked Neuraltrix for a technical modification that would allow the HR department to send data to their clinical neurophysiologists for “further analysis,” claiming that this might benefit people by potentially revealing a medical abnormality that might otherwise be missed.
2. An Artificial Intelligence research group partnering with Neuraltrix requested access to the data to improve their signal analysis algorithms.
3. A private health insurance company requested Neuraltrix provide access to the scan of someone who had applied for insurance coverage; they stated that they have a right to examine the scan just as life insurance agencies are allowed to perform health checks on potential customers.
4. An advertising agency asked Neuraltrix for access to their data to use them to fine-tune their customer behavioural prediction algorithms.
5. A government agency demanded access to the data to investigate a suspected case of “radicalisation”.
6. A prosecutor asked for access to the scan of a specific person because she had recently been the defendant in an assault case, where the prosecutor is gathering evidence of potential aggressive tendencies.
7. A defence attorney requested data because they were gathering potentially exonerating evidence, to prove that the defendant’s autonomy had been compromised by their brain states, following a line of argument known as “My brain made me do it.”
Optional STOP for questions and activities:Â
1. Activity: Identify legal issues – Students could research what laws or regulations apply to each case and consider various ways in which Neuraltrix could lawfully meet some of the above requests while rejecting others, and how their responses should be communicated within the company and to the requestor.
2. Activity: Identify ethical issues – Students could reflect on what might be the immediate ethical concerns related to sharing the data as requested.
3. Activity: Discussion or Reflection – Possible prompts:
-
- Do you, as a biomedical engineer, have any duty to the people who have been scanned? Do you have more or less of a responsibility to these people or to Neuraltrix?
- If you find that a fellow employee has already shared the data without telling others, how should you act? Should you worry that revealing this employee’s actions might cause distress or create distrust in the integrity of the entire system? Is there anyone else you should inform? Are there any risks you may be able to mitigate immediately?
- Do you think the reasons and justifications given for the data requests listed above are legitimate?
- Who owns the data collected by the BCI? Should it be protected? How, and for how long? Who should maintain it?
Dilemma – Part two:
The Neuraltrix BCI has an interface which allows users to provide informed consent before being scanned. The biomedical engineer developing the system was informed about a customer complaint which stated that the user had felt pressured to provide consent as the scan was part of a job interview. The complaint also stated that the user had not been aware of the extent of information gleaned from their brains, and that they would not have provided consent had been made aware of it.
Optional STOP for questions and activities:Â
1. Activity: Technical analysis – Students might try to determine if it is possible to design the BCI consent system and/or consent process to eliminate the difficulties cited in the complaint. Could the device be designed to automatically detect sensitive psychological content or allow the subject to stop the scan or retroactively erase the recording?
2. Activity: Determine the broader societal impact and the wider ethical context – Students should consider what issues are raised by the widespread availability of brain scans. This could be done in small groups or a larger classroom discussion.
Possible prompts:
-
- On the one hand, human assessors can be subject to bias and inconsistencies and, from this point of view, algorithmic assessment leaving human assessors out of the loop may be viewed as progress. On the other hand, some “black-box” algorithms used by the BCI have been criticised for opacity, hidden biases, and the difficulty of scrutinising their decisions. If a user is dissatisfied with the BCI-enhanced assessment, should they be able to opt out of it?
- If use of the Neuraltrix BCI became widespread, do you believe that humans could eventually irreversibly lose their assessment skills? Compare this with the potential loss of map-reading skills due to the easy access to Satellite Navigation systems.
- Can we dispense with human opinion and make assessment processes entirely “objective”?
- “Goodhart’s law,” named after the economist Charles Goodhart, states that when a measure is used as a tool, it becomes vulnerable to manipulation. Would Neuraltrix BCI create new opportunities for candidates to “game” the BCIs, and how would they do it?
Dilemma – Part three:
Neuraltrix BCI is about to launch its updated version, which features all data processing and storage moved to the cloud to facilitate interactive and mobile applications. This upgrade attracted investors and a major deal is about to be signed. The board is requesting a fast deployment from the management team and Aziza faces pressure from her managers to run final security checks and go live with the cloud version. During these checks, Aziza discovers a critical security issue which can be exploited once the BCI runs in the cloud, risking breaches in the database and algorithm. Managers believe this can be fixed after launch and request the engineer to start deployment and identify subsequent solutions to fix the security issue.
Optional STOP for questions and activities:Â
1. Activity: Students should consider if it is advisable for Aziza to follow requests from managers and the Neuraltrix BCI board and discuss possible consequences, or halt the new version deployment which may put at risk the new investment deal and possibly the future of the company.
2. Activity: Apply an analysis based on “Duty-Ethics” and “Rights Ethics.” This could be done in small groups (who would argue for management position and engineer position, respectively) or a larger classroom discussion. A tabulation approach with detailed pros and cons is recommended.
3. Activity: Apply a similar analysis as above based on the principles of “Act-Utilitarianism” and “Rule-Utilitarianism.”
Possible prompts:
-
- Should you, as a biomedical engineer, follow company rules and go ahead with manager’s requests or risk the future of the company (and possibly your job) and put deployment on hold until the security issue is fixed?
- Act utilitarianism principle, as advocated by John Stuart Mill, focuses on individual actions rather than on rule, therefore, actions should be judged based on whether they resulted in the most good outcome in a certain situation. Should the Neuraltrix BCI management be guided by this principle or rather by a cost-benefit approach?
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Any views, thoughts, and opinions expressed herein are solely that of the author(s) and do not necessarily reflect the views, opinions, policies, or position of the Engineering Professors’ Council or the Toolkit sponsors and supporters.