Neurotech Has Its Own Ethical Considerations Outside the AI Ethics Landscape
More and more, the growing subject of AI ethics is becoming a robust discussion; one that's mushrooming into varied ideas in the contexts of technology policy and regulation.
Yet as those important conversations unfold, IBM researchers Sara E. Berger and Francesca Rossi (both are AI Ethics global leaders at IBM's Thomas J. Watson Research Center in New York) have co-authored an article for the Association of Computing Machinery (ACM) emphasizing the need to expand the focus of AI ethics to incorporate key aspects of neurotechnological ethics.
Let's quickly define these important terms:
- neurotechnology (or neurotech for short): "the assembly of methods and instruments that enable a direct connection of technical components with the nervous system" (Müller & Rotter, 2017).
- neurosensing (such as electrocorticography and functional magnetic resonance imaging or fMRI)
- neuromodulating (such as deep brain stimulation or DBS and transcranial magnetic stimulation or TMS)
- and combinatorial (a combination of both neurotechnologies)
- neurodata: data directly representing the function of the human brain (Hallinan et. al., 2014).
More ethical thoughts worth noting
- It's not just about "data" (or "Big Data") ... it's also about our "neurodata;"
- It's not just about "safety" but also our subjective well-being (generally defined as the health, safety, happiness, and comfort of individuals and/or communities); and lastly,
- It's not just about "control" and "access" but also our human autonomy and human agency.
Berger, S., & Rossi, F. (2023). AI and Neurotechnology: Learning from AI Ethics to Address an Expanded Ethics Landscape. Communications of the ACM, 66(3), 58-68.
Hallinan, D., Schütz, P., Friedewald, M., & De Hert, P. (2014). Neurodata and neuroprivacy: Data protection outdated?. Surveillance & Society, 12(1), 55-72.
Müller, O., & Rotter, S. (2017). Neurotechnology: Current developments and ethical issues. Frontiers in systems neuroscience, 11, 93.