Neuroethics attempts to tackle the social, moral, and ethical problems raised by the developing field of neurotechnology, from socially systematic issues such as discrimination and inequality to abstract notions like the undermining of agency and human identity. Considerations of who has access to gathered data are essential as they influence how data can be utilized in different spaces. Due to a multitude of reasons, informed consent - the regular “terms of usage” - will not adequately inform and protect the rights of all users. This could result in information that users want to be private being gleaned. Automation of highly important roles, such as health providers, can give rise to misuse issues. Health providers can become responsible to complete data input and opaque algorithmic diagnostics which they are not well-versed in the technology causing it to be a subject to misuse. Disagreements concerning definitions of brain health, acceptable brain thresholds, and ranges of human motor and cognitive behavior may arise. Some questions related to this are: how exact should measurements? How objective are brain thresholds? Algorithmic bias in AI/ML algorithms depending on training data, very problematic when considering diagnosing mental conditions or perhaps for the legal applications. Lie detection–unwilling participation/self-incrimination, lying vs confused memories, false positives (stimuli evokes response for other reasons). How will we know with 100% certainty that this machine is accurate before it gains usage? Predicting future neurological conditions can lead to depression, as people are left feeling hopeless in the face of uncontrollable circumstances. Neuromarketing, propaganda, political campaigns, neuroprofiling (facebook). Personal manipulation that is both more effective and more discreet than ever before. Market forces drive the development, lack of regulation and regulatory capacity–slow moving, not tech-savvy legislation. Incentives of companies not aligned with the public, will promote more usage of products at the expense of truth if they can get away with it. Most of the advancements are made by private companies and military investing rather than public general research. They are the ones shaping the technology and the conversation (including ethical perspectives) around that technology. Drain of talented researchers to these more morally compromised positions because of higher salaries. The companies hold advocacy initiatives and sponsor chairs and departments at universities and research institutions to shape the dialogue. Companies need to decide whether to participate in unethical projects, whether they should allow their products to be used in such situations. Knowledge disparity between companies and everyone else leaves much responsibility, burden on them to act ethically. Reading mental states: even if we could perceive others’ thoughts, they are subjective–interpreting it as objective/absolute is a mistake: need to take into account the whole context in which it is occurring (ie a person’s entire life experiences). Researchers constructed a game and recorded the brain activity of its players. These signals could be processed to elicit details about bank PIN numbers and related private information without the game player knowing. P300 waves in response to hidden cues Cognitive liberty: if anyone became sure that measurements of their brain might reveal any of their mental contents, how might they refrain from having candid and revealing thoughts? Self-conception: it would not be certain that one might feel free to reflect upon values, decisions, or propositions without threat of consequences, so reasoned opinions would suffer. Biological determinism: using imaging to predict future behavior; potential to discriminate based on that, particularly in the legal risk-assessment process. Inner speech decoding–how to judge a person? Neurostimulation/neuromodulation: expanding use of DBS (deep brain stimulation) for not well understood conditions–need rigorous approval process. Cognitive enhancement could reinforce inequality, or it could reduce exciting differences among people. Inequality across the world–mental disorders remain largely underdiagnosed and undertreated in low- and middle-income countries Coordination of large scale research. The projects require vast computing power, large and expensive platforms, large amounts of data or participants. Need transparency, open access data platforms, standards, but no funding/incentives to do this. Need for global consensus on neuroethics, otherwise cross border flow. Also need to consider cross-cultural variations in values. Overclaiming can undermine neurotech confidence by failing to deliver by misrepresenting technologies, and serving to raise undue hopes and concerns. Ethics research necessary to avoid delay fallacy Ask the question: why do we want this neurotech now? Ability to retract actions mediated via brain controlled devices, particularly in the case of speech prosthesis to avoid unwanted expression.
While interference with the brain can be helpful in managing symptoms of certain neurological disorders, it has the potential to cause irreversible damage to the patient. For example, new technologies are being designed to genetically modify neurons, and existing forms of neural stimulation can change neuron form and function. Researchers have already observed the onset of depressive disorders in patients that had no mental health issues prior to deep brain stimulation. These raise many ethical concerns, such as the possibility for neurotechnology to affect one’s self self-consciousness and identity. To tackle this issue, researchers must establish guidelines to evaluate the “sameness” of a person over time. Along with ethical concerns over identity and personhood, the ability to read people’s brain signals opens the possibility for ill-intentioned parties to acquire highly sensitive information. As such, maintining security and privacy of brain data is of utmost importance in the use of readout neurotechnology.
As neurotechnology develops and its applications become clearer, it is important to consider the extent to which man and machine should be allowed to fuse together to prevent unnecessary complications.