Getting over the corona blues and the burnout

It was a long dreading unexpected slump. I wasn’t even aware that I was in a slump until I got of it. It was like climbing uphill all the time thinking that I am pedaling on a plateau, and it was…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Deliberations on the societal implementation of Neurotechnology

The issue of ethical and legal obstruction has long been a problem to overcome in the world of computer science. In a field that pushes the boundaries of what’s possible like no other, at times it seems as though the only thing holding our technology back are the ethical and legal dilemma’s.

The article that is quoted and referenced throughout is:
https://www.technologynetworks.com/neuroscience/articles/privacy-in-the-brain-the-ethics- of-neurotechnology-353075

This article talks about the issues of neurotechnology, and how we can be sure that technology that is literally integrated within our bodies and minds, can be safe and legitimate.
To begin to understand some of the issues in this article, we must first understand what exactly neurotechnology is. To give a dictionary definition, neurotechnology is, “the assembly of methods and instruments that enable a direct connection of technical components with the nervous system.”
But what does that actually mean? Well, neurotechnology is technology that utilises equipment such as electrodes and intelligent prostheses to create a connection between a powerful computer or node, and the nervous system in our bodies.
The first developments in neurotechnology were made way back in 1965; Giles Brindley, a University of Cambridge physiologist, produced a brain implant that could wirelessly stimulate the visual cortex. This was developed as a visual prosthesis and the phosphenes that the implant generated (irregular flashes of light that appear in the visual field) enabled the user to identify a few letters of the alphabet. While being of little practical use at the time, Brindley’s experiments established the moral basis for neurotechnology — these devices’ worth would surely be undeniable if they could bring sight to the blind and voice to the voiceless.
This revolutionary idea, that has been in the works for decades but only now is beginning to be taken seriously by the masses, has a wealth of uses, from medicine to practical daily uses, and with the money and brain of people like Elon Musk as the driving force behind it, we can be sure this is here to stay.

But how does it work? Electrodes, the main form of communication between the nodes and brain, can be placed directly on the skull, in the form of electrode caps that pick up electrical fields generated by the active brain. This is termed ‘non-invasive’, as the electrodes do not actually penetrate the body.
Current research aims to stabilise and implement these technologies, with a wide range of uses, as easily and seamlessly as possible. There are still many minor details to buff out, and it is extra important everything works smoothly and without fault.
Neurotechnology can be very useful, and when paired with pharmaceutical or medical aids, can be incredibly effective with treating a lot of ailments and diseases. However, interventions in the brain can irreversibly alter a patient’s personality and character. This is intended in the treatment of certain affective disorders.
However, changes to personality can also be an unintended side effect of brain intervention, and this raises the issues of legal responsibility. A foundational part of criminal law is automation, stating that if an action can be proved to be involuntary, there are insufficient grounds to find the defendant guilty.
If automation can be proved, the actus reus (physical section of a crime) is negated, and the defendant is acquitted (released and unable to be re-tried for the same crime).
So, if someone who is using neurotechnology murders someone, they could argue under the insanity defence that this is an example if insane automation, as the men’s rea (or internal aspect to a crime i.e. the intent) is in question, and therefore they would avoid jail time and be sent for a mandatory stay in a psych hospital.
But is that fair? Seeing as it would be near impossible to prove a malfunction or side effect was the direct cause for the murder, and it could have been any of a million other reasons, it is certainly ambiguous. However, ambiguity favours the defendant, so there is definitely a high chance an insanity defence would be enough to avoid jail time, as long as they don’t mind an indefinite stay in a psych ward.
An example of its use would be in patients suffering from ALS, who are almost completely paralyzed during the further stages of the disease. These patients are sometimes only able to communicate using their eyelids, or by voluntary changes of their brain activity. These patients are still capable of controlling certain aspects of their brain activity and relying on neurotechnology for decoding, can respond to yes or no questions. With practice, they can operate a computerized machine not dissimilar to a typewriter and compose sentences.

It is this direct communication with the brain that raises most of the issues around it. Many people voice concerns over privacy issues, worried for the lack of security and incredibly harmful consequences should a network be compromised, and a virus inflicted. Instead of a computer crashing and not functioning, it would be our entire consciousness.
Neurotechnology also raises ethical questions with regards to our mind, body and soul, complex philosophical concepts with many iterations of definitions. The definition usually draws on those values we describe as ‘personal’ not in the privacy sense, but in terms of one’s personhood. These include self- consciousness, responsibility, planning of the individual future, and similar ideas Integrity and dignity of a person are the most relevant criteria for the ethical evaluation of technological interventions. Neurotechnological interventions are ethically not acceptable if remaining a person is at risk.
Currently, one of the main ethical concerns is that the addition of neurotechnology, especially character altering ones, redefines one’s personhood, and could give arise to a whole spectrum of ethical, legal and moral issues.
Legally, if a person is not a “person”, why should they comply with our human laws? We don’t hold animals to our current laws, and why should we do the same with ‘people’ that aren’t ‘human’?
The law dictates that even after neurotechnological intervention, a human is a human, and obviously is still punishable under the same laws and moral codes. The question isn’t whether a human with neurotechnological implants a person is still, but rather whether their personal identity and personhood can be brought into question by these sudden and somewhat drastic changes in personality and character that can happen as an unwarranted side effect.
It is in these cases that a person’s ethical character is under question.
Can we really blame them for their choices? In the past, this hasn’t been a question worth asking, as of course we can blame a person for their own personal choices and decisions. However, now the line is more of a blurred one, as the implant of neurotechnology, and the differences in character and morals that they can provoke, provide an arguable defence for this issue.

Add a comment

Related posts:

OnePlus Breaks Into Budget Phones With OnePlus Nord N Series

Over the past several years, OnePlus has grown from a beloved niche brand to a powerhouse in the US market. Its strategy is simple: sell flagship smartphones at affordable prices. Last year, the…

Could reinforcement learning be the way to get autonomous cars on the road sooner?

A fascinating article in MIT Tech Review, “The big new idea for making self-driving cars that can go anywhere”, looks at how some companies working on autonomous driving are using reinforcement…

Good Teacher Assesses the Happiness Quotient of Children

The happiness quotient of the children in school speaks a lot about the teacher & teaching. If the children are happy & look forward to coming to school each day, then learning takes place…