Rise of Neurotechnology: Defend Against Brain Hackers Before It's Too Late

© Photo : PixabayBrain
Brain - Sputnik International
Subscribe
Advances in neurotechnology "put the freedom of the mind at risk," and necessitate the establishment of binding human rights laws to protect individual rights, researchers at the Swiss University of Basel have suggested.

Thoughtcrime, George Orwell's nightmare neologism, popularized by his seminal work 1984, is the criminal act of holding unspoken beliefs or doubts opposing or questioning authority.

​In the fictional Airstrip One, the Thought Police could only detect thoughtcrime offenses by rigorously monitoring the population's outward actions and statements for the slightest indications of dissent and disloyalty every minute of every day — although they had no way of knowing what if any recalcitrant views remained unspoken. The private thoughts of the public remained unobserved. 

Artificial intelligence - Sputnik International
New Elon Musk Venture Aims to Merge the Human Brain With Artificial Intelligence

Fast forward to 2017, technological advances mean machines can feasibly know the contents of an individual's mind — and by definition, the privacy of one's brain is under threat as a result. 

Scientists at the University of Nebraska have developed a device that can tell an individual's political persuasion. Facebook's Building 8 project aims to develop an application that allows individuals to type just by thinking. Brain imaging technology could be rolled out in courts within the next decade. Consumer firms use "neuromarketing" techniques to understand consumer thoughts, and structure bespoke campaigns.

Swiss ethicists Marcello Ienca and Roberto Adorno, writing in a paper in the journal Life Sciences, Society and Policy, view the burgeoning of such neurological applications as a positive development which offers "unprecedented opportunities," and do not angst over neurotechnology "intricately embedded in our everyday life."

​However, the pair are extremely concerned about the degree to which such tech is susceptible to abuse — both from within and without, from "malicious brain-hacking" and "hazardous uses of medical neurotechnology." 

If a neuro device was successfully hacked, a third party could effectively eavesdrop on an individual's thoughts, cause physical and psychological damage and even delete or steal memories or ideas. There are also ethical and legal concerns over the protection of data generated by these devices that need to be considered. 

Robotics - Sputnik International
The Rise of the Cyborgs: 'Biohacking' Eyes, Arms and 'Other' Body Parts

As a result, they believe there needs to be redefinition of the idea of mental integrity, and have proposed four new human rights laws — the right to cognitive liberty, mental privacy, mental integrity and psychological continuity. They warn current techniques are already so sophisticated people's minds might be being read or interfered with without their knowledge. Such intrusions may not even necessarily involve coercion, but "unauthorized modifications" of a person's "psychological continuity."

If adopted, these rights could for example prevent individuals from enforced technological enhancement — in November 2016, US military scientists reported a procedure called transcranial direct current stimulation (tDCS) boosted the mental skills of personnel, and there are suggestions it could become obligatory for members of the armed forces in time.

​"The mind is considered to be the last refuge of personal freedom and self-determination, but advances in neural engineering, brain imaging and neurotechnology put the freedom of the mind at risk. Our proposed laws would give people the right to refuse coercive and invasive neurotechnology, protect the privacy of data collected by neurotechnology, and protect the physical and psychological aspects of the mind from damage by the misuse of neurotechnology," the authors write.

Presently, international human rights law do not mention neuroscience, although advances in biomedicine, such as those in respect of human genes, have often been entangled with laws. The authors acknowledge that despite seismic developments in neurotechnology, it is still perhaps premature to worry about mental hackers infiltrating people's minds and making off with their bank details. Still, they believe it's best to get thinking about these eventualities now, and ensure protections are in place before such things can and do happen, rather than after. As they make clear, humans cannot afford for their to be lag before security measures are implemented. 

"Science-fiction can teach us a lot about the potential threat of technology. Neurotechnology featured in famous stories has in some cases already become a reality, while others are inching ever closer, or exist as military and commercial prototypes. We need to be prepared to deal with the impact these technologies will have on our personal freedom. It's always too early to assess a technology until it's suddenly too late," the authors concluded.

The researchers' suggestions are likely not to fall on deaf ears. Many of the firms involved in neurotechnology are extremely sensitive about the ethical implications of their work.

In unveiling Building 8, Facebook were quick to stress the division's products would not invade an individual's thoughts — a concern that is heightened in Facebook's case, given the existing privacy issues surrounding the social network. Moreover, it has pledged to assemble an independent Ethical, Legal and Social Implications panel to oversee its developments. Institutional review boards ensure test subjects aren't being abused and research is being done as safely as possible.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала
<% for (let j = 0; j < data[i].country.length; j++) { %> <%- data[i].country[j].lang.trim() %> <%- data[i].country[j].lang2.trim() %> <%- data[i].country[j].title.trim() %> <% } %> <% } %>