Do We Have a Right to Mental Privacy and Cognitive Liberty? Consequences for customers and developers of tech products.

In his impressive article “Do We Have a Right to Mental Privacy and Cognitive Liberty?” (Scientific American, 2017) the Swiss scientist Marcello Ienca describes the challenges which privacy faces in the light of new technologies which include interference in neurological processes, our brain.
All new technologies ignite fears. As scientists warned with the introduction of the railway, velocity above 30 km/h would be deadly for humans, they warn today light speed traveling would be fatal. Despite all technology optimism, skeptics help to recognize flaws in time, detect dangers before they become life threatening.

New technology to intrude into privacy
Ienca describes the moral and ethical risks of new technologies which interfere in deepest corners of privacy. The warrant “the thoughts are free, no-one can detect them” which was taken for secure, is devalued. He cites various new methods and devices which interfere in the “inner privacy” without the knowledge and consent of the affected person. Ienca stresses on uses for the military, usage in courts, predatory marketing studies: “With the growing availability of Internet-connected consumer-grade brain-computer interfaces, more and more individuals are becoming users of neurodevices.”

Ienca refers to possible misuse of neurological devices for “brainjacking”. Subsequently he demands a “reconceptualization of the right to mental integrity.” which allegedly is met by Article 3 of the EU’s Charter of Fundamental Rights “as a right to mental health [which] should not only protect from mental illness but also from illicit and harmful manipulations of people’s neural activity through the misuse of neurotechnology.”.

The political dead-end-street
To protect the individuals from losing their “right to psychological continuity [which] might preserve people’s personal identity and the continuity of their mental life from unconsented external alteration by third parties” he cites an initiative of the European Parliament for a global ban of research “which seeks to apply knowledge of the chemical, electrical, (…) or other functioning of the human brain to the development of weapons which might enable any form of manipulation of human beings.”

Curiosity trumps morals
To shield individuals, humanity from the misuse of technology by banning research was never a good idea in human history. It is was never possible to ban humans from following their curiosity if for the good or the evil. Those who want to stop research to protect us from the evil, inhibit unwillingly the research for the good as well.

The popular game when children keep their hands in front of their eyes and rejoice excitedly: “You can’t see me” does not work in the world of adults. The more promising tangible way is to address the alleged dangers as early as possible, fight misuses relentlessly, following the old game of cops and robbers.

Enhanced privacy, morality has a price
For the technology world, it means, to develop appropriate products which meet the demand of the customers. If the latter want products which respect their privacy, the tech world should offer them. Ads relying on business models derive from the early period of the Internet when customers were not willing to pay subscription fees. As the internet evolves, business models evolve too and those customers who emphasize on enhanced privacy as part of their lifestyle and morality will be willing to pay a premium.

Leave a Reply

Your email address will not be published. Required fields are marked *