Medical and non-medical chips are entering our lives despite the controversy they are surrounded by. Unfortunately, many people willing to participate in such programs are forgetting the civil and privacy issues that must be addressed by system designers, innovators, regulators, and legislators.
James Scott, a Senior Fellow at ICIT (Institute for Critical Infrastructure Technology) undergoes this subject in his recent paper focused on implantable devices. The paper is a comprehensive overview of the current state of medical and non-medical implants and the cybersecurity and privacy issues stemming from their use.
By 2025, advances in embedded technologies will outpace the augmentations displayed in media franchises like Mission Impossible, Kingsman Blade Runner, and Ghost in the Shell.
The researcher foresees a much broader deployment of these systems to take place in the near future. What he finds most important is that system designers and manufacturers pay close attention to the security-by-design concept that should come first when implants are still in an early phase. In his words, the inability to do so will end in the failure to mitigate the onslaught of privacy and security harms muddling mankind’s evolution.
While significant research is focused on the technical functionality of these devices, however, little concern is spared for the potential security risks or privacy harms that might emerge from the ubiquitous adoption of “uberveillance” systems that lack even fundamental security-by-design.
Thus, we are facing the need of “responsible regulatory legislation that does not pander to the whims of metadata curators and data brokers and that mandates security-by-design” more than ever.
The Risks Posed by Veillance Devices
First of all, let’s delve into the meaning of the veillance concept. As explained by Scott, the term is a broad concept that includes both surveillance (oversight) and sousveillance (undersight), as well as dataveillance and uberveillance. Wearable technology invites fears of surveillance and sousveillance, he says, and it is nearly impossible to disagree with him.
Two things should be noted here: that surveillance led to sousveillance, and uberveillance lead to dataveillance.
Eventually, data brokers and other interested parties will exploit big data psychographic and demographic algorithms to predict our daily activities and influence behaviors and perspectives. This scenario will end in the absolute monetization of the psychological profile, despite the potential security and privacy risks. Not only will data brokers leverage our private profiles but also adversaries:
Adversaries will similarly leverage exfiltrated information in multivector influence operations that undermine democracy, precisely target critical infrastructure personnel, and incite divisions in American society.
The Potential Risks of Wearables
Shortly said, Scott believes that wearables are a gateway technology that muffles consumers and leads to never-ending monitoring and collection of biological and behavioral data. The collected data is capitalized and monetized by the companies that developed the devices. Associated third parties and hackers are also parts of the exploit chain of this sensitive data.
The Potential Risks of Implants
Medical implants are often welcomed by users who see them as something necessary that will improve the quality of their lives. However, everything that is seen as positive or helpful in these implants will be leveraged by malicious actors, if security is poor or worse, non-existent.
Even though transhumanists, like Kevin Warwick, have proven that implants, such as NFC chips, sensors, and neural devices, are possible, easy, and have viable applications, they have failed to consider the commercialization of those technologies and the cascading security and privacy implications of the increasing ubiquity and pervasiveness of the devices, the researcher concludes.