Human augmentation tech requires dual use oversight

Research into various human augmentation technologies is progressing with little regard for the ethical consequences, despite its clear dual use nature for both medical and military applications, says Drone Wars UK.

Encompassing everything from exoskeletons, robotic prosthesis and bionic eye implants to brain-computer interfaces, neurochemical enhancement drugs and thought-controlled weapon systems, human augmentation can be defined as any medical or biological intervention designed to improve the performance of a person’s senses, motor functions or cognitive capabilities beyond what is necessary.

Underpinned by an array of other technologies such as artificial intelligence (AI), sensors, robotics and various data processing methods, human augmentation is considered “dual use”, meaning it can just as easily be deployed for legitimate medical uses as it can for lethal military purposes.

Set up in 2010 to undertake research and advocacy work around drones and other military technologies, Drone Wars said that while human augmentation technologies are still in the very early stages of development, the resources and attention being diverted to them by military bodies around the world – including the UK Ministry of Defence (MoD) – means there is now urgent need for wider discussion, scrutiny and regulation.

“As the pace of development accelerates in these fields it seems inevitable that existing legal frameworks will be outpaced as new technologies create scenarios that they were never intended to address,” Drone Wars said in its May 2023 Cyborg dawn? report.

“The difficulties are compounded by the dual use nature of human augmentation, where applications with legitimate medical uses could equally be used to further the use of remote lethal military force.

Más contenido para leer:  Los investigadores vinculan la red de bots Dridex con el ransomware emergente Entropy

“There is currently considerable discussion about the dangers of ‘killer robot’ autonomous weapon systems, but it is also time to start discussing how to control human enhancement and cyborg technologies which military planners have determined will also be developed,” it said.

Speaking during a webinar on 29 November, the report’s author, Peter Burt, said UK bodies such as the MoD’s Defence and Security Accelerator (DASA) are already funding the development of human augmentation prototypes, highlighting this is something the government is taking seriously.

He added that use of these human augmentation technologies will not be limited to the battlefield, and could also be used by domestic policing or security forces to manage and surveil their own populations.

“Cyborg technologies would also be invaluable to clandestine surveillance, and operators will be able to seamlessly move through crowded streets, capturing intelligence, targeting conversations, and other information that would be digitally stored for later analysis and interpretation,” he said.

Burt said there are also “many, many [ethical] unknowns” around human augmentation in the military, such as what happens if a human and computer become so interlinked that the human effectively becomes the weapon, or at least part of the weapon.

MoD thinking

In terms of UK government thinking on the issue, Burt noted the MoD’s Development, Concepts and Doctrine Centre (DCDC) collaborated with the German Bundeswehr Office for Defence Planning in 2021 to produce a report called Human augmentation – The dawn of a new paradigm.

In it, the MoD identified four core technologies it believes will be integral to human augmentation in the future – including genetic engineering, bioinformatics, brain interfaces and pharmaceuticals – and likened the human body to a “platform” for new technologies.

Más contenido para leer:  Los 'datos inadecuados' impiden que las empresas progresen en los objetivos ESG, según un estudio de IBM

It also indicated that ethical considerations may be trumped by other concerns around, for example, national security.

“Defence cannot wait for ethical views to change before exploiting human augmentation; it must join the conversation now to ensure it is at the leading edge of this field,” it said, adding that there may also be a “moral obligation to augment people” when it can promote well-being or protect against novel threats such as new virus strains.

“The need for human augmentation may ultimately be dictated by national interest,” it said. “Countries may need to develop and use human augmentation or risk surrendering influence, prosperity and security to those who will.”

It further added that “the future of human augmentation should not … be decided by ethicists or public opinion”, and that governments will instead “need to develop a clear policy position that maximises the use of human augmentation in support of prosperity, safety and security, without undermining our values.”

It concluded that “international dialogue and regulation of human augmentation technologies will be critical”.

Managing dual use

Commenting on the dual use nature of human augmentation during the same webinar, Ben Taylor-Green – awarded a DPhil in early 2023 for his work on brain-computer interface unmanned aerial vehicle (BCIUAV) technology – said BCIUAV use cases are posited as both an “assistive technology for severely paralysed patients” with, for example, motor neurone disease and a thought-controlled “neuroweapon” at the same time.

He added that, based on his extensive review of the scientific literature around BCIUAVs, “most of the research is not done in the knowledge of, or in immediate proximity to … the risk of weapons innovation. By which, I mean that most [scientific] papers do not declare military funding or association, and nor do they describe any weaponised uses.”

Más contenido para leer:  Telent toma el National Express para transformar las comunicaciones de los autobuses en West Midlands

However, Taylor-Green was clear that, among researchers of this technology for non-military purposes, “there is no evidence” indicating serious ongoing discussions about “the moral risks and material realities of contributing to weapons innovation” indirectly.

He suggested there needs to be serious engagement in the communities of scientists, engineers an others working on human augmentation-related technologies, so there can be “earnest” conversations about the ethics of conducting research that can so clearly be weaponised.

Drone War itself has suggested government can start to be regulated by controlling specific use cases of neurotechnology, on the basis it potentially poses the highest risk of any technology under the banner of human augmentation. It has also suggested setting up a research monitoring network for any human augmentation technologies with dual use potential, to act as an “early warning system” around the creation of neuroweapons.

Nuestro objetivo fué el mismo desde 2004, unir personas y ayudarlas en sus acciones online, siempre gratis, eficiente y sobre todo fácil!

¿Donde estamos?

Mendoza, Argentina

Nuestras Redes Sociales