Greetings and welcome to a brief news update on the third day of a weeklong immersion program for 30 participants from 20 countries around the world and open public forum with up to 1,000 onsite and online participants. Wednesday’s sessions at the Maison de la Paix, home to the Graduate Institute and Geneva Centre for Security Policy, focused on the intersection of diplomacy with global health and “human augmentation,” including ways of preventing misuse of emerging technologies and the case for neurorights to protect against devices that can decode information from our brains. The skill building sessions also focused on dual use of emerging technologies. The day began with the perspectives of Marie-Laure Salles, the Graduate Institute’s director, and Thomas Greminger, GCSP’s director and former secretary general of Organization for Security and Co-operation in Europe. Both spoke at length about their careers and professional insights during a background session.
The growing uses of powerful neurotechnologies “imply that a large amount of mental data can be potentially accessed and misused by third parties – other individuals, employers, the state,” said University of Zurich Associate Professor Roberto Andorno, who coordinates the university’s PhD program in biomedical ethics and law. “So, all this sounds like science fiction, but the fact is all this is happening now or will happen soon if no measures are taken,” said Andorno, comparing the human brain to a medieval stone fortress (pictured in a slide) protecting the last refuge of personal freedom. “Even if this is done with the apparent consent of the worker or the student, I think it’s really doubtful that their consent is really free.” He suggested a need for globally accepted new “neurorights” for cognitive liberty – rights to mental privacy, integrity and psychological continuity, which refers to unconsented personality or identity changes – or widespread reinterpretations of existing human rights to keep up with the emerging risks. “It’s clear,” he added, “at least some international standards are needed.” Stephanie Herrmann, an international staff attorney with Washington-based Perseus Strategies, said “the export and sale of brain data is going to be a new market” and that – while the “right to agency” is the best-protected neuroright; people’s identity is the worst-protected – overall, it is the more “passive” brain interactions that are “less protected” under international human rights law, particularly with the rapid advancement of various devices that can “communicate directly with the brain.”
At a session on preventing the misuse of emerging technologies, Ricardo Chavarriaga, head of the Swiss office of the Confederation of Laboratories for AI Research in Europe, said that after first identifying the probability of “foreseeable scenarios,” policy experts can start considering the individual and societal risks. What is needed is “an integrated toolkit” of neurotechnology governance internationally, said Chavarriaga, also a senior researcher at Zurich University of Applied Sciences (ZHAW) and associate fellow at GCSP. “An anticipatory stance is required to not get surprised in the future.” Weaknesses of AI include issues such as “catastrophic forgetting” – a loss of generalization ability on a task after being trained on a new task – and misuses such as manipulation of facial recognition or “deepfake” images and videos, said Sandra Scott-Hayward, an associate professor with the School of Electronics, Electrical Engineering and Computer Science at Queen’s University Belfast.
Turning to engineering for humanitarian action and global health, a session sponsored by ETH Zurich, EPFL and ICRC identified risks such as digital supply chains, surveillance and data, and cybersecurity. “The high reliance on the digital supply chain is a challenge,” said Grégoire Castella, head of the humanitarian division at EPFL’s EssentialTech Center. Thao Ton-That Whelan, an ICRC geodata analyst, said his organization focuses on regions where the “available census data and population maps are not available.” And at a session on how digital technologies can support peacekeeping and other peace efforts even in the face of uncertainty, Andreas Hirblinger, a postdoctoral researcher at the Graduate Institute’s Center on Conflict, Development and Peacebuilding (CCDP), emphasized the need to make realistic assessments of where global governance is headed, envision potential malevolent uses, and take into account tech’s impact on future conflict dynamics. Another session on “info wars” and the ethics of crisis communication looked at tensions that particularly came to light during the COVID-19 pandemic. “It seems to me to be a system where it’s harder to make corrections based on evidence,” said bioethicist Nikola Biller-Andorno, professor and director of the University of Zurich’s Institute of Biomedical Ethics. “Bringing, I think – in a challenging situation such as this pandemic was – bringing the kind of accumulating ever-changing scientific evidence together with policy decisions was a challenge here.”
The value of evidence and authenticity in another form – building trusted relationships to help span the boundaries of science and policymaking – surfaced a day earlier at discussions organized by the Geneva Science Policy Interface (GSPI). In short: relationships matter. Maxime Stauffer, a senior science policy officer with GSPI, said it is not enough for scientists to rely on evidence to prompt the creation of science-based policies; they also must network, collaborate and partner with other trusted researchers, experts and policymakers. “Information is essentially not enough. We can have the best information possible. We can have the best evidence possible, the best synthesis done directly for policymakers – it is not sufficient to actually have policy change in light of the evidence,” he said. “Policy change essentially does not respond proportionally to the information we bombard policymakers with.” And translating the complexity of scientific information into digestible information for policymaking “is necessary but it’s not sufficient” by itself, according to Moira Faul, executive director of NORRAG (Network for international policies and cooperation in education and training), a Graduate Institute associate program. “It’s about people, not concepts. If you don’t have those relationships, and you’re not building those relationships, then you’re not going to be able to do the job that you’re setting out for yourself.”
Panel discussion organised in the framework of the first Science Diplomacy Week, staged by the Geneva Coalition on Anticipatory Science and Diplomacy. As CO…