Geneva Science and Diplomacy Anticipator

 

Polarity thinking – the ability to hold two or more opposing ideas in mind and see all of their values or benefits – is more useful for problem-solving than a narrower focus that pits risks versus opportunities. Awareness of fearful prospects can bring the opportunity for action.

Those were among the chief messages during Wednesday’s sessions that included a series of scenarios about the exponential growth of emerging AI-assisted technologies and their security implications and applications, such as killer robots, deepfake videos and the potential for new bioattacks.

“Is your challenge a problem to solve or a polarity that needs to be leveraged?” asked Christina Orisich, deputy executive director of the Geneva Centre for Security Policy, who described today’s global climate as an age of polycrises and interconnected challenges that tend to sap our brainpower. “In science diplomacy it is important to understand what polarities are involved and to develop a polarity thinking mindset with a both/and lens, as not all challenges are problems to be solved, some are polarities to be leveraged.”

Emily Munro, GCSP’s head of strategic anticipation, ran an interactive session on strategic foresight and facilitated a science diplomacy exercise for participants with four future scenarios of global governance. The session showed how foresight applies to future-oriented discussion and present action and that trends and weak signals must be differentiated.

GESDA’s cohort of 30 participants in Geneva Science Diplomacy Week 2024 broke into small groups to identify unresolvable tensions around science and tasks that involve factors such as the potential for individual rewards, planning and promoting stability. What approach works for diplomats? “People, relationships and networks,” Orisich said, adding that confidence and the ability to learn or unlearn leadership skills are important. “If you focus on the positive, it will really help you go up to the greater focus.”

Participants, who requested more discussion about potential AI governance and military frameworks of thinking, had a private lunch talk with GCSP’s director, Ambassador Thomas Greminger, a former secretary general of the Organization for Security and Cooperation in Europe (OSCE), who shared his lessons from a career focused on multilateralism.

Jean-Marc Rickli, GCSP’s head of global and emerging risks, and Federico Mantellassi, a GCSP research and project officer, oversaw a group exercise on the world’s efforts to create global governance for lethal autonomous weapons systems. Participants were divided into two working groups: policy makers role playing as scientists, versus scientists role playing as policy makers. The groups discussed an outright ban and ethics of people killed by machines making decisions beyond the realm of human intervention.

“We’re entering an era in which there is the possibility that artificial intelligence and autonomous weapons will be able to make life and death decisions over which humans will lose control. Our whole global justice system is designed to hold human beings accountable. These weapons can create an accountability gap,” Mantellassi told the group.

 All photos by Michael Chiribau, UNITAR Division for Multilateral Diplomacy  

‘Making you aware’

The group of scientists role-playing as policymakers voted 11-4 for an outright ban to prevent dual military and civilian uses. Four in the group favored more regulations. There was no support for unrestricted usage. The group’s vote wouldn’t have been enough to trigger any action, however, because the real world negotiations are consensus-based.

“I feel like there is just as much uncertainty with the technology about autonomous systems as there is about AI,” said Hunadi Millicent Mokgalaka, a public and private sector specialist for South Africa’s Council for Scientific and Industrial Research. “If we’re going to use the data to decide whether or not to kill a person, I don’t see how we can trust the data.”

The group of policymakers role-playing as scientists unanimously agreed to ban only those applications and technologies that have no clear procedures, frameworks or guidelines for overseeing their development and testing. Under some circumstances the ban could be lifted, however, if some procedures, frameworks or guidelines were put in place.

“We were not sure that the benefits outweighed the risks,” said Ian Horuzhiy, economics and international security expert for Ukraine’s Ministry of Digital Transformation. “So we were leaning towards a precautionary approach.”

Rickli said a minority of individuals and groups have real malicious intent and, with such people, it’s impossible to negotiate so we should be careful to ensure these people do not get access to some technologies. He emphasized the security implications of technologies with exponential growth, which implies a potential for the associated risks to grow in a similar fashion.

“Our blind spot comes from the fact that we have lived in a linear world. But today’s changes are exponential,” he said. “By dropping the cost of access to technology, we’re mathematically increasing the number of users. It’s good in the area of consumer technology, but in the area of security, it’s not.”

The risks include those stemming from 3D-printed plastic guns, gene editing, biohacking, and facial recognition used in mass surveillance and social media algorithms. The quality and quantity of deep fake videos, first used by the porn industry, has surged. In recent weeks, pop star Katy Perry was falsely pictured at the annual Met Gala in New York and a political consultant sent AI-generated robocalls mimicking U.S. President Joe Biden’s voice.

Other technologies carry similar risks. “Increasingly your brain is becoming the next battlefield. We’re entering a new era that increasingly will allow mind reading. The next phase is mind writing,” said Rickli.

But with polarity thinking and awareness, risks can turn to opportunities.

“It’s not about fear-mongering. The purpose of this talk is to make you think how we can make technologies more secure,” he said. “This presentation is making you aware, so that now you are much more critical. It’s about making young people more aware and making them more critical about the ways they use these technologies.”

All photos by Michael Chiribau, UNITAR Division for Multilateral Diplomacy  

Story by John Heilprin