Humanitarian drones and other anachronisms

Aid-by-drone, what’s not to like? Nick Dowson explores the ethical questions raised by the use of experimental tech in disasters

The Iraq war brought the oxymoronic ‘surgical strike’. Now military technologies have given us another odd coupling: the ‘humanitarian drone’.

It’s no joke: the repurposing of unmanned aerial vehicles (UAVs) for delivery of emergency aid is just one of the innovations being hailed for a new ‘cyber humanitarianism’, along with artificial intelligence (AI), big data and others.

UAVs entrench the remote management of crises, reinforcing a ‘bunkering’ tendency that disconnects aid workers from those they purport to serve

Of course, technological advances will, and already do, bring many benefits. Last year direct cash transfers to Somalis – many made via mobile payment transfers, or electronic vouchers – were credited with averting a famine threatening over six million people.

But many of these new technologies come with their own agenda and moral conundrums. Proponents say drones will soon be an effective way to deliver goods; trials are currently under way. But they also ‘often risk becoming a form of experimentation on populations in the Global South,’ warns Kristin Sandvik, a research professor at the Peace Research Institute Oslo.

With air space, and personal data, much more heavily regulated in the Global North, trialling drones in crisis zones gives companies a way to test their technology – and look good while doing so.

But even put to ‘humanitarian’ uses, these technologies can cause harm. As well as blurring the line between humanitarian and military responses – and occasionally crashing into civilians – UAVs entrench the remote management of crises, reinforcing a ‘bunkering’ tendency that disconnects aid workers from those they purport to serve, making them less accountable.

And here’s a paradox: while drones help provide surveillance from the skies that is potentially highly useful for humanitarians, the same absence of ‘good on-the-ground intelligence’ that is used as a rationale for drone use is also an ‘insurmountable obstacle’ for their use in providing ‘surgically precise’ aid, argue Sandvik and fellow researcher Kjersti Lohne.

Humanitarian AI, with its reliance on often opaque algorithms, risks being another form of experimentation without consent. One algorithm developed through machine-learning to allocate refugees across resettlement locations is claimed by researchers to have increased employment prospects for their (virtual) refugees by between 40 and 70 per cent. But by treating refugees as bits of data, this approach embeds in code migrants’ exclusion from decisions and policymaking, stripping them of agency.

Rather than understanding beneficiaries as complex beings with emotional needs – with a desire, for example, to live close to family members – it replaces them with blind numbers and technology for technology’s sake: one of the algorithm’s ‘discoveries’ in Switzerland was that French-speaking refugees did better when placed in French cantons. Common sense, you might say.

It gets darker. When the Ebola epidemic hit West Africa in 2014, both public and private organizations pushed for access to mobile records for tracking the spread of the disease. This turned out ‘both illegal and ineffective’, say Sandvik and Lohne: Ebola, after all, ‘requires the exchange of fluids to transmit – a behaviour that isn’t represented in call detail records’.

Rather than understanding beneficiaries as complex beings with emotional needs it replaces them with blind numbers and technology for technology’s sake

Not only was highly sensitive personal data – covering location and communication history – of millions of people disclosed, but time and resources that could have been used to save lives were diverted.

An unstated assumption behind many proposed technologies is that they are ‘designed for life outside the state’, says refugee studies researcher Tom Scott-Smith. Individualized solutions, like the water-purifying LifeStraw which provides a clean personal water supply, are free for the need for social institutions.

Meanwhile, providing technology offers a beachhead into the sector for private companies, and helps shift the humanitarian relationship from solidarity to consumption.

Ultimately, technology alone will not solve humanity’s ills. As Nick Guttmann, Head of Humanitarian Response at Christian Aid argues, while cash technologies have proven useful in Somalia and elsewhere, detailed needs-assessments and community engagement remain invaluable.

‘No innovation without representation’ should be the slogan, Scott-Smith argues. ‘If humanitarians are going to innovate on behalf of others, they should make sure there is maximum involvement.’

Nick Dowson is a writer and investigative journalist who has covered topics including health, technology and power, housing, transport and the environment.

This is a lengthened version of the article originally published in the April edition of New Internationalist, on humanitarianism.