Aid experts fear 'Cambridge Analytica moment' over big data

NEW ORLEANS (Thomson Reuters Foundation) - Are you HIV-positive? Gay? Frustrated with government assistance after a disaster? Aid workers are increasingly asking such questions of the people they help, hoping to better pinpoint their needs by gathering and analyzing data on them.

But at a conference of development experts in New Orleans this week, some have questioned whether data collection in developing countries is being handled appropriately.

Ulrich Mans, co-founder of HumanityX, a Dutch group that helps aid agencies implement digital innovation, said rapid growth in big data required aid agencies to become more “tech-savvy” and aware of the risks.

But that change is happening only slowly, he said.

“I would wish it was going faster,” he told the Thomson Reuters Foundation.

Some agencies are worried that their expanding stores of data may be vulnerable to online theft.

As a blueprint to avoid missteps, Mans pointed to a set of principles on privacy and data protection drafted by the United Nations Global Pulse, a data and development initiative.

In a report, it noted the amount of digital data available globally was projected to increase by a staggering 40 percent per year.

The concerns raised at the international conference on evaluating resilience reflect sector-wide introspection amid a surge in digital data influencing the delivery of aid.

One development worker who asked to remain anonymous because he was not authorized to speak to the press warned data sets are being sent, unprotected, by email “more often than you think”.

Some larger organizations are starting to bring in cyber-security experts, but that has yet to trickle down to people collecting and sharing data on the ground, he added.

“I’m really hoping that we don’t have our Cambridge Analytica moment,” he said, referring to a scandal that erupted earlier this year over a British political consulting firm that used data obtained from Facebook accounts without permission.


A 2017 report by the International Data Responsibility Group, a global network, warned that, if leaked, data about persecuted groups “could be used to target these individuals”.

Data that could hint at one ethnic minority group resenting another - for instance, in Myanmar’s Rohingya crisis - could prove dangerous in the wrong hands, said Lindsey Jones, a researcher with the London-based Overseas Development Institute.

More than 700,000 Rohingya refugees fled into Bangladesh from western Myanmar’s Rakhine state, U.N. agencies say, after Rohingya insurgent attacks on Myanmar security forces in August 2017 triggered a sweeping military crackdown.

Xavier Vollenweider of the Flowminder Foundation, a Swedish non-profit that provides data to strengthen public health and development in poor countries, said his organization was well aware of the privacy risks.

He works on a pilot project in western Nepal that uses mobile phone records to help identify households facing hunger.

How often and where calls are made, as well as their length, are among signs that can detect when families are struggling financially, he said. The information the group receives includes a caller’s location in relation to network towers.

“These are super-sensitive data,” said Vollenweider.

The pilot project’s 500 local participants have “trusted us” with it, he added.

For security, Flowminder anonymizes the data shared by the cellphone provider in real-time, as agreed with the community.

It also encrypts and stores data on computers that never leave the company’s building, said Vollenweider.

“Ten years ago, you would have gone to a telecommunications firm with a big hard drive, and you left (with it) the next day,” he said.