The Virus of Misinformation: How Not to Use Technology in a Pandemic

by | Apr 9, 2020


With Michelle Finneran Dennedy and Faine Greenwood

Last Friday, the Secretary-General of the United Nations said: “Our common enemy is a virus, but our enemy is also a growing surge of misinformation.”

The day before, the Economist uncovered that the NHS was working with Palantir, a secretive spyware firm, to use its Foundry technology “which analyses records to deliver a ‘single source of truth.’

The day before that, a viral ‘heat-map’ of Florida cellphones and their travel paths showed how travel could spread COVID-19. To us (and others) it showed how the speed of the embrace of technology as a tracking tool is cause for serious alarm.

All of us have worked at the intersection of technology, disasters, and the challenges of scaled responses. We’ve been working on supporting communities and organisations responding around the world, through multiple disasters and diseases, and want to share some of those hard-won lessons and insights – before more money is wasted and more lives are risked.

Asking the right questions

First, before you start doing anything with technology, consider these three basic questions:

1. What is the outcome that the community wants to accomplish – are we building technology? Or solving a problem that the people who do the work would call a problem?

2. How does technology help us achieve this outcome – is there a specific information or workflow problem that technology helps?

3. Might the cost of developing this technology in resources and co-ordination better be applied elsewhere?

Then, if you’ve decided that technology is useful (hint: a hackathon is generally not useful), adopt a ‘minimum viable data’ approach, print out “When in doubt, don’t” and “It’s not your information” in a large sign, and then carefully consider these three further questions:

1. What is the least privacy-invasive approach we could take?

2. Have we considered all the risks that our approach could take and modelled the worst-case scenarios? 

3. Do the people we are trying to help understand what they are giving up and agree that no.1 justifies no.2? 

Then, and only then, with those answers in hand, should you reach out to people who actually work on the frontlines of what you are trying to help, and ask for a conversation. Because they are busy, and this is ugly. 

Innovations: useful or U.S.E.L.E.S.S?

There are too many credulous repetitions of technological claims, which we classify as the Unverified Saving Everyone’s LivES Spin (USELESS, because we like acronyms). We’ve all seen this countless times – in the Philippines after Typhoon Haiyan/Yolanda, during the Ebola crisis, and more. But far more dangerous is the casual surrender of hard-fought privacy and civil liberties to unproven tracking technologies, in the absence of rigorous safeguards.

Most of us would trade these innovations for a robust data-management infrastructure and data-protection training, coupled with basic technology resources that would allow us to serve communities in the way they want to be served. We don’t claim to be perfect, or anywhere near good enough at that – there’s a lot of room for improvement. 

But we also can confidently say that we know emergency response and technology well. And we know that for technological solutions to scale and work, they need manual labour, personal relationships, multi-year commitments that allow pivots and failures (the idea of a book delivery company pivoting to diapers, platforms, groceries, and then web services brings laughter from public sector innovators). We know, sadly, that the value of a clever idea or app in the absence of these rounds off to zero. Always.

Stop ‘disrupting’, start listening

Technology has done wonders. Most of those wonders emerge from a deep knowledge of markets and an understanding of their consequences. Casual interventions have a long history of doing dramatic harm – Facebook on misinformation, or Airbnb on rental markets. Where technology has little knowledge, it would do much better to subsume its ego, stop ‘disrupting’, and place its significant resources – fiscal and human and intellectual – in support of those who know. Pay attention to the deep power dynamics in play, and how these enable exclusion and violence.

Please: don’t hold a hackathon. Don’t put a map online showing where people are who need help or offer help. Don’t share people’s data without their consent and understanding. But most of all, ask. We’re here, we’ve failed, and we’ve learned. We want to work with you, but you need to work with us. 

Co-authors

Michelle Finneran Dinnedy is the CEO of Drumwave and the former Vice President and Chief Privacy Officer at CISCO

Faine Greenwood is an independent expert on technology and aid, a former Researcher at the Harvard Humanitarian Initiative and co-author of the The Signal Code 

Rahul Chandran writes here in his personal capacity.

Author

  • Rahul Chandran was the Executive Director of the Global Alliance for Humanitarian Innovation, and has previously worked at the intersection of peacekeeping, humanitarian, and development efforts across the globe as well as systems reform and strategic planning efforts at the United Nations. ??Prior to this, he was the Deputy Director of the Center on International Cooperation, the Director of the Afghanistan Reconstruction Programme, and an advisor to the United Nations Foundation, the OECD, the Clinton Global Initiative and numerous other institutions. He has previously worked in civil rights, in documentary film, and on a number of start-ups. He writes here in his personal capacity.


More from Global Dashboard

Let’s make climate a culture war!

Let’s make climate a culture war!

If the politics of climate change end up polarised, is that so bad?  No – it’s disastrous. Or so I’ve long thought. Look at the US – where climate is even more polarised than abortion. Result: decades of flip flopping. Ambition under Clinton; reversal...