Aarathi Krishnan: 5 ethical principles for digitizing humanitarian aid

Recorded atDecember 01, 2021
EventTEDWomen 2021
Duration (min:sec)11:37
Video TypeTED Stage Talk
Words per minute141.35 very slow
Readability (FK)33.86 very difficult
SpeakerAarathi Krishnan

Official TED page for this talk

Synopsis

Over the last decade, humanitarian organizations have digitized many of their systems, from registering refugees with biometric IDs to transporting cargo via drones. This has helped deliver aid around the world, but it's also brought new risks to the people it's meant to protect. Tech and human rights ethicist Aarathi Krishnan points to the dangers of digitization -- like sensitive data getting into the hands of the wrong people -- and lays out five ethical principles to help inform humanitarian tech innovation.

Text Highlight (experimental)
     
100:04 Sociologist Zeynep Tufekci once said that history is full of massive examples of harm caused by people with great power who felt that just because they felt themselves to have good intentions, that they could not cause harm.
200:24 In 2017, Rohingya refugees started to flee Myanmar into Bangladesh due to a crackdown by the Myanmar military, an act that the UN subsequently called of genocidal intent.
300:39 As they started to arrive into camps, they had to register for a range of services.
400:45 One of this was to register for a government-backed digital biometric identification card.
500:51 They weren't actually given the option to opt out.
600:56 In 2021, Human Rights Watch accused international humanitarian agencies of sharing improperly collected information about Rohingya refugees with the Myanmar government without appropriate consent.
701:11 The information shared didn't just contain biometrics.
801:15 It contained information about family makeup, relatives overseas, where they were originally from.
901:23 Sparking fears of retaliation by the Myanmar government, some went into hiding.
1001:29 Targeted identification of persecuted peoples has long been a tactic of genocidal regimes.
1101:37 But now that data is digitized, meaning it is faster to access, quicker to scale and more readily accessible.
1201:45 This was a failure on a multitude of fronts: institutional, governance, moral.
1301:52 I have spent 15 years of my career working in humanitarian aid.
1401:56 From Rwanda to Afghanistan.
1501:59 What is humanitarian aid, you might ask?
1602:01 In its simplest terms, it's the provision of emergency care to those that need it the most at desperate times.
1702:08 Post-disaster, during a crisis. Food, water, shelter.
1802:14 I have worked within very large humanitarian organizations, whether that's leading multicountry global programs to designing drone innovations for disaster management across small island states.
1902:29 I have sat with communities in the most fragile of contexts, where conversations about the future are the first ones they've ever had.
2002:40 And I have designed global strategies to prepare humanitarian organizations for these same futures.
2102:47 And the one thing I can say is that humanitarians, we have embraced digitalization at an incredible speed over the last decade, moving from tents and water cans, which we still use, by the way, to AI, big data, drones, biometrics.
2203:06 These might seem relevant, logical, needed, even sexy to technology enthusiasts.
2303:13 But what it actually is, is the deployment of untested technologies on vulnerable populations without appropriate consent.
2403:23 And this gives me pause.
2503:26 I pause because the agonies we are facing today as a global humanity didn't just happen overnight.
2603:33 They happened as a result of our shared history of colonialism and humanitarian technology innovations are inherently colonial, often designed for and in the good of groups of people seen as outside of technology themselves, and often not legitimately recognized as being able to provide for their own solutions.
2703:58 And so, as a humanitarian myself, I ask this question: in our quest to do good in the world, how can we ensure that we do not lock people into future harm, future indebtedness and future inequity as a result of these actions?
2804:17 It is why I now study the ethics of humanitarian tech innovation.
2904:21 And this isn't just an intellectually curious pursuit.
3004:26 It's a deeply personal one.
3104:29 Driven by the belief that it is often people that look like me, that come from the communities I come from, historically excluded and marginalized, that are often spoken on behalf of and denied voice in terms of the choices available to us for our future.
3204:47 As I stand here on the shoulders of all those that have come before me and in obligation for all of those that will come after me to say to you that good intentions alone do not prevent harm, and good intentions alone can cause harm.
3305:06 I'm often asked, what do I see ahead of us in this next 21st century?
3405:11 And if I had to sum it up: of deep uncertainty, a dying planet, distrust, pain.
3505:20 And in times of great volatility, we as human beings, we yearn for a balm.
3605:26 And digital futures are exactly that, a balm.
3705:30 We look at it in all of its possibility as if it could soothe all that ails us, like a logical inevitability.
3805:39 In recent years, reports have started to flag the new types of risks that are emerging about technology innovations.
3905:48 One of this is around how data collected on vulnerable individuals can actually be used against them as retaliation, posing greater risk not just against them, but against their families, against their community.
4006:05 We saw these risks become a truth with the Rohingya.
4106:09 And very, very recently, in August 2021, as Afghanistan fell to the Taliban, it also came to light that biometric data collected on Afghans by the US military and the Afghan government and used by a variety of actors were now in the hands of the Taliban.
4206:29 Journalists' houses were searched.
4306:32 Afghans desperately raced against time to erase their digital history online.
4406:38 Technologies of empowerment then become technologies of disempowerment.
4506:45 It is because these technologies are designed on a certain set of societal assumptions, embedded in market and then filtered through capitalist considerations.
4606:56 But technologies created in one context and then parachuted into another will always fail because it is based on assumptions of how people lead their lives.
4707:08 And whilst here, you and I may be relatively comfortable providing a fingertip scan to perhaps go to the movies, we cannot extrapolate that out to the level of safety one would feel while standing in line, having to give up that little bit of data about themselves in order to access food rations.
4807:31 Humanitarians assume that technology will liberate humanity, but without any due consideration of issues of power, exploitation and harm that can occur for this to happen.
4907:46 Instead, we rush to solutionizing, a form of magical thinking that assumes that just by deploying shiny solutions, we can solve the problem in front of us without any real analysis of underlying realities.
5008:03 These are tools at the end of the day, and tools, like a chef's knife, in the hands of some, the creator of a beautiful meal, and in the hands of others, devastation.
5108:17 So how do we ensure that we do not design the inequities of our past into our digital futures?
5208:26 And I want to be clear about one thing.
5308:28 I'm not anti-tech. I am anti-dumb tech.
5408:31 (Laughter) (Applause)
5508:38 The limited imaginings of the few should not colonize the radical re-imaginings of the many.
5608:45 So how then do we ensure that we design an ethical baseline, so that the liberation that this promises is not just for a privileged few, but for all of us?
5708:59 There are a few examples that can point to a way forward.
5809:03 I love the work of Indigenous AI that instead of drawing from Western values and philosophies, it draws from Indigenous protocols and values to embed into AI code.
5909:15 I also really love the work of Nia Tero, an Indigenous co-led organization that works with Indigenous communities to map their own well-being and territories as opposed to other people coming in to do it on their behalf.
6009:29 I've learned a lot from the Satellite Sentinel Project back in 2010, which is a slightly different example.
6109:36 The project started essentially to map atrocities through remote sensing technologies, satellites, in order to be able to predict and potentially prevent them.
6209:48 Now the project wound down after a few years for a variety of reasons, one of which being that it couldn’t actually generate action.
6309:57 But the second, and probably the most important, was that the team realized they were operating without an ethical net.
6410:07 And without ethical guidelines in place, it was a very wide open line of questioning about whether what they were doing was helpful or harmful.
6510:19 And so they decided to wind down before creating harm.
6610:24 In the absence of legally binding ethical frameworks to guide our work, I have been working on a range of ethical principles to help inform humanitarian tech innovation, and I'd like to put forward a few of these here for you today.
6710:41 One: Ask.
6810:43 Which groups of humans will be harmed by this and when?
6910:48 Assess: Who does this solution actually benefit?
7010:53 Interrogate: Was appropriate consent obtained from the end users?
7111:00 Consider: What must we gracefully exit out of to be fit for these futures?
7211:07 And imagine: What future good might we foreclose if we implemented this action today?
7311:16 We are accountable for the futures that we create.
7411:20 We cannot absolve ourselves of the responsibilities and accountabilities of our actions if our actions actually cause harm to those that we purport to protect and serve.
7511:32 Another world is absolutely, radically possible.
7611:37 Thank you.
7711:39 (Applause)
S M L