Stephanie Busari: How fake news does real harm

Recorded atFebruary 11, 2017
EventTEDLagos Ideas Search
Duration (min:sec)06:13
Video TypeTED Stage Talk
Words per minute164.85 slow
Readability (FK)61.53 easy
SpeakerStephanie Busari
DescriptionNigerian journalist

Official TED page for this talk


On April 14, 2014, the terrorist organization Boko Haram kidnapped more than 200 schoolgirls from the town of Chibok, Nigeria. Around the world, the crime became epitomized by the slogan #BringBackOurGirls -- but in Nigeria, government officials called the crime a hoax, confusing and delaying efforts to rescue the girls. In this powerful talk, journalist Stephanie Busari points to the Chibok tragedy to explain the deadly danger of fake news and what we can do to stop it.

Text Highlight (experimental)
100:12 I want to tell you a story about a girl.
200:16 But I can't tell you her real name.
300:18 So let's just call her Hadiza.
400:21 Hadiza is 20.
500:23 She's shy,
600:24 but she has a beautiful smile that lights up her face.
700:28 But she's in constant pain.
800:32 And she will likely be on medication for the rest of her life.
900:36 Do you want to know why?
1000:39 Hadiza is a Chibok girl,
1100:42 and on April 14, 2014, she was kidnapped
1200:45 by Boko Haram terrorists.
1300:48 She managed to escape, though,
1400:50 by jumping off the truck that was carrying the girls.
1500:54 But when she landed, she broke both her legs,
1600:57 and she had to crawl on her tummy to hide in the bushes.
1701:00 She told me she was terrified that Boko Haram would come back for her.
1801:05 She was one of 57 girls who would escape by jumping off trucks that day.
1901:10 This story, quite rightly, caused ripples
2001:12 around the world.
2101:14 People like Michelle Obama, Malala and others
2201:17 lent their voices in protest,
2301:19 and at about the same time -- I was living in London at the time --
2401:22 I was sent from London to Abuja to cover the World Economic Forum
2501:27 that Nigeria was hosting for the first time.
2601:30 But when we arrived, it was clear that there was only one story in town.
2701:35 We put the government under pressure.
2801:37 We asked tough questions about what they were doing
2901:40 to bring these girls back.
3001:42 Understandably,
3101:43 they weren't too happy with our line of questioning,
3201:46 and let's just say we received our fair share of "alternative facts."
3301:50 (Laughter)
3401:53 Influential Nigerians were telling us at the time
3501:56 that we were naïve,
3601:58 we didn't understand the political situation in Nigeria.
3702:02 But they also told us
3802:04 that the story of the Chibok girls
3902:07 was a hoax.
4002:10 Sadly, this hoax narrative has persisted,
4102:12 and there are still people in Nigeria today
4202:15 who believe that the Chibok girls were never kidnapped.
4302:18 Yet I was talking to people like these --
4402:22 devastated parents,
4502:23 who told us that on the day Boko Haram kidnapped their daughters,
4602:28 they ran into the Sambisa Forest after the trucks carrying their daughters.
4702:32 They were armed with machetes, but they were forced to turn back
4802:36 because Boko Haram had guns.
4902:39 For two years, inevitably, the news agenda moved on,
5002:42 and for two years,
5102:44 we didn't hear much about the Chibok girls.
5202:47 Everyone presumed they were dead.
5302:50 But in April last year,
5402:52 I was able to obtain this video.
5502:54 This is a still from the video
5602:56 that Boko Haram filmed as a proof of life,
5703:00 and through a source, I obtained this video.
5803:03 But before I could publish it,
5903:05 I had to travel to the northeast of Nigeria
6003:08 to talk to the parents, to verify it.
6103:11 I didn't have to wait too long for confirmation.
6203:15 One of the mothers, when she watched the video, told me
6303:18 that if she could have reached into the laptop
6403:21 and pulled our her child from the laptop,
6503:25 she would have done so.
6603:28 For those of you who are parents, like myself, in the audience,
6703:31 you can only imagine the anguish
6803:34 that that mother felt.
6903:37 This video would go on to kick-start negotiation talks with Boko Haram.
7003:43 And a Nigerian senator told me that because of this video
7103:47 they entered into those talks,
7203:50 because they had long presumed that the Chibok girls were dead.
7303:54 Twenty-one girls were freed in October last year.
7403:59 Sadly, nearly 200 of them still remain missing.
7504:03 I must confess that I have not been a dispassionate observer
7604:07 covering this story.
7704:08 I am furious when I think about the wasted opportunities
7804:14 to rescue these girls.
7904:15 I am furious when I think about what the parents have told me,
8004:19 that if these were daughters of the rich and the powerful,
8104:21 they would have been found much earlier.
8204:26 And I am furious
8304:28 that the hoax narrative,
8404:30 I firmly believe,
8504:31 caused a delay;
8604:34 it was part of the reason for the delay in their return.
8704:38 This illustrates to me the deadly danger of fake news.
8804:43 So what can we do about it?
8904:45 There are some very smart people,
9004:47 smart engineers at Google and Facebook,
9104:50 who are trying to use technology to stop the spread of fake news.
9204:55 But beyond that, I think everybody here -- you and I --
9304:59 we have a role to play in that.
9405:02 We are the ones who share the content.
9505:04 We are the ones who share the stories online.
9605:07 In this day and age, we're all publishers,
9705:10 and we have responsibility.
9805:12 In my job as a journalist,
9905:15 I check, I verify.
10005:17 I trust my gut, but I ask tough questions.
10105:21 Why is this person telling me this story?
10205:24 What do they have to gain by sharing this information?
10305:27 Do they have a hidden agenda?
10405:30 I really believe that we must all start to ask tougher questions
10505:36 of information that we discover online.
10605:41 Research shows that some of us don't even read beyond headlines
10705:47 before we share stories.
10805:49 Who here has done that?
10905:51 I know I have.
11005:54 But what if
11105:57 we stopped taking information that we discover at face value?
11206:02 What if we stop to think about the consequence
11306:05 of the information that we pass on
11406:08 and its potential to incite violence or hatred?
11506:12 What if we stop to think about the real-life consequences
11606:16 of the information that we share?
11706:19 Thank you very much for listening.
11806:21 (Applause)