Recorded at | March 18, 2014 |
---|---|
Event | TED2014 |
Duration (min:sec) | 34:47 |
Video Type | TED Stage Talk |
Words per minute | 176.92 medium |
Readability (FK) | 51.95 medium |
Speaker | Edward Snowden |
Country | United States of AmericaRussia |
Occupation | whistleblower, dissident |
Description | American whistleblower and former NSA contractor (born 1983) |
Official TED page for this talk
Synopsis
Appearing by telepresence robot, Edward Snowden speaks at TED2014 about surveillance and Internet freedom. The right to data privacy, he suggests, is not a partisan issue, but requires a fundamental rethink of the role of the internet in our lives — and the laws that protect it. "Your rights matter," he says, "because you never know when you're going to need them." Chris Anderson interviews, with special guest Tim Berners-Lee.
1 | 00:13 | Chris Anderson: The rights of citizens, | ||
2 | 00:15 | the future of the Internet. | ||
3 | 00:17 | So I would like to welcome to the TED stage | ||
4 | 00:20 | the man behind those revelations, | ||
5 | 00:23 | Ed Snowden. | ||
6 | 00:25 | (Applause) | ||
7 | 00:29 | Ed is in a remote location somewhere in Russia | ||
8 | 00:33 | controlling this bot from his laptop, | ||
9 | 00:36 | so he can see what the bot can see. | ||
10 | 00:40 | Ed, welcome to the TED stage. | ||
11 | 00:42 | What can you see, as a matter of fact? | ||
12 | 00:45 | Edward Snowden: Ha, I can see everyone. | ||
13 | 00:47 | This is amazing. | ||
14 | 00:49 | (Laughter) | ||
15 | 00:53 | CA: Ed, some questions for you. | ||
16 | 00:56 | You've been called many things | ||
17 | 00:57 | in the last few months. | ||
18 | 00:59 | You've been called a whistleblower, a traitor, | ||
19 | 01:04 | a hero. | ||
20 | 01:05 | What words would you describe yourself with? | ||
21 | 01:09 | ES: You know, everybody who is involved | ||
22 | 01:12 | with this debate | ||
23 | 01:13 | has been struggling over me and my personality | ||
24 | 01:16 | and how to describe me. | ||
25 | 01:19 | But when I think about it, | ||
26 | 01:21 | this isn't the question that we should be struggling with. | ||
27 | 01:24 | Who I am really doesn't matter at all. | ||
28 | 01:28 | If I'm the worst person in the world, | ||
29 | 01:30 | you can hate me and move on. | ||
30 | 01:32 | What really matters here are the issues. | ||
31 | 01:35 | What really matters here is the kind of government we want, | ||
32 | 01:38 | the kind of Internet we want, | ||
33 | 01:39 | the kind of relationship between people | ||
34 | 01:42 | and societies. | ||
35 | 01:43 | And that's what I'm hoping the debate will move towards, | ||
36 | 01:46 | and we've seen that increasing over time. | ||
37 | 01:48 | If I had to describe myself, | ||
38 | 01:50 | I wouldn't use words like "hero." | ||
39 | 01:52 | I wouldn't use "patriot," and I wouldn't use "traitor." | ||
40 | 01:54 | I'd say I'm an American and I'm a citizen, | ||
41 | 01:57 | just like everyone else. | ||
42 | 01:59 | CA: So just to give some context | ||
43 | 02:01 | for those who don't know the whole story -- | ||
44 | 02:03 | (Applause) — | ||
45 | 02:07 | this time a year ago, you were stationed in Hawaii | ||
46 | 02:11 | working as a consultant to the NSA. | ||
47 | 02:14 | As a sysadmin, you had access | ||
48 | 02:16 | to their systems, | ||
49 | 02:18 | and you began revealing certain classified documents | ||
50 | 02:23 | to some handpicked journalists | ||
51 | 02:26 | leading the way to June's revelations. | ||
52 | 02:27 | Now, what propelled you to do this? | ||
53 | 02:33 | ES: You know, | ||
54 | 02:36 | when I was sitting in Hawaii, | ||
55 | 02:38 | and the years before, when I was working in the intelligence community, | ||
56 | 02:40 | I saw a lot of things that had disturbed me. | ||
57 | 02:44 | We do a lot of good things in the intelligence community, | ||
58 | 02:47 | things that need to be done, | ||
59 | 02:49 | and things that help everyone. | ||
60 | 02:51 | But there are also things that go too far. | ||
61 | 02:53 | There are things that shouldn't be done, | ||
62 | 02:55 | and decisions that were being made in secret | ||
63 | 02:58 | without the public's awareness, | ||
64 | 02:59 | without the public's consent, | ||
65 | 03:01 | and without even our representatives in government | ||
66 | 03:04 | having knowledge of these programs. | ||
67 | 03:08 | When I really came to struggle with these issues, | ||
68 | 03:12 | I thought to myself, | ||
69 | 03:14 | how can I do this in the most responsible way, | ||
70 | 03:17 | that maximizes the public benefit | ||
71 | 03:20 | while minimizing the risks? | ||
72 | 03:23 | And out of all the solutions that I could come up with, | ||
73 | 03:26 | out of going to Congress, | ||
74 | 03:28 | when there were no laws, | ||
75 | 03:29 | there were no legal protections | ||
76 | 03:31 | for a private employee, | ||
77 | 03:33 | a contractor in intelligence like myself, | ||
78 | 03:36 | there was a risk that I would be buried along with the information | ||
79 | 03:40 | and the public would never find out. | ||
80 | 03:42 | But the First Amendment of the United States Constitution | ||
81 | 03:45 | guarantees us a free press for a reason, | ||
82 | 03:48 | and that's to enable an adversarial press, | ||
83 | 03:52 | to challenge the government, | ||
84 | 03:53 | but also to work together with the government, | ||
85 | 03:56 | to have a dialogue and debate about how we can | ||
86 | 03:58 | inform the public about matters of vital importance | ||
87 | 04:04 | without putting our national security at risk. | ||
88 | 04:07 | And by working with journalists, | ||
89 | 04:09 | by giving all of my information | ||
90 | 04:11 | back to the American people, | ||
91 | 04:13 | rather than trusting myself to make | ||
92 | 04:15 | the decisions about publication, | ||
93 | 04:18 | we've had a robust debate | ||
94 | 04:20 | with a deep investment by the government | ||
95 | 04:23 | that I think has resulted in a benefit for everyone. | ||
96 | 04:28 | And the risks that have been threatened, | ||
97 | 04:32 | the risks that have been played up | ||
98 | 04:34 | by the government | ||
99 | 04:36 | have never materialized. | ||
100 | 04:37 | We've never seen any evidence | ||
101 | 04:39 | of even a single instance of specific harm, | ||
102 | 04:43 | and because of that, | ||
103 | 04:44 | I'm comfortable with the decisions that I made. | ||
104 | 04:46 | CA: So let me show the audience | ||
105 | 04:49 | a couple of examples of what you revealed. | ||
106 | 04:51 | If we could have a slide up, and Ed, | ||
107 | 04:53 | I don't know whether you can see, | ||
108 | 04:55 | the slides are here. | ||
109 | 04:56 | This is a slide of the PRISM program, | ||
110 | 04:58 | and maybe you could tell the audience | ||
111 | 05:01 | what that was that was revealed. | ||
112 | 05:03 | ES: The best way to understand PRISM, | ||
113 | 05:06 | because there's been a little bit of controversy, | ||
114 | 05:07 | is to first talk about what PRISM isn't. | ||
115 | 05:11 | Much of the debate in the U.S. has been about metadata. | ||
116 | 05:14 | They've said it's just metadata, it's just metadata, | ||
117 | 05:16 | and they're talking about a specific legal authority | ||
118 | 05:19 | called Section 215 of the Patriot Act. | ||
119 | 05:22 | That allows sort of a warrantless wiretapping, | ||
120 | 05:25 | mass surveillance of the entire country's | ||
121 | 05:27 | phone records, things like that -- | ||
122 | 05:30 | who you're talking to, | ||
123 | 05:31 | when you're talking to them, | ||
124 | 05:33 | where you traveled. | ||
125 | 05:34 | These are all metadata events. | ||
126 | 05:37 | PRISM is about content. | ||
127 | 05:40 | It's a program through which the government could | ||
128 | 05:42 | compel corporate America, | ||
129 | 05:44 | it could deputize corporate America | ||
130 | 05:48 | to do its dirty work for the NSA. | ||
131 | 05:51 | And even though some of these companies did resist, | ||
132 | 05:54 | even though some of them -- | ||
133 | 05:56 | I believe Yahoo was one of them — | ||
134 | 05:57 | challenged them in court, they all lost, | ||
135 | 06:00 | because it was never tried by an open court. | ||
136 | 06:03 | They were only tried by a secret court. | ||
137 | 06:06 | And something that we've seen, | ||
138 | 06:07 | something about the PRISM program that's very concerning to me is, | ||
139 | 06:10 | there's been a talking point in the U.S. government | ||
140 | 06:12 | where they've said 15 federal judges | ||
141 | 06:16 | have reviewed these programs and found them to be lawful, | ||
142 | 06:18 | but what they don't tell you | ||
143 | 06:21 | is those are secret judges | ||
144 | 06:24 | in a secret court | ||
145 | 06:26 | based on secret interpretations of law | ||
146 | 06:29 | that's considered 34,000 warrant requests | ||
147 | 06:33 | over 33 years, | ||
148 | 06:35 | and in 33 years only rejected | ||
149 | 06:38 | 11 government requests. | ||
150 | 06:41 | These aren't the people that we want deciding | ||
151 | 06:43 | what the role of corporate America | ||
152 | 06:45 | in a free and open Internet should be. | ||
153 | 06:48 | CA: Now, this slide that we're showing here | ||
154 | 06:50 | shows the dates in which | ||
155 | 06:52 | different technology companies, Internet companies, | ||
156 | 06:55 | are alleged to have joined the program, | ||
157 | 06:57 | and where data collection began from them. | ||
158 | 07:00 | Now, they have denied collaborating with the NSA. | ||
159 | 07:05 | How was that data collected by the NSA? | ||
160 | 07:10 | ES: Right. So the NSA's own slides | ||
161 | 07:13 | refer to it as direct access. | ||
162 | 07:16 | What that means to an actual NSA analyst, | ||
163 | 07:19 | someone like me who was working as an intelligence analyst | ||
164 | 07:22 | targeting, Chinese cyber-hackers, | ||
165 | 07:24 | things like that, in Hawaii, | ||
166 | 07:26 | is the provenance of that data | ||
167 | 07:28 | is directly from their servers. | ||
168 | 07:30 | It doesn't mean | ||
169 | 07:32 | that there's a group of company representatives | ||
170 | 07:35 | sitting in a smoky room with the NSA | ||
171 | 07:38 | palling around and making back-room deals | ||
172 | 07:40 | about how they're going to give this stuff away. | ||
173 | 07:42 | Now each company handles it different ways. | ||
174 | 07:44 | Some are responsible. | ||
175 | 07:46 | Some are somewhat less responsible. | ||
176 | 07:48 | But the bottom line is, when we talk about | ||
177 | 07:49 | how this information is given, | ||
178 | 07:53 | it's coming from the companies themselves. | ||
179 | 07:55 | It's not stolen from the lines. | ||
180 | 07:58 | But there's an important thing to remember here: | ||
181 | 08:00 | even though companies pushed back, | ||
182 | 08:02 | even though companies demanded, | ||
183 | 08:04 | hey, let's do this through a warrant process, | ||
184 | 08:06 | let's do this | ||
185 | 08:08 | where we actually have some sort of legal review, | ||
186 | 08:11 | some sort of basis for handing over | ||
187 | 08:13 | these users' data, | ||
188 | 08:15 | we saw stories in the Washington Post last year | ||
189 | 08:17 | that weren't as well reported as the PRISM story | ||
190 | 08:20 | that said the NSA broke in | ||
191 | 08:23 | to the data center communications | ||
192 | 08:25 | between Google to itself | ||
193 | 08:27 | and Yahoo to itself. | ||
194 | 08:29 | So even these companies that are cooperating | ||
195 | 08:31 | in at least a compelled but hopefully lawful manner | ||
196 | 08:34 | with the NSA, | ||
197 | 08:36 | the NSA isn't satisfied with that, | ||
198 | 08:39 | and because of that, we need our companies | ||
199 | 08:41 | to work very hard | ||
200 | 08:44 | to guarantee that they're going to represent | ||
201 | 08:47 | the interests of the user, and also advocate | ||
202 | 08:49 | for the rights of the users. | ||
203 | 08:51 | And I think over the last year, | ||
204 | 08:52 | we've seen the companies that are named | ||
205 | 08:54 | on the PRISM slides | ||
206 | 08:55 | take great strides to do that, | ||
207 | 08:57 | and I encourage them to continue. | ||
208 | 09:00 | CA: What more should they do? | ||
209 | 09:02 | ES: The biggest thing that an Internet company | ||
210 | 09:06 | in America can do today, right now, | ||
211 | 09:09 | without consulting with lawyers, | ||
212 | 09:10 | to protect the rights of users worldwide, | ||
213 | 09:14 | is to enable SSL web encryption | ||
214 | 09:19 | on every page you visit. | ||
215 | 09:21 | The reason this matters is today, | ||
216 | 09:24 | if you go to look at a copy of "1984" on Amazon.com, | ||
217 | 09:29 | the NSA can see a record of that, | ||
218 | 09:32 | the Russian intelligence service can see a record of that, | ||
219 | 09:34 | the Chinese service can see a record of that, | ||
220 | 09:37 | the French service, the German service, | ||
221 | 09:39 | the services of Andorra. | ||
222 | 09:40 | They can all see it because it's unencrypted. | ||
223 | 09:43 | The world's library is Amazon.com, | ||
224 | 09:47 | but not only do they not support encryption by default, | ||
225 | 09:49 | you cannot choose to use encryption | ||
226 | 09:52 | when browsing through books. | ||
227 | 09:53 | This is something that we need to change, | ||
228 | 09:55 | not just for Amazon, I don't mean to single them out, | ||
229 | 09:57 | but they're a great example. | ||
230 | 09:58 | All companies need to move | ||
231 | 10:00 | to an encrypted browsing habit by default | ||
232 | 10:03 | for all users who haven't taken any action | ||
233 | 10:06 | or picked any special methods on their own. | ||
234 | 10:08 | That'll increase the privacy and the rights | ||
235 | 10:10 | that people enjoy worldwide. | ||
236 | 10:13 | CA: Ed, come with me to this part of the stage. | ||
237 | 10:16 | I want to show you the next slide here. (Applause) | ||
238 | 10:19 | This is a program called Boundless Informant. | ||
239 | 10:21 | What is that? | ||
240 | 10:23 | ES: So, I've got to give credit to the NSA | ||
241 | 10:25 | for using appropriate names on this. | ||
242 | 10:28 | This is one of my favorite NSA cryptonyms. | ||
243 | 10:31 | Boundless Informant | ||
244 | 10:33 | is a program that the NSA hid from Congress. | ||
245 | 10:36 | The NSA was previously asked by Congress, | ||
246 | 10:38 | was there any ability that they had | ||
247 | 10:40 | to even give a rough ballpark estimate | ||
248 | 10:44 | of the amount of American communications | ||
249 | 10:46 | that were being intercepted. | ||
250 | 10:49 | They said no. They said, we don't track those stats, | ||
251 | 10:52 | and we can't track those stats. | ||
252 | 10:53 | We can't tell you how many communications | ||
253 | 10:56 | we're intercepting around the world, | ||
254 | 10:58 | because to tell you that would be | ||
255 | 10:59 | to invade your privacy. | ||
256 | 11:02 | Now, I really appreciate that sentiment from them, | ||
257 | 11:05 | but the reality, when you look at this slide is, | ||
258 | 11:07 | not only do they have the capability, | ||
259 | 11:08 | the capability already exists. | ||
260 | 11:11 | It's already in place. | ||
261 | 11:13 | The NSA has its own internal data format | ||
262 | 11:16 | that tracks both ends of a communication, | ||
263 | 11:20 | and if it says, | ||
264 | 11:21 | this communication came from America, | ||
265 | 11:23 | they can tell Congress how many of those communications | ||
266 | 11:26 | they have today, right now. | ||
267 | 11:28 | And what Boundless Informant tells us | ||
268 | 11:31 | is more communications are being intercepted | ||
269 | 11:34 | in America about Americans | ||
270 | 11:37 | than there are in Russia about Russians. | ||
271 | 11:40 | I'm not sure that's what an intelligence agency | ||
272 | 11:42 | should be aiming for. | ||
273 | 11:44 | CA: Ed, there was a story broken in the Washington Post, | ||
274 | 11:47 | again from your data. | ||
275 | 11:49 | The headline says, | ||
276 | 11:50 | "NSA broke privacy rules | ||
277 | 11:52 | thousands of times per year." | ||
278 | 11:54 | Tell us about that. | ||
279 | 11:55 | ES: We also heard in Congressional testimony last year, | ||
280 | 11:58 | it was an amazing thing for someone like me | ||
281 | 12:00 | who came from the NSA | ||
282 | 12:02 | and who's seen the actual internal documents, | ||
283 | 12:04 | knows what's in them, | ||
284 | 12:07 | to see officials testifying under oath | ||
285 | 12:09 | that there had been no abuses, | ||
286 | 12:11 | that there had been no violations of the NSA's rules, | ||
287 | 12:15 | when we knew this story was coming. | ||
288 | 12:18 | But what's especially interesting about this, | ||
289 | 12:20 | about the fact that the NSA has violated | ||
290 | 12:22 | their own rules, their own laws | ||
291 | 12:24 | thousands of times in a single year, | ||
292 | 12:27 | including one event by itself, | ||
293 | 12:30 | one event out of those 2,776, | ||
294 | 12:35 | that affected more than 3,000 people. | ||
295 | 12:37 | In another event, they intercepted | ||
296 | 12:39 | all the calls in Washington, D.C., by accident. | ||
297 | 12:43 | What's amazing about this, | ||
298 | 12:45 | this report, that didn't get that much attention, | ||
299 | 12:47 | is the fact that not only were there 2,776 abuses, | ||
300 | 12:52 | the chairman of the Senate Intelligence Committee, | ||
301 | 12:54 | Dianne Feinstein, had not seen this report | ||
302 | 12:58 | until the Washington Post contacted her | ||
303 | 13:02 | asking for comment on the report. | ||
304 | 13:04 | And she then requested a copy from the NSA | ||
305 | 13:06 | and received it, | ||
306 | 13:08 | but had never seen this before that. | ||
307 | 13:10 | What does that say about the state of oversight | ||
308 | 13:12 | in American intelligence | ||
309 | 13:14 | when the chairman of the Senate Intelligence Committee | ||
310 | 13:16 | has no idea that the rules are being broken | ||
311 | 13:19 | thousands of times every year? | ||
312 | 13:21 | CA: Ed, one response to this whole debate is this: | ||
313 | 13:24 | Why should we care about | ||
314 | 13:27 | all this surveillance, honestly? | ||
315 | 13:29 | I mean, look, if you've done nothing wrong, | ||
316 | 13:31 | you've got nothing to worry about. | ||
317 | 13:34 | What's wrong with that point of view? | ||
318 | 13:36 | ES: Well, so the first thing is, | ||
319 | 13:37 | you're giving up your rights. | ||
320 | 13:39 | You're saying hey, you know, | ||
321 | 13:41 | I don't think I'm going to need them, | ||
322 | 13:43 | so I'm just going to trust that, you know, | ||
323 | 13:45 | let's get rid of them, it doesn't really matter, | ||
324 | 13:48 | these guys are going to do the right thing. | ||
325 | 13:50 | Your rights matter | ||
326 | 13:51 | because you never know when you're going to need them. | ||
327 | 13:54 | Beyond that, it's a part of our cultural identity, | ||
328 | 13:57 | not just in America, | ||
329 | 13:59 | but in Western societies | ||
330 | 14:00 | and in democratic societies around the world. | ||
331 | 14:03 | People should be able to pick up the phone | ||
332 | 14:06 | and to call their family, | ||
333 | 14:07 | people should be able to send a text message | ||
334 | 14:09 | to their loved ones, | ||
335 | 14:10 | people should be able to buy a book online, | ||
336 | 14:13 | they should be able to travel by train, | ||
337 | 14:14 | they should be able to buy an airline ticket | ||
338 | 14:17 | without wondering about how these events | ||
339 | 14:18 | are going to look to an agent of the government, | ||
340 | 14:22 | possibly not even your government | ||
341 | 14:25 | years in the future, | ||
342 | 14:26 | how they're going to be misinterpreted | ||
343 | 14:28 | and what they're going to think your intentions were. | ||
344 | 14:31 | We have a right to privacy. | ||
345 | 14:33 | We require warrants to be based on probable cause | ||
346 | 14:37 | or some kind of individualized suspicion | ||
347 | 14:39 | because we recognize that trusting anybody, | ||
348 | 14:44 | any government authority, | ||
349 | 14:45 | with the entirety of human communications | ||
350 | 14:48 | in secret and without oversight | ||
351 | 14:51 | is simply too great a temptation to be ignored. | ||
352 | 14:56 | CA: Some people are furious at what you've done. | ||
353 | 14:58 | I heard a quote recently from Dick Cheney | ||
354 | 15:01 | who said that Julian Assange was a flea bite, | ||
355 | 15:07 | Edward Snowden is the lion that bit the head off the dog. | ||
356 | 15:10 | He thinks you've committed | ||
357 | 15:12 | one of the worst acts of betrayal | ||
358 | 15:14 | in American history. | ||
359 | 15:16 | What would you say to people who think that? | ||
360 | 15:22 | ES: Dick Cheney's really something else. | ||
361 | 15:25 | (Laughter) (Applause) | ||
362 | 15:32 | Thank you. (Laughter) | ||
363 | 15:37 | I think it's amazing, because at the time | ||
364 | 15:39 | Julian Assange was doing some of his greatest work, | ||
365 | 15:43 | Dick Cheney was saying | ||
366 | 15:45 | he was going to end governments worldwide, | ||
367 | 15:47 | the skies were going to ignite | ||
368 | 15:50 | and the seas were going to boil off, | ||
369 | 15:52 | and now he's saying it's a flea bite. | ||
370 | 15:54 | So we should be suspicious about the same sort of | ||
371 | 15:57 | overblown claims of damage to national security | ||
372 | 16:01 | from these kind of officials. | ||
373 | 16:03 | But let's assume that these people really believe this. | ||
374 | 16:09 | I would argue that they have kind of | ||
375 | 16:12 | a narrow conception of national security. | ||
376 | 16:16 | The prerogatives of people like Dick Cheney | ||
377 | 16:19 | do not keep the nation safe. | ||
378 | 16:22 | The public interest is not always the same | ||
379 | 16:26 | as the national interest. | ||
380 | 16:29 | Going to war with people who are not our enemy | ||
381 | 16:33 | in places that are not a threat | ||
382 | 16:35 | doesn't make us safe, | ||
383 | 16:37 | and that applies whether it's in Iraq | ||
384 | 16:39 | or on the Internet. | ||
385 | 16:41 | The Internet is not the enemy. | ||
386 | 16:42 | Our economy is not the enemy. | ||
387 | 16:44 | American businesses, Chinese businesses, | ||
388 | 16:47 | and any other company out there | ||
389 | 16:51 | is a part of our society. | ||
390 | 16:54 | It's a part of our interconnected world. | ||
391 | 16:56 | There are ties of fraternity that bond us together, | ||
392 | 17:00 | and if we destroy these bonds | ||
393 | 17:03 | by undermining the standards, the security, | ||
394 | 17:06 | the manner of behavior, | ||
395 | 17:09 | that nations and citizens all around the world | ||
396 | 17:12 | expect us to abide by. | ||
397 | 17:14 | CA: But it's alleged that you've stolen | ||
398 | 17:18 | 1.7 million documents. | ||
399 | 17:20 | It seems only a few hundred of them | ||
400 | 17:22 | have been shared with journalists so far. | ||
401 | 17:25 | Are there more revelations to come? | ||
402 | 17:28 | ES: There are absolutely more revelations to come. | ||
403 | 17:30 | I don't think there's any question | ||
404 | 17:33 | that some of the most important reporting | ||
405 | 17:37 | to be done is yet to come. | ||
406 | 17:42 | CA: Come here, because I want to ask you | ||
407 | 17:44 | about this particular revelation. | ||
408 | 17:46 | Come and take a look at this. | ||
409 | 17:49 | I mean, this is a story which I think for a lot of the techies in this room | ||
410 | 17:52 | is the single most shocking thing | ||
411 | 17:54 | that they have heard in the last few months. | ||
412 | 17:56 | It's about a program called "Bullrun." | ||
413 | 17:59 | Can you explain what that is? | ||
414 | 18:02 | ES: So Bullrun, and this is again | ||
415 | 18:04 | where we've got to thank the NSA for their candor, | ||
416 | 18:11 | this is a program named after a Civil War battle. | ||
417 | 18:16 | The British counterpart is called Edgehill, | ||
418 | 18:17 | which is a U.K. civil war battle. | ||
419 | 18:19 | And the reason that I believe they're named this way | ||
420 | 18:21 | is because they target our own infrastructure. | ||
421 | 18:25 | They're programs through which the NSA | ||
422 | 18:27 | intentionally misleads corporate partners. | ||
423 | 18:31 | They tell corporate partners that these | ||
424 | 18:33 | are safe standards. | ||
425 | 18:35 | They say hey, we need to work with you | ||
426 | 18:37 | to secure your systems, | ||
427 | 18:41 | but in reality, they're giving bad advice | ||
428 | 18:44 | to these companies that makes them | ||
429 | 18:45 | degrade the security of their services. | ||
430 | 18:47 | They're building in backdoors that not only | ||
431 | 18:50 | the NSA can exploit, | ||
432 | 18:52 | but anyone else who has time and money | ||
433 | 18:55 | to research and find it | ||
434 | 18:57 | can then use to let themselves in | ||
435 | 18:59 | to the world's communications. | ||
436 | 19:01 | And this is really dangerous, | ||
437 | 19:03 | because if we lose a single standard, | ||
438 | 19:07 | if we lose the trust of something like SSL, | ||
439 | 19:10 | which was specifically targeted | ||
440 | 19:11 | by the Bullrun program, | ||
441 | 19:13 | we will live a less safe world overall. | ||
442 | 19:16 | We won't be able to access our banks | ||
443 | 19:18 | and we won't be able to access commerce | ||
444 | 19:23 | without worrying about people monitoring those communications | ||
445 | 19:26 | or subverting them for their own ends. | ||
446 | 19:28 | CA: And do those same decisions also potentially | ||
447 | 19:32 | open America up to cyberattacks | ||
448 | 19:35 | from other sources? | ||
449 | 19:39 | ES: Absolutely. | ||
450 | 19:41 | One of the problems, | ||
451 | 19:43 | one of the dangerous legacies | ||
452 | 19:46 | that we've seen in the post-9/11 era, | ||
453 | 19:49 | is that the NSA has traditionally worn two hats. | ||
454 | 19:54 | They've been in charge of offensive operations, | ||
455 | 19:56 | that is hacking, | ||
456 | 19:57 | but they've also been in charge of defensive operations, | ||
457 | 19:59 | and traditionally they've always prioritized | ||
458 | 20:02 | defense over offense | ||
459 | 20:03 | based on the principle | ||
460 | 20:05 | that American secrets are simply worth more. | ||
461 | 20:07 | If we hack a Chinese business | ||
462 | 20:10 | and steal their secrets, | ||
463 | 20:11 | if we hack a government office in Berlin | ||
464 | 20:13 | and steal their secrets, | ||
465 | 20:15 | that has less value to the American people | ||
466 | 20:19 | than making sure that the Chinese | ||
467 | 20:21 | can't get access to our secrets. | ||
468 | 20:24 | So by reducing the security of our communications, | ||
469 | 20:28 | they're not only putting the world at risk, | ||
470 | 20:30 | they're putting America at risk in a fundamental way, | ||
471 | 20:32 | because intellectual property is the basis, | ||
472 | 20:35 | the foundation of our economy, | ||
473 | 20:37 | and if we put that at risk through weak security, | ||
474 | 20:39 | we're going to be paying for it for years. | ||
475 | 20:41 | CA: But they've made a calculation | ||
476 | 20:42 | that it was worth doing this | ||
477 | 20:44 | as part of America's defense against terrorism. | ||
478 | 20:48 | Surely that makes it a price worth paying. | ||
479 | 20:51 | ES: Well, when you look at the results | ||
480 | 20:55 | of these programs in stopping terrorism, | ||
481 | 20:58 | you will see that that's unfounded, | ||
482 | 21:01 | and you don't have to take my word for it, | ||
483 | 21:03 | because we've had the first open court, | ||
484 | 21:07 | the first federal court that's reviewed this, | ||
485 | 21:09 | outside the secrecy arrangement, | ||
486 | 21:12 | called these programs Orwellian | ||
487 | 21:14 | and likely unconstitutional. | ||
488 | 21:16 | Congress, who has access | ||
489 | 21:19 | to be briefed on these things, | ||
490 | 21:21 | and now has the desire to be, | ||
491 | 21:23 | has produced bills to reform it, | ||
492 | 21:26 | and two independent White House panels | ||
493 | 21:29 | who reviewed all of the classified evidence | ||
494 | 21:31 | said these programs have never stopped | ||
495 | 21:34 | a single terrorist attack | ||
496 | 21:35 | that was imminent in the United States. | ||
497 | 21:39 | So is it really terrorism that we're stopping? | ||
498 | 21:42 | Do these programs have any value at all? | ||
499 | 21:44 | I say no, and all three branches | ||
500 | 21:47 | of the American government say no as well. | ||
501 | 21:49 | CA: I mean, do you think there's a deeper motivation | ||
502 | 21:51 | for them than the war against terrorism? | ||
503 | 21:54 | ES: I'm sorry, I couldn't hear you, say again? | ||
504 | 21:56 | CA: Sorry. Do you think there's a deeper motivation | ||
505 | 21:59 | for them other than the war against terrorism? | ||
506 | 22:02 | ES: Yeah. The bottom line is that terrorism | ||
507 | 22:05 | has always been what we in the intelligence world | ||
508 | 22:07 | would call a cover for action. | ||
509 | 22:11 | Terrorism is something that provokes | ||
510 | 22:13 | an emotional response that allows people | ||
511 | 22:15 | to rationalize authorizing powers and programs | ||
512 | 22:19 | that they wouldn't give otherwise. | ||
513 | 22:22 | The Bullrun and Edgehill-type programs, | ||
514 | 22:24 | the NSA asked for these authorities | ||
515 | 22:26 | back in the 1990s. | ||
516 | 22:28 | They asked the FBI to go to Congress and make the case. | ||
517 | 22:31 | The FBI went to Congress and did make the case. | ||
518 | 22:33 | But Congress and the American people said no. | ||
519 | 22:35 | They said, it's not worth the risk to our economy. | ||
520 | 22:38 | They said it's worth too much damage | ||
521 | 22:40 | to our society to justify the gains. | ||
522 | 22:43 | But what we saw is, in the post-9/11 era, | ||
523 | 22:47 | they used secrecy and they used the justification of terrorism | ||
524 | 22:50 | to start these programs in secret | ||
525 | 22:52 | without asking Congress, | ||
526 | 22:54 | without asking the American people, | ||
527 | 22:56 | and it's that kind of government behind closed doors | ||
528 | 22:59 | that we need to guard ourselves against, | ||
529 | 23:01 | because it makes us less safe, | ||
530 | 23:02 | and it offers no value. | ||
531 | 23:04 | CA: Okay, come with me here for a sec, | ||
532 | 23:06 | because I've got a more personal question for you. | ||
533 | 23:08 | Speaking of terror, | ||
534 | 23:11 | most people would find the situation you're in right now | ||
535 | 23:15 | in Russia pretty terrifying. | ||
536 | 23:19 | You obviously heard what happened, | ||
537 | 23:22 | what the treatment that Bradley Manning got, | ||
538 | 23:24 | Chelsea Manning as now is, | ||
539 | 23:27 | and there was a story in Buzzfeed saying that | ||
540 | 23:29 | there are people in the intelligence community | ||
541 | 23:31 | who want you dead. | ||
542 | 23:33 | How are you coping with this? | ||
543 | 23:35 | How are you coping with the fear? | ||
544 | 23:37 | ES: It's no mystery | ||
545 | 23:40 | that there are governments out there that want to see me dead. | ||
546 | 23:46 | I've made clear again and again and again | ||
547 | 23:49 | that I go to sleep every morning | ||
548 | 23:52 | thinking about what I can do for the American people. | ||
549 | 23:57 | I don't want to harm my government. | ||
550 | 24:00 | I want to help my government, | ||
551 | 24:03 | but the fact that they are willing to | ||
552 | 24:07 | completely ignore due process, | ||
553 | 24:09 | they're willing to declare guilt | ||
554 | 24:12 | without ever seeing a trial, | ||
555 | 24:15 | these are things that we need to work against | ||
556 | 24:18 | as a society, and say hey, this is not appropriate. | ||
557 | 24:21 | We shouldn't be threatening dissidents. | ||
558 | 24:23 | We shouldn't be criminalizing journalism. | ||
559 | 24:26 | And whatever part I can do to see that end, | ||
560 | 24:30 | I'm happy to do despite the risks. | ||
561 | 24:33 | CA: So I'd actually like to get some feedback | ||
562 | 24:34 | from the audience here, | ||
563 | 24:35 | because I know there's widely differing reactions | ||
564 | 24:38 | to Edward Snowden. | ||
565 | 24:39 | Suppose you had the following two choices, right? | ||
566 | 24:42 | You could view what he did | ||
567 | 24:45 | as fundamentally a reckless act | ||
568 | 24:46 | that has endangered America | ||
569 | 24:50 | or you could view it as fundamentally a heroic act | ||
570 | 24:53 | that will work towards America and the world's | ||
571 | 24:57 | long-term good? | ||
572 | 24:58 | Those are the two choices I'll give you. | ||
573 | 25:01 | I'm curious to see who's willing to vote with | ||
574 | 25:04 | the first of those, | ||
575 | 25:05 | that this was a reckless act? | ||
576 | 25:08 | There are some hands going up. | ||
577 | 25:10 | Some hands going up. | ||
578 | 25:11 | It's hard to put your hand up | ||
579 | 25:13 | when the man is standing right here, | ||
580 | 25:15 | but I see them. | ||
581 | 25:16 | ES: I can see you. (Laughter) | ||
582 | 25:19 | CA: And who goes with the second choice, | ||
583 | 25:21 | the fundamentally heroic act? | ||
584 | 25:23 | (Applause) (Cheers) | ||
585 | 25:26 | And I think it's true to say that there are a lot of people | ||
586 | 25:28 | who didn't show a hand and I think | ||
587 | 25:31 | are still thinking this through, | ||
588 | 25:32 | because it seems to me that the debate around you | ||
589 | 25:36 | doesn't split along traditional political lines. | ||
590 | 25:39 | It's not left or right, it's not really about | ||
591 | 25:41 | pro-government, libertarian, or not just that. | ||
592 | 25:45 | Part of it is almost a generational issue. | ||
593 | 25:48 | You're part of a generation that grew up | ||
594 | 25:50 | with the Internet, and it seems as if | ||
595 | 25:53 | you become offended at almost a visceral level | ||
596 | 25:56 | when you see something done | ||
597 | 25:57 | that you think will harm the Internet. | ||
598 | 25:59 | Is there some truth to that? | ||
599 | 26:03 | ES: It is. I think it's very true. | ||
600 | 26:08 | This is not a left or right issue. | ||
601 | 26:11 | Our basic freedoms, and when I say our, | ||
602 | 26:13 | I don't just mean Americans, | ||
603 | 26:15 | I mean people around the world, | ||
604 | 26:17 | it's not a partisan issue. | ||
605 | 26:19 | These are things that all people believe, | ||
606 | 26:21 | and it's up to all of us to protect them, | ||
607 | 26:24 | and to people who have seen and enjoyed | ||
608 | 26:27 | a free and open Internet, | ||
609 | 26:28 | it's up to us to preserve that liberty | ||
610 | 26:32 | for the next generation to enjoy, | ||
611 | 26:34 | and if we don't change things, | ||
612 | 26:35 | if we don't stand up to make the changes | ||
613 | 26:39 | we need to do to keep the Internet safe, | ||
614 | 26:42 | not just for us but for everyone, | ||
615 | 26:45 | we're going to lose that, | ||
616 | 26:46 | and that would be a tremendous loss, | ||
617 | 26:47 | not just for us, but for the world. | ||
618 | 26:50 | CA: Well, I have heard similar language recently | ||
619 | 26:52 | from the founder of the world wide web, | ||
620 | 26:54 | who I actually think is with us, Sir Tim Berners-Lee. | ||
621 | 26:58 | Tim, actually, would you like to come up and say, | ||
622 | 27:01 | do we have a microphone for Tim? | ||
623 | 27:03 | (Applause) | ||
624 | 27:05 | Tim, good to see you. Come up there. | ||
625 | 27:12 | Which camp are you in, by the way, | ||
626 | 27:15 | traitor, hero? I have a theory on this, but -- | ||
627 | 27:18 | Tim Berners-Lee: I've given much longer | ||
628 | 27:21 | answers to that question, but hero, | ||
629 | 27:24 | if I have to make the choice between the two. | ||
630 | 27:27 | CA: And Ed, I think you've read | ||
631 | 27:31 | the proposal that Sir Tim has talked about | ||
632 | 27:33 | about a new Magna Carta to take back the Internet. | ||
633 | 27:36 | Is that something that makes sense? | ||
634 | 27:38 | ES: Absolutely. I mean, my generation, I grew up | ||
635 | 27:41 | not just thinking about the Internet, | ||
636 | 27:43 | but I grew up in the Internet, | ||
637 | 27:46 | and although I never expected to have the chance | ||
638 | 27:50 | to defend it in such a direct and practical manner | ||
639 | 27:56 | and to embody it in this unusual, | ||
640 | 28:00 | almost avatar manner, | ||
641 | 28:02 | I think there's something poetic about the fact that | ||
642 | 28:05 | one of the sons of the Internet | ||
643 | 28:07 | has actually become close to the Internet | ||
644 | 28:10 | as a result of their political expression. | ||
645 | 28:12 | And I believe that a Magna Carta for the Internet | ||
646 | 28:16 | is exactly what we need. | ||
647 | 28:18 | We need to encode our values | ||
648 | 28:21 | not just in writing but in the structure of the Internet, | ||
649 | 28:25 | and it's something that I hope, | ||
650 | 28:27 | I invite everyone in the audience, | ||
651 | 28:29 | not just here in Vancouver but around the world, | ||
652 | 28:33 | to join and participate in. | ||
653 | 28:35 | CA: Do you have a question for Ed? | ||
654 | 28:37 | TBL: Well, two questions, | ||
655 | 28:39 | a general question — | ||
656 | 28:40 | CA: Ed, can you still hear us? | ||
657 | 28:42 | ES: Yes, I can hear you. CA: Oh, he's back. | ||
658 | 28:46 | TBL: The wiretap on your line | ||
659 | 28:47 | got a little interfered with for a moment. | ||
660 | 28:49 | (Laughter) | ||
661 | 28:51 | ES: It's a little bit of an NSA problem. | ||
662 | 28:53 | TBL: So, from the 25 years, | ||
663 | 28:57 | stepping back and thinking, | ||
664 | 29:00 | what would you think would be | ||
665 | 29:02 | the best that we could achieve | ||
666 | 29:04 | from all the discussions that we have | ||
667 | 29:06 | about the web we want? | ||
668 | 29:09 | ES: When we think about | ||
669 | 29:12 | in terms of how far we can go, | ||
670 | 29:15 | I think that's a question that's really only limited | ||
671 | 29:18 | by what we're willing to put into it. | ||
672 | 29:20 | I think the Internet that we've enjoyed in the past | ||
673 | 29:23 | has been exactly what we as not just a nation | ||
674 | 29:29 | but as a people around the world need, | ||
675 | 29:32 | and by cooperating, by engaging not just | ||
676 | 29:36 | the technical parts of society, | ||
677 | 29:38 | but as you said, the users, | ||
678 | 29:40 | the people around the world who contribute | ||
679 | 29:43 | through the Internet, through social media, | ||
680 | 29:45 | who just check the weather, | ||
681 | 29:47 | who rely on it every day as a part of their life, | ||
682 | 29:49 | to champion that. | ||
683 | 29:52 | We'll get not just the Internet we've had, | ||
684 | 29:55 | but a better Internet, a better now, | ||
685 | 29:58 | something that we can use to build a future | ||
686 | 30:02 | that'll be better not just than what we hoped for | ||
687 | 30:05 | but anything that we could have imagined. | ||
688 | 30:07 | CA: It's 30 years ago that TED was founded, 1984. | ||
689 | 30:13 | A lot of the conversation since then has been | ||
690 | 30:15 | along the lines that | ||
691 | 30:17 | actually George Orwell got it wrong. | ||
692 | 30:18 | It's not Big Brother watching us. | ||
693 | 30:20 | We, through the power of the web, | ||
694 | 30:22 | and transparency, are now watching Big Brother. | ||
695 | 30:24 | Your revelations kind of drove a stake | ||
696 | 30:26 | through the heart of that rather optimistic view, | ||
697 | 30:30 | but you still believe there's a way of doing something | ||
698 | 30:34 | about that. | ||
699 | 30:35 | And you do too. | ||
700 | 30:37 | ES: Right, so there is an argument to be made | ||
701 | 30:43 | that the powers of Big Brother have increased enormously. | ||
702 | 30:47 | There was a recent legal article at Yale | ||
703 | 30:51 | that established something called the Bankston-Soltani Principle, | ||
704 | 30:55 | which is that our expectation of privacy is violated | ||
705 | 31:00 | when the capabilities of government surveillance | ||
706 | 31:02 | have become cheaper by an order of magnitude, | ||
707 | 31:05 | and each time that occurs, we need to revisit | ||
708 | 31:08 | and rebalance our privacy rights. | ||
709 | 31:11 | Now, that hasn't happened since | ||
710 | 31:13 | the government's surveillance powers | ||
711 | 31:15 | have increased by several orders of magnitude, | ||
712 | 31:18 | and that's why we're in the problem that we're in today, | ||
713 | 31:21 | but there is still hope, | ||
714 | 31:25 | because the power of individuals | ||
715 | 31:27 | have also been increased by technology. | ||
716 | 31:30 | I am living proof | ||
717 | 31:32 | that an individual can go head to head | ||
718 | 31:34 | against the most powerful adversaries | ||
719 | 31:36 | and the most powerful intelligence agencies | ||
720 | 31:38 | around the world and win, | ||
721 | 31:42 | and I think that's something | ||
722 | 31:44 | that we need to take hope from, | ||
723 | 31:46 | and we need to build on | ||
724 | 31:47 | to make it accessible not just to technical experts | ||
725 | 31:50 | but to ordinary citizens around the world. | ||
726 | 31:52 | Journalism is not a crime, | ||
727 | 31:54 | communication is not a crime, | ||
728 | 31:56 | and we should not be monitored in our everyday activities. | ||
729 | 31:59 | CA: I'm not quite sure how you shake the hand of a bot, | ||
730 | 32:01 | but I imagine it's, this is the hand right here. TBL: That'll come very soon. | ||
731 | 32:07 | ES: Nice to meet you, | ||
732 | 32:08 | and I hope my beam looks as nice | ||
733 | 32:10 | as my view of you guys does. | ||
734 | 32:13 | CA: Thank you, Tim. | ||
735 | 32:16 | (Applause) | ||
736 | 32:21 | I mean, The New York Times recently called for an amnesty for you. | ||
737 | 32:25 | Would you welcome the chance to come back to America? | ||
738 | 32:30 | ES: Absolutely. There's really no question, | ||
739 | 32:33 | the principles that have been the foundation | ||
740 | 32:36 | of this project | ||
741 | 32:38 | have been the public interest | ||
742 | 32:42 | and the principles that underly | ||
743 | 32:45 | the journalistic establishment in the United States | ||
744 | 32:49 | and around the world, | ||
745 | 32:51 | and I think if the press is now saying, | ||
746 | 32:56 | we support this, | ||
747 | 32:58 | this is something that needed to happen, | ||
748 | 33:00 | that's a powerful argument, but it's not the final argument, | ||
749 | 33:03 | and I think that's something that public should decide. | ||
750 | 33:06 | But at the same time, | ||
751 | 33:07 | the government has hinted that they want | ||
752 | 33:09 | some kind of deal, | ||
753 | 33:11 | that they want me to compromise | ||
754 | 33:13 | the journalists with which I've been working, | ||
755 | 33:15 | to come back, | ||
756 | 33:16 | and I want to make it very clear | ||
757 | 33:19 | that I did not do this to be safe. | ||
758 | 33:22 | I did this to do what was right, | ||
759 | 33:24 | and I'm not going to stop my work | ||
760 | 33:26 | in the public interest | ||
761 | 33:28 | just to benefit myself. | ||
762 | 33:30 | (Applause) | ||
763 | 33:36 | CA: In the meantime, | ||
764 | 33:38 | courtesy of the Internet and this technology, | ||
765 | 33:42 | you're here, back in North America, | ||
766 | 33:44 | not quite the U.S., Canada, in this form. | ||
767 | 33:48 | I'm curious, how does that feel? | ||
768 | 33:52 | ES: Canada is different than what I expected. | ||
769 | 33:55 | It's a lot warmer. | ||
770 | 33:57 | (Laughter) | ||
771 | 34:02 | CA: At TED, the mission is "ideas worth spreading." | ||
772 | 34:06 | If you could encapsulate it in a single idea, | ||
773 | 34:08 | what is your idea worth spreading | ||
774 | 34:10 | right now at this moment? | ||
775 | 34:14 | ES: I would say the last year has been a reminder | ||
776 | 34:18 | that democracy may die behind closed doors, | ||
777 | 34:21 | but we as individuals are born | ||
778 | 34:23 | behind those same closed doors, | ||
779 | 34:26 | and we don't have to give up | ||
780 | 34:28 | our privacy to have good government. | ||
781 | 34:32 | We don't have to give up our liberty | ||
782 | 34:34 | to have security. | ||
783 | 34:35 | And I think by working together | ||
784 | 34:38 | we can have both open government | ||
785 | 34:41 | and private lives, | ||
786 | 34:42 | and I look forward to working with everyone | ||
787 | 34:44 | around the world to see that happen. | ||
788 | 34:47 | Thank you very much. | ||
789 | 34:48 | CA: Ed, thank you. | ||
790 | 34:50 | (Applause) |