Edward Snowden: Here's how we take back the Internet

Recorded atMarch 18, 2014
EventTED2014
Duration (min:sec)34:47
Video TypeTED Stage Talk
Words per minute176.92 medium
Readability (FK)51.95 medium
SpeakerEdward Snowden
CountryUnited States of AmericaRussia
Occupationwhistleblower, dissident
DescriptionAmerican whistleblower and former NSA contractor (born 1983)

Official TED page for this talk

Synopsis

Appearing by telepresence robot, Edward Snowden speaks at TED2014 about surveillance and Internet freedom. The right to data privacy, he suggests, is not a partisan issue, but requires a fundamental rethink of the role of the internet in our lives — and the laws that protect it. "Your rights matter," he says, "because you never know when you're going to need them." Chris Anderson interviews, with special guest Tim Berners-Lee.

Text Highlight (experimental)
     
100:13 Chris Anderson: The rights of citizens,
200:15 the future of the Internet.
300:17 So I would like to welcome to the TED stage
400:20 the man behind those revelations,
500:23 Ed Snowden.
600:25 (Applause)
700:29 Ed is in a remote location somewhere in Russia
800:33 controlling this bot from his laptop,
900:36 so he can see what the bot can see.
1000:40 Ed, welcome to the TED stage.
1100:42 What can you see, as a matter of fact?
1200:45 Edward Snowden: Ha, I can see everyone.
1300:47 This is amazing.
1400:49 (Laughter)
1500:53 CA: Ed, some questions for you.
1600:56 You've been called many things
1700:57 in the last few months.
1800:59 You've been called a whistleblower, a traitor,
1901:04 a hero.
2001:05 What words would you describe yourself with?
2101:09 ES: You know, everybody who is involved
2201:12 with this debate
2301:13 has been struggling over me and my personality
2401:16 and how to describe me.
2501:19 But when I think about it,
2601:21 this isn't the question that we should be struggling with.
2701:24 Who I am really doesn't matter at all.
2801:28 If I'm the worst person in the world,
2901:30 you can hate me and move on.
3001:32 What really matters here are the issues.
3101:35 What really matters here is the kind of government we want,
3201:38 the kind of Internet we want,
3301:39 the kind of relationship between people
3401:42 and societies.
3501:43 And that's what I'm hoping the debate will move towards,
3601:46 and we've seen that increasing over time.
3701:48 If I had to describe myself,
3801:50 I wouldn't use words like "hero."
3901:52 I wouldn't use "patriot," and I wouldn't use "traitor."
4001:54 I'd say I'm an American and I'm a citizen,
4101:57 just like everyone else.
4201:59 CA: So just to give some context
4302:01 for those who don't know the whole story --
4402:03 (Applause) —
4502:07 this time a year ago, you were stationed in Hawaii
4602:11 working as a consultant to the NSA.
4702:14 As a sysadmin, you had access
4802:16 to their systems,
4902:18 and you began revealing certain classified documents
5002:23 to some handpicked journalists
5102:26 leading the way to June's revelations.
5202:27 Now, what propelled you to do this?
5302:33 ES: You know,
5402:36 when I was sitting in Hawaii,
5502:38 and the years before, when I was working in the intelligence community,
5602:40 I saw a lot of things that had disturbed me.
5702:44 We do a lot of good things in the intelligence community,
5802:47 things that need to be done,
5902:49 and things that help everyone.
6002:51 But there are also things that go too far.
6102:53 There are things that shouldn't be done,
6202:55 and decisions that were being made in secret
6302:58 without the public's awareness,
6402:59 without the public's consent,
6503:01 and without even our representatives in government
6603:04 having knowledge of these programs.
6703:08 When I really came to struggle with these issues,
6803:12 I thought to myself,
6903:14 how can I do this in the most responsible way,
7003:17 that maximizes the public benefit
7103:20 while minimizing the risks?
7203:23 And out of all the solutions that I could come up with,
7303:26 out of going to Congress,
7403:28 when there were no laws,
7503:29 there were no legal protections
7603:31 for a private employee,
7703:33 a contractor in intelligence like myself,
7803:36 there was a risk that I would be buried along with the information
7903:40 and the public would never find out.
8003:42 But the First Amendment of the United States Constitution
8103:45 guarantees us a free press for a reason,
8203:48 and that's to enable an adversarial press,
8303:52 to challenge the government,
8403:53 but also to work together with the government,
8503:56 to have a dialogue and debate about how we can
8603:58 inform the public about matters of vital importance
8704:04 without putting our national security at risk.
8804:07 And by working with journalists,
8904:09 by giving all of my information
9004:11 back to the American people,
9104:13 rather than trusting myself to make
9204:15 the decisions about publication,
9304:18 we've had a robust debate
9404:20 with a deep investment by the government
9504:23 that I think has resulted in a benefit for everyone.
9604:28 And the risks that have been threatened,
9704:32 the risks that have been played up
9804:34 by the government
9904:36 have never materialized.
10004:37 We've never seen any evidence
10104:39 of even a single instance of specific harm,
10204:43 and because of that,
10304:44 I'm comfortable with the decisions that I made.
10404:46 CA: So let me show the audience
10504:49 a couple of examples of what you revealed.
10604:51 If we could have a slide up, and Ed,
10704:53 I don't know whether you can see,
10804:55 the slides are here.
10904:56 This is a slide of the PRISM program,
11004:58 and maybe you could tell the audience
11105:01 what that was that was revealed.
11205:03 ES: The best way to understand PRISM,
11305:06 because there's been a little bit of controversy,
11405:07 is to first talk about what PRISM isn't.
11505:11 Much of the debate in the U.S. has been about metadata.
11605:14 They've said it's just metadata, it's just metadata,
11705:16 and they're talking about a specific legal authority
11805:19 called Section 215 of the Patriot Act.
11905:22 That allows sort of a warrantless wiretapping,
12005:25 mass surveillance of the entire country's
12105:27 phone records, things like that --
12205:30 who you're talking to,
12305:31 when you're talking to them,
12405:33 where you traveled.
12505:34 These are all metadata events.
12605:37 PRISM is about content.
12705:40 It's a program through which the government could
12805:42 compel corporate America,
12905:44 it could deputize corporate America
13005:48 to do its dirty work for the NSA.
13105:51 And even though some of these companies did resist,
13205:54 even though some of them --
13305:56 I believe Yahoo was one of them —
13405:57 challenged them in court, they all lost,
13506:00 because it was never tried by an open court.
13606:03 They were only tried by a secret court.
13706:06 And something that we've seen,
13806:07 something about the PRISM program that's very concerning to me is,
13906:10 there's been a talking point in the U.S. government
14006:12 where they've said 15 federal judges
14106:16 have reviewed these programs and found them to be lawful,
14206:18 but what they don't tell you
14306:21 is those are secret judges
14406:24 in a secret court
14506:26 based on secret interpretations of law
14606:29 that's considered 34,000 warrant requests
14706:33 over 33 years,
14806:35 and in 33 years only rejected
14906:38 11 government requests.
15006:41 These aren't the people that we want deciding
15106:43 what the role of corporate America
15206:45 in a free and open Internet should be.
15306:48 CA: Now, this slide that we're showing here
15406:50 shows the dates in which
15506:52 different technology companies, Internet companies,
15606:55 are alleged to have joined the program,
15706:57 and where data collection began from them.
15807:00 Now, they have denied collaborating with the NSA.
15907:05 How was that data collected by the NSA?
16007:10 ES: Right. So the NSA's own slides
16107:13 refer to it as direct access.
16207:16 What that means to an actual NSA analyst,
16307:19 someone like me who was working as an intelligence analyst
16407:22 targeting, Chinese cyber-hackers,
16507:24 things like that, in Hawaii,
16607:26 is the provenance of that data
16707:28 is directly from their servers.
16807:30 It doesn't mean
16907:32 that there's a group of company representatives
17007:35 sitting in a smoky room with the NSA
17107:38 palling around and making back-room deals
17207:40 about how they're going to give this stuff away.
17307:42 Now each company handles it different ways.
17407:44 Some are responsible.
17507:46 Some are somewhat less responsible.
17607:48 But the bottom line is, when we talk about
17707:49 how this information is given,
17807:53 it's coming from the companies themselves.
17907:55 It's not stolen from the lines.
18007:58 But there's an important thing to remember here:
18108:00 even though companies pushed back,
18208:02 even though companies demanded,
18308:04 hey, let's do this through a warrant process,
18408:06 let's do this
18508:08 where we actually have some sort of legal review,
18608:11 some sort of basis for handing over
18708:13 these users' data,
18808:15 we saw stories in the Washington Post last year
18908:17 that weren't as well reported as the PRISM story
19008:20 that said the NSA broke in
19108:23 to the data center communications
19208:25 between Google to itself
19308:27 and Yahoo to itself.
19408:29 So even these companies that are cooperating
19508:31 in at least a compelled but hopefully lawful manner
19608:34 with the NSA,
19708:36 the NSA isn't satisfied with that,
19808:39 and because of that, we need our companies
19908:41 to work very hard
20008:44 to guarantee that they're going to represent
20108:47 the interests of the user, and also advocate
20208:49 for the rights of the users.
20308:51 And I think over the last year,
20408:52 we've seen the companies that are named
20508:54 on the PRISM slides
20608:55 take great strides to do that,
20708:57 and I encourage them to continue.
20809:00 CA: What more should they do?
20909:02 ES: The biggest thing that an Internet company
21009:06 in America can do today, right now,
21109:09 without consulting with lawyers,
21209:10 to protect the rights of users worldwide,
21309:14 is to enable SSL web encryption
21409:19 on every page you visit.
21509:21 The reason this matters is today,
21609:24 if you go to look at a copy of "1984" on Amazon.com,
21709:29 the NSA can see a record of that,
21809:32 the Russian intelligence service can see a record of that,
21909:34 the Chinese service can see a record of that,
22009:37 the French service, the German service,
22109:39 the services of Andorra.
22209:40 They can all see it because it's unencrypted.
22309:43 The world's library is Amazon.com,
22409:47 but not only do they not support encryption by default,
22509:49 you cannot choose to use encryption
22609:52 when browsing through books.
22709:53 This is something that we need to change,
22809:55 not just for Amazon, I don't mean to single them out,
22909:57 but they're a great example.
23009:58 All companies need to move
23110:00 to an encrypted browsing habit by default
23210:03 for all users who haven't taken any action
23310:06 or picked any special methods on their own.
23410:08 That'll increase the privacy and the rights
23510:10 that people enjoy worldwide.
23610:13 CA: Ed, come with me to this part of the stage.
23710:16 I want to show you the next slide here. (Applause)
23810:19 This is a program called Boundless Informant.
23910:21 What is that?
24010:23 ES: So, I've got to give credit to the NSA
24110:25 for using appropriate names on this.
24210:28 This is one of my favorite NSA cryptonyms.
24310:31 Boundless Informant
24410:33 is a program that the NSA hid from Congress.
24510:36 The NSA was previously asked by Congress,
24610:38 was there any ability that they had
24710:40 to even give a rough ballpark estimate
24810:44 of the amount of American communications
24910:46 that were being intercepted.
25010:49 They said no. They said, we don't track those stats,
25110:52 and we can't track those stats.
25210:53 We can't tell you how many communications
25310:56 we're intercepting around the world,
25410:58 because to tell you that would be
25510:59 to invade your privacy.
25611:02 Now, I really appreciate that sentiment from them,
25711:05 but the reality, when you look at this slide is,
25811:07 not only do they have the capability,
25911:08 the capability already exists.
26011:11 It's already in place.
26111:13 The NSA has its own internal data format
26211:16 that tracks both ends of a communication,
26311:20 and if it says,
26411:21 this communication came from America,
26511:23 they can tell Congress how many of those communications
26611:26 they have today, right now.
26711:28 And what Boundless Informant tells us
26811:31 is more communications are being intercepted
26911:34 in America about Americans
27011:37 than there are in Russia about Russians.
27111:40 I'm not sure that's what an intelligence agency
27211:42 should be aiming for.
27311:44 CA: Ed, there was a story broken in the Washington Post,
27411:47 again from your data.
27511:49 The headline says,
27611:50 "NSA broke privacy rules
27711:52 thousands of times per year."
27811:54 Tell us about that.
27911:55 ES: We also heard in Congressional testimony last year,
28011:58 it was an amazing thing for someone like me
28112:00 who came from the NSA
28212:02 and who's seen the actual internal documents,
28312:04 knows what's in them,
28412:07 to see officials testifying under oath
28512:09 that there had been no abuses,
28612:11 that there had been no violations of the NSA's rules,
28712:15 when we knew this story was coming.
28812:18 But what's especially interesting about this,
28912:20 about the fact that the NSA has violated
29012:22 their own rules, their own laws
29112:24 thousands of times in a single year,
29212:27 including one event by itself,
29312:30 one event out of those 2,776,
29412:35 that affected more than 3,000 people.
29512:37 In another event, they intercepted
29612:39 all the calls in Washington, D.C., by accident.
29712:43 What's amazing about this,
29812:45 this report, that didn't get that much attention,
29912:47 is the fact that not only were there 2,776 abuses,
30012:52 the chairman of the Senate Intelligence Committee,
30112:54 Dianne Feinstein, had not seen this report
30212:58 until the Washington Post contacted her
30313:02 asking for comment on the report.
30413:04 And she then requested a copy from the NSA
30513:06 and received it,
30613:08 but had never seen this before that.
30713:10 What does that say about the state of oversight
30813:12 in American intelligence
30913:14 when the chairman of the Senate Intelligence Committee
31013:16 has no idea that the rules are being broken
31113:19 thousands of times every year?
31213:21 CA: Ed, one response to this whole debate is this:
31313:24 Why should we care about
31413:27 all this surveillance, honestly?
31513:29 I mean, look, if you've done nothing wrong,
31613:31 you've got nothing to worry about.
31713:34 What's wrong with that point of view?
31813:36 ES: Well, so the first thing is,
31913:37 you're giving up your rights.
32013:39 You're saying hey, you know,
32113:41 I don't think I'm going to need them,
32213:43 so I'm just going to trust that, you know,
32313:45 let's get rid of them, it doesn't really matter,
32413:48 these guys are going to do the right thing.
32513:50 Your rights matter
32613:51 because you never know when you're going to need them.
32713:54 Beyond that, it's a part of our cultural identity,
32813:57 not just in America,
32913:59 but in Western societies
33014:00 and in democratic societies around the world.
33114:03 People should be able to pick up the phone
33214:06 and to call their family,
33314:07 people should be able to send a text message
33414:09 to their loved ones,
33514:10 people should be able to buy a book online,
33614:13 they should be able to travel by train,
33714:14 they should be able to buy an airline ticket
33814:17 without wondering about how these events
33914:18 are going to look to an agent of the government,
34014:22 possibly not even your government
34114:25 years in the future,
34214:26 how they're going to be misinterpreted
34314:28 and what they're going to think your intentions were.
34414:31 We have a right to privacy.
34514:33 We require warrants to be based on probable cause
34614:37 or some kind of individualized suspicion
34714:39 because we recognize that trusting anybody,
34814:44 any government authority,
34914:45 with the entirety of human communications
35014:48 in secret and without oversight
35114:51 is simply too great a temptation to be ignored.
35214:56 CA: Some people are furious at what you've done.
35314:58 I heard a quote recently from Dick Cheney
35415:01 who said that Julian Assange was a flea bite,
35515:07 Edward Snowden is the lion that bit the head off the dog.
35615:10 He thinks you've committed
35715:12 one of the worst acts of betrayal
35815:14 in American history.
35915:16 What would you say to people who think that?
36015:22 ES: Dick Cheney's really something else.
36115:25 (Laughter) (Applause)
36215:32 Thank you. (Laughter)
36315:37 I think it's amazing, because at the time
36415:39 Julian Assange was doing some of his greatest work,
36515:43 Dick Cheney was saying
36615:45 he was going to end governments worldwide,
36715:47 the skies were going to ignite
36815:50 and the seas were going to boil off,
36915:52 and now he's saying it's a flea bite.
37015:54 So we should be suspicious about the same sort of
37115:57 overblown claims of damage to national security
37216:01 from these kind of officials.
37316:03 But let's assume that these people really believe this.
37416:09 I would argue that they have kind of
37516:12 a narrow conception of national security.
37616:16 The prerogatives of people like Dick Cheney
37716:19 do not keep the nation safe.
37816:22 The public interest is not always the same
37916:26 as the national interest.
38016:29 Going to war with people who are not our enemy
38116:33 in places that are not a threat
38216:35 doesn't make us safe,
38316:37 and that applies whether it's in Iraq
38416:39 or on the Internet.
38516:41 The Internet is not the enemy.
38616:42 Our economy is not the enemy.
38716:44 American businesses, Chinese businesses,
38816:47 and any other company out there
38916:51 is a part of our society.
39016:54 It's a part of our interconnected world.
39116:56 There are ties of fraternity that bond us together,
39217:00 and if we destroy these bonds
39317:03 by undermining the standards, the security,
39417:06 the manner of behavior,
39517:09 that nations and citizens all around the world
39617:12 expect us to abide by.
39717:14 CA: But it's alleged that you've stolen
39817:18 1.7 million documents.
39917:20 It seems only a few hundred of them
40017:22 have been shared with journalists so far.
40117:25 Are there more revelations to come?
40217:28 ES: There are absolutely more revelations to come.
40317:30 I don't think there's any question
40417:33 that some of the most important reporting
40517:37 to be done is yet to come.
40617:42 CA: Come here, because I want to ask you
40717:44 about this particular revelation.
40817:46 Come and take a look at this.
40917:49 I mean, this is a story which I think for a lot of the techies in this room
41017:52 is the single most shocking thing
41117:54 that they have heard in the last few months.
41217:56 It's about a program called "Bullrun."
41317:59 Can you explain what that is?
41418:02 ES: So Bullrun, and this is again
41518:04 where we've got to thank the NSA for their candor,
41618:11 this is a program named after a Civil War battle.
41718:16 The British counterpart is called Edgehill,
41818:17 which is a U.K. civil war battle.
41918:19 And the reason that I believe they're named this way
42018:21 is because they target our own infrastructure.
42118:25 They're programs through which the NSA
42218:27 intentionally misleads corporate partners.
42318:31 They tell corporate partners that these
42418:33 are safe standards.
42518:35 They say hey, we need to work with you
42618:37 to secure your systems,
42718:41 but in reality, they're giving bad advice
42818:44 to these companies that makes them
42918:45 degrade the security of their services.
43018:47 They're building in backdoors that not only
43118:50 the NSA can exploit,
43218:52 but anyone else who has time and money
43318:55 to research and find it
43418:57 can then use to let themselves in
43518:59 to the world's communications.
43619:01 And this is really dangerous,
43719:03 because if we lose a single standard,
43819:07 if we lose the trust of something like SSL,
43919:10 which was specifically targeted
44019:11 by the Bullrun program,
44119:13 we will live a less safe world overall.
44219:16 We won't be able to access our banks
44319:18 and we won't be able to access commerce
44419:23 without worrying about people monitoring those communications
44519:26 or subverting them for their own ends.
44619:28 CA: And do those same decisions also potentially
44719:32 open America up to cyberattacks
44819:35 from other sources?
44919:39 ES: Absolutely.
45019:41 One of the problems,
45119:43 one of the dangerous legacies
45219:46 that we've seen in the post-9/11 era,
45319:49 is that the NSA has traditionally worn two hats.
45419:54 They've been in charge of offensive operations,
45519:56 that is hacking,
45619:57 but they've also been in charge of defensive operations,
45719:59 and traditionally they've always prioritized
45820:02 defense over offense
45920:03 based on the principle
46020:05 that American secrets are simply worth more.
46120:07 If we hack a Chinese business
46220:10 and steal their secrets,
46320:11 if we hack a government office in Berlin
46420:13 and steal their secrets,
46520:15 that has less value to the American people
46620:19 than making sure that the Chinese
46720:21 can't get access to our secrets.
46820:24 So by reducing the security of our communications,
46920:28 they're not only putting the world at risk,
47020:30 they're putting America at risk in a fundamental way,
47120:32 because intellectual property is the basis,
47220:35 the foundation of our economy,
47320:37 and if we put that at risk through weak security,
47420:39 we're going to be paying for it for years.
47520:41 CA: But they've made a calculation
47620:42 that it was worth doing this
47720:44 as part of America's defense against terrorism.
47820:48 Surely that makes it a price worth paying.
47920:51 ES: Well, when you look at the results
48020:55 of these programs in stopping terrorism,
48120:58 you will see that that's unfounded,
48221:01 and you don't have to take my word for it,
48321:03 because we've had the first open court,
48421:07 the first federal court that's reviewed this,
48521:09 outside the secrecy arrangement,
48621:12 called these programs Orwellian
48721:14 and likely unconstitutional.
48821:16 Congress, who has access
48921:19 to be briefed on these things,
49021:21 and now has the desire to be,
49121:23 has produced bills to reform it,
49221:26 and two independent White House panels
49321:29 who reviewed all of the classified evidence
49421:31 said these programs have never stopped
49521:34 a single terrorist attack
49621:35 that was imminent in the United States.
49721:39 So is it really terrorism that we're stopping?
49821:42 Do these programs have any value at all?
49921:44 I say no, and all three branches
50021:47 of the American government say no as well.
50121:49 CA: I mean, do you think there's a deeper motivation
50221:51 for them than the war against terrorism?
50321:54 ES: I'm sorry, I couldn't hear you, say again?
50421:56 CA: Sorry. Do you think there's a deeper motivation
50521:59 for them other than the war against terrorism?
50622:02 ES: Yeah. The bottom line is that terrorism
50722:05 has always been what we in the intelligence world
50822:07 would call a cover for action.
50922:11 Terrorism is something that provokes
51022:13 an emotional response that allows people
51122:15 to rationalize authorizing powers and programs
51222:19 that they wouldn't give otherwise.
51322:22 The Bullrun and Edgehill-type programs,
51422:24 the NSA asked for these authorities
51522:26 back in the 1990s.
51622:28 They asked the FBI to go to Congress and make the case.
51722:31 The FBI went to Congress and did make the case.
51822:33 But Congress and the American people said no.
51922:35 They said, it's not worth the risk to our economy.
52022:38 They said it's worth too much damage
52122:40 to our society to justify the gains.
52222:43 But what we saw is, in the post-9/11 era,
52322:47 they used secrecy and they used the justification of terrorism
52422:50 to start these programs in secret
52522:52 without asking Congress,
52622:54 without asking the American people,
52722:56 and it's that kind of government behind closed doors
52822:59 that we need to guard ourselves against,
52923:01 because it makes us less safe,
53023:02 and it offers no value.
53123:04 CA: Okay, come with me here for a sec,
53223:06 because I've got a more personal question for you.
53323:08 Speaking of terror,
53423:11 most people would find the situation you're in right now
53523:15 in Russia pretty terrifying.
53623:19 You obviously heard what happened,
53723:22 what the treatment that Bradley Manning got,
53823:24 Chelsea Manning as now is,
53923:27 and there was a story in Buzzfeed saying that
54023:29 there are people in the intelligence community
54123:31 who want you dead.
54223:33 How are you coping with this?
54323:35 How are you coping with the fear?
54423:37 ES: It's no mystery
54523:40 that there are governments out there that want to see me dead.
54623:46 I've made clear again and again and again
54723:49 that I go to sleep every morning
54823:52 thinking about what I can do for the American people.
54923:57 I don't want to harm my government.
55024:00 I want to help my government,
55124:03 but the fact that they are willing to
55224:07 completely ignore due process,
55324:09 they're willing to declare guilt
55424:12 without ever seeing a trial,
55524:15 these are things that we need to work against
55624:18 as a society, and say hey, this is not appropriate.
55724:21 We shouldn't be threatening dissidents.
55824:23 We shouldn't be criminalizing journalism.
55924:26 And whatever part I can do to see that end,
56024:30 I'm happy to do despite the risks.
56124:33 CA: So I'd actually like to get some feedback
56224:34 from the audience here,
56324:35 because I know there's widely differing reactions
56424:38 to Edward Snowden.
56524:39 Suppose you had the following two choices, right?
56624:42 You could view what he did
56724:45 as fundamentally a reckless act
56824:46 that has endangered America
56924:50 or you could view it as fundamentally a heroic act
57024:53 that will work towards America and the world's
57124:57 long-term good?
57224:58 Those are the two choices I'll give you.
57325:01 I'm curious to see who's willing to vote with
57425:04 the first of those,
57525:05 that this was a reckless act?
57625:08 There are some hands going up.
57725:10 Some hands going up.
57825:11 It's hard to put your hand up
57925:13 when the man is standing right here,
58025:15 but I see them.
58125:16 ES: I can see you. (Laughter)
58225:19 CA: And who goes with the second choice,
58325:21 the fundamentally heroic act?
58425:23 (Applause) (Cheers)
58525:26 And I think it's true to say that there are a lot of people
58625:28 who didn't show a hand and I think
58725:31 are still thinking this through,
58825:32 because it seems to me that the debate around you
58925:36 doesn't split along traditional political lines.
59025:39 It's not left or right, it's not really about
59125:41 pro-government, libertarian, or not just that.
59225:45 Part of it is almost a generational issue.
59325:48 You're part of a generation that grew up
59425:50 with the Internet, and it seems as if
59525:53 you become offended at almost a visceral level
59625:56 when you see something done
59725:57 that you think will harm the Internet.
59825:59 Is there some truth to that?
59926:03 ES: It is. I think it's very true.
60026:08 This is not a left or right issue.
60126:11 Our basic freedoms, and when I say our,
60226:13 I don't just mean Americans,
60326:15 I mean people around the world,
60426:17 it's not a partisan issue.
60526:19 These are things that all people believe,
60626:21 and it's up to all of us to protect them,
60726:24 and to people who have seen and enjoyed
60826:27 a free and open Internet,
60926:28 it's up to us to preserve that liberty
61026:32 for the next generation to enjoy,
61126:34 and if we don't change things,
61226:35 if we don't stand up to make the changes
61326:39 we need to do to keep the Internet safe,
61426:42 not just for us but for everyone,
61526:45 we're going to lose that,
61626:46 and that would be a tremendous loss,
61726:47 not just for us, but for the world.
61826:50 CA: Well, I have heard similar language recently
61926:52 from the founder of the world wide web,
62026:54 who I actually think is with us, Sir Tim Berners-Lee.
62126:58 Tim, actually, would you like to come up and say,
62227:01 do we have a microphone for Tim?
62327:03 (Applause)
62427:05 Tim, good to see you. Come up there.
62527:12 Which camp are you in, by the way,
62627:15 traitor, hero? I have a theory on this, but --
62727:18 Tim Berners-Lee: I've given much longer
62827:21 answers to that question, but hero,
62927:24 if I have to make the choice between the two.
63027:27 CA: And Ed, I think you've read
63127:31 the proposal that Sir Tim has talked about
63227:33 about a new Magna Carta to take back the Internet.
63327:36 Is that something that makes sense?
63427:38 ES: Absolutely. I mean, my generation, I grew up
63527:41 not just thinking about the Internet,
63627:43 but I grew up in the Internet,
63727:46 and although I never expected to have the chance
63827:50 to defend it in such a direct and practical manner
63927:56 and to embody it in this unusual,
64028:00 almost avatar manner,
64128:02 I think there's something poetic about the fact that
64228:05 one of the sons of the Internet
64328:07 has actually become close to the Internet
64428:10 as a result of their political expression.
64528:12 And I believe that a Magna Carta for the Internet
64628:16 is exactly what we need.
64728:18 We need to encode our values
64828:21 not just in writing but in the structure of the Internet,
64928:25 and it's something that I hope,
65028:27 I invite everyone in the audience,
65128:29 not just here in Vancouver but around the world,
65228:33 to join and participate in.
65328:35 CA: Do you have a question for Ed?
65428:37 TBL: Well, two questions,
65528:39 a general question —
65628:40 CA: Ed, can you still hear us?
65728:42 ES: Yes, I can hear you. CA: Oh, he's back.
65828:46 TBL: The wiretap on your line
65928:47 got a little interfered with for a moment.
66028:49 (Laughter)
66128:51 ES: It's a little bit of an NSA problem.
66228:53 TBL: So, from the 25 years,
66328:57 stepping back and thinking,
66429:00 what would you think would be
66529:02 the best that we could achieve
66629:04 from all the discussions that we have
66729:06 about the web we want?
66829:09 ES: When we think about
66929:12 in terms of how far we can go,
67029:15 I think that's a question that's really only limited
67129:18 by what we're willing to put into it.
67229:20 I think the Internet that we've enjoyed in the past
67329:23 has been exactly what we as not just a nation
67429:29 but as a people around the world need,
67529:32 and by cooperating, by engaging not just
67629:36 the technical parts of society,
67729:38 but as you said, the users,
67829:40 the people around the world who contribute
67929:43 through the Internet, through social media,
68029:45 who just check the weather,
68129:47 who rely on it every day as a part of their life,
68229:49 to champion that.
68329:52 We'll get not just the Internet we've had,
68429:55 but a better Internet, a better now,
68529:58 something that we can use to build a future
68630:02 that'll be better not just than what we hoped for
68730:05 but anything that we could have imagined.
68830:07 CA: It's 30 years ago that TED was founded, 1984.
68930:13 A lot of the conversation since then has been
69030:15 along the lines that
69130:17 actually George Orwell got it wrong.
69230:18 It's not Big Brother watching us.
69330:20 We, through the power of the web,
69430:22 and transparency, are now watching Big Brother.
69530:24 Your revelations kind of drove a stake
69630:26 through the heart of that rather optimistic view,
69730:30 but you still believe there's a way of doing something
69830:34 about that.
69930:35 And you do too.
70030:37 ES: Right, so there is an argument to be made
70130:43 that the powers of Big Brother have increased enormously.
70230:47 There was a recent legal article at Yale
70330:51 that established something called the Bankston-Soltani Principle,
70430:55 which is that our expectation of privacy is violated
70531:00 when the capabilities of government surveillance
70631:02 have become cheaper by an order of magnitude,
70731:05 and each time that occurs, we need to revisit
70831:08 and rebalance our privacy rights.
70931:11 Now, that hasn't happened since
71031:13 the government's surveillance powers
71131:15 have increased by several orders of magnitude,
71231:18 and that's why we're in the problem that we're in today,
71331:21 but there is still hope,
71431:25 because the power of individuals
71531:27 have also been increased by technology.
71631:30 I am living proof
71731:32 that an individual can go head to head
71831:34 against the most powerful adversaries
71931:36 and the most powerful intelligence agencies
72031:38 around the world and win,
72131:42 and I think that's something
72231:44 that we need to take hope from,
72331:46 and we need to build on
72431:47 to make it accessible not just to technical experts
72531:50 but to ordinary citizens around the world.
72631:52 Journalism is not a crime,
72731:54 communication is not a crime,
72831:56 and we should not be monitored in our everyday activities.
72931:59 CA: I'm not quite sure how you shake the hand of a bot,
73032:01 but I imagine it's, this is the hand right here. TBL: That'll come very soon.
73132:07 ES: Nice to meet you,
73232:08 and I hope my beam looks as nice
73332:10 as my view of you guys does.
73432:13 CA: Thank you, Tim.
73532:16 (Applause)
73632:21 I mean, The New York Times recently called for an amnesty for you.
73732:25 Would you welcome the chance to come back to America?
73832:30 ES: Absolutely. There's really no question,
73932:33 the principles that have been the foundation
74032:36 of this project
74132:38 have been the public interest
74232:42 and the principles that underly
74332:45 the journalistic establishment in the United States
74432:49 and around the world,
74532:51 and I think if the press is now saying,
74632:56 we support this,
74732:58 this is something that needed to happen,
74833:00 that's a powerful argument, but it's not the final argument,
74933:03 and I think that's something that public should decide.
75033:06 But at the same time,
75133:07 the government has hinted that they want
75233:09 some kind of deal,
75333:11 that they want me to compromise
75433:13 the journalists with which I've been working,
75533:15 to come back,
75633:16 and I want to make it very clear
75733:19 that I did not do this to be safe.
75833:22 I did this to do what was right,
75933:24 and I'm not going to stop my work
76033:26 in the public interest
76133:28 just to benefit myself.
76233:30 (Applause)
76333:36 CA: In the meantime,
76433:38 courtesy of the Internet and this technology,
76533:42 you're here, back in North America,
76633:44 not quite the U.S., Canada, in this form.
76733:48 I'm curious, how does that feel?
76833:52 ES: Canada is different than what I expected.
76933:55 It's a lot warmer.
77033:57 (Laughter)
77134:02 CA: At TED, the mission is "ideas worth spreading."
77234:06 If you could encapsulate it in a single idea,
77334:08 what is your idea worth spreading
77434:10 right now at this moment?
77534:14 ES: I would say the last year has been a reminder
77634:18 that democracy may die behind closed doors,
77734:21 but we as individuals are born
77834:23 behind those same closed doors,
77934:26 and we don't have to give up
78034:28 our privacy to have good government.
78134:32 We don't have to give up our liberty
78234:34 to have security.
78334:35 And I think by working together
78434:38 we can have both open government
78534:41 and private lives,
78634:42 and I look forward to working with everyone
78734:44 around the world to see that happen.
78834:47 Thank you very much.
78934:48 CA: Ed, thank you.
79034:50 (Applause)
S M L