Recorded at | May 14, 2011 |
---|---|
Event | TEDxSiliconValley |
Duration (min:sec) | 15:56 |
Video Type | TEDx Talk |
Words per minute | 196.62 fast |
Readability (FK) | 72.7 very easy |
Speaker | Damon Horowitz |
Description | philosopher, entrepreneur |
Official TED page for this talk
Synopsis
Damon Horowitz reviews the enormous new powers that technology gives us: to know more -- and more about each other -- than ever before. Drawing the audience into a philosophical discussion, Horowitz invites us to pay new attention to the basic philosophy -- the ethical principles -- behind the burst of invention remaking our world. Where's the moral operating system that allows us to make sense of it?
1 | 00:15 | Power. | ||
2 | 00:17 | That is the word that comes to mind. | ||
3 | 00:19 | We're the new technologists. | ||
4 | 00:21 | We have a lot of data, so we have a lot of power. | ||
5 | 00:24 | How much power do we have? | ||
6 | 00:26 | Scene from a movie: "Apocalypse Now" -- great movie. | ||
7 | 00:29 | We've got to get our hero, Captain Willard, to the mouth of the Nung River so he can go pursue Colonel Kurtz. | ||
8 | 00:34 | The way we're going to do this is fly him in and drop him off. | ||
9 | 00:36 | So the scene: the sky is filled with this fleet of helicopters carrying him in. | ||
10 | 00:41 | And there's this loud, thrilling music in the background, this wild music. | ||
11 | 00:45 | ♫ Dum da ta da dum ♫ ♫ Dum da ta da dum ♫ ♫ Da ta da da ♫ That's a lot of power. | ||
12 | 00:54 | That's the kind of power I feel in this room. | ||
13 | 00:56 | That's the kind of power we have because of all of the data that we have. | ||
14 | 01:00 | Let's take an example. | ||
15 | 01:02 | What can we do with just one person's data? | ||
16 | 01:07 | What can we do with that guy's data? | ||
17 | 01:11 | I can look at your financial records. | ||
18 | 01:13 | I can tell if you pay your bills on time. | ||
19 | 01:15 | I know if you're good to give a loan to. | ||
20 | 01:17 | I can look at your medical records; I can see if your pump is still pumping -- see if you're good to offer insurance to. | ||
21 | 01:23 | I can look at your clicking patterns. | ||
22 | 01:25 | When you come to my website, I actually know what you're going to do already because I've seen you visit millions of websites before. | ||
23 | 01:30 | And I'm sorry to tell you, you're like a poker player, you have a tell. | ||
24 | 01:34 | I can tell with data analysis what you're going to do before you even do it. | ||
25 | 01:38 | I know what you like. I know who you are, and that's even before I look at your mail or your phone. | ||
26 | 01:45 | Those are the kinds of things we can do with the data that we have. | ||
27 | 01:50 | But I'm not actually here to talk about what we can do. | ||
28 | 01:56 | I'm here to talk about what we should do. | ||
29 | 02:00 | What's the right thing to do? | ||
30 | 02:04 | Now I see some puzzled looks like, "Why are you asking us what's the right thing to do? | ||
31 | 02:09 | We're just building this stuff. Somebody else is using it." | ||
32 | 02:12 | Fair enough. | ||
33 | 02:15 | But it brings me back. | ||
34 | 02:17 | I think about World War II -- some of our great technologists then, some of our great physicists, studying nuclear fission and fusion -- just nuclear stuff. | ||
35 | 02:27 | We gather together these physicists in Los Alamos to see what they'll build. | ||
36 | 02:33 | We want the people building the technology thinking about what we should be doing with the technology. | ||
37 | 02:41 | So what should we be doing with that guy's data? | ||
38 | 02:44 | Should we be collecting it, gathering it, so we can make his online experience better? | ||
39 | 02:49 | So we can make money? | ||
40 | 02:51 | So we can protect ourselves if he was up to no good? | ||
41 | 02:55 | Or should we respect his privacy, protect his dignity and leave him alone? | ||
42 | 03:02 | Which one is it? | ||
43 | 03:05 | How should we figure it out? | ||
44 | 03:07 | I know: crowdsource. Let's crowdsource this. | ||
45 | 03:11 | So to get people warmed up, let's start with an easy question -- something I'm sure everybody here has an opinion about: iPhone versus Android. | ||
46 | 03:21 | Let's do a show of hands -- iPhone. | ||
47 | 03:24 | Uh huh. | ||
48 | 03:26 | Android. | ||
49 | 03:29 | You'd think with a bunch of smart people we wouldn't be such suckers just for the pretty phones. | ||
50 | 03:33 | (Laughter) | ||
51 | 03:35 | Next question, a little bit harder. | ||
52 | 03:39 | Should we be collecting all of that guy's data to make his experiences better and to protect ourselves in case he's up to no good? | ||
53 | 03:46 | Or should we leave him alone? | ||
54 | 03:48 | Collect his data. | ||
55 | 03:53 | Leave him alone. | ||
56 | 03:56 | You're safe. It's fine. | ||
57 | 03:58 | (Laughter) | ||
58 | 04:00 | Okay, last question -- harder question -- when trying to evaluate what we should do in this case, should we use a Kantian deontological moral framework, or should we use a Millian consequentialist one? | ||
59 | 04:19 | Kant. Mill. | ||
60 | 04:25 | Not as many votes. | ||
61 | 04:27 | (Laughter) | ||
62 | 04:30 | Yeah, that's a terrifying result. | ||
63 | 04:34 | Terrifying, because we have stronger opinions about our hand-held devices than about the moral framework we should use to guide our decisions. | ||
64 | 04:44 | How do we know what to do with all the power we have if we don't have a moral framework? | ||
65 | 04:50 | We know more about mobile operating systems, but what we really need is a moral operating system. | ||
66 | 04:58 | What's a moral operating system? | ||
67 | 05:00 | We all know right and wrong, right? | ||
68 | 05:02 | You feel good when you do something right, you feel bad when you do something wrong. | ||
69 | 05:06 | Our parents teach us that: praise with the good, scold with the bad. | ||
70 | 05:09 | But how do we figure out what's right and wrong? | ||
71 | 05:12 | And from day to day, we have the techniques that we use. | ||
72 | 05:15 | Maybe we just follow our gut. | ||
73 | 05:18 | Maybe we take a vote -- we crowdsource. | ||
74 | 05:21 | Or maybe we punt -- ask the legal department, see what they say. | ||
75 | 05:26 | In other words, it's kind of random, kind of ad hoc, how we figure out what we should do. | ||
76 | 05:33 | And maybe, if we want to be on surer footing, what we really want is a moral framework that will help guide us there, that will tell us what kinds of things are right and wrong in the first place, and how would we know in a given situation what to do. | ||
77 | 05:46 | So let's get a moral framework. | ||
78 | 05:48 | We're numbers people, living by numbers. | ||
79 | 05:51 | How can we use numbers as the basis for a moral framework? | ||
80 | 05:56 | I know a guy who did exactly that. | ||
81 | 05:59 | A brilliant guy -- he's been dead 2,500 years. | ||
82 | 06:05 | Plato, that's right. | ||
83 | 06:07 | Remember him -- old philosopher? | ||
84 | 06:09 | You were sleeping during that class. | ||
85 | 06:12 | And Plato, he had a lot of the same concerns that we did. | ||
86 | 06:14 | He was worried about right and wrong. | ||
87 | 06:16 | He wanted to know what is just. | ||
88 | 06:18 | But he was worried that all we seem to be doing is trading opinions about this. | ||
89 | 06:22 | He says something's just. She says something else is just. | ||
90 | 06:25 | It's kind of convincing when he talks and when she talks too. | ||
91 | 06:27 | I'm just going back and forth; I'm not getting anywhere. | ||
92 | 06:29 | I don't want opinions; I want knowledge. | ||
93 | 06:32 | I want to know the truth about justice -- like we have truths in math. | ||
94 | 06:38 | In math, we know the objective facts. | ||
95 | 06:41 | Take a number, any number -- two. | ||
96 | 06:43 | Favorite number. I love that number. | ||
97 | 06:45 | There are truths about two. | ||
98 | 06:47 | If you've got two of something, you add two more, you get four. | ||
99 | 06:51 | That's true no matter what thing you're talking about. | ||
100 | 06:53 | It's an objective truth about the form of two, the abstract form. | ||
101 | 06:57 | When you have two of anything -- two eyes, two ears, two noses, just two protrusions -- those all partake of the form of two. | ||
102 | 07:04 | They all participate in the truths that two has. | ||
103 | 07:08 | They all have two-ness in them. | ||
104 | 07:10 | And therefore, it's not a matter of opinion. | ||
105 | 07:13 | What if, Plato thought, ethics was like math? | ||
106 | 07:17 | What if there were a pure form of justice? | ||
107 | 07:20 | What if there are truths about justice, and you could just look around in this world and see which things participated, partook of that form of justice? | ||
108 | 07:29 | Then you would know what was really just and what wasn't. | ||
109 | 07:32 | It wouldn't be a matter of just opinion or just appearances. | ||
110 | 07:37 | That's a stunning vision. | ||
111 | 07:39 | I mean, think about that. How grand. How ambitious. | ||
112 | 07:42 | That's as ambitious as we are. | ||
113 | 07:44 | He wants to solve ethics. | ||
114 | 07:46 | He wants objective truths. | ||
115 | 07:48 | If you think that way, you have a Platonist moral framework. | ||
116 | 07:54 | If you don't think that way, well, you have a lot of company in the history of Western philosophy, because the tidy idea, you know, people criticized it. | ||
117 | 08:01 | Aristotle, in particular, he was not amused. | ||
118 | 08:04 | He thought it was impractical. | ||
119 | 08:07 | Aristotle said, "We should seek only so much precision in each subject as that subject allows." | ||
120 | 08:13 | Aristotle thought ethics wasn't a lot like math. | ||
121 | 08:16 | He thought ethics was a matter of making decisions in the here-and-now using our best judgment to find the right path. | ||
122 | 08:23 | If you think that, Plato's not your guy. | ||
123 | 08:25 | But don't give up. | ||
124 | 08:27 | Maybe there's another way that we can use numbers as the basis of our moral framework. | ||
125 | 08:33 | How about this: What if in any situation you could just calculate, look at the choices, measure out which one's better and know what to do? | ||
126 | 08:43 | That sound familiar? | ||
127 | 08:45 | That's a utilitarian moral framework. | ||
128 | 08:48 | John Stuart Mill was a great advocate of this -- nice guy besides -- and only been dead 200 years. | ||
129 | 08:54 | So basis of utilitarianism -- | ||
130 | 08:56 | I'm sure you're familiar at least. | ||
131 | 08:58 | The three people who voted for Mill before are familiar with this. | ||
132 | 09:00 | But here's the way it works. | ||
133 | 09:02 | What if morals, what if what makes something moral is just a matter of if it maximizes pleasure and minimizes pain? | ||
134 | 09:09 | It does something intrinsic to the act. | ||
135 | 09:12 | It's not like its relation to some abstract form. | ||
136 | 09:14 | It's just a matter of the consequences. | ||
137 | 09:16 | You just look at the consequences and see if, overall, it's for the good or for the worse. | ||
138 | 09:20 | That would be simple. Then we know what to do. | ||
139 | 09:22 | Let's take an example. | ||
140 | 09:24 | Suppose I go up and I say, "I'm going to take your phone." | ||
141 | 09:28 | Not just because it rang earlier, but I'm going to take it because I made a little calculation. | ||
142 | 09:33 | I thought, that guy looks suspicious. | ||
143 | 09:36 | And what if he's been sending little messages to Bin Laden's hideout -- or whoever took over after Bin Laden -- and he's actually like a terrorist, a sleeper cell. | ||
144 | 09:44 | I'm going to find that out, and when I find that out, I'm going to prevent a huge amount of damage that he could cause. | ||
145 | 09:50 | That has a very high utility to prevent that damage. | ||
146 | 09:53 | And compared to the little pain that it's going to cause -- because it's going to be embarrassing when I'm looking on his phone and seeing that he has a Farmville problem and that whole bit -- that's overwhelmed by the value of looking at the phone. | ||
147 | 10:05 | If you feel that way, that's a utilitarian choice. | ||
148 | 10:10 | But maybe you don't feel that way either. | ||
149 | 10:13 | Maybe you think, it's his phone. | ||
150 | 10:15 | It's wrong to take his phone because he's a person and he has rights and he has dignity, and we can't just interfere with that. | ||
151 | 10:23 | He has autonomy. | ||
152 | 10:25 | It doesn't matter what the calculations are. | ||
153 | 10:27 | There are things that are intrinsically wrong -- like lying is wrong, like torturing innocent children is wrong. | ||
154 | 10:35 | Kant was very good on this point, and he said it a little better than I'll say it. | ||
155 | 10:40 | He said we should use our reason to figure out the rules by which we should guide our conduct, and then it is our duty to follow those rules. | ||
156 | 10:48 | It's not a matter of calculation. | ||
157 | 10:51 | So let's stop. | ||
158 | 10:53 | We're right in the thick of it, this philosophical thicket. | ||
159 | 10:56 | And this goes on for thousands of years, because these are hard questions, and I've only got 15 minutes. | ||
160 | 11:03 | So let's cut to the chase. | ||
161 | 11:05 | How should we be making our decisions? | ||
162 | 11:09 | Is it Plato, is it Aristotle, is it Kant, is it Mill? | ||
163 | 11:12 | What should we be doing? What's the answer? | ||
164 | 11:14 | What's the formula that we can use in any situation to determine what we should do, whether we should use that guy's data or not? | ||
165 | 11:21 | What's the formula? | ||
166 | 11:25 | There's not a formula. | ||
167 | 11:29 | There's not a simple answer. | ||
168 | 11:31 | Ethics is hard. | ||
169 | 11:34 | Ethics requires thinking. | ||
170 | 11:38 | And that's uncomfortable. | ||
171 | 11:40 | I know; I spent a lot of my career in artificial intelligence, trying to build machines that could do some of this thinking for us, that could give us answers. | ||
172 | 11:49 | But they can't. | ||
173 | 11:51 | You can't just take human thinking and put it into a machine. | ||
174 | 11:55 | We're the ones who have to do it. | ||
175 | 11:58 | Happily, we're not machines, and we can do it. | ||
176 | 12:01 | Not only can we think, we must. | ||
177 | 12:05 | Hannah Arendt said, "The sad truth is that most evil done in this world is not done by people who choose to be evil. | ||
178 | 12:15 | It arises from not thinking." | ||
179 | 12:18 | That's what she called the "banality of evil." | ||
180 | 12:22 | And the response to that is that we demand the exercise of thinking from every sane person. | ||
181 | 12:29 | So let's do that. Let's think. | ||
182 | 12:31 | In fact, let's start right now. | ||
183 | 12:34 | Every person in this room do this: think of the last time you had a decision to make where you were worried to do the right thing, where you wondered, "What should I be doing?" Bring that to mind, and now reflect on that and say, "How did I come up that decision? | ||
184 | 12:51 | What did I do? Did I follow my gut? | ||
185 | 12:54 | Did I have somebody vote on it? Or did I punt to legal?" | ||
186 | 12:56 | Or now we have a few more choices. | ||
187 | 12:59 | "Did I evaluate what would be the highest pleasure like Mill would? | ||
188 | 13:03 | Or like Kant, did I use reason to figure out what was intrinsically right?" | ||
189 | 13:06 | Think about it. Really bring it to mind. This is important. | ||
190 | 13:09 | It is so important we are going to spend 30 seconds of valuable TEDTalk time doing nothing but thinking about this. | ||
191 | 13:15 | Are you ready? Go. | ||
192 | 13:33 | Stop. Good work. | ||
193 | 13:36 | What you just did, that's the first step towards taking responsibility for what we should do with all of our power. | ||
194 | 13:45 | Now the next step -- try this. | ||
195 | 13:49 | Go find a friend and explain to them how you made that decision. | ||
196 | 13:53 | Not right now. Wait till I finish talking. | ||
197 | 13:55 | Do it over lunch. | ||
198 | 13:57 | And don't just find another technologist friend; find somebody different than you. | ||
199 | 14:02 | Find an artist or a writer -- or, heaven forbid, find a philosopher and talk to them. | ||
200 | 14:07 | In fact, find somebody from the humanities. | ||
201 | 14:09 | Why? Because they think about problems differently than we do as technologists. | ||
202 | 14:13 | Just a few days ago, right across the street from here, there was hundreds of people gathered together. | ||
203 | 14:18 | It was technologists and humanists at that big BiblioTech Conference. | ||
204 | 14:22 | And they gathered together because the technologists wanted to learn what it would be like to think from a humanities perspective. | ||
205 | 14:29 | You have someone from Google talking to someone who does comparative literature. | ||
206 | 14:33 | You're thinking about the relevance of 17th century French theater -- how does that bear upon venture capital? | ||
207 | 14:38 | Well that's interesting. That's a different way of thinking. | ||
208 | 14:41 | And when you think in that way, you become more sensitive to the human considerations, which are crucial to making ethical decisions. | ||
209 | 14:49 | So imagine that right now you went and you found your musician friend. | ||
210 | 14:53 | And you're telling him what we're talking about, about our whole data revolution and all this -- maybe even hum a few bars of our theme music. | ||
211 | 15:00 | ♫ Dum ta da da dum dum ta da da dum ♫ | ||
212 | 15:03 | Well, your musician friend will stop you and say, "You know, the theme music for your data revolution, that's an opera, that's Wagner. | ||
213 | 15:11 | It's based on Norse legend. | ||
214 | 15:13 | It's Gods and mythical creatures fighting over magical jewelry." | ||
215 | 15:19 | That's interesting. | ||
216 | 15:22 | Now it's also a beautiful opera, and we're moved by that opera. | ||
217 | 15:28 | We're moved because it's about the battle between good and evil, about right and wrong. | ||
218 | 15:34 | And we care about right and wrong. | ||
219 | 15:36 | We care what happens in that opera. | ||
220 | 15:39 | We care what happens in "Apocalypse Now." | ||
221 | 15:42 | And we certainly care what happens with our technologies. | ||
222 | 15:46 | We have so much power today, it is up to us to figure out what to do, and that's the good news. | ||
223 | 15:53 | We're the ones writing this opera. | ||
224 | 15:56 | This is our movie. | ||
225 | 15:58 | We figure out what will happen with this technology. | ||
226 | 16:01 | We determine how this will all end. | ||
227 | 16:04 | Thank you. | ||
228 | 16:06 | (Applause) |