Recorded at | October 22, 2020 |
---|---|
Event | TED Salon Dell Technologies |
Duration (min:sec) | 14:34 |
Video Type | TED Salon Talk (partner) |
Words per minute | 197.92 fast |
Readability (FK) | 52.09 medium |
Speaker | Genevieve Bell |
Country | Australia |
Occupation | anthropologist |
Description | Australian anthropologist |
Official TED page for this talk
Synopsis
Artificial intelligence is all around us ... and the future will only bring more of it. How can we ensure the AI systems we build are responsible, safe and sustainable? Ethical AI expert Genevieve Bell shares six framing questions to broaden our understanding of future technology -- and create the next generation of critical thinkers and doers.
1 | 00:13 | Let me tell you a story about artificial intelligence. | ||
2 | 00:16 | There's a building in Sydney at 1 Bligh Street. | ||
3 | 00:19 | It houses lots of government apartments and busy people. | ||
4 | 00:23 | From the outside, it looks like something out of American science fiction: all gleaming glass and curved lines, and a piece of orange sculpture. | ||
5 | 00:31 | On the inside, it has excellent coffee on the ground floor and my favorite lifts in Sydney. | ||
6 | 00:36 | They're beautiful; they look almost alive. | ||
7 | 00:40 | And it turns out I'm fascinated with lifts. | ||
8 | 00:42 | For lots of reasons. | ||
9 | 00:43 | But because lifts are one of the places you can see the future. | ||
10 | 00:46 | In the 21st century, lifts are interesting because they're one of the first places that AI will touch you without you even knowing it happened. | ||
11 | 00:54 | In many buildings all around the world, the lifts are running a set of algorithms. | ||
12 | 01:00 | A form of protoartificial intelligence. | ||
13 | 01:03 | That means before you even walk up to the lift to press the button, it's anticipated you being there. | ||
14 | 01:09 | It's already rearranging all the carriages. | ||
15 | 01:12 | Always going down, to save energy, and to know where the traffic is going to be. | ||
16 | 01:16 | By the time you've actually pressed the button, you're already part of an entire system that's making sense of people and the environment and the building and the built world. | ||
17 | 01:25 | I know when we talk about AI, we often talk about a world of robots. | ||
18 | 01:29 | It's easy for our imaginations to be occupied with science fiction, well, over the last 100 years. | ||
19 | 01:35 | I say AI and you think "The Terminator." | ||
20 | 01:38 | Somewhere, for us, making the connection between AI and the built world, that's a harder story to tell. | ||
21 | 01:45 | But the reality is AI is already everywhere around us. | ||
22 | 01:49 | And in many places. | ||
23 | 01:50 | It's in buildings and in systems. | ||
24 | 01:53 | More than 200 years of industrialization suggest that AI will find its way to systems-level scale relatively easily. | ||
25 | 02:00 | After all, one telling of that history suggests that all you have to do is find a technology, achieve scale and revolution will follow. | ||
26 | 02:07 | The story of mechanization, automation and digitization all point to the role of technology and its importance. | ||
27 | 02:16 | Those stories of technological transformation make scale seem, well, normal. | ||
28 | 02:21 | Or expected. | ||
29 | 02:23 | And stable. | ||
30 | 02:24 | And sometimes even predictable. | ||
31 | 02:27 | But it also puts the focus squarely on technology and technology change. | ||
32 | 02:31 | But I believe that scaling a technology and building a system requires something more. | ||
33 | 02:38 | We founded the 3Ai Institute at the Australian National University in September 2017. | ||
34 | 02:44 | It has one deceptively simple mission: to establish a new branch of engineering to take AI safely, sustainably and responsibly to scale. | ||
35 | 02:53 | But how do you build a new branch of engineering in the 21st century? | ||
36 | 02:56 | Well, we're teaching it into existence through an experimental education program. | ||
37 | 03:01 | We're researching it into existence with locations as diverse as Shakespeare's birthplace, the Great Barrier Reef, not to mention one of Australia's largest autonomous mines. | ||
38 | 03:11 | And we're theorizing it into existence, paying attention to the complexities of cybernetic systems. | ||
39 | 03:18 | We're working to build something new and something useful. | ||
40 | 03:21 | Something to create the next generation of critical thinkers and critical doers. | ||
41 | 03:25 | And we're doing all of that through a richer understanding of AI's many pasts and many stories. | ||
42 | 03:31 | And by working collaboratively and collectively through teaching and research and engagement, and by focusing as much on the framing of the questions as the solving of the problems. | ||
43 | 03:43 | We're not making a single AI, we're making the possibilities for many. | ||
44 | 03:48 | And we're actively working to decolonize our imaginations and to build a curriculum and a pedagogy that leaves room for a range of different conversations and possibilities. | ||
45 | 03:58 | We are making and remaking. | ||
46 | 04:00 | And I know we're always a work in progress. | ||
47 | 04:04 | But here's a little glimpse into how we're approaching that problem of scaling a future. | ||
48 | 04:09 | We start by making sure we're grounded in our own history. | ||
49 | 04:13 | In December of 2018, I took myself up to the town of Brewarrina on the New South Wales-Queensland border. | ||
50 | 04:19 | This place was a meeting place for Aboriginal people, for different groups, to gather, have ceremonies, meet, to be together. | ||
51 | 04:26 | There, on the Barwon River, there's a set of fish weirs that are one of the oldest and largest systems of Aboriginal fish traps in Australia. | ||
52 | 04:34 | This system is comprised of 1.8 kilometers of stone walls shaped like a series of fishnets with the "Us" pointing down the river, allowing fish to be trapped at different heights of the water. | ||
53 | 04:44 | They're also fish holding pens with different-height walls for storage, designed to change the way the water moves and to be able to store big fish and little fish and to keep those fish in cool, clear running water. | ||
54 | 04:56 | This fish-trap system was a way to ensure that you could feed people as they gathered there in a place that was both a meeting of rivers and a meeting of cultures. | ||
55 | 05:04 | It isn't about the rocks or even the traps per se. | ||
56 | 05:08 | It is about the system that those traps created. | ||
57 | 05:11 | One that involves technical knowledge, cultural knowledge and ecological knowledge. | ||
58 | 05:16 | This system is old. | ||
59 | 05:18 | Some archaeologists think it's as old as 40,000 years. | ||
60 | 05:21 | The last time we have its recorded uses is in the nineteen teens. | ||
61 | 05:26 | It's had remarkable longevity and incredible scale. | ||
62 | 05:29 | And it's an inspiration to me. | ||
63 | 05:32 | And a photo of the weir is on our walls here at the Institute, to remind us of the promise and the challenge of building something meaningful. | ||
64 | 05:39 | And to remind us that we're building systems in a place where people have built systems and sustained those same systems for generations. | ||
65 | 05:46 | It isn't just our history, it's our legacy as we seek to establish a new branch of engineering. | ||
66 | 05:52 | To build on that legacy and our sense of purpose, I think we need a clear framework for asking questions about the future. | ||
67 | 05:59 | Questions for which there aren't ready or easy answers. | ||
68 | 06:03 | Here, the point is the asking of the questions. | ||
69 | 06:06 | We believe you need to go beyond the traditional approach of problem-solving, to the more complicated one of question asking and question framing. | ||
70 | 06:15 | Because in so doing, you open up all kinds of new possibilities and new challenges. | ||
71 | 06:20 | For me, right now, there are six big questions that frame our approach for taking AI safely, sustainably and responsibly to scale. | ||
72 | 06:28 | Questions about autonomy, agency, assurance, indicators, interfaces and intentionality. | ||
73 | 06:36 | The first question we ask is a simple one. | ||
74 | 06:39 | Is the system autonomous? | ||
75 | 06:41 | Think back to that lift on Bligh Street. | ||
76 | 06:43 | The reality is, one day, that lift may be autonomous. | ||
77 | 06:46 | Which is to say it will be able to act without being told to act. | ||
78 | 06:50 | But it isn't fully autonomous, right? | ||
79 | 06:52 | It can't leave that Bligh Street building and wonder down to Circular Quay for a beer. | ||
80 | 06:58 | It goes up and down, that's all. | ||
81 | 07:00 | But it does it by itself. | ||
82 | 07:02 | It's autonomous in that sense. | ||
83 | 07:05 | The second question we ask: does this system have agency? | ||
84 | 07:10 | Does this system have controls and limits that live somewhere that prevent it from doing certain kinds of things under certain conditions. | ||
85 | 07:19 | The reality with lifts, that's absolutely the case. | ||
86 | 07:22 | Think of any lift you've been in. | ||
87 | 07:24 | There's a red keyslot in the elevator carriage that an emergency services person can stick a key into and override the whole system. | ||
88 | 07:31 | But what happens when that system is AI-driven? | ||
89 | 07:34 | Where does the key live? | ||
90 | 07:35 | Is it a physical key, is it a digital key? | ||
91 | 07:37 | Who gets to use it? | ||
92 | 07:39 | Is that the emergency services people? | ||
93 | 07:40 | And how would you know if that was happening? | ||
94 | 07:43 | How would all of that be manifested to you in the lift? | ||
95 | 07:47 | The third question we ask is how do we think about assurance. | ||
96 | 07:51 | How do we think about all of its pieces: safety, security, trust, risk, liability, manageability, explicability, ethics, public policy, law, regulation? | ||
97 | 08:01 | And how would we tell you that the system was safe and functioning? | ||
98 | 08:06 | The fourth question we ask is what would be our interfaces with these AI-driven systems. | ||
99 | 08:11 | Will we talk to them? | ||
100 | 08:12 | Will they talk to us, will they talk to each other? | ||
101 | 08:14 | And what will it mean to have a series of technologies we've known, for some of us, all our lives, now suddenly behave in entirely different ways? | ||
102 | 08:21 | Lifts, cars, the electrical grid, traffic lights, things in your home. | ||
103 | 08:27 | The fifth question for these AI-driven systems: What will the indicators be to show that they're working well? | ||
104 | 08:33 | Two hundred years of the industrial revolution tells us that the two most important ways to think about a good system are productivity and efficiency. | ||
105 | 08:41 | In the 21st century, you might want to expand that just a little bit. | ||
106 | 08:45 | Is the system sustainable, is it safe, is it responsible? | ||
107 | 08:48 | Who gets to judge those things for us? | ||
108 | 08:51 | Users of the systems would want to understand how these things are regulated, managed and built. | ||
109 | 08:57 | And then there's the final, perhaps most critical question that you need to ask of these new AI systems. | ||
110 | 09:03 | What's its intent? | ||
111 | 09:05 | What's the system designed to do and who said that was a good idea? | ||
112 | 09:09 | Or put another way, what is the world that this system is building, how is that world imagined, and what is its relationship to the world we live in today? | ||
113 | 09:18 | Who gets to be part of that conversation? | ||
114 | 09:20 | Who gets to articulate it? | ||
115 | 09:22 | How does it get framed and imagined? | ||
116 | 09:26 | There are no simple answers to these questions. | ||
117 | 09:29 | Instead, they frame what's possible and what we need to imagine, design, build, regulate and even decommission. | ||
118 | 09:37 | They point us in the right directions and help us on a path to establish a new branch of engineering. | ||
119 | 09:42 | But critical questions aren't enough. | ||
120 | 09:46 | You also need a way of holding all those questions together. | ||
121 | 09:50 | For us at the Institute, we're also really interested in how to think about AI as a system, and where and how to draw the boundaries of that system. | ||
122 | 09:59 | And those feel like especially important things right now. | ||
123 | 10:03 | Here, we're influenced by the work that was started way back in the 1940s. | ||
124 | 10:07 | In 1944, along with anthropologists Gregory Bateson and Margaret Mead, mathematician Norbert Wiener convened a series of conversations that would become known as the Macy Conferences on Cybernetics. | ||
125 | 10:18 | Ultimately, between 1946 and 1953, ten conferences were held under the banner of cybernetics. | ||
126 | 10:25 | As defined by Norbert Wiener, cybernetics sought to "develop a language and techniques that will enable us to indeed attack the problem of control and communication in advanced computing technologies." | ||
127 | 10:38 | Cybernetics argued persuasively that one had to think about the relationship between humans, computers and the broader ecological world. | ||
128 | 10:46 | You had to think about them as a holistic system. | ||
129 | 10:49 | Participants in the Macy Conferences were concerned with how the mind worked, with ideas about intelligence and learning, and about the role of technology in our future. | ||
130 | 10:57 | Sadly, the conversations that started with the Macy Conference are often forgotten when the talk is about AI. | ||
131 | 11:03 | But for me, there's something really important to reclaim here about the idea of a system that has to accommodate culture, technology and the environment. | ||
132 | 11:13 | At the Institute, that sort of systems thinking is core to our work. | ||
133 | 11:17 | Over the last three years, a whole collection of amazing people have joined me here on this crazy journey to do this work. | ||
134 | 11:24 | Our staff includes anthropologists, systems and environmental engineers, and computer scientists as well as a nuclear physicist, an award-winning photo journalist, and at least one policy and standards expert. | ||
135 | 11:37 | It's a heady mix. | ||
136 | 11:39 | And the range of experience and expertise is powerful, as are the conflicts and the challenges. | ||
137 | 11:45 | Being diverse requires a constant willingness to find ways to hold people in conversation. | ||
138 | 11:50 | And to dwell just a little bit with the conflict. | ||
139 | 11:53 | We also worked out early that the way to build a new way of doing things would require a commitment to bringing others along on that same journey with us. | ||
140 | 12:02 | So we opened our doors to an education program very quickly, and we launched our first master's program in 2018. | ||
141 | 12:08 | Since then, we've had two cohorts of master's students and one cohort of PhD students. | ||
142 | 12:13 | Our students come from all over the world and all over life. | ||
143 | 12:16 | Australia, New Zealand, Nigeria, Nepal, Mexico, India, the United States. | ||
144 | 12:22 | And they range in age from 23 to 60. | ||
145 | 12:24 | They variously had backgrounds in maths and music, policy and performance, systems and standards, architecture and arts. | ||
146 | 12:33 | Before they joined us at the Institute, they ran companies, they worked for government, served in the army, taught high school, and managed arts organizations. | ||
147 | 12:42 | They were adventurers and committed to each other, and to building something new. | ||
148 | 12:47 | And really, what more could you ask for? | ||
149 | 12:50 | Because although I've spent 20 years in Silicon Valley and I know the stories about the lone inventor and the hero's journey, I also know the reality. | ||
150 | 12:58 | That it's never just a hero's journey. | ||
151 | 13:00 | It's always a collection of people who have a shared sense of purpose who can change the world. | ||
152 | 13:06 | So where do you start? | ||
153 | 13:09 | Well, I think you start where you stand. | ||
154 | 13:12 | And for me, that means I want to acknowledge the traditional owners of the land upon which I'm standing. | ||
155 | 13:16 | The Ngunnawal and Ngambri people, this is their land, never ceded, always sacred. | ||
156 | 13:21 | And I pay my respects to the elders, past and present, of this place. | ||
157 | 13:25 | I also acknowledge that we're gathering today in many other places, and I pay my respects to the traditional owners and elders of all those places too. | ||
158 | 13:33 | It means a lot to me to get to say those words and to dwell on what they mean and signal. | ||
159 | 13:38 | And to remember that we live in a country that has been continuously occupied for at least 60,000 years. | ||
160 | 13:44 | Aboriginal people built worlds here, they built social systems, they built technologies. | ||
161 | 13:49 | They built ways to manage this place and to manage it remarkably over a protracted period of time. | ||
162 | 13:55 | And every moment any one of us stands on a stage as Australians, here or abroad, we carry with us a privilege and a responsibility because of that history. | ||
163 | 14:04 | And it's not just a history. | ||
164 | 14:05 | It's also an incredibly rich set of resources, worldviews and knowledge. | ||
165 | 14:10 | And it should run through all of our bones and it should be the story we always tell. | ||
166 | 14:15 | Ultimately, it's about thinking differently, asking different kinds of questions, looking holistically at the world and the systems, and finding other people who want to be on that journey with you. | ||
167 | 14:26 | Because for me, the only way to actually think about the future and scale is to always be doing it collectively. | ||
168 | 14:33 | And because for me, the notion of humans in it together is one of the ways we get to think about things that are responsible, safe and ultimately, sustainable. | ||
169 | 14:45 | Thank you. |