英語演講 學(xué)英語,練聽力,上聽力課堂! 注冊(cè) 登錄
> 英語演講 > 英語演講mp3 > TED音頻 >  第173篇

演講MP3+雙語文稿:人工智能也會(huì)有偏見嗎?

所屬教程:TED音頻

瀏覽:

2022年07月10日

手機(jī)版
掃描二維碼方便學(xué)習(xí)和分享
https://online2.tingclass.net/lesson/shi0529/10000/10387/tedyp174.mp3
https://image.tingclass.net/statics/js/2012

聽力課堂TED音頻欄目主要包括TED演講的音頻MP3及中英雙語文稿,供各位英語愛好者學(xué)習(xí)使用。本文主要內(nèi)容為演講MP3+雙語文稿:人工智能也會(huì)有偏見嗎?,希望你會(huì)喜歡!

【演講者及介紹】Kriti Sharma

人工智能科學(xué)家克里蒂·夏爾馬(Kriti Sharma)創(chuàng)造了Al技術(shù),以幫助解決我們這個(gè)時(shí)代面臨的一些最嚴(yán)峻的社會(huì)挑戰(zhàn)——從家庭暴力到性健康和不平等。

【演講主題】如何讓人工智能遠(yuǎn)離人類的偏見

【中英文字幕】

翻譯者psjmz mz 校對(duì)者Jin Ge

00:13

How many decisions have been made about youtoday, or this week or this year, by artificial intelligence? I build AI for aliving so, full disclosure, I'm kind of a nerd. And because I'm kind of a nerd,wherever some new news story comes out about artificial intelligence stealingall our jobs, or robots getting citizenship of an actual country, I'm theperson my friends and followers message freaking out about the future.

你今天,這周,或今年有多少?zèng)Q定是人工智能(AI)做出的?我靠創(chuàng)建AI為生,所以,坦白說,我是個(gè)技術(shù)狂。因?yàn)槲沂撬闶莻€(gè)技術(shù)狂,每當(dāng)有關(guān)于人工智能要搶走我們的工作這樣的新聞報(bào)道出來,或者機(jī)器人獲得了一個(gè)國(guó)家的公民身份時(shí),我就成了對(duì)未來感到擔(dān)憂的朋友和關(guān)注者發(fā)消息的對(duì)象。

00:46

We see this everywhere. This media panicthat our robot overlords are taking over. We could blame Hollywood for that.But in reality, that's not the problem we should be focusing on. There is amore pressing danger, a bigger risk with AI, that we need to fix first. So weare back to this question: How many decisions have been made about you today byAI? And how many of these were based on your gender, your race or yourbackground?

這種事情隨處可見。媒體擔(dān)心機(jī)器人正在接管人類的統(tǒng)治。我們可以為此譴責(zé)好萊塢。但現(xiàn)實(shí)中,這不是我們應(yīng)該關(guān)注的問題。人工智能還有一個(gè)更緊迫的危機(jī),一個(gè)更大的風(fēng)險(xiǎn),需要我們首先應(yīng)對(duì)。所以我們?cè)倩氐竭@個(gè)問題:今天我們有多少?zèng)Q定是由人工智能做出的?其中有多少?zèng)Q定是基于你的性別,種族或者背景?

01:25

Algorithms are being used all the time tomake decisions about who we are and what we want. Some of the women in thisroom will know what I'm talking about if you've been made to sit through thosepregnancy test adverts on YouTube like 1,000 times. Or you've scrolled pastadverts of fertility clinics on your Facebook feed. Or in my case, Indianmarriage bureaus.

算法一直在被用來判斷我們是誰,我們想要什么。在座的人里有些女性知道我在說什么,如果你有上千次被要求看完YouTube上那些懷孕測(cè)試廣告,或者你在臉書的短新聞中刷到過生育診所的廣告?;蛘呶业挠龅降那闆r是,印度婚姻局。

01:50

(Laughter)

(笑聲)

01:52

But AI isn't just being used to makedecisions about what products we want to buy or which show we want to bingewatch next. I wonder how you'd feel about someone who thought things like this:"A black or Latino person is less likely than a white person to pay offtheir loan on time." "A person called John makes a better programmerthan a person called Mary." "A black man is more likely to be arepeat offender than a white man." You're probably thinking, "Wow,that sounds like a pretty sexist, racist person," right? These are somereal decisions that AI has made very recently, based on the biases it haslearned from us, from the humans. AI is being used to help decide whether ornot you get that job interview; how much you pay for your car insurance; howgood your credit score is; and even what rating you get in your annualperformance review. But these decisions are all being filtered through itsassumptions about our identity, our race, our gender, our age. How is thathappening?

但人工智能不僅被用來決定我們想要買什么產(chǎn)品,或者我們接下來想刷哪部劇。我想知道你會(huì)怎么看這樣想的人:“黑人或拉丁美洲人比白人更不可能按時(shí)還貸?!薄懊屑s翰的人編程能力要比叫瑪麗的人好?!薄昂谌吮劝兹烁锌赡艹蔀閼T犯?!蹦憧赡茉谙耄巴?,這聽起來像是一個(gè)有嚴(yán)重性別歧視和種族歧視的人?!睂?duì)吧? 這些都是人工智能 近期做出的真實(shí)決定,基于它從我們?nèi)祟惿砩蠈W(xué)習(xí)到的偏見。人工智能被用來幫助決定你是否能夠得到面試機(jī)會(huì);你應(yīng)該為車險(xiǎn)支付多少費(fèi)用;你的信用分?jǐn)?shù)有多好;甚至你在年度績(jī)效評(píng)估中應(yīng)該得到怎樣的評(píng)分。但這些決定都是通過它對(duì)我們的身份、種族、性別和年齡的假設(shè)過濾出來的。為什么會(huì)這樣?

03:11

Now, imagine an AI is helping a hiringmanager find the next tech leader in the company. So far, the manager has beenhiring mostly men. So the AI learns men are more likely to be programmers thanwomen. And it's a very short leap from there to: men make better programmersthan women. We have reinforced our own bias into the AI. And now, it'sscreening out female candidates. Hang on, if a human hiring manager did that,we'd be outraged, we wouldn't allow it. This kind of gender discrimination isnot OK. And yet somehow, AI has become above the law, because a machine madethe decision. That's not it.

想象一下人工智能正在幫助一個(gè)人事主管尋找公司下一位科技領(lǐng)袖。目前為止,主管雇傭的大部分是男性。所以人工智能知道男人比女人更有可能成為程序員,也就更容易做出這樣的判斷:男人比女人更擅長(zhǎng)編程。我們通過人工智能強(qiáng)化了自己的偏見?,F(xiàn)在,它正在篩選掉女性候選人。等等,如果人類招聘主管這樣做,我們會(huì)很憤怒,不允許這樣的事情發(fā)生。這種性別偏見讓人難以接受。然而,或多或少,人工智能已經(jīng)凌駕于法律之上,因?yàn)槭菣C(jī)器做的決定。這還沒完。

04:00

We are also reinforcing our bias in how weinteract with AI. How often do you use a voice assistant like Siri, Alexa oreven Cortana? They all have two things in common: one, they can never get myname right, and second, they are all female. They are designed to be ourobedient servants, turning your lights on and off, ordering your shopping. Youget male AIs too, but they tend to be more high-powered, like IBM Watson,making business decisions, Salesforce Einstein or ROSS, the robot lawyer. Sopoor robots, even they suffer from sexism in the workplace.

我們也在強(qiáng)化我們與人工智能互動(dòng)的偏見。你們使用Siri,Alexa或者Cortana 這樣的語音助手有多頻繁?它們有兩點(diǎn)是相同的:第一點(diǎn),它們總是搞錯(cuò)我的名字,第二點(diǎn),它們都有女性特征。它們都被設(shè)計(jì)成順從我們的仆人,開燈關(guān)燈,下單購(gòu)買商品。也有男性的人工智能,但他們傾向于擁有更高的權(quán)力,比如IBM的Watson可以做出商業(yè)決定,還有Salesforce的Einstein 或者ROSS, 是機(jī)器人律師。所以即便是機(jī)器人也沒能逃脫工作中的性別歧視。

04:43

(Laughter)

(笑聲)

04:44

Think about how these two things combineand affect a kid growing up in today's world around AI. So they're doing someresearch for a school project and they Google images of CEO. The algorithmshows them results of mostly men. And now, they Google personal assistant. Asyou can guess, it shows them mostly females. And then they want to put on somemusic, and maybe order some food, and now, they are barking orders at anobedient female voice assistant. Some of our brightest minds are creating thistechnology today. Technology that they could have created in any way theywanted. And yet, they have chosen to create it in the style of 1950s "MadMan" secretary. Yay!

想想這兩者如何結(jié)合在一起,又會(huì)影響一個(gè)在當(dāng)今人工智能世界中長(zhǎng)大的孩子。比如他們正在為學(xué)校的一個(gè)項(xiàng)目做一些研究,他們?cè)诠雀枭纤阉髁薈EO的照片。算法向他們展示的大部分是男性。他們又搜索了個(gè)人助手。你可以猜到,它顯示的大部分是女性。然后他們想放點(diǎn)音樂,也許想點(diǎn)些吃的,而現(xiàn)在,他們正對(duì)著一位順從的女聲助手發(fā)號(hào)施令。我們中一些最聰明的人創(chuàng)建了今天的這個(gè)技術(shù)。他們可以用任何他們想要的方式創(chuàng)造技術(shù)。然而,他們卻選擇了上世紀(jì)50年代《廣告狂人》的秘書風(fēng)格。是的,你沒聽錯(cuò)!

05:36

But OK, don't worry, this is not going toend with me telling you that we are all heading towards sexist, racist machinesrunning the world. The good news about AI is that it is entirely within ourcontrol. We get to teach the right values, the right ethics to AI. So there arethree things we can do. One, we can be aware of our own biases and the bias inmachines around us. Two, we can make sure that diverse teams are building thistechnology. And three, we have to give it diverse experiences to learn from. Ican talk about the first two from personal experience. When you work intechnology and you don't look like a Mark Zuckerberg or Elon Musk, your life isa little bit difficult, your ability gets questioned.

但還好,不用擔(dān)心。這不會(huì)因?yàn)槲腋嬖V你我們都在朝著性別歧視、種族主義的機(jī)器前進(jìn)而結(jié)束。人工智能的好處是,一切都在我們的控制中。我們得告訴人工智能正確的價(jià)值觀,道德觀。所以有三件事我們可以做。第一,我們能夠意識(shí)到自己的偏見和我們身邊機(jī)器的偏見。第二,我們可以確保打造這個(gè)技術(shù)的是背景多樣的團(tuán)隊(duì)。第三,我們必須讓它從豐富的經(jīng)驗(yàn)中學(xué)習(xí)。我可以從我個(gè)人的經(jīng)驗(yàn)來說明前兩點(diǎn)。當(dāng)你在科技行業(yè)工作,并且不像馬克·扎克伯格或埃隆·馬斯克那樣位高權(quán)重,你的生活會(huì)有點(diǎn)困難,你的能力會(huì)收到質(zhì)疑。

06:27

Here's just one example. Like mostdevelopers, I often join online tech forums and share my knowledge to helpothers. And I've found, when I log on as myself, with my own photo, my ownname, I tend to get questions or comments like this: "What makes you thinkyou're qualified to talk about AI?" "What makes you think you knowabout machine learning?" So, as you do, I made a new profile, and thistime, instead of my own picture, I chose a cat with a jet pack on it. And Ichose a name that did not reveal my gender. You can probably guess where thisis going, right? So, this time, I didn't get any of those patronizing commentsabout my ability and I was able to actually get some work done. And it sucks,guys. I've been building robots since I was 15, I have a few degrees incomputer science, and yet, I had to hide my gender in order for my work to betaken seriously.

這只是一個(gè)例子。跟大部分開發(fā)者一樣,我經(jīng)常參加在線科技論壇,分享我的知識(shí)幫助別人。我發(fā)現(xiàn),當(dāng)我用自己的照片,自己的名字登陸時(shí),我傾向于得到這樣的問題或評(píng)論:“你為什么覺得自己有資格談?wù)撊斯ぶ悄??”“你為什么覺得你了解機(jī)器學(xué)習(xí)?”所以,我創(chuàng)建了新的資料頁,這次,我沒有選擇自己的照片,而是選擇了一只帶著噴氣背包的貓。并選擇了一個(gè)無法體現(xiàn)我性別的名字。你能夠大概猜到會(huì)怎么樣,對(duì)吧?于是這次,我不再收到任何居高臨下的評(píng)論,我能夠?qū)P陌压ぷ髯鐾?。這感覺太糟糕了,伙計(jì)們。我從15歲起就在構(gòu)建機(jī)器人,我有計(jì)算機(jī)科學(xué)領(lǐng)域的幾個(gè)學(xué)位,然而,我不得不隱藏我的性別以讓我的工作被嚴(yán)肅對(duì)待。

07:31

So, what's going on here? Are men justbetter at technology than women? Another study found that when women coders onone platform hid their gender, like myself, their code was accepted fourpercent more than men. So this is not about the talent. This is about anelitism in AI that says a programmer needs to look like a certain person. Whatwe really need to do to make AI better is bring people from all kinds ofbackgrounds. We need people who can write and tell stories to help us createpersonalities of AI. We need people who can solve problems. We need people whoface different challenges and we need people who can tell us what are the realissues that need fixing and help us find ways that technology can actually fix it.Because, when people from diverse backgrounds come together, when we buildthings in the right way, the possibilities are limitless.

這是怎么回事呢?男性在科技領(lǐng)域就是強(qiáng)于女性嗎?另一個(gè)研究發(fā)現(xiàn),當(dāng)女性程序員在平臺(tái)上隱藏性別時(shí),像我這樣,她們的代碼被接受的比例比男性高4%。所以這跟能力無關(guān)。這是人工智能領(lǐng)域的精英主義,即程序員看起來得像具備某個(gè)特征的人。讓人工智能變得更好,我們需要切實(shí)的把來自不同背景的人集合到一起。我們需要能夠書寫和講故事的人來幫助我們創(chuàng)建人工智能更好的個(gè)性。我們需要能夠解決問題的人。我們需要能應(yīng)對(duì)不同挑戰(zhàn)的人,我們需要有人告訴我們什么是真正需要解決的問題,幫助我們找到用技術(shù)解決問題的方法。因?yàn)椋?dāng)不同背景的人走到一起時(shí),當(dāng)我們以正確的方式做事情時(shí),就有無限的可能。

08:38

And that's what I want to end by talking toyou about. Less racist robots, less machines that are going to take our jobs --and more about what technology can actually achieve. So, yes, some of theenergy in the world of AI, in the world of technology is going to be about whatads you see on your stream. But a lot of it is going towards making the worldso much better. Think about a pregnant woman in the Democratic Republic ofCongo, who has to walk 17 hours to her nearest rural prenatal clinic to get acheckup. What if she could get diagnosis on her phone, instead? Or think aboutwhat AI could do for those one in three women in South Africa who face domesticviolence. If it wasn't safe to talk out loud, they could get an AI service toraise alarm, get financial and legal advice. These are all real examples ofprojects that people, including myself, are working on right now, using AI.

這就是我最后想和你們討論的。減少種族歧視的機(jī)器人,減少奪走我們工作的機(jī)器——更多專注于技術(shù)究竟能實(shí)現(xiàn)什么。是的,人工智能世界中,科技世界中的一些能量是關(guān)于你在流媒體中看到的廣告。但更多是朝著讓世界更美好的方向前進(jìn)。想想剛果民主共和國(guó)的一位孕婦,需要走17小時(shí)才能到最近的農(nóng)村產(chǎn)前診所進(jìn)行產(chǎn)檢。如果她在手機(jī)上就能得到診斷會(huì)怎樣呢?或者想象一下人工智能能為1/3面臨家庭暴力的南非女性做什么。如果大聲說出來不安全的話,她們可以通過一個(gè)人工智能服務(wù)來報(bào)警,獲得財(cái)務(wù)和法律咨詢。這些都是包括我在內(nèi),正在使用人工智能的人所做的項(xiàng)目中的真實(shí)案例。

09:45

So, I'm sure in the next couple of daysthere will be yet another news story about the existential risk, robots takingover and coming for your jobs.

我確信在未來的幾十天里面,會(huì)有另一個(gè)新聞故事,告訴你們,機(jī)器人會(huì)接管你們的工作。

09:54

(Laughter)

(笑聲)

09:55

And when something like that happens, Iknow I'll get the same messages worrying about the future. But I feelincredibly positive about this technology. This is our chance to remake theworld into a much more equal place. But to do that, we need to build it theright way from the get go. We need people of different genders, races,sexualities and backgrounds. We need women to be the makers and not just themachines who do the makers' bidding. We need to think very carefully what weteach machines, what data we give them, so they don't just repeat our own pastmistakes. So I hope I leave you thinking about two things. First, I hope youleave thinking about bias today. And that the next time you scroll past anadvert that assumes you are interested in fertility clinics or online bettingwebsites, that you think and remember that the same technology is assuming thata black man will reoffend. Or that a woman is more likely to be a personalassistant than a CEO. And I hope that reminds you that we need to do somethingabout it.

當(dāng)這樣的事情發(fā)生時(shí),我知道我會(huì)收到同樣對(duì)未來表示擔(dān)憂的信息。但我對(duì)這個(gè)技術(shù)極為樂觀。這是我們重新讓世界變得更平等的機(jī)會(huì)。但要做到這一點(diǎn),我們需要在一開始就以正確的方式構(gòu)建它。我們需要不同性別,種族,性取向和背景的人。我們需要女性成為創(chuàng)造者,而不僅僅是聽從創(chuàng)造者命令的機(jī)器。我們需要仔細(xì)思考我們教給機(jī)器的東西,我們給它們什么數(shù)據(jù),這樣它們就不會(huì)只是重復(fù)我們過去的錯(cuò)誤。所以我希望我留給你們兩個(gè)思考。首先,我希望你們思考當(dāng)今社會(huì)中的偏見。下次當(dāng)你滾動(dòng)刷到認(rèn)為你對(duì)生育診所或者網(wǎng)上投注站有興趣的廣告時(shí),這會(huì)讓你回想起同樣的技術(shù)也在假定黑人會(huì)重復(fù)犯罪?;蛘吲愿赡艹蔀閭€(gè)人助理而非CEO。我希望那會(huì)提醒你,我們需要對(duì)此有所行動(dòng)。

11:20

And second, I hope you think about the factthat you don't need to look a certain way or have a certain background inengineering or technology to create AI, which is going to be a phenomenal forcefor our future. You don't need to look like a Mark Zuckerberg, you can looklike me. And it is up to all of us in this room to convince the governments andthe corporations to build AI technology for everyone, including the edge cases.And for us all to get education about this phenomenal technology in the future.Because if we do that, then we've only just scratched the surface of what wecan achieve with AI.

第二,我希望你們考慮一下這個(gè)事實(shí),你不需要以特定的方式去看,也不需要有一定的工程或技術(shù)背景去創(chuàng)建人工智能,人工智能將成為我們未來的一股非凡力量。你不需要看起來像馬克·扎克伯格,你可以看起來像我。我們這個(gè)房間里的所有人都有責(zé)任去說服政府和公司為每個(gè)人創(chuàng)建人工智能技術(shù),包括邊緣的情況。讓我們所有人都能在未來接受有關(guān)這項(xiàng)非凡技術(shù)的教育。因?yàn)槿绻覀兡菢幼隽耍艅倓偞蜷_了人工智能世界的大門。

12:05

Thank you.

謝謝。

12:06

(Applause)

(鼓掌)

用戶搜索

瘋狂英語 英語語法 新概念英語 走遍美國(guó) 四級(jí)聽力 英語音標(biāo) 英語入門 發(fā)音 美語 四級(jí) 新東方 七年級(jí) 賴世雄 zero是什么意思湛江市祺仁新村英語學(xué)習(xí)交流群

  • 頻道推薦
  • |
  • 全站推薦
  • 推薦下載
  • 網(wǎng)站推薦