Skip to content

英美剧电影台词站

The Social Dilemma(监视资本主义:智能陷阱)[2020]电影台词本阅读、下载和单词统计

Posted on June 16, 2024 By taiciben_script_user No Comments on The Social Dilemma(监视资本主义:智能陷阱)[2020]电影台词本阅读、下载和单词统计
电影名称:监视资本主义:智能陷阱
英文名称:The Social Dilemma
年代:2020

推荐:千部英美剧台词本阅读
时间 英文 中文
[00:19] ‎(“进入凡人生活的一切强大之物 ‎无不具有弊端”)
[00:24] ‎(——索福克勒斯)
[00:31] Why don’t you go ahead? Sit down and see if you can get comfy. ‎好 你直接开始吧? ‎好 坐下 能否舒适地坐着
[00:37] You good? All right. Yeah. ‎还好吗? 好了 ‎是
[00:43] Take one, marker. ‎第一镜 打板
[00:46] Wanna start by introducing yourself? 要我先从自我介绍开始吗?
[00:50] Hello, world. Bailey. Take three. ‎世界你好 贝利 第三镜
[00:53] You good? This is the worst part, man. 可以了吗? ‎ 这是我最讨厌的部分 兄弟
[00:56] I don’t like this. ‎我不喜欢这样
[00:59] I worked at Facebook in 2011 and 2012. ‎我2011年到2012年间在脸书工作
[01:02] I was one of the really early employees at Instagram. ‎我是Instagram非常早期的员工
[01:05] I worked at, uh, Google, uh, YouTube. ‎我曾任职于谷歌、YouTube
[01:08] Apple, Google, Twitter, Palm. ‎苹果、谷歌、推特、Palm
[01:12] I helped start Mozilla Labs and switched over to the Firefox side. ‎我帮助创始了Mozilla Labs ‎后来更多工作于火狐这边
[01:15] Are we rolling? Everybody? ‎在拍吗?大家…
[01:18] Great. ‎很好
[01:21] I worked at Twitter. ‎我曾在推特工作
[01:23] My last job there 我在推特的 ‎
[01:24] was the senior vice president of engineering. 最后一份岗位 是高级副总工程师
[01:27] I was the president of Pinterest. ‎我曾是Pinterest的总经理
[01:29] Before that, um, I was the… the director of monetization ‎在那之前 我在脸书做了五年的
[01:32] at Facebook for five years. ‎盈利总监
[01:34] While at Twitter, I spent a number of years running their developer platform, ‎在推特期间 我用了几年时间 ‎运营他们的开发者平台
[01:38] and then I became head of consumer product. ‎然后我做了消费者产品部长
[01:40] I was the  coinventor of Google Drive, Gmail Chat, ‎我曾合作发明 ‎谷歌硬盘、谷歌邮箱聊天
[01:44] Facebook Pages, and the Facebook like button. ‎脸书网页和脸书“赞”按钮
[01:47] Yeah. This is… This is why I spent, like, eight months ‎对 所以我才花了八个月的时间
[01:50] talking back and forth with lawyers. ‎和律师反复周旋
[01:54] This freaks me out. ‎真的让我很崩溃
[01:58] When I was there, ‎我在那里任职的时候
[01:59] I always felt like, fundamentally, it was a force for good. 一直觉得 ‎总体上 这是积极的力量
[02:03] I don’t know if I feel that way anymore. ‎我现在不确定 自己是否还这样想了
[02:05] I left Google in June 2017, uh, due to ethical concerns. ‎出于对道德伦理的担心 ‎我于2017年6月离开谷歌
[02:10] And… And not just at Google but within the industry at large. ‎不仅是担心谷歌的道德伦理 ‎而是对整个产业
[02:14] I’m very concerned. ‎我很担心
[02:16] I’m very concerned. ‎非常担心
[02:19] It’s easy today to lose sight of the fact ‎如今 人们很容易忽略一个事实
[02:21] that these tools actually have created some wonderful things in the world. ‎这些工具 ‎其实为世界创造了一些美好的东西
[02:27] They’ve reunited lost family members. They’ve found organ donors. ‎它们让失去联系的家人重聚 ‎找器官捐赠者
[02:32] I mean, there were meaningful, systemic changes happening ‎它们为整个世界带来了
[02:36] around the world because of these platforms ‎有意义的、系统的变革
[02:39] that were positive! ‎这是这些平台积极的一面
[02:40] I think we were naive about the flip side of that coin. ‎我认为我们对它的消极一面 ‎过度看轻了
[02:45] Yeah, these things, you release them, and they take on a life of their own. ‎对 这些东西 只要你发布 ‎它们自己就会存活
[02:48] And how they’re used is pretty different than how you expected. ‎它们被使用的方式 ‎与你当初的预期大相径庭
[02:52] Nobody, I deeply believe, ever intended any of these consequences. ‎我深信 这些负面的后果 ‎不是任何人刻意为之
[02:56] There’s no one bad guy. No. Absolutely not. ‎没有任何一个坏人 没有 绝对没有
[03:01] So, then, what’s the… what’s the problem? ‎那么问题在哪里?
[03:09] Is there a problem, and what is the problem? ‎是否有问题?问题在哪里?
[03:17] Yeah, it is hard to give a single, succinct… ‎是 很难给出一个单一的、简明的…
[03:20] I’m trying to touch on many different problems. ‎我在试着谈论很多不同的问题
[03:22] What is the problem? ‎问题在哪里?
[03:28] ‎NETFLIX 原创纪录片
[03:33] Despite facing mounting criticism, ‎虽然面对越来越多的质疑
[03:35] the so called Big Tech names are getting bigger. ‎这些所谓的大型技术公司却越发壮大
[03:37] The entire tech industry is under a new level of scrutiny. ‎整个技术产业 ‎正在面临全新维度的审查
[03:41] And a new study sheds light on the link ‎一项新的研究初步揭示了心理健康
[03:43] between mental health and social media use. ‎与社交媒体使用之间的联系
[03:46] Here to talk about the latest research… ‎来谈谈最新的研究…
[03:48] …is going on that gets no coverage at all. ‎…的情况 根本没有保险
[03:51] Tens of millions of Americans are hopelessly addicted ‎数千万美国人无望地
[03:54] to their electronic devices. ‎玩电子设备成瘾
[03:56] It’s exacerbated by the fact ‎因为技术 现在你真的能把自己
[03:58] that you can literally isolate yourself now ‎隔离在一个泡泡内 周围都是
[03:59] ‎(你的朋友在照片中圈出了你)
[04:00] in a bubble, thanks to our technology. ‎与你观点相似的人 ‎瘾性就更加恶化了
[04:02] Fake news is becoming more advanced ‎虚假新闻变得更发达了
[04:04] and threatening societies around the world. ‎威胁着全世界的社会
[04:06] We weren’t expecting any of this when we created Twitter over 12 years ago. ‎我们12年多以前创造推特的时候 ‎根本没有想到这些
[04:10] White House officials say they have no reason to believe ‎官方说 他们没有理由相信
[04:12] the Russian cyberattacks will stop. ‎俄罗斯网络攻击会停止
[04:14] YouTube is being forced to concentrate on cleansing the site. ‎YouTube被迫专注于清理网站
[04:18] TikTok, if you talk to any tween out there… ‎如果你和十几岁的孩子去聊
[04:21] …there’s no chance they’ll delete this thing… ‎他们是绝对不可能删除抖音的…
[04:24] Hey, Isla, can you get the table ready, please? ‎喂 艾拉 可以帮忙摆桌子吗?
[04:26] There’s a question about whether social media ‎有一个严肃的问题
[04:28] is making your child depressed. ‎社交媒体是否让您的孩子抑郁
[04:30] Isla, can you set the table, please? ‎艾拉 可以帮忙摆桌子吗?
[04:32] These cosmetic procedures are becoming so popular with teens, ‎整容在青少年中已经十分受欢迎
[04:35] plastic surgeons have coined a new syndrome for it, ‎整容医生甚至造出了一种新病征
[04:37] “Snapchat dysmorphia,” with young patients wanting surgery ‎“图片分享畸形征” ‎指的是年轻的患者想做整容手术
[04:40] so they can look more like they do in filtered selfies. ‎让自己看上去 ‎更接近加了滤镜的自拍中的样子
[04:43] Still don’t see why you let her have that thing. ‎我还是不明白 ‎你为什么让她用那东西
[04:45] What was I supposed to do? ‎我能怎么办?
[04:47] I mean, every other kid in her class had one. ‎我是说 她们班上的其他孩子全都有
[04:50] She’s only 11. ‎她才11岁
[04:51] Cass, no one’s forcing you to get one. ‎卡桑 没人强迫你用
[04:53] You can stay disconnected as long as you want. ‎你想和别人切断联系多久 都随你
[04:55] Hey, I’m connected without a cell phone, okay? I’m on the Internet right now. ‎喂 我没有手机也不会和别人 ‎切断联系 好吗?我现在就在网上
[04:59] Also, that isn’t even actual connection. It’s just a load of sh ‎而且 这根本不是实际的联系 ‎全都是没用的…
[05:03] Surveillance capitalism has come to shape ‎监视资本主义 ‎已经形成了我们的政治和文化
[05:05] our politics and culture in ways many people don’t perceive. ‎而很多人根本没有察觉
[05:07] ISIS inspired followers online, ‎伊斯兰国煽动网上的关注者
[05:10] and now white supremacists are doing the same. ‎现在 白人至上主义也在这样做
[05:12] Recently in India, ‎最近在印度
[05:14] Internet lynch mobs have killed a dozen people, including these five… ‎网络暴民害死了十几个人 ‎包括这五个…
[05:17] It’s not just fake news; it’s fake news with consequences. ‎虚假新闻不只是虚假新闻 ‎是有后果的
[05:20] How do you handle an epidemic in the age of fake news? ‎在虚假新闻的时代 ‎你要怎么解决传染病?
[05:24] Can you get the coronavirus by eating Chinese food? ‎吃中餐会感染新冠病毒吗?
[05:27] We have gone from the information age into the disinformation age. ‎我们已经从信息时代 ‎过渡到了虚假信息时代
[05:32] Our democracy is under assault. ‎我们的民主受到了攻击
[05:34] What I said was, “I think the tools ‎我说的是:“我认为如今 ‎
[05:37] that have been created today are starting 创造出的工具正在开始
[05:39] to erode the social fabric of how society works.” ‎侵蚀社会正常运转的社交纽带”
[06:00] Aza does welcoming remarks. We play the video. ‎阿莎愿意接受评论 我们播放视频
[06:04] And then, “Ladies and gentlemen, Tristan Harris.” ‎然后说 “女士们先生们 ‎我是特里斯坦·哈里斯”
[06:07] Right. Great. ‎ 好 ‎ 很好
[06:08] So, I come up, and… ‎就是说 我上来 然后…
[06:12] ‎(人道 技术的新议程)
[06:13] basically say, “Thank you all for coming.” Um… ‎你就说“欢迎各位的到来”
[06:17] So, today, I wanna talk about a new agenda for technology. ‎今天 我想聊聊技术的一个新议程
[06:22] And why we wanna do that is because if you ask people, ‎以及我们为什么要这样做 ‎因为如果你问人们
[06:25] “What’s wrong in the tech industry right now?” ‎“如今的技术产业怎么了?”
[06:28] there’s a cacophony of grievances and scandals, ‎有一种不满和丑闻的杂音
[06:31] and “They stole our data.” And there’s tech addiction. ‎“他们盗用了我们的数据” ‎还有技术成瘾问题
[06:33] And there’s fake news. And there’s polarization ‎有虚假新闻问题 有两极分化问题
[06:36] and some elections that are getting hacked. ‎有些竞选过程被黑客操控的问题
[06:38] But is there something that is beneath all these problems ‎但是这些问题的背后 ‎是否有一个原因
[06:41] that’s causing all these things to happen at once? ‎导致这些问题在同时发生?
[06:46] Does this feel good? Very good. Yeah. ‎ 感觉还行吗? ‎ 非常好 好
[06:50] I’m just trying to… Like, I want people to see… ‎我只是想… 我想让人们看到…
[06:53] Like, there’s a problem happening in the tech industry, ‎在技术产业 正面临着一个问题
[06:55] and it doesn’t have a name, ‎这个问题连名字都没有 ‎
[06:56] and it has to do with one source, like, one… 这个问题有一个源头…
[07:05] When you look around you, it feels like the world is going crazy. ‎环顾你身边 ‎感觉这个世界在逐渐疯狂
[07:12] You have to ask yourself, like, “Is this normal? ‎你要问自己 这是正常的吗?
[07:16] Or have we all fallen under some kind of spell?” ‎还是我们都中了什么魔咒?
[07:22] ‎(特里斯坦·哈里斯 ‎谷歌前设计道德伦理学家)
[07:25] ‎(人道技术中心 合作创始人)
[07:27] I wish more people could understand how this works ‎我希望更多的人能够理解它的原理
[07:30] because it shouldn’t be something that only the tech industry knows. ‎因为它不应该 ‎只被技术产业的业内知道
[07:34] It should be something that everybody knows. ‎应该让所有人都知道
[07:41] Bye. 拜
[07:47] Hello! Hi. ‎ 你好! ‎ 嗨!
[07:48] Tristan. Nice to meet you. It’s Tris tan, right? ‎ 特里斯坦 幸会 ‎ 特里斯坦?
[07:50] Yes. Awesome. Cool. ‎ 对 ‎ 太好了 好
[07:53] Tristan Harris is a former design ethicist for Google ‎特里斯坦·哈里斯 ‎是谷歌前设计道德伦理学家
[07:56] and has been called the closest thing Silicon Valley has to a conscience. ‎被称为硅谷最接近良知的人物
[07:59] He’s asking tech ‎他呼吁技术产业
[08:00] to bring what he calls “ethical design” to its products. ‎在产品中引进 ‎被他称为“道德伦理设计”的要素
[08:04] It’s rare for a tech insider to be so blunt, ‎搞技术的业内人士 ‎极少如此直言不讳
[08:06] but Tristan Harris believes someone needs to be. ‎特里斯坦·哈里斯相信 ‎总有人要这样
[08:11] When I was at Google, ‎我在谷歌工作的时候
[08:12] I was on the Gmail team, and I just started getting burnt out ‎我在谷歌邮箱团队 ‎我就开始觉得很疲惫
[08:16] ’cause we’d had so many conversations about… ‎因为我们讨论了很多…
[08:19] you know, what the inbox should look like and what color it should be, and… ‎收件箱应该长什么样 ‎应该是什么颜色
[08:23] And I, you know, felt personally addicted to e mail, ‎我自己感觉对邮件成瘾
[08:26] and I found it fascinating ‎我觉得有趣的是
[08:27] there was no one at Gmail working on making it less addictive. ‎在谷歌邮箱工作的人 ‎没有一个想把它做得不那么致瘾
[08:31] And I was like, “Is anybody else thinking about this? ‎我想:“别人想过这个问题吗?
[08:34] I haven’t heard anybody talk about this.” ‎我 没听谁谈论过这个问题”
[08:36] And I was feeling this frustration… ‎我对技术产业
[08:39] …with the tech industry, overall, 整体感到沮丧
[08:41] that we’d kind of, like, lost our way. ‎感觉我们有点迷路了
[08:46] You know, I really struggled to try and figure out ‎我真的很努力地去尝试 想办法
[08:49] how, from the inside, we could change it. ‎怎样能从行业内部改变这个问题
[08:55] And that was when I decided to make a presentation, ‎就在这个时候 我决定做一次展示
[08:58] kind of a call to arms. ‎算是号召大家吧
[09:00] Every day, I went home and I worked on it for a couple hours every single night. ‎每天我回到家 每一个晚上 ‎都要花几个小时去做这件事
[09:06] It basically just said, you know, ‎我的呼吁是
[09:08] never before in history have 50 designers 历史上从来没有过50个
[09:12] 20 to 35 year old white guys in California ‎20到35岁之间的加州白人设计师
[09:15] made decisions that would have an impact on two billion people. ‎做出一个能影响20亿人的决定
[09:21] Two billion people will have thoughts that they didn’t intend to have ‎20亿人将会拥有 ‎他们从来不曾预料的想法
[09:24] because a designer at Google said, “This is how notifications work ‎只因为一个谷歌的设计师说 ‎“你每天早上醒来
[09:28] on that screen that you wake up to in the morning.” ‎屏幕上的通知就是这样工作的”
[09:31] And we have a moral responsibility, as Google, for solving this problem. ‎我们作为谷歌 ‎有解决这个问题的道德责任
[09:36] And I sent this presentation ‎我把这个展示
[09:37] to about 15, 20 of my closest colleagues at Google, ‎发给了在谷歌 ‎与我关系最近的15到20个同事
[09:41] and I was very nervous about it. I wasn’t sure how it was gonna land. ‎我很紧张 我不知道他们会怎样想
[09:46] When I went to work the next day, ‎我第二天去上班的时候
[09:48] most of the laptops had the presentation open. ‎多数的电脑上都开着这个展示
[09:52] Later that day, there was, like, 400 simultaneous viewers, ‎那天下午 有400个人同时观看
[09:54] so it just kept growing and growing. ‎看到的人越来越多
[09:56] I got e mails from all around the company. I mean, people in every department saying, ‎我收到整个公司同事发来的各种邮件 ‎每一个部门的人都说
[10:00] “I totally agree.” “I see this affecting my kids.” ‎“我太同意了 ‎我看到这个问题正在影响我的孩子
[10:02] “I see this affecting the people around me.” ‎我看到这个问题正在影响我身边的人
[10:05] “We have to do something about this.” ‎我们应该做点什么 来解决这个问题”
[10:07] It felt like I was sort of launching a revolution or something like that. ‎我感觉自己 ‎好像开启了一场革命之类的
[10:11] Later, I found out Larry Page had been notified about this presentation ‎后来 我才知道莱利·佩吉 ‎那一天在三个不同会议
[10:15] in three separate meetings that day. ‎都被人告知这个展示的存在
[10:17] And so, it created this kind of cultural moment ‎于是这个展示 ‎创造了这个文化性的时刻
[10:20] that Google needed to take seriously. ‎谷歌需要认真对待
[10:26] And then… nothing. ‎然后…杳无音讯了
[10:34] Everyone in 2006… ‎2006年 所有人…
[10:37] including all of us at Facebook, ‎包括我们在脸书的所有人
[10:39] just had total admiration for Google and what Google had built, ‎超级羡慕谷歌 ‎羡慕谷歌所创建的一切
[10:43] which was this incredibly useful service ‎超级实用的服务
[10:47] that did, far as we could tell, lots of goodness for the world, ‎据我们当时所知 ‎为世界带来了很多好处
[10:51] and they built this parallel money machine. ‎他们建立了一个平行的造钱机器
[10:55] We had such envy for that, and it seemed so elegant to us… ‎我们超级羡慕嫉妒谷歌 ‎在我们看来太优雅了…
[11:00] and so perfect. ‎太完美了
[11:02] Facebook had been around for about two years, ‎脸书当时才成立大概两年
[11:05] um, and I was hired to come in and figure out ‎我被雇到脸书
[11:08] what the business model was gonna be for the company. ‎ 去找出公司未来要走怎样的商业模式
[11:10] I was the director of monetization. The point was, like,
[11:13] “You’re the person who’s gonna figure out how this thing monetizes.” ‎“你是要去想出 ‎这个东西怎样盈利的人”
[11:17] And there were a lot of people who did a lot of the work, ‎当时很多人做了很多工作
[11:19] but I was clearly one of the people who was pointing towards… ‎但我明显是其中一个指向…
[11:26] “Well, we have to make money, A… ‎首先 我们必须要赚钱
[11:29] and I think this advertising model is probably the most elegant way. ‎我认为这个广告模式 ‎可能是最优雅的方式
[11:42] Uh oh. What’s this video Mom just sent us? ‎妈妈刚给我们发的什么视频?
[11:44] Oh, that’s from a talk show, but that’s pretty good. ‎一个脱口秀 不过挺不错的 ‎
[11:46] Guy’s kind of a genius. 那个人还算是个天才
[11:47] He’s talking all about deleting social media, which you gotta do. ‎他在谈论删除社交媒体 ‎你们真应该这样做
[11:50] I might have to start blocking her e mails. ‎我可能要开始屏蔽她的邮件了
[11:52] I don’t even know what she’s talking about, man. ‎讲真 我都不知道她在说什么 ‎天啊
[11:54] She’s worse than I am. 她还不如我
[11:56] No, she only uses it for recipes. Right, and work. ‎ 不 她只用来找菜谱 ‎ 对 还有工作
[11:58] And workout videos. And to check up on us. ‎ 还有健身视频 ‎ 还看我们在做什么
[12:00] And everyone else she’s ever met in her entire life. ‎还看她这辈子遇到过的每一个人 ‎在做什么
[12:04] If you are scrolling through your social media feed ‎如果你一边往下划着 ‎你的社交媒体推送
[12:07] while you’re watchin’ us, you need to put the damn phone down and listen up ‎一边看着我们 ‎你需要把你该死的手机放下 听好
[12:11] ’cause our next guest has written an incredible book ‎因为我们的下一位嘉宾 ‎写了一本优秀的书
[12:14] about how much it’s wrecking our lives. ‎书的内容是社交媒体 ‎多大程度上破坏了我们的生活
[12:18] Please welcome author ‎掌声有请
[12:19] of Ten Arguments for Deleting Your Social Media Accounts Right Now… 《立刻删除 ‎你社交媒体的十个论点》作者
[12:24] Uh huh. …Jaron Lanier. ‎贾伦·拉尼尔
[12:27] Companies like Google and Facebook are some of the wealthiest ‎谷歌、脸书这样的公司是有史以来
[12:31] and most successful of all time. ‎最富有、最成功的几个公司
[12:33] Uh, they have relatively few employees. ‎他们的员工数量相对较少
[12:36] They just have this giant computer that rakes in money, right? Uh… ‎他们只有一个大电脑 在那里摇钱
[12:41] Now, what are they being paid for? ‎问题是 别人为什么给他们钱呢? ‎
[12:43] That’s a really important question. 这是一个非常重要的问题
[12:47] So, I’ve been an investor in technology for 35 years. ‎我做了35年的技术产业投资者
[12:51] The first 50 years of Silicon Valley, the industry made products ‎硅谷的前50年 行业制造产品…
[12:54] hardware, software ‎硬件、软件
[12:55] sold ’em to customers. Nice, simple business. 卖给顾客 ‎简单良好的商业模式
[12:58] For the last ten years, the biggest companies in Silicon Valley ‎过去十年 硅谷最大的公司
[13:01] have been in the business of selling their users. ‎一直涉足贩卖他们用户的勾当
[13:03] It’s a little even trite to say now, ‎现在这样说 有点陈词滥调
[13:05] but… because we don’t pay for the products that we use, ‎但因为我们不为使用这些产品付钱
[13:09] advertisers pay for the products that we use. ‎广告商为我们使用的产品付钱
[13:12] Advertisers are the customers. ‎广告商是顾客
[13:14] We’re the thing being sold. ‎我们是被销售的商品
[13:16] The classic saying is: ‎经典的说法是
[13:17] “If you’re not paying for the product, then you are the product.” ‎“如果你没有花钱买产品 ‎那你就是被卖的产品”
[13:23] A lot of people think, you know, “Oh, well, Google’s just a search box, ‎很多人想:“谷歌只是一个搜索框
[13:27] and Facebook’s just a place to see what my friends are doing ‎脸书只是一个看我朋友们在做什么
[13:29] and see their photos.” ‎看他们照片的地方”
[13:31] But what they don’t realize is they’re competing for your attention. ‎但他们没有意识到的是 ‎他们在竞争你的关注
[13:36] So, you know, Facebook, Snapchat, Twitter, Instagram, YouTube, ‎脸书、阅后即焚图片分享、推特 ‎Instagram、YouTube
[13:41] companies like this, their business model is to keep people engaged on the screen. ‎这种公司 他们的商业模式 ‎是让人们的注意力持续吸引在屏幕上
[13:46] Let’s figure out how to get as much of this person’s attention ‎我们来想办法 怎样尽最大可能
[13:49] as we possibly can. ‎获得这个人的注意力
[13:51] How much time can we get you to spend? ‎我们能让你在上面花多少时间?
[13:53] How much of your life can we get you to give to us? ‎我们能让你给我们分出 ‎你人生的多少时间?
[13:58] When you think about how some of these companies work, ‎当你去想 这些公司是怎样运作的
[14:01] it starts to make sense. ‎就能开始想通了
[14:03] There are all these services on the Internet that we think of as free, ‎网络上有过各种服务 ‎我们都认为是免费的
[14:06] but they’re not free. They’re paid for by advertisers. ‎但它们并不是免费的 ‎是广告商在付钱
[14:09] Why do advertisers pay those companies? ‎广告商为什么给这些公司付钱?
[14:11] They pay in exchange for showing their ads to us. ‎它们付钱 交换给你展示广告
[14:14] We’re the product. Our attention is the product being sold to advertisers. ‎我们是产品 我们的关注 ‎就是卖给广告商的产品
[14:18] That’s a little too simplistic. ‎这样说 过于简单化了
[14:20] It’s the gradual, slight, imperceptible change ‎产品其实是我们行为和认知的
[14:23] in your own behavior and perception that is the product. ‎逐渐的、一点一点的 ‎我们未察觉到的变化
[14:26] ‎(行为和认知的 变化)
[14:27] And that is the product. It’s the only possible product. ‎这才是产品 是唯一可能的产品
[14:30] There’s nothing else on the table that could possibly be called the product. ‎这其中 没有任何东西 ‎能再被称为产品了
[14:34] That’s the only thing there is for them to make money from. ‎这是他们能拿来赚钱的唯一东西
[14:37] Changing what you do, ‎改变你做的事
[14:39] how you think, who you are. ‎你的思维模式 改变你这个人
[14:42] It’s a gradual change. It’s slight. ‎这是一种逐渐的变化 非常轻微
[14:45] If you can go to somebody and you say, “Give me $10 million, ‎如果你去找一个人 ‎你说:“给我一千万美元
[14:49] and I will change the world one percent in the direction you want it to change…” ‎我会让世界往你希望的方向改变1%…”
[14:54] It’s the world! That can be incredible, and that’s worth a lot of money. ‎是整个世界!这就很神奇 值很多钱
[14:59] Okay. ‎好
[15:00] This is what every business has always  dreamt of: ‎这是每种商业都一直梦想的
[15:04] to have a guarantee that if it places an ad, it will be successful. ‎就是投放一个广告 ‎有一定能够成功的保证
[15:11] That’s their business. ‎这就是他们的生意 ‎
[15:12] They sell certainty. 他们卖的是确定性
[15:14] ‎(确定性)
[15:14] In order to be successful in that business, ‎为了在这个生意中成功
[15:17] you have to have great predictions. ‎你必须要有优秀的预判能力
[15:20] Great predictions begin with one imperative: ‎(优秀的预判能力)
[15:20] ‎优秀的预判能力始于一个必要条件
[15:25] you need a lot of data. ‎你需要很多数据
[15:27] ‎(数据)
[15:29] Many people call this surveillance capitalism, ‎很多人把它称作监视资本主义
[15:31] capitalism profiting off of the infinite tracking ‎资本主义利用大型技术公司 ‎对每个人去的每一个地方
[15:34] of everywhere everyone goes by large technology companies ‎进行无限追踪获利
[15:38] whose business model is to make sure ‎大型技术公司的商业模式
[15:40] that advertisers are as successful as possible. ‎是保证广告商能尽最大可能成功
[15:42] This is a new kind of marketplace now. ‎这是现在的一种新市场
[15:45] It’s a marketplace that never existed before. ‎这种市场 以前从未出现过
[15:48] And it’s a marketplace that trades exclusively in human futures. ‎这个市场交易的 只有人类期货
[15:56] Just like there are markets that trade in pork belly futures or oil futures. ‎就像交易五花肉期货 ‎和石油期货的市场
[16:02] We now have markets that trade in human futures at scale, ‎我们现在有了 ‎交易大范围人类期货的市场
[16:08] and those markets have produced the trillions of dollars ‎这些市场创造了万亿美元
[16:14] that have made the Internet companies the richest companies ‎让网络公司成为了人类历史上
[16:19] in the history of humanity. ‎最富有的公司
[16:27] What I want people to know is that everything they’re doing online ‎我想让人们知道的是 ‎他们在网上做的一切
[16:31] is being watched, is being tracked, is being measured. ‎都被监控着 被追踪着 被评估着
[16:35] Every single action you take is carefully monitored and recorded. ‎你所做出的每一个行为 ‎都被小心翼翼地监控着、记录着
[16:39] Exactly what image you stop and look at, for how long you look at it. ‎具体到你停在哪一张图片上看了 ‎你看了多久
[16:43] Oh, yeah, seriously, for how long you look at it. ‎是的 真的 看了多久都记录了
[16:45] ‎(纳维亚 参与时间)
[16:47] ‎(瑞恩 参与时间)
[16:49] ‎(雷恩 参与时间)
[16:50] They know when people are lonely. ‎人们孤独的时候 他们知道 ‎
[16:52] They know when people are depressed. 人们抑郁的时候 他们知道
[16:53] They know when people are looking at photos of your ex romantic partners. ‎人们看前任爱侣的时候 他们知道
[16:57] They know what you’re doing late at night. They know the entire thing. ‎你深夜在做什么 他们知道 ‎他们全都知道
[17:01] Whether you’re an introvert or an extrovert, ‎你是内向还是外向
[17:03] or what kind of neuroses you have, what your personality type is like. ‎你的神经哪种类型 ‎你的性格是哪种类型
[17:08] They have more information about us ‎他们所掌握的我们的信息
[17:11] than has ever been imagined in human history. ‎超越人类历史上所有的想象
[17:14] It is unprecedented. ‎这是史无前例的
[17:18] And so, all of this data that we’re… that we’re just pouring out all the time ‎所有这些我们不经意间 ‎不断流露出的数据
[17:22] is being fed into these systems that have almost no human supervision ‎都被输入到这些系统中 ‎几乎不用人类看管
[17:27] and that are making better and better and better and better predictions ‎会做出越来越好的预判
[17:30] about what we’re gonna do and… and who we are. ‎预判出我们要做什么 ‎我们是怎样的人
[17:34] ‎(为您推荐)
[17:36] People have the misconception it’s our data being sold. ‎很多人有一种误解 认为被卖掉的 ‎是我们的数据
[17:40] It’s not in Facebook’s business interest to give up the data. ‎脸书的商业兴趣 ‎肯定不是放掉这些数据
[17:45] What do they do with that data? ‎他们用这些数据做什么呢?
[17:51] They build models that predict our actions, ‎他们做出预判我们行为的模型
[17:54] and whoever has the best model wins. ‎拥有最优秀模型的公司就赢了
[18:02] His scrolling speed is slowing. ‎他向下滑网页的速度慢
[18:04] Nearing the end of his average session length. ‎接近他平均阅读一屏时间长度的末尾
[18:06] Decreasing ad load. ‎减少广告加载
[18:07] Pull back on friends and family. ‎把朋友和家人弄回来
[18:09] On the other side of the screen, ‎在屏幕的另一端
[18:11] it’s almost as if they had this avatar voodoo doll like model of us. ‎他们就好像拥有一个 ‎我们的巫毒娃娃化身一样
[18:16] All of the things we’ve ever done, ‎我们做过的所有事情
[18:18] all the clicks we’ve ever made, ‎点击过的每一个地方
[18:19] all the videos we’ve watched, all the likes, ‎我们看过的所有视频 ‎点赞过的所有内容
[18:21] that all gets brought back into building a more and more accurate model. ‎这些数据都会被返回去 ‎用来建造一个越来越精准的模型
[18:25] The model, once you have it, ‎一旦有了这个模型
[18:27] you can predict the kinds of things that person does. ‎就能预判这个人做怎样的事
[18:29] Right, let me just test. ‎好 让我测试一下
[18:32] Where you’ll go. I can predict  what kind of videos ‎你将要去哪里 ‎我能预判出 ‎
[18:35] will keep you watching. 你会继续看什么样的视频
[18:36] I can predict what kinds of emotions tend to trigger you. ‎我能预判 ‎什么样的情感更能让你产生共鸣
[18:39] Yes, perfect. ‎好 完美
[18:41] The most epic fails of the year. ‎年度最悲壮失败
[18:43] ‎(悲壮失败)
[18:48] Perfect. That worked. Following with another video. ‎ 完美 有效果 ‎ 接着另一个视频
[18:51] Beautiful. Let’s squeeze in a sneaker ad before it starts. ‎漂亮 在它开始之前 ‎我们插进去一个运动鞋广告
[18:56] At a lot of technology companies, ‎很多这种技术公司
[18:58] there’s three main goals. 有三个主要目标
[18:59] There’s the engagement goal: ‎有一个参与度目标 ‎
[19:01] to drive up your usage, to keep you scrolling. 增加你的使用 让你一直滑动屏幕
[19:04] There’s the growth goal: ‎有一个增长目标
[19:06] to keep you coming back and inviting as many friends ‎让你不断回来 尽可能多地邀请朋友
[19:08] and getting them to invite more friends. ‎让他们再邀请更多的朋友
[19:11] And then there’s the advertising goal: ‎还有一个广告目标 ‎
[19:13] to make sure that, as all that’s happening, 确保一切按照预期发展
[19:15] we’re making as much money as possible from advertising. ‎我们尽量多地从广告上挣钱
[19:19] Each of these goals are powered by algorithms ‎每一个目标都由一个算法驱动
[19:22] whose job is to figure out what to show you ‎算法的作用是找出 给你展示什么
[19:24] to keep those numbers going up. ‎让数据上涨
[19:26] We often talked about, at Facebook, this idea ‎我们在脸书 经常聊到这个想法
[19:30] of being able to just dial that as needed. ‎能够按照我们的需要调控
[19:34] And, you know, we talked about having Mark have those dials. ‎我们聊过 让马克来调控
[19:41] “Hey, I want more users in Korea today.” ‎“喂 我今天想让韩国的用户增加”
[19:45] “Turn the dial.” ‎开始调控
[19:47] “Let’s dial up the ads a little bit.” ‎“我们调控提高一点广告”
[19:49] “Dial up monetization, just slightly.” ‎“调控提高一点盈利”
[19:52] And so, that happ ‎所以说…
[19:55] I mean, at all of these companies, there is that level of precision. ‎所有这些公司 ‎都能做到这种程度的精准
[19:59] Dude, how I don’t know how I didn’t get carded. ‎ 兄弟 怎么… ‎ 我不知道我怎么没有得到黄牌罚下
[20:02] That ref just, like, sucked or something. You got literally all the way… ‎ 那个裁判真是逊 ‎ 真的是一直…
[20:05] That’s Rebecca. Go talk to her. I know who it is. ‎ 那是瑞贝卡 去和她说话 ‎ 我知道那是谁
[20:08] Dude, yo, go talk to her. I’m workin’ on it. ‎ 兄弟 去啊 去跟她说话 ‎ 我正在努力
[20:10] His calendar says he’s on a break right now. We should be live. ‎他的日历上说 他正在休假 ‎我们应该实时操作
[20:14] Want me to nudge him? ‎要我给他发一个窗口抖动吗?
[20:17] Yeah, nudge away. ‎好 抖吧
[20:21] “Your friend Tyler just joined. Say hi with a wave.” ‎“您的好友泰勒刚刚加入了 ‎去挥手打个招呼吧”
[20:26] Come on, Ben. ‎快啊 兄弟
[20:27] Send a wave. ‎发一个挥手
[20:29] You’re not… Go talk to her, dude. ‎你都没有… 去和她说话 兄弟
[20:31] ‎您的好友泰勒刚刚加入了! ‎挥手打个招呼吧
[20:36] ‎(联系人网络)
[20:38] New link! All right, we’re on. ‎新联系人!好 连上了
[20:40] Follow that up with a post from User 079044238820, Rebecca. ‎接下来推送 ‎用户079044238820瑞贝卡的发帖
[20:46] Good idea. GPS coordinates indicate that they’re in close proximity. ‎好主意 卫星定位坐标显示 ‎他们距离很近
[20:52] ‎(瑞贝卡 找到了我的灵魂伴侣 ‎#闺蜜#呲溜呲溜#好朋友)
[20:55] He’s primed for an ad. ‎他容易受到广告影响
[20:57] Auction time. ‎拍卖时间
[20:58] ‎(广告预览 深度衰落发蜡)
[21:00] Sold! To Deep Fade hair wax. ‎卖了!给深度衰落发蜡
[21:03] We had 468 interested bidders. We sold Ben at 3.262 cents for an impression. ‎我们有468个感兴趣的竞标人 ‎我们以3.262分卖给本 换来他的印象
[21:17] We’ve created a world ‎我们创造了一个世界 ‎
[21:18] in which online connection has become primary, 这个世界中 在线联系变成了主体
[21:22] especially for younger generations. ‎尤其是对年轻一代
[21:23] And yet, in that world, any time two people connect, ‎然而在这个世界 每次两个人联系
[21:29] the only way it’s financed is through a sneaky third person ‎唯一能提供经济支持的 ‎是通过一个鬼祟的第三方
[21:33] who’s paying to manipulate those two people. ‎有人给第三方钱 去操纵这两个人
[21:36] So, we’ve created an entire global generation of people ‎所以 我们创造了全球的一整代人
[21:39] who are raised within a context where the very meaning of communication, ‎他们成长的背景中 交流的意义
[21:44] the very meaning of culture, is manipulation. ‎文化的意义 就是操纵
[21:47] We’ve put deceit and sneakiness ‎我们所做的每一件事的中心
[21:49] at the absolute center of everything we do. ‎都加入了欺骗和鬼祟
[21:56] ‎(“任何足够先进的技术 ‎都极其类似于魔术”)
[22:01] ‎(——亚瑟·C·克拉克)
[22:05] Grab the… Okay. ‎ 拿起另一个… ‎ 好
[22:07] Where’s it help to hold it? Great. ‎ 放在哪里才有用? ‎ 这样很好
[22:09] Here? Yeah. ‎ 这里? ‎ 好
[22:10] How does this come across on camera if I were to do, like, this move ‎这会怎样出现在摄影机中 ‎如果我要做…
[22:13] We can Like that? ‎ 其实 我们可以… ‎ 就这样?
[22:15] What? Yeah. ‎ 什么? ‎ 对
[22:17] Do that again. Exactly. Yeah. ‎ 再来一次? ‎ 没错 好
[22:19] Yeah. No, it’s probably not… ‎对 不是 可能不是…
[22:20] Like… yeah. ‎这样… 对
[22:22] I mean, this one is less… ‎这个就稍微没那么…
[22:29] Larissa’s, like, actually freaking out over here. ‎克里斯在那边已经烦死了
[22:34] Is that good? ‎可以了吗?
[22:35] ‎(魔术!)
[22:37] I was, like, five years old when I learned how to do magic. ‎我从五岁开始学习变魔术
[22:41] And I could fool adults, fully grown adults with, like,  PhDs. ‎我可以骗过成年人 ‎有博士学位的完全成熟的成年人
[22:55] Magicians were almost like the first neuroscientists ‎魔术师几乎像是最早期的神经学家
[22:57] and psychologists. ‎和心理学家
[22:59] Like, they were the ones who first understood ‎他们是最先明白
[23:02] how people’s minds work. ‎人们思想工作原理的人
[23:04] They just, in real time, are testing lots and lots of stuff on people. ‎他们在人们身上 ‎实时测试着很多很多的东西
[23:09] A magician understands something, ‎魔术师懂一些事
[23:11] some part of your mind that we’re not aware of. ‎你思想中 ‎你自己都没意识到的某一部分
[23:14] That’s what makes the illusion work. ‎这是让幻觉起作用的关键
[23:16] Doctors, lawyers, people who know how to build 747s or nuclear missiles, ‎医生、律师 ‎知道怎样构造747飞机或者核弹的人
[23:20] they don’t know more about how their own mind is vulnerable. ‎他们并不会比别人更了解 ‎自己的思想有多么脆弱
[23:24] That’s a separate discipline. ‎因为这是一个完全不用的学科领域
[23:26] And it’s a discipline that applies to all human beings. ‎这个学科领域 对所有人类都适用
[23:29] ‎(斯坦福大学)
[23:30] From that perspective, you can have a very different understanding ‎从这个角度来说 ‎你就能对技术做了什么
[23:34] of what technology is doing. ‎有一个不同的理解
[23:36] When I was at the Stanford Persuasive Technology Lab, ‎我在斯坦福劝服技术实验室的时候 ‎
[23:39] this is what we learned. 这就是我们所学到的
[23:41] How could you use everything we know ‎怎样利用我们知道的一切心理学知识
[23:43] about the psychology of what persuades people ‎什么东西能劝服人们
[23:45] and build that into technology? ‎把这个运用到技术中?
[23:48] Now, many of you in the audience are geniuses already. ‎在座观众中的很多人 已经是天才了
[23:50] I think that’s true, but my goal is to turn you into a behavior change genius. ‎我认为这是事实 但我的目标是 ‎让你们变成行为改变的天才
[23:56] There are many prominent Silicon Valley figures who went through that class ‎很多著名的硅谷人物 ‎都上过这个课
[24:01] key growth figures at Facebook and Uber and… and other companies ‎脸书、优步和其他公司的 ‎业绩增长关键人物
[24:05] and learned how to make technology more persuasive, ‎学习怎样让技术更能劝服人们
[24:09] Tristan being one. ‎特里斯坦就是其中一个
[24:12] Persuasive technology is just sort of design ‎劝服性技术 可以说是极端应用的
[24:14] intentionally applied to the extreme, ‎刻意设计
[24:16] where we really want to modify someone’s behavior. ‎我们真的想去修改一个人的行为
[24:18] We want them to take this action. ‎我们想让他们这样做
[24:20] We want them to keep doing this with their finger. ‎我们想让他们继续用手指这样做
[24:23] You pull down and you refresh, it’s gonna be a new thing at the top. ‎你往下拉 刷新 最上面就是新的内容
[24:26] Pull down and refresh again, it’s new. Every single time. ‎再下拉 再刷新 又是新的 ‎每一次都是
[24:28] Which, in psychology, we call a positive intermittent reinforcement. ‎在心理学上 我们称为“正积极强化”
[24:33] You don’t know when you’re gonna get it or if you’re gonna get something, ‎你不知道什么时候能刷到 ‎或者你是否能刷到什么
[24:37] which operates just like the slot machines in Vegas. ‎它的原理就像是赌城的老虎机
[24:40] It’s not enough that you use the product consciously, ‎你有意识地使用产品 还远远不够
[24:42] I wanna dig down deeper into the brain stem ‎我想深度侵入你的大脑根部
[24:44] and implant, inside of you, an unconscious habit ‎在你脑中植入一个无意识的习惯
[24:47] so that you are being programmed at a deeper level. ‎让你在更深的层次被编程
[24:50] You don’t even realize it. ‎你自己都没有意识到
[24:52] A man, James Marshall… ‎一个叫做詹姆斯·马歇尔的人
[24:54] Every time you see it there on the counter, ‎每一次你在看到老虎机在柜台上
[24:56] and you just look at it, and you know if you reach over, ‎你看一眼 你知道如果你过去
[24:59] it just might have something for you, ‎它可能有东西给你
[25:01] so you play that slot machine to see what you got, right? ‎于是你就玩了一下老虎机 ‎看你能得到什么 对吧?
[25:03] That’s not by accident. That’s a design technique. ‎这不是偶然 这是设计好的手段
[25:06] He brings a golden nugget to an officer in the army in San Francisco. ‎他把一个金块 ‎给了旧金山军队的一个军官
[25:12] Mind you, the… the population of San Francisco was only… ‎别忘了 旧金山的人口数量 ‎当时只有…
[25:15] Another example is photo tagging. ‎另一个例子是照片圈人
[25:19] So, if you get an e mail ‎如果你收到一封邮件 ‎
[25:21] that says your friend just tagged you in a photo, 说你朋友刚刚在一张照片中圈出了你
[25:24] of course you’re going to click on that e mail and look at the photo. ‎你当然会点击那封邮件 看一下照片
[25:29] It’s not something you can just decide to ignore. ‎这不是你能选择忽略的事情
[25:32] This is deep seated, like, ‎他们所利用的 ‎
[25:34] human personality that they’re tapping into. 是根植于人类本性中的东西
[25:36] What you should be asking yourself is: ‎你应该问自己的是
[25:38] “Why doesn’t that e mail contain the photo in it? ‎“这封邮件中 ‎为什么没把照片放进来?
[25:40] It would be a lot easier to see the photo.” ‎这样 看照片就会容易很多 ”
[25:42] When Facebook found that feature, they just dialed the hell out of that ‎当脸书发现这个功能之后 ‎他们可真是尽情调控
[25:46] because they said, “This is gonna be a great way to grow activity. ‎因为他们说 ‎“这将是增长积极性的绝好方式
[25:48] Let’s just get people tagging each other in photos all day long.” ‎我们让大家整天在照片中互相圈吧”
[25:56] ‎(本:至少我们中 ‎有一个人拍得很好)
[25:59] He commented. ‎他评论了
[26:00] Nice. ‎很好
[26:01] Okay, Rebecca received it, and she is responding. ‎瑞贝卡收到了 她正在回复
[26:04] All right, let Ben know that she’s typing so we don’t lose him. ‎好 让本知道她在输入 别让他下线了
[26:07] Activating ellipsis. ‎激活省略号
[26:09] ‎(至少我们中有一个人拍得很好 ‎…)
[26:19] Great, she posted. ‎太好了 她发布了
[26:21] He’s commenting on her comment about his comment on her post. ‎他在评论 他给她发帖的评论的评论
[26:25] Hold on, he stopped typing. ‎等一下 他停止输入了
[26:26] Let’s autofill. ‎我们来自动填入
[26:28] Emojis. He loves emojis. ‎表情 他喜欢使用表情
[26:31] ‎(自动完成 参与度)
[26:33] He went with fire. ‎他选择了“火辣”表情
[26:34] I was rootin’ for eggplant. ‎我以为他会选择茄子呢
[26:38] There’s an entire discipline and field called “growth hacking.” ‎有一整个学科 ‎这个领域叫做“增长量黑客学”
[26:42] Teams of engineers whose job is to hack people’s psychology ‎无数工程师团队 ‎他们的工作就是黑入人们的心理
[26:47] so they can get more growth. ‎让他们拥有更多的增长量
[26:48] They can get more user sign ups, more engagement. ‎他们能得到更多的用户注册 ‎更高的参与度
[26:51] They can get you to invite more people. ‎能让你邀请更多人
[26:52] After all the testing, all the iterating, all of this stuff, ‎各种测试、迭代 种种操作之后
[26:56] you know the single biggest thing we realized? ‎知道我们发现最重要现象是什么吗?
[26:57] Get any individual to seven friends in ten days. ‎让任何个体在十天内邀请七个朋友
[26:59] ‎(查马斯·帕里哈皮提亚 ‎脸书前增长副总裁)
[27:01] That was it. ‎就是这个
[27:02] Chamath was the head of growth at Facebook early on, ‎查马斯是脸书早期的增长负责人
[27:05] and he’s very well known in the tech industry ‎他在技术领域非常知名
[27:08] for pioneering a lot of the growth tactics ‎因为他首创了很多增长手段
[27:11] that were used to grow Facebook at incredible speed. ‎这些手段的使用 ‎让脸书用户飞速增长
[27:14] And those growth tactics have then become the standard playbook for Silicon Valley. ‎后来那些增长手段 ‎变成了硅谷的标准战术
[27:18] They were used at Uber and at a bunch of other companies. ‎在优步使用了 ‎在很多其他公司也使用了
[27:21] One of the things that he pioneered was the use of scientific A/B testing ‎他首创的一个东西 ‎就是对功能上的小变化
[27:27] of small feature changes. ‎使用科学A/B测试
[27:29] Companies like Google and Facebook ‎谷歌和脸书这种公司
[27:31] would roll out lots of little, tiny experiments ‎会推出很多小实验
[27:34] that they were constantly doing on users. ‎他们不断在用户身上测试
[27:36] And over time, by running these constant experiments, ‎随着时间发展 ‎不停运行这些实验之后
[27:39] you… you develop the most optimal way ‎就能发展出能让用户 ‎做你想让他们做的事
[27:43] to get users to do what you want them to do. ‎的最佳方式
[27:45] It’s… It’s manipulation. ‎这就是操纵
[27:47] Uh, you’re making me feel like a lab rat. ‎你让我感觉自己像是实验室的小白鼠
[27:49] You are a lab rat. We’re all lab rats. ‎你就是实验室的小白鼠 ‎我们所有人都是
[27:52] And it’s not like we’re lab rats for developing a cure for cancer. ‎我们和开发治愈癌症药物的 ‎实验室小白鼠又不一样
[27:55] It’s not like they’re trying to benefit us. ‎这些实验的最终获益人 不是我们
[27:58] Right? We’re just zombies, and they want us to look at more ads ‎对吧?我们就像僵尸一样 ‎他们想让我们看更多的广告
[28:01] so they can make more  money. ‎让他们挣更多的钱
[28:03] Facebook conducted ‎脸书做了一个实验 ‎
[28:05] what they called “massive scale contagion experiments.” 他们称为“海量规模蔓延实验”
[28:08] Okay. ‎好吧
[28:09] How do we use subliminal cues on the Facebook pages ‎我们怎样用脸书页面上的潜意识信号
[28:13] to get more people to go vote in the midterm elections? ‎来让更多人在中期选举中投票?
[28:17] And they discovered that they were able to do that. ‎他们发现 他们能做到
[28:20] One thing they concluded is that we now know ‎他们得出的一个结论是 ‎现在我们知道
[28:24] we can affect real world behavior and emotions ‎我们能影响现实世界中的行为和情感
[28:28] without ever triggering the user’s awareness. ‎而根本不用触发用户的意识
[28:33] They are completely clueless. ‎他们自己完全不知道
[28:38] We’re pointing these engines of AI back at ourselves ‎我们将这些人工智能引擎 ‎指回到我们身上
[28:42] to reverse engineer what elicits responses from us. ‎来反向编程 什么能引诱我们的回应
[28:47] Almost like you’re stimulating nerve cells on a spider ‎就很像你在蜘蛛身上模拟神经细胞
[28:49] to see what causes its legs to respond. ‎来看是什么引起它的腿反应
[28:51] So, it really is this kind of prison experiment ‎就是这种监狱实验
[28:54] where we’re just, you know, roping people into the matrix, ‎我们在实验中 捆绑人们进入矩阵
[28:56] and we’re just harvesting all this money and… and data from all their activity ‎我们从他们的行为中获取金钱和数据 ‎
[29:00] to profit from. 用他们的行为牟利
[29:01] And we’re not even aware that it’s happening. ‎我们甚至都不知道 发生了这些
[29:04] So, we want to psychologically figure out how to manipulate you as fast as possible ‎我们想在心理学上弄清楚 ‎怎样以最快的速度操纵你
[29:07] and then give you back that dopamine hit. ‎然后返回给你让你产生兴奋的事物
[29:10] We did that brilliantly at Facebook. ‎我们在脸书做得非常出色
[29:12] Instagram has done it. WhatsApp has done it. ‎Instagram也这样做了 ‎WhatsApp也这样做了
[29:15] You know, Snapchat has done it. Twitter has done it. ‎阅后即焚图片分享也这样做了 ‎推特也这样做了
[29:17] I mean, it’s exactly the kind of thing ‎这种东西正是
[29:19] that a… that a hacker like myself would come up with ‎我这种黑客能想出来的
[29:22] because you’re exploiting a vulnerability in… in human psychology. ‎因为你在利用人类心理中的脆弱挣钱
[29:27] ‎(希恩·帕克 脸书前总经理)
[29:27] And I just… I think that we… ‎我只是想
[29:29] you know, the inventors, creators… 我们这些发明者、创造者…
[29:33] uh, you know, and it’s me, it’s Mark, it’s the… ‎有我 有马克 有…
[29:37] you know, Kevin Systrom at Instagram… It’s all of these people… ‎Instagram的凯文·斯特罗姆 ‎所有这些人…
[29:40] um, understood this consciously, and we did it anyway. ‎意识里非常清楚 但我们依然利用了
[29:50] No one got upset when bicycles showed up. ‎自行车问世的时候 没有人不满
[29:55] Right? Like, if everyone’s starting to go around on bicycles, ‎对吧?所有人都开始用自行车出行
[29:58] no one said, “Oh, my God, we’ve just ruined society. ‎没有人说 ‎“天啊 我们刚刚毁掉了社会
[30:01] Like, bicycles are affecting people. ‎因为自行车能影响人
[30:03] They’re pulling people away from their kids. ‎拉远了他们和孩子间的距离
[30:05] They’re ruining the fabric of democracy. People can’t tell what’s true.” ‎他们在毁掉民主的结构 ‎人们无法判断真假了”
[30:08] Like, we never said any of that stuff about a bicycle. ‎对于自行车 ‎我们从来没有说过这种话
[30:12] If something is a tool, it genuinely is just sitting there, ‎如果一个东西是工具 ‎它就会忠诚地坐在那里
[30:16] waiting patiently. ‎耐心等待
[30:19] If something is not a tool, it’s demanding things from you. ‎如果一个东西不是工具 ‎它会在你身上有所求
[30:22] It’s seducing you. It’s manipulating you. It wants things from you. ‎引诱你、操纵你 想从你身上获利
[30:26] And we’ve moved away from having a tools based technology environment ‎我们已经走过了 ‎以工具为基础的技术环境
[30:31] to an addiction and manipulation based technology environment. ‎来到了以致瘾和操纵 ‎为基础的技术环境
[30:34] That’s what’s changed. ‎这是技术环境的改变
[30:35] Social media isn’t a tool that’s just waiting to be used. ‎社交媒体 ‎不是原地等在那里被使用的工具
[30:39] It has its own goals, and it has its own means of pursuing them ‎它有自己的目标 ‎有自己的办法去实现这些目标
[30:43] by using your psychology against you. ‎利用你的心理 来对付你
[30:49] ‎(“只有两个行业 ‎把他们的客户叫做‘使用者’
[30:52] ‎非法毒品和软件”)
[30:55] ‎(——爱德华·塔夫特)
[30:57] Rewind a few years ago, I was the… ‎回想几年之前
[31:00] I was the president of Pinterest. ‎我是Pinterest的总经理
[31:03] I was coming home, ‎我回到家
[31:05] and I couldn’t get off my phone once I got home, 到家之后就无法放下手机
[31:08] despite having two young kids who needed my love and attention. ‎虽然我有两个小孩子 需要我的关爱
[31:12] I was in the pantry, you know, typing away on an e mail ‎我在食物储藏室里打字回邮件
[31:15] or sometimes looking at Pinterest. ‎有时候会看Pinterest
[31:18] I thought, “God, this is classic irony. ‎我想:“天啊 这真是典型的讽刺
[31:19] I am going to work during the day ‎我白天去工作
[31:22] and building something that then I am falling prey to.” ‎构造一个把我当猎物的东西”
[31:26] And I couldn’t… I mean, some of those moments, I couldn’t help myself. ‎我无法… 有时候 ‎我真的情不自禁使用
[31:32] The one that I’m… I’m most prone to is Twitter. ‎我最无法摆脱的是推特
[31:36] Uh, used to be Reddit. ‎以前无法摆脱的是Reddit
[31:38] I actually had to write myself software to break my addiction to reading Reddit. ‎我后来不得不给自己写程序 ‎来切断我阅读Reddit的瘾
[31:45] I’m probably most addicted to my e mail. ‎我最成瘾的 可能是邮件
[31:47] I mean, really. I mean, I… I feel it. ‎真的 我是认真的 我自己能感觉到
[31:52] Well, I mean, it’s sort it’s interesting ‎这很有趣
[31:55] that knowing what was going on behind the curtain, ‎我很清楚 这一切的幕后发生着什么
[31:58] I still wasn’t able to control my usage. ‎我还是无法控制自己去使用
[32:01] So, that’s a little scary. ‎这就有点可怕了
[32:03] Even knowing how these tricks work, I’m still susceptible to them. ‎即便知道这些手段的原理 ‎我还是容易受到它们影响
[32:07] I’ll still pick up the phone, and 20 minutes will disappear. ‎我拿起手机 ‎20分钟就不知不觉过去了
[32:12] Do you check your smartphone before you pee in the morning ‎你晨起排尿之前 ‎会看一眼智能手机吗?
[32:15] or while you’re peeing in the morning? ‎或者晨起排尿过程中 会看吗?
[32:17] ‘Cause those are the only two choices. ‎因为只有这两个选择
[32:19] I tried through willpower, just pure willpower… ‎我试过用意志力克制 纯意志力…
[32:23] “I’ll put down my phone, I’ll leave my phone in the car when I get home.” ‎“我要放下手机 ‎我到家之后 要把手机丢在车里”
[32:26] I think I told myself a thousand times, a thousand different days, ‎我应该在千万个不同的日子 ‎告诉过自己千万次
[32:30] “I am not gonna bring my phone to the bedroom,” ‎“我不要把手机带到卧室”
[32:32] and then 9:00 p.m. rolls around. ‎然后晚上九点到了
[32:34] “Well, I wanna bring my phone in the bedroom.” ‎“哎 我想把手机带进卧室”
[32:37] And so, that was sort of… ‎这就有点…
[32:39] Willpower was kind of attempt one, ‎意志力是一种努力
[32:41] and then attempt two was, you know, brute force. ‎兽性在做着另一种努力
[32:44] Introducing the Kitchen Safe. The Kitchen Safe is a revolutionary, ‎隆重介绍“厨房保险箱” ‎“厨房保险箱”
[32:48] new, time locking container that helps you fight temptation. 是革命性的全新发明 ‎帮你战胜诱惑的 ‎时间锁保鲜盒
[32:51] All David has to do is place those temptations in the Kitchen Safe. ‎大卫只需要把各种诱惑 ‎放进这个“厨房保险箱”
[32:57] Next, he rotates the dial to set the timer. ‎下一步 进行调控 设置时间
[33:01] And, finally, he presses the dial to activate the lock. ‎最后 按下调控 激活锁
[33:04] The Kitchen Safe is great… ‎“厨房保险箱”超级好
[33:05] We have that, don’t we? ‎我们家有 是吧?
[33:06] …video games, credit cards, and cell phones. ‎…电子游戏、信用卡、手机
[33:08] Yeah, we do. ‎对 有
[33:09] Once the Kitchen Safe is locked, it cannot be opened ‎“厨房保险箱”一旦上锁 ‎直到计时器归零之前
[33:12] until the timer reaches zero. 没办法打开
[33:13] So, here’s the thing. ‎问题是
[33:15] Social media is a drug. 社交媒体就是一种毒品
[33:17] I mean, we have a basic biological imperative ‎我们有着基本的生物学欲望
[33:20] to connect with other people. ‎去和别人联系
[33:23] That directly affects the release of dopamine in the reward pathway. ‎这直接影响着 ‎奖赏通路中的多巴胺释放
[33:28] Millions of years of evolution, um, are behind that system ‎这个机制背后 是几百万年的进化
[33:32] to get us to come together and live in communities, ‎让我们聚在一起 群居生活
[33:35] to find mates, to propagate our species. ‎找到伴侣 繁殖我们的物种
[33:38] So, there’s no doubt that a vehicle like social media, ‎所以 毫无疑问 社交媒体这种载体
[33:41] which optimizes this connection between people, ‎它会优化人们之间的联系
[33:45] is going to have the potential for addiction. ‎自然会有致瘾的可能性
[33:52] Mmm! Dad, stop! ‎爸 停!
[33:55] I have, like, 1,000 more snips to send before dinner. ‎我在晚饭前 还有上千个消息要回
[33:58] Snips? I don’t know what a snip is. ‎消息?
[33:59] ‎我不知道消息是什么
[34:00] Mm, that smells good, baby. All right. Thank you. ‎ 闻起来好香 宝贝 ‎ 谢谢
[34:03] I was, um, thinking we could use all five senses ‎我想 我们可以用所有五官
[34:05] to enjoy our dinner tonight. ‎来享受今晚的晚餐
[34:07] So, I decided that we’re not gonna have any cell phones at the table tonight. ‎所以我决定 ‎今晚的餐桌上不能使用手机
[34:11] So, turn ’em in. ‎好了 交上来
[34:13] Really? Yep. ‎ 真的吗? ‎ 是
[34:15] All right. Thank you. Ben? ‎ 好吧 ‎ 谢谢 本?
[34:18] Okay. Mom, the phone pirate. ‎ 好 ‎ 妈妈是手机海盗
[34:21] Got it. Mom! ‎ 拿走了 ‎ 妈!
[34:22] So, they will be safe in here until after dinner… ‎晚餐结束前 ‎这些手机会安全地放在这里
[34:27] and everyone can just chill out. ‎所有人可以安静待着了
[34:30] Okay? 好吗?
[34:47] Can I just see who it is? No. ‎ 我能看一眼是谁吗? ‎ 不行
[34:54] Just gonna go get another fork. ‎我去再拿一个叉子
[34:58] Thank you. ‎谢谢
[35:04] Honey, you can’t open that. ‎宝贝 不能打开
[35:06] I locked it for an hour, so just leave it alone. ‎我锁上了一个小时 别动了
[35:11] So, what should we talk about? ‎我们要聊点什么?
[35:13] Well, we could talk ‎我们可以聊聊
[35:14] about the, uh, Extreme Center wackos I drove by today. 我今天开车 ‎身边经过的极端中心政党疯子
[35:17] Please, Frank. What? ‎ 算了 弗兰科 ‎ 怎么了?
[35:18] I don’t wanna talk about politics. ‎我不想聊政治
[35:20] What’s wrong with the Extreme Center? See? He doesn’t even get it. ‎ 极端中心怎么了? ‎ 看吧?他都没明白
[35:23] It depends on who you ask. ‎取决于你问谁
[35:24] It’s like asking, “What’s wrong with propaganda?” ‎这就像是你问:“政治鼓吹怎么了?”
[35:28] Isla! ‎艾拉!
[35:32] Oh, my God. ‎天啊
[35:36] Do you want me to… Yeah. ‎ 需要我去… ‎ 嗯
[35:41] I… I’m worried about my kids. ‎我很担心我的孩子们
[35:44] And if you have kids, I’m worried about your kids. ‎等你们有了孩子 ‎我还会担心你们的孩子
[35:46] Armed with all the knowledge that I have and all of the experience, ‎虽然我有着各种知识储备 各种经验
[35:50] I am fighting my kids about the time ‎我还是和孩子们争论
[35:52] that they spend on phones and on the computer. ‎他们使用手机和电脑的时间
[35:54] I will say to my son, “How many hours do you think you’re spending on your phone?” ‎我会对我儿子说 ‎“你觉得自己会在手机上花多久?”
[35:58] He’ll be like, “It’s, like, half an hour. It’s half an hour, tops.” ‎他会说 ‎“也就半小时吧 最多半小时了”
[36:01] I’d say upwards hour, hour and a half. ‎我觉得比一小时多一点 一个半小时
[36:04] I looked at his screen report a couple weeks ago. ‎几周之前看了他的屏幕使用报告
[36:06] Three hours and 45 minutes. That… ‎ 是3小时45分 ‎ 那不是…
[36:11] I don’t think that’s… No. Per day, on average? ‎我觉得没有… 平均每天?
[36:13] Yeah. Should I go get it right now? ‎ 对 ‎ 我现在去拿来吗?
[36:15] There’s not a day that goes by that I don’t remind my kids ‎每一天 我都要提醒我的孩子们
[36:19] about the pleasure pain balance, ‎愉悦和痛苦的平衡
[36:21] about dopamine deficit states, ‎多巴胺短缺的状态
[36:24] about the risk of addiction. 上瘾的风险
[36:26] Moment of truth. ‎ 对 ‎ 来揭晓真相
[36:27] Two hours, 50 minutes per day. ‎每天2小时50分
[36:29] Let’s see. Actually, I’ve been using a lot today. ‎ 我们看看 ‎ 其实 我今天用了很多
[36:31] Last seven days. That’s probably why. ‎ 过去七天 ‎ 可能这就是原因
[36:33] Instagram, six hours, 13 minutes. Okay, so my Instagram’s worse. ‎Instagram 6小时13分 ‎好吧 我使用Instagram是最严重的
[36:39] My screen’s completely shattered. ‎我的屏幕彻底碎了
[36:42] Thanks, Cass. 谢谢你 卡桑
[36:44] What do you mean, “Thanks, Cass”? ‎“谢谢你 卡桑”是什么意思?
[36:46] You keep freaking Mom out about our phones when it’s not really a problem. ‎你一直让妈妈担心我们的手机问题 ‎但其实这根本不是问题
[36:49] We don’t need our phones to eat dinner! ‎我们吃晚餐不需要手机
[36:51] I get what you’re saying. It’s just not that big a deal. It’s not. ‎我明白你说的 ‎但这又不是什么大事 没什么啊
[36:56] If it’s not that big a deal, don’t use it for a week. ‎不是什么大事 那就一周别用手机
[37:01] Yeah. Yeah, actually, if you can put that thing away for, like, a whole week… ‎对 ‎对 其实 如果你能把那东西 ‎收起来一整周…
[37:07] I will buy you a new screen. ‎我就给你买一个新的屏幕
[37:10] Like, starting now? Starting now. ‎ 从现在开始吗? ‎ 现在开始
[37:15] Okay. You got a deal. Okay. ‎好 成交
[37:16] Okay, you gotta leave it here, though, buddy. ‎好 不过你要放在这里 小朋友
[37:19] All right, I’m plugging it in. ‎好 我把它放进去
[37:22] Let the record show… I’m backing away. ‎计时开始 我退后了
[37:25] Okay. ‎好
[37:27] You’re on the clock. One week. ‎ 计时开始了 ‎ 一周
[37:29] Oh, my… ‎天啊…
[37:31] Think he can do it? ‎你觉得他能做到吗?
[37:33] I don’t know. We’ll see. ‎不知道 走着瞧
[37:35] Just eat, okay? ‎你吃饭吧 好吗?
[37:44] Good family dinner! ‎美好的家庭晚餐!
[37:47] These technology products were not designed ‎这些技术产品不是由
[37:49] by child psychologists who are trying to protect and nurture children. ‎努力保护和培育孩子的 ‎儿童心理学家设计的
[37:53] They were just designing to make these algorithms ‎它们的设计 是让这些算法
[37:56] that were really good at recommending the next video to you ‎非常擅于给你推荐下一个视频
[37:58] or really good at getting you to take a photo with a filter on it. ‎非常擅于让你拍照加滤镜
[38:03] ‎(两个赞)
[38:13] ‎(确定删除吗?否)
[38:15] ‎(是)
[38:16] It’s not just that it’s controlling ‎这些东西不仅在控制 ‎
[38:18] where they spend their attention. 他们把注意力花在哪里
[38:21] Especially social media starts to dig deeper and deeper down into the brain stem ‎尤其是社交媒体越来越深入大脑根部
[38:26] and take over kids’ sense of self worth and identity. ‎夺走孩子们的判断力 ‎自我价值和身份
[38:29] ‎(美化我)
[38:42] ‎(莉莉:可爱!)
[38:43] ‎(索薇娅:天啊 好美)
[38:44] ‎(奥利维亚:你太美了)
[38:46] ‎(阿瓦:你把耳朵P大了吗?)
[38:48] ‎(哈哈)
[38:52] We evolved to care about whether other people in our tribe… ‎我们进化出 ‎在意我们社群中的其他人…
[38:55] ‎(布里亚纳:漂亮!)
[38:56] think well of us or not ’cause it matters. ‎…是否对我们有好印象的机制 ‎因为这很重要
[38:59] But were we evolved to be aware of what 10,000 people think of us? ‎但我们的进化 需要我们在意 ‎一万个人怎么看我们吗?
[39:04] We were not evolved to have social approval being dosed to us ‎我们的进化 ‎不需要每隔五分钟
[39:08] every five minutes. ‎就获得一次社交认可
[39:10] That was not at all what we were built to experience. ‎这根本不是我们需要去体验的
[39:15] We curate our lives around this perceived sense of perfection ‎我们管理自己的生活 ‎建立在获得的完美感上
[39:20] because we get rewarded in these short term signals ‎因为爱心、点赞、竖起大拇指 ‎这些短期的信号
[39:23] hearts, likes, thumbs up ‎给我们奖赏
[39:25] and we conflate that with value, and we conflate it with truth. ‎我们把它融合到价值中 ‎融合到真相中
[39:29] And instead, what it really is is fake, brittle popularity… ‎不论是否虚假 易破碎的人气
[39:33] that’s short term and that leaves you even more, and admit it, ‎这是短期的 你需要承认 这让你更加
[39:37] vacant and empty before you did it. ‎空虚 于是会再次这样做
[39:41] Because then it forces you into this vicious cycle ‎因为这样 ‎它会将你逼入这样一个恶性循环
[39:43] where you’re like, “What’s the next thing I need to do now?  ‘Cause I need it back.” ‎你会想:“我接下来要做什么? ‎因为我还想要这种感觉”
[39:48] Think about that compounded by two billion people, ‎想一下 这种现象被20亿人复杂化
[39:50] and then think about how people react then to the perceptions of others. ‎然后想一下 之后人们 ‎会怎样回应别人对自己的看法
[39:54] It’s just a… It’s really bad. ‎真的… 真的很恶劣
[39:56] It’s really, really bad. ‎真的太恶劣了
[40:00] There has been a gigantic increase ‎美国青少年群体中
[40:03] in depression and anxiety for American teenagers ‎出现了大幅增长的抑郁和焦虑
[40:06] which began right around… between 2011 and 2013. ‎大概就在2011年到2013年开始的
[40:11] The number of teenage girls out of 100,000 in this country ‎这个国家中 每十万名少女中
[40:15] who were admitted to a hospital every year ‎每年因为割腕或者自残
[40:17] because they cut themselves or otherwise harmed themselves, ‎进医院接受治疗的人数
[40:20] that number was pretty stable until around 2010, 2011, ‎在2010年到2011年是非常平稳的
[40:24] and then it begins going way up. ‎在那之后 直线上升
[40:28] It’s up 62 percent for older teen girls. ‎大一点的少女中 增加了62%
[40:32] ‎(美国非致命性自残住院人数)
[40:33] It’s up 189 percent for the preteen girls. That’s nearly triple. ‎进入青春期前的少女 增加了189% ‎将近三倍了
[40:40] Even more horrifying, we see the same pattern with suicide. ‎更可怕的是 ‎自杀也呈现出相同的趋势
[40:43] ‎(美国自杀率 ‎每百万女孩死亡人数)
[40:44] The older teen girls, 15 to 19 years old, ‎大一点的少女 15到19岁
[40:47] they’re up 70 percent,compared to the first decade of this century. ‎与本世纪初相比 增长了70%
[40:52] The preteen girls, who have very low rates to begin with, ‎青春期前的少女 ‎最开始的比率非常低
[40:55] they are up 151 percent. ‎现在增长了151%
[40:58] And that pattern points to social media. ‎这个增长模式 指向了社交媒体
[41:01] ‎(2009年手机上的社交媒体数量)
[41:04] Gen Z, the kids born after 1996 or so, ‎Z代人 ‎1996年之后那会儿出生的孩子们
[41:07] those kids are the first generation in history ‎那些孩子们是历史上第一代
[41:10] that got on social media in middle school. ‎在初中开始使用社交媒体的
[41:15] How do they spend their time? ‎她们的时间花在了哪里呢?
[41:19] They come home from school, and they’re on their devices. ‎她们放学回家 就拿起手机
[41:24] A whole generation is more anxious, more fragile, more depressed. ‎整个一代人都更加焦虑 ‎更加脆弱、更加抑郁
[41:30] They’re much less comfortable taking risks. 她们更不愿意冒险
[41:34] The rates at which they get driver’s licenses have been dropping. ‎她们拿到驾照的比率下降了
[41:38] The number who have ever gone out on a date ‎出去约会过的人数
[41:41] or had any kind of romantic interaction is dropping rapidly. ‎有过任何形式浪漫互动的人数骤减
[41:47] This is a real change in a generation. ‎整个一代人 有了真正的改变
[41:53] And remember, for every one of these, for every hospital admission, ‎别忘了 这些人中的每一个 ‎每一个住院的人
[41:57] there’s a family that is traumatized and horrified. ‎背后都有一个受伤的、惊恐的家庭
[42:00] “My God, what is happening to our kids?” ‎“天啊 我的孩子们怎么了?”
[42:19] It’s plain as day to me. ‎在我看来 问题很显而易见
[42:22] These services are killing people… and causing people to kill themselves. ‎这些服务正在杀人 ‎也在导致人们自杀
[42:29] I don’t know any parent who says, “Yeah, I really want my kids to be growing up ‎我不认识哪个家长会说 ‎“是 我希望我的孩子们 成长过程中
[42:33] feeling manipulated by tech designers, uh, ‎感觉被技术设计师操控
[42:36] manipulating their attention, making it impossible to do their homework, ‎操控他们的注意力 ‎让他们无法完成作业
[42:39] making them compare themselves to unrealistic standards of beauty.” ‎让他们将自己 ‎和不切实际的审美标准相对比”
[42:42] Like, no one wants that. ‎没有人希望那样
[42:45] No one does. ‎没有一个人
[42:46] We… We used to have these protections. ‎我们以前有一些保护措施
[42:48] When children watched Saturday morning cartoons, ‎小孩子们观看周六早间动画片的时候
[42:51] we cared about protecting children. ‎我们关心保护儿童
[42:52] We would say, “You can’t advertise to these age children in these ways.” ‎我们会说:“你不能这样 ‎给这个年龄段的孩子看广告”
[42:57] But then you take YouTube for Kids, ‎然后有了YouTube儿童频道
[42:58] and it gobbles up that entire portion of the attention economy, ‎蚕食了注意力经济的全部
[43:02] and now all kids are exposed to YouTube for Kids. ‎现在所有的孩子 ‎都能看YouTube儿童频道
[43:04] And all those protections and all those regulations are gone. ‎所有的保护措施 ‎所有的管理规定都不见了
[43:18] We’re training and conditioning a whole new generation of people… ‎我们在训练、调节整个一代人…
[43:23] that when we are uncomfortable or lonely or uncertain or afraid, ‎我们不自在、孤独、不确定或害怕时
[43:29] we have a digital pacifier for ourselves ‎有一个自己的数码安慰
[43:32] that is kind of atrophying our own ability to deal with that. ‎这有点让我们 ‎自己处理这些情绪的能力退化了
[43:53] Photoshop didn’t have 1,000 engineers ‎Photoshop屏幕的另一端 ‎
[43:55] on the other side of the screen, using notifications, using your friends, ‎没有几千个工程师 用通知 用你的朋友
[43:59] using AI to predict what’s gonna perfectly addict you, or hook you, ‎用人工智能去预判 ‎什么能完美地让你上瘾 引诱你
[44:02] or manipulate you, or allow advertisers ‎或者操纵你 或者允许广告商
[44:04] to test 60,000 variations of text or colors to figure out ‎去测试六万种不同的文本或颜色
[44:06] ‎(芝加哥反垄断技术会议)
[44:08] what’s the perfect manipulation of your mind. ‎…以此来找到 ‎怎样能完美地操纵你的思想
[44:11] This is a totally new species of power and influence. ‎这是一种全新的力量和影响
[44:16] I… I would say, again, the methods used ‎我想再一次强调 他们使用的
[44:19] to play on people’s ability to be addicted or to be influenced ‎玩弄人们的能力 ‎让人们成瘾或者受到影响
[44:22] may be different this time, and they probably are different. ‎或许这一次是不同的 ‎或许他们是不同的
[44:25] They were different when newspapers came in and the printing press came in, ‎报纸问世时 印刷媒体问世时 ‎与现在很不同
[44:28] and they were different when television came in, ‎电视问世的时候 与现在很不同
[44:31] and you had three major networks and… ‎当时有三个主要的…‎ 网络
[44:34] At the time. At the time. That’s what I’m saying. ‎ 当时 我也想说
[44:36] But I’m saying the idea that there’s a new level ‎但我在说 这是一个全新的层次
[44:38] and that new level has happened so many times before. ‎这个新的层次 以前也发生过很多次
[44:42] I mean, this is just the latest new level that we’ve seen. ‎这只是我们看见的 最新的一个层次
[44:45] There’s this narrative that, you know, “We’ll just adapt to it. ‎有这样一种说法:“我们去适应它
[44:48] We’ll learn how to live with these devices, ‎我们要学着与这些设备共存
[44:51] just like we’ve learned how to live with everything else.” ‎就像我们学着 ‎和其他所有事物共存一样”
[44:53] And what this misses is there’s something distinctly new here. ‎但这个说法没有注意到的是 ‎有些东西明显是全新的
[44:57] Perhaps the most dangerous piece of all this is the fact ‎或许这其中最危险的是
[45:00] that it’s driven by technology that’s advancing exponentially. ‎这是由技术驱动的 ‎在成指数地向前发展
[45:04] ‎(计算机处理能力)
[45:05] Roughly, if you say from, like, the 1960s to today, ‎大体上 ‎如果你看从20世纪60年代至今
[45:09] processing power has gone up about a trillion times. ‎计算机处理能力增长了万亿倍
[45:13] Nothing else that we have has improved at anything near that rate. ‎我们身边没有任何其他东西 ‎以这个速率增长
[45:18] Like, cars are, you know, roughly twice as fast. ‎比如 汽车速度基本上才实现翻倍
[45:22] And almost everything else is negligible. ‎几乎所有其他的东西都显得微不足道
[45:25] And perhaps most importantly, ‎或许最重要的是
[45:27] our human our physiology, our brains have evolved not at all. ‎我们人类… 我们的生理 ‎我们的大脑 根本没有丝毫进化
[45:31] ‎(没有手机的时间)
[45:37] Human beings, at a mind and body and sort of physical level, ‎人类的思想、身体和体质
[45:41] are not gonna fundamentally change. ‎基本上不会改变了
[45:47] I know, but they…
[45:56] We can do genetic engineering and develop new kinds of human beings, ‎我们可以搞基因工程 ‎在未来开发新的人类物种
[46:01] but realistically speaking, you’re living inside of hardware, a brain, ‎但是现实来讲 ‎你生活在大脑这个硬件下
[46:05] that was, like, millions of years old, ‎已经存在几百万年了
[46:07] and then there’s this screen, and then on the opposite side of the screen, ‎然后出现了这样一个屏幕 ‎在屏幕的另一端
[46:10] there’s these thousands of engineers and supercomputers ‎有上千名工程师和超级计算机
[46:13] that have goals that are different than your goals, ‎有着与你不同的目标
[46:16] and so, who’s gonna win in that game? Who’s gonna win? ‎那么 这个游戏谁能赢呢?谁会赢?
[46:25] How are we losing? ‎我们怎么会输?
[46:27] I don’t know. Where is he? This is not normal. ‎ 我不知道 ‎ 他在哪里?这太不正常了
[46:29] Did I overwhelm him with friends and family content? ‎我给他推送朋友和家人的内容太多 ‎他烦了吗?
[46:32] Probably. Well, maybe it was all the ads. ‎ 或许吧 ‎ 或许是广告太多了
[46:34] No. Something’s very wrong. Let’s switch to resurrection mode. ‎不 一定出了严重的问题 ‎我们切换到复苏模式吧
[46:39] When you think of AI, you know, an AI’s gonna ruin the world, ‎当你想到人工智能 ‎人工智能会毁掉世界
[46:44] and you see, like, a Terminator, and you see Arnold Schwarzenegger. ‎你会看到《终结者》 看到施瓦辛格…
[46:47] I’ll be back. ‎我会回来的
[46:48] You see drones, and you think, like, ‎…你会看到无人机 你觉得 ‎
[46:51] “Oh, we’re gonna kill people with AI.” “人工智能会杀人的”
[46:53] And what people miss is that AI already runs today’s world right now. ‎人们忽略的是 人工智能 ‎现在已经在运营着当今世界了
[46:59] Even talking about “an AI” is just a metaphor. ‎甚至谈论“人工智能”都只是暗喻
[47:03] At these companies like… like Google, there’s just massive, massive rooms, ‎在谷歌这种公司 有超级大的房间
[47:10] some of them underground, some of them underwater, ‎有些在地下 有些在水下
[47:13] of just computers. ‎房间里全是电脑
[47:14] Tons and tons of computers, as far as the eye can see. ‎无数的电脑 连绵不绝
[47:18] They’re deeply interconnected with each other ‎它们互相之间在内部深度连接
[47:20] and running extremely complicated programs, ‎在运行着极其复杂的程序
[47:23] sending information back and forth between each other all the time. ‎始终不停地在彼此之间交换信息
[47:26] And they’ll be running many different programs, 它们会运行很多不同的程序
[47:28] many different products on those same machines. ‎在同样的机器上 有不同的产品
[47:31] Some of those things could be described as simple algorithms, ‎有些东西可以被描述为简单算法
[47:33] some could be described as algorithms ‎有些算法太过复杂
[47:35] that are so complicated, you would call them intelligence. ‎就可以被称为“智能”
[47:40] I like to say that algorithms are opinions embedded in code… ‎我想说 算法是内嵌在代码中的观点…
[47:45] and that algorithms are not objective. ‎算法并不是客观的
[47:48] Algorithms are optimized to some definition of success. ‎算法被某种成功的定义优化
[47:52] So, if you can imagine, ‎所以 如果你能想象
[47:53] if a… if a commercial enterprise builds an algorithm ‎一个商业公司成功的定义
[47:57] to their definition of success, ‎需要依靠算法
[47:59] it’s a commercial interest. ‎那就是商业利益
[48:01] It’s usually profit. ‎通常都有利润
[48:03] You are giving the computer the goal state, “I want this outcome,” ‎你给电脑一个目标 ‎说“我想要这个结果”
[48:07] and then the computer itself is learning how to do it. ‎电脑自己去学习 怎样实现
[48:10] That’s where the term “machine learning” comes from. ‎这是“机器学习”概念的由来
[48:12] And so, every day, it gets slightly better ‎所以 每一天 都会更好一点
[48:14] at picking the right posts in the right order ‎在正确命令获取正确的推送数据
[48:17] so that you spend longer and longer in that product. ‎让你在这个产品上花的时间越来越多
[48:19] And no one really understands what they’re doing ‎没人能真正明白 为了实现这个目标
[48:22] in order to achieve that goal. ‎他们在做什么
[48:23] The algorithm has a mind of its own, so even though a person writes it, ‎算法有着自己的思想 虽然是人写的
[48:28] it’s written in a way ‎它写出来的目的
[48:30] that you kind of build the machine, and then the machine changes itself. ‎是你建立一个机器 ‎这个机器会自己改变
[48:35] There’s only a handful of people at these companies, ‎这种公司 员工非常少
[48:37] at Facebook and Twitter and other companies… ‎在脸书、推特和其他公司
[48:40] There’s only a few people who understand how those systems work, ‎只有几个人能明白 ‎这些系统的工作原理
[48:43] and even they don’t necessarily fully understand ‎虽然他们不需要完全理解
[48:46] what’s gonna happen with a particular piece of content. ‎某一条特定的内容 会发生什么
[48:49] So, as humans, we’ve almost lost control over these systems. ‎作为人类 我们几乎 ‎已经失去了对这些系统的控制
[48:55] Because they’re controlling, you know, the information that we see, ‎因为是它们在控制我们看到的信息
[48:59] they’re controlling us more than we’re controlling them. ‎更多的是它们在控制我们 ‎而不是我们控制它们
[49:02] Cross referencing him ‎在他的地理区域
[49:04] against comparables in his geographic zone. 对他和可以对比的人 ‎进行交叉参照
[49:07] His psychometric  doppelgangers. ‎他的心理测定相似者
[49:09] There are 13,694 people behaving just like him in his region. ‎在他的地区 ‎有13694个和他行为相似的人
[49:13] What’s trending with them? We need something actually good ‎ 他们中盛行什么? ‎ 我们需要真实的好东西
[49:16] for a proper resurrection, ‎才能进行有效的复苏
[49:17] given that the typical stuff isn’t working. ‎因为平常的那些东西已经不起作用了
[49:20] Not even that cute girl from school. ‎学校那个可爱的姑娘都没用了
[49:22] My analysis shows that going political with Extreme Center content ‎我的分析显示 ‎用极端中心内容搞政治
[49:25] has a 62.3 percent chance of long term engagement. ‎有62.3%的几率能够获得长期参与
[49:28] That’s not bad. ‎还不错
[49:29] It’s not good enough to lead with. ‎想用它引导 还不太够
[49:32] Okay, okay, so we’ve tried notifying him about tagged photos, ‎好 所以我们已经试过 ‎通知给他圈人照片
[49:35] invitations, current events, even a direct message from  Rebecca. ‎邀请、实事、甚至是瑞贝卡的私信
[49:39] But what about User 01265923010? ‎但是用户01265923010呢?
[49:42] Yeah, Ben loved all of her posts. ‎是 本点赞了她所有的发帖
[49:44] For months and, like, literally all of them, and then nothing. ‎几个月的所有发帖 真的点了所有 ‎然后就没有然后了
[49:47] I calculate a 92.3 percent chance of resurrection ‎我算出了通知安娜的内容
[49:50] with a notification about Ana. ‎会有92.3%的复苏概率
[49:53] ‎(新感情)
[49:56] And her new friend. ‎还有她的新朋友
[49:58] ‎(没有手机的时间)
[50:24] ‎(你的前女友有了新感情!)
[50:25] Oh, you gotta be kiddin’ me. ‎不是吧
[50:35] Okay. ‎好吧
[50:37] ‎(安娜与路易斯正在热恋)
[50:38] What? ‎什么?
[50:41] Bam! We’re back! ‎当!我们回来了!
[50:42] Let’s get back to making money, boys. ‎我们继续挣钱 兄弟们
[50:44] Yes, and connecting Ben with the entire world. ‎好 让他们和整个世界联系起来
[50:46] I’m giving him access to all the information he might like. ‎我给他看所有他可能喜欢的信息
[50:49] Hey, do you guys ever wonder if, you know, like, the feed is good for Ben? ‎你们是否想过 ‎这些推送对本是好的吗?
[50:57] No. No. ‎ 没想过 ‎ 没有
[51:17] I put a spell on you   ‎我在你身上下了咒语
[51:25] ‘Cause you’re mine   ‎因为你是我的
[51:34] You better stop the things you do ‎你最好停止你做的事情
[51:41] I ain’t lyin’ ‎我不骗你
[51:42] ‎(A/B测试 极端中心)
[51:44] No, I ain’t lyin’ ‎不 我不骗你
[51:49] You know I can’t stand it ‎你知道我无法忍受
[51:53] You’re runnin’ around   ‎你在四处奔跑
[51:55] You know better, Daddy   ‎爸爸 你更清楚
[51:58] I can’t stand it ‘Cause you put me down ‎我无法忍受 因为你将我放下
[52:03] Yeah, yeah ‎耶
[52:06] I put a spell on you   ‎我在你身上下了咒语
[52:12] Because you’re mine ‎因为你是我的
[52:18] You’re mine ‎你是我的
[52:20] So, imagine you’re on Facebook… ‎想象一下 你在用脸书…
[52:24] and you’re effectively playing against this artificial intelligence ‎你的对手是人工智能
[52:29] that knows everything about you, ‎它知道你的一切
[52:31] can anticipate your next move, and you know literally nothing about it, ‎能够预测你未来的举动 ‎你对它却一无所知
[52:34] except that there are cat videos and birthdays on it. ‎除了上面有猫的视频和出生日期
[52:37] That’s not a fair fight. ‎这根本不是公平的竞争
[52:41] Ben and Jerry, it’s time to go, bud! ‎本与杰瑞 该走了 孩子?
[52:51] Ben? ‎本?
[53:02] Ben. Mm. ‎本
[53:05] Come on. ‎快点
[53:07] School time. ‎该上学了
[53:08] Let’s go. 我们走
[53:13] ‎(人道技术中心)
[53:31] How you doing today? Oh, I’m… I’m nervous. ‎ 你今天怎么样? ‎ 我很紧张
[53:33] Are ya? Yeah. ‎ 你紧张吗? ‎ 是啊
[53:37] We were all looking for the moment ‎我们都在当心这个时刻
[53:39] when technology would overwhelm human strengths and intelligence. ‎当技术会超越人类力量和智慧
[53:43] When is it gonna cross the singularity, replace our jobs, be smarter than humans? ‎技术什么时候会超越人类 ‎取代我们的工作 比人类更聪明?
[53:48] But there’s this much earlier moment… ‎但有更早的时刻…
[53:50] when technology exceeds and overwhelms human weaknesses. ‎技术超越人类的弱点时
[53:57] This point being crossed is at the root of addiction, ‎这个超越的点就是上瘾
[54:02] polarization, radicalization, outrage ification, ‎两极分化、激进化、激化愤怒
[54:04] vanity ification, the entire thing. ‎激化虚荣 一切的根源
[54:07] This is overpowering human nature, ‎它在压制人类天性
[54:10] and this is checkmate on humanity. ‎在挫伤人性
[54:30] I’m sorry. ‎很抱歉
[54:41] One of the ways I try to get people to understand ‎我努力让人们明白
[54:45] just how wrong feeds from places like Facebook are ‎脸书这种地方的推送 ‎有多错误的一种方式
[54:49] is to think about the Wikipedia. ‎是让他们去想想维基百科
[54:51] ‎(新标签页)
[54:52] When you go to a page, you’re seeing the same thing as other people. ‎当你打开一个维基百科网页 ‎你和别人看到的东西是一样的
[54:55] ‎(维基百科 自由的百科全书)
[54:56] So, it’s one of the few things online that we at least hold in common. ‎所以 这是网络上少有的 ‎我们统一共享的东西
[55:00] Now, just imagine for a second that Wikipedia said, ‎现在 现象一下 维基百科说
[55:03] “We’re gonna give each person a different customized definition, ‎我们要给每一个人不同的个性化定义
[55:07] and we’re gonna be paid by people for that.” ‎有人给我们钱 让我们这样做”
[55:09] So, Wikipedia would be spying on you. Wikipedia would calculate, ‎维基百科就会监视你 会计算
[55:13] “What’s the thing I can do to get this person to change a little bit ‎“我要做什么 才能代表一些商业利益
[55:17] on behalf of some commercial interest?” Right? ‎让这个人产生一点改变?” 对吧?
[55:19] And then it would change the entry. ‎然后就会改变整个词条
[55:22] Can you imagine that? Well, you should be able to, ‎你能想象吗? ‎你应该能够想象得到
[55:24] ’cause that’s exactly what’s happening on Facebook. ‎因为在脸书页面上 就是这样的
[55:26] It’s exactly what’s happening in your YouTube feed. ‎你的YouTube推送 就是这样的
[55:29] When you go to Google and type in “Climate change is,” ‎当你登录谷歌 输入“气候变化是…”
[55:31] you’re going to see different results depending on where you live. ‎你会看到 根据你所居住的地区不同 ‎会出现不同的结果
[55:35] ‎(气候变化是)
[55:36] In certain cities, you’re gonna see it  autocomplete ‎在某些城市 你会看到自动完成…
[55:38] with “climate change is a hoax.” ‎“气候变化是一场骗局”
[55:40] In other cases, you’re gonna see ‎在其他地方 你将会看到
[55:42] “climate change is causing the destruction of nature.” ‎“气候变化是对自然的破坏”
[55:44] And that’s a function not of what the truth is about climate change, ‎这个功能 ‎提供的不都是气候变化的真相
[55:48] but about where you happen to be Googling from ‎而是你在哪里进行谷歌搜索
[55:51] and the particular things Google knows about your interests. ‎以及谷歌对你个人兴趣的了解
[55:54] Even two friends who are so close to each other, ‎即便是两个非常亲近的朋友
[55:58] who have almost the exact same set of friends, ‎他们两个有着完全相同的朋友圈子
[56:00] they think, you know, “I’m going to news feeds on Facebook. ‎他们会认为 ‎“我们会看到脸书上的新推送
[56:02] I’ll see the exact same set of updates.” ‎会看到完全相同的更新” ‎
[56:05] But it’s not like that at all. 但事实远非如此
[56:06] They see completely different worlds ‎他们会看到完全不同的世界
[56:08] because they’re based on these computers calculating ‎因为这些是基于计算机的计算
[56:10] what’s perfect for each of them. ‎对每一个人来说 怎样最完美
[56:12] ‎(直播中)
[56:14] The way to think about it is it’s 2.7 billion Truman Shows. ‎想象这件事的一个方式是 ‎这是27亿人的《楚门的世界》
[56:18] Each person has their own reality, with their own… ‎每一个人都有自己的现实 自己的…
[56:22] facts. ‎事实
[56:23] Why do you think that, uh, Truman has never come close ‎你觉得楚门为什么到现在都 ‎从来没有接近
[56:27] to discovering the true nature of his world until now? ‎发现他所在世界的真实本质?
[56:31] We accept the reality of the world with which we’re presented. ‎我们接受了 ‎呈现在我们面前的世界就是现实
[56:34] It’s as simple as that. ‎就是这么简单
[56:35] ‎(直播中)
[56:36] Over time, you have the false sense that everyone agrees with you, ‎随着时间推移 你会有一种错觉 ‎觉得每一个人都认同你
[56:41] because everyone in your news feed sounds just like you. ‎因为给你推送的新闻中 ‎每个人都和你极其相似
[56:44] And that once you’re in that state, it turns out you’re easily manipulated, ‎一旦你达到了这种状态 ‎你就很容易被操纵了
[56:49] the same way you would be manipulated by a magician. ‎和你被魔术师操纵 是同样的方式
[56:51] A magician shows you a card trick and says, “Pick a card, any card.” ‎魔术师给你看纸牌魔术 跟你说 ‎“选一张牌 哪张都行”
[56:55] What you don’t realize was that they’ve done a set up, ‎你没有意识到的是 ‎他们早就给你设好陷阱了
[56:58] so you pick the card they want you to pick. ‎于是你选的那张牌 ‎是他们想让你选的
[57:00] And that’s how Facebook works. Facebook sits there and says, ‎这就是脸书的工作原理 ‎脸书坐在那里说
[57:03] “Hey, you pick your friends. You pick the links that you follow.” ‎“喂 选择你的朋友 ‎选择你关注的联系人”
[57:06] But that’s all nonsense. It’s just like the magician. ‎根本是胡扯 它就跟魔术师一样
[57:08] Facebook is in charge of your news feed. ‎脸书负责给你进行新闻推送
[57:11] We all simply are operating on a different set of facts. ‎我们都不过是 ‎在基于不同的一系列事实行事
[57:14] When that happens at scale, ‎当大范围发生时
[57:16] you’re no longer able to reckon with or even consume information ‎你就再也无法考虑甚至消化
[57:20] that contradicts with that world view that you’ve created. ‎与你所创造的世界观相悖的信息了
[57:23] That means we aren’t actually being objective, ‎那就意味着 ‎我们其实不是客观、
[57:26] constructive individuals. 有建设性的个体
[57:28] Open up your eyes, don’t believe the lies! Open up… ‎睁大你的眼睛 别相信谎言!睁大…
[57:32] And then you look over at the other side, ‎然后你扫了一眼另一边
[57:35] and you start to think, “How can those people be so stupid? ‎你开始想 ‎“这些人怎么会如此愚蠢?”
[57:38] Look at all of this information that I’m constantly seeing. ‎他们也看到了我不停看到的这些信息
[57:42] How are they not seeing that same information?” ‎他们怎么会看不到相同的信息?
[57:44] And the answer is, “They’re not seeing that same information.” ‎问题的答案是 ‎“他们没有看到相同的信息”
[57:47] Open up your eyes, don’t believe the lies! ‎睁大你的眼睛 别相信谎言!
[57:52] What are Republicans like? People that don’t have a clue. ‎共和党人什么样? ‎愚昧无知
[57:55] The Democrat Party is a crime syndicate, not a real political party. ‎民主党就是一个犯罪团伙 ‎不是真正的政治党派
[57:59] A huge new Pew Research Center study of 10,000 American adults ‎皮尤研究中心一项全新的大型研究 ‎对一万名美国成年人进行调查
[58:03] finds us more divided than ever, ‎发现我们比任何时候都要分裂
[58:05] with  personal and political polarization at a 20 year high. ‎个人和政治两极分化达到20年来最高
[58:11] You have more than a third of Republicans saying ‎有超过三分之一的共和党人说
[58:14] the Democratic Party is a threat to the nation, ‎民主党是对这个国家的威胁
[58:16] more than a quarter of Democrats saying the same thing about the Republicans. ‎民主党超过四分之一的人 ‎也这样说共和党
[58:20] So many of the problems that we’re discussing, ‎我们讨论的很多问题
[58:22] like, around political polarization ‎比如政治两极分化
[58:24] exist in spades on cable television. ‎在有线电视上大量存在
[58:28] The media has this exact same problem, ‎媒体也有着同样的问题
[58:31] where their business model, by and large, ‎整体上来说 他们的商业模式
[58:33] is that they’re selling our attention to advertisers. ‎是把我们的关注出售给广告商
[58:35] And the Internet is just a new, even more efficient way to do that. ‎网络只是一个新的、更有效率的 ‎实现方式罢了
[58:40] At YouTube, I was working on YouTube recommendations. ‎我曾在YouTube的工作 ‎是研究YouTube推荐
[58:44] It worries me that an algorithm that I worked on ‎让我担心的是 我研究的一个算法
[58:47] is actually increasing polarization in society. ‎增加了社会中的两极分化
[58:50] But from the point of view of watch time, ‎但从观看时间来看
[58:53] this polarization is extremely efficient at keeping people online. ‎这种两极分化 ‎在让人们持续在线观看上 极其有效
[58:58] The only reason these teachers are teaching this stuff ‎这些老师教这些东西的唯一原因
[59:00] is ’cause they’re getting paid to. ‎是有人给他们钱 让他们教
[59:02] It’s absolutely absurd. Hey, Benji. ‎ 太扯了 ‎ 喂 本杰
[59:04] No soccer practice today? ‎今天没有足球训练吗?
[59:06] Oh, there is. I’m just catching up on some news stuff. ‎有 我只想看看今天的新闻
[59:08] Do research. Anything that sways from the Extreme Center ‎你去研究一下 极端中心说的任何话…
[59:11] Wouldn’t exactly call the stuff that you’re watching news. ‎都不会把你看的这个东西称作新闻
[59:15] You’re always talking about how messed up everything is. So are they. ‎你总是说一切都那么混乱 确实是
[59:19] But that stuff is just propaganda. ‎但那东西只是政治鼓吹
[59:21] Neither is true. It’s all about what makes sense. ‎没有一个是真的 ‎全都是让你觉得合理
[59:24] Ben, I’m serious. That stuff is bad for you. ‎本 我很严肃 这些东西对你有害
[59:27] You should go to soccer practice. Mm. ‎你应该去足球训练
[59:35] I share this stuff because I care. ‎我分享这个东西 是因为我在意
[59:37] I care that you are being misled, and it’s not okay. All right? ‎我在意你被误导 这是不对的 好吗?
[59:41] People think the algorithm is designed ‎人们认为算法的设计
[59:43] to give them what they really want, only it’s not. ‎是给他们真正想要的 但其实不然
[59:46] The algorithm is actually trying to find a few rabbit holes that are very powerful, ‎算法其实是在试图 ‎找到几个非常强大的兔子洞
[59:52] trying to find which rabbit hole is the closest to your interest. ‎试图找到哪一个兔子洞 ‎最贴近你的兴趣
[59:56] And then if you start watching one of those videos, ‎然后 如果你开始观看其中一个视频
[59:59] then it will recommend it over and over again. ‎它就会不停继续推荐
[1:00:02] It’s not like anybody wants this to happen. ‎这种情况 并不是有人刻意为之
[1:00:05] It’s just that this is what the recommendation system is doing. ‎只是推荐系统一直在做而已
[1:00:07] So much so that Kyrie Irving, the famous basketball player, ‎以至于著名篮球运动员凯里·欧文
[1:00:11] uh, said he believed the Earth was flat, and he apologized later ‎说他相信地球是平的 后来又道歉
[1:00:14] because he blamed it on a YouTube rabbit hole. ‎因为他把责任推给 ‎YouTube的一个兔子洞
[1:00:16] You know, like, you click the YouTube click ‎你点击 YouTube视频
[1:00:18] and it goes, like, how deep the rabbit hole goes. ‎它会继续 这个兔子洞能有多深
[1:00:21] When he later came on to NPR to say, ‎他后来在国家公共广播电台上说
[1:00:23] “I’m sorry for believing this. I didn’t want to mislead people,” ‎“很抱歉我相信了这个 ‎我无意误导人们”
[1:00:26] a bunch of students in a classroom were interviewed saying, ‎有人采访了教室中的一群学生 ‎他们说
[1:00:28] “The round Earthers got to him.” ‎“相信地球是圆的人肯定找他谈了”
[1:00:31] The flat Earth conspiracy theory was recommended ‎地球平面阴谋论被算法
[1:00:34] hundreds of millions of times by the algorithm. ‎推荐了几亿次
[1:00:37] It’s easy to think that it’s just a few stupid people who get convinced, ‎很容易去想 ‎只有几个愚蠢的人被说服罢了
[1:00:43] but the algorithm is getting smarter and smarter every day. ‎但是算法每一天都在变得更聪明
[1:00:46] So, today, they are convincing the people that the Earth is flat, ‎今天 它们说服人们相信 地球是平的
[1:00:50] but tomorrow, they will be convincing you of something that’s false. ‎但是明天 它们就会说服你相信 ‎一个完全虚假的事情
[1:00:54] On November 7th, the hashtag “Pizzagate” was born. ‎11月7日 话题标签‘披萨门’诞生了
[1:00:57] Pizzagate… ‎披萨门…
[1:01:00] Oh, boy. ‎天啊
[1:01:03] I still am not 100 percent sure how this originally came about, ‎我还是不能百分之百确定 ‎这个最初是从哪里来的
[1:01:06] but the idea that ordering a pizza meant ordering a trafficked person. ‎但是订披萨等于订一个贩卖的人口 ‎这个想法
[1:01:12] As the groups got bigger on Facebook, ‎由于脸书上的多个小组越来越大
[1:01:15] Facebook’s recommendation engine started suggesting to regular users ‎脸书推荐引擎开始建议普通用户
[1:01:20] that they join Pizzagate groups. ‎让他们加入披萨门小组
[1:01:21] So, if a user was, for example, anti vaccine or believed in chemtrails ‎所以 如果一个用户反对疫苗 ‎或者相信飞机喷洒重金属阴谋论
[1:01:27] or had indicated to Facebook’s algorithms in some way ‎或者对脸书的算法表示过
[1:01:30] that they were prone to belief in conspiracy theories, ‎他们易于相信阴谋论
[1:01:33] Facebook’s recommendation engine would serve them  Pizzagate groups. ‎脸熟的推荐引擎 ‎就会推荐给他们披萨门小组
[1:01:36] Eventually, this culminated in a man showing up with a gun, ‎最终 这件事达到高潮 ‎一名男子携枪出现
[1:01:41] deciding that he was gonna go liberate the children from the basement ‎决定他要给披萨店地下室的 ‎那些孩子们自由
[1:01:44] of the pizza place that did not have a basement. ‎而这个披萨店根本没有地下室
[1:01:46] What were you doing? ‎ 你当时在这个地方做什么? ‎
[1:01:48] Making sure there was nothing there. 确保这里什么都没有
[1:01:50] Regarding? Pedophile ring. 关于什么? ‎ 恋童癖怪圈
[1:01:52] What? Pedophile ring. ‎ 什么?‎ 恋童癖怪圈
[1:01:54] He’s talking about Pizzagate. ‎披萨门 他在说披萨门
[1:01:56] This is an example of a conspiracy theory ‎这是阴谋论在所有社交媒体上
[1:02:00] that was propagated across all social networks. ‎到处传播的一个例子
[1:02:03] The social network’s own recommendation engine ‎社交网络自己的推荐引擎
[1:02:06] is voluntarily serving this up to people ‎自愿把这个东西推送给
[1:02:08] who had never searched for the term “Pizzagate” in their life. ‎这辈子从来没有搜索过 ‎“披萨门”的人们
[1:02:10] ‎(披萨门 ‎民主党和恋童癖的深盘披萨)
[1:02:12] There’s a study, an MIT study, ‎有一个研究 麻省理工的研究
[1:02:14] that fake news on Twitter spreads six times faster than true news. ‎说推特上传播的虚假新闻 ‎比真实新闻传播速度快六倍
[1:02:19] What is that world gonna look like ‎当一个人有着高于另一个人
[1:02:21] when one has a six times advantage to the other one? ‎六倍的优势 这种世界会是什么样?
[1:02:25] You can imagine these things are sort of like… ‎你可以想象 这些事情有点…
[1:02:27] they… they tilt the floor of… of human behavior. ‎将人类行为的基础平面倾斜了
[1:02:31] They make some behavior harder and some easier. ‎让一些行为更难 让一些行为更容易
[1:02:34] And you’re always free to walk up the hill, ‎你总是可以自由地走上山坡
[1:02:37] but fewer people do, ‎但是这样做的人越来越少
[1:02:38] and so, at scale, at society’s scale, you really are just tilting the floor ‎所以大范围内 在整个社会范围内 ‎你就是将基础平面倾斜了
[1:02:43] and changing what billions of people think and do. ‎改变了数十亿人的想法和行为
[1:02:46] We’ve created a system that biases towards false information. ‎我们创造了一个 ‎偏心于虚假消息的体系
[1:02:52] Not because we want to, ‎并不是因为我们想这样做
[1:02:54] but because false information makes the companies more money ‎而是因为虚假信息比真实信息 ‎能让各个公司
[1:02:59] than the truth. The truth is boring. ‎赚到更多钱 真实信息比较无聊
[1:03:01] It’s a disinformation for profit business model. ‎这是一个 ‎利用虚假信息牟利的商业模式
[1:03:04] You make money the more you allow unregulated messages ‎允许未受监管的信息传送给更多的人
[1:03:08] to reach anyone for the best price. ‎卖出最好的价钱 以此来赚钱
[1:03:11] Because climate change? Yeah. ‎因为气候变化?对
[1:03:14] It’s a hoax. Yeah, it’s real. That’s the point. ‎这是骗局 对 是真的 这才是重点
[1:03:16] The more they talk about it and the more they divide us, ‎他们谈论这件事情越多 ‎就越会将我们分化
[1:03:20] the more they have the power, the more… ‎他们越有力量 就越有控制权
[1:03:22] Facebook has trillions of these news feed posts. ‎脸书有万亿个新闻推送贴
[1:03:26] They can’t know what’s real or what’s true… ‎他们无法知道哪些是真的 ‎哪些是事实…
[1:03:29] which is why this conversation is so critical right now. ‎所以当下这个议题 才如此重要
[1:03:33] It’s not just COVID 19 that’s spreading fast. ‎传播迅速的 不只是新冠病毒
[1:03:37] There’s a flow of misinformation online about the virus. ‎网上有关于这个病毒的大量虚假信息
[1:03:40] The notion drinking water ‎多喝水能将新冠病毒
[1:03:41] will flush coronavirus from your system ‎从你身体中冲走的提议
[1:03:43] is one of several myths about the virus circulating on social media. ‎是在社交媒体上广泛流传的 ‎该病毒的谜题之一
[1:03:47] The government planned this event, created the virus, ‎这是政府计划的事件 ‎创造了这个病毒
[1:03:50] and had a simulation of how the countries would react. ‎来模拟世界各国将会如何应对
[1:03:53] Coronavirus is a… a hoax. ‎新冠病毒是一场骗局
[1:03:56] SARS, coronavirus. ‎非典 新冠病毒
[1:03:58] And look at when it was made. 2018. ‎看看这是什么时候制造的 2018年
[1:04:01] I think the US government started this shit. ‎我觉得是美国政府开始的这场闹剧
[1:04:04] Nobody is sick. Nobody is sick. Nobody knows anybody who’s sick. ‎根本没有人生病 没人生病
[1:04:07] ‎没有人认识哪个人真正生病了
[1:04:09] Maybe the government is using the coronavirus as an excuse ‎或许是政府在利用新冠病毒当借口
[1:04:13] to get everyone to stay inside because something else is happening. ‎让所有人留在家 ‎因为有其他的事情要发生
[1:04:15] Coronavirus is not killing people, ‎新冠病毒不会杀人
[1:04:18] it’s the 5G radiation that they’re pumping out. ‎他们是在掩盖5G辐射害死的人
[1:04:21] ‎(5G信号塔被割倒焚烧)
[1:04:22] We’re being bombarded with rumors. ‎我们被谣言轰炸
[1:04:25] People are blowing up actual physical cell phone towers. ‎人们去毁掉了真实的手机信号塔
[1:04:28] We see Russia and China spreading rumors and conspiracy theories. ‎我们看到俄罗斯和中国 ‎传播谣言和阴谋论
[1:04:32] This morning, panic and protest in Ukraine as… ‎今天早上 乌克兰的恐慌与抗议…
[1:04:35] People have no idea what’s true, and now it’s a matter of life and death. ‎人们不知道什么是真相 ‎现在已经闹出人命了
[1:04:39] Those sources that are spreading coronavirus misinformation ‎那些传播新冠病毒虚假信息的来源
[1:04:42] have amassed something like 52 million engagements. ‎积累了5200万人参与
[1:04:45] You’re saying that silver solution would be effective. ‎你是说 胶银溶液会有效
[1:04:50] Well, let’s say it hasn’t been tested on this strain of the coronavirus, but… ‎我们这样说吧 ‎虽然没有用新冠病毒的毒株测试过
[1:04:54] What we’re seeing with COVID is just an extreme version ‎我们看到的新冠病毒 ‎只是发生在我们信息生态系统中的
[1:04:57] of what’s happening across our information ecosystem. ‎一个极端案例
[1:05:00] ‎(惠特默 希特勒)
[1:05:00] Social media amplifies exponential gossip and exponential hearsay ‎社交媒体放大了增长迅速的谣言 ‎和增长迅速的道听途说
[1:05:05] to the point that we don’t know what’s true, ‎以至于我们都不知道 什么是真了
[1:05:07] no matter what issue we care about. ‎不管我们关注的是什么问题
[1:05:15] He discovers this. 他发现了这个
[1:05:19] Ben. 本
[1:05:26] Are you still on the team? Mm hmm. ‎你还在队里吗?
[1:05:30] Okay, well, I’m gonna get a snack before practice ‎好 如果你想来 我在训练之前 ‎先去吃点零食 你要来吗?
[1:05:32] if you… wanna come. 先去吃点零食 你要来吗?
[1:05:35] Hm? 嗯?
[1:05:36] You know, never mind. ‎当我没说
[1:05:45] Nine out of ten people are dissatisfied right now. ‎当前 十个人当中 有九个人不满意
[1:05:47] The EC is like any political movement in history, when you think about it. ‎你仔细想想 极端中心和历史上 ‎任何政治运动无异
[1:05:50] We are standing up, and we are… we are standing up to this noise. ‎我们要站起来反抗 ‎我们要站起来反抗这种杂音
[1:05:54] You are my people. I trust you guys. ‎你是我的人民 我相信你们
[1:05:59] The Extreme Center content is brilliant. He absolutely loves it. ‎ 极端中心的内容好棒 ‎ 他非常喜欢
[1:06:02] Running an auction. ‎运行一个拍卖
[1:06:04] 840 bidders. He sold for 4.35 cents to a weapons manufacturer. ‎843个竞标人 他以4.35美分 ‎卖给了一个武器生产商
[1:06:08] Let’s promote some of these events. ‎我们来宣传一下这些活动
[1:06:10] Upcoming rallies in his geographic zone later this week. ‎这星期晚些时候 ‎将在这片地理区域发生的群众集会
[1:06:13] I’ve got a new vlogger lined up, too. ‎新的视频博主也安排好了
[1:06:17] And… and, honestly, I’m telling you, I’m willing to do whatever it takes. ‎说实话 我告诉你 ‎我愿意付出任何代价
[1:06:23] And I mean whatever. ‎我说 任何代价
[1:06:32] Subscribe… Ben? ‎ 订阅… ‎ 本?
[1:06:33] …and also come back because I’m telling you, yo… ‎…记得回来 因为我告诉你们…
[1:06:35] …I got some real big things comin’. ‎我后面会有大事件
[1:06:38] Some real big things. ‎非常大的事件
[1:06:40] One of the problems with Facebook is that, as a tool of persuasion, ‎脸书的一个问题是 ‎作为一个有劝说性质的工具
[1:06:45] it may be the greatest thing ever created. ‎它或许是史上最伟大的发明
[1:06:48] Now, imagine what that means in the hands of a dictator or an authoritarian. ‎现在 你来想象一下 落在独裁者 ‎或者集权主义者手中 会怎样
[1:06:53] If you want to control the population of your country, ‎如果你想控制你们国家的人民
[1:06:57] there has never been a tool as effective as Facebook. ‎从未有过像脸书这样有效的工具
[1:07:04] Some of the most troubling implications ‎一个问题最大的影响是
[1:07:07] of governments and other bad actors weaponizing social media, ‎一些政府和其他不良人士 ‎把社交媒体当作武器
[1:07:11] um, is that it has led to real, offline harm. ‎导致了真实的线下伤害
[1:07:13] I think the most prominent example ‎我认为最明显的案例
[1:07:15] that’s gotten a lot of press is what’s happened in Myanmar. ‎广泛被媒体关注的 ‎是缅甸发生的事情
[1:07:17] ‎(缅甸总统办公室)
[1:07:19] In Myanmar, when people think of the Internet, ‎在缅甸 人们想到网络时
[1:07:21] what they are thinking about is Facebook. ‎他们想到的 是脸书
[1:07:22] And what often happens is when people buy their cell phone, ‎经常发生的事情是 人们买了手机
[1:07:26] the cell phone shop owner will actually preload Facebook on there for them ‎手机店主会提前帮他们下载好脸书
[1:07:30] and open an account for them. ‎帮他们开好账户
[1:07:31] And so when people get their phone, the first thing they open ‎于是人们拿到手机之后 ‎第一个打开的应用
[1:07:34] and the only thing they know how to open is Facebook. ‎他们唯一知道怎样打开的 就是脸书
[1:07:38] Well, a new bombshell investigation exposes Facebook’s growing struggle ‎一个新的震惊调查显示 ‎脸书日益增长的
[1:07:41] to tackle hate speech in Myanmar. ‎对抗缅甸仇恨言论的难题
[1:07:43] ‎(停止杀害穆斯林)
[1:07:46] Facebook really gave the military and other bad actors ‎脸书真的给军人和其他不良人士
[1:07:49] a new way to manipulate public opinion ‎一种控制公众言论的新手段
[1:07:51] and to help incite violence against the Rohingya Muslims ‎并协助煽动 ‎针对罗兴亚族穆斯林的暴力
[1:07:55] that included mass killings, ‎包括大屠杀
[1:07:58] burning of entire villages, ‎焚烧整个村庄
[1:07:59] mass rape, and other serious crimes against humanity ‎违反人道主义的 ‎大规模强奸和其他严重犯罪行为
[1:08:03] that have now led ‎已经导致
[1:08:05] to 700,000  Rohingya Muslims having to flee the country. 七十万罗兴亚族穆斯林 ‎逃出这个国家
[1:08:11] It’s not that highly motivated propagandists ‎这种情绪高涨的鼓吹 ‎
[1:08:14] haven’t existed before. 以前并不是没有出现过
[1:08:16] It’s that the platforms make it possible ‎只是这个平台实现了
[1:08:19] to spread manipulative narratives with phenomenal ease, ‎让操纵性言论传播变得异常容易
[1:08:23] and without very much money. ‎也不用花多少钱
[1:08:25] If I want to manipulate an election, ‎如果我想操纵竞选
[1:08:27] I can now go into a conspiracy theory group on Facebook, ‎我现在可以去脸书上的 ‎一个阴谋论小组
[1:08:30] and I can find 100 people ‎可以找到一百个人
[1:08:32] who believe that the Earth is completely flat ‎他们深信地球是平的
[1:08:34] and think it’s all this conspiracy theory that we landed on the moon, ‎认为我们登月 完全是阴谋论
[1:08:37] and I can tell Facebook, “Give me 1,000 users who look like that.” ‎我可以告诉脸书 ‎“给我推荐一千个这种用户”
[1:08:42] Facebook will happily send me thousands of users that look like them ‎脸书会非常开心地发给我 ‎几千个这种用户
[1:08:46] that I can now hit with more conspiracy theories. ‎我现在可以给他们讲更多的阴谋论
[1:08:50] Sold for 3.4 cents an impression. ‎以3.4美分卖了一个印象
[1:08:53] New EC video to promote. Another ad teed up. ‎推广新的极端中心 ‎再安排一个广告
[1:08:58] Algorithms and manipulative politicians ‎算法和操纵人的政治家
[1:09:01] are becoming so expert ‎在学习如何激发我们的方面
[1:09:02] at learning how to trigger us, ‎变得非常专业
[1:09:04] getting so good at creating fake news that we absorb as if it were reality, ‎非常擅长制造我们容易接受的 ‎虚假新闻 假装这就是事实
[1:09:08] and confusing us into believing those lies. ‎给我们造成混乱 ‎让我们相信这些谎言
[1:09:10] It’s as though we have less and less control ‎我们似乎对自己是怎样的人 ‎
[1:09:12] over who we are and what we believe. 自己的信仰 有越来越少的控制权
[1:09:31] …so they can pick sides. ‎…来让他们选择站队
[1:09:32] There’s lies here, and there’s lies over there. ‎到处都是谎言
[1:09:34] So they can keep the power, ‎这样他们就能保持住权力
[1:09:36] so they can control everything. ‎这样他们就能控制一切
[1:09:40] They can control our minds, ‎他们可以控制我们的思想 ‎
[1:09:42] so that they can keep their secrets. 这样他们就可以保守他们的秘密
[1:09:44] ‎(质疑真相)
[1:09:46] ‎(疾控中心承认掩盖疫苗/自闭症)
[1:09:48] Imagine a world where no one believes anything true. ‎想象一个没有人相信任何真相的世界
[1:09:50] ‎(疫苗不普适所有人 ‎我们的基因就是证据)
[1:09:52] Everyone believes the government’s lying to them. ‎所有人都相信 政府在骗他们
[1:09:56] Everything is a conspiracy theory. ‎一切都是阴谋论
[1:09:58] “I shouldn’t trust anyone. I hate the other side.” ‎“我不应该相信任何人 ‎我痛恨对立面”
[1:10:01] That’s where all this is heading. ‎一切正在向这个方向发展
[1:10:02] The political earthquakes in Europe continue to rumble. ‎欧洲的政治地震 余震不止
[1:10:06] This time, in Italy and Spain. ‎这一次 轮到了意大利和西班牙
[1:10:08] Overall, Europe’s traditional, centrist coalition lost its majority ‎整体上来说 欧洲传统的 ‎中间派联合政府失去了大多数人支持
[1:10:12] while far right and far left populist parties made gains. ‎同时极左和极右民粹主义政党 ‎获得更多支持
[1:10:17] ‎(中心)
[1:10:19] Back up. ‎退后
[1:10:21] Okay, let’s go. ‎好 我们走
[1:10:28] These accounts were deliberately, specifically attempting ‎这些账户专门故意试图
[1:10:31] to sow political discord in Hong Kong. ‎散播香港政治纷争信息
[1:10:38] All right, Ben. ‎好 本
[1:10:42] What does it look like to be a country ‎生活在一个全部信息来自于脸书
[1:10:45] that’s entire diet is Facebook and social media? ‎和社交媒体的国家 是什么感觉?
[1:10:48] Democracy crumbled quickly. ‎民主迅速崩溃
[1:10:50] Six months. ‎六个月
[1:10:51] After that chaos in Chicago, ‎芝加哥的混乱发生后
[1:10:53] violent clashes between protesters and supporters… ‎抗议者和支持者之间的暴力冲突…
[1:10:58] Democracy is facing a crisis of confidence. ‎民主正面临着信心危机
[1:11:01] What we’re seeing is a global assault on democracy. ‎我们看到的 是对全球民主的攻击
[1:11:04] ‎(极端中心)
[1:11:05] Most of the countries that are targeted are countries ‎多数目标国家 都是
[1:11:08] that run democratic elections. ‎进行民主选举的国家
[1:11:10] This is happening at scale. ‎大范围发生
[1:11:12] By state actors, by people with millions of dollars saying, ‎国家行动者、家财万贯的富翁说
[1:11:15] “I wanna destabilize Kenya. I wanna destabilize Cameroon. ‎“我想让肯尼亚动摇 ‎我想让喀麦隆动摇
[1:11:18] Oh, Angola? That only costs this much.” ‎安哥拉?只要这么一点钱”
[1:11:20] An extraordinary election took place Sunday in Brazil. ‎巴西上周日举行了一场特别的选举
[1:11:23] With a campaign that’s been powered by social media. ‎选举动员是社交媒体驱动的
[1:11:31] We in the tech industry have created the tools ‎我们技术产业的人创造了
[1:11:34] to destabilize and erode the fabric of society ‎动摇和侵蚀社会结构的工具
[1:11:37] in every country,  all at once, everywhere. ‎所有国家都在同时发生 ‎世界各地都在发生
[1:11:40] You have this in Germany, Spain, France, Brazil, Australia. ‎德国、西班牙、法国 ‎巴西、澳大利亚都有发生
[1:11:44] Some of the most “developed nations” in the world ‎一些世界上最发达的国家
[1:11:47] are now imploding on each other, ‎正在互相爆破
[1:11:49] and what do they have in common? ‎他们有什么共同点?
[1:11:51] Knowing what you know now, ‎基于你当前的了解
[1:11:53] do you believe Facebook impacted the results of the 2016 election? ‎你相信脸书 ‎影响了2016年大选的结果吗?
[1:11:56] Oh, that’s… that is hard. ‎这个问题好难回答
[1:11:58] You know,  it’s… the… ‎你知道 这个…
[1:12:01] the reality is, well, there were so many different forces at play. ‎事实是 ‎有很多不同的力量在产生影响
[1:12:04] Representatives from Facebook, Twitter, and Google are back on Capitol Hill ‎脸书、推特和谷歌的代表们 ‎回到国会山
[1:12:07] for a second day of testimony ‎对俄罗斯干预2016年大选问题
[1:12:09] about Russia’s interference in the 2016 election. ‎进行第二天的证词发言
[1:12:12] The manipulation by third parties is not a hack. ‎第三方政党的操纵没有黑入
[1:12:18] Right? The Russians didn’t hack Facebook. ‎对吧?俄罗斯没有黑入脸书
[1:12:21] What they did was they used the tools that Facebook created ‎他们所做的是 利用脸书 ‎
[1:12:25] for legitimate advertisers and legitimate users, 为合法广告商与合法用户创造的工具
[1:12:27] and they applied it to a nefarious purpose. ‎用到了罪恶的用途中
[1:12:32] It’s like remote control warfare. ‎就像是远程控制的战争
[1:12:34] One country can manipulate another one ‎一个国家可以操纵另一个国家
[1:12:36] without actually invading its physical borders. ‎都不用真正入侵实体边境
[1:12:39] We’re seeing violent images. It appears to be a dumpster ‎我们看到这些暴力的画面
[1:12:42] being pushed around… ‎这是一个被推来推去的垃圾箱…
[1:12:43] But it wasn’t about who you wanted to vote for. ‎但问题不是你想投票给谁
[1:12:46] It was about sowing total chaos and division in society. ‎问题是在社会中散播混乱和分歧
[1:12:50] Now, this was in Huntington Beach. A march… ‎这是在霍廷顿海滩市的示威…
[1:12:53] It’s about making two sides ‎问题是制造了两个对立面
[1:12:54] who couldn’t hear each other anymore, ‎丝毫不再听取对方的观点
[1:12:56] who didn’t want to hear each other anymore, ‎不再想听对方的观点
[1:12:58] who didn’t trust each other anymore. ‎不再相信对方
[1:12:59] This is a city where hatred was laid bare ‎这是仇恨被暴露出
[1:13:03] and transformed into racial violence. ‎并转化成种族暴力的城市
[1:13:05] ‎(弗吉尼亚紧张局势 ‎暴力当天致三人遇害)
[1:13:20] Ben! ‎本!
[1:13:21] Cassandra! ‎卡桑德拉!
[1:13:22] Cass! Ben! ‎ 卡桑! ‎ 本!
[1:13:23] Come here! Come here! ‎过来!
[1:13:27] Arms up. Arms up. Get down on your knees. Now, down. ‎举起手 膝盖跪地 快 跪下
[1:13:36] Calm Ben! ‎ 冷静…… ‎ 本!
[1:13:37] Hey! Hands up! ‎喂!手举起来!
[1:13:39] Turn around. On the ground.  On the ground! ‎转过去 趴地上
[1:13:56] Do we want this system for sale to the highest bidder? ‎我们希望这个系统 ‎售卖出最高的竞价吗?
[1:14:01] For democracy to be completely for sale, where you can reach any mind you want, ‎完全出售民主 ‎你可以控制任何你想控制的思想
[1:14:05] target a lie to that specific population, and create culture wars? ‎对特定人群设定谎言 ‎制造文化战争?
[1:14:09] Do we want that? ‎我们希望这样吗?
[1:14:14] We are a nation of people… ‎我们这个国家的人民…
[1:14:16] that no longer speak to each other. ‎不再和彼此说话了
[1:14:19] We are a nation of people who have stopped being friends with people ‎我们这个国家的人民 ‎不再和彼此交友了
[1:14:23] because of who they voted for in the last election. ‎只因为他们在上一次竞选中投票的人
[1:14:25] We are a nation of people who have isolated ourselves ‎我们这个国家的人民孤立了自己
[1:14:28] to only watch channels that tell us that we’re right. ‎只看认同我们的那些频道
[1:14:32] My message here today is that tribalism is ruining us. ‎我今天想传达的信息是 ‎部落主义正在毁掉我们
[1:14:37] It is tearing our country apart. ‎它正在撕裂我们这个国家
[1:14:40] It is no way for sane adults to act. ‎正常的成年人 不可能这样做
[1:14:43] If everyone’s entitled to their own facts, ‎如果每个人都有权执着于自己的真相
[1:14:45] there’s really no need for compromise, no need for people to come together. ‎就真没有必要妥协 ‎没有必要让人们团结了
[1:14:49] In fact, there’s really no need for people to interact. ‎事实上 真的没有必要让人们互动
[1:14:52] We need to have… ‎我们需要
[1:14:53] some shared understanding of reality. Otherwise, we aren’t a country. ‎对现实有一些共同的理解 不然 我们就不是一个国家了
[1:14:58] So, uh, long term, the solution here is to build more AI tools ‎(扎克伯格先生)
[1:14:59] ‎所以长期来看 解决办法 ‎是建造更多的人工智能工具
[1:15:03] that find patterns of people using the services that no real person would do. ‎找到人们使用这些服务的行为模式 ‎这是任何一个真人都做不到的
[1:15:08] We are allowing the technologists to frame this as a problem ‎我们允许技术专家 把这个当做一个
[1:15:11] that they’re equipped to solve. ‎他们有能力解决的问题呈现
[1:15:15] That is… That’s a lie. ‎这是骗人的
[1:15:17] People talk about AI as if it will know truth. ‎人们谈论人工智能 ‎好像人工智能知道真理一样
[1:15:21] AI’s not gonna solve these problems. ‎人工智能无法解决这些问题
[1:15:24] AI cannot solve the problem of fake news. ‎人工智能无法解决虚假新闻的问题
[1:15:28] Google doesn’t have the option of saying, ‎谷歌没有选择去说
[1:15:31] “Oh, is this conspiracy? Is this truth?” Because they don’t know what truth is. ‎“这是阴谋论?这是真相吗?”
[1:15:34] ‎因为它们不知道 真相是什么
[1:15:36] They don’t have a… ‎它们没有
[1:15:37] They don’t have a proxy for truth that’s better than a click. 真相的代理服务器 ‎只有点击
[1:15:41] If we don’t agree on what is true ‎如果我们不同意真相
[1:15:45] or that there is such a thing as truth, ‎或者不同意存在真相
[1:15:48] we’re toast. ‎我们就完蛋了
[1:15:49] This is the problem beneath other problems ‎这是其他问题之下的问题
[1:15:52] because if we can’t agree on what’s true, ‎因为如果我们不能认同真相
[1:15:55] then we can’t navigate out of any of our problems. ‎那我们就无法找到 ‎我们任何一个问题的解决方法
[1:16:05] We should suggest Flat Earth Football Club. ‎我们应该建议他 ‎关注平面地球足球俱乐部
[1:16:07] Don’t show him sports updates. He doesn’t engage. ‎别再给他展示运动消息了 ‎他不感兴趣
[1:16:39] A lot of people in Silicon Valley subscribe to some kind of theory ‎硅谷的很多人相信一种理论
[1:16:42] that we’re building some global super brain, ‎我们正在建造一些全球的超级大脑
[1:16:45] and all of our users are just interchangeable little neurons, ‎我们所有的用户 ‎都只是可交互的神经元
[1:16:48] no one of which is important. ‎他们一点都不重要
[1:16:50] And it subjugates people into this weird role ‎它让人们服从于这样一个奇怪的角色
[1:16:53] where you’re just, like, this little computing element ‎你就像是一个小的编程元素
[1:16:56] that we’re programming through our behavior manipulation ‎我们通过我们的行为操纵去 编程
[1:16:58] for the service of this giant brain, and you don’t matter. ‎为了服务于这个巨型大脑 ‎你根本不重要
[1:17:02] You’re not gonna get paid. You’re not gonna get acknowledged. ‎不会给你钱 不会告诉你真相
[1:17:04] You don’t have self determination. ‎你没有自主权
[1:17:06] We’ll sneakily just manipulate you because you’re a computing node, ‎我们会鬼祟地操纵你 ‎因为你是编程中的结点
[1:17:09] so we need to program you  ’cause that’s what you do with computing nodes. ‎所以我们需要将你编程 ‎因为我们就要这样对待编程中的结点
[1:17:20] Oh, man. ‎天啊
[1:17:21] When you think about technology and it being an existential threat, ‎当你想到技术 ‎技术是一种人类存亡的威胁
[1:17:25] you know, that’s a big claim, and… ‎这个指控很严重…
[1:17:29] it’s easy to then, in your mind, think, “Okay, so, there I am with the phone… ‎然后你的脑中就会很容易想 ‎“好 我正在拿着手机
[1:17:35] scrolling, clicking, using it. ‎互动、点击、使用
[1:17:37] Like, where’s the existential threat? ‎人类存亡的威胁在哪里?
[1:17:40] Okay, there’s the supercomputer. ‎好 有一个超级电脑
[1:17:41] The other side of the screen, pointed at my brain, ‎在屏幕的另一端 正指向我的大脑
[1:17:44] got me to watch one more video. Where’s the existential threat?” ‎让我再看一个视频 ‎人类存亡的威胁在哪里?”
[1:17:54] It’s not about the technology being the existential threat. ‎技术并不是人类存亡的威胁
[1:18:02] ‎(劝服性技术 美国参议院听政会)
[1:18:03] It’s the technology’s ability ‎是技术能够把
[1:18:06] to bring out the worst in society… ‎社会中最坏的东西带出来的能力
[1:18:09] …and the worst in society being the existential threat. ‎社会中最坏的东西 ‎才是人类存亡的威胁
[1:18:13] ‎(美国参议院)
[1:18:18] If technology creates… ‎如果技术创造了
[1:18:21] mass chaos, 公众混乱
[1:18:23] outrage, incivility, ‎愤怒、无礼
[1:18:24] lack of trust in each other, ‎彼此缺乏信任
[1:18:27] loneliness, alienation, more polarization, ‎孤独、疏远、更加两极分化
[1:18:30] more election hacking, more populism, ‎更多大选黑入、更多平民政治
[1:18:33] more distraction and inability to focus on the real issues… ‎让人更加分散注意力 ‎无法集中在真正的问题上…
[1:18:37] that’s just society. ‎那只是社会
[1:18:40] And now society is incapable of healing itself ‎现在社会无法自愈
[1:18:46] and just devolving into a kind of chaos. ‎转移成了一种混乱的形式
[1:18:51] This affects everyone, even if you don’t use these products. ‎这影响着每一个人 ‎即使你不使用这些产品
[1:18:55] These things have become digital Frankensteins ‎这些事情变成了数码的科学怪人
[1:18:57] that are  terraforming the world in their image, ‎让这个世界变成他们现象中的样子
[1:19:00] whether it’s the mental health of children ‎不论是儿童的心理健康
[1:19:01] or our politics and our political discourse, ‎还是我们的政治 我们的政治演说
[1:19:04] without taking responsibility for taking over the public square. ‎而不用因为控制公众舆论 ‎承担责任
[1:19:07] So, again, it comes back to And who do you think’s responsible? ‎ 所以 还是要回到… ‎ 你觉得这一切怪谁?
[1:19:10] I think we have to have the platforms be responsible ‎我认为我们必须让平台负起责任
[1:19:13] for when they take over election advertising, ‎因为他们接管大选广告的时候
[1:19:15] they’re responsible for protecting elections. ‎就要负责保护大选
[1:19:17] When they take over mental health of kids or Saturday morning, ‎当他们接管儿童心理健康 ‎或是儿童频道的时候
[1:19:20] they’re responsible for protecting Saturday morning. ‎他们就有责任保护好儿童频道
[1:19:23] The race to keep people’s attention isn’t going away. ‎保持人们关注的竞争不会结束
[1:19:28] Our technology’s gonna become more integrated into our lives, not less. ‎我们的技术会在我们生活中更加集成 ‎而不会减少
[1:19:31] The AIs are gonna get better at predicting what keeps us on the screen, ‎人工智能会更加擅长预判 ‎什么内容能让我们持续盯着屏幕
[1:19:34] not worse at predicting what keeps us on the screen. ‎而不是做出更差的预判
[1:19:38] I… I am 62 years old, ‎我已经62岁了
[1:19:42] getting older every minute, the more this conversation goes on… ‎随着这个对话继续进行 ‎我每一分钟都在变老
[1:19:44] …but… but I will tell you that, um… ‎但我会告诉你
[1:19:48] I’m probably gonna be dead and gone, and I’ll probably be thankful for it, ‎到时候我可能已经死了 不在了 ‎但我可能会为此感恩
[1:19:52] when all this shit comes to fruition. ‎因为当这些恐怖的东西结出恶果
[1:19:54] Because… Because I think that this scares me to death. ‎我觉得能吓死我
[1:20:00] Do… Do you… Do you see it the same way? ‎你也这样看吗?
[1:20:03] Or am I overreacting to a situation that I don’t know enough about? ‎还是我对一个我不够了解的情况 ‎过度反应了?
[1:20:09] What are you most worried about? ‎你最担心什么?
[1:20:13] I think, in the… in the shortest time horizon… ‎我认为在最短的时间范围内…
[1:20:19] civil war. ‎是内战
[1:20:24] If we go down the current status quo for, let’s say, another 20 years… ‎如果现在的常态继续下去 ‎我们说再过20年
[1:20:31] we probably destroy our civilization through willful ignorance. ‎我们很可能会因为故意无知 ‎毁掉我们的文明
[1:20:34] We probably fail to meet the challenge of climate change. ‎我们或许会无法应对气候变化的挑战
[1:20:38] We probably degrade the world’s democracies ‎我们或许会瓦解世界的民主
[1:20:42] so that they fall into some sort of bizarre autocratic dysfunction. ‎最终衰落成一种奇怪的独裁机能障碍
[1:20:46] We probably ruin the global economy. ‎我们或许会毁掉全球经济
[1:20:48] Uh, we probably, um, don’t survive. ‎我们或许会无法存活
[1:20:52] You know, I… I really do view it as existential. ‎我真的把它看做 ‎人类生死存亡的大问题
[1:21:02] Is this the last generation of people ‎这会是知道在这种幻象发生之前
[1:21:05] that are gonna know what it was like before this illusion took place? ‎世界是什么样的最后一代人吗?
[1:21:11] Like, how do you wake up from the matrix when you don’t know you’re in the matrix? ‎如果你不知道自己在矩阵中 ‎你要怎么从矩阵中醒来?
[1:21:17] ‎(“不论是乌托邦还是毁灭 ‎都是一场一触即发的接力赛…)
[1:21:23] ‎(直接通往最后一刻…” ‎——巴克敏斯特·富勒)
[1:21:27] A lot of what we’re saying sounds like it’s just this… ‎你知道 我们说的很多话 ‎听起来像是…
[1:21:31] one sided doom and gloom. ‎片面的悲观
[1:21:33] Like, “Oh, my God, technology’s just ruining the world ‎“天啊 技术正在毁灭世界
[1:21:36] and it’s ruining kids,” ‎正在毁灭孩子们”
[1:21:38] and it’s like… “No.” ‎不是这样的
[1:21:40] It’s confusing because it’s simultaneous utopia…and dystopia. ‎这很困惑 ‎因为这同时是乌托邦和毁灭
[1:21:45] Like, I could hit a button on my phone, and a car shows up in 30 seconds, ‎我可以在手机按一个按钮 ‎30秒后就能出现一辆车
[1:21:50] and I can go exactly where I need to go. ‎我就可以去想去的任何地方
[1:21:52] That is magic. That’s amazing. ‎这简直是魔法 太神奇了
[1:21:56] When we were making the like button, ‎我们制作“点赞”按钮的时候
[1:21:57] our entire motivation was, “Can we spread positivity and love in the world?” ‎我们全部的动机是 “我们可以 ‎在世界中传播积极和爱吗?”
[1:22:01] The idea that, fast forward to today, and teens would be getting depressed ‎时间快进到当下 ‎青少年会因为没有得到足够多的点赞
[1:22:05] when they don’t have enough likes, ‎而抑郁 这个想法
[1:22:06] or it could be leading to political polarization ‎或者会导致政治两极分化
[1:22:08] was nowhere on our radar. ‎当时是完全无法想象的
[1:22:09] I don’t think these guys set out to be evil. ‎我不认为这些人 ‎最开始的目标是邪恶的
[1:22:13] It’s just the business model that has a problem. ‎只是这个商业模式有问题
[1:22:15] You could shut down the service and destroy whatever it is ‎你可以关掉服务 毁掉不管这是什么
[1:22:20] $20 billion of shareholder value and get sued and… ‎200亿美元的股东利益 被起诉…
[1:22:24] But you can’t, in practice, put the genie back in the bottle. ‎但现实是 覆水难收
[1:22:27] You can make some tweaks, but at the end of the day, ‎你可以做出一些小的调整 但是最终
[1:22:30] you’ve gotta grow revenue and usage, quarter over quarter. It’s… ‎你要增加收益和使用 ‎每一个季度都要增加
[1:22:34] The bigger it gets, the harder it is for anyone to change. ‎规模做得越大 越难让任何人改变
[1:22:38] What I see is a bunch of people who are trapped by a business model, ‎我看到的是一群被困住的人 ‎被商业模式
[1:22:43] an economic incentive, and shareholder pressure ‎经济奖励和股东压力困住
[1:22:46] that makes it almost impossible to do something else. ‎几乎无法做其他的任何事情
[1:22:49] I think we need to accept that it’s okay ‎我认为我们应该接受
[1:22:51] for companies to be focused on making money. ‎公司专注于挣钱 是合情合理的
[1:22:53] What’s not okay is when there’s no regulation, no rules, ‎不合情合理的是 ‎当没有监管、没有规定、
[1:22:55] and no competition, 没有竞争
[1:22:56] and the companies are acting as sort of  de facto governments. ‎公司在充当实际政府部门
[1:23:00] And then they’re saying, “Well, we can regulate ourselves.” ‎然后他们说“我们可以监管自己”
[1:23:03] I mean, that’s just a lie. That’s just ridiculous. ‎这肯定是骗人的 怎么可能呢
[1:23:06] Financial incentives kind of run the world, ‎经济奖励可以说运营着世界
[1:23:08] so any solution to this problem ‎所以这个问题的任何解决方案
[1:23:12] has to realign the financial incentives. ‎一定要符合经济奖励
[1:23:16] There’s no fiscal reason for these companies to change. ‎这些公司没有需要改变的财政理由
[1:23:18] And that is why I think we need regulation. ‎所以我才认为 我们需要监管
[1:23:21] The phone company has tons of sensitive data about you, ‎手机公司有无数的关于你的敏感数据
[1:23:24] and we have a lot of laws that make sure they don’t do the wrong things. ‎我们有很多法律去保证 ‎他们不会利用这些数据做错事
[1:23:27] We have almost no laws around digital privacy, for example. ‎在数码隐私上 我们几乎没有立法
[1:23:31] We could tax data collection and processing ‎我们可以对数据收集和处理收税
[1:23:34] the same way that you, for example, pay your water bill ‎原理等同于你交水费
[1:23:37] by monitoring the amount of water that you use. ‎监控你自己的用水量
[1:23:39] You tax these companies on the data assets that they have. ‎让这些公司因为持有的数据资产交税
[1:23:43] It gives them a fiscal reason ‎就能给他们一个财政理由
[1:23:44] to not acquire every piece of data on the planet. ‎不去获取地球上每一条数据
[1:23:47] The law runs way behind on these things, ‎立法在这方面太落后了
[1:23:50] but what I know is the current situation exists not for the protection of users, ‎但据我所知 当前状况的存在 ‎不是为了保护用户
[1:23:55] but for the protection of the rights and privileges ‎而是为了保护这些巨型的
[1:23:58] of these gigantic, incredibly wealthy companies. ‎超级富有公司的权利和特权
[1:24:02] Are we always gonna defer to the richest, most powerful people? ‎我们要一直听从最有钱 ‎最有权力的人吗?
[1:24:05] Or are we ever gonna say, ‎还是我们要说
[1:24:07] “You know, there are times when there is a national interest. ‎“有时候 确实是有国家利益
[1:24:12] There are times when the interests of people, of users, ‎有时候 人民的利益 用户利益
[1:24:15] is actually more important ‎其实比一个
[1:24:18] than the profits of somebody who’s already a billionaire”? ‎已经是亿万富翁的人的利益 ‎更加重要?”
[1:24:21] These markets undermine democracy, and they undermine freedom, ‎这些市场削弱了民主 削弱了自由
[1:24:26] and they should be outlawed. ‎应该对他们进行法律的制裁
[1:24:29] This is not a radical proposal. ‎这不是激进的提议
[1:24:31] There are other markets that we outlaw. ‎我们有法律制裁的其他市场
[1:24:34] We outlaw markets in human organs. ‎我们制裁人类器官贩卖市场
[1:24:37] We outlaw markets in human slaves. ‎我们制裁人类奴隶市场
[1:24:39] Because they have inevitable destructive consequences. ‎因为它们都有不可避免的破坏性后果
[1:24:44] We live in a world ‎我们生活的世界
[1:24:45] in which a tree is worth more, financially, dead than alive, ‎死去的树比活着的树更有经济价值
[1:24:50] in a world in which a whale is worth more dead than alive. ‎这个世界 死去的鲸 ‎比活着的鲸更有价值
[1:24:53] For so long as our economy works in that way ‎只要我们的经济这样运转
[1:24:56] and corporations go unregulated, ‎公司不受监管
[1:24:58] they’re going to continue to destroy trees, ‎它们就会继续破坏树木
[1:25:00] to kill whales, ‎继续捕杀鲸
[1:25:01] to mine the earth, and to continue to pull oil out of the ground, ‎在地球上挖矿 从地下抽石油
[1:25:06] even though we know it is destroying the planet ‎虽然我们知道 这样做会破坏地球
[1:25:08] and we know that it’s going to leave a worse world for future generations. ‎我们知道 这样做会为未来几代人 ‎留下一个更不堪的世界
[1:25:12] This is short term thinking ‎这是目光短浅
[1:25:13] based on this religion of profit at all costs, ‎为了利益牺牲一切的信仰
[1:25:16] as if somehow, magically, each corporation acting in its selfish interest ‎指望每个 ‎只顾自己私利的公司会突然神奇地
[1:25:20] is going to produce the best result. ‎去产生最好的结果
[1:25:22] This has been affecting the environment for a long time. ‎这已经影响环境很久了
[1:25:24] What’s frightening, and what hopefully is the last straw ‎恐怖的是 希望这是 ‎压倒骆驼的最后一根稻草
[1:25:27] that will make us wake up as a civilization ‎让我们作为文明的种族 去幡然醒悟
[1:25:29] to how flawed this theory has been in the first place ‎这个理论最初就有很多缺点
[1:25:31] is to see that now we’re the tree, we’re the whale. ‎让我们看到 我们现在就是树 就是鲸
[1:25:35] Our attention can be mined. ‎我们的关注就是被挖掘的矿产
[1:25:37] We are more profitable to a corporation ‎如果我们花时间盯着一个屏幕
[1:25:39] if we’re spending time staring at a screen, ‎盯着一个广告 对公司而言
[1:25:41] staring at an ad, ‎比我们用这个时间
[1:25:43] than if we’re spending that time living our life in a rich way. ‎过自己丰富的生活 更加有利可图
[1:25:45] And so, we’re seeing the results of that. ‎我们看到了这样的后果
[1:25:47] We’re seeing corporations using powerful artificial intelligence ‎我们看到公司利用强大的人工智能
[1:25:50] to outsmart us and figure out how to pull our attention ‎凌驾在我们的智能之上 ‎研究怎样拉拢我们的关注
[1:25:53] toward the things they want us to look at, ‎让我们去看 他们想让我们看的东西
[1:25:55] rather than the things that are most consistent ‎而不是让我们看 ‎与我们目标、价值观
[1:25:57] with our goals and our values and our lives. ‎与我们的生活最为一致的东西
[1:26:02] ‎(史蒂夫·乔布斯 今日演讲者)
[1:26:05] What a computer is, ‎电脑对我来说 是…
[1:26:06] is it’s the most remarkable tool that we’ve ever come up with. ‎是我们人类历史上 最神奇的发明
[1:26:11] And it’s the equivalent of a bicycle for our minds. ‎相当于是我们思想的自行车
[1:26:15] The idea of humane technology, that’s where Silicon Valley got its start. ‎人道技术的创想 是硅谷最初的目标
[1:26:21] And we’ve lost sight of it because it became the cool thing to do, ‎我们已经背离了这个目标 ‎因为这样做比较酷
[1:26:25] as opposed to the right thing to do. ‎而不是这样做比较正确
[1:26:27] The Internet was, like, a weird, wacky place. ‎网络就是一个奇怪的、可笑的地方
[1:26:29] It was experimental. ‎它是实验性的
[1:26:31] Creative things happened on the Internet, and certainly, they do still, ‎网络上发生着有创意的事情 ‎当然现在也有发生
[1:26:34] but, like, it just feels like this, like, giant mall. ‎但是感觉像一个巨大的商场
[1:26:38] You know, it’s just like, “God, there’s gotta be… ‎就是“天啊 ‎
[1:26:42] there’s gotta be more to it than that.” 肯定不止表面上这么简单”
[1:26:46] I guess I’m just an optimist. ‎我想我只是一个乐观主义者
[1:26:48] ‘Cause I think we can change what social media looks like and means. ‎因为我认为 ‎我们可以改变社交媒体的样子和方式
[1:26:54] The way the technology works is not a law of physics. ‎技术的工作方式不是物理学定律 ‎
[1:26:56] It is not set in stone. 它不是一成不变的
[1:26:58] These are choices that human beings like myself have been making. ‎这些都是像我这样的人类 ‎做出的选择
[1:27:02] And human beings can change those technologies. ‎人类可以改变这些技术
[1:27:06] And the question now is whether or not we’re willing to admit ‎现在的问题是 我们是否愿意承认
[1:27:10] that those bad outcomes are coming directly as a product of our work. ‎这些后果 是我们杰作的直接产物
[1:27:21] It’s that we built these things, and we have a responsibility to change it. ‎这些东西是我们建立起来的 ‎我们有责任去改变它们
[1:27:37] The attention extraction model ‎提取关注模型 ‎
[1:27:38] is not how we want to treat human beings. 不是我们想对待人类的方式
[1:27:45] Is it just me or… ‎只有我这样想吗?还是…
[1:27:49] Poor sucker. ‎可悲的人
[1:27:51] The fabric of a healthy society ‎一个健康社会的结构
[1:27:53] depends on us getting off this corrosive business model. ‎要依靠我们脱离这种 ‎有破坏性的商业模型
[1:28:04] We can demand that these products be designed humanely. ‎我们可以要求 ‎这些产品进行人道设计
[1:28:09] We can demand to not be treated as an extractable resource. ‎我们可以要求 ‎不被当做可以提取的资源对待
[1:28:15] The intention could be: “How do we make the world better?” ‎我们的动机可以是 ‎“我们怎样让这个世界变得更好?”
[1:28:20] Throughout history, ‎在整个人类历史中
[1:28:21] every single time something’s gotten better, ‎每一次有事物变得更好
[1:28:23] it’s because somebody has come along to say, ‎都是因为有人站出来说
[1:28:26] “This is stupid. We can do better.” ‎“这太蠢了 我们可以做得更好”
[1:28:29] Like, it’s the critics that drive improvement. ‎是批判者驱动着改进
[1:28:33] It’s the critics who are the true optimists. ‎批判者才是真正的乐观主义者
[1:28:37] Hello. ‎你好
[1:28:46] I mean, it seems kind of crazy, right? ‎感觉有点疯狂 是吧?
[1:28:47] It’s like the fundamental way that this stuff is designed… ‎这东西的设计基本方式
[1:28:52] isn’t going in a good direction. ‎就不会朝着好的方向发展
[1:28:55] Like, the entire thing. ‎整个社交媒体
[1:28:56] So, it sounds crazy to say we need to change all that, ‎我说要改变这一切 ‎听起来有点疯狂
[1:29:01] but that’s what we need to do. ‎但我们需要这样做
[1:29:04] Think we’re gonna get there? ‎你觉得这一天能实现吗?
[1:29:07] We have to. ‎必须要实现
[1:29:20] Um, it seems like you’re very optimistic. ‎你似乎非常乐观
[1:29:26] Is that how I sound? ‎听起来是这样吗?
[1:29:27] Yeah, I mean… ‎是 我是说… ‎我无法相信你一直这样说
[1:29:28] I can’t believe you keep saying that, because I’m like, “Really? ‎因为我在想:“真的吗?”
[1:29:31] I feel like we’re headed toward dystopia. 我感觉 ‎我们正走向毁灭而不是乌托邦
[1:29:33] I feel like we’re on the fast track to dystopia, ‎我感觉我们正在飞速走向毁灭
[1:29:35] and it’s gonna take a miracle to get us out of it.” ‎需要一个奇迹 ‎才能让我们走下这条路“
[1:29:37] And that miracle is, of course, collective will. ‎这个奇迹当然是集体意识
[1:29:41] I am optimistic that we’re going to figure it out, ‎可我是乐观主义者 ‎我们一定会有办法解决
[1:29:44] but I think it’s gonna take a long time. ‎但我认为 可能会用很久的时间
[1:29:47] Because not everybody recognizes that this is a problem. ‎因为不是所有的人都意识到了 ‎这是一个问题
[1:29:50] I think one of the big failures in technology today ‎我认为当今技术最大的一个失败
[1:29:55] is a real failure of leadership, ‎是领导力的真正失败
[1:29:58] of, like, people coming out and having these open conversations ‎人们站出来 去公开讨论
[1:30:02] about things that… not just what went well, but what isn’t perfect ‎不仅是哪些地方进行得好 ‎还应该讨论哪里不完美
[1:30:05] so that someone can come in and build something new. ‎才能让有人介入 构建一些新的东西
[1:30:08] At the end of the day, you know, ‎最终 你知道
[1:30:10] this machine isn’t gonna turn around until there’s massive public pressure. ‎在有足够的公众压力之前 ‎这台机器是绝对不会回头的
[1:30:14] By having these conversations and… and voicing your opinion, ‎通过这些对话 发出你的声音
[1:30:18] in some cases through these very technologies, ‎在一些情况下 通过某些特定的技术
[1:30:21] we can start to change the tide. We can start to change the conversation. ‎我们可以开始改变趋势 ‎我们可以开始改变对话
[1:30:24] It might sound strange, but it’s my world. It’s my community. ‎听起来可能有点奇怪 ‎但这是我的世界 是我生活的环境
[1:30:27] I don’t hate them. I don’t wanna do any harm to Google or Facebook. ‎我不恨他们 ‎我不想伤害谷歌或者脸书
[1:30:29] I just want to reform them so they don’t destroy the world. You know? ‎我只是想改革它们 ‎别让他们毁了世界 你知道吗?
[1:30:32] I’ve uninstalled a ton of apps from my phone ‎我在手机上卸载了很多程序
[1:30:35] that I felt were just wasting my time. ‎我感觉那些都是浪费时间
[1:30:37] All the social media apps, all the news apps, ‎所有的社交媒体程序 所有的新程序
[1:30:40] and I’ve turned off notifications ‎我关掉了通知
[1:30:42] on anything that was vibrating my leg with information ‎所有那些让我手机震动的通知
[1:30:45] that wasn’t timely and important to me right now. ‎不够及时 对现在我来说 ‎并不重要的信息
[1:30:49] It’s for the same reason I don’t keep cookies in my pocket. ‎也正是因为同样的理由 ‎我兜里不放饼干
[1:30:51] Reduce the number of notifications you get. ‎减少你收到的通知数量
[1:30:53] Turn off notifications. ‎关掉通知
[1:30:54] Turning off all notifications. ‎关掉所有应用的通知
[1:30:56] I’m not using Google anymore, I’m using Qwant, ‎我已经不再用谷歌了 ‎我用Qwant搜索引擎
[1:30:58] which doesn’t store your search history. ‎这个引擎不会存储你的搜索历史
[1:31:01] Never accept a video recommended to you on YouTube. ‎永远不要接受 ‎YouTube上给你推荐的视频
[1:31:04] Always choose. That’s another way to fight. ‎永远自己去选择 ‎这是另一个抗争的方式
[1:31:07] There are tons of Chrome extensions that remove recommendations. ‎谷歌浏览器有无数扩展程序 ‎可以移走推荐
[1:31:12] You’re recommending something to undo what you made. ‎我很喜欢你推荐一个 ‎撤销你所做东西的东西
[1:31:15] Yep. ‎对
[1:31:16] Before you share, fact check, consider the source, do that extra Google. ‎在你分享之前 查找一下事实 ‎思考一下信息来源 谷歌搜索一下
[1:31:21] If it seems like it’s something designed to really push your emotional buttons, ‎如果这个东西感觉像是 ‎以触发你的情感按钮为目标
[1:31:25] like, it probably is. ‎很可能确实是
[1:31:26] Essentially, you vote with your clicks. ‎基本可以说 你用点击去投票
[1:31:29] If you click on clickbait, ‎如果你点击了钓鱼链接
[1:31:30] you’re creating a financial incentive that perpetuates this existing system. ‎你就是在创造一个经济奖励 ‎延续这个已经存在的体系
[1:31:33] Make sure that you get lots of different kinds of information ‎在你的生活中 一定要获得
[1:31:37] in your own life. ‎各种不同的信息
[1:31:37] I follow people on Twitter that I disagree with ‎我会在推特上关注我不认同的人
[1:31:41] because I want to be exposed to different points of view. ‎因为我想看到不同的观点
[1:31:44] Notice that many people in the tech industry ‎要知道 技术行业中的很多人
[1:31:46] don’t give these devices to their own children. ‎不会把这些设备给他们自己的小孩用
[1:31:49] My kids don’t use social media at all. ‎我的孩子们完全不使用社交媒体
[1:31:51] Is that a rule, or is that a… ‎这是规定 还是…
[1:31:53] That’s a rule. ‎家规
[1:31:55] We are zealots about it. ‎我们对它很狂热
[1:31:57] We’re… We’re crazy. ‎我们很疯狂
[1:31:59] And we don’t let our kids have really any screen time. ‎我们不会让我们的孩子 ‎拥有任何看屏幕的时间
[1:32:05] I’ve worked out what I think are three simple rules, um, ‎我想出了 ‎我自己认为的三个简单原则
[1:32:08] that make life a lot easier for families and that are justified by the research. ‎能让生活对家人来说更容易 ‎这是经过研究验证的
[1:32:12] So, the first rule is all devices out of the bedroom ‎第一个原则是 ‎在每晚的固定时间 所有设备
[1:32:15] at a fixed time every night. ‎不能进入卧室
[1:32:17] Whatever the time is, half an hour before bedtime, all devices out. ‎不管是什么时间 睡前半小时 ‎所有设备全都拿出去
[1:32:20] The second rule is no social media until high school. ‎第二个原则是 ‎高中之前禁止使用社交媒体
[1:32:24] Personally, I think the age should be 16. ‎我个人认为 这个年龄应该是16岁
[1:32:26] Middle school’s hard enough. Keep it out until high school. ‎初中已经够难了 上高中之前别用了
[1:32:29] And the third rule is work out a time budget with your kid. ‎第三个原则是 ‎和你的孩子研究出一个时间预算
[1:32:33] And if you talk with them and say, ‎如果你和他们聊 去说
[1:32:34] “Well, how many hours a day do you wanna spend on your device? ‎“你每天想在你的设备上花多少时间
[1:32:38] What do you think is a good amount?” ‎你觉得适合的时长是多少”
[1:32:39] they’ll often say something pretty reasonable. ‎他们通常会说出一个很合理的时长
[1:32:42] Well, look, I know perfectly well ‎看 我非常清楚
[1:32:44] that I’m not gonna get everybody to delete their social media accounts, ‎我无法让所有人删除社交媒体账号
[1:32:48] but I think I can get a few. ‎但我想我可以让几个人这样做
[1:32:50] And just getting a few people to delete their accounts matters a lot, ‎让几个人删除账号 ‎就已经能产生很大影响了
[1:32:54] and the reason why is that that creates the space for a conversation ‎理由是 这样能创造一个对话的空间
[1:32:58] because I want there to be enough people out in the society ‎因为我想让社会中有足够的人
[1:33:00] who are free of the manipulation engines to have a societal conversation ‎这些人不受到引擎的操纵 ‎能够进行社交对话
[1:33:05] that isn’t bounded by the manipulation engines. ‎没有受到操纵引擎的牵制
[1:33:07] So, do it! Get out of the system. ‎这样做吧!退出这个体系
[1:33:10] Yeah, delete. Get off the stupid stuff. ‎对 删掉 下线这个愚蠢的东西
[1:33:13] The world’s beautiful. Look. Look, it’s great out there. ‎世界很美丽 你们看 ‎外面的世界很美好
[1:33:18] ‎(在社交媒体上关注我们!)
[1:33:20] ‎(开玩笑的)
[1:33:21] ‎(我们来聊聊怎样解决这个问题 ‎登录TheSocialDilemma.com)
2020年

Post navigation

Previous Post: Inside Job(监守自盗)[2010]电影台词本阅读、下载和单词统计
Next Post: Shutter Island(禁闭岛)[2010]电影台词本阅读、下载和单词统计

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Archives

  • July 2024
  • June 2024

Categories

  • 1931年
  • 1939年
  • 1941年
  • 1942年
  • 1943年
  • 1944年
  • 1948年
  • 1952年
  • 1953年
  • 1954年
  • 1955年
  • 1956年
  • 1957年
  • 1958年
  • 1959年
  • 1960年
  • 1961年
  • 1962年
  • 1963年
  • 1964年
  • 1966年
  • 1967年
  • 1968年
  • 1969年
  • 1970年
  • 1971年
  • 1972年
  • 1973年
  • 1974年
  • 1975年
  • 1976年
  • 1977年
  • 1978年
  • 1979年
  • 1980年
  • 1981年
  • 1982年
  • 1983年
  • 1984年
  • 1985年
  • 1986年
  • 1987年
  • 1988年
  • 1989年
  • 1990年
  • 1991年
  • 1992年
  • 1993年
  • 1994年
  • 1995年
  • 1996年
  • 1997年
  • 1998年
  • 1999年
  • 2000年
  • 2001年
  • 2002年
  • 2003年
  • 2004年
  • 2005年
  • 2006年
  • 2007年
  • 2008年
  • 2009年
  • 2010年
  • 2011年
  • 2012年
  • 2013年
  • 2014年
  • 2015年
  • 2016年
  • 2017年
  • 2018年
  • 2019年
  • 2020年
  • 2021年
  • 2022年
  • 2023年
  • 2024年
  • barui
  • Uncategorized
  • 一切安好
  • 中央公园
  • 亢奋
  • 亿万
  • 传世
  • 兄弟连
  • 克拉丽丝
  • 克拉克森的农场
  • 内裤队长
  • 副警长
  • 十字剑
  • 博斯
  • 卡特特工
  • 双城之战
  • 发展受阻
  • 叶卡捷琳娜大帝
  • 哈哈大校
  • 坏法官
  • 坏老师
  • 基本演绎法
  • 大城小妞
  • 大学生费莉希蒂
  • 大神偷卡门
  • 大西洋帝国
  • 天使在美国
  • 娃娃脸
  • 小小安妮
  • 小谎大事
  • 少年间谍亚历克斯
  • 布里奇顿
  • 幻灭
  • 废柴联盟
  • 康斯坦丁
  • 开心汉堡店
  • 德古拉
  • 德里女孩
  • 怒呛人生
  • 恋爱挑战书
  • 惊异传奇
  • 惊悚50州
  • 意乱情迷
  • 成瘾剂量
  • 我为喜剧狂
  • 找寻自我
  • 摩斯探长前传
  • 教师情事
  • 敢不敢挑战我
  • 新成长的烦恼
  • 日常谜团
  • 明日传奇
  • 星际之门亚特兰蒂斯
  • 更美好的事
  • 末日巡逻队
  • 杀无赦
  • 柏林谍影
  • 死亡医师
  • 比弗利娇妻
  • 波士顿法律
  • 火线警告
  • 灵书妙探
  • 犯罪心理
  • 犯罪现场调查·拉斯维加斯
  • 犯罪现场调查·纽约篇
  • 犯罪现场调查迈阿密
  • 狂欢命案
  • 狄金森
  • 狗狗博客
  • 生活
  • 生活大爆炸
  • 眼见为虚
  • 破产姐妹
  • 破发点大满贯之路
  • 神烦警探
  • 红粉联盟
  • 绝命律师
  • 绝命毒师
  • 绝望的主妇
  • 绿箭侠
  • 罪恶黑名单
  • 美国众神
  • 美国恐怖故事
  • 美国罪案故事
  • 美女上错身
  • 美式主妇
  • 脆莓公园
  • 致命陷阱
  • 艾米丽在巴黎
  • 芝加哥急救
  • 芝加哥烈焰
  • 芝加哥警署
  • 蛇蝎女佣
  • 贴身保镖
  • 达芬奇的恶魔
  • 迷失
  • 逃离丹尼莫拉
  • 逝者之证
  • 郊狼
  • 金牌律师
  • 铁证悬案
  • 随性所欲
  • 零异频道
  • 马男波杰克
  • 鲍勃心动
  • 麻木不仁
  • 黑吃黑
  • 黑帆
  • 黑暗救赎
  • 黑色星期一
{"status":false,"code":5000,"msg":"失败"}

Copyright © 2025 英美剧电影台词站.

Powered by PressBook WordPress theme