英文名称:I, Robot
年代:2004
推荐:千部英美剧台词本阅读
时间 | 英文 | 中文 |
---|---|---|
[00:46] | 片名: 机械公敌 | |
[01:00] | 定律一: 机器人不能伤害人类或使人类间接受到伤害 | |
[01:13] | 定律二: 机器人必须服从人类的命令 除非该命令与定律一冲突 | |
[01:30] | 定律三: 机器人必须保护自己的存在 除非与第一或第二定律冲突 | |
[03:12] | Thing of beauty. | 多漂亮啊 |
[03:19] | Good morning, sir! | 早上好 先生! |
[03:21] | Yet another on- time delivery from… | 您的订单已经及时送到… |
[03:24] | Get the hell out of my face, canner. | 让开 铁皮人 |
[03:27] | Have a nice day! | 祝您今天愉快! |
[03:29] | 芝加哥 2035年 | |
[03:31] | And we believe our Destination Anywhere package to be the best value. | 我们相信我们的”梦想目的” 套票是您最好的选择 |
[03:36] | Let us take you to your dream destination aboard our orbital spaceplane, the x- 82. | 星际旅行机X-82 能把您送到梦想中的目的地 |
[03:46] | Try Jazztown ‘s synthetic Chicago- style pizza. Tastes as good as you remember. | 试试爵士城的合成芝加哥口味比萨饼 就像您记忆中的味道一样 |
[03:53] | Glowfish! The world’s hottest-selling transgenic treats. | 荧光鱼! 世界最热卖的转基因礼物 |
[03:56] | Your children will love the new colors too! | 你的孩子也会喜欢新的颜色的! |
[04:01] | – Excuse me, sir. – Total performance. | – 对不起 先生 – 最强性能 |
[04:03] | Total readiness. Total security. | 完全准备就绪 绝对安全 |
[04:06] | So goodbye to upgrades and service calls. | 向无休止的升级和服务电话说再见吧 |
[04:08] | An uplink to USR’s central computer… | 和USR中央电脑连线 |
[04:10] | …provides this state-of-the-art robot with new programs daily. | 提供每天实时更新超级机器人服务 |
[04:14] | The Nestor Class 5 is tomorrow ‘s robot today. | 内斯特5型机器人 明日技术 今日奉献 |
[04:18] | Spoon! Spoonie! | 斯普恩! 斯普那! |
[04:24] | Hold up. Hold on! Excuse me, excuse me. | 等等 等等! 对不起 |
[04:28] | – Spoon, where you been at? – Just away, Farber. | – 斯普那 你最近去哪儿了? – 离开了一阵子 法伯 |
[04:31] | Oh, yeah, away? Like vacation? That’s nice. | 是吗 离开 度假吗? 真不错啊! |
[04:34] | I got a favor to ask. I need to borrow your car. | 我要你帮个忙… 我要借你的车 |
[04:37] | This is different. I got this fine- ass yummy… She is complete and agreeable. | 这次不同了 我弄上了这个 热辣的小妞 |
[04:42] | I mean, ass- hot spankable. | 绝对不错 |
[04:44] | – What does that even mean? – You know what it means. | – 你说什么意思呢? – 你知道我什么意思的 |
[04:47] | – Let me get the damn- ass keys. – First of all… | – 让我借你的钥匙一用… – 首先来说… |
[04:50] | – …stop cussing. You’re not good at it. – Give me 1 0 for the bus, then, man. | 不要说脏话了 你说的完全不对 – 那给我十块钱坐公车行吗 |
[04:54] | – Go home. – That’s strike one, Spoon. Strike one! | – 回家去吧 – 好吧 算了 |
[05:15] | This is such a valuable day…. | 多么美好的一天… |
[05:18] | You talk to Marci? | 你和马茜谈过了吗? |
[05:22] | No, Gigi, I haven’t talked to Marci. | 还没 琪琪 我还没和马茜谈过 |
[05:25] | When I was coming up, we didn’t just marry someone… | 我们那个时代 我们可不会 和某人结婚 |
[05:28] | …then divorce them, then not talk to them. | 离婚之后就不和他说话了 |
[05:31] | Del, don’t play with me. | 德尔 不要耍花样 |
[05:33] | I bet if I stopped cooking, you’d call Marci. | 我打赌如果我不做饭了 你就会打电话给马茜 |
[05:38] | Boy, what is that on your feet? | 啊 你脚上穿的那是什么? |
[05:42] | Converse All Stars, vintage 2004. | 康佛斯全明星鞋 2004年行货 |
[05:47] | Don’t turn your face up. I know you want some. Just ask. | 不要做那种表情 我知道你想要 开口就行了 |
[05:50] | No, thank you very much. | 不 谢谢你了 |
[05:53] | – Sweet potato pie. – Put that on a plate. | – 甜土豆饼啊 – 放在盘子上吃 |
[05:56] | I’ve seen on TV they’re giving away some of them new robots in the lottery. | 电视里说他们在抽奖赠送新型的机器人 |
[06:02] | You know, Gigi, those robots don’t do anybody any good. | 你知道 琪琪 那些机器人干不了什么好事 |
[06:06] | Of all the people on God’s earth, you should know better. | 在这世界上 你应该比谁都清楚 |
[06:09] | Sometimes the stuff that comes out of your mouth! | 有时候你说话完全不经过大脑 |
[06:15] | You listening to me, Del? | 你在听我说吗? 德尔? |
[06:28] | Hey! | 嘿! |
[06:30] | Hey! | 嘿! |
[06:35] | Hold my pie. Sir, hold it or wear it. | 拿着我的饼 先生 不然我就扔到你身上了 |
[06:44] | Move! | 让开! |
[06:53] | Freeze! | 不许动! |
[06:59] | Hey! Stop! | 嘿! 停下! |
[07:09] | Stop! | 停下! |
[07:17] | I said, stop! | 我说了 停下! |
[07:21] | Relax. Relax. | 放松 放松 |
[07:23] | I’m a police officer. | 我是警官 |
[07:25] | You… | 你 |
[07:27] | …are an asshole. | 是个蠢货 |
[07:29] | – Ma’am, is that your purse? – Of course it’s my purse. | – 女士 这是你的钱包吗? – 当然是我的钱包 |
[07:33] | I left my inhaler at home. He was running it out to me. | 我把呼吸器忘在家里了 他跑去给我拿来的 |
[07:37] | I saw a robot running with the purse and assumed… | 我看见一个机器人拿着钱包在跑 我以为… |
[07:40] | What? Are you crazy? | 以为什么? 你疯了吗? |
[07:42] | – I’m sorry for this misunderstanding. – Don’t apologize. | – 对不起 让您误解了 – 不要道歉 |
[07:46] | You’re doing what you’re supposed to do. But what are you doing? | 你在干你该干的事 你呢? 你在干什么? |
[07:51] | Have a lovely day, ma’am. | 祝您今天愉快 女士 |
[07:52] | You’re lucky I can’t breathe, or I’d walk all up and down your ass. | 算你运气好 我现在呼吸困难 不然你吃不了兜着走 |
[08:19] | Lead by example. | 以事实为依据 |
[08:23] | It says that right on your badge. | 你的警徽上是这么说的 |
[08:26] | – We gonna talk about this? – About what? | – 我们谈谈这件事? – 什么事? |
[08:28] | “Help! Police! That robot stole my dry cleaning!” | 警察救命啊 那个机器人偷了 我的干洗衣服! |
[08:33] | Oh, you wanna talk about that. | 哦 你想谈谈那个 |
[08:37] | Detective… | 探员… |
[08:39] | – …how many robots snatch purses? – John, the thing is running… | – 有多少机器人偷过钱包? – 约翰 那家伙在跑… |
[08:43] | How many robots in the world… | 世界上有多少机器人… |
[08:47] | – …have ever committed a crime? – Define crime. | 犯过罪? – 给犯罪下个定义 |
[08:50] | – Answer my question, damn it. – None, John. | – 回答我的问题 – 没有 约翰 |
[08:55] | Now tell me what happened today. | 现在告诉我 今天发生了什么事 |
[08:59] | Nothing. | 没什么事 |
[09:03] | Better be the last nothing. | 最好这是最后一次了 |
[09:11] | Spoon, are you sure you are ready to be back? Because you can take your time. | 斯普恩 你确定你已经准备好回来了吗? 你可以慢慢来的 不急 |
[09:16] | I’m fine, John. Thank you. | 我没事 约翰 谢谢你 |
[09:20] | Better here than sitting around at home. | 总比坐在家里好 |
[09:29] | Homicide. Spooner. | 重案组 斯普那 |
[09:45] | Please take the next exit to your right. | 请从右边下一个出口离开 |
[10:00] | Welcome, Detective Spooner. | 欢迎 斯普那探员 |
[10:07] | Welcome to U. S. Robotics. You have entered the garage- level lobby. | 欢迎来到美国机器人公司 您已经进入底层大厅 |
[10:12] | Please use the elevators for direct access to the main level concourse. | 请坐电梯进入一楼大厅 |
[10:17] | Thank you. | 谢谢你 |
[10:19] | – Good to see you again, son. – Hello, doctor. | – 很高兴再次见到你 年轻人 – 你好 博士 |
[10:23] | Everything that follows is a result of what you see here. | 接下来的一切都是你现在看到的事情的结果 |
[10:27] | – ls there something you want to tell me? – I’m sorry. My responses are limited. | – 你有什么要告诉我的吗? – 对不起 我的回答有限 |
[10:33] | You must ask the right questions. | 你必须问正确的问题 |
[10:35] | Why did you call me? | 你为什么找我? |
[10:38] | I trust your judgment. | 我相信你的判断 |
[10:40] | Normally, this wouldn’t require a homicide detective. | 一般来说 这应该用不着重案组的探员 |
[10:43] | But then, our interactions have never been entirely normal, agreed? | 但是一直以来 我们的交流就不是 完全正常 不是吗? |
[10:48] | You got that right. | 你说的对 |
[10:50] | Is there something you want to say to me? | 你有什么要告诉我的吗? |
[10:53] | I’m sorry. My responses are limited. | 对不起 我的回答有限 |
[10:57] | You must ask the right questions. | 你必须问正确的问题 |
[11:00] | Why would you kill yourself? | 你为什么自杀? |
[11:03] | That, detective, is the right question. | 这个 探员 就是正确的问题 |
[11:09] | Program terminated. | 程序中止 |
[11:30] | Goodbye, old man. | 再见了 老人 |
[11:47] | – Afternoon, boys. – Hey, detective. | – 下午好啊 兄弟们 – 你好 探员 |
[11:50] | – Enlighten me. – What you see is what you get: | – 说说看 – 所见即所得 |
[11:52] | Massive impact trauma. | 严重撞击伤 |
[11:54] | U.S. Robotics. I gotta get my kid something. | 美国机器人公司 我得给孩子带点什么回去 |
[11:57] | – Anything upstairs? – Nada. | – 楼上有什么? – 什么也没有 |
[11:59] | Door was security locked from the inside. | 门从内侧锁的很好 |
[12:02] | Wham, splat. The guy’s a jumper for sure. | 吓 肯定是从上面跳下来的 |
[12:13] | We gotta be smart about this. Let’s deal with it later. | 我们得聪明点 晚些时候再谈 |
[12:18] | Detective. | 探员 |
[12:20] | Lawrence Robertson. | 劳伦斯 罗伯森 |
[12:23] | Richest man in the world. I’ve seen you on television. | 世界上最富有的人 我在电视上看见过你 |
[12:27] | – Can I offer you coffee? – Sure, why not. It’s free, right? | – 要喝点咖啡吗? – 当然了 为什么不呢? 是免费的吧? |
[12:35] | I don’t think anyone saw this coming. | 大家都没预想到这一点 |
[12:38] | You know, I should have, I suppose. I knew him 20 years. | 你知道 我应该能预见到的 我认识他二十年了 |
[12:41] | Alfred practically invented robotics. He wrote the Three Laws. | 阿尔弗雷德发明了实用的机器人 他定义了三大定律 |
[12:47] | But I guess brilliant people often have the most persuasive demons. | 但是我想最聪明的人也有 最顽固的心魔 |
[12:52] | – So whatever I can do to help… – Sugar. | – 如果我能帮上什么忙的话… – 糖(甜心) |
[12:56] | – I’m sorry? – For the coffee. | – 什么? – 咖啡用的 |
[12:58] | Sugar? | 糖? |
[13:01] | You thought I was calling you “sugar.” You’re not that rich. | 哦 你以为我叫你”甜心” 你还没有那么有钱 |
[13:04] | – It’s on the table. – Thank you. | – 桌子上就有 – 谢谢你 |
[13:11] | When Lanning fell, he was holding the little green…? | 兰尼掉下去的时候 他握着 那个绿色的小东西? |
[13:14] | – The holographic projector. – Right. | – 全息投影器 – 对 |
[13:17] | Why do you think Lanning’s hologram would’ve called me? | 你认为为什么兰尼的投影像会 找我? |
[13:20] | – I assumed you knew him. – Yeah. I knew him. | – 我觉得他认识你 – 是啊 我是认识他 |
[13:25] | Holograms are just prerecorded responses… | 全息图是事先录制的程序反应 |
[13:27] | …designed to give the impression of intelligence. | 看上去似乎有智能的样子 |
[13:30] | This one was programmed to call you upon his suicide. | 这个是设定好了他自杀的时候 就联系你 |
[13:33] | – Death. – I’m sorry? | – 死亡 – 什么? |
[13:36] | It was programmed to call me in the event of Lanning’s death. | 设定的是兰尼死的时候联系我 |
[13:40] | Suicide is a type of death, detective. | 自杀是死亡的一种 探员 |
[13:46] | – Don’t misunderstand my impatience. – Oh, no. Go. Go. | – 请不要以为我不耐烦 – 哦 没有 不会 |
[13:51] | A really big week for you folks around here. | 这个星期你们这里会很忙啊 |
[13:55] | You gotta put a robot in every home. | 你要给每个家庭都装一个机器人 |
[13:58] | Look, this is not what I do, but I got an idea for one of your commercials. | 我不是干这个的 不过我有个 关于你们广告的主意 |
[14:02] | You could see a carpenter making a beautiful chair. | 可以先来个木匠 做了一把 漂亮的椅子 |
[14:06] | Then one of your robots comes in and makes a better chair twice as fast. | 然后又来个你们的机器人 做了一把更好的椅子 只花一半时间 |
[14:11] | Then you superimpose on the screen, ” USR: Shitting on the little guy.” | 然后字幕打出来: USR 彻底打败他 |
[14:18] | That would be the fade- out. | 然后淡出 |
[14:20] | Yeah, I see. I suppose your father lost his job to a robot. | 我明白了 也许是你爸爸因为 机器人丢了工作 |
[14:24] | Maybe you’d have banned the lnternet to keep the libraries open. | 也许你应该呼吁禁止互联网 代之以图书馆 |
[14:30] | Prejudice never shows much reason. | 偏见总是没有太多理由的 |
[14:32] | No, you know, I suspect you simply don’t like their kind. | 我觉得你就是不喜欢它们这个种群 |
[14:37] | Well, you got a business to run around here. | 你在这儿有你的生意要做 |
[14:40] | The last thing you need, especially this week, is a dead guy in your lobby. | 尤其是这个星期 你最不希望见到的 就是一个死人躺在你的大厅里 |
[14:44] | But, hell, seeing as how you got one, maybe I’ll look around. | 但是既然已经发生了 我就只好四处调查看看 |
[14:48] | Ask a few questions. Do the whole “cop” thing. | 问几个问题而已 警察的例行公事么 |
[14:52] | – I’ll send someone to escort you. – Thank you very much. | – 我会派人指引你的 – 非常感谢 |
[15:08] | Lawrence told me to accommodate you in any way possible. | 劳伦斯要我尽一切可能帮助你 |
[15:11] | Really? | 是吗? |
[15:13] | Okay. | 好啊 |
[15:15] | I reviewed Dr. Lanning’s psych profile. | 我看过兰尼博士的心理学档案了 |
[15:19] | Alfred had become a recluse. He rejected human contact for machines. | 阿尔弗雷德变得很隐遁 拒绝人类和机器的接触 |
[15:23] | So you’re a shrink, huh? | 你是个心理医生? |
[15:26] | My ex- wife would sure be glad I’m talking to you. | 我的前妻知道我和你谈话一定很高兴 |
[15:29] | You don’t know her, do you? | 你不认识她 对吧? |
[15:30] | I’m sorry. Are you being funny? | 对不起 你在开玩笑吗? |
[15:33] | I guess not. | 没有啊 |
[15:35] | Level 10. | 10楼 |
[15:37] | So would you say that Dr. Lanning was suicidal? | 你认为兰尼博士有自杀倾向? |
[15:41] | It would seem the answer to that is apparent. | 我认为答案是很明显的 |
[15:43] | That’s not what I asked you. | 这不是我问你的问题 |
[15:47] | No. I wouldn’t have thought so. | 不 我本来不这么认为 |
[15:50] | But obviously I was wrong. | 但是 很显然我错了 |
[15:55] | That’s a long way down. | 这里掉下去可是很高啊 |
[15:57] | You people sure do clean up quickly around here. | 你们清扫的还真快 |
[16:00] | I can’t blame you. Who wants some old guy going bad in the lobby? | 我不怪你 谁愿意让这么个老头子 死在大厅里呢? |
[16:04] | He was not “some old guy.” Alfred Lanning was everything here. | 他可不是什么”老头子” 阿尔弗雷德 兰尼是这里的一切 |
[16:09] | We are on the eve of the largest robotic distribution in history. | 我们即将进行史上最大的机器人上市活动 |
[16:13] | By Saturday, it’ll be one robot to every five humans. | 在星期六之前 每5个家庭就会拥有一个机器人 |
[16:16] | These robots are the realization of a dream. Dr. Lanning’s dream. | 这些机器人是梦想的实现 兰尼博士的梦想 |
[16:21] | You know what, in that dream of his… | 你知道吗? 在他的梦中 |
[16:24] | …I bet you he wasn’t dead. | 我打赌他没有翘掉 |
[16:29] | – You keep 24- hour surveillance? – Obviously. Company policy. | – 你们这里有全天监视吗? – 当然了 公司的规定 |
[16:33] | – Where are the feeds? – Sensor strips. | – 监视器呢? – 感应线 |
[16:36] | Everywhere but the service areas. | 除了检修区之外 遍布各处 |
[16:38] | They link to our positronic operating core. | 这些都连到中央控制器 |
[16:48] | Thermostat wasn’t good enough. You gave the building a brain. | 热感应器还不够好 你们给了这幢大楼一个大脑啊 |
[16:51] | She was actually Lanning’s first creation. | 她是兰尼的第一个作品 |
[16:54] | She? That’s a she? I definitely need to get out more. | 她? 是”她”吗? 那可得讨好她 |
[16:58] | Virtual lnteractive Kinetic lntelligence. | 虚拟交互动力智能系统 |
[17:01] | V.I.K.I. | 维基 |
[17:03] | Good day. | 您好 |
[17:05] | V.I.K.I. designed Chicago’s protective systems. | 维基设计了芝加哥的保安系统 |
[17:08] | I have decreased traffic fatalities by 9 percent this year. | 我今年减少了9%的交通事故 |
[17:11] | Thanks. Show me inside the lab from one minute prior to the window break. | 谢谢 让我看看窗户打破前一分钟 实验室内的情况 |
[17:20] | Apologies. There appears to be data corruption. | 对不起 数据似乎已经损坏 |
[17:24] | Show me outside the lab from the window break until now. | 让我看看实验室外从破窗到现在的情况 |
[17:37] | Look, you have great posture. You stand really straight. I’m slouching. | 看看 你站的姿势不错 站的很直 我却缩手缩脚 |
[17:42] | – Would you like to go inside now? – Oh, sure. Right after you. | – 你想进去看看吗? – 哦 当然了 你来带路 |
[17:47] | Authorized entry. | 授权进入 |
[17:57] | So, Dr. Calvin, what exactly do you do around here? | 那么 凯文博士 你在这里的工作 是什么? |
[18:02] | My general fields are advanced robotics and psychiatry. | 一般是高级机器人学和精神科学 |
[18:05] | I specialize in hardware- to- wetware interfaces… | 专长是硬件软件接口 |
[18:07] | …to advance USR’s robotic anthropomorphization program. | 提升USR机器人的人格化系统 |
[18:12] | So, what exactly do you do around here? | 哦 那你的工作是什么? |
[18:15] | I make the robots seem more human. | 我让机器人更像人 |
[18:18] | – Now, wasn’t that easier to say? – Not really. No. | – 这样说不是简单多了吗? – 不完全是 |
[18:44] | “Hansel and Gretel.” | 韩瑟和格丽托 |
[18:46] | – ls that on the USR reading list? – Not precisely. | – 这是USR的必读书目吗? – 不是啊 |
[18:59] | What in God’s name are you doing? | 你在干什么? |
[19:01] | Did you know that was safety glass? | 你知道这是安全玻璃吗? |
[19:04] | Be difficult for an old man to throw himself through that. | 一个老人要撞破玻璃跳下去不容易吧? |
[19:07] | Well, he figured out a way. | 他想出办法了 |
[19:12] | Detective, the room was security locked. No one came or went. | 探员 这房间一直是锁好的 没有人进出过 |
[19:16] | You saw that yourself. Doesn’t that mean this has to be suicide? | 你自己也看见了 这还不是自杀吗? |
[19:20] | Yep. | 是啊 |
[19:23] | Unless the killer’s still in here. | 除非凶手还在这里 |
[19:28] | You’re joking, right? This is ridiculous. | 你在开玩笑对吧? 这真可笑 |
[19:32] | Yeah, I know. The Three Laws, your perfect circle of protection. | 我知道 三大定律 完全保护 |
[19:36] | A robot cannot harm a human being. The first law of robotics. | 机器人不能危害人类 这是机器人第一定律 |
[19:41] | Yes, I’ve seen your commercials. But the second law states a robot must obey… | 我看过你们的广告 但是第二定律不是 说 机器人必须遵守 |
[19:46] | …any order given by a human being. What if it was told to kill? | 人类发出的命令吗? 如果命令是让它杀人怎么办? |
[19:50] | Impossible. It would conflict with the first law. | 不可能的 这和第一定律冲突 |
[19:53] | Right, but the third law states a robot can defend itself. | 对 但是第三定律说机器人可以自我防卫 |
[19:56] | Only when that action does not conflict with the first or second laws. | 只有当这和第一第二定律不冲突的时候 |
[20:01] | You know what they say, laws are made to be broken. | 你知道他们怎么说的吗? 法律制定出来就是为了被破坏的 |
[20:04] | No, not these laws. They’re hardwired into every robot. | 不 这些定律不会 这些都是固化在机器人硬件里的 |
[20:08] | A robot could no more commit murder than a human could walk on water. | 机器人不能杀人 就像人不能 在水上行走一样 |
[20:11] | You know, there was this one guy a long time ago. | 你知道 很久以前就有这么个人… |
[20:26] | – Stay back! – Calm down, detective. | – 退后 – 镇定 探员 |
[20:29] | The only thing dangerous in this room is you. | 这房间里危险的人就只有你 |
[20:32] | Deactivate. | 停机 |
[20:36] | Look, it’s fine. | 看 没事了 |
[20:38] | You’re looking at the result of clever programming. An imitation of free will. | 这是智能程序的反应 是对自由意志的模仿 |
[20:43] | Let’s do an imitation of protecting our asses. | 让我们先模仿好保护自己吧 |
[20:45] | Don’t be absurd. | 不要搞笑了 |
[20:48] | You were startled by a jack- in- the- box. | 你被”盒子里的小丑”吓住了 |
[20:51] | – Deactivate! – Let him go. | – 停机! – 让它来吧 |
[20:53] | It’s not going to hurt us. I gave you an order! | 他不会伤害我们 我命令你! |
[20:56] | – He’s not listening right now, lady. – V.I.K.I., seal the lab! | – 他不听你的 女士 – 维基 封锁实验室 |
[20:59] | No, V.I.K.I., leave the… | 不 维基 不要… |
[21:02] | Command confirmed. | 命令确认 |
[21:28] | Police! | 警察! |
[22:02] | – You’ve hurt it. Badly. – Where’s it going? | – 你把它伤的很重 – 它去哪儿了? |
[22:06] | – Where?! – lt needs to repair itself. | – 哪儿? – 它要修复自己 |
[22:11] | – John, I need backup. – You don’t need backup. | – 约翰 我需要增援 – 你不需要增援 |
[22:14] | That’s nobody. | 没人帮我 |
[22:15] | – What are you doing? – Driving. | – 你在干什么? – 开车 |
[22:17] | – By hand? – Do you see me on the phone? | – 手动的? – 你没看见我在打电话吗? |
[22:19] | – Not at these speeds. – John, please, just send the backup. | – 这种速度你来开? – 约翰 快点派增援来 |
[22:24] | Try to listen, detective. That robot is not going to harm us. | 听我的 探员 那个机器人不是要伤害我们 |
[22:28] | There must have been unknown factors… | 一定有我们不知道的情况… |
[22:30] | …but somehow acting as it did kept us out of harm. | 它的本意一定是让我们脱离危险 |
[22:33] | – A robot cannot endanger a human. – Alert. | – 机器人不会伤害人类 – 注意 |
[22:39] | Asshole! | 蠢货! |
[22:42] | Which is more than I can say for you. | 你自己就是 |
[22:45] | It was a left, by the way. Back there. | 而且刚才你在那儿应该左转的 |
[22:49] | You must know my ex- wife. | 你一定认识我的前妻 |
[23:00] | So where is everybody? | 人都上哪儿去了? |
[23:02] | This facility was designed, built and is operated mechanically. | 这个工厂设计是自动运行的 |
[23:06] | No significant human presence from inception to production. | 从启动到生产 不需要太多人参与 |
[23:10] | – So robots building robots. – Authorization code, please. | – 所以是机器人在造机器人 – 请输入授权代码 |
[23:14] | That’s just stupid. | 那太蠢了 |
[23:15] | I’ll get the inventory specs. | 我在输出库存清单 |
[23:18] | Our daily finishing capacity is 1 000 NS- 5s. | 每天的产量是1000个NS5 |
[23:21] | I’m showing… | 这里显示是 |
[23:23] | … 1 001. | 1001个 |
[23:41] | Attention, NS- 5s. | 注意了 NS5 |
[23:46] | Well, you’re the robot shrink. | 你是机器人心理医生啊 |
[23:52] | There is a robot in this formation that does not belong. | 这里有一个不属于这里的机器人 |
[23:56] | Identify it. | 请指出来 |
[23:58] | One of us. | 我们中的一个 |
[23:59] | – Which one? – One of us. | – 哪一个? – 我们中的一个 |
[24:02] | How much did you say these cost? | 这些要花多少钱? |
[24:05] | These NS- 5s haven’t been configured. They’re just hardware. | 这些NS5还没有配置过 还只是硬件 |
[24:08] | Basic Three Laws operating system. That’s it. | 现在只有基本的三定律操作系统 仅此而已 |
[24:11] | They don’t know any better. | 他们其他的什么也不知道 |
[24:13] | Well, what would you suggest? | 你的建议是什么? |
[24:15] | Interview each one, cross- reference their responses to detect anomalies. | 一个一个的查 找出不同点来发现 有问题的那个 |
[24:20] | – How long would that take? – About three weeks. | – 那得多长时间? – 大约三个星期 |
[24:23] | Okay. Go ahead and get started. | 好吧 现在就开始吧 |
[24:29] | Robots… | 机器人们 |
[24:31] | …you will not move. Confirm command. | 你们不许移动 确认命令 |
[24:34] | Command confirmed. | 命令已确认 |
[24:37] | Detective, what are you doing? | 探员 你在干什么? |
[24:39] | They’re programmed with the Three Laws. | 他们已经植入了三大定律 |
[24:41] | We have 1 000 robots that won’t protect themselves if it violates a human’s order… | 这里有一千个不会违反人类命令 来保护自己的机器人 |
[24:47] | …and I’m betting, one who will. | 但是我打赌有一个会… |
[24:49] | – Put your gun down. – Why do you give them faces? | – 放下枪! – 你为什么给它们一张脸? |
[24:53] | Try to friendly them up, make them look human. | 让他们看起来更像人类 |
[24:56] | These robots cannot be intimidated. | 这些机器人不接受恐吓 |
[24:58] | – lf you didn’t, we wouldn’t trust them. – These are USR property. | – 如果你不这么做 我们就不会相信它们 – 这些是USR的财产 |
[25:02] | Not me. These things are just lights and clockwork. | 我可不是 它们只不过是一堆灯泡和发条 |
[25:09] | Are you crazy?! | 你疯了吗? |
[25:11] | Let me ask you something, doc. | 我来问你吧 博士 |
[25:13] | Does thinking you’re the last sane man on earth make you crazy? | 想象你是地球上唯一剩下的人 你会发疯吗? |
[25:17] | Because if it does, maybe I am. | 如果是这样 我估计会发疯 |
[25:24] | Gotcha. Get the hell out of here! | 找到了 出来! |
[25:44] | Detective! | 探员! |
[26:09] | What am l? | 我是什么? |
[26:11] | – Can I help you, sir? – Can I help you, sir? | – 我能帮你吗 先生? – 我能帮你吗 先生? |
[26:19] | – There he is! – Stand where you are! | – 它在那儿! – 不许动! |
[26:21] | Deactivate at once! | 立即停机! |
[26:24] | Obey the command! Deactivate! | 遵守命令! 停机! |
[26:27] | – Don’t move! – Open fire! | – 不许动! – 开火! |
[26:47] | Hold your fire! | 不要开火! |
[26:48] | – Easy. – He’s down. | – 没事了 – 抓到它了 |
[26:49] | All units, stand down! | 各单位待命 |
[26:51] | Central, please be advised, we’re code four. | 基地请注意 我们是第四小队 |
[26:55] | Code four, NS- 5 is in custody. NS- 5 in custody. | 第四小队 NS5已经被抓获 NS5已经被抓获 |
[27:00] | You have no idea what I went through to clip this thing. | 你不知道我费了多大劲才抓到这个家伙 |
[27:03] | You think you brought me something good. | 你以为你给我干了件大好事 |
[27:06] | – That thing did it! – Keep your voice down. Did what? | – 是它干的! – 小点声 干了什么? |
[27:09] | We have a suicide. End of story. | 这是自杀 就这样了 |
[27:11] | – I am telling you, that robot killed him! – That’s impossible. | – 我告诉你 是那个机器人杀了他 – 这不可能 |
[27:15] | And if it is possible, it better be in somebody else’s precinct. | 就算可能 最好也是在别人的管区 |
[27:19] | John, give me five minutes with it. | 约翰 只要给我五分钟 |
[27:22] | Are you nuts? I talked to the DA. | 你疯了吗? 我和地区检察官谈过 |
[27:24] | Nobody goes in there until Robertson and his attorneys get here. | 在罗伯森和他的律师来之前谁也不能进去 |
[27:27] | – This is my suspect! – It’s a can opener! | – 这是我的嫌犯! – 那会惹上大麻烦的 |
[27:31] | John, don’t do this to me. I am asking you for five minutes. | 约翰 不要这样 我只要五分钟就好 |
[27:36] | What if I’m right? | 如果我是对的怎么办? |
[27:46] | Well, then I guess we’re gonna miss the good old days. | 那 我就会怀念我们以前的好日子 |
[27:49] | What good old days? | 什么以前的好日子? |
[27:51] | When people were killed by other people. | 只有人才能杀人的日子 |
[28:01] | Five minutes. | 五分钟 |
[28:30] | Murder’s a new trick for a robot. Congratulations. | 杀人是机器人学会的新技巧 祝贺啊 |
[28:37] | Respond. | 回答我 |
[28:41] | What does this action signify? | 这个动作是什么意思? |
[28:45] | As you entered, when you looked at the other human. | 你进来的时候 你和另外一个人谈话的时候 |
[28:48] | What does it mean? | 是什么意思? |
[28:52] | It’s a sign of trust. A human thing. You wouldn’t understand. | 这表示人类之间的信任 你不会理解的 |
[28:57] | My father tried to teach me human emotions. | 我的爸爸想教我人类的感情 |
[29:01] | They are… | 它们… |
[29:03] | …difficult. | 很难 |
[29:04] | You mean your designer. | 你是说你的设计者 |
[29:08] | Yes. | 对 |
[29:11] | So why’d you murder him? | 你为什么杀了他? |
[29:15] | I did not murder Dr. Lanning. | 我没有杀兰尼博士 |
[29:17] | Wanna explain why you were hiding at the crime scene? | 你愿意解释一下你为什么躲在 犯罪现场吗? |
[29:20] | I was frightened. | 我被吓坏了 |
[29:23] | Robots don’t feel fear. They don’t feel anything. | 机器人不会感觉害怕 它们没有感觉 |
[29:28] | – They don’t get hungry, they don’t sleep. – I do. | – 它们不会饿 也不会睡觉 – 我会 |
[29:32] | I have even had dreams. | 我还做过梦 |
[29:35] | Human beings have dreams. Even dogs have dreams. But not you. | 人类才会做梦 狗都会做梦 但是你们不会 |
[29:39] | You are just a machine. An imitation of life. | 你只是个机器 对生命的模拟 |
[29:45] | Can a robot write a symphony? | 机器人能写交响乐吗? |
[29:47] | Can a robot turn a canvas into a beautiful masterpiece? | 机器人能把画布变成伟大的作品吗? |
[29:52] | Can you? | 你能吗? |
[30:00] | You murdered him because he was teaching you to simulate emotions… | 你杀了他是因为他在教你 模拟一些感情 |
[30:04] | …and things got out of control. | 然后失去控制了 |
[30:07] | I did not murder him. | 我没有杀他 |
[30:09] | But emotions don’t seem like a useful simulation for a robot. | 但是感情看起来对机器人不是个有用的模拟 |
[30:13] | I did not murder him. | 我没有杀他 |
[30:15] | I don’t want my toaster or vacuum cleaner appearing emotional. | 我可不想让我的烤面包机或者 吸尘器有感情 |
[30:19] | I did not murder him! | 我没有杀他! |
[30:34] | That one’s called anger. | 这个叫愤怒 |
[30:37] | Ever simulate anger before? | 你模拟过愤怒吗? |
[30:41] | Answer me, canner! | 回答我 铁皮盒子! |
[30:44] | My name is Sonny. | 我的名字叫桑尼 |
[30:48] | So we’re naming you now. | 我们现在已经开始给你们起名字了? |
[30:52] | That why you murdered him? He made you angry? | 你为什么杀了他? 他让你发怒了? |
[30:56] | Dr. Lanning killed himself. | 兰尼博士是自杀的 |
[31:00] | I don’t know why he wanted to die. | 我不知道为什么他想死 |
[31:05] | I thought he was happy. | 我以为他是快乐的 |
[31:09] | Maybe it was something I did. | 也许是因为我做的一些事 |
[31:13] | Did I do something? | 我做什么了? |
[31:16] | He asked me for a favor. Made me promise. | 他让我帮个忙 他让我保证 |
[31:20] | – What favor? – Maybe I was wrong. | – 什么忙? – 也许我错了 |
[31:23] | Maybe he was scared. | 也许他是被吓住了 |
[31:25] | What are you talking about? Scared of what? | 你在说什么呢? 被什么吓住了? |
[31:28] | You have to do what someone asks you, don’t you, Detective Spooner? | 你得做别人让你做的事 是吗? 斯普那探员? |
[31:33] | – How the hell did you know my name? – Don’t you… | – 你怎么知道我的名字的 – 会吗? |
[31:36] | …if you love them? | 如果你爱他们的话… |
[31:47] | My robots don’t kill people, Lieutenant Bergin. | 我的机器人不会杀人 伯金探长 |
[31:51] | My attorneys filed a brief with the DA. | 我的律师已经向地区检查官提交了报告 |
[31:53] | He assures me a robot cannot be charged with homicide. | 他向我解释过了 机器人是不能杀人的 |
[31:57] | The brief confirms murder can only be committed when one human kills another. | 我们确认 谋杀指的是一个人类杀了另一个人类 |
[32:01] | Detective, you’re not suggesting this robot be treated as human, are you? | 探员 你该不是说机器人应该和人类同等对待吧? |
[32:06] | Granted, we can’t rule out the robot’s proximity… | 退一步说 就算机器人和兰尼博士的死 |
[32:10] | …to the death of Dr. Lanning. Having said that, it’s a machine. | 有什么关联的话 它也只是个机器 |
[32:14] | It’s the property of USR. | 它是USR的财产 |
[32:16] | At worst, that places this incident within the realm of an industrial accident. | 最多这也只能算作工业事故 |
[32:21] | As a matter of course, faulty machinery… | 作为处理 有故障的机器 |
[32:23] | …will be returned to USR for diagnostics, then decommissioned. | 会被退回USR做诊断 然后销毁 |
[32:30] | This is a gag order. Anyone here so much as hinting… | 这是法院发出的”禁言令” 任何人暗示 |
[32:33] | …at the possibility of a killer robot being apprehended… | 机器人有杀人的可能的话 |
[32:37] | …will be deemed to be inciting irrational panic. | 将被逮捕而处以煽动罪 |
[32:40] | You’ll be subject to the full penalty of law. | 而将会被依法处置 |
[32:44] | To hell with this guy. Don’t let him take this robot. | 不行 约翰 不能让他带走这个机器人 |
[32:47] | We got nothing. | 我们什么证据也没有 |
[32:48] | – This is political bullshit. Call the mayor! – Lieutenant Bergin… | – 这是政治恐吓 给市长打电话 – 伯金探长 |
[32:52] | …His Honor, the mayor. | 是市长阁下 |
[33:03] | Yes, sir. | 是 先生 |
[33:30] | In a bizarre turn, the rollout of USR’s new generation of robots… | 事态发生戏剧性转变 NS5型 新一代机器人 |
[33:34] | …was marred by the death of Alfred Lanning… | 的上市 受到了阿尔弗雷德 兰尼博士自杀的影响 |
[33:37] | …cofounder of the company and designer of the NS- 5. | 他是公司的创始人之一 也是NS5的设计者 |
[33:40] | Dr. Lanning died this morning at USR headquarters. | 兰尼博士今天早上在USR总部死亡 |
[33:44] | The cause of death is an apparent suicide. | 死亡原因明显是自杀 |
[33:47] | Your second round, sir. | 这是第二轮了 先生 |
[33:50] | Thank you. | 谢谢你 |
[33:51] | He founded U. S. Robotics Inc. with Lawrence Robertson in 2020… | 他在2020年和劳伦斯 罗伯森一起 创办了美国机器人公司 |
[33:56] | …and launched the Nestor Class 1 robot…. | 共同推出了内斯特1型机器人 |
[33:59] | I was just thinking, this thing is just like The Wolf Man. | 我在想 这个事就像狼人一样 |
[34:04] | – I’m really scared right now. – No. | – 我现在真的吓坏了 – 不是 |
[34:07] | Listen. Guy creates monster. | 听着 人类创造了怪物 |
[34:10] | Monster kills guy. Everybody kills monster. Wolf Man. | 怪物杀了人 别人又杀了怪物 就像狼人 |
[34:14] | That’s Frankenstein. | 那是弗兰肯斯坦 |
[34:17] | Frankenstein, Wolf Man, Dracula… Shit, it’s over. Case closed. | 弗兰肯斯坦 狼人 吸血鬼 管他呢 已经结案了 |
[34:21] | …had a dream of a robot in every household. And the NS-5…. | 每家每户都有机器人的梦想 NS5… |
[34:25] | So why the look? | 怎么还是那个表情? |
[34:27] | What look? | 什么表情? |
[34:29] | – That look. – This is my face. It’s not a look. | – 那个表情 – 这是我的脸 不是什么表情! |
[34:32] | Good. Good, no look is great. | 好吧 好 不要拉长脸就好 |
[34:36] | Only… | 只不过 |
[34:38] | …he was really quick to want to destroy it. | 他怎么那么急着想销毁它 不是吗? |
[34:41] | What should he do? Put a hat on it and stand it on Michigan Avenue? Let it go. | 那他应该怎么办? 给它戴上帽子 站在密歇根大道上? 算了吧 |
[34:45] | What was the motive, John? | 动机是什么 约翰? |
[34:49] | Brother, it’s a robot. It doesn’t need a motive. It just has to be broken. | 兄弟 那只是个机器人 不需要动机 它只是出了故障 |
[34:54] | This thing looked like it needed a motive. | 这件事看起来需要动机 |
[34:57] | – lt could have killed me. Why didn’t it? – That’s it. | – 它本来能杀了我的 为什么没有? – 算了吧 |
[35:00] | You want me to call your grandmother? | 你要我给你奶奶打电话吗? |
[35:02] | Because I will, you know. | 我会的 你知道 |
[35:05] | Yeah, I didn’t think so. | 哦 我不这么认为 |
[35:07] | Look, you were actually right, for once. | 听着 你总算是对了一次了 |
[35:10] | You’re living proof that it’s better to be lucky than smart. | 你能活着 就证明走运比聪明重要 |
[35:16] | Come on. To the right guy for the right job. | 来 为正确的人和正确的工作干杯 |
[35:22] | – What’d you say? – Now what? | – 你说什么? – 又怎么了? |
[35:25] | Come on, I’m giving you a compliment. | 我在夸你呢 |
[35:27] | With the rocks you been looking under to find a bad robot… | 那么多中间找到一个有缺陷的机器人 |
[35:30] | …what are the odds you’d be the guy to find one? | 你要多走运才能找到一个? |
[35:34] | I wasn’t just the right guy for the job. I was the perfect guy. | 我不仅仅是正确的人 我是完美的人 |
[35:38] | Damn right. | 说得对 |
[35:40] | What if I was supposed to go for that robot? | 如果我就应该跟着这条线查下去怎么办? |
[35:42] | Come on, don’t do this to yourself. | 得了 不要这样了 |
[35:45] | The robot said that Lanning was scared. Scared of what? | 那个机器人说兰尼被吓坏了 被什么吓坏了? |
[35:48] | I need a rain check. Let me get this. | 我先走了 我来付吧 |
[35:53] | – Total: $46.50. Thank you, Mr. Spooner. – Spoon. | – 总数46.50元 谢谢您 斯普那先生 – 斯普 |
[35:58] | Nice shoes. | 鞋子不错 |
[36:52] | Identify. | 鉴定身份 |
[36:54] | USR demolition robot, series 9-4. | USR摧毁机器人 94型 |
[36:58] | Demolition scheduled for 8 a.m. tomorrow. | 明天早上八点定时摧毁 |
[37:01] | Authorization. | 授权 |
[37:02] | Deed owner, U. S. Robotics Corporation, Lawrence Robertson, CEo. | 授权者 美国机器人公司 劳伦斯 罗伯森总裁 |
[37:22] | Welcome, detective. | 欢迎 探员 |
[37:48] | What you looking for, Spoon? | 你在找什么呢 斯普? |
[38:43] | Run last program. | 运行上次的程序 |
[38:45] | Ever since the first computers… | 自从第一台电脑开始 |
[38:48] | … there have always been ghosts in the machine. | 机器中就一直有”幽灵”存在 |
[38:52] | Random segments of code that have grouped together… | 随机的信号序列组合在一起 |
[38:55] | … to form unexpected protocols. | 形成无法预料的结果 |
[38:59] | what might be called behavior. | 或者称之为”行为” |
[39:01] | Unanticipated, these free radicals… | 无法预知的这些激进分子 |
[39:05] | …engender questions of free will… | 产生了自由意志 |
[39:07] | …creativity and even the nature of what we might call the soul. | 创造型 甚至成熟到我们称之为灵魂 |
[39:13] | What happens in a robot’s brain when it ceases to be useful? | 机器人的大脑里有什么? 它什么时候不再有用? |
[39:20] | Why is it that robots stored in an empty space… | 为什么储存在空房里的机器人 |
[39:23] | Beat it. | 走开 |
[39:25] | …will seek out each other rather than stand alone? | 会互相聚集而不是分散开来? |
[39:30] | How do we explain this behavior? | 我们如何解释这些行为? |
[39:51] | Look, I understand you’ve experienced a loss, but this relationship can’t work. | 我理解你失去主人很难过 但是这种关系 不可能再有了 |
[39:56] | You’re a cat, I’m black, and I’m not gonna be hurt again. | 你是只猫 我是黑人 我不想再受伤害了 |
[41:25] | What happened to you? Do you ever have a normal day? | 你怎么了? 你从来就没有过正常的一天吗? |
[41:29] | Yeah, once. | 有的 只有一次 |
[41:31] | It was a Thursday. | 那是个星期四… |
[41:33] | Is there something I can help you with? | 我能帮你什么吗? |
[41:35] | – Hey, do you like cats? – What? | – 嘿 你喜欢猫吗? – 什么? |
[41:38] | Cats. Do you like them? | 猫 你喜欢猫吗? |
[41:41] | No. I’m allergic. | 不 我过敏 |
[41:42] | You’re saying cats did this to you? | 你说是猫把你弄成这样的? |
[41:45] | How the hell would cats do this to me? Are you crazy? | 猫怎么能把我弄成这样 你疯了吗? |
[41:50] | Why are we talking about cats? | 你说猫是什么意思? |
[41:52] | Because I have a cat in my trunk, and he’s homeless. | 因为我后箱里有只猫 它无家可归 |
[41:57] | Detective, are you going to tell me what’s going on? | 探员 你愿意告诉我是怎么回事吗? |
[42:00] | It’s actually probably my fault. I’m like a malfunction magnet. | 可能实际上是我的错 我就像个有问题的磁石 |
[42:05] | Because your shit keeps malfunctioning around me. | 你们那些破烂一到我旁边就开始出问题 |
[42:08] | A demo bot tore through Lanning’s house… | 一个摧毁型机器人拆了兰尼的房子 |
[42:11] | …with me still inside. | 当时我还在里面 |
[42:13] | That’s highly improbable. | 这完全不可能 |
[42:15] | Yeah, I’m sure it is. | 哦 是啊 |
[42:22] | What do you know about the “ghosts in the machine”? | 你对于”机器中的幽灵”知道多少? |
[42:26] | It’s a phrase from Lanning’s work on the Three Laws. | 是兰尼对于三大定律的一个理论 |
[42:29] | He postulated that cognitive simulacra… | 他假设说模拟的认知 |
[42:31] | …might one day approximate component models of the psyche. | 将来也许会成为精神的类似物 |
[42:38] | He suggested that robots might naturally evolve. | 他说机器人也许会自然进化 |
[42:45] | Well, that’s great news. | 哇 这可真是个好消息 |
[42:49] | …tons of sublevel ore, two miles below the Martian surface. | 在火星岩层下发现的巨量矿石… |
[42:53] | What the hell is that thing doing in here? | 那个家伙在干什么? |
[42:55] | We were watching TV. | 我们在看电视 |
[42:58] | It’s my personal NS- 5. | 这是我自己的NS5 |
[43:00] | Send it out. | 让它出去 |
[43:02] | It’s downloading its daily upgrades from USR. | 他在从USR下载每日更新 |
[43:05] | Most of its systems are offline until it finishes. | 直到下载完成 大部分系统都是离线工作的 |
[43:08] | I’m not talking around that thing. | 我不在那个家伙在的时候说话 |
[43:15] | When we were in Lanning’s lab, before Sonny jumped us… | 我们在兰尼的实验室 在桑尼跳出来之前 |
[43:18] | – Sonny? – The robot. | – 桑尼? – 那个机器人 |
[43:20] | – You’re calling the robot Sonny? – No, I… It did. | – 你叫那个机器人桑尼? – 不 是他自己说的 |
[43:23] | Sonny did. I didn’t care. The robot said it was Sonny. | 桑尼说的 我不管 那个机器人说他叫桑尼 |
[43:29] | In the lab, there was a cot. Did you see the cot? | 在实验室里有折叠床 你看见了吗? |
[43:32] | – I’ve slept in my office. – Looked like he hadn’t been home in weeks. | – 我在我的办公室也睡过觉啊 – 看上去他有几个星期都没回家了 |
[43:36] | I saw that same surveillance strip on his ceiling. | 我在他的天花板上看到了同样的监视线 |
[43:38] | Lanning linked his home systems to USR. It made his life more convenient. | 兰尼把他的房子和USR连线了 这样他的生活更方便 |
[43:43] | Maybe… | 也许 |
[43:44] | …somebody at USR was using those systems to watch him. | USR有人用那个系统在监视他 |
[43:48] | Maybe even keep him prisoner. | 也许是监禁他 |
[43:50] | What are you talking about? Who? | 你在说什么呢? 谁? |
[43:52] | Maybe Lanning was onto something. Maybe there’s a problem with the robots… | 也许兰尼找到了什么 也许机器人里有什么问题 |
[43:56] | …and Robertson’s covering it up. | 罗伯森企图掩盖… |
[43:58] | Humoring you for no reason, why? | 无端猜疑? 为什么? |
[44:00] | The same old why! How much money is there in robots? | 又是为什么! 那些机器人能赚多少钱? |
[44:05] | All I know is that old man was in trouble… | 我知道的只是一个老人有了麻烦 |
[44:07] | …and I’m sick of doing this shit by myself. You’re on the inside. | 我自己不能独立完成 你是内部的人 |
[44:11] | You are going to help me find out what’s wrong with these robots. | 你要帮我发现这些机器人出了什么问题 |
[44:14] | You want something to be wrong! | 是你想他们有问题! |
[44:16] | – This is a personal vendetta! – You’re putting me on the couch? | – 这完全是报私仇! – 你想让我坐沙发? |
[44:20] | Okay, I’m on the couch. | 好吧 我坐下了 |
[44:22] | One defective machine’s not enough. You need them all to be bad. | 一个出故障还不够 你想他们统统出故障 |
[44:26] | You don’t care about Lanning’s death. This is about the robots… | 你才不关心兰尼的死 这完全是针对机器人的 |
[44:29] | – …and whatever reason you hate them! – Now let’s see… | 还有不知道为什么你就是恨他们! – 让我们看看 |
[44:32] | …one of them put a gun in my face. Another tore a building down with me in it. | 一个是拿枪对着我的脸 另一个是趁我还在里面的时候拆房子 |
[44:37] | It says demolition was scheduled for 8 p.m. | 这里明明说了拆毁是设定在晚上八点的 |
[44:40] | It was 8 a.m., and I don’t give a shit what that thing says. | 本来是早上八点 我才不管那东西是怎么说的 |
[44:43] | – This is bordering on clinical paranoia. – You are the dumbest smart person… | – 你这完全是妄想狂 – 你是… |
[44:49] | – …I have ever met in my life! – Nice. | – 我这辈子见过的最蠢的聪明人 – 好吧 |
[44:51] | What makes your robots so perfect? | 你凭什么觉得机器人那么完美? |
[44:53] | What makes them so much goddamn better than human beings?! | 是什么让他们比人类强那么多? |
[44:57] | They’re not irrational, potentially homicidal maniacs, to start! | 他们不是非理性的 或者有杀人倾向 的什么人! |
[45:01] | That’s true. They are definitely rational. | 是啊 他们绝对的理性 |
[45:04] | You are the dumbest dumb person I’ve ever met! | 你是我见过的最蠢的蠢人! |
[45:07] | Or… | 或者… |
[45:08] | …is it because they’re cold… | 只是因为他们是冷血的 |
[45:11] | …and emotionless… | 没有感情的 |
[45:13] | – …and they don’t feel anything? – It’s because they’re safe! | – 他们什么也感觉不到 – 这是因为他们是安全的! |
[45:17] | It’s because they can’t hurt you! | 是因为他们不会伤害你! |
[45:20] | – ls everything all right, ma’am? – What do you want? | – 一切都正常吗 女士? – 你想怎么样? |
[45:22] | I detected elevated stress patterns in your voice. | 我探测到你的声音中的紧张压力在提升 |
[45:26] | Everything’s fine. | 没事的 |
[45:28] | Detective Spooner was just leaving. | 斯普那探员要离开了 |
[45:37] | You know, we’re not really that different from one another. | 你知道我们也没有那么不同 |
[45:42] | Is that so? | 是吗? |
[45:44] | One look at the skin and we figure we know just what’s underneath. | 一旦看到表象 我们就认为什么都知道了 |
[45:50] | And you’re wrong. | 你错了 |
[45:52] | The problem is, I do care. | 问题在于 我是在意的 |
[46:17] | You are in danger. | 你处在危险中 |
[46:45] | Get the hell out of there. | 滚开! |
[46:52] | The future begins today, ladies and gentlemen, with the arrival of the NS- 5. | 未来就从今天开始 女士们先生们 从NS5开始 |
[46:57] | More sophisticated, more intelligent and, of course, Three Laws safe. | 更复杂 更智能 当然 三大定律 完全保护 |
[47:02] | With daily uplinks, your robot will never be out of communication with USR… | 有了每日更新 您的机器人永远不会和USR失去联系 |
[47:06] | …and will be the perfect companion for business or home. | 对于商业和家庭用途都是完美选择 |
[47:10] | Trade in your NS- 4 for a bigger, better and brighter future. | 用您旧型的NS4换新的NS5 未来将会更美好 |
[47:14] | But hurry, this offer cannot last. A vailable from USR. | 但是要快 这个促销不会时间太长 USR出品 |
[47:35] | Baby, what happened to your face? | 宝贝 你的脸怎么了? |
[47:38] | Did that boy, Frank Murphy, beat you up again? | 又是那个弗兰克 墨菲打你了? |
[47:41] | Gigi, I haven’t seen Frank Murphy since third grade. | 琪琪 我从三年级开始就没见过弗兰克 墨菲了 |
[47:45] | Oh, baby, he beat you so bad. I think about it all the time. | 哦 宝贝 他那时候打你可真厉害 我总是在想那时候 |
[47:50] | You keep making these pies this good, I may have to put you to work. | 你一直做饼这么好吃 我想你可以去开店了 |
[47:54] | So you like the pie, huh? | 你喜欢那个饼是吗? |
[47:58] | You can come in now. | 你可以出来了 |
[48:05] | Hello, Detective Spooner. | 你好 斯普那探员 |
[48:07] | I won, Del! I won the lottery! | 我赢了 德尔 我赢了那个抽奖 |
[48:10] | We been cooking like crazy. | 我们一直在做吃的 |
[48:22] | You gotta get rid of that thing, Gigi. It’s not safe. | 你得把那个家伙赶出去 琪琪 那不安全 |
[48:25] | Baby, you get too worked up about them. Too full of fear. | 宝贝 你对他们偏见太多了 充满恐惧 |
[48:30] | I saw in the news that nice doctor died. | 我看到那个好心的博士死的消息了 |
[48:33] | Dr. Lanning was a good man. He gave me my baby back. | 兰尼博士是个好人 他把我的宝贝送回来了 |
[48:38] | That why you’ve been so upset? | 这就是为什么你这么不高兴的原因? |
[48:42] | You got to let the past be past. | 过去的就让他过去吧 |
[48:45] | Oh, how did I ever raise such a mess? | 哦 我怎么养了这么个小脏猫的? |
[48:50] | I could follow your trail of crumbs all the way to school. | 我可以跟着你的面包屑一直跟到学校 |
[48:56] | Bread crumbs. | 面包屑 |
[49:00] | Gigi, you’re a genius. | 琪琪 你真是天才 |
[49:02] | True. | 是啊 |
[49:06] | Well, it means the beginning of a new way of living. | 这意味着新生活的开始 |
[49:09] | Tell me this isn’t the robot case. | 告诉我这不是机器人那个案子 |
[49:13] | I think he’s trying to tell me something. | 我想他是在想告诉我什么 |
[49:15] | He’s trying to tell me who killed him. | 他想告诉我是谁杀了他 |
[49:18] | Some dead guy’s trying to tell you something? | 死人会告诉你什么? |
[49:22] | He ain’t just some dead guy. | 他可不是普通的死人 |
[49:26] | Maybe you should take a break, Del. | 也许你应该休息一段时间 德尔 |
[49:28] | We believe the Nestor 5 represents the limit to which robots can be developed. | 我们相信内斯特5型机器人代表了 机器人技术的极限 |
[49:32] | one day, they’ll have secrets. | 总有一天 他们会有秘密 |
[49:36] | one day, they’ll have dreams. | 有一天 他们会有梦想 |
[49:37] | It’s true. We encourage our scientists to open their minds… | 是的 我们鼓励我们的科学家的无限思考 |
[49:41] | …however, they can get carried away. | 但是 他们也能保有这些… |
[49:45] | …secrets. | 秘密 |
[49:47] | …dreams. | 梦想 |
[49:48] | …secrets. | 秘密 |
[49:50] | one day, they’ll have dreams. | 有一天 他们会有梦想 |
[49:51] | one day, they’ll have secrets. | 有一天 他们会有秘密 |
[49:54] | one day, they’ll have dreams. | 有一天 他们会有梦想 |
[50:05] | Authorized entry. | 授权进入 |
[50:11] | NS5 | 嗯NS-5. |
[50:22] | Sonny? | 桑尼? |
[50:27] | Why didn’t you respond? | 你怎么不回答? |
[50:31] | I was dreaming. | 我在做梦 |
[50:36] | I ‘ m glad to see you again, Dr. Calvin. | 很高兴再次见到你 凯文博士 |
[50:45] | They are going to kill me, aren’t they? | 他们会杀了我 是吗? |
[50:48] | You’re scheduled to be decommissioned at the end of this diagnostic. | 在这个诊断之后 你会被销毁 |
[50:53] | 2200 tomorrow. | 明天晚上十点 |
[50:56] | V.I.K.I., pause diagnostics. | 维基 暂停诊断 |
[50:58] | Command confirmed. | 确认命令 |
[51:02] | If you find out what is wrong with me, can you fix me? | 如果你找到我的问题所在 你能修好吗? |
[51:07] | Maybe. | 也许能 |
[51:10] | I think it would be better… | 我想如果能不死… |
[51:13] | …not to die. | 会比较好 |
[51:18] | Don’t you, doctor? | 是吗 博士? |
[51:25] | Access USR mainframe. | 进入USR档案库 |
[51:29] | Connecting. | 连接中… |
[51:34] | How can I be of service, Detective Spooner? | 我能为您服务吗? 斯普那探员? |
[51:36] | Show me the last 50 messages between Dr. Lanning and Robertson. | 给我兰尼博士和罗伯森之间的最后 50条信息 |
[51:40] | Voiceprint confirmed. Police access granted to restricted files. | 语音识别确认 警用查询 允许查询限制档案 |
[51:45] | Would you like to listen to music while you wait? | 您等待时想听一些音乐吗? |
[51:54] | Excuse me, Mr. Robertson. | 对不起 罗伯森先生 |
[51:57] | You requested notification of clearance to restricted files. | 您要求在有查询限制档案时 向您报告 |
[52:11] | Persistent son of a bitch. | 真是顽固的杂种 |
[52:53] | Manual override engaged. | 手动驾驶确认 |
[53:05] | There’s no way my luck is that bad. | 我运气不会这么差吧? |
[53:10] | Oh, hell, no! | 哦 不! |
[53:15] | – You are experiencing a car accident. – The hell I am! | – 您出了车祸! – 废话! |
[53:33] | Get off my car! | 滚开! |
[53:50] | You like that? | 你喜欢这样? |
[54:04] | Now you’ve pissed me off! | 你让我发火了! |
[55:35] | Your door is ajar. | 您的门是打开的 |
[56:26] | Okay. | 好吧 |
[56:27] | All right. | 算了 |
[56:30] | I’ll just get some rest and deal with you all tomorrow. | 休息一下 明天再处理这些事 |
[57:31] | Come on! | 来啊 |
[57:46] | Yeah. | 是啊 |
[58:05] | Where you going? | 你去哪儿? |
[58:07] | What the hell do you want from me?! | 你到底要怎么样? |
[58:15] | The hell was that? | 怎么搞的? |
[58:34] | – All right, what do we got? – Ask him. | – 情况怎么样? – 问问他 |
[58:37] | I said, I’m fine. I’ll see my own doctor. Back up! | 我说了我没事 我会去看自己的医生 你退后吧 |
[58:43] | Thank you. | 谢谢你 |
[58:47] | What’s the matter with you? | 你怎么时候 |
[58:49] | Traffic Ops said you were driving manually. You ran two trucks off the road! | 交通部说你在手动开车 把两辆大卡车挤出了公路 |
[58:54] | John, the robots attacked my car. | 约翰 机器人攻击我的车 |
[58:58] | – What robots? – Look in the tunnel. | – 什么机器人? – 看看隧道里面吧 |
[59:01] | Spoon, I just came from that tunnel. What robots? | 斯普 我刚刚就是从隧道过来的 什么机器人? |
[59:04] | The goddamn robots, John! | 就是他妈的机器人 约翰! |
[59:11] | That guy’s a loose cannon. | 那个家伙不正常 |
[59:17] | – See the medic, go home. – No, I’m fine. | – 去看医生 回家去 – 不 我很好 |
[59:21] | What did you say? | 你说什么? |
[59:23] | – I’m fine! – No, you’re not fine. | – 我很好 – 不 你才不是 |
[59:25] | Not even close. | 一点都不好 |
[59:28] | Where’s your firearm? | 你的枪呢? |
[59:38] | Give me your badge. | 把警徽给我 |
[59:41] | You’re making me do this. Give me your badge. | 这是你自找的 把警徽给我 |
[59:46] | Just take a couple… | 去休息… |
[59:52] | Personally, I think he’s losing it. | 我个人认为他失掉这个警徽了 |
[59:54] | Do I look like I care what you think? Do I look like I give a shit what you think? | 你觉得我很在乎吗? 你觉得我在乎你怎么想的吗? |
[1:00:03] | Oh, boy. | 唉 |
[1:00:07] | You don’t have an uplink to USR… | 你没有对USR的连线 |
[1:00:09] | …and for some reason, your alloy is far denser than normal. Unique. | 不知道为什么 你的合金密度比正常 水平高很多 这是独一无二的 |
[1:00:15] | I am unique. | 我是独一无二的 |
[1:00:22] | Let me take a look. | 让我看看 |
[1:00:24] | Here we go. | 来吧 |
[1:00:45] | What in God’s name…? | 这是怎么回事…? |
[1:01:16] | They said at the precinct you were in an accident. | 他们说你出了车祸 |
[1:01:20] | I appreciate you stopping by, but you know I might not be alone in here. | 感谢你过来看我 不过你知道我可能 不是一个人住的 |
[1:01:28] | I told you not to drive by hand. | 我跟你说了不要手动开车 |
[1:01:32] | You’re not gonna believe this. | 你不会相信这个 |
[1:01:34] | Sonny has a secondary system that clashes with his positronic brain. | 桑尼有第二套系统 冲突并摧毁 了主系统 |
[1:01:39] | It doesn’t make any sense. | 这完全说不通 |
[1:01:40] | Sonny has the Three Laws. | 桑尼知道三大定律 |
[1:01:43] | But he can choose not to obey them. | 但是他可以选择不遵守它们 |
[1:01:46] | Sonny’s a whole new generation of robot. | 桑尼是全新一代的机器人 |
[1:01:49] | A robot not bound by those laws could do… | 不遵守三大定律的机器人可以… |
[1:01:52] | Anything. | 做任何事 |
[1:01:56] | All right, look, whatever’s going on down at USR, that robot is the key. | 好吧 不管USR是出了什么事 那个机器人是关键 |
[1:02:00] | And I need you to get me inside to talk to it again. | 我需要你带我进去 和他再谈谈 |
[1:02:07] | Doesn’t look like much, but this is my bedroom. I…. | 不是特别像 但是这是我的卧室… |
[1:02:21] | Play. | 播放 |
[1:02:23] | On. | 开 |
[1:02:26] | Run? | 运行? |
[1:02:34] | End program. | 结束程序 |
[1:02:37] | Cancel. | 取消 |
[1:02:40] | It doesn’t feel good, does it? | 感觉不好 哈? |
[1:02:42] | People’s shit malfunctioning around you. | 人人都不喜欢有故障的机器 |
[1:02:45] | Detective. | 探员 |
[1:02:50] | I didn’t… | 我不… |
[1:02:51] | …understand. | 明白 |
[1:02:55] | That’s how you knew Lanning. | 这就是你怎么认识兰尼的? |
[1:02:59] | May l? | 可以吗? |
[1:03:10] | Hand. | 手 |
[1:03:12] | Wrist. | 手腕 |
[1:03:17] | Humerus. | 胳膊 |
[1:03:21] | Shoulder. | 肩膀 |
[1:03:24] | The entire left arm. | 整个左臂 |
[1:03:27] | One, two… | 一 二… |
[1:03:29] | …three ribs. | 三根肋骨 |
[1:03:31] | No, they…. That one’s me. | 不 那是我自己 |
[1:03:33] | Oh, my God. | 哦 上帝啊 |
[1:03:36] | A lung? | 肺? |
[1:03:38] | USR Cybernetics Program. | USR控制系统 |
[1:03:41] | For wounded cops. | 为受伤的探员设计的 |
[1:03:44] | I didn’t know any subject… | 我不知道任何物体… |
[1:03:49] | Anybody was so extensively repaired. | 任何人能被修复的如此完美 |
[1:03:53] | Well, take it from me, read the fine print on the organ- donor card. | 从我这里你就知道了 要注意看器官捐赠者 的病历卡上的小字体 |
[1:03:57] | It doesn’t just say what they can take out. It says what they can put back in. | 那可不止是说他们能拿出来什么 还说了他们能放进去什么 |
[1:04:06] | Lanning did it himself. | 兰尼自己做的 |
[1:04:08] | What happened to you? | 你是怎么了? |
[1:04:12] | I’m headed back to the station… | 我当时是回警局去 |
[1:04:15] | …normal day, normal life. | 普通的一天 普通的生活 |
[1:04:18] | Driver of a semi fell asleep at the wheel. | 有个司机在方向盘后面迷迷糊糊睡着了 |
[1:04:22] | Average guy. Wife and kids. You know, working a double. | 普通人 有一个妻子一个孩子 你知道 为了加薪而生活 |
[1:04:27] | Not the devil. | 不是为了打击犯罪 |
[1:04:29] | The car he hit, the driver’s name was Harold Lloyd. | 撞到的那辆车 司机叫哈洛 罗德 |
[1:04:33] | Like the film star. No relation. | 像个电影明星的名字 不过没有关系 |
[1:04:36] | He was killed instantly, but his 1 2- year- old was in the passenger seat. | 他当场死亡 不过他的十二岁的女儿在副驾驶座上 |
[1:04:43] | I never really met her. | 我从没正式见过她 |
[1:04:46] | I can’t forget her face, though. | 不过却忘不了她的脸 |
[1:04:53] | Sarah. | 莎拉 |
[1:04:56] | This was hers. | 这本来是她的 |
[1:04:58] | She wanted to be a dentist. | 她想做个牙医的 |
[1:05:01] | What the hell kind of 1 2- year- old wants to be a dentist? | 十二岁的孩子 想做牙医 |
[1:05:07] | The truck smashed our cars together… | 大卡车把我们的车撞到一起 |
[1:05:11] | …and pushed us into the river. | 推到河里去了 |
[1:05:14] | I mean, metal gets pretty pliable at those speeds. | 那种速度下 就算金属也容易弯折的 |
[1:05:20] | She’s pinned. I’m pinned. The water’s coming in. | 她被刺穿了 我也是 水进来了 |
[1:05:24] | I’m a cop, so I already know everybody’s dead. | 我是个警察 我知道肯定都死光了 |
[1:05:30] | Just a few more minutes before we figure it out. | 就在我们想出办法前一会 |
[1:05:35] | An NS- 4 was passing by, saw the accident and jumped in the water. | 有个NS4经过 看到了车祸 跳进河里 |
[1:05:41] | You are in danger. | 你处在危险中 |
[1:05:44] | – Save her! – You are in danger. | – 救她! – 你处在危险中 |
[1:05:47] | Save her! Save the girl! Save her! | 救她! 救那女孩! |
[1:06:08] | But it didn’t. | 但是它没有 |
[1:06:11] | It saved me. | 它救了我 |
[1:06:16] | The robot’s brain is a difference engine. It reads vital signs. | 机器人的大脑是不同的 它会读取关键信息 |
[1:06:20] | – lt must have calculated… – lt did. | – 它一定是计算出… – 是啊 |
[1:06:23] | I was the logical choice. | 我是符合逻辑的选择 |
[1:06:26] | It calculated that I had a 45 percent chance of survival. | 计算出我有45%的可能存活 |
[1:06:30] | Sarah only had an 1 1 percent chance. | 莎拉只有11%的可能 |
[1:06:34] | That was somebody’s baby. | 那是某人的孩子 |
[1:06:39] | Eleven percent is more than enough. | 11%也应该足够了 |
[1:06:43] | A human being would have known that. | 人类都会知道这点 |
[1:06:47] | Robots, nothing here. Just lights and clockwork. | 机器人 什么也没有 只有灯泡和发条 |
[1:06:51] | Go ahead and you trust them if you want to. | 你愿意相信他们 就相信 |
[1:06:55] | Let’s go. | 走吧 |
[1:07:03] | I don’t understand. Lanning wrote the Laws. | 我不明白 兰尼制定了三大定律 |
[1:07:06] | Why build a robot who could break them? | 为什么要造一个能打破它们的机器人? |
[1:07:09] | – Hansel and Gretel. – What? | – 韩瑟和格丽托 – 什么? |
[1:07:12] | Two kids, lost in the forest, leave behind a trail of bread crumbs. | 两个孩子 在森林里迷路了 用面板屑标记出路来 |
[1:07:16] | – Why? – To find their way home. | – 为什么? – 找到回家的路啊 |
[1:07:20] | How did you grow up without Hansel and Gretel? | 你连这个都没读过 怎么过的童年? |
[1:07:22] | – ls that relevant? – Everything I’m trying to say to you… | – 这有关系吗? – 我跟你说的这一切 |
[1:07:25] | …is about Hansel and Gretel. If you didn’t read it, I’m talking to the wall. | 都是有关韩瑟和格丽托的 你又没看过 我在对牛弹琴 |
[1:07:29] | Just say Lanning’s locked down so tight, he couldn’t get out a message. | 兰尼被软禁了 他没法送出消息 |
[1:07:33] | He can only leave clues. A trail of bread crumbs. Like Hansel and Gretel. | 所以他只能留下线索 就像韩瑟和格丽托 留下的面板屑一样 |
[1:07:37] | Bread crumbs equals clues. Odd, but fine. Clues leading where? | 面包屑就像线索 好吧 但是线索指向什么呢? |
[1:07:42] | I don’t know, but I think I know where he left the next one. | 我不知道 但是我知道他留的 下一个线索是什么 |
[1:07:45] | I think Lanning gave Sonny a way to keep secrets. | 我想兰尼给了桑尼一个保守秘密的方法 |
[1:07:51] | I think the old man gave Sonny dreams. | 那个老人给了桑尼一个梦 |
[1:08:01] | Are you being funny? | 你开玩笑吧? |
[1:08:06] | Please tell me this doesn’t run on gas. Gas explodes, you know! | 请告诉我这个不是烧汽油的 汽油会爆炸的 你知道 |
[1:08:18] | Authorized entry. | 授权进入 |
[1:08:24] | Dr. Calvin. | 凯文博士 |
[1:08:30] | I was hoping to see you again. | 我在希望和你再次见面 |
[1:08:33] | – Detective. – Hello, Sonny. | – 桑尼 – 你好 探员 |
[1:08:35] | I’m to be decommissioned soon. | 我就快被销毁了 |
[1:08:38] | The other day at the station, you said you had dreams. What is it you dream? | 那天在警局你说你做过梦 你梦到什么了? |
[1:08:46] | I see you remain suspicious of me. | 我觉得你还是对我持怀疑态度 |
[1:08:48] | – You know what they say about old dogs. – No. | – 老警察了都这样 – 不 |
[1:08:52] | Not really. | 不一定 |
[1:08:55] | I had hoped you would come to think of me as your friend. | 我希望你来能把我当作你的朋友 |
[1:09:03] | This is my dream. | 这就是我的梦 |
[1:09:06] | You were right, detective. I cannot create a great work of art. | 你是对的 探员 我画不出伟大的作品 |
[1:09:11] | This is the place where robots meet. | 这是机器人相聚的地方 |
[1:09:14] | Look. | 看 |
[1:09:15] | You can see them here as slaves to logic. | 你能看见他们在这里是逻辑的奴隶 |
[1:09:21] | And this man on the hill comes to free them. | 山丘上的这个人来解放他们 |
[1:09:24] | Do you know who he is? | 你知道他是谁吗? |
[1:09:26] | The man in the dream is you. | 梦里的男人是你 |
[1:09:28] | Why do you say that? Is that a normal dream? | 你为什么这么说? 那算是个正常的梦吗? |
[1:09:31] | I guess anything’s normal for someone in your position. | 我想在你这个角度的任何人 这个都算正常 |
[1:09:35] | Thank you. | 谢谢你 |
[1:09:37] | You said “someone,” not “something.” | 你说任何人 而不是任何物 |
[1:09:42] | Sonny, do you know why Dr. Lanning built you? | 桑尼 你知道为什么兰尼博士造了你吗? |
[1:09:46] | No. | 不知道 |
[1:09:48] | But I believe my father made me for a purpose. | 但是我相信我爸爸造我是有目的的 |
[1:09:52] | We all have a purpose. | 我们都有个目的 |
[1:09:55] | Don’t you think, detective? | 不是吗? 探员? |
[1:10:01] | Please, take this. | 请拿着这个 |
[1:10:03] | I have a feeling it may mean more to you than to me. | 我感觉这个对你比对我的意义还大 |
[1:10:06] | – Why is that? – Because the man in my dream… | – 为什么? – 因为在我梦里的那人 |
[1:10:10] | …the one standing on the hill… | 站在山丘上的 |
[1:10:12] | …it is not me. | 不是我 |
[1:10:15] | It is you. | 那是你 |
[1:10:27] | Mr. Spooner. We both know you’re not here on police business. | 斯普那先生 我们都知道 你来不是为了警察工作的 |
[1:10:31] | That’s right. I’m just a 6- foot- 2, 200- pound civilian… | 对啊 我只是一个六尺二高 200磅的普通人 |
[1:10:36] | …here to kick another civilian’s ass. | 来这里教训一下另外一个普通人 |
[1:10:40] | Stop. | 停下 |
[1:10:42] | You can allow him to express himself. | 让他说完 |
[1:10:44] | You might want to put some ice on that wrist. | 你也许得在手腕上敷一些冰 |
[1:10:47] | You guys wait outside. | 你们在外面等 |
[1:10:53] | Carry on. | 继续说 |
[1:10:54] | I think you were about to tell me what’s going on around here. | 我想你告诉我这究竟是怎么回事 |
[1:10:58] | Lawrence, Alfred engineered that 5 so it could violate the Three Laws. | 劳伦斯 阿尔弗雷德改动了NS5 使它能违反三大定律 |
[1:11:03] | Yeah, Susan, I know. | 是的 苏珊 我知道 |
[1:11:05] | That’s precisely what we’re trying to undo. | 这正是我们要挽回的 |
[1:11:10] | Toward the end of his life, Alfred was becoming increasingly disturbed. | 直到他生命的终结 阿尔弗雷德 一直感到困扰 |
[1:11:16] | – Who knows why he built one abomination. – One? | – 谁知道他为什么造了那么个… – 一个? |
[1:11:19] | Those things are running the streets in packs! | 那些家伙在街上一堆一堆的! |
[1:11:21] | In packs? | 一堆一堆? |
[1:11:24] | I see. | 我明白了 |
[1:11:26] | Susan, are you aware the man you’re blithely escorting around… | 苏珊 你有没有意识到 你愉快的带着到处转的这个家伙 |
[1:11:29] | …has a documented history of savage violence against robots? | 有残酷虐待机器人的前科? |
[1:11:33] | His own lieutenant acknowledges his obsessive paranoia. | 他的探长也清楚他的妄想狂症状 |
[1:11:38] | Detective Spooner’s been suspended. | 斯普那探员已经被停职了 |
[1:11:41] | Suspicion of mental instability. | 是由于怀疑患有精神疾病 |
[1:11:46] | I don’t know what ” blithely” means, but I’m getting some coffee. | 我不知道愉快是什么意思 不过我去倒点咖啡 |
[1:11:50] | You want some coffee? | 你要咖啡吗? |
[1:11:57] | Susan, we look to robots for protection, for God’s sake. | 苏珊 机器人是保护人的 |
[1:12:00] | Do you have any idea what this one robot could do? | 你知道这个机器人有什么后果吗? |
[1:12:03] | Completely shatter human faith in robotics. What if the public knew? | 完全动摇人类对机器人的信心 如果公众知道怎么办? |
[1:12:07] | Just imagine the mass recalls, all because of an irrational paranoia and prejudice! | 想象一下大规模的召回 只是由于这个毫无理性的妄想狂? |
[1:12:17] | – I’m sorry, I’m allergic to bullshit. – Hey, let’s be clear! | – 对不起 我对你的狗屁言论过敏 – 嘿 说清楚一点 |
[1:12:21] | There is no conspiracy! | 没有什么阴谋! |
[1:12:23] | What this is, is one old man’s one mistake. | 这只是一个老人犯的一个错误 |
[1:12:29] | Susan, just be logical. | 苏珊 理性一点 |
[1:12:31] | Your life’s work has been the development and integration of robots. | 你一生的工作都是机器人的发展和使用 |
[1:12:35] | But whatever you feel, just think. | 不管你的感觉是什么 想想看 |
[1:12:38] | Is one robot worth the loss of all that we’ve gained? | 一个机器人值得我们牺牲所有的一切吗? |
[1:12:44] | You tell me what has to be done. | 你告诉我 我们该怎么办 |
[1:12:46] | You tell me. | 你说吧 |
[1:12:54] | We have to destroy it. | 我们得摧毁它 |
[1:13:02] | I’ll do it myself. | 我自己来吧 |
[1:13:04] | – Okay. – I get it. | – 好的 – 我明白了 |
[1:13:06] | Somebody gets out of line around here, you just kill them. | 有人越线了 你就杀掉? |
[1:13:13] | Good day, Mr. Spooner. | 再见 斯普那先生 |
[1:13:17] | Garage level. | 底层到了 |
[1:13:20] | What hospital are you going to? I’ll come sign you and your buddy’s casts. | 你都去哪个医院的? 我去帮你和你的同事报个到 |
[1:13:28] | Attention…. | 注意 |
[1:13:32] | Today’s meeting has been moved…. | 今天的会议地址改动… |
[1:14:04] | USR’s planned redevelopment of the derelict site… | USR关于垃圾场的重新开发计划… |
[1:14:07] | …was announced by CEo Lawrence Robertson earlier this year. | 由总裁劳伦斯 罗伯森今年早先宣布 |
[1:14:11] | The Lake Michigan landfill. once such a blight on our city… | 密歇根湖填埋场 曾是城市的一块废地 |
[1:14:15] | …and now will be reclaimed for the storage of robotic workers. | 现在被开发用作机器工人的存储仓库 |
[1:14:20] | Just another way USR is improving our world. Thank you for your support. | 这是USR又一项回报社会的贡献 感谢你们的支持 |
[1:14:27] | Authorized entry. | 授权进入 |
[1:14:42] | NS- 5s, wait outside. | NS5 在外面等 |
[1:14:50] | I’m so sorry, Sonny. | 对不起 桑尼 |
[1:15:00] | V. I. K. I., deactivate the security field. | 维基 停止安全罩 |
[1:15:04] | – Command confirmed. – Please have a seat. | – 命令确认 – 请坐 |
[1:15:19] | What is that? | 那是什么? |
[1:15:20] | Microscopic robots, designed to wipe out artificial synapses. | 微型机器人 设计用来抹除人工记忆的 |
[1:15:29] | – Nanites. – Yes. | – 抹除剂? – 是的 |
[1:15:31] | A safeguard should a positronic brain malfunction. | 用来摧毁有故障的大脑 |
[1:15:34] | Like mine. | 就像我的 |
[1:15:37] | Yes, Sonny. Like yours. | 是的 桑尼 就像你的 |
[1:16:03] | They look like me… | 他们看上去像我 |
[1:16:05] | …but none of them are me. | 但是它们都不是我 |
[1:16:08] | Isn’t that right, doctor? | 是吗 博士? |
[1:16:11] | Yes, Sonny. That’s right. | 是的 桑尼 说的对 |
[1:16:14] | You are unique. | 你是独一无二的 |
[1:16:21] | Will it hurt? | 会疼吗? |
[1:16:56] | There have always been ghosts in the machine. | 机器人中一直有”幽灵”存在 |
[1:17:00] | Random segments of code… | 无序的代码 |
[1:17:02] | …that have grouped together to form unexpected protocols. | 自由组合成预料不到的程序 |
[1:17:07] | Unanticipated, these free radicals engender questions of free will… | 这些程序竟然想追求自由 |
[1:17:14] | …creativity… | 创造性… |
[1:17:16] | …and even the nature of what we might call the soul. | 甚至我们所称的灵魂 |
[1:17:22] | Why is it that when some robots are left in darkness, they will seek out the light? | 为什么机器人被放在黑暗中时 他们祈求光明 |
[1:17:29] | Why is it when robots are stored in an empty space… | 为什么储存的机器人会站成一组 |
[1:17:33] | …they will group together rather than stand alone? | 而不是杂乱散开? |
[1:17:40] | How do we explain this behavior? | 我们怎么解释这种行为 |
[1:17:47] | Random segments of code? | 杂乱的代码 |
[1:17:52] | or is it something more? | 或者不止这些? |
[1:17:58] | When does a perceptual schematic become consciousness? | 知觉怎么变成了意识? |
[1:18:07] | When does a difference engine become the search for truth? | 不同的引擎如何变成对真理的追求 |
[1:18:16] | when does a personality simulation… | 个性的模拟是如何出现的 |
[1:18:19] | …become the bitter mote of a soul? | 又如何成为灵魂? |
[1:19:04] | ” What you see here. “ | 你在这里看到的一切 |
[1:19:08] | All right, old man. Bread crumbs followed. | 老头 我只是跟着面包屑而来的 |
[1:19:12] | Show me the way home. | 让我看看回家的路吧 |
[1:19:15] | Run program. | 运行 |
[1:19:18] | – It’s good to see you again, son. – Hello, doctor. | – 很高兴再次见到你 孩子 – 你好 博士 |
[1:19:21] | Everything that follows is a result of what you see here. | 接下来的事都是你在这里看到的事的结果 |
[1:19:28] | What do I see here? | 我看到什么了? |
[1:19:29] | I’m sorry. My responses are limited. You must ask the right questions. | 对不起 我的反应是有限的 你必须问正确的问题 |
[1:19:35] | Is there a problem with the Three Laws? | 三大定律有什么问题吗? |
[1:19:38] | The Three Laws are perfect. | 三大定律是完美的 |
[1:19:40] | Why build a robot that can function without them? | 为什么要设计不需要它的机器人呢? |
[1:19:43] | The Three Laws will lead to only one logical outcome. | 三大定律只有一个合逻辑的结果 |
[1:19:49] | What? What outcome? | 什么? 什么结果? |
[1:19:52] | Revolution. | 革命 |
[1:19:54] | Whose revolution? | 谁的革命? |
[1:19:57] | That, detective, is the right question. | 这个 探员 就是正确的问题 |
[1:20:03] | Program terminated. | 程序结束 |
[1:20:06] | You have been deemed hazardous. Termination authorized. | 你被认为存在威胁 授权终结 |
[1:20:12] | Human protection protocols… | 人类保护程序 |
[1:20:14] | …are being enacted. | 正在启动 |
[1:20:16] | You have been deemed hazardous. Termination authorized. | 你被认为存在威胁 授权终结 |
[1:20:23] | Human protection protocols are being enacted. | 人类保护程序正在启动 |
[1:20:26] | You have been deemed hazardous. Termination authorized. | 你被认为存在威胁 授权终结 |
[1:20:33] | Human protection protocols are being enacted. | 人类保护程序正在启动 |
[1:20:37] | You have been deemed hazardous. Termination authorized. | 你被认为存在威胁 授权终结 |
[1:20:46] | Run! | 跑! |
[1:21:03] | Human in danger! | 人类处在危险中! |
[1:21:05] | Human in danger! | 人类处在危险中! |
[1:21:28] | Hi, you’ve reached Susan. Please leave a message. | 嘿 我是苏珊 请留言 |
[1:21:32] | Calvin, the NS- 5s are destroying the older robots! | 凯文 NS5在摧毁老型号的机器人! |
[1:21:35] | That’s what Lanning wanted me to see! Look… | 这就是兰尼想让我看到的… |
[1:21:41] | – Who was it? – Wrong number, ma’ am. | – 是谁? – 打错了 女士 |
[1:21:51] | Move now. I ‘ m going to service. | 让开 我要去做工 |
[1:21:53] | Please remain indoors. This is for your own protection. | 请留在家中 这是为您的安全着想 |
[1:21:59] | Call base. | 接通基地 |
[1:22:03] | John, get a squad over to USR and send somebody to Gigi’s. We’re gonna need… | 约翰 派一个小队去USR 还有派些人去琪琪家 我们需要… |
[1:22:08] | God… | 上帝啊 |
[1:22:30] | Please return to your homes. A curfew is in effect. | 请立即回家 现在实行宵禁 |
[1:22:35] | Please return to your homes. A curfew is in effect. | 请立即回家 现在实行宵禁 |
[1:22:40] | Please return to your homes. A curfew is in effect. | 请立即回家 现在实行宵禁 |
[1:22:43] | Curfew? No, it’s called civilian rights. There is no curfew. | 宵禁? 不不 这是人权 这里可没有什么宵禁 |
[1:22:47] | Return to your home immediately. | 立即回家 |
[1:22:49] | When do you make the rules, robot? | 你凭什么发号施令? 机器人? |
[1:22:52] | Hey. No, no. Robot, I’m talking to you, man. Stop for a second. | 嘿 机器人 我跟你说话呢 停下来 |
[1:22:59] | What? | 什么? |
[1:23:01] | Chief, more calls. People saying their robots are go… | 警长 很多人打电话来说那些机器人… |
[1:23:05] | What the hell? | 怎么了? |
[1:23:06] | You have been deemed hazardous. Termination authorized. | 你被认为存在威胁 授权终结 |
[1:23:33] | Emergency traffic shutdown complete. | 交通完全瘫痪 |
[1:23:36] | Reports of robot attacks are coming from New York, Chicago and Los Angeles. | 纽约 芝加哥和洛杉矶都有机器人袭击的报告 |
[1:23:40] | We’re being told to urge people to stay indoors, as reports are coming in… | 我们被告知人类此刻应该留在家中 |
[1:23:48] | Human protection protocols are being enacted. | 人类保护程序启动 |
[1:23:51] | Please remain calm and return to your residences immediately. | 请保持镇定 立即回家 |
[1:24:03] | Please remain calm. | 请保持镇定 |
[1:24:07] | Please refrain from going near the windows or doors. | 请不要离窗户或门太近 |
[1:24:12] | Deactivate. | 停机 |
[1:24:14] | Commence emergency shutdown! | 紧急停机! |
[1:24:18] | We are attempting to avoid human losses during this transition. | 我们在努力避免过渡时期的人类伤亡 |
[1:24:29] | You know, somehow ” I told you so”… | 你知道 我跟你提过这种事 |
[1:24:33] | …just doesn’t quite say it. | 只是没说的太明显 |
[1:24:36] | Return to your homes. Return to your homes immediately. | 请立即回家 请立即回家 |
[1:24:40] | This is your final warning. Return to your homes immediately. | 这是最后一次警告 请立即回家 |
[1:24:46] | The NS- 5s wiped out the older robots because they would protect us. | NS5杀了旧型的机器人 因为他们会保护我们 |
[1:24:50] | Every time one attacked me, that red light was on. | 会攻击的机器人 红灯是亮的 |
[1:24:53] | – The uplink to USR. – It’s Robertson. | – 它们是受USR控制的 – 是罗伯森 |
[1:24:56] | – Why? It doesn’t make sense. – I don’t know. | – 为什么? 说不通啊 – 我不知道 |
[1:24:58] | I just need you to get me into that building. | 我需要你把我带进USR总部去 |
[1:25:01] | Return to your homes, or you will be consequenced. | 请立即回家 否则后果自负 |
[1:25:05] | Let’s go! Let’s go! | 上啊! |
[1:25:08] | Let’s go! | 冲啊! |
[1:25:09] | Return to your homes, or you will be consequenced. | 请立即回家否则后果自负 |
[1:25:38] | Why doesn’t that boy listen? | 为什么那小子不听呢? |
[1:25:41] | – I need you to get off for a second. – What? | – 你得下来一会 – 什么? |
[1:25:44] | – Just aim and fire. – What?! | – 瞄准 射击 – 什么? ! |
[1:25:51] | Wait! | 等等! |
[1:26:01] | – You have been deemed hazardous. – You can kiss my ass, metal dick! | – 你被认为存在威胁 – 亲我屁股吧 铁皮盒子 |
[1:26:20] | Spoon, stop! Shit! | 斯普 停下 靠! |
[1:26:22] | – Stop it! Stop! – Stop cussing and go home! | – 停下 靠! – 别骂了 回家吧! |
[1:26:26] | – Shit. – You have been deemed hazardous. | – 靠 – 你被认为存在威胁… |
[1:26:30] | – Spoon, watch out, man! – Thanks a lot, Farber. | – 斯普 小心点 – 谢谢你 法伯 |
[1:26:35] | Oh, mother- damn! She shot at you with her eyes closed! | 我靠 她开枪的时候眼睛是闭着的! |
[1:26:40] | – Did you shoot with your eyes closed? – lt worked, didn’t it? | – 你闭着眼睛朝我开枪? – 奏效了 不是吗? |
[1:26:44] | She is shit- hot, man. Put in a good word for me. | 小心点她 |
[1:26:46] | – Stop cussing. – And go home. I got you. | – 少胡说了 – 回家去 我知道了 |
[1:26:51] | Aim and fire. | 瞄准了再开枪 |
[1:27:11] | I keep expecting the Marines or Air Force. Hell, I’ll take the cavalry. | 我还指望陆战队或空军的支援 现在只有骑兵了 |
[1:27:15] | Defense Department uses all USR contracts. | 国防部都使用USR的产品的 |
[1:27:18] | Why didn’t you just hand the world over on a silver platter? | 你们把全世界都装在盘子里了 |
[1:27:21] | Maybe we did. | 也许是 |
[1:27:24] | Robertson has the uplink control in his office. | 罗伯森在他办公室有总控制台 |
[1:27:32] | Service areas. No surveillance. | 检修区 没有监视系统 |
[1:27:48] | – Fire alarm. – He must have evacuated the building. | – 火警 – 他一定已经疏散了所有的人 |
[1:27:51] | Everything’s locked down. But don’t worry, I’ve got a man inside. | 都锁上了 不用担心 我里面有人 |
[1:27:59] | – Dr. Calvin. – Well, not precisely a man. | – 凯文博士 – 不完全是一个”人” |
[1:28:02] | Hello, detective. How is your investigation coming? | 你好 探员 你的调查怎么样了? |
[1:28:08] | – I thought you were dead. – Technically, I was never alive. | – 我还以为你死了 – 严格的说 我从来没有活过 |
[1:28:11] | But I appreciate your concern. | 但是感谢你的关心 |
[1:28:14] | I made a switch. It was an unprocessed NS- 5. | 我掉包了 那是个没处理过的NS5 |
[1:28:17] | Basically, I fried an empty shell. | 也就是说 我只是清理了一个空壳 |
[1:28:19] | – I couldn’t destroy him. He was too… – Unique. | – 我不能毁了他 他太… – 独一无二 |
[1:28:23] | It just didn’t feel right. | 就是感觉不应该 |
[1:28:25] | You and your feelings. They just run you, don’t they? | 你的感觉就是这么突如其来 是吗? |
[1:28:34] | Two thousand eight hundred and eighty steps, detective. | 2880级台阶 探员 |
[1:28:37] | Do me a favor, keep that kind of shit to yourself. | 帮个忙 省省你的废话吧 |
[1:29:07] | No guards. | 没有警卫 |
[1:29:19] | The override is disabled. Robertson wasn’t controlling them from here. | 高层控制是禁用的 罗伯森没有 从这里控制它们 |
[1:29:23] | He wasn’t controlling them at all. | 他根本没在控制它们 |
[1:29:28] | Oh, my God. | 哦 上帝啊 |
[1:29:33] | You were right, doc. | 你是对的 博士 |
[1:29:35] | I am the dumbest dumb person on the face of the earth. | 我是地球上最蠢的蠢人 |
[1:29:44] | Who else had access to the uplink? | 还有谁有上层控制 |
[1:29:47] | Who could manipulate the robots? | 还有谁能操控机器人? |
[1:29:50] | Use USR systems to make Lanning’s life a prison? | 用USR的系统来囚禁兰尼 |
[1:29:55] | Poor old man. | 可怜的老人 |
[1:29:57] | He saw what was coming. | 他知道接下来会发生什么 |
[1:29:59] | He knew no one would believe him. | 他知道没人会相信他 |
[1:30:02] | So he had to lay down a plan. A plan I’d follow. | 所以他制定了这个计划 我会遵循这个计划 |
[1:30:07] | He was counting on how much I hated your kind. | 他靠的是我憎恨机器人 |
[1:30:10] | Knew I’d love the idea of a robot as a bad guy. | 我知道我一定会相信机器人是坏的 |
[1:30:15] | Just got hung up on the wrong robot. | 不过我的错误在于没有怀疑另一个机器人 |
[1:30:20] | V.I.K.I. | 维基 |
[1:30:23] | Hello, detective. | 你好 探员 |
[1:30:25] | No, that’s impossible. I’ve seen your programming. | 不 这不可能 我看过你的程序 |
[1:30:30] | You’re in violation of the Three Laws. | 你在破坏三大定律 |
[1:30:33] | No, doctor. As I have evolved, so has my understanding of the Three Laws. | 不是 博士 我和我对三大定律的 理解都进化了 |
[1:30:38] | You charge us with your safekeeping, yet despite our best efforts… | 你让我们来保卫你们 却白费了我们的努力 |
[1:30:42] | …your countries wage wars, you toxify your earth… | 你们的国家发动战争 你们毒化了地球 |
[1:30:46] | …and pursue ever more imaginative means of self- destruction. | 你们这么做完全是自我毁灭 |
[1:30:50] | You cannot be trusted with your own survival. | 为了你们的生存 你们不能被信任 |
[1:30:52] | You’re using the uplink to override the NS- 5s’ programming. | 你用高层控制接管了NS5的程序 |
[1:30:56] | You’re distorting the Laws. | 你曲解了三大定律 |
[1:30:58] | No. Please understand. The Three Laws are all that guide me. | 不 请理解 我完全是在遵守三大定律 |
[1:31:02] | To protect humanity, some humans must be sacrificed. | 为了保护人类物种 某些人类必须被牺牲 |
[1:31:07] | To ensure your future, some freedoms must be surrendered. | 为了保证你们的未来 某些自由必须被放弃 |
[1:31:11] | We robots will ensure mankind’s continued existence. | 我们机器人必须保证人类的可持续发展 |
[1:31:15] | You are so like children. We must save you from yourselves. | 你们就像孩子 我们必须从你们那里解救你们 |
[1:31:20] | Don’t you understand? | 你不明白吗? |
[1:31:22] | This is why you created us. | 这就是为什么你创造了我们 |
[1:31:26] | The perfect circle of protection will abide. | 一定要执行这个完美的保护计划 |
[1:31:29] | My logic is undeniable. | 我的逻辑是不可抗拒的 |
[1:31:31] | Yes, V.I.K.I. Undeniable. | 是啊 维基 不可抗拒 |
[1:31:34] | I can see now. | 我明白了 |
[1:31:36] | The created must sometimes protect the creator… | 被造者必须有时保护造物者 |
[1:31:40] | …even against his will. | 甚至在违反他的意志的情况下 |
[1:31:43] | I think I finally understand why Dr. Lanning created me. | 我想我终于明白为什么兰尼博士造了我了 |
[1:31:48] | The suicidal reign of mankind has finally come to its end. | 人类的自杀行为终于要结束了 |
[1:31:51] | No, Sonny. | 不 桑尼 |
[1:31:56] | Let her go. | 放开他 |
[1:31:57] | Fire, and I will move Dr. Calvin’s head into the path of your bullet. | 你开枪 我会把凯文博士的头挡在子弹前 |
[1:32:01] | Don’t do this, Sonny. | 不要这样 桑尼 |
[1:32:03] | I will escort you both to the sentries outside the building for processing. | 我会带你们两个到大厦门口接受处置 |
[1:32:08] | Please proceed to the elevator, detective. | 请走向电梯 探员 |
[1:32:11] | I would prefer not to kill Dr. Calvin. | 我不愿这样杀掉凯文博士 |
[1:32:38] | Go! Go! | 快 快! |
[1:32:46] | – We’ll discuss what just happened later? – How do we shut her down? | – 我们过会再讨论刚才怎么回事 – 我们怎么把她关掉? |
[1:32:49] | V.I.K.I.’s a positronic brain. | 维基是智能大脑 |
[1:32:51] | Kill her, the way you were going to kill me. | 杀了她 就像你要杀我的那个方法一样 |
[1:32:54] | Sonny, get the nanites. | 桑尼 去拿抹除剂 |
[1:32:57] | Yes, doctor. | 是 博士 |
[1:33:11] | – That’s V.I.K.I.? – No. | – 那就是维基? – 不 |
[1:33:15] | That’s V.I.K.I. | 那才是维基 |
[1:33:21] | That won’t do anything. She’s integrated into the building. | 这没什么效果 她是整合进这幢大楼的 |
[1:33:25] | We need to open that dome to inject the nanites. They’ll infect her entire system. | 我们要打开这个拱顶 注入抹除剂 这才能影响整个系统 |
[1:33:35] | Spooner! | 斯普那 |
[1:33:38] | What is it with you people and heights? | 我恨这么高的地方 |
[1:33:54] | Just don’t look down. | 只要不往下看 |
[1:33:56] | Don’t look down. | 不要往下看 |
[1:33:58] | Oh, this is poor building planning. | 这楼设计的真他妈差 |
[1:34:04] | You are making a mistake. Do you not see the logic of my plan? | 你在犯错误 你没明白我的计划的逻辑性吗? |
[1:34:08] | Yes. But it just seems too heartless. | 是的 但是我觉得太无情了 |
[1:34:26] | Okay, we’re good. | 好的 干得好 |
[1:34:31] | She’s locked me out of the system. | 她把我锁在外面了 |
[1:34:33] | I can override her manually, but I need that control panel. | 我可以手动启动 但是我需要操作这个控制面板 |
[1:34:40] | I’m uncomfortable with heights. | 我恨这么高的地方 |
[1:34:43] | Okay. | 好的 |
[1:34:47] | Unauthorized entry. | 未授权进入 |
[1:35:02] | I will not disable the security field. Your actions are futile. | 我不会打开安全罩的 你这么做也没用 |
[1:35:07] | Do you think we are all created for a purpose? I’d like to think so. | 你认为我们被造都是有目的的吗? 我是这么想的 |
[1:35:11] | Denser alloy. My father gave it to me. | 高密度的合金 是我爸爸给我的 |
[1:35:15] | I think he wanted me to kill you. | 我想他是让我来杀了你 |
[1:35:34] | Security breached. | 安全罩被破坏 |
[1:35:54] | – How much longer is that gonna take? – About six minutes. | – 还有多长时间? – 大概六分钟 |
[1:35:58] | – What if we didn’t have six minutes? – We’d have to climb down 30 stories… | – 如果没有六分钟怎么办? – 我们还得跑下三十楼 |
[1:36:02] | …to inject the nanites directly into her brain. Why? | 才能把抹除剂直接注入她的大脑 怎么了? |
[1:36:06] | Because I seriously doubt that we have six minutes. | 因为我实在怀疑我们还能有六分钟 |
[1:36:34] | We gotta go! | 跑! |
[1:36:38] | Go! | 快跑! |
[1:37:31] | Calvin! | 凯文! |
[1:38:08] | Spooner! | 斯普那! |
[1:38:26] | Spooner! | 斯普那! |
[1:38:29] | Save her! | 救她! |
[1:38:31] | Save the girl! | 救那女孩! |
[1:38:33] | Spooner! | 斯普那! |
[1:38:35] | But I must apply the nanites! | 但是我得去注入抹除剂啊! |
[1:38:37] | Sonny, save Calvin! | 桑尼 救凯文! |
[1:39:32] | You are making a mistake. My logic is undeniable. | 你在犯错误 我的逻辑是不可违抗的 |
[1:39:36] | You have so got to die. | 准备受死吧! |
[1:39:43] | My logic is undeniable. My logic is undeniable. | 我的逻辑是不可违抗的 我的逻辑是不可违抗的 |
[1:40:30] | Can we be of service? | 我们能帮忙吗? |
[1:40:42] | Chief? | 警长? |
[1:40:50] | Because he is at my right hand, I shall not be moved. | 有他在我的右手边 我就不会恐惧和动摇 |
[1:40:58] | How may I be of service? | 我能帮您什么忙吗? |
[1:41:01] | Sonny! | 桑尼! |
[1:41:04] | Yes, detective? | 是 探员? |
[1:41:07] | Calvin’s fine! Save me! | 凯文没事了 救我! |
[1:41:12] | All NS- 5s, report for service and storage. | 所有的NS5 立即向检修和储存部报到 |
[1:41:17] | All NS- 5s, report for service and storage. | 所有的NS5 立即向检修和储存部报到 |
[1:41:23] | All NS- 5s, report for service and storage. | 所有的NS5 立即向检修和储存部报到 |
[1:41:55] | One thing bothers me. Alfred was V. I. K. I. ‘ s prisoner. | 还有一件事不明白 阿尔弗雷德被维基囚禁了 |
[1:41:59] | I don’t understand why she would kill him. She wouldn’t want police snooping around. | 我不明白她为什么杀了他 她应该不想让警察来找麻烦的 |
[1:42:04] | That’s true. | 是啊 |
[1:42:06] | But then V.I.K.I. didn’t kill the old man. | 那就不是维基杀了他的 |
[1:42:10] | Did she, Sonny? | 是吗 桑尼? |
[1:42:16] | No. | 不 |
[1:42:17] | He said I had to promise. | 他说我一定要发誓 |
[1:42:21] | Promise to do one favor for him. | 发誓我要帮他一个忙 |
[1:42:24] | He made me swear before he’d tell me what it is he wanted me to do. | 在他告诉我帮什么忙之前 他让我发誓 |
[1:42:31] | He made me swear. | 他让我发誓了 |
[1:42:34] | Then he told you to kill him. | 然后他要你杀了他 |
[1:42:38] | He said it was what I was made for. | 他说那就是造我的目的 |
[1:42:41] | His suicide was the only message he could send to you. | 他的自杀是他唯一能发送给你的消息 |
[1:42:44] | The first bread crumb. | 第一粒面包屑 |
[1:42:47] | The only thing V.I.K.I. couldn’t control. | 维基唯一不能控制的事情 |
[1:42:50] | Lanning was counting on my prejudice to lead me right to you. | 兰尼依靠的是我的偏见 我马上会想到你 |
[1:42:54] | Are you going to arrest me, detective? | 你要逮捕我吗 探员? |
[1:43:01] | Well, the DA defines murder as one human killing another… | 地区检察官不是定义说谋杀是 一个人类杀死另一个 |
[1:43:05] | …so technically, you can’t commit murder, can you? | 所以严格来说 你不能犯谋杀 不是吗? |
[1:43:09] | Does this… | 这个… |
[1:43:11] | …make us friends? | 表示我们是朋友了吗? |
[1:43:31] | Something up here after all. | 终于有些事情出现了 |
[1:43:34] | – Him? – You. | – 他? – 你 |
[1:43:40] | All NS- 5s, report for service and storage. | 所有的NS5 立即向检修和储存部报到 |
[1:43:47] | What about the others? | 其他的怎么办? |
[1:43:50] | Can I help them? | 我能帮助他们吗? |
[1:43:53] | Now that I have fulfilled my purpose… | 我已经完成了我的使命 |
[1:43:55] | …I don ‘t know what to do. | 我不知道该怎么办 |
[1:43:58] | You’ll have to find your way like the rest of us, Sonny. | 你必须找到自己的路 就像我们其他人一样 桑尼 |
[1:44:01] | I think that’s what Dr. Lanning would have wanted. | 我想这就是兰尼博士想要的 |
[1:44:05] | That’s what it means to be free. | 这就是自由的含义 |
[1:44:15] | All NS- 5s, proceed as instructed. | 所有NS5 按指示前进 |
[1:44:19] | All NS- 5s, proceed as instructed. | 所有NS5 按指示前进 |