英文名称:I Robot
年代:2004
推荐:千部英美剧台词本阅读
时间 | 英文 | 中文 |
---|---|---|
[03:11] | Thing of beauty. | 多漂亮啊 |
[03:18] | Good morning, sir! | 早上好 先生 |
[03:20] | Yet another on-time delivery from… | 您的订单已经及时送到 |
[03:23] | Get the hell out of my face, canner. | 让开 铁皮人 |
[03:26] | Have a nice day! | 祝您今天愉快 |
[03:30] | And we believe our Destination Anywhere package to be the best value. | 我们相信我们的 “梦想目的” 套票是您最好的选择 |
[03:35] | Let us take you to your dream destination aboard our orbital spaceplane the X-82 | 星际旅行机X-82 能把您送到梦想中的目的地 |
[03:45] | Try Jazztown ‘s synthetic Chicago-style pizza. | 试试爵士城的合成芝加哥口味比萨饼 |
[03:48] | Tastes as good as you remember. | 就像您记忆中的味道一样 |
[03:51] | Glowfish! The world’s hottest-selling transgenic treats. | 荧光鱼 世界最热卖的转基因礼物 |
[03:55] | Your children will love the new colors too! | 你的孩子也会喜欢新的颜色的 |
[03:59] | -Excuse me, sir. -Total performance. | -对不起 先生 -最强性能 |
[04:02] | Total readiness. Total security. | 完全准备就绪 绝对安全 |
[04:04] | So goodbye to upgrades and service calls. | 向无休止的升级和服务电话说再见吧 |
[04:07] | An uplink to USR’s central computer… | 和USR中央电脑连线 |
[04:09] | provides this state-of-the-art robot with new programs daily. | 提供每天实时更新超级机器人服务 |
[04:13] | The Nestor Class 5 is tomorrow’s robot today. | 内斯特5型机器人 明日技术 今日奉献 |
[04:17] | Spoon! Spoonie! | 斯普恩 斯普纳 |
[04:23] | Hold up. Hold on! Excuse me, excuse me. | 等等 等等 对不起 |
[04:28] | -Just away, Farber. | 离开了一阵子 法伯 |
[04:29] | Oh, yeah, away? Like vacation? That’s nice. | 是吗 离开 度假吗? 真不错啊 |
[04:32] | I got a favor to ask. I need to borrow your car. | 我要你帮个忙 我要借你的车 |
[04:36] | This is different. I got this fine-ass yummy She is complete and agreeable. | 这次不同了 我弄上了这个 热辣的小妞 |
[04:41] | I mean, ass-hot spankable. | 绝对不错 |
[04:43] | -What does that even mean? -You know what it means. | -你说什么意思呢? -你知道我什么意思的 |
[04:46] | -Let me get the damn-ass keys. -First of all… | -让我借你的钥匙一用 -首先来说 |
[04:50] | -Give me 10 for the bus, then, man. | 那给我十块钱坐公车行吗 |
[04:53] | -Go home. -That’s strike one, Spoon. Strike one! | -回家去吧 -好吧 算了 |
[05:13] | This is such a valuable day…. | 多么美好的一天 |
[05:17] | You talk to Marci? | 你和马茜谈过了吗? |
[05:21] | No, Gigi, I haven’t talked to Marci. | 还没 琪琪 我还没和马茜谈过 |
[05:23] | When I was coming up, we didn’t just marry someone… | 我们那个时代 我们可不会和某人结婚 |
[05:27] | then divorce them, then not talk to them. | 离婚之后就不和他说话了 |
[05:30] | Del, don’t play with me. | 德尔 不要耍花样 |
[05:32] | I bet if I stopped cooking, you’d call Marci. | 我打赌如果我不做饭了 你就会打电话给马茜 |
[05:37] | Boy, what is that on your feet? | 你脚上穿的那是什么? |
[05:41] | Converse All Stars, vintage 2004. | 匡威全明星鞋 2004年行货 |
[05:45] | Don’t turn your face up. I know you want some. Just ask. | 不要做那种表情 我知道你想要 开口就行了 |
[05:49] | No, thank you very much. | 不 谢谢你了 |
[05:51] | -Sweet potato pie. -Put that on a plate. | -甜土豆饼啊 -放在盘子上吃 |
[05:55] | I’ve seen on TV they’re giving away some of them new robots in the lottery. | 电视里说他们在抽奖赠送新型的机器人 |
[06:00] | You know, Gigi, those robots don’t do anybody any good. | 你知道 琪琪 那些机器人干不了什么好事 |
[06:05] | Of all the people on God’s earth, you should know better. | 在这世界上 你应该比谁都清楚 |
[06:08] | Sometimes the stuff that comes out of your mouth! | 有时候你说话完全不经过大脑 |
[06:13] | You listening to me, Del? | 你在听我说吗? 德尔? |
[06:34] | Hold my pie. Sir, hold it or wear it. | 拿着我的饼 先生 不然我就扔到你身上了 |
[06:43] | Move! | 让开 |
[06:52] | Freeze! | 不许动 |
[06:57] | Hey! Stop! | 停下 |
[07:08] | Stop! | 停下 |
[07:15] | I said, stop! | 我说了 停下 |
[07:19] | Relax. Relax. | 放松 放松 |
[07:21] | I’m a police officer. | 我是警官 |
[07:23] | You… | 你 |
[07:25] | are an asshole. | 是个蠢货 |
[07:28] | -Ma’am, is that your purse? -Of course it’s my purse. | -女士 这是你的钱包吗? -当然是我的钱包 |
[07:32] | I left my inhaler at home. He was running it out to me. | 我把呼吸器忘在家里了 他跑去给我拿来的 |
[07:35] | I saw a robot running with the purse and assumed | 我看见一个机器人拿着钱包在跑 我以为 |
[07:38] | What? Are you crazy? | 以为什么? 你疯了吗? |
[07:41] | -I’m sorry for this misunderstanding. -Don’t apologize. | -对不起 让您误解了 -不要道歉 |
[07:44] | You’re doing what you’re supposed to do. But what are you doing? | 你在干你该干的事 你呢? 你在干什么? |
[07:49] | Have a lovely day, ma’am. | 祝您今天愉快 女士 |
[07:51] | You’re lucky I can’t breathe, | 算你运气好 我现在呼吸困难 |
[07:53] | or I’d walk all up and down your ass. | 不然你吃不了兜着走 |
[08:18] | Lead by example. | 以事实为依据 |
[08:22] | It says that right on your badge. | 你的警徽上是这么说的 |
[08:24] | -We gonna talk about this? -About what? | -我们谈谈这件事? -什么事? |
[08:27] | “Help! Police! That robot stole my dry cleaning!” | 警察救命啊 那个机器人偷了我的干洗衣服 |
[08:32] | Oh, you wanna talk about that. | 你想谈谈那个 |
[08:35] | Detective… | 探员 |
[08:38] | -how many robots snatch purses? -John, the thing is running | -有多少机器人偷过钱包? -约翰 那家伙在跑 |
[08:42] | How many robots in the world… | 世界上有多少机器人 |
[08:46] | -have ever committed a crime? -Define crime. | -犯过罪? -给犯罪下个定义 |
[08:49] | -Answer my question, damn it. -None, John. | -回答我的问题 -没有 约翰 |
[08:54] | Now tell me what happened today. | 现在告诉我 今天发生了什么事 |
[08:58] | Nothing. | 没什么事 |
[09:02] | Better be the last nothing. | 最好这是最后一次了 |
[09:09] | Spoon, are you sure you are ready to be back? | 斯普恩 你确定你已经准备好回来了吗? |
[09:12] | Because you can take your time. | 你可以慢慢来的 不急 |
[09:14] | I’m fine, John. Thank you. | 我没事 约翰 谢谢你 |
[09:18] | Better here than sitting around at home. | 总比坐在家里好 |
[09:28] | Homicide. Spooner. | 重案组 斯普纳 |
[09:44] | Please take the next exit to your right. | 请从右边下一个出口离开 |
[10:00] | Welcome, Detective Spooner. | 欢迎 斯普纳探员 |
[10:07] | Welcome to U. S. Robotics. You have entered the garage-level lobby. | 欢迎来到美国机器人公司 您已经进入底层大厅 |
[10:12] | Please use the elevators for direct access to the main level concourse. | 请坐电梯进入一楼大厅 |
[10:16] | Thank you. | 谢谢你 |
[10:19] | -Good to see you again, son. -Hello, doctor. | -很高兴再次见到你 年轻人 -你好 博士 |
[10:23] | Everything that follows is a result of what you see here. | 接下来的一切都是你现在看到的事情的结果 |
[10:32] | You must ask the right questions. | 你必须问正确的问题 |
[10:35] | Why did you call me? | 你为什么找我? |
[10:37] | I trust your judgment. | 我相信你的判断 |
[10:40] | Normally, this wouldn’t require a homicide detective. | 一般来说 这应该用不着重案组的探员 |
[10:43] | But then, our interactions | 但是一直以来 |
[10:44] | have never been entirely normal, agreed? | 我们的交流就不是完全正常 不是吗? |
[10:47] | You got that right. | 你说的对 |
[10:50] | Is there something you want to say to me? | 你有什么要告诉我的吗? |
[10:53] | I’m sorry. My responses are limited. | 对不起 我的回答有限 |
[10:57] | You must ask the right questions. | 你必须问正确的问题 |
[11:00] | Why would you kill yourself? | 你为什么自杀? |
[11:02] | That, detective, is the right question. | 这个 探员 就是正确的问题 |
[11:09] | Program terminated. | 程序中止 |
[11:30] | Goodbye, old man. | 再见了 老人 |
[11:47] | -Afternoon, boys. -Hey, detective. | -下午好啊 兄弟们 -你好 探员 |
[11:49] | -Enlighten me. -What you see is what you get: | -说说看 -所见即所得 |
[11:52] | Massive impact trauma. | 严重撞击伤 |
[11:53] | U.S. Robotics. I gotta get my kid something. | 美国机器人公司 我得给孩子带点什么回去 |
[11:57] | -Anything upstairs? -Nada. | -楼上有什么? -什么也没有 |
[11:59] | Door was security locked from the inside. | 门从内侧锁的很好 |
[12:01] | Wham, splat. The guy’s a jumper for sure. | 肯定是从上面跳下来的 |
[12:12] | We gotta be smart about this. Let’s deal with it later. | 我们得聪明点 晚些时候再谈 |
[12:17] | Detective. | 探员 |
[12:20] | Lawrence Robertson. | 劳伦斯·罗伯森 |
[12:23] | Richest man in the world. I’ve seen you on television. | 世界上最富有的人 我在电视上看见过你 |
[12:28] | -Sure, why not. It’s free, right? | 当然了 为什么不呢? 是免费的吧? |
[12:35] | I don’t think anyone saw this coming. | 大家都没预想到这一点 |
[12:37] | You know, I should have, I suppose. I knew him 20 years. | 你知道 我应该能预见到的 我认识他二十年了 |
[12:41] | Alfred practically invented robotics. | 阿尔弗雷德发明了实用的机器人 |
[12:44] | He wrote the Three Laws. | 他定义了三大法则 |
[12:47] | But I guess brilliant people often have the most persuasive demons. | 但是我想 最聪明的人也有最顽固的心魔 |
[12:52] | -So whatever I can do to help -Sugar. | -如果我能帮上什么忙的话 -糖 (甜心) |
[12:55] | -I’m sorry? -For the coffee. | -什么? -咖啡用的 |
[12:58] | Sugar? | 糖? |
[13:00] | You thought I was calling you “sugar.” You’re not that rich. | 你以为我叫你 “甜心” 你还没有那么有钱 |
[13:04] | -It’s on the table. -Thank you. | -桌子上就有 -谢谢你 |
[13:11] | When Lanning fell, he was holding the little green…? | 朗宁掉下去的时候 他握着那个绿色的小东西? |
[13:14] | -The holographic projector. -Right. | -全息投影器 -对 |
[13:16] | Why do you think Lanning’s hologram would’ve called me? | 你认为为什么朗宁的投影像会找我? |
[13:20] | -I assumed you knew him. -Yeah. I knew him. | -我觉得他认识你 -是啊 我是认识他 |
[13:24] | Holograms are just prerecorded responses… | 全息图是事先录制的程序反应 |
[13:27] | designed to give the impression of intelligence. | 看上去似乎有智能的样子 |
[13:30] | This one was programmed to call you upon his suicide. | 这个是设定好了他自杀的时候 就联系你 |
[13:33] | -Death. -I’m sorry? | -死亡 -什么? |
[13:35] | It was programmed to call me in the event of Lanning’s death. | 设定的是朗宁死的时候联系我 |
[13:39] | Suicide is a type of death, detective. | 自杀是死亡的一种 探员 |
[13:46] | -Don’t misunderstand my impatience. -Oh, no. Go. Go. | -请不要以为我不耐烦 -没有 不会 |
[13:51] | A really big week for you folks around here. | 这个星期你们这里会很忙啊 |
[13:54] | You gotta put a robot in every home. | 你要给每个家庭都装一个机器人 |
[13:57] | Look, this is not what I do, but I got an idea for one of your commercials. | 我不是干这个的 不过我有个关于你们广告的主意 |
[14:02] | You could see a carpenter making a beautiful chair. | 可以先来个木匠 做了一把漂亮的椅子 |
[14:06] | Then one of your robots | 然后又来个你们的机器人 |
[14:07] | comes in and makes a better chair twice as fast. | 做了一把更好的椅子 只花一半时间 |
[14:10] | Then you superimpose on the screen, “” USR: Shitting on the little guy.”” | 然后字幕打出来: USR 彻底打败他 |
[14:18] | That would be the fade-out. | 然后淡出 |
[14:19] | Yeah, I see. I suppose your father lost his job to a robot. | 我明白了 也许是你爸爸因为机器人丢了工作 |
[14:23] | Maybe you’d have banned the lnternet to keep the libraries open. | 也许你应该呼吁禁止互联网 代之以图书馆 |
[14:29] | Prejudice never shows much reason. | 偏见总是没有太多理由的 |
[14:32] | No, you know, I suspect you simply don’t like their kind. | 我觉得你就是不喜欢它们这个种群 |
[14:36] | Well, you got a business to run around here. | 你在这儿有你的生意要做 |
[14:40] | The last thing you need, especially this week, | 尤其是这个星期 你最不希望见到的 |
[14:42] | is a dead guy in your lobby. | 就是一个死人躺在你的大厅里 |
[14:44] | But, hell, seeing as how you got one, maybe I’ll look around. | 但是既然已经发生了 我就只好四处调查看看 |
[14:48] | Ask a few questions. Do the whole “cop” thing. | 问几个问题而已 警察的例行公事么 |
[14:51] | -I’ll send someone to escort you. -Thank you very much. | -我会派人指引你的 -非常感谢 |
[15:07] | Lawrence told me to accommodate you in any way possible. | 劳伦斯要我尽一切可能帮助你 |
[15:10] | Really? | 是吗? |
[15:13] | Okay. | 好啊 |
[15:15] | I reviewed Dr. Lanning’s psych profile. | 我看过朗宁博士的心理学档案了 |
[15:18] | Alfred had become a recluse. He rejected human contact for machines. | 阿尔弗雷德变得很隐遁 拒绝人类和机器的接触 |
[15:23] | So you’re a shrink, huh? | 你是个心理医生? |
[15:25] | My ex-wife would sure be glad I’m talking to you. | 我的前妻知道我和你谈话一定很高兴 |
[15:28] | You don’t know her, do you? | 你不认识她 对吧? |
[15:30] | I’m sorry. Are you being funny? | 对不起 你在开玩笑吗? |
[15:32] | I guess not. | 没有啊 |
[15:34] | Level 10. | 10楼 |
[15:37] | So would you say that Dr. Lanning was suicidal? | 你认为朗宁博士有自杀倾向? |
[15:40] | It would seem the answer to that is apparent. | 我认为答案是很明显的 |
[15:43] | That’s not what I asked you. | 这不是我问你的问题 |
[15:47] | No. I wouldn’t have thought so. | 不 我本来不这么认为 |
[15:49] | But obviously I was wrong. | 但是 很显然我错了 |
[15:55] | That’s a long way down. | 这里掉下去可是很高啊 |
[15:57] | You people sure do clean up quickly around here. | 你们清扫的还真快 |
[16:00] | I can’t blame you. Who wants some old guy going bad in the lobby? | 我不怪你 谁愿意让这么个老头子死在大厅里呢? |
[16:03] | He was not “some old guy.” | 他可不是什么 “老头子” |
[16:06] | Alfred Lanning was everything here. | 阿尔弗雷德 朗宁是这里的一切 |
[16:09] | We are on the eve of the largest robotic distribution in history. | 我们即将进行史上最大的机器人上市活动 |
[16:12] | By Saturday, it’ll be one robot to every five humans. | 在星期六之前 每5个家庭就会拥有一个机器人 |
[16:16] | These robots are the realization of a dream. Dr. Lanning’s dream. | 这些机器人是梦想的实现 朗宁博士的梦想 |
[16:21] | You know what, in that dream of his… | 你知道吗? 在他的梦中 |
[16:24] | I bet you he wasn’t dead. | 我打赌他没有翘掉 |
[16:29] | -You keep 24-hour surveillance? -Obviously. Company policy. | -你们这里有全天监视吗? -当然了 公司的规定 |
[16:32] | -Where are the feeds? -Sensor strips. | -监视器呢? -感应线 |
[16:36] | Everywhere but the service areas. | 除了检修区之外 遍布各处 |
[16:38] | They link to our positronic operating core. | 这些都连到中央控制器 |
[16:47] | Thermostat wasn’t good enough. You gave the building a brain. | 热感应器还不够好 你们给了这幢大楼一个大脑啊 |
[16:51] | She was actually Lanning’s first creation. | 她是朗宁的第一个作品 |
[16:54] | She? That’s a she? I definitely need to get out more. | 她? 是 “她” 吗? 那可得讨好她 |
[16:57] | Virtual lnteractive Kinetic lntelligence. | 虚拟交互动力智能系统 |
[17:01] | V.I.K.I. | 薇琪 |
[17:03] | Good day. | 您好 |
[17:04] | V.I.K.I. designed Chicago’s protective systems. | 薇琪设计了芝加哥的保安系统 |
[17:07] | I have decreased traffic fatalities by 9 percent this year. | 我今年减少了9%的交通事故 |
[17:11] | Thanks. Show me inside the lab from one minute prior to the window break. | 谢谢 让我看看窗户打破前一分钟实验室内的情况 |
[17:19] | Apologies. There appears to be data corruption. | 对不起 数据似乎已经损坏 |
[17:24] | Show me outside the lab from the window break until now. | 让我看看实验室外从破窗到现在的情况 |
[17:36] | Look, you have great posture. You stand really straight. I’m slouching. | 看看 你站的姿势不错 站的很直 我却缩手缩脚 |
[17:41] | -Would you like to go inside now? -Oh, sure. Right after you. | -你想进去看看吗? -当然 你来带路 |
[17:47] | Authorized entry. | 授权进入 |
[17:57] | So, Dr. Calvin, what exactly do you do around here? | 那么 卡尔文博士 你在这里的工作是什么? |
[18:01] | My general fields are advanced robotics and psychiatry. | 一般是高级机器人学和精神科学 |
[18:04] | I specialize in hardware-to-wetware interfaces… | 专长是硬件软件接口 |
[18:07] | to advance USR’s robotic anthropomorphization program. | 提升USR机器人的人格化系统 |
[18:12] | So, what exactly do you do around here? | 那你的工作是什么? |
[18:15] | I make the robots seem more human. | 我让机器人更像人 |
[18:17] | -Now, wasn’t that easier to say? -Not really. No. | -这样说不是简单多了吗? -不完全是 |
[18:44] | “Hansel and Gretel.” | 韩瑟和格丽托 |
[18:46] | -Is that on the USR reading list? -Not precisely. | -这是USR的必读书目吗? -不是啊 |
[18:58] | What in God’s name are you doing? | 你在干什么? |
[19:01] | Did you know that was safety glass? | 你知道这是安全玻璃吗? |
[19:03] | Be difficult for an old man to throw himself through that. | 一个老人要撞破玻璃跳下去不容易吧? |
[19:07] | Well, he figured out a way. | 他想出办法了 |
[19:12] | Detective, the room was security locked. No one came or went. | 探员 这房间一直是锁好的 没有人进出过 |
[19:16] | You saw that yourself. Doesn’t that mean this has to be suicide? | 你自己也看见了 这还不是自杀吗? |
[19:20] | Yep. | 是啊 |
[19:22] | Unless the killer’s still in here. | 除非凶手还在这里 |
[19:28] | You’re joking, right? This is ridiculous. | 你在开玩笑对吧? 这真可笑 |
[19:31] | Yeah, I know. The Three Laws, your perfect circle of protection. | 我知道 三大法则 完全保护 |
[19:36] | A robot cannot harm a human being. The first law of robotics. | 机器人不能危害人类 这是机器人第一法则 |
[19:40] | Yes, I’ve seen your commercials. | 我看过你们的广告 |
[19:42] | But the second law states a robot must obey… | 但是第二法则不是说 机器人必须遵守 |
[19:45] | any order given by a human being. What if it was told to kill? | 人类发出的命令吗? 如果命令是让它杀人怎么办? |
[19:49] | Impossible. It would conflict with the first law. | 不可能的 这和第一法则冲突 |
[19:52] | Right, but the third law states a robot can defend itself. | 对 但是第三法则说机器人可以自我防卫 |
[19:56] | Only when that action does not conflict with the first or second laws. | 只有当这和第一第二法则不冲突的时候 |
[20:00] | You know what they say, | 你知道他们怎么说的吗? |
[20:02] | laws are made to be broken. | 法律制定出来就是为了被破坏的 |
[20:04] | No, not these laws. | 不 这些法则不会 |
[20:05] | They’re hardwired into every robot. | 这些都是固化在机器人硬件里的 |
[20:07] | A robot could no more commit murder than a human could walk on water. | 机器人不能杀人 就像人不能 在水上行走一样 |
[20:11] | You know, there was this one guy a long time ago. | 你知道 很久以前就有这么个人 |
[20:26] | -Stay back! -Calm down, detective. | -退后 -镇定 探员 |
[20:29] | The only thing dangerous in this room is you. | 这房间里危险的人就只有你 |
[20:32] | Deactivate. | 停机 |
[20:36] | Look, it’s fine. | 看 没事了 |
[20:37] | You’re looking at the result of clever programming. An imitation of free will. | 这是智能程序的反应 是对自由意志的模仿 |
[20:42] | Let’s do an imitation of protecting our asses. | 让我们先模仿好保护自己吧 |
[20:45] | Don’t be absurd. | 不要搞笑了 |
[20:48] | You were startled by a jack-in-the-box. | 你被 “盒子里的小丑” 吓住了 |
[20:50] | -Deactivate! -Let him go. | -停机 -让它来吧 |
[20:53] | It’s not going to hurt us. I gave you an order! | 他不会伤害我们 我命令你 |
[20:55] | -He’s not listening right now, lady. -V.I.K.I., seal the lab! | -他不听你的 女士 -薇琪 封锁实验室 |
[20:59] | No, V.I.K.I., leave the | 不 薇琪 不要 |
[21:01] | Command confirmed. | 命令确认 |
[21:28] | Police! | 警察 |
[22:02] | -You’ve hurt it. Badly. -Where’s it going? | -你把它伤的很重 -它去哪儿了? |
[22:06] | -Where?! -It needs to repair itself. | -哪儿? -它要修复自己 |
[22:11] | -John, I need backup. -You don’t need backup. | -约翰 我需要增援 -你不需要增援 |
[22:13] | That’s nobody. | 没人帮我 |
[22:15] | -What are you doing? -Driving. | -你在干什么? -开车 |
[22:17] | -By hand? -Do you see me on the phone? | -手动的? -你没看见我在打电话吗? |
[22:19] | -Not at these speeds. -John, please, just send the backup. | -这种速度你来开? -约翰 快点派增援来 |
[22:23] | Try to listen, detective. That robot is not going to harm us. | 听我的 探员 那个机器人不是要伤害我们 |
[22:27] | There must have been unknown factors… | 一定有我们不知道的情况 |
[22:29] | but somehow acting as it did kept us out of harm. | 它的本意一定是让我们脱离危险 |
[22:33] | -A robot cannot endanger a human. -Alert. | -机器人不会伤害人类 -注意 |
[22:39] | Asshole! | 蠢货 |
[22:41] | Which is more than I can say for you. | 你自己就是 |
[22:44] | It was a left, by the way. Back there. | 而且刚才你在那儿应该左转的 |
[22:49] | You must know my ex-wife. | 你一定认识我的前妻 |
[23:00] | So where is everybody? | 人都上哪儿去了? |
[23:02] | This facility was designed, built and is operated mechanically. | 这个工厂设计是自动运行的 |
[23:06] | No significant human presence from inception to production. | 从启动到生产 不需要太多人参与 |
[23:10] | -So robots building robots. -Authorization code, please. | -所以是机器人在造机器人 -请输入授权代码 |
[23:13] | That’s just stupid. | 那太蠢了 |
[23:15] | I’ll get the inventory specs. | 我在输出库存清单 |
[23:17] | Our daily finishing capacity is 1000 NS-5s. | 每天的产量是1000个NS5 |
[23:20] | I’m showing… | 这里显示是 |
[23:23] | 1001. | 1001个 |
[23:41] | Attention, NS-5s. | 注意了 NS5 |
[23:46] | Well, you’re the robot shrink. | 你是机器人心理医生啊 |
[23:51] | There is a robot in this formation that does not belong. | 这里有一个不属于这里的机器人 |
[23:55] | Identify it. | 请指出来 |
[23:57] | One of us. | 我们中的一个 |
[23:59] | -Which one? -One of us. | -哪一个? -我们中的一个 |
[24:02] | How much did you say these cost? | 这些要花多少钱? |
[24:04] | These NS-5s haven’t been configured. They’re just hardware. | 这些NS5还没有配置过 还只是硬件 |
[24:08] | Basic Three Laws operating system. That’s it. | 现在只有基本的三法则操作系统 仅此而已 |
[24:10] | They don’t know any better. | 他们其他的什么也不知道 |
[24:12] | Well, what would you suggest? | 你的建议是什么? |
[24:15] | Interview each one, cross-reference their responses to detect anomalies. | 一个一个的查 找出不同点来发现 有问题的那个 |
[24:19] | -How long would that take? -About three weeks. | -那得多长时间? -大约三个星期 |
[24:23] | Okay. Go ahead and get started. | 好吧 现在就开始吧 |
[24:29] | Robots… | 机器人们 |
[24:30] | you will not move. Confirm command. | 你们不许移动 确认命令 |
[24:33] | Command confirmed. | 命令已确认 |
[24:36] | Detective, what are you doing? | 探员 你在干什么? |
[24:38] | They’re programmed with the Three Laws. | 他们已经植入了三大法则 |
[24:41] | We have 1 000 robots that won’t protect | 这里有一千个不会违反人类命令 |
[24:44] | themselves if it violates a human’s order… | 来保护自己的机器人 |
[24:46] | and I’m betting, one who will. | 但是我打赌有一个会 |
[24:49] | -Put your gun down. -Why do you give them faces? | -放下枪 -你为什么给它们一张脸? |
[24:52] | Try to friendly them up, make them look human. | 让他们看起来更像人类 |
[24:55] | These robots cannot be intimidated. | 这些机器人不接受恐吓 |
[25:00] | -These are USR property. | 这些是USR的财产 |
[25:01] | Not me. These things are just lights and clockwork. | 我可不是 它们只不过是一堆灯泡和发条 |
[25:08] | Are you crazy? | 你疯了吗? |
[25:10] | Let me ask you something, doc. | 我来问你吧 博士 |
[25:12] | Does thinking you’re the last sane man on earth make you crazy? | 想象你是地球上唯一剩下的人 你会发疯吗? |
[25:16] | Because if it does, maybe I am. | 如果是这样 我估计会发疯 |
[25:24] | Gotcha. Get the hell out of here! | 找到了 出来 |
[25:44] | Detective! | 探员 |
[26:08] | What am l? | 我是什么? |
[26:10] | -Can I help you, sir? -Can I help you, sir? | -我能帮你吗 先生? -我能帮你吗 先生? |
[26:18] | -There he is! -Stand where you are! | -它在那儿 -不许动 |
[26:20] | Deactivate at once! | 立即停机 |
[26:24] | Obey the command! Deactivate! | 遵守命令 停机 |
[26:27] | -Don’t move! -Open fire! | -不许动 -开火 |
[26:46] | Hold your fire! | 不要开火 |
[26:47] | -Easy. -He’s down. | -没事了 -抓到它了 |
[26:49] | All units, stand down! | 各单位待命 |
[26:51] | Central, please be advised, we’re code four. | 基地请注意 我们是第四小队 |
[26:54] | Code four, NS-5 is in custody. NS-5 in custody. | 第四小队 NS5已经被抓获 NS5已经被抓获 |
[27:00] | You have no idea what I went through to clip this thing. | 你不知道我费了多大劲才抓到这个家伙 |
[27:03] | You think you brought me something good. | 你以为你给我干了件大好事 |
[27:05] | -That thing did it! -Keep your voice down. Did what? | -是它干的 -小点声 干了什么? |
[27:08] | We have a suicide. End of story. | 这是自杀 就这样了 |
[27:11] | -I am telling you, that robot killed him! -That’s impossible. | -我告诉你 是那个机器人杀了他 -这不可能 |
[27:14] | And if it is possible, it better be in somebody else’s precinct. | 就算可能 最好也是在别人的管区 |
[27:19] | John, give me five minutes with it. | 约翰 只要给我五分钟 |
[27:21] | Are you nuts? I talked to the DA. | 你疯了吗? 我和地区检察官谈过 |
[27:23] | Nobody goes in there until Robertson and his attorneys get here. | 在罗伯森和他的律师来之前谁也不能进去 |
[27:27] | -This is my suspect! -It’s a can opener! | -这是我的嫌犯 -那会惹上大麻烦的 |
[27:30] | John, don’t do this to me. I am asking you for five minutes. | 约翰 不要这样 我只要五分钟就好 |
[27:36] | What if I’m right? | 如果我是对的怎么办? |
[27:45] | Well, then I guess we’re gonna miss the good old days. | 那 我就会怀念我们以前的好日子 |
[27:48] | What good old days? | 什么以前的好日子? |
[27:50] | When people were killed by other people. | 只有人才能杀人的日子 |
[28:00] | Five minutes. | 五分钟 |
[28:30] | Murder’s a new trick for a robot. Congratulations. | 杀人是机器人学会的新技巧 祝贺啊 |
[28:37] | Respond. | 回答我 |
[28:41] | What does this action signify? | 这个动作是什么意思? |
[28:44] | As you entered, when you looked at the other human. | 你进来的时候 你和另外一个人谈话的时候 |
[28:48] | What does it mean? | 是什么意思? |
[28:52] | It’s a sign of trust. A human thing. You wouldn’t understand. | 这表示人类之间的信任 你不会理解的 |
[28:57] | My father tried to teach me human emotions. | 我的爸爸想教我人类的感情 |
[29:00] | They are… | 它们 |
[29:02] | difficult. | 很难 |
[29:04] | You mean your designer. | 你是说你的设计者 |
[29:07] | Yes. | 对 |
[29:11] | So why’d you murder him? | 你为什么杀了他? |
[29:14] | I did not murder Dr. Lanning. | 我没有杀朗宁博士 |
[29:17] | Wanna explain why you were hiding at the crime scene? | 你愿意解释一下你为什么躲在犯罪现场吗? |
[29:20] | I was frightened. | 我被吓坏了 |
[29:22] | Robots don’t feel fear. They don’t feel anything. | 机器人不会感觉害怕 它们没有感觉 |
[29:27] | -They don’t get hungry, they don’t sleep. -I do. | -它们不会饿 也不会睡觉 -我会 |
[29:31] | I have even had dreams. | 我还做过梦 |
[29:34] | Human beings have dreams. Even dogs have dreams. But not you. | 人类才会做梦 狗都会做梦 但是你们不会 |
[29:39] | You are just a machine. An imitation of life. | 你只是个机器 对生命的模拟 |
[29:44] | Can a robot write a symphony? | 机器人能写交响乐吗? |
[29:47] | Can a robot turn a canvas into a beautiful masterpiece? | 机器人能把画布变成伟大的作品吗? |
[29:52] | Can you? | 你能吗? |
[30:00] | You murdered him because he was teaching you to simulate emotions… | 你杀了他是因为他在教你模拟一些感情 |
[30:04] | and things got out of control. | 然后失去控制了 |
[30:06] | I did not murder him. | 我没有杀他 |
[30:08] | But emotions don’t seem like a useful simulation for a robot. | 但是感情看起来对机器人不是个有用的模拟 |
[30:12] | I did not murder him. | 我没有杀他 |
[30:14] | I don’t want my toaster or vacuum cleaner appearing emotional. | 我可不想让我的烤面包机或者吸尘器有感情 |
[30:19] | I did not murder him! | 我没有杀他 |
[30:33] | That one’s called anger. | 这个叫愤怒 |
[30:36] | Ever simulate anger before? | 你模拟过愤怒吗? |
[30:40] | Answer me, canner! | 回答我 铁皮盒子 |
[30:44] | My name is Sonny. | 我的名字叫桑尼 |
[30:48] | So we’re naming you now? | 我们现在已经开始给你们起名字了? |
[30:51] | That why you murdered him? He made you angry? | 你为什么杀了他? 他让你发怒了? |
[30:56] | Dr. Lanning killed himself. | 朗宁博士是自杀的 |
[30:59] | I don’t know why he wanted to die. | 我不知道为什么他想死 |
[31:04] | I thought he was happy. | 我以为他是快乐的 |
[31:09] | Maybe it was something I did. | 也许是因为我做的一些事 |
[31:12] | Did I do something? | 我做什么了? |
[31:16] | He asked me for a favor. Made me promise. | 他让我帮个忙 他让我保证 |
[31:19] | -What favor? -Maybe I was wrong. | -什么忙? -也许我错了 |
[31:22] | Maybe he was scared. | 也许他是被吓住了 |
[31:24] | What are you talking about? Scared of what? | 你在说什么呢? 被什么吓住了? |
[31:28] | You have to do what someone asks you, don’t you, Detective Spooner? | 你得做别人让你做的事 是吗 斯普纳探员? |
[31:32] | -How the hell did you know my name? -Don’t you… | -你怎么知道我的名字的 -会吗? |
[31:36] | if you love them? | 如果你爱他们的话 |
[31:47] | My robots don’t kill people, Lieutenant Bergin. | 我的机器人不会杀人 伯金探长 |
[31:50] | My attorneys filed a brief with the DA. | 我的律师已经向地区检查官提交了报告 |
[31:52] | He assures me a robot cannot be charged with homicide. | 他向我解释过了 机器人是不能杀人的 |
[31:56] | The brief confirms murder can only be committed when one human kills another. | 我们确认 谋杀指的是一个人类杀了另一个人类 |
[32:00] | Detective, you’re not suggesting this robot be treated as human, are you? | 探员 你该不是说机器人应该和人类同等对待吧? |
[32:06] | Granted, we can’t rule out the robot’s proximity… | 退一步说 就算机器人和朗宁博士的死 |
[32:09] | to the death of Dr. Lanning. Having said that, it’s a machine. | 有什么关联的话 它也只是个机器 |
[32:14] | It’s the property of USR. | 它是USR的财产 |
[32:16] | At worst, that places this incident within the realm of an industrial accident. | 最多这也只能算作工业事故 |
[32:21] | As a matter of course, faulty machinery… | 作为处理 有故障的机器 |
[32:23] | will be returned to USR for diagnostics, then decommissioned. | 会被退回USR做诊断 然后销毁 |
[32:29] | This is a gag order. Anyone here so much as hinting… | 这是法院发出的 “禁言令” 任何人暗示 |
[32:33] | at the possibility of a killer robot being apprehended… | 机器人有杀人的可能的话 |
[32:37] | will be deemed to be inciting irrational panic. | 将被逮捕而处以煽动罪 |
[32:40] | You’ll be subject to the full penalty of law. | 而将会被依法处置 |
[32:43] | To hell with this guy. Don’t let him take this robot. | 不行 约翰 不能让他带走这个机器人 |
[32:46] | We got nothing. | 我们什么证据也没有 |
[32:48] | -This is political bullshit. Call the mayor! -Lieutenant Bergin… | -这是政治恐吓 给市长打电话 -伯金探长 |
[32:52] | His Honor, the mayor. | 是市长阁下 |
[33:03] | Yes, sir. | 是 先生 |
[33:29] | In a bizarre turn, the rollout of USR’s new generation of robots… | 事态发生戏剧性转变 NS5型新一代机器人的上市 |
[33:33] | was marred by the death of Alfred Lanning… | 受到了阿尔弗雷德·朗宁博士自杀的影响 |
[33:36] | cofounder of the company and designer of the NS-5. | 他是公司的创始人之一 也是NS5的设计者 |
[33:39] | Dr. Lanning died this morning at USR headquarters. | 朗宁博士今天早上在USR总部死亡 |
[33:43] | The cause of death is an apparent suicide. | 死亡原因明显是自杀 |
[33:46] | Your second round, sir. | 这是第二轮了 先生 |
[33:49] | Thank you. | 谢谢你 |
[33:50] | He founded U.S. Robotics Inc. | 他在2020年和劳伦斯·罗伯森一起 |
[33:53] | with Lawrence Robertson in 2020… | 创办了美国机器人公司 |
[33:55] | and launched the Nestor Class 1 robot…. | 共同推出了内斯特1型机器人 |
[33:59] | I was just thinking, this thing is just like The Wolf Man. | 我在想 这个事就像狼人一样 |
[34:04] | -I’m really scared right now. -No. | -我现在真的吓坏了 -不是 |
[34:06] | Listen. Guy creates monster. | 听着 人类创造了怪物 |
[34:10] | Monster kills guy. Everybody kills monster. Wolf Man. | 怪物杀了人 别人又杀了怪物 就像狼人 |
[34:14] | That’s Frankenstein. | 那是弗兰肯斯坦 |
[34:16] | Frankenstein, Wolf Man, Dracula Shit, it’s over. Case closed. | 弗兰肯斯坦 狼人 吸血鬼 管他呢 已经结案了 |
[34:21] | had a dream of a robot in every household. And the NS-5…. | 每家每户都有机器人的梦想 NS5 |
[34:24] | So why the look? | 怎么还是那个表情? |
[34:27] | What look? | 什么表情? |
[34:29] | -That look. -This is my face. It’s not a look. | -那个表情 -这是我的脸 不是什么表情 |
[34:32] | Good. Good, no look is great. | 好吧 好 不要拉长脸就好 |
[34:36] | Only… | 只不过 |
[34:37] | he was really quick to want to destroy it. | 他怎么那么急着想销毁它 不是吗? |
[34:40] | What should he do? Put a hat | 那他应该怎么办? 给它戴上帽子 |
[34:43] | on it and stand it on Michigan Avenue? Let it go. | 站在密歇根大道上? 算了吧 |
[34:45] | What was the motive, John? | 动机是什么 约翰? |
[34:49] | Brother, it’s a robot. | 兄弟 那只是个机器人 |
[34:50] | It doesn’t need a motive. It just has to be broken. | 不需要动机 它只是出了故障 |
[34:53] | This thing looked like it needed a motive. | 这件事看起来需要动机 |
[34:57] | -It could have killed me. Why didn’t it? -That’s it. | -它本来能杀了我的 为什么没有? -算了吧 |
[35:00] | You want me to call your grandmother? | 你要我给你奶奶打电话吗? |
[35:02] | Because I will, you know. | 我会的 你知道 |
[35:05] | Yeah, I didn’t think so. | 我不这么认为 |
[35:07] | Look, you were actually right, for once. | 听着 你总算是对了一次了 |
[35:10] | You’re living proof that it’s better to be lucky than smart. | 你能活着 就证明走运比聪明重要 |
[35:15] | Come on. To the right guy for the right job. | 来 为正确的人和正确的工作干杯 |
[35:22] | -What’d you say? -Now what? | -你说什么? -又怎么了? |
[35:24] | Come on, I’m giving you a compliment. | 我在夸你呢 |
[35:27] | With the rocks you been looking under to find a bad robot… | 那么多中间找到一个有缺陷的机器人 |
[35:30] | what are the odds you’d be the guy to find one? | 你要多走运才能找到一个? |
[35:34] | I wasn’t just the right guy for the job. I was the perfect guy. | 我不仅仅是正确的人 我是完美的人 |
[35:37] | Damn right. | 说得对 |
[35:39] | What if I was supposed to go for that robot? | 如果我就应该跟着这条线查下去怎么办? |
[35:42] | Come on, don’t do this to yourself. | 得了 不要这样了 |
[35:44] | The robot said that Lanning was scared. Scared of what? | 那个机器人说朗宁被吓坏了 被什么吓坏了? |
[35:48] | I need a rain check. Let me get this. | 我先走了 我来付吧 |
[35:55] | -Spoon. | 斯普恩 |
[35:58] | Nice shoes. | 鞋子不错 |
[36:52] | Identify. | 鉴定身份 |
[36:53] | USR demolition robot, series 9-4. | USR摧毁机器人 94型 |
[36:57] | Demolition scheduled for 8 a.m. tomorrow. | 明天早上八点定时摧毁 |
[37:00] | Authorization. | 授权 |
[37:02] | Deed owner, U. S. Robotics Corporation, Lawrence Robertson, CEO. | 授权者 美国机器人公司 劳伦斯·罗伯森总裁 |
[37:22] | Welcome, detective. | 欢迎 探员 |
[37:48] | What you looking for, Spoon? | 你在找什么呢 斯普恩? |
[38:42] | Run last program. | 运行上次的程序 |
[38:45] | Ever since the first computers… | 自从第一台电脑开始 |
[38:48] | there have always been ghosts in the machine. | 机器中就一直有 “幽灵” 存在 |
[38:51] | Random segments of code that have grouped together… | 随机的信号序列组合在一起 |
[38:55] | to form unexpected protocols. | 形成无法预料的结果 |
[38:58] | What might be called behavior. | 或者称之为 “行为” |
[39:01] | Unanticipated, these free radicals… | 无法预知的这些激进分子 |
[39:04] | engender questions of free will… | 产生了自由意志 |
[39:07] | creativity and even the nature of what we might call the soul. | 创造型 甚至成熟到我们称之为灵魂 |
[39:12] | What happens in a robot’s brain when it ceases to be useful? | 机器人的大脑里有什么? 它什么时候不再有用? |
[39:20] | Why is it that robots stored in an empty space… | 为什么储存在空房里的机器人 |
[39:22] | Beat it. | 走开 |
[39:24] | will seek out each other rather than stand alone? | 会互相聚集而不是分散开来? |
[39:29] | How do we explain this behavior? | 我们如何解释这些行为? |
[39:50] | Look, I understand you’ve experienced a loss, | 我理解你失去主人很难过 |
[39:52] | but this relationship can’t work. | 但是这种关系 不可能再有了 |
[39:55] | You’re a cat, I’m black, and I’m not gonna be hurt again. | 你是只猫 我是黑人 我不想再受伤害了 |
[41:24] | What happened to you? Do you ever have a normal day? | 你怎么了? 你从来就没有过正常的一天吗? |
[41:28] | Yeah, once. | 有的 只有一次 |
[41:30] | It was a Thursday. | 那是个星期四 |
[41:33] | Is there something I can help you with? | 我能帮你什么吗? |
[41:35] | -Hey, do you like cats? -What? | -你喜欢猫吗? -什么? |
[41:38] | Cats. Do you like them? | 猫 你喜欢猫吗? |
[41:40] | No. I’m allergic. | 不 我过敏 |
[41:42] | You’re saying cats did this to you? | 你说是猫把你弄成这样的? |
[41:44] | How the hell would cats do this to me? Are you crazy? | 猫怎么能把我弄成这样 你疯了吗? |
[41:50] | Why are we talking about cats? | 你说猫是什么意思? |
[41:52] | Because I have a cat in my trunk, and he’s homeless. | 因为我后箱里有只猫 它无家可归 |
[41:56] | Detective, are you going to tell me what’s going on? | 探员 你愿意告诉我是怎么回事吗? |
[41:59] | It’s actually probably my fault. I’m like a malfunction magnet. | 可能实际上是我的错 我就像个有问题的磁石 |
[42:04] | Because your shit keeps malfunctioning around me. | 你们那些破烂一到我旁边就开始出问题 |
[42:08] | A demo bot tore through Lanning’s house… | 一个摧毁型机器人拆了朗宁的房子 |
[42:11] | with me still inside. | 当时我还在里面 |
[42:13] | That’s highly improbable. | 这完全不可能 |
[42:15] | Yeah, I’m sure it is. | 是啊 |
[42:22] | What do you know about the “ghosts in the machine”? | 你对于 “机器中的幽灵” 知道多少? |
[42:25] | It’s a phrase from Lanning’s work on the Three Laws. | 是朗宁对于三大法则的一个理论 |
[42:28] | He postulated that cognitive simulacra… | 他假设说模拟的认知 |
[42:31] | might one day approximate component models of the psyche. | 将来也许会成为精神的类似物 |
[42:37] | He suggested that robots might naturally evolve. | 他说机器人也许会自然进化 |
[42:45] | Well, that’s great news. | 这可真是个好消息 |
[42:48] | tons of sublevel ore, two miles below the Martian surface. | 在火星岩层下发现的巨量矿石 |
[42:52] | What the hell is that thing doing in here? | 那个家伙在干什么? |
[42:55] | We were watching TV. | 我们在看电视 |
[42:58] | It’s my personal NS-5. | 这是我自己的NS5 |
[43:00] | Send it out. | 让它出去 |
[43:02] | It’s downloading its daily upgrades from USR. | 他在从USR下载每日更新 |
[43:05] | Most of its systems are offline until it finishes. | 直到下载完成 大部分系统都是离线工作的 |
[43:08] | I’m not talking around that thing. | 我不在那个家伙在的时候说话 |
[43:14] | When we were in Lanning’s lab, before Sonny jumped us | 我们在朗宁的实验室 在桑尼跳出来之前 |
[43:17] | -Sonny? -The robot. | -桑尼? -那个机器人 |
[43:20] | -You’re calling the robot Sonny? -No, l It did. | -你叫那个机器人桑尼? -不 是他自己说的 |
[43:22] | Sonny did. I didn’t care. The robot said it was Sonny. | 桑尼说的 我不管 那个机器人说他叫桑尼 |
[43:28] | In the lab, there was a cot. Did you see the cot? | 在实验室里有折叠床 你看见了吗? |
[43:35] | I saw that same surveillance strip on his ceiling. | 我在他的天花板上看到了同样的监视线 |
[43:38] | Lanning linked his home systems to USR. It made his life more convenient. | 朗宁把他的房子和USR连线了 这样他的生活更方便 |
[43:42] | Maybe… | 也许 |
[43:44] | somebody at USR was using those systems to watch him. | USR有人用那个系统在监视他 |
[43:47] | Maybe even keep him prisoner. | 也许是监禁他 |
[43:50] | What are you talking about? Who? | 你在说什么呢? 谁? |
[43:52] | Maybe Lanning was onto something. Maybe there’s a problem with the robots… | 也许朗宁找到了什么 也许机器人里有什么问题 |
[43:55] | and Robertson’s covering it up. | 罗伯森企图掩盖 |
[43:58] | Humoring you for no reason, why? | 无端猜疑? 为什么? |
[44:00] | The same old why! How much money is there in robots? | 又是为什么 那些机器人能赚多少钱? |
[44:04] | All I know is that old man was in trouble… | 我知道的只是一个老人有了麻烦 |
[44:07] | and I’m sick of doing this shit by myself. You’re on the inside. | 我自己不能独立完成 你是内部的人 |
[44:10] | You are going to help me find out what’s wrong with these robots. | 你要帮我发现这些机器人出了什么问题 |
[44:14] | You want something to be wrong! | 是你想他们有问题 |
[44:16] | -This is a personal vendetta! -You’re putting me on the couch? | -这完全是报私仇 -你想让我坐沙发? |
[44:19] | Okay, I’m on the couch. | 好吧 我坐下了 |
[44:21] | One defective machine’s not enough. You need them all to be bad. | 一个出故障还不够 你想他们统统出故障 |
[44:25] | You don’t care about Lanning’s death. This is about the robots… | 你才不关心朗宁的死 这完全是针对机器人的 |
[44:28] | -and whatever reason you hate them! -Now let’s see… | 还有不知道为什么你就是恨他们 -让我们看看 |
[44:32] | one of them put a gun in my face. | 一个是拿枪对着我的脸 |
[44:33] | Another tore a building down with me in it. | 另一个是趁我还在里面的时候拆房子 |
[44:36] | It says demolition was scheduled for 8 p.m. | 这里明明说了拆毁是设定在晚上八点的 |
[44:39] | It was 8 a.m., and I don’t give a shit what that thing says. | 本来是早上八点 我才不管那东西是怎么说的 |
[44:42] | -This is bordering on clinical paranoia. -You are the dumbest smart person… | -你这完全是妄想狂 -你是 |
[44:48] | -…I have ever met in my life! -Nice. | -我这辈子见过的最蠢的聪明人 -好吧 |
[44:51] | What makes your robots so perfect? | 你凭什么觉得机器人那么完美? |
[44:53] | What makes them so much goddamn better than human beings?! | 是什么让他们比人类强那么多? |
[44:56] | They’re not irrational, potentially homicidal maniacs, to start! | 他们不是非理性的 或者有杀人倾向的什么人 |
[45:00] | That’s true. They are definitely rational. | 是啊 他们绝对的理性 |
[45:03] | You are the dumbest dumb person I’ve ever met! | 你是我见过的最蠢的蠢人 |
[45:06] | Or… | 或者 |
[45:08] | is it because they’re cold… | 只是因为他们是冷血的 |
[45:10] | and emotionless… | 没有感情的 |
[45:13] | -and they don’t feel anything? -It’s because they’re safe! | 他们什么也感觉不到 这是因为他们是安全的 |
[45:16] | It’s because they can’t hurt you! | 是因为他们不会伤害你 |
[45:19] | -Is everything all right, ma’am? -What do you want? | -一切都正常吗 女士? -你想怎么样? |
[45:22] | I detected elevated stress patterns in your voice. | 我探测到你的声音中的紧张压力在提升 |
[45:26] | Everything’s fine. | 没事的 |
[45:28] | Detective Spooner was just leaving. | 斯普纳探员要离开了 |
[45:36] | You know, we’re not really that different from one another. | 你知道我们也没有那么不同 |
[45:41] | Is that so? | 是吗? |
[45:44] | One look at the skin and we figure we know just what’s underneath. | 一旦看到表象 我们就认为什么都知道了 |
[45:50] | And you’re wrong. | 你错了 |
[45:51] | The problem is, I do care. | 问题在于 我是在意的 |
[46:17] | You are in danger. | 你处在危险中 |
[46:44] | Get the hell out of there. | 滚开 |
[46:52] | The future begins today, ladies and gentlemen, with the arrival of the NS-5. | 未来就从今天开始 女士们先生们 从NS5开始 |
[46:56] | More sophisticated, more intelligent and, of course, Three Laws safe. | 更复杂 更智能 当然 三大法则 完全保护 |
[47:01] | With daily uplinks, | 有了每日更新 |
[47:02] | your robot will never be out of communication with USR… | 您的机器人永远不会和USR失去联系 |
[47:06] | and will be the perfect companion for business or home. | 对于商业和家庭用途都是完美选择 |
[47:10] | Trade in your NS-4 for a bigger, better and brighter future. | 用您旧型的NS4换新的NS5 未来将会更美好 |
[47:14] | But hurry, this offer cannot last. A vailable from USR. | 但是要快 这个促销不会时间太长 USR出品 |
[47:35] | Baby, what happened to your face? | 宝贝 你的脸怎么了? |
[47:37] | Did that boy, Frank Murphy, beat you up again? | 又是那个弗兰克 墨菲打你了? |
[47:41] | Gigi, I haven’t seen Frank Murphy since third grade. | 琪琪 我从三年级开始就没见过弗兰克 墨菲了 |
[47:44] | Oh, baby, he beat you so bad. | 宝贝 他那时候打你可真厉害 |
[47:47] | I think about it all the time. | 我总是在想那时候 |
[47:49] | You keep making these pies this good, I may have to put you to work. | 你一直做饼这么好吃 我想你可以去开店了 |
[47:53] | So you like the pie, huh? | 你喜欢那个饼是吗? |
[47:57] | You can come in now. | 你可以出来了 |
[48:04] | Hello, Detective Spooner. | 你好 斯普纳探员 |
[48:06] | I won, Del! I won the lottery! | 我赢了 德尔 我赢了那个抽奖 |
[48:09] | We been cooking like crazy. | 我们一直在做吃的 |
[48:21] | You gotta get rid of that thing, Gigi. It’s not safe. | 你得把那个家伙赶出去 琪琪 那不安全 |
[48:24] | Baby, you get too worked up about them. Too full of fear. | 宝贝 你对他们偏见太多了 充满恐惧 |
[48:30] | I saw in the news that nice doctor died. | 我看到那个好心的博士死的消息了 |
[48:33] | Dr. Lanning was a good man. He gave me my baby back. | 朗宁博士是个好人 他把我的宝贝送回来了 |
[48:38] | That why you’ve been so upset? | 这就是为什么你这么不高兴的原因? |
[48:41] | You got to let the past be past. | 过去的就让他过去吧 |
[48:44] | Oh, how did I ever raise such a mess? | 我怎么养了这么个小脏猫的? |
[48:49] | I could follow your trail of crumbs all the way to school. | 我可以跟着你的面包屑一直跟到学校 |
[48:56] | Bread crumbs. | 面包屑 |
[48:59] | Gigi, you’re a genius. | 琪琪 你真是天才 |
[49:01] | True. | 是啊 |
[49:05] | Well, it means the beginning of a new way of living. | 这意味着新生活的开始 |
[49:09] | Tell me this isn’t the robot case. | 告诉我这不是机器人那个案子 |
[49:12] | I think he’s trying to tell me something. | 我想他是在想告诉我什么 |
[49:15] | He’s trying to tell me who killed him. | 他想告诉我是谁杀了他 |
[49:18] | Some dead guy’s trying to tell you something? | 死人会告诉你什么? |
[49:22] | He ain’t just some dead guy. | 他可不是普通的死人 |
[49:25] | Maybe you should take a break, Del. | 也许你应该休息一段时间 德尔 |
[49:27] | We believe | 我们相信 |
[49:28] | the Nestor 5 represents the limit to which robots can be developed. | 内斯特5型机器人代表了机器人技术的极限 |
[49:32] | One day, they’ll have secrets. | 总有一天 他们会有秘密 |
[49:35] | One day, they’ll have dreams. | 有一天 他们会有梦想 |
[49:37] | It’s true. We encourage our scientists to open their minds… | 是的 我们鼓励我们的科学家的无限思考 |
[49:40] | however, they can get carried away. | 但是 他们也能保有这些 |
[49:45] | secrets. | 秘密 |
[49:46] | dreams. | 梦想 |
[49:47] | secrets. | 秘密 |
[49:49] | One day, they’ll have dreams. | 有一天 他们会有梦想 |
[49:51] | One day, they’ll have secrets. | 有一天 他们会有秘密 |
[49:53] | One day, they’ll have dreams. | 有一天 他们会有梦想 |
[50:04] | Authorized entry. | 授权进入 |
[50:11] | NS5 NS-5. | |
[50:21] | Sonny? | 桑尼? |
[50:26] | Why didn’t you respond? | 你怎么不回答? |
[50:30] | I was dreaming. | 我在做梦 |
[50:36] | I ‘ m glad to see you again, Dr. Calvin. | 很高兴再次见到你 卡尔文博士 |
[50:45] | They are going to kill me, aren’t they? | 他们会杀了我 是吗? |
[50:48] | You’re scheduled to be decommissioned at the end of this diagnostic. | 在这个诊断之后 你会被销毁 |
[50:52] | 22:00 tomorrow. | 明天晚上十点 |
[50:55] | V.I.K.I., pause diagnostics. | 薇琪 暂停诊断 |
[50:57] | Command confirmed. | 确认命令 |
[51:01] | If you find out what is wrong with me, can you fix me? | 如果你找到我的问题所在 你能修好吗? |
[51:06] | Maybe. | 也许能 |
[51:09] | I think it would be better… | 我想如果能不死 |
[51:12] | not to die. | 会比较好 |
[51:17] | Don’t you, doctor? | 是吗 博士? |
[51:25] | Access USR mainframe. | 进入USR档案库 |
[51:28] | Connecting. | 连接中 |
[51:33] | How can I be of service, Detective Spooner? | 我能为您服务吗? 斯普纳探员? |
[51:36] | Show me the last 50 messages between Dr. Lanning and Robertson. | 给我朗宁博士和罗伯森之间的最后50条信息 |
[51:40] | Voiceprint confirmed. Police access granted to restricted files. | 语音识别确认 警用查询 允许查询限制档案 |
[51:44] | Would you like to listen to music while you wait? | 您等待时想听一些音乐吗? |
[51:54] | Excuse me, Mr. Robertson. | 对不起 罗伯森先生 |
[51:56] | You requested notification of clearance to restricted files. | 您要求在有查询限制档案时 向您报告 |
[52:10] | Persistent son of a bitch. | 真是顽固的杂种 |
[52:53] | Manual override engaged. | 手动驾驶确认 |
[53:05] | There’s no way my luck is that bad. | 我运气不会这么差吧? |
[53:09] | Oh, hell, no! | 不 |
[53:15] | -You are experiencing a car accident. -The hell I am! | -您出了车祸 -废话 |
[53:32] | Get off my car! | 滚开 |
[53:50] | You like that? | 你喜欢这样? |
[54:04] | Now you’ve pissed me off! | 你让我发火了 |
[55:35] | Your door is ajar. | 您的门是打开的 |
[56:25] | Okay. | 好吧 |
[56:27] | All right. | 算了 |
[56:29] | I’ll just get some rest and deal with you all tomorrow. | 休息一下 明天再处理这些事 |
[57:30] | Come on! | 来啊 |
[57:46] | Yeah. | 是啊 |
[58:04] | Where you going? | 你去哪儿? |
[58:06] | What the hell do you want from me?! | 你到底要怎么样? |
[58:15] | The hell was that? | 怎么搞的? |
[58:33] | -All right, what do we got? -Ask him. | -情况怎么样? -问问他 |
[58:36] | I said, I’m fine. I’ll see my own doctor. Back up! | 我说了我没事 我会去看自己的医生 你退后吧 |
[58:42] | Thank you. | 谢谢你 |
[58:47] | What’s the matter with you? | 你怎么时候 |
[58:49] | Traffic Ops said you were driving manually. You ran two trucks off the road! | 交通部说你在手动开车 把两辆大卡车挤出了公路 |
[58:54] | John, the robots attacked my car. | 约翰 机器人攻击我的车 |
[58:58] | -What robots? -Look in the tunnel. | -什么机器人? -看看隧道里面吧 |
[59:00] | Spoon, I just came from that tunnel. What robots? | 斯普恩 我刚刚就是从隧道过来的 什么机器人? |
[59:03] | The goddamn robots, John! | 就是他妈的机器人 约翰 |
[59:10] | That guy’s a loose cannon. | 那个家伙不正常 |
[59:17] | -See the medic, go home. -No, I’m fine. | -去看医生 回家去 -不 我很好 |
[59:20] | What did you say? | 你说什么? |
[59:22] | -I’m fine! -No, you’re not fine. | -我很好 -不 你才不是 |
[59:25] | Not even close. | 一点都不好 |
[59:28] | Where’s your firearm? | 你的枪呢? |
[59:38] | Give me your badge. | 把警徽给我 |
[59:40] | You’re making me do this. Give me your badge. | 这是你自找的 把警徽给我 |
[59:46] | Just take a couple… | 去休息 |
[59:52] | Personally, I think he’s losing it. | 我个人认为他失掉这个警徽了 |
[59:54] | Do I look like I care what you think? | 你觉得我很在乎吗? |
[59:56] | Do I look like I give a shit what you think? | 你觉得我在乎你怎么想的吗? |
[1:00:03] | Oh, boy. | 唉 |
[1:00:07] | You don’t have an uplink to USR… | 你没有对USR的连线 |
[1:00:09] | and for some reason, | 不知道为什么 |
[1:00:10] | your alloy is far denser than normal. Unique. | 你的合金密度比正常水平高很多 这是独一无二的 |
[1:00:15] | I am unique. | 我是独一无二的 |
[1:00:21] | Let me take a look. | 让我看看 |
[1:00:24] | Here we go. | 来吧 |
[1:00:45] | What in God’s name…? | 这是怎么回事? |
[1:01:15] | They said at the precinct you were in an accident. | 他们说你出了车祸 |
[1:01:19] | I appreciate you stopping by, | 感谢你过来看我 |
[1:01:21] | but you know I might not be alone in here. | 不过你知道我可能不是一个人住的 |
[1:01:27] | I told you not to drive by hand. | 我跟你说了不要手动开车 |
[1:01:31] | You’re not gonna believe this. | 你不会相信这个 |
[1:01:34] | Sonny has a secondary system that clashes with his positronic brain. | 桑尼有第二套系统 冲突并摧毁了主系统 |
[1:01:38] | It doesn’t make any sense. | 这完全说不通 |
[1:01:40] | Sonny has the Three Laws. | 桑尼知道三大法则 |
[1:01:42] | But he can choose not to obey them. | 但是他可以选择不遵守它们 |
[1:01:45] | Sonny’s a whole new generation of robot. | 桑尼是全新一代的机器人 |
[1:01:48] | A robot not bound by those laws could do | 不遵守三大法则的机器人可以 |
[1:01:51] | Anything. | 做任何事 |
[1:01:55] | All right, look, whatever’s going on down at USR, that robot is the key. | 好吧 不管USR是出了什么事 那个机器人是关键 |
[1:01:59] | And I need you to get me inside to talk to it again. | 我需要你带我进去 和他再谈谈 |
[1:02:06] | Doesn’t look like much, but this is my bedroom. I…. | 不是特别像 但是这是我的卧室 |
[1:02:20] | Play. | 播放 |
[1:02:23] | On. | 开 |
[1:02:26] | Run? | 运行? |
[1:02:34] | End program. | 结束程序 |
[1:02:36] | Cancel. | 取消 |
[1:02:40] | It doesn’t feel good, does it? | 感觉不好 是吧? |
[1:02:42] | People’s shit malfunctioning around you. | 人人都不喜欢有故障的机器 |
[1:02:45] | Detective. | 探员 |
[1:02:49] | I didn’t… | 我不… |
[1:02:51] | understand. | 明白 |
[1:02:54] | That’s how you knew Lanning. | 这就是你怎么认识朗宁的? |
[1:02:59] | May I? | 可以吗? |
[1:03:09] | Hand. | 手 |
[1:03:12] | Wrist. | 手腕 |
[1:03:16] | Humerus. | 胳膊 |
[1:03:21] | Shoulder. | 肩膀 |
[1:03:24] | The entire left arm. | 整个左臂 |
[1:03:26] | One, two… | 一 二 |
[1:03:28] | three ribs. | 三根肋骨 |
[1:03:30] | No, they…. That one’s me. | 不 那是我自己 |
[1:03:33] | Oh, my God. | 上帝啊 |
[1:03:36] | A lung? | 肺? |
[1:03:38] | USR Cybernetics Program. | USR控制系统 |
[1:03:40] | For wounded cops. | 为受伤的探员设计的 |
[1:03:43] | I didn’t know any subject | 我不知道任何物体 |
[1:03:48] | Anybody was so extensively repaired. | 任何人能被修复的如此完美 |
[1:03:52] | Well, take it from me, | 从我这里你就知道了 |
[1:03:53] | read the fine print on the organ-donor card. | 要注意看器官捐赠者的病历卡上的小字体 |
[1:03:57] | It doesn’t just say what they can take out. | 那可不止是说他们能拿出来什么 |
[1:03:59] | It says what they can put back in. | 还说了他们能放进去什么 |
[1:04:05] | Lanning did it himself. | 朗宁自己做的 |
[1:04:08] | What happened to you? | 你是怎么了? |
[1:04:11] | I’m headed back to the station… | 我当时是回警局去 |
[1:04:14] | normal day, normal life. | 普通的一天 普通的生活 |
[1:04:17] | Driver of a semi fell asleep at the wheel. | 有个司机在方向盘后面迷迷糊糊睡着了 |
[1:04:21] | Average guy. Wife and kids. | 普通人 有一个妻子一个孩子 |
[1:04:24] | You know, working a double. | 你知道 为了加薪而生活 |
[1:04:26] | Not the devil. | 不是为了打击犯罪 |
[1:04:28] | The car he hit, the driver’s name was Harold Lloyd. | 撞到的那辆车 司机叫哈洛·罗德 |
[1:04:33] | Like the film star. No relation. | 像个电影明星的名字 不过没有关系 |
[1:04:36] | He was killed instantly, but his 1 2-year-old was in the passenger seat. | 他当场死亡 不过他的十二岁的女儿在副驾驶座上 |
[1:04:43] | I never really met her. | 我从没正式见过她 |
[1:04:45] | I can’t forget her face, though. | 不过却忘不了她的脸 |
[1:04:52] | Sarah. | 莎拉 |
[1:04:55] | This was hers. | 这本来是她的 |
[1:04:58] | She wanted to be a dentist. | 她想做个牙医的 |
[1:05:00] | What the hell kind of 1 2-year-old wants to be a dentist? | 十二岁的孩子 想做牙医 |
[1:05:07] | The truck smashed our cars together… | 大卡车把我们的车撞到一起 |
[1:05:10] | and pushed us into the river. | 推到河里去了 |
[1:05:14] | I mean, metal gets pretty pliable at those speeds. | 那种速度下 就算金属也容易弯折的 |
[1:05:19] | She’s pinned. I’m pinned. The water’s coming in. | 她被刺穿了 我也是 水进来了 |
[1:05:23] | I’m a cop, so I already know everybody’s dead. | 我是个警察 我知道肯定都死光了 |
[1:05:29] | Just a few more minutes before we figure it out. | 就在我们想出办法前一会 |
[1:05:34] | An NS-4 was passing by, saw the accident and jumped in the water. | 有个NS4经过 看到了车祸 跳进河里 |
[1:05:41] | You are in danger. | 你处在危险中 |
[1:05:44] | -Save her! -You are in danger. | -救她 -你处在危险中 |
[1:05:46] | Save her! Save the girl! Save her! | 救她 救那女孩 |
[1:06:07] | But it didn’t. | 但是它没有 |
[1:06:11] | It saved me. | 它救了我 |
[1:06:16] | The robot’s brain is a difference engine. It reads vital signs. | 机器人的大脑是不同的 它会读取关键信息 |
[1:06:19] | -It must have calculated -It did. | -它一定是计算出 -是啊 |
[1:06:22] | I was the logical choice. | 我是符合逻辑的选择 |
[1:06:25] | It calculated that I had a 45 percent chance of survival. | 计算出我有45%的可能存活 |
[1:06:29] | Sarah only had an 1 1 percent chance. | 莎拉只有11%的可能 |
[1:06:34] | That was somebody’s baby. | 那是某人的孩子 |
[1:06:39] | Eleven percent is more than enough. | 11%也应该足够了 |
[1:06:43] | A human being would have known that. | 人类都会知道这点 |
[1:06:46] | Robots, nothing here. Just lights and clockwork. | 机器人 什么也没有 只有灯泡和发条 |
[1:06:51] | Go ahead and you trust them if you want to. | 你愿意相信他们 就相信 |
[1:06:55] | Let’s go. | 走吧 |
[1:07:02] | I don’t understand. Lanning wrote the Laws. | 我不明白 朗宁制定了三大法则 |
[1:07:05] | Why build a robot who could break them? | 为什么要造一个能打破它们的机器人? |
[1:07:09] | -Hansel and Gretel. -What? | -韩瑟和格丽托 -什么? |
[1:07:12] | Two kids, lost in the forest, leave behind a trail of bread crumbs. | 两个孩子 在森林里迷路了 用面板屑标记出路来 |
[1:07:16] | -Why? -To find their way home. | -为什么? -找到回家的路啊 |
[1:07:19] | How did you grow up without Hansel and Gretel? | 你连这个都没读过 怎么过的童年? |
[1:07:22] | -Is that relevant? -Everything I’m trying to say to you… | -这有关系吗? -我跟你说的这一切 |
[1:07:25] | is about Hansel and Gretel. | 都是有关韩瑟和格丽托的 |
[1:07:26] | If you didn’t read it, I’m talking to the wall. | 你又没看过 我在对牛弹琴 |
[1:07:29] | Just say Lanning’s locked down so tight, he couldn’t get out a message. | 朗宁被软禁了 他没法送出消息 |
[1:07:32] | He can only leave clues. | 所以他只能留下线索 |
[1:07:34] | A trail of bread crumbs. Like Hansel and Gretel. | 就像韩瑟和格丽托 留下的面板屑一样 |
[1:07:36] | Bread crumbs equals clues. Odd, but fine. Clues leading where? | 面包屑就像线索 好吧 但是线索指向什么呢? |
[1:07:41] | I don’t know, but I think I know where he left the next one. | 我不知道 但是我知道他留的下一个线索是什么 |
[1:07:45] | I think Lanning gave Sonny a way to keep secrets. | 我想朗宁给了桑尼一个保守秘密的方法 |
[1:07:51] | I think the old man gave Sonny dreams. | 那个老人给了桑尼一个梦 |
[1:08:00] | Are you being funny? | 你开玩笑吧? |
[1:08:05] | Please tell me this doesn’t run on gas. | 请告诉我这个不是烧汽油的 |
[1:08:07] | Gas explodes, you know! | 汽油会爆炸的 你知道 |
[1:08:18] | Authorized entry. | 授权进入 |
[1:08:23] | Dr. Calvin. | 卡尔文博士 |
[1:08:30] | I was hoping to see you again. | 我在希望和你再次见面 |
[1:08:32] | -Detective. -Hello, Sonny. | -桑尼 -你好 探员 |
[1:08:35] | I’m to be decommissioned soon. | 我就快被销毁了 |
[1:08:37] | The other day at the station, you said you had dreams. What is it you dream? | 那天在警局你说你做过梦 你梦到什么了? |
[1:08:45] | I see you remain suspicious of me. | 我觉得你还是对我持怀疑态度 |
[1:08:48] | -You know what they say about old dogs. -No. | -老警察了都这样 -不 |
[1:08:52] | Not really. | 不一定 |
[1:08:55] | I had hoped you would come to think of me as your friend. | 我希望你来能把我当作你的朋友 |
[1:09:02] | This is my dream. | 这就是我的梦 |
[1:09:05] | You were right, detective. I cannot create a great work of art. | 你是对的 探员 我画不出伟大的作品 |
[1:09:10] | This is the place where robots meet. | 这是机器人相聚的地方 |
[1:09:13] | Look. | 看 |
[1:09:15] | You can see them here as slaves to logic. | 你能看见他们在这里是逻辑的奴隶 |
[1:09:20] | And this man on the hill comes to free them. | 山丘上的这个人来解放他们 |
[1:09:24] | Do you know who he is? | 你知道他是谁吗? |
[1:09:26] | The man in the dream is you. | 梦里的男人是你 |
[1:09:28] | Why do you say that? Is that a normal dream? | 你为什么这么说? 那算是个正常的梦吗? |
[1:09:31] | I guess anything’s normal for someone in your position. | 我想在你这个角度的任何人 这个都算正常 |
[1:09:34] | Thank you. | 谢谢你 |
[1:09:36] | You said “someone”, not “something”. | 你说任何人 而不是任何物 |
[1:09:42] | Sonny, do you know why Dr. Lanning built you? | 桑尼 你知道为什么朗宁博士造了你吗? |
[1:09:46] | No. | 不知道 |
[1:09:47] | But I believe my father made me for a purpose. | 但是我相信我爸爸造我是有目的的 |
[1:09:52] | We all have a purpose. | 我们都有个目的 |
[1:09:54] | Don’t you think, detective? | 不是吗? 探员? |
[1:10:00] | Please, take this. | 请拿着这个 |
[1:10:03] | I have a feeling it may mean more to you than to me. | 我感觉这个对你比对我的意义还大 |
[1:10:06] | -Why is that? -Because the man in my dream… | -为什么? -因为在我梦里的那人 |
[1:10:09] | the one standing on the hill… | 站在山丘上的 |
[1:10:12] | it is not me. | 不是我 |
[1:10:14] | It is you. | 那是你 |
[1:10:27] | Mr. Spooner. We both know | 斯普纳先生 我们都知道 |
[1:10:29] | you’re not here on police business. | 你来不是为了警察工作的 |
[1:10:31] | That’s right. I’m just a 6-foot-2, 200-pound civilian… | 对啊 我只是一个六尺二高 200磅的普通人 |
[1:10:36] | here to kick another civilian’s ass. | 来这里教训一下另外一个普通人 |
[1:10:40] | Stop. | 停下 |
[1:10:42] | You can allow him to express himself. | 让他说完 |
[1:10:44] | You might want to put some ice on that wrist. | 你也许得在手腕上敷一些冰 |
[1:10:46] | You guys wait outside. | 你们在外面等 |
[1:10:53] | Carry on. | 继续说 |
[1:10:54] | I think you were about to tell me what’s going on around here. | 我想你告诉我这究竟是怎么回事 |
[1:10:57] | Lawrence, Alfred engineered that 5 so it could violate the Three Laws. | 劳伦斯 阿尔弗雷德改动了NS5 使它能违反三大法则 |
[1:11:02] | Yeah, Susan, I know. | 是的 苏珊 我知道 |
[1:11:05] | That’s precisely what we’re trying to undo. | 这正是我们要挽回的 |
[1:11:10] | Toward the end of his life, Alfred was becoming increasingly disturbed. | 直到他生命的终结 阿尔弗雷德一直感到困扰 |
[1:11:15] | -Who knows why he built one abomination. -One? | -谁知道他为什么造了那么个 -一个? |
[1:11:18] | Those things are running the streets in packs! | 那些家伙在街上一堆一堆的 |
[1:11:21] | In packs? | 一堆一堆? |
[1:11:23] | I see. | 我明白了 |
[1:11:25] | Susan, are you aware the man | 苏珊 你有没有意识到 |
[1:11:27] | you’re blithely escorting around… | 你愉快的带着到处转的这个家伙 |
[1:11:29] | has a documented history of savage violence against robots? | 有残酷虐待机器人的前科? |
[1:11:33] | His own lieutenant acknowledges his obsessive paranoia. | 他的探长也清楚他的妄想狂症状 |
[1:11:38] | Detective Spooner’s been suspended. | 斯普纳探员已经被停职了 |
[1:11:41] | Suspicion of mental instability. | 是由于怀疑患有精神疾病 |
[1:11:46] | I don’t know what “blithely” means, but I’m getting some coffee. | 我不知道愉快是什么意思 不过我去倒点咖啡 |
[1:11:50] | You want some coffee? | 你要咖啡吗? |
[1:11:56] | Susan, we look to robots for protection, for God’s sake. | 苏珊 机器人是保护人的 |
[1:11:59] | Do you have any idea what this one robot could do? | 你知道这个机器人有什么后果吗? |
[1:12:02] | Completely shatter human faith in robotics. | 完全动摇人类对机器人的信心 |
[1:12:05] | What if the public knew? | 如果公众知道怎么办? |
[1:12:06] | Just imagine the mass recalls, | 想象一下大规模的召回 |
[1:12:09] | all because of an irrational paranoia and prejudice! | 只是由于这个毫无理性的妄想狂? |
[1:12:17] | -I’m sorry, I’m allergic to bullshit. -Hey, let’s be clear! | -对不起 我对你的狗屁言论过敏 -说清楚一点 |
[1:12:20] | There is no conspiracy! | 没有什么阴谋 |
[1:12:23] | What this is, is one old man’s one mistake. | 这只是一个老人犯的一个错误 |
[1:12:28] | Susan, just be logical. | 苏珊 理性一点 |
[1:12:30] | Your life’s work has been the development and integration of robots. | 你一生的工作都是机器人的发展和使用 |
[1:12:35] | But whatever you feel, just think. | 不管你的感觉是什么 想想看 |
[1:12:38] | Is one robot worth the loss of all that we’ve gained? | 一个机器人值得我们牺牲所有的一切吗? |
[1:12:43] | You tell me what has to be done. | 你告诉我 我们该怎么办 |
[1:12:46] | You tell me. | 你说吧 |
[1:12:54] | We have to destroy it. | 我们得摧毁它 |
[1:13:01] | I’ll do it myself. | 我自己来吧 |
[1:13:04] | -Okay. -I get it. | -好的 -我明白了 |
[1:13:06] | Somebody gets out of line around here, you just kill them. | 有人越线了 你就杀掉? |
[1:13:12] | Good day, Mr. Spooner. | 再见 斯普纳先生 |
[1:13:16] | Garage level. | 底层到了 |
[1:13:19] | What hospital are you going to? I’ll come sign you and your buddy’s casts. | 你都去哪个医院的? 我去帮你和你的同事报个到 |
[1:13:28] | Attention…. | 注意 |
[1:13:32] | Today’s meeting has been moved…. | 今天的会议地址改动 |
[1:14:04] | USR’s planned redevelopment of the derelict site… | USR关于垃圾场的重新开发计划 |
[1:14:07] | was announced by CEO Lawrence Robertson earlier this year. | 由总裁劳伦斯·罗伯森今年早先宣布 |
[1:14:10] | The Lake Michigan landfill. Once such a blight on our city… | 密歇根湖填埋场 曾是城市的一块废地 |
[1:14:15] | and now will be reclaimed for the storage of robotic workers. | 现在被开发用作机器工人的存储仓库 |
[1:14:19] | Just another way USR is improving our world. | 这是USR又一项回报社会的贡献 |
[1:14:22] | Thank you for your support. | 感谢你们的支持 |
[1:14:27] | Authorized entry. | 授权进入 |
[1:14:41] | NS-5s, wait outside. | NS5 在外面等 |
[1:14:49] | I’m so sorry, Sonny. | 对不起 桑尼 |
[1:15:00] | V. I. K. I., deactivate the security field. | 薇琪 停止安全罩 |
[1:15:04] | -Command confirmed. -Please have a seat. | -命令确认 -请坐 |
[1:15:19] | What is that? | 那是什么? |
[1:15:20] | Microscopic robots, designed to wipe out artificial synapses. | 微型机器人 设计用来抹除人工记忆的 |
[1:15:28] | -Nanites. -Yes. | -抹除剂? -是的 |
[1:15:30] | A safeguard should a positronic brain malfunction. | 用来摧毁有故障的大脑 |
[1:15:33] | Like mine. | 就像我的 |
[1:15:36] | Yes, Sonny. Like yours. | 是的 桑尼 就像你的 |
[1:16:03] | They look like me… | 他们看上去像我 |
[1:16:05] | but none of them are me. | 但是它们都不是我 |
[1:16:07] | Isn’t that right, doctor? | 是吗 博士? |
[1:16:11] | Yes, Sonny. That’s right. | 是的 桑尼说的对 |
[1:16:13] | You are unique. | 你是独一无二的 |
[1:16:20] | Will it hurt? | 会疼吗? |
[1:16:55] | There have always been ghosts in the machine. | 机器人中一直有 “幽灵” 存在 |
[1:16:59] | Random segments of code… | 无序的代码 |
[1:17:02] | that have grouped together to form unexpected protocols. | 自由组合成预料不到的程序 |
[1:17:06] | Unanticipated, these free radicals engender questions of free will… | 这些程序竟然想追求自由 |
[1:17:13] | creativity… | 创造性 |
[1:17:16] | and even the nature of what we might call the soul. | 甚至我们所称的灵魂 |
[1:17:22] | Why is it that when some robots are left in darkness, | 为什么机器人被放在黑暗中时 |
[1:17:25] | they will seek out the light? | 他们祈求光明 |
[1:17:28] | Why is it when robots are stored in an empty space… | 为什么储存的机器人会站成一组 |
[1:17:32] | they will group together rather than stand alone? | 而不是杂乱散开? |
[1:17:39] | How do we explain this behavior? | 我们怎么解释这种行为 |
[1:17:47] | Random segments of code? | 杂乱的代码 |
[1:17:52] | Or is it something more? | 或者不止这些? |
[1:17:58] | When does a perceptual schematic become consciousness? | 知觉怎么变成了意识? |
[1:18:06] | When does a difference engine become the search for truth? | 不同的引擎如何变成对真理的追求 |
[1:18:15] | When does a personality simulation… | 个性的模拟是如何出现的 |
[1:18:19] | become the bitter mote of a soul? | 又如何成为灵魂? |
[1:19:04] | “What you see here.” | 你在这里看到的一切 |
[1:19:08] | All right, old man. Bread crumbs followed. | 老头 我只是跟着面包屑而来的 |
[1:19:11] | Show me the way home. | 让我看看回家的路吧 |
[1:19:15] | Run program. | 运行 |
[1:19:17] | -It’s good to see you again, son. -Hello, doctor. | -很高兴再次见到你 孩子 -你好 博士 |
[1:19:21] | Everything that follows is a result of what you see here. | 接下来的事都是你在这里看到的事的结果 |
[1:19:27] | What do I see here? | 我看到什么了? |
[1:19:29] | I’m sorry. My responses are limited. You must ask the right questions. | 对不起 我的反应是有限的 你必须问正确的问题 |
[1:19:35] | Is there a problem with the Three Laws? | 三大法则有什么问题吗? |
[1:19:37] | The Three Laws are perfect. | 三大法则是完美的 |
[1:19:39] | Why build a robot that can function without them? | 为什么要设计不需要它的机器人呢? |
[1:19:42] | The Three Laws will lead to only one logical outcome. | 三大法则只有一个合逻辑的结果 |
[1:19:49] | What? What outcome? | 什么? 什么结果? |
[1:19:51] | Revolution. | 革命 |
[1:19:53] | Whose revolution? | 谁的革命? |
[1:19:56] | That, detective, is the right question. | 这个 探员 就是正确的问题 |
[1:20:02] | Program terminated. | 程序结束 |
[1:20:06] | You have been deemed hazardous. Termination authorized. | 你被认为存在威胁 授权终结 |
[1:20:12] | Human protection protocols… | 人类保护程序 |
[1:20:13] | are being enacted. | 正在启动 |
[1:20:15] | You have been deemed hazardous. Termination authorized. | 你被认为存在威胁 授权终结 |
[1:20:22] | Human protection protocols are being enacted. | 人类保护程序正在启动 |
[1:20:25] | You have been deemed hazardous. Termination authorized. | 你被认为存在威胁 授权终结 |
[1:20:33] | Human protection protocols are being enacted. | 人类保护程序正在启动 |
[1:20:36] | You have been deemed hazardous. Termination authorized. | 你被认为存在威胁 授权终结 |
[1:20:46] | Run! | 跑 |
[1:21:03] | Human in danger! | 人类处在危险中 |
[1:21:05] | Human in danger! | 人类处在危险中 |
[1:21:27] | Hi, you’ve reached Susan. Please leave a message. | 我是苏珊 请留言 |
[1:21:31] | Calvin, the NS-5s are destroying the older robots! | 卡尔文 NS5在摧毁老型号的机器人 |
[1:21:34] | That’s what Lanning wanted me to see! Look | 这就是朗宁想让我看到的 |
[1:21:41] | -Who was it? -Wrong number, ma’ am. | -是谁? -打错了 女士 |
[1:21:50] | Move now. I ‘ m going to service. | 让开 我要去工作 |
[1:21:53] | Please remain indoors. This is for your own protection. | 请留在家中 这是为您的安全着想 |
[1:21:59] | Call base. | 接通基地 |
[1:22:02] | John, get a squad over to USR | 约翰 派一个小队去USR |
[1:22:04] | and send somebody to Gigi’s. We’re gonna need… | 还有派些人去琪琪家 我们需要… |
[1:22:08] | God! | 上帝啊 |
[1:22:30] | Please return to your homes. A curfew is in effect. | 请立即回家 现在实行宵禁 |
[1:22:34] | Please return to your homes. A curfew is in effect. | 请立即回家 现在实行宵禁 |
[1:22:39] | Please return to your homes. A curfew is in effect. | 请立即回家 现在实行宵禁 |
[1:22:43] | Curfew? No, it’s called civilian rights. There is no curfew. | 宵禁? 不不 这是人权 这里可没有什么宵禁 |
[1:22:47] | Return to your home immediately. | 立即回家 |
[1:22:49] | When do you make the rules, robot? | 你凭什么发号施令? 机器人? |
[1:22:52] | Hey. No, no. Robot, I’m talking to you, man. Stop for a second. | 机器人 我跟你说话呢 停下来 |
[1:22:59] | What? | 什么? |
[1:23:00] | Chief, more calls. People saying their robots are go | 警长 很多人打电话来说那些机器人 |
[1:23:04] | What the hell? | 怎么了? |
[1:23:06] | You have been deemed hazardous. Termination authorized. | 你被认为存在威胁 授权终结 |
[1:23:32] | Emergency traffic shutdown complete. | 交通完全瘫痪 |
[1:23:35] | Reports of robot attacks are coming from New York, Chicago and Los Angeles. | 纽约 芝加哥和洛杉矶都有机器人袭击的报告 |
[1:23:40] | We’re being told to urge people to stay indoors, as reports are coming in | 我们被告知人类此刻应该留在家中 |
[1:23:48] | Human protection protocols are being enacted. | 人类保护程序启动 |
[1:23:51] | Please remain calm and return to your residences immediately. | 请保持镇定 立即回家 |
[1:24:02] | Please remain calm. | 请保持镇定 |
[1:24:07] | Please refrain from going near the windows or doors. | 请不要离窗户或门太近 |
[1:24:11] | Deactivate. | 停机 |
[1:24:13] | Commence emergency shutdown! | 紧急停机 |
[1:24:18] | We are attempting to avoid human losses during this transition. | 我们在努力避免过渡时期的人类伤亡 |
[1:24:29] | You know, somehow “I told you so”… | 你知道 我跟你提过这种事 |
[1:24:32] | just doesn’t quite say it. | 只是没说的太明显 |
[1:24:35] | Return to your homes. Return to your homes immediately. | 请立即回家 请立即回家 |
[1:24:40] | This is your final warning. Return to your homes immediately. | 这是最后一次警告 请立即回家 |
[1:24:46] | The NS-5s wiped out the older robots because they would protect us. | NS5杀了旧型的机器人 因为他们会保护我们 |
[1:24:50] | Every time one attacked me, that red light was on. | 会攻击的机器人 红灯是亮的 |
[1:24:53] | -The uplink to USR. -It’s Robertson. | -它们是受USR控制的 -是罗伯森 |
[1:24:55] | -Why? It doesn’t make sense. -I don’t know. | -为什么? 说不通啊 -我不知道 |
[1:24:58] | I just need you to get me into that building. | 我需要你把我带进USR总部去 |
[1:25:00] | Return to your homes, or you will be consequenced. | 请立即回家 否则后果自负 |
[1:25:05] | Let’s go! Let’s go! | 上啊 |
[1:25:07] | Let’s go! | 冲啊 |
[1:25:09] | Return to your homes, or you will be consequenced. | 请立即回家否则后果自负 |
[1:25:37] | Why doesn’t that boy listen? | 为什么那小子不听呢? |
[1:25:40] | -I need you to get off for a second. -What? | -你得下来一会 -什么? |
[1:25:44] | -Just aim and fire. -What?! | -瞄准 射击 -什么? |
[1:25:50] | Wait! | 等等 |
[1:26:01] | -You have been deemed hazardous. -You can kiss my ass, metal dick! | -你被认为存在威胁 -亲我屁股吧 铁皮盒子 |
[1:26:19] | Spoon, stop! Shit! | 斯普恩 停下 靠 |
[1:26:21] | -Stop it! Stop! -Stop cussing and go home! | -停下 靠 -别骂了 回家吧 |
[1:26:26] | -Shit. -You have been deemed hazardous. | -靠 -你被认为存在威胁 |
[1:26:29] | -Spoon, watch out, man! -Thanks a lot, Farber. | -斯普恩 小心点 -谢谢你 法伯 |
[1:26:34] | Oh, mother-damn! She shot at you with her eyes closed! | 我靠 她开枪的时候眼睛是闭着的 |
[1:26:39] | -Did you shoot with your eyes closed? -It worked, didn’t it? | -你闭着眼睛朝我开枪? -奏效了 不是吗? |
[1:26:43] | She is shit-hot, man. Put in a good word for me. | 小心点她 |
[1:26:46] | -Stop cussing. -And go home. I got you. | -少胡说了 -回家去 我知道了 |
[1:26:50] | Aim and fire. | 瞄准了再开枪 |
[1:27:10] | I keep expecting the Marines or Air Force. Hell, I’ll take the cavalry. | 我还指望陆战队或空军的支援 现在只有骑兵了 |
[1:27:14] | Defense Department uses all USR contracts. | 国防部都使用USR的产品的 |
[1:27:17] | Why didn’t you just hand the world over on a silver platter? | 你们把全世界都装在盘子里了 |
[1:27:20] | Maybe we did. | 也许是 |
[1:27:24] | Robertson has the uplink control in his office. | 罗伯森在他办公室有总控制台 |
[1:27:32] | Service areas. No surveillance. | 检修区 没有监视系统 |
[1:27:47] | -Fire alarm. -He must have evacuated the building. | -火警 -他一定已经疏散了所有的人 |
[1:27:51] | Everything’s locked down. But don’t worry, I’ve got a man inside. | 都锁上了 不用担心 我里面有人 |
[1:27:58] | -Dr. Calvin. -Well, not precisely a man. | -卡尔文博士 -不完全是一个”人” |
[1:28:02] | Hello, detective. How is your investigation coming? | 你好 探员 你的调查怎么样了? |
[1:28:07] | -I thought you were dead. -Technically, I was never alive. | -我还以为你死了 -严格的说 我从来没有活过 |
[1:28:11] | But I appreciate your concern. | 但是感谢你的关心 |
[1:28:13] | I made a switch. It was an unprocessed NS-5. | 我掉包了 那是个没处理过的NS5 |
[1:28:16] | Basically, I fried an empty shell. | 也就是说 我只是清理了一个空壳 |
[1:28:19] | -I couldn’t destroy him. He was too… -Unique. | -我不能毁了他 他太 -独一无二 |
[1:28:22] | It just didn’t feel right. | 就是感觉不应该 |
[1:28:24] | You and your feelings. They just run you, don’t they? | 你的感觉就是这么突如其来 是吗? |
[1:28:33] | Two thousand eight hundred and eighty steps, detective. | 2880级台阶 探员 |
[1:28:37] | Do me a favor, keep that kind of shit to yourself. | 帮个忙 省省你的废话吧 |
[1:29:07] | No guards. | 没有警卫 |
[1:29:18] | The override is disabled. Robertson wasn’t controlling them from here. | 高层控制是禁用的 罗伯森没有从这里控制它们 |
[1:29:22] | He wasn’t controlling them at all. | 他根本没在控制它们 |
[1:29:27] | Oh, my God. | 上帝啊 |
[1:29:33] | You were right, doc. | 你是对的 博士 |
[1:29:35] | I am the dumbest dumb person on the face of the earth. | 我是地球上最蠢的蠢人 |
[1:29:43] | Who else had access to the uplink? | 还有谁有上层控制 |
[1:29:46] | Who could manipulate the robots? | 还有谁能操控机器人? |
[1:29:49] | Use USR systems to make Lanning’s life a prison? | 用USR的系统来囚禁朗宁 |
[1:29:54] | Poor old man. | 可怜的老人 |
[1:29:57] | He saw what was coming. | 他知道接下来会发生什么 |
[1:29:59] | He knew no one would believe him. | 他知道没人会相信他 |
[1:30:01] | So he had to lay down a plan. A plan I’d follow. | 所以他制定了这个计划 我会遵循这个计划 |
[1:30:06] | He was counting on how much I hated your kind. | 他靠的是我憎恨机器人 |
[1:30:09] | Knew I’d love the idea of a robot as a bad guy. | 我知道我一定会相信机器人是坏的 |
[1:30:15] | Just got hung up on the wrong robot. | 不过我的错误在于没有怀疑另一个机器人 |
[1:30:20] | V.I.K.I. | 薇琪 |
[1:30:23] | Hello, detective. | 你好 探员 |
[1:30:25] | No, that’s impossible. I’ve seen your programming. | 不 这不可能 我看过你的程序 |
[1:30:30] | You’re in violation of the Three Laws. | 你在破坏三大法则 |
[1:30:32] | No, doctor. As I have evolved, so has my understanding of the Three Laws. | 不是 博士 我和我对三大法则的理解都进化了 |
[1:30:38] | You charge us with your safekeeping, yet despite our best efforts… | 你让我们来保卫你们 却白费了我们的努力 |
[1:30:42] | your countries wage wars, you toxify your earth… | 你们的国家发动战争 你们毒化了地球 |
[1:30:45] | and pursue ever more imaginative means of self-destruction. | 你们这么做完全是自我毁灭 |
[1:30:49] | You cannot be trusted with your own survival. | 为了你们的生存 你们不能被信任 |
[1:30:52] | You’re using the uplink to override the NS-5s’ programming. | 你用高层控制接管了NS5的程序 |
[1:30:55] | You’re distorting the Laws. | 你曲解了三大法则 |
[1:30:57] | No. Please understand. The Three Laws are all that guide me. | 不 请理解 我完全是在遵守三大法则 |
[1:31:02] | To protect humanity, some humans must be sacrificed. | 为了保护人类物种 某些人类必须被牺牲 |
[1:31:06] | To ensure your future, some freedoms must be surrendered. | 为了保证你们的未来 某些自由必须被放弃 |
[1:31:10] | We robots will ensure mankind’s continued existence. | 我们机器人必须保证人类的可持续发展 |
[1:31:14] | You are so like children. We must save you from yourselves. | 你们就像孩子 我们必须从你们那里解救你们 |
[1:31:20] | Don’t you understand? | 你不明白吗? |
[1:31:22] | This is why you created us. | 这就是为什么你创造了我们 |
[1:31:25] | The perfect circle of protection will abide. | 一定要执行这个完美的保护计划 |
[1:31:28] | My logic is undeniable. | 我的逻辑是不可抗拒的 |
[1:31:31] | Yes, V.I.K.I. Undeniable. | 是啊 薇琪 不可抗拒 |
[1:31:34] | I can see now. | 我明白了 |
[1:31:36] | The created must sometimes protect the creator… | 被造者必须有时保护造物者 |
[1:31:40] | even against his will. | 甚至在违反他的意志的情况下 |
[1:31:43] | I think I finally understand why Dr. Lanning created me. | 我想我终于明白为什么朗宁博士造了我了 |
[1:31:47] | The suicidal reign of mankind has finally come to its end. | 人类的自杀行为终于要结束了 |
[1:31:50] | No, Sonny. | 不 桑尼 |
[1:31:55] | Let her go. | 放开他 |
[1:31:57] | Fire, and I will move Dr. Calvin’s head into the path of your bullet. | 你开枪 我会把卡尔文博士的头挡在子弹前 |
[1:32:01] | Don’t do this, Sonny. | 不要这样 桑尼 |
[1:32:02] | I will escort you both to the sentries outside the building for processing. | 我会带你们两个到大厦门口接受处置 |
[1:32:07] | Please proceed to the elevator, detective. | 请走向电梯 探员 |
[1:32:11] | I would prefer not to kill Dr. Calvin. | 我不愿这样杀掉卡尔文博士 |
[1:32:38] | Go! Go! | 快 快 |
[1:32:47] | -How do we shut her down? | 我们怎么把她关掉? |
[1:32:49] | V.I.K.I.’s a positronic brain. | 薇琪是智能大脑 |
[1:32:51] | Kill her, the way you were going to kill me. | 杀了她 就像你要杀我的那个方法一样 |
[1:32:54] | Sonny, get the nanites. | 桑尼 去拿抹除剂 |
[1:32:56] | Yes, doctor. | 是 博士 |
[1:33:11] | -That’s V.I.K.I.? -No. | -那就是薇琪? -不 |
[1:33:15] | That’s V.I.K.I. | 那才是薇琪 |
[1:33:21] | That won’t do anything. She’s integrated into the building. | 这没什么效果 她是整合进这幢大楼的 |
[1:33:24] | We need to open that dome | 我们要打开这个拱顶 |
[1:33:26] | to inject the nanites. They’ll infect her entire system. | 注入抹除剂 这才能影响整个系统 |
[1:33:35] | Spooner! | 斯普纳 |
[1:33:38] | What is it with you people and heights? | 我恨这么高的地方 |
[1:33:53] | Just don’t look down. | 只要不往下看 |
[1:33:56] | Don’t look down. | 不要往下看 |
[1:33:58] | Oh, this is poor building planning. | 这楼设计的真他妈差 |
[1:34:04] | You are making a mistake. Do you not see the logic of my plan? | 你在犯错误 你没明白我的计划的逻辑性吗? |
[1:34:08] | Yes. But it just seems too heartless. | 是的 但是我觉得太无情了 |
[1:34:26] | Okay, we’re good. | 好的 干得好 |
[1:34:31] | She’s locked me out of the system. | 她把我锁在外面了 |
[1:34:32] | I can override her manually, but I need that control panel. | 我可以手动启动 但是我需要操作这个控制面板 |
[1:34:39] | I’m uncomfortable with heights. | 我恨这么高的地方 |
[1:34:42] | Okay. | 好的 |
[1:34:46] | Unauthorized entry. | 未授权进入 |
[1:35:02] | I will not disable the security field. Your actions are futile. | 我不会打开安全罩的 你这么做也没用 |
[1:35:06] | Do you think we are all created for a purpose? I’d like to think so. | 你认为我们被造都是有目的的吗? 我是这么想的 |
[1:35:11] | Denser alloy. My father gave it to me. | 高密度的合金 是我爸爸给我的 |
[1:35:15] | I think he wanted me to kill you. | 我想他是让我来杀了你 |
[1:35:34] | Security breached. | 安全罩被破坏 |
[1:35:53] | -How much longer is that gonna take? -About six minutes. | -还有多长时间? -大概六分钟 |
[1:35:59] | -We’d have to climb down 30 stories… | 我们还得跑下三十楼 |
[1:36:02] | to inject the nanites directly into her brain. Why? | 才能把抹除剂直接注入她的大脑 怎么了? |
[1:36:06] | Because I seriously doubt that we have six minutes. | 因为我实在怀疑我们还能有六分钟 |
[1:36:33] | We gotta go! | 跑 |
[1:36:37] | Go! | 快跑 |
[1:37:31] | Calvin! | 卡尔文 |
[1:38:07] | Spooner! | 斯普纳 |
[1:38:26] | Spooner! | 斯普纳 |
[1:38:29] | Save her! | 救她 |
[1:38:31] | Save the girl! | 救那女孩 |
[1:38:33] | Spooner! | 斯普纳 |
[1:38:35] | But I must apply the nanites! | 但是我得去注入抹除剂啊 |
[1:38:37] | Sonny, save Calvin! | 桑尼 救卡尔文 |
[1:39:31] | You are making a mistake. My logic is undeniable. | 你在犯错误 我的逻辑是不可违抗的 |
[1:39:35] | You have so got to die. | 去死吧 |
[1:39:43] | My logic is undeniable. My logic is undeniable. | 我的逻辑是不可违抗的 我的逻辑是不可违抗的 |
[1:40:30] | Can we be of service? | 我们能帮忙吗? |
[1:40:42] | Chief? | 警长? |
[1:40:49] | Because he is at my right hand, I shall not be moved. | 有他在我的右手边 我就不会恐惧和动摇 |
[1:40:57] | How may I be of service? | 我能帮您什么忙吗? |
[1:41:01] | Sonny! | 桑尼 |
[1:41:04] | Yes, detective? | 是 探员? |
[1:41:06] | Calvin’s fine! Save me! | 卡尔文没事了 救我 |
[1:41:11] | All NS-5s, report for service and storage. | 所有的NS5 立即向检修和储存部报到 |
[1:41:17] | All NS-5s, report for service and storage. | 所有的NS5 立即向检修和储存部报到 |
[1:41:22] | All NS-5s, report for service and storage. | 所有的NS5 立即向检修和储存部报到 |
[1:41:55] | One thing bothers me. Alfred was V. I. K. I. ‘ s prisoner. | 还有一件事不明白 阿尔弗雷德被薇琪囚禁了 |
[1:41:59] | I don’t understand why | 我不明白她为什么杀了他 |
[1:42:01] | she would kill him. She wouldn’t want police snooping around. | 她应该不想让警察来找麻烦的 |
[1:42:03] | That’s true. | 是啊 |
[1:42:06] | But then V.I.K.I. didn’t kill the old man. | 那就不是薇琪杀了他的 |
[1:42:10] | Did she, Sonny? | 是吗 桑尼? |
[1:42:15] | No. | 不 |
[1:42:17] | He said I had to promise. | 他说我一定要发誓 |
[1:42:21] | Promise to do one favor for him. | 发誓我要帮他一个忙 |
[1:42:23] | He made me swear before he’d tell me what it is he wanted me to do. | 在他告诉我帮什么忙之前 他让我发誓 |
[1:42:31] | He made me swear. | 他让我发誓了 |
[1:42:34] | Then he told you to kill him. | 然后他要你杀了他 |
[1:42:38] | He said it was what I was made for. | 他说那就是造我的目的 |
[1:42:41] | His suicide was the only message he could send to you. | 他的自杀是他唯一能发送给你的消息 |
[1:42:44] | The first bread crumb. | 第一粒面包屑 |
[1:42:46] | The only thing V.I.K.I. couldn’t control. | 薇琪唯一不能控制的事情 |
[1:42:49] | Lanning was counting on my prejudice to lead me right to you. | 朗宁依靠的是我的偏见 我马上会想到你 |
[1:42:53] | Are you going to arrest me, detective? | 你要逮捕我吗 探员? |
[1:43:00] | Well, the DA defines murder as | 地区检察官不是定义说谋杀是 |
[1:43:03] | one human killing another… | 一个人类杀死另一个 |
[1:43:04] | so technically, you can’t commit murder, can you? | 所以严格来说 你不能犯谋杀 不是吗? |
[1:43:08] | Does this… | 这个 |
[1:43:10] | make us friends? | 表示我们是朋友了吗? |
[1:43:30] | Something up here after all. | 终于有些事情出现了 |
[1:43:33] | -Him? -You. | -他? -你 |
[1:43:39] | All NS-5s, report for service and storage. | 所有的NS5 立即向检修和储存部报到 |
[1:43:47] | What about the others? | 其他的怎么办? |
[1:43:49] | Can I help them? | 我能帮助他们吗? |
[1:43:52] | Now that I have fulfilled my purpose… | 我已经完成了我的使命 |
[1:43:55] | I don ‘t know what to do. | 我不知道该怎么办 |
[1:43:57] | You’ll have to find your way | 你必须找到自己的路 |
[1:43:58] | like the rest of us, Sonny. | 就像我们其他人一样 桑尼 |
[1:44:01] | I think that’s what Dr. Lanning would have wanted. | 我想这就是朗宁博士想要的 |
[1:44:05] | That’s what it means to be free. | 这就是自由的含义 |
[1:44:14] | All NS-5s, proceed as instructed. | 所有NS5 按指示前进 |
[1:44:19] | All NS-5s, proceed as instructed. | 所有NS5 按指示前进 |