首页
外语
计算机
考研
公务员
职业资格
财经
工程
司法
医学
专升本
自考
实用职业技能
登录
考研
Very soon, unimaginably powerful technologies will remake our lives. This could have dangerous consequences, especially because
Very soon, unimaginably powerful technologies will remake our lives. This could have dangerous consequences, especially because
admin
2015-02-12
55
问题
Very soon, unimaginably powerful technologies will remake our lives. This could have dangerous consequences, especially because we may not even understand the basic science underlying them. There’s a growing gap between our technological capability and our underlying scientific understanding. We can do very clever things with the technology of the future without necessarily understanding some of the science underneath, and that is very dangerous.
The technologies that are particularly dangerous over the next hundred years are nanotechnology, artificial intelligence and biotechnology. The benefits they will bring are beyond doubt but they are potentially very dangerous. In the field of artificial intelligence there are prototype designs for something that might be 50,000 million times smarter than the human brain by the year 2010. The only thing not feasible in the film Terminator is that the people win. If you’re fighting against technology that is that much smarter than you, you probably will not win. We’ve all heard of the grey goo problem that self-replicating nanotech devices might keep on replicating until the world has been reduced to sticky goo, and certainly in biotechnology, we’ve really got a big problem because it’s converging with nanotechnology. Once you start mixing nanotech with organisms and you start feeding nanotech-enabled bacteria, we can go much further than the Borg in Star Trek, and those superhuman organisms might not like us very much.
We are in a world now where science and commerce are increasingly bedfellows. The development of technology is happening in the context of global free trade regimes which see technological diffusion embedded with commerce as intrinsically a good. We should prepare for new and unfamiliar forms of argument around emerging technologies.
It can be inferred from the text that the author______.
选项
A、thinks people overestimate the capabilities of technology
B、is not optimistic that artificial intelligence will always be used positively
C、thinks that we should take science fiction movies more seriously
D、believes artificial intelligence is the greatest threat we face technologically
答案
B
解析
属态度推断题。第二段作者提到它们带来的好处不容置疑但也存在很大的潜在危险,之后,具体分析探讨人工智能、纳米技术和生物技术的潜在危险。可见,作者对人工智能的积极应用并不乐观。
转载请注明原文地址:https://kaotiyun.com/show/Pb74777K
0
考研英语一
相关试题推荐
EricHansenwritesabouttravelasaparticipatingenthusiastratherthanamereobserver.(46)Itgivesthesenineessays,base
EricHansenwritesabouttravelasaparticipatingenthusiastratherthanamereobserver.(46)Itgivesthesenineessays,base
TheUnitedStatesEnlargesthePunishmenttotheMediaMorethananyotherindustry,America’smulti-billion-dollarentertai
RecessionVstheImpactthattheSchoolEducatesApetitiontosaveArlingtonCounty’sDavidM.BrownPlanetariumis800sig
Somecriticsarequestioningthevalueofthejurysystemprobablybecause______.Itcanbeinferredfromthefifthparagrapht
Theauthorwouldbemostlikelytoagreewithwhichofthefollowingconclusions?Thebesttitlewhichdescribesthecontentof
Theword"globalization"usuallyconjuresupimagesofglobe-spanningcompaniesanddistance-destroyingtechnologies.Itsenable
Arecenttelevisioncommercialbegins【C1】______theannouncement,"Johnnywasjustbeatenupbythebiggestkid【C2】______theblo
Formyproposedjourney,thefirstprioritywasclearlytostartlearningArabic.Ihaveneverbeenalinguist.ThoughIhadtra
随机试题
下列有关内脏神经的描述中,错误的为()
尿三杯试验中,血尿以第一杯为主,推测出血部位可能是
经护理评估此病人心功能分级为马女士,有风湿性心病史,因心源性水肿给予噻嗪类利尿剂治疗时,特别注意预防
关于企业社会工作的概念,下列选项描述正确的是()。
普通高中化学课程标准为充分体现普通高中化学课程的基础性,设置两个必修课程模块,注重从()三个方面为学生科学素养的发展和高中阶段后续课程的学习打下必备的基础。
简述勃拉姆斯的艺术成就及作品风格。
下列关于犯罪对象的说法中,正确的是()。(2014一专一5)
HowtoBeaSuccessfulBusinesspersonHaveyoueverwonderedwhysomepeoplearesuccessfulinbusinessandothersarenot?
Theman______smokingeventhoughthedoctortoldhimnotto.
______(如果他下午来),Ishouldaskhimtohelpus.
最新回复
(
0
)