来源类型
Other journal
DOI
https://doi.org/10.1016/j.futures.2015.07.009
Taking superintelligence seriously: Superintelligence: Paths, dangers, strategies by Nick Bostrom (Book Review)
论文题名译名
认真对待超级智能:尼克·博斯特罗姆的《超级智能:道路、危险、战略》(书评)
发布日期
2015
出版者
Elsevier
出版日期
2015-09-01
发表期刊
ISSN
0016-3287
EISSN
1873-6378
出版年
2015
卷号
72
页码范围
32-35
摘要
A new book by Nick Bostrom, Superintelligence: Paths, Dangers, Strategies, is reviewed. Superintelligence explores the future of artificial intelligence and related technologies and the risks they may pose to human civilization. The book ably demonstrates the potential for serious thinking aimed at the long-term future. Bostrom succeeds in arguing that the development of superintelligent machines will, if not properly managed, create catastrophic risks to humanity. The book falls short in some respects, and some sections are more compelling and novel than others. Overall, however, Bostrom’s book succeeds in demolishing the “null hypothesis” according to which the possibility and risks of superintelligence can continue to be ignored, and is a must-read for those interested in the long-term future of humanity.
中文摘要
尼克·博斯特罗姆的新书《超级智慧:道路、危险、策略》被评论。超级智能探索了人工智能和相关技术的未来,以及它们可能对人类文明构成的风险。这本书巧妙地展示了针对长远未来进行严肃思考的潜力。博斯特罗姆成功地认为,如果管理不当,超级智能机器的开发将给人类带来灾难性风险。这本书在某些方面做得不够,有些部分比其他部分更有吸引力和新颖性。然而,总的来说,博斯特罗姆的书成功地打破了“零假设”,根据该假设,超级智能的可能性和风险可以继续被忽视,是那些对人类长期未来感兴趣的人的必读之作。
NSTL主题领域
新兴技术
NSTL智库专题
人工智能
来源关键词
Existential riskArtificial intelligenceSuperintelligenceResponsible innovation
NSTL分类号
33 ; 21
来源智库
Consortium for Science, Policy, and Outcomes (United States)
版权信息
Copyright © 2015 Elsevier Ltd. All rights reserved.
获取方式
开放
NSTL资源类型
期刊论文
NSTL唯一标识符
JA202304210000008ZK
加工单位 processInst
入库编号
CJ20230511JA000060

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。