If anyone builds it, everyone dies : (Record no. 209543)

MARC details
000 -LEADER
fixed length control field 01932nam a22002897a 4500
003 - CONTROL NUMBER IDENTIFIER
control field IIITD
005 - DATE AND TIME OF LATEST TRANSACTION
control field 20260203124044.0
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION
fixed length control field 260124b |||||||| |||| 00| 0 eng d
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
International Standard Book Number 9781847928931
040 ## - CATALOGING SOURCE
Original cataloging agency IIITD
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER
Classification number 006.301
Item number YUD-I
100 ## - MAIN ENTRY--PERSONAL NAME
Personal name Yudkowsky, Eliezer
245 ## - TITLE STATEMENT
Title If anyone builds it, everyone dies :
Remainder of title the case against superintelligent AI
Statement of responsibility, etc by Eliezer Yudkowsky and Nate Soares
260 ## - PUBLICATION, DISTRIBUTION, ETC. (IMPRINT)
Place of publication, distribution, etc London :
Name of publisher, distributor, etc Penguin,
Date of publication, distribution, etc © 2025
300 ## - PHYSICAL DESCRIPTION
Extent xii, 259 p. ;
Dimensions 24 cm.
504 ## - BIBLIOGRAPHY, ETC. NOTE
Bibliography, etc Includes bibliographical references and index.
505 ## - FORMATTED CONTENTS NOTE
Title Introduction: Hard calls and easy calls
505 ## - FORMATTED CONTENTS NOTE
Title Part I: Nonhuman minds
505 ## - FORMATTED CONTENTS NOTE
Title Part II: One extinction scenario
505 ## - FORMATTED CONTENTS NOTE
Title Part III: Facing the challenge
520 ## - SUMMARY, ETC.
Summary, etc AI is the greatest threat to our existence that we have ever faced. The scramble to create superhuman AI has put us on the path to extinction - but it's not too late to change course. Two pioneering researchers in the field, Eliezer Yudkowsky and Nate Soares, explain why artificial superintelligence would be a global suicide bomb and call for an immediate halt to its development. The technology may be complex but the facts are simple: companies and countries are in a race to build machines that will be smarter than any person, and the world is devastatingly unprepared for what will come next. Could a machine superintelligence wipe out our entire species? Would it want to? Would it want anything at all? In this urgent book, Yudkowsky and Soares explore the theory and the evidence, present one possible extinction scenario and explain what it would take for humanity to survive. The world is racing to build something truly new - and if anyone builds it, everyone dies.<br/>
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name as entry element Artificial Intelligence
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name as entry element Technological forecasting
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name as entry element Human beings -- Extinction -- Forecasting
700 ## - ADDED ENTRY--PERSONAL NAME
Personal name Soares, Nate
942 ## - ADDED ENTRY ELEMENTS (KOHA)
Koha item type Books
Source of classification or shelving scheme Dewey Decimal Classification
Holdings
Withdrawn status Lost status Source of classification or shelving scheme Damaged status Not for loan Collection code Home library Current library Shelving location Date acquired Bill No. Bill Date Cost, normal purchase price PO No. PO Date Total Checkouts Full call number Barcode Date last seen Cost, replacement price Price effective from Vendor/Supplier Koha item type
    Dewey Decimal Classification     Computer Science and Engineering IIITD IIITD General Stacks 24/01/2026 XIP4-75479 2026-01-19 607 IIITD/LIC/BS/2025/AMZ/17 2026-01-19   006.301 YUD-I 013691 24/01/2026 999 24/01/2026 Amazon.in Books
© 2024 IIIT-Delhi, library@iiitd.ac.in