| If Anyone Builds It, Everyone Dies (Little, Brown) |
| Friday, 23 January 2026 | |||
|
In this book, subtitled "Why Superhuman AI Would Kill Us All", Eliezer Yudkowsky and Nate Soares argue that sufficiently smart AIs will develop goals of their own that put them in conflict with us—and that if it comes to conflict, an artificial superintelligence would crush us. The contest wouldn't even be close. Yudkowsky and Soares walk through the theory and the evidence, present one possible extinction scenario, and explain what it would take for humanity to survive. <ASIN:0316595640 >
Author: Eliezer Yudkowsky and Nate Soares For more Book Watch just click. Book Watch is I Programmer's listing of new books and is compiled using publishers' publicity material. It is not to be read as a review where we provide an independent assessment. Some, but by no means all, of the books in Book Watch are eventually reviewed. To have new titles included in Book Watch contact BookWatch@i-programmer.info Follow @bookwatchiprog on Twitter or subscribe to I Programmer's Books RSS feed for each day's new addition to Book Watch and for new reviews.
|
