%% ### #todo - [ ] https://www.lesswrong.com/posts/kcKrE9mzEHrdqtDpE/the-case-for-ensuring-that-powerful-ais-are-controlled %% Two definitions offered by Suleyman (2023, p. 14): >**Containment**: The ability to monitor, curtail, control, and potentially even close down technologies. >**The Containment Problem**: Technology’s predisposition to diffuse widely in waves and to have emergent impacts that are impossible to predict or control, including negative and unforeseen consequences. Suleyman views containment as a "first, critical step" to build safe technology — necessary but on its own insufficient. %%What would it look like to contain AI?%% At the core of Suleyman's argument is the thesis that containment is likely *not* possible, but that it *must* be possible for humanity to survive and enjoy a flourishing future. Reasons why containment of technology is hard in general: - **[[The fundamental driver of technological progress is human needs|The demand for technology is insatiable]]**. Both on the individual and collective level, new and improved technologies are the key strategy to meet goals and needs, to survive and thrive. No level of resistance to technology is sufficient to overpower this demand. [[Technology tends to improve and proliferate]]. - **Technology can't be uninvented**. There's an asymmetry here: Our civilization has a large capacity to invent new technologies, but once invented, they can't be "un-invented."^[See Nick Bostrom's [Vulnerable World Hypothesis](https://nickbostrom.com/papers/vulnerable.pdf)] Technologies are ideas, and there's no guaranteed way to eliminate ideas or stop them from spreading. - **Technology is inherently difficult to control**. Once you introduce it to the world, the consequences are complex and unpredictable. People find ways to use any given technology in ways that its maker didn't expect or even intend. The more capable and broadly useful a technology, the larger this problem becomes. Reasons why containment of the [[The "Coming Wave" is an emerging cluster of technologies centered on AI and synthetic biology|Coming Wave]] in particular will be hard: - [[Four features of the coming wave that make containment challenging – asymmetry, hyper-evolution, omni-use, and autonomy]] - [[Fragility amplifiers increase the vulnerability of systems and compound the challenges and risks of emerging technologies]] - [[Containment of emerging technologies requires strong nation-states and cohesive societies at a time when they are fragile and divided]] - [[There is insufficient clarity and conviction that advanced AI poses an existential risk]] There are historical exceptions, such as the **containment of nuclear weapons** since the cold war. However, this isn't a very hopeful story either, but one centered on a technology that was incredibly costly and difficult to build, faced strong international containment efforts fueled, in part, by its obvious lethal potential, and was enabled by series of events where pure luck saved the day. --- Topics: - [[The AI Revolution and the Tapestry of Tomorrow (Index)]] Related notes: - [[Suleyman suggests ten steps to contain the coming wave]]