Human extinctionHuman extinction is the hypothetical end of the human species due to either natural causes such as population decline from sub-replacement fertility, an asteroid impact, large-scale volcanism, or via anthropogenic destruction (self-extinction). For the latter, some of the many possible contributors include climate change, global nuclear annihilation, biological warfare, and ecological collapse. Other scenarios center on emerging technologies, such as advanced artificial intelligence, biotechnology, or self-replicating nanobots.
Nuclear holocaustA nuclear holocaust, also known as a nuclear apocalypse, nuclear Armageddon, or atomic holocaust, is a theoretical scenario where the mass detonation of nuclear weapons causes globally widespread destruction and radioactive fallout. Such a scenario envisages large parts of the Earth becoming uninhabitable due to the effects of nuclear warfare, potentially causing the collapse of civilization and, in the worst case, extinction of humanity and/or termination of all biological life on Earth.
Global governanceGlobal governance refers to institutions that coordinate the behavior of transnational actors, facilitate cooperation, resolve disputes, and alleviate collective action problems. Global governance broadly entails making, monitoring, and enforcing rules. Within global governance, a variety of types of actors – not just states – exercise power. Governance is thus broader than government. Global governance began in the mid-19th century. It became particularly prominent in the aftermath of World War I, and more so after the end of World War II.
Societal collapseSocietal collapse (also known as civilizational collapse) is the fall of a complex human society characterized by the loss of cultural identity and of social complexity as an adaptive system, the downfall of government, and the rise of violence. Possible causes of a societal collapse include natural catastrophe, war, pestilence, famine, economic collapse, population decline or overshoot, mass migration, and sabotage by rival civilizations. A collapsed society may revert to a more primitive state, be absorbed into a stronger society, or completely disappear.
Existential risk from artificial general intelligenceExistential risk from artificial general intelligence is the hypothesis that substantial progress in artificial general intelligence (AGI) could result in human extinction or another irreversible global catastrophe. One argument goes as follows: The human species currently dominates other species because the human brain possesses distinctive capabilities other animals lack. If AI were to surpass humanity in general intelligence and become superintelligent, then it could become difficult or impossible to control.
Doomsday argumentThe Doomsday Argument (DA), or Carter catastrophe, is a probabilistic argument that claims to predict the future population of the human species based on an estimation of the number of humans born to date. The Doomsday argument was originally proposed by the astrophysicist Brandon Carter in 1983, leading to the initial name of the Carter catastrophe. The argument was subsequently championed by the philosopher John A. Leslie and has since been independently conceived by J. Richard Gott and Holger Bech Nielsen.
Futures studiesFutures studies, futures research, futurism or futurology is the systematic, interdisciplinary and holistic study of social/technological advancement, and other environmental trends; often for the purpose of exploring how people will live and work in the future. Predictive techniques, such as forecasting, can be applied, but contemporary futures studies scholars emphasize the importance of systematically exploring alternatives. In general, it can be considered as a branch of the social sciences and an extension to the field of history.
SurvivalismSurvivalism is a social movement of individuals or groups (called survivalists or preppers) who proactively prepare for emergencies, such as natural disasters, as well as other disasters causing disruption to social order (that is, civil disorder) caused by political or economic crises. Preparations may anticipate short-term scenarios or long-term, on scales ranging from personal adversity, to local disruption of services, to international or global catastrophe.
Space colonizationSpace colonization (also called space settlement or extraterrestrial colonization) is the use of outer space or celestial bodies other than Earth for permanent habitation or as extraterrestrial territory. The inhabitation and territorial use of extraterrestrial space has been proposed, for example, for space settlements or extraterrestrial mining enterprises. To date, no permanent space settlement other than temporary space habitats have been set up, nor has any extraterrestrial territory or land been legally claimed.
Fermi paradoxThe Fermi paradox is the discrepancy between the lack of conclusive evidence of advanced extraterrestrial life and the apparently high likelihood of its existence. As a 2015 article put it, "If life is so easy, someone from somewhere must have come calling by now." Italian-American physicist Enrico Fermi's name is associated with the paradox because of a casual conversation in the summer of 1950 with fellow physicists Edward Teller, Herbert York, and Emil Konopinski.
Elon MuskElon Reeve Musk (ˈiːlɒn ; born June 28, 1971) is a business magnate and investor. Musk is the founder, chairman, CEO and chief technology officer of SpaceX, angel investor, CEO and product architect of Tesla, Inc., owner, chairman and CTO of X Corp., founder of the Boring Company, a co-founder of Neuralink and OpenAI, and the president of the Musk Foundation. He is the wealthiest person in the world, with an estimated net worth of US217billion,accordingtotheBloombergBillionairesIndex,and219 billion according to Forbes, primarily from his ownership stakes in Tesla and SpaceX. SuperintelligenceA superintelligence is a hypothetical agent that possesses intelligence far surpassing that of the brightest and most gifted human minds. "Superintelligence" may also refer to a property of problem-solving systems (e.g., superintelligent language translators or engineering assistants) whether or not these high-level intellectual competencies are embodied in agents that act in the world. A superintelligence may or may not be created by an intelligence explosion and associated with a technological singularity.
Future of EarthThe biological and geological future of Earth can be extrapolated based on the estimated effects of several long-term influences. These include the chemistry at Earth's surface, the cooling rate of the planet's interior, the gravitational interactions with other objects in the Solar System, and a steady increase in the Sun's luminosity. An uncertain factor is the pervasive influence of technology introduced by humans, such as climate engineering, which could cause significant changes to the planet.
Planetary boundariesPlanetary boundaries are a framework to describe limits to the impacts of human activities on the Earth system. Beyond these limits, the environment may not be able to self-regulate anymore. This would mean the Earth system would leave the period of stability of the Holocene, in which human society developed. Crossing a planetary boundary comes at the risk of abrupt environmental change. The framework is based on scientific evidence that human actions, especially those of industrialized societies since the Industrial Revolution, have become the main driver of global environmental change.
Bulletin of the Atomic ScientistsThe Bulletin of the Atomic Scientists is a nonprofit organization concerning science and global security issues resulting from accelerating technological advances that have negative consequences for humanity. The Bulletin publishes content at both a free-access website and a bi-monthly, nontechnical academic journal. The organization has been publishing continuously since 1945, when it was founded by Albert Einstein and former Manhattan Project scientists as the Bulletin of the Atomic Scientists of Chicago immediately following the atomic bombings of Hiroshima and Nagasaki.
Technological singularityThe technological singularity—or simply the singularity—is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization. According to the most popular version of the singularity hypothesis, I. J. Good's intelligence explosion model, an upgradable intelligent agent will eventually enter a "runaway reaction" of self-improvement cycles, each new and more intelligent generation appearing more and more rapidly, causing an "explosion" in intelligence and resulting in a powerful superintelligence that qualitatively far surpasses all human intelligence.
Asteroid impact avoidanceAsteroid impact avoidance comprises the methods by which near-Earth objects (NEO) on a potential collision course with Earth could be diverted away, preventing destructive impact events. An impact by a sufficiently large asteroid or other NEOs would cause, depending on its impact location, massive tsunamis or multiple firestorms, and an impact winter caused by the sunlight-blocking effect of large quantities of pulverized rock dust and other debris placed into the stratosphere.
Human overpopulationHuman overpopulation (or human population overshoot) describes a concern that human populations may become too large to be sustained by their environment or resources in the long term. The topic is usually discussed in the context of world population, though it may concern individual nations, regions, and cities. Since 1804, the global human population has increased from 1 billion to 8 billion due to medical advancements and improved agricultural productivity. Annual world population growth peaked at 2.