future timeline technology singularity humanity
 
Timeline»

 

The 21st century

The 21st century began with America as the sole superpower in the absence of the Soviet Union, with China emerging as a potential rival. Debates raged over what should be done about fossil fuel pollution, after a previous century marked by rapid industrial expansion. With the Cold War over and Islamic fundamentalist-related terrorism on the rise, the West and its allies turned their attention to the Middle East.

Digital technology – in its early stages of mainstream use in the 1980s and 1990s – became widely adopted by most of the world, though concerns about stress and anti-sociality from the overuse of mobile phones, the Internet and related technologies remained controversial. Over 1.5 billion people worldwide used the Internet by the end of the first decade and over 4 billion (more than half the world's population) used mobile phones.

A global financial downturn resulted in serious and long-lasting consequences for much of the world. Growing divides between rich and poor, as well as an increasing awareness of surveillance and privacy intrusion, exacerbated public resentment of governments and elites. Social media and other technologies exposed the corruption inherent in the political and economic system, while facilitating the organisation of movements and protests. But at the same time, concerns grew around the spread of misinformation and the need for objective truth.

A new set of crises would emerge in the 2020s. The COVID-19 pandemic led to many millions of deaths, and severe disruption to society and travel until the introduction of vaccines, with after-effects including supply issues and an inflation surge. The Russian invasion of Ukraine in 2022 added further to the economic and political troubles of this period.

Despite the now alarmingly obvious impacts of climate change, progress on the issue remained slow, falling far short of the action needed. The world exceeded 1.5°C of global warming in the 2030s and 2°C in the 2040s. Governments now faced the looming problem of climate refugees. However, unexpectedly rapid advances in solar, wind, and other renewables led to a decarbonisation of electricity across most of the developed world, with developing countries not far behind. This trend could also be seen in electric cars and other vehicles, which supplanted traditional combustion engines.

Many other fields of science and technology – from genetics, to nanotechnology, robotics, and quantum computing – underwent the same exponential growth, providing new ways for humanity to tackle global problems. This included, for example, the emergence of cellular agriculture, setting in motion a long-term reversal in the need for sprawling arable land. Meanwhile, an entire industry of carbon capture and utilisation allowed historical pollutants to be safely stored and then later repurposed, initially at small scales and then eventually up to gigatons. Other developments, such as vertical farming and genetically modified crops, helped to ease food shortages.

However, global warming continued to be a major issue for the rest of the century, with poorer regions struggling to adapt. Mass dislocations of people became front and centre of international political negotiations, as heatwaves, floods, wildfires, and other impacts accelerated. Rising sea levels, a relatively minor concern in earlier decades, began to pose enormous challenges for coastal regions.

Of all the emerging technologies of the 21st century, artificial intelligence would undoubtedly prove to have the greatest impact. By the 2020s, it had already reached a level of sophistication considered human-like – whether in chatbot conversations and other language models, game-playing programs such as AlphaGo, image and video synthesis from user prompts, or assisting researchers by automating laborious tasks.

However, even greater developments followed by mid-century, leading to profound questions over the nature of consciousness and being. With AI now seemingly on the verge of sentience, humanity struggled to come to terms with its creation. In laboratory research settings, the most advanced humanoid androids took on an eerily lifelike appearance, while programs running on the fastest supercomputers demonstrated a level of insight that surpassed the greatest minds on earth.

Outside these cutting-edge environments and in the everyday world, AI made its presence felt in ways that simultaneously improved aspects of society while also disrupting the lives of millions. For instance, the automation of many traditional job roles led to increased technological unemployment and a growing demand for welfare reform, such as guaranteed basic incomes. This now applied to many white-collar occupations in addition to manual labour. Anti-work sentiment reached the mainstream as people felt hopeless and threatened, not only by this rapidly changing employment landscape, but the general situation in terms of global economic volatility and worsening environmental conditions around much of the world. New diseases added to the sense of doom, with antibiotic resistance now a grave concern for public health, alongside the risk of pandemics from animal reservoirs and other sources.

Demographic changes presented yet another challenge in the 21st century. Declining fertility rates and longer lifespans resulted in a growing imbalance of the ratio between employees and retired older persons. The situation became so acute that many countries began offering newly developed rejuvenation therapies to their people for free, both to reduce the burden of healthcare and to encourage senior citizens back into the workforce. Governments also provided incentives for couples to have more children. Alongside this, immigration became even more of an issue, with an urgent need to attract more foreign workers whilst controlling the now enormous numbers of climate refugees.

In addition to advances in reversing biological aging, other notable milestones of the mid-21st century included humans landing on Mars, and the first detection of a biosignature from exoplanet data.

As global warming approached 2.5°C, the global order threatened to unravel completely. Nationalism reached feverish levels, with nuclear brinkmanship and a breakdown in consensus on the route forward. Mass evacuations had become commonplace, particularly in low-lying coastal regions lacking sufficient defences against sea level rise. Heatwaves made some regions uninhabitable. The economic damages from climate change now surpassed half a trillion dollars every year – a ten-fold increase compared to the start of the century.

Managing and reversing this catastrophe now required a full-time, WW2-scale mobilisation of people, resources, and technology. With fossil fuels nearing the end of their phase out, net zero pledges shifted to 'net negative' and the removal of historical pollutants that remained in the atmosphere, on land, and in the oceans. In addition to carbon sequestration and direct air capture, other efforts included true megaprojects – on a scale never witnessed before – such as geoengineering and even attempts to prevent glaciers sliding into the ocean (the latter achieved by drilling holes in Antarctica, Greenland, and elsewhere, and then pumping water from below onto the surface).

Amid this turbulent time, robots had kept the wheels of society turning in many ways. Some of the more advanced models appeared so lifelike and so intelligent that they effectively represented a new segment of the population, making progress in legal status, and gaining rights of personhood. But in many societies, realistic humanoid androids held the position of slaves or indentured servants.

To better understand and interact with AI, humans began merging with technology in ever more sophisticated ways. Highly compact devices became implantable and integrated within the human body – even the brain itself – able to monitor and treat disease, enhance the senses, and provide entertainment or communication in ways that simply were not possible before. Transhumanism, this new form of bodily enhancement, achieved mainstream popularity during the late 21st century, which continued into the 22nd century and beyond.