exploration is becoming common
By the mid-22nd century, a wide variety of probes to neighbouring star systems have successfully reached their destinations.* The fastest
of these can now achieve a significant fraction of light speed, requiring
only a few decades of travel time. By way of comparison, space probes of a century earlier – such as the Voyager missions – will take many thousands of years to reach the stars.
A number of different engine systems are being utilised – from antimatter, to nuclear pulse
propulsion and other more experimental methods. Each of these craft is equipped with powerful
AI, heavily automated systems and robots/androids. Protection
from incoming debris is offered by cone-shaped force fields projected
from the front of each craft. This streamlined shape causes objects like asteroids to drift by without causing any damage.
After journeying for trillions of miles, the majority of probes successfully
rendezvous with their destinations. These return a treasure trove of
data and visual information on extrasolar planets. In addition, images are taken of Earth and the Solar System viewed from light years away, providing a new perspective of humanity and its place in the universe.
Holodeck-style environments are becoming possible
The concept of virtual reality had been explored as far back as the 1930s, when Stanley G. Weinbaum wrote his short story, Pygmalion's Spectacles. Published in Wonder Stories – an American science fiction magazine – this described a "mask" with holographic recording of experiences including smell, taste and touch.*
In the 1950s, Morton Heilig wrote of an "Experience Theatre" that could encompass all the senses in an effective manner, thus drawing the viewer into the onscreen activity. He built a prototype mechanical device called the Sensorama in 1962 – which had five short films displayed in it while engaging multiple senses (sight, sound, smell, and touch). Around this time, engineer and inventor Douglas Englebart began using computer screens as both input and output devices.
Later in the 20th century, the term "virtual reality" was popularised by Jaron Lanier. A major pioneer of the field, he founded VPL Research in 1985, which developed and built some of the seminal "goggles and gloves" systems of that decade.
A more advanced concept was depicted in TV shows like Star Trek: The Next Generation (1987-1994) and movies like The Matrix (1999). These introduced the idea of simulated realities that were convincing enough to be indistinguishable from the real world.
It was not until the late 2010s that virtual reality became a truly mainstream consumer technology.* By then, exponential advances in computing power had solved many issues hindering previous, cruder attempts at VR – such as cost, weight/bulkiness, pixel resolution and screen latency. It was possible to combine these headsets with circular or hexagonal treadmills, offering users the ability to walk in a seamless open world.* The Internet also enabled participants from around the globe to compete and engage with each other in massively multiplayer online role-playing games.
While clearly a huge improvement over earlier generations of hardware, these devices would pale into insignificance when compared to full immersion virtual reality (FIVR). As computers became ever smaller and more compact, made possible via new materials such as graphene, they were beginning to integrate with the human body in ways hitherto impossible. Their components had shrunk by orders of magnitude, following trends like Moore's Law. Machines that filled entire rooms in the 1970s had become smartphones by 2010 and the size of blood cells by the 2030s.* This occurred in parallel with accurate models of the brain, establishing a basic roadmap of neurological processes.* Full immersion virtual reality leveraged these advances to create microscopic devices able to record and mimic patterns of thought, directly inside the brain. Tens of billions of these "nanobots" could be programmed to function simultaneously, like a miniature Internet, the end result being that sensory information was now reproducible through software. In other words – vision, hearing, smell, taste and touch could be temporarily substituted by a computer program, allowing users to experience virtual environments with sufficient detail to match the real world. First demonstrated in laboratory settings and military training environments, FIVR was commercialised in subsequent decades and became one of the 21st century's defining technologies.
Not everyone was amenable to having nanoscale machines inserted into their brains, however. In any case, full immersion VR provided only a superficial imitation of real life – it could not replicate every subatomic particle, for example, or the countless quantum events occurring at any given moment in time and space. Accounting for these phenomena would require a level of computing on a different scale entirely.
Lattice Quantum Chromodynamics (LQCD) was a promising field in the late 20th and early 21st centuries. This allowed researchers to simulate objects and processes in near-perfect detail, using resolutions based on the fundamental physical laws. By the 2010s, for example, individual proton masses could be determined at error margins close to one percent. During the 2020s, exascale computing helped to further refine the nuclear forces and uncover exotic "new physics" beyond the Standard Model.
Smaller and smaller pixelations were being applied to greater and greater volumes of space-time, as supercomputers later reached the zettascale, yottascale and beyond. By the 2070s, it was possible to simulate a complete virus with absolute accuracy down to the smallest known quantum level.* Blood cells, bacteria and other living structures followed as this technique approached the macroscale. In the early 22nd century, mind transfer became feasible for mainstream use, whole-brain scans now sufficiently perfected. Another milestone was passed by 2140, with a cubic metre of space-time being accurately simulated.**
These four-dimensional lattice grids were, in effect, miniature universes – fully programmable and controllable. When combined with artificial intelligence, matter contained within their boundaries could be used to recreate virtually anything in real time and real resolution. Spatial extents continued to grow, reaching tens of metres. Although highly convincing VR had been around for over a century, achieving this level of detail at these scales had been impossible until now. By 2150, perfect simulations can be generated in room-sized environments without any requirement for on-person hardware.
As virtual reality advances still further, entire worlds are constructed using the smallest quantum units for building blocks. This opens up some profound opportunities in the 23rd century. For example, artificial planet Earths can have their parameters altered slightly – gravity, mass, temperature and so on – then fast-forwarded billions of years to compare the outcomes. Intelligent species evolving on these virtual worlds may be entirely unaware that they are part of a giant simulation.
observer from the previous century, walking through a newly developed
city of 2150, would be struck by the sense of cleanliness and order.
The air would smell fresh and pure, as if they were in pre-industrial countryside. Roads and pavements would be immaculate: made of special
materials that cleaned themselves, absorbed garbage and could self-repair
in the event of damage. Building surfaces, windows and roofs would be
completely resistant to dirt, bacteria, weather, graffiti and vandalism.
These same coatings would be applied to public transport, cars and other
vehicles. Everything would appear brand new, shiny and in perfect condition
at all times. Greenery would feature heavily in this city, alongside spectacular fountains, sculptures and other beautification.
telegraph poles, signs, bollards and other visual "clutter" that
once festooned the streets have disappeared. Lighting is achieved
more discretely, using a combination of self-illuminating walls and
surfaces, antigravity and other features designed to hide these eyesores,
maximising pedestrian space and aesthetics. Electricity is passed wirelessly
from building to building. Room temperature superconductors – implanted
in the ground – allow the rapid movement of vehicles without the need
for tracks, wheels, overhead cables or other such components. Cars
and trains simply drift along silently, riding on electromagnetic currents.
are obsolete – all information is beamed into a person's
visual cortex. They merely have to "think" of a particular
building, street or route to be given information about it.
would also notice their increased personal space, and the relative quiet
of areas that, in earlier times, would have bustled with cars, people
and movement. In some places, robots tending to manual duties might
outnumber humans. This is partly a result of the reduction
in the world's population. However, it is also because citizens of today
spend the majority of their time in virtual environments. These offer practically everything a person
needs in terms of knowledge, communication and interaction – often
at speeds much greater than real time. Limited only by a person's
imagination, they can provide richer and more stimulating experiences
than just about anything in the physical world.
rare occasions when a person ventures outside, they are likely to spend
little time on foot. Almost all services and material needs can be obtained
within the home, or practically on their doorstep – whether it be food,
medical assistance, or even replacement body parts and physical upgrades. Social
gatherings in the real world are infrequent, usually reserved
for "special" occasions such as funerals, for novelty value,
or the small number of situations where VR is impractical.
almost non-existent in these hi-tech cities. Surveillance is everywhere:
recording every footstep of your journey in perfect detail and identifying
who you are, from the moment you enter a public area. Even your internal
biological state can be monitored – such as neural activity and pulse
– giving clues as to your immediate intentions. Police can be summoned
instantly, with robotic officers appearing to 'grow' out of the ground
through the use of blended claytronics and nanobots, embedded into the
buildings and roads. This is so much faster and more efficient that
in most cities, having law enforcement drive or fly to a crime area
(in physical vehicles) has become obsolete.
safe and clean, some of these hi-tech districts might appear rather
sterile to an observer from the previous century. They would lack the
grit, noise and character which defined cities in past times. One way
that urban designers are overcoming this problem is through the use
of dynamic surfaces. These create physical environments that are interactive.
Certain building façades, for instance, can change their appearance
to match the tastes of the observer. This can be achieved via augmented
reality (which only the individual is aware of), claytronic surfaces
and holographic projections (which everybody can see), or a combination
of the two. A bland glass and steel building could suddenly morph into
a classical style, with Corinthian columns and marble floors; or it
could change to a red brick texture, depending on the mood or situation.
solar eclipse in London
rare total eclipse takes place in Britain this year, with parts of London
experiencing totality.* The last time this
occurred was in 1715; the next will be in 2600 AD.
extinctions are levelling off
century has passed since the peak in global extinction rates* and biodiversity has now stabilised. With previous food chains having collapsed, the world's
fauna is dominated by the hardiest and most adaptable lifeforms – such as rats, cockroaches and canines – while plant life has seen a marked increased in the proportion of weeds.
the world lie abandoned cities and decaying infrastructure surrounded
by vast, polluted wastelands. Small pockets of biodiversity
can still be found – but many of these are contained within artificial
environments, protected and sealed from conditions outside.
Much of humanity has fled to higher and lower latitudes while efforts continue to resolve the climate crisis.
world's first bicentenarians
people who were born in the 1960s are still alive and well in today's
world. Life expectancy had been increasing at a rate of 0.2 years, per
year, at the turn of the 21st century. This incremental progress meant
that by the time they were 80, these people could expect to live an
additional decade on top of their original lifespan.
the rate of increase itself had been accelerating, due to major breakthroughs
in medicine and healthcare, combined with better education and lifestyle
choices. This created a "stepping stone", allowing people
to buy time for the treatments available later in the century – which
included being able to halt the aging process altogether.*
power plants are widespread
century after the global deployment of fusion, new forms of power production
are becoming necessary in order to cope with the ongoing rise in
energy demands on Earth and elsewhere. A new generation
of power plants is becoming available, capable of harnessing the energy
released in matter/antimatter collisions. The reactions involved are
1,000 times more powerful than fission produced in nuclear power
plants and 300 times more powerful than nuclear fusion energy.*
Osharov | Dreamstime.com
civilian expansion into the solar system – and the increasing ease of
access to space technology – has led to the emergence of a new and deadly
form of terrorism. This involves the sabotage or hijacking of spacecraft,
for use in the purposeful redirection of asteroids towards Earth, Mars
and the Moon.*
colonies in the outer solar system are also being targetted. These are
particularly vulnerable, since they tend to lack the orbital infrastructure
and defences necessary to deflect these huge incoming objects. At least
one major colony around Jupiter is devastated during this time.
to religious extremists, there is a growing anarcho-primitivist movement.
This consists of small underground cults opposed to the increasing dominance
of AI in the running of world affairs. They deplore what they see as
forced, unnatural changes and technologies sweeping humanity – instead
favouring a return to more traditional lifestyles and cultures. They
are prepared to resort to whatever means necessary to achieve this.*
Kaulitzki | Dreamstime.com
Nitrous oxide (N2O) has fallen to pre-industrial levels
Nitrous oxide (N2O) is a naturally occurring gas emitted by bacteria in the soils and oceans, forming part of the Earth's nitrogen cycle. It was first synthesised by English natural philosopher and chemist Joseph Priestley in 1772. From the Industrial Revolution onwards, human activities began to significantly increase the amount of N2O in the atmosphere. By the early 21st century, about 40% of total emissions were man-made.
By far the largest anthropogenic source (80%) was from agriculture and the use of synthetic fertilizers to improve soil, as well as the breakdown of nitrogen in livestock manure and urine. Industrial sources included production of chemicals such as nylon, internal combustion engines for transport and oxidizers in rocketry. Known as "laughing gas" due to its euphoric effects, it was also used in surgery and dentistry for anaesthetics and analgesics.
Nitrous oxide was found to be a powerful greenhouse gas – the third most important after carbon dioxide and methane. While not as abundant in the atmosphere as carbon dioxide (CO2), it had almost 300 times the heat-trapping ability per molecule and caused roughly 10 percent of global warming. After the banning of chlorofluorocarbons (CFCs) in the 1980s, it also became the most important substance in stratospheric ozone depletion.*
By the mid-21st century, the effects of global warming had become very serious.* While most efforts were focussed on mitigating CO2, attempts were made to address the imbalance of other greenhouse gases, including N20. There was no "silver bullet" for this. Instead, it would take a combination of substantial improvements in agricultural efficiency, reduced emissions in transportation and industrial sectors, along with changes in dietary habits towards less per capita meat consumption in the developed world. While many technologies and innovations were already available in earlier decades, these targets were unfortunately difficult to achieve – due to additional costs and the absence of political will for implementation. It was only during the catastrophic events in the second half of the century that sufficient efforts and financial resources were directed towards the problem.
With a lifespan of 114 years,* man-made N20 proved difficult to stabilise and remained in the atmosphere well into the 22nd century. By 2190, it has fallen to around 270 parts per billion (ppb), its pre-industrial level.** As well as halting the impact of global warming and ozone damage, other benefits include better overall air quality, reduced loss of biodiversity in eutrophied aquatic and terrestrial ecosystems, and multiple economic benefits.