28th October 2018
Future "superhumans" predicted in Stephen Hawking's final essays
In a newly-published, posthumous book, Stephen Hawking leaves us with his final thoughts on the universe's biggest questions. He predicts the future of humanity, including a race of "superhumans" that could emerge as people choose to upgrade themselves.
Professor Stephen Hawking, who died in March at the age of 76, was among the most renowned scientists of the modern era.
As a physicist, his works included gravitational singularity theorems in the framework of general relativity and the prediction that black holes emit radiation – often called Hawking radiation. He was the first to set out a theory of cosmology explained by a union of the general theory of relativity and quantum mechanics. He was a vigorous supporter of the many-worlds interpretation of quantum mechanics.
As an author, Hawking achieved commercial success with several works of popular science. His most well-known book, A Brief History of Time, appeared on the British Sunday Times best-seller list for a record-breaking 237 weeks.
This month sees the release of Hawking's final book – Brief Answers to the Big Questions – in which he examines some of the universe's greatest mysteries and promotes the view that science is vitally important to solving problems here on Earth. Although incomplete at the time of his passing, the book was completed by his academic colleagues, family and the Stephen Hawking Estate. The hardback edition is 256 pages in length and draws upon his many essays, lectures and keynote speeches. The publisher describes it as "a selection of [Hawking's] most profound, accessible, and timely reflections from his personal archive."
The 10 big questions that Hawking attempts to answer are as follows:
The book discusses many of today's greatest challenges. The biggest threats to our world, according to Hawking, include asteroid collisions, climate change and nuclear war. Within the next 1,000 years, a calamity may "cripple Earth", he writes. Artificial intelligence could develop a will of its own; a will that is in conflict with ours.
"In short, the advent of super-intelligent AI would be either the best or the worst thing ever to happen to humanity," Hawking says. "The real risk with AI isn't malice but competence. Super-intelligent AI will be extremely good at accomplishing its goals. If those goals aren't aligned with ours, we're in trouble."
Alongside the rise of powerful AI, he believes that a genetically-modified race of superhumans will eventually come to dominate the world: "I am sure that during this century, people will discover how to modify both intelligence and instincts such as aggression," he speculates. "Laws will probably be passed against genetic engineering with humans. But some people won't be able to resist the temptation to improve human characteristics, such as memory, resistance to disease and length of life.
"Once such superhumans appear, there will be significant political problems with unimproved humans, who won't be able to compete," he continues. "Presumably, they will die out, or become unimportant. Instead, there will be a race of self-designing beings, who are improving at an ever-increasing rate. If the human race manages to redesign itself, it will probably spread out and colonise other planets and stars."
The risks of failure are mounting, however. Education and science are "in danger now more than ever before." He urges young people "to look up at the stars and not down at your feet. Try to make sense of what you see, and wonder about what makes the universe exist ... It matters that you don't give up. Unleash your imagination. Shape the future."
Dr. Marcelo Gleiser, reviewing the book for National Public Radio (NPR), writes: "Stephen Hawking is one of those rare luminaries whose life symbolises the best humanity has to offer ... [His book is one] every thinking person worried about humanity's future should read ... If there is a unifying theme across the book, it is Hawking's deep faith in science's ability to solve humanity's biggest problems ... His answers to the big questions illustrate his belief in the rationality of nature and on our ability to uncover all its secrets.
"Although Hawking touches on the origin of the universe, the physics of black holes and some of his other favourite topics, his main concern in this book is not physics. It's humanity and its collective future. Focusing his attention on three related questions – the future of our planet, colonisation of other planets and the rise of artificial intelligence – he charts his strategy to save us from ourselves. Only science, Hawking argues, can save us from our mistakes ... Hawking believes that humanity's evolutionary mission is to spread through the galaxy as a sort of cosmic gardener, sowing life along the way. He believes we will develop a positive relation with intelligent machines and that, together, we will redesign the current fate of the world and our species."
We are giving away free copies of this book to our readers!
For a chance of winning, simply tweet a link to this competition page, making sure to include "@future_timeline" in your tweet.
We will select three winners at random in 10 days' time.
• Follow us on Twitter
• Follow us on Facebook
• Subscribe to us on YouTube