Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!

Limit to Humanity's Ability to Weather Crises

  • Please log in to reply
2 replies to this topic




  • Members
  • PipPipPipPipPipPipPipPip
  • 1,850 posts
  • LocationCanada

The recent apocalypse has me wondering, just how good are humans at dealing with civilization-ending phenomena?


As we've watched most of the world slowly shut down in response to this virus, with thousands dying every day, and economies on the brink of collapse, I've often found myself irked by the repeated statement that a vaccine is 16-18 months away. We're faced with potentially one of the greatest crises of the past century, and with all of our best minds, and the full support of the most powerful states on this planet, the best we can do to come up with a vaccine is 1 year and 6 months? I know everyone is involved is working very hard, but surely we're capable of better than this? Millions could be dead by then.


So that makes me ask the question, what is the limit to humanity's ability to deal with an apocalyptic event? If we were truly faced with extinction, say from a much deadlier virus, or an asteroid, how fast could we move if we pooled everything we had together and worked 24/7 to stop the threat?  Do you think nations would share resources and forget their petty squabbles?



    2020 is here; I still suck

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,939 posts
  • LocationGeorgia
We came up with a vaccine in a matter of weeks. It may have been less actually I'd need to check. Either way the point is that the 1+ year is not the figure for coming up with a vaccine but rather the time needed to verify the efficacy and safety of said vaccine.

The growth of computation is doubly exponential growth. 




  • Members
  • PipPipPipPipPipPipPip
  • 1,638 posts

Speaking of vaccines:

Outdated polio vaccine causes new cases of disease

Dozens of polio outbreaks in the past few years have been linked to a polio vaccine that many nations abandoned in 2016.


I think there are lots of challenges we've been good at addressing. Confronting some, however, are probably going to be like the sinking of the Titanic: from the time the ship was first struck, it took 2 hours and 40 minutes for it to sink. It must have been agonizing to people on-board. They knew it was going down into ice-cold waters, and they were completely powerless to stop it.

The worst crises will be those with an exponentially-growing threat, because it will look safe in the beginning, so people won't see the need to act; and then when they finally see what's in store for them, it's too late.
The government does happen to have experts who specialize in planning against so-called "tail risks'':

But my mind was floating. It was in this space of what's the risks of having this person run this thing? Then when I started interviewing people in the government it became clear this was a perfectly legitimate way to view the government. At its very basic level the government's job is to keep us safe. It does lots of other things. You can view the government in lots of other ways. You can view it as a service provider. You can view it as an employer, but at its very basic level its job is to keep us safe.

When you wander around it you start to appreciate how many things there are to keep us safe from. The things that pop to mind when you're thinking of that are very vivid things, vivid risks, nuclear terrorist attacks or pandemics or natural disasters, all of which the government is responsible for defending us from.

But if you poke around some more and think about things a bit more broadly, there are all these other risks that are equally existential, like the risk of inequality getting so bad that the society crumbles and there is revolution, or the risk that we don't invest our agricultural science sufficiently, so that we don't have a food supply in 30 years. So on and so forth. So I'm kicking around these thoughts.

At the same time, the book I'd written before "The Fifth Risk" was all about the way human beings have trouble evaluating risk. And it was about two psychologists, Amos Tversky and Danny Kahneman their work on this subject. One of the insights that drops out of Kahneman and Tversky's work is that people don't assess risk well, but in addition, they're not really sensitive to shifts in extreme risks. So if you take something, there's a one in a million chance of happening, and you shift it to a one in 100,000 chance of happening, people don't register that as, "Oh it's 10 times more likely."

You could think of the government, as this manager of a basket of one in a million risks, but they've got a million of them, and that you can think of Trump and his neglect, mismanagement, ignorance, et cetera, as a machine for ratcheting up the likelihood of a lot of unlikely bad things happening.

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users