Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!

These ads will disappear if you register on the forum

Photo

Things that could kill all humans


  • Please log in to reply
41 replies to this topic

#1
Vivian

Vivian

    Member

  • Members
  • PipPip
  • 25 posts

Ive read an article that lists the things that could kill we all , and what is most likely to happen . The top killers are: 

 

1 AI , with 10% of chance to kill us all in the next 100 years

 

2Global pandemic and nuclear war, with 5% of chance in 100 years

 

 

3 Climate change, with 5% in 200 years.

 

https://www.sciencea...the-most-likely

 

I dont know exactly how they reach that numbers, but  I dont think they take in account the new tecnologies that we might develop during the next 100 years. So, what do you think? Do we have or might have soon the tecnologies to survive these events? In the AI scenario, the foe would basically be tecnology itself, so , we would have to find ways to control it. 

 

Overall, acording to that guys, we have roughly 20% chance of being extinct in the next 100 years, and I think that, if we can survive this century, we are much less likely to face extintion.



#2
Jakob

Jakob

    Fenny-Eyed Slubber-Yuck

  • Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,221 posts
  • LocationIn the Basket of Deplorables

The chances of human extinction are less than 1% in the next 100 years. Mass die-offs and and collapse of civilization is more likely, but still less than 5% in my opinion. We are adaptable.


Click 'show' to see quotes from great luminaries.

Spoiler

#3
Erowind

Erowind

    Psychonaut, Aspiring Mathematician and Anarchist

  • Members
  • PipPipPipPipPipPip
  • 523 posts
  • LocationInside some convoluted formula I don't actually understand.

Existential threats should be taken much more seriously than they are I agree. It's kind of like we are collectively using a flamethrower to light birthday cake next to a gas pump, a lot of things we do are overkill and if we just did them in a different way everything would be perfectly fine.


Current status: slaving away for the math gods of Pythagoras VII.


#4
Vivian

Vivian

    Member

  • Members
  • PipPip
  • 25 posts

Yeah, I also think that we are very adaptable, we can alread plant vegetables in greenhouses in deserts, and plant in underground... Even in pandemics, if things become really bad, everyone can wear cirugic masks. Many people would die, but society and human kind would still survive. 

 

I dont think we do many overkill things that could end society. We alread have tecnologies to survive climate changes, nuclear bombs are just stored... we just have to control AI, and if we do it, tecnology will progress really, really fast.


  • Jakob likes this

#5
_SputnicK_

_SputnicK_

    Member

  • Members
  • PipPipPip
  • 61 posts
  • LocationUSA

I am surprised this article left out the potential dangers of self-replicating nano-technology: if we assume that nano-bots could replicate themselves at the atomic level, then there is no reason to believe that they couldn't exponentially increase their harvesting size until the entire earth is desolated. 

 

It might sound life science-fiction, but this is something that really might be possible in a atomic-scale manipulating world. 


Artificial intelligence will reach human levels by around 2029.

Follow that out further to, say, 2045, we will have multiplied the intelligence, the human biological machine intelligence of our civilization a billion-fold.
-Ray Kurzweil


#6
Jakob

Jakob

    Fenny-Eyed Slubber-Yuck

  • Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,221 posts
  • LocationIn the Basket of Deplorables

I am surprised this article left out the potential dangers of self-replicating nano-technology: if we assume that nano-bots could replicate themselves at the atomic level, then there is no reason to believe that they couldn't exponentially increase their harvesting size until the entire earth is desolated. 

 

It might sound life science-fiction, but this is something that really might be possible in a atomic-scale 3D printing world. 

The danger is not real. We would have to actively try to destroy the world in order to do so with nanoweapons (as with anything else).


Click 'show' to see quotes from great luminaries.

Spoiler

#7
_SputnicK_

_SputnicK_

    Member

  • Members
  • PipPipPip
  • 61 posts
  • LocationUSA

 

I am surprised this article left out the potential dangers of self-replicating nano-technology: if we assume that nano-bots could replicate themselves at the atomic level, then there is no reason to believe that they couldn't exponentially increase their harvesting size until the entire earth is desolated. 

 

It might sound life science-fiction, but this is something that really might be possible in a atomic-scale 3D printing world. 

The danger is not real. We would have to actively try to destroy the world in order to do so with nanoweapons (as with anything else).

 

The threat of warmongers and terrorists armed with the wrong technology in their hands is the exact reason that the possibility of a genetically-engineered super virus is seriously considered, or that there is real talk of a nuclear war with North Korea and the US. If you're suggesting that self-replicating nanotechnology is not a threat simply because it requires malicious intent, then you might need to reconsider what actually constitutes a serious threat.


Artificial intelligence will reach human levels by around 2029.

Follow that out further to, say, 2045, we will have multiplied the intelligence, the human biological machine intelligence of our civilization a billion-fold.
-Ray Kurzweil


#8
Vivian

Vivian

    Member

  • Members
  • PipPip
  • 25 posts

Self replicating nanobots would be a really dangerous weapon, Whoever got the nanobots first, would win the war. If terrorists got them, the world would be left with only the terrorists. We shouldnt build them without building a counter to them. 

 

About the nuclear war with north corea, Trump just posted a message on twiter that made north coreans think that he was begining a war, and they began to train to attack USA. Im not american , im brazilian, but I was afraid of a world war as soon as trump became USA president . Risk of nuclear war increases too much when we have two jerks with too much power. I hope Trump is no longer in power when we get nanobots . 

 

We here in Brazil chose too many bad politicians, but at least, we dont have nuclear weapons, and we dont have many people wanting to kill us all.



#9
Whereas

Whereas

    Member

  • Members
  • PipPipPipPipPip
  • 469 posts

As one of my professors used to say: "All of human kind's problems could be solved by a mid-sized asteroid."
 
Anyway, there is this report that came out last year, that pegs our odds of going extinct in the next 100 years at 9.5%. That means you're more likely to die in a human extinction event than in a car accident...


If you're wrong, how would you know it?


#10
Jakob

Jakob

    Fenny-Eyed Slubber-Yuck

  • Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,221 posts
  • LocationIn the Basket of Deplorables

 

 

I am surprised this article left out the potential dangers of self-replicating nano-technology: if we assume that nano-bots could replicate themselves at the atomic level, then there is no reason to believe that they couldn't exponentially increase their harvesting size until the entire earth is desolated. 

 

It might sound life science-fiction, but this is something that really might be possible in a atomic-scale 3D printing world. 

The danger is not real. We would have to actively try to destroy the world in order to do so with nanoweapons (as with anything else).

 

The threat of warmongers and terrorists armed with the wrong technology in their hands is the exact reason that the possibility of a genetically-engineered super virus is seriously considered, or that there is real talk of a nuclear war with North Korea and the US. If you're suggesting that self-replicating nanotechnology is not a threat simply because it requires malicious intent, then you might need to reconsider what actually constitutes a serious threat.

 

You're missing the point. It is all but impossible to make self-replicating nanoweapons that are so prolific as to pose a threat to the world. If they existed, then the technology for defenses would almost certainly exist too.

 

Also, nobody wants to destroy the fucking world.


Click 'show' to see quotes from great luminaries.

Spoiler

#11
Alislaws

Alislaws

    Democratic Socialist Materialist

  • Members
  • PipPipPipPipPipPip
  • 716 posts
  • LocationLondon

 

"Also, nobody with the intelligence and resources to manufacture super advanced nano-weapons wants to destroy the fucking world."

Clarified, because there are always idiots and crazy people!

 

Anyway, I don't thing Global warming is a threat to humanity as a whole. Absolute worst case, if it leads to huge food shortages or an ice age or something, it could kill off 90% of us, but we'd survive as a species. 

 

Pandemic risk is actually a big one, with antibiotics being massively overused and the risk of stuff like Bird flu making the jump to pigs where we could get a combo of 60% lethal, low transmission bird flu with highly infective, low mortality swine flu and then like half the worlds population is probably done for.

 

Again though, that's crash of civilization territory, not extinction territory. 


  • Jakob likes this

#12
Jakob

Jakob

    Fenny-Eyed Slubber-Yuck

  • Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,221 posts
  • LocationIn the Basket of Deplorables

 

Clarified, because there are always idiots and crazy people!

Fair point.

 

 

Anyway, I don't thing Global warming is a threat to humanity as a whole. Absolute worst case, if it leads to huge food shortages or an ice age or something, it could kill off 90% of us, but we'd survive as a species. 

 

Pandemic risk is actually a big one, with antibiotics being massively overused and the risk of stuff like Bird flu making the jump to pigs where we could get a combo of 60% lethal, low transmission bird flu with highly infective, low mortality swine flu and then like half the worlds population is probably done for.

 

Again though, that's crash of civilization territory, not extinction territory.

Absolutely. Climate change may kill millions, but nowhere near billions even in the worst case scenario. Antibiotic resistant bacteria is probably the most significant threat, but even then, it seems unlikely that everyone dies.

 

I think a lot of people don't get that human extinction means that Everyone. Dies. As in, literally everyone. It's hard to imagine a disaster so grave that one breeding population of humans couldn't ride it out in a bunker somewhere.


Click 'show' to see quotes from great luminaries.

Spoiler

#13
Vivian

Vivian

    Member

  • Members
  • PipPip
  • 25 posts

I dont think civilization can end either if we have 3d printers and internet. If we can save internet, we can access all the knowledge we build in the previous years. We will soon be able to build anything with 3d printer, so we could actually build everything again . 

 

I know that bacterias are becoming resistent, but antibiotics arent the only weapons we have against bacterias, we have vacines too. And we can avoid getting sick with masks, condons, etc. Also, its very difficult to kill a large population with a virus or bacteria. Because if a population is too big, the chances of resistent individuals increase. Also, virus and bacterias that spread more are that wich dont kill individuals too fast, so, if a population is big, the virus becomes more and more "friendly"  before infecting everyone. And we are at 7,4 biliions.  

 

We´ve never seen a virus or bacteria that can kill all humans. 

 

HIV has a killing rate of 99% without treatment , but with treatment, life expectancy is almost equal to regular population, and we know how to prevent it

ebola has 90% killing rate 

bird flu has 60% killing rate

 

So , even if everyone got sick  with AIDS and didnt have treatment, there would still be 70 milions people left. This is more than chimps and all other great apes together . With internet, everyone can learn to do what is needed to survive. With 3d printers, everything gets even easier.

 

Even terrorists wouldnt build a virus to kill everyone without a vacine or something.

 

Global warming catastrophe wouldnt happen too fast, and people would have time to adapt somewhat .


  • Jakob likes this

#14
rennerpetey

rennerpetey

    To infinity, and beyond

  • Members
  • PipPipPipPip
  • 175 posts
  • LocationLost in the Delta Quadrant

 

Even terrorists wouldnt build a virus to kill everyone without a vacine or something.

Don't be so sure about that.


Pope Francis said that atheists are still eligible to go to heaven, to return the favor, atheists said that popes are still eligible to go into a void of nothingness.


#15
Jakob

Jakob

    Fenny-Eyed Slubber-Yuck

  • Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,221 posts
  • LocationIn the Basket of Deplorables

 

 

Even terrorists wouldnt build a virus to kill everyone without a vacine or something.

Don't be so sure about that.

 

Terrorists have political goals, they don't just shoot people and blow things up because they like the noise.


  • Yuli Ban likes this

Click 'show' to see quotes from great luminaries.

Spoiler

#16
Vivian

Vivian

    Member

  • Members
  • PipPip
  • 25 posts

Even if Islamic state got to make a letal virus, they would also develop a vacine, because they would have to leave some muslims alive, or people that could become muslims. They would do something like " See that dead people? They werent muslims, you have to be muslims to get the vaccine!".  This would give us a chance to fight back, let spies go in and take the  vaccine, we could catch some of them and torture them to say were the vaccines are. 

 

On the topic of antibiotic resistent bacterias, they are mostly staphilococcus aureus, streptococcus, M. Tuberculosis, E.coli.  and a bunch of  bacterias that only kill weakened people, and/or dont spread too easily. Most lethal virus have far higher killing rates.

 

https://www.livescie...s-on-earth.html

 

https://lucbourne.sc...erous-bacteria/

 

bacterias can do damage for sure, but they wont kill us all.



#17
Alislaws

Alislaws

    Democratic Socialist Materialist

  • Members
  • PipPipPipPipPipPip
  • 716 posts
  • LocationLondon

While some terrorists are willing to die for their cause, they are usually fairly stupid. The guys running the show are not the ones who end up with explosives attached to them. 

 

Even the smartest terrorists are unlikely to be able to research or engineer cutting edge nano or bio-weapons. In general a lifetime of scientific education does not lead neatly to the sort of unthinking fanaticism needed to spend years working very hard to wipe out humanity. 

 

If anyone has any examples of competent scientists working to literally end the world/humanity, please let me know, because that sounds super interesting!

 

If some sort of plague or virus could be engineered, or deliberately evolved or something, then you might get some terrorist leaders locking themselves in a bunker and having his idiots release it into some populated places. 

 

We are much more likely to get a naturally evolved plague which we can't stop in time than terrorists are likely to manage it. If you spend a few minutes thinking about how hard it actually is to protect every part of a country, when all that is needed for a terrorist attack is a car and a willingness to drive on the pavement(Sidewalk), you'll realize that terrorism is not actually very difficult. 



#18
Vivian

Vivian

    Member

  • Members
  • PipPip
  • 25 posts

Yes, its easy to do terrorism, so labs that are bio engeneering virus have to have a very well made security, and protocols of what to do with the area if terrorists break in, alarms that send messages to people that are far away and that can give orders . 

 

As for AI, even if they are super inteligent, their motivations should always be determined by humans. They cant be able to come with a new motivation out of its own circuits.



#19
Whereas

Whereas

    Member

  • Members
  • PipPipPipPipPip
  • 469 posts

Isn't the IQ required by an individual to succeed in wiping out human kind continuously falling due to technological advancements? As in most things, defending is harder than attacking, and it's not unlikely that for a period of years or perhaps decades in the near future it will be easy to manufacture a gene edited super-plague (which is a fairly simple task from a high level point of view), while we won't yet have any good way of countering it (it takes us like at least half a year to produce a new vaccine today? - and we don't yet have proper medical nanobots either). AI advancements are going to be of greater use to bad actors here, since they'll probably be one of the main reasons the "required IQ" for this will keep falling in the near future. (And gods help us should an actual AGI figure out that we're standing in the way of it being able to produce more paper clips ...)

I'd also like to pour some rain on the "we're not extinct if it only wipes out 90% of us" reasoning. If modern civilization collapses (which wouldn't require a 90% death rate - 50% would be *more* than enough), then we're *done*. Sure, some hunter-gatherer tribes may survive, but the big picture they'd be in then? Scientific and technological knowledge would be lost and gone, infrastructure would crumble, the resources that had allowed a low-tech civilization to grow into a high-tech civilization have been used up, and global warming (along with the badly broken nitrogen cycle and wildlife and crops loss from the ongoing sixth mass extinction) would still continue for a few decades, and would further transform wast swathes of the world into a very human-inhospitable environment. Even *if* it were possible to recover from that, you'd probably at least be looking at an additional "lost" millennium, in which time some other bad  thing could happen without us being a type I or II civilization yet. Supervolcano, gamma ray burst, and eventually that same thing that wiped out global civilization the first time. The Great Filter may yet be right in front of us.


If you're wrong, how would you know it?


#20
Alislaws

Alislaws

    Democratic Socialist Materialist

  • Members
  • PipPipPipPipPipPip
  • 716 posts
  • LocationLondon

I'd also like to pour some rain on the "we're not extinct if it only wipes out 90% of us" reasoning. If modern civilization collapses (which wouldn't require a 90% death rate - 50% would be *more* than enough), then we're *done*. Sure, some hunter-gatherer tribes may survive, but the big picture they'd be in then? Scientific and technological knowledge would be lost and gone, infrastructure would crumble, the resources that had allowed a low-tech civilization to grow into a high-tech civilization have been used up, and global warming (along with the badly broken nitrogen cycle and wildlife and crops loss from the ongoing sixth mass extinction) would still continue for a few decades, and would further transform wast swathes of the world into a very human-inhospitable environment. Even *if* it were possible to recover from that, you'd probably at least be looking at an additional "lost" millennium, in which time some other bad  thing could happen without us being a type I or II civilization yet. Supervolcano, gamma ray burst, and eventually that same thing that wiped out global civilization the first time. The Great Filter may yet be right in front of us.

 

We wouldn't drop straight back to hunter gatherer tribes in 1 generation, the % of the population left would contain at least some people who understand the scientific method, which, coupled with all the remains of today's civilization, all the buildings, old machines, books etc. would mean that we would be very unlikely to plunge back into a medieval tech level.

 

Also however bad global warming would get, it would hardly be a big issue if 50-90% of the worlds population was dead, the remaining people could all easily fit and feed themselves from land far enough north or south of the equator to grow crops just as we do now. Also we'd have most of the industrial apparatus and resources needed to support 7 billion people and only (for example) 2 billion people to support with it. So avg standards of living could improve to above today's very quickly once everyone stops killing each-other. 

 

So unless some sort of virus removed everyone's ability to read, we'd probably recover. 


  • Jakob likes this




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users