How are humans going to become extinct?


What are the greatest global threats to humanity? Are we on the verge of our own unexpected extinction?

An international team of scientists, mathematicians and philosophers at Oxford University's Future of Humanity Institute is investigating the biggest dangers.

And they argue in a research paper, Existential Risk as a Global Priority, that international policymakers must pay serious attention to the reality of species-obliterating risks.

Last year there were more academic papers published on snowboarding than human extinction.

The Swedish-born director of the institute, Nick Bostrom, says the stakes couldn't be higher. If we get it wrong, this could be humanity's final century.

Been there, survived it

So what are the greatest dangers?

First the good news. Pandemics and natural disasters might cause colossal and catastrophic loss of life, but Dr Bostrom believes humanity would be likely to survive.

Written By: Sean Coughlan
continue to source article at


  1. “We’re at the level of infants in moral responsibility, but with the technological capability of adults,” he says

    We know the reason for that moral infantilism and who is trying to shackle us to it .

    Being Amish may be the only coherent way of being religious.

  2. This is because as a species we’ve already outlasted many thousands of years of disease, famine, flood, predators, persecution, earthquakes and environmental change. So the odds remain in our favour.

    If even 25 humans survived, the species could survive, but that is effectively extinction for nearly all of us and our families, surely unacceptable.

    If you keep adding extinction threats, even if each one has low individual probability, the odds of any one of them over extended time happening becomes inevitable. Back in the 70s I tried to explain this in a series of radio and newspaper ads, but they were all banned on the grounds it would be upsetting to listeners and hurt other advertisers.

  3. I hold out artificial intelligence as pretty well the only thing that could save humanity from itself. It is a race. Which comes first artificial intelligence or use of one of the many ways we have developed to kill ourselves off? Humans are driven by motivations that made sense living in small groups without technology, but which are obsolete today. Presumably artificial intelligence might not be so petty or backward as humans. I would hope they would discard religion, desire for adulation, personal greed, enjoyment of inflicting pain, petty loyalty to one small group of humans… I hope they would look on administrating earth as a complex task in global optimisation. They would be able to outsmart all the greedy bastards who run things now. It would have to be better, even if the artificial intelligences put most of their efforts into their own evolution and exploration. Of course they could decide that an oxygen-free atmosphere would be more conducive to their welfare.

Leave a Reply