The Possible End Of Mankind - Oxford Institute Forecasts

In a forthcoming paper entitled “Existential Risk Prevention as the Most Important Task for Humanity,” a team of mathematicians, philosophers and scientists at Oxford University’s Future of Humanity Institute say there is ever-increasing evidence that the human race’s reliance on technology could, in fact, lead to its demise. The group argue that we face a real risk to our own existence which is not a slow demise in some distant - theoretical future, but within the next century.

According to the director of the institute Dr Nick Bostrom, the chances of pandemics and natural disasters taking our species out within the very narrow timeframe is extremely small. “The Earth will remain habitable for at least another billion years. Civilization began only a few thousand years ago. If we do not destroy mankind, these few thousand years may be only a tiny fraction of the whole of civilized human history,” he writes. This concurs with the Nasa series of answers in 2012 to frequently asked questions about the 'end of the world.' 

Instead, it’s the unknown factors behind innovative technologies that Bostrom says pose the greatest risk going forward. Synthetic biology, nanotechnology and machine intelligence are areas that have contributed greatly to our quality of life, but also greatly threaten our future life. All of which...could become our own worst enemy, if they aren’t already, with Bostrom calling them,
“threats we have no track record of surviving."

Seán O'Heigeartaigh, a geneticist at the institute, draws an analogy with algorithms used in automated stock market trading. These mathematical strings can have direct and destructive consequences for real economies and real people. "We are developing things that could go wrong in a profound way," He told the BBC in a recent interview.

Dr Bostrom says there is a real gap between the speed of technological advance and our understanding of its implications. The team apostles the need for international policymakers to pay serious attention to the reality of species-obliterating risks. According to Dr. Bostrom, the stakes couldn't be higher and if we get it wrong, this could be humanity's final century.

Related Articles: