Stephen Hawking's Final Warnings: AI, Superhumans, Asteroids
Stephen Hawking is undeniably one of the greatest minds in recent history and his passing left a gaping hole in the academic and scientific communities. He didn't leave the world bereft, however, and even left a corpus of essays and articles for us to munch on for months, even years to come. Some of those, however, are less uplifting than others. The late great physicist has left warnings on the dangers that the human civilization faces in the near future, including everyone's favorite scapegoat, AI.
To be clear, Hawking's doesn't say that artificial intelligence will develop ill-will towards humanity. It will, instead, simply be very good and very efficient at carrying out its objectives, even if those objectives is in conflict with ours and our welfare. He proposes regulating AI ASAP.
But we don't need to look to artificial lifeforms for danger. Our own ingenuity might be our own undoing. Hawking believes that, sooner or later, man will start tinkering with its genes and create a new superhuman generation that would survive nuclear war and climate problems, rendering "regular" humans obsolete and almost extinct. We're likely to see political and social problems that have so far only been hinted at in comics and movies.
That "forced evolution", however, might be necessary if we are to survive what Hawking labels as the two biggest threats to our planet: an asteroid the size of which killed the dinosaurs and climate change that would leave us with Venusian temperatures of 250C. The former we have no defense against. The latter will be our own fault.
We might find salvation in the stars, both through our own travels and with the help of intelligent life in space. Quartz notes that Hawking did believe we are not alone. We just haven't found any because we may have overlooked what intelligent life may look like outside our own.