My mother is worried about robots. Stephen Hawking too although in combination with artificial intelligence (AI). People were worried too during the Industrial Revolution (around 1800) and after the introduction of computers (since 1938). However, both mechanisation and automation resulted in a substantial increase of prosperity. Are the current concerns on robots and AI any different?
Prof Stephen Hawking, possibly the world’s leading scientist, has warned that robots in combination with AI could end mankind. The same theme may be found in famous movies like 2001: A Space Odyssey, Battleship Galactica, Matrix, Terminator and Transformers. In short, from (science) fiction to non fiction.
I suspect that Stephen Hawking’s worries about AI, ultimately deal with the expected impossibility to adhere to the 3 Laws of Robotics as outlined in 1942 by the famous Sci-Fi author and scientist Prof Isaac Asimov:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Prof Hawking says the primitive forms of artificial intelligence developed so far have already proved very useful, but he fears the consequences of creating something that can match or surpass humans. “It would take off on its own, and re-design itself at an ever increasing rate,” he said. “Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (source: BBC News, 2 Dec 2014)
Global communication tools, like internet, would be required for AI robots to combat humans. Without this they would just operate as mavericks. This also applies to mankind and its armies though. As long as unlimited energy storage is out of reach, machines will need new energy supply just like humans will always require food and water. Planes, vehicles and weapons which are part of a (communication) network will prove to be very vulnerable. Access to communication and energy (including food and water) will ultimately be decisive. Weapons will be far less decisive.
When we disregard friendly and helpful AI robots (Transformers), time travel (Terminator), a saviour (Matrix) then only running and hiding remains as shown in Galactica. A gloomy prospect. After ample consideration I conclude that Stephen Hawking is probably right. No surprise there. Curiosity killed the cat. Greed and lack of ethics the demise of man and mankind.
The 2nd Commandment reads: Thou shalt not make unto thee any graven image.
0 Comments