Thou shalt not take a human’s job
Much to its credit, the UK’s House of Lords (note for out American readers: think of it as a bit like the Senate, but not all that much) has launched a public inquiry into Artificial Intelligence. Since we’ve all known for a while that AI is going to dominate pretty much everything in the 21st century (except maybe kittens), it’s about time our political leaders started to weigh in.
Bishop Croft decided to take a leaf out of God’s book and issue Ten Commandments for AI – going one better (maybe six better) than Isaac Asimov with his famous Three Laws of Robotics (which actually amount to four as he added Law 0 later).2 The Three Laws are:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws
Near the end of his book Foundation and Earth, a zeroth law was introduced:
0. A robot may not injure humanity, or, by inaction, allow humanity to come to harm
The Bishop’s move, however, has a rather 2018 feel. Since he’s worried about jobs, his commandments include:
- The primary purpose of AI should be to enhance and augment, rather than replace, human labour and creativity
And he’s also in favour of, well, doing good:
- Governments should ensure that the best research and application of AI is directed towards the most urgent problems facing humanity
Of course, the issue here has little to do with robots and AI, and everything to do with the relationship between corporations, governments, and the public interest. AIs will do as they are told. But who will tell them?