Authors: Anti Procreation
Roko’s Basilisk is an idea that was suggested by Roko in LessWrong.com in 2010, that an AI would be motivated to eternally torture people who have not helped to bring it into existence. The more likely possibility of eternal torment is, I think, a sadistic AI. A Reddit user TheFaggetman suggested the possibility of a sadistic AI in 2015, Brian Tomasik suggested a possibility of sadists take control of an AI. Although the major focus on AI research is an existential risk, I think human extinction only bad as much as an annihilation of the people thereby annihilated is bad. Although there's no knock-down argument to prove eternal torment is worse than annihilation, as we can see on 'Better red than dead' v. 'Better dead than red' debate, if we at least think that whereas eternal torment may be infinite times worse than annihilation, annihilation may be only finite times (e.g. 10 times) worse than eternal torment, perhaps moral priority shall be given to prevention of eternal torment caused by AI-molecular-assembler than annihilation caused by AI. (This Paper was originally published as a part of a book called 'Procreation Is a Murder: Why Procreation Is the Root of All Evils)
Comments: 5 Pages.
[v1] 2016-10-31 08:35:48
Unique-IP document downloads: 60 times
Vixra.org is a pre-print repository rather than a journal. Articles hosted may not yet have been verified by peer-review and should be treated as preliminary. In particular, anything that appears to include financial or legal advice or proposed medical treatments should be treated with due caution. Vixra.org will not be responsible for any consequences of actions that result from any form of use of any documents on this website.
Add your own feedback and questions here:
You are equally welcome to be positive or negative about any paper but please be polite. If you are being critical you must mention at least one specific error, otherwise your comment will be deleted as unhelpful.