https://arxiv.org/ftp/arxiv/papers/1902/1902.03689.pdf Full Abstract: Background. Increasingly, expert observers and artificial intelligence (AI) progression metrics indicate AI will exceed human intelligence within the next few decades. Whether artificial general intelligence (AGI) that exceeds human capabilities will be the single greatest boon in history or a disaster to humankind is unknown. No proof exists that AGI will benefit humans, nor a proof or proven method ensuring that AGI will not harm or eliminate humans. Objective. I propose a set of logically distinct conceptual components that are necessary and sufficient to 1) ensure that most known AGI scenarios will not harm humanity and 2) will robustly align AGI values and goals with human values. Methods. By systematically addressing each pathway category to malevolent AI we can induce the methods/axioms required to redress the category. Results and Discussion. Distributed ledger technology (DLT, โ€˜blockchainโ€™) is integral to this proposal, e.g. to reduce the probability of hacking, provide an audit trail to detect and correct errors or identify components causing vulnerability or failure and replace them or shut them down remotely and/or automatically, and to separate and balance key AGI components via decentralized apps (dApps). Smart contracts based on DLT are necessary to address evolution of AI that will be too fast for human monitoring and intervention. The proposed axioms. 1) Access to technology by market license. 2) Transparent ethics embodied in DLT. 3) Morality encrypted via DLT. 4) Behavior control structure with values (ethics) at roots. 5) Individual bar- code identification of all critical components. 6) Configuration Item (from business continuity/disaster recovery planning). 7) Identity verification secured via DLT. 8) โ€˜Smartโ€™ automated contracts based on DLT. 9) Decentralized applications - AI software code modules encrypted via DLT. 10) Audit trail of component usage stored via DLT. 11) Social ostracism (denial of societal resources) augmented by DLT petitions