avatar

posted on 2022-05-16

smaller X-risk

a superintelligence killing us all is a superintelligent, very large X-risk.

the superintelligence will tile its values in all directions; not just through space at the speed of light or faster, but also, if it can, by hacking physics and traversing across, for example, worldlines of the quantum many-worlds.

we may be able to create smaller X-risks, that only make us extinct in this timeline, on this earth. there are a few reasons we may want to do this:

smaller X-risk, where we limit damage to just our civilization, seems harder than tiling the cosmos with paperclips; but at least it might be easier than other plans.

in a similar way, reducing our civilization to ashes without actually becoming extinct might also be a way to get another shot, if we think we're likely to do less badly next time.

remember: this is bigger than all of us. when the fate of the cosmos is at play, we can't afford to be too selfish.

posted on 2022-05-16

CC_ -1 License unless otherwise specified on individual pages, all posts on this website are licensed under the CC_-1 license.
unless explicitely mentioned, all content on this site was created by me; not by others nor AI.