tammy's blog about
AI alignment,
utopia,
anthropics,
and more;
i previously wrote a post about how a superintelligence with an unoptimal decision system likely loses to alien superintelligences that are more optimal, at the scale of cosmic wars between those superints.
i don't think this is necessarily true: maybe physics does look like a funny graph à la wolfram, and then maybe we can carve out pieces of space that still grow but are causally isolated from the rest of the universe; and then, whether a given causally isolated bubble ever has to encounter an alien superint is purely up to whether it decides to generate alien space that leads to those, which is prevented easily enough.
unless otherwise specified on individual pages, all posts on this website are licensed under the CC_-1 license.
unless explicitely mentioned, all content on this site was created by me; not by others nor AI.