tammy's blog about
AI alignment,
utopia,
anthropics,
and more;
in The Sequences, Eliezer Yudkowsky describes rationality as
now, personally, i intrinsically value a bunch of things, but having accurate beliefs isn't necessarily one of them; for me, rationality is an instrumental value in that it helps me achieve my other values better.
in general, i value people being able to do whatever they want, and as such they shouldn't necessarily have to form accurate beliefs if they don't care to. in fact, forming inaccurate beliefs is a great source of culture, and culture is something that i do personally intrinsically value.
but we live in the era of liberal democracies, where society requires people to form accurate beliefs, because they're the ones directing society through elections. i see the need for people to be rationalist as an unfortunate necessity; hopefully a need we can be rid of when we reach a topia where human decisions are no longer the pillar of civilization.
not, of course, that there's anything wrong with any individual or even group choosing to intrinsically value rationality. the part i care about is that it being a choice.
unless otherwise specified on individual pages, all posts on this website are licensed under the CC_-1 license.
unless explicitely mentioned, all content on this site was created by me; not by others nor AI.