tammy's blog about
AI alignment,
utopia,
anthropics,
and more;
let's say aligned superintelligence gives us utopia. let's even say compute is infinite. what might sill be scarce? one possible answer relates to moral patients: even with theoretically limitless material capabilities, under aligned superintelligence there are some things you're not allowed to do. for example, it generally shouldn't allow you to create unconsentingly suffering moral patients.
one general scarce resource, then, is getting a moral patient to be involved with something, and especially getting a specific moral patient to be involved with something. maybe i knock on someone's door to ask them to cook me a meal, because i care that it has been cooked by a real person. if they're up for it, then maybe they cook me a meal and go back to whatever they were doing; but another way this could work is that what i'm negotiating is a future, yet-to-be-created fork of that person, who would only retroactively consent to exist and cook me a meal if i had done something that that person, before the forking, would consider a fair trade for what i'm asking.
and then, we get into the question of: do i get to create a fork of someone without their original's consent, if the fork consents? how much does a moral patient have intellectual property rights over things that are like it in most ways, except for the consenting to be forked? i've no idea what the answer to this is — my historical strong philosophical opposition to intellectual property does not seem like it necessarily straightforwardly carries over to this. so, i'll leave it as an open question, for now.
unless otherwise specified on individual pages, all posts on this website are licensed under the CC_-1 license.
unless explicitely mentioned, all content on this site was created by me; not by others nor AI.