isn't adding "nostalgia" at the end of kind of cheating, in that it carries most of the fragile complexity of values by carrying me and the people i care about? kinda, but i do think i still value the rest of these on their own — even if i couldn't have my nostalgia value satisfied, for whatever reason, then i'd still want the universe to have these other six values satisfied.
i'd say "even if my nostalgia is carried along, these other values are what i'd want to be satisfied for other arbitrarily-alien moral patiens too" but i don't think these other six values quite capture what i want everyone to have. for example, i think culture/art and authenticity/contact-with-reality should be purely voluntary — i think it's fine for moral patients who don't care about these and just want to wirehead, to be able to do so.
note that figuring out and formalizing one's intrinsic values is difficult work, and while i think i've made a lot of progress on that endeavor, i'm still very unsure. also, this work isn't particularly useful to AI alignment in my opinion; in practice, i'd just want to hand over the work of figuring out my values to my CEV. at most, figuring out my values has let me realize some requirements on what an alignment scheme must be capable of expressing — for example, the value of reducing (unconsented) suffering necessitates breaking the monotonicity principle, as kind of do all the other values here too, really.