That's another paper for my "to-read" list....
I'm currently the sole "NO" here (Edit: Still the sole NO. The majority is against me. Hmmm)
. Some people will know my reasoning, but for those of you who don't:
I'm not a realist about metaethics. If you put a gun to my head and forced me to narrow that down, I'd say I was an emotivist
, with some caveats. I have lots of random intuitions about what I should be doing, and I try to bring those to some sort of reflective equilibrium (I think that's one of the areas in which I depart from normal emotivists). My intuitions do not
collapse down into: "prevent suffering, cause happiness". They probably don't even collapse into: "maximize the satisfaction of preferences". They don't even collapse down into perfect altruism - no matter how I try to force them.
Hence, I remain largely (>50%) selfish. I don't care as much about animals as I do humans. I care about people close to me more than I care about strangers. Of course, I still care
, and I can do math - so I still want to stop wild animal suffering etc... And of course I dance a fine line between what's in accordance with my ethics and what's just needed for me to be psychologically healthy...
But anyway, that's a long winded way of saying: "I'm not a utilitarian in the pure sense, therefore I don't endorse utilitronium shockwaves as there are configurations of the universe I would regard as having higher value than that."
So no. I don't endorse it on a visceral or rational level. If presented with this genie, I would work really hard (or pay other people to work really hard) on a way of getting my intuitions into a coherent state - e.g. by getting an Oracle AI to take my brain state and cohere it or something. If the genie said I had ten minutes to answer - I'd probably reel off something about the current preferences of all beings which have preferences, with weighting toward the strength of those preferences and completely banning torture and some other stuff - though that would probably go horribly wrong.
Alan: I seem to recall you're an emotivist as well? I struggle to understand how humans, acting only on their own intuitions, can end up endorsing utilitronium shockwaves
Can you give me a run-down of what you think the factors that lead your intuitions in this direction were? I kinda get how realists about metaethics might like it, but - bleh!
World domination is such an ugly phrase. I prefer to call it world optimization