Anonymous 08/13/2019 (Tue) 07:06:26 No.5506 del
>>5505
Certain primate heuristics are the direct cause of most things we do, but they're not usually considered goals, and they tend to conflict with other primate heuristics in some way.
The terminal goal of EA is roughly to increase global utility as much as possible, i.e. utilitarianism.
A more complicated but not much more useful explanation is that EA communities explicitly value utilitarianism and doing things that are more utilitarian is a way to increase social status in those communities.
So if you want to get all technical and reductionist it's caused by the drive for social status which the group is collectively harnessing for the purpose of increasing global utility. But that way of framing things has limited use. In most contexts it's useful to skip all the parts with the words "status" and "signaling".

But that's getting away from the topic of the thread.
The reason polyamory is considered good by a lot of rationalists isn't that it suppresses obsolete primate programming, it's that it (presumably) makes the people involved happier. Suppressing primate heuristics is instrumental, it's not the goal.
The reason EA is considered good by a lot of rationalists is cynically that it lets you appear rational and ethical, or less cynically that it increases global happiness.