Anonymous
08/13/2019 (Tue) 11:16:34
[Preview]
No.6286
del
EA is worse than average because the legible philosophy preselects for people who are bad at game theory, which second order selects for people who are good at game theory. Altruism is another word for playing CooperateBot in the prisoner's dilemma, and when you have a bunch of CooperateBots in one place, not just socipathic DefectBots, but also otherwise cool PrudentBots will come in to consume the surplus.
"You'd ruin your shoes to cooperate locally in a context where the community will ensure reciprocity, therefore you should send money to far off places which will be spent by the fargroup, who is overwhelmingly unlikely to reciprocate." Uh, obviously not? This is a bad argument in general before we look at any of the specifics about the people we're subsidizing. And making it publicly is a beacon which attracts the sharks.
"But if everyone cooperated every turn, everyone would be better off over the long haul!" That is not causally entangled with your trading partner's decision function. And saying that out loud is going to attract the sort of people who want to value pump you. And lo and behold, EA is filled with sociopaths.
It is deeply ironic that one subculture over, MIRI spends part of their time formalizing when and under what conditions an agent should cooperate in both the IPD and the oneshot. Protip: it isn't CooperateBot and requires conditioning your response on your trading partner.