/ratanon/ - Rationalists Anonymous

Remember when /ratanon/ was good?

Posting mode: Reply

Check to confirm you're not a robot
Name
Email
Subject
Comment
Password
Drawing x size canvas
File(s)

Board Rules

Max file size: 350.00 MB

Max files: 5

Max message length: 4096

Manage Board | Moderate Thread

Return | Magrathea | Catalog | Bottom

Expand All Images


? Anonymous 08/13/2019 (Tue) 07:05:36 [Preview] No. 5503
Polyamory = good, because it's just your obsolete primate programming that makes you upset about Tyrone fucking your gf
EA = good, because your enlightened primate programming makes you care about maximising starving n*****s in Africa.
Am I missing something here or there is a contradiction?


Anonymous 08/13/2019 (Tue) 07:05:54 [Preview] No.5504 del
Some of your primate heuristics are useful, some of them aren't. Obeying them or getting rid of them is always instrumental, never a terminal goal.


Anonymous 08/13/2019 (Tue) 07:06:10 [Preview] No.5505 del
>>5504
Some but not all? What are some terminal goals that are not primate heuristics?


Anonymous 08/13/2019 (Tue) 07:06:26 [Preview] No.5506 del
>>5505
Certain primate heuristics are the direct cause of most things we do, but they're not usually considered goals, and they tend to conflict with other primate heuristics in some way.
The terminal goal of EA is roughly to increase global utility as much as possible, i.e. utilitarianism.
A more complicated but not much more useful explanation is that EA communities explicitly value utilitarianism and doing things that are more utilitarian is a way to increase social status in those communities.
So if you want to get all technical and reductionist it's caused by the drive for social status which the group is collectively harnessing for the purpose of increasing global utility. But that way of framing things has limited use. In most contexts it's useful to skip all the parts with the words "status" and "signaling".

But that's getting away from the topic of the thread.
The reason polyamory is considered good by a lot of rationalists isn't that it suppresses obsolete primate programming, it's that it (presumably) makes the people involved happier. Suppressing primate heuristics is instrumental, it's not the goal.
The reason EA is considered good by a lot of rationalists is cynically that it lets you appear rational and ethical, or less cynically that it increases global happiness.


Anonymous 08/13/2019 (Tue) 07:06:42 [Preview] No.5507 del
>>5506
So the end goal of EA is universe tiled in hedonium? Why do rationalists still seem to be averse to wireheading themselves?


Anonymous 08/13/2019 (Tue) 07:06:58 [Preview] No.5508 del
>>5507
Some utilitarians would be on board with that. Some utilitarians think that doesn't maximize utility (e.g. preference utilitarians) or even happiness (e.g. certain objections about unique brain states).
There exist EA people who would be on board with that. I don't know how many. I would think people who think tiling the universe with hedonium is good wouldn't object to wireheading either.
There also exist EA people who aren't utilitarians, even though the overall sensibilities of EA are pretty utilitarian. I don't know how many.
Most EA has much more mundane goals. Preventing malaria or improving nutritional intake of poor rural laborers using low-cost interventions is good whether you subscribe to preference utilitarianism or negative utilitarianism or hedonic utilitarianism.


Anonymous 08/13/2019 (Tue) 07:07:14 [Preview] No.5509 del
>>5503
Tyrone wouldn't stoop down to fucking the sort of girl who goes poly.


Anonymous 08/13/2019 (Tue) 07:07:30 [Preview] No.5510 del
>>5504
>Obeying them or getting rid of them is always instrumental, never a terminal goal.
Why not? Did the Terminal Goal Fairy come down and bless your dear monkey brain with some sort of exogenous goals?

>>5506
>The reason polyamory is considered good by a lot of rationalists isn't that it suppresses obsolete primate programming, it's that it (presumably) makes the people involved happier.
So it's not good because it suppresses the (((obsolete))) primate desire to pair bond, sure, but it is good because it achieves the primate desire to experience happiness. Really makes me think. And what I'm thinking is that you haven't thought this through all the way. You never answered the other anon's question, "What are some terminal goals that are not primate heuristics?"

>>5509
Tyrone will copulate with whomever he damn well pleases.


Anonymous 08/13/2019 (Tue) 07:07:47 [Preview] No.5511 del
>>5510
>Why not? Did the Terminal Goal Fairy come down and bless your dear monkey brain with some sort of exogenous goals?
In explicit justifications rationalists give, like the OP seems to be criticizing, it's an instrumental goal. In the first case it's not "this is primate programming so you should get rid of it" but "this is primate programming so it's okay to get rid of it to achieve X".
>You never answered the other anon's question, "What are some terminal goals that are not primate heuristics?"
That's what the first paragraph was about. Explicitly stated terminal goals are not always primate heuristics. Direct causes are primate heuristics, but talking in terms of those is usually not useful.


Anonymous 08/13/2019 (Tue) 07:23:31 [Preview] No.5561 del
EA or any other form of altruism in general is a consequence of idiocy.



Top | Catalog | Post a reply | Magrathea | Return