Saturday, July 28, 2018

[Speculation] Inducing anti-paranoia

From http://slatestarcodex.com/2017/10/02/different-worlds/, this interesting observation:

Paranoia is a common symptom of various psychiatric disorders – most famously schizophrenia, but also paranoid personality disorder, delusional disorder, sometimes bipolar disorder. You can also get it from abusing certain drugs – marijuana, LSD, cocaine, and even prescription drugs like Adderall and Ritalin. The fun thing about paranoia is how gradual it is. Sure, if you abuse every single drug at once you'll think the CIA is after you with their mind-lasers. But if you just take a little more Adderall than you were supposed to, you'll be 1% paranoid. You'll have a very mild tendency to interpret ambiguous social signals just a little bit more negatively than usual. If a friend leaves without saying goodbye, and you would normally think "Oh, I guess she had a train to catch", instead you think "Hm, I wonder what she meant by that". There are a bunch of good stimulant abuse cases in the literature that present as "patient's boss said she was unusually standoffish and wanted her to get psychiatric evaluation", show up in the office as "well of course I'm standoffish, everyone in my office excludes me from everything and is rude in a thousand little ways throughout the day", and end up as "cut your Adderall dosage in half, please".

Hmmm. This raises a few questions in my mind:

(1) Would it be possible to deliberately induce paranoia in one's self through pure psychology, by looking extra-hard for ambiguities that can be interpreted as negative social cues?

(2) Would it be possible to do the opposite, and deliberately not notice possible negative cues if they are ambiguous?

(3) What would be the pros and cons of each approach, and what would be the likely long-term outcomes?

Prima facie it seems reasonable to believe that anti-paranoia (i.e. optimism) would lead to other people having more positive experiences with you, because people like being liked, so in a sense it would probably be a self-fulfilling prophecy. It might also lead to you getting more easily exploited by those who genuinely have bad intentions: con men, narcissists, and the like.

In the short term, interpreting ambiguity in a neutral or positive sense ("I guess she had a train to catch") could lead you to overlook some negative social signals and cause some awkwardness (which you might or might not notice). But if the signal is persistent (e.g. if you have terrible body odor and people avoid you) sooner or later someone will signal you in a way that you will notice. As long as you don't retroactively internalize the prior ambiguities--as long as you respond only to the signal that you actually get--there shouldn't be much harm done, to you or to them. And you'll avoid the psychic stress and relationship harm that comes from imputing negative social cues where none were intended.

I'm reminded that "happiness is a dominant strategy" and it seems likely that a mild degree of anti-paranoia is probably a healthy strategy too, for yourself and other people. I wonder if that's why Down's Syndrome kids have a reputation for making their families happy? Anecdotally I observe that they seem largely immune to negative social cues, nor do they give off any negative social cues.

I wonder if I can induce temporary, mild anti-paranoia in myself as an experiment. Or, if I already have it, if I can induce some more.

-Max

P.S. Would it be beneficial to reify an anti-paranoid attitude as a social contract? "If you have something negative to say to me you'd better say it loud and clear or I won't notice." It's worth considering the likely consequences.

--
If I esteem mankind to be in error, shall I bear them down? No. I will lift them up, and in their own way too, if I cannot persuade them my way is better; and I will not seek to compel any man to believe as I do, only by the force of reasoning, for truth will cut its own way.

"Thou shalt love thy wife with all thy heart, and shalt cleave unto her and none else."

Re: Inferential distance

The conclusion seems to hint at "talk more carefully" but what's necessary is in fact to "listen more carefully before you talk." If a given person's attention is a nonrenewable resource, you need to make sure you understand where they're coming from before you start trying to engage with them.

If argumentum ad populum is fundamental to my mode of understanding the universe, wouldn't it be nice if you knew that about me BEFORE launching into a pitch based on carefully defined terms and logical implications? Either you know an argument which is going to persuade me ("recently, more scientists are concluding XYZ") or you don't, but if you do, you don't want to inoculate me by feeding me an unpersuasive argument first ("you already believe ABC; ABC -> XYZ; therefore XYZ"). And if you don't know an argument which is going to persuade me, wouldn't you at least like to know beforehand that you're wasting your time?

Listen first, then talk. (At least if you are interested in persuasion, as opposed to e.g. killing time.)

On Sat, Jul 28, 2018 at 11:50 AM, Maximilian Wilson <wilson.max@gmail.com> wrote:
[This is one reason I find conversations with certain people frustrating. E.g. on D&D message boards. Things that seem obvious to me are not always obvious to them, and they demand "proof" of trivialities, like the relative worthlessness of Improved Critical. Anyway, "inferential distance" seems like a useful concept. -Max]

https://medium.com/@ThingMaker/idea-inoculation-inferential-distance-848836a07a5b

Excerpt:

Inferential distance is the gap between [your hypotheses and world model], and [my hypotheses and world model]. It's just how far out we have to reach to one another in order to understand one another.

If you and I grew up in the same town, went to the same schools, have the same color of skin, have parents in the same economic bracket who attended the same social functions, and both ended up reading Less Wrong together, the odds are that the inferential distance between us for any given set of thoughts is pretty small. If I want to communicate some new insight to you, I don't have to reach out very far. I understand which parts will be leaps of faith for you, and which prerequisites you have versus which you don't — I can lean on a shared vocabulary and shared experiences and a shared understanding of how the world works. In short, I'm unlikely to be surprised by which parts of the explanation are easy and which parts you're going to struggle with.

If, on the other hand, I'm teleported back in time to the deck of the Santa Maria with the imperative to change Christopher Columbus's mind about a few things or all of humanity dies in a bleak and hopeless future, there's a lot less of that common context. Even assuming magical translation, Christopher Columbus and I are simply not going to understand each other. Things that are obviously true to one of us will seem confusing and false and badly in need of justification, and conclusions that seem to obviously follow for one of us will be gigantic leaps of faith for the other.

It's right in the name — inferential distance. It's not about the "what" so much as it is about the "how" — how you infer new conclusions from a given set of information. When there's a large inferential distance between you and someone else, you don't just disagree on the object level, you also often disagree about what counts as evidence, what counts as logic, and what counts as self-evident truth.



--
If I esteem mankind to be in error, shall I bear them down? No. I will lift them up, and in their own way too, if I cannot persuade them my way is better; and I will not seek to compel any man to believe as I do, only by the force of reasoning, for truth will cut its own way.

"Thou shalt love thy wife with all thy heart, and shalt cleave unto her and none else."



--
Be pretty if you are,
Be witty if you can,
But be cheerful if it kills you.

If you're so evil, eat this kitten!

Inferential distance

[This is one reason I find conversations with certain people frustrating. E.g. on D&D message boards. Things that seem obvious to me are not always obvious to them, and they demand "proof" of trivialities, like the relative worthlessness of Improved Critical. Anyway, "inferential distance" seems like a useful concept. -Max]

https://medium.com/@ThingMaker/idea-inoculation-inferential-distance-848836a07a5b

Excerpt:

Inferential distance is the gap between [your hypotheses and world model], and [my hypotheses and world model]. It's just how far out we have to reach to one another in order to understand one another.

If you and I grew up in the same town, went to the same schools, have the same color of skin, have parents in the same economic bracket who attended the same social functions, and both ended up reading Less Wrong together, the odds are that the inferential distance between us for any given set of thoughts is pretty small. If I want to communicate some new insight to you, I don't have to reach out very far. I understand which parts will be leaps of faith for you, and which prerequisites you have versus which you don't — I can lean on a shared vocabulary and shared experiences and a shared understanding of how the world works. In short, I'm unlikely to be surprised by which parts of the explanation are easy and which parts you're going to struggle with.

If, on the other hand, I'm teleported back in time to the deck of the Santa Maria with the imperative to change Christopher Columbus's mind about a few things or all of humanity dies in a bleak and hopeless future, there's a lot less of that common context. Even assuming magical translation, Christopher Columbus and I are simply not going to understand each other. Things that are obviously true to one of us will seem confusing and false and badly in need of justification, and conclusions that seem to obviously follow for one of us will be gigantic leaps of faith for the other.

It's right in the name — inferential distance. It's not about the "what" so much as it is about the "how" — how you infer new conclusions from a given set of information. When there's a large inferential distance between you and someone else, you don't just disagree on the object level, you also often disagree about what counts as evidence, what counts as logic, and what counts as self-evident truth.



--
If I esteem mankind to be in error, shall I bear them down? No. I will lift them up, and in their own way too, if I cannot persuade them my way is better; and I will not seek to compel any man to believe as I do, only by the force of reasoning, for truth will cut its own way.

"Thou shalt love thy wife with all thy heart, and shalt cleave unto her and none else."