Thinking
Thinking
- Aim to get an accurate picture of reality, even when that's unpleasant.
- Be self-aware about what you know and what you don't know. Aim to stay close to the humility sweet spot.
- See things as they are, not as you wish they were (Scout Mindset).
- For each subject you think you know, ask the following questions:
- How could I be wrong about this? In general, be less sure about what you know than intuition implies.
- What evidence would convince me I'm wrong?
- We use the same term - “no evidence” - to mean:
- This thing is super plausible, and honestly very likely true, but we haven't checked yet, so we can't be sure.
- We have hard-and-fast evidence that this is false, stop repeating this easily debunked lie.
- We use the same term - “no evidence” - to mean:
- Be specific. Ask yourself the question, "What's an example of that?" Or more bluntly, "Can I be more specific?"
- Run your brain in debug mode so you understand why you're thinking in that way. The brain hasn't changed that much in the last few thousands years and was built for a different world.
- Believing you're rational makes it easier to fool yourself mistaking your intuitions with rational decision.
- Stress test your ideas/assumptions/beliefs with experiments and facts as many times as possible.
- Anything you know or do could be wrong. You get less dumb by saying things and getting feedback. We all have crony beliefs. From time to time, do a self-audit and figure out which ideas you've come to hold sacred and remind yourself that they're just ideas.
- Many beliefs are held because there is a social and tribal benefit to holding them, not necessarily because they're true.
- A great way to do that is to bet on everything where you can or will find out the answer. Even if you're only testing yourself against one other person, it's a way of calibrating yourself to avoid both overconfidence and under-confidence, which will serve you in good stead emotionally when you try to do inadequacy reasoning. It'll also force you to do falsifiable predictions.
- A tool to assign a percentage to a belief is the equivalent bet test.
- Instead of thinking "I'm sure X is fake!", try to think in terms of probabilities. E.g: I think there's a 90% chance X is fake. Instead of thinking in terms of changing your mind, think in terms of updating your probabilities. This mindset makes it easier to remember that it's not a question of winning or losing, but a question of being as accurate as possible. “Probability update” is less emotionally devastating than "I said X, but actually ~X, so I was wrong").
- You can try things to find out which ideas are right or wrong. It requires asking "What else would be true if this thing were true?" or "What would be different depending on whether X versus Y were true?".
- Knowledge decays. Things you learned in the past might not be true nowadays (status of Pluto as a planet, dinosaurs with feathers, number of people living, …). Facts decay over time until they are no longer facts or perhaps no longer complete.
- Don't fully trust Science (or History) as is not perfect. Studies are based on incorrect assumptions (from other studies), might have experimental issues, or might be manipulated by external factors (e.g: tobacco companies paying for studies).
- Avoiding stupidity is easier than seeking brilliance. Think backward so that you can avoid failures.
- Research before judging! We do not know what we don't know. Gather as much context as you can before making any final statement.
- Absolute truth is relative and everyone is doing the best they can. These are opportunities for you to help and learn more about the world.
- Think in distributions instead of magic answers. The world is analog and not digital, continuous and not discrete. Nuance is everywhere.
- Real people are complex and flawed, full of faults and biases. Each turn of events is mired in potential positives and potential negatives, which is a mess to sort out.
- Fundamental Attribution Error: we attribute people's behavior to their personality, not their situation.
- Digitizing an analog view will result in some loss of information. In that world, everything is good or bad, everyone is smart or ignorant, ones and zeros. Mistrust simple comparisons.
- Real people are complex and flawed, full of faults and biases. Each turn of events is mired in potential positives and potential negatives, which is a mess to sort out.
- You need a view of both the micro and the macro, the forest and the trees — and how both perspectives slot together.
- Local Validity: Some argument steps are allowed steps and some argument steps aren't (Non-Central Fallacy), independently of whether they arrive at an answer you agree with.
- People can fool you by saying they saw things that they didn't see, telling you some things they know but not others or by using flawed steps when drawing conclusions. When you try to make an argument come out with a particular answer, you can fool yourself in the same way.
- Assume good faith. Trust the other person to be believing things that make sense to them, which you'd have ended up believing if you were exposed to the same stimuli, and that they are generally trying to find the the truth.
- When you see something odd or something that doesn't fit with what you'd ordinarily expect, notice and promote it to conscious.
- Notice when your mind is flinching away from a thought and flag that area as requiring more deliberate exploration.
- Notice your internal state (cognitive and emotional).
- Notice when you are in a failure mode, and step out. For example:
- Motivated Reasoning or Soldier Mindset:
- You are fighting to make sure an argument wins.
- You are fighting to make another argument lose.
- You are incentivized to believe something, or not to notice something, because of social or financial rewards or because it'd be physically inconvenient/annoying.
- You are offended/angered/defensive/agitated.
- You are afraid you'll lose something important if you lose a belief.
- You are arguing about definitions of words instead of ideas.
- You are confused or surprised. Treat this as a red flag that something about your models is wrong.
- Motivated Reasoning or Soldier Mindset:
- Notice if someone else seems to be in one of the above failure modes.
- Tactfully disagree in a way that arouses curiosity rather than defensiveness.
- Leave your colleague a line of retreat.
- Be prejudiced in favor of tolerating dissent.
- Socially reward people who change their mind.
- The real costs aren't always what is shown. Costs and Values are often made of multiple parts. Beware of repeated costs—they add up!
- Take into account second and third order effects.
- Do your philosophical thinking in advance (cached thoughts), so you can concentrate on explaining well. Above all, practice staying within the one-inferential-step bound.
- Think for yourself about "wise" or important or emotionally fraught topics rather than letting your brain complete the pattern. If you don't stop at the first answer, and cast out replies that seem vaguely unsatisfactory, in time your thoughts will form a coherent whole, flowing from the single source of yourself, rather than being fragmentary repetitions of other people's conclusions.
- Sometimes inferential distances can be very far apart. You need willingness to entertain and explore ideas before deciding that they are wrong. The other person might be on a self-consistent equilibria (someone christian, creationism, …) and only changing one view of the world wouldn't work. You have to convince them for all the views. A clear argument has to lay out an inferential pathway, starting from what the audience already knows or accepts. Same applies when working with a group or even for you! Change your mind a little at a time.
- You cant reason someone out of a notion that they didn't reason themselves into.
- There's a distinction between tacit knowledge and explicit knowledge:
- Tacit knowledge is like the knowledge that you use to ride a bicycle—it's complex, experiential, intuitive, hard to put into words. There is knowledge experts have, but cannot explain or write down.
- Explicit knowledge is clear and concrete and transferable and (at least somewhat) objectively verifiable. How you ride a bicycle is tacit, but the fact that you can ride a bicycle is explicit. It's a binary fact that can be completely and compactly transferred through words, and that is verifiable through experiment.
- An event or fact is common knowledge among a group of people if everyone knows it, everyone knows that everyone knows it, everyone knows that everyone knows that everyone knows it, and so on.
- You do not think about things the same way as everyone else.
- You may approach something analytically while others approach it intuitively — and both styles can yield the same end results!
- Humans think in very different styles, related to how they use their senses while thinking. For example, some people see images during a conversation for each concept, others "feel" concepts in their body, others have explicit models that they update, and many have some combination. Also, some people can't imagine in images, and others can't store faces. It's very strange that we enter adult life without a shared understanding of this.
- Don't model the minds inside other people's brains as exactly the same as your own mind. Humans lack insight into their own minds and what is common among everyone or unusually specific to a few.
- We're all biased to our own personal history. Your personal experiences make up maybe 0.00000001% of what's happened in the world but maybe 80% of how you think the world works.
- When thinking about any question, imagine yourself considering a similar question, under circumstances that would bias you the opposite direction. If you stick with your opinion, it's probably honest; if you'd change your opinion in the counterfactual, you probably had it because of bias.
- Counterfactual tests to improve rationality:
- Status Quo Test: If you're defending the status quo, imagine that the opposite was the status quo. Would you be tempted to switch to what you have now?
- Conformity Test: Imagine that some common and universally-agreed idea was unusual; would you still want to do it? If not, you might be motivated by conformity bias.
- The Selective Skeptic Test: How credible would you consider the same evidence if it supported the other side?
- Predictive processing gives us more confidence in an admission that bias is possible, and a hope that there's something other than bias which we can latch onto as a guide. It helps provide a convincing framework we can use to figure out what's going on at all levels of cognition.
- An estimate is better than a guess. A measurement is better than an estimate.
- All points of view have complex context, many of which are predetermined by chance of birth, biology, and environment. There's no such thing as, "I only believe (x) because of (y)." our brains like simple, binary thinking, but real life is constantly challenging that impulse.
- Experiments usually have mistakes. When the experiment process improves around a topic, the evidence might decrease indicating it wasn't present in the beginning. To find truth, improve the way to measure it!
- Cognitive ease makes us more likely to believe things that are familiar to us. Cognitive strain helps us avoid the pitfall of jumping to the intuitive but wrong answer. Both ways are useful in different situations, the key is to identify where to flow or fight against the cognitive ease.
- Saying "that's a good point" doesn't lose the argument. It wins trust. Acknowledging a valid observation is a display of respect. It signals that you're Listening with an open mind, and motivates them to follow suit. You don't have to agree on everything to agree on something.
- Every time you say "that's a good point", it gets easier for you to acknowledge good points in the future. Same happens when you say "I was wrong".
Resources
- LessWrong - A community dedicated to improving reasoning and decision-making.
- Rationality Checklist - Checklist for personal use, so you can have a wish-list of rationality Habits and see if you're acquiring good habits.
- Kialo - Tool to explore debates.
- Arguman - An argument analysis platform.
- Guesstimate - A spreadsheet for things
- Metaculus - Community dedicated to generating accurate predictions about future real-world events by aggregating the collective wisdom, insight, and intelligence of its participants. that aren't certain.
- Rationality skill tree.
- Center For Applied Rationality Handbook