👨🏼‍🎓
Putting Mental Models to Practice
Created
Jan 4, 2021 07:46 AM
Media Type
Articles
Lesson Type
Learning
Project
Commoncog Almanack
Property
This summary was largely done for my own note-taking, sharing it just in case it adds more value to other people.
This is a summary largely taken for my own reference, and may contain errors :) All images were created by Cedric Chin at Commoncog

Context

Source URL:
 
Why is it important: This can help me apply the mental models I learn from reading into practice

Keywords

Mental Models, Learning

Backdrop – Why Cedric Initially Thought That Mental Models Sucked

Mental models used by practitioners can be useful, but only if you have the necessary mental architecture to process them
Don't read anything by non-practioners. Instead, read from the source material of practitioners in fields you inhabit, copy their actions, climb their skill trees, and reflect through trial and error.
 
If mental models are the key to success, why is it that reading and understanding these mental models aren’t sufficient? Where does experience fit into this? Why do we assume that a practitioner’s lifetime of experience gives them an edge, even after they have codified all their mental models?
The answer, of course, is that practice matters. Knowing something in theory and knowing something in practice aren't the same thing. ... mental models are divided into two types: mental models that are explicit knowledge and mental models that are tacit knowledge.
Explicit models lend themselves to codification, and consist of lenses for seeing the world — know-what frameworks (facts) and know-why frameworks (science). But there also exist mental models that are nearly impossible to communicate. We call this tacit knowledge. These mental models are the ones that I believe best explains a practitioner’s success.

Framework for putting mental models to practice

Main framework:
  1. Use intelligent trial and error in service of solving problems. This means two sub-approaches: first, using the field of instrumental rationality to get more efficient at trial and error. Second, using a meta-skill I call ‘skill extraction’ to extract approaches from practitioners in your field.
  1. Concurrently use the two techniques known for building expertise (deliberate practice and perceptual exposure) to build skills in order to get at more difficult problems.
  1. Periodically attempt to generalise from what you have learnt during the above steps into explicit mental models.
 

More details

  • "Adopting Munger’s prescription to read widely and pick elementary findings from the science didn’t cause me to get substantially better in pursuit of my chosen goals."
  • Mental models may work well for beginner or intermediate practitioners. Not those that already have the explicit knowledge and now need tacit knowledge
  • They may also work better in fields like investing/finance, where your task is to have lenses for evaluating opportunities. Not to invent or operate. You can also get quick feedback because investing generally has tight feedback loops
  • To get benefit from mental models, have always these two principles:
      1. Let reality be the teacher. Reality is a higher resolution teacher than words on a page, or instructions from a practitioner
      1. When it comes to practice, one should pay attention to actual practitioners. This is because their approaches have been tested by reality. A second order implication of this is that if you tell me something you have actually done — I will pay close attention!
 

Rationality and biases

On rationality/biases: this post (part 2) essentially lays down the epistemological framework for other posts
"Cognitive biases cause people to make choices that are most obviously irrational, but not most importantly irrational... Since cognitive biases are the primary focus of research into rationality, rationality tests mostly measure how good you are at avoiding them... LessWrong readers tend to be fairly good at avoiding cognitive biases... But there a whole series of much more important irrationalities that LWers suffer from. (Let's call them "practical biases" as opposed to "cognitive biases," even though both are ultimately practical and cognitive.)"

Better trial and error

As long as you have trial and error iterations in which you don't risk ruin (i.e., you can continue to play iterated games) AND you learn from your mistakes and get better each time, you're can eventually reach a phenomenal outcome
 
Things to avoid with trial and error (sounds a lot like stochastic gradient descent):
  1. Don't blow up (avoid risk of ruin)
  1. Don't select trials randomly or suboptimally. Iterate and learn from your failures. Instead, search for relevant approaches and fully reflect on your failures to figure out what to vary for future trials
  1. Don't irrationally repeat the same trial over and over again, expecting different results
  1. Don't think that solvable problems are unsolvable and stop prematurely
  1. Be efficient. Once you've figured out what works and generalized your approach, you can apply it to other similar scenarios and get to an optimal solution quicker (instead of starting a trial from scratch again)
 
This is easy to think about. But hard to implement. Managing your own psychology and exogenous factors is really, really hard
 
Alternatively, there's also the search-inference framework. You can infer a problem properly, and then search for ways to solve it
  1. Possibilities are possible answers to the original question. In this case they are the course options you may take.
  1. Evaluation criteria (or ‘goals’, as Baron originally calls them) are the criteria by which you evaluate the possibilities. You have three goals in the above example: you want an interesting course, you want to learn something about modern history, and you want to keep your work load manageable.
  1. Evidence consists of any belief or potential belief that helps you determine the extent to which a possibility achieves some goal. In this example, the evidence consists of your friend’s report that the course was interesting and the work load was heavy. At the end of the example, you resolved to find your friend Sam for more evidence about the work load on the second course.
 
You want to search through the solutions to a problem. But “search has negative utility”. The more time you spend analysing a given decision, the more negative utility you incur because of diminishing returns. So often, you end up reaching a conclusion far earlier than you should have
 

Expert Decision Making

A lot of experts don't use the Search-Inference Framework. Instead, they use Recognition Primed Decision Making — mostly as a result of tacit expertise
Because expert intuition is often portrayed as ‘magical’, we ignore it and turn to more rational, deliberative modes of decision making. We do not believe that intuition can be trained, or replicated. We think that rational choice analysis is the answer to everything, and that amassing a large collection of mental models in service of the search-inference framework is the ‘best’ way to make decisions.
 
BUT, it's the trial and error model that leads to recognition. Once you have enough tacit knowledge acquired through trial and error, you can kind of have these kinds of decision frameworks automatically pop up in your head: in o
These do have a search-inference layer when things are different. But even that can be informed by tacit knowledge — because you're easily able to identify and eliminate alternatives that have no way of working!

Skill Extraction

Gary Klein has a 4-fold strategy for developing expertise-driven decision making:
  1. First, identify discrete decision points in one’s field of work. Each of these decision points represent discrete areas of improvement you will now train deliberately for.
  1. Second, whenever possible, look for ways to do trial and error in the course of doing. Look for quick actions that you may use to tests aspects of your domain-specific mental models. This is, of course, not always possible. Which leads us to —
  1. Run simulations where you cannot learn from doing.
  1. Fourth, because opportunities for experiences are relatively rare, you should maximise the amount of learning you can get out of each.
 
Skill extraction is mostly about point 1 above: identifying discrete decision points quickly. Klein has summarized this in the Critical Decision Method, which goes as follows: (all of these involve asking an expert decision maker about a situation)
  1. The first step is to find a good story to probe. The story might be non-routine, where a novice might have faltered. Or it could be about a routine event that involves making perceptual judgments and decisions that would prove challenging to a less experienced practitioner.
  1. After such a story is located, ask for a front-to-back narrative of the event in question. As this occurs, keep track of the state of knowledge and how it changes as events occur.
  1. The third pass is to probe for their thought processes. At this point Klein’s team will usually ask what a person noticed (cues) when changing an assessment of the situation and what alternate goals might have existed at that point. If your practitioner chose a course of action, ask them what other actions were possible, whether they considered any of them, and if so, what the factors were that favoured that option.
  1. If time permits, Klein will then do a fourth and final pass. This time, at each decision point, he would ask for mistakes that a novice could make. For example: “If I were the one making the decision, if by some fluke of events I got pressed into service during this emergency, would I see this the same way you did? What mistakes could I make? Why would I make them?”
 
Of course, getting a verbal representation of this knowledge isn’t enough. In order to use it, you have to put it to practice. The goal of such knowledge is to guide the construction of tacit mental models of your own.
 

Advice Seeking and Decision Improvement Protocol

Dalio’s application of this idea is the following advice-seeking protocol:
  1. If you’re talking to a more believable person, suppress your instinct to debate and instead ask questions to understand their approach. This is far more effective in getting to the truth than wasting time debating.
  1. You’re only allowed to debate someone who has roughly equal believability compared to you.
  1. If you’re dealing with someone with lower believability, spend the minimum amount of time to see if they have objections that you’d not considered before. Otherwise, don’t spend that much time on them.
 
Periodically review decisions that you’ve made with the following list of questions:
  1. What was the timeline? Write down the key judgments and decisions that were made as the incident unfolded.
  1. Circle the tough decisions in this project or episode. For each one, ask the following questions:
  1. Why was this difficult?
  1. How were you interpreting the situation? In hindsight, what are the cues and patterns you should have been picking up?
  1. Why did you pick the course of action you adopted?
  1. In hindsight, should you have considered or selected a different course of action?
 
Â