Have you ever noticed how differently we approach buying a car versus choosing what to watch on Netflix? One might involve spreadsheets, research, and asking friends for advice. The other? We might just click on whatever catches our eye. I argue a lot of our thinking is more like the second than the first. This argument was originally made to me by Katja Grace, but if you don’t like it it’s probably my fault.
When people take something seriously – like their weight, their children's education, or a major purchase – they may become amateur researchers. They may dig through studies, compare options, and carefully weigh advice from trusted sources. But when it comes to leisure activities? Personally, I’m much more likely to go with the flow. Pick something up and see how it feels. I could do a cost benefit analysis of an hour of television, but instead I’ll start watching and see how it feels.
When you do your most important thinking do you take it seriously, or not?
Tl;dr:
This post sets out a theory, not a fait accomplis
Most of our thinking, even on important topics, resembles casual Netflix browsing more than careful car-buying research.
We often default to quick, inattentive thinking rather than careful deliberation.
This pattern is broad - from Twitter discussions to Democratic party decision making
I look into the evidence for or against
I give a number of theories for this eg Kanheman’s idea that deliberate thinking is taxing or Caplan's theory that we invest mental energy only when we actually think we well benefit
I give a number of suggestions eg trying to spend less time more deliberately, or using LLMs for feedback
Theory:
Much thinking is inattentive and error prone, without updating, including for very important topics.
Longform articles, reports, government decision making, twitter interactions. My theory is that in all these places most thinking is more like “getting it done” than careful sober minded decision making.
Evidence:
What evidence is there of this?
Consider what twitter (X) is like. People are always making quick, bad arguments. I was surprised how few people seemed to know that Trump had set up fake slates of electors (even among Kamala supporters). There is a lack of seriousness to a lot of discussion there, including my own. What % of people’s time is spent in discussions where they might genuinely change their mind? 5%? 1%?
Next, consider Philip Tetlock’s work on forecasting. His team of superforecasters managed to beat a team of intelligence experts who had secret information. Part of that was the skill of his forecasters, but also he attributes their success to their attitude of constantly wanting to get better (what he calls “perpetual beta”, I think). In some sense at least the forecasters were striving more for the right answer than the well paid intelligence experts with secret info.
Even some quite serious spaces seem to be missing serious thinking on lots of topics. I am surprised by the lack of deep tech or parenting discussion on LessWrong. And many topics on the EA forum seem under-discussed - individual profiles of other philanthropies, personal estimates of charities, animal welfare estimates, etc.
What about evidence against the theory?
Some people seem capable of lots of careful thinking, even in throwaway moments. On Twitter, it’s people like Stefan Schubert, Joshua Blake or Katja Grace, but I have met 10s of people like this.
This paper suggests to me that there is far more focus on preferences than I might imagine. It asks people to score on a 7 point scale questions like “It is important that the [art/charity/investments] I choose reflects my personal tastes or values.” Across 100 university respondents, the answers are less objective than I would expect, particularly for charity, but also for investments and medical treatments. This suggests that in other decisions where thinkings seems poor, people may instead be attempting to satisfy alternative preferences that I cannot see.
Having thought about the above more, I think “accuracy isn’t a top priority” is a better theory than the one expressed here, but if I don’t publish this now it will probably be months.
But mostly I am thinking about my own experience.
On twitter, I fire off quick replies to tweets where I know the "strong" counterpoint. I have passing conversations where I dismiss potentially valid evidence because I can't process it properly. I take positions without really considering the full picture. In a footnote, I describe these behaviours with more introspection1.
There are rare times when I actually think carefully. Sometimes my friend Charles2, a well-calibrated forecaster, will challenges my view: I know he’s often right when we disagree, so I'll examine which of my considerations might be wrong3. Or if I'm with Katja4, who is practically unable to take sides, I feel very self-conscious when I’m just trying to hold a position.
Having noticed this in myself, I see this kind of behaviour in others. On Twitter I don’t expect most people to change their minds on engaging. I do not expect many experts and communities to have well-calibrated answers to the questions they claim as a top priority.
In short, the world makes more sense if many people are engaging in these behaviours.
If the above is true, what is going on?
First some mechanistic theories:
Daniel Kanneman’s Thinking Fast and Slow, suggests deliberate thinking is tiring and our brains avoid doing so where possible. It’s much easier to do heuristic based thinking (which he calls “system 1”).
Bryan Caplan offers another perspective in "The Myth of the Rational Voter." He suggests we approach political thinking through a lens of personal consequence and personal convenience. When we truly believe that our vote matters, we think carefully about it. So for much thinking – from economic policy to company norms – most people correctly believe their vote won't change anything. So despite the topics' importance, we don't invest the mental energy to think them through properly.
A related signalling approach might be put forward by Robin Hanson. The imaginary Hanson in my brain says something like “People are doing serious thinking, but on topics of what matters for them, and for most people that’s status, not the topic at hand”. Under this view, people do care deeply, but they care about staying in their tribe, pleasing their boss, not rocking the boat. And on aggregate this leads to a lot of bad thinking.
Next, I find the notion of “aliveness” useful5. I somehow find it much more alive/tasty/desirable to have a quick spar with someone than to slowly figure stuff out. To compare the strength of our quips. I am a creature searching for sex, calories and rock and roll and this is no different. Somehow I find simple thinking much more attractive than being careful.
Finally there is some mix of the principal-agent problem and how we aggregate preferences. Most of us are trying to appease many people who aren’t closely connected to the work we do. It is both hard to know what they would want and hard to actually ensure that our incentives are towards doing that thing. Politicians might want to do a good job, but it’s hard to know what that looks like and their incentives are often towards not upsetting their voters.
And next some theories on why I personally don’t try harder:
Most prominently, I think I identify as someone who “does thinking”. And that accounts for a lot of bad behaviour. I’m a thoughtful person, how could I be engaged in performative or low quality thinking, If there is some reason that I might not be thinking carefully, it butts up against this identity and that alone gives it reason to be ignored.
Next there is busyness. I often have a long to-do list of things but I enjoy a political back and forth. Much thinking gets relegated to a sort of “smart-casual” mode. I pretend it’s work but take an attitude of distraction. But in turn this means that I don’t really focus on the matter at hand. And if I might change my mind and don’t want to.. Suddenly that’s a good time to get back to work.
Third is status. I have a pretty large twitter account at this point. I get lots of responses to my tweets. And so whenever I want to not have to respond to one, I can easily say “Oh I can’t respond to everyone!”6.
All of these three are stories I tell myself that allow justification of thinking as leisure. Whenever I might change my mind, I can instead tell one of these stories and not have to. I’m a thoughtful person! I’m too busy! I can’t respond to everyone.
Times when I manage serious thinking
First, when my life or energy depends on it. If I really want a job or to impress someone, suddenly the good thinking becomes easier. I am sure that if I had a child who was sick with an unknown disease I would take it very seriously indeed..
Beyond that, I think a key insight here is that I have to make space for non-leisure thinking. If I want to think about something carefully. I might have to put times aside for it. I’ve heard Benjamin Todd books time in a hotel to write. Locks himself in a room away from distractions. That’s a far cry from twitter.
I can also improve my incentives. When I am forecasting, that’s sort of political. But I know that I want a good score (or to make money on bets, etc). Somehow I have really learned that I need to take time, write considerations, etc. Wobbly thinking doesn’t cut it here. I guess it’s pretty true in blog writing too, which is why I’m trying to blog more and tweet less.
Assuming it’s true, how can we avoid inattentive thinking on important topics?
Ask an LLM. Ask “what are the basic steps for doing X”. And submit your draft for its feedback. LLMs are like a wise, careful friend, who wants to spend a lot of time on your work. They will read a whole draft in seconds, find errors and inconsistencies, all for the cost of a $20 a month subscription. I recommend it7.
Notice inattention. I am a pretty inattentive person. A woman I dated used to hate when I was on my phone when we hung out. But I justified it as part of my ADHD. Well I was wrong. Even if it wasn’t my fault8, I could do something to fix it. Similarly, I did not create this complex world, but I still have to interact with it as it is and that requires focus and care. I have started to notice when I want to be inattentive. Often I am anxious and want something to smear my consciousness across so as to avoid noticing the pain. I find noticing valuable.
Do less, better thinking. For me, a big issue is that my thinking is smeared across so many topics. For minutes or even seconds each. And then I get tired and snippy. I’ve been trying to write more longform, but that necessarily means I cover fewer issues. But that’s okay, most of what Twitter cares about from week to week doesn’t matter.
Create space for the most important thinking. If I really care about thinking about something, I might take time off, create a calm, low-stress environment. What is the most important thing in your life? Have you taken a day of work to consider how to better orient to that thing? If not, why?
Write something for friends. I might write a blog (like this one!) and share it with friends to try and get their view on it. I want to respect my friends’ time so I’ll try to make sure that the document is carefully written.
Notice when I am not doing these things. These days I try and use these as triggers. If I can’t be bothered to get some time alone or write a discussion up, do I actually want to think about it? Or do I just want an argument.
Where do I see this in practice
I criticise these organisations as my friends9.
LessWrong and rationalists more generally seem to lack focus on building a track record. It has been clear to them for decades that with forecasting and prediction markets one can show who tends to be good at thinking and who isn’t. Why isn’t there a publicly accessible track record for top rationalists? Is Eliezer Yudkowsky good at predicting things? He’s made a few good trades on Manifold, but doesn’t the rationalist diaspora deserve a better system than this? What about Hanson, Zvi? I expect Habryka to argue that the LessWrong review is more accurate than forecasting track record, but I disagree.
More broadly, rationalists seem to me not focused enough on geopolitics and investing. Are we good at making global plays around AI? I don’t know, but I would be much more confident if I knew how good we were at predicting other important shifting global problems. Where is the rationalist view on Taiwan?10 I think a fair criticism here is “Nathan, be the change you want to see”, but still, why aren’t we here already11.
Likewise, in EA I am surprised how hard it is to find numbers for many things, let alone medians of several respected people on said topics. Is there an easy way to see animal welfare ranges, cost effectiveness numbers, comparable cause prioritisations, website views? Not that I’ve seen. It is to Open Philanthropy’s credit that most of their funding is easily searchable and that websites like OpenBook can exist. But this seems to be the exception, rather than the rule.
I found the EA poise towards SBF pretty unserious too. I figured that someone was watching the billionaires more than the questions I was putting on Manifold. But it seemed that no one was coordinating. We see how that turned out. More generally, while rationalism might focus too much on new mechanisms and hence achieve too little, sometimes it seems EA wants to change the world without changing itself.
I don’t read Progress Studies closely enough to know, but it feels a bit like they haven’t really thought about AI risk, perhaps because they don’t want that vibe or because they feel EA has it covered. This feels pretty unserious, notably because progress on benefits vs costs can be asymmetric (and we should want it to be).
I was heartened to see the Democrats dump Biden, who was probably going to lose. But they took far too long to do so and replaced him with Harris, who wasn’t much better. Somehow the leadership race for the most powerful country on earth became a self-congratulatory mass deception. I would have voted for Harris, but I do not feel bad that Democrats were punished for this.
In conclusion:
If I say that there is a thinking I do when I watch television, you hopefully know what I mean. Unfocused, unserious, unimportant. The question is whether much of my thinking looks like that or something more deliberate. And if the former, perhaps I should change.
Often when reading twitter I sort of open many browser tabs in my brain. I think it would be better if I sat and read one, but I just scroll, smearing my focus over many things. I am already in a sort of semi-focused stat.
And so to jerk me out of that into writing replies, which I find tiresome, something has to feel particularly good, like using an argument I think is clever, correcting someone with a strong reply, getting into a discussion about something I think is important.
But I rarely have the energy to actually follow up with new research or actual thinking. If I did I probably would have read something more thoroughly earlier.
Charles Dillon, if you’re curious.
Sorry.
I write more on my feelings on Katja here:
I don't know where this is from but it comes in my sort of meditation and spiritual circles. "What is alive to you?" is a surprisingly good prompt.
This is kind of snotty and I don't endorse it, but I think I probably do think it. If you recoil from thoughts like this, I recommend asking if you too have them (though maybe you are just a kinder person than me!)
Though they are not friends. They are black boxes. I would not come to rely on them for emotional regulation. Future models may have their own goals that are different to mine.
It was probably my fault.
This page is kind of an overview, but it was written by me https://www.lesswrong.com/tag/china
I have some weak theory that the problem here is that Rationalists like transparency more than they like accuracy. Often I would prefer the cryptic pronouncement of someone at the top of manifold’s leaderboard than a 10,000 word piece about some LessWronger's introspection.
Who care how good Eliezer, Robin, or Zvi are at forecasting, when you can as well ask Manifold and Metaculus?