I’m not sure I’m convinced that the story is “power changes people” rather than “lots of people were never really wedded to any principles in the first place, and just going with whatever so-called ‘beliefs’ feel right to them in the moment”. Like, right-wingers tended to be in favor of free trade, but then they liked Trump, and Trump said free trade is bad, so now they're against it. Likewise, the left-wing flipped on free trade in the opposite direction, along with the FBI and other things. Everybody flips on executive power and states rights all the time depending on who’s in office. Most of these changes are not examples of people changing their minds *upon personally gaining power*. Instead they’re about people responding to their microenvironment, including adopting the beliefs of people you admire, motivated reasoning, and so on.
I haven't read The Power Broker, but got a second-hand summary from my spouse. An interesting bit was: Supposedly, Robert Moses was a principled reformer before he had power, and super corrupt after. I was confused by this, and then it hit me! Before he had power, being a principled reformer was advantageous to him increasing power, so he motivated-reasoning'ed himself into believing in reform, or at least saying that. After he had power, being corrupt was advantageous to him increasing power, so he motivated-reasoning’ed himself into vice-versa.
My hunch is that Elon Musk is in that category too. Maybe Marc Andressen as well, but I haven’t been paying much attention to him.
Your example of Dustin Moskovitz calling Tesla the next Enron is I think Bulverism? Maybe he's right, beats me.
(When I saw the title I thought you were going to talk about psychedelics, which are IMO quite a high-stakes irreversible rolling of the dice with respect to your long-term values.)
I am not sure if, "right wingers," switched to become in favor of tariffs, or if that's just a strategy that was used to win the recent election, and independent voters were convinced that this is a good strategy, among a hugely complex other number of factors.
That being said, the crux of what you are saying is true, in that, there is in-group bias, it's a known phenomenon. People may anchor their viewpoints and decisions toward an in-group, or a leader, or a high performing subset or individual, depending upon the context.
When I was reading Young's article, I took the SBF example to be merely negative well known counterpoint to support the overall thesis, "going down a particular radicalization pipeline is an investment strategy."
You absolutely can choose to hold beliefs that are unconventional, whether they may be in the political knowledge domain or know-how domain and there is nothing essentially wrong with that, because those unconventional beliefs may be in fact, correct or often correct, despite in-group bias. For example, you might think that the best way to fix rust on the outside of a car is with some kind of liquid cement, rather than welding. The vast majority of experts, engineers, mechanics might find this offensive and a horrible practice, because welding is the only way to create a durable fix. However one day you might find yourself on an island or in the middle of the jungle where there's no welding equipment, but a crate full of liquid cement, and you might become a hero by quickly fixing a bunch of vehicles the, "wrong," way and allowing people to get to medical care without the vehicle breaking down, despite it being clearly the wrong practice in many other instances. However, the stronger you hold the conviction on [Liquid Cement Good], or [Liquid Cement Bad], in either direction, probably the worse outcomes in the actual universe in different scenarios.
Then, apply this to your hypothetical self with a lot of power and influence, the higher the conviction you have for an extreme idea, the more consequential it becomes. Presuming you are not an individual with enormous power, which is the vast majority of all of us, it simply means a lot of people might find you to be a dick and you start to self-isolate and become unhappy. Again, it doesn't mean that the unconventional idea is fundamentally incorrect, it just means it's risky, and it is a decision to take on that risk. The risk may be either social or as the title suggests, moral, or it may be both.
In my view an alternative title to this article could be, "trying to build a unique perspective on the mechanics of the radicalization pipeline." Again, radical beliefs are not fundamentally harmful in it of themselves, it's OK to have radical beliefs, it's the degree to which the radical beliefs manifest into something else that is the problem.
I’m not sure I’m convinced that the story is “power changes people” rather than “lots of people were never really wedded to any principles in the first place, and just going with whatever so-called ‘beliefs’ feel right to them in the moment”. Like, right-wingers tended to be in favor of free trade, but then they liked Trump, and Trump said free trade is bad, so now they're against it. Likewise, the left-wing flipped on free trade in the opposite direction, along with the FBI and other things. Everybody flips on executive power and states rights all the time depending on who’s in office. Most of these changes are not examples of people changing their minds *upon personally gaining power*. Instead they’re about people responding to their microenvironment, including adopting the beliefs of people you admire, motivated reasoning, and so on.
I haven't read The Power Broker, but got a second-hand summary from my spouse. An interesting bit was: Supposedly, Robert Moses was a principled reformer before he had power, and super corrupt after. I was confused by this, and then it hit me! Before he had power, being a principled reformer was advantageous to him increasing power, so he motivated-reasoning'ed himself into believing in reform, or at least saying that. After he had power, being corrupt was advantageous to him increasing power, so he motivated-reasoning’ed himself into vice-versa.
My hunch is that Elon Musk is in that category too. Maybe Marc Andressen as well, but I haven’t been paying much attention to him.
IMO SBF had a personality disorder, of a type that’s generally anticorrelated with being a person of principle - https://www.spencergreenberg.com/2023/11/who-is-sam-bankman-fried-sbf-really-and-how-could-he-have-done-what-he-did-three-theories-and-a-lot-of-evidence/
Your example of Dustin Moskovitz calling Tesla the next Enron is I think Bulverism? Maybe he's right, beats me.
(When I saw the title I thought you were going to talk about psychedelics, which are IMO quite a high-stakes irreversible rolling of the dice with respect to your long-term values.)
I think I’m a bit less cynical than this, but yes maybe these people cared about their principles less than they claimed to.
I am not sure if, "right wingers," switched to become in favor of tariffs, or if that's just a strategy that was used to win the recent election, and independent voters were convinced that this is a good strategy, among a hugely complex other number of factors.
That being said, the crux of what you are saying is true, in that, there is in-group bias, it's a known phenomenon. People may anchor their viewpoints and decisions toward an in-group, or a leader, or a high performing subset or individual, depending upon the context.
When I was reading Young's article, I took the SBF example to be merely negative well known counterpoint to support the overall thesis, "going down a particular radicalization pipeline is an investment strategy."
You absolutely can choose to hold beliefs that are unconventional, whether they may be in the political knowledge domain or know-how domain and there is nothing essentially wrong with that, because those unconventional beliefs may be in fact, correct or often correct, despite in-group bias. For example, you might think that the best way to fix rust on the outside of a car is with some kind of liquid cement, rather than welding. The vast majority of experts, engineers, mechanics might find this offensive and a horrible practice, because welding is the only way to create a durable fix. However one day you might find yourself on an island or in the middle of the jungle where there's no welding equipment, but a crate full of liquid cement, and you might become a hero by quickly fixing a bunch of vehicles the, "wrong," way and allowing people to get to medical care without the vehicle breaking down, despite it being clearly the wrong practice in many other instances. However, the stronger you hold the conviction on [Liquid Cement Good], or [Liquid Cement Bad], in either direction, probably the worse outcomes in the actual universe in different scenarios.
Then, apply this to your hypothetical self with a lot of power and influence, the higher the conviction you have for an extreme idea, the more consequential it becomes. Presuming you are not an individual with enormous power, which is the vast majority of all of us, it simply means a lot of people might find you to be a dick and you start to self-isolate and become unhappy. Again, it doesn't mean that the unconventional idea is fundamentally incorrect, it just means it's risky, and it is a decision to take on that risk. The risk may be either social or as the title suggests, moral, or it may be both.
In my view an alternative title to this article could be, "trying to build a unique perspective on the mechanics of the radicalization pipeline." Again, radical beliefs are not fundamentally harmful in it of themselves, it's OK to have radical beliefs, it's the degree to which the radical beliefs manifest into something else that is the problem.
Masterful and I don't know how you don't have more followers.