Discussion about this post

User's avatar
Lovkush Agarwal's avatar

Maybe you implicitly cover it in one of the above points, but your tool that helps identify cruxes or disagreements is something I like. I saws its use directly during discussion/argument around whether EA should distance itself from manifund events / rationality.

Expand full comment
Tim Duffy's avatar

I really like that discourse mapping! I also liked the idea behind the AI welfare week slider, but I thought it had a pretty important flaw. Footnote 2 was something like "a priority means we should be spending >=5% of EA resources on it". Being in a footnote, I think some people accounted for that specific definition and some did not. It was also not clear to me whether that means ramping up immediately to 5%, being above 5% in the medium termo, or something else.

Expand full comment

No posts