1 Comment

I think you're far too high (10x) for AI killing/enslaving us all, conditional on it being able to. There's a really high opportunity cost from doing so. For an intuition, which of your goals do you think the optimal/most efficient/most likely to succeed path to achieving them (ignoring moral constraints) involves world domination? For a fuller version of this argument: https://rootsofprogress.org/power-seeking-ai

The rest seems reasonable at first reading to me.

Expand full comment