There's a mistake I made a couple times and didn't really internalize the lesson as fast as I'd like. Moreover, it wasn't even a failure to generalize, it was basically a failure to even have a single update stick about a single situation.
The particular example was me saying, roughly:
Look, I'm 60%+ on "Alignment is quite hard, in a way that's unlikely to be solved without a 6+ year pause." I can imagine believing it was lower, but it feels crazy to me to think it's lower than like 15%. And at 15%, it's still horrendously irresponsible to solve AI takeoff via rushing forward and winging-it than "everybody stop, and actually give yourselves time think." (
The error mode here is something like "I was imagining what I'd think if you slid this one belief slider from ~60%+ to 15%, without imagining all the other beliefs that would [...]
---
First published:
September 2nd, 2025
Source:
https://www.lesswrong.com/posts/vJrP7nbnJTwk4fcbk/simulating-the-rest-of-the-political-disagreement
---
Narrated by TYPE III AUDIO.