Eliezer Yudkowsky
↑ 13 References
There are two concepts from Eliezer Yudkowsky’s writings that I have found separately useful as a structure for understanding thought. I want to lay them out here in basic form as I hope to explore them more in future notes.
A lot of my thoughts on this topic are taken from Eliezer Yudkowsky’s writings. I’ve added my own thoughts and am trying to shape the ideas in a way that makes them easier to wrap my head around, but I would not claim much originality here.
The idea that Societal optimization power changes with time implies that it might not be time itself that changes things for the better, but rather Optimization power in versus optimized product out. Eliezer Yudkowsky uses this to argue that we should not carry forward a history of positive change through an AI revolution, since we are not sure how that will impact Optimization power in versus optimized product out
Eliezer Yudkowsky gives Rationality, From A to Z#^1239ba|a couple examples of flawed ways we value human life.
Eliezer Yudkowsky calls this “mutual information” and the “symmetric probability flow”. ^[ Rationality, From A to Z#^27ecf7]
Good point by Eliezer Yudkowsky about how what we feel to be reality is really our beliefs about reality. This then makes it the case that Beliefs of beliefs just feel like beliefs.
Eliezer Yudkowsky calls this “mutual information” and the “symmetric probability flow”. ^[ Rationality, From A to Z#^27ecf7]
In Rationality, From A to Z, Eliezer Yudkowsky often describes how rationality (more or less what I call Probabilistic Thinking) can be like a martial art. It is an interesting comparison, though I don’t know if I agree with all parts of it.