The idea that Societal optimization power changes with time implies that it might not be time itself that changes things for the better, but rather Optimization power in versus optimized product out. Eliezer Yudkowsky uses this to argue that we should not carry forward a history of positive change through an AI revolution, since we are not sure how that will impact Optimization power in versus optimized product out