Evidence is symmetric
Using Bayes’ Theorem, if you have two phenomena that have any non-independence, then their influence on each other is symmetric.
Eliezer Yudkowsky calls this “mutual information” and the “symmetric probability flow”. ^[ Rationality, From A to Z#^27ecf7]
Considering the black raven example from both this Bayesian inductive perspective and the traditional deductive perspective informs the differences in Full Logic|Inductive vs Deductive Logic]
↑ 4 References
“(I digress here to remark that the symmetry of the expression for the mutual information shows that Y must tell us as much about Z, on average, as Z tells us about Y. I leave it as an exercise to the reader to reconcile this with anything they were taught in logic class about how, if all ravens are black, being allowed to reason Raven(x) ⇒ Black(x) doesn’t mean you’re allowed to reason Black(x) ⇒ Raven(x). How different seem the symmetrical probability flows of the Bayesian, from the sharp lurches of logic—even though the latter is just a degenerate case of the former.)” (Eliezer Yudkowsky, Rationality)
^27ecf7
This is an example of Full Logic|Inductive vs Deductive Logic, and Evidence is symmetric
“(I digress here to remark that the symmetry of the expression for the mutual information shows that Y must tell us as much about Z, on average, as Z tells us about Y. I leave it as an exercise to the reader to reconcile this with anything they were taught in logic class about how, if all ravens are black, being allowed to reason Raven(x) ⇒ Black(x) doesn’t mean you’re allowed to reason Black(x) ⇒ Raven(x). How different seem the symmetrical probability flows of the Bayesian, from the sharp lurches of logic—even though the latter is just a degenerate case of the former.)” (Eliezer Yudkowsky, Rationality)
^27ecf7
This is an example of Full Logic, and Evidence is symmetric