Evidence is symmetric

Evidence is symmetric

Using Bayes’ Theorem, if you have two phenomena that have any non-independence, then their influence on each other is symmetric.

Eliezer Yudkowsky calls this “mutual information” and the “symmetric probability flow”. ^[ Rationality, From A to Z#^27ecf7]

We have an object with no prior information. We observe that it is very likely a raven, then the probability that the object is black increases significantly.

We have an object with no prior information. We observe that it is very likely black, then the probability that the object is a raven increases slightly.

The two above examples must have symmetric / inverted / complementary probability effects to each other. (what is the technical Symmetry?)

Considering the black raven example from both this Bayesian inductive perspective and the traditional deductive perspective informs the differences in Full Logic|Inductive vs Deductive Logic]