Peirce’s 1870 “Logic of Relatives” • Comment 11.23

Peirce’s 1870 “Logic of Relatives”Comment 11.23

Peirce’s description of logical conjunction and conditional probability via the logic of relatives and the mathematics of relations is critical to understanding the relationship between logic and measurement, in effect, the qualitative and quantitative aspects of inquiry.  To root that connection firmly in mind, I will try to sum up as succinctly as possible, in more current notation, the lesson we ought to take away from Peirce’s last “number of” example, since I know the account I have given so far may appear to have wandered widely.

NOF 4.3

So if men are just as apt to be black as things in general,

[\mathrm{m,}][\mathrm{b}] ~=~ [\mathrm{m,}\mathrm{b}],

where the difference between [\mathrm{m}] and [\mathrm{m,}] must not be overlooked.

(Peirce, CP 3.76)

Viewed in different lights the formula [\mathrm{m,}\mathrm{b}] = [\mathrm{m,}][\mathrm{b}] presents itself as an aimed arrow, fair sampling, or statistical independence condition.  The concept of independence was illustrated in the previous installment by means of a case where independence fails.  The details of that counterexample are summarized below.

Othello Product M,B,
\text{Figure 54. Bigraph Product}~ M,B,

The condition that “men are just as apt to be black as things in general” is expressed in terms of conditional probabilities as \mathrm{P}(\mathrm{b}|\mathrm{m}) = \mathrm{P}(\mathrm{b}), which means that the probability of the event \mathrm{b} given the event \mathrm{m} is equal to the unconditional probability of the event \mathrm{b}.

In the Othello example it is enough to observe that \mathrm{P}(\mathrm{b}|\mathrm{m}) = \tfrac{1}{4} while \mathrm{P}(\mathrm{b}) = \tfrac{1}{7} in order to recognize the bias or dependency of the sampling map.

The reduction of a conditional probability to an absolute probability, as \mathrm{P}(A|Z) = \mathrm{P}(A), is one of the ways we come to recognize the condition of independence, \mathrm{P}(AZ) = \mathrm{P}(A)P(Z), via the definition of conditional probability, \mathrm{P}(A|Z) = \displaystyle{\mathrm{P}(AZ) \over \mathrm{P}(Z)}.

By way of recalling the derivation, the definition of conditional probability plus the independence condition yields the following sequence of equations.

\mathrm{P}(A|Z) = \displaystyle{\mathrm{P}(AZ) \over P(Z)} = \displaystyle{\mathrm{P}(A)\mathrm{P}(Z) \over \mathrm{P}(Z)} = \mathrm{P}(A).

As Hamlet discovered, there’s a lot to be learned from turning a crank.

Resources

cc: CyberneticsOntolog ForumStructural ModelingSystems Science
cc: FB | Peirce Matters • Laws of Form (1) (2) • Peirce List (1) (2) (3) (4) (5) (6) (7)

This entry was posted in C.S. Peirce, Logic, Logic of Relatives, Logical Graphs, Mathematics, Relation Theory, Visualization and tagged , , , , , , . Bookmark the permalink.