Pragmatic Semiotic Information • 9

Information Recapped

Reflection on the inverse relation between uncertainty and information led us to define the information capacity of a communication channel as the average uncertainty reduction on receiving a sign, taking the acronym auroras as a reminder of the definition.

To see how channel capacity is computed in a concrete case let’s return to the scene of uncertainty shown in Figure 5.

Pragmatic Semiotic Information • Figure 5

For the sake of the illustration let’s assume we are dealing with the observational type of uncertainty and operating under the descriptive reading of signs, where the reception of a sign says something about what’s true of our situation.  Then we have the following cases.

  • On receiving the message “A” the additive measure of uncertainty is reduced from \log 5 to \log 3, so the net reduction is (\log 5 - \log 3).
  • On receiving the message “B” the additive measure of uncertainty is reduced from \log 5 to \log 2, so the net reduction is (\log 5 - \log 2).

The average uncertainty reduction per sign of the language is computed by taking a weighted average of the reductions occurring in the channel, where the weight of each reduction is the number of options or outcomes falling under the associated sign.

The uncertainty reduction (\log 5 - \log 3) is assigned a weight of 3.

The uncertainty reduction (\log 5 - \log 2) is assigned a weight of 2.

Finally, the weighted average of the two reductions is computed as follows.

{1 \over {2 + 3}}(3(\log 5 - \log 3) + 2(\log 5 - \log 2))

Extracting the pattern of calculation yields the following worksheet for computing the capacity of a two‑symbol channel with frequencies partitioned as n = k_1 + k_2.

Capacity of a channel {“A”, “B”} bearing the odds of 60 “A” to 40 “B”

\begin{array}{lcl}  & = & \quad {1 \over n}(k_1(\log n - \log k_1) + k_2(\log n - \log k_2))  \\[4pt]  & = & \quad {k_1 \over n}(\log n - \log k_1) + {k_2 \over n}(\log n - \log k_2)  \\[4pt]  & = & \quad - {k_1 \over n}(\log k_1 - \log n) - {k_2 \over n}(\log k_2 - \log n)  \\[4pt]  & = & \quad - {k_1 \over n}(\log {k_1 \over n}) - {k_2 \over n}(\log {k_2 \over n})  \\[4pt]  & = & \quad - (p_1 \log p_1 + p_2 \log p_2)  \\[4pt]  & = & \quad - (0.6 \log 0.6 + 0.4 \log 0.4)  \\[4pt]  & = & \quad 0.971  \end{array}

In other words, the capacity of the channel is slightly under 1 bit.  That makes intuitive sense in as much as 3 against 2 is a near‑even split of 5 and the measure of the channel capacity, otherwise known as the entropy, is especially designed to attain its maximum of 1 bit when a two‑way partition is split 50‑50, that is, when the distribution is uniform.

cc: FB | SemeioticsLaws of FormMathstodonOntologAcademia.edu
cc: Conceptual GraphsCyberneticsStructural ModelingSystems Science

This entry was posted in C.S. Peirce, Definition, Determination, Information, Information = Comprehension × Extension, Inquiry, Logic, Pragmatic Semiotic Information, Scientific Method, Semiosis, Semiotics, Sign Relations, Triadic Relations, Uncertainty and tagged , , , , , , , , , , , , , . Bookmark the permalink.

1 Response to Pragmatic Semiotic Information • 9

  1. Pingback: Survey of Pragmatic Semiotic Information • 8 | Inquiry Into Inquiry

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.