Ideas Demand Expression Always

Re: Quomodocumque

What came of my morning meditation today —

There is something about an idea that demands to be communicated.

But what of bad ideas, fixed ideas, ideology?

And again, is there really any such thing as a bad idea in Plato’s Heaven?

Or is it only the bad expression of a good idea that leads Humanity astray?

I will have to think on it more, anon.

Posted in Meditation | Tagged | Leave a comment

Anamnesis, Maieusis, Monadology, Semeiosis

Anamnesis
Learning = Recollection
Maieusis
Teaching = Midwifery
Monadology
Communication = Pre-Established Harmony
Semeiosis
Meaning = Interpretation
Posted in Anamnesis, Maieusis, Monadology, Semeiosis, Semiosis, Symbolism | Tagged , , , , , | Leave a comment

The Lambda Point • 1

A note on the title.  From long ago discussions with Harvey Davis, one of my math professors at Michigan State.  I remember telling him of my interest in the place where algebra, geometry, and logic meet, and he quipped, “Ah yes, the lambda point”, punning on the triple point of phase transitions among gaseous, liquid, and solid states.

Re: Cathy O’Neil

One of the insights coming out of C.S. Peirce’s work on logic, informing the development of his logical graphs, is that negative logical relations are more fundamental than positive logical relations, since the right set of negative relations can generate all possible logical relations, but no set of purely positive relations can do all that.  That is the gist of it, put very roughly, modulo the right definitions of positive and negative relations, of course.

We see this theme exhibited in the generative power of the \textsc{nand} and \textsc{nnor} operators for propositional calculus which Peirce discovered early on and dubbed the amphecks.

Posted in Algebra, Amphecks, Boolean Algebra, C.S. Peirce, Cactus Graphs, Geometry, Graph Theory, Lambda Point, Logic, Logical Graphs, Mathematics, Minimal Negation Operators, Peirce, Propositional Calculus, Topology | Tagged , , , , , , , , , , , , , , | Leave a comment

How To Succeed In Proof Business Without Really Trying

Re: R.J. LiptonSurely You Are Joking?

Comment 1

Even at the mailroom entry point of propositional calculus, there is a qualitative difference between insight proofs and routine proofs.  Human beings can do either sort, as a rule, but routinizing insight is notoriously difficult, so the clerical routines have always been the ones that lend themselves to the canonical brands of canned mechanical proofs.

Just by way of a very choice example, consider the Praeclarum Theorema (Splendid Theorem) noted by Leibniz, as presented in cactus syntax here:

I’ll discuss different ways of proving this in the comments that follow.

Comment 2

The proof given via the link above is the sort that a human, all too human was able to find without much trouble.  You can see that it exhibits a capacity for global pattern recognition and analogical pattern matching — manifestly aided by the use of graphical syntax — that marks the human knack for finding proofs.  When I first set to work developing a Simple Propositional Logic Engine (SPLE) those were the aptitudes I naturally sought to emulate.  Alas, I lacked the metaptitude for that.

Comment 3

For my next proof of the Praeclarum Theorema I give an example of a routine proof, the sort of proof that a machine with all its blinkers on can be trained to derive simply by following its nose, demanding as little insight as possible and exploiting the barest modicum of tightly reigned-in look-ahead.

Posted in Algorithms, Animata, Artificial Intelligence, Automatic Theorem Proving, Boolean Algebra, Boolean Functions, C.S. Peirce, Cactus Graphs, Computational Complexity, Graph Theory, Logic, Logical Graphs, Minimal Negation Operators, Model Theory, Peirce, Praeclarum Theorema, Proof Theory, Propositional Calculus, Visualization | Tagged , , , , , , , , , , , , , , , , , , | 7 Comments

What Is A Theorem That A Human May Prove It?

Re: Gil KalaiWhy Is Mathematics Possible?Tim Gowers’ Take On The Matter

Comment 1

To the extent that mathematics has to do with reasoning about possible existence, or inference from pure hypothesis, a line of thinking going back to Aristotle and developed greatly by C.S. Peirce may have some bearing on the question of “Why Mathematics Is Possible”.  In that line of thought, hypothesis formation is treated as a case of “abductive” inference, whose job in science generally is to supply suitable raw materials for deduction and induction to develop and test.  In that light, a large part of our original question becomes, as Peirce once expressed it —

Is there cause to believe “we can trust to the human mind’s having such a power of guessing right that before very many hypotheses shall have been tried, intelligent guessing may be expected to lead us to the one which will support all tests, leaving the vast majority of possible hypotheses unexamined”?  (Peirce, Collected Papers, CP 6.530).

The question may fit the situation in mathematics slightly better if we modify the word “hypothesis” to say “proof“.

Comment 2

I copied out a more substantial excerpt from Peirce’s paper here:

The question of naturalness arises in many areas, from AI and cognitive science to logic and the philosophy of science, most often under the heading of “Natural Kinds”.  Given a universe of discourse X, the lattice of “All Kinds” would be its power set, and we want to know what portion of that ordering makes up the Natural Kinds, the concepts or hypotheses that are worth considering in practice.

To the same purpose, Peirce employs the criterion of “admissible hypotheses that seem the simplest to the human mind”.

Comment 3

The following project report outlines the three types of inference — Abductive, Deductive, and Inductive — as treated by Aristotle and Peirce, at least insofar as these patterns of reasoning can be analyzed in syllogistic forms.  I did this work by way of exploring how a propositional logic engine might be used to assist in scientific inquiry.

It looks a bit cobbled together to my eyes today and probably could use a rewrite, but I did put a lot of work into the diagrams and remain rather pleased with those.

References

Well, more like allusions, really …

  • McCulloch, Warren S. (1961), “What Is a Number that a Man May Know It, and a Man, that He May Know a Number?”, Ninth Alfred Korzybski Memorial Lecture, General Semantics Bulletin, Numbers 26 and 27, pp. 7–18, Institute of General Semantics, Lakeville, CT.  Reprinted in Embodiments of Mind, pp. 1–18.  Online (1) (2).
  • McCulloch, Warren S. (1965), Embodiments of Mind, MIT Press, Cambridge, MA.
Posted in Abduction, Analogy, Aristotle, C.S. Peirce, Conjecture, Deduction, Epistemology, Hypothesis, Induction, Inquiry, Logic, Logic of Science, Mathematics, Peirce, Proof Theory, Retroduction, Theorem Proving, Warren S. McCulloch | Tagged , , , , , , , , , , , , , , , , , | 2 Comments

C.S. Peirce • The Proper Treatment of Hypotheses

Selection from C.S. Peirce, “Hume On Miracles” (1901), CP 6.522–547

530.   Now the testing of a hypothesis is usually more or less costly. Not infrequently the whole life’s labor of a number of able men is required to disprove a single hypothesis and get rid of it. Meantime the number of possible hypotheses concerning the truth or falsity of which we really know nothing, or next to nothing, may be very great. In questions of physics there is sometimes an infinite multitude of such possible hypotheses. The question of economy is clearly a very grave one.

In very many questions, the situation before us is this: We shall do better to abandon the whole attempt to learn the truth, however urgent may be our need of ascertaining it, unless we can trust to the human mind’s having such a power of guessing right that before very many hypotheses shall have been tried, intelligent guessing may be expected to lead us to the one which will support all tests, leaving the vast majority of possible hypotheses unexamined. Of course, it will be understood that in the testing process itself there need be no such assumption of mysterious guessing-powers. It is only in selecting the hypothesis to be tested that we are to be guided by that assumption.

531.   If we subject the hypothesis, that the human mind has such a power in some degree, to inductive tests, we find that there are two classes of subjects in regard to which such an instinctive scent for the truth seems to be proved. One of these is in regard to the general modes of action of mechanical forces, including the doctrine of geometry; the other is in regard to the ways in which human beings and some quadrupeds think and feel. In fact, the two great branches of human science, physics and psychics, are but developments of that guessing-instinct under the corrective action of induction.

532.   In those subjects, we may, with great confidence, follow the rule that that one of all admissible hypotheses which seems the simplest to the human mind ought to be taken up for examination first. Perhaps we cannot do better than to extend this rule to all subjects where a very simple hypothesis is at all admissible.

This rule has another advantage, which is that the simplest hypotheses are those of which the consequences are most readily deduced and compared with observation; so that, if they are wrong, they can be eliminated at less expense than any others.

Notes

Wiener, Selected Writings

  • Chapter 18. Letters to Samuel P. Langley, and “Hume on Miracles and Laws of Nature” (pp. 275–321).

Essential Peirce 2(a)(b)

  • MS 869, untitled, marked “H[ume] on M[iracles]”. Probably composed toward the end of April 1901 as a working document toward the next one. Published in CP 6.522–547.
  • MS 692, “The Proper Treatment of Hypotheses : a Preliminary Chapter, toward an Examination of Hume’s Argument against Miracles, in its Logic and in its History”. This was the second paper Peirce sent to Langley, who received it on May 13, 1901. Peirce wanted it to be the first of three chapters. Langley rejected the paper and the plan on May 18. Published in Carolyn Eisele’s Historical Perspectives 2:890–904.

References

  • Peirce, C.S., Collected Papers of Charles Sanders Peirce, vols. 1–6, Charles Hartshorne and Paul Weiss (eds.), vols. 7–8, Arthur W. Burks (ed.), Harvard University Press, Cambridge, MA, 1931–1935, 1958. Volume 6 : Scientific Metaphysics, 1935.
  • Wiener, Philip P. (ed.), Charles S. Peirce : Selected Writings, Dover Publications, New York, NY, 1966. Originally published as Values in a Universe of Chance, Doubleday, 1958.
Posted in Abduction, Hypothesis, Inquiry, Logic of Science, Peirce, References, Retroduction, Sources | Tagged , , , , , , , | 1 Comment

Moneytheism

Re: Cathy O’NeilProfit as Proxy for Value

There is a deep and pervasive analogy between systems of commerce and systems of communication, turning on their near-universal use of symbola (images, media, proxies, signs, symbols, tokens, etc.) to stand for pragmata (objects, objectives, the things we really care about — or would really care about if we examined our values in practice thoroughly enough).

Both types of sign-using systems are prey to the same sort of dysfunction or functional disease — it sets in when their users confuse signs with objects so badly that signs become ends instead of means.

There is a vast literature on this topic, once you think to go looking for it.  And it’s a perennial theme in fable and fiction.

Posted in Commerce, Communication, Economics, Moneytheism, Semiotics, Sign Relations | Tagged , , , , , | 6 Comments

Fourier Transforms of Boolean Functions • 2

Re: R.J. Lipton and K.W. ReganTwin Primes Are Useful

Note.  Just another sheet of scratch paper, exploring possible alternatives to the Fourier transforms in the previous post.  As a rule, I like to keep Boolean problems in Boolean spaces, partly for aesthetic reasons and partly from a sense that it doesn’t reduce the computational complexity of Boolean problems to replace them with integer or real number problems.  I’ll begin by copying the previous post as a template and gradually transform it as I proceed.

Begin with a survey of concrete examples, perhaps in tabular form.

Notation

Boolean domain {\mathbb{B} = \{0, 1\}}.

Boolean function on {k} variables {f : \mathbb{B}^k \to \mathbb{B}}.

Boolean coordinate projections {\mathcal{X} = \{ x_1, \ldots, x_k \}},
where {x_j : \mathbb{B}^k \to \mathbb{B}} such that {x_j : (x_1, \ldots, x_j, \ldots, x_k) \mapsto x_j}.

Minimal negation operator {\nu_k : \mathbb{B}^k \to \mathbb{B}}. In contexts where the meaning is clear, {\nu_k (x_1, \ldots, x_k)} may be written as {\nu (x_1, \ldots, x_k)} or even, using a different style of parentheses, as \texttt{(} x_1 \texttt{,} \ldots \texttt{,} x_k \texttt{)}.

{k = 2}

For ease of reading formulas, let {x = (x_1, x_2) = (u, v)}.

Identify {x \in \mathbb{B}^2} with the corresponding singular proposition {x : \mathbb{B}^2 \to \mathbb{B}}.

Try some other bases, but with addition as in {\mathbb{F}_2 = \text{GF}(2)}.

Observation. The propositions {f_7, f_{11}, f_{13}, f_{14}} are pairwise orthogonal.

Let {\mathcal{G} = \{ f_7, f_{11}, f_{13}, f_{14} \}}. I’m thinking of calling these the cosingular or fenestral propositions. (I would have called them the lacunary or umbral propositions but those terms already have established meanings in mathematics.) On third thought, I think I’ll call them crenular propositions, a crenel being a notch at the top of a structure.

Definitions

Fourier coefficient of {f} on {g}

{\displaystyle \hat{f}(g) = \sum_{x \in \mathbb{B}^2} f(x) \cdot g(x)}

Fourier expansion of {f}

{\displaystyle f(x) = \sum_{g \in \mathcal{G}} \hat{f}(g) \cdot g(x)}

Tables

\begin{array}{|c||*{4}{c}|}  \multicolumn{5}{c}{\text{Table 2.1. Values of}~ g(x)} \\[4pt]  \hline  g & f_{8} & f_{4} & f_{2} & f_{1} \\  &  \texttt{ } u \texttt{  } v \texttt{ } &  \texttt{ } u \texttt{ (} v \texttt{)} &  \texttt{(} u \texttt{) } v \texttt{ } &  \texttt{(} u \texttt{)(} v \texttt{)} \\  \hline\hline  f_{7}  & 0 & 1 & 1 & 1 \\  f_{11} & 1 & 0 & 1 & 1 \\  f_{13} & 1 & 1 & 0 & 1 \\  f_{14} & 1 & 1 & 1 & 0 \\  \hline  \end{array}
 

\begin{array}{|*{9}{c|}}  \multicolumn{9}{c}{\text{Table 2.2. Fourier Coefficients of Boolean Functions on Two Variables}}\\[4pt]  \hline  \text{~~~~~~~~} & \text{~~~~~~~~} & &  \text{~~~~~~~~} & \text{~~~~~~~~} & \text{~~~~~~~~~} &  \text{~~~~~~~~} & \text{~~~~~~~~} & \text{~~~~~~~~~} \\  L_1 & L_2 && L_3 & L_4 &  \hat{f}(f_{7}) & \hat{f}(f_{11}) & \hat{f}(f_{13}) & \hat{f}(f_{14}) \\  ~&~&~&~&~&~&~&~&~\\  \hline  &&u = & 1~1~0~0 &&&&& \\  &&v = & 1~0~1~0 &&&&& \\  \hline  f_{0} & f_{0000} && 0~0~0~0 & (~)    & 0 & 0 & 0 & 0 \\  f_{1} & f_{0001} && 0~0~0~1 & (u)(v) & 1 & 1 & 1 & 0 \\  f_{2} & f_{0010} && 0~0~1~0 & (u)~v~ & 1 & 1 & 0 & 1 \\  f_{3} & f_{0011} && 0~0~1~1 & (u)    & 0 & 0 & 1 & 1 \\  f_{4} & f_{0100} && 0~1~0~0 & ~u~(v) & 1 & 0 & 1 & 1 \\  f_{5} & f_{0101} && 0~1~0~1 & (v)    & 0 & 1 & 0 & 1 \\  f_{6} & f_{0110} && 0~1~1~0 & (u,~v) & 0 & 1 & 1 & 0 \\  f_{7} & f_{0111} && 0~1~1~1 & (u~~v) & 1 & 0 & 0 & 0 \\  \hline  f_{8} & f_{1000} && 1~0~0~0 & ~u~~v~ & 0 & 1 & 1 & 1 \\  f_{9} & f_{1001} && 1~0~0~1 &((u,~v))& 1 & 0 & 0 & 1 \\  f_{10}& f_{1010} && 1~0~1~0 &  v     & 1 & 0 & 1 & 0 \\  f_{11}& f_{1011} && 1~0~1~1 &(~u~(v))& 0 & 1 & 0 & 0 \\  f_{12}& f_{1100} && 1~1~0~0 &  u     & 1 & 1 & 0 & 0 \\  f_{13}& f_{1101} && 1~1~0~1 &((u)~v~)& 0 & 0 & 1 & 0 \\  f_{14}& f_{1110} && 1~1~1~0 &((u)(v))& 0 & 0 & 0 & 1 \\  f_{15}& f_{1111} && 1~1~1~1 & ((~))  & 1 & 1 & 1 & 1 \\  \hline  \end{array}
 

\begin{array}{|*{9}{c|}}  \multicolumn{9}{c}{\text{Table 2.3. Fourier Coefficients of Boolean Functions on Two Variables}}\\[4pt]  \hline  \text{~~~~~~~~} & \text{~~~~~~~~} & &  \text{~~~~~~~~} & \text{~~~~~~~~} & \text{~~~~~~~~~} &  \text{~~~~~~~~} & \text{~~~~~~~~} & \text{~~~~~~~~~} \\  L_1 & L_2 && L_3 & L_4 &  \hat{f}(f_{7}) & \hat{f}(f_{11}) & \hat{f}(f_{13}) & \hat{f}(f_{14}) \\  ~&~&~&~&~&~&~&~&~\\  \hline  && u = & 1~1~0~0 &&&&& \\  && v = & 1~0~1~0 &&&&& \\  \hline  f_{0} & f_{0000} && 0~0~0~0 & (~)    & 0 & 0 & 0 & 0 \\  \hline  f_{1} & f_{0001} && 0~0~0~1 & (u)(v) & 1 & 1 & 1 & 0 \\  f_{2} & f_{0010} && 0~0~1~0 & (u)~v~ & 1 & 1 & 0 & 1 \\  f_{4} & f_{0100} && 0~1~0~0 & ~u~(v) & 1 & 0 & 1 & 1 \\  f_{8} & f_{1000} && 1~0~0~0 & ~u~~v~ & 0 & 1 & 1 & 1 \\  \hline  f_{3} & f_{0011} && 0~0~1~1 & (u)    & 0 & 0 & 1 & 1 \\  f_{12}& f_{1100} && 1~1~0~0 &  u     & 1 & 1 & 0 & 0 \\  \hline  f_{6} & f_{0110} && 0~1~1~0 & (u,~v) & 0 & 1 & 1 & 0 \\  f_{9} & f_{1001} && 1~0~0~1 &((u,~v))& 1 & 0 & 0 & 1 \\  \hline  f_{5} & f_{0101} && 0~1~0~1 & (v)    & 0 & 1 & 0 & 1 \\  f_{10}& f_{1010} && 1~0~1~0 &  v     & 1 & 0 & 1 & 0 \\  \hline  f_{7} & f_{0111} && 0~1~1~1 & (u~~v) & 1 & 0 & 0 & 0 \\  f_{11}& f_{1011} && 1~0~1~1 &(~u~(v))& 0 & 1 & 0 & 0 \\  f_{13}& f_{1101} && 1~1~0~1 &((u)~v~)& 0 & 0 & 1 & 0 \\  f_{14}& f_{1110} && 1~1~1~0 &((u)(v))& 0 & 0 & 0 & 1 \\  \hline  f_{15}& f_{1111} && 1~1~1~1 & ((~))  & 1 & 1 & 1 & 1 \\  \hline  \end{array}

Notes

References

21 May 2013 Twin Primes Are Useful
08 Nov 2012 The Power Of Guessing
05 Jan 2011 Fourier Complexity Of Symmetric Boolean Functions
19 Nov 2010 Is Complexity Theory On The Brink?
18 Sep 2009 Why Believe That P=NP Is Impossible?
04 Jun 2009 The Junta Problem
Posted in Boolean Functions, Computational Complexity, Fourier Transforms, Harmonic Analysis, Logic, Mathematics, Propositional Calculus | Tagged , , , , , , | Leave a comment

Wherefore Aught?

Re: R.J. Lipton and K.W. ReganWhy Is There Something?

Here is another one of those eternally recurring ideas echoed inimitably by C.S. Peirce in his sketch of a Cosmogonic Philosophy.

It would suppose that in the beginning,—infinitely remote,—there was a chaos of unpersonalized feeling, which being without connection or regularity would properly be without existence.  This feeling, sporting here and there in pure arbitrariness, would have started the germ of a generalizing tendency.  Its other sportings would be evanescent, but this would have a growing virtue.  Thus, the tendency to habit would be started;  and from this with the other principles of evolution all the regularities of the universe would be evolved.  At any time, however, an element of pure chance survives and will remain until the world becomes an absolutely perfect, rational, and symmetrical system, in which mind is at last crystallized in the infinitely distant future.  (Peirce, 1890/2010, p. 110).

The above quotation is taken from one of several discussions where Peirce introduces his idea that natural laws themselves evolve.  That idea has enjoyed yet another revival in recent days, notably by Lee Smolin in Time Reborn.

Reference

Charles S. Peirce (30 August 1890), “The Architecture of Theories”, pp. 98–110 in Peirce Edition Project (2010), Writings of Charles S. Peirce : A Chronological Edition, Volume 8, 1890–1892, Indiana University Press, Bloomington, IN. Published version, The Monist, vol. 1, no. 2 (January 1891), pp. 161–176.

Posted in C.S. Peirce, Cosmogony, Evolution, Existence, Natural Law, Peirce, Philosophy, References, Sources | Tagged , , , , , , , , | 2 Comments

Special Classes of Propositions

Adapted from Differential Propositional Calculus • Special Classes of Propositions

A basic proposition, coordinate proposition, or simple proposition in the universe of discourse \mathcal{X}^\bullet = \lbrack x_1, \ldots, x_k \rbrack is one of the propositions in the set \mathcal{X} = \lbrace x_1, \ldots, x_k \rbrace.

Among the 2^{2^k} propositions in \lbrack x_1, \ldots, x_k \rbrack are several families of 2^k propositions each that take on special forms with respect to the logical basis \lbrace x_1, \ldots, x_k \rbrace. Three of these families are especially prominent in the present context, the linear, the positive, and the singular propositions. Each family is naturally parameterized by the coordinate k-tuples in \mathbb{B}^k and falls into k + 1 ranks, with a binomial coefficient \dbinom{k}{j} giving the number of propositions that have rank or weight j.

  • The linear propositions, \lbrace \ell : \mathbb{B}^k \to \mathbb{B} \rbrace = (\mathbb{B}^k \xrightarrow{\ell} \mathbb{B}), may be written as sums:

    \begin{array}{llll}  \displaystyle\sum_{i=1}^k e_i ~=~ e_1 + \ldots + e_k &  \text{where} &  \left\{ \begin{matrix} e_i = x_i \\ \text{or} \\ e_i = 0 \end{matrix} \right\} &  \text{for}~ i=1 ~\text{to}~ k.  \end{array}

  • The positive propositions, \lbrace p : \mathbb{B}^k \to \mathbb{B} \rbrace = (\mathbb{B}^k \xrightarrow{p} \mathbb{B}), may be written as products:

    \begin{array}{llll}  \displaystyle\prod_{i=1}^k e_i ~=~ e_1 \cdot \ldots \cdot e_k &  \text{where} &  \left\{ \begin{matrix} e_i = x_i \\ \text{or} \\ e_i = 1 \end{matrix} \right\} &  \text{for}~ i=1 ~\text{to}~ k.  \end{array}

  • The singular propositions, \lbrace \mathbf{x} : \mathbb{B}^k \to \mathbb{B} \rbrace = (\mathbb{B}^k \xrightarrow{s} \mathbb{B}), may be written as products:

    \begin{array}{llll}  \displaystyle\prod_{i=1}^k e_i ~=~ e_1 \cdot \ldots \cdot e_k &  \text{where} &  \left\{ \begin{matrix} e_i = x_i \\ \text{or} \\ e_i = \texttt{(}x_i\texttt{)} \end{matrix} \right\} &  \text{for}~ i=1 ~\text{to}~ k.  \end{array}

In each case the rank j ranges from 0 to k and counts the number of positive appearances of the coordinate propositions x_1, \ldots, x_k in the resulting expression. For example, for k = 3 the linear proposition of rank 0 is 0, the positive proposition of rank 0 is 1, and the singular proposition of rank 0 is \texttt{(} x_1 \texttt{)(} x_2 \texttt{)(} x_3 \texttt{)}.

The basic propositions x_i : \mathbb{B}^k \to \mathbb{B} are both linear and positive. So these two kinds of propositions, the linear and the positive, may be viewed as two different ways of generalizing the class of basic propositions.

Finally, it is important to note that all of the above distinctions are relative to the choice of a particular logical basis \mathcal{X} = \lbrace x_1, \ldots, x_k \rbrace. For example, a singular proposition with respect to the basis \mathcal{X} will not remain singular if \mathcal{X} is extended by a number of new and independent features. Even if one keeps to the original set of pairwise options \lbrace x_i \rbrace \cup \lbrace \texttt{(} x_i \texttt{)} \rbrace to pick out a new basis, the sets of linear propositions and positive propositions are both determined by the choice of basic propositions, and this whole determination is tantamount to the purely conventional choice of a cell as origin.

Posted in Boolean Functions, Computational Complexity, Differential Logic, Equational Inference, Functional Logic, Indication, Logic, Logical Graphs, Mathematics, Minimal Negation Operators, Propositional Calculus, Visualization | Tagged , , , , , , , , , , , | 2 Comments