## Fourier Transforms of Boolean Functions

The problem is concretely about Boolean functions ${f}$ of ${k}$ variables, and seems not to involve prime numbers at all. For any subset ${S}$ of the coordinates, the corresponding Fourier coefficient is given by:

$\displaystyle \hat{f}(S) = \frac{1}{2^k} \sum_{x \in \mathbb{Z}_2^k} f(x)\chi_S(x)$

where ${\chi_S(x)}$ is ${-1}$ if ${\sum_{i \in S} x_i}$ is odd, and ${+1}$ otherwise.

Need to play around with this concept a while …

## Gaussian Coefficients

The Gaussian coefficient, also known as the q-binomial coefficient, is notated as Gauss(nk)q and given by the following formula:

(qn−1)(qn−1−1) … (qn−k+1−1) / (qk−1)(qk−1−1) … (q−1).

$\text{Gauss}(n, k)_q = \frac{(q^n - 1)(q^{n-1} - 1) \ldots (q^{n-k+1} - 1)}{(q^k - 1)(q^{k-1} - 1) \ldots (q - 1)}$

The ordinary generating function for selecting at most one positive integer is:

1/(1−q)   =   1 + q + q2 + q3 + …

The ordinary generating function for selecting exactly one positive integer is:

q/(1−q)   =   q + q2 + q3 + q4 + …

The ordinary generating function for selecting exactly one positive integer ≥ n is:

qn/(1−q)   =   qn + qn+1 + qn+2 + qn+3 + …

The ordinary generating function for selecting exactly one positive integer < n is:

q/(1−q)   −   qn/(1−q)   =   q + q2 + … + qn−2 + qn−1

(qqn) / (1 − q)   =   q + q2 + … + qn−2 + qn−1

(qnq) / (q − 1)   =   q + q2 + … + qn−2 + qn−1

The ordinary generating function for selecting at most one positive integer < n is:

(qn − 1) / (q − 1)   =   1 + q + q2 + … + qn−2 + qn−1

The ordinary generating function for selecting at most one positive multiple of k is:

1/(1−qk)   =   1 + qk + q2k + q3k + …

1. Jon Awbrey says:

We have come to a critical point in the arc of democratic societies, where the idea that money can regulate itself, evangelized by the Church of the Invisible Hand, has tipped its hand to the delusion that money itself can regulate society.

Laßt uns rechnen …

OneTwoThree

2. Jon Awbrey says:

The well-known capacity that thoughts have — as doctors have discovered — for dissolving and dispersing those hard lumps of deep, ingrowing, morbidly entangled conflict that arise out of gloomy regions of the self probably rests on nothing other than their social and worldly nature, which links the individual being with other people and things; but unfortunately what gives them their power of healing seems to be the same as what diminishes the quality of personal experience in them.

3. Jon Awbrey says:

e

Benjamin Peirce apparently liked this mathematical synonym for the additive inverse of 1 so much that he introduced three special symbols for e, i, π — ones that enable e to be written in a single cursive ligature, as shown here.

4. Jon Awbrey says:

5. Jon Awbrey says:

6. Jon Awbrey says:

$\text{Gauss}(n, k)_q = \frac{(q^n - 1)(q^{n-1} - 1) \ldots (q^{n-k+1} - 1)}{(q^k - 1)(q^{k-1} - 1) \ldots (q - 1)}$

7. Jon Awbrey says:

This reminds me of some things I used to think about — I always loved working playing with generating functions and I can remember a time in the 80s when I was very pleased with myself for working out the q-analogue of integration by parts — but it will probably take me a while to warm up those old gray cells today.

Let me first see if I can get LaTeX to work in these comment boxes …

The Gaussian coefficient, also known as the q-binomial coefficient, is notated as Gauss(n, k)q and given by the following formula:

(qn−1)(qn−1−1) … (qn−k+1−1) / (qk−1)(qk−1−1) … (q−1).

$\text{Gauss}(n, k)_q = \frac{(q^n - 1)(q^{n-1} - 1) \ldots (q^{n-k+1} - 1)}{(q^k - 1)(q^{k-1} - 1) \ldots (q - 1)}$

8. Jon Awbrey says:

Groups like $Z_2^m$ acting on the space of $2^{2^m}$ boolean functions on $m$ variables come up in my explorations of differential logic.

Here’s one indication of a context where these groups come up —

Here’s a discussion of the case where $m=2$

9. Jon Awbrey says:

Notes

10. Jon Awbrey says:

All Learning Is But Recollection
All Leaning Is But Reïnclination

And they will lean that way forever

I lean that way myself, inclined to believe
All leaning inclines to preserve the swerve.

If $\exists \frac{\sum}{\varnothing \nu}$
Then the Shadow falls in a moat
Between the castle of invention
And the undiscovered country.

If $\exists \frac{\bigodot}{\varnothing \nu}$
Then the Shadow falls in a moat
Between the castle of invention
And the undiscovered country.

11. Jon Awbrey says:
Anamnesis
Learning = Recollection
Maieusis
Teaching = Midwifery
Communication = Pre-Established Harmony
Symbology
Meaning = Interpretation
12. Jon Awbrey says:

These are the forms of time,
which imitates eternity and
revolves according to a law
of number.

Plato, “Timaeus”, 38 A
Benjamin Jowett (trans.)

13. Jon Awbrey says:

Try 9001 and 9002 = 〈 and 〉

Or 12296 and 12297 = 〈 and 〉

14. Jon Awbrey says:

&lang; &rang; = &‍#10216; &‍#10217; = ⟨ ⟩

Other possibilities short of using LaTeX —

&‍#9001; &‍#9002; = 〈 〉

&‍#12296; &‍#12297; = 〈 〉

15. Jon Awbrey says:

If we are thinking about records of a fixed finite length $k$ and a fixed signature $X_1, \ldots, X_k$ then a relational data base is a finite subset $D$ of a $k$-dimensional coordinate space $X = X_1 \times \ldots \times X_k = \prod_{j=1}^k X_j.$

Given a non-empty subset $J$ of the indices $K = [1, k],$ we can take the projection $\text{proj}_J$ of $D$ on the subspace $X_J = \prod X_{j \in J} X_j.$

Saying that “a query is likely to use only a few columns” amounts to saying that most of the time we can get by with the help of our small dimension projections. This is akin to a very old idea, having its ancestor in Descartes’ suggestion that “we should never attend to more than one or two” dimensions at a time.

16. Jon Awbrey says:

Just a thought, more loose than lucid most likely —

There is another kind of “discrete logarithm” that I used to call the “vector logarithm” of a positive integer $n.$ Consider the primes factorization of $n$ and write the exponents of the primes in order as a coordinate tuple $(e_1, \ldots, e_k, 0, 0, 0, \ldots),$ where $e_j = 0$ for any prime $p_j$ not dividing $n$ and where the exponents are all $0$ after some finite point. Then multiplying two positive integers maps to adding the corresponding vectors.

17. Jon Awbrey says:
Charters? ...... ☐ Y ☐ N
Common Core? ... ☐ Y ☐ N
Disruption? .... ☐ Y ☐ N
Technology? .... ☐ Y ☐ N

18. Jon Awbrey says:

## Emeticon

http://s583.photobucket.com/user/metasonix/media/vomit.gif.html

[URL=http://s583.photobucket.com/user/metasonix/media/vomit.gif.html][IMG]http://i583.photobucket.com/albums/ss273/metasonix/vomit.gif[/IMG][/URL]

19. Jon Awbrey says:

The following tag would normally force invisible borders on a table:

<table border="0" style="border-width:0">

But WordPress still leaves a light border line on top of each cell, so you have to add the following parameter to each table datum:

<td style="border-top:1px solid white">

20. Jon Awbrey says:

Sign relations are just a special case of triadic relations, in much the same way that groups and group actions are special types of triadic relations. It’s a bit of a complication that we participate in sign relations whenever we try to talk about anything else, but it still makes sense to try and tease the separate issues apart as much as we can.

As far as relations in general go, relative terms are often expressed by slotted frames like “brother of __”, “divisor of __”, or “sum of __ and __”. Peirce referred to these kinds of incomplete expressions as rhemes or rhemata and Frege used the adjective ungesättigt or unsaturated to convey more or less the same idea.

Switching the focus to sign relations, it’s a fair question to ask what sorts of objects might be denoted by pieces of code like “brother of __”, “divisor of __”, or “sum of __ and __”. And while we’re at it, what is this thing called denotation, anyway?

21. Jon Awbrey says:

It may help to clarify the relationship between logical relatives and mathematical relations. The word relative as used in logic is short for relative term — as such it refers to a piece of language that is used to denote a formal object. So what kind of object is that? The way things work in mathematics, we are free to make up a formal object that corresponds directly to the term, so long as we can form a consistent theory of it, but in our case it is probably easier to relate the relative term to the kinds of relations we already know and love in mathematics and database theory.

In those contexts a relation is just a set of ordered tuples and, if you are a fan of strong typing like I am, such a set is always set in a specific setting, namely, it’s a subset of a specific Cartesian product.

Peirce wrote ${k}$-tuples ${(x_1, x_2, \ldots, x_{k-1}, x_k)}$ in the form ${x_1 \colon\! x_2 \colon\! \ldots \colon\! x_{k-1} \colon\! x_k}$ and referred to them as elementary ${k}$-adic relatives. He expressed a set of ${k}$-tuples as a “logical sum” or “logical aggregate”, what we would call a logical disjunction of these elementary relatives, and he frequently regarded them as being arranged in the form of ${k}$-dimensional arrays.

Time for some concrete examples, which I will give in the next comment …

22. Jon Awbrey says:

Table 1 shows the first few ordered pairs in the relation on positive integers that corresponds to the relative term, “divisor of”. Thus, the ordered pair ${i \text{:} j}$ appears in the relation if and only if ${i|j}.$

$\begin{array}{|c||*{11}{c}|} \multicolumn{12}{c}{\text{Table 1. Elementary Relatives for the Divisor Of" Relation}} \\[4pt] \hline i|j &1&2&3&4&5&6&7&8&9&10&\ldots \\ \hline\hline 1&1\!\!:\!\!1&1\!:\!2&1\!:\!3&1\!:\!4&1\!:\!5&1\!:\!6&1\!:\!7&1\!:\!8&1\!:\!9&1\!:\!10&\dots \\ 2&&2\!:\!2&&2\!:\!4&&2\!:\!6&&2\!:\!8&&2\!:\!10&\dots \\ 3&&&3\!:\!3&&&3\!:\!6&&&3\!:\!9&&\dots \\ 4&&&&4\!:\!4&&&&4\!:\!8&&&\dots \\ 5&&&&&5\!:\!5&&&&&5\!:\!10&\dots \\ 6&&&&&&6\!:\!6&&&&&\dots \\ 7&&&&&&&7\!:\!7&&&&\dots \\ 8&&&&&&&&8\!:\!8&&&\dots \\ 9&&&&&&&&&9\!:\!9&&\dots \\ 10&&&&&&&&&&10\!:\!10&\dots \\ \ldots&\ldots&\ldots&\ldots&\ldots&\ldots& \ldots&\ldots&\ldots&\ldots&\ldots&\ldots \\ \hline \end{array}$

Table 2 shows the same information in the form of a logical matrix. This has a coefficient of ${1}$ in row ${i}$ and column ${j}$ when ${i|j}$, otherwise it has a coefficient of ${0}.$ (The zero entries have been omitted here for ease of reading.)

$\begin{array}{|c||*{11}{c}|} \multicolumn{12}{c}{\text{Table 2. Logical Matrix for the Divisor Of" Relation}} \\[4pt] \hline i|j &1&2&3&4&5&6&7&8&9&10&\ldots \\ \hline\hline 1&1&1&1&1&1&1&1&1&1&1&\dots \\ 2& &1& &1& &1& &1& &1&\dots \\ 3& & &1& & &1& & &1& &\dots \\ 4& & & &1& & & &1& & &\dots \\ 5& & & & &1& & & & &1&\dots \\ 6& & & & & &1& & & & &\dots \\ 7& & & & & & &1& & & &\dots \\ 8& & & & & & & &1& & &\dots \\ 9& & & & & & & & &1& &\dots \\ 10&& & & & & & & & &1&\dots \\ \ldots&\ldots&\ldots&\ldots&\ldots&\ldots& \ldots&\ldots&\ldots&\ldots&\ldots&\ldots \\ \hline \end{array}$

In much the same way that matrices in linear algebra represent linear transformations, these logical arrays and matrices represent logical transformations.

To be continued …

23. Jon Awbrey says:

Ð ð

${\daleth ~ \eth}$

${\mathfrak{D ~ d}}$

${\mathfrak{F ~ f} ~ \digamma}$

24. Jon Awbrey says:

Arrays like the ones sampled above supply a way to understand the difference between a relation and its associated relative terms. To make a long story short, we could say that a relative term is a relation plus an indication of its intended application. But explaining what that means will naturally take a longer story.

In his first paper on the “Logic of Relatives” (1870) Peirce treated sets — in other words, the extensions of concepts — as logical aggregates or logical sums. He wrote a plus sign with a comma ${(+\!\!,)}$ to indicate an operation of logical addition, abstractly equivalent to inclusive “or”, that was used to form these sums.

For example, letting ${t}$ be the concept of the first ten positive integers, it can be expressed as the following logical sum.

${t ~=~ 1 ~+\!\!,~ 2 ~+\!\!,~ 3 ~+\!\!,~ 4 ~+\!\!,~ 5 ~+\!\!,~ 6 ~+\!\!,~ 7 ~+\!\!,~ 8 ~+\!\!,~ 9 ~+\!\!,~ 10}$

Relations, as sets of tuples, can also be expressed as logical sums.

For example, letting ${\mathfrak{D}}$ be the divisibility relation on positive integers, it is possible to think of ${\mathfrak{D}}$ as a logical sum that begins in the following way.

${\mathfrak{D} ~=~ 1\text{:}1 ~+\!\!,~ 1\text{:}2 ~+\!\!,~ 2\text{:}2 ~+\!\!,~ 1\text{:}3 ~+\!\!,~ 3\text{:}3 ~+\!\!,~ 1\text{:}4 ~+\!\!,~ 2\text{:}4 ~+\!\!,~ 4\text{:}4 ~+\!\!,~ 1\text{:}5 ~+\!\!,~ 5\text{:}5 ~+\!\!,~ \ldots}$

It should be apparent that this is only a form of expression, not a definition, since it takes a prior concept of divisibility to say what ordered pairs appear in the sum, but it’s a reformulation that has its uses, nonetheless.

To be continued …

25. Jon Awbrey says:

Define operations on the elementary relatives that obey the following rules:

${(i : j) ~ j = i},$

${i ~ (i : j) = j},$

$(i : j) (k : \ell) = \left\{\begin{array}{cl} (i : \ell) & \text{if}~ j = k, \\ 0 & \text{if}~ j \neq k. \end{array}\right.$

Extending these rules in the usual distributive fashion to sums of elementary monadic and dyadic relatives allows us to define relative multiplications of the following forms:

$\mathit{a}\mathit{b},$ where $\mathit{a}$ and $\mathit{b}$ are 2-adic relatives,

$\mathit{a}\mathrm{b},$ where $\mathit{a}$ is 2-adic and $\mathrm{b}$ is 1-adic,

$\mathrm{a}\mathit{b},$ where $\mathrm{a}$ is 1-adic and $\mathit{b}$ is 2-adic.

For example, expressed in terms of coefficients, the relative product $\mathit{a}\mathit{b}$ of 2-adic relatives $\mathit{a}$ and $\mathit{b}$ is given by the following formula:

$\displaystyle (\mathit{a}\mathit{b})_{ij} = \sum_{m}^{,} \mathit{a}_{im}\mathit{b}_{mj}.$

This will of course remind everyone of the formula for multiplying matrices in linear algebra, but I have affixed a comma atop the summation symbol to remind us that the logical sum is the inclusive disjunction $(\lor)$ — that Peirce wrote as $(+\!\!,)$ — and not the exclusive disjunction that corresponds to the linear algebraic sum $(+).$

To be continued …

26. Jon Awbrey says:

To the extent that mathematics has to do with reasoning about possible existence, or inference from pure hypothesis, a line of thinking going back to Aristotle and developed greatly by C.S. Peirce may have some bearing on the question of How and Why Mathematics is Possible. In that line of thought, hypothesis formation is treated as a case of “abductive” inference, whose job in science generally is to supply suitable raw materials for deduction and induction to develop and test. In that light, a large part of our original question becomes, as Peirce once expressed it —

Is it reasonable to believe that “we can trust to the human mind’s having such a power of guessing right that before very many hypotheses shall have been tried, intelligent guessing may be expected to lead us to the one which will support all tests, leaving the vast majority of possible hypotheses unexamined”? (Peirce, Collected Papers, CP 6.530).

The question may fit the situation in mathematics slightly better if we modify the word hypothesis to say proof.

27. Jon Awbrey says:

The Jug of Punch

Bein’ on the twenty-third of June,
As I sat weaving all at my loom,
Bein’ on the twenty-third of June,
As I sat weaving all at my loom,
I heard a thrush, singing on yon bush,
And the song she sang was The Jug of Punch.

What more pleasure can a boy desire,
Than sitting down beside the fire?
What more pleasure can a boy desire,
Than sitting down beside the fire?
And in his hand a jug of punch,
And on his knee a tidy wench.

When I am dead and left in my mould,
At my head and feet place a flowing bowl,
When I am dead and left in my mould,
At my head and feet place a flowing bowl,
And every young man that passes by,
He can have a drink and remember I.

28. Jon Awbrey says:

According to my understanding of it, the so-called Liar Paradox is just the most simple-minded of fallacies, involving nothing more mysterious than the acceptance of a false assumption, from which anyone can prove anything at all.

Let us contemplate one of the shapes in which the putative Liar Paradox is commonly cast:

Somebody writes down:

1. Statement 1 is false.

Then you are led to reason: If Statement 1 is false then by the principle that permits the substitution of equals in a true statement to obtain yet another true statement, you can derive the result:

“Statement 1 is false” is false. Ergo, Statement 1 is true, and so on, and so on, ad nauseam infinitum.

Where did you go wrong? Where were you misled?

Just here, to wit, where it is writ:

1. Statement 1 is false.

What is this really saying? Well, it’s the same as writing:

Statement 1. Statement 1 is false.

And what the heck does this dot.comment say? It is inducing you to accept this identity:

“Statement 1” = “Statement 1 is false”.

That appears to be a purely syntactic indexing,the sort of thing you are led to believe that you can do arbitrarily, with logical impunity. But you cannot, for syntactic identity implies logical equivalence, and that is liable to find itself constrained by iron bands of logical law.

And you cannot, not with logical impunity, assume the result of this transmutation, which would be as much as to say this:

“Statement 1” = “Negation of Statement 1”

And this my friends, call it “Statement 0”, is purely and simply a false statement, with no hint of paradox about it.

Statement 0 was slipped into your drink before you were even starting to think. A bit before you were led to substitute you should have examined more carefully the site proposed for the substitution!

For the principle that you rushed to use does not permit you to substitute unequals into a statement that is false to begin with, not just in the first place, but even before, in the zeroth place of argument, as it were, and still expect to come up with a truth.

Now let that be the end of that.

• cg says:

Of course, the solution to the conundrum is that statement 1 is sometimes false. If “statement 1 is false” is equivalent to “statement 1 is always false”, then it is an example of where it is false.

29. Jon Awbrey says:

For the moment, I am viewing these questions merely as matters of classical propositional logic, even just Boolean formulas.

If we have a Boolean formula like ${\lnot (x \land y) = \lnot x \lor \lnot y},$ then we do not know whether ${x}$ and ${y}$ are true or false, but we do know that the formula as a whole is true, because we adopted axioms beforehand to make it so.

In that perspective, a form like “1. Statement 1 is true” is just a way of expressing the formula “Statement 1 = (Statement 1 = true)”, which has the form ${x = (x = 1)},$ which is true on the adopted axioms.

30. Jon Awbrey says:

FYSMI (funny you should mention it), but I was thinking about Dirac holes just the other day, in connection with an inquiry into Fourier Transforms of Boolean Functions.

I have there a notion of singular propositions, which are Boolean functions that pick out single cells in their given universe of discourse, and I needed a handy name for the complements of these. I suppose I could have been gutsy and called them “Awbrey holes”, but it turns out that a long ago near-namesake of mine is already credited with the discovery of something else entirely under very nearly that name. So I finally settled on crenular propositions.

31. Jon Awbrey says:

Someone might enjoy looking at the complexity of Boolean functions as expressed in terms of minimal negation operators. Such expressions have graph-theoretic representations in a species of cactus graphs called painted and rooted cacti, as illustrated here:

I know I once made a table of more or less canonical cactus expressions for the 256 Boolean functions on 3 variables, but I will have to look for that later.

32. Jon Awbrey says:

Peirce’s “Pickwickian” paragraph comes to mind, said Jon Perennially —

Two things here are all-important to assure oneself of and to remember. The first is that a person is not absolutely an individual. His thoughts are what he is “saying to himself”, that is, is saying to that other self that is just coming into life in the flow of time. When one reasons, it is that critical self that one is trying to persuade; and all thought whatsoever is a sign, and is mostly of the nature of language. The second thing to remember is that the man’s circle of society (however widely or narrowly this phrase may be understood), is a sort of loosely compacted person, in some respects of higher rank than the person of an individual organism. It is these two things alone that render it possible for you — but only in the abstract, and in a Pickwickian sense — to distinguish between absolute truth and what you do not doubt.

— C.S. Peirce, Collected Papers, CP 5.421.

Charles Sanders Peirce, “What Pragmatism Is”, The Monist, Volume 15, 1905, 161–181. Reprinted in the Collected Papers, CP 5.411–437.

33. Jon Awbrey says:

The One Thing Needful

‘Now, what I want is, Facts. Teach these boys and girls nothing but Facts. Facts alone are wanted in life. Plant nothing else, and root out everything else. You can only form the minds of reasoning animals upon Facts: nothing else will ever be of any service to them. This is the principle on which I bring up my own children, and this is the principle on which I bring up these children. Stick to Facts, sir!’

One of the first things we learn in systems theory and engineering is that meaningless measures are always the easiest to make and to game. The one thing needful for a meaningful measure is to ask — and to keep on asking — the eminently practical question, “What is the purpose of this system?”

What is the purpose of an educational system? What is the purpose of an economic system? What is the purpose of a governmental system? Take your eyes off those prizes and you lose sight of all.

34. Jon Awbrey says:

Benjamin Peirce liked Euler’s Formula $e^{i\pi} + 1= 0$ so much that he introduced three special symbols for $e, i, \pi$ — ones that enable $e^{i\pi}$ to be written in a single cursive ligature, as shown in this note.

35. Jon Awbrey says:
36. Jon Awbrey says:

hypotheses non fingerprinto —

But I’m captivated by the fingerprints of finite partial functions ${f : \mathbb{N}^+ \to \mathbb{N}^+}.$

37. Jon Awbrey says:

Men loven of propre kinde newfangelnesse,
As briddes doon that men in cages fede.

— Geoffrey Chaucer • “The Squire’s Tale”

38. Jon Awbrey says:

It sometimes helps to think of a set of boolean functions as a higher order boolean function $h : (\mathbb{B}^n \to \mathbb{B}) \to \mathbb{B}$ and to view these as a type of generalized quantifiers.

39. Jon Awbrey says:

&permil; → ‰

40. Jon Awbrey says:

Synchronicity Rules❢

I just started reworking an old exposition of mine on Cook’s Theorem, where I borrowed the Parity Function example from Wilf (1986), Algorithms and Complexity, and translated it into the cactus graph syntax for propositional calculus that I developed as an extension of Peirce’s logical graphs.

By way of providing a simple illustration of Cook’s Theorem, namely, that Propositional Satisfiability is NP-Complete, I will describe one way to translate finite approximations of turing machines into propositional expressions, using the cactus language syntax for propositional calculus that I will describe in more detail as we proceed.

41. Jon Awbrey says:

In the Realm of Riffs and Rotes

$23 = \text{p}_9^1 = \text{p}_{\text{p}_\text{p}^\text{p}}$

$512 = \text{p}_1^9 = \text{p}^{\text{p}_\text{p}^\text{p}}$

There are pictures of 23 and 512 here.

42. Jon Awbrey says:

### Source Copy

A proof predicate has the form ${P(f,c)}$ and says that ${c}$ is a valid proof (in a given formal system) of the formula ${f}$. This is the logical analogue of checking the validity of a computation ${c}$ by a particular machine. A provability predicate then has the form ${Pv(f) = (\exists c)P(f,c)}$.

The weird fact, which applies to the same natural strong formal systems ${F}$ that Kurt Gödel’s famous incompleteness theorems hold for, is that there are statements ${f}$ such that ${F}$ proves ${Pv(f)}$, but does not prove ${f}$ itself.

### Transcription

A proof predicate has the form ${P(f, c)}$ and says that ${c}$ is a valid proof (in a given formal system) of the formula ${f}.$ This is the logical analogue of checking the validity of a computation ${c}$ by a particular machine. A provability predicate then has the form ${Pv(f) = (\exists c)P(f, c)}.$

The weird fact, which applies to the same natural strong formal systems ${F}$ that Kurt Gödel’s famous incompleteness theorems hold for, is that there are statements ${f}$ such that ${F}$ proves ${Pv(f)},$ but does not prove ${f}$ itself.

43. Jon Awbrey says:

Draw the Riff and Rote for $153 = 3^2 \cdot 17 = \text{p}_2^2 \text{p}_7^1 = \text{p}_\text{p}^\text{p} \text{p}_{\text{p}_{\text{p}^\text{p}}}$

• Jon Awbrey says:

Sette P ne la fronte mi descrisse
col punton de la spada, e «Fa che lavi,
quando se’ dentro, queste piaghe», disse.

— Dante • Purgatorio 09.112–114

44. Jon Awbrey says:

He had drifted into the very heart of the world. From him to the distant beloved was as far as to the next tree.

45. Jon Awbrey says:

Re: “Are there more good cases of isomorphism to study?”

Just off the top of my head, as Data says, there are a couple of examples that come to mind.

Sign Relations. In computational settings, a sign relation $L$ is a triadic relation of the form $L \subseteq O \times S \times I,$ where $O$ is a set of formal objects under consideration and $S$ and $I$ are two formal languages used to denote those objects. It is common practice to cut one’s teeth on the special case $S = I$ before moving on to more solid diets.

Cactus Graphs. In particular, a variant of cactus graphs known (by me, anyway) as painted and rooted cacti (PARCs) affords us with a very efficient graphical syntax for propositional calculus.

46. Jon Awbrey says:

Minimal Negation Operators and Painted Cacti

Let $\mathbb{B} = \{ 0, 1 \}.$

The mathematical objects of penultimate interest are the boolean functions $f : \mathbb{B}^n \to \mathbb{B}$ for $n \in \mathbb{N}.$

A minimal negation operator $\nu_k$ for $k \in \mathbb{N}$ is a boolean function $\nu_k : \mathbb{B}^k \to \mathbb{B}$ defined as follows:

$\nu_0 = 0.$

$\nu_k (x_1, \ldots, x_k) = 1$ if and only if exactly one of the arguments $x_j$ equals $0.$

The first few of these operators are already enough to generate all boolean functions $f : \mathbb{B}^n \to \mathbb{B}$ via functional composition but the rest of the family is worth keeping around for many practical purposes.

In most contexts $\nu (x_1, \ldots, x_k)$ may be written for $\nu_k (x_1, \ldots, x_k)$ since the number of arguments determines the rank of the operator. In some contexts even the letter $\nu$ may be omitted, writing just the argument list $(x_1, \ldots, x_k),$ in which case it helps to use a distinctive typeface for the list delimiters, as $\texttt{(} x_1 \texttt{,} \ldots \texttt{,} x_k \texttt{)}.$

A logical conjunction of $k$ arguments can be expressed in terms of minimal negation operators as $\nu_{k+1} (x_1, x_2, \ldots, x_{k-1}, x_k, 0)$ and this is conveniently abbreviated as a concatenation of arguments $x_1 x_2 \ldots x_{k-1} x_k.$

To be continued …

The species of cactus graphs we want here are all constructed from a single family of logical operators $\{ \nu_k : k \in \mathbb{N} \}$ called minimal negation operators. The operator $\nu_k$ is a boolean function $\nu_k : \mathbb{B}^k \to \mathbb{B}$ such that $\nu_k (x_1, \ldots, x_k) = 1$ just in case exactly one of the arguments $x_k$ equals $0.$

47. Jon Awbrey says:

The importance of preparing individuals for their role as citizens in a democratic society is well documented. However, the reverse assertion is less broadly understood. That is, a democratic environment, in which dialogue and critical thinking are prized, is not only facilitative of but vital to the full development of intelligence. Philosopher Hilary Putnam (1992) refers to what he calls the epistemological justification of democracy which he attributes to John Dewey, “The claim, then, is this: Democracy is not just one form of social life among other workable forms of social life; it is the precondition for the full application of intelligence to the solution of social problems” (Putnam, Renewing Philosophy, p. 180).

Awbrey, S.M., and Scott, D.K. (August 1993), “Educating Critical Thinkers for a Democratic Society”, in Critical Thinking : The Reform of Education and the New Global Economic Realities, Thirteenth Annual International Conference of The Center for Critical Thinking, Rohnert, CA. ERIC Document ED4703251. Online.

48. Jon Awbrey says:

Aristotle understood that the affective is the basis of the cognitive.

Words spoken are symbols or signs (symbola) of affections or impressions (pathemata) of the soul (psyche); written words are the signs of words spoken. As writing, so also is speech not the same for all races of men. But the mental affections themselves, of which these words are primarily signs (semeia), are the same for the whole of mankind, as are also the objects (pragmata) of which those affections are representations or likenesses, images, copies (homoiomata). (Aristotle, De Interp. i. 16a4).

49. Jon Awbrey says:

Let me get some notational matters out of the way before continuing.

I use $\mathbb{B}$ for a generic 2-point set, usually $\{ 0, 1 \}$ and usually but not always interpreted for logic so that $0 = \text{false}$ and $1 = \text{true}.$ I use “teletype” parentheses $\texttt{(} \ldots \texttt{)}$ for negation, so that $\texttt{(} x \texttt{)} = \lnot x$ for $x ~\text{in}~ \mathbb{B}.$ Later on I’ll be using teletype format lists $\texttt{(} x_1 \texttt{,} \ldots \texttt{,} x_k \texttt{)}$ for minimal negation operators.

50. Jon Awbrey says:

As long as we’re reading $x$ as a boolean variable $(x \in \mathbb{B})$ the equation $x = \texttt{(} x \texttt{)}$ is not paradoxical but simply false. As an algebraic structure $\mathbb{B}$ can be extended in many ways but that leaves open the question of whether those extensions have any application to logic.

On the other hand, the assignment statement $x := \texttt{(} x \texttt{)}$ makes perfect sense in computational contexts. The effect of the assignment operation on the value of the variable $x$ is commonly expressed in time series notation as $x' = \texttt{(} x \texttt{)}$ and the same change is expressed even more succinctly by defining $\mathrm{d}x = x' - x$ and writing $\mathrm{d}x = 1.$

Now suppose we are observing the time evolution of a system $X$ with a boolean state variable $x : X \to \mathbb{B}$ and what we observe is the following time series:

$\begin{array}{c|c} t & x \\ \hline 0 & 0 \\ 1 & 1 \\ 2 & 0 \\ 3 & 1 \\ 4 & 0 \\ 5 & 1 \\ 6 & 0 \\ 7 & 1 \\ 8 & 0 \\ 9 & 1 \\ \ldots & \ldots \end{array}$

Computing the first differences we get:

$\begin{array}{c|cc} t & x & \mathrm{d}x \\ \hline 0 & 0 & 1 \\ 1 & 1 & 1 \\ 2 & 0 & 1 \\ 3 & 1 & 1 \\ 4 & 0 & 1 \\ 5 & 1 & 1 \\ 6 & 0 & 1 \\ 7 & 1 & 1 \\ 8 & 0 & 1 \\ 9 & 1 & 1 \\ \ldots & \ldots & \ldots \end{array}$

Computing the second differences we get:

$\begin{array}{c|cccc} t & x & \mathrm{d}x & \mathrm{d}^2 x & \ldots \\ \hline 0 & 0 & 1 & 0 & \ldots \\ 1 & 1 & 1 & 0 & \ldots \\ 2 & 0 & 1 & 0 & \ldots \\ 3 & 1 & 1 & 0 & \ldots \\ 4 & 0 & 1 & 0 & \ldots \\ 5 & 1 & 1 & 0 & \ldots \\ 6 & 0 & 1 & 0 & \ldots \\ 7 & 1 & 1 & 0 & \ldots \\ 8 & 0 & 1 & 0 & \ldots \\ 9 & 1 & 1 & 0 & \ldots \\ \ldots & \ldots & \ldots & \ldots & \ldots \end{array}$

This leads to thinking of the system $X$ as having an extended state $(x, \mathrm{d}x, \mathrm{d}^2 x, \ldots, \mathrm{d}^k x),$ and this additional language gives us the facility of describing state transitions in terms of the various orders of differences. For example, the rule $x' = \texttt{(} x \texttt{)}$ can now be expressed by the rule $\mathrm{d}x = 1.$

The following article has a few more examples along these lines:

51. Jon Awbrey says:
.∙◦° Fishfool Thinking Ru(l)es The Day °◦∙.
♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒
♒♒♒ ⫸⫷⫸ . ◦ ° ♒♋♒ ° ◦ . ⫷⫸⫷ ♒♒♒
♒ ♋ ♒ ♋ ♒ ♋ ♒ ♋ ♒ ♋ ♒ ♋ ♒ ♋ ♒ ♋ ♒
♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒♒

52. Jon Awbrey says:

To my way of thinking, boolean universes $(\mathbb{B}^n, \mathbb{B}^n \to \mathbb{B})$ are some of the most fascinating and useful spaces around, so anything that encourages their exploration is a good thing.

I doubt if there is any such thing as a perfect calculus or syntactic system for articulating their structure but some are decidedly better than others and any improvement we find makes a positive difference in the order of practical problems we can solve in the time we have at hand.

In that perspective, it seems to me that too fixed a focus on P versus NP leads to a condition of tunnel vision that obstructs the broader exploration of those spaces and the families of calculi for working with them.

53. Jon Awbrey says:

Questions about a suitable analogue of differential calculus for boolean spaces keep popping (or pipping) up. Having spent a fair amount of time exploring the most likely analogies between real spaces like $\mathbb{R}^n, (\mathbb{R}^n \to \mathbb{R}), (\mathbb{R}^n \to \mathbb{R}^m), ~\ldots$  and the corresponding boolean spaces $\mathbb{B}^n, (\mathbb{B}^n \to \mathbb{B}), (\mathbb{B}^n \to \mathbb{B}^m), ~\ldots,$  where $\mathbb{B} = \{0, 1\},$ I can’t say I’ve gotten all that far, but some of my first few, halting steps are recorded here:

54. Jon Awbrey says:

$\begin{array}{l} \text{This} ~ \mathbb{B}\text{urro} ~ \mathbb{B}\text{alks} ~@~ \mathbb{B}\text{ridge}, \\ \text{Moonwalking} ~ \mathbb{B}\mathrm{ack} : \mathbb{N} \to \mathbb{B}, \\ \text{Where} ~ \mathbb{B}\text{'n'} ~ \mathbb{B}\text{e a} ~ \mathbb{B}\text{it} ~ \mathbb{B}\text{izzare}. \end{array}$

55. Jon Awbrey says:
 Character is revealed by action.          ~~ Aristotle
Action is discrete.                       ~~ Planck
----------------------------------------------------------
The better part of valour is discretion.  ~~ Shakespeare

56. Jon Awbrey says:

We shall not cease from exploration
And the end of all our exploring
Will be to arrive where we started
And know the place for the first time.

57. Jon Awbrey says:

$\begin{array}{*{13}{r}} A\spadesuit & J\clubsuit & Q\heartsuit & K\diamondsuit & 5\spadesuit & 6\spadesuit & 7\spadesuit & 2\diamondsuit & 3\clubsuit & 4\heartsuit & 8\diamondsuit & 9\clubsuit & 10\heartsuit \\ 2\heartsuit & A\heartsuit & 6\clubsuit & 10\spadesuit & J\diamondsuit & Q\diamondsuit & 4\diamondsuit & 9\heartsuit & 8\heartsuit & K\spadesuit & 7\clubsuit & 3\spadesuit & 5\clubsuit \\ 3\diamondsuit & 8\spadesuit & A\diamondsuit & 7\heartsuit & 2\clubsuit & K\clubsuit & Q\clubsuit & J\spadesuit & 10\diamondsuit & 9\diamondsuit & 6\heartsuit & 5\heartsuit & 4\spadesuit \\ 4\clubsuit & 5\diamondsuit & 9\spadesuit & A\clubsuit & K\heartsuit & 3\heartsuit & J\heartsuit & 10\clubsuit & Q\spadesuit & 8\clubsuit & 2\spadesuit & 7\diamondsuit & 6\diamondsuit \end{array}$

58. Jon Awbrey says:

I can’t remember when I first started playing with Gödel codings of graph-theoretic structures, which arose in logical and computational settings, but I remember being egged on in that direction by Martin Gardner’s 1976 column on Catalan numbers, planted plane trees, polygon dissections, etc.

Codings being injections from a combinatorial species $\mathcal {S}$ to integers, either non-negative integers $\mathbb{N}$ or positive integers $\mathcal{N},$ I was especially interested in codings that were also surjective, thereby revealing something about the target domain of arithmetic.

The most interesting bijection I found was between positive integers $\mathcal{N}$ and finite partial functions from $\mathcal{N}$ to $\mathcal{N}.$ All of this comes straight out of the primes factorizations. That type of bijection may remind some people of Dana Scott’s $D_\infty.$ Corresponding to the positive integers there arose two species of graphical structures, which I dubbed riffs and rotes.

59. Jon Awbrey says:

I always have pictures like this $\downarrow$ in my head.

$\frown\frown\frown\frown\frown\frown\frown\frown\frown\frown\frown$

60. Jon Awbrey says:

I remember having a similar discussion a number of times in the early days of object-oriented programming, or at least when the “fadding crowd” first latched onto it.

We might picture the dyadic relation between Objects and Processes as a rectangular matrix with an entry of $1$ indicating where Process $p, q, r, s, t, u, \ldots$ has meaningful application to Object $a, b, c, d, e, f, \ldots$

$\begin{array}{c|ccccccc} \cdot & p & q & r & s & t & u & \ldots \\ \hline \\ a & & & 1 & 1 & & & \ldots \\ b & & 1 & 1 & 1 & & & \ldots \\ c & 1 & 1 & 1 & 1 & 1 & 1 & \ldots \\ d & & 1 & 1 & 1 & 1 & & \ldots \\ e & & & 1 & 1 & 1 & & \ldots \\ f & & & 1 & 1 & & & \ldots \\ \ldots&\ldots&\ldots&\ldots&\ldots&\ldots&\ldots&\ldots \end{array}$

Then the process orientation amounts to slicing the matrix along columns while the object orientation amounts to slicing it along rows.

But a more general orientation might consider the possibility that the tuples naturally cluster in different ways, partitioning the space into shapes more general than vertical or horizontal stripes.

61. Jon Awbrey says:

On Boole’s Ark all cases come two by two. Here’s a sketch of the Case Analysis-Synthesis Theorem (CAST) that weathers any deluge:

Case Analysis-Synthesis Theorem

62. Jon Awbrey says:

The most striking example of a “Primitive Insight Proof” (PIP❢) known to me is the Dawes–Utting proof of the Double Negation Theorem from the CSP–GSB axioms for propositional logic. There is a graphically illustrated discussion here. I cannot guess what order of insight it took to find this proof — for me it would have involved a whole lot of random search through the space of possible proofs, and that’s even if I got the notion to look for one.

There is of course a much deeper order of insight into the mathematical form of logical reasoning that it took C.S. Peirce to arrive at his maximally elegant 4-axiom set.

63. Jon Awbrey says:

The insight that it takes to find a succinct axiom set for a theoretical domain falls under the heading of abductive or retroductive reasoning, a knack as yet refractory to computational attack, but once we’ve lucked on a select-enough set of axioms we can develop theorems that afford us a more navigable course through the subject.

For example, back on the range of propositional calculus, it takes but a few pivotal theorems and the lever of mathematical induction to derive the Case Analysis-Synthesis Theorem (CAST), which provides a bridge between proof-theoretic methods that demand a modicum of insight and model-theoretic methods that can be run routinely.

64. Jon Awbrey says:

One’s answer, or at least one’s initial response to that question will turn on how one feels about formal realities. As I understand it, reality is that which persists in thumping us on the head until we get what it’s trying to tell us. Are there formal realities, forms that drive us in that way?

Discussions like these tend to begin by supposing we can form a distinction between external and internal. That is a formal hypothesis, not yet born out as a formal reality. Are there formal realities that drive us to recognize them, to pick them out of a crowd of formal possibilities?

65. Jon Awbrey says:

Thanks for all that. My snippet was from the B.S. Miller rendering, and I did get an inkling while reading it of an Eleatic influence on the translator. Your recent mention of Arjuna sent me reeling back to some readings and writings I was immersed in 20 years ago. A few days’ digging turned up hard and soft copies of a WinWord mutilation of MacWord document that unfortunately lost all the graphics and half the formatting, but LibreOffice was able to export a MediaWiki text that I could paste up on one of my wikis. Traveling coming up so it may be another couple weeks before I can LaTeX what needs to be LaTeXed, but here is the link for future reference:

66. Jon Awbrey says:

67. Jon Awbrey says:

Skew   =   Agonic   =   $\not\angle$

Chiaroscuro   =   $\chi\!\!\rightarrow\!\!\not\angle\rho$

68. Jon Awbrey says:

As I read him, Peirce began with a quest to understand how science works, which required him to examine how symbolic mediations inform inquiry, which in turn required him to develop the logic of relatives beyond its bare beginnings in De Morgan. There are therefore intimate links, which I am still trying to understand, among his respective theories of inquiry, signs, and relations.

There’s a bit on the relation between interpretation and inquiry here and a bit more on the three types of inference — abduction, deduction, induction — here.

69. Jon Awbrey says:

There is a deep and pervasive analogy between systems of commerce and systems of communication, turning on their near-universal use of symbola (images, media, proxies, signs, symbols, tokens, etc.) to stand for pragmata (objects, objective values, the things we really care about, or would really care about if we examined our values in practice thoroughly enough).

Both types of sign-using systems are prey to the same sort of dysfunction or functional disease — it sets in when their users confuse signs with objects so badly that signs become ends instead of means.

There is a vast literature on this topic, once you think to go looking for it. And it’s a perennial theme in fable and fiction.

☞ Recycling a comment on Cathy O’Neil’s blog from two years ago …

70. Jon Awbrey says:

My brother James is a social anthropologist who wrote a dissertation with constant reference to Weber and we used to have long discussions about the routinization of charisma. I came to it from the direction of Meno’s question whether virtue can be taught. So let us add virtuoso, adjective and substantive, to the krater before us.

71. Jon Awbrey says:

From what I’ve seen, Peirce’s brand of pragmatism, as an application of the closure cum representation principle known as the pragmatic maxim and incorporating realism about generals, is a sturdier stage for mathematical performance than the derivative styles of neo-pragmatism that Gene Halton aptly described as “fragmatism”.

72. Jon Awbrey says:

Immanuel Kant discussed the correspondence theory of truth in the following manner:

Truth is said to consist in the agreement of knowledge with the object.  According to this mere verbal definition, then, my knowledge, in order to be true, must agree with the object.  Now, I can only compare the object with my knowledge by this means, namely, by taking knowledge of it.  My knowledge, then, is to be verified by itself, which is far from being sufficient for truth.  For as the object is external to me, and the knowledge is in me, I can only judge whether my knowledge of the object agrees with my knowledge of the object.  Such a circle in explanation was called by the ancients Diallelos.  And the logicians were accused of this fallacy by the sceptics, who remarked that this account of truth was as if a man before a judicial tribunal should make a statement, and appeal in support of it to a witness whom no one knows, but who defends his own credibility by saying that the man who had called him as a witness is an honourable man.  (Kant, 45)

Kant, Immanuel (1800), Introduction to Logic. Reprinted, Thomas Kingsmill Abbott (trans.), Dennis Sweet (intro.), Barnes and Noble, New York, NY, 2005.

73. Jon Awbrey says:

There are bits of ambiguity in the use of words like empirical and external.

If by empirical we mean based on experience, then it brings to mind the maxim of a famous intuitionist (whose name I’ve misplaced for the moment, maybe Brouwer?) — There are no non-experienced truths.

When it comes to external, I cannot say how to define that mathematically, but if we replace our criterion of objectitude by independent or universal then those are concepts about which mathematics has definite things to say.

74. Jon Awbrey says:

You can still get the old editor by going to:

https://yourblogname.wordpress.com/wp-admin/post-new.php.

75. Jon Awbrey says:

Réseau
Réseaux
Rousseau

Social compacts come and go
And may converge one day
To one that comes
And never goes

So reap what you réseau

Manifold • Atlas of Charts
Intersecting circles of competence
Overlapping neighborhoods of expertise
Communities of inquiry as social networks
Some networks are more compact than others

76. Jon Awbrey says:

### The 12 Latin Squares of Order 3

$\begin{pmatrix} a & b & c \\ b & c & a \\ c & a & b \end{pmatrix} \begin{pmatrix} a & b & c \\ c & a & b \\ b & c & a \end{pmatrix} \begin{pmatrix} b & c & a \\ a & b & c \\ c & a & b \end{pmatrix} \begin{pmatrix} b & c & a \\ c & a & b \\ a & b & c \end{pmatrix} \begin{pmatrix} c & a & b \\ a & b & c \\ b & c & a \end{pmatrix} \begin{pmatrix} c & a & b \\ b & c & a \\ a & b & c \end{pmatrix} \\[10pt] \begin{pmatrix} a & c & b \\ b & a & c \\ c & b & a \end{pmatrix} \begin{pmatrix} a & c & b \\ c & b & a \\ b & a & c \end{pmatrix} \begin{pmatrix} b & a & c \\ a & c & b \\ c & b & a \end{pmatrix} \begin{pmatrix} b & a & c \\ c & b & a \\ a & c & b \end{pmatrix} \begin{pmatrix} c & b & a \\ a & c & b \\ b & a & c \end{pmatrix} \begin{pmatrix} c & b & a \\ b & a & c \\ a & c & b \end{pmatrix}$

77. Jon Awbrey says:

To read the writing on the walls, I musta, since I did.

So there must be desires we don’t know we have until we bear their fruits.

78. Jon Awbrey says:

Crazy Old Guy Syndrome
$\mathbb{COGS} \therefore \sum$

Related discussion at MathBabe

79. Jon Awbrey says:

Version 1

As always, it’s a matter of whether our models are adequate to the thing itself, the phenomenon before us.

When it comes to a network that has the capacity to inquire into itself, the way our universe inquires into itself, the main thing lacking in almost all our network models has always been a logical capacity that is adequate to the task.

Version 2

As always, it’s a matter of whether our models are adequate to the thing itself, the phenomenon before us.

If the universe is a network that has the capacity to inquire into itself, the way our universe appears to do through us, then what order of logical capacity is up to that task?

For my part, I don’t think the common run of network models that we are seeing today have enough “logic” in them to do the job. Just to be cute about it, they need more nots in their ether.

80. Jon Awbrey says:

Happy Vita Nova❢  You may wish eventually to look into the way that networks, or graphs, can be used to do logic.  Doing that requires getting beneath purely positive connections to the varieties of negation that can be used to generate all the rest.  Peirce was a pioneer in this pursuit, as evidenced by his logical graphs.

81. Jon Awbrey says:

The Alfred E. Neumann Computer &madash;
Maybe it’s a madder of bisociative alge〈bra|s
Less in the way of error-correction and
More in the way of punkreation codes

82. Jon Awbrey says:

I used to think about the heap problem a lot when I was programming and I decided the heap quits being a heap as soon as you remove one grain because then it becomes two heaps.

The Pascal sorting of the sorites played on moves between heaps and stacks, but I’ve forgotten the details of that particular epiphany.  The whole-system-theoretic point is clear enough though — the system as a whole makes a discrete transition from one state of organization to another.

83. Jon Awbrey says:

One classical tradition views logic as a normative science, the one whose object is truth.  This puts it on a par with ethics, whose object is justice or morality in action, and aesthetics, whose object is beauty or the admirable in itself.

The pragmatic spin on this line of thinking views logic, ethics, and aesthetics as a concentric series of normative sciences, each a subdiscipline of the next.  Logic tells us how we ought to conduct our reasoning in order to achieve the goals of reasoning in general.  Thus logic is a special application of ethics.  Ethics tells us how we ought to conduct our activities in general in order to achieve the good appropriate to each enterprise.  What makes the difference between a normative science and a prescriptive dogma is whether this telling is based on actual inquiry into the relationship of conduct to result, or not.

Here’s a bit more I wrote on this a long time ago in a galaxy not far away —

84. Jon Awbrey says:

Version 2

Making reality our friend is necessary to survival and finding good descriptions of reality is the better part of doing that, so I don’t think we have any less interest in truth than the Ancients.  From what I remember, Plato had specific objections to specific styles of art, not to art in general.  There is even a Pythagorean tradition that reads The Republic as a metaphorical treatise on music theory, one that serves as a canon for achieving harmony in human affairs.  Truth in fiction and myth is a matter of interpretation and — come to think of it — that’s not essentially different from truth in more literal forms of expression.

Version 3

Making reality our friend is necessary to survival and finding good descriptions of reality is the better part of doing that, so I shouldn’t imagine we have any less interest in truth than the Ancients.  From what I remember, Plato had specific objections to specific styles of art, not to art in general.  There is even a Pythagorean tradition that interprets The Republic as a metaphorical treatise on music theory, no doubt serving incidentally as a canon of harmony in human affairs.  Truth in fiction and myth is a matter of interpretation and, come to think of it, that is not essentially different from truth in more literal forms of expression.

85. Jon Awbrey says:

These are the forms of time,
which imitates eternity and
revolves according to a law
of number.

Plato • Timaeus • 38 A
Benjamin Jowett (trans.)

It is clear from Aristotle and even Plato in places that the good of reasoning from fair samples and freely chosen examples was bound up with notions of probability, which in the Greek idiom meant likeness, likelihood, and likely stories, in effect, how much the passing image could tell us of the original idea.

86. Jon Awbrey says:

Re: Michael Harris • Are Your Colleagues Zombies?

Comment 1

There are many things that could be discussed in this connection, but coming from a perspective informed by Peirce on the nature of inquiry and the whole tradition augured by Freud and Jung on the nature of the unconscious makes for a slightly shifted view of things compared, say, to the pet puzzles of analytic philosophy and rationalistic cognitive psychology.

Comment 2

Let me just ramble a bit and scribble a list of free associative questions that came to mind as I perused your post and sampled a few of its links.

There is almost always in the back of my mind a question about how the species of mathematical inquiry fits within the genus of inquiry in general.

That raises a question about the nature of inquiry. Do machines or zombies — unsouled creatures — inquire or question at all? Is awareness or consciousness necessary to inquiry? Inquiry in general? Mathematical inquiry as a special case?

Comment 3

One of the ideas we get from Peirce is that inquiry begins with the irritation of doubt (IOD) and ends with the fixation of belief (FOB). This fits nicely in the frame of our zombie flick for a couple of reasons: (1) it harks back to Aristotle’s idea that the cognitive is derivative of the affective, (2) it reminds me of what my middle to high school biology texts always enumerated as a defining attribute of living things, their irritability.

87. Jon Awbrey says:

Re: John Baez • The Internal Model Principle

Comment 1

Ashby’s book was my own first introduction to cybernetics and I recently returned to his discussion of regulation games in connection with some issues in Peirce’s logic of science or “theory of inquiry”.

In that context it seems like the formula $\rho \subset [\psi^{-1}(G)]\phi$ would have to be saying that the Regulator’s choices are a subset given by applying that portion of the game matrix with goal values in the body to the Disturber’s input.

Comment 2

There’s a far-ranging discussion that could take off from this point — touching on the links among analogical reasoning, arrows and morphisms, cybernetic images, iconic representations, mental models, systems simulations, etc., and just how categorically or not those functions are necessary to intelligent agency, all of which questions have enjoyed large and overlapping literatures for a long time now — but I’m not sure how much of that you meant to open up.

88. Jon Awbrey says:

$\texttt{((} \text{Desire} \texttt{,} \text{Law} \texttt{))}$

mno

89. Jon Awbrey says:

So many modes of mathematical thought,
So many are learned, so few are taught.
There are streams that flow beneath the sea,
There are waves that crash upon the strand,
Lateral thoughts that spread and meander —
Who knows what springs run under the sand?

90. Jon Awbrey says:

There are many modes of mathematical thought.  The way I see it they all play their part.  We have the byways of lateral thinking.  We have that “laser-like focus on one topic”.  At MathOverFlow they prefer the latter to the exclusion of the lateral.  Their logo paints a picture of overflow but they color mostly inside the box.

91. Jon Awbrey says:

Up till now quantification theory has been based on the assumption of individual variables ranging over universal collections of perfectly determinate elements.  Merely to write down quantified formulas like $\forall_{x \in X} f(x)$ and $\exists_{x \in X} f(x)$ involves a subscription to such notions, as shown by the membership relations invoked in their indices.

92. Jon Awbrey says:

$2017 = p_{306} = p_{2 \cdot 9 \cdot 17} = p_{p_1^1 p_2^2 p_7^1} = p_{p_1^1 p_2^2 p_{p_4^1}^1} = p_{p_1^1 p_2^2 p_{p_{p_1^2}^1}^1} = p_{p p_p^p p_{p_{p^p}}}$

93. Jon Awbrey says:

The book that struck the deepest chord with me was To Mock a Mockingbird : And Other Logic Puzzles Including an Amazing Adventure in Combinatory Logic, Alfred A. Knopf, New York, NY, 1985.

I once attended a conference at Michigan State on “Creativity in Logic and Math” or some such theme and there was another conference going on down the hall on Birdcalls — seriously — complete with sound effects all afternoon.  It made me wonder a little …

At any rate, I found much study there —

94. Jon Awbrey says:

Contrapositive or modus tollens arguments are very common in mathematics. Since it’s Comedy Hour, I can’t help thinking of Chrysippus, who is said to have died laughing, and his dog — not the one tied to a cart, the one chasing a rabbit or stag or whatever.

Let me resort to a Peircean usage of $(X)$ for $\lnot X$ and $XY$ for $X \land Y.$

Then $X \Rightarrow Y$ is written $(X (Y)).$

95. Jon Awbrey says:

From a functional point of view it was a step backward when we passed from Peirce’s $\sum$ and $\prod$ to the present $\exists$ and $\forall.$  There’s a rough indication of what I mean at the following location:

96. Jon Awbrey says:

C.S. Peirce is one who recognized the constitutional independence of mathematical inquiry, finding at its core a mode of operation tantamount to observation and more primitive than logic itself.  Here is one place where he expressed that idea.

Normative science rests largely on phenomenology and on mathematics;
metaphysics on phenomenology and on normative science.

— Charles Sanders Peirce, Collected Papers, CP 1.186 (1903)
Syllabus : Classification of Sciences (CP 1.180–202, G-1903-2b)

97. Jon Awbrey says:

Just a tangential association with respect to 2.2.2.  I have been exploring questions related to pivotal variables (“Differences that Make a Difference” or “Difference In ⟹ Difference Out”) by means of logical analogues to partial and total differentials.

For example, letting $\mathbb{B} = \{ 0, 1 \},$ the partial differential operator $\partial_{x_i}$ sends a function $f : \mathbb{B}^k \to \mathbb{B}$ with fiber $F = f^{-1}(1) \subseteq \mathbb{B}^k$ to a function $g = \partial_{x_i}f$ whose fiber $G = g^{-1}(1) \subseteq \mathbb{B}^k$ consists of all the places where a change in the value of $x_i$ makes a change in the value of $f.$

98. Jon Awbrey says:

If you don’t mind using model-theoretic language, there is $A \models p$ for the fact that sentence $p$ is true in model $A,$ where $A$ is defined as a subset of the relevant set $\mathcal{S}$ of simple sentences.

Cf: My notes on Chang and Keisler (1973) • (4)(8)

99. Jon Awbrey says:

Venn diagrams make for very iconic representations of their universes of discourse.  That is one of the main sources of their intuitive utility and also the main source of their logical limitations — they begin to exceed our human capacity for visualization once we climb to 4 or 5 circles (Boolean variables) or so.

Peirce’s logical graphs at the Alpha level (propositional calculus) are somewhat iconic but far less so than Venn diagrams.  They are more properly regarded as symbolic representations, in a way that exceeds the logical capacities of icons.  That is the source of their considerably greater power as a symbolic calculus.

Here’s a primer on all that:

100. Jon Awbrey says:

C.S. Peirce put forth the idea that what he called “the laws of information” were key to solving “the puzzle of the validity of scientific inference” and thus to understanding the “logic of science”.  See my notes on his notorious formula:

Information = Comprehension × Extension

101. Jon Awbrey says:

(1 + ⓪ + ① + ② + ③ + ④ + ⑤ + ⑥ + ⑦ + ⑧ + ⑨ + Ⓐ + Ⓑ + Ⓒ + Ⓓ + Ⓔ + Ⓕ)³

102. Jon Awbrey says:

Measurement is an extension of perception.  Measurement gives us data about an object system the way perception gives us percepts, which we may consider just a species of data.

If we ask when we first became self-conscious about this whole process of perception and measurement, I don’t know, but Aristotle broke ground in a very articulate way with his treatise On Interpretation.  Sense data are impressions on the mind and they have their consensual, communicable derivatives in spoken and written signs.  This triple interaction among objects, ideas, and signs is the cornerstone of our contemporary theories of signs, commonly known as semiotics.

103. Jon Awbrey says:

In many applications a predicate is a function from a universe of discourse $X$ to a binary value in $\mathbb{B} = \{0, 1\},$ that is, a characteristic function or indicator function $f : X \to \mathbb{B},$ and $f^{-1}(1),$ the fiber of $1$ under $f,$ is the set of elements denoted or indicated by the predicate.  That is the semantics, anyway.  As far as syntax goes, there are many formal languages whose syntactic expressions serve as names for those functions and nominally speaking one may call those names predicates.

104. Jon Awbrey says:

Here’s two lines of inquiry I suspect intersect —

Meet you at the corner …

105. Jon Awbrey says:

Going back to Aristotle:

Words spoken are symbols or signs (symbola) of affections or impressions (pathemata) of the soul (psyche);  written words are the signs of words spoken.  As writing, so also is speech not the same for all races of men.  But the mental affections themselves, of which these words are primarily signs (semeia), are the same for the whole of mankind, as are also the objects (pragmata) of which those affections are representations or likenesses, images, copies (homoiomata).  (Aristotle, De Interp. i. 16a4).

From a Peircean semiotic perspective we can distinguish an object domain and a semiotic plane, so we can have three types of type/token relations:  (1) within the object domain, (2) between objects and signs, (3) within the semiotic plane.  We could subtilize further but this much is enough for a start.

Type/token relations of type (1) are very common in mathematics and go back to the origins of mathematical thought.  These days computer science is rife with them.  I’ve seen a lot of confusion about this in Peircean circles as it’s not always grasped that type/token relations are not always all about signs.  It can help to speak of types versus instances or instantiations instead.

Aristotle covers type/token relations of types (2) and (3) in De Interp., the latter since he recognizes signs of signs in the clause, “written words are the signs of words spoken”.

106. Jon Awbrey says:

I remember a time I was working on a dissertation proposal and having trouble communicating its main points to my advisor.  And then it occurred to me that Peirce’s theory of sign relations was the very thing needed to capture the problematics of that communication situation.

107. Jon Awbrey says:

I think the underlying issue is whether we want our connectives to be truth-functional or whether we are seeking some sort of “relevance logic”. In the first case the supposed “content” of a proposition, e.g., “Mayo can fly” is irrelevant, only its truth-value enters into the truth-functional conditional. And the truth value of “Mayo can fly” is further irrelevant to the truth of “p ⇒ Mayo can fly” if p is false.

I think the underlying issue is whether we want our connectives to be truth-functional or whether we are seeking some sort of “relevance logic”.  In the first case the supposed “content” of a proposition, e.g., $\text{Mayo can fly}"$ is irrelevant, only its truth-value enters into the truth-functional conditional.  And the truth value of $\text{Mayo can fly}"$ is further irrelevant to the truth of $p \Rightarrow \text{Mayo can fly}"$ if $p$ is false.

This site uses Akismet to reduce spam. Learn how your comment data is processed.