Thursday, May 13, 2004
Thanks to something Steven Den Beste posted, I spent a long time this evening wandering around Amritas' blog, reading his critiques of Chomsky. I had not previously come across any details of what it is Chomsky has been up to; all I had was the general impression, he is held in contempt by all right-thinking persons, who are identified as right-thinking by the fact that they hold Chomsky in contempt. So now I have a general impression of what the fighting is about, and a basic understanding of the particular brand of snake-oil Chomsky has been selling.
I found myself reminded of something out of atomic logic. And here we hit a digression: since I only suppose this blog has readers, I can just as easily suppose they need that term explained. Atomic logic is the name (well, one of the names, and the name that was fashionable when I went to college) for that part of formal logic that deals with "atoms" -- what a programmer would call boolean variables -- and the operators that work upon them in expressions. Atoms are either true or false; the operators that work upon them likewise give a result of either true or false. These operators are:
(Visual Basic programmers will recognize most of the operator names. I chose the VB forms, because they're easiest to type.)
- AND -- the value of the expression "A AND B" is true only when both A and B are true
- OR -- the value of the expression "A OR B" is true when either A or B (or both) is true
- NOT -- the value of the expression "NOT A" is true only when A is false
- XOR -- the value of the expression "A XOR B" is true when either A or B (but not both) is true
- IMP -- the value of the expression "A IMP B" is true when either A is false or B is true (or both)
- EQV -- the value of the expression "A EQV B" is true when A and B are either both true or both false
- NAND -- the value of the expression "A NAND B" is true when either A or B (or both) is false
- NOR -- the value of the expression "A NOR B" is true only when both A and B are false.
Now, out of the list above, which would you say are "basic" operators, and which "derived"? A layman would object to the formal definition of OR, since the word as used in ordinary speech means XOR instead; he would choose AND, XOR, and NOT as the basic operations, and derive the rest:
A person more comfortable with the field would choose AND, OR, and NOT:
- A OR B :== (A XOR B) XOR (A AND B)
- A EQV B :== (A AND B) XOR ((NOT A) AND (NOT B))
- A IMP B :== ((NOT A) XOR B) XOR (A AND B)
- A NAND B :== NOT (A AND B)
- A NOR B :== (NOT A) AND (NOT B)
But this doesn't take into account a really arcane trick. All the other operators can be defined in terms of NAND, as follows:
- A XOR B :== (A OR B) AND (NOT (A AND B))
- A EQV B :== (A AND B) OR ((NOT A) AND (NOT B))
- A IMP B :== (NOT A) OR B
- A NAND B :== NOT (A AND B)
- A NOR B :== NOT (A OR B)
-- and so on. We can pull exactly the same trick with NOR:
- NOT A :== A NAND A
- A AND B :== NOT (A NAND B) :== (A NAND B) NAND (A NAND B)
- A OR B :== (NOT A) NAND (NOT B) :== (A NAND A) NAND (B NAND B)
-- and so on.
- NOT A :== A NOR A
- A OR B :== NOT (A NOR B) :== (A NOR B) NOR (A NOR B)
- A AND B :== (NOT A) NOR (NOT B) :== (A NOR A) NOR (B NOR B)
As far as I know, this trick has only one real-world application, which is in the construction of integrated circuits. The actual nitty-gritty of putting together NAND and NOR gates is a lot simpler (and the gates work a lot faster) than any of the others, so this trick is used all over the place in microprocessors. Other than that, NAND and NOR are pretty much useless; there is no expression in which they can be used that cannot be made clearer by not using them.
This didn't stop one of my professors in college from jumping all over this trick. It showed that NAND and NOR were more "basic," you see. He even assigned an ugly problem on the final: Write an equivalent to this ordinary expression using only the NAND operator. And we did it, too. Something that started out as an easily understood half-of-a-line became an impassable four-line thicket of up-arrows (the glyph for NAND) and parentheses. We had our revenge by the same act, though, since the idiot no doubt went blind trying to grade them.
Now from what I saw on Amritas' board, the prototypical Chomskyite would go right along with said idiot professor, concluding, either NAND or NOR (take your pick) is the actual internal mechanism (the "deep structure") of all thought, and all the other operators are merely derivative ("surface structure"). Whereas the truth is self-evidently the opposite: AND, XOR, and NOT are the workhorses of everyday thought, and all the others are derivative; the OR and IMP operators, as defined, actually contradict the equivalent everyday usage, and NAND and NOR contribute nothing to understanding, but instead obstruct it.
On another tack, I was reminded of an argument I had once on the internet with a woman who absolutely insisted that brown was not a color. She had the artist's "insider" perspective, you see: One finds brown on a color wheel by starting out with red, or orange, or yellow, and decreasing the luminance, so there is no such thing as brown; there is only dark red, or dark orange, or dark yellow. It made no difference to her that that perception doesn't work that way, that it requires an effort of will to see the underlying similarity of what ordinary perception insists on treating as qualitatively different, and more than a little self-deception to say that the qualitative difference doesn't matter.
Back to the immediate subject. Chomsky is full of shit. I say this with all the confidence of long introspection on the underlying problem (and with no other authority), and, if Amritas' depiction of the field is anything to go by, it probably helps that I have no training in it. The actual structure of thought, if we must have it in those terms, is an amorphous digraph interconnected to a fare-thee-well, where both the vertices and the edges have any number of qualities, but nothing is in any particular order. Grammar is not a necessary component of thought itself; it arises from the requirement of representing this graph somehow in a linear form (since words must be uttered one after another), and there is no one "right" or "best" or "fundamental" way to do this. The syntactical system, whichever one it is, picks out enough of the highlights to communicate, not the entirety of the graph, but enough (the "meaning") so that it can be reconstructed in the listener's mind (again, in no particular order); the wholly internal art of resupplying the bits and pieces that didn't get sent, thus pulling reasonable certainty out of uncertain materials, is what goes by the name of "understanding."
(I've played around with the idea of constructing a computer model along these lines, but actually doing it would require a lot more time than I have to spend on it. Mostly it would have to do with deciding what ought to be a vertex, and what an edge, and what all the possible attributes (qualities) of each would be, and growing the whole thing in much the same order that a child learns to speak. The end result, I suspect, would be a picture of language, and of the basic knowledge required to understand language, that would show only slight echoes of what grammarians have been maintaining for centuries; in particular, I suspect that the "parts of speech" actually required to describe and run a working language would bear very little resemblance to the old categories of nouns, verbs, adjectives, adverbs, and so on.)
Also: If Chomsky's universal "deep structure" idea is right, and if, as he has it, it mirrors the syntactical patterns of English, then English ought to be one of the easiest languages for foreigners to learn. Whereas we know, it's one of the hardest.
For the more forensically inclined