I have no quarrel with the idea that a language user has a mental ‘lexicon’. But I am impatient that no linguist has satisfactorily defined the lexicon’s logical content or its physical implementation. This piece fearlessly tackles the logical aspect.
Words and rules
Empirical evidence suggests that language knowledge is a collection of:
- phonological words – P
- meanings – M
- categories – C
- syntactic relations – R
Here these are considered to be concepts. This usage is a bit wider than in plain English but it’s appropriate because concepts are grouped into propositions:
- words – P / C / M (= ‘P means M when used as C’)
- rules – CX / R / CY (= ‘a word as CX and another as CY may be joined as R’)
Thus the mental lexicon gives the semantics for individual words, and the syntax for pairs of words. If the syntax were given for individual words, complicated logic would be required to form a junction. Instead a junction is formed thus:
The shaded triangles are propositions permanently stored in the lexicon. The unshaded triangle M2 / R / M3 is the proposition delivered to cognition when activated by phonological input. If P2 is kissed and P3 is Lucy, KISS / PATIENT / LUCY is delivered. (Small caps denote a concept in the mind. For the meaning of PATIENT in this context, try Wikipedia on ‘thematic relation’.)
This is a simplification. The two morphemes kiss and ed form a junction delivering KISS / TENSE / PAST to cognition. The lexicon must also cover phenomena like prosodic stress in order to deliver sentence meaning reliably; for example, distinguishing (1) as the answer to Who did John kiss? rather than to What did John do to Lucy? or to Who kissed Lucy?
Perhaps language knowledge is unconscious because Cs are content-less nodes that occur simply to link words and rules. Certainly C / R / M propositions suggested by the diagram have no effect. All other concepts must have something more in order to have cognitive effect.
Content-less would mean that ‘category’ is not tied to traditional ideas about parts-of-speech.
The C in a particular rule might be unique to a word (for example kicked as in kicked the bucket) or shared by a large number of words (kicked in other contexts, along with many other past-tense monotransitive verbs). Having Cs ranging from very specific to very general means an infant needs to acquire just enough to deal with regular and irregular formations. It might also allow semantic congruency of word pairs to be part of language knowledge.
The existence of Cs is hard to deny as they enable any phonological word to have alternative meanings:
and any meaning to be expressed by alternative phonological words:
No junction can include another junction, which should make the approach congenial to dependency grammarians.
Without higher-level constituents, phrase-structure grammarians will be uncomfortable. If you’re one, please don’t just say ‘Constituency is axiomatic!’ but look for a flaw in the argumentation.
Unlike shaded propositions, an M / R / M would not be language knowledge – something that works automatically and unconsciously. As something consciously know-able it may be new, or recent (confirming something from earlier in the discourse but perhaps volatile), or permanent. Presumably knowledge becomes permanent as a result of repetition. It can be inferred that all knowledge, not just language knowledge, is held as propositions in the same mental architecture.
The triangles have much in common with the boxes in LS3. One difference is that the combining of categories is permanently stored and therefore doesn’t need program logic to be invoked in real time.
Another difference is that, to allow a many-to-many relationship between phonological word and meaning, the P, C and M in a ‘word’ proposition must be nodes in a network, not a free-standing item.
A given concept can participate in any number of different triangles that are properly constituted (a P / R / M would be improper, for example). Three-way grouping gives a very large number of triangles and an unimaginably large (but finite) number of P / M / C / R / C / M / P junctions. Infinity only applies to sentences because, as will be seen, junctions occur in series. But it can be argued that every possible junction of an idiolect is pre-stored awaiting activation; it does not need to be computed from lexical ‘features’.
There you go
Working out the P / M / C / R / C / M / P diagram was a slog but worthwhile. It can model any and every junction. It can cope with the many-to-many between Ps and Ms. And it can cope with any irregularities.
Next time we’ll see how it is applied serially to cover a whole sentence.