Network Grammar’s sentence analyses are quite subtle. A casual reader may feel that too much must be taken on trust. Completing the discussion of WHO-interrogatives will be even more demanding. It’s therefore appropriate first to summarise NG’s underlying assumptions. These are what must be trusted. From them the analyses follow straightforwardly.
A network holds an individual’s knowledge. The network’s nodes accommodate concepts. A proposition is formed by linking three concepts. A concept may participate in many propositions. Therefore a node is typically connected to many others. However a node has no local content although it does form the locus for a unique concept. Progression through the many paths that fan out from the node is how that concept is invoked.
Language knowledge is held as concepts and propositions. A word is a proposition formed from three concepts – phonological word, category and meaning. A rule is a proposition formed from two category concepts and a relation concept. Substituting the corresponding meanings for the two categories in a rule forms a proposition delivered to cognition. The delivery of one or more of these propositions results from each pair of words in a sentence, i.e. from each junction.
The possible junctions in a sentence are reviewed in a simple sequence that follows from assuming the sentence is processed word by word in a single pass, left to right. At any particular phonological word PN, it is only possible to consider pairings of PN with preceding phonological words in the sentence. These possibilities are tried in a right-to-left sequence, broadly:
Each of these can represent more than one possibility because one phonological word may have several meanings: the several propositions needed must each have a distinct category. Therefore the possibilities actually tried are in the form:
In this form, the pairing is valid if such a rule exists in the lexicon.
It’s often the case that the earlier word is committed to one particular category because that word has already been identified as in a junction. If that is so and PN has, say, three alternative meanings, then three possibilities can be tried:
Of course, it’s also possible that the earlier word is not yet committed and so more possibilities can be tried.
Most pairings don’t exist as a rule in the lexicon or are unavailable because a word can only occur once in a sentence as dependent (although any number of times as parent). Obviously when a valid pairing is identified, the backwards scan stops.
Up to this point the assumptions are logical and simple if not actually proven. Let’s call these primary assumptions. Beyond this point, secondary assumptions are needed to reconcile the facts of language with the primary assumptions. These cannot be defended by appealing to commonsense or to Ockham’s Razor. Furthermore they may be subject to revision as LanguidSlog analyses more and more types of sentence.
Progression through the network is achieved by spreading activation. Any phonological word is assumed, for arithmetical convenience, to bring six units of activation.
Typically the propositions delivered to cognition for a sentence are fewer than the phonological words in that sentence. Therefore LanguidSlog doesn’t need to explain how activation is provided for progression through language knowledge. However, further progression through the network generally is divergent and activation must be provided in order to sustain a growing number of paths. No explanation of that is offered here.
There are many situations in language where the proposition(s) for a junction can’t be fully formed until after the words in the junction have been passed. In such a case the possible alternatives are all created but the available activation is divided between them, each then having insufficient for delivery. The situation is eventually resolved by pushing the activation all on to one or other of the propositions.
Sometimes resolution doesn’t occur until sentence-end when a more-strongly activated proposition wins against one less-strongly activated. Therefore activation may be initially split asymmetrically across alternative propositions.
Sometimes selection is needed between propositions which have been given the same partial activation. In this situation it’s necessary to assume that there is gradual decay of activation and therefore the proposition more recently activated wins against the one activated longer ago.
Incomplete propositions try to combine to enable something to be delivered to cognition.
The simplest case is where two propositions are incomplete because each has one null concept. They share one substantive concept but their third concepts are different. Each of these replaces the null in the other proposition giving a single complete proposition. (The example shown in LS12 was NERO / HASP / (null) and (null) / HASP / MAD giving NERO / HASP / MAD.)
Another case is where two propositions each have a null concept but neither of their other two concepts is shared. These can’t be combined into a single proposition but they may be concatenated and logically form a single proposition. By joining their nulls the resulting node becomes substantive and the whole thing is deliverable. (LS12 shows this is how GIVE / AGENT / NERO is in effect formed from Nero is giving.)
A proposition may have a full set of concepts but lack full activation. If so, it will not yet have been delivered. The proposition cannot combine but it may be displaced by a proposition from an incoming junction. A special case is where the displacement occurs at sentence-end.
Displacement means that one or other of the concepts in the proposition changes but typically it retains its part-activation. Often this complements another existing proposition with all the same concepts, the combined activation being sufficient for delivery.
Even where the incoming and existing propositions initially got the same activation, it is the incoming that wins and the existing that is displaced. As already mentioned this could be a case of stronger/weaker, the existing proposition having had longer to decay.
Who is X giving? is next week’s problem.