So far Network Grammar has focused exclusively on comprehension. This piece looks briefly at production. It concludes that NG’s progression-through-network approach can represent not only how the hearer’s brain processes a sentence but also how the speaker previously constructed the sentence.
The same mental network, as described in NG7 onwards, is used for production and for comprehension. Both processes rely on the same stored language knowledge – for syntax, P / C / M words and C / R / C rules.
This means that the paths through the network for progressing from phonological words through to conceptual propositions are followed in reverse to get from propositions to words.
These assumptions are not self-evidently true but anything else would be more complicated. We’ll use them until they present inseperable problems.
The assumptions are bold but some problems immediately follow.
How does the unsequenced bundle of propositions get turned into a sequenced set of (word)__(word) junctions?
How do multiple propositions form one junction? For example GIVE / TENSE / PAST and GIVE / AGENT / NERO are needed to derive Nero__gave.
Many propositions may be active in cognition at one time. How is the scope of one sentence limited to what can be accommodated in the syntax?
How is it that the same set of propositions can produce different sentences? For example arguments may be differently arranged.
(5) Nero is giving Olivia to Poppaea
(20) Nero is giving Poppaea Olivia
(11) Olivia is given to Poppaea by Nero
(12) Poppaea is given Olivia by Nero
And adjuncts may be differently arranged.
(5e) Nero is giving Olivia gladly to Poppaea
(5f) Nero is giving Olivia to Poppaea gladly
Does the whole set of propositions need to be in place before the sentence is constructed? Or can the speaker add things while the process is going on? Could that be why variants of (5) can have gladly included at the beginning or at the end?
A crucial point is that, for a bundle of propositions, the junctions activated are not independent of each other. Each must share with another junction one phonological word. In some cases each word is shared with a different junction.
(1) John kissed Lucy
The conceptual propositions that create sentence (1) are KISS / AGENT / JOHN, KISS / PATIENT / LUCY and KISS / TENSE / PAST. The junctions activated from these are John__kiss, kiss__Lucy and kiss__ed.
This is more elaborate than has been shown in LanguidSlog up to now because I’ve stubbornly ignored verb tense and aspect. The reason was that dealing with irregular give is more complicated (and I’ll not try untangling it here). However give provided so many useful examples because it allows all three arguments to be semantically similar; it allows both to-dative and double-object forms; and it allows up or in as a particle.
These three junctions can only be arranged one way – as long as we assume that there is a type of junction (like kiss__ed) where the two words are inseparable, the dependent being an affix to the parent. Thus John kiss Lucy ed is not possible.
What has to be shown is that the position of every word in a sentence is attributable to the propositions in the bundle – and perhaps to one or other of the propositions becoming active in cognition after the sentence-forming process starts (see comments above on adjuncts).
There is not space to do that here. We’ll have to rely on the sort of reasoning that LanguidSlog used to get started: there is no ghost-in-the-machine. Therefore the junctions themselves must dictate the order in which they are presented. Sure, we can look back through the blog and find many sentences – perhaps nearly all of them – where the junctions needed to identify the propositions that give the meaning do not include everything that is needed when the process goes the other way.
Let’s look at a simple example.
The patient in sentence (1) can be topicalised.
(2) Lucy John kissed
The conceptual propositions that drive production must still include KISS / AGENT / JOHN, KISS / PATIENT / LUCY and KISS / TENSE / PAST, but two other things happen. One is that Lucy__kiss is invoked instead of kiss__Lucy. The other is that something ensures that Lucy is positioned before John.
There must be an additional proposition. Let’s assume this is TOPIC / CHANGE TO / LUCY. I’ve not given this much thought. A comprehensive approach to sentence nuance – through modality etc – may need a different approach but we can at least be confident that only one more proposition is needed to produce (2) instead of (1).
The mechanism depends on differentiating C (category) concepts. Sentence (1) has:
The M / R / M proposition is shaded because, in production, this must pre-exist. Sentence (2) has:
Usually the Lucy word has CX. But only the Lucy / CY / lucy word is available for TOPIC / CHANGE TO / LUCY:
There is still a problem. How is the position of Lucy in front of John ensured?
It would be foolish to stipulate that the words in a junction like Lucy__kiss must be separated by something else. A ghost-in-the-machine would be required. Instead we have to rely on the principle that everything is achieved through the junctions created for what the speaker is communicating.
One possibility is that, for sentence (2), there is another junction – Lucy__John. This prevents John Lucy kissed, giving instead the word order the hearer needs to pick up the speaker’s meaning correctly.
Of course, this junction can be ignored for comprehension – at least as LanguidSlog has dealt with this construction previously. It certainly couldn’t identify the thematic role of the first NP if the verb is ditransitive.
(13) Olivia Nero is giving to Poppaea
(14) Poppaea Nero is giving Olivia
At first glance the Lucy__John junction looks impossible to achieve without a ghost. The problem deserves more thought than I’ve given it so far. However my provisional idea is that the topic-change proposition is not actually TOPIC / CHANGE TO / LUCY but JOHN / TOPIC REPLACED BY / LUCY – which assumes that by default the sentence subject is the topic.
Is that it?
Having used fifty thousand words on comprehension, one thousand on production is parsimonious. However I am confident that this piece is a solid foundation for further work.
Next week, the penultimate piece will float some wild ideas about computable meaning.