Home » Articles » NG4. How could a sentence structure be formed?

NG4. How could a sentence structure be formed?

Motice and tenon - sentence structureLanguidSlog 3 outlined the items needed in a sentence structure.  It was rather long but not too difficult to follow, I hope.  This one is much shorter but nonetheless may require some thought.  It looks more closely at how junctions are formed in real time, concluding that the story is a bit implausible.

Structure recap

LS3 showed the structure of John kissed Lucy being formed thus:

John kissed Lucy (junctions)

Remember, the word-items are part of the hearer’s permanent language knowledge.  It is the junction-items – c, e and f – that must be created to form a structure for the sentence.  The issue is about how relevant parts of permanent words get into transient junctions.

The way the diagram is drawn implies that ‘semantic’ material is copied from permanent to transient items, and then perhaps from one transient to another.

Efficiency

Exactly how the meaning of a word is held in the mental architecture is not known.  However it must be distributed – enabling parts to be reused for other purposes – rather than localised.  Otherwise the subtlety with which humans can conceptualise could only be achieved in an absurdly inefficient way.

If that’s not clear, consider a taxonomy – say, ANIMAL / MAMMAL / DOG / LABRADOODLE.  Each of these concepts has all the properties of the one to its left, plus some more specific properties.  If, for example, the LABRADOODLE concept were freestanding and actually included all the properties of DOG, those properties would need to have been copied from DOG – which would itself contain all the properties of MAMMAL, ditto ANIMAL.  And a further copy would be needed for each and every other type of pooch.

It’s safe to assume that ‘specific properties’ of a concept at one level are somehow picked up by all subordinate concepts without copying.  That’s why ‘meaning…must be distributed’.

But ‘distributed’ means that copying from one item in the structure to another would be absurdly inefficient.

In contrast, with computer architecture, a software designer could simply load a transient item with pointers leading back to the appropriate permanent items and avoid any need to copy.

Where next?

For LS3, apparently reasonable assumptions were made about how structure must be held for participation in sentence processing.  However the questionable efficiency of that processing suggests there is something wrong.  Can that be sorted when we think about mental architecture in the next piece?

Mr Nice-Guy

 

18 comments

  1. SH says:

    I look forward to the next post! In particular whether you believe the human brain to be behaving in such a way as to increase efficiency and reduce redundancy. That seems an implied belief here, but to me is by no means self-evident…
    Or do you simply mean that inefficiency could only be tolerated below an ‘absurd’ level?

  2. Carsten says:

    I agree, a pretty good point! Especially as many brain cells are to my knowledge considered “unused” by (some) cognitive scientists. I can’t say whether this claim holds and whether it is related, but it seems like the brain had enough capacity to handle things rather inefficiently.
    Going back to linguistics specifically, I think this is related to the observation that frequently used morphological inflections seem to be stored as they are (‘I WAS at home.’) while for less frequently used structures, rules have to be applied in order to dynamically generate the output (‘I CREPT the hell out of my reader’).

  3. Mr Nice-Guy says:

    Thanks, SH. Thanks, Carsten.
    No, I don’t believe the human brain to be behaving in such a way as to increase efficiency. But I do believe the brain would have evolved so as to have these performance characteristics.

    SH’s closing question seems to introduce ideas about how the brain behaves and I can’t usefully answer it. Evolution delivers best performance, not tolerable performance. Available brain cells must therefore hold conceptual knowledge in the most efficient way.

    I do believe that we are able to conceptualise, say, “1955 VW Beetle in blue” as distinct from “1955 VW Beetle in red” – without all the details about wheels, engines etc being repeated in separate concepts. Hence my assertion about concepts being distributed.
    OK, you might quibble about whether these are unitary concepts. But 1955 VW Beetle in blue is one noun phrase and the concept, unitary or otherwise, must be distributed.

    Carsten’s second para … Yes, in an upcoming LanguidSlog piece you’ll see that I’m OK with that ‘observation’ about how language knowledge is stored. But efficiency wasn’t a major issue while drafting that piece.

    • SH says:

      I disagree with “Evolution delivers best performance, not tolerable performance. Available brain cells must therefore hold conceptual knowledge in the most efficient way.” Evolution trends towards local maxima for overall fitness, but certainly doesn’t deliver best possible performance for a given behaviour. I look forward to the next post!

      • Mr Nice-Guy says:

        Thanks, SH. Keep them coming!

        In my last response I was thinking about the evolution of mental architecture generally – or at least across higher mammals. That I assumed to be ‘best’. The extent to which the architecture is implemented varies because each species has an optimum set of characteristics for its niche. So, lots of the architecture for you but rather less for your dog. But your dog smells better.

        You seemed to suggest that an ‘absurdly inefficient’ architecture might be ‘tolerated’. I hadn’t intended to imply that efficiency (in this context) is scalar. My ‘absurd’ meant the idea of localized concepts is plainly untenable.

        Consider “John has a 1955 VW Beetle in blue”. The Chomskyan orthodoxy is that a constituent [has [… VW …]] is formed; this can then participate in higher-level constituents. LanguidSlog 4 is sceptical – unless there were a localized concept for the NP “1955 VW Beetle in blue”.

        Do you think that could be possible?

  4. SH says:

    I’m not intending to defend Chomsky’s stuff, for all that I wish it were true (damn pesky human beings always ruin nice neat logical systems)!

  5. KA says:

    I am not sure I understand the point.

    You seem to take offense at the idea that a previously established relation has to be forgotten in the process of parsing the sentence.

    The reason you think this is objectionable, I think, is because it is only possible with addressable storage and you believe, wrongly in my opinion, that the human mind doesn’t have such a thing. We have discussed this issue in my comments to post 1 and I will leave it to one side.

    Instead I will tackle the question whether this forgetting of an established relation results from using tree structures or whether it results from the particular coding you chose. I would claim that the latter is the case. You chose an encoding which makes nodes together with the downward edges things. But is that the way syntacticians view trees? I don’t think so. Trees a graphs (acyclic, directed, etc.) with a transitive partial ordering (precedence), a transitive partial ordering (dominance), category labels on the nodes (‘colors’ in mathematician speak), and the proviso that any two nodes either stand in the dominance relation or in the precedence relation. (There are a few more conditions for a well-formed tree, like the non-tangling condition, but let’s not go there right now…) If structure in this sense is to play a role in processing, then these relations will have to be recovered during processing.

    Let me reiterate what I understand from your post: You seem to take offense at the idea that a particular junction needs to be forgotten in the process of parsing ‘John kissed Lucy’. Suppose that instead of your notation where parts of the tree picture are treated as ‘items’ we simply try to recover the relations defined by the tree. Will your criticism still hold?

    A parser like that will encounter `John’ and ‘kissed’ and maybe write something like this to its memory:

    Nodes: 1, 2, 3, 4, 5,

    Content: <1, John>, <2, NP>, <3, S>, <4, VP> <5, kissed>

    Dominance: <3, 2> <2,1> <3, 1>, <3, 4> <3, 5>, <4, 5>

    Precedence: <1, 5>, <1, 4>, <2, 4>, <2, 5>

    It is (deliberately) clunky to give the full extent of the dominance and precedence relations. We could agree to keeping a more minimal representation at any time, one that is sufficient to derive the full relations as the transitive closure of the set.

    Now the word ‘Lucy’ comes in and the parser notices that it has to be part of the VP so amends as follows, without any deletions.

    Nodes: 1, 2, 3, 4, 5, 6, 7

    Content: <1, John>, <2, NP>, <3, S>, <4, VP> <5, kissed> , <6, NP>, <7, Lucy>

    Dominance: <3, 2> <2,1> <3, 1>, <3, 4> <3, 5>, <4, 5>, <3, 6>, <3, 7>, <4, 6>, <6, 7>

    Precedence: <1, 5, 7>

    So, I have parsed this without having to forget any information along the way simply by recoding your trees. (I agree that I have made blatant use of data structures and pointers. As you know, I am happy to grant your point about addressable storage. I think the human mind has it. I’m trying to understand whether there is a different point about forgetting of relations lurking here and if my recoding of the process has addressed your issue. If not, what is it (addressable storage to the side)?

    • Mr Nice-Guy says:

      Thanks, KA.

      You’ve forced me to re-read LS4 – and LS3 and LS5 also. I don’t seem to have been too fussed about deletion of the initial John__kissed junction. So I’m relaxed about conceding that your tactics would work for sentence (1).

      But where the tactics don’t work I would indeed be arguing ‘no addressability!’ So you are tempting me to challenge you to reveal the set of rewrite rules that would allow every sentence to be managed in this way.

      Please don’t forget that I’ve already asked what you think is in the lexicon and whether you think there is a stored program.

      The question about the stored program urgently needs clarification because of your use of ‘parser…will encounter’, ‘write…to its memory’ and ‘parser notices that [Lucy] has to be part of the VP’.

      I hope you’re now able to read on and continue your excellent commentary.

      And I’d like to ask a final question, for which the context is LS2. Do you think that structure gives meaning directly? Or is structure ‘parsed’ to give meaning in some other form?

      Best regards.

  6. KA says:

    Thanks for your comments. This is getting fun! I’ll continue to sidestep your questions about the totality of rules that will make parsing the way I describe it possible, what is in the lexicon, whether I think there is a stored program, etc. and continue the ‘John kissed Lucy’ challenge.

    You seem to be running on the idea that a given item can be related as soon as it is encountered to its leftward context with at most a finite amount of ambiguity. The amount of ambiguity must be determinable by looking backwards! (That’s how I understand the way you deal with to-datives and ditransitives a few entries down the road.) If I am wrong, please correct me.

    Here is why that won’t work: In addition to ‘John kissed Lucy’ there is also the sentence ‘John kissed Lucy and Freddy.’ So assuming that we had

    Nodes: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11

    Content: <1, John>, <2, NP>, <3, S>, <4, VP> <5, kissed> , <6, NP>, <7, Lucy>, <8, NP>, <9, BoolP> <10, and>, <11, Freddy>

    Dominance: <3, 2> <2,1> <3, 1>, <3, 4> <3, 5>, <4, 5>, <3, 6>, <3, 7>, <4, 6>, <6, 7>, <4, 8>, <8, 6>, <8, 9>, <9, 10>, <9, 11> (abbreviated – compute the transitive closure to get the full dominance relation)

    Precedence: <1, 5, 7, 10, 11> (abbreviated to precedence of terminals)

    What has happened in this step is that the NP ‘Lucy’ has moved down a step when ‘and Freddy’ came in so that [NP Lucy ] is no longer the sister of the verb. However, the dominance relations have been maintained as before. [VP kissed [NP Lucy] ] became [VP kissed [NP [NP Lucy] [BoolP and [NP Freddy ]]]].

    (Digression: ‘and’ creates a bit of a stumbling block, actually. When ‘and’ comes in the parser does not know whether we will have sentence level, VP-level, or NP-level coordination: ‘John kissed Lucy and {Freddy kissed Mary | danced with her | Freddy}’ I don’t know how to deal with ‘and’ without finite branching in my scheme of things bounded by the height of the parse tree – but I don’t see how you can deal with ‘and’ without branching either. I guess that for you this is more embarrassing because while finite this type of branching is unbounded. In sentences with more structural depth there are many more opportunities to attach and. This property, of course, gives rise to ambiguities. They are very frequent in the wild though rarely noticed by language users because context helps. Anyway, ‘and’ creates a contained problem because the number of ambiguities depends on the left context of the parser – on what has happened before and this, of course, is in line with your thinking: the parser’s actions are determined by what comes before. So let us set ‘and’ aside, though I would like to hear your opinion about this problem – or a point to the blog entry where you discuss it.)

    So here comes the real problem. The sentence could continue as follows: ‘John kissed Lucy and Freddy’s daughter’. This is relevant to you, because the parser now has to realize that creating the patient relation between ‘kiss’ and ‘Lucy and Freddy’ was a mistake. John does not kiss Lucy and Freddy. He kisses their daughter. Backtracking, of course, is anathema to your single parse left-right way of doing things.

    In phrase structure terms this means that what has to happen is that the NP ‘Lucy and Freddy’ moves from the complement position of the verb to the possessor position within the actual complement of the verb: [PossP [NP Lucy and Freddy] [Poss’ ‘s [NP daughter]]]. My schema of encoding trees accomplishes this without so much as having to blink. We simply add to the representation we had for ‘John kissed Lucy and Freddy’ the new relations needed:

    Nodes: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16

    Content: <1, John>, <2, NP>, <3, S>, <4, VP> <5, kissed> , <6, NP>, <7, Lucy>, <8, NP>, <9, BoolP> <10, and>, <11, Freddy>, <12, PossP>, <13, Poss’>, <14, s’>, <15, NP>, <16, daughter>

    Dominance: <3, 2> <2,1> <3, 1>, <3, 4> <3, 5>, <4, 5>, <3, 6>, <3, 7>, <4, 6>, <6, 7>, <4, 8>, <8, 6>, <8, 9>, <9, 10>, <9, 11> <4, 12>, <12, 8> <12, 13>, <13, 14>, <13, 15>, <15, 16> (abbreviated – compute the transitive closure to get the full dominance relation)

    Precedence: <1, 5, 7, 10, 11, 14, 16> (abbreviated to precedence of terminals)

    Of course, the sentence could continue as follows: ‘John kissed Lucy and Freddy’s daughter’s best friend.’ We’d have to revise the relations again: It is not the daughter that is in the patient relation with ‘kiss’ but the best friend. And that can continue as follows: ‘John kissed Lucy and Freddy’s daughter’s best friend’s room-mate.’ And so on and so forth for ever more, since possessors are recursive in English. If you don’t like the way I have split up the noun and the possessive, consider the following variant of the argument: the sentence could have continued with ‘John kissed Lucy and Freddy’s son.’ Which could have continued as ‘John kissed Lucy and Freddy’s son and daughter’s best friend.’ Which could have continued as ‘John kissed Lucy and Freddy’s son and daughter’s best friend and classmate’s room-mate’ etc.

    This seems to me to be a problem for your approach if I have understood anything. At the moment when the word Lucy is encountered, there is an unbounded FORWARD looking ambiguity here. Can you explain to me how you propose to deal with this without looping? You can’t put all possibilities in the table when you encounter ‘Lucy’ because there are unboundedly many of them because of the recursive property of possessors. Hm…

    As ever, looking forward to your reply.

    • Mr Nice-Guy says:

      Thanks yet again, KA. When do you sleep?

      Your ‘a given item can be related as soon as it is encountered to its leftward context’ is not always true. The item may have to wait until something appropriate appears to the right. Trivially that’s the case with the sentence-initial word.

      I’ll say YES to the rest of that para but I don’t like ‘looking backwards’.

      Coverage of coordination will start in LS33 and continue for several weeks. That should address everything in your comment before ‘the real problem’.

      In my network grammar, coordination involves the delivery of propositions in the form (for example) APPLE / COORD / BANANA from the string ‘apples and bananas’. But it’s not possible to form LUCY(acc) / COORD / FREDDY(gen). Inhibiting incompatible word-pairs is needed for quite simple sentences; for example, distinguishing the relations in ‘John gave apples bananas cherries damsons and…’ from those in ‘John gave Mary bananas Olivia damsons and…’.

      I haven’t had the time to work through your sentences but I’m pretty sure they’ll analyse OK. NG solutions aren’t always obvious immediately. Sometimes I have to invent a new ‘mechanism’; for example, I recently decided it’s possible to change the C (category) concept for a word when it occurs in certain types of junction. But I’ve never been tempted to abandon the one-pass, left-to-right principle. It’s really satisfying working with such severe constraints.

      Interestingly, you don’t mention the ambiguity in ‘John kissed Lucy and Freddy’s daughter’. You assume that there is one act of kissing, not two. When I was at school we were told that the one-act meaning should be achieved by ‘John kissed Lucy’s and Freddy’s daughter’. However what you wrote is indeed current usage so the issue is what happens in human sentence processing and how the ‘fully automatic’ result may be mediated by the hearer’s existing cognitive state.

      For your sentence my hunch is that the ‘two act’ meaning is delivered. KISS / PATIENT / LUCY is delivered but abandoned if in conflict with prior knowledge. KISS / PATIENT / FREDDY’S DAUGHTER conveys enough meaning because the hearer already has LUCY / PARTNER / FREDDY.

      If that’s generally true for unconscious ambiguity, sentence processing must over-deliver in such cases. Under-delivery would require all sorts of complicated logic and – horror! – a stored program.

      What do you think?

  7. KA says:

    Dear NG, thanks, as ever for your response.

    The reason I didn’t comment further on additional ambiguities introduced by ‘and’ (the two-kissings reading) was that I had decided to put ambiguities introduced by ‘and’ aside except in the digression. The only reason I had ‘and’ in the examples in the first place was to garden-path your parser. It sees a noun (phrase – I know, you don’t like those) that looks nominative/accusative and decides to link it up as the object with the verb. Then comes ‘and’, then another noun (phrase) and then the possessive marker, and whoops, the parser has to backtrack.

    Why was I doing this? In your argument in these early posts you seem to criticize two distinct points. The use of addressable storage and the necessity to backtrack. I explained in my comments to your first post that I think that our brains have addressable storage and if neurobiologists tell us that they can’t find it, well, then they’ll have to keep looking! It’s got to be there.

    The other issue seemed to have to do with backtracking, with the parser making a commitment and then taking it back. I argued that your examples are unsuitable to show the necessity of this backtracking because I could avoid it by re-coding trees in a different and more fundamental way. Then I introduced the ‘Lucy and Freddy’s daughter’ example to show that your parser will also make commitment that it has to take back.

    I could have based the argument on the verb ‘see’ instead, but the problem there seems to involve only a smaller number of forward looking ambiguities:

    John saw Lucy.
    John saw Lucy had left.

    Again, in the first sentence ‘Lucy’ is the object of ‘see’ but in the second it is not. (‘See’ is interesting, because it allows many many different kinds of complementation.) Obviously, John doesn’t need to see Lucy to see that she has left. Therefore, if the parser commits to creating a relation between ‘see’ and ‘Lucy’ at the point where the string is ‘John saw Lucy’, then this commitment will have to be taken back when further material comes in because there is no seeing relation between John and Lucy in the final sentence.

    I picked posessors because they are recursive and so the problem is harder. And I added co-ordination to make the gardenpath nature of the problem more obvious.

    In the end it seems to me that everybody is even and there is no argument so far.

    You seem totally unfazed by my arguments, so you must think that you still have presented an argument that shows the inferiority of phrase structure. I don’t see it. Can you tell me what it is?

    (The following passage in your reply made me grin. “When I was at school we were told that the one-act meaning should be achieved by ‘John kissed Lucy’s and Freddy’s daughter’. However what you wrote is indeed current usage” You attribute the reading that I focus on to *current* usage. At the same time you say that you were told to achieve a particular reading with a particular structure. Why do you think they told you that? Probably, because you kids didn’t follow the rule to begin with. When kids follow a rule automatically, they don’t have to be told about it. As in: people will tell you not to dangle your prepositions but they never tell you not to dangle your articles. Why? People dangle prepositions but they don’t dangle articles. So, your comment actually indicates that current usage and the usage of the time when you were in school agree and that, possibly, your teachers tried to drill it out of you. I thank you for your anecdote in support of the view that the one-kissing interpretation is, indeed, there.)

    As ever, KA

    • Mr Nice-Guy says:

      Thanks, KA. Yes, I am unfazed.

      ‘John kissed Lucy and Freddy’s daughter’ is ambiguous. OK, the two-act reading can be ensured prosodically (emphasis on ‘and’) and the one-act reading can be ensured as in my anecdote. NG can deal with these variants without ‘backtracking’.

      Without such tricks, the sentence is ambiguous but unexceptionable. The hearer/reader unconsciously disambiguates it – one way or other. I have suggested a plausible account of that.

      Do you doubt that the sentence is ambiguous? Written, it clearly is. Spoken? Well, we’d need to experiment on native speakers to be sure. If it is ambiguous, your theory must also explain how the hearer/reader disambiguates.

      By the way, I didn’t respond to the point, in your earlier comment, about recursion. This is not a problem for me. It should be obvious that while my rules-of-combination don’t allow (verb)__(genitive noun) junctions, they do allow (genitive)__(genitive) junctions.

      Let’s move on to ‘John saw Lucy had left’. First of all, my work hasn’t got to subordinate clauses yet, so the following is a bit raw.

      What I have already shown is how the role of a word may be left uncertain until subsequent words enable one or other option to be taken. For example, in ‘Nero gave Poppaea…’, POPPAEA may be THEME or GOAL; resolution depends on whether an NP or a PP follows. What’s different in your sentence is that the relation between SEE and LUCY may be THEME or COMPLEMENTISED BY; sentence-end or another verb resolves the thing one way or the other. (Yes, it’s a bit more complicated than that. The COMPLEMENTISED BY junction ought to be with the subordinate verb, not its subject. But I haven’t worked out the neatest way to allow for the complementiser to be either present or absent.)

      You ask why I think phrase structure is inferior. It’s fair to say that I have presented some arguments. I’ll allude to just one of them: What’s the relationship between phrase structure and meaning? In the mind of a real language-user, not to a generative linguist.

      Best regards.

      • KA says:

        hm…. I think we are talking past each other. You say: “‘John kissed Lucy and Freddy’s daughter’ is ambiguous. OK, the two-act reading can be ensured prosodically (emphasis on ‘and’) and the one-act reading can be ensured as in my anecdote. NG can deal with these variants without ‘backtracking’.”

        We are talking past each other because my point is not about the ambiguity of the resulting sentence at all. Remember that I didn’t even mention this ambiguity originally. My point has to do with the incremental parsing of the sentence on the one-event reading. Your walk-through of the parse for ‘John kissed Lucy’ suggests that the parser, at the point when it reads the word `Lucy’ has no choice but to create the patient relation between ‘kiss’ and ‘Lucy.’ Your presentation leaves no doubt that that is what happens. That’s fine for ‘John kissed Lucy.’ But my example has the same prefix as that sentences. However, then the words ‘and’ and ‘Freddy’s’ (or maybe the three words ‘and’, ‘Freddy’ and ”s’) come in and, one the single event reading — disambiguated in whatever way, the parser now has to retract its commitment to treating Lucy as the patient of kissing. The parser has backtracked. I thought you didn’t want that.

        Let me say it again. I am not interested in the two-event reading. It does not require undoing the patient relation between ‘kiss’ and ‘Lucy’ and therefore doesn’t serve my purpose of creating difficulties for your parser.

        So can you please step me through – step by incremental step – the parse of ‘John kissed Lucy and Freddy’s daughter.’? My claim is that what you will end up doing is either (a) inconsistent with the way you presented the parse process for ‘John kissed Lucy.’ or (b) will involve backtracking or both.

        Regarding your purported argument about syntactic structure not being relevant to semantics, I take it that the alternative to ‘constituent structure’ is ‘linear order.’ (I can sense a third alternative floating around: dependency. At the level of abstraction of this discussion, I don’t see why dependencies would count as alternatives to constituent structures. They seem mostly a different notation. You can transfer dependency structures to constituent structures by assigning a functional head to every different type of dependency, ‘color’ as mathematicians would have it. All of Cinque’s adverbial hierarchy can be recoded that way, for example.)

        If those two are the alternatives: ‘linear order’ vs. ‘consituent structure=dependency structure,’ then the example that we are discussing shows the relevance of syntactic structure to interpretation. The patient role is assigned by ‘kiss’ to its sister constituent not to the next noun in its rightward context: [kissed Lucy] vs. [kissed [[Lucy [and Freddy]][‘s daughter ]].

        I’m sure I have missed something fundamental in what you think the argument is.

        As ever, KA

  8. Mr Nice-Guy says:

    KA, this epic correspondence is hugely useful for LanguidSlog. Thank you!

    The reason for persisting with ambiguity as the focus for the discussion is this. The ‘parser’ can deliver only one reading. If the other reading is what the reader intended, the hearer’s cognition (downstream of language processing, that is) has to make the difference. In my last-but-one response I suggested the two-act reading would be easy for cognition to override. I’m therefore content to say that it’s the two-act reading that my NG delivers. And besides, the string ‘John kissed Lucy…’ will intend KISS / THEME / LUCY more often than not.

    In the discussion of ‘John saw Lucy had left’ in my last response, I briefly reminded you of how NG deals with intermediate uncertainty. I didn’t concoct a similar story for ‘John kissed Lucy and Freddy’s daughter’ to deliver the one-act reading: it would be complicated (for something far less frequent than alternating ditransitives); and it would still require an explanation of what happens if the speaker intended the two-act reading.

    In summary, I don’t deliver the one-act meaning, or contradict what I said elsewhere about ‘John kissed Lucy’, or back-track. You need to find a sentence beginning ‘John kissed Lucy…’ in which LUCY simply cannot be THEME. I can’t think of one quickly … but I’ll try again on my 10km walk tomorrow.

    Re ‘[my] purported argument about syntactic structure not being relevant to semantics…’, I’m sorry but can you remind me of the context, please.

    Best regards.

  9. KA says:

    Dear Mr NG,

    Thanks as ever for your thoughts on this ‘epic’ conversation.

    We are still discussing the following sentence

    “John kissed Lucy and Freddy’s daughter.”

    You offer this responses to the challenge of the one-event reading that this sentence has: “The ‘parser’ can deliver only one reading. If the other reading is what the reader intended, the hearer’s cognition (downstream of language processing, that is) has to make the difference. In my last-but-one response I suggested the two-act reading would be easy for cognition to override. I’m therefore content to say that it’s the two-act reading that my NG delivers.”

    From your perspective then the sentence is unambiguous. If the reading delivered by the parser is not the intended interpretation, then the interpretation delivered by the parser is overwritten by cognitive processes downstream of language processing. ‘Downstream’ can only mean that the actual syntactic form of the utterance is invisible to the downstream processing: The sequence of words (and its – dare I say it – structure) are not visible to downstream processes. I.e., what plays a role in the possibility of doing this overriding is unconnected to linguistic form and has to do only with content. I think you have committed yourself to that position.

    The downstream overriding will have to accomplish the following. It will have to take a meaning representation that says “John is the agent of a kissing event that involved Lucy as the patient and John is the agent of a kissing event that involved Freddy’s daughter as the patient” into a representation that says “John is the agent in a kissing event that involved as the patient the daughter whose parents are Lucy and Freddy.”

    From the position that I think you are committed to it follows that any sentence that expresses the two-event reading will be able to be overwritten with equal ease by downstream cognition. Since that downstream module has access only to the meaning not to the form of the utterance. That means that any sentence that has the two event reading will be subject to overwriting – since it is content and content only not form that is invoked in overwriting. This prediction turns out to be wrong. The following examples all describe the two-event reading but not a single one of them can be overwritten by an interpretation where there was only one kissing-event involving the daughter whose parents are Lucy and Freddy as the patient.

    John kissed Lucy and the daughter of Freddy
    John kissed Lucy and John kissed the daughter of Freddy
    John is the agent of a kissing event that involved Lucy as the patient and John is the agent of a kissing event that involved Freddy’s daughter as the patient

    Your prediction failed.

    Phrase structure on the other hand has an easy time explaining all of this. Among all the sentences we are looking at only the initial one contains the string ‘Lucy and Freddy’s daughter’, a string which is syntactically ambiguous between [[Lucy and Freddy] ‘s daughter] and [Lucy and [Freddy’s daughter].

    An analysis with syntactic structure predicts that the original example is ambiguous and that the three new examples are not ambiguous in that way.

    We can drive the point home by observing the following examples:

    Lucy and Freddy’s daughter were kissed by John.
    Lucy and Freddy’s daughter was kissed by John.

    Both are unambiguous, so, again, the thesis that the one-event reading can be created from the two-even reading by overwriting fails. But we already knew that. The syntactic theory that attributes a structural ambiguity to the original example correctly associates the sentences above with the readings. The noun phrase involved in the one-event reading is [[Freddy and Lucy]’s daughter], a noun-phrase with a single head ‘daughter’ in the singular. Thus we get singular agreement. The noun phrase in the two event reading is [Lucy and [Freddy’s daughter]], a co-ordination of two singular NPs (‘Lucy’ ‘Freddy’s daughter’) which therefore triggers plural agreement on the auxiliary.

    Conclusions: Downstream overwriting is not responsible for the one-event reading. The one-event reading arises because of a syntactic ambiguity in the string ‘Lucy and Freddy’s daughter.’ Analyzing this ambiguous string with the tools of phrase structure produces the correct result both in terms of interpretation and in terms of agreement.

    It’s time to quote what you said in your penultimate response to my queries: “You ask why I think phrase structure is inferior. It’s fair to say that I have presented some arguments. I’ll allude to just one of them: What’s the relationship between phrase structure and meaning? In the mind of a real language-user, not to a generative linguist.”
    I think the examples discussed in this post show clearly that phrase structure matters for interpretation ‘in the mind of a real language-user.’ It’s facts like these that convince generative grammarians that the types of theories of meaning used in such theories are of the correct kind and necessary.

    (As a final aside, you promised in your previous response that on your 10k run you would think about the question whether there are any sentences that start with ‘John kissed Lucy’ and where Lucy CANNOT be the patient in the kissing event. I don’t know how successful you have been, but I can offer a very similar prefix for which it is easy to find such continuations: “John kissed the queen.” We can continue with “John kissed the queen of England’s butler.” In this sentence, the queen of England remains unkissed.)

    Looking forward to your response.

    • Mr Nice-Guy says:

      Great stuff – thanks!

      You quote back at me ‘The ‘parser’ can deliver only one reading. If the other reading is what the speaker intended, the hearer’s cognition (downstream of language processing, that is) has to make the difference’. I’m sorry if I’ve missed something in what you’ve written but don’t my words apply also under your approach?

      I think so. Your next para (‘From your perspective…’) applies to us both. And the one after that (‘The downstream overriding…’) applies in principle to both although the details are different. I’m tempted to skip the next para too, saying ‘If I’ve got a problem, you must share it’. But let’s see if we can solve the problem – saving both of us from ignominy.

      So let’s pick up from ’From the position…’. Aren’t you looking rather narrowly, rather syntactically at the sentence? I said ‘downstream of language processing’ to emphasise that the rules-of-the-game are different: what I assume happens is that the conceptual propositions that language processing delivers resonate with or conflict with or simply add to the existing cognitive state of the hearer. That state will include long-term knowledge, current stuff from the discourse and current stuff from other channels.

      Thus, from the contentious sentence, proposition KISS / PATIENT / LUCY might conflict with LUCY / HAS PROPERTY / LONG DEAD. The incoming proposition is swamped (for want of a better word) because overall a coherent – and correct – meaning is cognised by the hearer. In your two-act sentences ‘John kissed Lucy and…’ (also the two passives later on) the same thing would not happen. From these sentences, if KISS / PATIENT / LUCY were ignored, cognition would only get KISS / AGENT / JOHN which is vacuous. The existence of LUCY / PARTNER / FREDDY doesn’t help at all. I suggest that KISS / PATIENT / LUCY can’t be ignored. It startles the hearer (assuming that context makes the kissing recent), swamping anything then delivered about the daughter.

      As you say, no overwriting – but the prediction of overwriting for all two-act sentences was yours, not mine. To make myself as clear as possible: NG says the two-act reading will be cognised unless there is a conflict. I’m assuming for this discussion that the sentences intending one-act and two-act are phonologically indistinguishable.

      (Actually I suspect that there is a phonological distinction between ‘Lucy and Freddy’ as an ad hoc pairing and ‘Lucy’n’Freddy’ which can be learned as a permanent pairing and lexicalised as a single item. That is why ‘Lucy’s and Freddy’s’ can be replaced by ‘Lucy and Freddy’s’. Surely someone has done the psycholing on this sort of thing? I haven’t raised phonology before because it might have given the impression that I’m finding it hard to evade your challenges. But no – not yet anyway.)

      At this point in your comment I get a bit confused – particularly ‘An analysis with syntactic structure predicts that the original example is ambiguous’. Do you mean a pencil-and-paper analysis by a syntactician, or an automatic mental process? If the latter, your statement suggests the sort of ghost-in-the-machine I am uncomfortable with. What would the ‘ghost’ do with the prediction?

      Also ‘The one-event reading arises because of a syntactic ambiguity…’. Please explain. (Again apologies if you’ve covered it before but I just don’t get it.)

      Regarding the ‘relationship between phrase structure and meaning’, your comments miss the point. Of course generativists can show correlation between their depictions of sentences and what they already know, as language users, to be the meaning of those sentences. What I was asking was about how the structure (that you believe the ‘parser’ builds) becomes – or actually is – the meaning in the hearer’s mind.

      Regarding ‘the queen of England’ … Yes, that will do. You’ll see from the blog that I’ve done nothing on qualifiers and determiners: I just have bare ‘Nero, ‘Poppaea’ etc. So forgive me if this is again a bit raw. In your latest sentence, ‘the queen’ is the start of an NP string. There may be no problem if the NG analysis starts with KISS / (null) / QUEEN and then the succeeding words build up the correct set of propositions to convey the correct relationships between the concepts. And also nothing can be delivered to cognition until the end of the NP is signalled (in this case by end-of-sentence). So, no back-tracking.

      You’ll agree that it would have been too confusing to introduce this mechanism early in the LanguidSlog saga. I hope you don’t think I’m clutching at straws. Syntax presents problems and I have to propose a solution within the constraints I’ve adopted. The tactics adopted for this particular problem are not exceptional in any way.

      Phew …

Comments