Edward Gibson's new book has arrived on my desk – 350 pages on cognitively-oriented dependency syntax from MIT Press! mitpress.mit.edu/978026255357... For me, this is an exciting development, as I came to love dependency syntax in the 1980s, through European authors such as Tesnière and Mel'čuk.
I may be a *little* biased but this 📘 is GREAT! If you ever found language structure interesting, but were turned off by implausible and overly complicated accounts, this book is 4U: a simple and empirically grounded account of the syntax of natural lgs. A must-read for lang researchers+aficionados!
When you find errors in the book, please let me know, so I can correct them for a future edition.
Chapter 10: The rational inference or “noisy-channel” approach to communication in language. See e.g., Levy (2008); Gibson et al (2013).
Chapter 9: Language and thought: Language is independent of thought. See Fedorenko et al (2024, Nature).
But Sag et al 2020 show that a transformational account of auxiliary verbs doesn’t even make the right predictions. So I adopt a dependency-grammar version of what Sag et al argue for (with no transformations).
Chpt 7 and 8: Comparison with other frameworks. Prominently, Chomsky's arguments for the impossibility of learning depend on the existence of transformations in the grammar, which he believed were necessitated by e.g., the auxiliary verb system in English.
Chapter 6: A case study of Legalese. Part of what makes legalese so hard to understand is the long-distance dependencies.
Chapter 5: Evidence from cross-linguistic word orders showing that harmonic word orders (similar head-dependent directions, which are ones with shorter average dependency lengths) lead to grammars that are more common. This summarizes work in Futrell, Levy & Gibson (2020, Language).
Chapter 4: Evidence from language production and comprehension showing that people are sensitive to head-dependent distance. This started with my own work on the dependency locality theory (DLT) in Gibson (1998, Cognition).
Chapter 2: Preliminaries from cognitive science, about human language and methods to investigate it.
I speculate that LLMs, similar to children, learn a dependency grammar representation for the input that they are exposed to, leading to impressive syntactic competence. I see the DG representation as a computational level of explanation, with the LLM providing an algorithmic level of explanation.
Chpts 1 and 3: Dependency grammar, a formalism independently discovered by: Tesnière (1959); Hays (1964); Mel`čuk (1988); Hudson (1984). Each of these researchers wanted to make the simplest grammar formalism in order to cover all grammar phenomena in human languages.
If you want a hard copy, you can buy it here:
www.penguinrandomhouse.com/books/798036...
Try the code MITP30 to get 30% off
or READMIT20 to get 20% off (if the first code doesn’t work anymore)
Please write a review on Amazon.com or Goodreads
New book! I have written a book, called Syntax: A cognitive approach, published by MIT Press.
This is open access; MIT Press will post a link soon, but until then, the book is available on my website:
tedlab.mit.edu/tedlab_websi...