Skip to main content
Log in

Recursion Isn’t Necessary for Human Language Processing: NEAR (Non-iterative Explicit Alternatives Rule) Grammars are Superior

  • Published:
Minds and Machines Aims and scope Submit manuscript

Abstract

Language sciences have long maintained a close and supposedly necessary coupling between the infinite productivity of the human language faculty and recursive grammars. Because of the formal equivalence between recursion and non-recursive iteration; recursion, in the technical sense, is never a necessary component of a generative grammar. Contrary to some assertions this equivalence extends to both center-embedded relative clauses and hierarchical parse trees. Inspection of language usage suggests that recursive rule components in fact contribute very little, and likely nothing significant, to linguistic creativity. Further than this, if the productivity of human language is considered as not rigidly bound, but not infinite, then the need for any sort of iteration in generative grammars vanishes and can be replaced with a Non-iterative Explicit Alternatives Rule grammar. The knock-on effects of dispensing with recursive (or any iterative) grammar components are: that language diversity can simply be based on rule and lexicon combinatorics with no potentially infinite dimensions derived from recursive or iterative components of rules; the oddity of a vast, multiply-infinite competence set of ‘grammatical but unacceptable’ productions is gone; and the development of a language faculty based on rules that eschew iterative rule components avoids any need for explaining ‘special’ mechanisms. On the broader front of searching for the mechanisms of mind, our analysis can be similarly applied to the proposals for a recursive basis for mind as an explanation for humanity’s great leap forward.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. Mithun quotes only the latter part of the full definition offered by Pinker and Jackendoff: “Recursion refers to a procedure that calls itself, or to a constituent that contains a constituent of the same kind)” (p. 203). Although Pinker and Jackendoff do not themselves use the term “recursive structure” it appears that Mithun interprets the “or” clause as meaning that another way to view recursion is in terms of a specific type of constituent structure.

  2. It is generally believed by computer scientists (although perhaps not indisputably proven) that in programming there is an absolute equivalence between recursion and loop iteration, i.e., what can be specified recursively can alternatively be specified iteratively, and vice versa. In human language the claimed recursive structures are limited. The complex, multi-stack, mutually recursive structures found in software have not been postulated for language. Consequently, a claim of equivalence is sound in the language domain: any recursively formulated grammar rule can be rewritten as an iterative equivalent.

  3. This common analogy contains a crucial weakness: the natural numbers contain only one dimension of infinity. Hence they can be strictly ordered with respect to size. Natural languages are not limited in this way (they contain, for example, a dimension of length infinity and one of structural diversity) and hence sentences cannot be simply ordered. It is precisely this difference that is being overlooked.

  4. This stricture does not apply to the so-called formal languages, such as those use to define grammars. The potential for confusion stems from the use of the word ‘language’ for two very different classes of phenomena: formal languages which may include all sorts of extra apparatus, such as explicit looping constructs; and natural languages which are primarily a sets of strings of words (or perhaps sounds).

  5. Perfors et al (2010) embrace a Bayesian approach that trades-off simplicity and over generation. Because the evaluation metric for simplicity they use appears to focus only on the number of rules and ignores processing costs, which are an important hidden aspect of recursive rules, they are more reluctant than us to completely jettison recursion.

  6. A simple example of a recursive grammar that imposes restrictions on rule ordering is one that defines the natural numbers, 1, 2, 3, …, ∞. r1: <number> is <number> +1; r2: <number> is 1. Must check r1 first, otherwise with potential <number> as 1, r2 generates infinite regress 0, −1, −2, etc.

  7. This aspect of theory of mind (viz., the ability to understand what is going in the mind of others) conjures recursive-looking structures, “I know that he knows that I know….” that may also have very constrained levels of embedding, even in the best chess or poker players.

  8. We touched on this symbolic requirement earlier when, for example, we noted that a recursive grammar must have a non-terminal to recurse upon whereas an equivalent iterative grammar need not.

References

  • Chomsky, N. (1956). Three models for the description of language. IRE Transactions on Information Theory, 3, 113–124.

    Article  Google Scholar 

  • Chomsky, N. (1965). Aspects of the theory of syntax. Cambridge, MA: MIT Press.

    Google Scholar 

  • Chomsky, N. (1995). The minimalist program. Cambridge, MA: MIT Press.

    MATH  Google Scholar 

  • Christiansen, M. H., & Chater, N. (1999). Toward a connectionist model of recursion in human linguistic performance. Cognitive Science, 23, 157–205.

    Article  Google Scholar 

  • Christiansen, M. H., & MacDonald, M. C. (2009). A usage-based approach to recursion in sentence processing. Language Learning, 59, 126–161.

    Article  Google Scholar 

  • Corballis, M. C. (2007). Recursion, language, and starlings. Cognitive Science, 31, 697–704.

    Article  Google Scholar 

  • Corballis, M. C. (2011). The recursive mind: The origins of language, thought, and civilization. Princeton, NJ: Princeton University Press.

    Google Scholar 

  • Elman, J. L. (1990). Finding structure in time. Cognitive Science, 14, 179–211.

    Article  Google Scholar 

  • Epstein, S., & Hornstein, N. (2004). Letter on ‘the future of language’. Language, 81, 3–6.

    Google Scholar 

  • Evans, N., & Levinson, S. (2009). The myth of language universals: Language diversity and its importance for cognitive science. Behavioral and Brain Sciences, 32(5), 429–448.

    Article  Google Scholar 

  • Everett, D. (2005). Cultural constrains on grammar and cognition in Pirahã: Another look at the design features of human language. Current Anthropology, 76(4), 621–646.

    Article  MathSciNet  Google Scholar 

  • Gibson, E. (1998). Linguistic complexity: Locality of syntactic dependencies. Cognition, 68, 1–76.

    Article  Google Scholar 

  • Gibson, E., & Thomas, J. (1996). The processing complexity of English center-embedded and self-embedded structures. In C. Schutze (Ed.), Proceedings of the NELS 26 sentence processing workshop (pp. 45–71). Cambridge, MA: MIT Press.

    Google Scholar 

  • Gibson, E., & Thomas, J. (1999). Memory limitations and structural forgetting: The perception of complex ungrammatical sentences as grammatical. Language and Cognitive Processes, 14, 225–248.

    Article  Google Scholar 

  • Harder, P. (2010). Over the top—recursion as a functional option. In H. Hulst (Ed.), Recursion and human language (pp. 233–244). New York, NY: De Gruyter Mouton.

    Chapter  Google Scholar 

  • Harel, D. (2000). Computers Ltd, what they really can’t do. Oxford: Oxford University Press.

    MATH  Google Scholar 

  • Hauser, M. D., Chomsky, N., & Fitch, T. (2002). The faculty of language: What is it, who has it, and how did it evolve? Science, 298, 1569–1579.

    Article  Google Scholar 

  • Huddleston, R. (1976). An introduction to English transformational syntax. London: Longman.

    Google Scholar 

  • Jackendoff, R., & Pinker, S. (2005). The nature of the language faculty and its implications for evolution of language (Reply to Fitch, Hauser, and Chomsky). Cognition, 97, 211–225.

    Article  Google Scholar 

  • Just, M. A., & Carpenter, P. A. (1992). A capacity theory of comprehension: Individual differences in working memory. Psychological Review, 99, 122–149.

    Article  Google Scholar 

  • Karlsson, F. (2010). Recursion and iteration. In H. Hulst (Ed.), Recursion and human language (pp. 43–67). New York, NY: De Gruyter Mouton.

    Chapter  Google Scholar 

  • Katz, J. J. (1978). Effability and translation. In F. Guenthner & M. Guenthner-Reutter (Eds.), Meaning and translation: Philosophical and linguistic approaches (pp. 191–234). London: Duckworth.

    Google Scholar 

  • Kimball, J. (1973). Seven principles of surface structure parsing in natural language. Cognition, 2, 15–47.

    Article  Google Scholar 

  • Kinsella, A. (2010). Was recursion the key step in the evolution of the human language faculty? In H. Hulst (Ed.), Recursion and human language (pp. 179–191). New York, NY: De Gruyter Mouton.

    Google Scholar 

  • Langacker, R. W. (1973). Language and its structure (2nd ed.). New York, NY: Harcourt Brace Jovanovich.

    Google Scholar 

  • Lasnik, H. (2000). Syntactic structures revisited: Contemporary lectures on classic transformational theory. Cambridge, MA: MIT Press.

    Google Scholar 

  • Laury, R., & Ono, T. (2010). Recursion in conversation: What speakers of Finnish and Japanese know how to do. In H. Hulst (Ed.), Recursion and human language (pp. 69–91). New York, NY: De Gruyter Mouton.

    Chapter  Google Scholar 

  • Marcus, M. (1980). A theory of syntactic recognition for natural language. Cambridge, MA: MIT Press.

    MATH  Google Scholar 

  • Mithun, M. (2010). The fluidity of recursion and its implications. In H. Hulst (Ed.), Recursion and human language (pp. 18–41). New York, NY: De Gruyter Mouton.

    Google Scholar 

  • Perfors, A., Tenenbaum, J., Gibson, E., & Regier, T. (2010). How recursive is language? A Bayesian exploration. In H. Hulst (Ed.), Recursion and human language (pp. 159–177). New York, NY: De Gruyter Mouton.

    Chapter  Google Scholar 

  • Pinker, S., & Jackendoff, R. (2005). The faculty of language: What’s special about it? Cognition, 95, 201–236.

    Article  Google Scholar 

  • Pullum, G., & Scholz, B. C. (2010). Recursion and the infinitude claim. In H. Hulst (Ed.), Recursion and human language (pp. 113–137). New York, NY: De Gruyter Mouton.

    Google Scholar 

  • Sakel, J., & Stapert, E. (2010). Piraha—In need of recursive syntax? In H. Hulst (Ed.), Recursion and human language (pp. 3–16). New York, NY: De Gruyter Mouton.

    Google Scholar 

  • Schank, R., & Wilks, Y. (1974). The goals of linguistic theory revisited. Lingua, 34, 301–326.

    Article  Google Scholar 

  • Stabler, E. P. (1994). The finite connectivity of linguistic structure. In C. Clifton, L. Frazier, & K. Rayner (Eds.), Perspectives on sentence processing (pp. 303–336). Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Stabler, E. (1999). Formal grammars. In R. A. Wilson & F. C. Keil (Eds.), The MIT encyclopedia of the cognitive sciences (pp. 320–322). Cambridge, MA: MIT Press.

    Google Scholar 

  • Tiede, H.-J., & Stoute, L. N. (2010). Recursion, infinity, and modeling. In H. Hulst (Ed.), Recursion and human language (pp. 147–158). New York, NY: De Gruyter Mouton.

    Chapter  Google Scholar 

  • van der Hulst, H. (2010). Recursion and human language. New York, NY: De Gruyter Mouton.

    Book  Google Scholar 

  • Verhagen, A. (2010). What do you think is the proper place of recursion? Conceptual and empirical issues. In H. Hulst (Ed.), Recursion and human language (pp. 93–110). New York, NY: De Gruyter Mouton.

    Chapter  Google Scholar 

  • Yang, C. (2006). The infinite gift. New York, NY: Scribner.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kenneth R. Paap.

Appendix: Building a Parse-Tree Using the Recursive Grammar Parsing Algorithm

Appendix: Building a Parse-Tree Using the Recursive Grammar Parsing Algorithm

The following narrative makes it easier to understand how the algorithm builds the tree structure. Start with the main procedure called Streebuild and Sentence 1. The first line establishes node 1 as the root of the tree and labels it “S”. Next, the procedure Npbuild(prefix, np) is called with no prefix and the np parameter taking the value of “The cat”. This leads to the branch left and establishes node 2 as “The cat”. Processing returns to Streebuild and the recursive Thatclbuild procedure is called. The “then” branch establishes trunk node 3 and labels it “thatcl”, calls Npbuild with “that” and “the dog” which branches left, and establishes “that the dog” as node 4. Thatclbuild now recursively recalls itself, because “that” heads the remaining word-string fragment, the “then” branch establishes the trunk node, node 5 and labels it “thatcl.” It now calls Npbuild with “that” and “John” which branches left establishing “that John” as node 6. Thatclbuild recursively calls itself once more. But now the number of “that”s in Sentence 1 has been exhausted and processing immediately returns by means of the “else” branch to complete the previous incomplete recursive call. This is a call to Vpbuild with “saw” which branches right to establish “saw” as node 7. Now this recursive call is complete and so control returns to previous incomplete call, i.e., it up to node 3, and calls Vpbuild with “chased” which branches right and establishes “chased” as node 8. The last call to Thatclbuild is complete so finally control returns to Streebuild where the final call to Vpbuild with the vp “bit the mouse” branches right and establishes “bit the mouse” as node 9.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Paap, K.R., Partridge, D. Recursion Isn’t Necessary for Human Language Processing: NEAR (Non-iterative Explicit Alternatives Rule) Grammars are Superior. Minds & Machines 24, 389–414 (2014). https://doi.org/10.1007/s11023-014-9341-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11023-014-9341-y

Keywords

Navigation