Skip to main content
Log in

Computers Aren’t Syntax All the Way Down or Content All the Way Up

  • Published:
Minds and Machines Aims and scope Submit manuscript

Abstract

This paper argues that the idea of a computer is unique. Calculators and analog computers are not different ideas about computers, and nature does not compute by itself. Computers, once clearly defined in all their terms and mechanisms, rather than enumerated by behavioral examples, can be more than instrumental tools in science, and more than source of analogies and taxonomies in philosophy. They can help us understand semantic content and its relation to form. This can be achieved because they have the potential to do more than calculators, which are computers that are designed not to learn. Today’s computers are not designed to learn; rather, they are designed to support learning; therefore, any theory of content tested by computers that currently exist must be of an empirical, rather than a formal nature. If they are designed someday to learn, we will see a change in roles, requiring an empirical theory about the Turing architecture’s content, using the primitives of learning machines. This way of thinking, which I call the intensional view of computers, avoids the problems of analogies between minds and computers. It focuses on the constitutive properties of computers, such as showing clearly how they can help us avoid the infinite regress in interpretation, and how we can clarify the terms of the suggested mechanisms to facilitate a useful debate. Within the intensional view, syntax and content in the context of computers become two ends of physically realizing correspondence problems in various domains.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. The test compares two physical medium-independent verbal exchanges with a machine and a human, and which is which is decided by a human questioner.

  2. The combinators S and K are respectively the lambda terms \(\lambda x\lambda y\lambda z.xz(yz)\) and \(\lambda x\lambda y.x\).

  3. A compiler is a program which translates from one programming language to another. Typically, the target language is a lower-level language than the source language. The final process in compiling is called code generation, by which all intermediate representations of levels of translations are dispensed with, and the code is solely in the form of the primitive instruction set of the intended architecture. There is no syntactic translation at run-time of compiled programs; it is just execution.

  4. Take f to be the representation relation. It can be for example \(f(\text{ low } \text{ voltage })=0\), meaning we map the physical property of low voltage to bit 0. Decoding and encoding are different uses of a representation. Decoding is essentially using f. Encoding is using \(f^{-1}\). For example, when we type ’a’ in our word processor, we encode whatever physical property—some voltage levels—is assigned to it down below. Therefore good representations are needed technologically, both in digital and analog computers, to be sure about the physical component.

  5. The authors note that the compute cycle is the reverse of the experiment cycle in for example physics. In physics we let the abstract level do its work, then encode its result in some physical object to see if the theory predicted its presence, location etc. correctly, as in Fig. 2a. The physical level does not do the work; its work is predicted. In the case of computers, the physical layer does the work—therefore it is not a simulation, and the result is tried to be predicted by an algorithm or an analogy (implicitly computes relation) of Fig. 1, as in Fig. 2b. Notice that what allows the physical layer to carry out its work is a series of translations of say the averaging algorithm or equations to the terms of the computer. We compare its physical work with the prediction at the abstract level, i.e. by an algorithm or equation.

    Notice also that a computer scientist seeking an explanation has to reverse the modeling relation too, after it is established. Because it only serves to establish the correctness of the algorithm, assuming the underlying physical machinery was confirmed. What we do with the corrected algorithm afterwards is much like the diagram in Fig. 2a. Therefore physics and computer science may differ on how they use the model-theory-technology cycles, but it is clear that what amounts to theory in physics and computer science is based on the same process of thinking.

  6. One implication of this result is that nothing computes in nature unless we map it to a computational problem in our thinking. Pancomputationalism and born-again computationalism seem to be some form of analogical (if not romantic) reasoning. ACM’s (2012) centenary celebration of Turing by all the living Turing-award winners makes the point quite clear: “This development [algorithmic thinking outside computer science] is an exquisite unintended consequence of the fact that there is latent computation underlying each of these phenomena [cells, brains, market, universe, etc.], or the ways in which science studies them.” [emphasis added]

    Nature-inspired computing is not to be confused with ‘nature computes’ movement. Personally, I would not lose too much sleep if some problems are not amenable to computationalist understanding. Discovery by method has its limits.

  7. Algorithmic complexity theory is based on Turing’s notion of ‘next’, such as the next step, next state, and next input. Problem size in the theory is measured with this concept. It is not a physical concept, but obviously physically realizable; see Bozşahin (2016) for more philosophical implications.

  8. Problems in P have polynomial solutions, where polynomial is on the problem size in the sense of footnote 7. NP problems can check a given solution with polynomial effort. Whether P and NP are the same is an open problem in computer science, with serious implications in economics, social sciences and natural sciences; see Fortnow (2013) for an entertaining coverage of these aspects.

  9. To avoid a cryptic mathematical exposition of the problem, I follow the authors’ informal description: The Steiner Tree Problem is equivalent in real life to building a road system of minimum length for n towns, possibly making intersections outside of towns. If the number of vertices that can be formed outside of towns (called Steiner vertices) is not known, the problem is NP-hard. If the question is whether we have a solution with length m then it is NP-complete. We are discussing the latter problem.

  10. For example, winning at chess every time in maximum of n moves from any starting move would be as easy as testing a checkmate on the board. If the opponent also knows the algorithm, then winning the game reduces to who starts the game. Even a non-player can undertake the test (checkmate condition) without understanding the game, if the rules are given on a piece of paper.

  11. For semantic content, we can take the perspective of the Language of Thought (LOT) hypothesis of Fodor (1975), which requires primitives (or core concepts), to support LOT as its primitives, or its rival, map theory, which states that we have a system of belief maps by which we steer our cognitive functions, rather than individuated sentential statements of LOT. Something has to support the system of belief and the maps. “The brain can do this” is no more an empirical theory than Turing’s “being suitably programmed” was for intelligence and understanding. A good place to start for both is by giving them a process ontology. Although map-theorists shun the computer analogy, the intensional view is not an analogy but a constitutive principle, so it might also help those theorists.

  12. See also Searle’s objection to the original replies in the (1980) article, and the volume of Preston and Bishop (2002), containing more new responses. Although Searle also contributed to the volume, he was not given the opportunity to respond to the articles (Preston 2002:46).

  13. Rogers (1959):115 provided the first Chinese Room-like argument where the input-output materials are not Chinese symbols but numbers. He was interested in computing number-theoretic functions by a human. He placed a man in the room equipped with a finite set of instructions to compute numbers, who outputted them for checking. He did not suggest a test to decide between man and machine. His test was to be able to compute any number-theoretic function in a finite amount of time. He assumed that the person inside the room is inexhaustible.

  14. Pace Copeland (2002), who considers super-Turing computing to be possible, we cannot compute and violate the laws of physics at the same time; see Cockshott et al. (2012) for discussion

  15. The Robot Reply suggested that a computer with a body would be like a child learning language.

  16. I first saw the use of this mythological ‘word turtle’ idea in the sense closest to the current discussion in Ross (1967), who attributes it to William James. Its significance for the intensional view is that Ross recalls it with a “bull’s eye relevance to the study of syntax.”

  17. Without this assumption, we would not be too far from linguistic relativism of the Sapir variety, in which we would think in a language. Although there are many cases in which linguistic terminology is deeply cultural; for example, basicness of color terms, this is not a causal link from language to thought, because we have seen tragic cases of having thought but not language, at least quite unexpected cases of inferential ability (if language caused thought), in the light of little (almost no) linguistic exposure up to puberty. One such case is Genie (Fromkin et al. 1974; Curtiss et al. 1974). Assuming that language is an expression of thought seems to avoid these problems. In the current argument, it is not only empirically on better grounds, but also a necessary consequence of thinking that language is a computational mechanism. On the other hand, syntactic semantics require it as an extra assumption; but, let me stress that nobody is denying the role of compositional semantics in syntax. The question is whether syntax alone can cause semantics.

  18. Ford (2011):70, who defended Searle’s views against Rapaport, is less apathetic, but still analogical in thinking about computers: “if we can get a computer to have meaningful conscious experiences—the road to natural language acquisition and understanding would be clear (as far as Searle is concerned).”

  19. Keller’s case is different from that of a child who is deaf or blind in the critical period of acquisition. The child in these circumstances can have some access to meanings out there by his own initiative, to relate them to forms. In fact blind children create form differences for the semantic distinction of look and see, although they cannot experience visual looking or visual seeing. Deaf or hearing children who are born to deaf parents acquire their sign language in the normal time course of language acquisition. Blind children follow a normal course too as long as they are exposed to language; see Gleitman and Elissa (1995) for a summary.

References

  • Aaronson, S. (2005). Guest column: NP-complete problems and physical reality. ACM SIGACT News, 36(1), 30–52.

    Article  Google Scholar 

  • Aaronson, S. (2013). Why philosophers should care about computational complexity. In B. J. Copeland, C. J. Posy, & O. Shagrir (Eds.), Computability: Turing, Gödel, Church, and beyond. Cambridge: MIT Press.

    Google Scholar 

  • Abend, O., Kwiatkowski, T., Smith, N., Goldwater, S., & Steedman, Mark. (2017). Bootstrapping language acquisition. Cognition, 164, 116–143.

    Article  Google Scholar 

  • ACM. (2012). ACM turing centenary celebration. Association for Computing Machinery, June 15–16, San Francisco. http://turing100.acm.org/.

  • Bickhard, M. H. (1996). Troubles with computationalism. In W. O’Donohue & R. Kitchener (Eds.), Philosophy of psychology (pp. 173–183). London: Sage.

    Chapter  Google Scholar 

  • Block, N. (1978). Troubles with functionalism. In C. W. Savage (Ed.), Minnesota studies in the philosophy of science. Minneapolis: University of Minnesota Press.

    Google Scholar 

  • Bozşahin, C. (2016). What is a computational constraint? In V. C. Müller (Ed.), Computing and philosophy. Synthese Library 375 (pp. 3–16). Heidelberg: Springer.

    Google Scholar 

  • Bringsjord, S., & Taylor. J. (2005). An argument for \(P=NP\). arXiv:cs/0406056.

  • Bryant, P. E. (1974). Perception and understanding in young children. New York: Basic Book.

    Google Scholar 

  • Burgin, M. (2001). How we know what technology can do. Communications of the ACM, 44(11), 82–88.

    Article  Google Scholar 

  • Cariani, P. (1998). Epistemic autonomy through adaptive sensing. In Intelligent control (ISIC). Held jointly with IEEE international symposium on computational intelligence in robotics and automation (CIRA), Intelligent systems and semiotics (ISAS) (pp. 718–723).

  • Cockshott, P., Mackenzie, L. M., & Michaelson, G. (2012). Computation and its limits. Oxford: Oxford University Press.

    MATH  Google Scholar 

  • Copeland, B. J. (2002). Hypercomputation. Minds and Machines, 12(4), 461–502.

    Article  MATH  Google Scholar 

  • Copeland, B. J., & Shagrir, O. (2011). Do accelerating Turing machines compute the uncomputable? Minds and Machines, 21(2), 221–239.

    Article  Google Scholar 

  • Curtiss, S., Fromkin, V., Krashen, S., Rigler, D., & Rigler, M. (1974). The linguistic development of Genie. Language, 50(3), 528–554.

    Article  Google Scholar 

  • Dennett, D. C. (1971). Intentional systems. The Journal of Philosophy, 68(4), 87–106.

    Article  Google Scholar 

  • Dennett, D. C. (1991). Consciousness explained. New York: Little Brown & Co.

    Google Scholar 

  • Dewdney, A. K. (1984). On the spaghetti computer and other analog gadgets for problem solving. Scientific American, 250(6), 19–26.

    Article  Google Scholar 

  • Fodor, J. (1975). The language of thought. Cambridge, MA: Harvard.

    Google Scholar 

  • Ford, J. (2011). Helen Keller was never in a Chinese Room. Minds and Machines, 21(1), 57–72.

    Article  Google Scholar 

  • Fortnow, L. (2013). The golden ticket: P, NP, and the search for the impossible. Princeton: Princeton University Press.

    Book  MATH  Google Scholar 

  • Fromkin, V., Krashen, S., Curtiss, S., Rigler, D., & Rigler, M. (1974). The development of language in Genie: A case of language acquisition beyond the “critical period”. Brain and Language, 1(1), 81–107.

    Article  Google Scholar 

  • Gandy, R. (1980). Church’s thesis and principles for mechanisms. Studies in Logic and the Foundations of Mathematics, 101, 123–148.

    Article  MathSciNet  MATH  Google Scholar 

  • Gleitman, L. R., & Elissa, L. N. (1995). The invention of language by children: Environmental and biological influences on the acquisition of language. In L. R. Gleitman & M. Liberman (Eds.), Language: An invitation to cognitive science (2nd ed., pp. 1–24). Cambridge, MA: MIT Press.

    Google Scholar 

  • Graham, P. (1994). On Lisp. Englewood Cliffs, NJ: Prentice Hall.

    Google Scholar 

  • Horsman, C., Stepney, S., Wagner, R. C., & Kendon, V. (2013). When does a physical system compute? Proceedings of the Royal Society A, 470, 20140182.

    Article  MATH  Google Scholar 

  • Hoyte, D. (2008). Let over lambda. HCSW and Hoytech. Doug Hoyte. ISBN 9781435712751.

  • Jay, B., & Given-Wilson, T. (2011). A combinatory account of internal structure. The Journal of Symbolic Logic, 76(3), 807–826.

    Article  MathSciNet  MATH  Google Scholar 

  • Keller, H. (1905). The story of my life. Garden City, NY: Doubleday.

    Google Scholar 

  • Knuth, D. E. (1973). Searching and Sorting, the art of computer programming, vol. 3. Reading, MA: Addison-Wesley.

    MATH  Google Scholar 

  • Knuth, D. E. (1996). Selected papers on computer science. Cambridge: Cambridge University Press.

    MATH  Google Scholar 

  • Knuth, D. E. (2014). Twenty questions for Donald Knuth. http://www.informit.com/articles/article.aspx?p=2213858. Accessed 1 June 2017.

  • Lenneberg, E. H. (1967). The biological foundations of language. New York: Wiley.

    Google Scholar 

  • Lewis, H. R., & Papadimitriou, C. H. (1998). Elements of the theory of computation (2nd ed.). New Jersey: Prentice-Hall.

    Google Scholar 

  • Mills, J. W. (2008). The nature of the extended analog computer. Physica D: Nonlinear Phenomena, 237(9), 1235–1256.

    Article  MathSciNet  MATH  Google Scholar 

  • Newell, A., & Simon, H. (1976). Computer science as empirical inquiry: Symbols and search. Communications of the ACM, 19(3), 113–126.

    Article  MathSciNet  Google Scholar 

  • Pask, G. (1968). Colloquy of mobiles. London: ICA.

    Google Scholar 

  • Piccinini, G. (2008). Computers. Pacific Philosophical Quarterly, 89, 32–73.

    Article  Google Scholar 

  • Pitowsky, I. (1990). The physical Church thesis and physical computational complexity. Iyyun: The Jerusalem Philosophical Quarterly, 39, 81–99.

    Google Scholar 

  • Preston, J. (2002). Introduction. In Preston and Bishop (2002).

  • Preston, J., & Bishop, M. (Eds.). (2002). Views into the Chinese room: New essays on Searle and artificial intelligence. Oxford: Oxford University Press.

    MATH  Google Scholar 

  • Rapaport, W. J. (1988). Syntactic semantics: Foundations of computational natural-language understanding. In J. H. Fetzer (Ed.), Aspects of artificial intelligence (pp. 81–131). Holland: Kluwer.

    Chapter  Google Scholar 

  • Rapaport, W. J. (2006). How Helen Keller used syntactic semantics to escape from a Chinese Room. Minds and Machines, 16(4), 381–436.

    Article  Google Scholar 

  • Rapaport, W. J. (2011). Yes, she was!. Minds and Machines, 21(1), 3–17.

    Article  Google Scholar 

  • Rogers, H, Jr. (1959). The present theory of Turing machine computability. Journal of the Society for Industrial and Applied Mathematics, 7(1), 114–130.

    Article  MathSciNet  MATH  Google Scholar 

  • Ross, J. R. (1967). Constraints on variables in syntax. Ph.D. dissertation, MIT. Published as Ross 1986.

  • Ross, J. R. (1986). Infinite syntax!. Norton, NJ: Ablex.

    Google Scholar 

  • Rubel, L. A. (1993). The extended analog computer. Advances in Applied Mathematics, 14(1), 39–50.

    Article  MathSciNet  MATH  Google Scholar 

  • Searle, J. R. (1980). Minds, brains and programs. The Behavioral and Brain Sciences, 3, 417–424.

    Article  Google Scholar 

  • Searle, J. R. (1990). Is the brain’s mind a digital computer? Proceedings of American Philosophical Association, 64(3), 21–37.

    Article  Google Scholar 

  • Searle, J. R. (2001). Chinese Room argument. In R. A. Wilson & F. C. Keil (Eds.), The MIT encyclopedia of the cognitive sciences (pp. 115–116). Cambridge, MA: MIT Press.

    Google Scholar 

  • Searle, J. R. (2002). Twenty-one years in the Chinese Room. In Preston and Bishop (2002).

  • Shagrir, O. (1999). What is computer science about? The Monist, 82(1), 131–149.

    Article  Google Scholar 

  • Simon, H. (1969). The sciences of the artificial. Cambridge: MIT Press.

    Google Scholar 

  • Turing, A. M. (1936). On computable numbers, with an application to the entscheidungsproblem. Proceedings of the London Mathematical Society, 42(series 2), 230–265.

    MathSciNet  MATH  Google Scholar 

  • Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59, 433–460.

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

I thank five reviewers of Minds and Machines for their comments, which significantly improved the paper. Thanks also to Halit Oğuztüzün and Umut Özge for feedback on an earlier draft, which led to some new sections, and to Vincent Nunney for last minute help with some pieces of text. All errors and misunderstandings are mine.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cem Bozşahin.

Additional information

The paper is dedicated to my Arizona State classmate Jonathan Mills, 1952–2016, whose work led me to some rethinking which eventually led to this paper, who had taught me one day in class what it means to be a computer science graduate student.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bozşahin, C. Computers Aren’t Syntax All the Way Down or Content All the Way Up. Minds & Machines 28, 543–567 (2018). https://doi.org/10.1007/s11023-018-9469-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11023-018-9469-2

Keywords

Navigation