In this paper we probe the limits of the computational method in economics. This method involves modeling individual behavior and economic processes in terms of constrained optimization. In neoclassical economics human behavior is explained entirely computationally. Alternative paradigms include the evolutionary and the complexity?based approaches that model behavior and processes as non?optimizing or boundedly rational. But many of the models used in ?complex?evolutionary economics? are cellular automata or their equivalents. This means that neoclassical economics and complex?evolutionary economics are both committed (...) to a computational vision of the economy. A highly complex computational economy can evolve and self?organize but it also displays computational universality that means that many problems are not decidable. The inherent limits of computability become evident. This paper proposes incorporating a particular (constructive) non?computability into our view of economic behavior and processes. The paper defines constructively non?computational behavior, discusses its origins in Roger Penrose's writings, and provides an application of this concept to the question of realistic counterfactuals in economic models. (shrink)
The most elementary way to think about Mathctrtati ca is as an enhance calculator Ã¢â¬â a calculator that does not only numerical computation but also algebraic computation and graphics. Matltcmatica can function much like a standard calt".1a- tor. you type in a question, you get back an answer. But Mat/tctttadca ga's turthcr I ue an ordinary calculator. You can type in questions that require answers that arc longer than a calculator can handle. For example, Matltcmatictt can giv; you thc numerical (...) value of tr to a hundred decimal places, or the exact result for a numerical calculation as complicated as the result of 3)tI0, (m Fig. 1). (shrink)
Over a period of more than twenty years, Sybil Wolfram gave lectures at Oxford University on Philosophical Logic, a major component of most of the undergraduate degree programmes. She herself had been introduced to the subject by Peter Strawson, and saw herself as working very much within the Strawsonian tradition. Central to this tradition, which began with Strawson's seminal attack on Russell's theory of descriptions in ‘On Referring' (1950), is the distinction between a sentence and what is said by (...) a sentence − Strawson initially called the latter a use of a sentence, and sometimes a proposition , but his most frequent term for what is said , which Wolfram consistently adopts, is the statement expressed.1 The force of the distinction is clearly illustrated in ‘On Referring', which uses it to undermine the common assumption that any sentence must be either true, or false, or meaningless. Russell had argued on this basis that a sentence such as ‘The King of France is bald' (which is clearly neither true nor meaningless) must be false, but Strawson points out that if we distinguish between the sentence itself and the statement that it expresses (on some occasion of use), we can quite easily combine the admission that the sentence is meaningful − for it can in appropriate circumstances be used to express true and false statements − with the claim that nevertheless if the circumstances are ‘inappropriate' (in particular, when there is no current King of France), the sentence can fail to express a statement that is either true or false. On this picture, therefore, it is sentences that are meaningful, but statements that are the primary bearers of truth. (shrink)
Sir Roger Penrose, retired professor of mathematics at the University of Oxford and collaborator with Stephen Hawking on black hole theory, has written 'a complete guide to the laws of the universe' called The Road to Reality. His publisher calls it the most important and ambitious work of science for a generation. Penrose caused a furore in the world of consciousness studies with his 1989 book The Emperor's New Mind, which conjectured a new mechanism for consciousness and kept a faithful (...) band of researchers busy for a decade with models based on microtubules and the like. Sadly, the idea fizzled out. The title of the 2002 Tucson 'Toward a Science of Consciousness' conference poetry slam winner was: Microtubules - my ass! (shrink)
Mind–body dualism has rarely been an issue in the generative study of mind; Chomsky himself has long claimed it to be incoherent and unformulable. We first present and defend this negative argument but then suggest that the generative enterprise may license a rather novel and internalist view of the mind and its place in nature, different from all of, (i) the commonly assumed functionalist metaphysics of generative linguistics, (ii) physicalism, and (iii) Chomsky’s negative stance. Our argument departs from the empirical (...) observation that the linguistic mind gives rise to hierarchies of semantic complexity that we argue (only) follow from constraints of an essentially mathematical kind. We assume that the faculty of language tightly correlates with the mathematical capacity both formally and in evolution, the latter plausibly arising as an abstraction from the former, as a kind of specialized output. On this basis, and since the semantic hierarchies in question are mirrored in the syntactic complexity of the expression involved, we posit the existence of a higher-dimensional syntax structured on the model of the hierarchy of numbers, in order to explain the semantic facts in question. If so, syntax does not have a physicalist interpretation any more than the hierarchy of number-theoretic spaces does. (shrink)
A traditional view maintains that thought, while expressed in language, is non-linguistic in nature and occurs in non-linguistic beings as well. I assess this view against current theories of the evolutionary design of human grammar. I argue that even if some forms of human thought are shared with non-human animals, a residue remains that characterizes a unique way in which human thought is organized as a system. I explore the hypothesis that the cause of this difference is a grammatical way (...) of structuring semantic information, and I present evidence that the organization of grammar precisely reflects the organization of a specific mode of thought apparently distinctive of humans. Since there appears to be no known non-grammatical structuring principle for the relevant mode of thought, I suggest that grammar is that principle, with no independent ?Language of Thought? needed. (shrink)
1. Pohlers and The Problem. I first met Wolfram Pohlers at a workshop on proof theory organized by Walter Felscher that was held in Tübingen in early April, 1973. Among others at that workshop relevant to the work surveyed here were Kurt Schütte, Wolfram’s teacher in Munich, and Wolfram’s fellow student Wilfried Buchholz. This is not meant to slight in the least the many other fine logicians who participated there.2 In Tübingen I gave a couple of (...) survey lectures on results and problems in proof theory that had been occupying much of my attention during the previous decade. The following was the central problem that I emphasized there: The need for an ordinally informative, conceptually clear, proof-theoretic reduction of classical theories of iterated arithmetical inductive definitions to corresponding constructive systems. As will be explained below, meeting that need would be significant for the then ongoing efforts at establishing the constructive foundation for and proof-theoretic ordinal analysis of certain impredicative subsystems of classical analysis. I also spoke in Tübingen about.. (shrink)
Contemporary arguments for forms of psycho-physical dualism standardly depart from phenomenal aspects of consciousness ('what it is like' to have some particular conscious experience). Conceptual aspects of conscious experience, as opposed to phenomenal or visual/perceptual ones, are often taken to be within the scope of functionalist, reductionist, or physicalist theories. I argue that the particular conceptual structure of human consciousness makes this asymmetry unmotivated. The argument for a form of dualism defended here proceeds from the empirical premise that (...) conceptual structure in a linguistic creature like us is a combinatorial and compositional system that implicates a distinction between simple and complex, or 'atomic' and 'molecular' concepts. The argument is that conceptual atoms, qua atoms, are irreducible to anything else. If so, and if the atoms are essentially semantic, a form of dualism follows: though positively inviting naturalistic inquiry into the semantic and mental aspects of nature, it requires that we look at the mental as a primitive domain of nature. Schematically, then, the argument is as follows: (1) Human consciousness/thought is conceptually structured. (2) The human conceptual system is a 'particulate' system at a syntactic and semantic level of representation (the notion of a 'particulate' system is developed in Section 2). (3) This implies the existence of conceptual 'particles', concepts that have no further semantic decomposition ('atoms'). (4) A conceptual atom cannot be explained in terms of anything that does not involve its own intrinsic properties (Section 3). (5) Physicalism as normally conceived is inconsistent with (3) and (4) (Section 4). (shrink)
Computers today are not only the calculation tools - they are directly (inter)acting in the physical world which itself may be conceived of as the universal computer (Zuse, Fredkin, Wolfram, Chaitin, Lloyd). In expanding its domains from abstract logical symbol manipulation to physical embedded and networked devices, computing goes beyond Church-Turing limit (Copeland, Siegelman, Burgin, Schachter). Computational processes are distributed, reactive, interactive, agent-based and concurrent. The main criterion of success of computation is not its termination, but the adequacy of (...) its response, its speed, generality and flexibility; adaptability, and tolerance to noise, error,faults, and damage. Interactive computing is a generalization of Turing computing, and it calls for new conceptualizations (Goldin, Wegner). In the info-computationalist framework, with computation seen as information processing, natural computation appears as the most suitable paradigm of computation and information semantics requires logical pluralism. (shrink)
Traditionally, Ancient Mesopotamian epistemic practices resulting in the vast corpus of cuneiform ‘lexical lists’ and other, similarly formatted treatises have been conceptualized as “ Listenwissenschaft ” in Assyriology. Introduced by the German Assyriologist Wolfram v. Soden in 1936 , this concept has also been utilized in other disciplines of the Humanities as a terminological means to describe epistemic activity allegedly inferior to ‘Western’ modes of analytical and hypotactic scientific reasoning. Building on the exemplary evidence of a bilingual list of (...) cuneiform compound graphemes from the early 2nd millennium BCE as well as on recent conceptualizations of ‘epistemic cultures’ and the instrumental function of material ‘representations’ in the context of epistemic practices, the present paper attempts to replace the essentialistic and teleological concept of an Ancient Mesopotamian “ Listenwissenschaft ” with a new epistemological model describing the underlying epistemic practices as highly adaptive non-linear epistemic practices comparable to what has been described as ‘practices with »epistemic things«’ in recent epistemology and practice theory. (shrink)
This paper is an edited form of a letter written by the two authors (in the name of Tarski) to Wolfram Schwabhäuser around 1978. It contains extended remarks about Tarski's system of foundations for Euclidean geometry, in particular its distinctive features, its historical evolution, the history of specific axioms, the questions of independence of axioms and primitive notions, and versions of the system suitable for the development of 1-dimensional geometry.
I argue that the implementation of theDummettian program of an ``anti-realist'' semanticsrequires quite different conceptions of the technicalmeaning-theoretic terms used than those presupposed byDummett. Starting from obvious incoherences in anattempt to conceive truth conditions as assertibilityconditions, I argue that for anti-realist purposesnon-epistemic semantic notions are more usefully kept apart from epistemic ones rather than beingreduced to them. Embedding an anti-realist theory ofmeaning in Martin-Löf's Intuitionistic Type Theory(ITT) takes care, however, of many notorious problemsthat have arisen in trying to specify suitableintuitionistic (...) notions of semantic value,truth-conditions, and validity, taking into accountthe so-called ``defeasibility of evidence'' forassertions in empirical discourses. (shrink)
Carnap took the content of a particular sentence or set of sentences to consist in the set ofthe consequences of the sentence or set. This claim equates meaning with inferential role, but it restricts the inferences to deductive or explicative ones. Here I reject a recent proposal by Rober Brandom, where inductive or ampliative inferences arealso meant to confer contents on expressions. I argue that if Brandom's inferentialist picture is upheld, and both explicative and ampliative inferences confer meaning, one consequence (...) of this is that the content of a sentence is to be read off from our ways of rationally altering our beliefs. Meaning and content then are largely concepts of pragmatics, with no clear theoretical interest. My critique affects certain aspects of Dummett's meaning-theoretic picture too,and the discussion also links up with the development of `dynamic semantics'. (shrink)
Internalism is an explanatory strategy that makes the internal structure and constitution of the organism a basis for the investigation of its external function and the ways in which it is embedded in an environment. It is opposed to an externalist explanatory strategy, which takes its departure from observations about external function and mind-environment interactions, and infers and rationalizes internal organismic structure from that. This paper addresses the origins of truth, a basic ingredient in the human conceptual scheme. I suggest (...) the necessity of pursuing an internalist line of explanation for it, as adopted in the biolinguistic program and generative grammar at large. According to this view, the concept of truth is a presupposition for the way language is used in relation to the world, rather than a function of that use or a consequence of language-world relations. (shrink)
1. The Physical Church-Turing Thesis. Physicists often interpret the Church-Turing Thesis as saying something about the scope and limitations of physical computing machines. Although this was not the intention of Church or Turing, the Physical Church Turing thesis is interesting in its own right. Consider, for example, Wolfram’s formulation: One can expect in fact that universal computers are as powerful in their computational capabilities as any physically realizable system can be, that they can simulate any physical system . . (...) . No physically implementable procedure could then shortcut a computationally irreducible process. (Wolfram 1985) Wolfram’s thesis consists of two parts: (a) Any physical system can be simulated (to any degree of approximation) by a universal Turing machine (b) Complexity bounds on Turing machine simulations have physical signiﬁcance. For example, suppose that the computation of the minimum energy of some system of n particles takes at least exponentially (in n) many steps. Then the relaxation time of the actual physical system to its minimum energy state will also take exponential time. (shrink)
In a recent paper, Kit Fine offers a reconstruction of Cantor's theory of ordinals. It avoids certain mentalistic overtones in it through both a non-standard ontology and a non-standard notion of abstraction. I argue that this reconstruction misses an essential constructive and computational content of Cantor's theory, which I in turn reconstruct using Martin-Löf's theory of types. Throughout, I emphasize Kantian themes in Cantor's epistemology, and I also argue, as against Michael Hallett's interpretation, for the need for a constructive understanding (...) of Cantorian ?existence principles? (shrink)
From Clouds to Corsair: Kierkegaard, Aristophanes, and Socrates -- The pure fool and the knight of faith: Wolfram's Parzival and the stages of existence -- From romantic aesthete to Christian analogue: Don Quixote's sallies in Kierkegaard's authorship -- Saying not quite "everything just as it is": Shakespeare on life's way -- "Sorrow's changeling": irony, humor, and laughter in Kierkegaard and Carlyle.
In this book leading scholars from every relevant field report on all aspects of compositionality, the notion that the meaning of an expression can be derived from its parts. Understanding how compositionality works is a central element of syntactic and semantic analysis and a challenge for models of cognition. It is a key concept in linguistics and philosophy and in the cognitive sciences more generally, and is without question one of the most exciting fields in the study of language and (...) mind. The authors of this book report critically on lines of research in different disciplines, revealing the connections between them and highlighting current problems and opportunities. -/- The force and justification of compositionality have long been contentious. First proposed by Frege as the notion that the meaning of an expression is generally determined by the meaning and syntax of its components, it has since been deployed as a constraint on the relation between theories of syntax and semantics, as a means of analysis, and more recently as underlying the structures of representational systems, such as computer programs and neural architectures. The Oxford Handbook of Compositionality explores these and many other dimensions of this challenging field. It will appeal to researchers and advanced students in linguistics and philosophy and to everyone concerned with the study of language and cognition including those working in neuroscience, computational science, and bio-informatics. (shrink)
Renaud Barbaras, La vie lacunaire [Thomas Vercruysse, p. 324] • Wolfram Hogrebe, Der implizite Mensch [Federica Ceranovi, p. 334] • Emmanuel Alloa, Das durchscheinende Bild [Maria Teresa Costa, p. 344] • Alexander R. Galloway, The Interface Effect [Angela Maiello, p. 346] • Francisco José Ramos, La significación del lenguaje poético [Michele Gardini, p. 348] • Alessandro Arbo, Entendre comme. Wittgenstein et l’esthétique musicale [Leonardo V. Distaso, p. 351.
The paper is meant as a survey of issues in computational complexity from the standpoint of its relevance to social research. Moreover, the threads are hinted at that lead to computer science from mathematical logic and from philosophical questions about the limits and the power both of mathematics and the human mind. Especially, the paper addresses Turing's idea of oracle, considering its impact on computational (i.e., relying on simulations) economy, sociology etc. Oracle is meant as a device capable of finding (...) the values of uncomputable functions. Such an idealized entity is exemplified by the human mind's procedure of recognizing the truth of the Gödelian sentence, of identifying uncomputable numbers through Turing's diagonal procedure, etc. Since such procedures are strictly defined and are as reliable as any calculations, they are worth to be called computation as well. From the computation in the strict sense, that defined as purely algorithmic (mechanical) process, one distinguishes them with the term "hipercomputation". Now the following questions arise. - Are there undecidable problems (ie. not decidable with appropriate algorithms) in social research as are (according to what is reported esp. By S. Wolfram) in natural sciences? The answer in the negative would impose limitations on computer simulations (as entirely relying on algorithms). - If there are, then we have the next question: can such problems be addressed with hipercomputational procedures? - How such hipercomputational procedures would be related to analog computation (coextensive, everlappiing, etc.)? Another set of issues is stated in terms of tractability of decidable problems, that is, the efficiency of algorithms needed for solutions. As inefficient are regarded those which require more resources (time, memory, etc.) than is available in a foreseeable future. In this context, one discusses methods of such an efficient organizing computational processes to overcome the scarcity of resources; thus parallel, distributive, interactive, etc. computing are used as remedies. The paper claims, hinting at F.Hayek's ideas, that in some social systems (e.g., stock exchange, and free market in general) such an efficient organization of their computational activities spontaneously evolves. And this is the main source of its advantages over the central economic planning (as defended by O. Lange). This noticing (in terms of complexity theory) of analogy between Hayek's point and the current discussion of efficiency of algorithms is what may count as an original contribution of the present paper. (shrink)
On the one hand, means of transport can be considered as media which shape the perception of space; on the other, they can be considered as milieus which produce certain forms of social interaction. In order to relate both perspectives to each other, the present contribution outlines a topology of vehicles, drawing upon contemporary French literature set in cities. Their detailed representation of certain means of transport shows that literary texts not only decipher modes of spatial perception that are specific (...) to certain vehicles, but also devise new ways of using vehicles through their involvement with an outdated culture of transport. German Verkehrsmittel lassen sich als Medien betrachten, die auf die Wahrnehmung des Raums einwirken, aber auch als Milieus, die bestimmte Formen sozialer Interaktion erzeugen. Um beide Perspektiven aufeinander zu beziehen, umreißt der Beitrag eine Topologie der Fahrzeuge anhand von Stadttexten aus der französischen Literatur der Gegenwart. Aus deren eingehender Darstellung bestimmter Verkehrsmittel geht hervor, dass literarische Texte nicht allein fahrzeugspezifische Weisen der Raumerfahrung entziffern, sondern darüber hinaus auch in Auseinandersetzung mit einer überkommenen Transportkultur originelle Praktiken des Fahrzeuggebrauchs ersinnen. (shrink)
Evolutionary pressure selects for the most efficient way of information processing by the brain. This is achieved by focussing neuronal processing onto essential environmental objects, by using focussing devices as pointers to different objects rather than reestablishing new representations, and by using external storage bound to internal representations by pointers. Would external storage increase the capacity of cognitive processing?
Peer review is a widely accepted instrument for raising the quality of science. Peer review limits the enormous unstructured influx of information and the sheer amount of dubious data, which in its absence would plunge science into chaos. In particular, peer review offers the benefit of eliminating papers that suffer from poor craftsmanship or methodological shortcomings, especially in the experimental sciences. However, we believe that peer review is not always appropriate for the evaluation of controversial hypothetical science. We argue that (...) the process of peer review can be prone to bias towards ideas that affirm the prior convictions of reviewers and against innovation and radical new ideas. Innovative hypotheses are thus highly vulnerable to being “filtered out” or made to accord with conventional wisdom by the peer review process. Consequently, having introduced peer review, the Elsevier journal Medical Hypotheses may be unable to continue its tradition as a radical journal allowing discussion of improbable or unconventional ideas. Hence we conclude by asking the publisher to consider re-introducing the system of editorial review to Medical Hypotheses. (shrink)