To explore the relation between mathematicalmodels and reality, four different domains of reality are distinguished: observer-independent reality (to which there is no direct access), personal reality, social reality and mathematical/formal reality. The concepts of personal and social reality are strongly inspired by constructivist ideas. Mathematical reality is social as well, but constructed as an autonomous system in order to make absolute agreement possible. The essential problem of mathematical modelling is that within mathematics there is (...) agreement about ‘truth’, but the assignment of mathematics to informal reality is not itself formally analysable, and it is dependent on social and personal construction processes. On these levels, absolute agreement cannot be expected. Starting from this point of view, repercussion of mathematical on social and personal reality, the historical development of mathematical modelling, and the role, use and interpretation of mathematicalmodels in scientific practice are discussed. (shrink)
The genetic code appeared on Earth with the first cells. The codes of cultural evolution arrived almost four billion years later. These are the only codes that are recognized by modern biology. In this book, however, Marcello Barbieri explains that there are many more organic codes in nature, and their appearance not only took place throughout the history of life but marked the major steps of that history. A code establishes a correspondence between two independent 'worlds', and the codemaker is (...) a third party between those 'worlds'. Therefore the cell can be thought of as a trinity of genotype, phenotype and ribotype. The ancestral ribotypes were the agents which gave rise to the first cells. The book goes on to explain how organic codes and organic memories can be used to shed new light on the problems encountered in cell signalling, epigenesis, embryonic development, and the evolution of language. (shrink)
'Mass terms', words like water, rice and traffic, have proved very difficult to accommodate in any theory of meaning since, unlike count nouns such as house or dog, they cannot be viewed as part of a logical set and differ in their grammatical properties. In this study, motivated by the need to design a computer program for understanding natural language utterances incorporating mass terms, Harry Bunt provides a thorough analysis of the problem and offers an original and detailed solution. An (...) extension of classical set theory, Ensemble Theory, is defined, and this provides the conceptual basis of a framework for the analysis of natural language meaning which Dr Bunt calls Two-level model-theoretic semantics. The validity of the framework is convincingly demonstrated by the formal analysis of a fragment of English including sentences with quantified and modified mass terms. Separate chapters of the book are devoted to an axiomatic definition of Ensemble Theory and a detailed discussion of its status as a mathematical formalism. (shrink)
We show that vector space semantics and functional semantics in two-sorted first order logic are equivalent for pregroup grammars. We present an algorithm that translates functional expressions to vector expressions and vice-versa. The semantics is compositional, variable free and invariant under change of order or multiplicity. It includes the semantic vector models of Information Retrieval Systems and has an interior logic admitting a comprehension schema. A sentence is true in the interior logic if and only if (...) the ‘usual’ first order formula translating the sentence holds. The examples include negation, universal quantifiers and relative pronouns. (shrink)
The present paper argues that ‘mature mathematical formalisms’ play a central role in achieving representation via scientific models. A close discussion of two contemporary accounts of how mathematicalmodels apply—the DDI account (according to which representation depends on the successful interplay of denotation, demonstration and interpretation) and the ‘matching model’ account—reveals shortcomings of each, which, it is argued, suggests that scientific representation may be ineliminably heterogeneous in character. In order to achieve a degree of unification that (...) is compatible with successful representation, scientists often rely on the existence of a ‘mature mathematical formalism’, where the latter refers to a—mathematically formulated and physically interpreted—notational system of locally applicable rules that derive from (but need not be reducible to) fundamental theory. As mathematical formalisms undergo a process of elaboration, enrichment, and entrenchment, they come to embody theoretical, ontological, and methodological commitments and assumptions. Since these are enshrined in the formalism itself, they are no longer readily obvious to either the novice or the proficient user. At the same time as formalisms constrain what may be represented, they also function as inferential and interpretative resources. (shrink)
Mathematicalmodels of tumour invasion appear as interesting tools for connecting the information extracted from medical imaging techniques and the large amount of data collected at the cellular and molecular levels. Most of the recent studies have used stochastic models of cell translocation for the comparison of computer simulations with histological solid tumour sections in order to discriminate and characterise expansive growth and active cell movements during host tissue invasion. This paper describes how a deterministic approach based (...) on reaction-diffusion models and their generalisation in the mechano-chemical framework developed in the study of biological morphogenesis can be an alternative for analysing tumour morphological patterns. We support these considerations by reviewing two studies. In the first example, successful comparison of simulated brain tumour growth with a time sequence of computerised tomography (CT) scans leads to a quantification of the clinical parameters describing the invasion process and the therapy. The second example considers minimal hypotheses relating cell motility and cell traction forces. Using this model, we can simulate the bifurcation from an homogeneous distribution of cells at the tumour surface toward a nonhomogeneous density pattern which could characterise a pre-invasive stage at the tumour-host tissue interface. (shrink)
Vector-based models of word meaning have become increasingly popular in cognitive science. The appeal of these models lies in their ability to represent meaning simply by using distributional information under the assumption that words occurring within similar contexts are semantically similar. Despite their widespread use, vector-based models are typically directed at representing words in isolation, and methods for constructing representations for phrases or sentences have received little attention in the literature. This is in marked contrast to experimental (...) evidence (e.g., in sentential priming) suggesting that semantic similarity is more complex than simply a relation between isolated words. This article proposes a framework for representing the meaning of word combinations in vector space. Central to our approach is vector composition, which we operationalize in terms of additive and multiplicative functions. Under this framework, we introduce a wide range of composition models that we evaluate empirically on a phrase similarity task. (shrink)
In this essay I argue against I. Bernard Cohen's influential account of Newton's methodology in the Principia: the 'Newtonian Style'. The crux of Cohen's account is the successive adaptation of 'mental constructs' through comparisons with nature. In Cohen's view there is a direct dynamic between the mental constructs and physical systems. I argue that his account is essentially hypothetical-deductive, which is at odds with Newton's rejection of the hypothetical-deductive method. An adequate account of Newton's methodology needs to show how Newton's (...) method proceeds differently from the hypothetical-deductive method. In the constructive part I argue for my own account, which is model based: it focuses on how Newton constructed his models in Book I of the Principia. I will show that Newton understood Book I as an exercise in determining the mathematical consequences of certain force functions. The growing complexity of Newton's models is a result of exploring increasingly complex force functions (intra-theoretical dynamics) rather than a successive comparison with nature (extra-theoretical dynamics). Nature did not enter the scene here. This intra-theoretical dynamics is related to the 'autonomy of the models'. (shrink)
There are presently two leading foreign policy decision-making paradigms in vogue. The first is based on the classical or rational model originally posited by von Neumann and Morgenstern to explain microeconomic decisions. The second is based on the cybernetic perspective whose groundwork was laid by Herbert Simon in his early research on bounded rationality. In this paper we introduce a third perspective — thepoliheuristic theory of decision-making — as an alternative to the rational actor and cybernetic paradigms in international relations. (...) This theory is drawn in large part from research on heuristics done in experimental cognitive psychology. According to the poliheuristic theory, policy makers use poly (many) heuristics while focusing on a very narrow range of options and dimensions when making decisions. Among them, the political dimension is noncompensatory. The paper also delineates the mathematical formulations of the three decision-making models. (shrink)
There are presently two leading foreign policy decision-making paradigms in vogue. The first is based on the classical or rational model originally posited by von Neumann and Morgenstern to explain microeconomic decisions. The second is based on the cybernetic perspective whose groundwork was laid by Herbert Simon in his early research on bounded rationality. In this paper we introduce a third perspective -- the poliheuristic theory of decision-making -- as an alternative to the rational actor and cybernetic paradigms in international (...) relations. This theory is drawn in large part from research on heuristics done in experimental cognitive psychology. According to the poliheuristic theory, policy makers use poly (many) heuristics while focusing on a very narrow range of options and dimensions when making decisions. Among them, the political dimension is noncompensatory. The paper also delineates the mathematical formulations of the three decision-making models. (shrink)
An influential position in the philosophy of biology claims that there are no biological laws, since any apparently biological generalization is either too accidental, fact-like or contingent to be named a law, or is simply reducible to physical laws that regulate electrical and chemical interactions taking place between merely physical systems. In the following I will stress a neglected aspect of the debate that emerges directly from the growing importance of mathematicalmodels of biological phenomena. My main aim (...) is to defend, as well as reinforce, the view that there are indeed laws also in biology, and that their difference in stability, contingency or resilience with respect to physical laws is one of degrees, and not of kind. In order to reach this goal, in the next sections I will advance the following two arguments in favor of the existence of biological laws, both of which are meant to stress the similarity between physical and biological laws. (shrink)
The paper discusses how systems biology is working toward complex accounts that integrate explanation in terms of mechanisms and explanation by mathematicalmodels—which some philosophers have viewed as rival models of explanation. Systems biology is an integrative approach, and it strongly relies on mathematical modeling. Philosophical accounts of mechanisms capture integrative in the sense of multilevel and multifield explanations, yet accounts of mechanistic explanation (as the analysis of a whole in terms of its structural parts and (...) their qualitative interactions) have failed to address how a mathematical model could contribute to such explanations. I discuss how mathematical equations can be explanatorily relevant. Several cases from systems biology are discussed to illustrate the interplay between mechanistic research and mathematical modeling, and I point to questions about qualitative phenomena (rather than the explanation of quantitative details), where quantitative models are still indispensable to the explanation. Systems biology shows that a broader philosophical conception of mechanisms is needed, which takes into account functional-dynamical aspects, interaction in complex networks with feedback loops, system-wide functional properties such as distributed functionality and robustness, and a mechanism’s ability to respond to perturbations (beyond its actual operation). I offer general conclusions for philosophical accounts of explanation. (shrink)
The dominant approach to analyzing the meaning of natural language sentences that express mathematical knowl- edge relies on a referential, formal semantics. Below, I discuss an argument against this approach and in favour of an internalist, conceptual, intensional alternative. The proposed shift in analytic method offers several benefits, including a novel perspective on what is required to track mathematical content, and hence on the Benacerraf dilemma. The new perspective also promises to facilitate discussion between philosophers of mathematics (...) and cognitive scientists working on topics of common interest. (shrink)
In this commentary to Napoletani et al. (Found Sci 16:1–20, 2011), we argue that the approach the authors adopt suggests that neural nets are mathematical techniques rather than models of cognitive processing, that the general approach dates as far back as Ptolemy, and that applied mathematics is more than simply applying results from pure mathematics.
The problem of whether Lambek Calculus is complete with respect to (w.r.t.) relational semantics, has been raised several times, cf. van Benthem (1989a) and van Benthem (1991). In this paper, we show that the answer is in the affirmative. More precisely, we will prove that that version of the Lambek Calculus which does not use the empty sequence is strongly complete w.r.t. those relational Kripke-models where the set of possible worlds,W, is a transitive binary relation, while that version (...) of the Lambek Calculus where we admit the empty sequence as the antecedent of a sequent is strongly complete w.r.t. those relational models whereW=U×U for some setU. We will also look into extendability of this completeness result to various fragments of Girard's Linear Logic as suggested in van Benthem (1991), p. 235, and investigate the connection between the Lambek Calculus and language models. (shrink)
A plurality of axiomatic systems can be interpreted as referring to one and the same mathematical object. In this paper we examine the relationship between axiomatic systems and their models, the relationships among the various axiomatic systems that refer to the same model, and the role of an intelligent user of an axiomatic system. We ask whether these relationships and this role can themselves be formalized.
We develop a semantics for independence logic with respect to what we will call general models. We then introduce a simpler entailment semantics for the same logic, and we reduce the validity problem in the former to the validity problem in the latter. Then we build a proof system for independence logic and prove its soundness and completeness with respect to entailment semantics.
Ever since the early decades of this century, there have emerged a number of competing schools of ecology that have attempted to weave the concepts underlying natural resource management and natural-historical traditions into a formal theoretical framework. It was widely believed that the discovery of the fundamental mechanisms underlying ecological phenomena would allow ecologists to articulate mathematically rigorous statements whose validity was not predicated on contingent factors. The formulation of such statements would elevate ecology to the standing of a rigorous (...) scientific discipline on a par with physics. However, there was no agreement as to the fundamental units of ecology. Systems ecologists sought to identify the fundamental organization that tied the physical and biological components of ecosystems into an irreducible unit: the ecosystem was their fundamental unit. Population ecologists sought, instead, to identify the biological mechanisms regulating the abundance and distribution of plant and animal species: to these ecologists, the individual organism was the fundamental unit of ecology, and the physical environment was nothing more than a stage upon which the play of individuals in perennial competition took place. As Joel Hagen has pointed out, the two schools were thus dividied by fundamentally different and irreconcilable assumptions about the nature of ecosystems.Notwithstanding these divisive efforts to elevate the image of ecology, the discipline remained in the shadows of American academia until the mid-1960s, when systems ecologists succeeded in projecting ecology onto the national scene. They did so by seeking closer involvement with practical problems: they argued before Congress that their approach to the theoretical problems of ecology was uniquely suited to the solution of the impending “environmental crisis.” With the establishment of the International Biological Program, they succeeded in attracting unprecedented levels of funding for systems ecology research. Theoretical population ecologists, on the other hand, found themselves consigned to the outer regions of this new institutional landscape. The systems ecologists' successful capture of the limelight and the purse brought the divisions between them and population ecologists into sharper relief — hence the hardening of the division of ecology observed by Hagen.45I have argued that the population biologist Richard Levins, prompted by these institutional developments, sought to challenge the social position of systems ecology, and to assert the intellectual priority of theoretical population ecology. He attempted to do so by articulating a nontrivial and rather carefully thought out classification of ecological models that led to the disqualification of systems analysis as a legitimate approach to the study of ecological phenomena. I have suggested that — ultimately —Levins's case against systems analysis in ecology rested on the view that an aspiration to realism and prediction was incompatible with an interest in theoretical issues, a concern that he equated with the search for generality. He sought to reinforce this argument by exploiting the fact that systems ecologists had staked their future on the provision of technical solutions to the problems of the “environmental crisis”: he associated systems ecologists' aspiration to realism and precision with a concern for practical issues, trading on the widely accepted view that practical imperatives are incompatible with the aims of scientific inquiry.46 These are plausible, but nonetheless questionable, claims which have now become an integral part of ecological knowledge. And finally, I hope to have shown how even the most abstract levels of scientific argument are shaped by political considerations, and how discussions of the conceptual development of modern ecology might benefit from a greater consideration of its historical and social dimensions.47. (shrink)
This introduction to mathematical logic starts with propositional calculus and first-order logic. Topics covered include syntax, semantics, soundness, completeness, independence, normal forms, vertical paths through negation normal formulas, compactness, Smullyan's Unifying Principle, natural deduction, cut-elimination, semantic tableaux, Skolemization, Herbrand's Theorem, unification, duality, interpolation, and definability. The last three chapters of the book provide an introduction to type theory (higher-order logic). It is shown how various mathematical concepts can be formalized in this very expressive formal language. This expressive (...) notation facilitates proofs of the classical incompleteness and undecidability theorems which are very elegant and easy to understand. The discussion of semantics makes clear the important distinction between standard and nonstandard models which is so important in understanding puzzling phenomena such as the incompleteness theorems and Skolem's Paradox about countable models of set theory. Some of the numerous exercises require giving formal proofs. A computer program called ETPS which is available from the web facilitates doing and checking such exercises. Audience: This volume will be of interest to mathematicians, computer scientists, and philosophers in universities, as well as to computer scientists in industry who wish to use higher-order logic for hardware and software specification and verification. (shrink)
The process of constructing mathematicalmodels is examined and a case made that the construction process is an integral part of the justification for the model. The role of heuristics in testing and modifying models is described and some consequences for scientific methodology are drawn out. Three different ways of constructing the same model are detailed to demonstrate the claims made here.
Tarskian model theory is almost universally understood as a formal counterpart of the preformal notion of semantics, of the “linkage between words and things”. The wide-spread opinion is that to account for the semantics of natural language is to furnish its settheoretic interpretation in a suitable model structure; as exemplified by Montague 1974.
The preceding theory represents, I believe, a large improvement over conceptual graph theories of analogy. In particular, it is possible for analogical reasoning to be flexible or ‘creative’ on this approach, an aspect of analogy that is not accounted for in conceptual graph theories. I also believe that searching by constraint violations is a more reasonable way to organize memory search than to look for properties of conceptual hierarchies. Proof of this last point, however, awaits an more detailed classification of (...) the constraints that figure in analogical reasoning. (shrink)
This book attempts to marry truth-conditional semantics with cognitive linguistics in the church of computational neuroscience. To this end, it examines the truth-conditional meanings of coordinators, quantifiers, and collective predicates as neurophysiological phenomena that are amenable to a neurocomputational analysis. Drawing inspiration from work on visual processing, and especially the simple/complex cell distinction in early vision (V1), we claim that a similar two-layer architecture is sufficient to learn the truth-conditional meanings of the logical coordinators and logical quantifiers. As a (...) prerequisite, much discussion is given over to what a neurologically plausible representation of the meanings of these items would look like. We eventually settle on a representation in terms of correlation, so that, for instance, the semantic input to the universal operators (e.g. and, all)is represented as maximally correlated, while the semantic input to the universal negative operators (e.g. nor, no)is represented as maximally anticorrelated. On the basis this representation, the hypothesis can be offered that the function of the logical operators is to extract an invariant feature from natural situations, that of degree of correlation between parts of the situation. This result sets up an elegant formal analogy to recent models of visual processing, which argue that the function of early vision is to reduce the redundancy inherent in natural images. Computational simulations are designed in which the logical operators are learned by associating their phonological form with some degree of correlation in the inputs, so that the overall function of the system is as a simple kind of pattern recognition. Several learning rules are assayed, especially those of the Hebbian sort, which are the ones with the most neurological support. Learning vector quantization (LVQ) is shown to be a perspicuous and efficient means of learning the patterns that are of interest. We draw a formal parallelism between the initial, competitive layer of LVQ and the simple cell layer in V1, and between the final, linear layer of LVQ and the complex cell layer in V1, in that the initial layers are both selective, while the final layers both generalize. It is also shown how the representations argued for can be used to draw the traditionally-recognized inferences arising from coordination and quantification, and why the inference of subalternacy breaks down for collective predicates. Finally, the analogies between early vision and the logical operators allow us to advance the claim of cognitive linguistics that language is not processed by proprietary algorithms, but rather by algorithms that are general to the entire brain. Thus in the debate between objectivist and experiential metaphysics, this book falls squarely into the camp of the latter. Yet it does so by means of a rigorous formal, mathematical, and neurological exposition – in contradiction of the experiential claim that formal analysis has no place in the understanding of cognition. To make our own counter-claim as explicit as possible, we present a sketch of the LVQ structure in terms of mereotopology, in which the initial layer of the network performs topological operations, while the final layer performs mereological operations. The book is meant to be self-contained, in the sense that it does not assume any prior knowledge of any of the many areas that are touched upon. It therefore contains mini-summaries of biological visual processing, especially the retinocortical and ventral /what?/ parvocellular pathways computational models of neural signaling, and in particular the reduction of the Hodgkin-Huxley equations to the connectionist and integrate-and-fire neurons Hebbian learning rules and the elaboration of learning vector quantization the linguistic pathway in the left hemisphere memory and the hippocampus truth-conditional vs. image-schematic semantics objectivist vs. experiential metaphysics and mereotopology. All of the simulations are implemented in MATLAB, and the code is available from the book’s website. • The discovery of several algorithmic similarities between visison and semantics. • The support of all of this by means of simulations, and the packaging of all of this in a coherent theoretical framework. (shrink)
Employing the theory of Birkhoff polarities as a model of model theory yields an inductively defined dual structure which is a formalization of semantics and which allows for simple proofs of some new results for model theory.
The use of mathematicalmodels to support decision making is proliferating in both the public and private sectors. Advances in computer technology and greater opportunities to learn the appropriate techniques are extending modeling capabilities to more and more people. As powerful decision aids, models can be both beneficial or harmful. At present, few safeguards exist to prevent model builders or users from deliberately, carelessly, or recklessly manipulating data to further their own ends. Perhaps more importantly, few people (...) understand or appreciate that harm can be caused when builders or users fail to recognize the values and assumptions on which a model is based or fail to take into account all the groups who would be affected by a model's results. This volume provides a setting for a dialogue about ethics and shows the need to continue and define a vocabulary for exploring ethical concerns. It will become increasingly important for model builders and users to have a clear and strong code of ethics to guide them in making the ethical decisions they surely will have to face. (shrink)
This paper is concerned with scientific reasoning in the engineering sciences. Engineering sciences aim at explaining, predicting and describing physical phenomena occurring in technological devices. The focus of this paper is on mathematical description. These mathematical descriptions are important to computer-aided engineering or design programs (CAE and CAD). The first part of this paper explains why a traditional view, according to which scientific laws explain and predict phenomena and processes, is problematic. In the second part, the reasons of (...) these methodological difficulties are analyzed. Ludwig Prandtl’s method of integrating a theoretical and empirical approach is used as an example of good scientific practice. Based on this analysis, a distinction is made between different types of laws that play a role in constructing mathematical descriptions of phenomena. A central assumption in understanding research methodology is that, instead of scientific laws, knowledge of capacities and mechanisms are primary in the engineering sciences. Another important aspect in methodology of the engineering sciences is that in explaining a phenomenon or process spatial regions are distinguished in which distinct physical behaviour occur. The mechanisms in distinct spatial regions are represented in a so-called diagrammatic model. The construction of a mathematical description of the phenomenon or process is based on this diagrammatic model. (shrink)
One of the important challenges in the philosophy of mathematics is to account for the semantics of sentences that express mathematical propositions while simultaneously explaining our access to their contents. This is Benacerraf’s Dilemma. In this dissertation, I argue that cognitive science furnishes new tools by means of which we can make progress on this problem. The foundation of the solution, I argue, must be an ontologically realist, albeit non-platonist, conception of mathematical reality. The semantic portion of (...) the problem can be addressed by accepting a Chomskyan conception of natural languages and a matching internalist, mentalist and nativist view of semantics. A helpful perspective on the epistemic aspect of the puzzle can be gained by translating Kurt G ̈odel’s neo-Kantian conception of the nature of mathematics and its objects into modern, cognitive terms. (shrink)
This book presents a detailed analysis of three ancient models of spatial magnitude, time, and local motion. The Aristotelian model is presented as an application of the ancient, geometrically orthodox conception of extension to the physical world. The other two models, which represent departures from mathematical orthodoxy, are a "quantum" model of spatial magnitude, and a Stoic model, according to which limit entities such as points, edges, and surfaces do not exist in (physical) reality. The book is (...) unique in its discussion of these ancient models within the context of later philosophical, scientific, and mathematical developments. (shrink)
Two approaches for defining common knowledge coexist in the literature: the infinite iteration definition and the circular or fixed point one. In particular, an original modelization of the fixed point definition was proposed by Barwise (1989) in the context of a non-well-founded set theory and the infinite iteration approach has been technically analyzed within multi-modal epistemic logic using neighbourhood semantics by Lismont (1993). This paper exhibits a relation between these two ways of modelling common knowledge which seem at (...) first quite different. (shrink)
Ruchkin et al.'s view of working memory as activated long-term memory is more compatible with language processing than models such as Baddeley's, but it raises questions about individual differences in working memory and the validity of domain-general capacity estimates. Does it make sense to refer to someone as having low working memory capacity if capacity depends on particular knowledge structures tapped by the task?
In this paper we prove strong completeness of axiomatic extensions of first-order strict core fuzzy logics with the so-called quasi-witnessed axioms with respect to quasi-witnessed models. As a consequence we obtain strong completeness of Product Predicate Logic with respect to quasi-witnessed models, already proven by M.C. Laskowski and S. Malekpour in . Finally we study similar problems for expansions with Δ, define Δ-quasi-witnessed axioms and prove that any axiomatic extension of a first-order strict core fuzzy logic, expanded with (...) Δ, and Δ-quasi-witnessed axioms are complete with respect to Δ-quasi-witnessed models. (shrink)