Logic and Computation is concerned with techniques for formal theorem-proving, with particular reference to Cambridge LCF (Logic for Computable Functions). Cambridge LCF is a computer program for reasoning about computation. It combines methods of mathematical logic with domain theory, the basis of the denotational approach to specifying the meaning of statements in a programming language. This book consists of two parts. Part I outlines the mathematical preliminaries: elementary logic and domain theory. They are explained at an intuitive level, (...) giving references to more advanced reading. Part II provides enough detail to serve as a reference manual for Cambridge LCF. It will also be a useful guide for implementors of other programs based on the LCF approach. (shrink)
Caplan & Waters argue that the processing resources used for sentence comprehension are not drawn from an undifferentiated verbal working memory resource. This commentary cites data from normal aging to support this position. Still lacking in theory development is a specification of the transient memory representations necessary for interpretive and post-interpretive operations.
This paper gives an overview of the common approach to quantification and generalised quantification in formal linguistics and philosophy of language. We point out how this usual general framework represents a departure from empirical linguistic data. We briefly sketch a different idea for prooftheory which is closer to the language itself than standard approaches in many aspects. We stress the importance of Hilbert’s operators—the epsilon-operator for existential and tau-operator for universal quantifications. Indeed, these operators are helpful (...) in the construction of a semantic representation which is close to natural language in particular with quantified noun phrases as individual terms. We also define guidelines for the design of proof rules corresponding to generalized quantifiers. (shrink)
The paper has three parts. First, a survey and analysis is given ofthe structure of individual rights in the recent EU Directive ondata protection. It is argued that at the core of this structure isan unexplicated notion of what the data subject can `reasonablyexpect' concerning the further processing of information about himor herself. In the second part of the paper it is argued thattheories of privacy popular among philosophers are not able to shed much light on the issues (...) treated in the Directive, whichare, arguably, among the central problems pertaining to theprotection of individual rights in the information society. Inthe third part of the paper, some suggestions are made for a richerphilosophical theory of data protection and privacy. It is arguedthat this account is better suited to the task of characterizingthe central issues raised by the Directive. (shrink)
Pickering & Garrod's (P&G's) integrated model of production and comprehension includes no explicit role for nonlinguistic cognitive processes. Yet, how domain-general cognitive functions contribute to language processing has become clearer with well-specified theories and supporting data. We therefore believe that their account can benefit by incorporating functions like working memory and cognitive control into a unified model of language processing.
This article explores the ways in which data centre operators are currently reconfiguring the systems of energy and heat supply in European capitals, replacing conventional forms of heating with data-driven heat production, and becoming important energy suppliers. Taking as an empirical object the heat generated from server halls, the article traces the expanding phenomenon of ‘waste heat recycling’ and charts the ways in which data centre operators in Stockholm and Paris direct waste heat through metropolitan district heating (...) systems and urban homes, and valorise it. Drawing on new materialisms, infrastructure studies and classical theory of production and destruction of value in capitalism, the article outlines two modes in which this process happens, namely infrastructural convergence and decentralisation of the data centre. These modes arguably help data centre operators convert big data from a source of value online into a raw material that needs to flow in the network irrespective of meaning. In this conversion process, the article argues, a new commodity is in a process of formation, that of computation traffic. Altogether data-driven heat production is suggested to raise the importance of certain dataprocessing nodes in Northern Europe, simultaneously intervening in the global politics of access, while neutralising external criticism towards big data by making urban life literally dependent on power from data streams. (shrink)
Recent work on the acquisition of number words has emphasized the importance of integrating linguistic and developmental perspectives [Musolino, J. (2004). The semantics and acquisition of number words: Integrating linguistic and developmental perspectives. Cognition93, 1-41; Papafragou, A., Musolino, J. (2003). Scalar implicatures: Scalar implicatures: Experiments at the semantics-pragmatics interface. Cognition, 86, 253-282; Hurewitz, F., Papafragou, A., Gleitman, L., Gelman, R. (2006). Asymmetries in the acquisition of numbers and quantifiers. Language Learning and Development, 2, 76-97; Huang, Y. T., Snedeker, J., Spelke, (...) L. (submitted for publication). What exactly do numbers mean?]. Specifically, these studies have shown that data from experimental investigations of child language can be used to illuminate core theoretical issues in the semantic and pragmatic analysis of number terms. In this article, I extend this approach to the logico-syntactic properties of number words, focusing on the way numerals interact with each other (e.g. Three boys are holding two balloons) as well as with other quantified expressions (e.g. Three boys are holding each balloon). On the basis of their intuitions, linguists have claimed that such sentences give rise to at least four different interpretations, reflecting the complexity of the linguistic structure and syntactic operations involved. Using psycholinguistic experimentation with preschoolers (n=32) and adult speakers of English (n=32), I show that (a) for adults, the intuitions of linguists can be verified experimentally, (b) by the age of 5, children have knowledge of the core aspects of the logical syntax of number words, (c) in spite of this knowledge, children nevertheless differ from adults in systematic ways, (d) the differences observed between children and adults can be accounted for on the basis of an independently motivated, linguistically-based processing model [Geurts, B. (2003). Quantifying kids. Language Acquisition, 11(4), 197-218]. In doing so, this work ties together research on the acquisition of the number vocabulary with a growing body of work on the development of quantification and sentence processing abilities in young children [Geurts, 2003; Lidz, J., Musolino, J. (2002). Children's command of quantification. Cognition, 84, 113-154; Musolino, J., Lidz, J. (2003). The scope of isomorphism: Turning adults into children. Language Acquisition, 11(4), 277-291; Trueswell, J., Sekerina, I., Hilland, N., Logrip, M. (1999). The kindergarten-path effect: Studying on-line sentence processing in young children. Cognition, 73, 89-134; Noveck, I. (2001). When children are more logical than adults: Experimental investigations of scalar implicature. Cognition, 78, 165-188; Noveck, I., Guelminger, R., Georgieff, N., & Labruyere, N. (2007). What autism can tell us about every. . . not sentences. Journal of Semantics,24(1), 73-90. On a more general level, this work confirms the importance of integrating formal and developmental perspectives [Musolino, 2004], this time by highlighting the explanatory power of linguistically-based models of language acquisition and by showing that the complex structure postulated by linguists has important implications for developmental accounts of the number vocabulary. (shrink)
The automatic verification of large parts of mathematics has been an aim of many mathematicians from Leibniz to Hilbert. While Gödel's first incompleteness theorem showed that no computer program could automatically prove certain true theorems in mathematics, the advent of electronic computers and sophisticated software means in practice there are many quite effective systems for automated reasoning that can be used for checking mathematical proofs. This book describes the use of a computer program to check the proofs of several celebrated (...) theorems in metamathematics including those of Gödel and Church-Rosser. The computer verification using the Boyer-Moore theorem prover yields precise and rigorous proofs of these difficult theorems. It also demonstrates the range and power of automated proof checking technology. The mechanization of metamathematics itself has important implications for automated reasoning, because metatheorems can be applied as labor-saving devices to simplify proof construction. (shrink)
We present an account of processing capacity in the ACT-R theory. At the symbolic level, the number of chunks in the current goal provides a measure of relational complexity. At the subsymbolic level, limits on spreading activation, measured by the attentional parameter W, provide a theory of processing capacity, which has been applied to performance, learning, and individual differences data.
The azimuthal processing of 3D full-azimuth full-offset P-P reflection seismic data can enable better imaging, thus yielding improved estimates of structure, lithology, porosity, pore fluids, in situ stress, and aligned porosity that flows fluids. In the past 34 years, the oil and gas industry has significantly advanced in the use of seismic azimuthal anisotropy, in particular, to gain information concerning unequal horizontal stresses and/or vertically aligned fractures, and possibly more importantly, to improve the prestack imaging especially in complex (...) structure. The important development stages during the past 40 years were enabled by industry advancements in acquisition, processing, theory, and interpretation. The typical important techniques became evident in PP amplitude variation with angle and azimuth and orthorhombic imaging. These techniques addressed the complications due to wave propagation in birefringent media. PP AVAaz, now industry standard for vertically aligned fracture characterization, is accompanied by a near-angle azimuthal amplitude variation when aligned connected porosity that flows fluids is present. Birefringence is present with unequal horizontal stresses and/or vertically aligned fractures that flow fluids. I have focused on the field-data documentation of the relationships among azimuthal P-P reflection data, S-wave birefringence, and hydrocarbon production. With increases and improvements in acquisition and processing, plus today’s powerful versatile interpretation platforms, continual advances beyond orthorhombic into monoclinic and triclinic symmetries are to be expected. The use of 3D azimuthal seismic for time-lapse changes of the in situ stress field, fracture populations, and pore fluids, as rocks undergo production processes and at plate boundaries where stresses change, offers great potential to benefit not just the oil and gas industry but all of humanity. (shrink)
We used Chinese prenominal relative clauses to test the predictions of two competing accounts of sentence comprehension difficulty: the experience-based account of Levy () and the Dependency Locality Theory. Given that in Chinese RCs, a classifier and/or a passive marker BEI can be added to the sentence-initial position, we manipulated the presence/absence of classifiers and the presence/absence of BEI, such that BEI sentences were passivized subject-extracted RCs, and no-BEI sentences were standard object-extracted RCs. We conducted two self-paced reading experiments, (...) using the same critical stimuli but somewhat different filler items. Reading time patterns from both experiments showed facilitative effects of BEI within and beyond RC regions, and delayed facilitative effects of classifiers, suggesting that cues that occur before a clear signal of an upcoming RC can help Chinese comprehenders to anticipate RC structures. The data patterns are not predicted by the DLT, but they are consistent with the predictions of experience-based theories. (shrink)
This introduction to the basic ideas of structural prooftheory contains a thorough discussion and comparison of various types of formalization of first-order logic. Examples are given of several areas of application, namely: the metamathematics of pure first-order logic (intuitionistic as well as classical); the theory of logic programming; category theory; modal logic; linear logic; first-order arithmetic and second-order logic. In each case the aim is to illustrate the methods in relatively simple situations and then apply (...) them elsewhere in much more complex settings. There are numerous exercises throughout the text. In general, the only prerequisite is a standard course in first-order logic, making the book ideal for graduate students and beginning researchers in mathematical logic, theoretical computer science and artificial intelligence. For the new edition, many sections have been rewritten to improve clarity, new sections have been added on cut elimination, and solutions to selected exercises have been included. (shrink)
The real-time and high-continuity requirements of the edge computing network gain more and more attention because of its active defence problem, that is, a data-driven complex problem. Due to the dual constraints of the hybrid feature of edge computing networks and the uncertainty of new attack features, implementing active defence measures such as detection, evasion, trap, and control is essential for the security protection of edge computing networks with high real-time and continuity requirements. The basic idea of safe active (...) defence is to make the defence gain more significant than the attack loss. To encounter the new attacks with uncertain features introduced by the ubiquitous transmission network in the edge computing network, this paper investigates the attack behaviour and presents an attack-defence mechanism based on game theory. Based on the idea of dynamic intrusion detection, we utilize the game theory in the field of edge computing network and suggest a data-driven mimicry intrusion detection game model-based technique called GLIDE. The game income of participants and utility computing methods under different deployment strategies are analysed in detail. According to the proof analysis of the Nash equilibrium condition in the model, the contradictory dynamic game relationship is described. Therefore, the optimal deployment strategy of the multiredundancy edge computing terminal intrusion detection service in the edge computing network is obtained by solving the game balance point. The detection probability of the edge computing network for network attacks is improved, and the cost of intrusion detection of the edge computing network is reduced. (shrink)
Callous unemotional (CU) traits (i.e., a lack of empathy/remorse and poverty of emotion) that co-occur with childhood antisocial behaviour are believed to be the developmental precursor to psychopathy in adulthood. An increasing volume of evidence supports two distinct variants of CU traits/psychopathy, known as primary and secondary. Primary variants are thought to show core deficits in emotional reactivity (e.g., attenuated autonomic activity), whereas secondary variants present with high levels of anxiety and this may be reflected in increased emotional sensitivity to (...) negative stimuli. -/- Aims: The current study is the first of its kind to examine the role of anxiety in modulating emotional processing as indexed by heart rate (HR) in incarcerated boys with CU traits. -/- Methods: HR was recorded continuously while 205 adolescents (aged 14 to 18 years) completed an emotional pictures dot-probe task. The task consisted of four blocks of 18 trials, beginning with a 500 ms central fixation cross, a 250 ms picture pair presentation (i.e., neutral, positive, negative valence), followed by a probe appearing in either the top or bottom picture location until response, and lastly a 2000 ms inter-trial interval recovery period. Four groups were formed on the basis of median-split scores on the Revised Children's Manifest Anxiety Scale and the Inventory of Callous Unemotional Traits, reflecting a primary variant (high CU/low anxiety), secondary variant (high CU/high anxiety), and two nonpsychopathic groups (low CU/high anxiety and low CU/low anxiety). -/- Results: The HR data indicated relative HR deceleration during the picture-probe period, regardless of group. Additionally, compared to all other groups, HR deceleration was greatest for the high CU/high anxiety secondary group and smallest for the high CU/low anxiety primary group during negative stimuli. Conclusions: This result is thought to reflect differences between CU/psychopathy variants in attentional orienting to distressing stimuli, consistent with theory. (shrink)
Based on cloud computing and statistics theory, this paper proposes a reasonable analysis method for big data of film and television. The method selects Hadoop open source cloud platform as the basis, combines the MapReduce distributed programming model and HDFS distributed file storage system and other key cloud computing technologies. In order to cope with different dataprocessing needs of film and television industry, association analysis, cluster analysis, factor analysis, and K-mean + association analysis algorithm training (...) model were applied to model, process, and analyze the full data of film and TV series. According to the film type, producer, production region, investment, box office, audience rating, network score, audience group, and other factors, the film and television data in recent years are analyzed and studied. Based on the study of the impact of each attribute of film and television drama on film box office and TV audience rating, it is committed to the prediction of film and television industry and constantly verifies and improves the algorithm model. (shrink)
Structural prooftheory is a branch of logic that studies the general structure and properties of logical and mathematical proofs. This book is both a concise introduction to the central results and methods of structural prooftheory, and a work of research that will be of interest to specialists. The book is designed to be used by students of philosophy, mathematics and computer science. The book contains a wealth of results on proof-theoretical systems, including extensions (...) of such systems from logic to mathematics, and on the connection between the two main forms of structural prooftheory - natural deduction and sequent calculus. The authors emphasize the computational content of logical results. A special feature of the volume is a computerized system for developing proofs interactively, downloadable from the web and regularly updated. (shrink)
Grodzinsky's theory of agrammatic sentence processing fails to account for crucial empirical facts. In contrast to his predictions, the data show that there are (1) degrees of severity and (2) problems with sentences that do not require movement, and that (3) under the right task circumstances, full-fledged syntactic trees are constructed.
Video gaming, specifically action video gaming, seems to improve a range of cognitive functions. The basis for these improvements may be attentional control in conjunction with reward-related learning to amplify the execution of goal-relevant actions while suppressing goal-irrelevant actions. Given that EEG alpha power reflects inhibitory processing, a core component of attentional control, it might represent the electrophysiological substrate of cognitive improvement in video gaming. The aim of this study was to test whether non-video gamers, non-action video gamers and (...) action video gamers exhibit differences in EEG alpha power, and whether this might account for differences in visual information processing as operationalized by the theory of visual attention. Forty male volunteers performed a visual short-term memory paradigm where they memorized shape stimuli depicted on circular stimulus displays at six different exposure durations while their EEGs were recorded. Accuracy data was analyzed using TVA-algorithms. There was a positive correlation between the extent of post-stimulus EEG alpha power attenuation and speed of information processing across all participants. Moreover, both EEG alpha power attenuation and speed of information processing were modulated by an interaction between group affiliation and time on task, indicating that video gamers showed larger EEG alpha power attenuations and faster information processing over time than NVGs – with AVGs displaying the largest increase. An additional regression analysis affirmed this observation. From this we concluded that EEG alpha power might be a promising neural substrate for explaining cognitive improvement in video gaming. (shrink)
Traditional approaches to human information processing tend to deal with perception and action planning in isolation, so that an adequate account of the perception-action interface is still missing. On the perceptual side, the dominant cognitive view largely underestimates, and thus fails to account for, the impact of action-related processes on both the processing of perceptual information and on perceptual learning. On the action side, most approaches conceive of action planning as a mere continuation of stimulus processing, thus (...) failing to account for the goal-directedness of even the simplest reaction in an experimental task. We propose a new framework for a more adequate theoretical treatment of perception and action planning, in which perceptual contents and action plans are coded in a common representational medium by feature codes with distal reference. Perceived events (perceptions) and to-be-produced events (actions) are equally represented by integrated, task-tuned networks of feature codes – cognitive structures we call event codes. We give an overview of evidence from a wide variety of empirical domains, such as spatial stimulus-response compatibility, sensorimotor synchronization, and ideomotor action, showing that our main assumptions are well supported by the data. Key Words: action planning; binding; common coding; event coding; feature integration; perception; perception-action interface. (shrink)
Proof-theoretical notions and techniques, developed on the basis of sentential/symbolic representations of formal proofs, are applied to Euler diagrams. A translation of an Euler diagrammatic system into a natural deduction system is given, and the soundness and faithfulness of the translation are proved. Some consequences of the translation are discussed in view of the notion of free ride, which is mainly discussed in the literature of cognitive science as an account of inferential efficacy of diagrams. The translation enables us (...) to formalize and analyze free ride in terms of prooftheory. The notion of normal form of Euler diagrammatic proofs is investigated, and a normalization theorem is proved. Some consequences of the theorem are further discussed: in particular, an analysis of the structure of normal diagrammatic proofs; a diagrammatic counterpart of the usual subformula property; and a characterization of diagrammatic proofs compared with natural deduction proofs. (shrink)
ProofTheory of Modal Logic is devoted to a thorough study of proof systems for modal logics, that is, logics of necessity, possibility, knowledge, belief, time, computations etc. It contains many new technical results and presentations of novel proof procedures. The volume is of immense importance for the interdisciplinary fields of logic, knowledge representation, and automated deduction.
There is a growing interest in aggregating more biomedical and patient data into large health data sets for research and public benefits. However, collecting and processing patient data raises new ethical issues regarding patient’s rights, social justice and trust in public institutions. The aim of this empirical study is to gain an in-depth understanding of the awareness of possible ethical risks and corresponding obligations among those who are involved in projects using patient data, i.e. healthcare (...) professionals, regulators and policy makers. We used a qualitative design to examine Swiss healthcare stakeholders’ experiences and perceptions of ethical challenges with regard to patient data in real-life settings where clinical registries are sponsored, created and/or used. A semi-structured interview was carried out with 22 participants between July 2014 and January 2015. The interviews were audio-recorded, transcribed, coded and analysed using a thematic method derived from Grounded Theory. All interviewees were concerned as a matter of priority with the needs of legal and operating norms for the collection and use of data, whereas less interest was shown in issues regarding patient agency, the need for reciprocity, and shared governance in the management and use of clinical registries’ patient data. This observed asymmetry highlights a possible tension between public and research interests on the one hand, and the recognition of patients’ rights and citizens’ involvement on the other. The advocation of further health-related data sharing on the grounds of research and public interest, without due regard for the perspective of patients and donors, could run the risk of fostering distrust towards healthcare data collections. Ultimately, this could diminish the expected social benefits. However, rather than setting patient rights against public interest, new ethical approaches could strengthen both concurrently. On a normative level, this study thus provides material from which to develop further ethical reflection towards a more cooperative approach involving patients and citizens in the governance of their health-related big data. (shrink)
Purpose Principled discussions on the economic value of data are frequently pursued through metaphors. This study aims to explore three influential metaphors for talking about the economic value of data: data are the new oil, data as infrastructure and data as an asset. Design/methodology/approach With the help of conceptual metaphor theory, various meanings surrounding the three metaphors are explored. Meanings clarified or hidden through various metaphors are identified. Specific emphasis is placed on the economic (...) value of ownership of data. Findings In discussions on data as economic resource, the three different metaphors are used for separate purposes. The most used metaphor, data are the new oil, communicates that ownership of data could lead to great wealth. However, with data as infrastructure data have no intrinsic value. Therefore, profits generated from data resources belong to those processing the data, not those owning it. The data as an asset metaphor can be used to convince organizational leadership that they own data of great value. Originality/value This is the first scholarly investigation of metaphors communicating economic value of data. More studies in this area appear urgent, given the power of such metaphors, as well as the increasing importance of data in economics. (shrink)
The consideration of careful reasoning can be traced to Aristotle and earlier authors. The possibility of rigorous rules for drawing conclusions can certainly be traced to the Middle Ages when types o f syllogism were studied. Shortly after the introduction of computers, the audacious scientist naturally envisioned the automation of sound reasoning—reasoning in which conclusions that are drawn follow l ogically and inevitably from the given hypotheses. Did the idea spring from the intent to emulate s Sherlock Holmes and Mr. (...) Spock (of Star Trek) in ﬁction and Hilbert and Tarski and other great mind i n nonﬁction? Each of them applied logical reasoning to answer questions, solve problems, and ﬁnd e proofs. But can such logical reasoning be fully automated? Can a single computer program b d esigned to offer sufﬁcient power in the cited contexts? Indeed, while the use of computers was quickly accepted for numerical calculations and dataprocessing, intense skepticism persisted—even in the early 1960s—regarding the ability of computers to a pply effective reasoning. The following simple (but perhaps deceptive) example provides a taste of the type of argument that might have been used to support this skepticism. (shrink)
The paper contains proof-theoretic investigation on extensions of Kripke-Platek set theory, KP, which accommodate first-order reflection. Ordinal analyses for such theories are obtained by devising cut elimination procedures for infinitary calculi of ramified set theory with Пn reflection rules. This leads to consistency proofs for the theories KP+Пn reflection using a small amount of arithmetic and the well-foundedness of a certain ordinal system with respect to primitive decending sequences. Regarding future work, we intend to avail ourselves of (...) these new cut elimination techniques to attain an ordinal analysis of П12 comprehension by approaching П12 comprehension through transfinite levels of reflection. (shrink)
The prooftheory of many-valued systems has not been investigated to an extent comparable to the work done on axiomatizatbility of many-valued logics. Prooftheory requires appropriate formalisms, such as sequent calculus, natural deduction, and tableaux for classical (and intuitionistic) logic. One particular method for systematically obtaining calculi for all finite-valued logics was invented independently by several researchers, with slight variations in design and presentation. The main aim of this report is to develop the proof (...)theory of finite-valued first order logics in a general way, and to present some of the more important results in this area. In Systems covered are the resolution calculus, sequent calculus, tableaux, and natural deduction. This report is actually a template, from which all results can be specialized to particular logics. (shrink)
Progress in philosophy is difficult to achieve because our methods are evidentially and rhetorically weak. In the last two decades, experimental philosophers have begun to employ the methods of the social sciences to address philosophical questions. However, the adequacy of these methods has been called into question by repeated failures of replication. Experimental philosophers need to incorporate more robust methods to achieve a multi-modal perspective. In this chapter, we describe and showcase cutting-edge methods for data-mining and visualization. Big (...) class='Hi'>data is a useful investigatory tool for moral psychology, and it fits well with the Ramsification method the first author advances in a series of recent papers. The guiding insight of these papers is that we can infer the meaning and structure of concepts from patterns of assertions and inferential associations in natural language. (shrink)
Currently, production and comprehension are regarded as quite distinct in accounts of language processing. In rejecting this dichotomy, we instead assert that producing and understanding are interwoven, and that this interweaving is what enables people to predict themselves and each other. We start by noting that production and comprehension are forms of action and action perception. We then consider the evidence for interweaving in action, action perception, and joint action, and explain such evidence in terms of prediction. Specifically, we (...) assume that actors construct forward models of their actions before they execute those actions, and that perceivers of others' actions covertly imitate those actions, then construct forward models of those actions. We use these accounts of action, action perception, and joint action to develop accounts of production, comprehension, and interactive language. Importantly, they incorporate well-defined levels of linguistic representation. We show how speakers and comprehenders use covert imitation and forward modeling to make predictions at these levels of representation, how they interweave production and comprehension processes, and how they use these predictions to monitor the upcoming utterances. We show how these accounts explain a range of behavioral and neuroscientific data on language processing and discuss some of the implications of our proposal. (shrink)
Admissible rules of a logic are those rules under which the set of theorems of the logic is closed. In this paper, a Gentzen-style framework is introduced for analytic proof systems that derive admissible rules of non-classical logics. While Gentzen systems for derivability treat sequents as basic objects, for admissibility, the basic objects are sequent rules. Proof systems are defined here for admissible rules of classes of modal logics, including K4, S4, and GL, and also Intuitionistic Logic IPC. (...) With minor restrictions, proof search in these systems terminates, giving decision procedures for admissibility in the logics. (shrink)
The axiomatic presentation of modal systems and the standard formulations of natural deduction and sequent calculus for modal logic are reviewed, together with the difficulties that emerge with these approaches. Generalizations of standard proof systems are then presented. These include, among others, display calculi, hypersequents, and labelled systems, with the latter surveyed from a closer perspective.
Paraconsistent Weak Kleene Logic is the 3-valued propositional logic defined on the weak Kleene tables and with two designated values. Most of the existing proof systems for PWK are characterised by the presence of linguistic restrictions on some of their rules. This feature can be seen as a shortcoming. We provide a cut-free calculus for PWK that is devoid of such provisos. Moreover, we introduce a Priest-style tableaux calculus for PWK.
In the first part we show why ordinals and ordinal notations are naturally connected with proof theoretical research. We introduce the program of ordinal analysis. The second part gives examples of applications of ordinal analysis.
This volume contains articles covering a broad spectrum of prooftheory, with an emphasis on its mathematical aspects. The articles should not only be interesting to specialists of prooftheory, but should also be accessible to a diverse audience, including logicians, mathematicians, computer scientists and philosophers. Many of the central topics of prooftheory have been included in a self-contained expository of articles, covered in great detail and depth. The chapters are arranged so that (...) the two introductory articles come first; these are then followed by articles from core classical areas of prooftheory; the handbook concludes with articles that deal with topics closely related to computer science. (shrink)
Proof-theoretic methods are developed for subsystems of Johansson’s logic obtained by extending the positive fragment of intuitionistic logic with weak negations. These methods are exploited to establish properties of the logical systems. In particular, cut-free complete sequent calculi are introduced and used to provide a proof of the fact that the systems satisfy the Craig interpolation property. Alternative versions of the calculi are later obtained by means of an appropriate loop-checking history mechanism. Termination of the new calculi is (...) proved, and used to conclude that the considered logical systems are PSPACE-complete. (shrink)
This paper takes a critical look at theory-free, statistical methodologies for processing and interpreting data taken from respondents answering a set of dichotomous (yes-no) questions. The basic issue concerns to what extent theoretical conclusions based on such analyses are invariant under a class of "informationally equivalent" question transformations. First the notion of Boolean equivalence of two question sets is discussed. Then Lazarsfeld's latent structure analysis is considered in detail. It is discovered that the best fitting latent model (...) depends on which one of the many informationally equivalent question sets is used. This fact raises a number of methodological problems and pitfalls with latent structure analysis. Related problems with other methodologies are briefly discussed. (shrink)
This paper deals with a prooftheory for a theory T22 of recursively Mahlo ordinals in the form of Π2-reflecting on Π2-reflecting ordinals using a subsystem Od of the system O of ordinal diagrams in Arai 353). This paper is the first published one in which a proof-theoretic analysis à la Gentzen–Takeuti of recursively large ordinals is expounded.
The foundations of mathematics are divided into prooftheory and set theory. Prooftheory tries to justify the world of infinite mind from the standpoint of finite mind. Set theory tries to know more and more of the world of the infinite mind. The development of two subjects are discussed including a new proof of the accessibility of ordinal diagrams. Finally the world of large cardinals appears when we go slightly beyond girard's categorical (...) approach to prooftheory. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural prooftheory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary (...) discussion. Much of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
Opposition theory suggests that binary oppositions underlie basic cognitive and linguistic processes. However, opposition theory has never been implemented in a computational cognitive-semiotics model. In this paper, we present a simple model of metaphor identification that relies on opposition theory. An algorithm instantiating the model has been tested on a data set of 100 phrases comprising adjective-noun pairs in which approximately a half represent metaphorical language-use and the rest literal language-use. The algorithm achieved 89% accuracy in (...) metaphor identification and illustrates the relevance of opposition theory for modelling metaphor processing. (shrink)