Default logic is one of the most popular and successful formalisms for non-monotonic reasoning. In 2002, Bonatti and Olivetti introduced several sequent calculi for credulous and skeptical reasoning in propositional default logic. In this paper we examine these calculi from a proof-complexity perspective. In particular, we show that the calculus for credulous reasoning obeys almost the same bounds on the proof size as Gentzen’s system LK. Hence proving lower bounds for credulous reasoning will be as hard as proving lower bounds (...) for LK. On the other hand, we show an exponential lower bound to the proof size in Bonatti and Olivetti’s enhanced calculus for skeptical default reasoning. (shrink)
Many proposals for logic-based formalisations of argumentation consider an argument as a pair (Φ,α), where the support Φ is understood as a minimal consistent subset of a given knowledge base which has to entail the claim α. In case the arguments are given in the full language of classical propositional logic reasoning in such frameworks becomes a computationally costly task. For instance, the problem of deciding whether there exists a support for a given claim has been shown to be -complete. (...) In order to better understand the sources of complexity (and to identify tractable fragments), we focus on arguments given over formulæ in which the allowed connectives are taken from certain sets of Boolean functions. We provide a complexity classification for four different decision problems (existence of a support, checking the validity of an argument, relevance and dispensability) with respect to all possible sets of Boolean functions. Moreover, we make use of a general schema to enumerate all arguments to show that certain restricted fragments permit polynomial delay. Finally, we give a classification also in terms of counting complexity. (shrink)
Conflicts have arisen between communities and operators of confined animal feeding as farms have become bigger in order to maintain their competitiveness. These conflicts have been difficult to resolve because measuring and allocating the benefits and costs of livestock production is difficult. This papers demonstrates a policy tool for promoting compromise whereby the community gets reduced negative impacts from livestock while at the same time continues to benefit from livestock jobs, taxes, and related economic activity. Public economic benefits and public (...) economic costs of confined animal feeding operations are estimated for every farm and affected house in Craven County, North Carolina. The results show public economic benefits of $5.7 million and public economic costs of $2.2 million, but that the ratio of benefits to costs for individual farm-house pairs varies in important ways across the 26 hog farms in Craven County. (shrink)
What are the processes, from conception to adulthood, that enable a single cell to grow into a sentient adult? The processes that occur along the way are so complex that any attempt to understand development necessitates a multi-disciplinary approach, integrating data from cognitive studies, computational work, and neuroimaging - an approach till now seldom taken in the study of child development. -/- Neuroconstructivism is a major new 2 volume publication that seeks to redress this balance, presenting an integrative new framework (...) for considering development. In the first volume, the authors review up-to-to date findings from neurobiology, brain imaging, child development, computer and robotic modelling to consider why children's thinking develops the way it does. They propose a new synthesis of development that is based on 5 key principles found to operate at many levels of descriptions. They use these principles to explain what causes a number of key developmental phenomena, including infants' interacting with objects, early social cognitive interactions, and the causes of dyslexia. The "neuroconstructivist" framework also shows how developmental disorders do not arise from selective damage to the normal cognitive system, but instead arise from developmental processes that operate under atypical constraints. How these principles work is illustrated in several case studies ranging from perceptual to social and reading development. Finally, the authors use neuroimaging, behavioural analyses, computational simulations and robotic models to provide a way of understanding the mechanisms and processes that cause development to occur. (shrink)
The Reception of Derrida explores the cross-cultural reception of Derrida's work, specifically how that work in all its diversity, has come to be identified with the word deconstruction. In response to this cultural and academic phenomenon, the book examines how Derrida's own understanding of translation and inheritance illuminate the 'translation and transformation' of his own works. Positioned against the misreadings of deconstruction, the book traces the relationship between Derrida's concern with the ethico-political dimension of deconstruction and an authorial legacy. This (...) timely new study is the first book to consider the cultural reception of Derrida's works, and its accessible language and structure help to make this a benchmark amongst introductory Derrida studies. (shrink)
It is often assumed that similar domain-specific behavioural impairments found in cases of adult brain damage and developmental disorders correspond to similar underlying causes, and can serve as convergent evidence for the modular structure of the normal adult cognitive system. We argue that this correspondence is contingent on an unsupported assumption that atypical development can produce selective deficits while the rest of the system develops normally (Residual Normality), and that this assumption tends to bias data collection in the field. Based (...) on a review of connectionist models of acquired and developmental disorders in the domains of reading and past tense, as well as on new simulations, we explore the computational viability of Residual Normality and the potential role of development in producing behavioural deficits. Simulations demonstrate that damage to a developmental model can produce very different effects depending on whether it occurs prior to or following the training process. Because developmental disorders typically involve damage prior to learning, we conclude that the developmental process is a key component of the explanation of endstate impairments in such disorders. Further simulations demonstrate that in simple connectionist learning systems, the assumption of Residual Normality is undermined by processes of compensation or alteration elsewhere in the system. We outline the precise computational conditions required for Residual Normality to hold in development, and suggest that in many cases it is an unlikely hypothesis. We conclude that in developmental disorders, inferences from behavioural deficits to underlying structure crucially depend on developmental conditions, and that the process of ontogenetic development cannot be ignored in constructing models of developmental disorders. Key Words: Acquired and developmental disorders; connectionist models; modularity; past tense; reading. (shrink)
In response to our target article, many of the commentators concentrated on our notion of Residual Normality. In our response, we focus on the questions raised by this idea. However, we also examine broader issues concerning the importance of incorporating a realistic theory of the process of development into explanations of developmental deficits.
What makes us conscious? Many theories that attempt to answer this question have appeared recently in the context of widespread interest about consciousness in the cognitive neurosciences. Most of these proposals are formulated in terms of the information processing conducted by the brain. In this overview, we survey and contrast these models. We first delineate several notions of consciousness, addressing what it is that the various models are attempting to explain. Next, we describe a conceptual landscape that addresses how the (...) theories attempt to explain consciousness. We then situate each of several representative models in this landscape and indicate which aspect of consciousness they try to explain. We conclude that the search for the neural correlates of consciousness should be usefully complemented by a search for the computational correlates of consciousness. (shrink)
We address two points in this commentary. First, we question the extent to which O'Brien & Opie have established that the classical approach is unable to support a viable vehicle theory of consciousness. Second, assuming that connectionism does have the resources to support a vehicle theory, we explore how the activity of the units of a PDP network might sum together to form phenomenal experience (PE).