What are the processes, from conception to adulthood, that enable a single cell to grow into a sentient adult? Neuroconstructivism is a pioneering 2 volume work that sets out a whole new framework for considering the complex topic of development, integrating data from cognitive studies, computational work, and neuroimaging.
What makes us conscious? Many theories that attempt to answer this question have appeared recently in the context of widespread interest about consciousness in the cognitive neurosciences. Most of these proposals are formulated in terms of the information processing conducted by the brain. In this overview, we survey and contrast these models. We first delineate several notions of consciousness, addressing what it is that the various models are attempting to explain. Next, we describe a conceptual landscape that addresses how the (...) theories attempt to explain consciousness. We then situate each of several representative models in this landscape and indicate which aspect of consciousness they try to explain. We conclude that the search for the neural correlates of consciousness should be usefully complemented by a search for the computational correlates of consciousness. (shrink)
It is often assumed that similar domain-specific behavioural impairments found in cases of adult brain damage and developmental disorders correspond to similar underlying causes, and can serve as convergent evidence for the modular structure of the normal adult cognitive system. We argue that this correspondence is contingent on an unsupported assumption that atypical development can produce selective deficits while the rest of the system develops normally (Residual Normality), and that this assumption tends to bias data collection in the field. Based (...) on a review of connectionist models of acquired and developmental disorders in the domains of reading and past tense, as well as on new simulations, we explore the computational viability of Residual Normality and the potential role of development in producing behavioural deficits. Simulations demonstrate that damage to a developmental model can produce very different effects depending on whether it occurs prior to or following the training process. Because developmental disorders typically involve damage prior to learning, we conclude that the developmental process is a key component of the explanation of endstate impairments in such disorders. Further simulations demonstrate that in simple connectionist learning systems, the assumption of Residual Normality is undermined by processes of compensation or alteration elsewhere in the system. We outline the precise computational conditions required for Residual Normality to hold in development, and suggest that in many cases it is an unlikely hypothesis. We conclude that in developmental disorders, inferences from behavioural deficits to underlying structure crucially depend on developmental conditions, and that the process of ontogenetic development cannot be ignored in constructing models of developmental disorders. Key Words: Acquired and developmental disorders; connectionist models; modularity; past tense; reading. (shrink)
Neuroconstructivism: How the Brain Constructs Cognition proposes a unifying framework for the study of cognitive development that brings together (1) constructivism (which views development as the progressive elaboration of increasingly complex structures), (2) cognitive neuroscience (which aims to understand the neural mechanisms underlying behavior), and (3) computational modeling (which proposes formal and explicit specifications of information processing). The guiding principle of our approach is context dependence, within and (in contrast to Marr [1982]) between levels of organization. We propose that three (...) mechanisms guide the emergence of representations: competition, cooperation, and chronotopy; which themselves allow for two central processes: proactivity and progressive specialization. We suggest that the main outcome of development is partial representations, distributed across distinct functional circuits. This framework is derived by examining development at the level of single neurons, brain systems, and whole organisms. We use the terms encellment, embrainment, and embodiment to describe the higher-level contextual influences that act at each of these levels of organization. To illustrate these mechanisms in operation we provide case studies in early visual perception, infant habituation, phonological development, and object representations in infancy. Three further case studies are concerned with interactions between levels of explanation: social development, atypical development and within that, developmental dyslexia. We conclude that cognitive development arises from a dynamic, contextual change in embodied neural structures leading to partial representations across multiple brain regions and timescales, in response to proactively specified physical and social environment. (shrink)
Conflicts have arisen between communities and operators of confined animal feeding as farms have become bigger in order to maintain their competitiveness. These conflicts have been difficult to resolve because measuring and allocating the benefits and costs of livestock production is difficult. This papers demonstrates a policy tool for promoting compromise whereby the community gets reduced negative impacts from livestock while at the same time continues to benefit from livestock jobs, taxes, and related economic activity. Public economic benefits and public (...) economic costs of confined animal feeding operations are estimated for every farm and affected house in Craven County, North Carolina. The results show public economic benefits of $5.7 million and public economic costs of $2.2 million, but that the ratio of benefits to costs for individual farm-house pairs varies in important ways across the 26 hog farms in Craven County. (shrink)
Conflicts have arisen between communities and operators of confined animal feeding as farms have become bigger in order to maintain their competitiveness. These conflicts have been difficult to resolve because measuring and allocating the benefits and costs of livestock production is difficult. This papers demonstrates a policy tool for promoting compromise whereby the community gets reduced negative impacts from livestock while at the same time continues to benefit from livestock jobs, taxes, and related economic activity. Public economic benefits and public (...) economic costs of confined animal feeding operations are estimated for every farm and affected house in Craven County, North Carolina. The results show public economic benefits of $5.7 million and public economic costs of $2.2 million, but that the ratio of benefits to costs for individual farm-house pairs varies in important ways across the 26 hog farms in Craven County. (shrink)
Many proposals for logic-based formalisations of argumentation consider an argument as a pair (Φ,α), where the support Φ is understood as a minimal consistent subset of a given knowledge base which has to entail the claim α. In case the arguments are given in the full language of classical propositional logic reasoning in such frameworks becomes a computationally costly task. For instance, the problem of deciding whether there exists a support for a given claim has been shown to be -complete. (...) In order to better understand the sources of complexity (and to identify tractable fragments), we focus on arguments given over formulæ in which the allowed connectives are taken from certain sets of Boolean functions. We provide a complexity classification for four different decision problems (existence of a support, checking the validity of an argument, relevance and dispensability) with respect to all possible sets of Boolean functions. Moreover, we make use of a general schema to enumerate all arguments to show that certain restricted fragments permit polynomial delay. Finally, we give a classification also in terms of counting complexity. (shrink)
We address two points in this commentary. First, we question the extent to which O'Brien & Opie have established that the classical approach is unable to support a viable vehicle theory of consciousness. Second, assuming that connectionism does have the resources to support a vehicle theory, we explore how the activity of the units of a PDP network might sum together to form phenomenal experience (PE).
In response to our target article, many of the commentators concentrated on our notion of Residual Normality. In our response, we focus on the questions raised by this idea. However, we also examine broader issues concerning the importance of incorporating a realistic theory of the process of development into explanations of developmental deficits.
Default logic is one of the most popular and successful formalisms for non-monotonic reasoning. In 2002, Bonatti and Olivetti introduced several sequent calculi for credulous and skeptical reasoning in propositional default logic. In this paper we examine these calculi from a proof-complexity perspective. In particular, we show that the calculus for credulous reasoning obeys almost the same bounds on the proof size as Gentzen’s system LK. Hence proving lower bounds for credulous reasoning will be as hard as proving lower bounds (...) for LK. On the other hand, we show an exponential lower bound to the proof size in Bonatti and Olivetti’s enhanced calculus for skeptical default reasoning. (shrink)
What are the processes, from conception to adulthood, that enable a single cell to grow into a sentient adult? The processes that occur along the way are so complex that any attempt to understand development necessitates a multi-disciplinary approach, integrating data from cognitive studies, computational work, and neuroimaging - an approach till now seldom taken in the study of child development. Neuroconstructivism is a major new 2 volume publication that seeks to redress this balance, presenting an integrative new framework for (...) considering development. In the first volume, the authors review up-to-to date findings from neurobiology, brain imaging, child development, computer and robotic modelling to consider why children's thinking develops the way it does. They propose a new synthesis of development that is based on 5 key principles found to operate at many levels of descriptions. They use these principles to explain what causes a number of key developmental phenomena, including infants' interacting with objects, early social cognitive interactions, and the causes of dyslexia. The "neuroconstructivist" framework also shows how developmental disorders do not arise from selective damage to the normal cognitive system, but instead arise from developmental processes that operate under atypical constraints. How these principles work is illustrated in several case studies ranging from perceptual to social and reading development. Finally, the authors use neuroimaging, behavioural analyses, computational simulations and robotic models to provide a way of understanding the mechanisms and processes that cause development to occur. (shrink)
We investigate the application of Courcelle’s theorem and the logspace version of Elberfeld et al. in the context of non-monotonic reasoning. Here we formalize the implication problem for propositional sets of formulas, the extension existence problem for default logic, the expansion existence problem for autoepistemic logic, the circumscriptive inference problem, as well as the abduction problem in monadic second order logic and thereby obtain fixed-parameter time and space efficient algorithms for these problems. On the other hand, we exhibit, for each (...) of the above problems, families of instances of a very simple structure that, for a wide range of different parameterizations, do not have efficient fixed-parameter algorithms under standard complexity assumptions. (shrink)
Based in a global array of case studies - Malaysia, Saudi Arabia, Pakistan, Turkey, and Islamic education in the United States - this volume shows how the discourse concerning educational technology in the Islamic world has emphasized neoliberal and neofundamentalist themes, and argues that the design and implementation of educational technologies in schools would be better accomplished by taking a culturally grounded approach. This approach would be rooted in the context and local needs of learners, with implications and possible application (...) in the Muslim world and beyond. (shrink)
We argue that are no such things as literal categories in human cognition. Instead, we argue that there are merely temporary coalescences of dimensions of similarity, which are brought together by context in order to create the similarity structure in mental representations appropriate for the task at hand. Fodor contends that context‐sensitive cognition cannot be realised by current computational theories of mind. We address this challenge by describing a simple computational implementation that exhibits internal knowledge representations whose similarity structure alters (...) fluidly depending on context. We explicate the processing properties that support this function and illustrate with two more complex models, one applied to the development of semantic knowledge , the second to the processing of simple metaphorical comparisons . The models firstly demonstrate how phenomena that seem problematic for literal categorisation resolve to particular cases of the contextual modulation of mental representations; and secondly prompt a new perspective on the relation between language and thought: language affords the strategic control of context on semantic knowledge, allowing information to be brought to bear in a given situation that might otherwise not be available to influence processing. This may explain one way in which human thought is creative, and distinctive from animal cognition. (shrink)