According to the Generality Constraint, mental states with conceptual content must be capable of recombining in certain systematic ways. Drawing on empirical evidence from cognitive science, I argue that so-called analogue magnitude states violate this recombinability condition and thus have nonconceptual content. I further argue that this result has two significant consequences: it demonstrates that nonconceptual content seeps beyond perception and infiltrates cognition; and it shows that whether mental states have nonconceptual content is largely an empirical matter determined by (...) the structure of the neural representations underlying them. (shrink)
Andrea Westlund's account of love involves lovers becoming a Plural Subject mirroring Margaret Gilbert's Plural Subject Theory. However, while for Gilbert the creation of a plural will involves individuals jointly committing to pool their wills and the plural will directly normatively constraining those individuals, Westlund, in contrast, sees the creation of a plural will as a continual process thus rejecting the possibility of such direct normative constraint. This rejection appears to be required to explain the flexibility that allows for (...) a central place for reciprocity in loving relationships. However, this paper argues against the existence of such flexibility and presents instead the case that variance in the normative pain of rebelling against the collective will can be accommodated by replacing Gilbert's notion of all-or-nothing pooling of wills with an account that see wills as becoming entangled through levels of identification with the plural subject. (shrink)
In his book Mind and World, John McDowell grapples with the problem that the world must and yet seemingly cannot constrain our empirical thought. I first argue that McDowell’s proposed solution to the problem throws him onto the horns of his own, intractable dilemma, and thus fails to solve the problem of rational constraint by the world. Next, I will argue that Wilfrid Sellars, in a series of articles written in the 1950s and 60s, provides the tools to solve (...) the dilemma McDowell sets before us. We will see how, borrowing from Sellars and certain neo-Sellarsians, we can solve the problem of rational constraint by perception without resorting to a McDowellian quasi-enchantment of the world. (shrink)
I develop a variant of the constraint interpretation of the emergence of purely physical (non-biological) entities, focusing on the principle of the non-derivability of actual physical states from possible physical states (physical laws) alone. While this is a necessary condition for any account of emergence, it is not sufficient, for it becomes trivial if not extended to types of constraint that specifically constitute physical entities, namely, those that individuate and differentiate them. Because physical organizations with these features are (...) in fact interdependent sets of such constraints, and because such constraints on physical laws cannot themselves be derived from physical laws, physical organization is emergent. These two complementary types of constraint are components of a complete non-reductive physicalism, comprising a non-reductive materialism and a non-reductive formalism. (shrink)
We present a rendering of some common grammatical formalisms in terms of evolving algebras. Though our main concern in this paper is on constraint-based formalisms, we also discuss the more basic case of context-free grammars. Our aim throughout is to highlight the use of evolving algebras as a specification tool to obtain grammar formalisms.
We observe a number of connections between recent developments in the study of constraint satisfaction problems, irredundant axiomatisation and the study of topological quasivarieties. Several restricted forms of a conjecture of Clark, Davey, Jackson and Pitkethly are solved: for example we show that if, for a finite relational structure M, the class of M-colourable structures has no finite axiomatisation in first order logic, then there is no set (even infinite) of first order sentences characterising the continuously M-colourable structures amongst (...) compact totally disconnected relational structures. We also refute a rather old conjecture of Gorbunov by presenting a finite structure with an infinite irredundant quasi-identity basis. (shrink)
Domain constraint, the requirement that analogues be selected from "the same category," inheres in the popular saying "you can't compare apples and oranges" and the textbook principle "the greater the number of shared properties, the stronger the argument from analogy." I identify roles of domains in biological, linguistic, and legal analogy, supporting the account of law with a computer word search of judicial decisions. I argue that the category treatments within these disciplines cannot be exported to general informal logic, (...) where the relevance of properties, not their number, must be the logically prior criterion for evaluating analogical arguments. (shrink)
The famous Allen's interval relations constraint propagation algorithm was intended for linear time. Its 13 primitive relations define all the possible mutual locations of two intervals on the time-axis. In this paper an application of the algorithm for non-linear time is suggested. First, a new primitive relation is added. It is called excludes since an occurrence of one event in a certain course of events excludes an occurrence of the other event in this course. Next, new composition rules for (...) relations between intervals are presented: some of the old rules are extended by the relation excludes, and entirely new ones are formulated for composing the relation excludes with the other relations. Four different composition tables are considered. The choice of a composition table depends on whether time is branching or not, and whether intervals can contain non-collinear subintervals or not. (shrink)
The SRL (speciate re-entrant logic) of King (1989) is a sound, complete and decidable logic designed specifically to support formalisms for the HPSG (head-driven phrase structure grammar) of Pollard and Sag (1994). The SRL notion of modellability in a signature is particularly important for HPSG, and the present paper modifies an elegant method due to Blackburn and Spaan (1993) in order to prove that – modellability in each computable signature is 1 0 – modellability in some finite signature is (...) 1 0 -hard (hence not decidable), and – modellability in some finite signature is decidable. (shrink)
We prove an exponential lower bound on the size of proofs in the proof system operating with ordered binary decision diagrams introduced by Atserias, Kolaitis and Vardi . In fact, the lower bound applies to semantic derivations operating with sets defined by OBDDs. We do not assume any particular format of proofs or ordering of variables, the hard formulas are in CNF. We utilize (somewhat indirectly) feasible interpolation. We define a proof system combining resolution and the OBDD proof system.
It is widely mooted that a plausible computational cognitive model should involve both symbolic and connectionist components. However, sound principles for combining these components within a hybrid system are currently lacking; the design of such systems is oftenad hoc. In an attempt to ameliorate this we provide a framework of types of hybrid systems and constraints therein, within which to explore the issues. In particular, we suggest the use of system independent constraints, whose source lies in general considerations about cognitive (...) systems, rather than in particular technological or task-based considerations. We illustrate this through a detailed examination of an interruptibility constraint: handling interruptions is a fundamental facet of cognition in a dynamic world. Aspects of interruptions are delineated, as are their precise expression in symbolic and connectionist systems. We illustrate the interaction of the various constraints from interruptibility in the different types of hybrid systems. The picture that emerges of the relationship between the connectionist and the symbolic within a hybrid system provides for sufficient flexibility and complexity to suggest interesting general implications for cognition, thus vindicating the utility of the framework. (shrink)
The so-called "adaptationism" of mainstream evolutionary biology has been criticized from a variety of sources. One, which has received relatively little philosophical attention, is developmental biology. Developmental constraints are said to be neglected by adaptationists. This paper explores the divergent methodological and explanatory interests that separate mainstream evolutionary biology from its embryological and developmental critics. It will focus on the concept of constraint itself; even this central concept is understood differently by the two sides of the dispute.
The concept of developmental constraint was at the heart of developmental approaches to evolution of the 1980s. While this idea was widely used to criticize neo-Darwinian evolutionary theory, critique does not yield an alternative framework that offers evolutionary explanations. In current Evo-devo the concept of constraint is of minor importance, whereas notions as evolvability are at the center of attention. The latter clearly defines an explanatory agenda for evolutionary research, so that one could view the historical shift from (...) ‘developmental constraint’ towards ‘evolvability’ as the move from a concept that is a mere tool of criticism to a concept that establishes a positive explanatory project. However, by taking a look at how the concept of constraint was employed in the 1980s, I argue that developmental constraint was not just seen as restricting possibilities (‘constraining’), but also as facilitating morphological change in several ways. Accounting for macroevolutionary transformation and the origin of novel form was an aim of these developmental approaches to evolution. Thus, the concept of developmental constraint was part of a positive explanatory agenda long before the advent of Evo-devo as a genuine scientific discipline. In the 1980s, despite the lack of a clear disciplinary identity, this concept coordinated research among paleontologists, morphologists, and developmentally inclined evolutionary biologists. I discuss the different functions that scientific concepts can have, highlighting that instead of classifying or explaining natural phenomena, concepts such as ‘developmental constraint’ and ‘evolvability’ are more important in setting explanatory agendas so as to provide intellectual coherence to scientific approaches. The essay concludes with a puzzle about how to conceptually distinguish evolvability and selection. (shrink)
Whether certain objects compose a whole at a given time does not seem to depend on anything other than the character of those objects and the relations between them. This observation suggests a far-reaching constraint on theories of composition. One version of the constraint has been explicitly adopted by van Inwagen and rules out his own answer to the composition question. The constraint also rules out the other well-known moderate answers that have so far been proposed.
Although most of the contemporary debates around subjectivity are framed by a rejection of the metaphysical subject, more time needs to be spent developing the implications of abandoning the meta-physics of constraint. Doing so provides the key to approaching our pressing problem that concerns freedom, and only once invisible, ideal "constraints" have been adequately understood will all of the contemporary puzzlement that concerns intentional resistance to power be assuaged. While Sartre does not solve the problem of freedom bequeathed to (...) us by Foucault, it is clear that he struggled with similar issues, and that his work sheds important light on the issue of ideal constraint. Once more, on Sartre's second view, power and freedom are not mutually exclusive, and in this he advances over much contemporary liberal thought. Thus, on the approach of what would be Sartre's hundredth birthday, I invite others to take this opportune moment to reevaluate the early work of this once shining philosophical star, only recently and perhaps prematurely eclipsed by anti-humanism, and recognize that now, more than ever, Sartre's thought is relevant to our very pressing concerns. (shrink)
This paper is concerned with a quality space model as an account of the intelligibility of explanation. I argue that descriptions of causal or functional roles (Chalmers Levine, 2001) are not the only basis for intelligible explanations. If we accept that phenomenal concepts refer directly, not via descriptions of causal or functional roles, then it is difficult to find role fillers for the described causal roles. This constitutes a vagueness constraint on the intelligibility of explanation. Thus, I propose to (...) use quality space models to develop a systematic way of studying different modalities of perception and feelings, e.g., visual and auditory perception, pain, and emotion, that can reveal some structural relations among these modalities. It might turn out that topological explanation can be more intelligible than causal explanation in this case. I discuss two accounts of a quality space for color vision (Clark, 2000; Rosenthal, 2010) and propose how to construct a quality space for pain. Daniel Kostic is Associated Researcher at Berlin School of Mind and Brain. (shrink)
According to the view that Peacocke elaborates in A Study of Concepts (1992), a concept can be individuated by providing the conditions a thinker must satisfy in order to possess that concept. Hence possessions conditions for concepts should be specifiable in a way that respects a non-circularity constraint. In a more recent paper “Implicit Conceptions, Understanding and Rationality” (1998a) Peacocke argues against his former view, in the light of the phenomenon of rationally accepting principles which do not follow from (...) what the thinker antecedently accepts. In this paper I defend the view of the book from his more recent criticisms, claiming that the noncircularity constraint should be respected, and that Peacocke's more recent insights could be accommodated in the framework of his former theory of concepts. (shrink)
Robert Adams, in Finite and Infinite Goods: A Framework for Ethics, suggests a moral constraint on our obedience to God's commands: if a purportedly divine command seems abhorrently evil, then we should infer that it is not really God so commanding. I suggest that in light of his commitments to God as the standard of goodness, to the transcendence of God, and to a critical stance towards ethics, Adams should be willing to consider the possibility of a good God (...) commanding us to do something that seems abhorrently evil to us, but really is good according to His transcendent goodness. I suggest that the ought-to-is moral constraint that Adams advocates is only appropriate when we are not certain that it is God giving the command, and that an is-to-ought constraint based on psychological certainty should be the ultimate constraint on our obedience to purportedly divine commands. This constraint advocates that if one is certain upon reflection that a command is from God, then one should obey that command, regardless of how evil it seems. After responding to several objections to this psychological constraint, I offer my own qualification, according to which it is appropriate to disobey a command that one is certain is from God if one cannot conceive that the command is good. Finally, I offer some reason to think that, contrary to Adams's assertions, the project of considering how to react to a purportedly divine command that also seems abhorrently evil is worth both philosophic and spiritual energy. (shrink)
Introduction -- Instrumental rationality -- Social order -- Deontic constraint -- Intentional states -- Preference noncognitivism -- A naturalistic perspective -- Transcendental necessity -- Weakness of will -- Normative ethics.
Those who endorse the Psychological Continuity Approach (PCA) to analyzing personal identity need to impose a non-branching constraint to get the intuitively correct result that in the case of fission, one person becomes two. With the help of Brueckner's (2005) discussion, it is shown here that the sort of non-branching clause that allows proponents of PCA to provide sufficient conditions for being the same person actually runs contrary to the very spirit of their theory. The problem is first presented (...) in connection with perdurantist versions of PCA. The difficulty is then shown to apply to endurantist versions as well. (shrink)
This paper offers a new definition of "adaptationism". An evolutionary account is adaptationist, it is suggested, if it allows for multiple independent origins for the same function -- i.e., if it violates the "Unique Origin Constraint". While this account captures much of the position Gould and Lewontin intended to stigmatize, it leaves it open that adaptationist accounts may sometimes be appropriate. However, there are many important cases, including that of human rationality, in which it is not.
Immoralists hold that in at least some cases, moral ﬂ aws in artworks can increase their aesthetic value. They deny what I call the valence constraint: the view that any effect that an artwork’s moral value has on its aesthetic merit must have the same valence. The immoralist offers three arguments against the valence constraint. In this paper I argue that these arguments fail, and that this failure reveals something deep and interesting about the relationship between cognitive and (...) moral value. In the ﬁ nal section I offer a positive argument for the valence constraint. (shrink)
I argue that we should not adopt categorial restrictions on the significance of syntactically well-formed strings. Even syntactically well-formed but semantically absurd strings, such as ‘Life is but a walking shadow’ and ‘Caesar is a prime number’, can express thoughts; and competent thinkers both can and ought to be able to grasp such thoughts. A more specific way of putting this claim is that Gareth Evans’ Generality Constraint should be viewed as a fully general constraint on concept possession (...) and propositional thought, even though Evans himself accepted only a categorially-restricted version of the Constraint. I establish this by arguing, first, that even well-formed but semantically cross-categorial strings often do possess substantive inferential roles; second, that hearers exploit these inferential roles in interpreting such strings metaphorically; and third, that there is no good reason to deny truth-conditions to strings with inferential roles. (shrink)
We now know of a number of ways of developing real analysis on a basis of abstraction principles and second-order logic. One, outlined by Shapiro in his contribution to this volume, mimics Dedekind in identifying the reals with cuts in the series of rationals under their natural order. The result is an essentially structuralist conception of the reals. An earlier approach, developed by Hale in his "Reals byion" program differs by placing additional emphasis upon what I here term Frege's (...) class='Hi'>Constraint, that a satisfactory foundation for any branch of mathematics should somehow so explain its basic concepts that their applications are immediate. This paper is concerned with the meaning of and motivation for this constraint. Structuralism has to represent the application of a mathematical theory as always posterior to the understanding of it, turning upon the appreciation of structural affinities between the structure it concerns and a domain to which it is to be applied. There is, therefore, a case that Frege's Constraint has bite whenever there is a standing body of informal mathematical knowledge grounded in direct reflection upon sample, or schematic, applications of the concepts of the theory in question. It is argued that this condition is satisfied by simple arithmetic and geometry, but that in view of the gap between its basic concepts (of continuity and of the nature of the distinctions among the individual reals) and their empirical applications, it is doubtful that Frege's Constraint should be imposed on a neo-Fregean construction of analysis. (shrink)
This paper argues that there is a general constraint on the evolution of culture. This constraint – what I am calling the Fundamental Constraint – must be satisfied in order for a cultural system to be adaptive. The Fundamental Constraint is this: for culture to be adaptive there must be a positive correlation between the fitness of cultural variants and their fitness impact on the organisms adopting those variants. Two ways of satisfying the Fundamental Constraint (...) are introduced, structural solutions and evaluative solutions. Because of the limitations on these solutions, this constraint helps explain why there is not more culture in nature, why the culture that does exist has the form it has, and why complex, cumulative culture is restricted to the human species. (shrink)
The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates (...) the expectation values of certain functions with their empirical averages. There are, however, various other ways in which one can construct constraints from empirical data, which makes the maximum entropy principle lead to very different probability assignments. This paper shows that an argument by Jaynes to justify the usual constraint rule is unsatisfactory and investigates several alternative choices. The choice of a constraint rule is also shown to be of crucial importance to the debate on the question whether there is a conflict between the methods of inference based on maximum entropy and Bayesian conditionalization. (shrink)
Behavioral scientists studied behavior; cognitive scientists study what generates behavior. Cognitive science is hence theoretical behaviorism (or behaviorism is experimental cognitivism). Behavior is data for a cognitive theorist. What counts as a theory of behavior? In this paper, a methodological constraint on theory construction -- "neoconstructivism" -- will be proposed (by analogy with constructivism in mathematics): Cognitive theory must be computable; given an encoding of the input to a behaving system, a theory must be able to compute (an encoding (...) of) its outputs. It is a mistake to conclude, however, that this constraint requires cognitive theory to be computational, or that it follows from this that cognition is computation. (shrink)
The everyday virtue of civility functions as a constraint upon informal social pressures. Can civility also be understood, as John Rawls has proposed, as a distinctively political constraint? I contrast Rawls's project of constraining the political with Mill's of constraining both the social and the political, and explore Rawls's account of the relation between the two. I argue that Rawls's political duty of civility rests on the assumption that the political is peculiarly coercive; ignores the social enforcement of (...) morality; and implausibly has civility apply to motives in acting, rather than to actions. (shrink)
The FASB in its Conceptual Framework has set high principles in the ethics of standard-setting in accounting. This paper concentrates on what the FASB calls the cost/benefit constraint, i.e., the commitment to setting an accounting standard only when the benefits of the standard exceeds the costs of that standard toall stakeholders. This constraint is supposed to take precedence over other concerns, such as neutrality (freedom from bias) of account information.The major conclusion of this paper is that a conflict (...) exists between the FASB's commitment and its practice. There is no evidence that the FASB has always made a costs and benefits judgement with respect to proposed standards. In the cases when such a judgement is made, the FASB discounts social costs; therefore, it is not considering costs to all stakeholders. At the same time the FASB discounts social costs, it seems to have an undue concern for standards that do not increase the volatility of net income. The Conceptual Framework explicitly defines costs as the costs to society as a whole. (shrink)
In an earlier paper I argued that Alvin Plantinga's defence of pure experiential theism (a theism epistemically based on religious experience) against the evidential problem of evil is inappropriately circular. Eric Snider rejects my argument claiming first that I do not get Plantinga's thought right. Second, he rejects a key principle my argument relies on, viz. the 'independence constraint on neutralizers'. Finally, he offers an alternative to the independence constraint which allows the pure experiential theist to deal successfully (...) with the evidential problem of evil. In this paper I argue that: (a) I have correctly characterized Plantinga's argument; and (b) that Snider's proposed counter-example to the independence constraint fails. Finally, I argue (c) that Snider's proposed alternative to the independence constraint is not a plausible epistemic principle. (shrink)
I discuss Philip Pettit’s argument that appreciation is not a proper response to value because it fails to satisfy the non-iteration constraint, according to which, where V is a value and R is a response to value, R-ing V must not be distinct from R-ing R-ing V. After motivating the non-iteration constraint and conceding that appreciation fails to satisfy the constraint, I argue that the consequentialist’s preferred response to value, promotion, also violates the constraint, leaving Pettit (...) with a dilemma: if he insists on the constraint, then promotion is not a proper response to value; if he does not insist on the constraint, then his argument against appreciation as a proper response to value fails. (shrink)
Ronald Dworkin argues on the basis of a theory of well-being that critical paternalism is self-defeating. People must endorse their lives if they are to benefit. This is the endorsement constraint and this paper rejects it. For certain kinds of important mistakes that people can make in their lives, the endorsement constraint is either incredible or too narrow to rule out as much paternalism as Dworkin wants. The endorsement constraint cannot be interpreted to give sensible judgements when (...) people change their minds about the value of their lives. And the main argument for the endorsement constraint, which is based on the value of integrity, does not support Dworkin's anti-paternalism. (shrink)
Judith Jarvis Thomson concludes “A Defense of Abortion” with a discussion of samaritanism. Whereas her rights-based arguments demonstrate the moral permissibility of virtually all abortions, this new consideration of samaritanism provides grounds for morally objecting to certain abortions that are otherwise morally pemissible given strictly rights-based considerations. I argue, first, that this samaritanism constraint on the moral permissibility of abortion involves an appeal to virtue-theoretical considerations. I then show why this hybridization of rights-based considerations and virtue-theoretical considerations has advantages (...) over responses to the moral status of abortion that are either exclusively rights-based, or else exclusively virtue-theoretical. I conclude by offering some thoughts on how to utilize this hybrid strategy outside of Thomson’s particular context, as well as why we might generally favor such a strategy in our moral reasoning. (shrink)
There is a widespread belief that, conceptually, justice cannot require what we cannot achieve. This belief is sometimes used by defenders of so-called ‘non-ideal theories of justice’ to criticise so-called ‘ideal theories of justice’. I refer to this claim as ‘the feasibility constraint on the concept of justice’ and argue against it. I point to its various implausible implications and contend that a willingness to apply the label ‘unjust’ to some regrettable situations that we cannot fix is going to (...) enhance the action-guiding potential of a conception of justice, by providing an aspirational ideal. This is possible on the condition that, at all times, we cannot specify with certainty the limits of what is feasible for us collectively. The rejection of the feasibility constraint entails that there can be injustice without perpetrators; this is a theoretical price worth paying. (shrink)
Recent work has shown that preschool-aged children and adults understand freedom of choice regardless of culture, but that adults across cultures differ in perceiving social obligations as constraints on action. To investigate the development of these cultural differences and universalities, we interviewed school-aged children (4–11) in Nepal and the United States regarding beliefs about people's freedom of choice and constraint to follow preferences, perform impossible acts, and break social obligations. Children across cultures and ages universally endorsed the choice to (...) follow preferences but not to perform impossible acts. Age and culture effects also emerged: Young children in both cultures viewed social obligations as constraints on action, but American children did so less as they aged. These findings suggest that while basic notions of free choice are universal, recognitions of social obligations as constraints on action may be culturally learned. (shrink)
The study of biomechanics most often takes a classic adaptationist approach, examining the functional abilities of organisms in relation to what is allowed by physical parameters. This approach generally assumes strong selection and is less concerned with evolutionary stochasticity in determining the presence of biological traits. It is equally important, however, to consider the importance of constraint in determining the form of organisms. If selection is relatively weak compared to stochastic events, then the observed forms in living systems can (...) be taken not as those shapes that were strongly selected for, so much as those forms that do not violate physical rules and therefore persist. Using the problem of maximum animal size as a case study for this alternative biomechanical philosophy, I demonstrate one example of how biomechanical approaches can be used to study constraint and consider the concept of absent forms. This alternative mindset and approach produces a complementary system to the traditional form and function approach in biomechanics. The two philosophies can be used in conjunction to better understand biological systems. I focus particularly on the maximum size of flying animals, as they are a heavily constrained class of system that has also been shaped by substantial stochasticity. (shrink)
According to the view that Peacocke elaborates in _A Study of Concepts_ (1992), a concept can be individuated by providing the conditions a thinker must satisfy in or- der to possess that concept. Hence possessions conditions for concepts should be specifiable in a way that respects a non-circularity constraint. In a more recent paper.
This paper presents the Constraint Language for Lambda Structures(CLLS), a first-order language for semantic underspecification thatconservatively extends dominance constraints. It is interpreted overlambda structures, tree-like structures that encode -terms. Based onCLLS, we present an underspecified, uniform analysis of scope,ellipsis, anaphora, and their interactions. CLLS solves a variablecapturing problem that is omnipresent in scope underspecification andcan be processed efficiently.
Palmer's “isomorphism constraint” presupposes the logical possibility of two qualitatively disparate sets of sensory experiences exhibiting the same relationships. Two arguments are presented to demonstrate that, because such a state of affairs cannot be coherently specified, its occurrence is not logically possible. The prospects for behavioral and biological science are better than Palmer suggests; those for functionalism are worse.
The constraint formalism of classical mechanics is extended to field theories with gauge groups. Explicit examples of Klein-Gordon and Maxwell fields are presented. The symmetry properties of the Maxwell fields have the unexpcted feature in this formalism of forming a first-class algebra which is not Lie, a situation already encountered in the general theory of relativity.
Duncan MacIntosh has argued that David Gauthier's notion of a constrained maximization disposition faces a dilemma. For if such a disposition is revocable, it is no longer rational come the time to act on it, and so acting on it is not (as Gauthier argues) rational; but if it is not revocable, acting on it is not voluntary. This paper is a response to MacIntosh's dilemma. I introduce an account of rational intention of a type which has become increasingly and (...) independently prominent in the literature, and argue that, on this account, rational and voluntary constraint is possible. (shrink)
I distinguish between being cognisant and being able to perform intelligent operations. The former, but not the latter, minimally involves the capacity to make adequate judgements about one's relation to objects in the environment. The referential nature of cognisance entails that the mental states of cognisant systems must be inter-related holistically, such that an individual thought becomes possible because of its relation to a system of potential thoughts. I use Gareth Evans' 'Generality Constraint' as a means of describing how (...) the reference and holism of mental states in cognisant systems are mutually dependent. Next, I describe attempts to deny the relevance of holism and reference by positing a mentalese. These attempts fail because the meanings of symbols are under determined, with there being no principled means of distinguishing between the mental tokening of a symbol and its disambiguation. I argue that the connectionist meta-theory does not encounter this problem because it is able to encompass the holism of the mental. Recent attempts to show that symbol processing theories of thought must be preferred to connectionist theories do not work. Despite appearances to the contrary, the Generality Constraint favours connectionist not symbol-processing theories. (shrink)
The Newell Test is an important step in advancing our understanding of cognition. One critical constraint is missing from this test: A cognitive architecture must be self-contained. ACT-R and connectionism fail on this account. I present an alternative proposal, called Distributed Adaptive Control (DAC), and expose it to the Newell Test with the goal of achieving a clearer specification of the different constraints and their relationships, as proposed by Anderson & Lebiere (A&L).
If conduct must be wrongful in order to be justifiably criminalised, how should its wrongfulness be established? I examine a conception of wrongfulness put forward by A. P. Simester, which makes wrongfulness turn on whether the reasons favouring the performance of an action are, all things considered, defeated by the reasons against its performance. I argue that such a view can only generate appropriate substantive constraints in the context of criminalisation if it can distinguish between the sorts of reasons that (...) a verdict of wrongfulness, as a concept distinct from stupidity or selfishness, should attend to, and the sorts of reasons it should leave out. Assuming that this conception of wrongfulness should operate as a constraint on criminalisation in a liberal-democratic state, the only reasons it should include are other-regarding reasons. What matters is whether the agent commits an other-regarding wrong. This conception of wrongfulness helps us further to resolve fundamental questions concerning mala prohibita and the legitimate reach of any duty to obey the law. (shrink)
The continuous spontaneous localization (CSL) model modifies Schrödinger's equation so that the collapse of the state vector is described as a physical process (a special interaction of particles with a universal fluctuating field). A consequence of the model is that an electron in an atom should occasionally get “spontaneously” knocked out of the atom. The CSL ionization rate for the 1s electrons in the Ge atom is calculated and compared with an experimental upper limit for the rate of “spontaneously” generated (...) x-ray pulses in Ge. This gives, for the first time, an experimental constraint on the parameters which characterize this model (the GRW parameters and the relative collapse rate of electrons and nucleons). It is concluded that the values assigned to the GRW parameters by GRW may be maintained only if the coupling of electrons to the fluctuating field is 0.35% or less than the coupling of nucleons, suggestive of a mass-proportional (and therefore gravitational) collapse mechanism. For other allowed values of the GRW parameters, it is still argued that nucleons should collapse more rapidly than electrons. (shrink)