In 1952, Heinrich Scholz published a question in The Journal of Symbolic Logic asking for a characterization of spectra, i.e., sets of natural numbers that are the cardinalities of finite models of first order sentences. Günter Asser in turn asked whether the complement of a spectrum is always a spectrum. These innocent questions turned out to be seminal for the development of finite model theory and descriptive complexity. In this paper we survey developments over the last 50-odd years (...) pertaining to the spectrumproblem. Our presentation follows conceptual developments rather than the chronological order. Originally a number theoretic problem, it has been approached by means of recursion theory, resource bounded complexity theory, classification by complexity of the defining sentences, and finally by means of structural graph theory. Although Scholz' question was answered in various ways, Asser's question remains open. (shrink)
Autism spectrum disorders (ASDs) are an issue of growing public health significance. This set of neurodevelopmental disorders, which includes autistic disorder, Asperger syndrome, and pervasive developmental disorder not otherwise specified (PDD-NOS), is characterized by abnormalities in one or more of the following domains: language use, reciprocal social interactions, and/or a pattern of restricted interests or stereotyped behaviors. Prevalence estimates for ASDs have been increasing over the past few decades, with estimates at ~5/10,000 in the 1960s, and current estimates as (...) high as 1/88 (Newschaffer et al. 2007; CDC 2012).1 While ASDs encompass a wide range of phenotypes and degrees of severity, the disorders .. (shrink)
In this paper, I present a challenge for Michael McKenna’s conversational theory of moral responsibility. On his view, to be a responsible agent is to be able to engage in a type of moral conversation. I argue that individuals with autism spectrum disorder present a considerable problem for the conversational theory because empirical evidence on the disorder seems to suggest that there are individuals in the world who meet all of the conditions for responsible agency that the theory (...) lays out but who are nevertheless not responsible agents. Attending to the moral psychology of such individuals will, I think, help shed light on an important gap in the conversational theory. (shrink)
Functionalism, a philosophical theory, has empirical consequences. Functionalism predicts that where systematic transformations of sensory input occur and are followed by behavioral accommodation in which normal function of the organism is restored such that the causes and effects of the subject's psychological states return to those of the period prior to the transformation, there will be a return of qualia or subjective experiences to those present prior to the transform. A transformation of this type that has long been of philosophical (...) interest is the possibility of an inverted spectrum. Hilary Putnam argues that the physical possibility of acquired spectrum inversion refutes functionalism. I argue, however, that in the absence of empirical results no a priori arguments against functionalism, such as Putnam's, can be cogent. I sketch an experimental situation which would produce acquired spectrum inversion. The mere existence of qualia inversion would constitute no refutation of functionalism; only its persistence after behavioral accommodation to the inversion would properly count against functionalism. The cumulative empirical evidence from experiments on image inversion suggests that the results of actual spectrum inversion would confirm rather than refute functionalism. (shrink)
We numerically solve the functional differential equations (FDEs) of 2-particle electrodynamics, using the full electrodynamic force obtained from the retarded Lienard–Wiechert potentials and the Lorentz force law. In contrast, the usual formulation uses only the Coulomb force (scalar potential), reducing the electrodynamic 2-body problem to a system of ordinary differential equations (ODEs). The ODE formulation is mathematically suspect since FDEs and ODEs are known to be incompatible; however, the Coulomb approximation to the full electrodynamic force has been believed to (...) be adequate for physics. We can now test this long-standing belief by comparing the FDE solution with the ODE solution, in the historically interesting case of the classical hydrogen atom. The solutions differ. A key qualitative difference is that the full force involves a ‘delay’ torque. Our existing code is inadequate to calculate the detailed interaction of the delay torque with radiative damping. However, a symbolic calculation provides conditions under which the delay torque approximately balances (3rd order) radiative damping. Thus, further investigations are required, and it was prematurely concluded that radiative damping makes the classical hydrogen atom unstable. Solutions of FDEs naturally exhibit an infinite spectrum of discrete frequencies. The conclusion is that (a) the Coulomb force is not a valid approximation to the full electrodynamic force, so that (b) the n-body interaction needs to be reformulated in various current contexts such as molecular dynamics. (shrink)
Michael Tyeâ€™s considered position on visual experience combines representationalism with externalism about color, so when considering spectrum inversion, he needs a principled reason to claim that a person with inverted color vision is seeing things incorrectly. Tyeâ€™s responses to the problem of the inverted spectrum ( 2000 , in: Consciousness, color, and content, The MIT Press, Cambridge, MA and 2002a , in: Chalmers (ed.) Philosophy of mind: classical and contemporary readings, Oxford University Press, Oxford) rely on a (...) teleological approach to the evolution of vision to secure the grounds upon which people with normal color vision can be justly called â€˜rightâ€™ and those with inverted color vision can be called â€˜wrongâ€™. I demonstrate that since the inverted spectrum thought experiment requires that both sorts of vision be behaviorally indistinguishable, no biologically acceptable concept of teleology will allow Tye to draw the distinction he needs. (shrink)
I argue that the inverted spectrum hypothesis is nota possibility we should take seriously. The principlereason is that if someone's qualia were inverted inthe specified manner there is reason to believe thephenomenal difference would manifest itself inbehaviour. This is so for two reasons. First, Isuggest that qualia, including phenomenal colours, arepartly constituted by an affective component whichwould be inverted along with the connected qualia. Theresulting affective inversions will, given theintimate connections that exist between emotions andbehaviour, likely manifest themselves in (...) behaviour, inwhich case the underlying phenomenal differences canbe functionally captured. Second, I argue that othersense modalities lack the structural featuresnecessary for undetectable inversion which, because oftheir analogy with colour qualia, weakens theplausibility of such an inversion in the original caseof vision. (shrink)
It is known that various complexity-theoretical problems can be translated into some special spectra problems. Thus, questions about complexity classes are translated into questions about the expressive power of some languages. In this paper we investigate the spectra of some logics with Henkin quantifiers in the empty vocabulary.
Our account of the problem of the classical limit of quantum mechanics involves two elements. The first one is self-induced decoherence, conceived as a process that depends on the own dynamics of a closed quantum system governed by a Hamiltonian with continuous spectrum; the study of decoherence is addressed by means of a formalism used to give meaning to the van Hove states with diagonal singularities. The second element is macroscopicity represented by the limit $\hbar \rightarrow 0$ : (...) when the macroscopic limit is applied to the Wigner transformation of the diagonal state resulting from decoherence, the description of the quantum system becomes equivalent to the description of an ensemble of classical trajectories on phase space weighted by their corresponding probabilities. (shrink)
Because humans cannot know one another’s minds directly, every form of communication is a solution to the same basic problem: how can privately held information be made publicly accessible through manipulations of the physical environment? Language is by far the best studied response to this challenge. But there are a diversity of non-linguistic strategies for representation with external signs as well, from facial expressions and fog horns to chronological graphs and architectural renderings. The general thesis of this dissertation is (...) that there is an impressively wide spectrum of conventional systems of representation, corresponding to the many ways that the problem of communication can be solved, and that these systems can be described and explained using the tools of contemporary mathematical semantics. As a partial corrective to the countervailing norm, this work concentrates on the class of systems arguably most different from language— those governing the interpretation of pictorial images. Such representations dominate practical communication: witness the proliferation of maps, road signs, newspaper photographs, scientific illustrations, television shows, engineering drawings, and even the fleeting imagery of manual gesture. I argue that systems of depiction and languages embody a parallel technologies of communication. Both are based on semantics: systematic and conventional mappings from signs to representational content. But I also provide evidence that these semantics are profoundly divergent. Whereas the semantics of languages are based on arbitrary associations of signs and denotations, the semantics of systems of depiction are based on rules of geometrical transformation. Drawing on recent research in computer graphics and computational vision, I go on to develop a precise theory of pictorial semantics. This in turn facilitates a detailed comparison of iconic, image-based representation, and symbolic, language-based representation. A consequence of these conclusions is that the traditional, language-centric conception of semantics must be overhauled to allow for a more general semantic theory, one which countenances the wide variety of interpretive mechanisms actually at work in human communication. (shrink)
We face an epistemic problem in competently judging some types of experience. The problem arises when an experience either defies our efforts to assess its quality, such as a traumatic event, or compromises our abilities to assess quality in general, such as starvation. In the latter type of case, the competent judge problem is actually a paradox since the experience undermines our competence to judge at the same time that it gives us competence to judge it against (...) other experiences. The problem is pressing because it arises for experiences at the more extreme ends of the spectrum, which are precisely the experiences we most want to judge competently. It also has implications for how we approach some practical ethical problems, such as solitary confinement. The paper explores a range of cases and explains why efforts to escape the competent judge problem may prove fruitless. (shrink)
We present a solution to the ghost problem in fourth order derivative theories. In particular we study the Pais–Uhlenbeck fourth order oscillator model, a model which serves as a prototype for theories which are based on second plus fourth order derivative actions. Via a Dirac constraint method quantization we construct the appropriate quantum-mechanical Hamiltonian and Hilbert space for the system. We find that while the second-quantized Fock space of the general Pais–Uhlenbeck model does indeed contain the negative norm energy (...) eigenstates which are characteristic of higher derivative theories, in the limit in which we switch off the second order action, such ghost states are found to move off shell, with the spectrum of asymptotic in and out S-matrix states of the pure fourth order theory which results being found to be completely devoid of states with either negative energy or negative norm. We confirm these results by quantizing the Pais–Uhlenbeck theory via path integration and by constructing the associated first-quantized wave mechanics, and show that the disappearance of the would-be ghosts from the energy eigenspectrum in the pure fourth order limit is required by a hidden symmetry that the pure fourth order theory is unexpectedly found to possess. The occurrence of on-shell ghosts is thus seen not to be a shortcoming of pure fourth order theories per se, but rather to be one which only arises when fourth and second order theories are coupled to each other. (shrink)
Larry Temkin and Stuart Rachels have argued that the “_ is better than _” relation need not be transitive. In support of this claim, they have presented several spectrum cases towards which our actual preferences appear not to be transitive. In this paper I examine one of them, and explain that there are several solutions we may give to the problem of what is the best global option within the spectrum. I point out that these solutions do (...) not depend on whether we reject or accept the transitivity of the “_ is better than _” relation. This reduces the strength of the challenge that spectrum cases pose to transitivity in axiology. (shrink)
Proponents of the hard problem of consciousness argue that the zombie and inverted spectrum thought experiments demonstrate that consciousness cannot be physical. They present scenarios designed to demonstrate that it is conceivable that a physical replica of someone can have radically different or no conscious experiences, that such an experience-less replica is possible and therefore that materialism is false. I will argue that once one understands the limitations that the physics of this world puts on cognitive systems, zombies (...) and the inverted spectrum are not conceivable. (shrink)
This paper addresses the question of how scientists determine which type of hypothesis is most suitable for tackling a particular problem by examining the historical case of the anomalous β spectrum in early nuclear physics , a puzzle that occasioned the most diverse hypotheses amongst physicists at the time. It is shown that such determinations are most often implicitly informed by scientists' individual perspectives on the structural relations between the various elements of the theory and the problem (...) at hand. In addition to this main result, it is suggested that Wolfgang Pauli's neutrino idea may well have been an adaptation of Ernst Rutherford's original and older neutron idea, which would provide evidence that the adaptation of older ideas is a more common practice than is often thought. (shrink)
We investigate theories of initial segments of the standard models for arithmetics. It is easy to see that if the ordering relation is definable in the standard model then the decidability results can be transferred from the infinite model into the finite models. On the contrary we show that the Σ₂—theory of multiplication is undecidable in finite models. We show that this result is optimal by proving that the Σ₁—theory of multiplication and order is decidable in finite models as well (...) as in the standard model. We show also that the exponentiation function is definable in finite models by a formula of arithmetic with multiplication and that one can define in finite models the arithmetic of addition and multiplication with the concatenation operation. We consider also the spectrumproblem. We show that the spectrum of arithmetic with multiplication and arithmetic with exponentiation is strictly contained in the spectrum of arithmetic with addition and multiplication. (shrink)
Abstract: In this article, the logic and functions of character-trait ascriptions in ethics and epistemology is compared, and two major problems, the "generality problem" for virtue epistemologies and the "global trait problem" for virtue ethics, are shown to be far more similar in structure than is commonly acknowledged. I suggest a way to put the generality problem to work by making full and explicit use of a sliding scale--a "narrow-broad spectrum of trait ascription"-- and by accounting (...) for the various uses of it in an inquiry-pragmatist account. In virtue theories informed by inquiry pragmatism, the agential habits and abilities deemed salient in explanations/evaluations of agents in particular cases, and the determination of what relevant domains and conditions an agent's habit or ability is reliably efficacious in, is determined by pragmatic concerns related to our evaluative epistemic practices. (shrink)
The classical two-body system with Lorentz-invariant Coulomb work function V = -k/ρ is solved in 3+1 dimensions using the manifestly covariant Hamiltonian mechanics of Stückelberg. Particular solutions for the reduced motion are obtained which correspond to bound attractive, unbound attractive, and repulsive scattering motion. A lack of perihelion precession is found in the bound attractive orbit, and the semiclassical hydrogen spectrum subsequently contains no fine structure corrections. It is argued that this prediction is indicative of the correct classical special (...) relativistic two-body theory. (shrink)
Think of the color spectrum, spread out before you. You can identify the different colors with ease. But if you are asked to indicate the point at which one color ends and the next begins, you are at a loss. "There is no such point", is a natural thought: one color just shades gradually into the next.
Autism and Asperger syndrome are psychiatric conditions diagnosed primarily on the basis of deficits and problems in social behaviour; interaction and communication. At present the explanation of these behavioural features is dominated by three cognitive models. However, it is a characteristic of each of these models that they only explain a sub-set of the overall features.The aim of this paper is to suggest an alternative conceptual theory of autism and Asperger syndrome that unites the current three models. Thus, the aim (...) is to situate the existing models as special cases of the one being proposed here. This alternative conceptualisation draws heavily on distinctions and ideas present in the philosophy of science, most especially in the area of critical realism. Central to the theory is the idea that the core “problem” in autism and Asperger syndrome concerns ontological depth. More specifically, that people with these conditions find it difficult to cope with phenomena characterised by depth , open systems and high internal relationality. (shrink)
In nature scent is important for man primarily as a marker of food and sexual attractiveness, it polarizes as objects of life and decay, death. Scent, just like touch and taste exists till subject and object get opposed to each other, it is the sphere where body is included into material world, and flesh of the world is incrusted into the body. Aesthetics in its anthropologic meaning is limited by a body- perceptible dimension. Development of such categories as the sublime, (...) tragic, comic are not possible here. Creation of compound aromas, including those which do not exist in nature, can not overstep the limits of the beautiful- -ugly opposition. There is no contradiction, nourishing the comic, the tragic or the sublime, in blending of body with the world by means of scent. The spectrum of the aesthetic in the sphere of aromas is expanded in the plane from the beautiful to the ugly: fragrant, heady, amber, garlic, carpic, putrid, hideous, stinking. (shrink)
Plant protection problems are simulated by a system of ordinary differential equations with given initial conditions. The sensitivity and resistance of pathogen subpopulations to fungicide mixtures, fungicide weathering, plant growth, etc. are taken into consideration. The system of equations is solved numerically for each set of initial conditions and parameters of the disease and fungicide applications. Optimization algorithms were investigated and a computer program was developed for optimization of these solutions. 14 typical cases of the disease were simulated and optimized (...) in order to determine optimal fungicide treatments. The optimized strategy for fungicide application differs considerably from the commonly used method and seems to be an important new principle in plant protection. The approach developed in this study may be useful for a wide spectrum of purposes in the simulation of leaf diseases. It may also help the biologist to decrease or pinpoint experimental work and analyze its results and is perspective for plant disease control. (shrink)
Processing of facial expressions goes beyond simple pattern recognition. To elucidate this problem, Niedenthal et al. offer a model that identifies multiple embodied and disembodied routes for expression processing, and spell out conditions triggering use of different routes. I elaborate on this model by discussing recent research on emotional recognition in individuals with autism, who can use multiple routes of emotion processing, and consequently can show atypical and typical patterns of embodied simulation and mimicry.
The most comprehensive text in its field, this anthology includes 74 articles in 9 areas of philosophy of religion: The Concept of God; Traditional Arguments for the Existence of God; Religious Experience; The Problem of Evil; Miracles, Death and Immortality; Faith and Reason; Science, Religion, and Evolution; and Religious Pluralism. The arrangement of the articles and the introductions which accompany them help students place the readings in their historical or contemporary context, and to ensure that students can be exposed (...) to a spectrum of viewpoints. (shrink)
Chalmers has argued for a form of property dualism on the basis of the concept of a zombie , and the concept of the inverted spectrum. He asserts that these concepts show that the facts about consciousness, such as experience or qualia, are really further facts about our world, over and above the physical facts. He claims that they are the hard part of the mind-body issue. He also claims that consciousness is a fundamental feature of the world like (...) mass, charge, etc. He says that consciousness does not logically supervene on the physical and all current attempts to assert an identity between consciousness and the physical are just as non-reductive as his dualism. They are simply correlations and are part of the problem of the explanatory gap. In this paper, three examples of strong identities between a sensation or a quale and a physiological process are presented, which overcome these problems. They explain the identity in an a priori manner and they show that consciousness or sensations logically supervene on the physical , in that it is logically impossible to have P and not to have Q. In each case, the sensation was predicted and entailed by the physical. The inverted spectrumproblem for consciousness is overcome and explained by a striking asymmetry in colour space. It is concluded that as some physical properties realize some sensations or qualia that human zombies are not metaphysically possible and the explanatory gap is bridged in these cases. Thus, the hard problem is overcome in these instances. (shrink)
The spectrum of a relation on a computable structure is the set of Turing degrees of the image of R under all isomorphisms between and any other computable structure . The relation is intrinsically computably enumerable if its image under all such isomorphisms is c.e. We prove that any computable partially ordered set is isomorphic to the spectrum of an intrinsically c.e. relation on a computable structure. Moreover, the isomorphism can be constructed in such a way that the (...) image of the minimum element of the partially ordered set is computable. This solves the spectrumproblem. The theorem and modifications of its proof produce computably categorical structures whose expansions by finite number of constants are not computably categorical and, indeed, ones whose expansions can have any finite number of computable isomorphism types. They also provide examples of computably categorical structures that remain computably categorical under expansions by constants but have no Scott family. (shrink)
This is terribly hard, Thouless, I'm sorry. I have thought over all this for years. … It is now as if we had ploughed furrows in different parts of a field. There is a lot left to do. Judging from their writings, most contemporary analytic philosophers have not been persuaded that “the inverted spectrumproblem” is – as Wittgenstein maintained – really a conceptual puzzle calling for dissolution, rather than a straight problem calling for a solution. In (...) this paper, I present Wittgenstein's view as clearly and persuasively as I can, contrasting it with the views of Sidney Shoemaker and Ned Block, two of his more prominent critics. I conclude with a look at Frank Jackson's well-known Knowledge Argument, which, if successful, would demonstrate the futility of looking for a physicalist solution to the inverted spectrum and related philosophical problems. My goal is to combat what I take to be the common and unfortunate failure – among both physicalistically inclined philosophers, including Shoemaker and Block, and anti-physicalists, such as Jackson – to appreciate the force of Wittgenstein's arguments. (shrink)
The “demarcation problem,” the issue of how to separate science from pseu- doscience, has been around since fall 1919—at least according to Karl Pop- per’s (1957) recollection of when he first started thinking about it. In Popper’s mind, the demarcation problem was intimately linked with one of the most vexing issues in philosophy of science, David Hume’s problem of induction (Vickers 2010) and, in particular, Hume’s contention that induction cannot be logically justified by appealing to the fact (...) that “it works,” as that in itself is an inductive argument, thereby potentially plunging the philosopher straight into the abyss of a viciously circular argument. (shrink)
One of the reasons why most of us feel puzzled about the problem of abortion is that we want, and do not want, to allow to the unborn child the rights that belong to adults and children. When we think of a baby about to be born it seems absurd to think that the next few minutes or even hours could make so radical a difference to its status; yet as we go back in the life of the fetus (...) we are more and more reluctant to say that this is a human being and must be treated as such. No doubt this is the deepest source of our dilemma, but it is not the only one. For we are also confused about the general question of what we may and may not do where the interests of human beings conflict. We have strong intuitions about certain cases; saying, for instance, that it is all right to raise the level of education in our country, though statistics allow us to predict that a rise in the suicide rate will follow, while it is not all right to kill the feeble-minded to aid cancer research. It is not easy, however, to see the principles involved, and one way of throwing light on the abortion issue will be by setting up parallels involving adults or children once born. So we will be able to isolate the “equal rights” issue and should be able to make some advance... (shrink)
Ever since Socrates, philosophers have been in the business of asking ques- tions of the type “What is X?” The point has not always been to actually find out what X is, but rather to explore how we think about X, to bring up to the surface wrong ways of thinking about it, and hopefully in the process to achieve an increasingly better understanding of the matter at hand. In the early part of the twentieth century one of the most (...) ambitious philosophers of sci- ence, Karl Popper, asked that very question in the specific case in which X = science. Popper termed this the “demarcation problem,” the quest for what distinguishes science from nonscience and pseudoscience (and, presumably, also the latter two from each other). (shrink)
J.L. Mackie’s version of the logical problem of evil is a failure, as even he came to recognize. Contrary to current mythology, however, its failure was not established by Alvin Plantinga’s Free Will Defense. That’s because a defense is successful only if it is not reasonable to refrain from believing any of the claims that constitute it, but it is reasonable to refrain from believing the central claim of Plantinga’s Free Will Defense, namely the claim that, possibly, every essence (...) suffers from transworld depravity. (shrink)
I resolve the major challenge to an Expressivist theory of the meaning of normative discourse: the Frege–Geach Problem. Drawing on considerations from the semantics of directive language (e.g., imperatives), I argue that, although certain forms of Expressivism (like Gibbard’s) do run into at least one version of the Problem, it is reasonably clear that there is a version of Expressivism that does not.
The philosophical study of consciousness is chock full of thought experiments: John Searle’s Chinese Room, David Chalmers’ Philosophical Zombies, Frank Jackson’s Mary’s Room, and Thomas Nagel’s ‘What is it like to be a bat?’ among others. Many of these experiments and the endless discussions that follow them are predicated on what Chalmers famously referred as the ‘hard’ problem of consciousness: for him, it is ‘easy’ to figure out how the brain is capable of perception, information integration, attention, reporting on (...) mental states, etc, even though this is far from being accomplished at the moment. What is ‘hard’, claims the man of the p-zombies, is to account for phenomenal experience, or what philosophers usually call ‘qualia’: the ‘what is it like’, first-person quality of consciousness. (shrink)
Here I discuss some theistic responses to the problem of animal pain and suffering with special attention to Michael Murray’s presentation in Nature Red in Tooth and Claw. The neo-Cartesian defenses he describes are reviewed, along with the appeal to nomic regularity and Murray’s emphasis on the progression of the universe from chaos to order. It is argued that despite these efforts to prove otherwise the problem of animal suffering remains a serious threat to the belief that an (...) all-powerful, all-knowing, and all-good creator exists. (shrink)
In this paper, I argue that there is a kind of evil, namely, the unequal distribution of natural endowments, or natural inequality, which presents theists with a new evidential problem of evil. The problem of natural inequality is a new evidential problem of evil not only because, to the best of my knowledge, it has not yet been discussed in the literature, but also because available theodicies, such the free will defense and the soul-making defense, are not (...) adequate responses in the face of this particular evil, or so I argue. (shrink)
This book maintains that our conception of consciousness and cognition begins with and depends upon a few fundamental errors. Thau elucidates these errors by discussing three important philosophical puzzles - Spectrum Inversion, Frege's Puzzle, and Black-and-White Mary - each of which concerns some aspect of either consciousness or cognition. He argues that it has gone unnoticed that each of these puzzles presents the very same problem and, in bringing this commonality to light, the errors in our natural conception (...) of consciousness and cognition are also reviewed. (shrink)
In its original form, Nozick’s experience machine serves as a potent counterexample to a simplistic form of hedonism. The pleasurable life offered by the experience machine, its seems safe to say, lacks the requisite depth that many of us find necessary to lead a genuinely worthwhile life. Among other things, the experience machine offers no opportunities to establish meaningful relationships, or to engage in long-term artistic, intellectual, or political projects that survive one’s death. This intuitive objection finds some support in (...) recent research regarding the psychological effects of phenomena such as video games or social media use. After a brief discussion of these problems, I will consider a variation of the experience machine in which many of these deficits are remedied. In particular, I’ll explore the consequences of a creating a virtual world populated with strongly intelligent AIs with whom users could interact, and that could be engineered to survive the user’s death. The presence of these agents would allow for the cultivation of morally significant relationships, and the world’s long-term persistence would help ground possibilities for a meaningful, purposeful life in a way that Nozick’s original experience machine could not. While the creation of such a world is obviously beyond the scope of current technology, it represents a natural extension of the existing virtual worlds provided by current video games, and it provides a plausible “ideal case” toward which future virtual worlds will move. While this improved experience machine would seem to represent progress over Nozick’s original, I will argue that it raises a number of new problems stemming from the fact that that the world was created to provide a maximally satisfying and meaningful life for the intended user. This, in turn, raises problems analogous in some ways to the problem(s) of evil faced by theists. In particular, I will suggest that it is precisely those features that would make a world most attractive to potential users—the fact that the AIs are genuinely moral agents whose well-being the user can significantly impact—that render its creation morally problematic, since they require that the AIs inhabiting the world be subject to unnecessary suffering. I will survey the main lines of response to the traditional problem of evil, and will argue that they are irrelevant to this modified case. I will close by considering by consider what constraints on the future creation of virtual worlds, if any, might serve to allay the concerns identified in the previous discussion. I will argue that, insofar as the creation of such worlds would allow us to meet morally valuable purposes that could not be easily met otherwise, we would be unwise to prohibit it altogether. However, if our processes of creation are to be justified, they must take account of the interests of the moral agents that would come to exist as the result of our world creation. (shrink)
This is an opinionated overview of the Frege-Geach problem, in both its historical and contemporary guises. Covers Higher-order Attitude approaches, Tree-tying, Gibbard-style solutions, and Schroeder's recent A-type expressivist solution.
We can classify theories of consciousness along two dimensions. First, a theory might be physicalist or dualist. Second, a theory might endorse any of these three views regarding causal relations between phenomenal properties (properties that characterize states of our consciousness) and physical properties: nomism (the two kinds of property interact through deterministic laws), acausalism (they do not causally interact), and anomalism (they interact but not through deterministic laws). In this paper, I explore anomalous dualism, a combination of views that has (...) not previously been explored (as far as I know). I suggest that a kind of anomalous dualism, nonreductive anomalous panpsychism, promises to offer the best overall answer to two pressing issues for dualist views, the problem of mental causation and the mapping problem (the problem of predicting mind-body associations). (shrink)
My primary aim is to defend a nonreductive solution to the problem of action. I argue that when you are performing an overt bodily action, you are playing an irreducible causal role in bringing about, sustaining, and controlling the movements of your body, a causal role best understood as an instance of agent causation. Thus, the solution that I defend employs a notion of agent causation, though emphatically not in defence of an account of free will, as most theories (...) of agent causation are. Rather, I argue that the notion of agent causation introduced here best explains how it is that you are making your body move during an action, thereby providing a satisfactory solution to the problem of action. (shrink)
Moral non-cognitivists hope to explain the nature of moral agreement and disagreement as agreement and disagreement in non-cognitive attitudes. In doing so, they take on the task of identifying the relevant attitudes, distinguishing the non-cognitive attitudes corresponding to judgements of moral wrongness, for example, from attitudes involved in aesthetic disapproval or the sports fan’s disapproval of her team’s performance. We begin this paper by showing that there is a simple recipe for generating apparent counterexamples to any informative specification of the (...) moral attitudes. This may appear to be a lethal objection to non-cognitivism, but a similar recipe challenges attempts by non-cognitivism’s competitors to specify the conditions underwriting the contrast between genuine and merely apparent moral disagreement. Because of its generality, this specification problem requires a systematic response, which, we argue, is most easily available for the non-cognitivist. Building on premisses congenial to the non-cognitivist tradition, we make the following claims: (1) In paradigmatic cases, wrongness-judgements constitute a certain complex but functionally unified state, and paradigmatic wrongness-judgements form a functional kind, preserved by homeostatic mechanisms. (2) Because of the practical function of such judgements, we should expect judges’ intuitive understanding of agreement and disagreement to be accommodating, treating states departing from the paradigm in various ways as wrongness-judgements. (3) This explains the intuitive judgements required by the counterexample-generating recipe, and more generally why various kinds of amoralists are seen as making genuine wrongness-judgements. (shrink)