I have been arguing, for almost thirty years now, that emotions have been unduly neglected in philosophy. Back in the seventies, it was an argument that attracted little sympathy. I have also been arguing that emotions are a ripe for philosophical analysis, a view that, as evidenced by the Manchester 2001 conference and a large number of excellent publications, has now become mainstream. My own analysis of emotion, first published in 1973, challenged the sharp divide between emotions and rationality, insisted (...) that we reject the established notion that the emotions are involuntary, and argued, in a brief slogan, that ‘emotions are judgments.’ Since then, although the specific term ‘judgment’ has come under considerable fire and my voluntarist thesis continues to attract incredulousness the general approach I took to emotions has been widely accepted in both philosophy and the social sciences. When Paul Griffiths took on what he misleadingly characterized as ‘propositional attitude’ theories of emotion as the enemy of all that was true and scientifically worthy, I knew that we had made it. Such ferocious abuse is surely a sign that we had shifted, in Kuhnian terms, from being revolutionary to becoming the ‘normal’ paradigm. The current counter-revolution of affect programmes and neuro-reductionism says a lot about who we are and how far we have come. (shrink)
I have been arguing, for almost thirty years now, that emotions have been unduly neglected in philosophy. Back in the seventies, it was an argument that attracted little sympathy. I have also been arguing that emotions are a ripe for philosophical analysis, a view that, as evidenced by the Manchester 2001 conference and a large number of excellent publications, has now become mainstream. My own analysis of emotion, first published in 1973, challenged the sharp divide between emotions and rationality, insisted (...) that we reject the established notion that the emotions are involuntary, and argued, in a brief slogan, that ‘ emotions are judgments.’ Since then, although the specific term ‘judgment’ has come under considerable fire and my voluntarist thesis continues to attract incredulousness the general approach I took to emotions has been widely accepted in both philosophy and the social sciences. When Paul Griffiths took on what he misleadingly characterized as ‘propositional attitude’ theories of emotion as the enemy of all that was true and scientifically worthy, I knew that we had made it. Such ferocious abuse is surely a sign that we had shifted, in Kuhnian terms, from being revolutionary to becoming the ‘normal’ paradigm. The current counter-revolution of affect programmes and neuro-reductionism says a lot about who we are and how far we have come. (shrink)
In the biological realm, a complete explanation of a trait seems to include an explanation in terms of function. It is natural to ask of some trait, "What is its function?" or "What purpose in the organism does the particular trait serve?" or "What is the goal of its activity?" There are several views concerning the appropriate definition of function for biological matters. Two popular views of function with respect to living things are Cummins' organizational account and the Griffiths/Godfrey-Smith (...) modern history account. Whereas Cummins argues that a trait functions so as to contribute to the general organization of some organism's present structure, Griffiths, and Godfrey-Smith argue that a trait functions because of its fitness with respect to the organism's recent evolutionary history. In this paper, I show how these accounts can be made compatible and compliment one another. Given that structure, organization, operational flexibility, function, and evolutionary history are all factors to be considered in an organism's makeup, we should expect that the traits of an organism function the way they do because such traits presently contribute to the overall organization of the organism (Cummins) as well as were selected for in the organism's species' recent ancestry (Griffiths/Godfrey-Smith). (shrink)
Robert MacArthur's mathematical ecology is often regarded as ahistorical and has been criticized by historically oriented ecologists and philosophers for ignoring the importance of history. I clarify and defend his approach, especially his use of simple mathematical models to explain patterns in data and to generate predictions that stimulate empirical research. First I argue that it is misleading to call his approach ahistorical because it is not against historical explanation. Next I distinguish three kinds of criticism of his approach (...) and argue that his approach is compatible with the first two of them. Finally, I argue that the third kind of criticism, advanced by Kim Sterelny and Paul Griffiths, is largely irrelevant to MacArthur's approach. ‡I am especially grateful to Thomas Nickles for encouragement and helpful comments on earlier versions of this paper. Thanks also to Guy Hoelzer, Stephen Jenkins, and Jay Odenbaugh for comments on an earlier draft, Kim Sterelny for clarifications of the Tasmania example, Gregory Mikkelson for references, and the audience at PSA 2006 for discussions. †To contact the author, please write to: Department of History and Philosophy of Science, University of Pittsburgh, 1017 Cathedral of Learning, Pittsburgh, PA 15260; e-mail: yoi5@pitt.edu. (shrink)
We consider the informal concept of "computability" or "effective calculability" and two of the formalisms commonly used to define it, "(Turing) computability" and "(general) recursiveness". We consider their origin, exact technical definition, concepts, history, general English meanings, how they became fixed in their present roles, how they were first and are now used, their impact on nonspecialists, how their use will affect the future content of the subject of computability theory, and its connection to other related areas. After a careful (...) historical and conceptual analysis of computability and recursion we make several recommendations in section §7 about preserving the intensional differences between the concepts of "computability" and "recursion." Specifically we recommend that: the term "recursive" should no longer carry the additional meaning of "computable" or "decidable;" functions defined using Turing machines, register machines, or their variants should be called "computable" rather than "recursive;" we should distinguish the intensional difference between Church's Thesis and Turing's Thesis, and use the latter particularly in dealing with mechanistic questions; the name of the subject should be "Computability Theory" or simply Computability rather than "Recursive Function Theory.". (shrink)
We begin with the history of the discovery of computability in the 1930’s, the roles of Gödel, Church, and Turing, and the formalisms of recursive functions and Turing automatic machines . To whom did Gödel credit the definition of a computable function? We present Turing’s notion [1939, §4] of an oracle machine and Post’s development of it in [1944, §11], [1948], and finally Kleene-Post [1954] into its present form. A number of topics arose from Turing functionals including continuous functionals on (...) Cantor space and online computations. Almost all the results in theoretical computability use relative reducibility and o-machines rather than a-machines and most computing processes in the real world are potentially online or interactive. Therefore, we argue that Turing o-machines, relative computability, and online computing are the most important concepts in the subject, more so than Turing a-machines and standard computable functions since they are special cases of the former and are presented first only for pedagogical clarity to beginning students. At the end in §10–§13 we consider three displacements in computability theory, and the historical reasons they occurred. Several brief conclusions are drawn in §14. (shrink)
Let M be a smooth, compact manifold of dimension n ≥ 5 and sectional curvature | K | ≤ 1. Let Met (M) = Riem(M)/Diff(M) be the space of Riemannian metrics on M modulo isometries. Nabutovsky and Weinberger studied the connected components of sublevel sets (and local minima) for certain functions on Met (M) such as the diameter. They showed that for every Turing machine T e , e ∈ ω, there is a sequence (uniformly effective in e) of homology (...) n-spheres {P k e } k ∈ ω which are also hypersurfaces, such that P k e is diffeomorphic to the standard n-sphere S n (denoted P k e ≈ diff S n ) iff T e halts on input k, and in this case the connected sum N k e =M ♯ P k e ≈ diff M , so N k e ∈ Met(M), and N k e is associated with a local minimum of the diameter function on Met(M) whose depth is roughly equal to the settling time σ e (k) of T e on inputs y i } ∈ ω of c.e. sets so that for all i the settling time of the associated Turing machine for A i dominates that for A i + 1 , even when the latter is composed with an arbitrary computable function. From this, Nabutovsky and Weinberger showed that the basins exhibit a "fractal" like behavior with extremely big basins, and very much smaller basins coming off them, and so on. This reveals what Nabutovsky and Weinberger describe in their paper on fractals as "the astonishing richness of the space of Riemannian metrics on a smooth manifold, up to reparametrization." From the point of view of logic and computability, the Nabutovsky-Weinberger results are especially interesting because: (1) they use c.e. sets to prove structural complexity of the geometry and topology, not merely undecidability results as in the word problem for groups, Hilbert's Tenth Problem, or most other applications; (2) they use nontrivial information about c.e. sets, the Soare sequence {A i } i ∈ ω above, not merely G öodel's c.e. noncomputable set K of the 1930's; and (3) without using computability theory there is no known proof that local minima exist even for simple manifolds like the torus T 5 (see §). (shrink)
It is argued that while quantum mechanics contains nonlocal or entangled states, the instantaneous or nonlocal influences sometimes thought to be present due to violations of Bell inequalities in fact arise from mistaken attempts to apply classical concepts and introduce probabilities in a manner inconsistent with the Hilbert space structure of standard quantum mechanics. Instead, Einstein locality is a valid quantum principle: objective properties of individual quantum systems do not change when something is done to another noninteracting system. There is (...) no reason to suspect any conflict between quantum theory and special relativity. (shrink)
Debates over vaccine mandates raise intense emotions, as reflected in the current controversy over whether to mandate the vaccine against human papilloma virus (HPV), the virus that can cause cervical cancer. Public health ethics so far has failed to facilitate meaningful dialogue between the opposing sides. When stripped of its emotional charge, the debate can be framed as a contest between competing ethical values. This framework can be conceptualized graphically as a conflict between autonomy on the one hand, which militates (...) against government intrusion, and beneficence, utilitarianism, justice, and nonmaleficence on the other, which may lend support to intervention. When applied to the HPV vaccine, this framework would support a mandate based on utilitarianism, if certain conditions are met and if herd immunity is a realistic objective. (shrink)
This article is an accompaniment to Anthony Freeman’s review of Views into the Chinese Room, reflecting on some pertinent outstanding questions about the Chinese room argument. Although there is general agreement in the artificial intelligence community that the CRA is somehow wrong, debate continues on exactly why and how it is wrong. Is there a killer counter-argument and, if so, what is it? One remarkable fact is that the CRA is prototypically a thought experiment, yet it has been very little (...) discussed from the perspective of thought experiments in general. Here, I argue that the CRA fails as a thought experiment because it commits the fallacy of undersupposing, i.e., it leaves too many details to be filled in by the audience. Since different commentators will often fill in details differently, leading to different opinions of what constitutes a decisive counter, the result is 21-plus years of inconclusive debate. (shrink)
Stapp’s counterfactual argument for quantum nonlocality based upon a Hardy entangled state is shown to be flawed. While he has correctly analyzed a particular framework using the method of consistent histories, there are alternative frameworks which do not support his argument. The framework dependence of quantum counterfactual arguments, with analogs in classical counterfactuals, vitiates the claim that nonlocal (superluminal) influences exist in the quantum world. Instead it shows that counterfactual arguments are of limited use for analyzing these questions.
The psychological concept of illusion is defined as a process involving an interaction of logical and empirical considerations. Common usage suggests that an illusion is a discrepancy between one's awareness and some stimulus. Following preliminary definitions of classes of stimuli, five definitions of illusion are considered, based upon the possible discrepancies between awareness and a stimulus. It is found that each of these definitions fails to make important distinctions, even to the point of equating all illusory and perceptual phenomena. This (...) dilemma is resolved by redefining illusion without reference to truth or falsity, but relative to the functioning of a given perceptual system under different conditions. The definition accepted as best is 'a discrepancy between one's perceptions of an object or event observed under different conditions'. Conditions may differ in terms of stimulus exposure, stimulus context, or experiental context. The philosophical and psychological implications are discussed of accepting a definition of illusion not based on a discrepancy between awareness and a stimulus. (shrink)
It is shown how all the major conceptual difficulties of standard (textbook) quantum mechanics, including the two measurement problems and the (supposed) nonlocality that conflicts with special relativity, are resolved in the consistent or decoherent histories interpretation of quantum mechanics by using a modified form of quantum logic to discuss quantum properties (subspaces of the quantum Hilbert space), and treating quantum time development as a stochastic process. The histories approach in turn gives rise to some conceptual difficulties, in particular the (...) correct choice of a framework (probabilistic sample space) or family of histories, and these are discussed. The central issue is that the principle of unicity, the idea that there is a unique single true description of the world, is incompatible with our current understanding of quantum mechanics. (shrink)
This important new text establishes a framework for discussing, understanding, and ultimately making sound decisions on meeting these ethical challenges.
John Searle’s Chinese room argument is a celebrated thought experiment designed to refute the hypothesis, popular among artificial intelligence scientists and philosophers of mind, that “the appropriately programmed computer really is a mind”. Since its publication in 1980, the CRA has evoked an enormous amount of debate about its implications for machine intelligence, the functionalist philosophy of mind, theories of consciousness, etc. Although the general consensus among commentators is that the CRA is flawed, and not withstanding the popularity of the (...) systems reply in some quarters, there is remarkably little agreement on exactly how and why it is flawed. A newcomer to the controversy could be forgiven for thinking that the bewildering collection of diverse replies to Searle betrays a tendency to unprincipled, ad hoc argumentation and, thereby, a weakness in the opposition’s case. In this paper, treating the CRA as a prototypical example of a ‘destructive’ thought experiment, I attempt to set it in a logical framework, which allows us to systematise and classify the various objections. Since thought experiments are always posed in narrative form, formal logic by itself cannot fully capture the controversy. On the contrary, much also hinges on how one translates between the informal everyday language in which the CRA was initially framed and formal logic and, in particular, on the specific conception of possibility that one reads into the logical formalism. (shrink)
Any attempt to introduce probabilities into quantum mechanics faces difficulties due to the mathematical structure of Hilbert space, as reflected in Birkhoff and von Neumann's proposal for a quantum logic. The (consistent or decoherent) histories solution is provided by its single framework rule, an approach that includes conventional (Copenhagen) quantum theory as a special case. Mermin's Ithaca interpretation addresses the same problem by defining probabilities which make no reference to a sample space or event algebra (“correlations without correlata”). But this (...) leads to severe conceptual difficulties, which almost inevitably couple quantum theory to unresolved problems of human consciousness. Using histories allows a sharper quantum description than is possible with a density matrix, suggesting that the latter provides an ensemble rather than an irreducible single-system description as claimed by Mermin. The histories approach satisfies the first five of Mermin's desiderata for a good interpretation of quantum mechanics, including Einstein locality, but the Ithaca interpretation seems to have difficulty with the first (independence of observers) and the third (describing individual systems). (shrink)
It is shown that for every nonzero r.e. degree c there is a linear ordering of degree c which is not isomorphic to any recursive linear ordering. It follows that there is a linear ordering of low degree which is not isomorphic to any recursive linear ordering. It is shown further that there is a linear ordering L such that L is not isomorphic to any recursive linear ordering, and L together with its ‘infinitely far apart’ relation is of low (...) degree. Finally, an analogue of the recursion theorem for recursive linear orderings is refuted. (shrink)
This article is a response to various assertions made by B. d'Espagnat about the consistent history approach to quantum mechanics. It is argued that the consistent history interpretation allows for counterfactual definitions, does not imply that the future influences the past, is “realistic” according to d'Espagnat's own definition of that term, and provides a consistent substitute for classical logic in the quantum domain.
We announce and explain recent results on the computably enumerable (c.e.) sets, especially their definability properties (as sets in the spirit of Cantor), their automorphisms (in the spirit of Felix Klein's Erlanger Programm), their dynamic properties, expressed in terms of how quickly elements enter them relative to elements entering other sets, and the Martin Invariance Conjecture on their Turing degrees, i.e., their information content with respect to relative computability (Turing reducibility).
In the last five years there have been a number of results about the computable content of the prime, saturated, or homogeneous models of a complete decidable theory T in the spirit of Vaught's "Denumerable models of complete theories" combined with computability methods for degrees d ≤ 0′. First we recast older results by Goncharov, Peretyat'kin, and Millar in a more modern framework which we then apply. Then we survey recent results by Lange, "The degree spectra of homogeneous models," which (...) generalize the older results and which include positive results on when a certain homogeneous model of T has an isomorphic copy of a given Turing degree. We then survey Lange's "A characterization of the 0-basis homogeneous bounding degrees" for negative results about when does not have such copies, generalizing negative results by Goncharov, Peretyat'kin, and Millar. Finally, we explain recent results by Csima, Harizanov, Hirschfeldt, and Soare, "Bounding homogeneous models," about degrees d that are homogeneous bounding and explain their relation to the PA degrees. (shrink)
It is shown that quantum mechanics is noncontextual if quantum properties are represented by subspaces of the quantum Hilbert space rather than by hidden variables. In particular, a measurement using an appropriately constructed apparatus can be shown to reveal the value of an observable A possessed by the measured system before the measurement took place, whatever other compatible observable B may be measured at the same time.
A set X of nonnegative integers is computably enumerable (c.e.), also called recursively enumerable (r.e.), if there is a computable method to list its elements. Let ε denote the structure of the computably enumerable sets under inclusion, $\varepsilon = (\{W_e\}_{e\in \omega}, \subseteq)$ . We previously exhibited a first order ε-definable property Q(X) such that Q(X) guarantees that X is not Turing complete (i.e., does not code complete information about c.e. sets). Here we show first that Q(X) implies that X has (...) a certain "slowness" property whereby the elements must enter X slowly (under a certain precise complexity measure of speed of computation) even though X may have high information content. Second we prove that every X with this slowness property is computable in some member of any nontrivial orbit, namely for any noncomputable A ∈ ε there exists B in the orbit of A such that X ≤ T B under relative Turing computability (≤ T ). We produce B using the Δ 0 3 -automorphism method we introduced earlier. (shrink)
The traditional view of symbol grounding seeks to connect an a priori internal representation or ‘form’ to its external referent. But such a ‘form’ is usually itself systematically composed out of more primitive parts, so this view ignores its grounding in the physics of the world. Some previous work simulating multiple talking/listening agents has effectively taken this stance, and shown how a shared discrete speech code can emerge. Taking the earlier work of Oudeyer, we have extended his model to include (...) a dispersive force intended to account broadly for a speaker’s motivation to increase auditory distinctiveness. New simulations show that vowel systems result that are more representative of the range seen in human languages. These simulations make many profound abstractions and assumptions. Relaxing these by including more physically and physiologically realistic mechanisms for talking and listening is seen as the key to replicating more complex and dynamic aspects of speech, such as consonant-vowel patterning. (shrink)
Post in 1944 began studying properties of a computably enumerable set A such as simple, h-simple, and hh-simple, with the intent of finding a property guaranteeing incompleteness of A . From the observations of Post and Myhill , attention focused by the 1950s on properties definable in the inclusion ordering of c.e. subsets of ω, namely E = . In the 1950s and 1960s Tennenbaum, Martin, Yates, Sacks, Lachlan, Shoenfield and others produced a number of elegant results relating ∄-definable properties (...) of A , like maximal, hh-simple, atomless, to the information content of A . Harrington and Soare gave an answer to Post's program for definable properties by producing an ∄-definable property Q which guarantees that A is incomplete and noncomputable, but developed a new Δ 3 0 -automorphism method to prove certain other properties are not ∄-definable. In this paper we introduce new ∄-definable properties relating the ∄-structure of A to deg, which answer some open questions. In contrast to Q we exhibit here an ∄-definable property T which allows such a rapid flow of elements into A that A must be complete even though A may possess many other properties such as being promptly simple. We also present a related property NL which has a slower flow but fast enough to guarantee that A is not low, even though A may possess virtually all other related lowness properties and A may simultaneously be promptly simple. (shrink)
We show, roughly speaking, that it requires ω iterations of the Turing jump to decode nontrivial information from Boolean algebras in an isomorphism invariant fashion. More precisely, if α is a recursive ordinal, A is a countable structure with finite signature, and d is a degree, we say that A has αth-jump degree d if d is the least degree which is the αth jump of some degree c such there is an isomorphic copy of A with universe ω in (...) which the functions and relations have degree at most c. We show that every degree d ≥ 0 (ω) is the ωth jump degree of a Boolean algebra, but that for $n no Boolean algebra has nth-jump degree $\mathbf{d} > 0^{(n)}$ . The former result follows easily from work of L. Feiner. The proof of the latter result uses the forcing methods of J. Knight together with an analysis of various equivalences between Boolean algebras based on a study of their Stone spaces. A byproduct of the proof is a method for constructing Stone spaces with various prescribed properties. (shrink)