I attempt to get as clear as possible on the chain of reasoning by which irreversible macrodynamics is derivable from time-reversible microphysics, and in particular to clarify just what kinds of assumptions about the initial state of the universe, and about the nature of the microdynamics, are needed in these derivations. I conclude that while a “PastHypothesis” about the early Universe does seem necessary to carry out such derivations, that Hypothesis is not correctly understood as a (...) constraint on the early Universe’s entropy. (shrink)
The PastHypothesis is the claim that the Boltzmann entropy of the universe was extremely low when the universe began. Can we make sense of this claim when *classical* gravitation is included in the system? I first show that the standard rationale for not worrying about gravity is too quick. If the paper does nothing else, my hope is that it gets the problems induced by gravity the attention they deserve in the foundations of physics. I then try (...) to make plausible a very weak claim: that there is a well-defined Boltzmann entropy that *can* increase in *some* interesting self-gravitating systems. More work is needed before we can say whether this claim answers the threat to the standard explanation of entropy increase. (shrink)
The pasthypothesis is that the entropy of the universe was very low in the distant past. It is put forward to explain the entropic arrow of time but it has been suggested (e.g. [Penrose, R. (1989a). The emperor’s new mind. London:Vintage Books; Penrose, R. (1989b). Annals of the New York Academy of Sciences, 571, 249–264; Price, H. (1995). In S. F. Savitt (Ed.), Times’s arrows today. Cambridge: Cambridge University Press; Price, H. (1996). Time’s arrow and Archimedes’ (...) point. Oxford: Oxford University Press; Price, H. (2004). In C. Hitchcock (Ed.), Contemporary debates in philosophy of science. Oxford: Blackwell]) that it is itself in need of explanation. It has also been suggested that cosmic inflation could provide the explanation, but Price (2004) raises a serious objection to this suggestion, which has otherwise received very little attention in the philosophical literature. Price points out that the standard inflationary explanation involves a double standard: although the evolution of the universe described by the inflationary model seems natural from the standard temporal perspective it looks highly unnatural from the reversed temporal perspective. The main purpose of this paper is to propose a novel form of the inflationary explanation that avoids this objection. It is argued that the inflationary model would not involve a double standard (but would still explain the pasthypothesis) if we construct the model with a global “boundary” condition instead of a conventional boundary condition: if we assume that the universe is as generic as possible overall, rather than as generic as possible at some given point (e.g. the Big Bang) as is assumed in the standard inflationary model. This novel form of the inflationary explanation is then compared with Price’s 1996 preferred explanation, a version of the so-called “Weyl hypothesis”. (shrink)
In his recent book, Time and Chance, David Albert claims that by positing that there is a uniform probability distribution defined, on the standard measure, over the space of microscopic states that are compatible with both the current macrocondition of the world, and with what he calls the “pasthypothesis”, we can explain the time asymmetry of all of the thermodynamic behavior in the world. The principal purpose of this paper is to dispute this claim. I argue that (...) Albert's proposal fails in his stated goal—to show how to use the time‐reversible dynamics of Newtonian physics to “underwrite the actual content of our thermodynamic experience” (Albert 2000, 159). Albert's proposal can satisfactorily explain why the overall entropy of the universe as a whole is increasing, but it does not and cannot explain the increasing entropy of relatively small, relatively short‐lived systems in energetic isolation without making use of a principle that leads to reversibility objections. (shrink)
Why is our knowledge of the past so much more ‘expansive’ (to pick a suitably vague term) than our knowledge of the future, and what is the best way to capture the difference(s) (i.e., in what sense is knowledge of the past more ‘expansive’)? One could reasonably approach these questions by giving necessary conditions for different kinds of knowledge, and showing how some were satisfied by certain propositions about the past, and not by corresponding propositions about the (...) future. I take it that such is the approach of Chapter 6 of Time and Chance (T&C). Here’s another such a proposal, similar to that of, but significantly different from T&C; my purpose in this section is to highlight the differences, by showing how this account fails. (shrink)
In recent work on the foundations of statistical mechanics and the arrow of time, Barry Loewer and David Albert have developed a view that defends both a best system account of laws and a physicalist fundamentalism. I argue that there is a tension between their account of laws, which emphasizes the pragmatic element in assessing the relative strength of different deductive systems, and their reductivism or funda- mentalism. If we take the pragmatic dimension in their account seriously, then the laws (...) of the special sciences should be part of our best explanatory system of the world, as well. (shrink)
I defend what may loosely be called an eliminativist account of causation by showing how several of the main features of causation, namely asymmetry, transitivity, and necessitation (or sometimes probability-raising), arise from the combination of fundamental dynamical laws and a special constraint on the macroscopic structure of matter in the past. At the microscopic level, the causal features of necessitation and transitivity are grounded, but not the asymmetry. At the coarse-grained level of the macroscopic physics, the causal asymmetry is (...) grounded, but not the necessitation or transitivity. Thus, at no single level of description does the physics justify the conditions that are taken to be constitutive of causation. Nevertheless, if we mix our reasoning about the microscopic and macroscopic descriptions, the structure provided by the dynamics and special initial conditions can justify the folk concept of causation to a significant extent. I explain why our causal concept works so well even though at bottom it is comprised of a patchwork of principles that don't mesh well. (shrink)
Non-presentist A-theories of time (such as the growing block theory and the moving spotlight theory) seem unacceptable because they invite skepticism about whether one exists in the present. To avoid this absurd implication, Peter Forrest appeals to the "Past is Dead hypothesis," according to which only beings in the objective present are conscious. We know we're present because we know we're conscious, and only present beings can be conscious. I argue that the dead pasthypothesis undercuts (...) the main reason for preferring non-presentist A-theories to their presentist rivals, rivals which straightforwardly avoid skepticism about the present. (shrink)
In the past, hypothesis testing in medicine has employed the paradigm of the repeatable experiment. In statistical hypothesis testing, an unbiased sample is drawn from a larger source population, and a calculated statistic is compared to a preassigned critical region, on the assumption that the comparison could be repeated an indefinite number of times. However, repeated experiments often cannot be performed on human beings, due to ethical or economic constraints. We describe a new paradigm for hypothesis (...) testing which uses only rearrangements of data present within the observed data set. The token swap test, based on this new paradigm, is applied to three data sets from cardiovascular pathology, and computational experiments suggest that the token swap test satisfies the Neyman Pearson condition. (shrink)
Of the many tasks undertaken in science, one is striking both in its scope and the epistemic difficulties it faces: the reconstruction of the deep past. Such reconstruction provides the resources to successfully explain puzzling extant traces, from fossils to radiation signatures, often in the absence of extensive and repeatable observations—the hallmark of good epistemic support. Yet good explanations do not come for free. Evidence can fail, in practice or in principle, to support one hypothesis over another (underdetermination). (...) And when hypotheses do confront conflicting evidence, identifying the piece of theory to abandon can be notoriously difficult (testing holism). Good science, in any discipline, must overcome these challenges. (shrink)
This paper first advances and discusses the hypothesis that so-called “iconic” or (for the auditory sphere) “echoic” memory is actually a form of perception of the past. Such perception is made possible by parallel inputs with differential delays which feed independently into the sensorium. This hypothesis goes well together with a set of related psychological and phenomenological facts, as for example: Sperling’s results about the visual sensory buffer, the facts that we seem to see movement and hear (...) temporal Gestalts, and the fact that we sometimes seem to hear sounds only after they have stopped. In it most simple form, and formulated in the somewhat misleading information processing idiom, my hypothesis says that each one of a number of parallel input lines with different delays feeds into a spatially separate sensory unit. The set of such units then holds information about the immediate past in what one might call a “chronotopic” sensory map. This contrasts with the idea (common in sensory buffer theory) that the received sensory information is kept (while possibly decaying) in the same unit for some time after it occurred. The hypothesis also contradicts the theory that all sensory information passes through the same unit but is then successively passed through a unidirectional chain of separate units, where the past experiences then become represented (the shift register hypothesis). The main advantage of my theory, beside the natural explanations it offers for the above-mentioned kind of phenomena, is that it postulates a parallel – and therefore robust – rather than a serial mechanism for the registering of temporal information. It can of course easily be modified to fit more complex models of the sensory cerebral code(s) as well as of the chronotopic representation as such. -/- In the second part of my poster, I advance a corresponding hypothesis for those motor commands which control brief movements. At closer inspection, most socalled “ballistic” movements do not seem to be truly ballistic (in the sense in which the movement of a cannonball is so) since the brain must exert some kind of feedforward control over the later part of their trajectory. I suggest that this control is at least sometimes realized by means of differentially delayed output from a chronotopic representation of successive segments of the movement. Not only could this be a biologically natural way of ensuring efficient adaptability of the movement; the hypothesis also explains the not uncommon experience of “seeing the whole movement laid out in advance” when it is initiated. (shrink)
What new implications does the dynamical hypothesis have for cognitive science? The short answer is: None. The _Behavior and Brain Sciences _target article, “The dynamical hypothesis in cognitive science” by Tim Van Gelder is basically an attack on traditional symbolic AI and differs very little from prior connectionist criticisms of it. For the past ten years, the connectionist community has been well aware of the necessity of using (and understanding) dynamically evolving, recurrent network models of cognition.
The Extended Mind Hypothesis (EMH) needs a defence of phenomenal externalism in order to be consistent with an indispensable condition for attributing extended beliefs, concerning the conscious past endorsement of information. However, it is difficult, if not impossible, to envisage such a defence. Proponents ofthe EMH are thus confronted with a difficult dilemma: they either accept absurd attributions of belief, and thus deflate EMH, or incorporate, for compatibility reasons, the conscious past endorsement condition for extended belief attribution, (...) implying a seemingly unavailable defence of phenomenal externalism, and thus risk inconsistency within EMH. Either way, EMH is threatened. (shrink)
According to the knowledge argument, physicalism fails because when physically omniscient Mary first sees red, her gain in phenomenal knowledge involves a gain in factual knowledge. Thus not all facts are physical facts. According to the ability hypothesis, the knowledge argument fails because Mary only acquires abilities to imagine, remember and recognise redness, and not new factual knowledge. I argue that reducing Mary’s new knowledge to abilities does not affect the issue of whether she also learns factually: I show (...) that gaining specific new phenomenal knowledge is required for acquiring abilities of the relevant kind. Phenomenal knowledge being basic to abilities, and not vice versa, it is left an open question whether someone who acquires such abilities also learns something factual. The answer depends on whether the new phenomenal knowledge involved is factual. But this is the same question we wanted to settle when first considering the knowledge argument. The ability hypothesis, therefore, has offered us no dialectical progress with the knowledge argument, and is best forgotten. (shrink)
Borrowing conceptual tools from Bergson, this essay asks after the shift in the temporality of life from Merleau-Ponty’s Phénoménologie de la perception to his later works. Although the Phénoménologie conceives life in terms of the field of presence of bodily action, later texts point to a life of invisible and immemorial dimensionality. By reconsidering Bergson, but also thereby revising his reading of Husserl, Merleau-Ponty develops a non-serial theory of time in the later works, one that acknowledges the verticality and irreducibility (...) of the past. Life in the flesh relies on unconsciousness or forgetting, on an invisibility that structures its passage. (shrink)
David Lewis (1983, 1988) and Laurence Nemirow (1980, 1990) claim that knowing what an experience is like is knowing-how, not knowing-that. They identify this know-how with the abilities to remember, imagine, and recognize experiences, and Lewis labels their view ‘the Ability Hypothesis’. The Ability Hypothesis has intrinsic interest. But Lewis and Nemirow devised it specifically to block certain anti-physicalist arguments due to Thomas Nagel (1974, 1986) and Frank Jackson (1982, 1986). Does it?
According to the Ability Hypothesis, knowing what it is like to have experience E is just having the ability to imagine or recognize or remember having experience E. I examine various versions of the Ability Hypothesis and point out that they all face serious objections. Then I propose a new version that is not vulnerable to these objections: knowing what it is like to experience E is having the ability todiscriminate imagining or having experience E from imagining or (...) having any other experience. I argue that if we replace the ability to imagine or recognize with the ability to discriminate, the Ability Hypothesis can be salvaged. (shrink)
The Perceptual Hypothesis is that we sometimes see, and thereby have non-inferential knowledge of, others' mental features. The Perceptual Hypothesis opposes Inferentialism, which is the view that our knowledge of others' mental features is always inferential. The claim that some mental features are embodied is the claim that some mental features are realised by states or processes that extend beyond the brain. The view I discuss here is that the Perceptual Hypothesis is plausible if, but only if, (...) the mental features it claims we see are suitably embodied. Call this Embodied Perception Theory. I argue that Embodied Perception Theory is false. It doesn't follow that the Perceptual Hypothesis is implausible. The considerations which serve to undermine Embodied Perception Theory serve equally to undermine the motivations for assuming that others' mental lives are always imperceptible. (shrink)
Enactive approaches foreground the role of interpersonal interaction in explanations of social understanding. This motivates, in combination with a recent interest in neuroscientific studies involving actual interactions, the question of how interactive processes relate to neural mechanisms involved in social understanding. We introduce the Interactive Brain Hypothesis (IBH) in order to help map the spectrum of possible relations between social interaction and neural processes. The hypothesis states that interactive experience and skills play enabling roles in both the development (...) and current function of social brain mechanisms, even in cases where social understanding happens in the absence of immediate interaction. We examine the plausibility of this hypothesis against developmental and neurobiological evidence and contrast it with the widespread assumption that mindreading is crucial to all social cognition. We describe the elements of social interaction that bear most directly on this hypothesis and discuss the empirical possibilities open to social neuroscience. We propose that the link between coordination dynamics and social understanding can be best grasped by studying transitions between states of coordination. These transitions form part of the self-organization of interaction processes that characterize the dynamics of social engagement. The patterns and synergies of this self-organization help explain how individuals understand each other. Various possibilities for role-taking emerge during interaction, determining a spectrum of participation. This view contrasts sharply with the observational stance that has guided research in social neuroscience until recently. We also introduce the concept of readiness to interact to describe the practices and dispositions that are summoned in situations of social significance (even if not interactive). This latter idea links interactive factors to more classical observational scenarios. (shrink)
This paper introduces a new family of cases where agents are jointly morally responsible for outcomes over which they have no individual control, a family that resists standard ways of understanding outcome responsibility. First, the agents in these cases do not individually facilitate the outcomes and would not seem individually responsible for them if the other agents were replaced by non-agential causes. This undermines attempts to understand joint responsibility as overlapping individual responsibility; the responsibility in question is essentially joint. Second, (...) the agents involved in these cases are not aware of each other's existence and do not form a social group. This undermines attempts to understand joint responsibility in terms of actual or possible joint action or joint intentions, or in terms of other social ties. Instead, it is argued that intuitions about joint responsibility are best understood given the Explanation Hypothesis, according to which a group of agents are seen as jointly responsible for outcomes that are suitably explained by their motivational structures: something bad happened because they didn’t care enough; something good happened because their dedication was extraordinary. One important consequence of the proposed account is that responsibility for outcomes of collective action is a deeply normative matter. (shrink)
The ‘Knobe effect’ is the name given to the empirical finding that judgments about whether an action is intentional or not seem to depend on the moral valence of this action. To account for this phenomenon, Scaife and Webber have recently advanced the ‘Consideration Hypothesis’, according to which people’s ascriptions of intentionality are driven by whether they think the agent took the outcome in consideration when taking his decision. In this paper, I examine Scaife and Webber’s hypothesis and (...) conclude that it is supported neither by the existing literature nor by their own experiments, whose results I did not replicate, and that the ‘Consideration Hypothesis’ is not the best available account of the ‘Knobe Effect’. (shrink)
Many philosophers and psychologists now argue that emotions play a vital role in reasoning. This paper explores one particular way of elucidating how emotions help reason which may be dubbed ?the search hypothesis of emotion?. After outlining the search hypothesis of emotion and dispensing with a red herring that has marred previous statements of the hypothesis, I discuss two alternative readings of the search hypothesis. It is argued that the search hypothesis must be construed as (...) an account of what emotions typically do, rather than as a definition of emotion. Even as an account of what emotions typically do, the search hypothesis can only be evaluated in the context of a specific theory of what emotions are. 1 Introduction 2 The search hypothesis of emotion 3 A red herring: the frame problem 4 The search problem 5 Two readings of the search hypothesis 6 Two final remarks 7 Conclusion. (shrink)
Research on patients with damage to ventromedial frontal cortices suggests a key role for emotions in practical decision making. This field of investigation is often associated with Antonio Damasio’s Somatic Marker Hypothesis–a putative account of the mechanism by which autonomic tags guide decision making in typical individuals. Here we discuss two ‘myths’ surrounding the direction and interpretation of this research. First, it is often assumed that there is a single somatic marker hypothesis. As others have noted, however, Damasio’s (...) ‘hypothesis’ admits of multiple interpretations (Colombetti, ; Dunn et al. ). Our analysis builds upon this point by characterizing decision making as a multi-stage process and identifying the various potential roles for somatic markers. The second myth is that the available evidence suggests a role for somatic markers in the core stages of decision making, i.e. during the generation, deliberation or evaluation of candidate options. To the contrary, we suggest that somatic markers most likely have a peripheral role, in the recognition of decision points, or in the motivation of action. This conclusion is based on an examination of the past 25 years of research conducted by Damasio and colleagues, focusing in particular on some early experiments that have been largely neglected by the critical literature. (shrink)
This paper is about The Truthmaker Problem for Presentism. I spell out a solution to the problem that involves appealing to indeterministic laws of nature and branching semantics for past- and future-tensed sentences. Then I discuss a potential glitch for this solution, and propose a way to get around that glitch. Finally, I consider some likely objections to the view offered here, as well as replies to those objections.
This paper defends the claim that, in order to have a concept of time, subjects must have memories of particular events they once witnessed. Some patients with severe amnesia arguably still have a concept of time. Two possible explanations of their grasp of this concept are discussed. They take as their respective starting points abilities preserved in the patients in question: (1) the ability to retain factual information over time despite being unable to recall the past event or situation (...) that information stems from, and (2) the ability to remember at least some past events or situations themselves (typically because retrograde amnesia is not complete). It is argued that a satisfactory explanation of what it is for subjects to have a concept of time must make reference to their having episodic memories such as those mentioned under (2). It is also shown how the question as to whether subjects have such memories, and thus whether they possess a concept of time, enters into our explanation of their actions. (shrink)
Knowing one’s past thoughts and attitudes is a vital sort of self-knowledge. In the absence of memorial impressions to serve as evidence, we face a pressing question of how such self-knowledge is possible. Recently, philosophers of mind have argued that self-knowledge of past attitudes supervenes on rationality. I examine two kinds of argument for this supervenience claim, one from cognitive dynamics, and one from practical rationality, and reject both. I present an alternative account, on which knowledge of (...) class='Hi'>past attitudes is inferential knowledge, and depends upon contingent facts of one’s rationality and consistency. Failures of self-knowledge are better explained by the inferential account. (shrink)
I argue that David Lewis’s attempt, in his ‘Counterfactual Dependence and Time’s Arrow’, to explain the fixity of the past in terms of counterfactual independence is unsuccessful. I point out that there is an ambiguity in the claim that the past is counterfactually independent of the present (or, more generally, that the earlier is counterfactually independent of the later), corresponding to two distinct theses about the relation between time and counterfactuals, both officially endorsed by Lewis. I argue that (...) Lewis’s attempt is flawed for a variety of reasons, including the fact that his own theory about the evaluation of counterfactuals requires too many exceptions to the general rule that the past is counterfactually independent of the present. At the end of the paper, I consider a variant of Lewis’s strategy that attempts to explain the fixity of the past in terms of causal, rather than counterfactual, independence. I conclude that, although this variant avoids some of the objections that afflict Lewis’s account, it nevertheless seems to be incapable of giving a satisfactory explanation of the notion of the fixity of the past. (shrink)
Against Russell’s skeptical conjecture, that the world and its entire population came into existence five minutes ago, it is argued that any one of the following is logically incompatible with the conjunction of the other two: ostensible memories of certain events, records of such events, and the non-occurrence of these same events. This conclusion is reached through a critical examination of (1) the arguments advanced by Norman Malcolm in trying to show that Russell’s “hypothesis” does not express a logical (...) possibility, and (2) the counterarguments by which James W. Cornman tries to show that it does. (shrink)
We identify a particular type of causal reasoning ability that we believe is required for the possession of episodic memories, as it is needed to give substance to the distinction between the past and the present. We also argue that the same causal reasoning ability is required for grasping the point that another person's appeal to particular past events can have in conversation. We connect this to claims in developmental psychology that participation in joint reminiscing plays a key (...) role in memory development. (shrink)
This paper challenges arguments that systematic patterns of intelligent behavior license the claim that representations must play a role in the cognitive system analogous to that played by syntactical structures in a computer program. In place of traditional computational models, I argue that research inspired by Dynamical Systems theory can support an alternative view of representations. My suggestion is that we treat linguistic and representational structures as providing complex multi-dimensional targets for the development of individual brains. This approach acknowledges the (...) indispensability of the intentional or representational idiom in psychological explanation without locating representations in the brains of intelligent agents. (shrink)
In this paper I examine Albert’s (2000) claim that the low entropy state of the early universe is sufficient to explain irreversible thermodynamic phenomena. In particular, I argue that conditionalising on the initial state of the universe does not have the explanatory power it is presumed to have. I present several arguments to the effect that Albert’s ‘pasthypothesis’ alone cannot justify the belief in past non-equilibrium conditions or ground the veracity of records of the past.
Thanks to all the people who responded to my enquiry about the status of the Continuum Hypothesis. This is a really fascinating subject, which I could waste far too much time on. The following is a summary of some aspects of the feeling I got for the problems. This will be old hat to set theorists, and no doubt there are a couple of embarrassing misunderstandings, but it might be of some interest to non professionals.
Through the philosophies of Bergson and Deleuze, my paper explores a different theory of time. I reconstitute Deleuze’s paradoxes of the past in Difference and Repetition and Bergsonism to reveal a theory of time in which the relation between past and present is one of coexistence rather than succession. The theory of memory implied here is a non-representational one. To elaborate this theory, I ask: what is the role of the “virtual image” in Bergson’s Matter and Memory? Far (...) from representing the simple afterimage of a present perception, the “virtual image” carries multiple senses. Contracting the immediate past for the present, or expanding virtually to hold the whole of memory (and even the whole of the universe), the virtual image can form a bridge between the present and the non-representational past. This non-representational account of memory sheds light not only on the structure of time for Bergson, but also on his concepts of pure memory and virtuality. The rereading of memory also opens the way for Bergsonian intuition to play an intersubjective role; intuition becomes a means for navigating the resonances and dissonances that can be felt between different rhythms of becoming or planes of memory, which constitute different subjects. (shrink)
Several theories claim that dreaming is a random by-product of REM sleep physiology and that it does not serve any natural function. Phenomenal dream content, however, is not as disorganized as such views imply. The form and content of dreams is not random but organized and selective: during dreaming, the brain constructs a complex model of the world in which certain types of elements, when compared to waking life, are underrepresented whereas others are over represented. Furthermore, dream content is consistently (...) and powerfully modulated by certain types of waking experiences. On the basis of this evidence, I put forward the hypothesis that the biological function of dreaming is to simulate threatening events, and to rehearse threat perception and threat avoidance. To evaluate this hypothesis, we need to consider the original evolutionary context of dreaming and the possible traces it has left in the dream content of the present human population. In the ancestral environment human life was short and full of threats. Any behavioral advantage in dealing with highly dangerous events would have increased the probability of reproductive success. A dream-production mechanism that tends to select threatening waking events and simulate them over and over again in various combinations would have been valuable for the development and maintenance of threat-avoidance skills. Empirical evidence from normative dream content, children's dreams, recurrent dreams, nightmares, post traumatic dreams, and the dreams of hunter-gatherers indicates that our dream-production mechanisms are in fact specialized in the simulation of threatening events, and thus provides support to the threat simulation hypothesis of the function of dreaming. Key Words: dream content; dream function; evolution of consciousness; evolutionary psychology; fear; implicit learning; nightmares; rehearsal; REM; sleep; threat perception. (shrink)
Merleau-Ponty's reference to "a past which has never been present" at the end of "Le sentir" challenges the typical framework of the Phenomenology of Perception, with its primacy of perception and bodily field of presence. In light of this "original past," I propose a re-reading of the prepersonal as ground of perception that precedes the dichotomies of subject-object and activity-passivity. Merleau-Ponty searches in the Phenomenology for language to describe this ground, borrowing from multiple registers (notably Bergson, but also (...) Husserl). This "sensory life" is a coexistence of sensing and sensible—bodily and worldly—rhythms. Perception is, then, not a natural given, but a temporal process of synchronization between rhythms. By drawing on Bergson, this can be described as a process in which virtual life is actualized into perceiving subject and object perceived. Significantly, this process involves non-coincidence or delay whereby sensory life is always already past for perception. (shrink)
The centerpiece of the first volume of Michel Foucault’s History of Sexuality is the analysis of what Foucault terms the “repressive hypothesis,” the nearly universal assumption on the part of twentieth-century Westerners that we are the heirs to a Victorian legacy of sexual repression. The supreme irony of this belief, according to Foucault, is that the whole time that we have been announcing and denouncing our repressed, Victorian sexuality, discourses about sexuality have actually proliferated. Paradoxically, as Victorian as we (...) allegedly are, we cannot stop talking about sex. Much of the analysis of the first volume of the History of Sexuality consists in an unmasking and debunking of the repressive hypothesis. This unmasking does not take the simple form of a counter-claim that we are not, in fact, repressed; rather, Foucault contends that understanding sexuality solely or even primarily in terms of repression is inaccurate and misleading. As he said in an interview published in 1983, “it is not a question of denying the existence of repression. It’s one of showing that repression is always a part of a much more complex political strategy regarding sexuality. Things are not merely repressed.”1 Foucault makes this extremely clear in the introduction to the History of Sexuality, Volume 1, when he writes. (shrink)
The dynamical hypothesis is the claim that cognitive agents are dynamical systems. It stands opposed to the dominant computational hypothesis, the claim that cognitive agents are digital computers. This target article articulates the dynamical hypothesis and defends it as an open empirical alternative to the computational hypothesis. Carrying out these objectives requires extensive clarification of the conceptual terrain, with particular focus on the relation of dynamical systems to computers.
1 *Common Sense Conception of Beliefs and Other Propositional Attitudes2 What is the Language of Thought Hypothesis?3 Status of LOTH4 Scope of LOTH5 *Natural Language as Mentalese?6 *Nativism and LOTH7 Naturalism and LOTH.
It is often said that, according to common sense, there is a fundamental asymmetry between the past and future; namely, that the past is closed and the future is open. Eternalism in the ontology of time is often seen as conflicting with common sense on this point. Here I argue against the claim that common sense is committed to this fundamental asymmetry between the past and the future, on the grounds that facts about the past often (...) depend on facts about the future.1. (shrink)
This paper is about the open future response to fatalistic arguments. I first present a typical fatalistic argument and then spell out the open future response as a response to that argument. Then I raise the question of how the open future response can be independently justified. I consider some possible ways in which the response might be defended, and I try to show that none of these is a plausible, non-question-begging defense. Next I formulate what I take to be (...) the only plausible, nonquestion-begging defense of the open future response. This defense involves both (i) the claim that the laws of nature are indeterministic and (ii) a certain version of the correspondence theory of truth. Finally, I argue that there is a very surprising consequence of justifying the open future response by making the defense in question, namely, that the past is sometimes open. Fatalism is the view that whatever will happen in the future is inevitable, due to certain considerations about truth and time. Fatalism, in turn, is normally taken to imply that there is no such thing as genuine, human free will. Suppose that I am an anti-fatalist. Suppose I believe that Joe Montana is free to choose what he will have for lunch tomorrow, and suppose I take this case to be a paradigmatic example of one involving both evitability and human free will. Now suppose that I meet a fatalist, who presents the following argument.1.. (shrink)
A new position in the philosophy of mind has recently appeared: the extended mind hypothesis (EMH). Some of its proponents think the EMH, which says that a subject's mental states can extend into the local environment, shows that internalism is false. I argue that this is wrong. The EMH does not refute internalism; in fact, it necessarily does not do so. The popular assumption that the EMH spells trouble for internalists is premised on a bad characterization of the internalist (...) thesis—albeit one that most internalists have adhered to. I show that internalism is entirely compatible with the EMH. This view should prompt us to reconsider the characterization of internalism, and in conclusion I make some brief remarks about how that project might proceed. (shrink)
The Linguistic Turn is the title of an influential anthology edited by Richard Rorty, published in 1967. In his introduction, Rorty explained: The purpose of the present volume is to provide materials for reflection on the most recent philosophical revolution, that of linguistic philosophy. I shall mean by “linguistic philosophy” the view that philosophical problems are problems which may be solved (or dissolved) either by reforming language, or by understanding more about the language we presently use. (1967: 3) ‘The linguistic (...) turn’ has subsequently become a standard vague phrase for a diffuse event — some regard it as the event — in twentieth century philosophy, one not confined to signed-up linguistic philosophers in Rorty’s sense. For those who took the turn, language was somehow the central theme of philosophy. There is an increasingly widespread sense that the linguistic turn is past. In this essay I ask how far the turn has been, or should be, reversed. (shrink)
Locke denied that ideas of secondary qualities resemble their causes. It has been suggested that Locke denied this because he accepted a mechanical corpuscular hypothesis about the constitution of objects. This paper shows that this and other usual explanations of Locke's denial are mistaken. Further, it suggests an alternative relationship between the scientific account and Locke's philosophical views, and finally it provides Locke's real justification for his claim that ideas of secondary qualities do not resemble their causes.
The Language of Thought Hypothesis (LOTH) is an empirical thesis about thought and thinking. For their explication, it postulates a physically realized system of representations that have a combinatorial syntax (and semantics) such that operations on representations are causally sensitive only to the syntactic properties of representations. According to LOTH, thought is, roughly, the tokening of a representation that has a syntactic (constituent) structure with an appropriate semantics. Thinking thus consists in syntactic operations defined over representations. Most of the (...) arguments for LOTH derive their strength from their ability to explain certain empirical phenomena like productivity, systematicity of thought and thinking. (shrink)
The purpose of this article is to explain why I believe that the Continuum Hypothesis (CH) is not a definite mathematical problem. My reason for that is that the concept of arbitrary set essential to its formulation is vague or underdetermined and there is no way to sharpen it without violating what it is supposed to be about. In addition, there is considerable circumstantial evidence to support the view that CH is not definite.
Abstract Narrative thinking has a very important role in our ordinary everyday lives?in our thinking about fiction, about the historical past, about how things might have been, and about our own past and our plans for the future. In this paper, which is part of a larger project, I will be focusing on just one kind of narrative thinking: the kind that we sometimes engage in when we think about, evaluate, and respond emotionally to, our own past (...) lives from a perspective that is external to the remembered events. Being able to do this is an essential part of what it is to have a narrative sense of self. Sometimes, I will suggest, we fail to have such responses?we are not able to think and feel as we should about an episode in our lives. On such occasions, there is a gap in our narrative sense of self?a gap which opens up especially where the past is in some sense tragic or traumatic. The desire to close this gap is what I will call a desire for emotional closure. (shrink)
The notion of memory storage, central to most contemporary theories of remembering, is challenged from a philosophical perspective as being contradictory and untenable. It criticizes this storage hypothesis as relying upon a linear explanation of time, an assumption which results in infinite regression, solipsism, and a failure to contact the real past. A model based on the phenomenological viewpoints of Edmund Husserl and Maurice Merleau-Ponty is offered as an alternative paradigm. Finally, a research method suggested by this descriptive (...) approach to memory is presented and illustrated. (shrink)
The Eternal Coin is a fair coin has existed forever, and will exist forever, in a region causally isolated from you. It is tossed every day. How confident should you be that the Coin lands heads today, conditional on (i) the hypothesis that it has landed Heads on every past day, or (ii) the hypothesis that it will land Heads on every future day? I argue for the extremely counterintuitive claim that the correct answer to both questions (...) is 1. (shrink)
I want to join Dummett in saying that the reality of the past (and, by analogy, the reality of the future) is an issue of realism versus anti-realism: (Dummett 1969) If you affirm the reality of the past, you are a realist about the past. If you deny the reality of the past, you are an anti-realist about the past. (And likewise, in each case, for the future). It makes sense to think of these issues (...) by analogy with realism about the external world, unobservable objects, mathematical objects, universals, and so on. These are all properly described as ontological issues. (shrink)
The purpose of this paper is to defend what I call the action-oriented coding theory (ACT) of spatially contentful visual experience. Integral to ACT is the view that conscious visual experience and visually guided action make use of a common subject-relative or 'egocentric' frame of reference. Proponents of the influential two visual systems hypothesis (TVSH), however, have maintained on empirical grounds that this view is false (Milner & Goodale, 1995/2006; Clark, 1999; 2001; Campbell, 2002; Jacob & Jeannerod, 2003; Goodale (...) & Milner, 2004). One main source of evidence for TVSH comes from behavioral studies of the comparative effects of size-contrast illusions on visual awareness and visuo- motor action. This paper shows that not only is the evidence from illusion studies inconclusive, there is a better, ACT-friendly interpretation of the evidence that avoids serious theoretical difficulties faced by TVSH. (shrink)
In historical claims for nativism, mathematics is a paradigmatic example of innate knowledge. Claims by contemporary developmental psychologists of elementary mathematical skills in human infants are a legacy of this. However, the connection between these skills and more formal mathematical concepts and methods remains unclear. This paper assesses the current debates surrounding nativism and mathematical knowledge by teasing them apart into two distinct claims. First, in what way does the experimental evidence from infants, nonhuman animals and neuropsychology support the nativist (...)hypothesis? Second, granting that infants have some elementary mathematical skills, does this mean that such skills play an important role in the development of mathematical knowledge? (shrink)
In Matter and Memory, Bergson examines the relationship between perception and memory, the status of consciousness in its relation to the brain, and more generally, a possible conjunction of matter and mind. Our reading focuses in particular on his understanding of the evanescent presence of the present and of its debt vis-à-vis the "unconscious" consciousness of a "virtual" past. We wish to show that the Bergsonian version of a critique of "the metaphysics of presence" is, for all that, an (...) offshoot of a Platonic type of metaphysics. It is true that Bergson departs from traditional standpoints on the side of a self-sufficient and original present and a form of presence to which the transparency of consciousness would confer the character of immediate evidence. All the same, it can hardly be claimed that his rehabilitation of the past and the unconscious opens up new perspectives on how forgetting and death are bound up with the work of memory. (shrink)
A sentence in the Resultative perfect licenses two inferences: (a) the occurrence of an event (b) the state caused by this event obtains at evaluation time. In this paper I show that this use of the perfect is subject to a large number of distributional restrictions that all serve to highlight the result inference at the expense of the event inference. Nevertheless, only the event inference determines the truth conditions of this use of the perfect, the result inference being a (...) unique type of conventional implicature. I argue furthermore that, since the result state is singular, the event that causes it must also be singular, whereas the Experiential perfect is purely quantificational. But in out-of-the-blue contexts the past tense is also normally interpreted as singular. This leads to a certain amount of competition between the Resultative perfect and the past tense, and it is this competition, I suggest, that maintains the conventional (non-truth conditional) result state inference. (shrink)
If reality is temporary, then reality changes, and if reality changes, the past has explanatory work to do, and it cannot do that work unless it is no longer real. This tells against the Moving Now Theory, the Growing Block Theory, and any form of Presentism that attempts to understand the past in terms of the present, including Tensed Properties Presentism and Tensed Facts Presentism. It tells in favor of a form Presentism that allows us to appeal to (...) unreal past facts. I suggest that Priorian Presentism, conjoined with a certain way of understanding the role played by tense operators, is one such view. (shrink)
In this paper I discuss an unconventional form of presentism which, I claim, captures better than all other versions of the doctrine the fundamental notion underpinning it, namely, the notion that 'only what is present is real'. My proposal is to take this maxim as stating, not the rather uncontroversial view that past things are not real now, but the more radical idea that they never were. This rendition of presentism is, I argue, the only one that is neither (...) trivial nor absurd. I examine this proposal by considering it against a sceptical hypothesis that bears similarities to it, viz., the hypothesis that the world was created five minutes ago. On this hypothesis, the past, all but five minutes of it, is unreal, in precisely the sense in which the presentism I discuss claims it is. I show that, assuming semantic externalism, this sceptical hypothesis cannot be sustained, but that a somewhat weaker hypothesis, the Creationist hypothesis that the world is 5,768 years old, cannot be refuted. Together, these conclusions enable a demarcation of those presentist intuitions that language and thought tolerate and those they do not. (shrink)
Evolutionary psychologists tend to view the mind as a large collection of evolved, functionally specialized mechanisms, or modules. Cosmides and Tooby (1994) have presented four arguments in favor of this model of the mind: the engineering argument, the error argument, the poverty of the stimulus argument, and combinatorial explosion. Fodor (2000) has discussed each of these four arguments and rejected them all. In the present paper, we present and discuss the arguments for and against the massive modularity hypothesis. We (...) conclude that Cosmides and Tooby's arguments have considerable force and are too easily dismissed by Fodor. (shrink)
I begin with a characterization of neurolinguistic theories, trying to pinpoint some general properties that an account of brain/language relations should have. I then address specific criticisms made in the commentaries regarding the syntactic theory assumed in the target article, properties of the Trace Deletion Hypothesis (TDH) and the Tree-Pruning Hyothesis (TPH), other experimental results from aphasia, and findings from functional neuroimaging. Despite the criticism, the picture of the limited role of Broca's area remains unchanged.
This paper examines the justification for the hypothesis of extended cognition (HEC). HEC claims that human cognitive processes can, and often do, extend outside our heads to include objects in the environment. HEC has been justified by inference to the best explanation (IBE). Both advocates and critics of HEC claim that we should infer the truth value of HEC based on whether HEC makes a positive, or negative, explanatory contribution to cognitive science. I argue that IBE cannot play this (...) epistemic role. A serious rival to HEC exists with a differing truth value, and this invalidates IBEs for both the truth and falsity of HEC. Explanatory value to cognitive science cannot be used as a guide to the truth value of HEC. (shrink)
Sherri Roush () and I (, ) have each argued independently that the most significant challenge to scientific realism arises from our inability to consider the full range of serious alternatives to a given hypothesis we seek to test, but we diverge significantly concerning the range of cases in which this problem becomes acute. Here I argue against Roush's further suggestion that the atomic hypothesis represents a case in which scientific ingenuity has enabled us to overcome the problem, (...) showing how her general strategy is undermined by evidence I have already offered in support of what I have called the 'problem of unconceived alternatives'. I then go on to show why her strategy will not generally (if ever) allow us to formulate and test exhaustive spaces of hypotheses in cases of fundamental scientific theorizing. (shrink)
I’m going to argue for a set of restricted skeptical results: roughly put, we don’t know that fire engines are red, we don’t know that we sometimes have pains in our lower backs, we don’t know that John Rawls was kind, and we don’t even know that we believe any of those truths. However, people unfamiliar with philosophy and cognitive science do know all those things. The skeptical argument is traditional in form: here’s a skeptical hypothesis; you can’t epistemically (...) neutralize it, you have to be able to neutralize it to know P; so you don’t know P. But the skeptical hypotheses I plug into it are “real, live” scientific-philosophical hypotheses often thought to be actually true, unlike any of the outrageous traditional skeptical hypotheses (e.g., ‘You’re a brain in a vat’). So I call the resulting skepticism Live Skepticism. Notably, the Live Skeptic’s argument goes through even if we adopt the clever anti-skeptical fixes thought up in recent years such as reliabilism, relevant alternatives theory, contextualism, and the rejection of epistemic closure. Furthermore, the scope of Live Skepticism is bizarre: although we don’t know the simple facts noted above, many of us do know that there are black holes and other amazing facts. (shrink)
Thirty years ago, grounded cognition had roots in philosophy, perception, cognitive linguistics, psycholinguistics, cognitive psychology, and cognitive neuropsychology. During the next 20 years, grounded cognition continued developing in these areas, and it also took new forms in robotics, cognitive ecology, cognitive neuroscience, and developmental psychology. In the past 10 years, research on grounded cognition has grown rapidly, especially in cognitive neuroscience, social neuroscience, cognitive psychology, social psychology, and developmental psychology. Currently, grounded cognition appears to be achieving increased acceptance throughout (...) cognitive science, shifting from relatively minor status to increasing importance. Nevertheless, researchers wonder whether grounded mechanisms lie at the heart of the cognitive system or are peripheral to classic symbolic mechanisms. Although grounded cognition is currently dominated by demonstration experiments in the absence of well-developed theories, the area is likely to become increasingly theory driven over the next 30 years. Another likely development is the increased incorporation of grounding mechanisms into cognitive architectures and into accounts of classic cognitive phenomena. As this incorporation occurs, much functionality of these architectures and phenomena is likely to remain, along with many original mechanisms. Future theories of grounded cognition are likely to be heavily influenced by both cognitive neuroscience and social neuroscience, and also by developmental science and robotics. Aspects from the three major perspectives in cognitive science—classic symbolic architectures, statistical/dynamical systems, and grounded cognition—will probably be integrated increasingly in future theories, each capturing indispensable aspects of intelligence. (shrink)
The Narrative Practice Hypothesis (NPH) is a recently conceived, late entrant into the contest of trying to understand the basis of our mature folk psychological abilities, those involving our capacity to explain ourselves and comprehend others in terms of reasons. This paper aims to clarify its content, importance and scientific plausibility by: distinguishing its conceptual features from those of its rivals, articulating its philosophical significance, and commenting on its empirical prospects. I begin by clarifying the NPH's target explanandum and (...) the challenge it presents to theory theory (TT), simulation theory (ST) and hybrid combinations of these theories. The NPH competes with them directly for the same explanatory space insofar as these theories purport to explain the core structural basis of our folk psychological (FP)-competence (those of the sort famously but not exclusively deployed in acts of third-personal mindreading). (shrink)
Interpretive Archaeologies provides a forum for debate between varied approaches to studying the past. It reflects the profound shift in the direction of archaeological study in the last fifteen years. The book argues that archaeologists must understand their own subjective approaches to the material they study as well as recognize how past researchers imposed their value systems on the evidence they presented. The book's authors, drawn from Europe, North America, Asia and Australasia, represent many different strands of archaeology. (...) They address the philosophical issues involved in interpretation and the origins of meaning in the evolution and emergence of "mind" in early hominids. They discuss the ways in which material culture is understood and presented in museums, and how the nature of history is itself in flux. (shrink)
In an effort to account for our a priori knowledge of synthetic necessary truths, Kant proposes to extend the successful method used in mathematics and the natural sciences to metaphysics. In this paper, a uniform account of that method is proposed and the particular contribution of the ‘Copernican hypothesis’ to our knowledge of necessary truths is explained. It is argued that, though the necessity of the truths is in a way owing to the object's relation to our cognition, the (...) truths we come to know are fully objective, expressing necessary relations between properties. Kant's distinction between ‘phenomena’ and ‘noumena’ is shown to serve to properly restrict the scope of the necessity claims so that they do express necessary connections between properties. (shrink)
At first sight, homosexuality has little to do with reproduction. Nevertheless, many neo-Darwinian theoreticians think that human homosexuality may have had a procreative value, since it enabled the close kin of homosexuals to have more viable offspring than individuals lacking the support of homosexual siblings. In this article, however, we will defend an alternative hypothesis - originally put forward by Freud in "A phylogenetic phantasy" - namely that homosexuality evolved as a means to strengthen social bonds. Consequently, from an (...) evolutionary point of view, homosexuality and heterosexuality have entirely distinct origins: there is no continuum from heterosexuality to homosexuality. Indeed, the natural history we propose shows that the intensity of the homosexual inclination has little or no predictive value with regard to the intensity of heterosexual tendencies. In fact, this may be a sound Darwinian way to understand sexual ambivalence. But if sexual ambivalence is a biological datum, one has to conclude that psychodynamic mechanisms are often needed in order to explain exclusive heterosexuality or exclusive homosexuality. (shrink)
This essay selectively reviews, from an historical and philosophical perspective, the dopamine (DA) hypothesis of schizophrenia (DHS; Table 1 lists the abbreviations used in this essay). Our goal is not to adjudicate the validity of the theory—although we arrive at a generally skeptical conclusion—but to focus on the process whereby the DHS has evolved over time and been evaluated. Since its inception, the DHS has been the most prominent etiologic theory in psychiatry and is still referred to widely in (...) current textbooks (e.g., Buchanan and Carpenter, Jr. 2005, 1336; Cohen 2003, 225; Gazzaniga 2004, 1257;Kandel et al. 2000, 1200). Understanding its origins and evolution should help to clarify the nature of modern .. (shrink)
This paper presents two ideas in connection with the notion of empathic access to one's past, where this notion is understood as consisting of memories of one's past from the inside, plus a fundamental sympathy for those remembered states. The first idea is that having empathic access is a necessary condition for one's personal identity and survival. I give reasons to reject this view, one such reason being that it in effect blocks off the possibility of profound personal (...) progress through radical change. The second idea is that empathy with one's past should, as a matter of necessity, be modeled on empathy with another person. I reject this two-state model, arguing for the alternative possibility of a one-state model, according to which one's thoughts and memories of one's past can become infused with one's present thoughts about and attitudes toward one's past. (shrink)
The question of whether non-human animals are conscious is of fundamental importance. There are already good reasons to think that many are, based on evolutionary continuity and other considerations. However, the hypothesis is notoriously resistant to direct empirical test. Numerous studies have shown behaviour in animals analogous to consciously-produced human behaviour. Fewer probe whether the same mechanisms are in use. One promising line of evidence about consciousness in other animals derives from experiments on metamemory. A study by Hampton (Proc (...) Natl Acad Sci USA 98(9):5359–5362, 2001 ) suggests that at least one rhesus macaque can use metamemory to predict whether it would itself succeed on a delayed matching-to-sample task. Since it is not plausible that mere meta-representation requires consciousness, Hampton’s study invites an important question: what kind of metamemory is good evidence for consciousness? This paper argues that if it were found that an animal had a memory trace which allowed it to use information about a past perceptual stimulus to inform a range of different behaviours, that would indeed be good evidence that the animal was conscious. That functional characterisation can be tested by investigating whether successful performance on one metamemory task transfers to a range of new tasks. The paper goes on to argue that thinking about animal consciousness in this way helps in formulating a more precise functional characterisation of the mechanisms of conscious awareness. (shrink)
Language and Ontology: Linguistic Relativism (Sapir-Whorf Hypothesis) vs. Universal Grammar Universal Ontology vs. Ontological Relativity Semiotics and Ontology: Annotated Bibliography of John Deely. First part: 1965-1998 Annotated Bibliography of John Deely. Second part: 1999-2010 The Rediscovery of John Poinsot (John of St. Thomas).
The open future view is the common-sense view that there is an ontological difference between the past, the present, and the future in the sense that the past and the present are real, whereas the future is not yet a part of reality. In this paper we develop a theory in which the open future view is consistently combined with special relativity. Technically, the heart of our contribution is a logical conservativity result showing that, although the open future (...) view is not definable inside the causal geometry of Minkowski space-time, it can be conservatively added to it. (shrink)
In recent years evolutionary psychologists have developed and defended the Massive Modularity Hypothesis, which maintains that our cognitive architecture—including the part that subserves ‘central processing’ —is largely or perhaps even entirely composed of innate, domain-specific computational mechanisms or ‘modules’. In this paper I argue for two claims. First, I show that the two main arguments that evolutionary psychologists have offered for this general architectural thesis fail to provide us with any reason to prefer it to a competing picture of (...) the mind which I call the Library Model of Cognition. Second, I argue that this alternative model is compatible with the central theoretical and methodological commitments of evolutionary psychology. Thus I argue that, at present, the endorsement of the Massive Modularity Hypothesis by evolutionary psychologists is both unwarranted and unmotivated. (shrink)
Those familiar with contemporary continental philosophy know well the defenses Husserlians have offered of Husserl’s theory of inner time-consciousness against post-modernism’s deconstructive criticisms. As post-modernism gives way to Deleuzean post-structuralism, Deleuze’s Le bergsonisme has grown into the movement of Bergsonism. This movement, designed to present an alternative to phenomenology, challenges Husserlian phenomenology by criticizing the most “important… of all phenomenological problems.” Arguing that Husserl’s theory of time-consciousness detailed a linear succession of iterable instants in which the now internal to consciousness (...) receives prejudicial favor, Bergsonism concludes that Husserl derived the past from the present and cannot account for the sense of the past, which differs in kind from the present. Consequently, everything on Husserl’s account remains present and his theory cannot accommodate for time’s passage. In this paper, I renew the Husserlian defense of Husserl’s theory of time-consciousness in response to the recent movement of Deleuzean Bergsonism. Section one presents Bergsonism’s notion of the past in general and its critique of Husserl’s theory of time-consciousness. Section two presents a rejoinder to Bergsonism’s critique of Husserl, questioning (1) its understanding of the living-present as linearly extended, (2) its conflation of the living-present with Husserl’s early schema-apprehension interpretation, and (3) its failure to grasp Husserl’s revised understanding of primary memory as a result of (2). In conclusion, I suggest that Husserl’s theory of retention might articulate a notion of the past more consistent with Bergson than Bergsonism itself. (shrink)
We describe a number of puzzling phenomena and use them as evidence for a hypothesis about why bodily continuity matters for personal identity. The phenomena all belong to a particular kind of symbolisation: each of them illustrates how an entity (object or person) sometimes acquires symbolic significance in virtue of a material link with the symbolised entity. Relics are the most obvious example of what happens here: they are cherished, desired or respected, not because of their intrinsic features, but (...) because of their material link with some significant individual person. Crucial for the hypothesis we wish to defend, is the fact that a human being can in some cases and for some others function as a relic of what she used to be; in these cases a human individual has a specific significance in virtue of a material link (bodily continuity) with her own past. We argue that this phenomenon can be extended and that the importance of bodily continuity for personal identity is constituted by the kind of symbolisation upon which the existence of relics is based.1. (shrink)
This essay introduces the massive redeployment hypothesis, an account of the functional organization of the brain that centrally features the fact that brain areas are typically employed to support numerous functions. The central contribution of the essay is to outline a middle course between strict localization on the one hand, and holism on the other, in such a way as to account for the supporting data on both sides of the argument. The massive redeployment hypothesis is supported by (...) case studies of redeployment, and compared and contrasted with other theories of the localization of function. (shrink)
The purpose of this essay is to determine how we should construe the content of memories. First, I distinguish two features of memory that a construal of mnemic content should respect. These are the ‘attribution of pastness’ feature (a subject is inclined to believe of those events that she remembers that they happened in the past) and the ‘attribution of existence’ feature (a subject is inclined to believe that she existed at the time that those events that she remembers (...) took place). Next, I distinguish two kinds of theories of memory, which I call ‘perceptual’ and ‘self-based’ theories. I argue that those theories that belong to the first kind but not the second one have trouble accommodating the attribution of existence. And theories that belong to the second kind but not the first one leave the attribution of pastness unexplained. I then discuss two different theories that are both perceptual and self-based, which I eventually reject. Finally, I propose a perceptual, self-based theory that can account for both the attribution of pastness and the attribution of past existence. (shrink)
Recent work by Joshua Knobe has established that people are far more likely to describe bad but foreseen side effects as intentionally performed than good but foreseen side effects (this is sometimes called the 'Knobe effect' or the 'side-effect effect.' Edouard Machery has proposed a novel explanation for this asymmetry: it results from construing the bad side effect as a cost that must be incurred to receive a benefit. In this paper, I argue that Machery's 'trade-off hypothesis' is wrong. (...) I do this by reproducing the asymmetry between judgments about good and bad side effects in cases that cannot plausibly be construed as trade-offs. (shrink)
b>. One major problem many hypotheses regarding the neural correlate of consciousness (NCC) face is what we might call “the why question”: _why _would this particular neural feature, rather than another, correlate with consciousness? The purpose of the present paper is to develop an NCC hypothesis that answers this question. The proposed hypothesis is inspired by the Cross-Order Integration (COI) theory of consciousness, according to which consciousness arises from the functional integration of a first-order representation of an external (...) stimulus and a second-order representation of that first-order representation. The proposal comes in two steps. The first step concerns the “general shape” of the NCC and can be directly derived from COI theory. The second step is a concrete hypothesis that can be arrived at by combining the general shape with empirical considerations. (shrink)
There is widespread agreement, even among those who accept the possibility of backward causation, that it is impossible to change the past. I argue that this agreement corresponds to a relatively uninteresting understanding of what changing the past amounts to. In one sense it is indeed impossible to change the past: in no possible world is an action performed which makes the past in that world different from the past in that world. In another sense, (...) however, it may be possible to change the past: maybe in some possible world an action is performed which makes the past in that world different from the actual past. I argue that those who accept the possibility of backward causation are committed to accepting the possibility that the past changes in the latter sense. (shrink)
Over the last four decades arguments for and against the claim that creative hypothesis formation is based on Darwinian ‘blind’ variation have been put forward. This paper offers a new and systematic route through this long-lasting debate. It distinguishes between undirected, random, and unjustified variation, to prevent widespread confusions regarding the meaning of undirected variation. These misunderstandings concern Lamarckism, equiprobability, developmental constraints, and creative hypothesis formation. The paper then introduces and develops the standard critique that creative hypothesis (...) formation is guided rather than blind, integrating developments from contemporary research on creativity. On that basis, I discuss three compatibility arguments that have been used to answer the critique. These arguments do not deny guided variation but insist that an important analogy exists nonetheless. These compatibility arguments all fail, even though they do so for different reasons: trivialisation, conceptual confusion, and lack of evidence respectively. Revisiting the debate in this manner not only allows us to see where exactly a ‘Darwinian’ account of creative hypothesis formation goes wrong, but also to see that the debate is not about factual issues, but about the interpretation of these factual issues in Darwinian terms. (shrink)
This article considers the question of the responsibility of present generations for injustices committed by previous ones. It asks whether the descendants of victims of past injustice have claims against the descendants of the perpetrators of injustice. Two modes of argument are examined: the individual responsibility approach, according to which descendants cannot have claims against other descendants, and the collective responsibility approach, according to which descendants do have strong claims. Both approaches are criticized, but for different failings. An alternative (...) view, building on the individualist approach, is defended. This view argues that some people may have to bear responsibility for past injustice if lines of responsibility can clearly be drawn. This is most likely when certain kinds of corporate agents persist over generations, even after original members of such corporations have ceased to exist. Key Words: responsibility justice injustice aborigines history. (shrink)
Book description: The capacity to represent and think about time is one of the most fundamental and least understood aspects of human cognition and consciousness. This book throws new light on central issues in the study of the mind by uniting, for the first time, psychological and philosophical approaches dealing with the connection between temporal representation and memory. Fifteen specially written essays by leading psychologists and philosophers investigate the way in which time is represented in memory, and the role memory (...) plays in our ability to reason about time. They offer insights into current theories of memory processes and of the mechanisms and cognitive abilities underlying temporal judgements, and draw out fundamental issues concerning the phenomenology and epistemology of memory and our understanding of time. The chapters are arranged into four sections, each focused on one area of current research: Keeping Track of Time, and Temporal Representation; Memory, Awareness and the Past; Memory and Experience; Knowledge and the Past: The Epistemology and Metaphysics of Time. A general introduction gives an overview of the topics discussed and makes explicit central themes which unify the different philosophical and psychological approaches. (shrink)
According to John Haugeland, the capacity for “authentic intentionality” depends on a commitment to constitutive standards of objectivity. One of the consequences of Haugeland’s view is that a neurocomputational explanation cannot be adequate to understand “authentic intentionality”. This paper gives grounds to resist such a consequence. It provides the beginning of an account of authentic intentionality in terms of neurocomputational enabling conditions. It argues that the standards, which constitute the domain of objects that can be represented, reflect the statistical structure (...) of the environments where brain sensory systems evolved and develop. The objection that I equivocate on what Haugeland means by “commitment to standards” is rebutted by introducing the notion of “florid, self-conscious representing”. Were the hypothesis presented plausible, computational neuroscience would offer a promising framework for a better understanding of the conditions for meaningful representation. (shrink)
This paper consists of two parts. In the first part, I give an in-depth comparison and analysis of the theories of Frank Ankersmit and Eelco Runia, in which I highlight their most important resemblances and differences. What both have in common is their notion of the presence of the past as a ‘presence in absence’. They differ, however, with respect to the character of this past and the role representation plays in making it present. Second, I also argue (...) that for both Ankersmit and Runia, the presence of the past is always the present of our past, which excludes the experience of the otherness of the past, and which opens both theories to the criticisms of being self-centered and nationalistic. (shrink)
Abstract: Recently, the experimental philosopher Joshua Knobe has shown that the folk are more inclined to describe side effects as intentional actions when they bring about bad results. Edouard Machery has offered an intriguing new explanation of Knobe's work—the 'trade-off hypothesis'—which denies that moral considerations explain folk applications of the concept of intentional action. We critique Machery's hypothesis and offer empirical evidence against it. We also evaluate the current state of the debate concerning the concept of intentionality, and (...) argue that, given the number of variables at play, any parsimonious account of the relevant data is implausible. (shrink)
Skeptics claim that we know radically less than we think we do. For example, skeptics might claim that we have next to no knowledge of the past, the future, or other minds. Here we will consider the skeptical claim that we have next to no knowledge of the external world: the world of physical objects that we at least seem to perceive. One well-known argument in support of this claim appeals to the possibility of being a BIV: a brain (...) in a vat. According to the BIV Hypothesis, you are a mere BIV without a normal body. This of course means, among other things, that you don't have hands. The nerve endings of your brain are stimulated in a manner so sophisticated that the perfect illusion of a normal life is generated. Let's distinguish between the.. (shrink)
I seek to clarify the notion of the fixity of the past appropriate to Pike’s regimentation of the argument for the incompatibility of God’s foreknowledge and human freedom. Also, I discuss Alvin Plantinga’s famous example of Paul and the Ant Colony in light of Pike’s argument.