A well-known ``overdetermination''argument aims to show that the possibility of mental causes of physical events in a causally closed physical world and the possibility of causally relevant mental properties are both problematic. In the first part of this paper, I extend an identity reply that has been given to the first problem to a property-instance account of causal relata. In the second, I argue that mental types are composed of physical types and, as a consequence, both mental and physical (...) types may be causally relevant with respect to the same physical effect, contrary to the overdetermination argument. In further sections, I argue that mental types have causal powers, consider some objections and reject an alternative version of part-whole physicalism. Throughout I assume that causal relata are tropes and property types are classes of tropes. (shrink)
This paper develops the notion of a situated part structure and applies it to the semantics of the modifiers 'whole' and 'individual'. It argues that the ambiguity of 'whole' should be traced to two different conceptions of part structures of objects being at play: one according to which the parts of an objects are just the material parts and another, Aristotelian conception according to which the parts of an object include properties of form.
It is often held that according to Aristotle the city is a natural organism. One major reason for this organic interpretation is no doubt that Aristotle describes the relationship between the individual and the city as a part-whole relationship, seemingly the same relationship that holds between the parts of a natural organism and the organism itself. Moreover, some scholars (most notably Jonathan Barnes) believe this view of the city led Aristotle to accept an implicit totalitarianism. I argue, however, that (...) an investigation of the various ways Aristotle describes parts and wholes reveals that for Aristotle the city has a unity (and thus a nature) quite different from that of a natural organism. (shrink)
The centrality of the whole/part relation in mathematics is demonstrated through the presentation and analysis of examples from algebra, geometry, functional analysis,logic, topology and category theory.
This commentary on Michael Cahill’s Grading Arson argues that Cahill’s analysis inevitably leads to three possible conclusions. First, arson does not belong in criminal codes. Second, crimes of manner do not belong in criminal codes. And, third, the special part needs serious reconsideration. Although Cahill is reticent to draw any of these conclusions, this commentary urges Cahill to embrace all three.
The House of Lords majority decision in Matthews v. Kent and Medway Towns Fire Authority overturns the narrow interpretation given to key aspects of the Part-Time Workers (Protection of Less Favourable Treatment) Regulations’ core comparator mechanism in the lower tribunals and the Court of Appeal. It is a contextually astute judgment, which recognises the reductionist implications of an overly narrow approach to establishing comparability for the purposes of a less favourable treatment claim on the grounds of part-time work. (...) The positive aspect of this decision remains overshadowed, however, by the fact that this interpretation provides little consolation to the large majority of part-time women workers whose disadvantage and inequality remains outside the scope of the Regulations’ protection. (shrink)
Accurately predicting other people's actions may involve two processes: internal real-time simulation (dynamic updating) and matching recently perceived action images (static matching). Using a priming of body parts, this study aimed to differentiate the two processes. Specifically, participants played a motion-controlled video game with either their arms or legs. They then observed arm movements of a point-light actor, which were briefly occluded from view, followed by a static test pose. Participants judged whether this test pose depicted a coherent continuation of (...) the previously seen action (i.e., “action prediction task”). Evidence of dynamic updating was obtained after compatible effector priming (i.e., arms), whereas incompatible effector priming (i.e., legs) indicated static matching. Together, the results support action prediction as engaging two distinct processes, dynamic simulation and static matching, and indicate that their relative contributions depend on contextual factors like compatibility of body parts involved in performed and observed action. (shrink)
Since the 1930s, scientists studying the neurological disease scrapie had assumed that the infectious agent was a virus. By the mid 1960s, however, several unconventional properties had arisen that were difficult to reconcile with the standard viral model. Evidence for nucleic acid within the pathogen was lacking, and some researchers considered the possibility that the infectious agent consisted solely of protein. In 1982, Stanley Prusiner coined the term `prion' to emphasize the agent's proteinaceous nature. This infectious protein hypothesis was denounced (...) by many scientists as `heretical'.This essay asks why the concept of an infectious protein was considered controversial. Some biologists justified their evaluation of this hypothesis on the grounds that an infectious protein contradicted the `central dogma of molecular biology'. Others referred to vague theoretical constraints such as molecular biology's `theoretical structure' or `framework'. Examination of the objections raised by researchers reveals exactly what generalizations were being challenged by a protein model of infection.This two-part survey of scrapie and prion research reaches several conclusions: (1) A theoretical framework is present in molecular biology, exerting its influence in hypothesis formation and evaluation; (2) This framework consists of several related, yet separable, generalizations or `elements', including Francis Crick's Central Dogma and Sequence Hypothesis, plus notions concerning infection, replication, protein synthesis, and protein folding; (3) The term `central dogma' has stretched beyond Crick's original 1958 definition to encompass at least two other `framework elements': replication and protein synthesis; and (4) From the study of scrapie and related diseases, biological information has been delineated into at least two classes: sequential and what I call `conformational'.In Part I of this essay, a brief review of the central dogma, as outlined by both Francis Crick and James Watson, will be given. The developments in scrapie research from 1965 to 1972 will then be traced. This section will summarize many of the puzzling, non-viral-like properties of the scrapie agent. Alternative hypotheses to the viral explanation will also be presented, including early versions of a protein-only hypothesis. Part II of this essay will follow the developments in scrapie and prion research from the mid 1970s through 1991. The growing prominence of a protein-only model of infection will be balanced by continued objections from many researchers to a pathogen devoid of nucleic acid. These objections will help illuminate those generalizations in molecular biology that were indeed challenged by a protein-only model of infection. (shrink)
In this second paper, I continue my discussion of the problem of reference for scientific realism. First, I consider a final objection to Kitcher's account of reference, which I generalise to other accounts of reference. Such accounts make attributions of reference by appeal to our pretheoretical intuitions about how true statements ought to be distibuted among the scientific utterances of the past. I argue that in the cases that merit discussion, this strategy fails because our intuitions are unstable. The interesting (...) cases are importantly borderline--it really isn't clear what we ought to say about how those terms referred. I conclude that in many relevant cases, our grounds for thinking that the theoretical terms of the past referred are matched by our grounds for thinking that they failed to refer, in such a way that deciding on either result is arbitrary and bad news for the realist. In response to this problem, in the second part of the paper I expand upon Field's (1973) account of partial reference to sketch a new way of thinking about the theoretical terms of the past--that they partially referred and partially failed to refer. (shrink)
In this first part of a two-part paper, we describe efforts in the early decades of this century to restrict the extent of violations of the Second Law of thermodynamics that were brought to light by the rise of the kinetic theory and the identification of fluctuation phenomena. We show how these efforts mutated into Szilard's (1929) proposal that Maxwell's Demon is exorcised by proper attention to the entropy costs associated with the Demon's memory and information acquisition. In (...) the second part we will argue that the information theoretic exorcisms of the Demon provide largely illusory benefits. According to the case, they either return a presupposition that can be had without information theoretic consideration or they postulate a broader connection between information and entropy than can be sustained. (shrink)
In this second part of our two-part paper we review and analyse attempts since 1950 to use information theoretic notions to exorcise Maxwell's Demon. We argue through a simple dilemma that these attempted exorcisms are ineffective, whether they follow Szilard in seeking a compensating entropy cost in information acquisition or Landauer in seeking that cost in memory erasure. In so far as the Demon is a thermodynamic system already governed by the Second Law, no further supposition about information (...) and entropy is needed to save the Second Law. In so far as the Demon fails to be such a system, no supposition about the entropy cost of information acquisition and processing can save the Second Law from the Demon. (shrink)
In this second part of our two-part paper we review and analyse attempts since 1950 to use information theoretic notions to exorcise Maxwell’s Demon. We argue through a simple dilemma that these attempted exorcisms are ineffective, whether they follow Szilard in seeking a compensating entropy cost in information acquisition or Landauer in seeking that cost in memory erasure. In so far as the Demon is a thermodynamic system already governed by the Second Law, no further supposition about information (...) and entropy is needed to save the Second Law. In so far as the Demon fails to be such a system, no supposition about the entropy cost of information acquisition and processing can save the Second Law from the Demon. (shrink)
In Part I we saw that the works of Helmholtz, Holder, Campbell and Stevens contain the main ingredients for the analysis of the conditions which make (fundamental) measurement possible, but, so to speak, that what is lacking in the work of the first three is to be found in the work of the last, and vice versa. The first tradition focuses on the conditions that an empirical qualitative system must satisfy in order to be numerically representable, but pays (...) no attention to the relation between possible different representations. The second tradition focuses on the study of scale types and the mathematical properties of the transformations that characterize the scales, but says nothing about the empirical facts these scales represent and the nature of such representation. Then, these two lines of research need to be appropriately integrated. In this Part II, we shall see how this integration is brought about in the foundational work of Suppes, the extensions and modifications which are generated around this work and the mature theory which results from all of this. (shrink)
The long history of ergodic and quasi-ergodic hypotheses provides the best example of the attempt to supply non-probabilistic justifications for the use of statistical mechanics in describing mechanical systems. In this paper we reverse the terms of the problem. We aim to show that accepting a probabilistic foundation of elementary particle statistics dispenses with the need to resort to ambiguous non-probabilistic notions like that of (in)distinguishability. In the quantum case, starting from suitable probability conditions, it is possible to deduce elementary (...) particle statistics in a unified way. Following our approach Maxwell-Boltzmann statistics can also be deduced, and this deduction clarifies its status.Thus our primary aim in this paper is to give a mathematically rigorous deduction of the probability of a state with given energy for a perfect gas in statistical equilibrium; that is, a deduction of the equilibrium distribution for a perfect gas. A crucial step in this deduction is the statement of a unified statistical theory based on clearly formulated probability conditions from which the particle statistics follows. We believe that such a deduction represents an important improvement in elementary particle statistics, and a step towards a probabilistic foundation of statistical mechanics.In this Part I we first present some history: we recall some results of Boltzmann and Brillouin that go in the direction we will follow. Then we present a number of probability results we shall use in Part II. Finally, we state a notion of entropy referring to probability distributions, and give a natural solution to Gibbs' paradox. (shrink)
In the first part of this article I investigated the Popperian roots of Lakatos's Proofs and Refutations, which was an attempt to apply, and thereby to test, Popper's theory of knowledge in a field-mathematics-to which it had not primarily been intended to apply. While Popper's theory of knowledge stood up gloriously to this test, the new application gave rise to new insights into the heuristic of mathematical development, which necessitated further clarification and improvement of some Popperian methodological maxims. In (...) the present part I analyze this second phase in the development of Lakatos's Popperian programme in mathematics, and its connection to the methodology of scientific research programmes. (shrink)
Does the matter of the sensible world, for Plotinus as for Plato and Aristotle, exist without a cause of its existence? Long divided on the answer to that question, scholarly opinion now veers in favour of a derivation of matter from principles prior to matter, with disagreement limited to the details of the theory. What exactly is implied by the various passages of the Enneads where Plotinus writes of soul or physis in relation to `darkness' and `non-being', matter and form? (...) In the pages that follow, I argue that the soul's `making' of a `non-being' that by implication is matter, in Enn. III 9  3, is logically antecedent to the `making' of `visible form' ascribed to physis in Enn. III 8  2. A detailed study of the context and the syntax of the latter passage shows that, contrary to an interpretation put forward recently in this Journal, the two `makings' cannot be the same. (shrink)
The notions of part and whole play an important role for ontology and in many areas of the semantics of natural language. Both in philosophy and linguistic semantics, usually a particular notion of part structure is used, that of extensional mereology. This paper argues that such a notion is insufficient for ontology and, especially, for the semantic analysis of the relevant constructionsof natural language. What is needed for the notion of part structure,in addition to an ordering among (...) parts, is the notion of integrated whole. (shrink)
In the present volume Proclus describes the 'creation' of the soul that animates the entire universe. This is not a literal creation, for Proclus argues that Plato means only to convey the eternal dependence of the World Soul upon higher causes. In his exegesis of Plato's text, Proclus addresses a range of issues in Pythagorean harmonic theory, as well as questions about the way in which the World Soul knows both forms and the visible reality that comprises its body. This (...)part of Proclus' Commentary is particularly responsive to the interpretive tradition that precedes it. As a result, this volume is especially significant for the study of the Platonic tradition from the earliest commentators onwards. (shrink)
In this three-part paper, my concern is to expound and defend a conception of science, close to Einstein's, which I call aim-oriented empiricism. I argue that aim-oriented empiricsim has the following virtues. (i) It solve the problem of induction; (ii) it provides decisive reasons for rejecting van Fraassen's brilliantly defended but intuitively implausible constructive empiricism; (iii) it solves the problem of verisimilitude, the problem of explicating what it can mean to speak of scientific progress given that science advances from (...) one false theory to another; (iv) it enables us to hold that appropriate scientific theories, even though false, can nevertheless legitimately be interpreted realistically, as providing us with genuine , even if only approximate, knowledge of unobservable physical entities; (v) it provies science with a rational, even though fallible and non-mechanical, method for the discovery of fundamental new theories in physics. In the third part of the paper I show that Einstein made essential use of aim-oriented empiricism in scientific practice in developing special and general relativity. I conclude by considering to what extent Einstein came explicitly to advocate aim-oriented empiricism in his later years. (shrink)
In this paper I argue that aim-oriented empiricism provides decisive grounds for accepting scientific realism and rejecting instrumentalism. But it goes further than this. Aim-oriented empiricism implies that physicalism is a central part of current (conjectural) scientific knowledge. Furthermore, we can and need, I argue, to interpret fundamental physical theories as attributing necessitating physical properties to fundamental physical entities.
In this article (Part I), I first engage in some conceptual clarification of what the words "imagine," "imagining," and "imagination" can mean. Each has (i) a constructive sense, (ii) an attitudinal sense, and (iii) an imagistic sense. Keeping the senses straight in the course of cognitive theorizing is important for both psychology and philosophy. I then discuss the roles that perceptual memories, beliefs, and genre truth attitudes play in constructive imagination, or the capacity to generate novel representations that go (...) well beyond what's prompted by one's immediate environment. (shrink)
God, free will, and time: the free will offense part II Content Type Journal Article Category Article Pages 1-10 DOI 10.1007/s11153-011-9328-z Authors J. L. Schellenberg, Mount Saint Vincent University, 166 Bedford Highway, Halifax, NS B3M2J6, Canada Journal International Journal for Philosophy of Religion Online ISSN 1572-8684 Print ISSN 0020-7047.
Recent work has defended “Euclidean” theories of set size, in which Cantor’s Principle (two sets have equally many elements if and only if there is a one-to-one correspondence between them) is abandoned in favor of the Part-Whole Principle (if A is a proper subset of B then A is smaller than B). It has also been suggested that Gödel’s argument for the unique correctness of Cantor’s Principle is inadequate. Here we see from simple examples, not that Euclidean theories of (...) set size are wrong, but that they must be either very weak and narrow or largely arbitrary and misleading. (shrink)
Part I of this essay supports the anti-egalitarian conclusion that individuals may readily become entitled to substantially unequal extra-personal holdings by criticizing end-state and pattern theories of distributive justice and defending the historical entitlement doctrine of justice in holdings. Part II of this essay focuses on a second route to the anti-egalitarian conclusion. This route combines the self-ownership thesis with a contention that is especially advanced by G.A. Cohen. This is the contention that the anti-egalitarian conclusion can be (...) inferred from the self-ownership thesis without the aid of additional controversial premises. Cohen advances this contention, not because he wants to support the anti-egalitarian conclusion, but rather because he wants to emphasize the need for one to reject the self-ownership thesis if one is to reject the anti-egalitarian conclusion. In Part II of this essay, I support this second route to the anti-egalitarian conclusion by reinforcing Cohen's special contention while rejecting his challenges to the self-ownership thesis. Cohen's special contention is reinforced by way of an explanation of why the redistributive state must trench upon some people's self-ownership rights. One important challenge to the self-ownership thesis is answered through the articulation of a new and improved Lockean proviso. Another challenge offered by Cohen is answered by arguing that the philosophical costs of denying the self-ownership thesis are as great as the self-ownership libertarian maintains. Thus, I defend both of the key elements of self-ownership libertarianism, the self-ownership thesis and the anti-egalitarian conclusion. Key Words: autonomy distributive justice egalitarianism exploitation Lockean proviso self-ownership slavery. (shrink)
This two-part article offers a defense of a libertarian doctrine that centers on two propositions. The first is the self-ownership thesis according to which each individual possesses original moral rights over her own body, faculties, talents, and energies. The second is the anti-egalitarian conclusion that, through the exercise of these rights of self-ownership, individuals may readily become entitled to substantially unequal extra-personal holdings. The self-ownership thesis remains in the background during Part I of this essay, while the anti-egalitarian (...) conclusion is supported in two ways. First, I offer a reconstruction of Robert Nozick's well-known `How Liberty Upsets Patterns' argument against all end-state and pattern theories of distributive justice; and I defend this reconstructed stance against what might (otherwise) seem to be telling criticisms. Second, I defend the two key principles of Nozickian historical entitlement theory (the principle of just transfer and the principle of just initial acquisition) against criticisms offered by G.A. Cohen. Part II will center on Cohen's contention that the crucial basis for the anti-egalitarian conclusion is the self-ownership thesis. There I argue that Cohen is correct to hold that he must reject the self-ownership thesis if he is to avoid the anti-egalitarian conclusion; but he is wrong to think that he has an adequate basis for rejecting this thesis. Thus, both elements in the libertarianism under consideration are vindicated. And, the self-ownership thesis plays a surprisingly direct role in vindicating the anti-egalitarian conclusion. Key Words: egalitarianism historical entitlement moral rights self-ownership. (shrink)
Many biologists and philosophers have worried that importing models of reasoning from the physical sciences obscures our understanding of reasoning in the life sciences. In this paper we discuss one example that partially validates this concern: part-whole reductive explanations. Biology and physics tend to incorporate different models of temporality in part-whole reductive explanations. This results from differential emphases on compositional and causal facets of reductive explanations, which have not been distinguished reliably in prior philosophical analyses. Keeping these two (...) facets distinct facilitates the identifi cation of two further aspects of reductive explanation: intrinsicality and fundamentality. Our account provides resources for discriminating between different types of reductive explanation and suggests a new approach to comprehending similarities and differences in the explanatory reasoning found in biology and physics. (shrink)
I argue that it is intuitive and useful to think about composition in the light of the familiar functionalist distinction between role and occupant. This involves factoring the standard notion of parthood into two related notions: being a parthood slot and occupying a parthood slot. One thing is part of another just in case it fills one of that thing's parthood slots. This move opens room to rethink mereology in various ways, and, in particular, to see the mereological structure (...) of a composite as potentially outreaching the individual entities that are its parts. I sketch one formal system that allows things to have individual entities as parts multiple times over. This is particularly useful to David Armstrong, given Lewis's charge that his structural universals must do exactly that. I close by reflecting upon the nature and point of formal mereology. (shrink)
This is the first part of a two-part article in which we defend the thesis of Humean Supervenience about Laws of Nature (HS). According to this thesis, two possible worlds cannot differ on what is a law of nature unless they also differ on the Humean base. The Humean base is easy to characterize intuitively, but there is no consensus on how, precisely, it should be defined. Here in Part I, we present and motivate a characterization of (...) the Humean base that, we argue, enables HS to capture what is really stake in the debate, without taking on extraneous commitments. (shrink)
In part ix of "dialogues concerning natural religion", Demea advances an "a priori" argument for the existence of god: an argument of which cleanthes and philo then make a number of trenchant criticisms. These criticisms are acknowledged by all commentators to be hume's own, And they are regarded by almost all commentators as being fatal to demea's argument. I show that, On the contrary, Hume's main criticisms are all worthless, And that they even include an inconsistency of the most (...) glaring kind. (shrink)
In this paper I discuss the proposal that the law of torts exists to do justice, more specifically corrective justice, between the parties to a tort case. My aims include clarifying the proposal and defending it against some objections (as well as saving it from some defences that it could do without). Gradually the paper turns to a discussion of the rationale for doing corrective justice. I defend what I call the ‘continuity thesis’ according to which at least part (...) of the rationale for doing corrective justice is to mitigate one’s wrongs, including one’s torts. I try to show how much of the law of torts this thesis helps to explain, but also what it leaves unexplained. In the process I show (what I will discuss in a later companion paper) that ‘corrective justice’ cannot be a complete answer to the question of what tort law is for. (shrink)
Part One of this essay considered familiar ways of characterizing deontology, which focus on the notions of the good and the right. Here we will take up alternative approaches, which stress the type of reasons for actions that are generated by deontological theories. Although some of these alternative conceptualizations of deontology also employ a distinction between the good and the right, all mark the basic contrast between deontology and teleology in terms of reasons to act.