The law tends to think that there is no difficulty about identifying humans. When someone is born, her name is entered into a statutory register. She is ‘X’ in the eyes of the law. At some point, ‘X’ will die and her name will be recorded in another register. If anyone suggested that the second X was not the same as the first, the suggestion would be met with bewilderment. During X's lifetime, the civil law assumed that the X who (...) entered into a contract was the same person who breached it. The criminal law assumed that X, at the age of 80, was liable for criminal offences ‘she’ committed at the age of 18. This accords with the way we talk. ‘She's not herself today’, we say; or ‘When he killed his wife he wasn't in his right mind’. The intuition has high authority: ‘To thine own self be true’, urged Polonius.1 It sounds as if we believe in souls—immutable, core essences that constitute our real selves. Medicine conspires in the belief. If you become mentally ill, a psychiatrist will seek to get you back to your right mind. The Mental Capacity Act 1985 states that when a patient loses capacity the only lawful interventions will be interventions which are in that patient's best interests,2 and that in determining what those interests are the decision-maker must have …. (shrink)
We examine metonymy at psycho- and neurolinguistic levels, seeking to adjudicate between two possible processing implementations. We compare highly conventionalized systematic metonymy to lesser-conventionalized circumstantial metonymy. Whereas these two metonymy types differ in terms of contextual demands, they each reveal a similar dependency between the named and intended conceptual entities. We reason that if each metonymy yields a distinct processing time course and substantially non-overlapping preferential localization pattern, it would not only support a two-mechanism view but would suggest that conventionalization (...) acts as a linguistic categorizer. By contrast, a similar behavior in time course and localization would support a one-mechanism view and the inference that conventionalization acts instead as a modulator of contextual felicitousness, and that differences in interpretation introduced by conventionalization are of degree, not of kind. Results from three paradigms: self-paced reading, event-related potentials, and functional magnetic resonance imaging, reveal the following: no main effect by condition for either metonymy type immediately after the metonymy trigger, and a main effect for only the Circumstantial metonymy one word post-trigger ; a N400 effect across metonymy types and a late positivity for Circumstantial metonymy ; and a highly overlapping activation connecting the left ventrolateral prefrontal cortex and the left dorsolateral prefrontal cortex. Altogether, the pattern observed does not reach the threshold required to justify a two-mechanism system. Instead, the pattern is more naturally understood as resulting from the implementation of a generalized referential dependency mechanism, modulated by degree of context dependence/conventionalization, thus supporting architectures of language whereby “lexical” and “pragmatic” meaning relations are encoded along a cline of contextual underspecification. (shrink)
Dualism argues that the mind is more than just the brain. It holds that there exists two very different realms, one mental and the other physical. Both are fundamental and one cannot be reduced to the other - there are minds and there is a physical world. This book examines and defends the most famous dualist account of the mind, the cartesian, which attributes the immaterial contents of the mind to an immaterial self. John Foster's new book exposes the inadequacies (...) of the dominant materialist and reductionist accounts of the mind. In doing so he is in radical conflict with the current philosophical establishment. Ambitious and controversial, _The Immaterial Self_ is the most powerful and effective defence of Cartesian dualism since Descartes' own. (shrink)
Philosophers of science have given considerable attention to the logic of completed scientific systems. In this 1958 book, Professor Hanson turns to an equally important but comparatively neglected subject, the philosophical aspects of research and discovery. He shows that there is a logical pattern in finding theories as much as in using established theories to make deductions and predictions, and he sets out the features of this pattern with the help of striking examples in the history of science.
John Foster presents a clear and powerful discussion of a range of topics relating to our understanding of the universe: induction, laws of nature, and the existence of God. He begins by developing a solution to the problem of induction - a solution whose key idea is that the regularities in the workings of nature that have held in our experience hitherto are to be explained by appeal to the controlling influence of laws, as forms of natural necessity. His second (...) line of argument focuses on the issue of what we should take such necessitational laws to be, and whether we can even make sense of them at all. Having considered and rejected various alternatives, Foster puts forward his own proposal: the obtaining of a law consists in the causal imposing of a regularity on the universe as a regularity. With this causal account of laws in place, he is now equipped to offer an argument for theism. His claim is that natural regularities call for explanation, and that, whatever explanatory role we may initially assign to laws, the only plausible ultimate explanation is in terms of the agency of God. Finally, he argues that, once we accept the existence of God, we need to think of him as creating the universe by a method which imposes regularities on it in the relevant law-yielding way. In this new perspective, the original nomological-explanatory solution to the problem of induction becomes a theological-explanatory solution. The Divine Lawmaker is bold and original in its approach, and rich in argument. The issues on which it focuses are among the most important in the whole epistemological and metaphysical spectrum. (shrink)
John Foster addresses the question: what is it to perceive a physical object? He rejects the view that we perceive such objects directly, and argues for a new version of the traditional empiricist account, which locates the immediate objects of perception in the mind. But this account seems to imply that we do not perceive physical objects at all. Foster offers a surprising solution, which involves embracing an idealist view of the physical world.
A World for Us aims to refute physical realism and establish in its place a form of idealism. Physical realism, in the sense in which John Foster understands it, takes the physical world to be something whose existence is both logically independent of the human mind and metaphysically fundamental. Foster identifies a number of problems for this realist view, but his main objection is that it does not accord the world the requisite empirical immanence. The form of idealism that he (...) tries to establish in its place rejects the realist view in both its aspects. It takes the world to be something whose existence is ultimately constituted by facts about human sensory experience, or by some richer complex of non-physical facts in which such experiential facts centrally feature. Foster calls this phenomenalistic idealism. He tries to establish a specific version of such phenomenalistic idealism, in which the experiential facts that centrally feature in the constitutive creation of the world are ones that concern the organization of human sensory experience. The basic idea of this version is that, in the context of certain other constitutively relevant factors, this sensory organization creates the physical world by disposing things to appear systematically world-wise at the human empirical viewpoint. Chief among these other relevant factors is the role of God as the one who is responsible for the sensory organization and ordains the system of appearance it yields. It is this that gives the idealistically created world its objectivity and allows it to qualify as a real world. (shrink)
The field of neuroimaging has reached a watershed. Brain imaging research has been the source of many advances in cognitive neuroscience and cognitive science over the last decade, but recent critiques and emerging trends are raising foundational issues of methodology, measurement, and theory. Indeed, concerns over interpretation of brain maps have created serious controversies in social neuroscience, and, more important, point to a larger set of issues that lie at the heart of the entire brain mapping enterprise. In this volume, (...) leading scholars -- neuroimagers and philosophers of mind -- reexamine these central issues and explore current controversies that have arisen in cognitive science, cognitive neuroscience, computer science, and signal processing. The contributors address both statistical and dynamical analysis and modeling of neuroimaging data and interpretation, discussing localization, modularity, and neuroimagers' tacit assumptions about how these two phenomena are related; controversies over correlation of fMRI data and social attributions ; and the standard inferential design approach in neuroimaging. Finally, the contributors take a more philosophical perspective, considering the nature of measurement in brain imaging, and offer a framework for novel neuroimaging data structures. Contributors: William Bechtel, Bharat Biswal, Matthew Brett, Martin Bunzl, Max Coltheart, Karl J. Friston, Joy J. Geng, Clark Glymour, Kalanit Grill-Spector, Stephen José Hanson, Trevor Harley, Gilbert Harman, James V. Haxby, Rik N. Henson, Nancy Kanwisher, Colin Klein, Richard Loosemore, Sébastien Meriaux, Chris Mole, Jeanette A. Mumford, Russell A. Poldrack, Jean-Baptiste Poline, Richard C. Richardson, Alexis Roche, Adina L. Roskies, Pia Rotshtein, Rebecca Saxe, Philipp Sterzer, Bertrand Thirion, Edward Vul The hardcover edition does not include a dust jacket. (shrink)
Does general validity or real world validity better represent the intuitive notion of logical truth for sentential modal languages with an actuality connective? In (Philosophical Studies 130:436–459, 2006) I argued in favor of general validity, and I criticized the arguments of Zalta (Journal of Philosophy 85:57–74, 1988) for real world validity. But in Nelson and Zalta (Philosophical Studies 157:153–162, 2012) Michael Nelson and Edward Zalta criticize my arguments and claim to have established the superiority of real world validity. Section 1 (...) of the present paper introduces the problem and sets out the basic issues. In Sect. 2 I consider three of Nelson and Zalta’s arguments and find all of them deficient. In Sect. 3 I note that Nelson and Zalta direct much of their criticism at a phrase (‘true at a world from the point of view of some distinct world as actual’) I used only inessentially in Hanson (Philosophical Studies 130:436–459, 2006), and that their account of the philosophical foundations of modal semantics leaves them ill equipped to account for the plausibility of modal logics weaker than S5. Along the way I make several general suggestions for ways in which philosophical discussions of logical matters–especially, but not limited to, discussions of truth and logical truth for languages containing modal and indexical terms–might be facilitated and made more productive. (shrink)
Soren Kierkegaard's Fear and Trembling is one of the most widely read works of Continental philosophy and the philosophy of religion. While several commentaries and critical editions exist, Jeffrey Hanson offers a distinctive approach to this crucial text. Hanson gives equal weight and attention to all three of Kierkegaard’s "problems," dealing with Fear and Trembling as part of the entire corpus of Kierkegaard's production and putting all parts into relation with each other. Additionally, he offers a distinctive analysis of the (...) Abraham story and other biblical texts, giving particular attention to questions of poetics, language, and philosophy, especially as each relates to the aesthetic, the ethical, and the religious. Presented in a thoughtful, well-informed, and fresh manner, Hanson’s claims are original and edifying. This new reading of Kierkegaard will stimulate fruitful dialogue on well-traveled philosophical ground. (shrink)
This is a reprint of Amartya Sen’s 1973 book on the measurement of inequality, plus an updated bibliography and index, and an annex by James Foster and Sen that summarizes and comments on the main developments since 1973. The book is superbly written and focuses on verbal discussion of the plausibility and significance of the conditions, theorems, and measures.
This discussion, featuring short comments by R. Melvin Keiser, Durwood Foster, Richard Gelwick and Donald Musser, grew out of articles in TAD 35:3 (2008-2009) on connections and disconnections between the thought of Polanyi and Tillich (featuring essays by Foster and Gelwick with a response from Musser). Keiser raises questions about perspectives articulated in the earlier articles and Foster, Gelwick and Musser respond here.
Arthur Diamond comments that "it is not clear how a donor distributes money through Hanson's market". Let me try again to be clear. Imagine David Levy were to seek funding for the regression he suggests in his comments, on the relative impact of sports versus science spending on aggregate productivity. Consider what might happen under three different funding institutions.
John Foster presents a clear and powerful discussion of a range of topics relating to our understanding of the universe: induction, laws of nature, and the existence of God. He begins by developing a solution to the problem of induction - a solution whose key idea is that the regularities in the workings of nature that have held in our experience hitherto are to be explained by appeal to the controlling influence of laws, as forms of natural necessity. His second (...) line of argument focuses on the issue of what we should take such necessitational laws to be, and whether we can even make sense of them at all. Having considered and rejected various alternatives, Foster puts forward his own proposal: the obtaining of a law consists in the causal imposing of a regularity on the universe as a regularity. With this causal account of laws in place, he is now equipped to offer an argument for theism. His claim is that natural regularities call for explanation, and that, whatever explanatory role we may initially assign to laws, the only plausible ultimate explanation is in terms of the agency of God. Finally, he argues that, once we accept the existence of God, we need to think of him as creating the universe by a method which imposes regularities on it in the relevant law-yielding way. In this new perspective, the original nomological-explanatory solution to the problem of induction becomes a theological-explanatory solution.The Divine Lawmaker is bold and original in its approach, and rich in argument. The issues on which it focuses are among the most important in the whole epistemological and metaphysical spectrum. (shrink)
John Foster presents a penetrating investigation into the question: what is it to perceive a physical object? Is perceptual contact with a physical object, he asks, something fundamental, or does it break down into further factors? If the latter, what are these factors, and how do they combine to secure the contact?For most of the book, Foster addressed these questions in the framework of a realist view of the physical world. But the arguments which thereby unfold - arguments which undermine (...) direct realism and establish a version of the sense-datum theory - lead to the conclusion that we do no perceive physical objects at all. The only way to avoid this conclusion is by abandoning physical realism for a form of idealism, and this is the option which Foster finally embraces. The Nature of Perception makes an important contrbution to the ongoing debate: it sheds fresh light on the traditional issues, and breathes new life into positions which most current philosophers assume to be dead. (shrink)
Environmental problems compel examination of three contrasting patterns of moral reasoning concerning the human relationship to nature: the currently implemented Progress Ethic, and the proposed alternatives of a Stewardship Ethic and Connection Ethic. But none of these deliver all they promise, whether in theory or practice or both, because all dubiously presume that moral reason is commensurate with nature, and that the value of natural entities is an intrinsic property. Matthew R. Foster argues that resolution of this crisis requires reaching (...) beyond the limit of reason, and acknowledging value to be not a noun, but a verb about the incomparable relation of two entities. (shrink)
Originally published in 1963, The Concept of the Positron forms a detailed analysis of quantum theory. Whilst it is not as well known as Professor Hanson's previous book, Patterns of Discovery, the text has many interesting aspects. In many ways it goes further than Hanson's earlier work in approaching the problems of theory competition and the rationality of science, topics that have since become central to the philosophy of science. It is also notable for a rigorous and forthright defence of (...) the Copenhagen Interpretation. Taken together, the ideas presented in this book constitute a first-rate achievement in the history and philosophy of science. This paperback reissue comes with a new preface from Matthew Lund, Assistant Professor, Faculty of Philosophy and Religious Studies at Rowan University. (shrink)
For a stable visual world, the colours of objects should appear the same under different lights. This property of colour constancy has been assumed to be fundamental to vision, and many experimental attempts have been made to quantify it. I contend here, however, that the usual methods of measurement are either too coarse or concentrate not on colour constancy itself, but on other, complementary aspects of scene perception. Whether colour constancy exists other than in nominal terms remains unclear.
In the first section, I consider what several logicians say informally about the notion of logical consequence. There is significant variation among these accounts, they are sometimes poorly explained, and some of them are clearly at odds with the usual technical definition. In the second section, I first argue that a certain kind of informal account—one that includes elements of necessity, generality, and apriority—is approximately correct. Next I refine this account and consider several important questions about it, including the appropriate (...) characterization of necessity, the criterion for selecting logical constants, and the exact role of apriority. I argue, among other things, that there is no need to recognize a special logical sense of necessity and that the selection of terms to serve as logical constants is ultimately a pragmatic matter. In the third section, I consider whether the informal account I have presented and defended is adequately represented by the usual technical definition. I show that it is, and provably so, for certain limited ways of selecting logical constants. In the general case, however, there seems to be no way to be sure that the technical and informal accounts coincide. (shrink)
The growing prominence of computers in contemporary life, often seemingly with minds of their own, invites rethinking the question of moral responsibility. If the moral responsibility for an act lies with the subject that carried it out, it follows that different concepts of the subject generate different views of moral responsibility. Some recent theorists have argued that actions are produced by composite, fluid subjects understood as extended agencies (cyborgs, actor networks). This view of the subject contrasts with methodological individualism: the (...) idea that actions are produced only by human individuals. This essay compares two views of responsibility: moral individualism (the ethical twin of methodological individualism), and joint responsibility (associated with extended agency theory). It develops a view of what joint responsibility might look like, and considers the advantages it might bring relative to moral individualism as well as the objections that are sure to be raised against it. (shrink)
A simple exogenous growth model gives conservative estimates of the economic implications of machine intelligence. Machines complement human labor when they become more productive at the jobs they perform, but machines also substitute for human labor by taking over human jobs. At ﬁrst, expensive hardware and software does only the few jobs where computers have the strongest advantage over humans. Eventually, computers do most jobs. At ﬁrst, complementary eﬀects dominate, and human wages rise with computer productivity. But eventually substitution can (...) dominate, making wages fall as fast as computer prices now do. An intelligence population explosion makes per-intelligence consumption fall this fast, while economic growth rates rise by an order of magnitude or more. These results are robust to automating incrementally, and to distinguishing hardware, software, and human capital from other forms of capital. (shrink)
Attempts to model interstellar colonization may seem hopelessly compromised by uncertainties regarding the technologies and preferences of advanced civilizations. If light speed limits travel speeds, however, then a selection effect may eventually determine frontier behavior. Making weak assumptions about colonization technology, we use this selection effect to predict colonists’ behavior, including which oases they colonize, how long they stay there, how many seeds they then launch, how fast and far those seeds fly, and how behavior changes with increasing congestion. This (...) colonization model explains several astrophysical puzzles, predicting lone oases like ours, amid large quiet regions with vast unused resources. (shrink)
If you might be living in a simulation then all else equal you should care less about others, live more for today, make your world look more likely to become rich, expect to and try more to participate in pivotal events, be more entertaining and praiseworthy, and keep the famous people around you happier and more interested in you.
In Everett's many worlds interpretation, quantum measurements are considered to be decoherence events. If so, then inexact decoherence may allow large worlds to mangle the memory of observers in small worlds, creating a cutoff in observable world size. Smaller world are mangled and so not observed. If this cutoff is much closer to the median measure size than to the median world size, the distribution of outcomes seen in unmangled worlds follows the Born rule. Thus deviations from exact decoherence can (...) allow the Born rule to be derived via world counting, with a finite number of worlds and no new fundamental physics. (shrink)
In practice, scoring rules elicit good probability estimates from individuals, while betting markets elicit good consensus estimates from groups. Market scoring rules combine these features, eliciting estimates from individuals or groups, with groups costing no more than individuals. Regarding a bet on one event given another event, only logarithmic versions preserve the probability of the given event. Logarithmic versions also preserve the conditional probabilities of other events, and so preserve conditional independence relations. Given logarithmic rules that elicit relative probabilities of (...) base event pairs, it costs no more to elicit estimates on all combinations of these base events. (shrink)
The regularities in nature, simply by being regularities, call for explanation. There are only two ways in which we could, with any plausibility, try to explain them. One way would be to suppose that they are imposed on the world by God. The other would be to suppose that they reflect the presence of laws of nature, conceived of as forms of natural necessity. But the only way of making sense of the notion of a law of nature, thus conceived, (...) is by construing a law as the causing of the associated regularity, and the only remotely plausible account of such causing would be in terms of the agency of God. So, by whichever route, we are led to the conclusion that the regularities are brought about by God. So the presence of the regularities in nature provides us with a strong case for accepting the existence of God. (shrink)