John Foster presents a clear and powerful discussion of a range of topics relating to our understanding of the universe: induction, laws of nature, and the existence of God. He begins by developing a solution to the problem of induction - a solution whose key idea is that the regularities in the workings of nature that have held in our experience hitherto are to be explained by appeal to the controlling influence of laws, as forms of natural necessity. His (...) second line of argument focuses on the issue of what we should take such necessitational laws to be, and whether we can even make sense of them at all. Having considered and rejected various alternatives, Foster puts forward his own proposal: the obtaining of a law consists in the causal imposing of a regularity on the universe as a regularity. With this causal account of laws in place, he is now equipped to offer an argument for theism. His claim is that natural regularities call for explanation, and that, whatever explanatory role we may initially assign to laws, the only plausible ultimate explanation is in terms of the agency of God. Finally, he argues that, once we accept the existence of God, we need to think of him as creating the universe by a method which imposes regularities on it in the relevant law-yielding way. In this new perspective, the original nomological-explanatory solution to the problem of induction becomes a theological-explanatory solution. The Divine Lawmaker is bold and original in its approach, and rich in argument. The issues on which it focuses are among the most important in the whole epistemological and metaphysical spectrum. (shrink)
John Foster addresses the question: what is it to perceive a physical object? He rejects the view that we perceive such objects directly, and argues for a new version of the traditional empiricist account, which locates the immediate objects of perception in the mind. But this account seems to imply that we do not perceive physical objects at all. Foster offers a surprising solution, which involves embracing an idealist view of the physical world.
A World for Us aims to refute physical realism and establish in its place a form of idealism. Physical realism, in the sense in which John Foster understands it, takes the physical world to be something whose existence is both logically independent of the human mind and metaphysically fundamental. Foster identifies a number of problems for this realist view, but his main objection is that it does not accord the world the requisite empirical immanence. The form of idealism (...) that he tries to establish in its place rejects the realist view in both its aspects. It takes the world to be something whose existence is ultimately constituted by facts about human sensory experience, or by some richer complex of non-physical facts in which such experiential facts centrally feature. Foster calls this phenomenalistic idealism. He tries to establish a specific version of such phenomenalistic idealism, in which the experiential facts that centrally feature in the constitutive creation of the world are ones that concern the organization of human sensory experience. The basic idea of this version is that, in the context of certain other constitutively relevant factors, this sensory organization creates the physical world by disposing things to appear systematically world-wise at the human empirical viewpoint. Chief among these other relevant factors is the role of God as the one who is responsible for the sensory organization and ordains the system of appearance it yields. It is this that gives the idealistically created world its objectivity and allows it to qualify as a real world. (shrink)
Aproximaciones a la escuela francesa de epistemología Los problemas que dominan a la epistemología pueden contextualizarse históricamente como una forma de racionalidad filosófica. La filosofía se ha presentado a lo largo de la historia como un discurso en el que sus diversos componentes (metafísica, ontología, gnoseología, ética, lógica, etc.) se mostraron unidos en el molde de la ?unidad del saber?. En este marco unitario alguna de las formas del saber filosófico detenta usualmente una posición dominante. El énfasis colocado en la (...) unidad del saber filosófico, o en ?la unidad del pensamiento humano?, es una herencia que el pensamiento filosófico recibe de sus raíces mítico-teológicas. Dicha visión se vio sometida, en la historia de la filosofía, a un proceso de secularización por el cual la instancia dominante pasó de la teología a la metafísica y de ésta a la teoría del conocimiento. Entre los siglos XIX y XX, este proceso atestiguó un cambio ulterior, colocando a la epistemología como instancia dominante de la racionalidad filosófica. La sucesión debe verse como una consecuencia de la funcionalización social de los dispositivos de creencias (ideología), lo que provoca que los mismos se conviertan, en determinado momento, en un obstáculo para la producción de nuevos conocimientos. De esta manera, los nuevos conocimientos, para desarrollarse, se ven forzados a provocar reestructuraciones en el campo filosófico, ya sea mediante el reemplazo de la instancia dominante, la incorporación o creación de nuevas formas de saber filosófico -tal el caso de la epistemología-, o de la marginalización relativa de otras. Se trata de en un proceso complejo (que no es ni lineal, ni biunívoco), en el que cabe no obstante discernir un esquema de la sucesión temporal de las formas filosóficas que dominan la pretendida ?unidad del pensamiento humano? (filosofía). El que acabamos de describir es un proceso lento de sustitución y reemplazo en el tipo de garantías que se le exige elaborar a la filosofía. Algunos momentos, como el ocaso de las garantías de la fe, acaecido con el surgimiento de la filosofía moderna, podrían parecer a primera vista contrajemplos para esta concepción de la evolución del saber filosófico. Podría creerse, en efecto, que con la constitución de esferas autónomas de discurso (teología, ciencia, filosofía), del discurso filosófico se desgajó en un discurso de una naturaleza diferente: la ciencia. Sin embargo, una mirada más atenta revela un paisaje diferente, puesto que esta transformación estuvo acompañada, primero, por la aparición de una nueva instancia dominante de la unificación del conocimiento filosófico. Se trata de la búsqueda de una nueva clase de garantías, las del origen y el fundamento del conocimiento, es decir, las de la gnoseología o teoría del conocimiento, en el interior de la cual se verificó finalmente un nuevo desplazamiento, con la constitución, a fines del siglo XIX y principios del siglo XX, de la ?filosofía de la ciencia? o epistemología. Este modelo para la conceptualización del desarrollo del discurso filosófico tiene la ventaja de permitirnos pensar la relación que la epistemología guarda con la instancia de saber filosófico dominante en el seno de la cual se desarrolla: la de la gnoseología. A partir de las relaciones que la epistemología guarda con la temática de las garantías del conocimiento podemos apresar, en un esquema heurístico que será complejizado de diversos modos en este libro, la diferencia entre las tesis características de la epistemología anglosajona y de la epistemología francesa. De acuerdo con en este esquema heurístico, el rasgo más característico de la epistemología anglosajona es su sujeción, en la mayor parte de su desarrollo, a la teoría del conocimiento, lo que se revela en la persistencia de algunos aspectos de la filosofía de la representación y en la reproducción de la oposición idealista entre sujeto y objeto como dos polos cuya armonía debería establecerse, filosóficamente, en términos de la verdad. En su lugar, la epistemología francesa se propuso el estudio de los mecanismos de producción de los conocimientos. La epistemología, desde esta perspectiva, ya no fue vista primordialmente como el estudio de los fundamentos del conocimiento científico, sino como la teoría de las condiciones y las formas de la práctica científica y la historia de esta práctica, tal como aparece en las distintas ciencias concretas. Expresado de otra manera, el contraste se podría establecer mediante la observación de que mientras los anglosajones hacen filosofía de la ciencia como una extensión de la lógica, los franceses la hacen como una extensión de la historia de la ciencia, es decir, encontrando en la historia el laboratorio del epistemólogo. Ahora bien, según veremos, el campo de la epistemología francesa ha cobijado una buena cantidad de debates que tienen que ver primordialmente con dos tendencias en tensión: la que enfatiza la autonomía de lo epistemológico y aquella que destaca la determinación social del pensamiento. Los trabajos de este libro esperan problematizar este y otros ejes, explorando las perspectivas de los ?clásicos? de la escuela francesa en epistemología (Bachelard, Canguilhem, Althusser, Foucault, etc.), las relaciones entre los mismos y los diálogos que cabe establecer entre estos y otras corrientes de pensamiento. ÍNDICE: La ruptura epistemológica, de Bachelard a Balibar y Pêcheux, Pedro Karczmarczyk La ruptura epistemológica según Bachelard, Althusser y Badiou, Carlos Gassmann Visitaciones Derrideanas, Jazmín Anahí Acosta Epistemología sin sujeto cognoscente. Superación, disolución o sujeción de la subjetividad en Popper, Wittgenstein y Foucault, Silvia Rivera; La torsión política del concepto de verdad en Michel Foucault, Manuel Cuervo Sola Canguilhem y Foucault. De la norma biológica a la norma política, Andrea Torrano Psicología e ideología: Foucault, Canguilhem y Althusser, Matías Abeijón . (shrink)
Dualism argues that the mind is more than just the brain. It holds that there exists two very different realms, one mental and the other physical. Both are fundamental and one cannot be reduced to the other - there are minds and there is a physical world. This book examines and defends the most famous dualist account of the mind, the cartesian, which attributes the immaterial contents of the mind to an immaterial self. John Foster's new book exposes the (...) inadequacies of the dominant materialist and reductionist accounts of the mind. In doing so he is in radical conflict with the current philosophical establishment. Ambitious and controversial, _The Immaterial Self_ is the most powerful and effective defence of Cartesian dualism since Descartes' own. (shrink)
This is a reprint of Amartya Sen’s 1973 book on the measurement of inequality, plus an updated bibliography and index, and an annex by James Foster and Sen that summarizes and comments on the main developments since 1973. The book is superbly written and focuses on verbal discussion of the plausibility and significance of the conditions, theorems, and measures.
This discussion, featuring short comments by R. Melvin Keiser, Durwood Foster, Richard Gelwick and Donald Musser, grew out of articles in TAD 35:3 (2008-2009) on connections and disconnections between the thought of Polanyi and Tillich (featuring essays by Foster and Gelwick with a response from Musser). Keiser raises questions about perspectives articulated in the earlier articles and Foster, Gelwick and Musser respond here.
John Foster presents a clear and powerful discussion of a range of topics relating to our understanding of the universe: induction, laws of nature, and the existence of God. He begins by developing a solution to the problem of induction - a solution whose key idea is that the regularities in the workings of nature that have held in our experience hitherto are to be explained by appeal to the controlling influence of laws, as forms of natural necessity. His (...) second line of argument focuses on the issue of what we should take such necessitational laws to be, and whether we can even make sense of them at all. Having considered and rejected various alternatives, Foster puts forward his own proposal: the obtaining of a law consists in the causal imposing of a regularity on the universe as a regularity. With this causal account of laws in place, he is now equipped to offer an argument for theism. His claim is that natural regularities call for explanation, and that, whatever explanatory role we may initially assign to laws, the only plausible ultimate explanation is in terms of the agency of God. Finally, he argues that, once we accept the existence of God, we need to think of him as creating the universe by a method which imposes regularities on it in the relevant law-yielding way. In this new perspective, the original nomological-explanatory solution to the problem of induction becomes a theological-explanatory solution.The Divine Lawmaker is bold and original in its approach, and rich in argument. The issues on which it focuses are among the most important in the whole epistemological and metaphysical spectrum. (shrink)
John Foster presents a penetrating investigation into the question: what is it to perceive a physical object? Is perceptual contact with a physical object, he asks, something fundamental, or does it break down into further factors? If the latter, what are these factors, and how do they combine to secure the contact?For most of the book, Foster addressed these questions in the framework of a realist view of the physical world. But the arguments which thereby unfold - arguments (...) which undermine direct realism and establish a version of the sense-datum theory - lead to the conclusion that we do no perceive physical objects at all. The only way to avoid this conclusion is by abandoning physical realism for a form of idealism, and this is the option which Foster finally embraces. The Nature of Perception makes an important contrbution to the ongoing debate: it sheds fresh light on the traditional issues, and breathes new life into positions which most current philosophers assume to be dead. (shrink)
Environmental problems compel examination of three contrasting patterns of moral reasoning concerning the human relationship to nature: the currently implemented Progress Ethic, and the proposed alternatives of a Stewardship Ethic and Connection Ethic. But none of these deliver all they promise, whether in theory or practice or both, because all dubiously presume that moral reason is commensurate with nature, and that the value of natural entities is an intrinsic property. Matthew R. Foster argues that resolution of this crisis requires (...) reaching beyond the limit of reason, and acknowledging value to be not a noun, but a verb about the incomparable relation of two entities. (shrink)
For a stable visual world, the colours of objects should appear the same under different lights. This property of colour constancy has been assumed to be fundamental to vision, and many experimental attempts have been made to quantify it. I contend here, however, that the usual methods of measurement are either too coarse or concentrate not on colour constancy itself, but on other, complementary aspects of scene perception. Whether colour constancy exists other than in nominal terms remains unclear.
I want to examine a possible solution to the problem of induction-one which, as far as I know, has not been discussed elsewhere. The solution makes crucial use of the notion of objective natural necessity. For the purposes of this discussion, I shall assume that this notion is coherent. I am aware that this assumption is controversial, but I do not have space to examine the issue here.
The regularities in nature, simply by being regularities, call for explanation. There are only two ways in which we could, with any plausibility, try to explain them. One way would be to suppose that they are imposed on the world by God. The other would be to suppose that they reflect the presence of laws of nature, conceived of as forms of natural necessity. But the only way of making sense of the notion of a law of nature, thus conceived, (...) is by construing a law as the causing of the associated regularity, and the only remotely plausible account of such causing would be in terms of the agency of God. So, by whichever route, we are led to the conclusion that the regularities are brought about by God. So the presence of the regularities in nature provides us with a strong case for accepting the existence of God. (shrink)
This article reviews the use of implantable radiofrequency identification (RFID) tags in humans, focusing on the VeriChip (VeriChip Corporation, Delray Beach, FL) and the associated VeriMed patient identification system. In addition, various nonmedical applications for implanted RFID tags in humans have been proposed. The technology offers important health and nonhealth benefits, but raises ethical concerns, including privacy and the potential for coercive implantation of RFID tags in individuals. A national discussion is needed to identify the limits of acceptable use of (...) implantable RFID tags in humans before their use becomes widespread and it becomes too late to prevent misuse of this useful but ethically problematic technology. (shrink)
A key element of the distinction between explicit and implicit cognitive functioning is the presence or absence of conscious awareness. In this review, we consider the proposal that neuropsychological disorders can best be considered in terms of a decoupling between preserved implicit or unconscious processing and impaired explicit or conscious processing. Evidence for dissociations between implicit and explicit processes in blindsight, amnesia, object agnosia, prosopagnosia, hemi-neglect, and aphasia is examined. The implications of these findings for a) our understanding of a (...) variety of neuropsychological disorders, b) the conceptualization of normal cognitive functioning, c) the neural basis of consciousness, and d) the clinical rehabilitation of brain-injured individuals are also discussed. (shrink)
Strategies for protecting historically disadvantaged groups have been extensively debated in the context of genetic variation research, making this a useful starting point in examining the protection of social groups from harm resulting from biomedical research. We analyze research practices developed in response to concerns about the involvement of indigenous communities in studies of genetic variation and consider their potential application in other contexts. We highlight several conceptual ambiguities and practical challenges associated with the protection of group interests and argue (...) that protectionist strategies developed in the context of genetic research will not be easily adapted to other types of research in which social groups are placed at risk. We suggest that it is this set of conceptual and practical issues that philosophers, ethicists, and others should focus on in their efforts to protect identifiable social groups from harm resulting from biomedical research. (shrink)
This paper focuses on the concept of the reified and consensual universes in the theory of social representations, and the relationship between them. Having examined the different ways in which Moscovici discusses this concept, and the different ways in which these discussions have been interpreted, I will suggest that many of the criticisms levelled at this facet of social representations theory appear somewhat misplaced. However, it does seem that some aspects of the concept of the consensual and the reified universes (...) are rather under-developed within the theory as a whole. It will be suggested that the recent addition of the notion of the representational project to the theory of social representations goes some way towards addressing some of these emergent issues, most notably with regards to the relationship between knowledges from different spheres. As an example, the area of representations of health and illness will be considered: drawing on work that takes the perspective of the sociology of scientific knowledge, it will be suggested that the theory of social representations needs to make the possibility of a bi-directional relationship between knowledge from the fields of scientific and common sense understanding more explicit. (shrink)
There are three potential problems with using virtue theory to develop an environmental ethic. First, Aristotelian virtue theory is ratiocentric. Later philosophers have objected that Aristotle’s preference for reason creates a distorted picture of the human good. Overvaluing reason might well bias virtue theory against the value of non-rational beings. Second, virtue theory is egocentric. Hence, it is suited to developing a conception of the good life, but it is not suited to considering obligations to others. Third, virtue theory is (...) notoriously bad at providing rules and procedures for resolving ethical questions about particular circumstances. But environmentalists need procedures for determining which of several conflicting values is most important. Virtue theory is not action guiding. I respond to each of these problems. I show that virtue theory is uniquely suited to answering ethical questions about nonhuman animals and the environment. (shrink)
Marking the tercentenary of Berkeley's birth, this collection of previously unpublished essays covers such Berkeleian topics as: imagination, experience, and possibility; the argument against material substance; the physical world; idealism; science; the self; action and inaction; beauty; and the general good. Among the contributors are: Christopher Peacocke, Ernest Sosa, Margaret Wilson, C.C.W. Taylor, and J.O. Urmson.
I argue that the reflections on language in Adorno and Heidegger have their common root in a modernist problematic that dissected experience into ordinary experience, and transfiguring experiences that are beyond the capacity for expression of our language. I argue that Adorno’s solution to this problem is the more resolutely “modernist” one, in that Adorno is more rigorous about preserving the distinction between what can be said, and what strives for expression in language. After outlining the definitive statement of this (...) problematic in Nietzsche’s early epistemological writings, I outline Heidegger’s solution and subsequently Adorno’s critique of Heidegger. Finally, I argue that situating Adorno within the modernist problem of language and expression is crucial for making sense of his philosophy as a form of critical theory. (shrink)
This commentary seeks to place Farah's (1994) arguments in the historical context of ideas about mind-brain relationships. It further seeks to draw a conceptual parallel between the issues considered by Farah in her target article and questions which have concerned neuroscientists since the nineteenth century regarding the functional organization of the brain. Specific reference is made to the relationship between use of the concept of in cognitive neuropsychology and use of the concept of in neuroscience.
McTaggart raised a famed paradox regarding the transientist conception of time, the idea that the present moves into the future to overtake future events (or, alternatively, that future events move into the present) and past events recede further and further into the past as time goes on. Schlesinger has recently attempted an ingenious transientist solution to McTaggart's paradox. We will argue that Schlesinger's solution to McTaggart's paradox itself gives rise to a new, yet perfectly parallel, paradox which can only be (...) resolved by abandoning the transientist view of time. (shrink)
The paper builds an argument about empathy, kinesthesia, choreography, and power as they were constituted in early eighteenth century France. It examines the conditions under which one body could claim to know what another body was feeling, using two sets of documents – philosophical examinations of perception and kinesthesia by Condillac and notations of dances published by Feuillet. Reading these documents intertextually, I postulate a kind of corporeal episteme that grounds how the body is constructed. And I endeavor to situate (...) this body within the colonial and expansionist politics of its historical moment. (shrink)