This paper attempts to articulate certain inadequacies that are involved in the traditional way of categorizing Indian philosophy and explores alternative approaches, some of which otherwise are not explicitly seen in the treatises of the history of Indian Philosophies. By categorization, I mean, classifying Indian philosophy into two streams, which are traditionally called as astica and nastica or orthodox and heterodox systems. Further, these different schools in the astica Darsanas and nastica Darsanas are usually numbered into six and three respectively. (...) Nyaya - Vaisesika, Sankhya -Yoga and Purva & Uttara Mimamsa are identified as astica darsanas and Carvaka, Buddhism and Jainism are identified as nastica darsanas (6+3). It is my endeavor to critically analyze the usual astica-nastica distinction of 6+3 classification of Indian philosophy so as to find out the meaning of such a rationale in this categorization. This general consensus is contested in this paper. What I am intended to support and strengthen such a critical analysis and exploration is to discuss these systems of India’s philosophy within the general intellectual milieu of Indian cultural traditions, its orientations, presuppositions and preferences. In order to carry out such a task, I shall be taking recourse to the theories of different scholars, both traditional and modern, in approaching and appropriating Indian Philosophy from different perspectives and their critical-creative approaches shall be scrutinized. (shrink)
In this essay I attempt to show the limitations of analytic thinking and the kinds of dead ends into which such analyses may lead us in the philosophy of sport. As an alternative, I argue for a philosophy of complementation and compatibility in the face of what appear to be exclusive alternatives. This is a position that is sceptical of bifurcations and other simplified portrayals of reality but does not dismiss them entirely. A philosophy of complementation traffics in (...) the realm of ambiguities, paradoxes, differences by degree, tendencies, mixtures, polarities, tensions, complexes, transitions and all other forms of messiness. I note that this position has been generated, in part, by work conducted in the empirical sciences and that complementation provides a paradigm that is useful across the academic disciplines. To show the ways in which analytic thinking leads to dead ends, I analyse the epistemological debate over ?broad internalism? engaged in by Russell (1999, 2004), Dixon (2003), Simon (2000, 2004) and Morgan (2004). Evidence for the claim that they reached a mostly unhelpful stalemate is based on the fact that they did not provide any third option and moreover that the analytic tools and ground rules they employ prevent its discovery. I suggest that all four authors are comfortable with the analytic tendency to bifurcate reality and require choices among exclusionary alternatives. I also claim that they treat reason as if it were generated by a ?mind from nowhere?. Philosophical anthropology, I suggest, provides much-needed somatic grounding that would reign in excessively optimistic views of reason (Dixon, Simon and Russell) or excessively plastic interpretations of mind (Morgan). It can also provide evidence that could help us understand why hominids (even modern ones) are so attracted to dichotomies and why we have so much trouble in reconciling apparent incompatibilities. (shrink)
In a brilliant series of essays, the distinguished philosopher D. Z. Phillips explores the alternatives for faith after foundationalism. A significant exploration of post-foundationalist thought in its own right, Faith After Foundationalism is also an important evaluation and critique of the theological implications of the views of Alvin Plantinga, Richard Rorty, George Lindbeck, and Peter Berger.Phillips’s own position is that one must resist the philosopher’s tendency to turn religious mystery into epistemological mystery. To understand how religious concepts are formed (...) is to understand that to speak of God as “beyond mortal telling” is not to confess a failure of language. God’s hiddenness is part of our concept of him—a reflection of the mystery of human life as it is lived. Faith After Foundationalism will be essential reading for philosophers of religion and theologians, as well as for students of contemporary epistemology. (shrink)
Richard Levins has advocated the scientific merits of qualitative modeling throughout his career. He believed an excessive and uncritical focus on emulating the models used by physicists and maximizing quantitative precision was hindering biological theorizing in particular. Greater emphasis on qualitative properties of modeled systems would help counteract this tendency, and Levins subsequently developed one method of qualitative modeling, loop analysis, to study a wide variety of biological phenomena. Qualitative modeling has been criticized for being conceptually and methodologically problematic. As (...) a clear example of a qualitative modeling method, loop analysis shows this criticism is indefensible. The method has, however, some serious limitations. This paper describes loop analysis, its limitations, and attempts to clarify the differences between quantitative and qualitative modeling, in content and objective. Loop analysis is but one of numerous types of qualitative analysis, so its limitations do not detract from the currently underappreciated and underdeveloped role qualitative modeling could have within science. (shrink)
Jaegwon Kim’s exclusion argument is a general ontological argument, applicable to any properties deemed supervenient on a microproperty basis, including biological properties. It implies that the causal power of any higher-level property must be reducible to the subset of the causal powers of its lower-level properties. Moreover, as Kim’s recent version of the argument indicates, a higher-level property can be causally efficient only to the extent of the efficiency of its micro-basis. In response, I argue that the ontology that aims (...) to capture experimentally based explanations of metabolic control systems and morphogenetic systems must involve causally relevant contextual properties. Such an ontology challenges the exclusiveness of micro-based causal efficiency that grounds Kim’s reductionism, since configurations themselves are inherently causally efficient constituents. I anticipate and respond to the reductionist’s objection that the nonreductionist ontology’s account of causes and inter-level causal relations is incoherent. I also argue that such an ontology is not open to Kim’s overdetermination objection. (shrink)
Epistemic closure has been a central issue in epistemology over the last forty years. According to versions of the relevant alternatives and subjunctivist theories of knowledge, epistemic closure can fail: an agent who knows some propositions can fail to know a logical consequence of those propositions, even if the agent explicitly believes the consequence (having “competently deduced” it from the known propositions). In this sense, the claim that epistemic closure can fail must be distinguished from the fact that agents (...) do not always believe, let alone know, the consequences of what they know—a fact that raises the “problem of logical omniscience” that has been central in epistemic logic. -/- This paper, part I of II, is a study of epistemic closure from the perspective of epistemic logic. First, I introduce models for epistemic logic, based on Lewis’s models for counterfactuals, that correspond closely to the pictures of the relevant alternatives and subjunctivist theories of knowledge in epistemology. Second, I give an exact characterization of the closure properties of knowledge according to these theories, as formalized. Finally, I consider the relation between closure and higher-order knowledge. The philosophical repercussions of these results and results from part II, which prompt a reassessment of the issue of closure in epistemology, are discussed further in companion papers. -/- As a contribution to modal logic, this paper demonstrates an alternative approach to proving modal completeness theorems, without the standard canonical model construction. By “modal decomposition” I obtain completeness and other results for two non-normal modal logics with respect to new semantics. One of these logics, dubbed the logic of ranked relevant alternatives, appears not to have been previously identified in the modal logic literature. More broadly, the paper presents epistemology as a rich area for logical study. (shrink)
According to the Relevant Alternatives (RA) Theory of knowledge, knowing that something is the case involves ruling out (only) the relevant alternatives. The conception of knowledge in epistemic logic also involves the elimination of possibilities, but without an explicit distinction, among the possibilities consistent with an agent’s information, between those relevant possibilities that an agent must rule out in order to know and those remote, far-fetched or otherwise irrelevant possibilities. In this article, I propose formalizations of two versions (...) of the RA theory. Doing so clarifies a famous debate in epistemology, pitting Fred Dretske against David Lewis, about whether the RA theorist should accept the principle that knowledge is closed under known implication, familiar as the K axiom in epistemic logic. Dretske’s case against closure under known implication leads to a study of other closure principles, while Lewis’s defense of closure by appeal to the claimed context sensitivity of knowledge attributions leads to a study of the dynamics of context. Having followed the first lead at length in other work, here I focus more on the second, especially on logical issues associated with developing a dynamic epistemic logic of context change over models for the RA theory. (shrink)
At several points in his later writings Wittgenstein discusses imaginary forms of life and ways of thinking that appear queer or alien from our point of view; concepts so different from ours that those who think from within them seem to be alternatives to us. In this paper I argue that reflection on the notions of difference and possibility in play here shows that imaginary cases of alien conceptual schemes or forms of life such as those considered by Wittgenstein (...) are not all cases of concepts that are entirely unintelligible for us; rather they may represent possible, albeit distant, ways of thinking for us. Such cases serve to aid imaginative reflection on our own case(s). By making us appreciate the possibility of the strange, they help us better to appreciate the arbitrariness of the familiar. I end the paper by considering what the implications of this reading of Wittgenstein's position might be for Donald Davidson's rejection of conceptual relativism. (shrink)
Introduction -- Overview of the contemporary global context : life stories -- Data on poverty, hunger, and inequality in an age of globalization -- The goals and structure of this book -- Development theory and practice : an overview -- Origins of the concept of development -- Modernization theory -- Modernization theory and U.S. aid policy -- The impact of modernizationist development -- Structuralist economic theories -- Dependency theories -- Basic needs approach -- New international economic order -- Alternative development (...) -- The impact of reformist thought on development policy -- Neoliberal resurgence and structural adjustment policies -- Current debates in development studies -- The failures of modernizationist development : a closer look -- The impacts of colonialism and slavery -- Post-WW II development policies and the third world debt crisis -- Consequences of debt and structural adjustment -- Responses to the debt crisis -- United States opposition to social change in the third world -- Summary of major structural influences on the third world -- Catholic social teaching and development -- CST prior to Pope John XXIII -- Early reflections on development : John XXIII and Vatican II -- The pivotal contributions of Paul VI, the Latin American bishops, and justice in the world -- John Paul II : the centrality of solidarity -- The social ethics of Benedict XVI -- Summary of catholic social teaching on development issues -- Catholic social teaching and political economy : neoconservative and radical critiques -- Neoconservative reflections on CST -- Radical reflections on CST -- Evaluation of neoconservative, radical, and CST views -- Grassroots critics of development and neoliberal globalization -- Rejecting the quest for development - Vandana shiva : the violence of development and reductionist science -- Further issues in the development/globalization debates -- Reclaiming the commons : the positive visions of development critics -- Catholic social teaching, the radical tradition, and development critics -- Grassroots action and policy alternatives -- Grassroots organizations in the third world : an overview -- The impact of grassroots organizations -- Development policies : follow the nic model -- Alternative development policies -- Differing visions : alternative development vs. regeneration -- Prospects for the adoption of alternative policies -- Re-envisioning C atholic social teaching -- The contributions of CST to the development debate -- Enhancing Catholic social teaching -- Structural analysis of capitalism -- Women, development, and CST -- CST, modernization, and cultural diversity -- CST and ecology - CST, grassroots movements, and social struggle -- The church and social change -- Social criticism and pioneering creativity : how Christians can constructively address issues of development and globalization -- Education -- Lifestyle choices -- Responsible purchasing -- Responsible investment -- Organizing, activism, and aid provision -- Direct service/solidarity -- Responsible parenting -- Applying CST in the life of the church -- Concluding reflections -- Theological epilogue: The path of discipleship. (shrink)
In this paper, I work through the possible contours of an anti-genocide based on a framework informed by the work of Giorgio Agamben. Such a framework posits the inherent need to circumvent sovereign power within any form of normative activism. To begin, I show how the nascent anti-genocide movement promotes an ideal in which ?Western? states, particularly the USA, accept the global responsibility to protect persecuted life beyond national boundaries. Using Agamben, I argue that this vision also entails an acceptance (...) of a sovereign framework for the valuation of life, thus failing to confront the inherent power of the sovereign to condemn life in the first place. I then highlight the limitations that Agamben's ontology places on us in dealing with this inherent problem within the sovereign-subject relationship. By positing an alternative ontology, I suggest the possibility of establishing communities of solidarity that challenge the sovereign's self-ascribed role as the absolute valuator of life. Counter to Agamben, I argue that the basis for such communities could be a dedication to the universal sacredness of human life, which is maintained independently of, and in challenge to, sovereign power. (shrink)
Human beings, even very young infants, and members of several other species, exhibit remarkable capacities for attending to and engaging with others. These basic capacities have been the subject of intense research in developmental psychology, cognitive psychology, comparative psychology, neuroscience, and philosophy of mind over the last several decades. Appropriately characterizing the exact level and nature of these abilities and what lies at their basis continues to prove a tricky business. The contributions to this special issue investigate whether and to (...) what extent the exercise of such capacities count as, or are best explained by, a genuine understanding of minds, where such understanding depends on the creatures in question possessing capacities for attributing a range of mental states and their contents in systematic ways. The question that takes center stage is: Do the capacities for attending to and engaging with others in question involve mindreading or is this achieved by other means? In this editorial we will review the state of the debate between mindreading and alternative accounts of social cognition. The issue is organized as follows: the first two papers review the experimental literature on mindreading in primates (Bermúdez) and children (Low & Wang), and the kinds of arguments made for mindreading and alternative accounts of social cognition. The next set of papers (Hedger & Fabricius, Lurz & Krachun, Zawidzki, and de Bruin et al.) further critique the existing experimental data and defend various mindreading and non-mindreading accounts. The final set of papers address further issues raised by phenomenological (Jacob, Zahavi), enactive (Michael), and embodied (Spaulding) accounts of social cognition. (shrink)
Scientists have shown that the practice of factory farming is an increasingly urgent danger to human health, the environment, and nonhuman animal welfare. For all these reasons, moral agents must consider alternatives. Vegetarian food production, humane food animal farming, and in-vitro meat production are all explored from a variety of ethical perspectives, especially utilitarian and rights-based viewpoints, all in the light of current U.S. and European initiatives in the public and private sectors. It is concluded that vegetarianism and potentially (...) in-vitro meat production are the best-justified options. (shrink)
Many political philosophers hold the Feasible Alternatives Principle (FAP): justice demands that we implement some reform of international institutions P only if P is feasible and P improves upon the status quo from the standpoint of justice. The FAP implies that any argument for a moral requirement to implement P must incorporate claims whose content pertains to the causal processes that explain the current state of affairs. Yet, philosophers routinely neglect the need to attend to actual causal processes. This (...) undermines their arguments concerning moral requirements to reform international institutions. The upshot is that philosophers’ arguments must engage in causal analysis to a greater extent than is typical. -/- [Supplement: Handout available at http://db.tt/fyuVW3Xv]. (shrink)
The "non-identity argument" has been applied to reject the validity of claims for historic justice, often generating highly unintuitive conclusions. George Sher has suggested a solution to this problem, explaining the harm to descendants of historically wronged peoples as deriving not from the historic wrongs but from the failure to provide rectification to the previous generation for harm they suffered. That generation was likewise owed rectification for harm they suffered from failure to provide rectification to the generation preceding them. In (...) this chain of injustices each failure to provide rectification to one is the source of wrongful harm to the next. Such chains form a "bridge" between the historic wrong and the harm suffered by living individuals. I call this approach the subsequent-wrong solution (SWS). I argue that bypassing the non-identity argument in this way is problematic. First, SWS cannot justify rectification in seemingly legitimate historic-justice claims, such as historic wrongs generating delayed harms that skip generations. Second, SWS justifies rectification for the wrong reasons, denying the essence of historic-justice claims: that past wrongs, for which original wrongdoers are responsible, harm descendants of original victims. Finally, SWS does not fully account for group membership's role in historic injustice, unable to distinguish between claims of descendants of historic victims and claims made by others with unrelated interests in the rectification of the previous generation. A supplementary solution is needed, focusing on the role of group harm and group membership. The plausibility of this approach, tying individual harm to group harm, derives from these three limitations of the subsequent-harm solution. I give a rudimentary account of what such a solution would look like. (shrink)
In various debates about science, appeal is made to the freedom of scientific research. A rationale in favor of this freedom is rarely offered. In this paper, two major arguments are reconstructed that promise to lend support to a principle of scientific freedom. According to the epistemological argument, freedom of research is required in order to organize the collective cognitive effort we call science efficiently. According to the political argument, scientific knowledge needs to be generated in ways that are independent (...) of the major political powers because of the important role it plays for the citizens and their capacity to form well-informed political preferences. Both arguments are examined critically in order to identify their strengths and limitations. I argue that the scientific freedom established by both rests on a number of critical preconditions, and that the arguments’ force must be weighed against competing societal interests and values in each case of their application. Appeal to a principle of scientific freedom should therefore never mark the end, but rather the beginning of a public debate about the ends and means of science. (shrink)
Elsewhere I have argued that the most significant threat to scientific realism arises from what I call the problem of unconceived alternatives: the repeated failure of past scientists and scientific communities to even conceive of alternatives to extant scientific theories, even when such alternatives were both (1) well-confirmed by the evidence available at the time and (2) sufficiently scientifically serious as to be actually embraced in the course of further investigation. In this paper I explore Francis Galton’s (...) development and defense of his “stirp” theory of inheritance and conclude that this particular historical example offers impressive support for the challenge posed by the problem of unconceived alternatives while simultaneously showing how we can make that challenge deeper and sharper. (shrink)
The original evidence advanced to support the Tensed S Condition (TSC) and the Specified Subject Condition (SSC) in Chomsky's Conditions on Transformations is reconsidered and viable alternatives to these constraints are provided. It is shown that TSC and SSC, in some instances, lead to a loss of linguistically significant generalization. Satisfactory alternatives can account for the relevant range of data and provide a more general account of additional data. Finally, counterevidence to Subjacency and Superiority is adduced, but explicit (...)alternatives to these conditions are not offered. (shrink)
In the development field, one of the major shortcomings of mainstream development theories and models is their relative indifference toward environmental concerns. However, the worsening environmental catastrophes and the growing environmental consciousness led to the emergence of a new model of development known as "sustainable development." The proponents of sustainable development tend to explore the environmental costs of development activities, prescribe environment-friendly policies, suggest institutional and legal measures for environmental protection, and publicize the principles of sustainable through international forums and (...) publications. Despite this recognition of environment-development relationship, the model of sustainable development suffers from certain serious shortcomings that need to be addressed. This article begins with a brief discussion on various forms of environmental challenges to development, followed by an analysis of how the model of sustainable development articulates the environment-development linkages in both practical and intellectual terms. The final section of the paper critically examines the major limitations of the model in dealing with the environmental question, and makes some suggestions in this regard. (shrink)
As the world population is growing and government directives tell us to consume more fatty acids, the demand for fish is increasing. Due to declines in wild fish populations, we have come to rely more and more on aquaculture. Despite rapid expansion of aquaculture, this sector is still in a relatively early developmental stage. This means that this sector can still be steered in a favorable direction, which requires discussion about sustainability. If we want to avoid similar problems to the (...) ones we have experienced with livestock farming, we need to generate knowledge of the biology, profitability, environmental aspects, consumer awareness, and product appreciation of particular fish species. However, the discussion about a sustainable aquaculture also raises the question how we should treat fish. This moral question is regularly addressed as a problem of applied ethics with a focus on tailoring ethical principles to practical questions. In this article we do not deny the importance of the practical accounts, but we start from the fundamental question whether and why fish matter in our moral deliberations, i.e., from the discussion on moral status. We elaborate the distinction between moral considerability and moral significance in order to show both the importance and the limitations of the discussion about moral status for practical problems in aquaculture. We illustrate these points with a case-study about the farming of a specific fish species, the African catfish. (shrink)
The Human Genome Project (HGP) represents a massive merging of science and technology in the name of all humanity. While the disease aspects of HGP-generated data have received the greatest publicity and are the strongest rationale for the project, it should be remembered that the HGP has, as its goal the sequencing of all 100,000 human genes and the accurate depiction of the ancestral and functional relationships among these genes. The HGP will thus be constructing the molecular taxonomic norm for (...) humanity. Currently the HGP genomic baseline is almost exclusively skewed toward North Atlantic European lineages through the extensive use of the Centre d’Études du Polymorphisme Humaine (CEPH) data set. More recently, the HGP has shifted to the use of volunteer donors since adequate informed consent had not been secured from the CEPH families. No evidence exists that either the CEPH families or the current volunteers are the most appropriate demographic or evolutionary lineages for the functional genomic studies that will guide production of new DNA based drugs, targeted therapeutics and gene-based diagnostics. The lack of scientific representativeness of the HGP is a serious impediment to its broad applicability. Yet this can be remedied, and five alternative sampling strategies are presented. In response to the current exclusionary design of the HGP, there is noteworthy caution and skepticism in the African American community concerning genetic studies. The Manifesto on Genomic Studies Among African Americans reflects both a desire to be systematically included in federally funded genomic studies and a desire to maintain some control over the interpretation and application of research results. Representative sampling in the HGP is seen as an international human rights issue with domestic ethical implications. (shrink)
Researchers in medical education have extensively studied negative reactions to gross anatomy, sometimes grouped under the term the cadaver experience. Although there has been disagreement about the extent and importance of such phenomena, several attempts at curricular reform have been designed to humanize the student-cadaver encounter. However, some obvious sources linking gross anatomy and the humanities have been consistently overlooked. Such sourcesâfrom the history of art, the history of anatomy, and autobiographical and imaginative literatureânot only bear witness to the cadaver (...) experience for anatomists of the past, but also offer forgotten alternatives for placing present-day reactions in perspective. Former methods of teaching which used such material might serve as models for reintegrating the humanities into the study of gross anatomy as a possible humanizing force. (shrink)
This article is the second one in a series dealing with mental health ethics in Cuba. It reports on ethical dilemmas, resources and limitations to their resolution, and recommendations for action. The data, obtained through individual interviews and focus groups with 28 professionals, indicate that Cubans experience dilemmas related to (a) the interests of clients, (b) their personal interests, and (c) the interest of the state. These conflicts are related to power differentials among (a) clients and professionals, (b) professionals (...) from various disciplines, and (c) professionals and organizational authorities. Resources to solve ethical dilemmas include government support, ethics committees, and collegial dialogue. Limitations include minimal training in ethics, lack of safe space to discuss professional disagreements, and little tolerance for criticism. Recommendations to address ethical dilemmas include better training, implementation of a code of ethics, and provision of safe space to discuss ethical dilemmas. The findings are discussed in light of the role of power in applied ethics. (shrink)
There is a wealth of experimental data showing that the way a problem is framed may have an effect on people's choices and decisions. Based on a semantic analysis of evaluative expressions like ‘good’, I propose a new explanation of such framing effects. The key idea is that our choices and decisions reveal a counterfactual systematicity: they carry information about the choices and decisions we would have made if the facts had been otherwise. It is these counterfactual alternatives that (...) may diverge between otherwise equivalent versions of the same task, and thus explain the effects of framing. (shrink)
Peter Stastny and Peter Lehmann's Alternatives beyond Psychiatry offers a comprehensive and up to date account of the alternatives to mainstream psychiatry that are being developed by service consumers and survivors across the world. As psychiatry moves into a new age less dominated by a biomedical paradigm many of the approaches described in this book may be adopted by mainstream health services. This is a hugely readable and accessible book for professionals and consumers alike.
This paper examines the relationship between perceptual knowledge and discrimination in the light of the so-called ‘relevant alternatives’ intuition. It begins by outlining an intuitive relevant alternatives account of perceptual knowledge which incorporates the insight that there is a close connection between perceptual knowledge and the possession of relevant discriminatory abilities. It is argued, however, that in order to resolve certain problems that face this view, it is essential to recognise an important distinction between favouring and discriminating epistemic (...) support that is often overlooked in the literature. This distinction complicates the story regarding how an alternative becomes relevant, and in doing so weakens the connection between perceptual knowledge and discrimination. The theory that results, however—what I term a ‘two-tiered’ relevant alternatives theory of perceptual knowledge—accommodates many of our intuitions about perceptual knowledge and so avoids the revisionism of some recent proposals in the epistemological literature. (shrink)
Sherri Roush () and I (, ) have each argued independently that the most significant challenge to scientific realism arises from our inability to consider the full range of serious alternatives to a given hypothesis we seek to test, but we diverge significantly concerning the range of cases in which this problem becomes acute. Here I argue against Roush's further suggestion that the atomic hypothesis represents a case in which scientific ingenuity has enabled us to overcome the problem, showing (...) how her general strategy is undermined by evidence I have already offered in support of what I have called the 'problem of unconceived alternatives'. I then go on to show why her strategy will not generally (if ever) allow us to formulate and test exhaustive spaces of hypotheses in cases of fundamental scientific theorizing. (shrink)
There are not enough solid organs available to meet the needs of patients with organ failure. Thousands of patients every year die on the waiting lists for transplantation. Yet there is one currently available, underutilized, potential source of organs. Many patients die in intensive care following withdrawal of life-sustaining treatment whose organs could be used to save the lives of others. At present the majority of these organs go to waste.In this paper we consider and evaluate a range of ways (...) to improve the number and quality of organs available from this group of patients. Changes to consent arrangements (for example conscription of organs after death) or changes to organ donation practice could dramatically increase the numbers of organs available, though they would conflict with currently accepted norms governing transplantation.We argue that one alternative, Organ Donation Euthanasia, would be a rational improvement over current practice regarding withdrawal of life support. It would give individuals the greatest chance of being able to help others with their organs after death. It would increase patient autonomy. It would reduce the chance of suffering during the dying process. We argue that patients should be given the choice of whether and how they would like to donate their organs in the event of withdrawal of life support in intensive care.Continuing current transplantation practice comes at the cost of death and prolonged organ failure. We should seriously consider all of the alternatives. (shrink)
Traditionally, skeptics as well as their opponents have agreed that in order to know that p one must be able, by some preferred means, to rule out all the alternatives to p. Recently, however, some philosophers have attempted to avert skepticism not (merely) by weakening the preferred means but rather by articulating a subset of the alternatives to p — the so-called relevant alternatives — and insisting that knowledge that p requires only that we be able (by (...) the preferred means) to rule out members of the set. In this paper I argue that a precise formulation of this new approach reveals it inadequate as a solution to skepticism. (shrink)
The main characters of a philosophy meant as an activity which is not essentially different from science but deals with questions which go beyond the limits of present sciences are the following: 1) Philosophy is an investigation of the world. It is aimed at dealing with major issues and is justified only insofar as it deals with them. 2) Philosophy provides a global view, it is not limited to sectorial questions. So there cannot be a philosophy of mathematics alone, or (...) physics alone, or biology alone, and so on. 3)Being an investigation about the world, philosophy aims at knowledge. Therefore questions about knowledge are central in philosophy. 4)Philosophy is continuous with sciences. Its objectives are not essentially different from those of sciences. 5)Philosophy makes use of results of sciences. This is not accessory to it, it is essential for its progress. 6)The method of philosophy is essentially the same as that of sciences. 7) Philosophy seeks new knowledge. Seeking new knowledge is part of its deepest nature. 8) Philosophy seeks new discovery methods. Seeking new knowledge, it also seeks new methods to obtain it. 9) Philosophy tries unexplored routes and, by so doing, it may even give origin to new sciences. Its greatest value consists in this. 10) Philosophy makes use of the experience of philosophers of the past. For this may help us to understand where certain ideas lead, avoiding us to try routes which have already revealed fruitless. 11) A conclusive solution of philosophical problems is impossible. Their solutions are always provisional and are bound to be replaced sooner or later by others. Progress exists everywhere, even in philosophy. 12) Philosophy has no specific field of its own, nor specific techniques of its own. But because it moves on an unexplored ground, it is at the same time always exposed to the risk of failure but also capable of surprising developments, originating new sciences. -/- . (shrink)
I consider the implications of incommensurability for the assumption, in rational choice theory, that a rational agent’s preferences are complete. I argue that, contrary to appearances, the completeness assumption and the existence of incommensurability are compatible. Indeed, reﬂection on incommensurability suggests that one’s preferences should be complete over even the incommensurable alternatives one faces.
By deliberation we understand practical reasoning with an end in view of choosing some course of action. Integral to it is the agent's sense of alternative possibilities, that is, of two or more courses of action he presumes are open for him to undertake or not. Such acts may not actually be open in the sense that the deliberator would do them were he to so intend, but it is evident that he assumes each to be so. One deliberates only (...) by taking it for granted that both performing and refraining from any of the acts under consideration are possible for one, and that which is to be selected is something entirely up to oneself. What is it for a course of action to be presumed as open, or for several courses of action to present themselves as a range of open alternatives? Answering these questions is essential for an understanding of deliberation and choice and, indeed, for the entire issue of free will and responsibility. According to one common view, a deliberator takes the considered options to be open only by assuming he is free to undertake any of them and, consequently, that whichever he does undertake is, as yet, a wholly undetermined matter. Built into the structure of deliberation, on this theory, is an indeterministic bias relative to which any deliberator with deterministic beliefs is either inconsistent or condemned to a fatalistic limbo. An unmistakable challenge is thereby posed: is there an alternative conception of the presuppositions underlying deliberation more congenial to a deterministic perspective yet adequate to the data? Convinced that there is, I develop a partial account of deliberation that, though highly similar to the aforementioned view, diverges at a critical juncture. (shrink)
This paper discusses various problems of explanations by mechanisms. Two positions are distinguished: the narrow position claims that only explanations by mechanisms are acceptable. It is argued that this position leads to an infinite regress because the discovery of a mechanism must entail the search for other mechanisms etc. Another paradoxical consequence of this postulate is that every successful explanation by mechanisms is unsatisfactory because it generates new ``black box'' explanations. The second â liberal â position that is advanced in (...) this paper regards, besides explanations by mechanisms, also the discovery of bivariate correlations as a first step of an explanation by mechanisms as meaningful. It is further argued that there is no contradiction between causal analysis and the explanation by mechanisms. Instead, explanations by mechanisms always presuppose the analysis of causal structures (but not vice versa). The final point is that an explanation by mechanisms is not inconsistent with the Hempel-Oppenheim scheme of explanation. (shrink)
The incredible achievements of modern scientific theories lead most of us to embrace scientific realism: the view that our best theories offer us at least roughly accurate descriptions of otherwise inaccessible parts of the world like genes, atoms, and the big bang. In Exceeding Our Grasp, Stanford argues that careful attention to the history of scientific investigation invites a challenge to this view that is not well represented in contemporary debates about the nature of the scientific enterprise. The historical record (...) of scientific inquiry, Stanford suggests, is characterized by what he calls the problem of unconceived alternatives. Past scientists have routinely failed even to conceive of alternatives to their own theories and lines of theoretical investigation, alternatives that were both well-confirmed by the evidence available at the time and sufficiently serious as to be ultimately accepted by later scientific communities. Stanford supports this claim with a detailed investigation of the mid-to-late 19th century theories of inheritance and generation proposed in turn by Charles Darwin, Francis Galton, and August Weismann. He goes on to argue that this historical pattern strongly suggests that there are equally well-confirmed and scientifically serious alternatives to our own best theories that remain currently unconceived. Moreover, this challenge is more serious than those rooted in either the so-called pessimistic induction or the underdetermination of theories by evidence, in part because existing realist responses to these latter challenges offer no relief from the problem of unconceived alternatives itself. Stanford concludes by investigating what positive account of the spectacularly successful edifice of modern theoretical science remains open to us if we accept that our best scientific theories are powerful conceptual tools for accomplishing our practical goals, but abandon the view that the descriptions of the world around us that they offer are therefore even probably or approximately true. (shrink)
Fred Dretske holds that if one knows something, one need not eliminate every alternative to it but only the relevant alternatives. Besides defending this view in "The Pragmatic Dimension of Knowledge" ("Phil. Stud.", 40, 363-378, n 81), he makes some tentative suggestions about determining when an alternative is relevant. I discuss these suggestions and conclude that there are problems yet to be solved. I do not conclude that there are insoluble problems or that Dretske's approach is on the wrong (...) track. It is, I believe, on the right track. (shrink)
We present a model of a fundamental property of consciousness as the capacity of a system to opt among presented alternatives. Any system possessing this capacity is "conscious" in some degree, whether or not it has the higher capacity of reflecting on its opting. We argue that quantum systems, composed of microphysical particles, as studied by quantum mechanics, possess this quality in a protomental form. That is, such particles display the capacity to opt among alternatives, even though they (...) lack the ability to experience or communicate their experiences. Human consciousness stands at the opposite end of the hierarchy of conscious life forms as the most sophisticated system of which we have direct acquaintance. We contend that it shares the common characteristic of a system capable of opting among alternatives. Because the fundamental property of consciousness is shared by human beings and the constituents of elementary matter in the universe, our model of consciousness can be considered as a modified form of panpsychism. (shrink)
This paper analyzes Deborah Mayo's recent criticism of use-novelty requirement. She claims that her severity criterion captures actual scientific practice better than use-novelty, and that use-novelty is not a necessary condition for severity. Even though certain cases in which evidence used for the construction of the hypothesis can test the hypothesis severely, I do not think that her severity criterion fits better with our intuition about good tests than use-novelty. I argue for this by showing a parallelism in terms of (...) severity between the confidence interval case and what she calls 'gellerization'. To account for the difference between these cases, we need to take into account certain additional considerations like a systematic neglect of relevant alternatives. (shrink)
The purpose of this paper is to describe some limitations on scientific behaviorist and computational models of the mind. These limitations stem from the inability of either model to account for the integration of experience and behavior. Behaviorism fails to give an adequate account of felt experience, whereas the computational model cannot account for the integration of our behavior with the world. Both approaches attempt to deal with their limitations by denying that the domain outside their limits (...) is a part of psychology. These attempts to turn the shortcomings of the two models into virtues would be more convincing if their limitations were not diametrically opposed. I will argue that in each case the limitations are too restrictive unless the theories are augmented by physiology. (shrink)
Pathologies of Rational Choice Theory is a valuable survey and critique of research in the rational choice tradition, but one that slights that tradition's past and potential contributions to the study of politics. The authors rightly note limitations of rational choice theory but understate what it has to offer political scientists, for they fail to focus clearly on its essentials; adopt too narrow a basis for evaluating scholarship; and wrongly identify rational choice theory with the shortcomings of some scholarship (...) that makes use of it. (shrink)
In earlier work I have argued that the most substantial threat to scientific realism arises from the problem of unconceived alternatives: the repeated failure of past scientists and scientific communities to conceive of alternatives to extant scientific theories, even when such alternatives were both (1) well confirmed by the evidence available at the time and (2) sufficiently scientifically serious as to be later embraced by actual scientific communities. In this paper I explore Charles Darwin's development and defense (...) of his ‘pangenesis’ theory of inheritance and conclude that this particular historical example offers impressive support for the challenge posed to realism by this problem of unconceived alternatives. Introduction Darwin and pangenesis: The search for the material basis of generation and heredity A crucial unconceived alternative: common-cause mechanisms of inheritance Galton and common-cause inheritance Conclusion. (shrink)
Informed consent is recognized as a primary ethical requirement to conduct research involving humans. In the investigations with the use of human biological material, informed consent (IC) assumes a differentiated condition on account of the many future possibilities. This work presents suitable alternatives for IC regarding the storage and use of human biological material in research, according to new Brazilian regulations. Both norms – Resolution 441/11 of the National Health Council, approved on 12 May 2011, and Ordinance 2.201 (NATIONAL (...) GUIDELINES FOR BIOREPOSITORIES AND BIOBANKS OF HUMAN BIOLOGICAL MATERIAL FOR RESEARCH PURPOSE) of the Brazil Ministry of Health, approved on 14 September 2011 – state that the consent of subjects for the collection, storage and use of samples stored in Biobanks is necessarily established by means of a Free and Informed Consent Form (ICF). In order to obtain individual and formal statements, this form should contain the following two mutually exclusive options: an explanation about the use of the stored material in each research study, and the need for new consent or the waiver thereof when the material is used for a new study. On the other hand, ICF suitable for Biorepositories must be exclusive and related to specific research. Although Brazilian and international regulations identify the main aspects to be included in the IC, efforts are still necessary to improve the consent process, so that the document will become a bond of trust between subject and researcher. (shrink)
If, as it is usually understood, incommensurable theories must be compatible then one need never choose between two such theories. But if theories were incompatible and incommensurable one would have to choose between them. What if they are incompatible only outside the domain of observation? The fact that Darwin's biology can clash with Kelvin's physics (each with their respective auxiliary assumptions) regarding the age of the earth shows how commensurable theories may yet be incompatible. But it also shows that they (...) need not be alternatives--i.e. that one may not be able to simply and satisfactorily replace the other in our world view. But standard examples of scientific revolutions consist of the replacement of one theory by another in one's world view. These alternative theories must therefore be more than merely incompatible--what do they share if not content? (I.e. they must be commensurable.). (shrink)
This study discusses the relationship between Green Chemistry and Environmental Sustainability as expressed in textbooks and articles on Green Chemistry authored by their promoters. It was found that although the Brundtland concept of Sustainable Development/Sustainability has been mentioned often by green chemists, a full analysis of that relationship was almost never attempted. In particular, green chemists have paid scarce attention to the importance of The Second Law of thermodynamics on Environmental Sustainability and the consequences of the limitations it imposes (...) on Green Chemistry, which are discussed in this paper. (shrink)
This paper focuses on one matter that poses a problem for both human judges and standard probability frameworks, namely the assumption of a unique (privileged) and complete partition of the state-space of possible events. This is tantamount to assuming that we know all possible outcomes or alternatives in advance of making a decision, but it is clear that there are many practical situations in prediction, diagnosis, and decision-making where such partitions are contestable and/or incomplete. The paper begins by surveying (...) the impact of partitions on the choice of priors in formal probabilistic updating frameworks, and on human subjective probability judgements. That material is followed by an overview of strategies for dealing with partition dependence, including considerations of how a rational agent's preferences may determine the choice of a partition. (shrink)
In earlier work I have argued that the most substantial threat to scientific realism arises from the problem of unconceived alternatives: the repeated failure of past scientists and scientific communities to conceive of alternatives to extant scientific theories, even when such alternatives were both (1) well confirmed by the evidence available at the time and (2) sufficiently scientifically serious as to be later embraced by actual scientific communities. In this paper I explore Charles Darwin's development and defense (...) of his 'pangenesis' theory of inheritance and conclude that this particular historical example offers impressive support for the challenge posed to realism by this problem of unconceived alternatives. (shrink)
Caplan & Waters propose a dedicated linguistic working memory to handle “interpretive” language comprehension, but there are data suggesting that more general working memory capacity can predict syntactic comprehension difficulty, and their claims depend on the existence of a principled distinction between “interpretive” and “post-interpretive” processes, which seems unlikely. Other conceptions of the source of individual differences also deserve consideration, as more flexible explanations of the phenomena.
Overall, most of the reviewers agree that Principles of Brain Evolution was a welcome addition to the field, and kindly describe it as carefully researched and lucidly written. Thereafter, they note some gaps – principally, adaptive scenarios, microevolutionary studies, and computational models. I here admit to those deficiencies but explain why they exist and how they might be filled. In addition, one commentator criticizes my analysis of hominin brain evolution, and another finds my principle of “large equals well-connected” to be (...) inconsistent with the data. I rebut those two critiques. Hopefully, this process of critique and counterpoint will stimulate some readers to pursue the mentioned thoughts and to engage in new research. (shrink)
Tomasello et al. have not characterized the motivation underlying shared intentionality, and we hope to encourage research on this topic by offering comparative paradigms and specific empirical questions. Although we agree that nonhuman primates differ greatly from us in terms of shared intentionality, we caution against concluding that they lack all aspects of it before other empirical tools have been exhausted. In addition, identifying the conditions in which humans spontaneously engage in shared intentionality, and the conditions in which we fail, (...) will more fully characterize this ability. (shrink)
In a recent paper, Martin Hackl and I identified a variety of circumstances where scalar implicatures, questions, definite descriptions, and sentences with the focus particle only are absent or unacceptable (Fox and Hackl 2006, henceforth F&H). We argued that the relevant effect is one of maximization failure (MF): an application of a maximization operator to a set that cannot have the required maximal member. We derived MF from our hypothesis that the set of degrees relevant for the semantics of degree (...) constructions is always dense (the Universal Density of Measurement, UDM). The goal of this paper is to present an apparent shortcoming of F&H and to argue that it is overcome once certain consequences of the proposal are shown to follow from more general properties of MF. Specifically, the apparent problem comes from evidence that the core generalizations argued for in F&H extend to areas for which an account in terms of density is unavailable. Nevertheless, I will argue that the account could still be right. Certain dense sets contain "too many alternatives" for there to be a maximal member, thus leading to MF. But, there are other sets that lead to the same predicament. My goal will be to characterize a general signature of MF in the hope that it could be used to determine the identity of alternatives in areas where their identity is not clear on independent grounds. (shrink)
There is a widespread belief among intellectuals that the domain of philosophy shrinks as the domain of the special sciences expands, and that someday, science might swallow up philosophy entirely. Some philosophical naturalists think that this day may have already arrived. These naturalists believe that philosophy’s methodology should be the same as that of natural science; they imply that philosophy has no distinctive “armchair” methodology of its own.
As the worlds of economics, politics, culture, and communications face a growing wave of globalization that will likely continue, ethical challenges for journalists have also gone global. I propose a clear division between ethics codes for media owners, the public, and professional journalists and present a set of considerations and specific rules applicable only to the last group. In this article I advocate a universal code of journalistic ethics but point out problems and warn against dangers that have made the (...) application of such codes difficult in the past. A universal code should consider the voluntary nature of such an endeavor, the cultural and economic differences in various journalistic traditions, and the problem of producing solutions acceptable to all involved. (shrink)
A comparison is made of the traditional Loschmidt (reversibility) and Zermelo (recurrence) objections to Boltzmann's H-theorem, and its simplified variant in the Ehrenfests' 1912 wind-tree model. The little-cited 1896 (pre-recurrence) objection of Zermelo (similar to an 1889 argument due to Poincare) is also analysed. Significant differences between the objections are highlighted, and several old and modern misconceptions concerning both them and the H-theorem are clarified. We give (...) particular emphasis to the radical nature of Poincare's and Zermelo's attack, and the importance of the shift in Boltzmann's thinking in response to the objections as a whole. (shrink)
<span class='Hi'>Naturalized</span> <span class='Hi'>epistemology</span>—the recent attempt to transform the theory of knowledge into a branch of natural science—is often criticized for dispensing with the distinctively philosophical content of <span class='Hi'>epistemology</span>. In this dissertation, I argue that epistemologists are correct to reject naturalism, but that new arguments are needed to show why this is so. I establish my thesis first by evaluating two prominent varieties of naturalism—optimistic and pessimistic—and then by offering a proposal for how a new version of non-naturalistic <span (...) class='Hi'>epistemology</span> must move forward. Optimistic naturalism attempts to use scientific methods to give positive answers to traditional epistemological questions. Epistemologists, for example, are urged to draw on psychology and evolutionary biology in order to show our beliefs are justified. I argue that this project fails. First, the naturalist’s thesis that theory is underdetermined by evidence poses difficulties for the optimist’s attempt to show that our beliefs are justified, even according to <span class='Hi'>naturalized</span> standards. Second, while critics usually contest naturalists’ logical right to use the concept of normative justification, I suggest that a deeper problem is with the naturalists’ use of the concept of belief. Naturalistic philosophy of mind, while perhaps acceptable for other purposes, does not deliver a concept of “belief” consistent with the constraints and needs of <span class='Hi'>naturalized</span> <span class='Hi'>epistemology</span>. Pessimistic naturalism—Quine’s project—takes it for granted that “belief” is problematic and logical justification elusive, and instead offers a pragmatic account of the development of our theory of the world. This project, while deeply unsatisfactory to the traditional epistemologist, also faces the challenge of privileging scientific discourse over other pragmatically successful modes of discourse. Whatever its merits, we can undermine its motivation by challenging the underdetermination thesis it rests on. We can do this by appealing to facts about scientific practice that undermine the conception of confirmation driving the thesis, by appealing to other facts about scientific practice, and by challenging some philosophical preconceptions, in order to make room for a new brand of inductivist foundationalism.. (shrink)
There is a growing movement to increase access to palliative care by declaring it a human right. Calls for such a right—in the form of articles in the healthcare literature and pleas to the United Nations and World Health Organization—rarely define crucial concepts involved in such a declaration, in particular ‘palliative care’ and ‘human right’. This paper explores how such concepts might be more fully developed, the difficulties in using a human rights approach to promote palliative care, and the relevance (...) of such an enterprise to public health ethics. (shrink)
This paper begins with a discussion of the value of privacy,especially for medical records in an age of advancing technology.I then examine three alternative approaches to protection ofmedical records: reliance on governmental guidelines, the useof corporate self-regulation, and my own third hybrid view onhow to maintain a presumption in favor of privacy with respectto medical information, safeguarding privacy as vigorously andcomprehensively as possible, without sacrificing the benefitsof new information technology in medicine. None of the threemodels I examine are unproblematic, yet (...) it is crucial to weighthe strengths and weaknesses of these alternative approaches. (shrink)
For decisions between many alternatives, the benchmark result is Hick's Law: that response time increases log-linearly with the number of choice alternatives. Even when Hick's Law is observed for response times, divergent results have been observed for error rates—sometimes error rates increase with the number of choice alternatives, and sometimes they are constant. We provide evidence from two experiments that error rates are mostly independent of the number of choice alternatives, unless context effects induce participants to (...) trade speed for accuracy across conditions. Error rate data have previously been used to discriminate between competing theoretical accounts of Hick's Law, and our results question the validity of those conclusions. We show that a previously dismissed optimal observer model might provide a parsimonious account of both response time and error rate data. The model suggests that people approximate Bayesian inference in multi-alternative choice, except for some perceptual limitations. (shrink)
Self-regulation exists at the firm-level, the industry-level, and the business-level of economic organization. Industry self-regulation has faced economic (free rider) and legal (antitrust) impediments to widespread implementation, although there exist examples of effective industry self-regulation, e.g., securities industry and the SEC, advertising and the FTC. By instituting industry codes of conduct, national trade associations have shown to be natural vehicles for self-regulation. While there has been long-standing general encouragement for establishing industry codes, adopting and enforcing conduct codes has been seriously (...) circumscribed by restrictive Supreme Court decisions and FTC advisory opinions. One approach to clearing legal confusion is to petition the FTC to issue an industry guide on promulgating and enforcing trade association codes of conduct. Another strategy is to utilize a stakeholder approach to association ethics committee appointments that subsequently influence code creation and enforcement. Finally, a new concept of an industry code of conduct will consist of three subcodes: an economic code; an environmental code; and a socio-political code. Combined, these strategic approaches will offer new opportunities for effective nonmarket regulation. (shrink)
With the example of treatment of menopause-related vegetative and emotional disturbances, the author verifies the effectiveness of the use of Ignatia amara containing complex homeopathic remedies (IACCHR) as an alternative to placebo. Substantial improvement in psychological and psychosomatic symptoms was observed. Climacteric complaints diminished or disappeared completely in the majority of women (95.7% by patient evaluation and 96.2% by physician evaluation). Compared to standard pharmaceuticals, IACCHR treatment was tolerated better and lower risk of side effects was observed. The results obtained (...) in this work indicate the significant therapeutic potential of this group of treatments, which is in line with the therapeutic effect of the placebo. Nevertheless, the showing of specific effects in pharmacological tests disqualifies the investigated treatments from use in a clinical trial in place of a placebo. (shrink)
Springer link: http://www.springer.com/philosophy/logic+and+philosophy+of+language/book/978-94-007-6090-5 -/- This volume examines the limitations of mathematical logic and proposes a new approach to logic intended to overcome them. To this end, the book compares mathematical logic with earlier views of logic, both in the ancient and in the modern age, including those of Plato, Aristotle, Bacon, Descartes, Leibniz, and Kant. From the comparison it is apparent that a basic limitation of mathematical logic is that it narrows down the scope of logic confining it to (...) the study of deduction, without providing tools for discovering anything new. As a result, mathematical logic has had little impact on scientific practice. -/- Therefore, this volume proposes a view of logic according to which logic is intended, first of all, to provide rules of discovery, that is, non-deductive rules for finding hypotheses to solve problems. This is essential if logic is to play any relevant role in mathematics, science and even philosophy. To comply with this view of logic, this volume formulates several rules of discovery, such as induction, analogy, generalization, specialization, metaphor, metonymy, definition, and diagrams. A logic based on such rules is basically a logic of discovery, and involves a new view of the relation of logic to evolution, language, reason, method and knowledge, particularly mathematical knowledge. It also involves a new view of the relation of philosophy to knowledge. This book puts forward such new views, trying to open again many doors that the founding fathers of mathematical logic had closed historically. (shrink)
I argue that defenders of general duties of species preservation are faced with an impossible task. I distinguish derivative from non-derivative value and argue that the derivative value of species can yield only limited and contingent duties of preservation. There can be no general duty of species preservation unless all species have non-derivative value. Ongoing controversy over the ’species’ notion has not deterred some from claiming settled authority for whatever notion appears most conducive to their favored account of species value. (...) This is a mistake. The actual task is to state biologically plausible criteria for a ’species’ notion and to make the case that these criteria demarcate something of moral value. I argue that the task is made impossible by the same basic biological facts that led Darwin to the view that species are “merely artificial combinations made for convenience.‘. (shrink)
In defending his interest-relative account of knowledge in Knowledge and Practical Interests (2005), Jason Stanley relies heavily on intuitions about several bank cases. We experimentally test the empirical claims that Stanley seems to make concerning our common-sense intuitions about these bank cases. Additionally, we test the empirical claims that Jonathan Schaffer seems to make in his critique of Stanley. We argue that our data impugn what both Stanley and Schaffer claim our intuitions about such cases are. To account for these (...) results, one must develop a better conception of the connection between a subject's interests and her body of knowledge than those offered by Stanley and Schaffer. (shrink)
In current debates about moral responsibility, it is common to differentiate two fundamentally different incompatibilist positions: Leeway Incompatibilism and Source Incompatibilism. The present paper argues that this is a bad dichotomy. Those forms of Leeway Incompatibilism that have no appeal to ‘origination’ or ‘ultimacy’ are problematic, which suggests that incompatibilists should prefer Source Incompatibilism. Two sub-classifications of Source Incompatibilism are then differentiated: Narrow Source Incompatibilism holds that alternative possibilities are outside the scope of what is required for moral responsibility, and (...) Wide Source Incompatibilism holds that while ultimacy is most fundamental to moral responsibility, an agent meeting the ultimacy condition will also have alternative possibilities, thereby also satisfying an alternative possibilities condition. The present paper argues that the most promising incompatibilist positions will be versions of Wide Source Incompatibilism. (shrink)
Global Economy, Global Justice explores a vital question that is suppressed in most economics texts: "what makes for a good economic outcome?" Neoclassical theory embraces the normative perspective of "welfarism" to assess economic outcomes. This volume demonstrates the fatal flaws of this perspective--flaws that stem from objectionable assumptions about human nature, society and science. Exposing these failures, the book obliterates the ethical foundations of global neoliberalism. George DeMartino probes heterodox economic traditions and philosophy in search of an ethically viable alternative (...) to welfarism. Drawing on the work of Amartya Sen, DeMartino proposes the egalitarian principle of the "global harmonization of capabilities" to guide economics. This principle provides a basis for resisting oppression the world over while nevertheless demanding respect for cultural diversity. DeMartino puts this principle to work adjudicating contemporary debates over global policy regimes, and completes thebook with a set of deeply egalitarian global policies for the year 2025. Global Economy, Global Justice 's engaging prose will appeal to those seeking to understand the intersection between economics and political philosophy. Its focus on the normative foundations of contemporary policy disputes makes it unique in the literature on globalization. (shrink)
For contemporary democratic theorists, democracy is largely a matter of deliberation. But the recent rise of deliberative democracy (in practice as well as theory) coincided with ever more prominent identity politics, sometimes in murderous form in deeply divided societies. This essay considers how deliberative democracy can process the toughest issues concerning mutually contradictory assertions of identity. After considering the alternative answers provided by agonists and consociational democrats, the author makes the case for a power-sharing state with attenuated sovereignty and a (...) more engaged deliberative politics in a public sphere that is semidetached from the state and situated transnationally. (shrink)
: Common morality theory must confront apparent counterexamples from the history of morality, such as the widespread acceptance of slavery in prior eras, that suggest core norms have changed over time. A recent defense of common morality theory addresses this problem by drawing a distinction between the content of the norms of the common morality and the range of individuals to whom these norms apply. This distinction is successful in reconciling common morality theory with practices such as slavery, but only (...) at the cost of underscoring the limits of common morality theory, in particular its inability to resolve disputes about the moral status of entities. Given that many controversies in bioethics center on the disputed status of various entities, such as embryos and nonhuman animals, this is an important limitation. Nonetheless, common morality theory still can be a useful resource in diminishing moral conflict on issues that do not involve disputes over moral status. (shrink)
There is no satisfactory account for the general phenomenon of confabulation, for the following reasons: (1) confabulation occurs in a number of pathological and non-pathological conditions; (2) impairments giving rise to confabulation are likely to have different neural bases; and (3) there is no unique theory explaining the aetiology of confabulations. An epistemic approach to defining confabulation could solve all of these issues, by focusing on the surface features of the phenomenon. However, existing epistemic accounts are unable to offer sufficient (...) conditions for confabulation and tend to emphasise only its epistemic disadvantages. In this paper, we argue that a satisfactory epistemic account of confabulation should also acknowledge those features which are (potentially) epistemically advantageous. For example, confabulation may allow subjects to exercise some control over their own cognitive life which is instrumental to the construction or preservation of their sense of self. (shrink)
Explaining the transition from a signed to a spoken protolanguage is a major problem for all gestural theories. I suggest that Arbib's improved “beyond the mirror” hypothesis still leaves this core problem unsolved, and that Darwin's model of musical protolanguage provides a more compelling solution. Second, although I support Arbib's analytic theory of language origin, his claim that this transition is purely cultural seems unlikely, given its early, robust development in children.
Rational choice theory enjoys unprecedented popularity and influence in the behavioral and social sciences, but it generates intractable problems when applied to socially interactive decisions. In individual decisions, instrumental rationality is defined in terms of expected utility maximization. This becomes problematic in interactive decisions, when individuals have only partial control over the outcomes, because expected utility maximization is undefined in the absence of assumptions about how the other participants will behave. Game theory therefore incorporates not only rationality but also common (...) knowledge assumptions, enabling players to anticipate their co-players' strategies. Under these assumptions, disparate anomalies emerge. Instrumental rationality, conventionally interpreted, fails to explain intuitively obvious features of human interaction, yields predictions starkly at variance with experimental findings, and breaks down completely in certain cases. In particular, focal point selection in pure coordination games is inexplicable, though it is easily achieved in practice; the intuitively compelling payoff-dominance principle lacks rational justification; rationality in social dilemmas is self-defeating; a key solution concept for cooperative coalition games is frequently inapplicable; and rational choice in certain sequential games generates contradictions. In experiments, human players behave more cooperatively and receive higher payoffs than strict rationality would permit. Orthodox conceptions of rationality are evidently internally deficient and inadequate for explaining human interaction. Psychological game theory, based on nonstandard assumptions, is required to solve these problems, and some suggestions along these lines have already been put forward. Key Words: backward induction; Centipede game; common knowledge; cooperation; epistemic reasoning; game theory; payoff dominance; pure coordination game; rational choice theory; social dilemma. (shrink)
The perceptual symbol approach to knowledge representation combines structured frames and dynamic imagery. The perceptual symbol approach provides a good account of the representation of scientific models, of some types of naive theories held by children and adults, and of certain reconstructive memory phenomena. The ontological status of perceptual symbols is unclear and this form of representation does not succeed in accounting for all forms of human knowledge.
In his recent paper on the symmetry problem Roni Katzir argues that the only relevant factor for the calculation of any Quantity implicature is syntactic structure. I first refute Katzir’s thesis with three examples that show that structural complexity is irrelevant to the calculation of some Quantity implicatures. I then argue that it is inadvisable to assume—as Katzir and others do—that exactly one factor is relevant to the calculation of any Quantity implicature.
We contrast three decision rules that extend Expected Utility to contexts where a convex set of probabilities is used to depict uncertainty: Γ-Maximin, Maximality, and E-admissibility. The rules extend Expected Utility theory as they require that an option is inadmissible if there is another that carries greater expected utility for each probability in a (closed) convex set. If the convex set is a singleton, then each rule agrees with maximizing expected utility. We show that, even when the option set is (...) convex, this pairwise comparison between acts may fail to identify those acts which are Bayes for some probability in a convex set that is not closed. This limitation affects two of the decision rules but not E-admissibility, which is not a pairwise decision rule. E-admissibility can be used to distinguish between two convex sets of probabilities that intersect all the same supporting hyperplanes. (shrink)
This paper develops a semantical model – theoretic account of (logical) content complementing the syntactically specified account of content developed in A New Theory of Content I, JPL 23: 596–620, 1994. Proofs of Completeness are given for both propositional and quantificational languages (without identity). Means for handling a quantificational language with identity are also explored. Finally, this new notion of content is compared, in respect of both logical properties and philosophical applications, to alternative partitions of the standard consequence class relation (...) proposed by Stelzner, Schurz and Wiengartner. (shrink)
This paper argues that even the most extensively refined comparative cost/benefit analysis must be supplemented by other factors, irreducible to it, if we are to develop an adequate framework to guide policy decisions affecting technological design and innovation.
Testing the validity of knowledge requires formal expression of that knowledge. Formality of an expression is defined as the invariance, under changes of context, of the expression's meaning, i.e. the distinction which the expression represents. This encompasses both mathematical formalism and operational determination. The main advantages of formal expression are storability, universal communicability, and testability. They provide a selective edge in the Darwinian competition between ideas. However, formality can never be complete, as the context cannot be eliminated. Primitive terms, observation (...) set-ups, and background conditions are inescapable parts of formal or operational definitions, that all refer to a context beyond the formal system. Heisenberg's Uncertainty Principle and Gödel's Theorem provide special cases of this more universal limitation principle. Context-dependent expressions, on the other hand, have the benefit of being more flexible, intuitive and direct, and putting less strain on memory. It is concluded that formality is not an absolute property, but a context-dependent one: different people will apply different amounts of formality in different situations or for different purposes. Some recent computational and empirical studies of formality and contexts illustrate the emerging scientific investigation of this dependence. (shrink)
The micro-regional focus of bioregionalism is a small unit of physical space, typically a watershed region. In bioregional discourse, natural systems become metaphors for cultural coherence. However, when we look for laws embedded in the natural world, those that are found do not then reveal themselves as principles which apply to systems of culture. Further, within most individuals, the sense of regional identity spans several scales because our past narratives and present affiliations span several localities. Humans are not immersed in (...) singular niches, nor is the bioregionalist an existential, primordial localist, for his or her choice has been crafted. (shrink)
Judgements of the value or likelihood of a focal object or outcome have been shown to vary dramatically as a function of whether judgement is based on selective or comparative processing. This article explores the question of when selective versus comparative processing is likely, and demonstrates that as motivation and opportunity to process information carefully (operationalised as accountability and time pressure, respectively) decrease, the likelihood of selective processing increases. Moreover, we document how individuals manage to render judgements when in selective (...) processing mode by relying on categorical knowledge. (shrink)
Some scientists believe that although evolutionary theory is explanatory, it does not have, in contrast to the theories of physics, any predictive power. This raises the question of its testability. The analysis given shows that there are good reasons to claim the unpredictability of evolutionary events; nevertheless, the evolutionary theory has potential predictive power. It is argued that the difference between biology and physics lies not in the predictive power of the theories involved, but in the different weight which is (...) lent to the forecasting of particular events in these sciences. A second source of confusion derives from the ambiguity of the term 'prediction'. In order to define 'prediction' for cases in which the term is used to refer to a part of testing procedure, the reference to the time-point "now" is quite irrelevant. Prediction of unknown observational data is sufficient for testing a hypothesis, but such prediction may or may not be identical with forecasting of future events. Different factors that may cause particular difficulties met by biologists in forecasting future events are analyzed subsequently in the second part of the paper. The conclusion is drawn that although particular cognitive situations limiting the ability of forecasting are very frequent in biological sciences, the claim about the peculiar logical status of biological theories is not thereby justified. (shrink)
This volume contains eighteen essays by established and younger historians that examine non-democratic alternative political systems and ideologies--oligarchies, monarchies, mixed constitutions--along with diverse forms of communal and regional associations such as ethnoi, amphiktyonies, and confederacies. The papers, which span the length and breadth of the Hellenic world highlight the immense political flexibility and diversity of ancient Greek civilization.
Certain features of perception – the quale red, for example, and other qualia – must be regarded as additions to the materialist neurophysiological picture of perception. The perception of three-dimensional volumetric objects can also be seen as qualitative additions to the neurophysiological processes in the brain, possibly without additions to the information content.
Thomas, Bracken, and Timimi (2012) make an important contribution in critiquing the extent to which the profession of psychiatry can be so bureaucratic that patients are treated as problems to be solved in an ‘efficient’ assembly line fashion rather than as individual persons. The trouble with bureaucracies is that they promote a cold and impersonal accounting approach in which critical reflection on purposes is circumvented by decision-making algorithms (Zachar and Bartlett 2009). Psychotherapy treatment manuals definitely satisfy the bureaucratic instinct, and (...) the fifteen-minute medication management session even more so (Harris 2011). Ideally, evidence-based medicine (EBM) should be used to promote the goals of .. (shrink)
Climate policy decisions are decisions under uncertainty and are, therefore, based on a range of future climate scenarios, describing possible consequences of alternative policies. Accordingly, the methodology for setting up such a scenario range becomes pivotal in climate policy advice. The preferred methodology of the Intergovernmental Panel on Climate Change will be characterised as ,,modal verificationism"; it suffers from severe shortcomings which disqualify it for scientific policy advice. Modal falsificationism, as a more sound alternative, would radically alter the way the (...) climate scenario range is set up. Climate science's inability to find robust upper bounds for future temperature rise in line with modal falsificationism does not disprove that methodology, rather, this very fact prescribes even more drastic efforts to curb CO2 emissions than currently proposed. (shrink)
The Socratic method has a long history in teaching philosophy and mathematics, marked by such names as Karl Weierstra, Leonard Nelson and Gustav Heckmann. Its basic idea is to encourage the participants of a learning group (of pupils, students, or practitioners) to work on a conceptual, ethical or psychological problem by their own collective intellectual effort, without a textual basis and without substantial help from the teacher whose part it is mainly to enforce the rigid procedural rules designed to ensure (...) a fruitful, diversified, open and consensus-oriented thought process. Several features of the Socratic procedure, especially in the canonical form given to it by Heckmann, are highly attractive for the teaching of medical ethics in small groups: the strategy of starting from relevant singular individual experiences, interpreting and cautiously generalizing them in a process of inter-subjective confrontation and confirmation, the duty of non-directivity on the part of the teacher in regard to the contents of the discussion, the necessity, on the part of the participants, to make explicit both their own thinking and the way they understand the thought of others, the strict separation of content level and meta level discussion and, not least, the wise use made of the emotional and motivational resources developing in the group process. Experience shows, however, that the canonical form of the Socratic group suffers from a number of drawbacks which may be overcome by loosening the rigidity of some of the rules. These concern mainly the injunction against substantial interventions on the part of the teacher and the insistence on consensus formation rooted in Leonard Nelson's Neo-Kantian Apriorism. (shrink)
Olympism is among other things a peaceful philosophy. This means in practice that the most important thing for a researcher who studies peace movement in the Olympic Games is to examine how peace movements have been developed in the Olympic Games. The development of peace movement would be verified by analyzing the torch relay, the opening ceremony, and the Olympic Truce Resolution, in particular. The purpose of this paper is to evaluate the validity of these peace movements in recent Olympic (...) Games and to clarify the problems arising from implementation of peace movement into the Olympic Games. (shrink)