The irreversibility effect implies that a decision maker who neglects the prospect of receiving more complete information at later stages of a sequential decision problem will in certain cases too easily take an irreversible decision, as he ignores the existence of a positive option value in favour of reversible decisions. This option value represents the decision maker's flexibility to adapt subsequent decisions to the obtained information. In this paper we show that the economic models dealing with irreversibility as used in (...) environmental and capital investment decision making can be extended to emergency response decisions that produce important irreversible effects. In particular, we concentrate on the decision whether or not to evacuate an industrial area threatened by a possible nuclear accident. We show in a simple two-period evacuation decision model that non-optimal conclusions may be drawn when evacuation is regarded as a `now or never decision'. The robustness of these results is verified by means of a sensitivity analysis of the various model parameters. The importance of `options thinking' in this decision context is illustrated in an example. (shrink)
Parameters of a Bertalanffy type of temperature dependent growth model are fitted using data from a population of stone loach ( Barbatula barbatula ). Over two periods respectively in 1990 and 2010 length data of this population has been collected at a lowland stream in the central part of the Netherlands. The estimation of the maximum length of a fully grown individual is given special attention because it is in fact found as the result of an extrapolation over a (...) large interval of the entire lifetime. It is concluded that this parameter should not at forehand be set at one fixed value for the population at that location due to varying conditions over the years. (shrink)
In the history of modern philosophy systematic connections were assumed to hold between the modal concepts of logical possibility and necessity and the concept of conceivability. However, in the eyes of many contemporary philosophers, insuperable objections face any attempt to analyze the modal concepts in terms of conceivability. It is important to keep in mind that a philosophical explanation of modality does not have to take the form of a reductive analysis. In this paper I attempt to provide a response-dependent (...) account of the modal concepts in terms of conceivability along the lines of a nonreductive model of explanation. (shrink)
A great deal of discussion in recent philosophy of language has centered on the idea that there might be hidden contextual parameters in our sentences. But relatively little attention has been paid to what those parameters themselves are like, beyond the assumption that they behave more or less like variables do in logic. My goal in this paper is to show this has been a mistake. I shall argue there are at least two very different sorts of contextual (...)parameters. One is indeed basically like variables in logic, but the other is very different, and much more like overt referring expressions. This result is of interest in its own right, to those of us who are concerned to map out the details of the semantic and pragmatic workings of language. But it will have some wider morals as well. One of the important issues behind the debate over hidden parameters has been how we can posit hidden structure in language, and how far such structure can stray from the intuitive forms and contents speakers see in communication. I shall argue that one sort of hidden parameter is surprisingly close to the contents and forms speakers find intuitive, while another is more remote. I shall show that the.. (shrink)
Moral response-dependent metaethical theories characterize moral properties in terms of the reactions of certain classes of individuals. Nick Zangwill has argued that such theories are flawed: they are unable to accommodate the motive of duty. That is, they are unable to provide a suitable reason for anyone to perform morally right actions simply because they are morally right. I argue that Zangwill ignores significant differences between various approvals, and various individuals, and that moral response-dependent theories can accommodate the motive of (...) duty. (shrink)
Mark Johnston claims the pragmatist theory of truth is inconsistent with the way we actually employ and talk about that concept. He is, however, sympathetic enough to attempt to rescue its respectable core using ‘response-dependence’, a revisionary form of which he advocates as a method for clarifying various philosophically significant concepts. But Johnston has misrepresented pragmatism; it does not require rescuing, and as I show here, his ‘missing explanation argument’ against pragmatism therefore fails. What Johnston and other critics including Putnam (...) have overlooked is the distinctive nature of the pragmatist strategy, specifically, that it is non-reductive, a characteristic it shares with a more promising form of response-dependence; what Johnston calls ‘Descriptive Protagoreanism’ (DP). In this paper I offer a defence of pragmatism and show how it might be re-articulated as a form of DP. (shrink)
Response moralism holds that audience reactions to works of fiction can be morally bad. This position appears implausible: How could it be bad to enjoy fictional suffering? It's just fiction; no one is harmed. My goal is to sketch the most compelling avenue of defense for the theory. I show both how and how not to defend response moralism. First I argue that Allan Hazlett's recent defense fails. Then I defend a Moorean suggestion for how to support the theory. Most (...) important, I argue that the difficulties for the theory have not been fully appreciated. To this end, I present, but do not attempt to solve, four issues facing response moralism. (shrink)
Nanotechnology: Considering the Complex Ethical, Legal, and Societal Issues with the Parameters of Human Performance Content Type Journal Article Pages 265-275 DOI 10.1007/s11569-008-0047-6 Authors Linda MacDonald Glenn, Albany Medical College/Center Alden March Bioethics Institute Albany NY 12208 USA Jeanann S. Boyce, Montgomery College Dept. of Computer Science and Business 7600 Takoma Avenue Takoma Park MD 20912 USA Journal NanoEthics Online ISSN 1871-4765 Print ISSN 1871-4757 Journal Volume Volume 2 Journal Issue Volume 2, Number 3.
Although widely recognized as one of sociology's true classics. Max Weber's The Protestant Ethic and the Spirit of Capitalism has largely failed to influence the development of sociological theory in the United States. Because it has been read almost exclusively as a study of the "role of ideas" in economic development, its diverse and multifaceted theoretical contributions generally have been neglected. This study explicitly calls attention to The Protestant Ethic as a theoretical treatise by examining this classic in reference to (...) four major debates in postwar sociological theory in the United States. Moreover, it demarcates an array of major parameters in American theorizing. The conclusion speculates upon the reasons for the strong opposition to The Protestant Ethic's theoretical lessons and argues that a style of theorizing unique to sociology in the United States has erected firm barriers against this classic text. (shrink)
In 'Subjunctive Conditionals: Two Parameters vs. Three' Pavel Tichy articulates and defends a three-parameter account of counterfactuals. In the paper, he responds to a well known objection against the validity of various forms of inference, in particular strengthening of the antecedent, contraposition, and hypothetical syllogism. In this paper, I argue that his response to the objection is inadequate. I then propose an alternative form of the three-parameter account of counterfactuals that avoids the objection in question.
Symbolic healing, that is, responding to meaningful experiences in positive ways, can facilitate human healing. This process partly engages consciousness and partly evades consciousness completely (sometimes it partakes of both simultaneously). This paper, presented as the Society for the Anthropology of Consciousness Distinguished Lecture at the 2011 AAA meeting in Montreal, reviews recent research on what is ordinarily (and unfortunately) called the “placebo effect.” The author makes the argument that language use should change, and the relevant portions of what is (...) often called the placebo effect should be referred to as the “meaning response.”. (shrink)
For several subsystems of second order arithmetic T we show that the proof-theoretic strength of T + (bar rule) can be characterized in terms of T + (bar induction) □ , where the latter scheme arises from the scheme of bar induction by restricting it to well-orderings with no parameters. In addition, we demonstrate that ACA + 0 , ACA 0 + (bar rule) and ACA 0 + (bar induction) □ prove the same Π 1 1 -sentences.
Biologists studying short-lived organisms have become aware of the need to recognize an explicit temporal extend of a population over a considerable time. In this article we outline the concept and the realm of populations with explicit spatial and temporary boundaries. We call such populations “temporally bounded populations”. In the concept, time is of the same importance as space in terms of a dimension to which a population is restricted. Two parameters not available for populations that are only spatially (...) defined characterise temporally bounded populations: total population size, which is the total number of individuals present within the temporal borders, and total residence time, which is the sum of the residence times of all individuals. We briefly review methods to estimate these parameters. We illustrate the concept for the large blue butterfly (Maculinea nausithous) and outline insights into ecological and conservation-relevant processes that cannot be gained without the use of the concept. (shrink)
Locus equations contain an economical set of hidden (i.e., not directly observable in the data) parameters of speech that provide an elegant way of characterizing the ubiquitous context-dependent behaviors exhibited in speech acoustics. These hidden parameters can be effectively exploited to constrain the huge set of context-dependent speech model parameters currently in use in modern, mainstream speech recognition technology.
Recently, it has been argued that the phenomenon of direct transfer of intermediate metabolites between adjacent enzymes, also known as metabolic channelling, would not decrease the concentration of those intermediates in the bulk solution. However, this conclusion has been drawn by extrapolation from the results of simulations with a rather restricted set of parameters. We show that, for a number of kinetic cases, the existence of metabolic channelling can decrease the size of the soluble pool of intermediates. When the (...) enzyme(s) downstream of the channel have a catalytic capacity that is large relative to the enzymes upstream of the channel, the decrease of concentration can be substantial (3 orders of magnitude). (shrink)
If you seem to be able to do data assimilation with uncertain static parameters then you are probably not working in environmental science. In this field, applications are often characterized by sensitive dependence on initial conditions and attracting sets in the state-space, which, taken together, can be a major challenge to numerical methods, leading to very peaky likelihood functions. Inherently stochastic models and uncertain static parameters increase the challenge.
This paper concerns modal logics of provability — Gödel-Löb systemGL and Solovay logicS — the smallest and the greatest representation of arithmetical theories in propositional logic respectively. We prove that the decision problem for admissibility of rules (with or without parameters) inGL andS is decidable. Then we get a positive solution to Friedman''s problem forGL andS. We also show that A. V. Kuznetsov''s problem of the existence of finite basis for admissible rules forGL andS has a negative solution. Afterwards (...) we give an algorithm deciding the solvability of logical equations inGL andS and constructing some solutions. (shrink)
An algorithm recognizing admissibility of inference rules in generalized form (rules of inference with parameters or metavariables) in the intuitionistic calculus H and, in particular, also in the usual form without parameters, is presented. This algorithm is obtained by means of special intuitionistic Kripke models, which are constructed for a given inference rule. Thus, in particular, the direct solution by intuitionistic techniques of Friedman's problem is found. As a corollary an algorithm for the recognition of the solvability of (...) logical equations in H and for constructing some solutions for solvable equations is obtained. A semantic criterion for admissibility in H is constructed. (shrink)
This paper offers an appraisal of Phillip Pettit’s approach to the problem how a finite set of examples can serve to represent a determinate rule, given that indefinitely many rules can be extrapolated from any such set. Negatively, I argue that Pettit’s so-called ethocentric theory of rule-following fails to deliver the solution to this problem that he sets out to provide. More constructively, I consider what further provisions are needed in order to advance Pettit’s distinctive general approach to the problem. (...) I conclude that what is needed is a ‘no-priority’ account of rule-exemplification: that is, an account that (a) affirms the constitutive role of agents’ responses in the exemplification of rules but (b) denies the explanatory priority given to such responses in Pettit’s theory. (shrink)
In âWhy Criminal Law: A Question of Content?â, Douglas Husak argues that an analysis of the justifiability of the criminal law depends upon an analysis of the justifiability of state punishment. According to Husak, an adequate justification of state punishment both must show why the state is permitted to infringe valuable rights such as the right not to be punished and must respond to two distinct groups of persons who may demand a justification for the imposition of punishment, namely, individuals (...) subjected to punishment and the society asked to support the institution of punishment. In this discussion, I analyse Husakâs account of the right not to be punished with an eye to showing that the parameters of that right do not extend to the cases that would make it controversial. I also consider two other distinct groups of persons who have equal standing to alleged offenders and society to demand justification for the imposition of state punishment, namely, direct victims of crimes and criminal justice officials. (shrink)
Advancing the reductionist conviction that biology must be in agreement with the assumptions of reductive physicalism (the upward hierarchy of causal powers, the upward fixing of facts concerning biological levels) A. Rosenberg argues that downward causation is ontologically incoherent and that it comes into play only when we are ignorant of the details of biological phenomena. Moreover, in his view, a careful look at relevant details of biological explanations will reveal the basic molecular level that characterizes biological systems, defined by (...) wholly physical properties, e.g., geometrical structures of molecular aggregates (cells). In response, we argue that contrary to his expectations one cannot infer reductionist assumptions even from detailed biological explanations that invoke the molecular level, as interlevel causal reciprocity is essential to these explanations. Recent very detailed explanations that concern the structure and function of chromatin—the intricacies of supposedly basic molecular level—demonstrate this. They show that what seem to be basic physical parameters extend into a more general biological context, thus rendering elusive the concepts of the basic level and causal hierarchy postulated by the reductionists. In fact, relevant phenomena are defined across levels by entangled, extended parameters. Nor can the biological context be explained away by basic physical parameters defining molecular level shaped by evolution as a physical process. Reductionists claim otherwise only because they overlook the evolutionary significance of initial conditions best defined in terms of extended biological parameters. Perhaps the reductionist assumptions (as well as assumptions that postulate any particular levels as causally fundamental) cannot be inferred from biological explanations because biology aims at manipulating organisms rather than producing explanations that meet the coherence requirements of general ontological models. Or possibly the assumptions of an ontology not based on the concept of causal powers stratified across levels can be inferred from biological explanations. The incoherence of downward causation is inevitable, given reductionist assumptions, but an ontological alternative might avoid this. We outline desiderata for the treatment of levels and properties that realize interlevel causation in such an ontology. (shrink)
Newton's methodology is significantly richer than the hypothetico-deductive model. It is informed by a richer ideal of empirical success that requires not just accurate prediction but also accurate measurement of parameters by the predicted phenomena. It accepts theory-mediated measurements and theoretical propositions as guides to research. All of these enrichments are exemplified in the classical response to Mercury's perihelion problem. Contrary to Kuhn, Newton's method endorses the radical transition from his theory to Einstein's. The richer themes of Newton's method (...) are strikingly realized in a challenge to general relativity from a new problem posed by Mercury's perihelion. †To contact the author, please write to: Talbot College, University of Western Ontario, London, Ontario, Canada N6A 3K7; e-mail: firstname.lastname@example.org. (shrink)
Although Fair Trade has been in existence for more than 40 years, discussion in the business and business ethics literature of this unique trading and campaigning movement between Southern producers and Northern buyers and consumers has been limited. This paper seeks to redress this deficit by providing a description of the characteristics of Fair Trade, including definitional issues, market size and segmentation and the key organizations. It discusses Fair Trade from Southern producer and Northern trader and consumer perspectives and highlights (...) the key issues that currently face the Fair Trade movement. It then identifies an initial research agenda to be followed up in subsequent papers. (shrink)
While theorists of cultural pluralism have generally supported tribal sovereignty to protect threatened Native cultures, they fail to address adequately cultural conflicts between Native and non-Native communities, especially when tribal sovereignty facilitates illiberal or undemocratic practices. In response, I draw on Jürgen Habermas' conceptions of dis-course and the public sphere to develop a universalist approach to cultural pluralism, called the 'intercultural public sphere', which analyzes how cultures can engage in mutual learning and mutual criticism under fair conditions. This framework accommodates (...) cultural diversity within formally universalistic parameters while avoiding four common criticisms of universalist approaches to cultural pluralism. But this framework differs from that of Habermas in two ways. First, it includes 'subaltern' publics, open only to members of cultural subgroups, in order to counter relations of 'cultural power'. Second, it admits 'strong' publics, democratic institutions with decision-making powers. Finally, I show how the subaltern, strong institutions of tribal sovereignty contribute to the fair discursive conditions required for mutual learning and mutual critique in an intercultural public sphere. Key Words: Habermas Kymlicka Native peoples sovereignty tribal. (shrink)
Since the 1960s, a variety of new ways of addressing the challenges of diversity in American society have coalesced around the term "multiculturalism." In this article, we impose some clarity on the theoretical debates that surround divergent visions of difference. Rethinking multiculturalism from a sociological point of view, we propose a model that distinguishes between the social (associational) and cultural (moral) bases for social cohesion in the context of diversity. The framework allows us to identify three distinct types of multiculturalism (...) and situate them in relation to assimilationism, the traditional American response to difference. We discuss the sociological parameters and characteristics of each of these forms, attending to the strength of social boundaries as well as to the source of social ties. We then use our model to clarify a number of conceptual tensions in the existing scholarly literature and offer some observations about the politics of recognition and redistribution, and the recent revival of assimilationist thought. (shrink)
Ninety-one right brain-damaged patients with left neglect and 43 right brain-damaged patients without neglect were asked to extend horizontal segments, either left- or rightward, starting from their right or left endpoints, respectively. Earlier experiments based on similar tasks had shown, in left neglect patients, a tendency to overextend segments toward the left side. This seemingly paradoxical phenomenon was held to undermine current explanations of unilateral neglect. The results of the present extensive research demonstrate that contralesional overextension is also evident in (...) most right brain-damaged patients without contralesional neglect. Furthermore, they show that in a minority of left neglect patients, the opposite behavior, i.e., right overextension can be found. The paper also reports the results of correlational analyses comprising the parameters of line-extension, line-bisection, and cancellation tasks, as well as the parameters relative to the Milner Landmark Task, by which a distinction is drawn between perceptual and response biases in unilateral neglect. A working hypothesis is then advanced about the brain dysfunction underlying neglect and an attempt is made at finding an explanation of neglect and the links between the mechanisms of space representation and consciousness through the study of the changes induced by unilateral brain lesions in the characteristics of space-coding neurons. Abbreviations: C, control group;GN+91,full group of neglect patients;GN+27,group of neglect patients with relative left overextension;GN+14,group of neglect patients with relative right overextension;GN-43,full group of non-neglect patients;GN-9,group of non-neglect patients with relative left overextension; H canc, H cancellation task; LE, left extension; LE/RE, ratio of left-right extension; N+, neglect patients; N-, non-neglect patients; PB Land-M, perceptual bias on Landmark motor task; PB Land-V, perceptual bias on Landmark verbal task; RB Land-M, response bias on Landmark motor task; RB Land-V, response bias on Landmark verbal task; RE, right extension. (shrink)
Cancer is a complex disease, necessitating research on many different levels; at the subcellular level to identify genes, proteins and signaling pathways associated with the disease; at the cellular level to identify, for example, cell-cell adhesion and communication mechanisms; at the tissue level to investigate disruption of homeostasis and interaction with the tissue of origin or settlement of metastasis; and finally at the systems level to explore its global impact, e.g. through the mechanism of cachexia. Mathematical models have been proposed (...) to identify key mechanisms that underlie dynamics and events at every scale of interest, and increasing effort is now being paid to multi-scale models that bridge the different scales. With more biological data becoming available and with increased interdisciplinary efforts, theoretical models are rendering suitable tools to predict the origin and course of the disease. The ultimate aims of cancer models, however, are to enlighten our concept of the carcinogenesis process and to assist in the designing of treatment protocols that can reduce mortality and improve patient quality of life. Conventional treatment of cancer is surgery combined with radiotherapy or chemotherapy for localized tumors or systemic treatment of advanced cancers, respectively. Although radiation is widely used as treatment, most scheduling is based on empirical knowledge and less on the predictions of sophisticated growth dynamical models of treatment response. Part of the failure to translate modeling research to the clinic may stem from language barriers, exacerbated by often esoteric model renderings with inaccessible parameterization. Here we discuss some ideas for combining tractable dynamical tumor growth models with radiation response models using biologically accessible parameters to provide a more intuitive and exploitable framework for understanding the complexity of radiotherapy treatment and failure. (shrink)
Public schools are functionally provided through structural arrangements such as government funding, but public schools are achieved in substance, in part, through local governance. In this essay, Kathleen Knight Abowitz explains the bifocal nature of achieving public schools; that is, that schools are both subject to the unitary Public compact of constitutional principles as well as to the more local engagements with multiple publics. Knight Abowitz sketches this bifocal nature, exploring both the unitary ideal and its parameters, as well (...) as the less understood forms of multiple, organic publics that come into being in response to localized problems in schools or districts. These publics often fail to realize their potential in the development of increased capacity for enhanced teaching and learning. The essay ultimately points to a practical application: that educational leadership of all types, and with some very specific kinds of habits and skills, is needed to help achieve public schools. (shrink)
A general conceptual framework for large-scale neocortical dynamics based on data from many laboratories is applied to a variety of experimental designs, spatial scales, and brain states. Partly distinct, but interacting local processes (e.g., neural networks) arise from functional segregation. Global processes arise from functional integration and can facilitate (top down) synchronous activity in remote cell groups that function simultaneously at several different spatial scales. Simultaneous local processes may help drive (bottom up) macroscopic global dynamics observed with electroencephalography (EEG) or (...) magnetoencephalography (MEG). A local/global dynamic theory that is consistent with EEG data and the proposed conceptual framework is outlined. This theory is neutral about properties of neural networks embedded in macroscopic fields, but its global component makes several qualitative and semiquantitative predictions about EEG measures of traveling and standing wave phenomena. A more general “metatheory” suggests what large-scale quantitative theories of neocortical dynamics may be like when more accurate treatment of local and nonlinear effects is achieved. The theory describes the dynamics of excitatory and inhibitory synaptic action fields. EEG and MEG provide large-scale estimates of modulation of these synaptic fields around background levels. Brain states are determined by neuromodulatory control parameters. Purely local states are dominated by local feedback gains and rise and decay times of postsynaptic potentials. Dominant local frequencies vary with brain region. Other states are purely global, with moderate to high coherence over large distances. Multiple global mode frequencies arise from a combination of delays in corticocortical axons and neocortical boundary conditions. Global frequencies are identical in all cortical regions, but most states involve dynamic interactions between local networks and the global system. EEG frequencies may involve a “matching” of local resonant frequencies with one or more of the many, closely spaced global frequencies. Key Words: binding problem; cell assemblies; coherence; EEG; limit cycles; neocortical dynamics; pacemakers; phase locking; spatial scale; standing waves; synchronization. Footnotes1 The relationship between the synaptic action fields proposed in the target article and cell assemblies is clarified with Figure R1 (p. 416) of the Response. (This figure was not available to Commentators. (shrink)
The paper examines critically some recently published views by Ramsey on the contrast between ab initio and parametrized theories. I argue that, all things being equal, ab initio calculations are indeed regarded more highly in the physics and chemistry communities. A case study on density functional approaches in theoretical chemistry is presented in order to re‐examine the question of ab initio and parametrized approaches in a contemporary context.
Gupta-Belnap-style circular definitions use all real numbers as possible starting points of revision sequences. In that sense they are boldface definitions. We discuss lightface versions of circular definitions and boldface versions of inductive definitions.
We analyse the effect of the regulatory T cells (Tregs) in the local control of the immune responses by T cells. We obtain an explicit formula for the level of antigenic stimulation of T cells as a function of the concentration of T cells and the parameters of the model. The relation between the concentration of the T cells and the antigenic stimulation of T cells is an hysteresis, that is unfold for some parameter values. We study the appearance (...) of autoimmunity from cross-reactivity between a pathogen and a self antigen or from bystander proliferation. We also study an asymmetry in the death rates. With this asymmetry we show that the antigenic stimulation of the Tregs is able to control locally the population size of Tregs. Other effects of this asymmetry are a faster immune response and an improvement in the simulations of the bystander proliferation. The rate of variation of the levels of antigenic stimulation determines if the outcome is an immune response or if Tregs are able to maintain control due to the presence of a transcritical bifurcation for some tuning between the antigenic stimuli of T cells and Tregs. This behavior is explained by the presence of a transcritical bifurcation. (shrink)
The relevance of Wolfgang K hler's psychoneural isomorphism principle to contemporary cognitive neuroscience is explored. K hler's approach to the mind—body problem is interpreted as a response to the foundational crisis of psychology at the beginning of the twentieth century. Some aspects of his isomorphism doctrine are discussed, with a view to reaching an interpretation that is both historically accurate and pertinent to issues currently debated in the philosophy of psychology. The principle was meant to be empirically verifiable. Accordingly, some (...) similarities between K hler's approach and current neural network modeling are pointed out, and it is shown that some recent trends in the neurosciences are broadly compatible with K hler's views on cortical functioning. Isomorphism is interpreted as a form of neuroreductionism constrained by bridging laws relating mental phenomena to macrosocopic parameters of neural function. While isomorphism is probably valid for perceptual phenomena, its applicability to higher mental processes remains doubtful. (shrink)
Any intelligent discussion of terrorism must have some way of identifying the phenomenon under scrutiny. Only then is it possible to devise criteria for describing a given action, agent, or organization as ‘terrorist’, to investigate the causes and objectives of terrorism, and to set parameters for a legitimate response to what some regard as a fundamental challenge to world peace. Scholars have long recognized these points, but the same is not true of more prominent forces shaping contemporary Western perceptions. (...) In the United States, the mainstream media (newspapers, television, cinema), the independent “think tanks,” and the main sectors of the Government, have sponsored a public discourse about terrorism devoid of any serious inquiry into, or concern about, the nature, origins, and goals of terrorist actions. The rhetoric, with which they assail popular consciousness, deflects attention away from a critical examination of these issues, and thereby contributes to the increasing spiral of hatred and atrocity. This happens because the use of ‘terror’ and its cognates obscures the causes of political unrest and, consequently, impedes the development of rational policies for dealing with underlying grievances. The rhetoric of ‘terror’ is not always innocent; there are those who employ it deliberately in pursuit of specific political objectives. This is especially true in the popular discourse about political tensions in the Middle East, particularly, concerning the Israeli-Palestinian conflict. The result has been disastrous, not only for the lives and well-being of individual Palestinians and 1 Israelis, but increasingly, for the entire world. Unless we diminish the use of emotive language in describing the circumstances, actions, and tendencies constituting this and similar such conflicts, we risk multiplying the amount of terrorist violence in the world and moving even further from just and peaceful resolutions of pressing political problems.. (shrink)
idea of a mechanical balance, described the volume of exchange of various aggregated commodities, weighted by their price, balanced against the quantity of money in the economy, weighted by the money’ s rate of circulation. Another family of models addressed issues about the gold standard and bimetallism by thinking of quantities of gold and silver as liquids in different connected reservoirs representing, alternatively, bullion and minted coin, and the way the liquids/metal/currency in one reservoir will ¯ ow into others if (...) the level in one becomes higher than in another. Morgan sets out the ways in which Fischer developed these models in response to both theoretical and practical issues of the day. In the process we see how the activity of building models can address relations which are very imperfectly understood, revealing previously unappreciated causal interconnections. For example, Gresham’ s law is revealed as just one facet of a much more complex network of interconnected variables, and the models help to make clear the conditions under which this law does and does not apply. Also illustrated are ways in which such models can be illuminating about the underlying mechanisms even though the models in question involve extreme idealizations and are extremely limited in practical application because they require parameters which cannot be independently measured. In this respect these examples provide cases in which models would appear to facilitate theory development and articulation more than mediation between preexisting theory and the world. (shrink)
Difficulty of distinguishing rules and similarity in categorization comes from reliance on relatively simple manipulation-response designs and a style of modeling with abstract parameters, rather than assessment of intervening and controlling mental states. This commentary proposes a strategy in which rules and similarity would be distinguished by their different roles in a theory interrelating reportable conscious contents in deliberative categorization.
What is biological complexity? How many sorts exist? Are there levels of complexity? How are they related to one another? How is complexity related to the emergence of new phenotypes? To try to get to grips with these questions, we consider the archetype of a complex biological system, Escherichia coli. We take the position that E. coli has been selected to survive adverse conditions and to grow in favourable ones and that many other complex systems undergo similar selection. We invoke (...) the concept of hyperstructures which constitute a level of organisation intermediate between macromolecules and cells. We also invoke a new concept, competitive coherence, to describe how phenotypes are created by a competition between maintaining a consistent story over time and creating a response that is coherent with respect to both internal and external conditions. We suggest how these concepts lead to parameters suitable for describing the rich form of complexity termed hypercomplexity and we propose a relationship between competitive coherence and emergence. (shrink)
idea of a mechanical balance, described the volume of exchange of various aggregated commodities, weighted by their price, balanced against the quantity of money in the economy, weighted by the money’ s rate of circulation. Another family of models addressed issues about the gold standard and bimetallism by thinking of quantities of gold and silver as liquids in different connected reservoirs representing, alternatively, bullion and minted coin, and the way the liquids/metal/currency in one reservoir will ¯ ow into others if (...) the level in one becomes higher than in another. Morgan sets out the ways in which Fischer developed these models in response to both theoretical and practical issues of the day. In the process we see how the activity of building models can address relations which are very imperfectly understood, revealing previously unappreciated causal interconnections. For example, Gresham’ s law is revealed as just one facet of a much more complex network of interconnected variables, and the models help to make clear the conditions under which this law does and does not apply. Also illustrated are ways in which such models can be illuminating about the underlying mechanisms even though the models in question involve extreme idealizations and are extremely limited in practical application because they require parameters which cannot be independently measured. In this respect these examples provide cases in which models would appear to facilitate theory development and articulation more than mediation between preexisting theory and the world. Different readers, because they will be looking for different things, will themselves offer widely different evaluations of these essays, individually and collectively. In my own evaluation this collection provides a wonderful resource of much needed detail for use in the effort of all interpreters of science to move beyond past problematic oversimpli® cations. Accounts such as the positivist and semantic descriptions of theories can themselves be seen as highly idealized models, and as such do indeed bring out important features of the ways in which we theorize about the world.. (shrink)
This work is a qualitative study of an organism''s physiological adaptative response to stress. The experimental data were selected from a previous study leading to the conclusion that stress may be considered as a topological retraction within a vital space that must be more precisely defined. The experimental methodology uses rat poisoning by neurotoxins. The control parameter is the intensity of the toxic doses. Measured parameters are the animals'' survival rate and the kinetics of cerebral acetylcholinesterase activity. The results, (...) when expressed as a function of the inverted doses, show a characteristic evolution. The pattern of the curve closely resembles a vortex profile. This analogy is studied more extensively in both the physical and biological domains. These findings help to clarify the concept of biological stress which presents the same vectorial properties as hydrodynamic vorticity. In particular, the dissipation of stress and the dissipation of vorticity seem to obey the same laws. This observation is valid for both diffusion and convection processes. The decompensation phase of stress could be compared with the instability and turbulence in flows.Our approach in this paper is mainly to establish a general and phenomenological description of the stress response fitting experimental observations. (shrink)
The effect of damping on the behaviour of oscillations in the vicinity of bifurcations of nonlinear dynamical systems is investigated. Here, our primary focus is single degree-of-freedom conservative systems to which a small linear viscous energy dissipation has been added. Oscillators with saddle–node, pitchfork and transcritical bifurcations are shown analytically to exhibit several interesting characteristics in the free decay response near a bifurcation. A simple mechanical oscillator with a transcritical bifurcation is used to experimentally verify the analytical results. A transcritical (...) bifurcation was selected because it may be used to represent generic bifurcation behaviour. It is shown that the damping ratio can be used to predict a change in the stability with respect to changing system parameters. (shrink)
Freud finds it impossible to accept the existence of a Supreme Being because he thinks that there is no way to scientifically demonstrate or prove the existence of a being so defined. Consequently, Freud maintains that individuals who claim to have a religious experience of God suffer from a delusion. Such individuals remain in an infantile state of neurotic denial, fooling themselves about the reality of extramental existence.In contradistinction, Max Scheler, a student of Husserlian phenomenology, can accept the existence of (...) God because he finds that God. understood as the summum bonum, is the superlative value to which humanity can give assent in the religious act. Within the context of the religious act, an individual can come to discover or realize God. But this discovery is not made through a scientific demonstration or proof. Unlike Freud, Scheler shows that this discovery comes about via a phenomenological methodology which endorses a broader view of experience. Scheler ultimately makes the further claim that those individuals, like the scientist, who choose not to engage in the religious act are, in fact, involved in a delusional state.So, both thinkers claim that the other is in a delusional state. The task I undertake in this paper is to place these two thinkers into dialogue with one another in order to evaluate their specific methodologies. First, I explicate Freud’s view of religion. In doing so, I make explicit Freud’s empirical methodology and mechanistic materialism which is the root for his claim that God exists as an illusion or “wretched makeshift” of the neurotic unconscious mind. Next, I present the Schelerian response to Freud and positivistic science by making explicit the parameters of the religious act which recognizes God as the superlative value. Finally, I assess the views of Freud and Scheler, and in so doing, show that Scheler’s phenomenological methodology, with its emphasis upon bracketing empirical presuppositions, has merit in that it broadens experience beyond merely what is scientifically observed. We see that Freud’s claim that all experience needs to be scientifically demonstrated is too narrow a view of experience. And so, by denying other types of possible epistemologies and methodologies, he and his followers of the empirical methodology of strict positivism involve themselves in a delusional state by not accepting these approaches to extramental reality. However, I maintain that the psychoanalytic method advocated by Freud has merit in that it can be a useful aid to a person involved in or seeking to be involved in the religious act. In the end I show how it is possible to view the empirical methodology of Freud and the phenomenological methodology of Scheler as coexisting and harmonizing with one another. (shrink)
Professor Franklin is correct to say that there are signiﬁcant areas of agreement between his account of formal science (Franklin, 1994) and my critique of his account. We both agree that the domain-independence exhibited by the formal sciences is ontologically and epistemically interesting, and that the concept of ‘structure’ must be central in any analysis of domain-independence. We also agree that knowledge of the structural, relational properties of physical systems should count as empirical knowledge, and that it makes sense to (...) talk about an empirical ‘science’ of structure. Where we disagree is over the frequency of occasions where ‘practical certainty’ is actually attained: Franklin argues that practical certainty is not uncommon in the formal sciences, while I argue that there are barriers to practical certainty that Franklin fails to appreciate. In Franklin’s response, he brieﬂy presents and offers criticisms of my two main arguments against his central thesis. Franklin asserts that the ﬂaw in my ﬁrst argument is that it does not appreciate that modern mathematical models exhibit structural stability, and thus that many of their predictions are robust under small variations in input or parameter values (and hence, insensitive to the inevitable gaps between estimates and actual values). Nevertheless, what I had in mind in using the term ‘realistic modelling situations’ were models of fairly complex natural systems, such as the dripping faucet, spruce budworm and ecosystem ecology examples. My objection is not that mathematical models of such systems cannot legitimately and accurately describe structural properties that are genuinely predicable of real-world systems, but simply that for.. (shrink)
In the metabolic control theory, the control coefficient is a key parameter in quantifying the sensitivity of the flux towards an infinitesimal variation of enzyme activity. This concept does not apply just as it is for variations of enzyme concentrations whenever there is spatial, energy or resources limitations in the cell. Due to constraint on total enzyme concentration, the variation of concentration of any given enzyme may affect the concentrations of other enzymes. To take into account these correlations between enzyme (...) concentrations, we propose the concept of "combined response coefficient". Its definition is similar to that of the control coefficient, but its mathematical expression is different. Its range of variation is from – + 1, the null value corresponding to optimum enzyme concentration, i.e. to concentrations that maximise the flux, and the negative values to concentrations beyond the optimum value. A summation property could be derived using a simple weighting of the combined response coefficients, the sum of the weighed coefficient being 0. (shrink)
Multi-level discrete models of genetic networks, or the more general piecewise affine differential models, provide qualitative information on the dynamics of the system, based on a small number of parameters (such as synthesis and degradation rates). Boolean models also provide qualitative information, but are based simply on the structure of interconnections. To explore the relationship between the two formalisms, a piecewise affine differential model and a Boolean model are compared, for the carbon starvation response network in E. coli . (...) The asymptotic dynamics of both models are shown to be quite similar. This study suggests new tools for analysis and reduction of biological networks. (shrink)
I argue that Meeker is mistaken in two crucial respects. First, contrary to both myself and Plantinga, he treats exclusivism as a theory about the relation between the religions, and then claims that it is superior to the pluralist theory. But he does not say what his exclusivist theory is. Second, he bases his claim of a fundamental self-contradiction in my pluralist position on a view which I disavow, namely that altruism is the core of religion. He omits the central (...) idea of a profound reorientation in response to the Real, of which altruism is a manifestation. (Published Online April 7 2006). (shrink)
Where the old objectivity question asked, Objectivity or relativism: which side are you on?, the new one refuses this choice, seeking instead to bypass widely recognized problems with the conceptual framework that restricts the choices to these two. It asks, How can the notion of objectivity be updated and made useful for contemporary knowledge-seeking projects? One response to this question is the strong objectivity program that draws on feminist standpoint epistemology to provide a kind of logic of discovery for maximizing (...) our ability to block might makes right in the sciences. It does so by delinking the neutrality ideal from standards for maximizing objectivity, since neutrality is now widely recognized as not only not necessary, not only not helpful, but, worst of all, an obstacle to maximizing objectivity when knowledge-distorting interests and values have constituted a research project. Strong objectivity provides a method for correcting this kind of situation. However, standpoint approaches have their own limitations which are quite different from the misreadings of them upon which most critics have tended to focus. Unfortunately, historically limited epistemologies and philosophies of science are all we get to choose from at this moment in history. (shrink)
At the April 2006 meeting of the Central Division of the American Philosophical Association, in an author-meets-critics session on Scott Soames' book _Reference and Description: The Case Against Two-Dimensionalism_ , I presented a comment on Soames' book, "Scott Soames' Two-Dimensionalism" . The other critic was Robert Stalnaker. Soames presented his response to critics . Below is a reply to Soames' response to me, for those who were at the session and interested others. Note that this response was mostly written before (...) the session, except for one or two paragraphs where the discussion in the session is mentioned. (shrink)
Doctrinally, a precedent is a case of the same or higher court that furnishes an authoritative rule for the determination of the case at hand, either because the facts are alike, or, if the facts are different, because the principle that governed the first case is applicable to the different facts. In this article I try to free precedent form the dominant doctrinal view by offering a more intuitive conception: that to be precedent means to be treated as precedent. Put (...) differently, I attempt to see precedent not as descriptive of a previous decision or rule but as the aggregate effect of responses to one or a series of court decisions in the legal community. I illustrate this viewpoint by developing a version of a response-dependent concept that incorporates our intuitive grasp of how precedent works. In the pursuit of this task I rely on the basic philosophical premises of response-dependence theory and on one paradigmatic response-dependent concept—the quality of being funny. (shrink)
A standard view of reference holds that a speaker's use of a name refers to a certain thing in virtue of the speaker's associating a condition with that use that singles the referent out. This view has been criticized by Saul Kripke as empirically inadequate. Recently, however, it has been argued that a version of the standard view, a _response-based theory of reference_, survives the charge of empirical inadequacy by allowing that associated conditions may be largely or even entirely implicit. (...) This paper argues that response-based theories of reference are prey to a variant of the empirical inadequacy objection, because they are ill-suited to accommodate the successful use of proper names by pre-school children. Further, I argue that there is reason to believe that normal adults are, by and large, no different from children with respect to how the referents of their names are determined. I conclude that speakers typically refer _positionally_: the referent of a use of a proper name is typically determined by aspects of the speaker's position, rather than by associated conditions present, however implicitly, in her psychology. (shrink)
Response to Mark Schroeder’s Slaves of the passions Content Type Journal Article DOI 10.1007/s11098-010-9656-3 Authors Jonathan Dancy, The University of Reading, Reading, UK Journal Philosophical Studies Online ISSN 1573-0883 Print ISSN 0031-8116.
Within much contemporary epistemology, Kant’s response to skepticism has come to be epitomized by an appeal to transcendental arguments. This form of argument is said to provide a distinctively Kantian way of dealing with the skeptic, by showing that what the skeptic questions is in fact a condition for her being able to raise that question in the first place, if she is to have language, thoughts, or experiences at all. In this way, it is hoped, the game played by (...) the skeptic can be turned against herself.1 At the same time, however, this appeal to transcendental arguments is also widely felt to show what is wrong with Kant’s response to skepticism: for, it is suggested, such arguments can only be made to work against the background of his transcendental idealism. As we shall see, what this doctrine amounts to is much disputed; but as with any form of idealism, the worry is that it means compromising the very realism and objectivity we want to defend against skepticism in the first place, so that the price for adopting this Kantian strategy appears too high—the cure of using transcendental arguments in conjunction with transcendental idealism is almost as bad as the disease.2 Faced with this difficulty, two kinds of response have been canvassed. On the first, it is accepted that transcendental arguments do require a commitment to the wider philosophical framework of transcendental idealism, but it is claimed that this framework can and should be defended against the suggestion that it is itself ‘‘quasi-skeptical.’’ On the second, transcendental idealism is indeed abandoned as wrongheaded, but it is held that Kant’s transcendental arguments can be made to.. (shrink)
Bas van Fraassen claims that constructive empiricism strikes a balance between the empiricist's commitments to epistemic modesty -- that one's opinion should extend no further beyond the deliverances of experience than is necessary -- and to the rationality of science. In "Should the Empiricist be a Constructive Empiricist?" I argued that if the constructive empiricist follows through on her commitment to epistemic modesty she will find herself adopting a much more extreme position than van Fraassen suggests. Van Fraassen and Bradley (...) Monton have recently responded. My purpose here is to contest their response. The goal is not merely the rebuttal of a rebuttal; there is a lesson to learn concerning the realist/anti-realist dialectic generated by van Fraassen's view. (shrink)
On the self-locating response to the knowledge argument Content Type Journal Article DOI 10.1007/s11098-010-9612-2 Authors Daniel Stoljar, Philosophy Program, Research School of Social Sciences, The Australian National University, Canberra ACT, 0200 Australia Journal Philosophical Studies Online ISSN 1573-0883 Print ISSN 0031-8116.
Ned Block argues that the higher-order (HO) approach to explaining consciousness is ‘defunct’ because a prominent objection (the ‘misrepresentation objection’) exposes the view as ‘incoherent’. What’s more, a response to this objection that I’ve offered elsewhere (Weisberg 2010) fails because it ‘amounts to abusing the notion of what-it’s-like-ness’ (xxx).1 In this response, I wish to plead guilty as charged. Indeed, I will continue herein to abuse Block’s notion of what-it’s-like-ness. After doing so, I will argue that the HO approach accounts (...) for the sense of what-it’s-like-ness that matters in a theory of consciousness. I will also argue that the only incoherence present in the HO theory is that generated by embracing Block’s controversial notion of what-it’s-like-ness, something no theorist of any stripe ought to do. Block is famous for (among other things) having introduced the notion of ‘phenomenal consciousness’ into contemporary philosophy of mind (Block 1995). This term is widely employed in the philosophical literature and it even appears in the empirical literature. But wide-speared usage has brought about divergent interpretations of the term. We can distinguish a ‘moderate’ and a ‘zealous’ reading of ‘phenomenal consciousness’. On the moderate reading, ‘phenomenal consciousness’ just means ‘experience’. Many people have embraced this sense of the term and use it to roughly pick out conscious experience involving sensory quality (states like conscious visual experiences or conscious pains, for example).2 On the zealous reading, however, phenomenal consciousness is held to be ‘distinct from any cognitive, intentional, or functional property’ (Block 1995: 234). That is, any explanation of phenomenal consciousness in exclusively cognitive, intentional, or functional terms will fail to capture, without remainder, what is really distinctive about phenomenal consciousness. Block, of course, is fully clear about embracing the zealous reading; indeed, his initial introduction of the notion is in those terms. The same ambiguity occurs with the much-used (and abused) idea of ‘what-it’s-like-ness’.. (shrink)
Eric T. Olson has argued that those who hold that two material objects can exactly coincide at a moment of time, with one of these objects constituting the other, face an insuperable difficulty in accounting for the alleged differences between the objects, such as their being of different kinds and possessing different persistence-conditions. The differences, he suggests, are inexplicable, given that the objects in question are composed of the same particles related in precisely the same way. In response, I show (...) that the differences are not at all inexplicable once it is recognized that the conditions for a persisting object to be composed by certain particles at a moment of time must involve facts concerning other moments of time, and that the relevant facts are different for persisting objects of different kinds. Philosophers who neglect this sort of constraint on composition principles may be said to be victims of the 'cinematographic fallacy'. (shrink)
The radical empiricism of William James was first formally presented in his seminal papers of 1904, 'Does Consciousness Exist?' and 'A World of Pure Experience'. In James's view, pure experience was to serve as the source for psychology's primary data and radical empiricism was to launch an effective critique of experimentalism in psychology, a critique from which the problem of experimentalism within science could be addressed more broadly. This collection of papers presents James's formal statements on radical empiricism and a (...) representative sample of contemporary responses from psychologists and philosophers. With only a few exceptions, these responses indicate just how badly James was misread - psychologists ignoring the heart of James's message and philosophers transforming James's metaphysics into something quite unintelligible to the emerging generation of experimental psychologists. (shrink)
In this response, I reiterate my argument that price gouging undercuts the goal of equity in access to essential goods whereas Zwolinski emphasizes the importance of the efficient provision of essential goods above all other goals. I agree that the efficient provision of essential goods is important as I argue for the goal of equitable access to sufficient of the goods essential to living a minimally flourishing human life. However, efficiency is a means to this goal rather than the end (...) itself. Finally, I offer additional arguments against the non-worseness claim. (shrink)
The paper covers a range of topics of recent interest in relation to response-depdendence: its characterisation in terms of 'basic equations', its application to areas such as ethics, colour theory and philosophy of mind, and the 'missing explanation' argument.
Coping with everyday life limits the extent of one’s scepticism. It is practically impossible to doubt the existence of the things with which one is immediately engaged and interacting. To doubt that, say, a door exists, is to step back from merely using the door (opening it) and to reflect on it in a detached, theoretical way. It is impossible to simultaneously act and live immersed in situation S while doubting that one is in S. Sceptical doubts—such as ‘Is this (...) really a door?’, ‘Am I really walking?’ — require a reflective withdrawal in thought from the situation at hand. Maintaining sceptical doubt while coping with everyday life requires a split consciousness, a bad faith, with one part of consciousness doubting the existing of things that the other part takes forgranted. For this reason, a sustained lived sceptical doubt is sometimes thought to be impossible. -/- In this article, I examine Wittgenstein's response to scepticism in "On Certainty". I argue that one of his responses is "the response based on action", which is (as other Wittgenstein interpreters have noted) a characteristically pragmatist response. I then evaluate the quality of this pragmatist response to scepticism, noting that actions just as much as representations are susceptible to mis-interpretation. It is argued that despite the insights contained in it, Wittgenstein's contextualism about meaning is inadequate to rescue the Wittgensteinian response to scepticism. (shrink)
The purpose of this paper is to review critically Julian Savulescu's principle of 'Procreative Beneficence,' which holds that prospective parents are morally obligated to select, of the possible children they could have, those with the greatest chance of leading the best life. According to this principle, prospective parents are obliged to use the technique of pre-implantation genetic diagnosis (PGD) to select for the 'best' embryos, a decision that ought to be made based on the presence or absence of both disease (...) traits and non-disease traits such as intelligence. While several articles have been written in response to Savulescu's principle, none has systematically explored its philosophical underpinnings to demonstrate where it breaks down. In this paper I argue that the examples that Savulescu employs to support his theory in fact fail to justify it. He presents these examples as analogous to PGD, when in fact they differ from it in subtle but morally relevant ways. Specifically, Savulescu fails to acknowledge the fact that his examples evoke deontological and virtue ethics concerns that are absent in the context of PGD. These differences turn out to be crucial, so that, in the end, the analogies bear little support for his theory. Finally, I lay out the implications of this analysis for reproductive ethics. (shrink)
This essay is a rejoinder to comments on Uneasy Virtue made by Onora O'Neill, John Skorupski, and Michael Slote in this issue. In Uneasy Virtue I presented criticisms of traditional virtue theory. I also presented an alternative – a consequentialist account of virtue, one which is a form of ‘pure evaluational externalism’. This type of theory holds that the moral quality of character traits is determined by factors external to agency (e.g. consequences). All three commentators took exception to this account. (...) Therefore, the bulk of my response focuses on defending the externalist account of virtue presented in the final chapters of Uneasy Virtue. (shrink)
rgen Habermas' response to the European Union democratic deficit calls for a minimal threshold of democratic legislation through an explicit constitutional founding. He defends a model of freedom as autonomous self-determination by proposing to tie basic rights in the EU to a univocal form of European-wide popular sovereignty. Instead of constructing a common European political identity, I appeal to the novel democratic potential of institutions in the EU such as the Open Method of Coordination for mediating overlapping sovereignties in accord (...) with freedom as non-domination. The concluding example of basic rights to effective participation for immigrants and permanent minorities illustrates the strengths of Iris Young's and James Bohman's republican views of non-domination over Habermas' call for a European-wide collective willing. Key Words: James Bohman democratic deficit European Union freedom Jurgen Habermas non-domination Open Method of Coordination republicanism sovereignty Iris Young. (shrink)
This article suggests first that the concept of interpersonal recognition be understood in a multidimensional (as opposed to one-dimensional), practical (as opposed to symbolic), and strict (as opposed to broad) way. Second, it is argued that due recognition be seen as a reason-governed response to evaluative features, rather than all normativity and reasons being seen as generated by recognition. This can be called a response-model, or, more precisely, a value-based model of due recognition. A further suggestion is that there is (...) a systematic basis for distinguishing three dimensions of recognition, depending on whether recognition is given to someone qua a person, qua a certain kind of person, or qua a certain person. Finally, it is argued that recognition is a necessary condition of personhood, but whether it is of direct or indirect relevance depends on our theories of personhood (social vs. capacity-theory) and practical identity (dialogical definition model vs. feature-model). Despite the apparent opposition, it is shown that interpersonal recognition is both a response to value and a precondition of personhood. (shrink)
Michael Bergmann and Jan Cover summarize the essence of their paper as follows: “We argue that divine responsibility is sufficient for divine thankworthiness and consistent with the absence of divine freedom. We do this while insisting on the view that both freedom and responsibility are incompatible with causal determinism.” In this response I argue that while it makes sense for believers to be thankful that God exists, it makes no sense for them to thank him for doing the best act (...) he can, given the circumstances. (shrink)
I argue that there is a flaw in the way that response-dependence has been formulated in the literature, and this flawed formulation has been correctly attacked by Mark Johnston’s Missing Explanation Argument. Moving to a better formulation, which is analogous to the move from behaviourism to functionalism, avoids the Missing Explanation Argument.
Researchers working on children’s moral understanding maintain that the child’s capacity to distinguish morality from convention shows that children regard moral violations as objectively wrong (e.g. Nucci, L. (2001). Education in the moral domain. Cambridge: Cambridge University Press). However, one traditional way to cast the issue of objectivism is to focus not on conventionality, but on whether moral properties depend on our responses, as with properties like icky and fun. This paper argues that the moral/conventional task is inadequate for assessing (...) whether children regard moral properties as response-dependent. Unfortunately, children’s understanding of responsedependent properties has been neglected in recent research. Two experiments are reported showing that children are more likely to treat properties like fun and icky as response-dependent than moral properties like good and bad. Hence, this helps support the claim that children are moral objectivists. q 2003 Elsevier B.V. All rights reserved. (shrink)
In this paper, Crispin Wright’s unified strategy against scepticism is put under pressure through an examination of the concept of entitlement. Wright’s characterisation of a generalised form of scepticism is first described, followed by an examination of the concept of entitlement and of the role played by presuppositions in his strategy. This will make manifest the transcendental structure of this response to scepticism. The paper ends with a discussion of the effectiveness of this transcendental strategy in providing a satisfying response (...) to scepticism. (shrink)
The leading Intelligent Design theorist William Dembski (Rowman & Littlefield, Lanham MD, 2002) argued that the first No Free Lunch theorem, first formulated by Wolpert and Macready (IEEE Trans Evol Comput 1: 67–82, 1997), renders Darwinian evolution impossible. In response, Dembski’s critics pointed out that the theorem is irrelevant to biological evolution. Meester (Biol Phil 24: 461–472, 2009) agrees with this conclusion, but still thinks that the theorem does apply to simulations of evolutionary processes. According to Meester, the theorem shows (...) that simulations of Darwinian evolution, as these are typically set in advance by the programmer, are teleological and therefore non-Darwinian. Therefore, Meester argues, they are useless in showing how complex adaptations arise in the universe. Meester uses the term teleological inconsistently, however, and we argue that, no matter how we interpret the term, a Darwinian algorithm does not become non-Darwinian by simulation. We show that the NFL theorem is entirely irrelevant to this argument, and conclude that it does not pose a threat to the relevance of simulations of biological evolution. (shrink)
This is my response to the critical commentaries by Hasker, McNaughton and Schellenberg on my tetralogy on Christian doctrine. I dispute the moral principles invoked by McNaughton and Schellenberg in criticism of my theodicy and theory of atonement. I claim, contrary to Hasker, that I have taken proper account of the ‘existential dimension' of Christianity. I agree that whether it is rational to pursue the Christian way depends not only on how probable it is that the Christian creed is true (...) and so that the way leads to the Christian goals, but (in part) on how strongly one wants those goals. Hasker is correct to say that I need to give arguments in favour of the historical claims of Christianity, and I outline how I hope to do that. (shrink)
Stressing that the pronoun "I" picks out one and only one person in the world (i.e., me), I argue against Hunt (and other like-minded Rand commentators) that the supposed "hard case" of destructive people who do not care for their own lives poses no special difficulty for rational egoism. I conclude that the proper response to a terse objection like "What about suicide bombers?" is the equally terse assertion "But I don't want to get blown up.".
The underdetermination of theory by data obtains when, inescapably, evidence is insufficient to allow scientists to decide responsibly between rival theories. One response to would-be underdetermination is to deny that the rival theories are distinct theories at all, insisting instead that they are just different formulations of the same underlying theory; we call this the identical rivals response. An argument adapted from John Norton suggests that the response is presumptively always appropriate, while another from Larry Laudan and Jarrett Leplin (...) suggests that the response is never appropriate. Arguments from Einstein for the special and general theories of relativity may fruitfully be seen as instances of the identical rivals response; since Einstein’s arguments are generally accepted, the response is at least sometimes appropriate. But when is it appropriate? We attempt to steer a middle course between Norton’s view and that of Laudan and Leplin: the identical rivals response is appropriate when there is good reason for adopting a parsimonious ontology. Although in simple cases the identical rivals response need not involve any ontological difference between the theories, in actual scientific cases it typically requires treating apparent posits of the various theories as mere verbal ornaments or computational conveniences. Since these would-be posits are not now detectable, there is no perfectly reliable way to decide whether we should eliminate them or not. As such, there is no rule for deciding whether the identical rivals response is appropriate or not. Nevertheless, there are considerations that suggest for and against the response; we conclude by suggesting two of them. (shrink)
In a recent article [Mertz 2001] in this journal I argued for the virtues of a realist ontology of relation instances (unit attributes). A major strength of this ontology is an assay of ontic ('material') predication that yields an account of individuation without the necessity of positing and defending 'bare particulars'. The crucial insight is that it is the unifying agency or combinatorial aspect of a relation instance as predicable that is for ontology the principium individuationis [Mertz 2002; 1996]. Or (...) in short, what is ontically predicable, precisely as such, is the cause of individuation. As a preface to this positive doctrine I offered arguments against the coherence of bare particulars as defended in an article by J. P. Moreland . In a reply contained in this issue Moreland and Timothy Pickavance (hereafter M/P) propose to answer my objections . The response that follows provides reasons why, I contend, M/P have not succeeded in parrying my objections to bare particulars. (shrink)
Wayne Norman and Chris MacDonald launch a strong attack against Triple Bottom Line or 3BL accounting in their article “Gettingto the Bottom of ‘Triple Bottom Line’” (2004). This response suggests that, while limitations to 3BL accounting do exist, the critique of Norman and MacDonald is deeply flawed.
Experimental philosophers have recently questioned the use of intuitions as evidence in philosophical methods. J. R. Kuntz and J. R.C. Kuntz (2011) conduct an experiment suggesting that these critiques fail to be properly motivated because they fail to capture philosophers' preferred conceptions of intuition‐use. In this response, it is argued that while there are a series of worries about the design of this study, the data generated by Kuntz and Kuntz support, rather than undermine, the motivation for the experimentalist critiques (...) of intuition they aim to criticize. (shrink)
abstract First, Allen Buchanan, in the version of his paper entitled 'Philosophy and public policy: a role for social moral epistemology' that he presented at the workshop on 'Philosophy and Public Policy' held at the British Academy in London on March 8 th 2008, seems to imply that professional, academic philosophers have had little impact upon public policy. I mention an area where it can be argued in response that they have had a more benign, as well as a more (...) widespread, influence on society than Buchanan acknowledges in that version of his paper: in legislation concerning animal welfare. Second, I question whether or not the liberal commitment to freedom of religion is compatible with the ethics of belief that Buchanan appears to advocate. (shrink)
In this response to D. Z. Phillips's critique of my interpretation of Wittgenstein's view of magic and ritual, I counter Phillips's claim that I have misrepresented the Wittgensteinian view of ritual, consider the instrumentalist dimension of the Remarks on Frazer's Golden Bough, offer some objections to Phillips's expressivist view that a ritual ‘says itself’, and detect obscurantism in his approach to the study of religion.
Taddeo’s recent article, ‘Information Warfare: A Philosophical Perspective’ (Philos. Technol. 25:105–120, 2012) is a useful addition to the literature on information communications technologies (ICTs) and warfare. In this short response, I draw attention to two issues arising from the article. The first concerns the applicability of ‘information warfare’ terminology to current political and military discourse, on account of its relative lack of contemporary usage. The second engages with the political and ethical implications of treating ICT environments as a ‘domain’, with (...) its ramifications for the pursuit of ‘dominion’, particularly through military action. (shrink)
The traditional conception of response-dependence isinadequate because it cannot account for all intuitivecases of response-dependence. In particular, it is unableto account for the response-dependence of (aesthetic, moral, epistemic ...) values. I therefore propose tosupplement the traditional conception with an alternativeone. My claim is that only a combination of the twoconceptions is able to account for all intuitivecases of response-dependence.
Abstract: The symmetry argument is an objection to the 'deprivation approach'– the account of badness favored by nearly all philosophers who take death to be bad for the one who dies. Frederik Kaufman's recent response to the symmetry argument is a development of Thomas Nagel's suggestion that we could not have come into existence substantially earlier than we in fact did. In this paper, I aim to show that Kaufman's suggestion fails. I also consider several possible modifications of his theory, (...) and argue that they are unsuccessful as well. (shrink)
According to the proportionality objection to hell, infinite suffering is out of proportion to any wrong that finite human beings could commit and is hence unjust and inconsistent with God's moral perfection. The continuing-sin response concedes that eternal consignment to hell is out of proportion to the sins people commit during their earthly lives, but argues that people in hell continue to sin while in hell and, in this way, extend their consignment to hell ad infinitum. In this essay, I (...) evaluate the continuing-sin response. In particular, I argue that whether there is a proportionality problem to begin with and whether the continuing-sin response succeeds as a response depends on the character of the suffering that is experienced in hell. (shrink)
G. E. Moore famously offered a strikingly straightforward response to the radical sceptic which simply consisted of the claim that one could know, on the basis of one's knowledge that one has hands, that there exists an external world. In general, the Moorean response to scepticism maintains that we can know the denials of sceptical hypotheses on the basis of our knowledge of everyday propositions. In the recent literature two proposals have been put forward to try to accommodate, to varying (...) extents, this Moorean thesis. On the one hand, there are those who endorse an externalist version of contextualism, such as Keith DeRose, who have claimed that there must be some contexts in which Moore is right. More radically still, Ernest Sosa has expanded on this externalist thesis by arguing that, contra DeRose's contextualism, Moore may be right in all contexts. In this paper I evaluate these claims and argue that, suitably modified, one can resurrect the main elements of the Moorean anti-sceptical thesis. (shrink)
This paper offers an imminent interpretation of Kant's political teleology in the context of his response to Moses Mendelssohn in Theory and Practice III concerning prospects of humankind's moral progress. The paper assesses the nature of Kant's response against his mature political philosophy in the Doctrine of Right . In `Theory and Practice III' Kant's response to Mendelssohn remains incomplete: whilst insisting that individuals have a duty to contribute towards humankind's moral progress, Kant has no conclusive answer as to how (...) individuals might act on that duty. `Theory and Practice III' lacks a clear conception of the distinctness of political morality from the domain of virtue; Kant's resort to teleological argumentation is indicative of his lack of an account of instituting Right. The latter can be found in the Doctrine of Right —yet Kant's earlier teleological arguments contribute crucially to the development of his mature morality of Right. Key Words: inborn duty moral progress political teleology principles of Right. (shrink)
Alan Shewmons article, The brain and somatic integration: Insights into the standard biological rationale for equating brain death with death (2001), strikes at the heart of the standard justification for whole brain death criteria. The standard justification, which I call the standard paradigm, holds that the permanent loss of the functions of the entire brain marks the end of the integrative unity of the body. In my response to Shewmons article, I first offer a brief summary of the standard paradigm (...) and cite recent work by advocates of whole brain criteria who tenaciously cling to the standard paradigm despite increasing evidence showing that it has significant weaknesses. Second, I address Shewmons case against the standard paradigm, arguing that he is successful in showing that whole brain dead patients have integrated organic unity. Finally, I discuss some minor problems with Shewmons article, along with suggestions for further elaboration. (shrink)
In response to Michael Bradley, I summarize my account of the criteria by which the various data of natural theology increase the probability of theism and together make it probable. I explain the sense in which a simpler theory leaves less to be explained, justify my claim that God’s perfect goodness is entailed by his other divine properties, and show that not merely is theism simpler than Bradley’s ’Epicurean hypothesis’, but that the ’mixed’ data of natural theology are more to (...) be expected given theism than given the ’Epicurean hypothesis’. (shrink)
In his recent article ‘Consciousness and Reduction’, Ausonio Marras argues that functional reduction must appeal to bridge laws and thus does not represent a genuine alternative to Nagelian reduction. In response, I first argue that even if functional reduction must use bridge laws, it still represents a genuine alternative to Nagelian reduction. Further, I argue that Marras does not succeed in showing that functional reduction must use bridge laws. Introduction Nagelian Reduction, Functional Reduction, and Bridge Laws Marras on Functional Reduction (...) The Logical Space of ‘Bridge Law’ Views of Reduction [RP] as an Account of Realization Conclusion CiteULike Connotea Del.icio.us What's this? (shrink)
According to a Moorean response to skepticism, the standards for knowledge are invariantly comparatively low, and we can know across contexts all that we ordinarily take ourselves to know. It is incumbent upon the Moorean to defend his position by explaining how, in contexts in which S seems to lack knowledge, S can nevertheless have knowledge. The explanation proposed here relies on a warranted-assertability maneuver: Because we are warranted in asserting that S doesn’t know that p, it can seem that (...) S does in fact lack that piece of knowledge. Moreover, this warranted-assertability maneuver is unique and better than similar maneuvers because it makes use of H. P. Grice’s general conversational rule of Quantity—“Do not make your contribution more informative than is required”—in explaining why we are warranted in asserting that S doesn’t know that p. (shrink)
This article is a response to some of Philip Stratton-Lake’s criticisms of an earlier paper of mine in this journal, on the so-called ‘buck-passing’ account of goodness. Some elucidation is offered of the ‘wrong kind of reasons’ problem and of T. M. Scanlon’s view, and the question is raised of the role of goodness in the view outlined by Stratton-Lake.
The paper examines three tenets of Dancy’s meta-ethics, finds them incompatible, and proposes a response-dependentist (or response-dispositional) solution. The first tenet is the central importance of thick concepts and properties. The second is that such concepts essentially involve response(s) of observers, which Dancy interprets in a way that fits the pattern of context-dependent resultance: thick concepts are well suited for the particularist grounding of moral theory. However, and this is the third tenet, in his earlier paper (1986) Dancy forcefully argues (...) against response-dispositional accounts of moral concepts and properties. The present paper argues that an anti-dispositional view is incompatible with the first two points concerning thick concepts. If thick concepts and properties are paramount and ubiquitous in moral thought and reality, and if they are essentially tied to human responses, then anti-dispositionalism is false. Dancy himself avoids obvious contradiction by characterizing thick items (concepts) differently from the usual characterization of response-dependent items. Actions that satisfy thick concepts do so in virtue of meriting a determinate response. The (non-reductionist) response-dependentist usually puts it slightly differently: such actions satisfy a given moral concepts in virtue of eliciting a merited response. I have argued at length that this tenuous difference in formulation is too weak to support a relevant difference in rebus. If the argument is right, Dancy is implicitly committed to a kind of response-dependentism. Finally, the particularist should embrace thick concepts and properties, and reject anti-dispositionalism. However, this would bring back the analogy with color and other secondary qualities. Since there are ceteris paribus laws governing such properties, the analogy suggests that moral properties might also be best accounted for by a ceteris paribus, or hedged account, a compromise between traditional generalism and the particularism of Dancy’s variety. (shrink)
We address the following issues raised by the commentators of our target article and book: (1) the problem of multiple perspectives; (2) how to define group selection; (3) distinguishing between the concepts of altruism and organism; (4) genetic versus cultural group selection; (5) the dark side of group selection; (6) the relationship between psychological and evolutionary altruism; (7) the question of whether the psychological questions can be answered; (8) psychological experiments. We thank the contributors for their commentaries, which provide a (...) diverse agenda for future study of evolution and morality. Our response will follow the organization of our book, distinguishing between evolutionary issues that concern fitness effects and psychological issues that concern motives. (shrink)