Are there any viable semiotic objections to commodification? A semiotic objection holds that even if there is no independent consequentialist or deontic objection to the marketing of a good—such as that it is exploitative or causes third party harm—there remains a problem with what is said by participating in that market. Recent discussion of semiotic objections have suffered from a basic ambiguity in such talk. As Grice pointed out, there is a difference between saying that smoke on the horizon means (...) fire, and saying that it means there will be war tomorrow. We could say that in the former case smoke indicates fire because of its causal connection with fire, while in the latter case smoke expresses a call to war because that is the non-natural meaning given to it by convention or by its place in a communicative practice. The recent defenses of semiotic objections presented by Anthony Booth, Jacob Sparks, and Mark Wells do not survive this distinction, as they either complain about non-semiotic facts that are indicated rather than expressed by markets, or they complain about semiotic features of markets, but these complaints inevitably collapse into weak consequentialist objections. But this result is not bad for anticommodificationists, as semiotic objections have dialectical disadvantages. (shrink)
One intuition we have about critical discourse is that we can distinguish between aesthetic and non-aesthetic assertions. When we say that a composition has a quick tempo and makes much use of staccato, we are remarking upon non-aesthetic features of the work. When we say of the same composition that it is vibrant, we are, in some sense, referring to an aesthetic feature. How should we draw the line between the aesthetic and non-aesthetic features of a work, and what import (...) does the distinction have? Frank Sibley has famously claimed that there is a way to draw a line between our aesthetic and non-aesthetic terms, and moreover that the existence of this distinction supports the existence of realistic aesthetic properties. The ensuing discussions of Sibley’s claim indicate that whatever is at stake here is of great significance to aesthetics. (shrink)
Discrimination is typically understood to be a comparative phenomenon: S is discriminated against on the basis of trait T if she would not have been treated in the same way if she did not possess T. But the comparative test for discrimination may hide from view some important cases: associational discrimination and stereotype policing. These cases show more clearly what is true of discrimination in general: that it involves a vicarious wrong, that is, an action which wrongs someone other than (...) the primary object of disregard. This suggests why philosophers disagree whether discrimination is wrong because it causes harm or because it expresses disrespect. Paradigmatic cases of discrimination involve both harm and disrespect, but direct the harm and the disrespect towards different targets by treating the harmed individual as a mere representative of the targeted group. Discrimination is dehumanizing, and morally distinctive, precisely because of this fact. (shrink)
ABSTRACTTo whom is a duty owed? Contractualism answers with an interest theory of direction. As such, it faces three challenges. The Conceptual Challenge requires acknowledgment that a duty is conceptually distinct from an interest. The Extensional Challenge requires an account of cases in which one who is owed a duty does not take an interest in the duty, or does not take as much of an interest as someone who is not owed the duty. The Positivist Challenge requires explanation of (...) the great flexibility of law and other social practices in positing duties that do not reflect the landscape of moral interests. Contractualism can be shown to meet these challenges once we acknowledge the centrality of the idea of a generic interest. Focusing on generic interests also illuminates the distinctive form of respect involved in directedness. (shrink)
Moral theories often issue general principles that explain our moral judgments in terms of underlying moral considerations. But it is unclear whether the general principles have an explanatory role beyond the underlying moral considerations. In order to avoid the redundancy of their principles, two-level theories issue principles that appear to generalize beyond the considerations that ground them. In doing so, the principles appear to overgeneralize. The problem is conspicuous in the case of contractualism, which proposes that moral principles are grounded (...) in generic reasons that operate in only a subset of the cases covered by the relevant principle. Arguments that motivate the use of generic reasons on the basis of guidance, institutional necessity, or fairness are unsatisfactory. But the generality of moral principles can be justified in terms of the idea of a generic interest, which is an interest that all occupants of a standpoint have in virtue of what occupants of the standpoint as such typically take an interest in. The notion of a generic interest therefore acknowledges a distinction between having and taking an interest, and its use in moral theory is a way of giving expression to the significance of autonomy. This is reminiscent of the idea of a normative power; but the idea of a generic interest is less expensive and makes better sense of the way in which autonomy features in our moral lives. (shrink)
Risk is implicit in economic development. When does a course of economic development ethically balance risk and likely benefit? This paper examines the view of risk we find in Amartya Sen’s work on development. It shows that Sen’s capabilities approach leads to a more sensitive understanding of risk than traditional utility theory. Sen’s approach also supplies the basis of an argument for risk aversion in interventions that affect economic development. Sen’s approach describes development as aiming at freedom. The paper shows (...) how this claim can be made compatible with the social and relational basis of African communitarianism. (shrink)
I analyze a recent exchange between Adam Elga and JulianJonker concerning unsharp (or imprecise) credences and decision-making over them. Elga holds that unsharp credences are necessarily irrational; I agree with Jonker's reply that they can be rational as long as the agent switches to a nonlinear valuation. Through the lens of computational complexity theory, I then argue that even though nonlinear valuations can be rational, they come in general at the price of computational intractability, and that (...) this problematizes their use in defining rationality. I conclude that the meaning of "rationality" may be philosophically vague. (shrink)
Mentalizing is an important actual topic, both in psychodynamic theory and in clinical practice. Remarkably, mentalizing has been explicitly related to religion or psychology of religion only to a limited extent. This article explores the relevance of the concept of mentalizing for psychology of religion by first describing mentalizing, its development, and neuropsychological underpinnings. Second, to illustrate how the concept gives more insight into the psychology of religious phenomena, mentalizing is related to an almost universal religious practice, namely religious prayer. (...) Empirical studies from different psychological subdisciplines are interpreted from the perspective of mentalizing. Finally, its relevance for the discipline of psychology of religion is discussed. In this way, the potential of the concept as both an explaining psychological mechanism and a bridging notion that overcomes differences between psychological subdisciplines is demonstrated. (shrink)
Theoretical ethics includes both metaethics (the meaning of moral terms) and normative ethics (ethical theories and principles). Practical ethics involves making decisions about every day real ethical problems, like decisions about euthanasia, what we should eat, climate change, treatment of animals, and how we should live. It utilizes ethical theories, like utilitarianism and Kantianism, and principles, but more broadly a process of reflective equilibrium and consistency to decide how to act and be.
Julian Wuerth offers a radically new interpretation of major themes in Kant's philosophy. He explores Kant's ontology of the mind, his transcendental idealism, his account of the mind's powers, and his theory of action, and goes on to develop an original, moral realist account of Kant's ethics.
This paper provides preliminary insights into the process of sense-making and developing meaning with regard to corporate social responsibility (CSR) within 18 Dutch companies. It is based upon a research project carried out within the framework of the Dutch National Research Programme on CSR. The paper questions how change agents promoting CSR within these companies made sense of the meaning of CSR. How did they use language (and other instruments) to stimulate and underpin the contextual essence of CSR? Why did (...) they do that in this particular way? What were the consequences of this approach for shaping the process of CSR in their company? Did their efforts contribute to a new way of thinking and acting or was it merely putting old wine in new barrels? A preliminary conclusion is that change agents use above all linguistic artefacts (words and notions) and carry out practical projects while constructing meaning. Still, the meaning of meaning itself remains highly intangible, situational and personality related. (shrink)
This book argues that correspondence theories of truth fail because the relation that holds between a true thought and a fact is that of identity, not correspondence. Facts are not complexes of worldly entities which make thoughts true they are merely true thoughts. According to Julian Dodd, the resulting modest identity theory, while not defining truth, correctly diagnoses the failure of correspondence theories, and thereby prepares the ground for a defensible deflation of the concept of truth.
Introduction -- The type/token theory introduced -- Motivating the type/token theory : repeatability -- Nominalist approaches to the ontology of music -- Musical anti-realism -- The type/token theory elaborated -- Types I : abstract, unstructured, unchanging -- Types introduced and nominalism repelled -- Types as abstracta -- Types as unstructured entities -- Types as fixed and unchanging -- Types II : platonism -- Introduction : eternal existence and timelessness -- Types and properties -- The eternal existence of properties reconsidered -- (...) Types and patterns -- Defending the type/token theory I -- Unstructuredness and analogical predication -- Musical works as fixed and unchanging -- Abstractness and audibility (again) -- Works and interpretations -- Conclusion and resumé -- Defending the type/token theory II : musical platonism -- Platonism it is : replies to Anderson and Levinson -- The existence conditions of works of music -- Composition as creative discovery -- The nature of the compositional process : replies to objections -- Composition and aesthetic appraisal : a reply to Levinson -- Composition and aesthetic appraisal : understanding, interpretation, and correctness -- Musical works as continuants : a theory rejected -- A theory introduced -- Explicating and motivating the continuant view -- The continuant view and repeatability -- Further objections to the continuant view -- Musical works as compositional actions : a critique -- Currie's action-type hypothesis -- Davies's performance theory -- Sonicism I : against instrumentalism -- Sonicism introduced -- Sonicism motivated : moderate empiricism -- Instrumentation : timbral sonicism introduced -- Scores -- Instrumentation, artistic properties, and aesthetic content -- Levinson's rejoinder -- Sonicism II : against contextualism -- Introduction : formulating contextualism -- Contextualist ontological proposals -- Levinson's doppelgänger thought-experiments -- Artistic, representational, and object-directed expressive properties -- Aesthetic and non-object-directed expressive properties -- Conclusion : the place of context. (shrink)
Philosophy of Economics: A Contemporary Introduction is the first systematic textbook in the philosophy of economics. It introduces the epistemological, metaphysical and ethical problems that arise in economics, and presents detailed discussions of the solutions that have been offered. Throughout, philosophical issues are illustrated by and analysed in the context of concrete cases drawn from contemporary economics, the history of economic ideas, and actual economic events. This demonstrates the relevance of philosophy of economics both for the science of economics and (...) for the economy. This text will provide an excellent introduction to the philosophy of economics for students and interested general readers alike. (shrink)
According to the free energy principle biological agents resist a tendency to disorder in their interactions with a dynamically changing environment by keeping themselves in sensory and physiological states that are expected given their embodiment and the niche they inhabit :127–138, 2010. doi: 10.1038/nrn2787). Why would a biological agent that aims at minimising uncertainty in its encounters with the world ever be motivated to seek out novelty? Novelty for such an agent would arrive in the form of sensory and physiological (...) states that are unexpected. Such an agent ought therefore to avoid novel and surprising interactions with the world one might think. Yet humans and many other animals find play and other forms of novelty-seeking and exploration hugely rewarding. How can this be understood in frameworks for studying the mind that emphasise prediction error minimisation? This problem has been taken up in recent research concerned with epistemic action—actions an agent engages in to reduce uncertainty. However that work leaves two questions unanswered, which it is the aim of our paper to address. First, no account has been given yet of why it should feel good to the agent to engage the world playfully and with curiosity. Second an appeal is made to precision-estimation to explain epistemic action, yet it remains unclear how precision-weighting works in action more generally, or active inference. We argue that an answer to both questions may lie in the bodily states of an agent that track the rate at which free energy is being reduced. The recent literature on the predictive brain has connected the valence of emotional experiences to the rate of change in the reduction of prediction error :e1003094, 2013. doi: 10.1371/journal.pcbi.1003094; Van de Cruys, in Metzinger and Wiese Philosophy and predictive processing, vol 24, MIND Group, Frankfurt am Main, 2017. doi: 10.15502/9783958573253). In this literature valenced emotional experiences are hypothesised to be identical with changes in the rate at which prediction error is reduced. Experiences are negatively valenced when overall prediction error increases and are positively valenced when the sum of prediction errors decrease. We offer an ecological-enactive interpretation of the concept of valence and its connection to rate of change of prediction error. We show how rate of change should be understood in terms of embodied states of affordance-related action readiness. We then go on to apply this ecological-enactive account of error dynamics to provide an answer to the first question we have raised: It may explain why it should feel good to an agent to be curious and playful. Our ecological-enactive account also allows us to show how error dynamics may provide an answer to the second question we have raised regarding how precision-weighting works in active inference. An agent that is sensitive to rates of error reduction can tune precision on the fly. We show how this ability to tune precision on the go can allow agents to develop skills for adapting better and better to the unexpected, and search out opportunities for resolving uncertainty and progressing in its learning. (shrink)
What is the correct concept behind measures of inflation? Does money cause business activity or is it the other way around? Shall we stimulate growth by raising aggregate demand or rather by lowering taxes and thereby providing incentives to produce? Policy-relevant questions such as these are of immediate and obvious importance to the welfare of societies. The standard approach in dealing with them is to build a model, based on economic theory, answer the question for the model world and then (...) apply the results to economic phenomena outside. Data come in, if at all, only in testing a limited number of the model's consequences. Despite some critical voices, economic methodology too has by and large subscribed to a "theory first" approach to applied economics. Error in Economics systematically develops an alternative to the theory-based orthodoxy. It places the methodical study of evidence at the centre of the scientific enterprise and thus provides a foundation for a methodology of evidence-based economics. But the book does not stop at the truism that claims should be based on the best available evidence. Rather, detailed studies in the areas of measurement, causal inference and policy analysis show what it means for a claim to be evidence-based in the context of a concrete case. The examples discussed concern topics as diverse as consumer price indices, radio spectrum auctions, the transmission mechanism, natural experiments on minimum wages and the evaluation of counterfactuals for policy. Error in Economics is essential reading for economic methodologists, philosophers of science and anyone interested in how claims about socio-economic matters are validated. (shrink)
This paper argues that a consideration of the problem of providing truthmakers for negative truths undermines truthmaker theory. Truthmaker theorists are presented with an uncomfortable dilemma. Either they must take up the challenge of providing truthmakers for negative truths, or else they must explain why negative truths are exceptions to the principle that every truth must have a truthmaker. The first horn is unattractive since the prospects of providing truthmakers for negative truths do not look good neither absences, nor totality (...) states of affairs, nor Graham Priest and J.C. Beall’s ‘polarities’ (Beall, 2000; Priest, 2000) are up to the job. The second horn, meanwhile, is problematic because restricting the truthmaker principle to atomic truths, or weakening it to the thesis that truth supervenes on being, undercuts truthmaker theory’s original motivation. The paper ends by arguing that truthmaker theory is, in any case, an under-motivated doctrine because the groundedness of truth can be explained without appeal to the truthmaker principle. This leaves us free to give the ommonsensical and deflationary explanation of negative truths that common-sense suggests. (shrink)
This paper examines mathematical models in economics and observes that three mutually inconsistent hypotheses concerning models and explanation are widely held: (1) economic models are false; (2) economic models are nevertheless explanatory; and (3) only true accounts explain. Commentators have typically resolved the paradox by rejecting either one of these hypotheses. I will argue that none of the proposed resolutions work and conclude that therefore the paradox is genuine and likely to stay.
ABSTRACTTaking some controversial claims philosopher Jason Brennan makes in his book Against Democracy as a starting point, this paper argues in favour of two theses: There is No Such Thing as Superior Political Judgement; There Is No Such Thing as Uncontroversial Social Scientific Knowledge. I conclude that social science experts need to be kept in check, not given more power.
In this beautifully written account, Julian Young provides the most comprehensive biography available today of the life and philosophy of the nineteenth-century German philosopher Friedrich Nietzsche. Young deals with the many puzzles created by the conjunction of Nietzsche's personal history and his work: why the son of a Lutheran pastor developed into the self-styled 'Antichrist'; why this archetypical Prussian came to loath Bismarck's Prussia; and why this enemy of feminism preferred the company of feminist women. Setting Nietzsche's thought in (...) the context of his times - the rise of Prussian militarism, anti-Semitism, Darwinian science, the 'Youth' and emancipationist movements, as well as the 'death of God' - Young emphasises the decisive influence of Plato and of Richard Wagner on Nietzsche's attempted reform of Western culture. (shrink)
Heidegger's later philosophy has often been regarded as a lapse into unintelligible mysticism. While not ignoring its deep and difficult complexities, Julian Young's book explains in simple and straightforward language just what it is all about. It examines Heidegger's identification of loss of 'the gods', the violence of technology, and humanity's 'homelessness' as symptoms of the destitution of modernity, and his notion that overcoming 'oblivion of Being' is the essence of a turning to a post-destitute, genuinely post-modern existence. Young (...) argues that Heidegger's conception of such an overcoming is profoundly fruitful with respect to the ancient quest to discover the nature of the good life. His book will be an invaluable resource for both students and scholars of Heidegger's works. (shrink)
It may soon be possible to generate human organs inside of human-pig chimeras via a process called interspecies blastocyst complementation. This paper discusses what arguably the central ethical concern is raised by this potential source of transplantable organs: that farming human-pig chimeras for their organs risks perpetrating a serious moral wrong because the moral status of human-pig chimeras is uncertain, and potentially significant. Those who raise this concern usually take it to be unique to the creation of chimeric animals with (...) ‘humanised’ brains. In this paper, we show how that the same style of argument can be used to critique current uses of non-chimeric pigs in agriculture. This reveals an important tension between two common moral views: that farming human-pig chimeras for their organs is ethically concerning, and that farming non-chimeric pigs for food or research is ethically benign. At least one of these views stands in need of revision. (shrink)
The publication of the first study to use gene editing techniques in human embryos (Liang et al., 2015) has drawn outrage from many in the scientific community. The prestigious scientific journals Nature and Science have published commentaries which call for this research to be strongly discouraged or halted all together (Lanphier et al., 2015; Baltimore et al., 2015). We believe this should be questioned. There is a moral imperative to continue this research.
This paper presents arguments that challenge what I call the fact/value separability thesis: the idea, roughly, that factual judgements can be made independently of judgements of value. I will look at arguments to the effect that facts and values are entangled in the following areas of the scientific process in economics: theory development, economic concept formation, economic modelling, hypothesis testing, and hypothesis acceptance.
Advocates of paid living kidney donation frequently argue that kidney sellers would benefit from paid donation under a properly regulated kidney market. The poor outcomes experienced by participants in existing markets are often entirely attributed to harmful black-market practices. This article reviews the medical and anthropological literature on the physical, psychological, social, and financial harms experienced by vendors under Iran's regulated system of donor compensation and black markets throughout the world and argues that this body of research not only documents (...) significant harms to vendors, but also provides reasons to believe that such harms would persist under a regulated system. This does not settle the question of whether or not a regulated market should be introduced, but it does strengthen the case against markets in kidneys while suggesting that those advocating such a system cannot appeal to the purported benefits to vendors to support their case. (shrink)
Anti-realism is plagued by Fitch’s paradox: the remarkable result that if one accepts that all truths are knowable, minimal assumptions about the nature of knowledge entail that every truth is known. Dorothy Edgington suggests to address this problem by understanding p is knowable to be a counterfactual claim, but her proposal must contend with a forceful objection by Timothy Williamson. I revisit Edgington’s basic idea and find that Williamson’s objection is obviated by a refined understanding of counterfactual knowability that is (...) grounded in possible courses of inquiry. I arrive at a precise definition of knowability that is not just a technical avoidance of paradox, but is metaphysically sound and does justice to the anti-realist idea. (shrink)
The smooth integration of the natural sciences with everyday lived experience is an important ambition of radical embodied cognitive science. In this paper we start from Koffka’s recommendation in his Principles of Gestalt Psychology that to realize this ambition psychology should be a “science of molar behaviour”. Molar behavior refers to the purposeful behaviour of the whole organism directed at an environment that is meaningfully structured for the animal. Koffka made a sharp distinction between the “behavioural environment” and the “geographical (...) environment”. We show how this distinction picks out the difference between the environment as perceived by an individual organism, and the shared publicly available environment. The ecological psychologist James Gibson was later critical of Koffka for inserting a private phenomenal reality in between animals and the shared environment. Gibson tried to make do with just the concept of affordances in his explanation of molar behaviour. We argue however that psychology as a science of molar behaviour will need to make appeal both to the concepts of shared publicly available affordances, and of the multiplicity of relevant affordances that invite an individual to act. A version of Koffka’s distinction between the two environments remains alive today in a distinction we have made between the field and landscape of affordances. Having distinguished the two environments, we go on to provide an account of how the two environments are related. Koffka suggested that the behavioural environment forms out of the causal interaction of the individual with a pre-existing, ready-made geographical environment. We argue that such an account of the relation between the two environments fails to do justice to the complex entanglement of the social with the material aspects of the geographical environment. To better account for this sociomaterial reality of the geographical environment, we propose a process-perspective on our distinction between the landscape and field of affordances. While the two environments can be conceptually distinguished, we argue they should also be viewed as standing in a relation of reciprocal and mutual dependence. (shrink)
This paper contributes an analysis and formalisation of Damasio’s theory on core consciousness. Three important concepts in this theory are ‘emotion’, ‘feeling’ and ‘feeling a feeling’ . In particular, a simulation model is described of the dynamics of basic mechanisms leading via emotion and feeling to core consciousness, and dynamic properties are formally specified that hold for these dynamics at a more global level. These properties have been automatically checked for the simulation model. Moreover, a formal analysis is made of (...) relevant notions of representation used by Damasio. As part of this analysis, specifications of representation relations have been verified and confirmed against the simulation model. (shrink)
This paper argues that, within the Western ‘classical’ tradition of performing works of music, there exists a performance value of authenticity that is distinct from that of complying with the instructions encoded in the work's score. This kind of authenticity—interpretive authenticity—is a matter of a performance's displaying an understanding of the performed work. In the course of explaining the nature of this norm, two further claims are defended: that the respective values of interpretive authenticity and score compliance can come into (...) conflict; and that when this happens, compromising ideal score compliance for the sake of making the performance more interpretively authentic can make for a better performance. (shrink)
Brain organoid research raises ethical challenges not seen in other forms of stem cell research. Given that brain organoids partially recapitulate the development of the human brain, it is plausible that brain organoids could one day attain consciousness and perhaps even higher cognitive abilities. Brain organoid research therefore raises difficult questions about these organoids' moral status – questions that currently fall outside the scope of existing regulations and guidelines. This paper shows how these gaps can be addressed. We outline a (...) moral framework for brain organoid research that can address the relevant ethical concerns without unduly impeding this important area of research. (shrink)
Arthur Schopenhauer was one of the greatest writers and German philosophers of the nineteenth century. His work influenced figures as diverse as Wagner, Freud and Nietzsche. Best known as a pessimist, he was one of the few philosophers read and admired by Wittgenstein. In this comprehensive introduction, Julian Young covers all the main aspects of Schopenhauer's philosophy. Beginning with an overview of Schopenhauer's life and work, he introduces the central aspects of his metaphysics fundamental to understanding his work as (...) a whole: his philosophical idealism and debt to the philosophy of Kant; his attempt to answer the question of what the world is; his account of science; and in particular his idea that 'will' is the essence of all things. Julian Young then introduces and assesses Schopenhauer's aesthetics, which occupy a central place in his philosophy. He carefully examines Schopenhauer's theories of the sublime, artistic genius and music, before assessing his ethics of compassion, his arguments for pessimism and his account of 'salvation'. In the final chapter, he considers Schopenhauer's legacy and his influence on the thought of Nietzsche and Wittgenstein, making this an ideal starting point for those coming to Schopenhauer for the first time. (shrink)
Francesco Guala once wrote that ‘The problem of extrapolation is a minor scandal in the philosophy of science’. This paper agrees with the statement, but for reasons different from Guala’s. The scandal is not, or not any longer, that the problem has been ignored in the philosophy of science. The scandal is that framing the problem as one of external validity encourages poor evidential reasoning. The aim of this paper is to propose an alternative—an alternative which constitutes much better evidential (...) reasoning about target systems of interest, and which makes do without consideration of external validity. (shrink)
ABSTRACTWhat is John Cage's 4′33″? This paper disambiguates this question into three sub-questions concerning, respectively, the work's ontological nature, the art form to which it belongs, and the genre it is in. We shall see that the work's performances consist of silence, that it is a work of performance art, and that it belongs to the genre of conceptual art. Seeing the work in these ways helps us to understand it better, and promises to assuage somewhat the puzzlement and irritation (...) of those who are at first resistant to its charms. (shrink)
Two approaches to evidential reasoning compete in the biomedical and social sciences: the experimental and the pragmatist. Whereas experimentalism has received considerable philosophical analysis and support since the times of Bacon and Mill, pragmatism about evidence has been neither articulated nor defended. The overall aim is to fill this gap and develop a theory that articulates the latter. The main ideas of the theory will be illustrated and supported by a case study on the smoking/lung cancer controversy in the 1950s.