The paper expands on Goshal’s criticism of what management as a scientific discipline teaches and the effects on managerial and societal ethics. The main argument put forward is that the economisation of management has a detrimental effect on the practice of management and on society in large. The ideology of economism is described and analysed from an epistemological perspective. The paper argues that the economisation of management not only introduces the problems of economics (three are identified and discussed) but destroys (...) the very essence of management because economics is focused on efficiency and management should be focused on effectiveness. What is more, the basic axioms of mainstream economics stand in stark contrast to the philosophy of the Enlightenment and therefore endanger the foundations of Western societies. Management theory (via corporate governance) is the Trojan horse carrying economism into society. (shrink)
This paper argues that the MBA, probably the most successful academic program of the last 50 years, negatively affects the theory and practice of management with regard to ethics through its pedagogy, structure, and its underlying epistemic assumptions. In particular I seek to demonstrate how the syllabus, the pedagogy and the epistemological assumptions of MBA programs together make managers/leaders unable and unwilling to deal with ethics. I also argue that while the what and the how play a very important role, (...) it was only the emergence of a radical philosophical underpinning that has put management education on a negative trajectory. The paper thus examines MBA education from a meta-level perspective, connecting the pedagogical model with epistemological beliefs. (shrink)
We present a model of the distribution of labour in science. Such models tend to rely on the mechanism of the invisible hand . Our analysis starts from the necessity of standards in distributed processes and the possibility of multiple standards in science. Invisible hand models turn out to have only limited scope because they are restricted to describing the atypical single-standard case. Our model is a generalisation of these models to J standards; single-standard models such as Kitcher are a (...) limiting case. We introduce and formalise this model, demonstrate its dynamics and conclude that the conclusions commonly derived from invisible hand models about the distribution of labour in science are not robust against changes in the number of standards. (shrink)
That certain paper bills have monetary value, that Vladimir Putin is the president of Russia, and that Prince Philip is the husband of Queen Elizabeth II: such facts are commonly called ‘institutional facts’. IFF are, by definition, facts that exist by virtue of collective recognition. The standard view or tacit belief is that such facts really exist. In this paper we argue, however, that they really do not—they really are just well-established illusions. We confront realism about IFF with six (...) criteria of existence, three established and three less so but highly intuitive. We argue that they all tell against the existence of IFF. An obvious objection to IFF non-realism is that since people’s behaviour clearly reflects the existence of IFF, denying their existence leaves an explanatory gap. We reject this argument by introducing a variant of the so-called ‘Thomas Theorem,’ which says that when people collectively recognize a fact as existing, they largely behave accordingly, regardless of whether that fact really exists or not. (shrink)
Philosophy is often conceived in the Anglophone world today as a subject that focuses on questions in particular ‘‘core areas,’’ pre-eminently epistemology and metaphysics. This article argues that the contemporary conception is a new version of the scholastic ‘‘self-indulgence for the few’’ of which Dewey complained nearly a century ago. Philosophical questions evolve, and a first task for philosophers is to address issues that arise for their own times. The article suggests that a renewal of philosophy today should turn the (...) contemporary conception inside out, attending to and developing further the valuable work being done on the supposed ‘‘periphery’’ and attending to the ‘‘core areas’’ only insofar as is necessary to address genuinely significant questions. (shrink)
Between 1819 and 1832 Friedrich Schleiermacher was giving lectures on the life of Jesus at the University of Berlin. The following article includes two partial editions, which document the introductory parts of the lectures from 1819/20 and 1829/30. Both are based on manuscripts written by Schleiermacher’s listeners. Especially to explore the development of Schleiermacher’s conceptual considerations this two partial editions should be a useful addition to the new critical edition of Schleiermacher’s Vorlesungen über das Leben Jesu published in 2018 by (...) Walter Jaeschke. (shrink)
Mountaineering is a dangerous activity. For many mountaineers, part of its very attraction is the risk, the thrill of danger. Yet mountaineers are often regarded as reckless or even irresponsible for risking their lives. In this paper, we offer a defence of risk-taking in mountaineering. Our discussion is organised around the fact that mountaineers and non-mountaineers often disagree about how risky mountaineering really is. We hope to cast some light on the nature of this disagreement – and to argue that (...) mountaineering may actually be worthwhile because of the risks it involves. Section 1 introduces the disagreement and, in doing so, separates out several different notions of risk. Sections 2–4 then consider some explanations of the disagreement, showing how a variety of phenomena can skew people's risk judgements. Section 5 then surveys some recent statistics, to see whether these illuminate how risky mountaineering is. In light of these considerations, however, we suggest that the disagreement is best framed not simply in terms of how risky mountaineering is but whether the risks it does involve are justified. The remainder of the paper, sections 6–9, argues that risk-taking in mountaineering often is justified – and, moreover, that mountaineering can itself be justified by and because of the risks it involves. (shrink)
If someone abstains from meat-eating for reasons of taste or personal economics, no moral or philosophical question arises. But when a vegetarian attempts to persuade others that they, too, should adopt his diet, then what he says requires philosophical attention. While a vegetarian might argue in any number of ways, this essay will be concerned only with the argument for a vegetarian diet resting on a moral objection to the rearing and killing of animals for the human table. The vegetarian, (...) in this laense, does not merely require us to change or justify our eating habits, but to reconsider our attitudes and behaviour towards members of other species across a wide range of practices. (shrink)
Is rhetoric just a new and trendy way to épater les bourgeois? Unfortunately, I think that the newfound interest of some economists in rhetoric, and particularly Donald McCloskey in his new book and subsequent responses to critics, gives that impression. After economists have worked so hard for the past five decades to learn their sums, differential calculus, real analysis, and topology, it is a fair bet that one could easily hector them about their woeful ignorance of the conjugation of Latin (...) verbs or Aristotle's Six Elements of Tragedy. Moreover, it has certainly become an academic cliché that economists write as gracefully and felicitously as a hundred monkeys chained to broken typewriters. The fact that economists still trot out Keynes's prose in their defense is itself an index of the inarticulate desperation of an inarticulate profession. (shrink)
Roger Crisp has inspired two important criticisms of Scanlon's buck-passing account of value. I defend buck-passing from the wrong kind of reasons criticism, and the reasons and the good objection. I support Rabinowicz and Rønnow-Rasmussen's dual role of reasons in refuting the wrong kind of reasons criticism, even where its authors claim it fails. Crisp's reasons and the good objection contends that the property of goodness is buck-passing in virtue of its formality. I argue that Crisp conflates general and formal (...) properties, and that Scanlon is ambiguous about whether the formal property of a reason can stop the buck. Drawing from Wallace, I respond to Crisp's reasons and the good objection by developing an augmented buck-passing account of reasons and value, where the buck is passed consistently from the formal properties of both to the substantive properties of considerations and evaluative attitudes. I end by describing two unresolved problems for buck-passers. (shrink)
Until the eighteenth century, Latin was the uncontested language of academic discourse, including theology. Regardless of their denominational affiliation, scholars all across Europe made use of Latin in both their publications and lectures. Then, due to the influence of various strands of post-Kantian philosophy, a change took place, at least in the German-speaking area. With recourse to classical German philosophy, many Catholic systematic theologians switched to their mother-tounge and adopted the newly coined terms in order to express the same faith. (...) In reaction to this transformative work the neo-scholastic movement came into existence. Its adherents stressed the Church’s tradition and, especially its indebtedness to medieval thought. From the mid-nineteenth century onwards, partly supported by the Magisterium, various attempts were made to re-introduce Latin into dogmatics. This project was unsuccessful, however, because of changes to the Catholic world ushered in by the Second Vatican Council and also because of developments in German educational policy, which served to lower the status of Latin in schools. (shrink)
Matthias Vogel challenges the belief, dominant in contemporary philosophy, that reason is determined solely by our discursive, linguistic abilities as communicative beings. In his view, the medium of language is not the only force of reason. Music, art, and other nonlinguistic forms of communication and understanding are also significant. Introducing an expansive theory of mind that accounts for highly sophisticated, penetrative media, Vogel advances a novel conception of rationality while freeing philosophy from its exclusive attachment to linguistics. Vogel's media (...) of reason treats all kinds of understanding and thought, propositional and nonpropositional, as important to the processes and production of knowledge and thinking. By developing an account of rationality grounded in a new conception of media, he raises the profile of the prelinguistic and nonlinguistic dimensions of rationality and advances the Enlightenment project, buffering it against the postmodern critique that the movement fails to appreciate aesthetic experience. Guided by the work of Jürgen Habermas, Donald Davidson, and a range of media theorists, including Marshall McLuhan, Vogel rebuilds, if he does not remake, the relationship among various forms of media -- books, movies, newspapers, the Internet, and television -- while offering an original and exciting contribution to media theory. (shrink)
Jonathan Glover and I, while not in such deep disagreement about the ethics of killing as to make all communication impossible, still disagree enough to make sustained confrontation worthwhile. At minimum, such confrontation should make it clear what are the most fundamental issues at stake in ethical arguments about various kinds of killing.
I develop a theory of counterfactuals about relative computability, i.e. counterfactuals such as 'If the validity problem were algorithmically decidable, then the halting problem would also be algorithmically decidable,' which is true, and 'If the validity problem were algorithmically decidable, then arithmetical truth would also be algorithmically decidable,' which is false. These counterfactuals are counterpossibles, i.e. they have metaphysically impossible antecedents. They thus pose a challenge to the orthodoxy about counterfactuals, which would treat them as uniformly true. What’s more, I (...) argue that these counterpossibles don’t just appear in the periphery of relative computability theory but instead they play an ineliminable role in the development of the theory. Finally, I present and discuss a model theory for these counterfactuals that is a straightforward extension of the familiar comparative similarity models. (shrink)
This brief opening for a special issue of Tradition and Discovery: The Polanyi Society Periodical on Philip Clayton’s thought and its connection with that of Michael Polany introduces Clayton’s essay and the responses by Martinez Hewlett, Gregory R. Peterson, Andy F. Sanders and Waler B. Gulick.
The following objection to the ‘ontological’ argument of St Anselm has a continuing importance. The argument begs the question by introducing into the first premise the name ‘God’. In order for something to be truly talked about, to have properties truly attributed to it—it has been said—it must exist; a statement containing a vacuous name must either be false, meaningless, or lacking in truth-value, if it is not a misleading formulation to be explained by paraphrase into other terms. In any (...) case the question of the divine existence is begged. (shrink)
It has become fashionable to try to prove the impossibility of there being a God. Findlay's celebrated ontological disproof has in the past quarter century given rise to vigorous controversy. More recently James Rachels has offered a moral argument intended to show that there could not be a being worthy of worship. In this paper I shall examine the position Rachels is arguing for in some detail. I shall endeavor to show that his argument is unsound and, more interestingly, that (...) the genuine philosophical perplexity which motivates it can be dispelled without too much difficulty. (shrink)
In a recent paper, Robert A. Oakes argues that a doctrine central to, and partially constitutive of, classical theism implies a certain sort of pantheism. The doctrine in question is a modal form of the claim that God conserves in existence the world of contingent things; alternatively, it is the view that all contingently existing things are necessarily continuously dependent upon God for their existence. And the variety of pantheism at stake is a modal form of the thesis that all (...) contingent things are, in some sense, included within the being of God. (shrink)
Whether the prefrontal cortex is part of the neural substrates of consciousness is currently debated. Against prefrontal theories of consciousness, many have argued that neural activity in the prefrontal cortex does not correlate with consciousness but with subjective reports. We defend prefrontal theories of consciousness against this argument. We surmise that the requirement for reports is not a satisfying explanation of the difference in neural activity between conscious and unconscious trials, and that prefrontal theories of consciousness come out of this (...) debate unscathed. (shrink)
Simulations are used in very different contexts and for very different purposes. An emerging development is the possibility of using simulations to obtain a more or less representative reproduction of organs or even entire persons. Such simulations are framed and discussed using the term ‘digital twin’. This paper unpacks and scrutinises the current use of such digital twins in medicine and the ideas embedded in this practice. First, the paper maps the different types of digital twins. A special focus is (...) put on the concrete challenges inherent in the interactions between persons and their digital twin. Second, the paper addresses the questions of how far a digital twin can represent a person and what the consequences of this may be. Against the background of these two analytical steps, the paper defines first conditions for digital twins to take on an ethically justifiable form of representation. (shrink)
Traditionally, the manufacturer/operator of a machine is held (morally and legally) responsible for the consequences of its operation. Autonomous, learning machines, based on neural networks, genetic algorithms and agent architectures, create a new situation, where the manufacturer/operator of the machine is in principle not capable of predicting the future machine behaviour any more, and thus cannot be held morally responsible or liable for it. The society must decide between not using this kind of machine any more (which is not a (...) realistic option), or facing a responsibility gap, which cannot be bridged by traditional concepts of responsibility ascription. (shrink)
This volume gathers eleven new and three previously unpublished essays that take on questions of epistemic justification, responsibility, and virtue. It contains the best recent work in this area by major figures such as Ernest Sosa, Robert Audi, Alvin Goldman, and Susan Haak.
Opponents to consciousness in fish argue that fish do not feel pain because they do not have a neocortex, which is a necessary condition for feeling pain. A common counter-argument appeals to the multiple realizability of pain: while a neocortex might be necessary for feeling pain in humans, pain might be realized differently in fish. This paper argues, first, that it is impossible to find a criterion allowing us to demarcate between plausible and implausible cases of multiple realization of pain (...) without running into a circular argument. Second, opponents to consciousness in fish cannot be provided with reasons to believe in the multiple realizability of pain. I conclude that the debate on the existence of pain in fish is impossible to settle by relying on the multiple realization argument. (shrink)
Conditional structures lie at the heart of the sciences, humanities, and everyday reasoning. It is hence not surprising that conditional logics – logics specifically designed to account for natural language conditionals – are an active and interdisciplinary area. The present book gives a formal and a philosophical account of indicative and counterfactual conditionals in terms of Chellas-Segerberg semantics. For that purpose a range of topics are discussed such as Bennett’s arguments against truth value based semantics for indicative conditionals.
According to William Alston, we lack voluntary control over our propositional attitudes because we cannot believe intentionally, and we cannot believe intentionally because our will is not causally connected to belief formation. Against Alston, I argue that we can believe intentionally because our will is causally connected to belief formation. My defense of this claim is based on examples in which agents have reasons for and against believing p, deliberate on what attitude to take towards p, and subsequently acquire an (...) attitude A towards p because they have decided to take attitude A. From the possibility of intentional belief, two conclusions follow. First, the kind of control we have over our propositional attitudes is direct; it is possible for us to believe at will. Second, the question of whether what we believe is under our control ultimately depends on whether our will itself is under our control. It is, therefore, a question of the metaphysics of free will. (shrink)
According to a standard criticism, Robert Brandom's “normative pragmatics”, i.e. his attempt to explain normative statuses in terms of practical attitudes, faces a dilemma. If practical attitudes and their interactions are specified in purely non-normative terms, then they underdetermine normative statuses; but if normative terms are allowed into the account, then the account becomes viciously circular. This paper argues that there is no dilemma, because the feared circularity is not vicious. While normative claims do exhibit their respective authors' practical attitudes (...) and thereby contribute towards establishing the normative statuses they are about, this circularity is not a mark of Brandom's explanatory strategy but a feature of social practice of which we theorists partake. (shrink)
As for most measurement procedures in the course of their development, measures of consciousness face the problem of coordination, i.e., the problem of knowing whether a measurement procedure actually measures what it is intended to measure. I focus on the case of the Perceptual Awareness Scale to illustrate how ignoring this problem leads to ambiguous interpretations of subjective reports in consciousness science. In turn, I show that empirical results based on this measurement procedure might be systematically misinterpreted.
Eleven pairs of newly commissioned essays face off on opposite sides of fundamental problems in current theories of knowledge. Brings together fresh debates on eleven of the most controversial issues in epistemology. Questions addressed include: Is knowledge contextual? Can skepticism be refuted? Can beliefs be justified through coherence alone? Is justified belief responsible belief? Lively debate format sharply defines the issues, and paves the way for further discussion. Will serve as an accessible introduction to the major topics in contemporary epistemology, (...) whilst also capturing the imagination of professional philosophers. (shrink)
The scientific study of consciousness emerged as an organized field of research only a few decades ago. As empirical results have begun to enhance our understanding of consciousness, it is important to find out whether other factors, such as funding for consciousness research and status of consciousness scientists, provide a suitable environment for the field to grow and develop sustainably. We conducted an online survey on people’s views regarding various aspects of the scientific study of consciousness as a field of (...) research. 249 participants completed the survey, among which 80% were in academia, and around 40% were experts in consciousness research. Topics covered include the progress made by the field, funding for consciousness research, job opportunities for consciousness researchers, and the scientific rigor of the work done by researchers in the field. The majority of respondents (78%) indicated that scientific research on consciousness has been making progress. However, most participants perceived obtaining funding and getting a job in the field of consciousness research as more difficult than in other subfields of neuroscience. Overall, work done in consciousness research was perceived to be less rigorous than other neuroscience subfields, but this perceived lack of rigor was not related to the perceived difficulty in finding jobs and obtaining funding. Lastly, we found that, overall, the global workspace theory was perceived to be the most promising (around 28%), while most non-expert researchers (around 22% of non-experts) found the integrated information theory (IIT) most promising. We believe the survey results provide an interesting picture of current opinions from scientists and researchers about the progresses made and the challenges faced by consciousness research as an independent field. They will inspire collective reflection on the future directions regarding funding and job opportunities for the field. (shrink)
The advance of science and human knowledge is impeded by misunderstandings of various statistics, insufficient reporting of findings, and the use of numerous standardized and non-standardized presentations of essentially identical information. Communication with journalists and the public is hindered by the failure to present statistics that are easy for non-scientists to interpret as well as by use of the word significant, which in scientific English does not carry the meaning of "important" or "large." This article promotes a new standard method (...) for reporting two-group and two-variable statistics that can enhance the presentation of relevant information, increase understanding of findings, and replace the current presentations of two-group ANOVA, t-tests, correlations, chi-squares, and z-tests of proportions. A brief call to highly restrict the publication of risk ratios, odds ratios, and relative increase in risk percentages is also made, since these statistics appear to provide no useful scientific information regarding the magnitude of findings. (shrink)
Epistemic deontology is the view that the concept of epistemic justification is deontological: a justified belief is, by definition, an epistemically permissible belief. I defend this view against the argument from doxastic involuntarism, according to which our doxastic attitudes are not under our voluntary control, and thus are not proper objects for deontological evaluation. I argue that, in order to assess this argument, we must distinguish between a compatibilist and a libertarian construal of the concept of voluntary control. If we (...) endorse a compatibilist construal, it turns out that we enjoy voluntary control over our doxastic attitudes after all. If, on the other hand, we endorse a libertarian construal, the result is that, for our doxastic attitudes to be suitable objects of deontological evaluation, they need not be under our voluntary control. (shrink)
Despite the bad reputation of the legal profession, law remains king in America. A highly diverse society relies on the laws to maintain a working sense of the dignity and inviability of each individual. And a persistent element in contemporary debates is the fear that naturalistic theories of the human person will erode our belief that we have a dignity greater than that of other natural objects. Thus the endurance of the creation vs. evolution debate is due less to the (...) arguments of creationists, or to the continued influence of the book of Genesis, than to the reading of the evidence provided by Phillip E. Johnson of the University of California, Berkeley, Law School. (shrink)