This paper briefly reviews the theories that seek to explain the phenomenon of corporate charitable donations and then provides a review of the empirical issues that have arisen in previous studies in this area. The findings of an analysis of charitable donations data from the entire U.K. FTSE index for the years 1985–2000 are then reported. These findings include the observation of a time-related increase in charitable donations, which is compared with an earlier study to give a 24 year history (...) of charitable donations in the U.K. The findings note little responsiveness of the monetary value of charitable donations to the economic performance of firms. An international comparison over time against U.S. trends is also reported and shows how U.S. corporations have traditionally been more generous than U.K. firms, but that the trend in the U.S. is downwards. Membership of a U.K.-based "tithing" club (the PerCent Club) is shown to be associated with higher profit performance against non-members. Members' charitable contributions against profit are shown to be higher than the FTSE mean although short of the 0.5% target figure in "cash" terms. The paper concludes with a brief discussion of these findings in relation to the theoretical positions advanced for corporate philanthropy. (shrink)
Between 1819 and 1832 Friedrich Schleiermacher was giving lectures on the life of Jesus at the University of Berlin. The following article includes two partial editions, which document the introductory parts of the lectures from 1819/20 and 1829/30. Both are based on manuscripts written by Schleiermacher’s listeners. Especially to explore the development of Schleiermacher’s conceptual considerations this two partial editions should be a useful addition to the new critical edition of Schleiermacher’s Vorlesungen über das Leben Jesu published in 2018 by (...) Walter Jaeschke. (shrink)
Until the eighteenth century, Latin was the uncontested language of academic discourse, including theology. Regardless of their denominational affiliation, scholars all across Europe made use of Latin in both their publications and lectures. Then, due to the influence of various strands of post-Kantian philosophy, a change took place, at least in the German-speaking area. With recourse to classical German philosophy, many Catholic systematic theologians switched to their mother-tounge and adopted the newly coined terms in order to express the same faith. (...) In reaction to this transformative work the neo-scholastic movement came into existence. Its adherents stressed the Church’s tradition and, especially its indebtedness to medieval thought. From the mid-nineteenth century onwards, partly supported by the Magisterium, various attempts were made to re-introduce Latin into dogmatics. This project was unsuccessful, however, because of changes to the Catholic world ushered in by the Second Vatican Council and also because of developments in German educational policy, which served to lower the status of Latin in schools. (shrink)
Simulations are used in very different contexts and for very different purposes. An emerging development is the possibility of using simulations to obtain a more or less representative reproduction of organs or even entire persons. Such simulations are framed and discussed using the term ‘digital twin’. This paper unpacks and scrutinises the current use of such digital twins in medicine and the ideas embedded in this practice. First, the paper maps the different types of digital twins. A special focus is (...) put on the concrete challenges inherent in the interactions between persons and their digital twin. Second, the paper addresses the questions of how far a digital twin can represent a person and what the consequences of this may be. Against the background of these two analytical steps, the paper defines first conditions for digital twins to take on an ethically justifiable form of representation. (shrink)
A uniform construction for sequent calculi for finite-valued first-order logics with distribution quantifiers is exhibited. Completeness, cut-elimination and midsequent theorems are established. As an application, an analog of Herbrand’s theorem for the four-valued knowledge-representation logic of Belnap and Ginsberg is presented. It is indicated how this theorem can be used for reasoning about knowledge bases with incomplete and inconsistent information.
Matthias Vogel challenges the belief, dominant in contemporary philosophy, that reason is determined solely by our discursive, linguistic abilities as communicative beings. In his view, the medium of language is not the only force of reason. Music, art, and other nonlinguistic forms of communication and understanding are also significant. Introducing an expansive theory of mind that accounts for highly sophisticated, penetrative media, Vogel advances a novel conception of rationality while freeing philosophy from its exclusive attachment to linguistics. Vogel's media (...) of reason treats all kinds of understanding and thought, propositional and nonpropositional, as important to the processes and production of knowledge and thinking. By developing an account of rationality grounded in a new conception of media, he raises the profile of the prelinguistic and nonlinguistic dimensions of rationality and advances the Enlightenment project, buffering it against the postmodern critique that the movement fails to appreciate aesthetic experience. Guided by the work of Jürgen Habermas, Donald Davidson, and a range of media theorists, including Marshall McLuhan, Vogel rebuilds, if he does not remake, the relationship among various forms of media -- books, movies, newspapers, the Internet, and television -- while offering an original and exciting contribution to media theory. (shrink)
Conditional structures lie at the heart of the sciences, humanities, and everyday reasoning. It is hence not surprising that conditional logics – logics specifically designed to account for natural language conditionals – are an active and interdisciplinary area. The present book gives a formal and a philosophical account of indicative and counterfactual conditionals in terms of Chellas-Segerberg semantics. For that purpose a range of topics are discussed such as Bennett’s arguments against truth value based semantics for indicative conditionals.
The main claim of this paper is that notions of implementation based on an isomorphic correspondence between physical and computational states are not tenable. Rather, ``implementation'' has to be based on the notion of ``bisimulation'' in order to be able to block unwanted implementation results and incorporate intuitions from computational practice. A formal definition of implementation is suggested, which satisfies theoretical and practical requirements and may also be used to make the functionalist notion of ``physical realization'' precise. The upshot of (...) this new definition of implementation is that implementation cannot distinguish isomorphic bisimilar from non-isomporphic bisimilar systems anymore, thus driving a wedge between the notions of causal and computational complexity. While computationalism does not seem to be affected by this result, the consequences for functionalism are not clear and need further investigations. (shrink)
I develop a theory of counterfactuals about relative computability, i.e. counterfactuals such as 'If the validity problem were algorithmically decidable, then the halting problem would also be algorithmically decidable,' which is true, and 'If the validity problem were algorithmically decidable, then arithmetical truth would also be algorithmically decidable,' which is false. These counterfactuals are counterpossibles, i.e. they have metaphysically impossible antecedents. They thus pose a challenge to the orthodoxy about counterfactuals, which would treat them as uniformly true. What’s more, I (...) argue that these counterpossibles don’t just appear in the periphery of relative computability theory but instead they play an ineliminable role in the development of the theory. Finally, I present and discuss a model theory for these counterfactuals that is a straightforward extension of the familiar comparative similarity models. (shrink)
I distinguish between Old Contextualism, New Contextualism, and the Multiple Concepts Theory. I argue that Old Contextualism cannot handle the following three problems: (i) the disquotational paradox, (ii) upward pressure resistance, (iii) inability to avoid the acceptance of skeptical conclusions. New Contextualism, in contrast, can avoid these problems. However, since New Contextualism appears to be a semanticized mirror image of MCT, it remains unclear whether it is in fact a genuine version of contextualism.
Whether the prefrontal cortex is part of the neural substrates of consciousness is currently debated. Against prefrontal theories of consciousness, many have argued that neural activity in the prefrontal cortex does not correlate with consciousness but with subjective reports. We defend prefrontal theories of consciousness against this argument. We surmise that the requirement for reports is not a satisfying explanation of the difference in neural activity between conscious and unconscious trials, and that prefrontal theories of consciousness come out of this (...) debate unscathed. (shrink)
Eleven pairs of newly commissioned essays face off on opposite sides of fundamental problems in current theories of knowledge. Brings together fresh debates on eleven of the most controversial issues in epistemology. Questions addressed include: Is knowledge contextual? Can skepticism be refuted? Can beliefs be justified through coherence alone? Is justified belief responsible belief? Lively debate format sharply defines the issues, and paves the way for further discussion. Will serve as an accessible introduction to the major topics in contemporary epistemology, (...) whilst also capturing the imagination of professional philosophers. (shrink)
Some proponents of the Integrated Information Theory of consciousness profess strong views on the Neural Correlates of Consciousness, namely that large swathes of the neocortex, the cerebellum, at least some sensory cortices, and the so-called limbic system are all not essential for any form of conscious experiences. We argue that this connection is not incidental. Conflation between strong and weak versions of the theory has led these researchers to adopt definitions of NCC that are inconsistent with their own previous definitions, (...) inadvertently betraying the promises of an otherwise fruitful empirical endeavour. (shrink)
Traditionally, the manufacturer/operator of a machine is held (morally and legally) responsible for the consequences of its operation. Autonomous, learning machines, based on neural networks, genetic algorithms and agent architectures, create a new situation, where the manufacturer/operator of the machine is in principle not capable of predicting the future machine behaviour any more, and thus cannot be held morally responsible or liable for it. The society must decide between not using this kind of machine any more (which is not a (...) realistic option), or facing a responsibility gap, which cannot be bridged by traditional concepts of responsibility ascription. (shrink)
I propose a version of foundationaUsm with the following distinctive features. First, it includes in the class of basic beliefs ordinary beliefs about physical objects. This makes it unrestricted. Second, it assigns the role of ultimate justifiers to A-states: states of being appeared to in various ways. Such states have propositional content, and are justifiers if they are presumptively reliable. The beliefs A-states justify are basic if they are non-inferential. In the last three sections of the paper, I defend this (...) version of foundationalism against Sellars's famous anti-foundationalist dilemma, according to which sense-experiential states can't be justifiers ifthey lack propositional content, and can't terminate the justificatory regress if they have propositional content. I argue that the latter of these two claims is false. A-states can play the role of justifiers because they have propositional content, and they can terminate the justificatory regress because they themselves are capable of neither being true or false, nor being justified or unjustified. (shrink)
Opponents to consciousness in fish argue that fish do not feel pain because they do not have a neocortex, which is a necessary condition for feeling pain. A common counter-argument appeals to the multiple realizability of pain: while a neocortex might be necessary for feeling pain in humans, pain might be realized differently in fish. This paper argues, first, that it is impossible to find a criterion allowing us to demarcate between plausible and implausible cases of multiple realization of pain (...) without running into a circular argument. Second, opponents to consciousness in fish cannot be provided with reasons to believe in the multiple realizability of pain. I conclude that the debate on the existence of pain in fish is impossible to settle by relying on the multiple realization argument. (shrink)
After briefly discussing the relevance of the notions computation and implementation for cognitive science, I summarize some of the problems that have been found in their most common interpretations. In particular, I argue that standard notions of computation together with a state-to-state correspondence view of implementation cannot overcome difficulties posed by Putnam's Realization Theorem and that, therefore, a different approach to implementation is required. The notion realization of a function, developed out of physical theories, is then introduced as a replacement (...) for the notional pair computation-implementation. After gradual refinement, taking practical constraints into account, this notion gives rise to the notion digital system which singles out physical systems that could be actually used, and possibly even built. (shrink)
The paper explains in what sense the GRW matter density theory is a primitive ontology theory of quantum mechanics and why, thus conceived, the standard objections against the GRW formalism do not apply to GRWm. We consider the different options for conceiving the quantum state in GRWm and argue that dispositionalism is the most attractive one.
Business ethics educators strive to produce graduates who not only grasp the principles of ethical decision-making, but who can apply that business ethics education when faced with real-world challenges. However, this has proven especially difficult, as good intentions do not always translate into ethical awareness and action. Complementing a behavioral ethics approach with insights from social psychology, we developed an interventional class module with both online and in-class elements aimed at increasing students’ awareness of their own susceptibility to unconscious biases (...) and, consequently, unethical behaviors. We deployed this intervention within a problem-based learning course, in which students completed real-world projects for actual business clients. Our results suggest that although students appeared universally aware of the importance of ethical issues in business and generally espoused intentions to act ethically, those who received the intervention were significantly more likely to recognize their own susceptibility to perpetuating unethical business behavior and to identify ethical issues specific to their real-world projects. These results have important implications for behavioral ethics pedagogy and provide a de-biasing interventional approach for bridging classroom knowledge with real-world skills. (shrink)
As for most measurement procedures in the course of their development, measures of consciousness face the problem of coordination, i.e., the problem of knowing whether a measurement procedure actually measures what it is intended to measure. I focus on the case of the Perceptual Awareness Scale to illustrate how ignoring this problem leads to ambiguous interpretations of subjective reports in consciousness science. In turn, I show that empirical results based on this measurement procedure might be systematically misinterpreted.
This volume gathers eleven new and three previously unpublished essays that take on questions of epistemic justification, responsibility, and virtue. It contains the best recent work in this area by major figures such as Ernest Sosa, Robert Audi, Alvin Goldman, and Susan Haak.
According to William Alston, we lack voluntary control over our propositional attitudes because we cannot believe intentionally, and we cannot believe intentionally because our will is not causally connected to belief formation. Against Alston, I argue that we can believe intentionally because our will is causally connected to belief formation. My defense of this claim is based on examples in which agents have reasons for and against believing p, deliberate on what attitude to take towards p, and subsequently acquire an (...) attitude A towards p because they have decided to take attitude A. From the possibility of intentional belief, two conclusions follow. First, the kind of control we have over our propositional attitudes is direct; it is possible for us to believe at will. Second, the question of whether what we believe is under our control ultimately depends on whether our will itself is under our control. It is, therefore, a question of the metaphysics of free will. (shrink)
The scientific study of consciousness emerged as an organized field of research only a few decades ago. As empirical results have begun to enhance our understanding of consciousness, it is important to find out whether other factors, such as funding for consciousness research and status of consciousness scientists, provide a suitable environment for the field to grow and develop sustainably. We conducted an online survey on people’s views regarding various aspects of the scientific study of consciousness as a field of (...) research. 249 participants completed the survey, among which 80% were in academia, and around 40% were experts in consciousness research. Topics covered include the progress made by the field, funding for consciousness research, job opportunities for consciousness researchers, and the scientific rigor of the work done by researchers in the field. The majority of respondents (78%) indicated that scientific research on consciousness has been making progress. However, most participants perceived obtaining funding and getting a job in the field of consciousness research as more difficult than in other subfields of neuroscience. Overall, work done in consciousness research was perceived to be less rigorous than other neuroscience subfields, but this perceived lack of rigor was not related to the perceived difficulty in finding jobs and obtaining funding. Lastly, we found that, overall, the global workspace theory was perceived to be the most promising (around 28%), while most non-expert researchers (around 22% of non-experts) found the integrated information theory (IIT) most promising. We believe the survey results provide an interesting picture of current opinions from scientists and researchers about the progresses made and the challenges faced by consciousness research as an independent field. They will inspire collective reflection on the future directions regarding funding and job opportunities for the field. (shrink)
Consciousness scientists have not reached consensus on two of the most central questions in their field: first, on whether consciousness overflows reportability; second, on the physical basis of consciousness. I review the scientific literature of the 19th century to provide evidence that disagreement on these questions has been a feature of the scientific study of consciousness for a long time. Based on this historical review, I hypothesize that a unifying explanation of disagreement on these questions, up to this day, is (...) that scientific theories of consciousness are underdetermined by the evidence, namely, that they can be preserved “come what may” in front of (seemingly) disconfirming evidence. Consciousness scientists may have to find a way of solving the persistent underdetermination of theories of consciousness to make further progress. (shrink)
Neo-logicism is, not least in the light of Frege’s logicist programme, an important topic in the current philosophy of mathematics. In this essay, I critically discuss a number of issues that I consider to be relevant for both Frege’s logicism and neo-logicism. I begin with a brief introduction into Wright’s neo-Fregean project and mention the main objections that he faces. In Sect. 2, I discuss the Julius Caesar problem and its possible Fregean and neo-Fregean solution. In Sect. 3, I raise (...) what I take to be a central objection to the position of neo-logicism. In Sect. 4, I attempt to clarify how we should understand Frege’s stipulation that the two sides of an abstraction principle qua contextual definition of a term-forming operator shall be “gleichbedeutend”. In Sect. 5, I consider the options that Frege might have had to establish the analyticity of Hume’s Principle: The number that belongs to the concept F is equal to the number that belongs to the concept G if and only if F and G are equinumerous. Section 6 is devoted to Frege’s two criteria of thought identity. In Sects. 7 and 8, I defend the position of the neo-logicist against an alleged “knock-down argument”. In Sect. 9, I comment on Frege’s description of abstraction in Grundlagen, §64 and the use of the terms “recarving” and “reconceptualization” in the relevant literature on Fregean abstraction and neo-logicism. I argue that Fregean abstraction has nothing to do with the recarving of a sentence content or its decomposition in different ways. I conclude with remarks on global logicism versus local logicisms. (shrink)
This paper argues that economics, over the past 200 years, has become steadily more anti-philosophical and that there are three stages in the development of economic thought. Adam Smith intended economics to be a descriptive social science, rooted in an understanding of the moral and psychological processes of an individual’s decision-making and its connection to society in general. Yet, immediately after Smith’s death, economists made a clean cut and invented a totally new discipline: they switched towards a physicalist understanding of (...) human nature. Humans, like atoms, follow a natural law: they are driven by an emotion, namely selfishness. Thus economics became a ‘natural’ science. In the 20th century, the second reinterpretation removed all traces of humanity from the study of economics and declared economics to be a formal science like mathematics and logics. The actor in Phase 3 economics is homo economicus syntheticus, a postulate whose only connection to real humanity is the word homo. The paper asks what the results of this dramatic relocation are and why Phase 3 economics still claims descent from Smithian economics, despite the massive differences. (shrink)
Epistemic deontology is the view that the concept of epistemic justification is deontological: a justified belief is, by definition, an epistemically permissible belief. I defend this view against the argument from doxastic involuntarism, according to which our doxastic attitudes are not under our voluntary control, and thus are not proper objects for deontological evaluation. I argue that, in order to assess this argument, we must distinguish between a compatibilist and a libertarian construal of the concept of voluntary control. If we (...) endorse a compatibilist construal, it turns out that we enjoy voluntary control over our doxastic attitudes after all. If, on the other hand, we endorse a libertarian construal, the result is that, for our doxastic attitudes to be suitable objects of deontological evaluation, they need not be under our voluntary control. (shrink)
Adam Smith’s is often falsely portrayed as having argued that radical selfishness is a force for the good and that this “invisible hand’ is his market mechanism. This paper argues that Smith’s real market mechanism, the sympathy manoeuvre, is a viable alternative to Schumpeterian and mainstream models of innovation in economics and also could help build a firmer theoretical basis for other approaches such as Responsible Innovation. To Smith all human activity was social and must be understood and explained in (...) terms of the sentiments involved. Discovery, for instance, is driven by three sentiments, economic activities by sympathetic imagination, the need for exchange, the need to better one’s position in life, and the need for gratitude. Through sympathetic imagination, his famous model of the impartial spectator, Smith elegantly connects the individual and society. Smith’s innovation process is thus an exercise in social construction and not a destructive process based on radical selfishness. The paper argues that this social innovation process is a viable alternative to the extant approaches that are essentially asocial and amoral or ideologically normative. (shrink)
In this paper, I argue that the rejection of doxastic voluntarism is not as straightforward as its opponents take it to be. I begin with a critical examination of William Alston's defense of involuntarism and then focus on the question of whether belief is intentional.
According to a standard criticism, Robert Brandom's “normative pragmatics”, i.e. his attempt to explain normative statuses in terms of practical attitudes, faces a dilemma. If practical attitudes and their interactions are specified in purely non-normative terms, then they underdetermine normative statuses; but if normative terms are allowed into the account, then the account becomes viciously circular. This paper argues that there is no dilemma, because the feared circularity is not vicious. While normative claims do exhibit their respective authors' practical attitudes (...) and thereby contribute towards establishing the normative statuses they are about, this circularity is not a mark of Brandom's explanatory strategy but a feature of social practice of which we theorists partake. (shrink)
In this paper, I examine Alston's arguments for doxastic involuntarism. Alston fails to distinguish (i) between volitional and executional lack of control, and (ii) between compatibilist and libertarian control. As a result, he fails to notice that, if one endorses a compatibilist notion of voluntary control, the outcome is a straightforward and compelling case for doxastic voluntarism. Advocates of involuntarism have recently argued that the compatibilist case for doxastic voluntarism can be blocked by pointing out that belief is never intentional. (...) In response to this strategy, I distinguish between two types of intentionality and argue that belief is no less intentional than action is. (shrink)
Consciousness is scientifically challenging to study because of its subjective aspect. This leads researchers to rely on report-based experimental paradigms in order to discover neural correlates of consciousness (NCCs). I argue that the reliance on reports has biased the search for NCCs, thus creating what I call 'methodological artefacts'. This paper has three main goals: first, describe the measurement problem in consciousness science and argue that this problem led to the emergence of methodological artefacts. Second, provide a critical assessment of (...) the NCCs put forward by the global neuronal workspace theory. Third, provide the means of dissociating genuine NCCs from methodological artefacts. (shrink)
Defined narrowly, epistemology is the study of knowledge and justified belief. As the study of knowledge, epistemology is concerned with the following questions: What are the necessary and sufficient conditions of knowledge? What are its sources? What is its structure, and what are its limits? As the study of justified belief, epistemology aims to answer questions such as: How we are to understand the concept of justification? What makes justified beliefs justified? Is justification internal or external to one's own mind? (...) Understood more broadly, epistemology is about issues having to do with the creation and dissemination of knowledge in particular areas of inquiry. This article will provide a systematic overview of the problems that the questions above raise and focus in some depth on issues relating to the structure and the limits of knowledge and justification. (shrink)
I argue that scientific realism, insofar as it is only committed to those scientific posits of which we have causal knowledge, is immune to Kyle Stanford’s argument from unconceived alternatives. This causal strategy is shown not to repeat the shortcomings of previous realist responses to Stanford’s argument. Furthermore, I show that the notion of causal knowledge underlying it can be made sufficiently precise by means of conceptual tools recently introduced into the debate on scientific realism. Finally, I apply this strategy (...) to the case of Jean Perrin’s experimental work on the atomic hypothesis, disputing Stanford’s claim that the problem of unconceived alternatives invalidates a realist interpretation of this historical episode. 1 Stanford’s Argument from Unconceived Alternatives2 Previous Attempts to Undermine the Problem of Unconceived Alternatives2.1 The plausibility of unconceived alternatives2.2 The distinctness of unconceived alternatives2.3 The induction from past to present3 Causal Knowledge as a Criterion for the Realist3.1 How Chakravartty’s proposal differs from earlier causal strategies3.2 Causal realism and the detection/auxiliary distinction4 Causal Realism, Unconceived Alternatives, and the Atomic Hypothesis4.1 Perrin and the philosophers: some initial observations4.2 Roush and Stanford on Perrin4.3 From Brownian motion to the reality of atoms4.4 What we know about atoms5 Conclusion. (shrink)
Page generated Mon Aug 2 16:02:58 2021 on philpapers-web-65948fd446-659hb
cache stats: hit=1800, miss=1540, save= autohandler : 1375 ms called component : 1360 ms search.pl : 1217 ms render loop : 856 ms addfields : 458 ms publicCats : 400 ms initIterator : 357 ms next : 347 ms menu : 93 ms save cache object : 91 ms quotes : 41 ms autosense : 35 ms match_cats : 32 ms retrieve cache object : 32 ms search_quotes : 24 ms prepCit : 20 ms applytpl : 5 ms intermediate : 2 ms match_other : 1 ms match_authors : 1 ms init renderer : 0 ms setup : 0 ms auth : 0 ms writelog : 0 ms