It has been observed that whereas painters and musicians are likely to be embarrassed by references to the beauty in their work, mathematicians instead like to engage in discussions of the beauty of mathematics. Professional artists are more likely to stress the technical rather than the aesthetic aspects of their work. Mathematicians, instead, are fond of passing judgment on the beauty of their favored pieces of mathematics. Even a cursory observation shows that the characteristics of mathematical beauty are at variance (...) with those of artistic beauty. For example, courses in art appreciation are fairly common; it is however unthinkable to find any mathematical beauty appreciation courses taught anywhere. The purpose of the present paper is to try to uncover the sense of the term beauty as it is currently used by mathematicians. (shrink)
We shall argue that the attempt carried out by certain philosophers in this century to parrot the language, the method, and the results of mathematics has harmed philosophy. Such an attempt results from a misunderstanding of both mathematics and philosophy, and has harmed both subjects.
In the paper it is argued that bridging the digital divide may cause a new ethical and social dilemma. Using Hardin's Tragedy of the Commons, we show that an improper opening and enlargement of the digital environment (Infosphere) is likely to produce a Tragedy of the Digital Commons (TDC). In the course of the analysis, we explain why Adar and Huberman's previous use of Hardin's Tragedy to interpret certain recent phenomena in the Infosphere (especially peer-to-peer communication) may not be entirely (...) satisfactory. We then seek to provide an improved version of the TDC that avoids the possible shortcomings of their model. Next, we analyse some problems encountered by the application of classical ethics in the resolution of the TDC. In the conclusion, we outline the kind of work that will be required to develop an ethical approach that may bridge the digital divide but avoid the TDC. (shrink)
Il saggio propone alcune osservazioni a margine delle tre lezioni tenute dal giudice federale statunitense Guido Calabresi a Macerata nel 2012. Le lezioni riguardano l'organizzazione delle corti di giustizia federale negli Stati Uniti, la forma e il ruolo delle sentenze giudiziarie all'interno di un sistema federale e il rapporto che intrattiene con la pena di morte un giudice a essa decisamente contrario. Secondo Melis queste lezioni restituiscono in maniera straordinariamente viva e precisa la struttura del sistema giudiziario e, al (...) tempo stesso, l'esperienza soggettiva di un giudice. (shrink)
Husserl’s Third Logical Investigation, ostensibly dealing with the phenomenology of whole and parts, is actually meant to introduce the notion of Fundierung. This term is frequently used in the phenomenological literature, although little has been written about Fundierung itself since Husserl introduced it. Husserl himself, although he used it extensively, never again felt the need to reopen the discussion.
Modern society is challenged by a loss of efficiency in national governance systems values, and lifestyles. Corporate social responsibility (CSR) discourse builds upon a conception of organizational legitimacy that does not appropriately reflect these changes. The problems arise from the a-political role of the corporation in the concepts of cognitive and pragmatic legitimacy, which are based on compliance to national law and on relatively homogeneous and stable societal expectations on the one hand and widely accepted rhetoric assuming that all members (...) of society benefit from capitalist production on the other. We therefore propose a fundamental shift to moral legitimacy, from an output and power oriented approach to an input related and discursive concept of legitimacy. This shift creates a new basis of legitimacy and involves organizations in processes of active justification vis-à-vis society rather than simply responding to the demands of powerful groups. We consider this a step towards the politicization of the corporation and attempt to re-embed the debate on corporate legitimacy into its broader context of political theory, while reflecting the recent turn from a liberal to a deliberative concept of democracy. (shrink)
A plurality of axiomatic systems can be interpreted as referring to one and the same mathematical object. In this paper we examine the relationship between axiomatic systems and their models, the relationships among the various axiomatic systems that refer to the same model, and the role of an intelligent user of an axiomatic system. We ask whether these relationships and this role can themselves be formalized.
This article reports on the development of the managerial ethical profile (MEP) scale. The MEP scale is a multilevel, self-reporting scale measuring the perceived influence that different dimensions of common ethical frameworks have on managerial decision making. The MEP scale measures on eight subscales: economic egoism, reputational egoism, act utilitarianism, rule utilitarianism, self-virtue of self, virtue of others, act deontology, and rule deontology. Confirmatory factor analysis (CFA) was used to provide evidence of scale validity. Future research needs and the value (...) of this measure for business ethics are discussed. (shrink)
The 1927 Solvay conference was perhaps the most important meeting in the history of quantum theory. Contrary to popular belief, the interpretation of quantum theory was not settled at this conference, and no consensus was reached. Instead, a range of sharply conflicting views were presented and extensively discussed, including de Broglie's pilot-wave theory, Born and Heisenberg's quantum mechanics, and Schrödinger's wave mechanics. Today, there is no longer an established or dominant interpretation of quantum theory, so it is important to re-evaluate (...) the historical sources and keep the interpretation debate open. This book contains a complete translation of the original proceedings, with background essays on the three main interpretations of quantum theory presented at the conference, and an extensive analysis of the lectures and discussions in the light of current research in the foundations of quantum theory. The proceedings contain much unexpected material, including extensive discussions of de Broglie's pilot-wave theory (which de Broglie presented for a many-body system), and a theory of 'quantum mechanics' apparently lacking in wave function collapse or fundamental time evolution. This book will be of interest to graduate students and researchers in physics and in the history and philosophy of quantum theory. (shrink)
In this paper we show that the Gupta-Belnap systems S# and S* are П12. Since Kremer has independently established that they are П12-hard, this completely settles the problem of their complexity. The above-mentioned upper bound is established through a reduction to countable revision sequences that is inspired by, and makes use of a construction of McGee.
LIKE ARTISTS WHO FAIL TO GIVE an accurate description of how they work, like scientists who believe in unrealistic philosophies of science, mathematicians subscribe to a concept of mathematical truth that runs contrary to the truth.
We present an axiomatic approach for a class of finite, extensive form games of perfect information that makes use of notions like “rationality at a node” and “knowledge at a node.” We distinguish between the game theorist's and the players' own “theory of the game.” The latter is a theory that is sufficient for each player to infer a certain sequence of moves, whereas the former is intended as a justification of such a sequence of moves. While in general the (...) game theorist's theory of the game is not and need not be axiomatized, the players' theory must be an axiomatic one, since we model players as analogous to automatic theorem provers that play the game by inferring (or computing) a sequence of moves. We provide the players with an axiomatic theory sufficient to infer a solution for the game (in our case, the backwards induction equilibrium), and prove its consistency. We then inquire what happens when the theory of the game is augmented with information that a move outside the inferred solution has occurred. We show that a theory that is sufficient for the players to infer a solution and still remains consistent in the face of deviations must be modular. By this we mean that players have distributed knowledge of it. Finally, we show that whenever the theory of the game is group-knowledge (or common knowledge) among the players (i.e., it is the same at each node), a deviation from the solution gives rise to inconsistencies and therefore forces a revision of the theory at later nodes. On the contrary, whenever a theory of the game is modular, a deviation from equilibrium play does not induce a revision of the theory. (shrink)
In this paper we introduce three methods to approach philosophical problems informationally: Minimalism, the Method of Abstraction and Constructionism. Minimalism considers the specifications of the starting problems and systems that are tractable for a philosophical analysis. The Method of Abstraction describes the process of making explicit the level of abstraction at which a system is observed and investigated. Constructionism provides a series of principles that the investigation of the problem must fulfil once it has been fully characterised by the previous (...) two methods. For each method, we also provide an application: the problem of visual perception, functionalism, and the Turing Test, respectively. (shrink)
What makes trust such a powerful concept? Is it merely that in trust the whole range of social forces that we know play together? Or is it that trust involves a peculiar element beyond those we can account for? While trust is an attractive and evocative concept that has gained increasing popularity across the social sciences, it remains elusive, its many facets and applications obscuring a clear overall vision of its essence. In this book, Guido Möllering reviews a broad (...) range of trust research and extracts three main perspectives adopted in the literature for understanding trust. Accordingly, trust is presented as a matter of reason, routine or reflexivity. While all these perspectives contribute something to our understanding of trust, Möllering shows that they imply, but cannot explain, ‘suspension’ – the leap of faith that is typical of trust. He therefore proposes a new direction in trust research that builds on existing perspectives but places the suspension of uncertainty and vulnerability at the heart of the concept of trust. Beyond a purely theoretical line of argument, the author discusses implications for empirical studies of trust and presents original case material that captures the experience of trust in terms of reason, routine, reflexivity and suspension. Möllering concludes by suggesting how the new approach can enhance the relevance of trust research and its contributions to broader research agendas concerning the constitution of positive expectations in the face of prevalent uncertainty and change at various levels in our economies and societies. The book is essential reading for anyone who wants to gain a thorough understanding of trust. It can serve as a general introduction for advanced students and scholars in the social sciences, especially in economics, sociology, psychology and management. For more experienced researchers, it is a challenging and provocative critique of the field and a new approach to understanding trust. (shrink)
Many models of (un)ethical decision making assume that people decide rationally and are in principle able to evaluate their decisions from a moral point of view. However, people might behave unethically without being aware of it. They are ethically blind. Adopting a sensemaking approach, we argue that ethical blindness results from a complex interplay between individual sensemaking activities and context factors.
Tobacco companies have started to position themselves as good corporate citizens. The effort towards CSR engagement in the tobacco industry is not only heavily criticized by anti-tobacco NGOs. Some opponents such as the the World Health Organization have even categorically questioned the possibility of social responsibility in the tobacco industry. The paper will demonstrate that the deep distrust towards tobacco companies is linked to the lethal character of their products and the dubious behavior of their representatives in recent decades. As (...) a result, tobacco companies are not in the CSR business in the strict sense. Key aspects of mainstream CSR theory and practice such as corporate philanthropy, stakeholder collaboration, CSR reporting and self-regulation, are demonstrated to be ineffective or even counterproductive in the tobacco industry. Building upon the terminology used in the leadership literature, the paper proposes to differentiate between transactional and transformational CSR arguing that tobacco companies can only operate on a transactional level. As a consequence, corporate responsibility in the tobacco industry is based upon a much thinner approach to CSR and has to be conceptualized with a focus on transactional integrity across the tobacco supply chain. (shrink)
We propose a nonmonotonic Description Logic of typicality able to account for the phenomenon of combining prototypical concepts, an open problem in the fields of AI and cognitive modelling. Our logic extends the logic of typicality ALC + TR, based on the notion of rational closure, by inclusions p :: T(C) v D (“we have probability p that typical Cs are Ds”), coming from the distributed semantics of probabilistic Description Logics. Additionally, it embeds a set of cognitive heuristics for concept (...) combination. We show that the complexity of reasoning in our logic is EXPTIME-complete as in ALC. (shrink)
Interference phenomena are a well-known and crucial feature of quantum mechanics, the two-slit experiment providing a standard example. There are situations, however, in which interference effects are (artificially or spontaneously) suppressed. We shall need to make precise what this means, but the theory of decoherence is the study of (spontaneous) interactions between a system and its environment that lead to such suppression of interference. This study includes detailed modelling of system-environment interactions, derivation of equations (‘master equations’) for the (reduced) state (...) of the system, discussion of time-scales etc. A discussion of the concept of suppression of interference and a simplified survey of the theory is given in Section 2, emphasising features that will be relevant to the following discussion (and restricted to standard non-relativistic particle quantum mechanics. A partially overlapping field is that of decoherent histories, which proceeds from an abstract definition of loss of interference, but which we shall not be considering in any detail. (shrink)
The debate about the appropriate standards for upstream corporate social responsibility of multinational corporations has been on the public and academic agenda for some three decades. The debate originally focused narrowly on “contract responsibility” of MNCs for monitoring of upstream contractors for “sweatshop” working conditions violating employee rights. The authors argue that the MNC upstream responsibility debate has shifted qualitatively over time to “full producer responsibility” involving an expansion from “contract responsibility” in three distinct dimensions. First, there is an expansion (...) of scope from working conditions to human rights and social and environmental impacts broadly defined. Second, there is expansion in depth of this broader responsibility to the whole upstream supply chain without regard to contracting status. Upstream responsibility now includes all suppliers, including direct contractors and the chain of suppliers to such contractors. Finally, the change in CSR scope and depth has led to an evolution of CSR management practice. (shrink)
Offering a solution to the skeptical puzzle is a central aim of Nozick's sensitivity account of knowledge. It is well-known that this account faces serious problems. However, because of its simplicity and its explanatory power, the sensitivity principle has remained attractive and has been subject to numerous modifications, leading to a of sensitivity accounts. I will object to these accounts, arguing that sensitivity accounts of knowledge face two problems. First, they deliver a far too heterogeneous picture of higher-level beliefs about (...) the truth or falsity of one's own beliefs. Second, this problem carries over to bootstrapping and Moorean reasoning. Some beliefs formed via bootstrapping or Moorean reasoning are insensitive, but some closely related beliefs in even stronger propositions are sensitive. These heterogeneous results regarding sensitivity do not fit with our intuitions about bootstrapping and Moorean reasoning. Thus, neither Nozick's sensitivity account of knowledge nor any of its modified versions can provide the basis for an argument that bootstrapping and Moorean reasoning are flawed or for an explanation why they seem to be flawed. (shrink)
In this paper we argue that Revision Rules, introduced by Anil Gupta and Nuel Belnap as a tool for the analysis of the concept of truth, also provide a useful tool for defining computable functions. This also makes good on Gupta's and Belnap's claim that Revision Rules provide a general theory of definition, a claim for which they supply only the example of truth. In particular we show how Revision Rules arise naturally from relaxing and generalizing a classical construction due (...) to Kleene, and indicate how they can be employed to reconstruct the class of the general recursive functions. We also point at how Revision Rules can be employed to access non-minimal fixed points of partially defined computing procedures. (shrink)
This paper describes the Eunomos software, an advanced legal document and knowledge management system, based on legislative XML and ontologies. We describe the challenges of legal research in an increasingly complex, multi-level and multi-lingual world and how the Eunomos software helps users cut through the information overload to get the legal information they need in an organized and structured way and keep track of the state of the relevant law on any given topic. Using NLP tools to semi-automate the lower-skill (...) tasks makes this ambitious project a realistic commercial prospect as it helps keep costs down while at the same time allowing greater coverage. We describe the core system from workflow and technical perspectives, and discuss applications of the system for various user groups. (shrink)
Past decades have witnessed the growing success of branding as a corporate activity as well as a rise in anti-brand activism. While appearing to be contradictory, both trends have emerged from common sources – the transition from industrial to post-industrial society, and the advent of globalization – the examination of which might lead to a socially grounded understanding of why brand success in the future is likely to demand more than superior product performance, placing increasing demand on corporations with regard (...) to a broader envelop of socially responsible behavior. Directions for strategic and managerial options are suggested. (shrink)
Keith DeRose’s solution to the skeptical problem is based on his indirect sensitivity account. Sensitivity is not a necessary condition for any kind of knowledge, as direct sensitivity accounts claim, but the insensitivity of our beliefs that the skeptical hypotheses are false explains why we tend to judge that we do not know them. The orthodox objection line against any kind of sensitivity account of knowledge is to present instances of insensitive beliefs that we still judge to constitute knowledge. This (...) objection line offers counter-examples against the claim of direct sensitivity accounts that sensitivity is necessary for any kind of knowledge. These examples raise an easy problem for indirect sensitivity accounts that claim that there is only a tendency to judge that insensitive beliefs do not constitute knowledge, which still applies to our beliefs that the skeptical hypotheses are false. However, a careful analysis reveals that some of our beliefs that the skeptical hypotheses are false are sensitive; nevertheless, we still judge that we do not know them. Therefore, the fact that some of our beliefs that the skeptical hypotheses are false are insensitive cannot explain why we tend to judge that we do not know them. Hence, indirect sensitivity accounts cannot fulfill their purpose of explaining our intuitions about skepticism. This is the hard problem for indirect sensitivity accounts. (shrink)
Die Entscheidung für oder gegen lebensverlängernde Behandlungsmaßnahmen geht inzwischen der Hälfte aller Todesfälle in Europa voraus. Sie wird im klinischen Alltag häufig als ethische Herausforderung wahrgenommen, zudem sind unter Klinikern juristische Unsicherheiten und Fragen der korrekten Vorgehensweise verbreitet. Die hier vorgestellte Münchner Leitlinie zu Entscheidungen am Lebensende soll rechtliche Unsicherheit reduzieren, Klinikumsmitarbeiter für die ethische Dimension von Therapieentscheidungen am Lebensende sensibilisieren und ethisch begründete Entscheidungen fördern. Aus organisationsethischer Perspektive soll mit der Leitlinie eine Reflexion und Meinungsbildung zu einem ethisch relevanten (...) Themenbereich erfolgen und ein Beitrag zur Qualitätssicherung der Patientenversorgung und damit auch zur Patientenzufriedenheit geleistet werden. Die Prozesse der Leitlinienentwicklung, -aktualisierung und -implementierung werden vorgestellt. Die wesentlichen inhaltlichen Elemente der Leitlinie mit Darstellung der Rechtslage, der Definition zentraler Begriffe und Klärung medizinischer Entscheidungskriterien werden an Hand eines Entscheidungsalgorithmus dargestellt. Inhalt und Prozess der Leitlinienentwicklung werden auf die folgenden organisationsethischen Qualitätskriterien hin überprüft: deliberativer Prozess, inhaltliche Transparenz, repräsentative Zusammensetzung der Arbeitsgruppe, Implementierung und Evaluation der Leitlinie. Institutionen, die diese Leitlinie übernehmen wollen, sollten diese von einem autorisierten Gremium überprüfen lassen und auf die spezifischen Bedürfnisse ihrer Institution adaptieren. Hierfür ist eine vorgeschaltete Bedarfserhebung hilfreich. In dem Gremium sollten all diejenigen vertreten sein, die in der Folge mit der Leitlinie arbeiten werden. (shrink)
In public political deliberation, people will err and lie in accordance with definite patterns. Such discourse failure results from behavior that is both instrumentally and epistemically rational. The deliberative practices of a liberal democracy cannot be improved so as to overcome the tendency for rational citizens to believe and say things at odds with reliable propositions of social science. The theory has several corollaries. One is that much contemporary political philosophy can be seen as an unsuccessful attempt to vindicate, on (...) symbolic and moral grounds, the forms that discourse failure take on in public political deliberation. Another is that deliberative practices cannot be rescued even on non-epistemic grounds, such as social peace, impartiality, participation, and equality. To alleviate discourse failure, this book proposes to reduce the scope of majoritarian politics and enlarge markets. (shrink)
Background Decisions on limiting life-sustaining treatment for patients in the vegetative state (VS) are emotionally and morally challenging. In Germany, doctors have to discuss, together with the legal surrogate (often a family member), whether the proposed treatment is in accordance with the patient's will. However, it is unknown whether family members of the patient in the VS actually base their decisions on the patient's wishes. Objective To examine the role of advance directives, orally expressed wishes, or the presumed will of (...) patients in a VS for family caregivers' decisions on life-sustaining treatment. Methods and sample A qualitative interview study with 14 next of kin of patients in a VS in a long-term care setting was conducted; 13 participants were the patient's legal surrogates. Interviews were analysed according to qualitative content analysis. Results The majority of family caregivers said that they were aware of aforementioned wishes of the patient that could be applied to the VS condition, but did not base their decisions primarily on these wishes. They gave three reasons for this: (a) the expectation of clinical improvement, (b) the caregivers' definition of life-sustaining treatments and (c) the moral obligation not to harm the patient. If the patient's wishes were not known or not revealed, the caregivers interpreted a will to live into the patient's survival and non-verbal behaviour. Conclusions Whether or not prior treatment wishes of patients in a VS are respected depends on their applicability, and also on the medical assumptions and moral attitudes of the surrogates. We recommend repeated communication, support for the caregivers and advance care planning. (shrink)
Vogel argues that sensitivity accounts of knowledge are implausible because they entail that we cannot have any higher-level knowledge that our beliefs are true, not false. Becker and Salerno object that Vogel is mistaken because he does not formalize higher-level beliefs adequately. They claim that if formalized correctly, higher-level beliefs are sensitive, and can therefore constitute knowledge. However, these accounts do not consider the belief-forming method as sensitivity accounts require. If we take bootstrapping as the belief-forming method, as the discussed (...) cases suggest, then we face a generality problem. Our higher-level beliefs as formalized by Becker and Salerno turn out to be sensitive according to a wide reading of bootstrapping, but insensitive according to a narrow reading. This particular generality problem does not arise for the alternative accounts of process reliabilism and basis-relative safety. Hence, sensitivity accounts not only deliver opposite results given different formalizations of higher-level beliefs, but also for the same formalization, depending on how we interpret bootstrapping. Therefore, sensitivity accounts do not fail because they make higher-level knowledge impossible, as Vogel argues, and they do not succeed in allowing higher-level knowledge, as Becker and Salerno suggest. Rather, their problem is that they deliver far too heterogeneous results. (shrink)
Nelson’s programme for a stochastic mechanics aims to derive the wave function and the Schroedinger equation from natural conditions on a diffusion process in configuration space. If successful, this pro- gramme might have some advantages over the better-known determin- istic pilot-wave theory of de Broglie and Bohm. The essential points of Nelson’s strategy are reviewed, with particular emphasis on concep- tual issues relating to the role of time symmetry. The main problem in Nelson’s approach is the lack of strict equivalence (...) between the cou- pled Madelung equations and the Schroedinger equation. After a brief discussion, the paper concludes with a possible suggestion for trying to overcome this problem. (shrink)