La représentation des relations de cooccurrence à l’échelle d’un corpus entier sous la forme d’un graphe permet d’étudier l’organisation des mots dans le discours par des moyens lexicométriques. Cette étude révèle deux formes d’organisation complémentaires de ce lexique. En premier lieu, une organisation hiérarchique, qui donne à certains lemmes une saillance particulière du fait de leur meilleur positionnement dans le réseau de relations de cooccurrence. En second lieu, une organisation modulaire, qui montre que les liens de cooccurrence dans le texte (...) répartissent le lexique en classes de lemmes co-occurrents les uns aux autres. Ces deux formes d’organisation du lexique dans les textes donnent une image du sens général de celui-ci, et permettent d’identifier la variabilité du sens lexical des unités telles qu’elles sont employées en discours. (shrink)
Matthias Vogel challenges the belief, dominant in contemporary philosophy, that reason is determined solely by our discursive, linguistic abilities as communicative beings. In his view, the medium of language is not the only force of reason. Music, art, and other nonlinguistic forms of communication and understanding are also significant. Introducing an expansive theory of mind that accounts for highly sophisticated, penetrative media, Vogel advances a novel conception of rationality while freeing philosophy from its exclusive attachment to linguistics. Vogel's media (...) of reason treats all kinds of understanding and thought, propositional and nonpropositional, as important to the processes and production of knowledge and thinking. By developing an account of rationality grounded in a new conception of media, he raises the profile of the prelinguistic and nonlinguistic dimensions of rationality and advances the Enlightenment project, buffering it against the postmodern critique that the movement fails to appreciate aesthetic experience. Guided by the work of Jürgen Habermas, Donald Davidson, and a range of media theorists, including Marshall McLuhan, Vogel rebuilds, if he does not remake, the relationship among various forms of media -- books, movies, newspapers, the Internet, and television -- while offering an original and exciting contribution to media theory. (shrink)
Between 1819 and 1832 Friedrich Schleiermacher was giving lectures on the life of Jesus at the University of Berlin. The following article includes two partial editions, which document the introductory parts of the lectures from 1819/20 and 1829/30. Both are based on manuscripts written by Schleiermacher’s listeners. Especially to explore the development of Schleiermacher’s conceptual considerations this two partial editions should be a useful addition to the new critical edition of Schleiermacher’s Vorlesungen über das Leben Jesu published in 2018 by (...) Walter Jaeschke. (shrink)
Until the eighteenth century, Latin was the uncontested language of academic discourse, including theology. Regardless of their denominational affiliation, scholars all across Europe made use of Latin in both their publications and lectures. Then, due to the influence of various strands of post-Kantian philosophy, a change took place, at least in the German-speaking area. With recourse to classical German philosophy, many Catholic systematic theologians switched to their mother-tounge and adopted the newly coined terms in order to express the same faith. (...) In reaction to this transformative work the neo-scholastic movement came into existence. Its adherents stressed the Church’s tradition and, especially its indebtedness to medieval thought. From the mid-nineteenth century onwards, partly supported by the Magisterium, various attempts were made to re-introduce Latin into dogmatics. This project was unsuccessful, however, because of changes to the Catholic world ushered in by the Second Vatican Council and also because of developments in German educational policy, which served to lower the status of Latin in schools. (shrink)
Whether the prefrontal cortex is part of the neural substrates of consciousness is currently debated. Against prefrontal theories of consciousness, many have argued that neural activity in the prefrontal cortex does not correlate with consciousness but with subjective reports. We defend prefrontal theories of consciousness against this argument. We surmise that the requirement for reports is not a satisfying explanation of the difference in neural activity between conscious and unconscious trials, and that prefrontal theories of consciousness come out of this (...) debate unscathed. (shrink)
After briefly discussing the relevance of the notions computation and implementation for cognitive science, I summarize some of the problems that have been found in their most common interpretations. In particular, I argue that standard notions of computation together with a state-to-state correspondence view of implementation cannot overcome difficulties posed by Putnam's Realization Theorem and that, therefore, a different approach to implementation is required. The notion realization of a function, developed out of physical theories, is then introduced as a replacement (...) for the notional pair computation-implementation. After gradual refinement, taking practical constraints into account, this notion gives rise to the notion digital system which singles out physical systems that could be actually used, and possibly even built. (shrink)
Opponents to consciousness in fish argue that fish do not feel pain because they do not have a neocortex, which is a necessary condition for feeling pain. A common counter-argument appeals to the multiple realizability of pain: while a neocortex might be necessary for feeling pain in humans, pain might be realized differently in fish. This paper argues, first, that it is impossible to find a criterion allowing us to demarcate between plausible and implausible cases of multiple realization of pain (...) without running into a circular argument. Second, opponents to consciousness in fish cannot be provided with reasons to believe in the multiple realizability of pain. I conclude that the debate on the existence of pain in fish is impossible to settle by relying on the multiple realization argument. (shrink)
Traditionally, the manufacturer/operator of a machine is held (morally and legally) responsible for the consequences of its operation. Autonomous, learning machines, based on neural networks, genetic algorithms and agent architectures, create a new situation, where the manufacturer/operator of the machine is in principle not capable of predicting the future machine behaviour any more, and thus cannot be held morally responsible or liable for it. The society must decide between not using this kind of machine any more (which is not a (...) realistic option), or facing a responsibility gap, which cannot be bridged by traditional concepts of responsibility ascription. (shrink)
According to William Alston, we lack voluntary control over our propositional attitudes because we cannot believe intentionally, and we cannot believe intentionally because our will is not causally connected to belief formation. Against Alston, I argue that we can believe intentionally because our will is causally connected to belief formation. My defense of this claim is based on examples in which agents have reasons for and against believing p, deliberate on what attitude to take towards p, and subsequently acquire an (...) attitude A towards p because they have decided to take attitude A. From the possibility of intentional belief, two conclusions follow. First, the kind of control we have over our propositional attitudes is direct; it is possible for us to believe at will. Second, the question of whether what we believe is under our control ultimately depends on whether our will itself is under our control. It is, therefore, a question of the metaphysics of free will. (shrink)
As for most measurement procedures in the course of their development, measures of consciousness face the problem of coordination, i.e., the problem of knowing whether a measurement procedure actually measures what it is intended to measure. I focus on the case of the Perceptual Awareness Scale to illustrate how ignoring this problem leads to ambiguous interpretations of subjective reports in consciousness science. In turn, I show that empirical results based on this measurement procedure might be systematically misinterpreted.
Eleven pairs of newly commissioned essays face off on opposite sides of fundamental problems in current theories of knowledge. Brings together fresh debates on eleven of the most controversial issues in epistemology. Questions addressed include: Is knowledge contextual? Can skepticism be refuted? Can beliefs be justified through coherence alone? Is justified belief responsible belief? Lively debate format sharply defines the issues, and paves the way for further discussion. Will serve as an accessible introduction to the major topics in contemporary epistemology, (...) whilst also capturing the imagination of professional philosophers. (shrink)
Simulations are used in very different contexts and for very different purposes. An emerging development is the possibility of using simulations to obtain a more or less representative reproduction of organs or even entire persons. Such simulations are framed and discussed using the term ‘digital twin’. This paper unpacks and scrutinises the current use of such digital twins in medicine and the ideas embedded in this practice. First, the paper maps the different types of digital twins. A special focus is (...) put on the concrete challenges inherent in the interactions between persons and their digital twin. Second, the paper addresses the questions of how far a digital twin can represent a person and what the consequences of this may be. Against the background of these two analytical steps, the paper defines first conditions for digital twins to take on an ethically justifiable form of representation. (shrink)
Epistemic deontology is the view that the concept of epistemic justification is deontological: a justified belief is, by definition, an epistemically permissible belief. I defend this view against the argument from doxastic involuntarism, according to which our doxastic attitudes are not under our voluntary control, and thus are not proper objects for deontological evaluation. I argue that, in order to assess this argument, we must distinguish between a compatibilist and a libertarian construal of the concept of voluntary control. If we (...) endorse a compatibilist construal, it turns out that we enjoy voluntary control over our doxastic attitudes after all. If, on the other hand, we endorse a libertarian construal, the result is that, for our doxastic attitudes to be suitable objects of deontological evaluation, they need not be under our voluntary control. (shrink)
In this paper, I argue that the rejection of doxastic voluntarism is not as straightforward as its opponents take it to be. I begin with a critical examination of William Alston's defense of involuntarism and then focus on the question of whether belief is intentional.
Consciousness scientists have not reached consensus on two of the most central questions in their field: first, on whether consciousness overflows reportability; second, on the physical basis of consciousness. I review the scientific literature of the 19th century to provide evidence that disagreement on these questions has been a feature of the scientific study of consciousness for a long time. Based on this historical review, I hypothesize that a unifying explanation of disagreement on these questions, up to this day, is (...) that scientific theories of consciousness are underdetermined by the evidence, namely, that they can be preserved “come what may” in front of (seemingly) disconfirming evidence. Consciousness scientists may have to find a way of solving the persistent underdetermination of theories of consciousness to make further progress. (shrink)
This volume is dedicated to the life and work of Ernest Nagel counted among the influential twentieth-century philosophers of science. Forgotten by the history of philosophy of science community in recent years, this volume introduces Nagel’s philosophy to a new generation of readers and highlights the merits and originality of his works. Best known in the history of philosophy as a major American representative of logical empiricism with some pragmatist and naturalist leanings, Nagel’s interests and activities went beyond these limits. (...) His career was marked with a strong and determined intention of harmonizing the European scientific worldview of logical empiricism and American naturalism/pragmatism. His most famous and systematic treatise on, The Structure of Science, appeared just one year before Thomas Kuhn’s even more renowned, The Structure of Scientific Revolutions. As a reflection of Nagel’s interdisciplinary work, the contributing authors’ articles are connected both historically and systematically. The volume will appeal to students mainly at the graduate level and academic scholars. Since the volume treats historical, philosophical, physical, social and general scientific questions, it will be of interest to historians and philosophers of science, epistemologists, social scientists, and anyone interested in the history of analytic philosophy and twentieth-century intellectual history. (shrink)
In this paper, I examine Alston's arguments for doxastic involuntarism. Alston fails to distinguish (i) between volitional and executional lack of control, and (ii) between compatibilist and libertarian control. As a result, he fails to notice that, if one endorses a compatibilist notion of voluntary control, the outcome is a straightforward and compelling case for doxastic voluntarism. Advocates of involuntarism have recently argued that the compatibilist case for doxastic voluntarism can be blocked by pointing out that belief is never intentional. (...) In response to this strategy, I distinguish between two types of intentionality and argue that belief is no less intentional than action is. (shrink)
Defined narrowly, epistemology is the study of knowledge and justified belief. As the study of knowledge, epistemology is concerned with the following questions: What are the necessary and sufficient conditions of knowledge? What are its sources? What is its structure, and what are its limits? As the study of justified belief, epistemology aims to answer questions such as: How we are to understand the concept of justification? What makes justified beliefs justified? Is justification internal or external to one's own mind? (...) Understood more broadly, epistemology is about issues having to do with the creation and dissemination of knowledge in particular areas of inquiry. This article will provide a systematic overview of the problems that the questions above raise and focus in some depth on issues relating to the structure and the limits of knowledge and justification. (shrink)
The paper explains in what sense the GRW matter density theory is a primitive ontology theory of quantum mechanics and why, thus conceived, the standard objections against the GRW formalism do not apply to GRWm. We consider the different options for conceiving the quantum state in GRWm and argue that dispositionalism is the most attractive one.
Conditional structures lie at the heart of the sciences, humanities, and everyday reasoning. It is hence not surprising that conditional logics – logics specifically designed to account for natural language conditionals – are an active and interdisciplinary area. The present book gives a formal and a philosophical account of indicative and counterfactual conditionals in terms of Chellas-Segerberg semantics. For that purpose a range of topics are discussed such as Bennett’s arguments against truth value based semantics for indicative conditionals.
Defeaters can prevent a perceptual belief from being justified. For example, when you know that red light is shining at the table before you, you would typically not be justified in believing that the table is red. However, can defeaters also destroy a perceptual experience as a source of justification? If the answer is ‘no’, the red light defeater blocks doxastic justification without destroying propositional justification. You have some-things-considered, but not all-things-considered, justification for believing that the table is red. If (...) the answer is ‘yes’, the red light defeater blocks doxastic justification by destroying propositional justification. You have neither all-things-considered nor some-things-considered justification for believing that the table is red. According to dogmatism, the justificational force of perceptual experiences is indestructible. According to conservatism about sense experience, a perceptual experience ceases to have justificational force if there is evidence against its reliability. Finally, according to meta-evidentialism, a perceptual experience is blocked from being a source of justification is there is no evidence of its reliability. I argue that, of these three theories, meta-evidentialism is the most plausible. (shrink)
I argue that scientific realism, insofar as it is only committed to those scientific posits of which we have causal knowledge, is immune to Kyle Stanford’s argument from unconceived alternatives. This causal strategy is shown not to repeat the shortcomings of previous realist responses to Stanford’s argument. Furthermore, I show that the notion of causal knowledge underlying it can be made sufficiently precise by means of conceptual tools recently introduced into the debate on scientific realism. Finally, I apply this strategy (...) to the case of Jean Perrin’s experimental work on the atomic hypothesis, disputing Stanford’s claim that the problem of unconceived alternatives invalidates a realist interpretation of this historical episode. 1 Stanford’s Argument from Unconceived Alternatives2 Previous Attempts to Undermine the Problem of Unconceived Alternatives2.1 The plausibility of unconceived alternatives2.2 The distinctness of unconceived alternatives2.3 The induction from past to present3 Causal Knowledge as a Criterion for the Realist3.1 How Chakravartty’s proposal differs from earlier causal strategies3.2 Causal realism and the detection/auxiliary distinction4 Causal Realism, Unconceived Alternatives, and the Atomic Hypothesis4.1 Perrin and the philosophers: some initial observations4.2 Roush and Stanford on Perrin4.3 From Brownian motion to the reality of atoms4.4 What we know about atoms5 Conclusion. (shrink)
Corporate/collective moral responsibility is a thorny topic in business ethics and this paper argues that this is due a number of unacknowledged and connected epistemic issues. Firstly, CSR, Corporate Citizenship and many other research streams that are based on the assumption of collective and/or corporate moral responsibility are not compatible with Kantian ethics, consequentialism, or virtue ethics because corporate/collective responsibility violates the axioms and central hypotheses of these research programmes. Secondly, in the absence of a sound theoretical moral philosophical foundation, (...) business ethicists have based their ideas on legal and political epistemologies, yet still claim to be ethics-based. Thirdly, research is often driven by an intention to prove that a specific social goal is right, not by open and critical inquiry. Finally, today, corporate/collective moral responsibility is widely accepted as the Truth as most researchers are unaware of any issues because they are untrained in philosophy. The paper identifies the confusion about the epistemic basis as a major impediment for delivering a thick concept of the role of corporations as moral agents. Thus, the paper does not argue against corporate or collective agency as such, but points out an obvious but forgotten paradox: corporate and collective personhood cannot, at the moment at least, be epistemologically grounded in the field in which business ethics claims to operate: moral philosophy. (shrink)
Consciousness is scientifically challenging to study because of its subjective aspect. This leads researchers to rely on report-based experimental paradigms in order to discover neural correlates of consciousness (NCCs). I argue that the reliance on reports has biased the search for NCCs, thus creating what I call 'methodological artefacts'. This paper has three main goals: first, describe the measurement problem in consciousness science and argue that this problem led to the emergence of methodological artefacts. Second, provide a critical assessment of (...) the NCCs put forward by the global neuronal workspace theory. Third, provide the means of dissociating genuine NCCs from methodological artefacts. (shrink)
Making good decisions in extremely complex and difficult processes and situations has always been both a key task as well as a challenge in the clinic and has led to a large amount of clinical, legal and ethical routines, protocols and reflections in order to guarantee fair, participatory and up-to-date pathways for clinical decision-making. Nevertheless, the complexity of processes and physical phenomena, time as well as economic constraints and not least further endeavours as well as achievements in medicine and healthcare (...) continuously raise the need to evaluate and to improve clinical decision-making. This article scrutinises if and how clinical decision-making processes are challenged by the rise of so-called artificial intelligence-driven decision support systems. In a first step, this article analyses how the rise of AI-DSS will affect and transform the modes of interaction between different agents in the clinic. In a second step, we point out how these changing modes of interaction also imply shifts in the conditions of trustworthiness, epistemic challenges regarding transparency, the underlying normative concepts of agency and its embedding into concrete contexts of deployment and, finally, the consequences for ascriptions of responsibility. Third, we draw first conclusions for further steps regarding a ‘meaningful human control’ of clinical AI-DSS. (shrink)
Some proponents of the Integrated Information Theory of consciousness profess strong views on the Neural Correlates of Consciousness, namely that large swathes of the neocortex, the cerebellum, at least some sensory cortices, and the so-called limbic system are all not essential for any form of conscious experiences. We argue that this connection is not incidental. Conflation between strong and weak versions of the theory has led these researchers to adopt definitions of NCC that are inconsistent with their own previous definitions, (...) inadvertently betraying the promises of an otherwise fruitful empirical endeavour. (shrink)
This paper makes three points: First, empiricism as a stance is problematic unless criteria for evaluating the stance are provided. Second, Van Fraassen conceives of the empiricist stance as receiving its content, at least in part, from the rejection of metaphysics. But the rejection of metaphysics seems to presuppose for its justification the very empiricist doctrine Van Fraassen intends to replace with the empiricist stance. Third, while I agree with Van Fraassen’s endorsement of voluntarism, I raise doubts about the possibility (...) of defending voluntarism without engaging in the kind of metaphysics Van Fraassen rejects. (shrink)
This volume gathers eleven new and three previously unpublished essays that take on questions of epistemic justification, responsibility, and virtue. It contains the best recent work in this area by major figures such as Ernest Sosa, Robert Audi, Alvin Goldman, and Susan Haak.
In der Studie geht Matthias Neuber der Frage nach, in welchem Verhältnis das Konzept des Realismus und der logische Empirismus des Wiener Kreises, eine der dominanten Strömungen der deutschsprachigen theoretischen Philosophie des frühen 20. Jahrhunderts, zueinander stehen. Diese Fragestellung ist in der philosophiehistorischen Forschung bislang nur am Rande behandelt worden. Das ist umso erstaunlicher, als die neuere wissenschaftsphilosophische Realismusdebatte gerade durch den logischen Empirismus maßgeblich mitbestimmt worden ist. Der Autor geht aber noch einen Schritt weiter: Er begründet in dem (...) Band die These, dass es innerhalb des logischen Empirismus selbst Strömungen gab, die mit dem wissenschaftlichen Realismus kompatibel sind. Damit bezieht er eine Gegenposition zum Mainstream in der Deutung der wissenschaftsphilosophischen Realismusdebatte des 20. Jahrhunderts, denn der versteht den wissenschaftlichen Realismus als Gegenprogramm zum logischen Empirismus. Neuber liefert mit seiner philosophiehistorischen Studie nicht weniger als eine Neubewertung des Verhältnisses von Realismus und logischem Empirismus. Ein Werk, das sich insbesondere an Wissenschaftler, aber auch an fortgeschrittene Studierende auf dem Gebiet der Geschichte der Wissenschaftsphilosophie richtet. (shrink)
When I take a sip from the coffee in my cup, I can taste that it is sweet. When I hold the cup with my hands, I can feel that it is hot. Why does the experience of feeling that the cup is hot give me justification for believing that the cup is hot?And why does the experience of tasting that the coffee is sweet give me justification for believing that the coffee is sweet?In general terms: Why is it that (...) a sense experience that P is a source of justification—a reason—for believing that P? Call this the Question. I will discuss various answers to the Question, and defend the one I myself favor. (shrink)
We investigate interpolation properties of many-valued propositional logics related to continuous t-norms. In case of failure of interpolation, we characterize the minimal interpolating extensions of the languages. For finite-valued logics, we count the number of interpolating extensions by Fibonacci sequences.
This paper seeks to address research governance by highlighting the notion of public accountability as a complementary tool for the establishment of an ethical resonance space for emerging technologies. Public accountability can render development and design process of emerging technologies transparent through practices of holding those in charge of research accountable for their actions, thereby fostering ethical engagement with their potential negative consequences or side-effects. Through practices such as parliamentary questions, audits, and open letters emerging technologies could be effectively rendered (...) transparent and opened up to broader levels of scrutiny and debate, thereby contributing to a greater adherence of emerging technologies to ethics and moral consensus. Fundamental democratic practices could thus not only lead to better informed choices in design and development processes, but also contribute to more morally substantive outcomes. (shrink)
This paper argues that ceteris paribus (cp) laws exist based on a Lewisian best system analysis of lawhood (BSA). Furthermore, it shows that a BSA faces a second trivialization problem besides the one identified by Lewis. The first point concerns an argument against cp laws by Earman and Roberts. The second point aims to help making some assumptions of the BSA explicit. To address the second trivialization problem, a restriction in terms of natural logical constants is proposed that allows one (...) to describe regularities, as specified by basic generics (e.g. ‘birds can fly’) and universals (e.g. ‘all birds can fly’). It is argued that cp laws rather than strict laws might be a part of the the best system of such a regularity-based BSA, since sets of cp laws can be both (a) simpler and (b) stronger when reconstructed as generic non-material conditionals. Yet, if sets of cp laws might be a part of the best system of a BSA and thus qualify as proper laws of nature, it seems reasonable to conclude that at least some cp laws qualify as proper laws of nature. (shrink)