In this paper, we employ Extended Cognition as a background for a series of thought experiments about privacy and common used information technology devices. Laptops and smart phones are now widely used devices, but current privacy standards do not adequately address the relationship between the owners of these devices and the information stored on them. Law enforcement treats laptops and smart phones are potential sources of information about criminal activity, but this treatment ignores the use of smart devices as extensions (...) of users’ cognitive capability. In Philosophy of Mind, Extended Cognition is a metaphysical theory about the relationship between consciousness or cognitive activity and various external tools or aids that agents employ in the service of cognition. Supporters of Extended Cognition argue that mental activity must be understood as taking place both within the brain and by way of tools such as a logician’s pen and paper, a mathematician’s calculator, or a writer’s word processing program. While Extended Cognition does not have universal support among philosophers of mind, the theory nevertheless describes how agents interact with their “smart devices.” We explore the the implications of taking Extended Cognition seriously with regard to privacy concerns by way of a series of thought experiments. By comparing the differences in expectations of privacy between a citizen and the government, between an employee of a corporate firm, and between citizens alone, we show that expectations of privacy and injury are significantly affected by taking the cognitive role of smart devices into account. (shrink)
Sophie Roux confronte la critique du « sokalisme » qu’on trouve dans La Querelle des imposteurs d’Yves Jeanneret et la manière dont Impostures intellectuelles dessine le partage entre « littéraires » et « scientifiques ».
L’article « D’une Affaire aux autres » de Josquin Debaz et Sophie Roux, montre combien il est difficile de délimiter ce qu’on appelle « l’Affaire Sokal » et analyse, par un recensement aussi systématique que possible des articles de presse, la différence entre l’affaire américaine et l’affaire française.
Goffi and Roux are interested in what makes some thought experiments work, while others do not work. They do not attempt to draw an a priori line between two types of thought experiments, but rather ask the following question: inasmuch as thought experiments are arguments, and notwithstanding the fact that some of them might involve the contemplation of an imaginary scenario, how is it that some of them work, while others do not? Taking inspiration from a counterfactual thought experiment (...) presented by Nicholas Rescher, they treat thought experiments as argumentative procedures resembling tests of consistency, which invite the experimenter to seek the weakest link in her body of beliefs. Equipped with this method, they examine two well-known successful thought experiments (Galileo’s two bodies strapped together, and Thomson’s violinist) and discuss Mach’s notion of thought experiments. Thus they reach the hypothesis that successful thought experiments respect the three following conditions: they do not deal with things, but with beliefs; they mobilise a set of beliefs shared by the interlocutors; and this set of beliefs has a hierarchical structure. Using once again examples written at different periods and taken from various disciplines (Descartes’ receding bodies, Aristotle’s weaving shuttles), Goffi and Roux argue that each of those conditions is individually necessary for a thought experiment to work. They finally conclude on the limits and consequences of their approach. (shrink)
Roux begins by exploring the texts in which the origins of the scientific notion of thought experiments are usually said to be found. Her general claim is simple: the emergence of the notion of thought experiments relies on a succession of misunderstandings and omissions. She then examines, in a more systematic perspective, the three characteristics of the broad category of thought experiments nowadays in circulation: thought experiments are counterfactual, they involve a concrete scenario and they have a well-delimited cognitive (...) intention. Her aim in exploring these characteristics is twofold. Firstly, it is to show that each of these characteristics, considered individually, may be taken in a more or less strict sense, and consequently to explain the proliferation of thought experiments. Secondly, it is to suggest that the recent debates on thought experiments might have arisen because these three characteristics are not easily conciliated when they are considered together. Finally, in a third and last section, the nine essays of the introduced book are presented. (shrink)
In this article I present two arguments from Brian Hebblethwaite for the conclusion that multiple incarnations are impossible, as well as the analyses of those arguments provided by three other thinkers: Oliver Crisp, Peter Kevern, and Robin Le Poidevin. I argue that both of Hebblethwaite's arguments are unsound.
“Prizeworthy Research?” Wilhelm Roux and His Program of Developmental Mechanics. The Nobel Prize in physiology or medicine is awarded annually to a maximum of three laureates. Not surprisingly, the number of nominees is much larger. Drawing on Nobel Prize nominations in the Nobel archives in Sweden, the core of this paper deals with the nomination letters for the physiologist Wilhelm Roux to discuss competition and some controversies among German physiologists around 1900 in this particular context. The paper elucidates (...) the arguments brought forward to portray Roux as a scientist who had conferred “the greatest benefit to mankind” in the field of physiology or medicine ; examines some other runners-up, and reconstructs why Roux as well as some of his peers were not awarded the Nobel Prize. On a more general level, we argue that an analysis of Nobel Prize nominations contributes to a broader history of excellence in science and medicine in the twentieth century. (shrink)
Wilhelm Roux was the follower of mechanical philosophy in biological recognition. His research strategy focusing exclusively on the physical and chemical rights should be considered as strictly scientific. Reductionist perspective of organisms by Roux as closed complexes and physiological activities occuring in their range are in the completely new light of modern biology where the experimental method was introduced. In spite of the faulty methodological assumptions and the failed experimental attempt, the research attempt of Roux was the (...) inspiration for vitalist Hans Driesch. (shrink)
Essays on Wittgensteinian Themes Dedicated to Brian McGuinness Joachim Schulte, Göran Sundholm. PREFACE For thirty-five years the international community of philosophers have known Brian McGuinness as a major authority on the ...
Human beings are peculiar. In laboratory experiments, they often cooperate in one-shot prisoners’ dilemmas, they frequently offer 1/2 and reject low offers in the ultimatum game, and they often bid 1/2 in the game of divide-the-cake All these behaviors are puzzling from the point of view of game theory. The first two are irrational, if utility is measured in a certain way.1 The last isn’t positively irrational, but it is no more rational than other possible actions, since there are infinitely (...) many other Nash equilibria besides the one in which both players bid 1/2. At the same time, these behaviors seem to indicate that people are sometimes inclined to be cooperative, fair, and just. In his stimulating new book, Brian Skyrms sets himself the task of showing why these inclinations evolved, or how they might have evolved, under the pressure of natural selection. The goal is not to justify our ethical intuitions, but to explain why we have them.2.. (shrink)
Brian Z. Tamanaha has written extensively on realism in jurisprudence, but in his Realistic Theory of Law (2018), he uses "realism" in a commonplace way to ground a rough outline of legal history. While he refers to his method as genealogical, he does not acknowledge the complex tensions in the development of the philosophical use of that term from Nietzsche to Foucault, and the complex epistemological issues that separate them. While the book makes many interesting points, the methodological concerns (...) outweigh them in the overall assessment of the value of the work. (shrink)
As the author of Justice as Impartiality, I am not ashamed to admit that I was delighted by the liveliness of the discussion generated by it at the meeting on which this symposium is based. I am likewise grateful to the six authors for finding the book worthy of the careful attention that they have bestowed on it. Between them, the symposiasts take up many more points than I can cover in this response. I shall therefore focus on some themes (...) that cluster round the contractual device that I associate with the notion of justice as impartiality. Is it necessary? If it is not necessary is it nevertheless useful? Within an overall contractual framework is the form of contract that I propose uniquely justifiable? And does the form of contract that I defend generate the implications that I claim for it? (shrink)
In teaching jurisprudence, I typically distinguish between two different families of theories of adjudication—theories of how judges do or should decide cases. “Formalist” theories claim that the law is “rationally” determinate, that is, the class of legitimate legal reasons available for a judge to offer in support of his or her decision justifies one and only one outcome either in all cases or in some significant and contested range of cases ; and adjudication is thus “autonomous” from other kinds of (...) reasoning, that is, the judge can reach the required decision without recourse to nonlegal normative considerations of morality or political philosophy. I also note that “formalism” is sometimes associated with the idea that judicial decision-making involves nothing more than mechanical deduction on the model of the syllogism—Beccaria, for example, expresses such a view. I call the latter “Vulgar Formalism” to emphasize that it is not a view to which anyone today cares to subscribe. (shrink)
William Hasker replies to my arguments against Social Trinitarianism, offers some criticism of my own view, and begins a sketch of another account of the Trinity. I reply with some defence of my own theory and some questions about his.
This paper explores one of the main sources of Nietzsche’s knowledge of physiology and considers its relevance for the philosophical study of history. Beginning in 1881, Nietzsche read Der Kampf der Theile im Organismus by Wilhelm Roux, which exposed him to a dysteleological account of organic development emphasising the excitative, assimilative and auto-regulative processes of the body. These processes mediate the effects of natural selection. His reading contributed to a physiological understanding of history that borrowed Roux’s description of (...) physiological processes. This physiological description of history proceeded from the similarity between the body’s mediation of its milieu and history’s mediation of the past. (shrink)
In "the semantics of singular terms," brian loar described and criticized a "causal" theory of reference and offered a new "description" theory. It is argued that the particular causal theory described is not to be found in the papers by donnellan and kripke cited as evidence for it, And is a straw man. Further "prima facie", Loar's new description theory fails to meet kripke's noncircularity condition. Should loar attempt to meet it, His theory is likely to run foul of (...) kripke's usual "arguments from ignorance and error" against description theories. (shrink)
In his recent article, ‘A Gift to Theology? Jean-Luc Marion's ‘Saturated Phenomena’ in Christological Perspective’, Brian Robinette has critiqued Marion's phenomenology for confining theology to a one-sided approach to Christology, one that stresses only the passive, mystical reception of Christ. To correct this imbalance, Robinette brings Marion into dialogue with those more active Christologies or ‘prophetical-ethical’ liberation theologies of Gustavo Gutierrez, Johann Baptist Metz and others that stress a life-praxis focused on confronting evil and suffering. In this essay I (...) am arguing that Robinette has not fully developed the ‘logic’ of Marion's phenomenology of the ‘call and the gifted’, in which both a passive and an active element are operative. I explore more fully that very dynamic phenomenological process of the call-and-the-gifted as developed in Marion's work Being Given: Toward a Phenomenology of Givenness. Once viewed in Christological perspective, and especially in light of Christ's death and resurrection, Marion's phenomenology entails an ethical trope consistent with the mission of Christ as rendered in Scriptural revelation, and thus the gap between Marion's work and the prophetical-ethical theologies of Gutierrez and Baptist Metz becomes narrowed. (shrink)
In ‘The ethics of belief and Christian faith as commitment to assumptions’, Rik Peels attacks the views that I advanced in ‘Christianity and the ethics of belief’. Here, I rebut his criticisms of the claim that it is wrong to believe without sufficient evidence, of the contention that Christians are committed to that claim, and of the notion of that faith is not belief but commitment to assumptions in the hope of salvation. My original conclusions still stand.
In The Ant Trap, Brian Epstein proposes a bold new systematic strategy for developing social ontology. He explores the history and current state of the art and provides pointed critiques of leading theories in the field. His framework, incompassing frames that provide principles for grounding social facts, is developed in some detail across a variety of social practices and applied to revealing real world as well as hyporthetical examples. If Epstein's account holds, it should provide new directions and standards (...) of inquiry in both social sciecne and social philiosophy. (shrink)
Brian Trainor argues that the current hostility of political theorists towards the idea of the common good is in part due to the influence of Isaiah Berlin's concept of `value pluralism', or the incommensurability of basic human values. I agree with Trainor's opposition to the `agonistic' interpretation of pluralism, associated with thinkers like Chantal Mouffe. However, it is not the case that the only alternative to the pluralism— agonism thesis is the monist defence of a thick common good advocated (...) by Trainor. Between these extremes there is a middle way that accepts the deep plurality of values in Berlin's sense, but recognizes a case for a thin conception of the common good — that is, a liberal political framework. (shrink)
Brian Loar argues that we can account for the conceptual independence of coextensive terms purely psychologically, by appealing to conceptual rather than semantic differences between concepts, and that this leaves room for assuming that phenomenal and physical concepts can be coextensive on a posteriori grounds despite the fact that both sorts of concepts refer directly . I argue that Loar does not remove the mystery of the coextensiveness of those concepts because he does not offer any explanation of why (...) they should be coextensive. Secondly, I argue that even if we grant that phenomenal and physical concepts can be coextensive on a posteriori grounds, we are committed to holding that there are two different and essential modes of presentation of phenomenal properties, the physical and the phenomenal, and that this precludes us from seeing phenomenal properties as essentially physical in an unrelativized sense. (shrink)
I am grateful to Alan Madry and Joel Richeimer for their intelligent and stimulating critique of my article “Heidegger and the Theory of Adjudication.” It is the most interesting commentary I have seen on the paper, and I have learned much from it. It may facilitate discussion, and advance debate, to state with some clarity where exactly we agree and disagree. I leave to the footnotes discussion of certain minor points where Madry and Richeimer are guilty of some critical overreaching.
Brian Leiter and Peter Kail have delivered thoughtful critiques of my book, Nietzsche’s Naturalism: Philosophy and the Life Sciences in the Nineteenth Century.1 It is a great pleasure to respond to these critiques, since they raise some crucial issues with regard to Nietzsche’s understanding of naturalism and normativity. On the one hand, there are many areas of agreement: Nietzsche’s philosophical project is best understood along the lines of naturalism; developments in the nineteenth-century life sciences, broadly speaking, play a crucial (...) role in the formation of Nietzsche’s naturalism; and Nietzsche’s relationship to both Darwin and Darwin’s neo-Kantian interpreters is more complex than generally assumed. On the... (shrink)
Espen Hammer’s exceptionally fine book explores modern temporality, its problems and prospects. Hammer claims that how people experience time is a cultural/historical phenomenon, and that there is a peculiarly modern way of experiencing time as a series of present moments each indefinitely leading to the next in an ordered way. Time as measured by the clock is the paradigmatic instance of this sense of time. In this perspective time is quantifiable and forward-looking, and the present is dominated by the future. (...) Hammer argues that this manner of experiencing time provides a way of living that brings with it not only the basis for great successes in technology, but also great costs—specifically, what he calls the problems of transience and of meaning. Hammer goes about his task by considering the ways some of the great modern philosophers have characterized present-day temporality and have responded to the problems he has identified. Specifically, he considers what Kant, Hegel, Schopenhauer, Nietzsche, Heidegger, Habermas, Bloch, and Adorno provide in response to our peculiarly modern predicaments. The book is remarkable for its clarity and perceptiveness, but in the process in crucial places it simplifies the matters at hand or fails to push its insights as far as it ought, and in the end promises more than it can deliver. In this it betrays a rationalist confidence in the power of reason that founders on what in many ways remains a mystery. (shrink)
Brian Barry's Culture and Equality is probably the most powerful liberal egalitarian critique of multiculturalism addressing the pathologies of recognizing difference of ethnicity, religion, race, and culture. In this essay, I examine Barry's approach to the law, which underpins his theory of egalitarianism to determine whether it is enough — as Barry thinks it is — to insist on either applying the same law for everyone so that exemptions are foreclosed in general, or repealing the law since the case (...) for its existence is not justified. I find that Barry's effort is inadequate. Because the conditions for exemptions are not specified, exemptions are merely defensible, not just. Using the headscarf controversy in France to illustrate why Barry's approach backfires, I argue how enforcing the same law for all leads to undermining the very politics of redistribution that Barry champions. (shrink)
I take as my text propostion 4.0312 of the Tractatus : The possibility of propositions is based on the principle that objects have signs as their representatives. My fundamental idea is that the ‘logical constants’ are not representatives; that there can be no representatives of the logic of facts. Practically the same words occur in Wittgenstein's Notebook for 25 December 1914, where Miss Anscombe translates them: The possibility of the proposition is, of course, founded on the principle of signs as (...) going proxy for objects. Thus in the proposition something has something else as its proxy. But there is also the common cement. My fundamental thought is that the logical constants are not proxies. That the logic of the fact cannot have anything as its proxy. (shrink)
In the preface to his book God the Problem , Gordon Kaufman writes ‘Although the notion of God as agent seems presupposed by most contemporary theologians … Austin Farrer has been almost alone in trying to specify carefully and consistently just what this might be understood to mean.’.
In this interview Prof. Brian Leftow answers questions concerning the causes of the emergence of Analytic Philosophical Theology within the analytic tradition; the advantages of maintaining the traditional picture of perfect being theology with regards to divine attributes; his conception about the origin of necessary truths; the problem of evil; and the importance for universities of investing in research on philosophical theology.
The Philosophical Challenge from China, edited by Brian Bruya, undoubtedly occupies an important place in the discourse about what practices and authorities are relevant to Philosophy as an academic discipline. Its confident reorientation of philosophical relevance in the context of Anglophone academics will hopefully speak meaningfully to any remaining skeptics of the usefulness of Chinese philosophy. The intended audience of this effort, however, is shrinking, or, more accurately, those willing to be convinced are increasingly few, and what remains is (...) simply and haplessly the staunch traditionalists of the so-called Western paradigm. This evokes the thought that anthologies that strive to show relevance... (shrink)