_The Power of Religion in the Public Sphere_, co-edited by Eduardo Mendieta and Jonathan VanAntwerpen, represents a rare opportunity to experience a diverse group of preeminent philosophers confronting one pervasive contemporary concern: what role does, or should, religion play in our public lives? Reflecting on her recent work concerning state violence in Israel-Palestine, Judith Butler explores the potential of religious perspectives for renewing cultural and political criticism, while Jürgen Habermas, best known for his seminal conception of the public sphere, (...) thinks through the ambiguous legacy of the concept of "the political" in contemporary theory. Charles Taylor argues for a radical redefinition of secularism, and Cornel West defends civil disobedience and emancipatory theology. Eduardo Mendieta and Jonathan VanAntwerpen detail the immense contribution of these philosophers to contemporary social and political theory, and an afterword by Craig Calhoun places these attempts to reconceive the significance of both religion and the secular in the context of contemporary national and international politics. (shrink)
Who Comes After the Subject offers the most comprehensive overview to date of contemporary French thinking on the question of the "subject." Nineteen philosophers and critics offer diverse perspectives on the subject as it has manifested itself in our modern discourses: the subject of philosophy, of the State, of history, of psychoanalysis. Each contribution asks What has become of the subject? or What has the subject become? in the wake of its critiques and deconstructions .
Rejecting the Cut rule has been proposed as a strategy to avoid both the usual semantic paradoxes and the so-called v-Curry paradox. In this paper we consider if a Cut-free theory is capable of accurately representing its own notion of validity. We claim that the standard rules governing the validity predicate are too weak for this purpose and we show that although it is possible to strengthen these rules, the most obvious way of doing so brings with it a serious (...) problem: an internalized version of Cut can be proved for a Curry-like sentence. We also evaluate a number of possible ways of escaping this difficulty. (shrink)
This article develops a constructivist, non-metaphysical, non-essentialist conception of human dignity using Jeremy Waldron, Michael Rosen, Ernst Bloch, Jürgen Habermas and Axel Honneth. This constructivist conception of dignity is then related to the communicative or reflexive conception of freedom developed by discourse ethics. Then, these two conceptions are demonstrated to be foundational for the development and implementation of human rights.
We propose a new class of multiple contraction operations — the system of spheres-based multiple contractions — which are a generalization of Grove’s system of spheres-based (singleton) contractions to the case of contractions by (possibly non-singleton) sets of sentences. Furthermore, we show that this new class of functions is a subclass of the class of the partial meet multiple contractions.
Opinionated state of the art paper on scientific explanation. Analysis and discussion of the most relevant models and theories in the contemporary literature, namely, the deductive-nomological model, the models of inductive-statistical and statistical relevance, the pragmatic theory of why questions, the unifying theory of standard arguments, and the causal/non-causal counterfactual theory.
Many authors have considered that the notions of paraconsistency and dialetheism are intrinsically connected, in many cases, to the extent of confusing both phenomena. However, paraconsistency is a formal feature of some logics that consists in invalidating the rule of explosion, whereas dialetheism is a semantical/ontological position consisting in accepting true contradictions. In this paper, we argue against this connection and show that it is perfectly possible to adopt a paraconsistent logic and reject dialetheism, and, moreover, that there are examples (...) of non-paraconsistent logics that can be interpreted in a dialetheic way. (shrink)
In this article, we will present a number of technical results concerning Classical Logic, ST and related systems. Our main contribution consists in offering a novel identity criterion for logics in general and, therefore, for Classical Logic. In particular, we will firstly generalize the ST phenomenon, thereby obtaining a recursively defined hierarchy of strict-tolerant systems. Secondly, we will prove that the logics in this hierarchy are progressively more classical, although not entirely classical. We will claim that a logic is to (...) be identified with an infinite sequence of consequence relations holding between increasingly complex relata: formulae, inferences, metainferences, and so on. As a result, the present proposal allows not only to differentiate Classical Logic from ST, but also from other systems sharing with it their valid metainferences. Finally, we show how these results have interesting consequences for some topics in the philosophical logic literature, among them for the debate around Logical Pluralism. The reason being that the discussion concerning this topic is usually carried out employing a rivalry criterion for logics that will need to be modified in light of the present investigation, according to which two logics can be non-identical even if they share the same valid inferences. (shrink)
When discussing Logical Pluralism several critics argue that such an open-minded position is untenable. The key to this conclusion is that, given a number of widely accepted assumptions, the pluralist view collapses into Logical Monism. In this paper we show that the arguments usually employed to arrive at this conclusion do not work. The main reason for this is the existence of certain substructural logics which have the same set of valid inferences as Classical Logic—although they are, in a clear (...) sense, non-identical to it. We argue that this phenomenon can be generalized, given the existence of logics which coincide with Classical Logic regarding a number of metainferential levels—although they are, again, clearly different systems. We claim this highlights the need to arrive at a more refined version of the Collapse Argument, which we discuss at the end of the paper. (shrink)
From a scientific standpoint, the world is more prepared than ever to respond to infectious disease outbreaks; paradoxically, globalization and air travel, antimicrobial resistance, the threat of bioterrorism, and newly emerging pathogens driven by ecological, socioeconomic, and environmental factors, have increased the risk of global epidemics.1,2,3Following the 2002–2003 severe acute respiratory syndrome, global efforts to build global emergency response capabilities to contain infectious disease outbreaks were put in place.4,5,6But the recent H1N1, Ebola, and Zika global epidemics have shown unnecessary delays (...) and insufficient coordination in response efforts.7,8,9,10In a thoughtful and compelling essay,11Thana C. de Campos argues that greater clarity in the definition of pandemics would probably result in more timely effective emergency responses, and pandemic preparedness. In her view, a central problem is that the definition of pandemics is based solely on disease transmission across several countries, and not on spread and severity together, which conflates two very different situations: emergency and nonemergency disease outbreaks. A greater emphasis on severity, such that pandemics are defined as severe and rapidly spreading infectious disease outbreaks, would make them “true global health emergencies,” allowing for priority resource allocation and effective collective actions in emergency response efforts. Sympathetic to the position taken by de Campos, here I highlight some of the challenges in the definition of severity during an infectious disease outbreak. (shrink)
In some recent articles, Cobreros, Egré, Ripley, & van Rooij have defended the idea that abandoning transitivity may lead to a solution to the trouble caused by semantic paradoxes. For that purpose, they develop the Strict-Tolerant approach, which leads them to entertain a nontransitive theory of truth, where the structural rule of Cut is not generally valid. However, that Cut fails in general in the target theory of truth does not mean that there are not certain safe instances of Cut (...) involving semantic notions. In this article we intend to meet the challenge of answering how to regain all the safe instances of Cut, in the language of the theory, making essential use of a unary recovery operator. To fulfill this goal, we will work within the so-called Goodship Project, which suggests that in order to have nontrivial naïve theories it is sufficient to formulate the corresponding self-referential sentences with suitable biconditionals. Nevertheless, a secondary aim of this article is to propose a novel way to carry this project out, showing that the biconditionals in question can be totally classical. In the context of this article, these biconditionals will be essentially used in expressing the self-referential sentences and, thus, as a collateral result of our work we will prove that none of the recoveries expected of the target theory can be nontrivially achieved if self-reference is expressed through identities. (shrink)
"The essays in this book make it elegantly clear that there is a vigorous and rigorous Latin American philosophy... and that others dismiss it at their peril." —Mario Sáenz The ten essays in this lively anthology move beyond a purely historical consideration of Latin American philosophy to cover recent developments in political and social philosophy as well as innovations in the reception of key philosophical figures from the European Continental tradition. Topics such as indigenous philosophy, multiculturalism, the philosophy of race, (...) democracy, postmodernity, the role of women, and the position of Latin America and Latin Americans in a global age are explored by notable philosophers from the region. An introduction by Eduardo Mendieta examines recent trends and points to the social, political, economic, and cultural conditions that have inspired the discipline. Latin American Philosophy brings English-speaking readers up to date with recent scholarship and points to promising new directions. (shrink)
Adding a transparent truth predicate to a language completely governed by classical logic is not possible. The trouble, as is well-known, comes from paradoxes such as the Liar and Curry. Recently, Cobreros, Egré, Ripley and van Rooij have put forward an approach based on a non-transitive notion of consequence which is suitable to deal with semantic paradoxes while having a transparent truth predicate together with classical logic. Nevertheless, there are some interesting issues concerning the set of metainferences validated by this (...) logic. In this paper, we show that this logic, once it is adequately understood, is weaker than classical logic. Moreover, the logic is in a way similar to the paraconsistent logic LP. (shrink)
This paper defends the Quine-Putnam mathematical indispensability argument against two objections raised by Penelope Maddy. The objections concern scientific practices regarding the development of the atomic theory and the role of applied mathematics in the continuum and infinity. I present two alternative accounts by Stephen Brush and Alan Chalmers on the atomic theory. I argue that these two theories are consistent with Quine’s theory of scientific confirmation. I advance some novel versions of the indispensability argument. I argue that these new (...) versions accommodate Maddy’s history of the atomic theory. Counter-examples are provided regarding the role of the mathematical continuum and mathematical infinity in science. (shrink)
In this paper, we present a non-trivial and expressively complete paraconsistent naïve theory of truth, as a step in the route towards semantic closure. We achieve this goal by expressing self-reference with a weak procedure, that uses equivalences between expressions of the language, as opposed to a strong procedure, that uses identities. Finally, we make some remarks regarding the sense in which the theory of truth discussed has a property closely related to functional completeness, and we present a sound and (...) complete three-sided sequent calculus for this expressively rich theory. (shrink)
In 1997, I introduced the concept and the phrase “bio art”, originally in relation to my artwork “Time Capsule”. This work approached the problem of wet interfaces and human hosting of digital memory through the implantation of a microchip. The work consisted of a microchip implant, seven sepia-toned photographs, a live television broadcast, a webcast, interactive telerobotic webscanning of the implant, a remote database intervention, and additional display elements, including an X-ray of the implant. While “bio art” is applicable to (...) a large gamut of in vivo works that employ biological media, made by myself and others, in 1998, I started to employ the more focused term “transgenic art”. Republished in Ars Electronica ‘99—Life Science, ed. Gerfried Stocker and Christine Schopf, 289–296.) to describe a new art form based on the use of genetic engineering to create unique living beings. Art that manipulates or creates life must be pursued with great care, with acknowledgment of the complex issues it raises and, above all, with a commitment to respect, nurture, and love the life created. I have been creating and exhibiting a series of transgenic artworks since 1999. I have also been creating bio art that is not transgenic. The implications of this ongoing body of work have particular esthetic and social ramifications, crossing several disciplines and providing material for further reflection and dialog. What follows is an overview of theses works, the issues they evoke, and the debates they have elicited. (shrink)
The postulate of Recovery, among the six postulates for theory contraction, formulated and studied by Alchourrón, Gärdenfors and Makinson is the one that has provoked most controversy. In this article we construct withdrawal functions that do not satisfy Recovery, but try to preserve minimal change, and relate these withdrawal functions with the AGM contraction functions.
In this note we shall argue that Milne’s new effort does not refute Truthmaker Maximalism. According to Truthmaker Maximalism, every truth has a truthmaker. Milne has attempted to refute it using the following self-referential sentence M: This sentence has no truthmaker. Essential to his refutation is that M is like the Gödel sentence and unlike the Liar, and one way in which Milne supports this assimilation is through the claim that his proof is essentially object-level and not semantic. In Section (...) 2, we shall argue that Milne is still begging the question against Truthmaker Maximalism. In Section 3, we shall argue that even assimilating M to the Liar does not force the truthmaker maximalist to maintain the ‘dull option’ that M does not express a proposition. There are other options open and, though they imply revising the logic in Milne’s reasoning, this is not one of the possible revisions he considers. In Section 4, we shall suggest that Milne’s proof requires an implicit appeal to semantic principles and notions. In Section 5, we shall point out that there are two important dissimilarities between M and the Gödel sentence. Section 6 is a brief summary and conclusion. (shrink)
The 1985 paper by Carlos Alchourrón, Peter Gärdenfors, and David Makinson, “On the Logic of Theory Change: Partial Meet Contraction and Revision Functions” was the starting-point of a large and rapidly growing literature that employs formal models in the investigation of changes in belief states and databases. In this review, the first twenty-five years of this development are summarized. The topics covered include equivalent characterizations of AGM operations, extended representations of the belief states, change operators not included in the original (...) framework, iterated change, applications of the model, its connections with other formal frameworks, computatibility of AGM operations, and criticism of the model. (shrink)
The 1985 paper by Carlos Alchourrón (1931–1996), Peter Gärdenfors, and David Makinson (AGM), "On the Logic of Theory Change: Partial Meet Contraction and Revision Functions" was the starting-point of a large and rapidly growing literature that employs formal models in the investigation of changes in belief states and databases. In this review, the first twentyfive years of this development are summarized. The topics covered include equivalent characterizations of AGM operations, extended representations of the belief states, change operators not included in (...) the original framework, iterated change, applications of the model, its connections with other formal frameworks, computatibility of AGM operations, and criticism of the model. (shrink)
Transhumanist thought on overpopulation usually invokes the welfare of present human beings and the control over future generation, thus minimizing the need and meaning of new births. Here we devise a framework for a more thorough screening of the relevant literature, to have a better appreciation of the issue of natality. We follow the lead of Hannah Arendt and Brent Waters in this respect. With three overlapping categories of words, headed by “natality,” “birth,” and “intergenerations,” a large sample of books (...) on transhumanism is scrutinized, showing the lack of sustained reflection on the issue. After this preliminary scrutiny, a possible defense of natality in face of modern and transhumanist thought is marshaled, evoking a number of desirable human traits. One specific issue, the impact of modern values on natality, is further explored, reiterating that concerns about overpopulation and enhanced humans should keep in sight the natural cycle of birth and death. (shrink)
To the surprise of many readers, Jürgen Habermas has recently made religion a major theme of his work. Emphasizing both religion's prominence in the contemporary public sphere and its potential contributions to critical thought, Habermas's engagement with religion has been controversial and exciting, putting much of his own work in fresh perspective and engaging key themes in philosophy, politics and social theory. Habermas argues that the once widely accepted hypothesis of progressive secularization fails to account for the multiple trajectories of (...) modernization in the contemporary world. He calls attention to the contemporary significance of "postmetaphysical" thought and "postsecular" consciousness - even in Western societies that have embraced a rationalistic understanding of public reason. Edited by Craig Calhoun, Eduardo Mendieta, and Jonathan VanAntwerpen, _Habermas and Religion_ presents a series of original and sustained engagements with Habermas's writing on religion in the public sphere, featuring new work and critical reflections from leading philosophers, social and political theorists, and anthropologists. Contributors to the volume respond both to Habermas's ambitious and well-developed philosophical project and to his most recent work on religion. The book closes with an extended response from Habermas - itself a major statement from one of today's most important thinkers. (shrink)
Resumo: Neste artigo, propõe-se uma confrontação entre a teoria dos signos de Gotthold E. Lessing, tal como exposta em Laocoonte ou sobre as fronteiras da pintura e da poesia, e os dois ensaios de Theodor W. Adorno sobre as relações entre música e pintura. Pretende-se, com isso, demonstrar a presença decisiva de elementos da estética clássica alemã no pensamento adorniano do pós-guerra; em particular, observa-se o modo pelo qual a teoria racionalista de Lessing atua na abordagem dialética adorniana a respeito (...) da irredutibilidade formal dos meios artísticos e das possibilidades de sua convergência. À luz de tal confrontação, discutem-se, em um segundo momento do artigo, os temas da conferência de Adorno de 1966, A arte e as artes, que, em certa medida, consubstancia a discussão dos ensaios anteriores sobre música e pintura. Assinala-se, nesse contexto, a continuidade da posição teórica de Adorno e se apresentam as diferenças entre o processo de pseudomorfose e o de imbricação dos meios artísticos, segundo o filósofo.s: This article presents a comparison of Gotthold E. Lessing’s theory of signs, as found in his Laocoön: an essay on the limits of painting and poetry, and Theodore W. Adorno’s two essays on the relationship between music and painting. Our aim is to point out the decisive influence of German classical aesthetics on Adorno’s post-war aesthetics. Specifically, we discuss how Lessing’s theory functions as a framework for Adorno’s dialectical assessment of the formal specificity of artistic media and their possibilities of convergence in the context of the 1960’s avant-garde. In this context, we discuss the main implications of Adorno’s famous lecture of 1966, Art and the arts, which concerned the process of media convergence that intensified during the 1960’s, as well as the concepts of “overlapping” between artistic media and of “pseudomorphosis”. (shrink)
I propose a deductive-nomological model for mathematical scientific explanation. In this regard, I modify Hempel’s deductive-nomological model and test it against some of the following recent paradigmatic examples of the mathematical explanation of empirical facts: the seven bridges of Königsberg, the North American synchronized cicadas, and Hénon-Heiles Hamiltonian systems. I argue that mathematical scientific explanations that invoke laws of nature are qualitative explanations, and ordinary scientific explanations that employ mathematics are quantitative explanations. I analyse the repercussions of this deductivenomological model (...) on causal explanations. (shrink)
The aim of this paper is to show that it’s not a good idea to have a theory of truth that is consistent but ω -inconsistent. In order to bring out this point, it is useful to consider a particular case: Yablo’s Paradox. In theories of truth without standard models, the introduction of the truth-predicate to a first order theory does not maintain the standard ontology. Firstly, I exhibit some conceptual problems that follow from so introducing it. Secondly, I show (...) that in second order theories with standard semantics the same procedure yields a theory that doesn’t have models. So, while having an ω - inconsistent theory is a bad thing, having an unsatisfiable theory of truth is actually worse. This casts doubts on whether the predicate in question is, after all, a truthpredicate for that language. Finally, I present some alternatives to prove an inconsistency adding plausible principles to certain theories of truth. (shrink)
A theory of magnitudes involves criteria for their equivalence, comparison and addition. In this article we examine these aspects from an abstract viewpoint, by focusing on the so-called De Zolt’s postulate in the theory of equivalence of plane polygons. We formulate an abstract version of this postulate and derive it from some selected principles for magnitudes. We also formulate and derive an abstract version of Euclid’s Common Notion 5, and analyze its logical relation to the former proposition. These results prove (...) to be relevant for the clarification of some key conceptual aspects of Hilbert’s proof of De Zolt’s postulate, in his classical Foundations of Geometry. Furthermore, our abstract treatment of this central proposition provides interesting insights for the development of a well-behaved theory of compatible magnitudes. (shrink)
In different papers, Carnielli, W. & Rodrigues, A., Carnielli, W. Coniglio, M. & Rodrigues, A. and Rodrigues & Carnielli, present two logics motivated by the idea of capturing contradictions as conflicting evidence. The first logic is called BLE and the second—that is a conservative extension of BLE—is named LETJ. Roughly, BLE and LETJ are two non-classical logics in which the Laws of Explosion and Excluded Middle are not admissible. LETJ is built on top of BLE. Moreover, LETJ is a Logic (...) of Formal Inconsistency. This means that there is an operator that, roughly speaking, identifies a formula as having classical behavior. Both systems are motivated by the idea that there are different conditions for accepting or rejecting a sentence of our natural language. So, there are some special introduction and elimination rules in the theory that are capturing different conditions of use. Rodrigues & Carnielli’s paper has an interesting and challenging idea. According to them, BLE and LETJ are incompatible with dialetheia. It seems to show that these paraconsistent logics cannot be interpreted using truth-conditions that allow true contradictions. In short, BLE and LETJ talk about conflicting evidence avoiding to talk about gluts. I am going to argue against this point of view. Basically, I will firstly offer a new interpretation of BLE and LETJ that is compatible with dialetheia. The background of my position is to reject the one canonical interpretation thesis: the idea according to which a logical system has one standard interpretation. Then, I will secondly show that there is no logical basis to fix that Rodrigues & Carnielli’s interpretation is the canonical way to establish the content of logical notions of BLE and LETJ. Furthermore, the system LETJ captures inside classical logic. Then, I am also going to use this technical result to offer some further doubts about the one canonical interpretation thesis. (shrink)
The paper outlines an interpretation of one of the most important and original contributions of David Hilbert’s monograph Foundations of Geometry , namely his internal arithmetization of geometry. It is claimed that Hilbert’s profound interest in the problem of the introduction of numbers into geometry responded to certain epistemological aims and methodological concerns that were fundamental to his early axiomatic investigations into the foundations of elementary geometry. In particular, it is shown that a central concern that motivated Hilbert’s axiomatic investigations (...) from very early on was the aim of providing an independent basis for geometry. Accordingly, these concerns about an independent grounding for elementary geometry determined very clear methodological constraints in the process of embedding it into a formal axiomatic system. It is argued that Hilbert not only sought to show that geometry could be considered a pure mathematical theory, once it was presented as a formal axiomatic system; he also aimed at showing that in the construction of such an axiomatic system one could proceed purely geometrically, avoiding concept formations borrowed from other mathematical disciplines like arithmetic or analysis. (shrink)
This paper focuses on the extension of AGM that allows change for a belief base by a set of sentences instead of a single sentence. In [FH94], Fuhrmann and Hansson presented an axiomatic for Multiple Contraction and a construction based on the AGM Partial Meet Contraction. We propose for their model another way to construct functions: Multiple Kernel Contraction, that is a modification of Kernel Contraction, proposed by Hansson [Han94] to construct classical AGM contractions and belief base contractions. This construction (...) works out the unsolved problem pointed out by Hansson in [Han99, pp. 369]. (shrink)
We introduce a constructive model of selective belief revision in which it is possible to accept only a part of the input information. A selective revision operator ο is defined by the equality K ο α = K * f(α), where * is an AGM revision operator and f a function, typically with the property ⊢ α → f(α). Axiomatic characterizations are provided for three variants of selective revision.
Este artigo busca dialogar com o conceito de ‘frentismo cultural’, aliança de resistência à última ditadura militar brasileira a unir grande parte dos integrantes do PCB, intelectuais e artistas nos primeiros anos daquele regime. Dialogaremos, também, com uma dada construção social da memória sobre a ditadura, especificamente aquela forjada pela imprensa escrita, em seu duplo papel de fomentadora do Golpe e veículo de resistência à ditadura. O matutino fluminense Correio da Manhã e um dos seus principais editorialistas à época, Otto (...) Maria Carpeaux, são aqui apresentados como representantes daquela frente, capazes tanto de ratificá-la quanto indicar os seus limites. (shrink)
Page generated Tue Jul 27 04:54:51 2021 on philpapers-web-84c8c567c7-mhfn6
cache stats: hit=21874, miss=17935, save= autohandler : 1794 ms called component : 1779 ms search.pl : 1474 ms render loop : 1125 ms addfields : 586 ms publicCats : 505 ms next : 482 ms initIterator : 346 ms autosense : 200 ms match_other : 166 ms save cache object : 115 ms menu : 88 ms retrieve cache object : 81 ms quotes : 64 ms match_cats : 31 ms search_quotes : 25 ms prepCit : 22 ms applytpl : 6 ms intermediate : 1 ms match_authors : 1 ms init renderer : 0 ms setup : 0 ms auth : 0 ms writelog : 0 ms