Este estudio reflexiona sobre la hipótesis de que la crisis del discurso político en las sociedades latinoamericanas ha provocado un repliegue de los ciudadanos hacia las representaciones de los medios, que han instaurado nuevas modalidades interpretativas sobre los principales problemas sociales..
Enrique Dussel's writings span the theology of liberation, critiques of discourse ethics, evaluations of Marx, Levinas, Habermas, and others, but most importantly, the development of a philosophy written from the underside of Eurocentric modernist teleologies, an ethics of the impoverished, and the articulation of a unique Latin American theoretical perspective. This anthology of original articles by U.S. philosophers elucidating Dussel's thought, offers critical analyses from a variety of perspectives, including feminist ones. Also included is an essay by Dussel that (...) responds to these essays. (shrink)
The aim of this paper is to explore the peculiar case of infectious logics, a group of systems obtained generalizing the semantic behavior characteristic of the -fragment of the logics of nonsense, such as the ones due to Bochvar and Halldén, among others. Here, we extend these logics with classical negations, and we furthermore show that some of these extended systems can be properly regarded as logics of formal inconsistency and logics of formal undeterminedness.
This article aims to review the standard objections to dualism and to argue that will either fail to convince someone committed to dualism or are flawed on independent grounds. I begin by presenting the taxonomy of metaphysical positions on concrete particulars as they relate to the dispute between materialists and dualists, and in particular substance dualism is defined. In the first section, several kinds of substance dualism are distinguished and the relevant varieties of this kind of dualism are selected. The (...) remaining sections are analyses of the standard objections to substance dualism : It is uninformative, has troubles accounting for soul individuation, causal pairing and interaction, violates laws of physics, is made implausible by the development of neuroscience and it postulates entities beyond necessity. I conclude that none of these objections is successful. (shrink)
Decision Theory and Rationality offers a challenging new interpretation of a key theoretical tool in the human and social sciences. This accessible book argues, contrary to orthodoxy in politics, economics, and management science, that decision theory cannot provide a theory of rationality.
Roughly speaking, classical statistical physics is the branch of theoretical physics that aims to account for the thermal behaviour of macroscopic bodies in terms of a classical mechanical model of their microscopic constituents, with the help of probabilistic assumptions. In the last century and a half, a fair number of approaches have been developed to meet this aim. This study of their foundations assesses their coherence and analyzes the motivations for their basic assumptions, and the interpretations of their central concepts. (...) The most outstanding foundational problems are the explanation of time-asymmetry in thermal behaviour, the relative autonomy of thermal phenomena from their microscopic underpinning, and the meaning of probability. A more or less historic survey is given of the work of Maxwell, Boltzmann and Gibbs in statistical physics, and the problems and objections to which their work gave rise. Next, we review some modern approaches to (i) equilibrium statistical mechanics, such as ergodic theory and the theory of the thermodynamic limit; and to (ii) non-equilibrium statistical mechanics as provided by Lanford's work on the Boltzmann equation, the so-called Bogolyubov-Born-Green-Kirkwood-Yvon approach, and stochastic approaches such as `coarse-graining' and the `open systems' approach. In all cases, we focus on the subtle interplay between probabilistic assumptions, dynamical assumptions, initial conditions and other ingredients used in these approaches. (shrink)
The aim of this article is to analyse the relation between the second law of thermodynamics and the so-called arrow of time. For this purpose, a number of different aspects in this arrow of time are distinguished, in particular those of time-reversal (non-)invariance and of (ir)reversibility. Next I review versions of the second law in the work of Carnot, Clausius, Kelvin, Planck, Gibbs, Caratheodory and Lieb and Yngvason, and investigate their connection with these aspects of the arrow of time. It (...) is shown that this connection varies a great deal along with these formulations of the second law. According to the famous formulation by Planck, the second law expresses the irreversibility of natural processes. But in many other formulations irreversibility or even time-reversal non-invariance plays no role. I therefore argue for the view that the second law has nothing to do with the arrow of time. (shrink)
The target article by Locke & Bogin (L&B) focuses on the evolution of language as a communicative tool. They neglect, however, that from infancy onwards humans have the ability to go beyond successful behaviour and to reflect upon language (and other domains of knowledge) as a problem space in its own right. This ability is not found in other species and may well be what makes humans unique.
This research extends previous findings related to the positive influence of company credibility on a social Cause–Brand Alliance’s (CBA) persuasion mechanism. This study analyzes the mediating role of two dimensions of company credibility (trustworthiness and expertise) with regard to the influence of altruistic attributions and two types of brand–cause fit (functional and image fit) on corporate social responsibility image. A structural equation model tests the proposed framework with a sample of 299 consumers, and the results suggest that (1) image fit (...) and altruistic attribution are cues that consumers use to evaluate company trustworthiness when linking to a social cause; (2) functional fit significantly influences perceived company expertise but not trustworthiness; and (3) trustworthiness has more weight than expertise in judgments about corporate social responsibility. (shrink)
Four-dimensionalism, the stage theory version in particular, has been defended as the best solution for avoiding vagueness in regards to composition, persistence and identity. Stage theory is highly problematic by itself, and the two views usually packed with it, unrestricted composition and counterpart theory, are a heavy burden. However, dispensing with these two views, four-dimensionalism could avoid vague persistence by issuing a criterion that would establish sharp temporal boundaries for the existence of genuine entities (simples, molecules and living organisms). This (...) would avoid vague existence and vague identity, but in a way that is still compatible with endurantism. Nevertheless, a minimal (substantialist) four-dimensionalism, a worm perdurantist ontology, would fit better with the unique way in which organisms persist: by retaining both identity and intrinsic change. (shrink)
Los dos grandes problemas del enfoque supervaluacionista para la vaguedad son determinar cuáles son las precisificaciones admisibles y la vaguedad de orden superior ilimitado. Apelando al uso de los términos vagos por la comunidad lingüística competente puede dividirse de forma tajante la extension de un término precisando en qué casos se aplica definidamente, en cuáles se aplica indefinidamente y en cuales es indeterminado si se aplica. Esto produce dos órdenes de vaguedad, con lo que se bloquean los argumentos sorites. Finalmente (...) se indaga qué semántica resulta más adecuada para una vaguedad de segundo orden. La vaguedad se basa en una indeterminación determinada en la aplicación de ciertos términos por parte de los hablantes nativos. The two main problems for the supevaluationationist approach to vagueness are deciding which precifications are admissible and unlimited higher-order vagueness. Resorting to the use of vague terms by the competent linguistic community the extension of a term can be sharply divided, making precise in which cases it applies definitely, when it applies indefinitely and when it is undetermined whether it applies. This produces two orders of vagueness, and as a result sorties arguments are blocked. Finally, it is enquired what semantic fits better to second order vagueness. Vagueness is based on a determined indeteminacy concerning the application of certain terms by native speakers. (shrink)
This paper outlines a framework of the temporal interpretation in Chinese with a special focus on complement and relative clauses. It argues that not only does Chinese have no morphological tenses but there is no need to resort to covert semantic features under a tense node in order to interpret time in Chinese. Instead, it utilises various factors such as the information provided by default aspect, the tense-aspect particles, and pragmatic reasoning to determine the temporal interpretation of sentences. It is (...) shown that aspectual markers in Chinese play the same role that tense plays in a tense language. This result implies that the Chinese phrase structure has AspP above VP but no TP is above AspP. (shrink)
Although argumentation plays an essential role in our lives, there is no integrated area of research on the psychology of argumentation. Instead research on argumentation is conducted in a number of separate research communities that are spread across disciplines and have only limited interaction. With a view to bridging these different strands, we first distinguish between three meanings of the word ?argument?: argument as a reason, argument as a structured sequence of reasons and claims, and argument as a social exchange. (...) All three meanings are integral to a complete understanding of human reasoning and cognition. Cognitive psychological research on argumentation has focused mostly on the first and second of these meanings, so we present perspectives on argumentation from outside of cognitive psychology, which focus on the second and third. Specifically, we give anoverview of the methods, goals, and disciplinary backgrounds of research on the production, the analysis, and the evaluation of arguments. Finally, inintroducing the experimental studies included in this special issue, which were conducted by researchers from a range of theoretical backgrounds, weunderline the breadth of argumentation research as well as stress opportunities for mutual awareness and integration. (shrink)
ABSTRACTIn everyday situations, people regularly receive information from large groups of people and from single experts. Although lay opinions and expert opinions have been studied extensively in isolation, the present study examined the relationship between the two by asking how many laypeople are needed to counter an expert opinion. A Bayesian formalisation allowed the prescription of this quantity. Participants were subsequently asked to assess how many laypeople are needed in different situations. The results demonstrate that people are sensitive to the (...) relevant factors identified for determining how many lay opinions are required to counteract a single expert opinion. People's assessments were fairly good in line with Bayesian predictions. (shrink)
This paper gives an analysis of the Chinese distributivity marker dou 'all', which can occur not only with definite plural NPs but also with NPs whose determiner is a quantifier word such as mei 'every' or dabufen-de 'most'. Besides normal distributive predicates, it can also occur with certain types of collective predicates. The difficulties of giving a compositional interpretation to constructions of these kinds are discussed in detail. I show that we can solve those difficulties if we treat dou as (...) a generalized distributivity marker in the sense of Schwarzschild (1991, 1996), which distributes over the members of a plurality cover. Apart from the above topic, which is more narrowly a semantics topic, this paper also discusses some syntax-semantics interface issues related to the distribution of dou's associates. (shrink)
In order to protect patients against medical paternalism, patients have been granted the right to respect of their autonomy. This right is operationalized first and foremost through the phenomenon of informed consent. If the patient withholds consent, medical treatment, including life-saving treatment, may not be provided. However, there is one proviso: The patient must be competent to realize his autonomy and reach a decision about his own care that reflects that autonomy. Since one of the most important patient rights hinges (...) on the patient's competence, it is crucially important that patient decision making incompetence is clearly defined and can be diagnosed with the greatest possible degree of sensitivity and, even more important, specificity. Unfortunately, the reality is quite different. There is little consensus in the scientific literature and even less among clinicians and in the law as to what competence exactly means, let alone how it can be diagnosed reliably. And yet, patients are deemed incompetent on a daily basis, losing the right to respect of their autonomy. In this article, we set out to fill that hiatus by beginning at the very beginning, the literal meaning of the term competence. We suggest a generic definition of competence and derive four necessary conditions of competence. We then transpose this definition to the health care context and discuss patient decision making competence. (shrink)
It has been a longstanding problem to show how the irreversible behaviour of macroscopic systems can be reconciled with the time-reversal invariance of these same systems when considered from a microscopic point of view. A result by Lanford shows that, under certain conditions, the famous Boltzmann equation, describing the irreversible behaviour of a dilute gas, can be obtained from the time-reversal invariant Hamiltonian equations of motion for the hard spheres model. Here, we examine how and in what sense Lanford’s theorem (...) succeeds in deriving this remarkable result. Many authors have expressed different views on the question which of the ingredients in Lanford’s theorem is responsible for the emergence of irreversibility. We claim that these interpretations miss the target. In fact, we argue that there is no time-asymmetric ingredient at all. (shrink)
The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...) certain compelling consistency requirements. This paper reviews these consistency arguments and the surrounding controversy. It is shown that the uniqueness proofs are flawed, or rest on unreasonably strong assumptions. A more general class of inference rules, maximizing the so-called Re[acute ]nyi entropies, is exhibited which also fulfill the reasonable part of the consistency assumptions. (shrink)
Whereas there are many publications in which argumentation quality has been defined by argumentation theorists, considerably less research attention has been paid to lay people’s considerations regarding argument quality. Considerations about strong and weak argumentation are relevant because they can be compared with actual persuasive success. Argumentation theorists’ conceptions have to some extent been shown to be compatible with actual effectiveness, but for lay people such compatibility has yet to be determined. This study experimentally investigated lay people’s expectations about the (...) persuasiveness of anecdotal, statistical, causal, and expert evidence, and compared these expectations with the actual persuasiveness of these evidence types. Dutch and French participants (N = 174) ranked four types of evidence in terms of their expected persuasiveness for eight different claims. Both cultural groups expected statistical evidence to be the most persuasive type of evidence to other people, followed by expert, causal, and, finally, anecdotal evidence. A comparison of these rankings with the results of Hornikx and Hoeken (Communication Monographs 74, 443–463, 2007, Study 1) on the actual persuasiveness of the same evidence types reveals that people’s expectations are generally accurate: How relatively persuasive they expect evidence types to be often corresponded with their actual persuasiveness. (shrink)
I consider the problem of extending Reichenbach's principle of the common cause to more than two events, vis-a-vis an example posed by Bernstein. It is argued that the only reasonable extension of Reichenbach's principle stands in conflict with a recent proposal due to Horwich. I also discuss prospects of the principle of the common cause in the light of these and other difficulties known in the literature and argue that a more viable version of the principle is the one provided (...) by Penrose and Percival (1962). (shrink)
José Jorge Mendoza argues that the difficulty with resolving the issue of immigration is primarily a conflict over competing moral and political principles and is, at its core, a problem of philosophy. This book brings into dialogue various contemporary philosophical texts that deal with immigration to provide some normative guidance to immigration policy and reform.
This book chapter shows how the early Heidegger’s philosophy around the period of Being and Time can address some central questions of contemporary social ontology. After sketching “non-summative constructionism”, which is arguably the generic framework that underlies all forms of contemporary analytic social ontology, I lay out early Heidegger’s conception of human social reality in terms of an extended argument. The Heidegger that shows up in light of this treatment is an acute phenomenologist of human social existence who emphasizes our (...) engagement in norm-governed practices as the basis of social reality. I then defuse a common and understandable set of objections against invoking the early Heidegger as someone who can make any positive contribution to our understanding of social reality. Lastly, I explore the extent to which the early Heidegger’s philosophy provides insights regarding phenomena of collective intentionality by showing how the intelligibility of such phenomena traces back to individual agents’ common understanding of possible ways of understanding things and acting with one another. With the early Heidegger, I argue that this common understanding is the fundamental source and basis of collective intentionality, not the non-summativist constructionism on which contemporary analytic social ontology has sought to focus with much effort. The lesson about social ontology that we should learn from the early Heidegger is that there is a tight connection between the social constitution of the human individual and his or her capacity to perform actions or activities that instantiate collective intentionality. (shrink)
Se busca rastrear la imagen que Platón tiene de Heráclito y articularla con la estructura argumentativa del Cratilo, para comprender las necesidades textuales a las que responde la doctrina del flujo perpetuo, es decir, la discusión sobre la corrección (ὀρθότης) del nombre. Gracias a la inclusión del testimonio heraclíteo, resulta posible rastrear la presunta consolidación de la tesis sobre los nombres primarios y los secundarios como el eje de la separación entre dos planos de realidad (uno estable y uno móvil) (...) y de la teoría de las Ideas -es decir, como la base de la epistemología platónica presente en los diálogos de madurez-. The article seeks to trace the image Plato has of Heraclitus and connect it with the argumentative structure of the Cratylus in order to understand the textual needs that give rise to the doctrine of perpetual flux, that is, the discussion regarding the correctness (ὀρθότης) of names. The inclusion of Heraclitus's testimony makes it possible to trace the alleged consolidation of the thesis regarding primary and secondary names as the axis of separation between two levels of reality (one stable, the other, changing) and the theory of Ideas -that is, as the basis of Plato's epistemology as set forth in the late dialogues-. (shrink)
The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...) values of certain functions with their empirical averages. There are, however, various other ways in which one can construct constraints from empirical data, which makes the maximum entropy principle lead to very different probability assignments. This paper shows that an argument by Jaynes to justify the usual constraint rule is unsatisfactory and investigates several alternative choices. The choice of a constraint rule is also shown to be of crucial importance to the debate on the question whether there is a conflict between the methods of inference based on maximum entropy and Bayesian conditionalization. (shrink)
Helmuth Plessner’s Levels of Organic Life and the Human [Die Stufen des Organischen und der Mensch, 1928] is one of the founding texts of twentieth century philosophical anthropology. It is argued that Plessner’s work demonstrates the fundamental indispensability of the qualitative humanities vis-à-vis the natural-scientific study of man. Plessner’s non-reductionist, emergentist naturalism allots complementary roles to the causal and functional investigations of the life sciences and the phenomenological and hermeneutic interpretation of the phenomenon of life in its successive levels and (...) stages. Within this context, human agency can be understood as a higher-order property of organic life, which act by the selective activation of lower-level psychophysical powers. Plessner’s three ‘anthropological laws’ are used to situate the notion of practical self-understanding in between two extremes: deterministic views that deny human freedom and responsibility and views that ascribe an unrealistic amount of autonomy to human beings. (shrink)