Ernesto Laclau and Chantal Mouffe’s post-Marxist analysis pushed Gramsci’s anti-determinism to its limits, embracing a post-structuralist, discourse-centred politics. Mouffe’s subsequent programme for radical democracy has sought a renewed democratic left project. While radical democracy’s post-structuralism enables important insights into political subjectivity and antagonism in contemporary democracies, it also weakens its own critical and strategic capacity. By recuperating its Gramscian heritage, radical democracy could be more theoretically and politically effective. In contrast to discourses operating in an entirely open and contingent political (...) field, Gramscian theory offers a more realist – but non-determinist – account of the structural, enabling and constraining properties of ideologies. It also allows for a distinctive institutional space for society. Society is the site upon which political identities are articulated, and from which existing power relations are challenged. But a conception of society also points to the institutional limits to politics, notable by their absence in post-structuralism and radical democracy. (shrink)
This book is a translation of W.V. Quine's Kant Lectures, given as a series at Stanford University in 1980. It provide a short and useful summary of Quine's philosophy. There are four lectures altogether: I. Prolegomena: Mind and its Place in Nature; II. Endolegomena: From Ostension to Quantification; III. Endolegomena loipa: The forked animal; and IV. Epilegomena: What's It all About? The Kant Lectures have been published to date only in Italian and German translation. The present book is filled out (...) with the translator's critical Introduction, "The esoteric Quine?" a bibliography based on Quine's sources, and an Index for the volume. (shrink)
Vanguard anti-narrativist Galen Strawson declares personal memory unimportant for self-constitution. But what if lapses of personal memory are sustained by a morally reprehensible amnesia about historical events, as happens in the work of W.G. Sebald? The importance of memory cannot be downplayed in such cases. Nevertheless, contrary to expectations, a concern for memory needn’t ally one with the narrativist position. Recovery of historical and personal memory results in self-dissolution and not self-unity or understanding in Sebald’s characters. In the end, Sebald (...) shows how memory can be significant, even imperative, within a deeply anti-narrativist outlook on the self, memory, and history. (shrink)
There has been a great deal of critical discussion of Harry Frankfurt’s argument against the Principle of Alternative Possibilities (PAP), almost all of which has focused on whether the Frankfurt-style examples, which are designed to be counterexamples to PAP, can be given a coherent formulation. Recently, however, David Widerker has argued that even if Frankfurt-style examples can be given a coherent formulation, there is reason to believe that an agent in those examples could never be morally blameworthy for what she (...) has done. Therefore, such examples do not undermine a version of PAP restricted to blameworthiness. Widerker refers to his argument for this claim as the W-defense. I examine the W-defense in some detail, along with three recent replies to it by defenders of Frankfurt’s argument. I contend that each of these replies is problematic and, indeed, that two of them play directly into the hands of those seeking to defend PAP. I then develop my own reply to the W-defense by calling into question an assumption which is at the heart of that argument regarding the nature of moral blame. (shrink)
There are two motivations commonly ascribed to historical actors for taking up statistics: to reduce complicated data to a mean value (e.g., Quetelet), and to take account of diversity (e.g., Galton). Different motivations will, it is assumed, lead to different methodological decisions in the practice of the statistical sciences. Karl Pearson and W. F. R. Weldon are generally seen as following directly in Galton’s footsteps. I argue for two related theses in light of this standard interpretation, based on a reading (...) of several sources in which Weldon, independently of Pearson, reflects on his own motivations. First, while Pearson does approach statistics from this "Galtonian" perspective, he is, consistent with his positivist philosophy of science, utilizing statistics to simplify the highly variable data of biology. Weldon, on the other hand, is brought to statistics by a rich empiricism and a desire to preserve the diversity of biological data. Secondly, we have here a counterexample to the claim that divergence in motivation will lead to a corresponding separation in methodology. Pearson and Weldon, despite embracing biometry for different reasons, settled on precisely the same set of statistical tools for the investigation of evolution. (shrink)
In an unsung yet excellent paper, W.Z. Harvey set out to explain how both Maimonides and Spinoza have similarly problematic views on the nature of the knowledge of good and evil. In it, he proposed an answer to solving the problem. In the many decades since, debates surrounding this topic have flourished. A recent paper by Joshua Parens, his conclusions mark a distinction between Spinoza and Maimonides that threaten to undermine Harvey’s solution to the problem. I will argue that, although (...) Parens’ distinction forces us to revise Harvey’s contention, Harvey’s argument is still generally valid. (shrink)
Laudisa (Found. Phys. 38:1110–1132, 2008) claims that experimental research on the class of non-local hidden-variable theories introduced by Leggett is misguided, because these theories are irrelevant for the foundations of quantum mechanics. I show that Laudisa’s arguments fail to establish the pessimistic conclusion he draws from them. In particular, it is not the case that Leggett-inspired research is based on a mistaken understanding of Bell’s theorem, nor that previous no-hidden-variable theorems already exclude Leggett’s models. Finally, I argue (...) that the framework of Bohmian mechanics brings out the importance of Leggett tests, rather than proving their irrelevance, as Laudisa supposes. (shrink)
Se presenta el argumento de W. T. Stace sobre el realismo señalando no que éste sea falso sino solamente que no hay absolutamente ninguna razón para considerar que sea verdadero y por tanto no tenemos por qué creerlo. Esto se aplica a la discusión de la pregunta: ¿Cómo sabemos que los átomos existen? Haciendo referencia a algunas de las respuestas científicas más importantes conocidas que son en orden cronológico: i) La ley de las proporciones definidas o Ley de Proust, ii) (...) la teoría cinética de los gases, iii) el movimiento Browniano y, iv) imágenes de microscopio de efecto túnel. (shrink)
In his 2013 Foundations of Physics paper Mathias Egg claims to show that my critical arguments toward the foundational significance of Leggett’s non-local theories are misguided. The main motivation is that my argument would connect too strongly the Leggett original motivation for introducing this new class of theories with the foundational significance of these theories per se. Egg basically aims to show that, although it can be conceded that the Leggett original motivation relies on a mistaken view (...) of the original Bell theorem, the investigation on the Leggett theories does have a foundational meaning that can be disassociated from the view that Leggett himself has of of them. As a reply to Egg, I would like to argue here that, even if we assume to disentangle the Leggett view from the fate of the Leggett theories, there is still room to dispute the foundational significance of the Leggett ‘non-local realistic’ research program. (shrink)
It is shown here that Suarez (Found. Phys. 38:583, 2008) wrongly presents the assumptions behind the Leggett’s inequalities, and their modified form used by Groeblacher et al. (Nature 446:871, 2007) for an experimental falsification of a certain class of non-local hidden variable models.
J. Schumpeter is a key figure, even a seminal one, on technological innovation. Most economists who study technological innovation refer to Schumpeter and his pioneering role in introducing innovation into economic studies. However, despite having brought forth the concept of innovation in economic theory, Schumpeter provided few if any analyses of the process of innovation itself. This paper suggests that the origin of systematic studies on technological innovation owes its existence to the economist W. Rupert Maclaurin from MIT. In the (...) 1940s and 1950s, Maclaurin developed Schumpeter’s ideas, analyzing technological innovation as a process composed of several stages or steps, and proposed a theory of technological innovation, later called the linear model of innovation. The paper also argues that Maclaurin constructed one of the first taxonomies for measuring technological innovation. (shrink)
The rise of quantum information theory has lent new relevance to experimental tests for non-classicality, particularly in controversial cases such as adiabatic quantum computing superconducting circuits. The Leggett-Garg inequality is a “Bell inequality in time” designed to indicate whether a single quantum system behaves in a macrorealistic fashion. Unfortunately, a violation of the inequality can only show that the system is either (i) non-macrorealistic or (ii) macrorealistic but subjected to a measurement technique that happens to disturb the system. The (...) “clumsiness” loophole (ii) provides reliable refuge for the stubborn macrorealist, who can invoke it to brand recent experimental and theoretical work on the Leggett-Garg test inconclusive. Here, we present a revised Leggett-Garg protocol that permits one to conclude that a system is either (i) non-macrorealistic or (ii) macrorealistic but with the property that two seemingly non-invasive measurements can somehow collude and strongly disturb the system. By providing an explicit check of the invasiveness of the measurements, the protocol replaces the clumsiness loophole with a significantly smaller “collusion” loophole. (shrink)
It is shown that the before-before (or Suarez-Scarani) experiment refutes hidden variable models with a deterministic (“realistic”) nonlocal part, whereas experiments violating Leggett-type inequalities refute models with biased random local part. Therefore the claim that Gröblacher et al. (Nature 446:871–875, 2007) present “an experimental test of nonlocal realism” is misleading, and Marek Żukowski’s (Found. Phys. 38:1070, 2008) comment misses the point. A new experiment is suggested.
Este ensaio vem problematizar acerca da atualidade do conceito de indústria cultural ( Kulturindustrie ), no projeto da teoria crítica de Theodor W. Adorno, objetivando mostrar que as atuais limitações impostas ao debate derivam mais do fundamento não-dialético dos que apontam sua restrição do que da própria potência da teorização frankfurtiana.
Leggett formulated an inequality that seems to generalize the Bell theorem to non-local hidden variable theories. Leggett inequality is violated by quantum mechanics, as was confirmed by experiment. However, a careful analysis reveals that the theory applies to a class of local theory. Contrary to what happens in the derivation of Bell inequality, it is not necessary to make the hypothesis of outcome independence to derive the Leggett inequality.
W. H. Auden and Hannah Arendt belonged to a generation that experienced the catastrophic events of the mid-twentieth century, and they both sought to respond to the enormity of the novel phenomena they witnessed.
As one of the preeminent philosophers of the twentieth century, W. V. Quine made groundbreaking contributions to the philosophy of science, mathematical logic, and the philosophy of language. This collection of essays examines Quine's views, particularly his holism and naturalism, for their value to feminist theorizing today. Some contributors to this volume see Quine as severely challenging basic tenets of the logico-empiricist tradition in the philosophy of science—the analytic/synthetic distinction, verificationism, foundationalism—and accept various of his positions as potential resources for (...) feminist critique. Other contributors regard Quine as an unrepentant empiricist and, unlike feminists who seek to use or extend his arguments, they interpret his positions as far less radical and more problematic. In particular, critics and advocates of Quine's arguments that the philosophy of science should be "naturalized"—understood and pursued as an enterprise continuous with the sciences proper—disagree deeply about whether such a naturalized philosophy is "philosophy enough." Central issues at stake in these disagreements reflect current questions of special interest to feminists and also bridge the analytic and postmodern traditions. They include questions about whether and how the philosophy of science, as a form of practice, is or can be normative as well as questions concerning the implications of Quine's philosophy of language for the transparency and stability of meaning. In representing feminist philosophy centrally engaged with the analytic tradition, this volume is important not only for what it contributes to the understanding of Quine and naturalized epistemology but also for what it accomplishes in working against restrictive conceptions of the place of feminism within the discipline. Aside from the editors, the contributors are Kathryn Pyne Addelson, Louise M. Antony, Richmond Campbell, Lorraine Code, Jane Duran, Maureen Linker, Phyllis Rooney, and Paul A. Roth. (shrink)
In this paper rejection systems for the “nonsense-logic” W and the k-valued implicational-negational sentential calculi of Sobociński are given. Considered systems consist of computable sets of rejected axioms and only one rejection rule: the rejection version of detachment rule.