This new and complete translation of Spinoza's famous 17th-century work fills an important gap, not only for all scholars of Spinoza, but also for everyone interested in the relationship between Western philosophy and religion, and the history of biblical exegesis.
We continue the investigation of Gregory trees and the Cantor Tree Property carried out by Hart and Kunen. We produce models of MA with the Continuum arbitrarily large in which there are Gregory trees, and in which there are no Gregory trees.
In a world that is becoming more ‘networked’ than ever, especially on the personal-everyday level—with for example digital media pervading our lives and the Internet of Things now being on the rise—we need to increasingly account for ‘networked realities’. But are we as human beings actually well-equipped enough, epistemologically speaking, to do so? Multiple approaches within the philosophy of technology suggest our usage of technologies to be in the first instance oriented towards efficiency and the achievement of goals. We thereby (...) neglect the actual systemic, networked nature of technology, or its wider impacts. With regard to the pressing issue of how to cross the ‘gap’ between these two ‘modes,’ the paper at hand engages with the work of Gregory Bateson, reading him as a philosopher of technology. Bateson’s notions of “conscious purpose” and “learning” offer excellent tools to understand our predicament of living in a networked world but being partly unable to sufficiently grasp and come to terms with this situation. Moreover, as the article endeavors to demonstrate, Bateson’s thought is to be cast as a crucial addition to the body of theory being developed in the philosophy of technology. (shrink)
It is generally agreed that the most influential philosophers in America are Charles S. Peirce, William James and John Dewey. James's fame came rather suddenly in the latter half of his life—roughly, from 1880 to 1910; it flourished with the appearance of his Principles of Psychology and shortly thereafter with his advocacy of pragmatism and radical empiricism. James was acclaimed in England and Europe as well as in America. Peirce, on the other hand, was almost entirely neglected; his work remained (...) unknown to all but a few philosophers and his chief acknowledgment was as a scientist and logician. His importance began to be recognized and his immense researches and writings studied some twenty-five years after his death. It was otherwise with Dewey. During his long lifetime his ideas not only engaged the reflections and critical discussions of philosophers, he also had a profound and contagious influence on education, the social sciences, aesthetics, and political theory and practice. In this respect his thought has reached a wider audience in America than that of either Peirce or James. In his day lawyers, labour leaders, scientists and several heads of state attested to the vitality of his wisdom. (shrink)
Gregory Kavka's 'Toxin Puzzle' suggests that I cannot intend to perform a counter-preferential action A even if I have a strong self-interested reason to form this intention. The 'Rationalist Solution,' however, suggests that I can form this intention. For even though it is counter-preferential, A-ing is actually rational given that the intention behind it is rational. Two arguments are offered for this proposition that the rationality of the intention to A transfers to A-ing itself: the 'Self-Promise Argument' and David (...) Gauthier's 'Rational Self-Interest Argument.' But both arguments – and therefore the Rationalist Solution – fail. The Self-Promise Argument fails because my intention to A does not constitute a promise to myself that I am obligated to honor. And Gauthier's Rational Self-Interest Argument fails to rule out the possibility of rational irrationality. (shrink)
The sustained interdisciplinary debate about neovitalism between two Johns Hopkins University colleagues, philosopher Arthur O. Lovejoy and experimental geneticist H. S. Jennings, in the period 1911–1914, was the basis for their theoretical reconceptualization of scientific knowledge as contingent and necessarily incomplete in its account of nature. Their response to Hans Driesch’s neovitalist concept of entelechy, and his challenge to the continuity between biology and the inorganic sciences, resulted in a historically significant articulation of genetics and philosophy. This study traces the (...) debate’s shift of problem-focus away from neovitalism’s threat to the unity of science – “organic autonomy,” as Lovejoy put it – and toward the potential for development of a nonmechanististic, nonrationalist theory of scientific knowledge. The result was a new pragmatist epistemology, based on Lovejoy’s and Jennings’s critiques of the inadequacy of pragmatism’s account of scientific knowledge. The first intellectual move, drawing on naturalism and pragmatism, was based on a reinterpretation of science as organized experience. The second, sparked by Henri Bergson’s theory of creative evolution, and drawing together elements of Dewey’s and James’s pragmatisms, produced a new account of the contingency and necessary incompleteness of scientific knowledge. Prompted by the neovitalists’ mix of a priori concepts and, in Driesch’s case, and adherence to empiricism, Lovejoy’s and Jennings’s developing pragmatist epistemologies of science explored the interrelation between rationalism and empiricism. (shrink)
An intricate, long, and occasionally heated debate surrounds Boltzmann’s H-theorem (1872) and his combinatorial interpretation of the second law (1877). After almost a century of devoted and knowledgeable scholarship, there is still no agreement as to whether Boltzmann changed his view of the second law after Loschmidt’s 1876 reversibility argument or whether he had already been holding a probabilistic conception for some years at that point. In this paper, I argue that there was no abrupt statistical turn. In the first (...) part, I discuss the development of Boltzmann’s research from 1868 to the formulation of the H-theorem. This reconstruction shows that Boltzmann adopted a pluralistic strategy based on the interplay between a kinetic and a combinatorial approach. Moreover, it shows that the extensive use of asymptotic conditions allowed Boltzmann to bracket the problem of exceptions. In the second part I suggest that both Loschmidt’s challenge and Boltzmann’s response to it did not concern the H-theorem. The close relation between the theorem and the reversibility argument is a consequence of later investigations on the subject. (shrink)
Although Gregory Currie is often presented as a strong defender of empathic simulation as part of spectator engagement, this paper questions the importance of empathy in Currie's philosophy of film. Currie's account of the imagination is too propositional, and his account of a more sensuous and experiential kind of imagining is found wanting. While giving a convincing account of impersonal imagining in relation to fiction film, Currie does not sufficiently explain what empathy is, and what relation it has to (...) other forms of imagining. Simulation is primarily defined as impersonal, and perhaps more importantly, as conceptual and propositional in Currie's writings. This is perhaps most evident in his critique of personal imagining, where imagining seeing or imagining being becomes a self-reflexive form of imagining where the spectator also conceptualizes ‘I’ and ‘see’. This paper discusses the relation between personal imagining and empathy in Currie's account, and argues that he fails to show how empathy is of secondary importance for engagement in fiction film. (shrink)
The following three related contributions jointly serve to lift up elements of the thought of the anthropologist Gregory Bateson that can be fruitfully compared with elements of Michael Polanyi’s thought. In a brief introduction, William Stillwell reviews Bateson’s life and developing interests. Stillwell also provides, in a creative dialog form akin to Bateson’s own dialogs, a short review article on Noel Charlton’s Understanding Gregory Bateson: Mind, Beauty and the Sacred Earth. The third piece is Jere Moorman’s short 1991 (...) essay (now out of print) discussing Polanyi’s ideas about tacit knowing and their connection with Bateson’s ideas about the double bind. (shrink)
This paper analyzes the feast days of the Orthodox Church from the point of view of St. Gregory of Nazianzus. Liturgical scholars raise questions about the relationships between past and future, anamnesis and mimesis, the sanctification of time and longing for the eschaton. Investigation of Gregory’s liturgical theology, which has had unparalleled influence in the Byzantine rite churches, shows that all of these are false dichotomies. Gregory’s two homilies onPascha and his homilies on Christmas, Theophany, and Pentecost (...) were preached throughout his public life. They show, in the feast days, anamnesis, in which the sacred events in Christ’s life are made present, and mimesis, the repetition of past events so as to arrive at the same future in God’s eternal kingdom. Patristics and liturgical scholars, however, have understood “mimesis” in different ways. (shrink)
This article is an investigation of parallel themes in Heinrich Hertz's philosophy science and Kant's theory of schemata, symbols and regulative ideas. It is argued that Hertz's "pictures" bears close similarities to Kantian "schemata", that is, they are rules linking concepts to intuitions and provide them with their meaning. Kant's distinction between symbols and schemata is discussed and related to Hertz's three pictures of mechanics. It is argued that Hertz considered his own picture of mechanics (the "hidden mass" picture) as (...) symbolic in a different way than the force and energy pictures. In the final part of the article it is described how Harald Høffding soon after the publication of Hertz's Principles of Mechanics developed a general theory of analogical reasoning, relying on the ideas of Hertz and Kant. (shrink)
This article is an investigation of parallel themes in Heinrich Hertz's philosophy science and Kant's theory of schemata, symbols and regulative ideas. It is argued that Hertz's "pictures" bears close similarities to Kantian "schemata", that is, they are rules linking concepts to intuitions and provide them with their meaning. Kant's distinction between symbols and schemata is discussed and related to Hertz's three pictures of mechanics. It is argued that Hertz considered his own picture of mechanics as symbolic in a different (...) way than the force and energy pictures. In the final part of the article it is described how Harald Høffding soon after the publication of Hertz's Principles of Mechanics developed a general theory of analogical reasoning, relying on the ideas of Hertz and Kant. (shrink)
In this essay, I reconstruct H. Richard Niebuhr's interpretation of George Herbert Mead's account of the social constitution of the self. Specifically, I correct Niebuhr's interpretation, because it mischaracterizes Mead's understanding of social constitution as more dialogical than ecological. I also argue that Niebuhr's interpretation needs completing because it fails to engage one of Mead's more significant notions, the I/me distinction within the self. By reconstructing Niebuhr's account of faith and responsibility as theologically self-constitutive through Mead's I/me distinction, I demonstrate (...) Niebuhr's deep yet unacknowledged agreement with Mead: the self is constituted by its participation in multiple communities, but responds to them creatively by enduring the moral perplexity of competing communal claims. I conclude by initiating a constructive account of conscience that follows from this agreement. Conscience is more ecological than dialogical because it regards our creative participation in multiple ecologies of social roles oriented by patterns of responsive relations. (shrink)
In relation to a thesis put forward by Marx Wartofsky, we seek to show that a historiography of mathematics requires an analysis of the ontology of the part of mathematics under scrutiny. Following Ian Hacking, we point out that in the history of mathematics the amount of contingency is larger than is usually thought. As a case study, we analyze the historians’ approach to interpreting James Gregory’s expression ultimate terms in his paper attempting to prove the irrationality of \. (...) Here Gregory referred to the last or ultimate terms of a series. More broadly, we analyze the following questions: which modern framework is more appropriate for interpreting the procedures at work in texts from the early history of infinitesimal analysis? As well as the related question: what is a logical theory that is close to something early modern mathematicians could have used when studying infinite series and quadrature problems? We argue that what has been routinely viewed from the viewpoint of classical analysis as an example of an “unrigorous” practice, in fact finds close procedural proxies in modern infinitesimal theories. We analyze a mix of social and religious reasons that had led to the suppression of both the religious order of Gregory’s teacher degli Angeli, and Gregory’s books at Venice, in the late 1660s. (shrink)
Why do we need government? A common view is that government is necessary to constrain people's conduct toward one another, because people are not sufficiently virtuous to exercise the requisite degree of control on their own. This view was expressed perspicuously, and artfully, by liberal thinker James Madison, in The Federalist, number 51, where he wrote: “If men were angels, no government would be necessary.” Madison's idea is shared by writers ranging across the political spectrum. It finds clear expression in (...) the Marxist view that the state will gradually wither away after a communist revolution, as unalienated “communist man” emerges. And it is implied by the libertarian view that government's only legitimate function is to control the unfortunate and immoral tendency of some individuals to violate the moral rights of others. (shrink)
It is commonplace to suppose that the theory of individual rational choice is considerably less problematic than the theory of collective rational choice. In particular, it is often assumed by philosophers, economists, and other social scientists that an individual's choices among outcomes accurately reflect that individual's underlying preferences or values. Further, it is now well known that if an individual's choices among outcomes satisfy certain plausible axioms of rationality or consistency, that individual's choice-behavior can be interpreted as maximizing expected utility (...) on a utility scale that is unique up to a linear transformation. Hence, there is, in principle, an empirically respectable method of measuring individuals' values and a single unified schema for explaining their actions as value maximizing. (shrink)