The paper presents some essential heuristic and constructional elements of Free Process Theory (FPT), a non-Whiteheadian, monocategoreal framework. I begin with an analysis of our common sense concept of activities, which plays a crucial heuristic role in the development of the notion of a free process. I argue that an activity is not a type but a mode of occurrence, defined in terms of a network of inferences. The inferential space characterizing our concept of an activity entails that anything which (...) is conceived of as occurring in the activity mode is a concrete,dynamic, non-particular individual. Such individuals, which I call free processes, may be used for the interpretation of much more than just common sense activities. I introduce the formal theory FPT, a mereology with anon-transitive part-relation, which contains a typology of processes based on the following five parameters relating to: (a) patterns of possible spatial and temporal recurrence (automerity); (b) kinds of components (participant structure); (c) kinds of dynamic composition; (d) kinds of dynamic flow (dynamic shape); and (e) dynamic context. I show how these five evaluative dimensions for free processes can be used to define ontological correlates for various common sense categories,and to draw distinctions between various forms of agency(distributed, collective, reciprocal, entangled) and emergence (weak, strong,as autonomous system (Bickhard/Christensen)). (shrink)
Weak and global supervenience are equivalent to strong supervenience for intrinsic properties. Moreover, weak and global supervenience relations are always mere parts of a more general underlying strong supervenience relation. Most appeals to global supervenience, though, involve spatio-temporally relational properties; but here too, global and strong supervenience are equivalent. _Functionally_ we can characterize merely weak and global supervenience as follows: for A to supervene on B requires that at all worlds an individual’s A properties be a function of its B (...) properties, where this function varies from world to world. But what are the. (shrink)
Many public debates about the societal significance and impact of agriculture are usefully framed by Paul Thompson’s distinction between the “agrarian” and the “industrial vision.” The key argument of the present paper is that the ongoing debate between these visions goes beyond academic philosophy and has direct effects on the political economy of agriculture by influencing the scope of rent-seeking activities that are undertaken primarily in the name of the agrarian vision. The existence of rent-seeking activities is shown to reflect (...) the fact that the agrarian vision is not universally supported, which is certainly true of the industrial vision as well. The key argument of the present paper is that these two philosophical visions of agriculture are not radically incongruent. Rather, they share a common ground within which they are even mutually supportive. If agricultural policy making is oriented toward this common ground, it may reduce overall dissatisfaction with the resulting institutional regime of agricultural production. Such an agricultural policy may also stimulate the emergence of new business practices that not only enable efficient agricultural production but also minimize negative ecological impact and preserve cultural landscapes. (shrink)
One of the major tasks of medical educators is to help maintain and increase trainee empathy for patients. Yet research suggests that during the course of medical training, empathy in medical students and residents decreases. Various exercises and more comprehensive paradigms have been introduced to promote empathy and other humanistic values, but with inadequate success. This paper argues that the potential for medical education to promote empathy is not easy for two reasons: a) Medical students and residents have complex and (...) mostly unresolved emotional responses to the universal human vulnerability to illness, disability, decay, and ultimately death that they must confront in the process of rendering patient care b) Modernist assumptions about the capacity to protect, control, and restore run deep in institutional cultures of mainstream biomedicine and can create barriers to empathic relationships. In the absence of appropriate discourses about how to emotionally manage distressing aspects of the human condition, it is likely that trainees will resort to coping mechanisms that result in distance and detachment. This paper suggests the need for an epistemological paradigm that helps trainees develop a tolerance for imperfection in self and others; and acceptance of shared emotional vulnerability and suffering while simultaneously honoring the existence of difference. Reducing the sense of anxiety and threat that are now reinforced by the dominant medical discourse in the presence of illness will enable trainees to learn to emotionally contain the suffering of their patients and themselves, thus providing a psychologically sound foundation for the development of true empathy. (shrink)
A Scientific Integrity Consortium developed a set of recommended principles and best practices that can be used broadly across scientific disciplines as a mechanism for consensus on scientific integrity standards and to better equip scientists to operate in a rapidly changing research environment. The two principles that represent the umbrella under which scientific processes should operate are as follows: Foster a culture of integrity in the scientific process. Evidence-based policy interests may have legitimate roles to play in influencing aspects of (...) the research process, but those roles should not interfere with scientific integrity. The nine best practices for instilling scientific integrity in the implementation of these two overarching principles are Require universal training in robust scientific methods, in the use of appropriate experimental design and statistics, and in responsible research practices for scientists at all levels, with the training content regularly updated and presented by qualified scientists. Strengthen scientific integrity oversight and processes throughout the research continuum with a focus on training in ethics and conduct. Encourage reproducibility of research through transparency. Strive to establish open science as the standard operating procedure throughout the scientific enterprise. Develop and implement educational tools to teach communication skills that uphold scientific integrity. Strive to identify ways to further strengthen the peer review process. Encourage scientific journals to publish unanticipated findings that meet standards of quality and scientific integrity. Seek harmonization and implementation among journals of rapid, consistent, and transparent processes for correction and/or retraction of published papers. Design rigorous and comprehensive evaluation criteria that recognize and reward the highest standards of integrity in scientific research. (shrink)
Werner Heisenberg's 1925 paper ‘Quantum-theoretical re-interpretation of kinematic and mechanical relations’ marks the beginning of quantum mechanics. Heisenberg famously claims that the paper is based on the idea that the new quantum mechanics should be ‘founded exclusively upon relationships between quantities which in principle are observable’. My paper is an attempt to understand this observability principle, and to see whether its employment is philosophically defensible. Against interpretations of ‘observability’ along empiricist or positivist lines I argue that such readings are philosophically (...) unsatisfying. Moreover, a careful comparison of Heisenberg's reinterpretation of classical kinematics with Einstein's argument against absolute simultaneity reveals that the positivist reading does not fit with Heisenberg's strategy in the paper. Instead the appeal to observability should be understood as a specific criticism of the causal inefficacy of orbital electron motion in Bohr's atomic model. I conclude that the tacit philosophical principle behind Heisenberg's argument is not a positivistic connection between observability and meaning, but the idea that a theory should not contain causally idle wheels. (shrink)
The psycholinguistic literature has identified two syntactic adaptation effects in language production: rapidly decaying short-term priming and long-lasting adaptation. To explain both effects, we present an ACT-R model of syntactic priming based on a wide-coverage, lexicalized syntactic theory that explains priming as facilitation of lexical access. In this model, two well-established ACT-R mechanisms, base-level learning and spreading activation, account for long-term adaptation and short-term priming, respectively. Our model simulates incremental language production and in a series of modeling studies, we show (...) that it accounts for (a) the inverse frequency interaction; (b) the absence of a decay in long-term priming; and (c) the cumulativity of long-term adaptation. The model also explains the lexical boost effect and the fact that it only applies to short-term priming. We also present corpus data that verify a prediction of the model, that is, that the lexical boost affects all lexical material, rather than just heads. (shrink)
The book then discusses another group of issues ("whether it is, what it is, how and why it is"), which determined the argumentation, the axiomatic ordering of the sciences, and concludes with a demonstration on the basis of concrete ...
Kim argues that weak and global supervenience are too weak to guarantee any sort of dependency. Of the three original forms of supervenience, strong, weak, and global, each commonly wielded across all branches of philosophy, two are thus cast aside as uninteresting or useless. His arguments, however, fail to appreciate the strength of weak and global supervenience. I investigate what weak and global supervenience relations are functionally and how they relate to strong supervenience. For a large class of properties, weak (...) and global supervenience are equivalent to strong supervenience. I then offer a series of arguments showing that it is precisely because of their strength, not their weakness, that both weak and global supervenience are useless in characterizing any dependencies of interest to philosophers. (shrink)
Puzzles about persistence and change through time, i.e., about identity across time, have foundered on confusion about what it is for ‘two things’ to be have ‘the same thing’ at a time. This is most directly seen in the dispute over whether material objects can occupy exactly the same place at the same time. This paper defends the possibility of such coincidence against several arguments to the contrary. Distinguishing a temporally relative from an absolute sense of ‘the same’, we see (...) that the intuition, ‘this is only one thing’, and the dictum, ‘two things cannot occupy the same place at the same time’, are individuating things at a time rather than absolutely and are therefore compatible with coincidence. Several other objections philosophers have raised ride on this same ambiguity. Burke, originating what has become the most popular objection to coincidence, argues that if coincidence is possible there would be no explanation of how objects that are qualitatively the same at a time could belong to different sorts. But we can explain an object’s sort by appealing to its properties at other times. Burke’s argument to the contrary equivocates on different notions of ‘cross-time identity’ and ‘the statue’. From a largely negative series of arguments emerges a positive picture of what it means to say multiple things coincide and of why an object’s historical properties explain its sort rather than vice versa – in short, of how coincidence is possible. (shrink)
The fission of a person involves what common sense describes as a single person surviving as two distinct people. Thus, say most metaphysicians, this paradox shows us that common sense is inconsistent with the transitivity of identity. Lewis’s theory of overlapping persons, buttressed with tensed identity, gives us one way to reconcile the common sense claims. Lewis’s account, however, implausibly says that reference to a person about to undergo fission is ambiguous. A better way to reconcile the claims of common (...) sense, one that avoids this ambiguity, is to recognize branching persons, persons who have multiple pasts or futures. (shrink)
In the 1960s, newborn screening programs tested for a single very rare but serious disorder. In recent years, thanks to the development of new screening technology, they have expanded into panels of tests; a federally sponsored expert group has recommended that states test for twenty-nine core disorders and twenty-five secondary disorders. By the standards used to decide whether to introduce new preventive health services into clinical use, the decision-making in newborn screening policy has been lax.
Freedom and the subject were guiding themes for Michel Foucault throughout his philosophical career. In this clear and comprehensive analysis of his thought, Johanna Oksala identifies the different interpretations of freedom in his philosophy and examines three major divisions of it: the archaeological, the genealogical, and the ethical. She shows convincingly that in order to appreciate Foucault's project fully we must understand his complex relationship to phenomenology, and she discusses Foucault's treatment of the body in relation to recent feminist (...) work on this topic. Her sophisticated but lucid book illuminates the possibilities that Foucault's philosophy opens up for us in thinking about freedom. (shrink)
Some scientists are happy to follow in the footsteps of others; some like to explore novel approaches. It is tempting to think that herein lies an epistemic division of labor conducive to overall scientific progress: the latter point the way to fruitful areas of research, and the former more fully explore those areas. Weisberg and Muldoon’s model, however, suggests that it would be best if all scientists explored novel approaches. I argue that this is due to implausible modeling choices, and (...) I present an alternative ‘epistemic landscape’ model that demonstrates the alleged benefits from division of labor, with one restriction. (shrink)
For those who think the statue and the piece of copper that compose it are distinct objects that coincide, there is a burden of explanation. After all, common sense says that different ordinary objects cannot occupy the same space at the same time. A common argument in favour of four-dimensionalism (or ?perdurantism? or ?temporal parts theory?) is that it provides the resources for a superior explanation of this coincidence. This, however, is mistaken. Any explanatory work done by the four-dimensionalist notion (...) of absolute parthood rests ultimately on notions equally available to the three-dimensionalist. Thus, a neutral explanation of coincidence is at least as good while avoiding commitment to temporal parts. ?Many thanks to David Christensen, Louis deRosset, Matti Eklund, and two anonymous referees for helpful comments. (shrink)
This article argues that Lara Buchak’s risk-weighted expected utility theory fails to offer a true alternative to expected utility theory. Under commonly held assumptions about dynamic choice and the framing of decision problems, rational agents are guided by their attitudes to temporally extended courses of action. If so, REU theory makes approximately the same recommendations as expected utility theory. Being more permissive about dynamic choice or framing, however, undermines the theory’s claim to capturing a steady choice disposition in the face (...) of risk. I argue that this poses a challenge to alternatives to expected utility theory more generally. (shrink)
Ontic structural realists hold that structure is all there is, or at least all there is fundamentally. This thesis has proved to be puzzling: What exactly does it say about the relationship between objects and structures? In this article, I look at different ways of articulating ontic structural realism in terms of the relation between structures and objects. I show that objects cannot be reduced to structure, and argue that ontological dependence cannot be used to establish strong forms of structural (...) realism. At the end, I show how a weaker, but controversial, form of structural realism can be articulated on the basis of ontological dependence. (shrink)
General Process Theory (GPT) is a new (non-Whiteheadian) process ontology. According to GPT the domains of scientific inquiry and everyday practice consist of configurations of ‘goings-on’ or ‘dynamics’ that can be technically defined as concrete, dynamic, non-particular individuals called general processes. The paper offers a brief introduction to GPT in order to provide ontological foundations for research programs such as interactivism that centrally rely on the notions of ‘process,’ ‘interaction,’ and ‘emergence.’ I begin with an analysis of our common sense (...) concept of activities, which plays a crucial heuristic role in the development of the notion of a general process. General processes are not individuated in terms of their location but in terms of ‘what they do,’ i.e., in terms of their dynamic relationships in the basic sense of one process being part of another. The formal framework of GPT is thus an extensional mereology, albeit a non-classical theory with a non-transitive part-relation. After a brief sketch of basic notions and strategies of the GPT-framework I show how the latter may be applied to distinguish between causal, mechanistic, functional, self-maintaining, and recursively self-maintaining interactions, all of which involve ‘emergent phenomena’ in various senses of the term. (shrink)
A radical metaphysical theory typically comes packaged with a semantic theory that reconciles those radical claims with common sense. The metaphysical theory says what things exist and what their natures are, while the semantic theory specifies, in terms of these things, how we are to interpret everyday language. Thus may we “think with the learned, and speak with the vulgar.” This semantic accommodation of common sense, however, can end up undermining the very theory it is designed to protect. This paper (...) is a case study, showing in detail how one popular version of temporal parts theory is self-undermining. This raises the specter that the problem generalizes to other metaphysical theories. (shrink)
Realists about science tend to hold that our scientific theories aim for the truth, that our successful theories are at least partly true, and that the entities referred to by the theoretical terms of these theories exist. Antirealists about science deny one or more of these claims. A sizable minority of philosophers of science prefers not to take sides: they believe the realism debate to be fundamentally mistaken and seek to abstain from it altogether. In analogy with other realism debates (...) I will call these philosophers quietists. In the philosophy of science quietism often takes a somewhat peculiar form, which I will call naturalistic quietism. In this paper I will characterize Maddy’s Second Philosophy as a form of naturalistic quietism, and show what the costs for making it feasible are. (shrink)
Risk-weighted expected utility theory is motivated by small-world problems like the Allais paradox, but it is a grand-world theory by nature. And, at the grand-world level, its ability to handle the Allais paradox is dubious. The REU model described in Risk and Rationality turns out to be risk-seeking rather than risk-averse on one natural way of formulating the Allais gambles in the grand-world context. This result illustrates a general problem with the case for REU theory, we argue. There is a (...) tension between the small-world thinking marshaled against standard expected utility theory, and the grand-world thinking inherent to the risk-weighted alternative. (shrink)
Can different material objects have the same parts at all times at which they exist? This paper defends the possibility of such coincidence against the main argument to the contrary, the ‘Indiscernibility Argument’. According to this argument, the modal supervenes on the nonmodal, since, after all, the non-modal is what grounds the modal; hence, it would be utterly mysterious if two objects sharing all parts had different essential properties. The weakness of the argument becomes apparent once we understand how the (...) modal is grounded in the nonmodal. By extending the ideas of combinatorialism so that we recombine haecceities as well as fundamental properties, we see how modal properties can be grounded in non-modal properties in a way that allows coincidence and yet also explains why there are differences in the modal properties of coinciding objects. Despite this, some de re modal facts are not grounded in the non-modal but instead are brute. However, although we cannot explain why a particular object has the basic modal properties it has, we can explain a closely related, semantic fact and, comparing the facts we can’t explain to more familiar brute facts, we understand why there should be no better explanation. As a result, we can see how coincidence is, after all, possible. (shrink)
Are laws of nature necessary, and if so, are all laws of nature necessary in the same way? This question has played an important role in recent discussion of laws of nature. I argue that not all laws of nature are necessary in the same way: conservation laws are perhaps to be regarded as metaphysically necessary. This sheds light on both the modal character of conservation laws and the relationship between different varieties of necessity.
Ariane Fischer, David Woodruff, and Johanna Bockman have translated Karl Polanyi’s “Sozialistische Rechnungslegung” [“Socialist Accounting”] from 1922. In this article, Polanyi laid out his model of a future socialism, a world in which the economy is subordinated to society. Polanyi described the nature of this society and a kind of socialism that he would remain committed to his entire life. Accompanying the translation is the preface titled “Socialism and the embedded economy.” In the preface, Bockman explains the historical context (...) of the article and its significance to the socialist calculation debate, the social sciences, and socialism more broadly. Based on her reading of the accounting and society that Polanyi offers here, Bockman argues that scholars have too narrowly used Polanyi’s work to support the Keynesian welfare state to the exclusion of other institutions, have too broadly used his work to study social institutions indiscriminately, and have not recognized that his work shares fundamental commonalities with and often unacknowledged distinctions from neoclassical economics. (shrink)
In this article, we use content and cluster analysis on a global sample of 200 social entrepreneurial organizations to develop a typology of social entrepreneuring models. This typology is based on four possible forms of capital that can be leveraged: social, economic, human, and political. Furthermore, our findings reveal that these four social entrepreneuring models are associated with distinct logics of justification that may explain different ways of organizing across organizations. This study contributes to understanding social entrepreneurship as a field (...) of practice and it describes avenues for theorizing about the different organizational approaches adopted by social entrepreneurs. (shrink)
This paper argues that instrumental rationality is more permissive than expected utility theory. The most compelling instrumentalist argument in favour of separability, its core requirement, is that agents with non-separable preferences end up badly off by their own lights in some dynamic choice problems. I argue that once we focus on the question of whether agents’ attitudes to uncertain prospects help define their ends in their own right, or instead only assign instrumental value in virtue of the outcomes they may (...) lead to, we see that the argument must fail. Either attitudes to prospects assign non-instrumental value in their own right, in which case we cannot establish the irrationality of the dynamic choice behaviour of agents with non-separable preferences. Or they don’t, in which case agents with non-separable preferences can avoid the problematic choice behaviour without adopting separable preferences. (shrink)
In Risk and Rationality, Lara Buchak advertised REU theory as able to recover the modal preferences in the Allais paradox. But we pointed out that REU theory only applies in the “grand world” setting, where it actually struggles with the modal Allais preferences. Buchak offers two replies. Here we enumerate technical and philosophical problems they face.