Probability kinematics is studied in detail within the framework of elementary probability theory. The merits and demerits of Jeffrey's and Field's models are discussed. In particular, the principle of maximum relative entropy and other principles are used in an epistemic justification of generalized conditionals. A representation of conditionals in terms of Bayesian conditionals is worked out in the framework of external kinematics.
The Ontology Summit 2012 explored the current and potential uses of ontology, its methods and paradigms, in big systems and big data: How ontology can be used to design, develop, and operate such systems. The systems addressed were not just software systems, although software systems are typically core and necessary components, but more complex systems that include multiple kinds and levels of human and community interaction with physical-software systems, systems of systems, and the socio-technical environments for those systems which can (...) include cultural, legal, and economic components. The focus themes used for this exploration were Big Systems Engineering, Big Data Challenge, Large Scale Domain Applications, and cross-cutting aspects Ontology Quality, and Federation and Integration of Systems. The Ontology Summit 2012 consisted of over three months of intensive virtual collaborative elaboration of these issues in presentations, panels, and group email. The culmination of these activities was a face-to-face Symposium at the US National Institute of Standards and Technology (NIST), Gaithersburg, MD, USA, 12–13 April 2012. The primary product of this Ontology Summit is the communiqué reported here. But there are other products, some continuing as collaborative, more specifically focused analysis and modeling efforts aligned with various open standards activities. Behind all of these particular products, of course, is the real overriding purpose of the Ontology Summit 2012, which was: the joint collaboration of three distinct communities, the ontology, systems engineering and big systems stakeholder communities, who came together to address common problems, create common understanding and propose common solutions. (shrink)
For n≥3, define Tn to be the theory of the generic Kn-free graph, where Kn is the complete graph on n vertices. We prove a graph-theoretic characterization of dividing in Tn and use it to show that forking and dividing are the same for complete types. We then give an example of a forking and nondividing formula. Altogether, Tn provides a counterexample to a question of Chernikov and Kaplan.
In 1984, Henson and Rubel [2] proved the following theorem: If p(x₁, ..., x n ) is an exponential polynomial with coefficients in with no zeroes in ℂ, then $p({x_1},...,{x_n}) = {e^{g({x_{1......}}{x_n})}}$ where g(x₁......x n ) is some exponential polynomial over ℂ. In this paper, I will prove the analog of this theorem for Zilber's Pseudoexponential fields directly from the axioms. Furthermore, this proof relies only on the existential closedness axiom without any reference to Schanuel's conjecture.
This article reviews the recent literature on idealization, specifically idealization in the course of scientific modeling. We argue that idealization is not a unified concept and that there are three different types of idealization: Galilean, minimalist, and multiple models, each with its own justification. We explore the extent to which idealization is a permanent feature of scientific representation and discuss its implications for debates about scientific realism.
My purpose is to account for some oddities in what Kant did and did not say about "moral worth," and for another in what commentators tell us about his intent. The stone with which I hope to dispatch these several birds is-as one would expect a philosopher's stone to be-a distinction. I distinguish between two things Kant might have had in mind under the heading of moral worth. They come readily to mind when one both takes account of what he (...) actually said about it and notices a fact which he did not seem to notice: namely, that dutiful action- action which, whatever its motive, fulfills a duty-can be over- determined, and determined in particular by both respect for duty and some consortium of inclinations and prudenc. (shrink)
A hallmark of ecological research is dealing with complexity in the systems under investigation. One strategy is to diminish this complexity by constructing models and theories that are general. Alternatively, ecologists can constrain the scope of their generalizations to particular phenomena or types of systems. However, research employing the second strategy is often met with scathing criticism. I offer a theoretical argument in support of moderate generalizations in ecological research, based on the notions of interdependence and causal heterogeneity and their (...) effect on the trade-off between generality and realism. (shrink)
In this paper a new resolution to the gamer’s dilemma is presented. The first part of the paper is devoted to strictly formulating the dilemma, and the second to establishing its resolution. The proposed resolution, the grave resolution, aims to resolve not only the gamer’s dilemma, but also a wider set of analogous paradoxes – which together make up the paradox of treating wrongdoing lightly.
Invasion biology is a relatively young discipline which is important, interesting and currently in turmoil. Biological invaders can threaten native ecosystems and global biodiversity; they can incur massive economic costs and even introduce diseases. Invasion biologists generally agree that being able to predict when and where an invasion will occur is essential for progress in their field. However, successful predictions of this type remain elusive. This has caused a rift, as some researchers are pessimistic and believe that invasion biology has (...) no future, whereas others are more optimistic and believe that the key to successful prediction is the creation of a general, unified theoretical framework which encompasses all invasion events. Although I agree that there is a future for invasion biology, extensive synthesis is not the way to better predictions. I argue that the causes of invasion phenomena are exceedingly complex and heterogeneous, hence it is impossible to make generalizations over particular events without sacrificing causal detail. However, this causal detail is just what is needed for the specific predictions which the scientists wish to produce. Instead, I show that a limited type of synthesis is a more useful tool for generating successful predictions. An important implication of my view is that it points to a more pluralistic approach to invasion biology, where generalization and prediction are treated as important yet distinct research goals. (shrink)
We introduce what we call the Emergent Model of forgiving, which is a process-based relational model conceptualizing forgiving as moral and normative repair in the wake of grave wrongs. In cases of grave wrongs, which shatter the victim’s life, the Classical Model of transactional forgiveness falls short of illuminating how genuine forgiveness can be achieved. In a climate of persistent threat and distrust, expressions of remorse, rituals and gestures of apology, and acts of reparation are unable to secure the moral (...) confidence and trust required for moral repair, much less for forgiveness. Without the rudiments of a shared moral world — a world in which, at the very least, the survivor’s violation can be collectively recognized as a violation, and her moral status and authority collectively acknowledged and respected — expressions of remorse, gestures and rituals of apology, or promises of compensation have no authority as meaningful communicative acts with reparative significance. Accordingly, we argue that repair in the wake of traumatic violence involves ‘world-building,’ which supports the ability of survivors to move from despair to hope, from radical and disabling distrust to trust and engagement, and thus from impotence to effective agency. Our Emergent Model treats forgiveness as a slowly developing outcome of a series of changes in a person’s relationship to the trauma and its aftermath, in which moral agency is regained. We argue that forgiveness after grave wrongs and world-shattering harm, when it occurs, emerges from other phenomena, such as cohabitation within a community, gestures of reconciliation, working on shared projects, the developing of trust. On this view, forgiveness is an emergent phenomenon; it entails taking and exercising normative power—coming to claim one’s own moral authority in relation to oneself, one’s assailant, and one’s community. The processes that ultimately constitute forgiving are part and parcel of normative repair more broadly construed. (shrink)
This paper addresses arguments that “separability” is an assumption of Bell’s theorem, and that abandoning this assumption in our interpretation of quantum mechanics (a position sometimes referred to as “holism”) will allow us to restore a satisfying locality principle. Separability here means that all events associated to the union of some set of disjoint regions are combinations of events associated to each region taken separately.In this article, it is shown that: (a) localised events can be consistently defined without implying separability; (...) (b) the definition of Bell’s locality condition does not rely on separability in any way; (c) the proof of Bell’s theorem does not use separability as an assumption. If, inspired by considerations of non-separability, the assumptions of Bell’s theorem are weakened, what remains no longer embodies the locality principle. Teller’s argument for “relational holism” and Howard’s arguments concerning separability are criticised in the light of these results. Howard’s claim that Einstein grounded his arguments on the incompleteness of QM with a separability assumption is also challenged. Instead, Einstein is better interpreted as referring merely to the existence of localised events. Finally, it is argued that Bell rejected the idea that separability is an assumption of his theorem. (shrink)
Prediction is an important aspect of scientific practice, because it helps us to confirm theories and effectively intervene on the systems we are investigating. In ecology, prediction is a controversial topic: even though the number of papers focusing on prediction is constantly increasing, many ecologists believe that the quality of ecological predictions is unacceptably low, in the sense that they are not sufficiently accurate sufficiently often. Moreover, ecologists disagree on how predictions can be improved. On one side are the ‘theory-driven’ (...) ecologists, those who believe that ecology lacks a sufficiently strong theoretical framework. For them, more general theories will yield more accurate predictions. On the other are the ‘applied’ ecologists, whose research is focused on effective interventions on ecological systems. For them, deeper knowledge of the system in question is more important than background theory. The aim of this paper is to provide a philosophical examination of both sides of the debate: as there are strengths and weaknesses in both approaches to prediction, a pluralistic approach is best for the future of predictive ecology. (shrink)
Many phenomena in the natural world are complex, so scientists study them through simplified and idealised models. Philosophers of science have sought to explain how these models relate to the world. On most accounts, models do not represent the world directly, but through target systems. However, our knowledge of target systems is incomplete. First, what is the process by which target systems come about? Second, what types of entity are they? I argue that the basic conception of target systems, on (...) which other conceptions depend, is as parts of the world. I outline the process of target system specification and show that it is a crucial step in modelling. I also develop an account of target system evaluation, based on aptness. Paying close attention to target system specification and evaluation can help scientists minimise the frequency and extent of mistakes, when they are using models to investigate phenomena in complex real-world systems. (shrink)
Comparing Causality Principles.Joe Henson - 2005 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 36 (3):519-543.details
The traditional philosophy of science approach to prediction leaves little room for appreciating the value and potential of imprecise predictions. At best, they are considered a stepping stone to more precise predictions, while at worst they are viewed as detracting from the scientific quality of a discipline. The aim of this paper is to show that imprecise predictions are undervalued in philosophy of science. I review the conceptions of imprecise predictions and the main criticisms levelled against them: (i) that they (...) cannot aid in model selection and improvement, and (ii) that they cannot support effective interventions in practical decision making. I will argue against both criticisms, showing that imprecise predictions have a circumscribed but important and legitimate place in the study of complex, heterogeneous systems. The argument is illustrated and supported by an example from conservation biology, where imprecise models were instrumental in saving the kōkako from extinction. (shrink)
Using data from a survey of certified organic or in-transition to organic vegetable and dairy producers in Canada, we seek to understand a farmer’s decision to convert to organic production by exploring the motives, problems and challenges, and benefits of transition to organic. Results suggest that health and safety concerns and environmental issues are the predominant motives for conversion, while economic motives are of lesser importance. In contrast to the extant literature, results suggest that the motives underlying transition have not (...) changed overtime in Canada. Problems experienced during transition relate to lack of governmental and institutional support, negative pressure from other farmers and farm groups, and lack of physical and financial capital. Reduced exposure to chemicals and improved food quality were highly ranked benefits, while economic related benefits were scored among the lowest of the listed benefits. To prosper, the Canadian organic sector must overcome fundamental marketing problems and challenges. Promulgation of the Canada Organic standard may help address some marketing issues by providing more information to consumers. (shrink)
We consider extensions of Peano arithmetic suitable for doing some of nonstandard analysis, in which there is a predicate N(x) for an elementary initial segment, along with axiom schemes approximating ω 1 -saturation. We prove that such systems have the same proof-theoretic strength as their natural analogues in second order arithmetic. We close by presenting an even stronger extension of Peano arithmetic, which is equivalent to ZF for arithmetic statements.
In a series of pre-registered studies, we explored (a) the difference between people’s intuitions about indeterministic scenarios and their intuitions about deterministic scenarios, (b) the difference between people’s intuitions about indeterministic scenarios and their intuitions about neurodeterministic scenarios (that is, scenarios where the determinism is described at the neurological level), (c) the difference between people’s intuitions about neutral scenarios (e.g., walking a dog in the park) and their intuitions about negatively valenced scenarios (e.g., murdering a stranger), and (d) the difference (...) between people’s intuitions about free will and responsibility in response to first-person scenarios and third-person scenarios. We predicted that once we focused participants’ attention on the two different abilities to do otherwise available to agents in indeterministic and deterministic scenarios, their intuitions would support natural incompatibilism—the view that laypersons judge that free will and moral responsibility are incompatible with determinism. This prediction was borne out by our findings. (shrink)
It has been argued that ethical frameworks for data science often fail to foster ethical behavior, and they can be difficult to implement due to their vague and ambiguous nature. In order to overcome these limitations of current ethical frameworks, we propose to integrate the analysis of the connections between technical choices and sociocultural factors into the data science process, and show how these connections have consequences for what data subjects can do, accomplish, and be. Using healthcare as an example, (...) attention to sociocultural conversion factors relevant to health can help in navigating technical choices that require broader considerations of the sociotechnical system, such as metric tradeoffs in model validation, resulting in better ethical and technical choices. This approach promotes awareness of the ethical dimension of technical choices by data scientists and others, and that can foster the cultivation of 'ethical skills' as integral to data science. (shrink)
We argue that Brandon and Carson's (1996) "The Indeterministic Character of Evolutionary Theory" fails to identify any indeterminism that would require evolutionary theory to be a statistical or probabilistic theory. Specifically, we argue that (1) their demonstration of a mechanism by which quantum indeterminism might "percolate up" to the biological level is irrelevant; (2) their argument that natural selection is indeterministic because it is inextricably connected with drift fails to join the issue with determinism; and (3) their view that experimental (...) methodology in botany assumes indeterminism is both false and incompatible with the commitment to discoverable causal mechanisms underlying biological processes. We remain convinced that the probabilism of the theory of evolution is epistemically, not ontologically, motivated. (shrink)
This paper concludes the special issue of Agriculture and Human Values devoted to private governance of global agri-food systems. Rather than aiming to summarize the findings of the various papers that make up the issue, it highlights a number of cross-cutting issues relating to the increasing role of private governance. Key issues that are discussed include the legitimacy of private governance of agri-food systems and the scope for trade-off between its various dimensions, private governance in a global context and the (...) motivation for firms to engage in governance. Throughout, the major focus is on unresolved issues and on-going controversies with the intention of stimulating further research in this area. (shrink)