Law search is fundamental to legal reasoning and its articulation is an important challenge and open problem in the ongoing efforts to investigate legal reasoning as a formal process. This Article formulates a mathematical model that frames the behavioral and cognitive framework of law search as a sequential decision process. The model has two components: first, a model of the legal corpus as a search space and second, a model of the search process that is compatible with that environment. The (...) search space has the structure of a “multi-network”—an interleaved structure of distinct networks—developed in earlier work. In this Article, we develop and formally describe three related models of the search process. We then implement these models on a subset of the corpus of U.S. Supreme Court opinions and assess their performance against two benchmark prediction tasks. The first is to predict the citations in a document from its semantic content. The second is to predict the search results generated by human users. For both benchmarks, all search models outperform a null model with the learning-based model outperforming the other approaches. Our results indicate that through additional work and refinement, there may be the potential for machine law search to achieve human or near-human levels of performance. (shrink)
I argue against theories that attempt to reduce scientific representation to similarity or isomorphism. These reductive theories aim to radically naturalize the notion of representation, since they treat scientist's purposes and intentions as non-essential to representation. I distinguish between the means and the constituents of representation, and I argue that similarity and isomorphism are common but not universal means of representation. I then present four other arguments to show that similarity and isomorphism are not the constituents of scientific representation. I (...) finish by looking at the prospects for weakened versions of these theories, and I argue that only those that abandon the aim to radically naturalize scientific representation are likely to be successful. (shrink)
Science is popularly understood as being an ideal of impartial algorithmic objectivity that provides us with a realistic description of the world down to the last detail. The essays collected in this book—written by some of the leading experts in the field—challenge this popular image right at its heart, taking as their starting point that science trades not only in truth, but in fiction, too. With case studies that range from physics to economics and to biology, _Fictions in Science_ reveals (...) that fictions are as ubiquitous in scientific narratives and practice as they are in any other human endeavor, including literature and art. Of course scientific activity, most prominently in the formal sciences, employs logically precise algorithmic thinking. However, the key to the predictive and technological success of the empirical sciences might well lie elsewhere—perhaps even in scientists’ extraordinary creative imagination instead. As these essays demonstrate, within the bounds of what is empirically possible, a scientist’s capacity for invention and creative thinking matches that of any writer or artist. (shrink)
Depersonalization is characterised by a profound disruption of self-awareness mainly characterised by feelings of disembodiment and subjective emotional numbing.It has been proposed that depersonalization is caused by a fronto-limbic suppressive mechanism – presumably mediated via attention – which manifests subjectively as emotional numbing, and disables the process by which perception and cognition normally become emotionally coloured, giving rise to a subjective feeling of ‘unreality’.Our functional neuroimaging and psychophysiological studies support the above model and indicate that, compared with normal and clinical (...) controls, DPD patients show increased prefrontal activation as well as reduced activation in insula/limbic-related areas to aversive, arousing emotional stimuli.Although a putative inhibitory mechanism on emotional processing might account for the emotional numbing and characteristic perceptual detachment, it is likely, as suggested by some studies, that parietal mechanisms underpin feelings of disembodiment and lack of agency feelings. (shrink)
This paper defends an inferential conception of scientific representation. It approaches the notion of representation in a deflationary spirit, and minimally characterizes the concept as it appears in science by means of two necessary conditions: its essential directionality and its capacity to allow surrogate reasoning and inference. The conception is defended by showing that it successfully meets the objections that make its competitors, such as isomorphism and similarity, untenable. In addition the inferential conception captures the objectivity of the cognitive representations (...) used by science, it sheds light on their truth and completeness, and it explains the source of the analogy between scientific and artistic modes of representation. (shrink)
Scientific representation is a currently booming topic, both in analytical philosophy and in history and philosophy of science. The analytical inquiry attempts to come to terms with the relation between theory and world; while historians and philosophers of science aim to develop an account of the practice of model building in the sciences. This article provides a review of recent work within both traditions, and ultimately argues for a practice-based account of the means employed by scientists to effectively achieve representation (...) in the modelling sciences. (shrink)
This paper defends the deflationary character of two recent views regarding scientific representation, namely RIG Hughes’ DDI model and the inferential conception. It is first argued that these views’ deflationism is akin to the homonymous position in discussions regarding the nature of truth. There, we are invited to consider the platitudes that the predicate “true” obeys at the level of practice, disregarding any deeper, or more substantive, account of its nature. More generally, for any concept X, a deflationary approach is (...) then defined in opposition to a substantive approach, where a substantive approach to X is an analysis of X in terms of some property P, or relation R, accounting for and explaining the standard use of X. It then becomes possible to characterize a deflationary view of scientific representation in three distinct senses, namely: a “no-theory” view, a “minimalist” view, and a “use-based” view – in line with three standard deflationary responses in the philosophical literature on truth. It is then argued that both the DDI model and the inferential conception may be suitably understood in any of these three different senses. The application of these deflationary ‘hermeneutics’ moreover yields significant improvements on the DDI model, which bring it closer to the inferential conception. It is finally argued that what these approaches have in common – the key to any deflationary account of scientific representation – is the denial that scientific representation may be ultimately reduced to any substantive explanatory property of sources, or targets, or their relations. (shrink)
In “The Toolbox of Science” (1995) together with Towfic Shomar we advocated a form of instrumentalism about scientific theories. We separately developed this view further in a number of subsequent works. Steven French, James Ladyman, Otavio Bueno and Newton Da Costa (FLBD) have since written at least eight papers and a book criticising our work. Here we defend ourselves. First we explain what we mean in denying that models derive from theory – and why their failure to do so should (...) be lamented. Second we defend our use of the London model of superconductivity as an example. Third we point out both advantages and weaknesses of FLBD’s techniques in comparison to traditional Anglophone versions of the semantic conception. Fourth we show that FLBD’s version of the semantic conception has not been applied to our case study. We conclude by raising doubts about FLBD’s overall project. (shrink)
This paper reviews four attempts throughout the history of quantum mechanics to explicitly employ dispositional notions in order to solve the quantum paradoxes, namely: Margenau’s latencies, Heisenberg’s potentialities, Maxwell’s propensitons, and the recent selective propensities interpretation of quantum mechanics. Difficulties and challenges are raised for all of them, and it is concluded that the selective propensities approach nicely encompasses the virtues of its predecessors. Finally, some strategies are discussed for reading dispositional notions into two other well-known interpretations of quantum mechanics, (...) namely the GRW interpretation and Bohmian mechanics. (shrink)
Social commerce as a subset of e-commerce has been emerged in part due to the popularity of social networking sites. Social commerce brings new challenges to marketing activities. And social commerce transactions like e-commerce transactions can be dangerous and cause harmful losses to personal finances, time, and information privacy. This article examines ethical issues and consumer assessments of the risks of using an e-service and how risk affects consumer evaluations and usage of Internet-based services and self-service technologies. Results from two (...) surveys totaling 1024 consumers indicated that as usage risk concerns increased, the perceived usefulness of an e-service and intention to use it decreased. Additionally as usage risk concerns increased the effect of subjective norm on PU and intention to use an e-service strengthened, and the effect of perceived ease of use on PU and intention to use an e-service weakened. These findings advance theory and contribute to the foundation for future research aimed at improving our understanding of how consumers evaluate new e-services, new commerce systems and settings, and self-service technologies in the social commerce era. (shrink)
I review five explicit attempts throughout the history of quantum mechanics to invoke dispositional notions in order to solve the quantum paradoxes, namely: Margenau’s latencies, Heisenberg’s potentialities, Popper’s propensity interpretation of probability, Nick Maxwell’s propensitons, and the recent selective propensities interpretation of quantum mechanics. I raise difficulties and challenges for all of them, but conclude that the selective propensities approach nicely encompasses the virtues of its predecessors. I elaborate on some of the properties of the type of propensities that I (...) claim to be successful for quantum mechanics, and finish by briefly sketching out ways in which similar notions can be read into some of the other well-known interpretations of quantum mechanics. (shrink)
This paper argues for a broadly dispositionalist approach to the ontology of Bohmian mechanics . It first distinguishes the ‘minimal’ and the ‘causal’ versions of Bohm’s theory, and then briefly reviews some of the claims advanced on behalf of the ‘causal’ version by its proponents. A number of ontological or interpretive accounts of the wave function in BM are then addressed in detail, including configuration space, multi-field, nomological, and dispositional approaches. The main objection to each account is reviewed, namely the (...) ‘problem of perception’, the ‘problem of communication’, the ‘problem of temporal laws’, and the ‘problem of under-determination’. It is then shown that a version of dispositionalism overcomes the under-determination problem while providing neat solutions to the other three problems. A pragmatic argument is thus furnished for the use of dispositions in the interpretation of the theory more generally. The paper ends in a more speculative note by suggesting ways in which a dispositionalist interpretation of the wave function is in addition able to shed light upon some of the claims of the proponents of the causal version of BM. (shrink)
The article turns to Deleuze and Guattari’s concept of state capitalism and their theorization of money and debt in their critique of capitalism to develop an analysis of the governmental management of the current crisis determined by ordo- and neoliberalism. The paper argues that analyses which fail to properly recognize the power of capital to determine both state apparatuses and economic policy thereby fail to grasp the real functioning of money, debt and the Euro in the crisis and end up (...) unwittingly supporting liberalism. This is true of positions such as heterodox theory that, though critical of conventional and neoliberal political economy, nevertheless continue to uphold the state as an independent or mediating mechanism in relation to the power of capital. The neglect of the role which money plays in the strategies of capital to control both the creation of value and the functioning of the state is to be found even in Foucault’s genealogy of neoliberalism, a neglect which undermines his analysis of power. The paper highlights the implications of the standpoint of state capitalism for a more incisive analysis of the current crisis that reveals what is at stake for political struggles. (shrink)
This paper expands on, and provides a qualified defence of, Arthur Fine's selective interactions solution to the measurement problem. Fine's approach must be understood against the background of the insolubility proof of the quantum measurement. I first defend the proof as an appropriate formal representation of the quantum measurement problem. The nature of selective interactions, and more generally selections, is then clarified, and three arguments in their favour are offered. First, selections provide the only known solution to the measurement problem (...) that does not relinquish any of the explicit premises of the insolubility proofs. Second, unlike some no-collapse interpretations of quantum mechanics, selections suffer no difficulties with non-ideal measurements. Third, unlike most collapse interpretations, selections can be independently motivated by an appeal to quantum propensities. IntroductionThe problem of quantum measurement2.1 The ignorance interpretation of mixtures2.2 The eigenstate–eigenvalue link2.3 The quantum theory of measurementThe insolubility proof of the quantum measurement3.1 Some notation3.2 The transfer of probability condition (TPC)3.3 The occurrence of outcomes condition (OOC)A defence of the insolubility proof4.1 Stein's critique4.2 Ignorance is not required4.3 The problem of quantum measurement is an idealisationSelections5.1 Representing dispositional properties5.2 Selections solve the measurement problem5.3 Selections and ignoranceNon-ideal selections6.1 No-collapse interpretations and non-ideal measurements6.2 Exact and approximate measurements6.3 Selections for non-ideal interactions6.4 Approximate selections6.5 Implications for ignoranceSelective interactions test quantum propensities7.1 Equivalence classes as physical ‘aspects’: a critique7.2 Quantum dispositions7.3 Selections as a propensity modal interpretation7.4 A comparison with Popper's propensity interpretation. (shrink)
El objetivo de este artículo es presentar el problema triple (trilema) de Agripa que cuestiona la posibilidad de alcanzar una justificación epistemológica del conocimiento empírico. Una posible reconstrucción del problema es la que se apoya en el problema del regreso al infinito -uno de los modos de..
: This paper outlines a genuinely pragmatist conception of propensity, and defends it against common objections to the propensity interpretation of probability, prominently Humphreys’ paradox. The paper reviews the paradox and identifies one of its key assumptions, the identity thesis, according to which propensities are probabilities. The identity thesis is also involved in empiricist propensity interpretations deriving from Popper’s influential original proposal, and makes such interpretations untenable. As an alternative, I urge a return to Charles Peirce’s original insights on probabilistic (...) dispositions, and offer a reconstructed version of his pragmatist conception, which rejects the identity thesis. – Correspondence to: firstname.lastname@example.org. (shrink)
I argue against an account of scientific representation suggested by the semantic, or structuralist, conception of scientific theories. Proponents of this conception often employ the term “model” to refer to bare “structures”, which naturally leads them to attempt to characterize the relation between models and reality as a purely structural one. I argue instead that scientific models are typically “representations”, in the pragmatist sense of the term: they are inherently intended for specific phenomena. Therefore in general scientific models are not (...) (merely) structures. I then explore some consequences of this pragmatist account of representation, and argue that it sheds light upon the distinction between theories and models. I finish by briefly addressing some critical comments due to Bas Van Fraassen. (shrink)
By weakening an inference rule satisfied by logic daC, we define a new paraconsistent logic, which is weaker than logic \ and G′ 3, enjoys properties presented in daC like the substitution theorem, and possesses a strong negation which makes it suitable to express intutionism. Besides, daC ' helps to understand the relationships among other logics, in particular daC, \ and PH1.
This paper argues for a representational semantic conception of scientific theories, which respects the bare claim of any semantic view, namely that theories can be characterised as sets of models. RSC must be sharply distinguished from structural versions that assume a further identity of ‘models’ and ‘structures’, which we reject. The practice-turn in the recent philosophical literature suggests instead that modelling must be understood in a deflationary spirit, in terms of the diverse representational practices in the sciences. These insights are (...) applied to some mathematical models, thus showing that the mathematical sciences are not in principle counterexamples to RSC. (shrink)
On a purely epistemic understanding of experimental realism, manipulation affords a particularly robust kind of causal warrant, which is – like any other warrant – defeasible. I defend a version of Nancy Cartwright’s inference to the most likely cause, and I conclude that this minimally epistemic version of experimental realism is a coherent, adequate and plausible epistemology for science.
This paper is divided in two parts. In part I, I argue against two attempts to naturalise the notion of scientific representation, by reducing it to isomorphism and similarity. I distinguish between the means and the constituents of representation, and I argue that isomorphism and similarity are common means of representation; but that they are not constituents of scientific representation. I look at the prospects for weakened versions of these theories, and I argue that only those that abandon the aim (...) to naturalise scientific representation are likely to be successful. In part II of the paper, I present a deflationary conception of scientific representation, which minimally characterises it by means of two necessary conditions: representation is essentially intentional and it has the capacity to allow surrogate reasoning and inference. I then defend this conception by showing that it successfully meets the objections and difficulties that make its competitors, such as isomorphism and similarity, untenable. In addition the inferential conception explains the success of various means of representation in their appropriate domains, and it sheds light on the truth and accuracy of scientific representations. (shrink)
Scientific representation is a booming field nowadays within the philosophy of science, with many papers published regularly on the topic every year, and several yearly conferences and workshops held on related topics. Historically, the topic originates in two different strands in 20th-century philosophy of science. One strand begins in the 1950s, with philosophical interest in the nature of scientific theories. As the received or “syntactic” view gave way to a “semantic” or “structural” conception, representation progressively gained the center stage. Yet, (...) there is another, older, strand that links representation to fin de siècle modeling debates, particularly in the emerging Bildtheorie of Boltzmann and Hertz, and to the ensuing discussion among philosophers thereafter. Both strands feed into present-day philosophical work on scientific representation. There are a number of different orthogonal questions that philosophers ask regarding representation. One set of questions concerns the nature of the representational relation between theories or models, on the one hand, and the real-world systems they purportedly represent. Such questions lie at the more metaphysical and abstract end of the spectrum—and they are often addressed with the abstract tools of the analytical metaphysician. They constitute what we may refer to as the “analytical inquiry” into representation. On the other hand there are questions regarding the use that scientists put some representations to in practice—these are questions that are best addressed by means of some of the favorite tools of the philosopher of science, including descriptive analysis, illustration by means of case studies, induction, exemplification, inference from practice, etc., and are best referred to as the “practical inquiry” into representation. The notion of representation invoked in such inquiries may be “deflationary” or “substantive”—depending on whether it construes representation as a primitive notion, or as susceptible to further reduction or analysis in terms of something else. (shrink)
Abstract: It is often assumed without argument that fictionalism in the philosophy of science contradicts scientific realism. This paper is a critical analysis of this assumption. The kind of fictionalism that is at present discussed in philosophy of science is characterised, and distinguished from fictionalism in other areas. A distinction is then drawn between forms of fictional representation, and two competing accounts of fiction in science are discussed. I then outline explicitly what I take to be the argument for the (...) incompatibility of scientific realism with fictionalism. I argue that some of its premises are unwarranted, and are moreover questionable from a fictionalist perspective. The conclusion is that fictionalism is neutral in the realism-antirealism debate, pulling neither in favour nor against scientific realism. (shrink)
Peter Milne and Neal Grossman have argued against Popper's propensity interpretation of quantum mechanics, by appeal to the two-slit experiment and to the distinction between mixtures and superpositions, respectively. In this paper I show that a different propensity interpretation successfully meets their objections. According to this interpretation, the possession of a quantum propensity by a quantum system is independent of the experimental set-ups designed to test it, even though its manifestations are not.
There has been an intense discussion, albeit largely an implicit one, concerning the inference of causal hypotheses from statistical correlations in quantum mechanics ever since John Bell’s first statement of his notorious theorem in 1966. As is well known, its focus has mainly been the so-called Einstein-Podolsky-Rosen (“EPR”) thought experiment, and the ensuing observed correlations in real EPR like experiments. But although implicitly the discussion goes as far back as Bell’s work, it is only in the last two decades that (...) it has become recognizably and explicitly a debate about causal inference in the quantum realm. The bulk of this paper is devoted to a review of three influential arguments in the philosophical literature that aim to show that causal models for the EPR correlations are impossible, due to Bas Van Fraassen, Daniel Hausman and Huw Price. I contend that all these arguments are inconclusive since they contain premises or presuppositions that are false, unwarranted, or at least controversial. Five different common cause models are outlined that seem perfectly viable for the EPR correlations. These models are then employed to illustrate various difficulties with the premises and presuppositions underlying Van Fraassen’s, Hausman’s and Price’s arguments. In all these cases it is argued that the difficulties cut deep against these authors’ own theories of causation and causal inference. My conclusions are that causal models for the EPR correlations remain viable, that philosophical work is still required to assess their relative virtues, and that in any case the mere theoretical conceivability and empirical possibility of these models sheds doubts over Van Fraassen’s, Hausman’s and (important elements in) Price’s theories of causation and causal inference. (shrink)
It is widely accepted in contemporary philosophy of science that the domain of application of a theory is typically larger than its explanatory covering power: theories can be applied to phenomena that they do not explain. I argue for an analogous thesis regarding the notion of empirical adequacy. A theory's domain of application is typically larger than its domain of empirical adequacy: theories are often applied to phenomena from which they receive no empirical confirmation. \\\ Existe en la filosofía de (...) la ciencia actual un amplio consenso al afirmar que el dominio de aplicación de una teoría científica es en general mucho más extenso que su dominio explicativo: las teorías científicas pueden ser aplicadas a fenómenos que no son capaces de explicar. En este artículo defiendo una tesis análoga con respecto a la noción de adecuación empírica. El dominio de aplicación de las teorías científicas es en general mucho mas amplio que su dominio de adecuacion empírica: las teorías a menudo se aplican a fenómenos que no proporcionan confirmatión empírica a su favor. (shrink)
I analyse critically what I regard as the most accomplished empiricist account of propensities, namely the long run propensity theory developed by Donald Gillies . Empiricist accounts are distinguished by their commitment to the ‘identity thesis’: the identification of propensities and objective probabilities. These theories are intended, in the tradition of Karl Popper’s influential proposal, to provide an interpretation of probability that renders probability statements directly testable by experiment. I argue that the commitment to the identity thesis leaves empiricist theories, (...) including Gillies’ version, vulnerable to a variant of what is known as Humphreys’ paradox. I suggest that the tension may be resolved only by abandoning the identity thesis, and by adopting instead an understanding of propensities as explanatory properties of chancy objects. (shrink)
This article provides a state of the art review of the philosophical literature on scientific representation. It first argues that the topic emerges historically mainly out of what may be called the modelling tradition. It then introduces a number of helpful analytical distinctions, and goes on to divide contemporary approaches to scientific representation into two distinct kinds, substantive and deflationary. Analogies with related discussions of artistic representation in aesthetics, and of the nature of truth in metaphysics are pursued. It is (...) finally urged that the most promising approaches - and the ones most likely to feature prominently in future developments - are deflationary. In particular, a defence is provided of a genuinely inferential conception of representation. (shrink)
Francisco Suárez claims that forms may be efficient causes. There is an action whose proximate efficient cause is a substantial form, namely, the natural resulting. Also a substantial form is the principal efficient cause of the eduction of other forms, although it causes this through the substance’s own accidents. The souls insofar as substantial forms participate of both features. However, they pose a new complexity because of the actions they are exclusively principles of, namely vital actions. This kind of actions (...) may seem no different from natural resulting, if vitality is mainly described as some sort of immanence. However, there is a Suarezian approach to vital actions that does not rely completely on the immanence and therefore allows to consider souls as efficient causes without turning their actions into natural. (shrink)
This paper studies the topology of the Chilean mutual fund industry using networks methods. With the physical positions of the local equity portfolios managed during 2003.01-2017.4, we analyze their connectivity structure in both the mutual funds’ bipartite network and their one-mode projection. We estimate network measures to examine the potential effects on the topology arising from changes in the industrial environment and changes in the mutual funds’ investment strategies in their overlapped portfolios. Our main results show that changes in the (...) bipartite network and its one-mode projection are correlated with variables related to funds’ investment strategies and with industry-specific variables. In consequence, these elements are a new potential of disturbance in the financial network conformed by stocks and mutual funds. We contribute to the existing literature, improving the understanding of the aggregate behavior of a financial sector which despite its economic importance has attracted little attention from a systemic risk perspective. (shrink)
Melatonin is a hormone with complex roles in the pathogenesis of autoimmune disorders. Over the years, it has become clear that melatonin may exacerbate some autoimmune conditions, whereas it alleviates others such as multiple sclerosis. Multiple sclerosis is an autoimmune disorder characterized by a dysregulated immune response directed against the central nervous system. Indeed, the balance between pathogenic CD4+ T cells secreting IFN‐γ (TH1) or IL‐17 (TH17); and FoxP3+ regulatory T cells and IL‐10+ type 1 regulatory T cells (Tr1 cells) (...) is thought to play an important role in disease activity. Recent evidence suggests that melatonin ameliorates multiple sclerosis by controlling the balance between effector and regulatory cells, suggesting that melatonin‐triggered signaling pathways are potential targets for therapeutic intervention. Here, we review the available data on the effects of melatonin on immune processes relevant for MS and discuss its therapeutic potential. (shrink)
I argue that the Causal Markov Condition (CMC) is in principle applicable to the Einstein–Podolsky–Rosen (EPR) correlations. This is in line with my defence in the past of the applicability of the Principle of Common Cause to quantum mechanics. I first review a contrary claim by Dan Hausman and Jim Woodward, who endeavour to preserve the CMC against a possible counterexample by asserting that the conditions for the application of the CMC are not met in the EPR experiment. In their (...) view the CMC is inapplicable to the EPR correlations—i.e. it neither obtains nor fails. The view is grounded upon the non-separability of the quantum state, and the consequent unavailability of interventions. I urge that whether interventions are available in EPR—and why—is a complex and contextual question that does not have a unique or uniform answer. Instead, I argue that different combinations of causal hypotheses under test and interpretations of quantum mechanics yield different answers to the question. (shrink)
This paper argues that if propensities are displayed in objective physical chances then the appropriate representation of these chances is as indexed probability functions. Two alternative formal models, or accounts, for the relation between propensity properties and their chancy or probabilistic manifestations, in terms of conditionals and conditional probability are first reviewed. It is argued that both confront important objections, which are overcome by the account in terms of indexed probabilities. A number of further advantages of the indexed probability account (...) are discussed, which suggest that it is promising as a general theory of objective physical chance. The paper ends with a discussion of the indexical character of the objective chances that are grounded in propensities. (shrink)
It is still a matter of controversy whether the Principle of the Common Cause (PCC) can be used as a basis for sound causal inference. It is thus to be expected that its application to quantum mechanics should be a correspondingly controversial issue. Indeed the early 90’s saw a flurry of papers addressing just this issue in connection with the EPR correlations. Yet, that debate does not seem to have caught up with the most recent literature on causal inference generally, (...) which has moved on to consider the virtues of a generalised PCC-inspired condition, the so-called Causal Markov Condition (CMC). In this paper we argue that the CMC is an appropriate benchmark for debating possible causal explanations of the EPR correlations. But we go on to take issue with some pronouncements on EPR by defenders of the CMC. (shrink)
In the logic of theory change, the standard model is AGM, proposed by Alchourrón et al. (J Symb Log 50:510–530, 1985 ). This paper focuses on the extension of AGM that accounts for contractions of a theory by a set of sentences instead of only by a single sentence. Hansson (Theoria 55:114–132, 1989 ), Fuhrmann and Hansson (J Logic Lang Inf 3:39–74, 1994 ) generalized Partial Meet Contraction to the case of contractions by (possibly non-singleton) sets of sentences. In this (...) paper we present the possible worlds semantics for partial meet multiple contractions. (shrink)
This paper expands on, and provides a qualified defence of, Arthur Fine’s selective interactions solution to the measurement problem. Fine’s approach must be understood against the background of the insolubility proof of the quantum measurement. I first defend the proof as an appropriate formal representation of the quantum measurement problem. Then I clarify the nature of selective interactions, and more generally selections, and I go on to offer three arguments in their favour. First, selections provide the only known solution to (...) the measurement problem that does not relinquish any of the premises of the insolubility proofs. Second, unlike some no-collapse interpretations of quantum mechanics selections suffer no difficulties with non-ideal measurements. Third, unlike most collapse-interpretations selections can be independently motivated by an appeal to quantum dispositions. (shrink)
Este artigo tem o objetivo de articular a crítica feita por Peter Sloterdijk em Esferas I, de 1998, à insuficiência da tematização do problema da espacialidade em Ser e tempo, obra de Martin Heidegger publicada em 1927, explicitando a solução que Sloterdijk oferece ao recuperar aquilo que, para ele, estava prenhe na obra heideggeriana, isto é, o desenvolvimento apropriado da questão da espacialidade. Nos reportamos primeiramente aos parágrafos em que Heidegger disserta sobre a espacialidade em Ser e tempo para servir (...) de base à crítica que Sloterdijk constrói em Esferas I. Em seguida, recuperamos certa influência teórica que Heidegger sofreu do biólogo estoniano Jakob von Uexküll e do teórico da história Wilhelm Dilthey sobre as noções de vida e mundo ambiente, junto à exposição do conceito de vida fática em Ontologia: Hermenêutica da facticidade. Por fim, a partir da relação entre vida e existência, nos voltamos à obra de Sloterdijk para compreender o que está implicado no conceito de “esfera”, via aberta, segundo Sloterdijk, a partir da rearticulação dos problemas da vida e da espacialidade nos escritos de juventude de Heidegger. (shrink)
We propose a new class of multiple contraction operations — the system of spheres-based multiple contractions — which are a generalization of Grove’s system of spheres-based (singleton) contractions to the case of contractions by (possibly non-singleton) sets of sentences. Furthermore, we show that this new class of functions is a subclass of the class of the partial meet multiple contractions.
In an influential article published in 1982, Bas Van Fraassen developed an argument against causal realism on the basis of an analysis of the Einstein-Podolsky-Rosen correlations of quantum mechanics. Several philosophers of science and experts in causal inference -including some causal realists like Wesley Salmon- have accepted Van Fraassen’s argument, interpreting it as a proof that the quantum correlations cannot be given any causal model. In this paper I argue that Van Fraassen’s article can also be interpreted as a good (...) guide to the different causal models available for the EPR correlations, and their relative virtues. These models in turn give us insight into some of the unusual features that quantum propensicies might have. (shrink)