This article argues for a revised best system account of laws of nature. David Lewis’s original BSA has two main elements. On the one hand, there is the Humean base, which is the totality of particular matters of fact that obtain in the history of the universe. On the other hand, there is what I call the ‘nomic formula’, which is a particular operation that gets applied to the Humean base in order to output the laws of (...) nature. My revised account focuses on this latter element of the view. Lewis conceives of the nomic formula as a balance of simplicity and strength, but I argue that this is a mistake. Instead, I motivate and develop a different proposal for the standards that figure into the nomic formula, and I suggest a rationale for why these should be the correct standards. Specifically, I argue that the nomic formula should be conceived as a collection of desiderata designed to generate principles that are predictively useful to creatures like us. The resulting view—which I call the ‘best predictive system’ account of laws—is thus able to explain why scientists are interested in discovering the laws, and it also gives rise to laws with the sorts of features that we find in actual scientific practice 1Introduction2The LOPP3A Problem with Lewis's Formula4A Pragmatic Account of the Nomic Formula 4.1Informative dynamics4.2Wide applicability4.3Spatial locality4.4Temporal locality4.5Spatial, temporal, and rotational symmetries4.6Predictively useful properties4.7Simplicity4.8Recap5 Conclusion. (shrink)
In this paper we apply the popular Best System Account of laws to typical eternal worlds – both classical eternal worlds and eternal worlds of the kind posited by popular contemporary cosmological theories. We show that, according to the Best System Account, such worlds will have no laws that meaningfully constrain boundary conditions. It’s generally thought that lawful constraints on boundary conditions are required to avoid skeptical arguments. Thus the lack of such laws given the (...) Best System Account may seem like a severe problem for the view. We show, however, that at eternal worlds, lawful constraints on boundary conditions do little to help fend off skeptical worries. So with respect to handling these skeptical worries, the proponent of the Best System Account is no worse off than their competitors. (shrink)
Sciences are often regarded as providing the best, or, ideally, exact, knowledge of the world, especially in providing laws of nature. Ilya Prigogine, who was awarded the Nobel Prize for his theory of non-equilibrium chemical processes—this being also an important attempt to bridge the gap between exact and non-exact sciences [mentioned in the Presentation Speech by Professor Stig Claesson (nobelprize.org, The Nobel Prize in Chemistry 1977)]—has had this ideal in mind when trying to formulate a new kind of science. (...) Philosophers of science distinguish theory and reality, examining relations between these two. Nancy Cartwright’s distinction of fundamental and phenomenological laws, Rein Vihalemm’s conception of the peculiarity of the exact sciences, and Ronald Giere’s account of models in science and science as a set of models are deployed in this article to criticise the common view of science and analyse Ilya Prigogine’s view in particular. We will conclude that on a more abstract, philosophical level, Prigogine’s understanding of science doesn’t differ from the common understanding. (shrink)
It is common to appeal to governing laws of nature in order to explain the existence of natural regularities. Classical theism, however, maintains the sovereignty thesis: everything distinct from God is created by him and is under his guidance and control. It follows from this that God must somehow be responsible for natural laws and regularities. Therefore, theists need an account of the relation between regularities, laws, and God. I examine competing accounts of laws of (...) nature and conclude that dispositional essentialism provides the most satisfactory explanation of the relation between, and. (shrink)
Acceptance of Humean Supervenience and the reductive Humean analyses that entail it leads to a litany of inadequately explained conflicts with our intuitions regarding laws and possibilities. However, the non-reductive Humeanism developed here, on which law claims are understood as normative rather than fact stating, can accommodate those intuitions. Rational constraints on such norms provide a set of consistency relations that ground a semantics formulated in terms of factual-normative worlds, solving the Frege-Geach problem of construing unasserted contexts. This set (...) of factual-normative worlds includes exactly the intuitive sets of nomologically possible worlds associated with each possible set of laws. The extension of the semantics to counterfactual and subjunctive conditionals is sketched. Potential objections involving subjectivity, mind-dependence, and non-factuality are discussed. (shrink)
This article discusses the role of simplicity and the notion of a best balance of simplicity and strength within the best systems account (BSA) of laws of nature. The article explores whether there is anything in scientific practice that corresponds to the notion of simplicity or to the trade-off between simplicity and strength to which the BSA appeals. Various theoretical rationales for simplicity preferences and their bearing on the identification of laws are also explored. It is concluded (...) that there are a number of issues about the role of simplicity within the BSA and its relation to strength that need to be addressed before the BSA can be regarded as an adequate account of laws. 1 Introduction2 The Best Systems Account3 The Trade-Off between Simplicity and Strength: Preliminary Considerations4 Alternative Conceptions of the Relationship between Simplicity and Strength5 Two Roles for Simplicity6 Simplicity in the Best Systems Account: Curve-Fitting7 Simplicity as a Corrective for Overfitting8 Descriptive Simplicity in the Best Systems Account?9 Simplicity as Due to Human Intellectual Limitations10 Summary11 Concluding Remarks. (shrink)
It is often said that the best system account of laws needs supplementing with a theory of perfectly natural properties. The ‘strength’ and ‘simplicity’ of a system is language-relative and without a fixed vocabulary it is impossible to compare rival systems. Recently a number of philosophers have attempted to reformulate the BSA in an effort to avoid commitment to natural properties. I assess these proposals and argue that they are problematic as they stand. Nonetheless, I agree with their (...) aim, and show that if simplicity is interpreted as ‘compression’, algorithmic information theory provides a framework for system comparison without the need for natural properties. (shrink)
The idea of levels of organization plays a central role in the philosophy of the life sciences. In this article, I first examine the explanatory goals that have motivated accounts of levels of organization. I then show that the most state-of-the-art and scientifically plausible account of levels of organization, the account of levels of mechanism proposed by Bechtel and Craver, is fundamentally problematic. Finally, I argue that the explanatory goals can be reached by adopting a deflationary approach, (...) where levels of organization give way to more well-defined and fundamental notions, such as scale and composition. (shrink)
It is often said that the best system account of laws needs supplementing with a theory of perfectly natural properties. The ‘strength’ and ‘simplicity’ of a system is language-relative and without a fixed vocabulary it is impossible to compare rival systems. Recently a number of philosophers have attempted to reformulate the BSA in an effort to avoid commitment to natural properties. I assess these proposals and argue that they are problematic as they stand. Nonetheless, I agree with their (...) aim, and show that if simplicity is interpreted as ‘compression’, algorithmic information theory provides a framework for system comparison without the need for natural properties.A menudo se dice que la explicación de las leyes del mejor sistema requiere ser completada con una teoría de las propiedades perfectamente naturales. La ‘fuerza’ y la ‘simplicidad’ de un sistema son relativas a un lenguaje y sin un vocabulario fijo es imposible comparar sistemas rivales. Recientemente, varios filósofos han intentado reformular la BSA en un esfuerzo por evitar el compromiso con las propiedades naturales. Aquí valoro estas propuestas y argumento que son problemáticas en su forma actual. Sin embargo, comparto su objetivo y muestro que si la simplicidad es interpretada como ‘compresion’, la teoría algoritmica de la información proporciona un marco para la comparación sin necesidad de apelar a propiedades naturales. (shrink)
John Earman and John T. Roberts advocate a challenging and radical claim regarding the semantics of laws in the special sciences: the statistical account. According to this account, a typical special science law “asserts a certain precisely defined statistical relation among well-defined variables” and this statistical relation does not require being hedged by ceteris paribus conditions. In this paper, we raise two objections against the attempt to cash out the content of special science generalizations in statistical terms.
Among the cognitive capacities of evolved creatures is the capacity to represent. Theories in cognitive neuroscience typically explain our manifest representational capacities by positing internal representations, but there is little agreement about how these representations function, especially with the relatively recent proliferation of connectionist, dynamical, embodied, and enactive approaches to cognition. In this talk I sketch an account of the nature and function of representation in cognitive neuroscience that couples a realist construal of representational vehicles with a pragmatic (...) class='Hi'>account of mental content. I call the resulting package a deflationaryaccount of mental representation and I argue that it avoids the problems that afflict competing accounts. (shrink)
What is a law of nature? Traditionally, philosophical discussion of this question has been dominated by two prominent alternatives; David Lewis’s best-systems analysis, according to which a law is a regularity that serves as a theorem in our best axiomatization of the facts about the world, and the Dretske-Armstrong-Tooley analysis, which incorporates universals to distinguish laws from mere accidental generalizations. Marc Lange’s ﬁrst book presents a provocative alternative to this tradition, providing a novel treatment of natural laws that (...) should be of interest to those philosophers concerned with the analysis of lawhood, physical necessity, causation, inductive conﬁrmation, counterfactual analysis, and explanation. (shrink)
Humean interpretations claim that laws of nature merely summarize events. Non-Humean interpretations claim that laws force events to occur in certain patterns. First, I show that the Lewis/Ramsey account of lawhood, which claims that laws are axioms or theorems of the simplest strongest summary of events, provides the best Humean interpretation of laws. The strongest non-Humean account, the scientific essentialist position, grounds laws of nature in essential non-reducible dispositional properties held by natural kinds. (...) The scientific essentialist account entails that laws are a posteriori necessary truths. After showing that these are the best Humean and non-Humean accounts, I demonstrate that the Lewis/Ramsey account is better equipped for interpreting dispositions and counterfactuals. One distinction between the two accounts is whether counterfactuals, whose antecedents are physically possible, sometimes require closest worlds with different laws than the laws of the base world. On the Lewis/Ramsey account non-legal worlds will be necessary. If laws are merely summaries of events that occur then a world where the events are drastically different will often have different laws. The scientific essentialist, however, must demand that laws are the same in counterfactual reasoning because she grounds counterfactual reasoning in the essential dispositional properties of natural kinds. Recently, problems have developed for counterfactual analysis of dispositions due to finkish dispositions, mimicked dispositions, and masked dispositions. These difficulties have led some to abandon reductive accounts of dispositions. Doing so makes positions like scientific essentialism tenable. Yet, while scientific essentialism demands that dispositional properties cannot be reduced to categorical properties, the Humean has the opposite commitment. If dispositional properties are primitives in our ontology, then there is a stronger tie between events than Humeans admit. So, another major disagreement between these accounts is whether dispositions can be reduced. After examining why many attempts at reducing dispositions have failed, I offer one suggestion of how to reduce dispositions and demonstrate that keeping dispositional properties as primitives in our ontology is worse than the solution I offer. (shrink)
In this paper we argue that there is a problem with the conjunction of David Lewis' account of counterfactual conditionals and his account of laws of nature. This is a pressing problem since both accounts are individually plausible, and popular.
The key idea of the interventionist account of causation is that a variable A causes a variable B if and only if B would change if A were manipulated in the appropriate way. This paper raises two problems for Woodward's (2003) version of interventionism. The first is that the conditions it imposes are not sufficient for causation, because these conditions are also satisfied by non-causal relations of nomological dependence expressed in association laws. Such laws ground a relation (...) of mutual manipulability that is incompatible with the asymmetry of causation. Several ways of defending the interventionist account are examined and found unsatisfying. The second problem is that it often seems to be impossible, in a model that contains variables linked by an association law, to satisfy the conditions imposed on interventions on such variables. Various ways to solve this second problem, most importantly the analysis of manipulability in terms of difference making, are examined. Given that none solves the problem, I conclude that the interventionist conditions are neither sufficient nor necessary for causation. It is suggested that they provide an analysis of nomological dependence, which may be supplemented with the notion of a causal process to yield an analysis of causation. (shrink)
In The Really Hard Problem , Owen Flanagan maintains that accounting for meaning requires going beyond the resources of the physical, biological, social, and mind sciences. He notes that the religious myths and fantastical stories that once "funded" flourishing lives and made life meaningful have been epistemically discredited by science but nevertheless insists that meaning does exist and can be fully accounted for only in a form of systematic philosophical theorizing that is continuous with science and does not need to (...) invoke myth. He sees such a mode of thought as a new, empirical-normative science, which he labels eudaimonistic scientia , that evades the disenchantment produced by natural scientific accounts of meaning. I argue that such an empirical-normative science does not provide us with a scientific account of meaning but is itself simply another way of making sense of one's life that is open to scientific explanation. Such an explanation will be deflationary in the sense that it presumes no greater scheme of things for meaning beyond the span of human existence (collective and possibly individual) but not disenchanting in that it does not explain away the flourishing lives human persons and communities create for themselves. (shrink)
_ Source: _Page Count 20 This paper proposes a new deflationary reading of the metaphor of the “primitive sense of selfhood” in perception and proprioception, usually understood as an “experiential self-reference” that takes place before reflection and any use of concepts. As such, the paper is also a new defense of the old orthodox view that self-consciousness is a highly complex mental phenomenon that requires equally complex concepts. The author’s defense is a clear case of inference to the best (...) explanation. He argues that postulating an “experiential _self-reference_” to explain the “primitive sense of selfhood” is as explanatory overkill as attributing perceptions to bacteria to explain the remarkably sophisticated ways in which they adapt, attune, and respond to their environments. This is what the author calls trivialization of self-consciousness. The metaphor of the “primitive sense of selfhood” in perception and proprioception is far less extravagantly explained by what, based on Recanati, the author calls self-involvement without self-consciousness: there is no “experiential self-reference” because there is no _self-reference_ in the first place. Rather than being articulated as a constituent of the contents of her/his perceptions or proprioception, the self/subject is the key element of the circumstance of evaluation of these selfless contents. (shrink)
The merits of David Lewis’s Best System Account of natural law are frequently debated. But to my knowledge, the prospects for extending the BSA to cover meta-laws have never been examined. I shall identify two obstacles facing the most natural way of extending the BSA to cover meta-laws. The BSA’s fans should consider how these obstacles are to be overcome. Meta-laws are laws about laws. For example, Einstein’s special theory of relativity incorporates a meta-law: (...) The content of the [special] relativity theory can … be summarized in one sentence: all natural laws must be so conditioned that they are covariant with respect to Lorentz transformations. [The special theory of relativity] is not a theory in the usual sense but is better regarded as a second-level theory, or a theory of theories that constrains first-level theories. The principle of relativity is an example of a symmetry principle: a principle requiring that the first-order laws be unchanged under a given transformation. Long before Einstein proposed the principle of relativity, other spacetime symmetries were widely believed to be meta-laws: that the first-order laws are covariant under arbitrary spatial displacements, temporal displacements and spatial rotations. These spacetime symmetries require the laws to treat all spatial locations and directions alike and all moments alike. For instance, symmetry under temporal displacements rules out a fundamental force law specifying that a given force declines with the inverse-square of the distance before a given moment but with the inverse-cube of the distance at and after that moment. Wigner characterizes such a symmetry principle as ‘a superprinciple which is in …. (shrink)
Self-deception poses serious difficulties for belief attribution because the behavior of the self-deceived is deeply conflicted: some of it supports the attribution of a certain belief, while some of it supports the contrary attribution. Theorists have resorted either to attributing both beliefs to the self-deceived, or to postulating an unconscious belief coupled with another kind of cognitive attitude. On the other hand, deflationary accounts of self- deception have attempted a more parsimonious solution: attributing only one, false belief to the (...) subject. My aim in this paper is to critically examine this strategy and, subsequently, to suggest that its failure gives support to the neglected view that the self-deceived are not accurately describable as believing either of the relevant propositions. (shrink)
On the relevance-theoretic approach outlined in this paper, linguistic metaphors are not a natural kind, and ―metaphor‖ is not a theoretically important notion in the study of verbal communication. Metaphorical interpretations are arrived at in exactly the same way as literal, loose and hyperbolic interpretations: there is no mechanism specific to metaphors, and no interesting generalisation that applies only to them. In this paper, we defend this approach in detail by showing how the same inferential procedure applies to utterances at (...) both ends of the literal-loose-metaphorical continuum, and how both literal and metaphorical utterances may create poetic effects. (shrink)
In recent writings Paul Horwich has pursued two related aims: To show “how small a constraint is provided by compositionality”. “The compositionality of meaning imposes no constraint at all on how the meaning properties of words are constituted”. To present a deflationary alternative to the “Davidsonian truth-theoretic perspective” The paper has three sections: in section 1 I make some comments on compositionality, in section 2 I argue that Horwich does not succeed in achieving aim, and in section 3 I (...) argue that he does not succeed either in achieving aim. (shrink)
This is a review of Craig Dilworth's The Metaphysics of Science (Dordrecht, Springer, 2007). The book propounds an immensely important idea. Science makes metaphysical presuppositions. Unfortunately, Dilworth ignores work that has been done on this issue which takes the matter much further than he does.
3 LIBERTARIANISM Now that we have discussed determinism and laws of nature, let us finally turn to libertarianism. Traditionally, libertarianism has been viewed as an incompatibilist theory of free will, as it requires the existence of real ...
This book propounds an immensely important idea. Science makes metaphysical presuppositions. I must, however, at once declare an interest. For well over thirty years I have myself been expounding and arguing for just this idea.
An oft-repeated claim is that there is information in some biological entity or process, most especially in genes. Some of these claims derive from the Central Dogma, population genetics, and the neo-Darwinian program. Others derive from attacks upon evolution, in an attempt to show that “information cannot be created” by natural selection. In this paper I will try to show that the term “information” is a homonym for a range of distinct notions, and that these notions are either of concrete (...) properties, in which case they are little more than a periphrasis for correlation and causation, or of abstract properties, in which case they are observer-dependent. In short, if information is in the concrete world, it is causality. If it is abstract, it is in the head. (shrink)
Kant's distinction between things in themselves and things as they appear, or appearances, is commonly attacked on the ground that it delivers a radical and incoherent ‘two world’ picture of what there is. I attempt to deflect this attack by questioning these terms of dismissal. Distinctions of the kind Kant draws on are in fact legion, and they make perfectly good sense. The way to make sense of them, however, is not by buying into a profligate ontology but by using (...) some rather different tools – surprisingly enough, tools first developed in the area of aesthetics. Once this is done, much of what Kant says begins to look perfectly coherent. In the final part of the paper, I point out that none the less all is not well. Kant's Critical doctrines make it hard for us to accept Kant's own version of this otherwise coherent distinction. (shrink)
The paper explores a deductive-nomological account of metaphysical explanation: some truths metaphysically explain, or ground, another truth just in case the laws of metaphysics determine the latter truth on the basis of the former. I develop and motivate a specific conception of metaphysical laws, on which they are general rules that regulate the existence and features of derivative entities. I propose an analysis of the notion of ‘determination via the laws’, based on a restricted form of (...) logical entailment. I argue that the DN-account of ground can be defended against the well-known objections to the DN-approach to scientific explanation. The goal of the paper is to show that the DN-account of metaphysical explanation is a well-motivated and defensible theory. (shrink)
It has been argued that the fundamental laws of physics do not face a ‘problem of provisos’ equivalent to that found in other scientiﬁc disciplines (Earman, Roberts and Smith 2002) and there is only the appearance of exceptions to physical laws if they are confused with differential equations of evolution type (Smith 2002). In this paper I argue that even if this is true, fundamental laws in physics still pose a major challenge to standard Humean approaches to (...) lawhood, as they are not in any obvious sense about regularities in behaviour. A Humean approach to physical laws with exceptions is possible, however, if we adopt a view of laws that takes them to be the algorithms in the algorithmic compressions of empirical data. When this is supplemented with a distinction between lossy and lossless compression, we can explain exceptions in terms of compression artefacts present in the application of the lossy laws. (shrink)
This paper sketches a dispositionalist conception of laws and shows how the dispositionalist should respond to certain objections. The view that properties are essentially dispositional is able to provide an account of laws that avoids the problems that face the two views of laws (the regularity and the contingent nomic necessitation views) that regard properties as categorical and laws as contingent. I discuss and reject the objections that (i) this view makes laws necessary whereas (...) they are contingent; (ii) this view cannot account for certain kinds of laws of nature and their properties. (shrink)
That laws of nature play a vital role in explanation, prediction, and inductive inference is far clearer than the nature of the laws themselves. My hope here is to shed some light on the nature of natural laws by developing and defending the view that they involve genuine relations between properties. Such a position is suggested by Plato, and more recent versions have been sketched by several writers.~ But I am not happy with any of these accounts, (...) not so much because they lack detail or engender minor difficulties, though they do, but because they share a quite fundamental defect. My goal here is to make this defect clear and, more importantly, to present a rather different version of this general conception of laws that avoids it. I begin by considering several features of natural laws and argue that these are best explained by the view that laws involve properties, that this involvement takes the form of a genuine relation between properties, and, finally, that the relation is a metaphysically necessary one. In the second section I start at the other end, and by reflecting on the nature of properties arrive at a similar account of natural laws. In the final section I develop this account in more detail, with emphasis on the nature of the relation between properties it invokes. Along the way several natural objections to the account are answered. (shrink)
Fred Dretske, Michael Tooley, and David Armstrong accept a theory of governing laws of nature according to which laws are atomic states of affairs that necessitate corresponding natural regularities. Some philosophers object to the Dretske/Tooley/Armstrong theory on the grounds that there is no illuminating account of the necessary connection between governing law and natural regularity. In response, Michael Tooley has provided a reductive account of this necessary connection in his book Causation (1987). In this essay, I (...) discuss an improved version of his account and argue that it fails. First, the account cannot be extended to explain the necessary connection between certain sorts of laws—namely, probabilistic laws and laws relating structural universals—and their corresponding regularities. Second, Tooley’s account succeeds only by (very subtly) incorporating primitive necessity elsewhere, so the problem of avoiding primitive necessity is merely relocated. (shrink)
According to best systems accounts, laws of nature are generalizations in the best systematization of particular matters of fact. Metrics such as simplicity and strength determine which systematization is best, but these are notoriously language relative. For this reason, David Lewis proposed a constraint on languages of inquiry: all predicates must be natural. This constraint is sometimes interpreted as requiring us to know which natural properties are instantiated in our world prior to scientific theorizing. I argue that this interpretation (...) is incorrect. I provide a better interpretation and show how it undercuts an influential epistemological objection to Lewis's best systems account of laws due to Bas van Fraassen. (shrink)
There is a common argument form in the metaphysics of natural laws literature: a theory of natural law is attacked by offering a claim L as a law of scientific field F (physics, chemistry, biology, etc.), and from this law metaphysical implications contrary to the theory are drawn. Quite often however, L would not be regarded as a law by a scientist of F. Roberts' "measurability account of laws" offers a new and interesting way to more reliably (...) identify the laws of a field F and to eliminate the L that are not laws of F from the literature. (shrink)
Contemporary Humeans treat laws of nature as statements of exceptionless regularities that function as the axioms of the best deductive system. Such ‘Best System Accounts’ marry realism about laws with a denial of necessary connections among events. I argue that Hume’s predecessor, George Berkeley, offers a more sophisticated conception of laws, equally consistent with the absence of powers or necessary connections among events in the natural world. On this view, laws are not statements of regularities but (...) the most general rules God follows in producing the world. Pace most commentators, I argue that Berkeley’s view is neither instrumentalist nor reductionist. More important, the Berkeleyan Best System can solve some of the problems afflicting its Humean rivals, including the problems of theory choice and Nancy Cartwright’s ‘facticity’ dilemma. Some of these solutions are available in the contemporary context, without any appeal to God. Berkeley’s account deserves to be taken seriously in its own right. (shrink)
The better best system account, short BBSA, is a variation on Lewis’s theory of laws. The difference to the latter is that the BBSA suggests that best system analyses can be executed for any fixed set of properties. This affords the possibility to launch system analyses separately for the set of biological properties yielding the set of biological laws, chemical properties yielding chemical laws, and so on for the other special sciences. As such, the BBSA remains (...) silent about possible interrelations between these freestanding sets of laws. In this paper, I explicate an emergence relation between them which preserves the autonomy or novelty of each special science’s laws but also shows their dependence: the autonomy of each level’s generalisations is given because nomicity is conferred to them system intrinsic, their dependence is established via their supervenience on lower level laws. As will be shown, the autonomy of special science laws is further strengthened by their ceteris paribus character. (shrink)
This paper proposes a revision of our understanding of causation that is designed to address what Hartry Field has suggested is the central problem in the metaphysics of causation today: reconciling Bertrand Russell’s arguments that the concept of causation can play no role in the advanced sciences with Nancy Cartwright’s arguments that causal concepts are essential to a scientific understanding of the world. The paper shows that Russell’s main argument is, ironically, very similar to an argument that Cartwright has put (...) forward against the truth of universal laws of nature. The paper uses this insight to develop an account of causation that does justice to traditional views yet avoids the arguments of Russell. (shrink)