A scientific theory, in order to be accepted as a part of theoretical scientific knowledge, must satisfy both empirical and non-empirical requirements, the latter having to do with simplicity, unity, explanatory character, symmetry, beauty. No satisfactory, generally accepted account of such non-empirical requirements has so far been given. Here, a proposal is put forward which, it is claimed, makes a contribution towards solving the problem. This proposal concerns unity of physical theory. In order to satisfy the non-empirical requirement of (...) unity, a physical theory must be such that the same laws govern all possible phenomena to which the theory applies. Eight increasingly demanding versions of this requirement are distinguished. Some implications for other non-empirical requirements, and for our understanding of science are indicated. (shrink)
There are two problems of simplicity. What does it mean to characterize a scientific theory as simple, unified or explanatory in view of the fact that a simple theory can always be made complex (and vice versa) by a change of terminology? How is preference in science for simple theories to be justified? In this paper I put forward a proposal as to how the first problem is to be solved. The more nearly the totality of fundamental physical theory (...) exemplifies the metaphysical thesis that the universe has a unified dynamic structure, so the simpler that totality of theory is. What matters is content, not form. This proposed solution may appear to be circular, but I argue that it is not. Towards the end of the paper I make a few remarks about the second, justificational problem of simplicity. (shrink)
The idea that simplicity matters in science is as old as science itself, with the much cited example of Ockham's Razor, 'entia non sunt multiplicanda praeter necessitatem': entities are not to be multiplied beyond necessity. A problem with Ockham's razor is that nearly everybody seems to accept it, but few are able to define its exact meaning and to make it operational in a non-arbitrary way. Using a multidisciplinary perspective including philosophers, mathematicians, econometricians and economists, this monograph examines (...) class='Hi'>simplicity by asking six questions: What is meant by simplicity? How is simplicity measured? Is there an optimum trade-off between simplicity and goodness-of-fit? What is the relation between simplicity and empirical modelling? What is the relation between simplicity and prediction? What is the connection between simplicity and convenience? The book concludes with reflections on simplicity by Nobel Laureates in Economics. (shrink)
I discuss Hume's views about whether simplicity and generality are positive features of explanations. In criticizing Hobbes and others who base their systems of morality on self interest, Hume diagnoses their errors as resulting from a "love of simplicity". These worries about whether simplicity is a positive feature of explanations emerge in Hume's thinking over time. But Hume does not completely reject the idea that it's good to seek simple explanations. What Hume thinks we need is good (...) judgment about when we are going too far in our search for simple explanations. These worries about simplicity are not unique to Hume. We can see versions of them in the work of Hutcheson, Smith, and Reid. (shrink)
Abstract In this article, I explain how and why different attempts to defend absolute divine simplicity fail. A proponent of absolute divine simplicity has to explain why different attributions do not suppose a metaphysical complexity in God but just one superproperty, why there is no difference between God and His super-property and finally how a absolute simple entity can be the truthmaker of different intrinsic predications. It does not necessarily lead to a rejection of divine simplicity but (...) it shows that we may consider another conception of divine simplicity compatible with some metaphysical complexity in God. Content Type Journal Article Category Article Pages 1-14 DOI 10.1007/s11153-012-9336-7 Authors Yann Schmitt, Faculté de Philosophie, Institut Catholique de Paris, 21, Rue d’Assas, 75270 Paris Cedex 06, France Journal International Journal for Philosophy of Religion Online ISSN 1572-8684 Print ISSN 0020-7047. (shrink)
Simple assumptions represent a decisive reason to prefer one theory to another in everyday scientific praxis. But this praxis has little philosophical justification, since there exist many notions of simplicity, and those that can be defined precisely strongly depend on the language in which the theory is formulated. The language dependence is a natural feature—to some extent—but it is also believed to be a fatal problem, because, according to a common general argument, the simplicity of a theory is (...) always trivial in a suitably chosen language. But, this trivialization argument is typically either applied to toy-models of scientific theories or applied with little regard for the empirical content of the theory. This paper shows that the trivialization argument fails, when one considers realistic theories and requires their empirical content to be preserved . In fact, the concepts that enable a very simple formulation, are not necessarily measurable, in general. Moreover, the inspection of a theory describing a chaotic billiard shows that precisely those concepts that naturally make the theory extremely simple are provably not measurable. This suggests that—whenever a theory possesses sufficiently complex consequences—the constraint of measurability prevents too simple formulations in any language. This explains why the scientists often regard their assessments of simplicity as largely unambiguous. In order to reveal a cultural bias in the scientists’ assessment, one should explicitly identify different characterizations of simplicity of the assumptions that lead to different theory selections. General arguments are not sufficient. (shrink)
Despite the United States' economic abundance, "the good life" has proved elusive. Millions long for more time for friends and family, for reading or walking or relaxing. Instead our lives are frantic, hectic, and harried. In Graceful Simplicity, Jerome M. Segal, philosopher, political activist, and former staff member of the House Budget Committee, expands and deepens the contemporary discourse on simple living. He articulates his conception of a politics of simplicity--one rooted in beauty, peace of mind, appreciativeness, and (...) generosity of spirit. (shrink)
The paper corrects misrepresentations of Aquinas's understanding of divine simplicity, argues that the reasons he gives for divine simplicity are persuasive ones, and suggests how Aquinas's account of the Trinity can be used to explain how God can be said to exist necessarily. It gives an account of Aquinas's conception of form and individualised form, and shows how Plantinga's criticism of Aquinas's position on divine simplicity rests on a misunderstanding of Aquinas's notion of form. It describes and (...) makes the case for Aquinas's argument that God must be absolutely simply because he is the uncaused cause of all effects, and any real composition in things constitutes an effect. It shows that Brian Davies is mistaken in claiming that Aquinas does not hold God's existence to be logically necessary. It applies Frege's conception of existence to Aquinas's account of God's simplicity and his psychological analogy for the Trinity, in order to explain how God's existence can coherently be said to be logically necessary. (shrink)
Richard Dawkins has popularized an argument which, according to him, proves that there is almost certainly no God. It rests on the assumption that complex and statistically improbable things are more difficult to explain than those that are not, and that any explanatory mechanism that is called on to do the explaining must show how this complexity can be built up from simpler means as it would be useless otherwise. In this paper, I first question what justifies the consideration of (...) the designer’s own complexity. I suggest a different understanding of both order and simplicity inevitable when one considers the psychological counterpart of information. I then assess what seems to be the inference engine of the proposal, the metaphor of biological organisms as either self-programmed machines or algorithms. I show how self-generated organized complexity would not sit well with our knowledge of both abduction and the theorems of information theory applied to genetics. I then turn to the positive side of Dawkins’ challenge, and I review some philosophers and their proposals for how the complexity of the world could be controlled from outside if one wanted to uphold a traditional understanding of God’s simplicity. (shrink)
Children learn their native language by exposure to their linguistic and communicative environment, but apparently without requiring that their mistakes be corrected. Such learning from “positive evidence” has been viewed as raising “logical” problems for language acquisition. In particular, without correction, how is the child to recover from conjecturing an over-general grammar, which will be consistent with any sentence that the child hears? There have been many proposals concerning how this “logical problem” can be dissolved. In this study, we review (...) recent formal results showing that the learner has sufficient data to learn successfully from positive evidence, if it favors the simplest encoding of the linguistic input. Results include the learnability of linguistic prediction, grammaticality judgments, language production, and form-meaning mappings. The simplicity approach can also be “scaled down” to analyze the learnability of specific linguistic constructions, and it is amenable to empirical testing as a framework for describing human language acquisition. (shrink)
In a recent work, Popper claims to have solved the problem of induction. In this paper I argue that Popper fails both to solve the problem, and to formulate the problem properly. I argue, however, that there are aspects of Popper's approach which, when strengthened and developed, do provide a solution to at least an important part of the problem of induction, along somewhat Popperian lines. This proposed solution requires, and leads to, a new theory of the role of (...) class='Hi'>simplicity in science, which may have helpful implications for science itself, thus actually stimulating scientific progress. (shrink)
I examine the central atheistic argument of Richard Dawkins’s book The God Delusion (“Dawkins’s Gambit”) and illustrate its failure. I further show that Dawkins’s Gambit is a fragment of a more comprehensive critique of theism found in David Hume’s Dialogues Concerning Natural Religion. Among the failings of Dawkins’s Gambit is that it is directed against a version of the God Hypothesis that few traditional monotheists hold. Hume’s critique is more challenging in that it targets versions of the God Hypothesis that (...) are central to traditional monotheism. Theists and atheists should put away The God Delusion and pick up Hume’s Dialogues. (shrink)
In this paper I explain several ways in which Descartes denied that the human soul or mind is composite and the role this idea played in his thought. The mind is whole in the whole and whole in the parts of the body because it has no parts. Unlike body, the mind is indivisible, and this is a different idea from the thought that mind and body are incorruptible. Descartes connects the immortality of the soul with its status as a (...) substance and as incorruptible rather than with its indivisibility. (shrink)
The problem of simplicity involves three questions: How is the simplicity of a hypothesis to be measured? How is the use of simplicity as a guide to hypothesis choice to be justified? And how is simplicity related to other desirable features of hypotheses -- that is, how is simplicity to be traded-off? The present paper explores these three questions, from a variety of viewpoints, including Bayesianism, likelihoodism, and the framework of predictive accuracy formulated by Akaike (...) (1973). It may turn out that simplicity has no global justification -- that its justification varies from problem to problem. (shrink)
Simplicity made difficult Content Type Journal Article Pages 1-8 DOI 10.1007/s11098-010-9626-9 Authors John MacFarlane, Department of Philosophy, University of California, 314 Moses Hall #2390, Berkeley, CA 94720-2390, USA Journal Philosophical Studies Online ISSN 1573-0883 Print ISSN 0031-8116.
The Fourth Lateran Council teaches that God is a substantia seu natura simplex omnino”—an “altogether simple substance or nature”—and the First Vatican Council reiterated the teaching. The doctrine of divine simplicity is at the center of Thomas’s..
In this article I assess the coherence of Jonathan Edwards's doctrine of divine simplicity as an instance of an actus purus account of perfect-being theology. Edwards's view is an idiosyncratic version of this doctrine. This is due to a number of factors including his idealism and the Trinitarian context from which he developed his notion of simplicity. These complicating factors lead to a number of serious problems for his account, particularly with respect to the opera extra sunt indivisa (...) principle. I conclude that Edwards sets out an interesting and subtle version of the doctrine, but one which appears mired in difficulties from which he is unable to extract himself. (shrink)
There is a traditional theistic doctrine, known as the doctrine of divine simplicity, according to which God is an absolutely simple being, completely devoid of any metaphysical complexity. On the standard understanding of this doctrine—as epitomized in the work of philosophers such as Augustine, Anselm, and Aquinas—there are no distinctions to be drawn between God and his nature, goodness, power, or wisdom. On the contrary, God is identical with each of these things, along with anything else that can be (...) predicated of him intrinsically. (shrink)
According to many philosophical theologians, God is metaphysically simple: there is no real distinction among His attributes or even between attribute and existence itself. Here, I consider only one argument against the simplicity thesis. Its proponents claim that simplicity is incompatible with God's having created another world, since simplicity entails that God is unchanging across possible worlds. For, they argue, different acts of creation involve different willings, which are distinct intrinsic states. I show that this is (...) mistaken, by sketching an adequate account of reasons-guided activity that does not require distinct intrinsic states of willing corresponding to each possible act of creation. (shrink)
According to the doctrine of divine simplicity, God is an absolutely simple being lacking any distinct metaphysical parts, properties, or constituents. Although this doctrine was once an essential part of traditional philosophical theology, it is now widely rejected as incoherent. In this paper, I develop an interpretation of the doctrine designed to resolve contemporary concerns about its coherence, as well as to show precisely what is required to make sense of divine simplicity.
Among theories which ﬁt all of our data, we prefer the simpler over the more complex. Why? Surely not merely for practical convenience or aesthetic pleasure. But how could we be justiﬁed in this preference without knowing in advance that the world is more likely to be simple than complex? And isn’t this a rather extravagant a priori assumption to make? I want to suggest some steps we can take toward reducing this embarrassment, by showing that the assumption which supports (...) favouring simplicity is far more modest than it ﬁrst seems. (shrink)
Descartes famously endorsed the view that (CD) God freely created the eternal truths, such that He could have done otherwise than He did. This controversial doctrine is much discussed in recent secondary literature, yet Descartes’s actual arguments for CD have received very little attention. In this paper I focus on what many take to be a key Cartesian argument for CD: that divine simplicity entails the dependence of the eternal truths on the divine will. What makes this argument both (...) important and interesting is that Descartes’s scholastic predecessors share the premise of divine simplicity but reject the CD conclusion. To properly understand Descartes, then, we must determine precisely where he diverges from his predecessors on the path from simplicity to CD. And when we do so we obtain a very surprising result: that despite many dramatic prima facie differences, there is no substantive difference between the relevant doctrines of Descartes and the scholastics . Or so I argue. (shrink)
The article presents a new interpretation of Hume's treatment of personal identity, and his later rejection of it in the "Appendix" to the Treatise. Hume's project, on this interpretation, is to explain beliefs about persons that arise primarily within philosophical projects, not in everyday life. The belief in the identity and simplicity of the mind as a bundle of perceptions is an abstruse belief, not one held by the "vulgar" who rarely turn their minds on themselves so as to (...) think of their perceptions. The author suggests that it is this philosophical observation of the mind that creates the problems that Hume finally acknowledges in the "Appendix." He is unable to explain why we believe that the perceptions by means of which we observe our minds while philosophizing are themselves part of our minds. This suggestion is then tested against seven criteria that any interpretation of the "Appendix" must meet. (shrink)
In this paper we explore material simplicity, defined as the virtue disposing us to act appropriately within the sphere of our consumer decisions. Simplicity is a conscientious and restrained attitude toward material goods that typically includes (1) decreased consumption and (2) a more conscious consumption; hence (3) greater deliberation regarding our consumer decisions; (4) a more focused life in general; and (5) a greater and more nuanced appreciation for other things besides material goods, and also for (6) material (...) goods themselves. It is to be distinguished from simple-mindedness, a return to nature, or poverty. Simplicity overlaps with traditional virtues such as temperance, frugality, and wisdom, and sustains and enables traditional virtues such as justice and generosity. Simplicity is a virtue because it furthers human flourishing, both individual and social, and sustains nature’s ecological flourishing. For analytic purposes, we consider six areas in which simplicity can make important contributions: (1) basic individual flourishing, (2) basic societal flourishing, (3) individual freedom or autonomy, (4) the acquisition of knowledge, (5) living meaningfully, and (6) preserving and protecting nonhuman beings. The proven failure of materialism to secure subjective happiness or objective flourishing argues for the practice of voluntary simplicity and for the radical reform of modern consumer societies. (shrink)
In this paper I first try to clarify the essential features of tropes and then I use the resulting analysis to cope with the problem of mental causation. As to the first step, I argue that tropes, beside being essentially particular and abstract, are simple, where such a simplicity can be considered either from a phenomenal point of view or from a structural point of view. Once this feature is spelled out, the role tropes may play in solving the (...) problem of mental causation is evaluated. It is argued that no solution based on the determinable/determinate relation is viable without begging the question as regards the individuating conditions of the related properties. Next, it is shown that Robb’s solution, much in the spirit of Davidson’s anomalous monism, entails abandoning the assumption that tropes are essentially simple, a consequence that I find not acceptable. My conclusion is that these entities are of no help in solving the problem of mental causation, and that a universalist approach should be preferred. (shrink)
According to a doctrine widely held by most medieval philosophers and theologians, whether in the Muslim or Christian world, there are no metaphysical distinctions in God whatsoever. As a result of the compendious theorizing that has been done on this issue, the doctrine, usually called the doctrine of divine simplicity, has been bestowed a prominent status in both Islamic and Christian philosophical theology. In Islamic philosophy some well-known philosophers, such as Ibn Sina (980–1037) and Mulla Sadra (1571–1640), developed this (...) doctrine through a metaphysical approach. In this paper, considering the historical order, I shall first concentrate on Ibn Sina’s view. Then I shall turn to the theory of divine simplicity of Thomas Aquinas (1225?–1274), as the most developed and comprehensive version of the medieval theories in Christian world. Finally, I will return to Islamic philosophy and explore the more complicated and mature account of the doctrine as it was introduced by Mulla Sadra according to his own philosophical principles. (shrink)
In this paper, I argue that a natural selection-based perspective gives reasons for thinking that the core of the ability to mindread cognitively complex mental states is subserved by a simulationist process—that is, that it relies on non-specialised mechanisms in the attributer’s cognitive architecture whose primary function is the generation of her own decisions and inferences. In more detail, I try to establish three conclusions. First, I try to make clearer what the dispute between simulationist and non-simulationist theories of mindreading (...) fundamentally is about. Second, I try to make more precise an argument that is sometimes hinted at in support of the former: this ‘argument from simplicity’ suggests that, since natural selection disfavours building extra cognitive systems where this can be avoided, simulationist theories of mindreading are more in line with natural selection than their competitors. As stated, though, this argument overlooks the fact that building extra cognitive systems can also yield benefits: in particular, it can allow for the parallel processing of multiple problems and it makes for the existence of backups for important elements of the organism’s mind. I therefore try to make this argument more precise by investigating whether these benefits also apply to the present case—and conclude negatively. My third aim in this paper is to use this discussion of mindreading as a means for exploring the promises and difficulties of evolutionary arguments in philosophy and psychology more generally. (shrink)
The General Composition Question asks “what are the necessary and jointly sufficient conditions any xs and any y must satisfy in order for it to be true that those xs compose that y?” Although this question has received little attention, there is an interesting and theoretically fruitful answer. Namely, Strong Composition as Identity (SCAI): Necessarily, for any xs and any y, those xs compose y iff those xs are identical to y. SCAI is theoretically fruitful because if it is true, (...) then there is an answers to one of the most difficult and intractable questions of mereology (The Simple Question). In this paper, I introduce the Identity Account of Simplicity and argue that if SCAI is true then this Identity Account of Simplicity is as well. I consider an objection to The Identity Account of Simplicity. Ultimately, I find this objection unsuccessful. (shrink)
In On the Nature and Existence of God, Richard Gale follows majority opinion in giving very short shrift to the doctrine of divine simplicity: in his view, there is no coherent expressible doctrine of divine simplicity. Rising to the implicit challenge, I argue that---contrary to what is widely believed---there is a coherently expressible doctrine of divine simplicity, though it is rather different from the views that are typically expressed by defenders of this doctrine. At the very least, (...) I think that I manage to show that there are ways of understanding the doctrine of divine simplicity that have not yet been adequately examined. (shrink)
The advent of formal definitions of the simplicity of a theory has important implications for model selection. But what is the best way to define simplicity? Forster and Sober () advocate the use of Akaike's Information Criterion (AIC), a non-Bayesian formalisation of the notion of simplicity. This forms an important part of their wider attack on Bayesianism in the philosophy of science. We defend a Bayesian alternative: the simplicity of a theory is to be characterised in (...) terms of Wallace's Minimum Message Length (MML). We show that AIC is inadequate for many statistical problems where MML performs well. Whereas MML is always defined, AIC can be undefined. Whereas MML is not known ever to be statistically inconsistent, AIC can be. Even when defined and consistent, AIC performs worse than MML on small sample sizes. MML is statistically invariant under 1-to-1 re-parametrisation, thus avoiding a common criticism of Bayesian approaches. We also show that MML provides answers to many of Forster's objections to Bayesianism. Hence an important part of the attack on Bayesianism fails. (shrink)
The simplicity of a theory seems closely related to how well the theory summarizes individual data points. Think, for example, of classic curve-fitting. It is easy to get perfect data-fit with a ‘‘theory’’ that simply lists each point of data, but such a theory is maximally unsimple (for the data-fit). The simple theory suggests instead that there is one underlying curve that summarizes this data, and we usually prefer such a theory even at some expense in data-fit. In general, (...) it seems, theorizing involves looking for regularities or patterns in our experience, and such regularities are interesting to us because they summarize how our experience goes. We could list all the ravens we’ve encountered, and their colors, or we could summarize.. (shrink)
Karl Popper equates simplicity with falsifiability. He develops his argument for this equation through a geometrical example. There is a flaw in his example, which undermines his claim that simplicity is falsifiability. I point out the flaw here.
In this paper I will argue that, in general, where the evidence supports two theories equally, the simpler theory is not more likely to be true and is not likely to be nearer the truth. In other words simplicity does not tell us anything about model bias. Our preference for simpler theories (apart from their obvious pragmatic advantages) can be explained by the facts that humans are known to elaborate unsuccessful theories rather than attempt a thorough revision and that (...) a fixed set of data can only justify adjusting a certain number of parameters to a limited degree of precision. No extra tendency towards simplicity in the natural world is necessary to explain our preference for simpler theories. Thus Occam's razor eliminates itself (when interpreted in this form). (shrink)
The central problem with Bayesian philosophy of science is that it cannot take account of the relevance of simplicity and unification to confirmation, induction, and scientific inference. The standard Bayesian folklore about factoring simplicity into the priors, and convergence theorems as a way of grounding their objectivity are some of the myths that Earman's book does not address adequately. 1Review of John Earman: Bayes or Bust?, Cambridge, MA. MIT Press, 1992, £33.75cloth.
Aquinas maintains that, although God created the universe, he could have created another or simply refrained from creating altogether. That Aquinas believesin divine free choice is uncontroversial. Yet doubts have been raised as to whether Thomas is entitled to this belief, given his claims concerning divine simplicity.According to simplicity, there is no potentiality in God, nor is there a distinction in God between God’s willing, His essence, and His necessary being. On the surface, it appears that these claims (...) leave no room for divine free choice. I argue that attempts by Aquinas and a pair of his contemporary defenders to reconcile God’s freedom with God’s simplicity fail to resolve the problem. Nevertheless, I maintain that Aquinas provides the key to a resolution in his claim that while creatures are really related to God, God is not really related to creatures. (shrink)
Treating the principle of charity as a non-empirical, foundational principle leads to insoluble problems of justification. I suggest instead treating semantic properties realistically, and semantic terms as theoretical terms. This allows us to apply ordinary scientific reasoning in meta-semantics. In particular, we can appeal to widespread verbal agreement as an empirical phenomenon, and we can make use of probabilistic reasoning as well as appeal to theoretical simplicity for reaching the conclusion that there is a high rate of agreement in (...) belief between speakers who have a high rate of verbal agreement. Although this does not by itself imply that the beliefs agreed upon are generally true, it does imply that any single speaker who is party to such a verbal agreement is justified in taking the other speakers to have mostly true beliefs. She is so justified because of the fact that it is incoherent to take her own beliefs not to be mostly true. Indirectly, this justifies a version of the principle of charity as an empirically correct principle. (shrink)
This chapter examines four solutions to the problem of many models, and finds some fault or limitation with all of them except the last. The first is the naïve empiricist view that best model is the one that best fits the data. The second is based on Popper’s falsificationism. The third approach is to compare models on the basis of some kind of trade off between fit and simplicity. The fourth is the most powerful: Cross validation testing.
Often, when people discuss the role of simplicity in science, they do not notice the trade-off between simplicity of ontology and simplicity of theory using an ontology. Einstein appears to have been emphasising simplicity of ontology (basic elements), though he might have included theory as well (basic axioms/assumptions).
Explaining the connection, if any, between simplicity and truth is among the deepest problems facing the philosophy of science, statistics, and machine learning. Say that an efficient truth finding method minimizes worst case costs en route to converging to the true answer to a theory choice problem. Let the costs considered include the number of times a false answer is selected, the number of times opinion is reversed, and the times at which the reversals occur. It is demonstrated that (...) (1) always choosing the simplest theory compatible with experience, and (2) hanging onto it while it remains simplest, is both necessary and sufficient for efficiency. †To contact the author, please write to: Department of Philosophy, Carnegie Mellon University, Baker Hall 135, Pittsburgh, PA 15213-3890; e-mail: email@example.com. (shrink)
In this study, Andrew Radde-Gallwitz argues that Basil and Gregory develop an understanding of divine simplicity which does not require that God be identical with the properties of God or that these be identical with one another. Their motivation is that they want to hold that we cannot, in all eternity, know God's essence and yet that we have knowledge of God. Radde-Gallwitz argues that, for Basil and especially Gregory, in addition to our "conceptualizations" (epinoiai), we also have knowledge (...) of propria, properties necessarily connected to God's essence.In the early chapters, Radde-Gallwitz surveys the background to the Cappadocians, beginning with the second century. He argues that in early Christianity the .. (shrink)
This paper presents a new explanation of how preferring the simplest theory compatible with experience assists one in ﬁnding the true answer to a scientiﬁc question when the answers are theories or models. Inquiry is portrayed as an unending game between science and nature in which the scientist aims to converge to the true theory on the basis of accumulating information. Simplicity is a topological invariant reﬂecting sequences of theory choices that nature can force an arbitrary, convergent scientist to (...) produce. It is demonstrated that among the methods that converge to the truth in an empirical problem, the ones that do so with a minimum number of reversals of opinion prior to convergence are exactly the ones that prefer simple theories. The approach explains not only simplicity tastes in model selection, but aspects of theory testing and the unwillingness of natural science to break symmetries without a reason. (shrink)
Almost all commentators acknowledge that among the grounds on which scientists perform theory-choices are criteria of simplicity. In general, simplicity is regarded either as only a logico-empirical quality of a theory, diagnostic of the theory's future predictive success, or as a purely aesthetic or otherwise extra-empirical property of it. This paper attempts to demonstrate that the simplicity-criteria applied in scientific practice include both a logico-empirical and a quasi-aesthetic criterion: to conflate these in an account of scientists' theory-choice (...) is to court confusion. (shrink)
Nelson Goodman has constructed two theories of simplicity: one of predicates; one of hypotheses. I offer a simpler theory by generalization and abstraction from his. Generalization comes by dropping special conditions Goodman imposes on which unexcluded extensions count as complicating and which excluded extensions count as simplifying. Abstraction is achieved by counting only nonisomorphic models and subinterpretations. The new theory takes into account all the hypotheses of a theory in assessing its complexity, whether they were projected prior to, or (...) result from, projection of a given hypothesis. It assigns simplicity post-projection priority over simplicity pre-projection. It better orders compound conditionals than does the theory of simplicity of hypotheses, and it does not inherit an anomaly of the theory of simplicity of predicates — its failure to order the ordering relations. Drop Goodman's special conditions, and the problems fall away with them. (shrink)
given at the 2007 Formal Epistemology Workshop at Carnegie Mellon June 2nd. Good compression must track higher vs lower probability of inputs, and this is one way to approach how simplicity tracks truth.
Participants in the debate about whether simplicity is a guide to truth or merely pragmatically useful typically wrangle over two problems: (1) how to weigh simplicity against other virtues like strength and fitness and (2) whether there is a unique measure for simplicity that straps it to truth. I would like to put forth a third problem: (3) Even if problems (1) and (2) could be solved, it is far from clear whether the simplest theory out of (...) an available class of competitors would always be the one closest to the truth. (shrink)
The philosophical significance of the procedure of applying Akaike Information Criterion (AIC) to curve-fitting problems is evaluated. The theoretical justification for using AIC (the so-called Akaike's theorem) is presented in a rigorous way, and its range of validity is assessed by presenting both instances in which it is valid and counter-examples in which it is invalid. The philosophical relevance of the justification that this result gives for making one particular choice between simple and complicated hypotheses is emphasized. In addition, recent (...) claims that the methods based on Akaike's theorem are relevant to other philosophical problems associated with the notion of simplicity are presented and evaluated. (shrink)
Kant saw in an old argument a threat to his criticism of traditional rational psychology. He called this argument the Achilles of all dialectical inferences. What the Achilles purports to prove is that the unity of consciousness requires the simplicity of the soul. The argument proceeds from, a distinction between two types ofactions that are ascribable to a subject. For example, when we say that a school of fish moves, this movement can be explained by referring to the movements (...) of the fish constituting that whole. Thus, “moving” is an action type that can be attributed to an aggregate. The second premise says that the action of the thinking I cannot be regarded as the concurrence of several things acting. Thus, any thinking self has to be a simple subject because the action of the thinking self cannot be an aggregate of several actions of different subjects constituting that self. In this paper, Kant’s criticism of the Achilles argument is investigated. (shrink)
The doctrine of God’s absolute simplicity denies the possibility of real distinctions in God. It is, e.g., impossible that God have any kind of parts or any intrinsic accidental properties, or that there be real distinctions among God’s essential properties or between any of them and God himself. After showing that some of the counter-intuitive implications of the doctrine can readily be made sense of, the authors identify the apparent incompatibility of God’s simplicity and God’s free choice as (...) a special difficulty and associate it with two others: the apparent incompatibilities between essential omnipotence and essential goodness, and between perfect goodness and moral goodness. Since all three of these difficulties are associated with a certain understanding of the nature of God’s will, the authors base their resolution of them on an account of will in general and of God’s will in particular, drawing on Aquinas’s theory of will.Taking creation as their paradigm of divine free choice, the authors develop a solution of the principal incompatibility based on three claims: (i) God’s acts of choice are both free and conditionally necessitated; (ii) the difference between absolutely and conditionallynecessitated acts of will is not a real distinction in God; and (iii) the conditional necessity of God’s acts of will is compatible with contingency in the objects of those acts. The heart of their solution consists in their attempt to make sense of and support those claims.The authors extend their solution to cover the two associated apparent incompatibilities as well.The article concludes with observations on the importance of the doctrine of God’s absolute simplicity for resolving problems in religious morality and in the cosmological argument. (shrink)
Two experiments were performed to explore the mechanisms of human 3D shape perception. In Experiment 1, the subjects’ performance in a shape constancy task in the presence of several cues (edges, binocular disparity, shading and texture) was tested. The results show that edges and binocular disparity, but not shading or texture, are important in 3D shape perception. Experiment 2 tested the effect of several simplicity constraints, such as symmetry and planarity on subjects’ performance in a shape constancy task. The (...) 3D shapes were represented by edges or vertices only. The results show that performance with or without binocular disparity is at chance level, unless the 3D shape is symmetric and/or its faces are planar. In both experiments, there was a correlation between the subjects’ performance with and without binocular disparity. Our study suggests that simplicity constraints, not depth cues, play the primary role in both monocular and binocular 3D shape perception. These results are consistent with our computational model of 3D shape recovery. (shrink)
Intractability and optimality are two sides of one coin: Optimal models are often intractable, that is, they tend to be excessively complex, or NP-hard. We explain the meaning of NP-hardness in detail and discuss how modem computer science circumvents intractability by introducing heuristics and shortcuts to optimality, often replacing optimality by means of sufficient sub-optimality. Since the principles of decision theory dictate balancing the cost of computation against gain in accuracy, statistical inference is currently being reshaped by a vigorous new (...) trend: the science of simplicity. Simple models, as we show for specific cases, are not just tractable, they also tend to be robust. Robustness is the ability of a model to extract relevant information from data, disregarding noise.Recently, Gigerenzer, Todd and the ABC Research Group (1999) have put forward a collection of fast and frugal heuristics as simple, boundedly rational inference strategies used by the unaided mind in real world inference problems. This collection of heuristics has suggestively been called the adaptive toolbox. In this paper we will focus on a comparison task in order to illustrate the simplicity and robustness of some of the heuristics in the adaptive toolbox in contrast to the intractability and the fragility of optimal solutions. We will concentrate on three important classes of models for comparison-based inference and, in each of these classes, search for that to be used as benchmarks to evaluate the performance of fast and frugal heuristics: lexicographic trees, linear modes and Bayesian networks. Lexicographic trees are interesting because they are particularly simple models that have been used by humans throughout the centuries. Linear models have been traditionally used by cognitive psychologists as models for human inference, while Bayesian networks have only recently been introduced in statistics and computer science. Yet it is the Bayesian networks that are the best possible benchmarks for evaluating the fast and frugal heuristics, as we will show in this paper. (shrink)
In this paper I will argue that a false assumption drives the attraction of philosophers to a divine command theory of morality. Specifically, I suggest the idea thatanything not created by God is independent of God is a misconception. The idea misleads us into thinking that our only choice in offering a theistic ground for morality is between making God bow to a standard independent of his will or God creating morality in revealing his will. Yet what is God is (...) hardly independent of him, and in coupling a perfect being theology with the doctrine of divine simplicity we discover that God’s “reason” is God. Accordingly, obeying the truths of goodness that we humans speak of as contained in the divine wisdom hardly impugns the divine sovereignty. By modifying divine command ethics to give primacyto God’s love or justice, thinkers such as Robert M. Adams, Philip L. Quinn, and Edward J. Wierenga admit the repugnance of this picture in spite of their verbal allegiance to divine command ethics. Accordingly, they implicitly concede that basing morality on God’s sheer power should not be the preferred option for the Christian theist. (shrink)
Recently Samuel Richmond, generalizing Nelson Goodman, has proposed a measure of the simplicity of a theory that takes into account not only the polymorphicity of its models but also their internal homogeneity. By this measure a theory is simple if small subsets of its models exhibit only a few distinct (i.e., non-isomorphic) structures. Richmond shows that his measure, unlike that given by Goodman's theory of simplicity of predicates, orders the order relations in an intuitively satisfactory manner. In this (...) note I formalize his presentation and suggest an improvement designed to overcome certain technical difficulties. (shrink)
Debates concerning the types of representations that aid reading acquisition have often been influenced by the relationship between measures of early phonological awareness (the ability to process speech sounds) and later reading ability. Here, a complementary approach is explored, analyzing how the functional utility of different representational units, such as whole words, bodies (letters representing the vowel and final consonants of a syllable), and graphemes (letters representing a phoneme) may change as the number of words that can be read gradually (...) increases. Utility is measured by applying a Simplicity Principle to the problem of mapping from print to sound; that is, assuming that the “best” representational units for reading are those which allow the mapping from print to sounds to be encoded as efficiently as possible. Results indicate that when only a small number of words are read whole-word representations are most useful, whereas when many words can be read graphemic representations have the highest utility. (shrink)
I argue that Kant’s four Paralogistic conclusions concerning (a) substantiality; (b1) unity and (b2) immortality, in the famous “Achillesargument”; (c) personal identity; and (d) metaphysical idealism, in the first edition Critique of Pure Reason (1781), are all connectedby being grounded in a common underlying rational principle, an a priori (universal and necessary) presupposition, namely, that boththe mind and its essential attribute of thinking are immaterial and unextended, i.e., simple. Consequently, despite Kant’s predilectionfor architectonic divisions and separations, I show that in (...) fact the simplicity assumption grounds all four Paralogisms and reinforcesKant’s corresponding commitments to the principles of continuity and coherence. Further, I maintain that Kant, under the influence ofhis earlier Leibnizian and subjective idealist leanings, continued to be guided in the first edition Critique, not only in the Paralogismsbut also in certain sections of the Analytic, by emphasizing unconscious activities, which once more reinforced his commitments to aparadigm of the simplicity, unity, and identity of self-consciousness or apperception. (shrink)
Arbitration is a preferred method for the resolution of international business disputes. As of yet, most publications on online arbitration deal with legal issues. In this paper, we present an Online arbitration environment that we believe facilitates the participants in a meaningful way. Our assumption is that an ODR service should be easy to use (convenient), and at the same time provide meaningful support. More specifically we have paid attention to four criteria that we believe are important, viz. simplicity, (...) awareness, orientation and timeliness. The online arbitration service is called GearBi. (shrink)
Theoretical simplicity is difficult to characterize, and evidently can depend upon a number of distinct factors. One such desirable characteristic is that the laws of a theory have relatively few "counterinstances" whose accommodation requires the invocation of a ceteris paribus condition and ancillary explanation. It is argued that, when one theory is reduced to another, such that the laws of the second govern the behavior of the parts of the entities in the domain of the first, there is a (...) characteristic gain in simplicity of the sort mentioned: while I see no way of quantitatively measuring the "amount" of defeasibility of the laws of a theory, microreduction can be shown to decrease that "amount.". (shrink)
Various formulations of the principle of simplicity in science are examined and rejected in favor of Goodman's proposal, the essence of which is to concentrate attention upon the predicates that form the extralogical basis of any given theory and to provide measures for comparing the relative structural simplicity of different sets of such predicates. The postulational basis of Goodman's method is set out and explained, together with some important amendments and additions, and a number of theorems are proved, (...) with whose aid the simplest theory to account for a certain corpus of scientific phenomena is readily determinable. (shrink)
Einstein proclaimed that we could discover true laws of nature by seeking those with the simplest mathematical formulation. He came to this viewpoint later in his life. In his early years and work he was quite hostile to this idea. Einstein did not develop his later Platonism from a priori reasoning or aesthetic considerations. He learned the canon of mathematical simplicity from his own experiences in the discovery of new theories, most importantly, his discovery of general relativity. Through his (...) neglect of the canon, he realised that he delayed the completion of general relativity by three years and nearly lost priority in discovery of its gravitational field equations. (shrink)
The authors try to show that many of the differences between Ross and themselves are only apparent, masking considerable agreement. Among the real disagreements, at least one is over the interpretation of Aquinas’s account of divine simplicity, but the mostcentral disagreement consists in the authors’ claim that their concern was not with a distinction between the way God is and the way he might have been (as Ross suggests) but with the difference between the way God is necessarily and (...) the way he is contingently. Finally, the authors argue that the concept of simplicity is indeed required for the solution of the problems discussed at the end of their original article. (shrink)
It is not possible to dismiss the atomistic paradigm because the proposed elementary particles are too many (and, hence, it is claimed, they do not provide a simple account of nature) or because it is not possible to observe quarks in an isolated manner. The developments in particle physics have brought about radical changes to our notions of simplicity and observability, and in this paper we elaborate on these changes. It is as a result of these changes that the (...) present situation in elementary particle physics justify us to claim that we have indeed reached a level of explanation where the constituent particles (quarks, leptons, gluons, and intermediate bosons) used for the explanation of the various phenomena considered to be delineating a particular level in the descriptive framework of the physical phenomena and a specific stratum in the organization of nature, can be regarded as elementary. (shrink)
Firstly, in this paper, we prove that the equivalence of simplicity and the symmetry of forking. Secondly, we attempt to recover definability part of stability theory to simplicity theory. In particular, using elimination of hyperimaginaries we prove that for any supersimple T, canonical base of an amalgamation class P is the union of names of ψ-definitions of P, ψ ranging over stationary L-formulas in P. Also, we prove that the same is true with stable formulas for an 1-based (...) theory having elimination of hyperimaginaries. For such a theory, the stable forking property holds, too. (shrink)
This paper is intended to explore Jeffrey's proposal for the measurement of the simplicity of scientific laws. The first part is a sketch of Jeffreys' development of a view on simplicity, which will be followed by a discussion of what seem to be some rather crucial defects in the proposal as it stands. It will be suggested here that no plausible way of countering these defects seems available.
A well-known objection to divine simplicity holds that the doctrine is incompatible with God’s contingent knowledge. I set out the objection and reject two problematic solutions. I then argue that the objection is best answered by adopting an “extrinsic model of divine knowing” according to which God’s contingent knowledge, which varies across worlds, does not involve any intrinsic variation in God. Solutions along these lines have been suggested by others. This paper advances the discussion by developing and offering partial (...) defenses of three such models. (shrink)
This article attempts to determine how Leibniz might safeguard the simplicity of an individual substance (singular) while also retaining the view that causal powers (plural) are constitutive of said individual substance. I shall argue that causal powers are not to be understood as veritable parts of a substance in so far as such an account would render substances as unnecessarily complex. Instead, my proposal is that sense can be made of Leibniz’s metaphysical picture by appeal to truthmakers. In order (...) to develop my argument I critically examine (a) Leibniz’s revival of the scholastic notion of substantial form, (b) his theory of accidents, and (c) his account of metaphysical predication, and argue that an application of truthmaker theory can satisfy each in accordance with his simplicity requirement on individual substances. (shrink)
Richard Swinburne has given a defense of arguments for the existence of God (and in particular of teleological arguments) in his book "The Existence of God" (1979/1991). This paper argues that such theistic arguments fail, and poses some general problems for theistic arguments. Swinburne's use of a principle of simplicity is not given adequate justification and, if justified, works against theism. There are adequate rebuttals to Swinburne's arguments that depend upon there being few particles of basic physics, universal laws (...) of nature, cogent cosmological argument, and temporal order in the universe. Theistic arguments falter on malleability, on going well beyond evidence, on anthropomorphism, on treating consistency as if it were evidence or explanation, on selective and inconsistent use of principles, and on a lack of any serious attempt to disprove hypotheses. All of this serves to support the conclusion suggested by Hume's posthumous theological writings that theistic arguments are so malleable, profligate, overreaching, equivocal, anthropomorphic, selective, inconsistent, and uncritical as to be inept. (shrink)
Abstract In Simple Rules for a Complex World, Richard Epstein claims to be focusing on legal simplicity, and on the link between legal simplicity and a legal system less intrusive on individual liberty. It turns out, however, that Epstein's conception of simplicity is itself soaked with the substantive idea of individual liberty. The consequences of this are that the claim that legal simplicity brings individual liberty (and legal minimalism) becomes true by definition, and (...) that Epstein avoids taking on the important and interesting questions of whether and when legal simplicity, more conventionally understood, produces less legal instrusiveness and thus, under Epstein's own conception, more liberty. (shrink)
William Coperthwaite is a teacher, builder, designer, and writer who for many years has explored the possibilities of true simplicity on a homestead on the north coast of Maine. In the spirit of Henry David Thoreau, Emily Dickinson, and Helen and Scott Nearing, Coperthwaite has fashioned a livelihood of integrity and completeness—buying almost nothing, providing for his own needs, and serving as a guide and companion to hundreds of apprentices drawn to his unique way of being. A Handmade Life (...) carries Coperthwaite’s ongoing experiments with hand tools, hand-grown and gathered food, and handmade shelter, clothing, and furnishings out into the world to challenge and inspire. His writing is both philosophical and practical, exploring themes of beauty, work, education, and design while giving instruction on the hand-crafting of the necessities of life. Richly illustrated with luminous color photographs by Peter Forbes, the book is a moving and inspirational testament to a new practice of old ways of life. (shrink)
It is shown how simplicity in econometric modelling can be defended from three different methodological positions, a ?traditional scientific?, a rhetorical and a hermeneutical one. Moreover, it is argued that the claim of methodological superiority by supporters of general-to-specific modelling is largely rhetorics. In practice there does not exist a viable alternative to simple modelling in empirical economics.
The most pedigreed line of thought about mind is the simplicity argument: that the unity of thinking entails the simplicity, immateriality, and immortality of soul. It is widely taken to be a rationalist argument, as opposed to an empiricist or peripatetic argument (see Mijuskovic, The Achilles of Rationalist Arguments), which was completely destroyed by Kant in the First Critique. In this paper it is argued that there is a conceptual connection between the downfall of the Aristotelian conception of (...) soul as substantial form and the downfall of this argument in that in the downfall of the Aristotelian conception of soul it became acceptable to view the functional unity of a material system as constituting a genuine unity per se. This then undermined all philosophical motivation for the postulation of substantial forms. As a result, there was no longer reason for rooting the unity of apperception in the simplicity of a subsistence soul as opposed to some simply emergent power of thinking. (shrink)
This paper argues that (1) Richard Swinburne’s general account of the simplicity of empirical hypotheses fails because it involves a deeply problematic notion of postulating a property, while there is a wide range of hypotheses where the assessment of simplicity rests entirely on the number and kinds of postulated properties, (2) Swinburne’s main argument in ’The Christian God’ for the simplicity of theism, the one based on considerations about pure limitless intentional power, is significantly weaker than he (...) seems to believe. The paper does not draw a conclusion about whether theism is simple. (shrink)
According to many philosophical theologians, God is metaphysically simple: there is no real distinction among His attributes or even between attribute and existence itself. Here, I consider only one argument against the simplicity thesis. Its proponents claim that simplicity is incompatible with God’s having created another world, since simplicity entails that God is unchanging across possible worlds. For, they argue, different acts of creation involve different willings, which are distinct intrinsic states. I show that this is mistaken, (...) by sketching an adequate account of reasons-guided activity that does not require distinct intrinsic states of willing corresponding to each possible act of creation. (shrink)
Predication is an indisputable part of our linguistic behavior. By contrast, the metaphysics of predication has been a matter of dispute ever since antiquity. According to Plato—or at least Platonism, the view that goes by Plato’s name in contemporary philosophy—the truths expressed by predications such as “Socrates is wise” are true because there is a subject of predication (e.g., Socrates), there is an abstract property or universal (e.g., wisdom), and the subject exemplifies the property.1 This view is supposed to be (...) general, applying to all predications, whether the subject of predication is a person, a planet, or a property.2 Despite the controversy surrounding the metaphysics of predication, many theistic philosophers—including the majority of contemporary analytic theists—regard Platonism as extremely attractive. At the same time, however, such philosophers are also commonly attracted to a form of traditional theism that has at its core the thesis that God is an absolutely independent.. (shrink)
Mutual perceptual knowledge is a prevalent feature of our everyday lives, yet appears to be exceptionally difficult to characterise in an acceptable way. This paper argues for a renewed understanding of Stephen Schiffer’s iterative approach to mutual knowledge, according to which mutual knowledge requires an infinite number of overlapping, embedded mental states. It is argued that the charge of ‘psychological implausibility’ that normally accompanies discussion of this approach can be offset by identifying mutual knowledge, not with the infinite iterations themselves, (...) but with the finite base which Schiffer proves is capable of generating those iterations. An understanding of this finite base as a primitive, relational property holding between two or more people, allows us to understand the iterations as an implicit and ‘harmless’ intrapersonal feature of what is an interpersonal phenomenon. The paper concludes by relating the account to joint attention in infant interaction. (shrink)
For more than one decade, Andy Clark has defended the now-famous extended mind thesis, the idea that cognitive processes leak into the world. In this paper I analyse Clark’s theoretical justification for the thesis: explanatory simplicity. I argue that his way of justifying the thesis leads into contradiction, either at the level of propositional attitude ascriptions or at the theoretical level. I evaluate three possible strategies of dealing with this issue, concluding that they are all likely to fail and (...) that therefore, as regards explanatory simplicity, the burden of proof is on Clark’s side. The paper divides into two main sections: in “Simplicity and Coherence”, I define the two concepts that are important in this context (simplicity and explanatory coherence). In “How to Cope with Coherence”, these two concepts are applied to the central thought experiment, the Inga/Otto case. It will be shown that justifying the extended mind thesis by reference to simplicity may cause trouble, because ‘extended’ behavioural descriptions are likely to yield rather complicated explanations. (shrink)
Most scientists and philosophers of science recognize that, when it comes to accepting and rejecting theories in science, considerations that have to do with simplicity, unity, symmetry, elegance, beauty or explanatory power have an important role to play, in addition to empirical considerations. Until recently, however, no one has been able to give a satisfactory account of what simplicity (etc.) is, or how giving preference to simple theories is to be justified. But in the last few years, two (...) different but related accounts have appeared, both of which address the above issues. On the one hand, James McAllister has argued that aesthetic criteria in science reflect scientists' judgements about what kind of theory is most likely to be empirically successful, based on the relative empirical success and failure of different kinds of theories in the past. Scientists employ what McAllister dubs "the aesthetic induction". On the other hand, I have argued that we need to see science as making a hierarchy of metaphysical assumptions about the comprehensibility and knowability of the universe, these assumptions asserting less and less as one ascends the hierarchy. One of the more substantial of these assumptions is that the universe is physically comprehensible. The key non-empirical feature a body of fundamental theories in physics must possess to be acceptable is unity. The better such a body of theory exemplifies the metaphysical thesis that the universe is physically comprehensible, in the sense that it has a unified dynamic structure, so the more acceptable such a body of theory is, from this standpoint. This affects not just theoretical physics, but the whole of natural science. In this paper I compare and contrast, and try to assess impartially the relative merits of, these two views. (shrink)
In ’The Coherence of Theism’ Richard Swinburne writes that a person cannot be omniscient and perfectly free. In ’The Existence of God’ Swinburne writes that God is a person who is omniscient and perfectly free. There is a straightforward reason why the two passages are not in tension, but recognition of this reason raises a problem for Swinburne’s argument in ’The Existence of God’ (the conclusion of which is that God likely exists). In this paper I present the problem for (...) Swinburne’s argument. I then consider two potential responses and suggest that neither succeeds. (shrink)
This article generalizes the explanationist account of inference to the best explanation (IBE). It draws a clear distinction between IBE and abduction and presents abduction as the first step of IBE. The second step amounts to the evaluation of explanatory power, which consist in the degree of explanatory virtues that a hypothesis exhibits. Moreover, even though coherence is the most often cited explanatory virtue, on pain of circularity, it should not be treated as one of the explanatory virtues. Rather, coherence (...) should be equated with explanatory power and considered to be derivable from the other explanatory virtues: unification, explanatory depth and simplicity. (shrink)