Truth, etc. is a wide-ranging study of ancient logic based upon the John Locke lectures given by the eminent philosopher Jonathan Barnes in Oxford. The book presupposes no knowledge of logic and no skill in ancient languages: all ancient texts are cited in English translation; and logical symbols and logical jargon are avoided so far as possible. Anyone interested in ancient philosophy, or in logic and its history, will find much to learn and enjoy here.
This anthology looks at the early sages of Western philosophy and science who paved the way for Plato and Aristotle and their successors. Democritus's atomic theory of matter, Zeno's dazzling "proofs" that motion is impossible, Pythagorean insights into mathematics, Heraclitus's haunting and enigmatic epigrams-all form part of a revolution in human thought that relied on reasoning, forged the first scientific vocabulary, and laid the foundations of Western philosophy. Jonathan Barnes has painstakingly brought together the surviving Presocratic fragments in their (...) original contexts, utilizing the latest research and a major new papyrus of Empedocles. Translated and edited by Jonathan Barnes. (shrink)
In the works of Sextus Empiricus, scepticism is presented in its most elaborate and challenging form. This book investigates - both from an exegetical and from a philosophical point of view - the chief argumentative forms which ancient scepticism developed. Thus the particular focus is on the Agrippan aspect of Sextus' Pyrrhonism. Barnes gives a lucid explanation and analysis of these arguments, both individually and as constituent parts of a sceptical system. For, taken together, these forms amount to a (...) formidable and systematic challenge to any claim to knowledge or rational belief. The challenge had a great influence on the history of philosophy. And it has never been met. This study reflects the growing interest in ancient scepticism. Quotations from the ancient sources are all translated and Greek terms are explained. Notes on the ancient authors give a brief guide to the sources, both familiar and unfamiliar. (shrink)
What is it to deceive someone? And how is it possible to deceive oneself? Does self-deception require that people be taken in by a deceitful strategy that they know is deceitful? The literature is divided between those who argue that self-deception is intentional and those who argue that it is non-intentional. In this study, Annette Barnes offers a challenge to both the standard characterisation of other-deception and current characterizations of self-deception, examining the available explanations and exploring such questions as (...) the self-deceiver's false consciousness, bias, and the irrationality and objectionability of self-deception. She arrives at a non-intentional account of self-deception that is deeper and more complete than alternative non-intentional accounts and avoids the reduction of self-deceptive belief to wishful belief. (shrink)
Is human freedom and choice exaggerated in recent social theory? Should agency be the central in sociology? In this, penetrating and assured book, one of the leading commentators in the field asks where social theory is going. Barnes argues that social theory has taken the wrong turn in over-stating individual freedom. The result is that social contexts in which all individual actions are situated, is dangerously under-theorized. Barnes calls for a form of social theory that recognizes that sociability (...) is the essential characteristic of human life. It is our capacity to communicate with each other, and plan for each other’s welfare, that makes us truly human. Once this is allowed, notions of “agency”, “freedom”, and “choice” lose their connotation with free-floating individualism. Instead the embedded character of agency is starkly revealed. This is a model of well-informed and balanced analysis. It will be of interest to students of sociology, philosophy and social theory. (shrink)
Barnes focuses and examines Plato’s ideals on life through “Allegory of the Cave”. The nature of selfhood, moral/ political issues, and enlightenment demonstrate in any classroom the alternatives to a dry session on philosophy to young children through an engaging discussion.
The influence of Aristotle, the prince of philosophers, on the intellectual history of the West is second to none. In this book, Jonathan Barnes examines Aristotle's scientific researches, his discoveries in logic and his metaphysical theories, his work in psychology and in ethics and politics, and his ideas about art and poetry, placing his teachings in their historical context.
D. Miller's demonstrations of the language dependence of truthlikeness raise a profound problem for the claim that scientific progress is objective. In two recent papers (Barnes 1990, 1991) I argue that the objectivity of progress may be grounded on the claim that the aim of science is not merely truth but knowledge; progress thus construed is objective in an epistemic sense. In this paper I construct a new solution to Miller's problem grounded on the notion of "approximate causal explanation" (...) which allows for linguistically invariant progress outside an epistemic context. I suggest that the notion of "approximate causal explanation" provides the resources for a more robust theory of progress than that provided by the notion of "approximate truth.". (shrink)
The Introduction to philosophy written by Porphyry at the end of the second century AD is the most successful work of its kind ever to have been published. It was translated into most respectable languages, and for a millennium and a half every student of philosophy read it as his first text in the subject. Porphyry's aim was modest: he intended to explain the meaning of five terms, 'genus', 'species', 'difference', 'property', and 'accident' - terms which he took to be (...) important to Aristotelian logic and metaphysics, and hence to philosophy in general. Thus in principle the Introduction is simple and elementary. In fact, there are sometimes difficulties and doubts on the surface of the text - and beneath the surface there are frequent depths or profundities. The work raises, directly or indirectly, a number of perennial philosophical questions. In addition, the Introduction became, in Boethius's Latin translation, the point of reference for one of the longest-lasting of philosophical disputes - the dispute over the status of 'universals'. This book contains a new English translation of the Introduction, preceded by a study of the life and works of Porphyry, the purpose and nature of the Introduction, and the history of the text. It is accompanied by a discursive commentary the primary aim of which is to analyse and assess the philosophical theses and arguments which the Introduction puts forward. (But there are also numerous notes of a more philological or historical turn.) The twentieth century turned away from Aristotelian logic, and the Introduction lost its position on the syllabus. Barnes does not argue that it should be put back in its old place; but his commentary - the first to be published in English, and the fullest to be published for a century - suggests that there is blood in the old man yet. -/- CLARENDON LATER ANCIENT PHILOSOPHERS General Editors: Jonathan Barnes and A. A. Long -/- This series, which is modelled on the familiar Clarendon Aristotle and Clarendon Plato series, is designed to encourage philosophers and students of philosophy to explore the fertile terrain of later ancient philosophy. The texts range in date from the first century BC to the fifth century AD, and will cover all the parts and all the schools of philosophy. Each volume contains a substantial introduction, an English translation, and a critical commentary on the philosophical claims and arguments of the text. The translations aim primarily at accuracy and fidelity; but they are also readable and accompanied by notes on textual problems that affect the philosophical interpretation. No knowledge of Greek or Latin is assumed. (shrink)
In this paper we aim to disentangle the thesis that the future is open from theses that often get associated or even conflated with it. In particular, we argue that the open future thesis is compatible with both the unrestricted principle of bivalence and determinism with respect to the laws of nature. We also argue that whether or not the future (and indeed the past) is open has no consequences as to the existence of (past and) future ontology.
The classical view of the relationship between necessity and apriority, defended by Leibniz and Kant, is that all necessary truths are known a priori. The classical view is now almost universally rejected, ever since Saul Kripke and Hilary Putnam discovered that there are necessary truths that are known only a posteriori. However, in recent years a new debate has emerged over the epistemology of these necessary a posteriori truths. According to one view – call it the neo-classical view – knowledge (...) of a necessary truth always depends on at least one item of a priori knowledge. According to the rival view – call it the neoempiricist view – our knowledge of necessity is sometimes broadly empirical. In this paper I present and defend an argument against the neo-empiricist view. I argue that knowledge of the necessity of a necessary truth could not be broadly empirical. (shrink)
We discuss arguments against the thesis that the world itself can be vague. The first section of the paper distinguishes dialectically effective from ineffective arguments against metaphysical vagueness. The second section constructs an argument against metaphysical vagueness that promises to be of the dialectically effective sort: an argument against objects with vague parts. Firstly, cases of vague parthood commit one to cases of vague identity. But we argue that Evans' famous argument against will not on its own enable one to (...) complete the reductio in the present context. We provide a metaphysical premise that would complete the reductio, but note that it seems deniable. We conclude by drawing general morals from our case study. (shrink)
Ancient philosophy is in a bad way. Like all other academic disciplines, it is crushed by the embrace of bureaucracy. Like other parts of philosophy, it is infected by faddishness. And in addition it suffers cruelly from the decline in classical philology. There is no cure for this disease.
In this paper I develop a framework for understanding ontic vagueness. The project of the paper is two-fold. I first outline a definitional account of ontic vagueness – one that I think is an improvement on previous attempts because it remains neutral on other, independent metaphysical issues. I then develop one potential manifestation of that basic definitional structure. This is a more robust (and much less neutral) account which gives a fully classical explication of ontic vagueness via modal concepts. The (...) overarching aim is to systematically investigate the puzzling question of what exactly it could be for the world itself to be vague. (shrink)
Many of us are tempted by the thought that the future is open, whereas the past is not. The future might unfold one way, or it might unfold another; but the past, having occurred, is now settled. In previous work we presented an account of what openness consists in: roughly, that the openness of the future is a matter of it being metaphysically indeterminate how things will turn out to be. We were previously concerned merely with presenting the view and (...) exploring its consequences; we did not attempt to argue for it over rival accounts. That is what we will aim to do in this paper. (shrink)
In this paper, I argue for a new way of characterizing ontological emergence. I appeal to recent discussions in meta-ontology regarding fundamentality and dependence, and show how emergence can be simply and straightforwardly characterized using these notions. I then argue that many of the standard problems for emergence do not apply to this account: given a clearly specified meta-ontological background, emergence becomes much easier to explicate. If my arguments are successful, they show both a helpful way of thinking about emergence (...) and the potential utility of discussions in meta-ontology when applied to first-order metaphysics. (shrink)
abstract In this paper I develop a characterization of disability according to which disability is in no way a sub-optimal feature. I argue, however, that this conception of disability is compatible with the idea that having a disability is, at least in a restricted sense, a harm. I then go on to argue that construing disability in this way avoids many of the common objections levelled at accounts which claim that disability is not a negative feature.
Although science was once seen as the product of individual great men working in isolation, we now realize that, like any other creative activity, science is a highly social enterprise, influenced in subtle as well as obvious ways by the wider culture and values of its time. Scientific Knowledge is the first introduction to social studies of scientific knowledge. The authors, all noted for their contributions to science studies, have organized this book so that each chapter examines a key step (...) in the process of doing science. Using case studies from cognitive science, physics, and biology to illustrate their descriptions and applications of the social study of science, they show how this approach provides a crucial perspective on how science is actually done. Scientific Knowledge will be of interest not only to those engaged in science studies, but also to anyone interested in the practice of science. (shrink)
In this paper I argue that Gareth Evans’ famous proof of the impossibility of de re indeterminate identity fails on a counterpart-theoretic interpretation of the determinacy operators. I attempt to motivate a counterpart-theoretic reading of the determinacy operators and then show that, understood counterpart-theoretically, Evans’ argument is straightforwardly invalid.
Recent research has yielded an explosion of literature that establishes a strong connection between emotional and cognitive processes. Most notably, Antonio Damasio draws an intimate connection between emotion and cognition in practical decision making. Damasio presents a "somatic marker" hypothesis which explains how emotions are biologically indispensable to decisions. His research on patients with frontal lobe damage indicates that feelings normally accompany response options and operate as a biasing device to dictate choice. What Damasio's hypothesis lacks is a theoretical model (...) of decision making which can advance the conceptual connection between emotional and cognitive decision making processes. In this paper we combine Damasio's somatic marker hypothesis with the coherence theory of decision put forward by Thagard and Millgram. The juxtaposition of Damasio's hypothesis with a cognitive theory of decision making leads to a new and better theory of emotional decisions. (shrink)
Knowledge can be transmitted by a valid deductive inference. If I know that p, and I know that if p then q, then I can infer that q, and I can thereby come to know that q. What feature of a valid deductive inference enables it to transmit knowledge? In some cases, it is a proof of validity that grounds the transmission of knowledge. If the subject can prove that her inference follows a valid rule, then her inference transmits knowledge. (...) However, this only pushes the question back to the inference that was made in this proof. What feature of that inference enables it to transmit knowledge? A vicious regress looms here. Every proof requires a valid inference, and every valid inference must follow at least one rule of inference. So every proof must follow at least one rule of inference. Therefore not every valid inference that transmits knowledge can acquire this power through a proof, on pain of vicious infinite regress. So it must be possible to transmit knowledge by making an inference that follows an underived rule. A deductive inference that follows an underived rule is what I will call a basic deductive inference. It must be possible to transmit knowledge by making a basic deductive inference. But how is this possible? What feature of a basic deductive inference gives it this power to transmit knowledge? (shrink)
Hill and Levine offer alternative explanations of these conceivabilities, concluding that these conceivabilities are thereby defeated as evidence. However, this strategy fails because their explanations generalize to all conceivability judgments concerning phenomenal states. Consequently, one could defend absolutely any theory of phenomenal states against conceivability arguments in just this way. This result conflicts with too many of our common sense beliefs about the evidential value of conceivability with respect to phenomenal states. The general moral is that the application of such (...) principles of explanatory defeat is neither simple nor straightforward. (shrink)
We contend that empathy is best viewed as a kind of analogical thinking of the sort described in the multiconstraint theory of analogy proposed by Keith Holyoak and Paul Thagard (1995). Our account of empathy reveals the Theory-theory/Simulation theory debate to be based on a false assumption and formulated in terms too simple to capture the nature of mental state ascription. Empathy is always simulation, but may simultaneously include theory-application. By properly specifying the analogical processes of empathy and their constraints, (...) we are able to show how the amount of theory needed to empathize is determined. (shrink)
The quantitative problem of old evidence is the problem of how to measure the degree to which e confirms h for agent A at time t when A regards e as justified at t. Existing attempts to solve this problem have applied the e-difference approach, which compares A's probability for h at t with what probability A would assign h if A did not regard e as justified at t. The quantitative problem has been widely regarded as unsolvable primarily on (...) the grounds that the e-difference approach suffers from intractable problems. Various philosophers have proposed that 'Bayesianism' should be rejected as a research strategy in confirmation theory in part because of the unsolvability of this problem. I develop a version of the e-difference approach which overcomes these problems and possesses various advantages (but also certain limitations). I develop an alternative 'theistic' approach which handles many cases that my development of the e-difference approach does not handle. I conclude with an assessment of the significance of the quantitative problem for Bayesianism and argue that this problem is misunderstood in so far as it is regarded as unsolvable, and in so far as it is regarded as a problem only for Bayesians. (shrink)
In this paper I respond to Trenton Merricks's (2005) paper ‘Composition and Vagueness’. I argue that Merricks's paper faces the following difficulty: he claims to provide independent motivation for denying one of the premisses of the Lewis-Sider vagueness argument for unrestricted composition, but the alleged motivation he provides begs the question.
The miracle argument for scientific realism can be cast in two forms: according to the miraculous theory argument, realism is the only position which does not make the empirical successes of particular theories miraculous. According to the miraculous choice argument, realism is the only position which does not render the fact that empirically successful theories have been chosen a miracle. A vast literature discusses the miraculous theory argument, but the miraculous choice argument has been unjustifiably neglected. I raise two objections (...) to Richard Boyd's defense of the latter: (1) we have no miracle free account of the emergence of take-off theories and (2) the anti-realist can account for the non-miraculous choice of empirically successful theories by attributing mere empirical adequacy to background theory. I argue that the availability of extra-empirical criteria that are arguably truth conductive but not theory-laden suffices to answer (1), and the unavailability of extra-empirical criteria that are conductive to empirical adequacy but not necessarily to truth (and are also not theory-laden) constitutes to reply to (2). The prospects for a realist victory are at least somewhat promising, on a controversial assumption about the rate at which empirically successful theories emerge. (shrink)
This thesis is a systematic investigation of whether there might be conceptual room for the idea that the world itself might be vague, independently of how we describe it. This idea – the existence of so-called ontic vagueness – has generally been extremely unpopular in the literature; my thesis thus seeks to evaluate whether this ‘negative press’ is justified. I start by giving a working definition and semantics for ontic vagueness, and then attempt to show that there are no conclusive (...) arguments that rule out vagueness of this kind. I subsequently establish what type of arguments I think would be most effective in establishing ontic vagueness and provide some arguments of this form. I then highlight a potential worry for this type of argument, but argue that it can be circumvented. Finally, I consider the main ways that the opponent of ontic vagueness would be likely resist the arguments I have offered, and argue that these strategies of response are methodologically problematic. I conclude by claiming that ontic vagueness is a perfectly plausible ontological commitment. (shrink)
While Sartre scholars cannot fairly be described as being opposed to science, they have, for the most part, stayed aloof. The field of psychology, of course, has been an exception. Sartre himself felt compelled to present his own existential psychoanalysis by marking the parallels and differences between his position and traditional approaches, particularly the Freudian. The same is true with respect to his concept of bad faith and of emotional behavior. Scholars have followed his lead with richly productive results. But (...) we may note that the debate has centered on psychic and therapeutic issues, aspects of what Sartre called le vécu or lived experience, rather than on the findings of cognitive science or neuroscience. Although all existentialists and phenomenologists accept as a central tenet the fact that consciousness is embodied, there has been virtually no concern with the biological substratum. But the study of consciousness cannot be restricted within its own narrow confines—unlike, say, Greek grammar, which can be learned without reference to the rules of Arabic. At some point, there must be established an organic foundation for the behavior of the conscious organism. (shrink)
In recent literature on vagueness, writers have noted that more ‘plentiful’ theories of properties – those that postulate genuine properties corresponding to the classically vague predicates like ‘bald’ and ‘heap’ – appear straightforwardly committed to ontic vagueness. In this paper, however, I will argue that worries of ontic vagueness are not specific to ‘plentiful’ accounts of properties. The classically ‘sparse’ theories of properties – Universals and tropes – will, I contend, be subject to similar difficulties.
The behavior of individuals currently living will generally have long-term consequences that affect the well-being of those who will come to live in the future. Intergenerational interdependencies of this nature raise difficult moral issues because only the current generation is in a position to decide on actions that will determine the nature of the world in which future generations will live. Although most are willing to attach some weight to the interests of future generations, many would argue that it is (...) not necessary to treat these interests as equivalent to those of the current generation. A common approach in this context is to use a system of discounting to evaluate future benefits and harms. This paper assesses the logic of discounting drawing on the writings of economists and philosophers. Much of the economic literature concerns the choice of an appropriate social discount rate. The social discount rate can be taken to reflect beliefs about the rights of future generations, a subject that has been extensively debated in the phioosophic literature. The writings of both economists and philosophers concerned with the weight to attach to the interests of future generations are reviewed and evaluated in this paper and the implications for environmental policy are discussed. (shrink)
This paper addresses a significant gap in the conceptualization of business ethics within different cultural influences. Though theoretical models of business ethics have recognized the importance of culture in ethical decision-making, few have examinedhow this influences ethical decision-making. Therefore, this paper develops propositions concerning the influence of various cultural dimensions on ethical decision-making using Hofstede''s typology.
: "Technoscience" is now most commonly used in academic work to refer to sets of activities wherein science and technology have become inextricably intermingled, or else have hybridized in some sense. What, though, do we understand by "science" and by "technology"? The use of these terms has varied greatly, but their current use presumes a society with extensive institutional and occupational differentiation. Only in that kind of context may science and technology be treated as "other" in relation to "the rest" (...) of the social order; whether as differentiated sets of practices or as specialized institutional forms. References to "technoscience" may then be taken to imply a reversal of earlier processes of cultural and institutional differentiation and/or a recombination of separate bodies of practice and skill. Either way a move back to a less differentiated state is implied, which makes it surprising that we appear to have very few memories of technoscience in periods less culturally and institutionally differentiated than our own and a lower level of technical and intellectual division of labor. However, the elusiveness of our memories of technoscience may be significant mainly for what it suggests about our ways of conceptualizing the past, rather than for any insight it offers into that past "itself." We tend to identify practices and networks of practices in terms of functions. And it may be because, at different times, different functions have been selected as constitutive of practices, that "technoscience" has come to be regarded as something especially characteristic of the present. (shrink)
Ancient philosophers -- The history of philosophy -- Philosophy within quotation marks? -- Anglophone attitudes -- Brentano's Aristotle -- Heidegger in the cave -- 'There was an old person from Tyre' -- The Presocratics in context -- Argument in ancient philosophy -- Philosophy and dialectic -- Aristotle and the methods of ethics -- Metacommentary -- An introduction to Aspasius -- Parmenides and the Eleatic One -- Reason and necessity in Leucippus -- Plato's cyclical argument -- Death and the philosopher -- (...) Aristotelian arithmetic -- The principle of plenitude -- 'Aristotle's opinion concerning destiny and what is up to us' -- 'Belief is up to us' -- The same again : the Stoics and eternal recurrence -- Bits and pieces -- Partial wholes -- 'Drei Sonnen sah ich ...' : Syrianus and astronomy -- Immaterial causes. (shrink)
Sider (Four-dimensionalism 2001; Philos Stud 114:135–146, 2003; Nous 43:557–567, 2009) has developed an influential argument against indeterminacy in existence. In what follows, I argue that the defender of metaphysical forms of indeterminate existence has a unique way of responding to Sider’s argument. The response I’ll offer is interesting not only for its applicability to Sider’s argument, but also for its broader implications; responding to Sider helps to show both how we should think about precisification in the context of metaphysical indeterminacy (...) and how we should understand commitment to metaphysically indeterminate existence. And if I’m right that metaphysical indeterminacy can allow for indeterminate existence in a way that semantic indeterminacy can’t, indeterminate existence might actually give us a reason to accept metaphysical indeterminacy (rather than a reason to reject it, as is commonly assumed). (shrink)
In recent years there has been a resurgence of interest in property dualism—the view that some mental properties are neither identical with, nor strongly supervenient on, physical properties. One of the principal objections to this view is that, according to natural science, the physical world is a causally closed system. So if mental properties are really distinct from physical properties, then it would seem that mental properties never really cause anything that happens in the physical world. Thus, dualism threatens to (...) lead inexorably to epiphenomenalism. In this paper, I will argue that the only way for a property dualist to avoid epiphenomenalism is to deny that the human body is strictly identical with the sum of its microphysical parts. I will go on to argue that the only way to sustain such anti-reductionism about the human body is to embrace some sort of substance-hylomorphism. (shrink)
This paper proposes a solution to David Miller's Minnesotan-Arizonan demonstration of the language dependence of truthlikeness (Miller 1974), along with Miller's first-order demonstration of the same (Miller 1978). It is assumed, with Peter Urbach, that the implication of these demonstrations is that the very notion of truthlikeness is intrinsically language dependent and thus non-objective. As such, truthlikeness cannot supply a basis for an objective account of scientific progress. I argue that, while Miller is correct in arguing that the number of (...) true atomic sentences of a false theory is language dependent, the number of known sentences (under certain straightforward assumptions) is conserved by translation; degree of knowledge, unlike truthlikeness, is thus a linguistically invariant notion. It is concluded that the objectivity of scientific progress must be grounded on the fact (noted in Cohen 1980) that knowledge, not mere truth, is the aim of science. (shrink)
Philip Kitcher has proposed a theory of explanation based on the notion of unification. Despite the genuine interest and power of the theory, I argue here that the theory suffers from a fatal deficiency: It is intrinsically unable to account for the asymmetric structure of explanation, and thus ultimately falls prey to a problem similar to the one which beset Hempel's D-N model. I conclude that Kitcher is wrong to claim that one can settle the issue of an argument's explanatory (...) force merely on the basis of considerations about the unifying power of the argument pattern the argument instantiates. (shrink)
University Of Exeter, England The central argument of this article is that there is no fact of the matter, no evidence, however tentative or questionable, that will serve adequately to identify actions "chosen" or "determined" for the purposes of sociological theory. This argument will be developed with reference to the two theorists of the greatest importance in advocating the sociological value of the concept of agency: Talcott Parsons, with his "voluntaristic theory of action," set the scene for the whole agency (...) and structure debate in modern sociology, and Anthony Giddens, in his theory of structuration, provides the most comprehensive recent account. Both theorists put forward grounds and justifications for their use of the concepts of "choice" and "agency," but it will be argued here that in the last analysis, none of them has any sociological merit. (shrink)
The theory of explanatory unification was first proposed by Friedman (1974) and developed by Kitcher (1981, 1989). The primary motivation for this theory, it seems to me, is the argument that this account of explanation is the only account that correctly describes the genesis of scientific understanding. Despite the apparent plausibility of Friedman's argument to this effect, however, I argue here that the unificationist thesis of understanding is false. The theory of explanatory unification as articulated by Friedman and Kitcher thus (...) emerges as fundamentally misconceived. (shrink)
What effect does witnessing other students cheat have on one's own cheating behavior? What roles do moral attitudes and neutralizing attitudes (justifications for behavior) play when deciding to cheat? The present research proposes a model of academic dishonesty which takes into account each of these variables. Findings from experimental (vignette) and survey methods determined that seeing others cheat increases cheating behavior by causing students to judge the behavior less morally reprehensible, not by making rationalization easier. Witnessing cheating also has unique (...) effects, controlling for other variables. (shrink)
I aim to show that one way of testing the mettle of a theory of scientific explanation is to inquire what that theory entails about the status of brute facts. Here I consider the nature of brute facts, and survey several contemporary accounts of explanation vis a vis this subject (the Friedman-Kitcher theory of explanatory unification, Humphreys' causal theory of explanation, and Lipton's notion of 'explanatory loveliness'). One problem with these accounts is that they seem to entail that brute (...) facts represent a gap in scientific understanding. I argue that brute facts are non-mysterious and indeed are even explainable by the lights of Salmon's ontic conception of explanation (which I endorse here). The plausibility of various models of explanation, I suggest, depends to some extent on the tendency of their proponents to focus on certain examples of explananda - I ponder brute facts qua explananda here as a way of helping us to recognize this dependency. (shrink)