What effect does witnessing other students cheat have on one's own cheating behavior? What roles do moral attitudes and neutralizing attitudes (justifications for behavior) play when deciding to cheat? The present research proposes a model of academic dishonesty which takes into account each of these variables. Findings from experimental (vignette) and survey methods determined that seeing others cheat increases cheating behavior by causing students to judge the behavior less morally reprehensible, not by making rationalization easier. Witnessing cheating also has unique (...) effects, controlling for other variables. (shrink)
Truth, etc. is a wide-ranging study of ancient logic based upon the John Locke lectures given by the eminent philosopher Jonathan Barnes in Oxford. The book presupposes no knowledge of logic and no skill in ancient languages: all ancient texts are cited in English translation; and logical symbols and logical jargon are avoided so far as possible. Anyone interested in ancient philosophy, or in logic and its history, will find much to learn and enjoy here.
This anthology looks at the early sages of Western philosophy and science who paved the way for Plato and Aristotle and their successors. Democritus's atomic theory of matter, Zeno's dazzling "proofs" that motion is impossible, Pythagorean insights into mathematics, Heraclitus's haunting and enigmatic epigrams-all form part of a revolution in human thought that relied on reasoning, forged the first scientific vocabulary, and laid the foundations of Western philosophy. Jonathan Barnes has painstakingly brought together the surviving Presocratic fragments in their (...) original contexts, utilizing the latest research and a major new papyrus of Empedocles. Translated and edited by Jonathan Barnes. (shrink)
In the works of Sextus Empiricus, scepticism is presented in its most elaborate and challenging form. This book investigates - both from an exegetical and from a philosophical point of view - the chief argumentative forms which ancient scepticism developed. Thus the particular focus is on the Agrippan aspect of Sextus' Pyrrhonism. Barnes gives a lucid explanation and analysis of these arguments, both individually and as constituent parts of a sceptical system. For, taken together, these forms amount to a (...) formidable and systematic challenge to any claim to knowledge or rational belief. The challenge had a great influence on the history of philosophy. And it has never been met. This study reflects the growing interest in ancient scepticism. Quotations from the ancient sources are all translated and Greek terms are explained. Notes on the ancient authors give a brief guide to the sources, both familiar and unfamiliar. (shrink)
What is it to deceive someone? And how is it possible to deceive oneself? Does self-deception require that people be taken in by a deceitful strategy that they know is deceitful? The literature is divided between those who argue that self-deception is intentional and those who argue that it is non-intentional. In this study, Annette Barnes offers a challenge to both the standard characterisation of other-deception and current characterizations of self-deception, examining the available explanations and exploring such questions as (...) the self-deceiver's false consciousness, bias, and the irrationality and objectionability of self-deception. She arrives at a non-intentional account of self-deception that is deeper and more complete than alternative non-intentional accounts and avoids the reduction of self-deceptive belief to wishful belief. (shrink)
Is human freedom and choice exaggerated in recent social theory? Should agency be the central in sociology? In this, penetrating and assured book, one of the leading commentators in the field asks where social theory is going. Barnes argues that social theory has taken the wrong turn in over-stating individual freedom. The result is that social contexts in which all individual actions are situated, is dangerously under-theorized. Barnes calls for a form of social theory that recognizes that sociability (...) is the essential characteristic of human life. It is our capacity to communicate with each other, and plan for each other’s welfare, that makes us truly human. Once this is allowed, notions of “agency”, “freedom”, and “choice” lose their connotation with free-floating individualism. Instead the embedded character of agency is starkly revealed. This is a model of well-informed and balanced analysis. It will be of interest to students of sociology, philosophy and social theory. (shrink)
Barnes focuses and examines Plato’s ideals on life through “Allegory of the Cave”. The nature of selfhood, moral/ political issues, and enlightenment demonstrate in any classroom the alternatives to a dry session on philosophy to young children through an engaging discussion.
The influence of Aristotle, the prince of philosophers, on the intellectual history of the West is second to none. In this book, Jonathan Barnes examines Aristotle's scientific researches, his discoveries in logic and his metaphysical theories, his work in psychology and in ethics and politics, and his ideas about art and poetry, placing his teachings in their historical context.
D. Miller's demonstrations of the language dependence of truthlikeness raise a profound problem for the claim that scientific progress is objective. In two recent papers (Barnes 1990, 1991) I argue that the objectivity of progress may be grounded on the claim that the aim of science is not merely truth but knowledge; progress thus construed is objective in an epistemic sense. In this paper I construct a new solution to Miller's problem grounded on the notion of "approximate causal explanation" (...) which allows for linguistically invariant progress outside an epistemic context. I suggest that the notion of "approximate causal explanation" provides the resources for a more robust theory of progress than that provided by the notion of "approximate truth.". (shrink)
This brief paperback is designed for symbolic/formal logic courses. It features the tree method proof system developed by Jeffrey. The new edition contains many more examples and exercises and is reorganized for greater accessibility.
The Introduction to philosophy written by Porphyry at the end of the second century AD is the most successful work of its kind ever to have been published. It was translated into most respectable languages, and for a millennium and a half every student of philosophy read it as his first text in the subject. Porphyry's aim was modest: he intended to explain the meaning of five terms, 'genus', 'species', 'difference', 'property', and 'accident' - terms which he took to be (...) important to Aristotelian logic and metaphysics, and hence to philosophy in general. Thus in principle the Introduction is simple and elementary. In fact, there are sometimes difficulties and doubts on the surface of the text - and beneath the surface there are frequent depths or profundities. The work raises, directly or indirectly, a number of perennial philosophical questions. In addition, the Introduction became, in Boethius's Latin translation, the point of reference for one of the longest-lasting of philosophical disputes - the dispute over the status of 'universals'. This book contains a new English translation of the Introduction, preceded by a study of the life and works of Porphyry, the purpose and nature of the Introduction, and the history of the text. It is accompanied by a discursive commentary the primary aim of which is to analyse and assess the philosophical theses and arguments which the Introduction puts forward. (But there are also numerous notes of a more philological or historical turn.) The twentieth century turned away from Aristotelian logic, and the Introduction lost its position on the syllabus. Barnes does not argue that it should be put back in its old place; but his commentary - the first to be published in English, and the fullest to be published for a century - suggests that there is blood in the old man yet. -/- CLARENDON LATER ANCIENT PHILOSOPHERS General Editors: Jonathan Barnes and A. A. Long -/- This series, which is modelled on the familiar Clarendon Aristotle and Clarendon Plato series, is designed to encourage philosophers and students of philosophy to explore the fertile terrain of later ancient philosophy. The texts range in date from the first century BC to the fifth century AD, and will cover all the parts and all the schools of philosophy. Each volume contains a substantial introduction, an English translation, and a critical commentary on the philosophical claims and arguments of the text. The translations aim primarily at accuracy and fidelity; but they are also readable and accompanied by notes on textual problems that affect the philosophical interpretation. No knowledge of Greek or Latin is assumed. (shrink)
Richard Jeffrey is beyond dispute one of the most distinguished and influential philosophers working in the field of decision theory and the theory of knowledge. His work is distinctive in showing the interplay of epistemological concerns with probability and utility theory. Not only has he made use of standard probabilistic and decision theoretic tools to clarify concepts of evidential support and informed choice, he has also proposed significant modifications of the standard Bayesian position in order that it provide a (...) better fit with actual human experience. Probability logic is viewed not as a source of judgment but as a framework for explaining the implications of probabilistic judgments and their mutual compatability This collection of essays spans a period of some 35 years and includes what have become some of the classic works in the literature. There is also one completely new piece, while in many instances Jeffrey includes afterthoughts on the older essays. (shrink)
In this paper we aim to disentangle the thesis that the future is open from theses that often get associated or even conflated with it. In particular, we argue that the open future thesis is compatible with both the unrestricted principle of bivalence and determinism with respect to the laws of nature. We also argue that whether or not the future (and indeed the past) is open has no consequences as to the existence of (past and) future ontology.
The classical view of the relationship between necessity and apriority, defended by Leibniz and Kant, is that all necessary truths are known a priori. The classical view is now almost universally rejected, ever since Saul Kripke and Hilary Putnam discovered that there are necessary truths that are known only a posteriori. However, in recent years a new debate has emerged over the epistemology of these necessary a posteriori truths. According to one view – call it the neo-classical view – knowledge (...) of a necessary truth always depends on at least one item of a priori knowledge. According to the rival view – call it the neoempiricist view – our knowledge of necessity is sometimes broadly empirical. In this paper I present and defend an argument against the neo-empiricist view. I argue that knowledge of the necessity of a necessary truth could not be broadly empirical. (shrink)
Ancient philosophy is in a bad way. Like all other academic disciplines, it is crushed by the embrace of bureaucracy. Like other parts of philosophy, it is infected by faddishness. And in addition it suffers cruelly from the decline in classical philology. There is no cure for this disease.
In this paper I argue that Gareth Evans’ famous proof of the impossibility of de re indeterminate identity fails on a counterpart-theoretic interpretation of the determinacy operators. I attempt to motivate a counterpart-theoretic reading of the determinacy operators and then show that, understood counterpart-theoretically, Evans’ argument is straightforwardly invalid.
Recent research has yielded an explosion of literature that establishes a strong connection between emotional and cognitive processes. Most notably, Antonio Damasio draws an intimate connection between emotion and cognition in practical decision making. Damasio presents a "somatic marker" hypothesis which explains how emotions are biologically indispensable to decisions. His research on patients with frontal lobe damage indicates that feelings normally accompany response options and operate as a biasing device to dictate choice. What Damasio's hypothesis lacks is a theoretical model (...) of decision making which can advance the conceptual connection between emotional and cognitive decision making processes. In this paper we combine Damasio's somatic marker hypothesis with the coherence theory of decision put forward by Thagard and Millgram. The juxtaposition of Damasio's hypothesis with a cognitive theory of decision making leads to a new and better theory of emotional decisions. (shrink)
Knowledge can be transmitted by a valid deductive inference. If I know that p, and I know that if p then q, then I can infer that q, and I can thereby come to know that q. What feature of a valid deductive inference enables it to transmit knowledge? In some cases, it is a proof of validity that grounds the transmission of knowledge. If the subject can prove that her inference follows a valid rule, then her inference transmits knowledge. (...) However, this only pushes the question back to the inference that was made in this proof. What feature of that inference enables it to transmit knowledge? A vicious regress looms here. Every proof requires a valid inference, and every valid inference must follow at least one rule of inference. So every proof must follow at least one rule of inference. Therefore not every valid inference that transmits knowledge can acquire this power through a proof, on pain of vicious infinite regress. So it must be possible to transmit knowledge by making an inference that follows an underived rule. A deductive inference that follows an underived rule is what I will call a basic deductive inference. It must be possible to transmit knowledge by making a basic deductive inference. But how is this possible? What feature of a basic deductive inference gives it this power to transmit knowledge? (shrink)
Hill and Levine offer alternative explanations of these conceivabilities, concluding that these conceivabilities are thereby defeated as evidence. However, this strategy fails because their explanations generalize to all conceivability judgments concerning phenomenal states. Consequently, one could defend absolutely any theory of phenomenal states against conceivability arguments in just this way. This result conflicts with too many of our common sense beliefs about the evidential value of conceivability with respect to phenomenal states. The general moral is that the application of such (...) principles of explanatory defeat is neither simple nor straightforward. (shrink)
We contend that empathy is best viewed as a kind of analogical thinking of the sort described in the multiconstraint theory of analogy proposed by Keith Holyoak and Paul Thagard (1995). Our account of empathy reveals the Theory-theory/Simulation theory debate to be based on a false assumption and formulated in terms too simple to capture the nature of mental state ascription. Empathy is always simulation, but may simultaneously include theory-application. By properly specifying the analogical processes of empathy and their constraints, (...) we are able to show how the amount of theory needed to empathize is determined. (shrink)
The quantitative problem of old evidence is the problem of how to measure the degree to which e confirms h for agent A at time t when A regards e as justified at t. Existing attempts to solve this problem have applied the e-difference approach, which compares A's probability for h at t with what probability A would assign h if A did not regard e as justified at t. The quantitative problem has been widely regarded as unsolvable primarily on (...) the grounds that the e-difference approach suffers from intractable problems. Various philosophers have proposed that 'Bayesianism' should be rejected as a research strategy in confirmation theory in part because of the unsolvability of this problem. I develop a version of the e-difference approach which overcomes these problems and possesses various advantages (but also certain limitations). I develop an alternative 'theistic' approach which handles many cases that my development of the e-difference approach does not handle. I conclude with an assessment of the significance of the quantitative problem for Bayesianism and argue that this problem is misunderstood in so far as it is regarded as unsolvable, and in so far as it is regarded as a problem only for Bayesians. (shrink)
In this paper I respond to Trenton Merricks's (2005) paper ‘Composition and Vagueness’. I argue that Merricks's paper faces the following difficulty: he claims to provide independent motivation for denying one of the premisses of the Lewis-Sider vagueness argument for unrestricted composition, but the alleged motivation he provides begs the question.
The miracle argument for scientific realism can be cast in two forms: according to the miraculous theory argument, realism is the only position which does not make the empirical successes of particular theories miraculous. According to the miraculous choice argument, realism is the only position which does not render the fact that empirically successful theories have been chosen a miracle. A vast literature discusses the miraculous theory argument, but the miraculous choice argument has been unjustifiably neglected. I raise two objections (...) to Richard Boyd's defense of the latter: (1) we have no miracle free account of the emergence of take-off theories and (2) the anti-realist can account for the non-miraculous choice of empirically successful theories by attributing mere empirical adequacy to background theory. I argue that the availability of extra-empirical criteria that are arguably truth conductive but not theory-laden suffices to answer (1), and the unavailability of extra-empirical criteria that are conductive to empirical adequacy but not necessarily to truth (and are also not theory-laden) constitutes to reply to (2). The prospects for a realist victory are at least somewhat promising, on a controversial assumption about the rate at which empirically successful theories emerge. (shrink)
While Sartre scholars cannot fairly be described as being opposed to science, they have, for the most part, stayed aloof. The field of psychology, of course, has been an exception. Sartre himself felt compelled to present his own existential psychoanalysis by marking the parallels and differences between his position and traditional approaches, particularly the Freudian. The same is true with respect to his concept of bad faith and of emotional behavior. Scholars have followed his lead with richly productive results. But (...) we may note that the debate has centered on psychic and therapeutic issues, aspects of what Sartre called le vécu or lived experience, rather than on the findings of cognitive science or neuroscience. Although all existentialists and phenomenologists accept as a central tenet the fact that consciousness is embodied, there has been virtually no concern with the biological substratum. But the study of consciousness cannot be restricted within its own narrow confines—unlike, say, Greek grammar, which can be learned without reference to the rules of Arabic. At some point, there must be established an organic foundation for the behavior of the conscious organism. (shrink)
In recent literature on vagueness, writers have noted that more ‘plentiful’ theories of properties – those that postulate genuine properties corresponding to the classically vague predicates like ‘bald’ and ‘heap’ – appear straightforwardly committed to ontic vagueness. In this paper, however, I will argue that worries of ontic vagueness are not specific to ‘plentiful’ accounts of properties. The classically ‘sparse’ theories of properties – Universals and tropes – will, I contend, be subject to similar difficulties.
"To some people, life is very simple . . . no shadings and grays, all blacks and whites. . . . Now, others of us find that good, bad, right, wrong, are many-sided, complex things. We try to see every side; but the more we see, the less sure we are.".
This paper addresses a significant gap in the conceptualization of business ethics within different cultural influences. Though theoretical models of business ethics have recognized the importance of culture in ethical decision-making, few have examinedhow this influences ethical decision-making. Therefore, this paper develops propositions concerning the influence of various cultural dimensions on ethical decision-making using Hofstede''s typology.
: "Technoscience" is now most commonly used in academic work to refer to sets of activities wherein science and technology have become inextricably intermingled, or else have hybridized in some sense. What, though, do we understand by "science" and by "technology"? The use of these terms has varied greatly, but their current use presumes a society with extensive institutional and occupational differentiation. Only in that kind of context may science and technology be treated as "other" in relation to "the rest" (...) of the social order; whether as differentiated sets of practices or as specialized institutional forms. References to "technoscience" may then be taken to imply a reversal of earlier processes of cultural and institutional differentiation and/or a recombination of separate bodies of practice and skill. Either way a move back to a less differentiated state is implied, which makes it surprising that we appear to have very few memories of technoscience in periods less culturally and institutionally differentiated than our own and a lower level of technical and intellectual division of labor. However, the elusiveness of our memories of technoscience may be significant mainly for what it suggests about our ways of conceptualizing the past, rather than for any insight it offers into that past "itself." We tend to identify practices and networks of practices in terms of functions. And it may be because, at different times, different functions have been selected as constitutive of practices, that "technoscience" has come to be regarded as something especially characteristic of the present. (shrink)
In recent years there has been a resurgence of interest in property dualism—the view that some mental properties are neither identical with, nor strongly supervenient on, physical properties. One of the principal objections to this view is that, according to natural science, the physical world is a causally closed system. So if mental properties are really distinct from physical properties, then it would seem that mental properties never really cause anything that happens in the physical world. Thus, dualism threatens to (...) lead inexorably to epiphenomenalism. In this paper, I will argue that the only way for a property dualist to avoid epiphenomenalism is to deny that the human body is strictly identical with the sum of its microphysical parts. I will go on to argue that the only way to sustain such anti-reductionism about the human body is to embrace some sort of substance-hylomorphism. (shrink)
This paper proposes a solution to David Miller's Minnesotan-Arizonan demonstration of the language dependence of truthlikeness (Miller 1974), along with Miller's first-order demonstration of the same (Miller 1978). It is assumed, with Peter Urbach, that the implication of these demonstrations is that the very notion of truthlikeness is intrinsically language dependent and thus non-objective. As such, truthlikeness cannot supply a basis for an objective account of scientific progress. I argue that, while Miller is correct in arguing that the number of (...) true atomic sentences of a false theory is language dependent, the number of known sentences (under certain straightforward assumptions) is conserved by translation; degree of knowledge, unlike truthlikeness, is thus a linguistically invariant notion. It is concluded that the objectivity of scientific progress must be grounded on the fact (noted in Cohen 1980) that knowledge, not mere truth, is the aim of science. (shrink)
Philip Kitcher has proposed a theory of explanation based on the notion of unification. Despite the genuine interest and power of the theory, I argue here that the theory suffers from a fatal deficiency: It is intrinsically unable to account for the asymmetric structure of explanation, and thus ultimately falls prey to a problem similar to the one which beset Hempel's D-N model. I conclude that Kitcher is wrong to claim that one can settle the issue of an argument's explanatory (...) force merely on the basis of considerations about the unifying power of the argument pattern the argument instantiates. (shrink)
From a point of view like de Finetti's, what is the judgmental reality underlying the objectivistic claim that a physical magnitude X determines the objective probability that a hypothesis H is true? When you have definite conditional judgmental probabilities for H given the various unknown values of X, a plausible answer is sufficiency, i.e., invariance of those conditional probabilities as your probability distribution over the values of X varies. A different answer, in terms of conditional exchangeability, is offered for use (...) when such definite conditional probabilities are absent. (shrink)
The approach to decision theory floated in my 1965 book is reviewed (I), challenged in various related ways (II–V) and defended, firstad hoc (II–IV) and then by a general argument of Ellery Ells's (VI). Finally, causal decision theory (in a version sketched in VII) is exhibited as a special case of my 1965 theory, according to the Eellsian argument.
Isaac Levi and I have different views of probability and decision making. Here, without addressing the merits, I will try to answer some questions recently asked by Levi (1985) about what my view is, and how it relates to his.
A pluralistic scientific method is one that incorporates a variety of points of view in scientific inquiry. This paper investigates one example of pluralistic method: the use of weighted averaging in probability estimation. I consider two methods of weight determination, one based on disjoint evidence possession and the other on track record. I argue that weighted averaging provides a rational procedure for probability estimation under certain conditions. I consider a strategy for calculating ‘mixed weights’ which incorporate mixed information about agent (...) credibility. I address various objections to the weighted averaging technique and conclude that the technique is a promising one in various respects. (shrink)
Logicism Lite counts number‐theoretical laws as logical for the same sort of reason for which physical laws are counted as as empirical: because of the character of the data they are responsible to. In the case of number theory these are the data verifying or falsifying the simplest equations, which Logicism Lite counts as true or false depending on the logical validity or invalidity of first‐order argument forms in which no numbertheoretical notation appears.
The present study examines the relationships between consumers'' ethical beliefs and personality traits. Based on a survey of 295 undergraduate business students, the authors found that individuals with high needs for autonomy, innovation, and aggression, as well as individuals with a high propensity for taking risks tend to have less ethical beliefs concerning possible consumer actions. Individuals with a high need for social desirability and individuals with a strong problem solving coping style tend to have more ethical beliefs concerning possible (...) consumer actions. The needs for achievement, affiliation, complexity and an emotion solving coping style were not significantly correlated with consumer ethical beliefs. (shrink)
The commonly perceived tension between authentic moral and ethical action and action involving tolerance is held to be the illusory product of an unduly individualistic frame of thought. Moral and ethical actions are produced not by independent individuals but by participants in cultural traditions. And even the wholly routine continuation of a single homogeneous tradition must always and invariably involve mutual tolerance: participants must interact not as independent individuals but as tolerant members. Tolerance deserves recognition, accordingly, as a primary virtue, (...) not merely compatible with authentic moral and ethical action, but required by it. An explicit rhetoric enjoining tolerance needs to be understood as performative discourse employed to change, or else to sustain, the systems of tolerances in which all cultures, whether simple or differentiated, homogeneous or diverse, unified or fragmented, invariably consist. (shrink)
provides a sustained and ambitious development of the basic idea that knowledge is true belief that tracks the truth. In this essay, I provide a quick synopsis of Roush's book and offer a substantive discussion of her analysis of scientific evidence. Roush argues that, for e to serve as evidence for h, it should be easier to determine the truth value of e than it is to determine the truth value of h, an ideal she refers to as leverage. She (...) defends a detailed method by which the value of p(h/e) is computed without direct information about p(h) but only using evidence about the value of p(e), from which the value of p(h) is derived. She presents an example of how to use her leverage method, which I argue involves a certain critical mistake. I show how the leveraging method can be used in a way that is sound—I conclude with a few remarks about the importance of distinguishing clearly between prior and posterior probabilities. CiteULike Connotea Del.icio.us What's this? (shrink)
Predictivism asserts that where evidence E confirms theory T, E provides stronger support for T when E is predicted on the basis of T and then confirmed than when E is known before T's construction and 'used', in some sense, in the construction of T. Among the most interesting attempts to argue that predictivism is a true thesis (under certain conditions) is that of Patrick Maher (1988, 1990, 1993). The purpose of this paper is to investigate the nature of predictivism (...) using Maher's analysis as a starting point. I briefly summarize Maher's primary argument and expand upon it; I explore related issues pertaining to the causal structure of empirical domains and the logic of discovery. (shrink)