Explaining the complex history of art to an individual or group can be extremely difficult. Explaining how to approach works of art from the vast history of the subject can be nearly impossible. An entire industry of books aimed at surveying art from a Western or global view is employed each semester at colleges and universities. Straightforward and condensed survey books such as Gardner's Art Through the Ages or museum guidebooks such as The Art Institute of Chicago: The Essential Guide1 (...) are directed at providing at least a workable knowledge of art in general or the art to be encountered in a museum's collection. However, it is rare to find a book that attempts to explain how to approach works of art in any .. (shrink)
Todd argues for the integration of science and religion to form a new paradigm for the third millennium. He counters both the arguments made by fundamentalist Christians against science and the rejection of religion by the New Atheists, in particular Richard Dawkins and his followers. Drawing on the work of scientists, psychologists, philosophers, and theologians, Todd challenges the materialistic reductionism of our age and offers an alternative grounded in the visionary work taking place in a wide array of (...) disciplines including Jungian archetypal psychology, quantum mechanics, evolutionary biology,epistemology, neuroscience and an incarnational theology implicit in the evolutionary process. (shrink)
Computer simulation and philosophy of science Content Type Journal Article Pages 1-4 DOI 10.1007/s11016-011-9567-8 Authors Wendy S. Parker, Department of Philosophy, Ellis Hall 202, Ohio University, Athens, OH 45701, USA Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
At the International Legal Ethics Conference IV held at Stanford Law School between 15 and 17 July 2010, one of the two opening plenary sessions consisted of a panel who debated the proposition that legal ethics should be mandatory in legal education. The panel included leading legal ethics academics from jurisdictions around the world—both those where legal ethics is a compulsory part of the law degree and those where it is not. It comprised Professors Andrew Boon, Brent Cotter, Christine (...) class='Hi'>Parker, Stephen L Pepper and Richard Wu, and was organised and chaired by Professor Kim Economides. This is an edited version of the panel's discussion. It provides a useful summary of the state of legal ethics teaching in the jurisdictions represented as well as a marshalling of the arguments for and against legal ethics as a required course in the university law degree. (shrink)
What follows is a dialogue, in the Platonic sense, concerning the justifications for "business ethics" as a vehicle for asking questions about the values of modern business organisations. The protagonists are the authors, Gordon Pearson – a pragmatist and sceptic where business ethics is concerned – and Martin Parker – a sociologist and idealist who wishes to be able to ask ethical questions of business. By the end of the dialogue we come to no agreement on the necessity or (...) justification for business ethics, but on the way discuss the uses of philosophy, the meanings of integrity and trust, McDonald''s, a hypothetical torture manufacturer and various other matters. (shrink)
A comprehensive and systematic reconstruction of the philosophy of Charles S. Peirce, perhaps America's most far-ranging and original philosopher, which reveals the unity of his complex and influential body of thought. We are still in the early stages of understanding the thought of C. S. Peirce (1839-1914). Although much good work has been done in isolated areas, relatively little considers the Peircean system as a whole. Peirce made it his life's work to construct a scientifically sophisticated and logically rigorous philosophical (...) system, culminating in a realist epistemology and a metaphysical theory ("synechism") that postulates the connectedness of all things in a universal evolutionary process. In The Continuity of Peirce's Thought, Kelly Parker shows how the principle of continuity functions in phenomenology and semeiotics, the two most novel and important of Peirce's philosophical sciences, which mediate between mathematics and metaphysics. Parker argues that Peirce's concept of continuity is the central organizing theme of the entire Peircean philosophical corpus. He explains how Peirce's unique conception of the mathematical continuum shapes the broad sweep of his thought, extending from mathematics to metaphysics and in religion. He thus provides a convenient and useful overview of Peirce's philosophical system, situating it within the history of ideas and mapping interconnections among the diverse areas of Peirce's work. This challenging yet helpful book adopts an innovative approach to achieve the ambitious goal of more fully understanding the interrelationship of all the elements in the entire corpus of Peirce's writings. Given Peirce's importance in fields ranging from philosophy to mathematics to literary and cultural studies, this new book should appeal to all who seek a fuller, unified understanding of the career and overarching contributions of Peirce, one of the key figures in the American philosophical tradition. (shrink)
In that Case Content Type Journal Article DOI 10.1007/s11673-010-9261-3 Authors Malcolm Parker, School of Medicine, University of Queensland, Brisbane, Australia Journal Journal of Bioethical Inquiry Online ISSN 1872-4353 Print ISSN 1176-7529.
In this article, Walter Parker brings structure and agency to the foreground of the current tumult of public schooling in the United States. He focuses on three structures that are serving as rules and resources for creative agency. These are a discourse of derision about failing schools, a broad mobilization of multiculturalism, and an enduring nationalism. Drawing on Anthony Giddens's structuration theory, Parker examines how these discourses figure in redefining school reform, redefining school curricula, and requiring schools once (...) again to serve nationalistic purposes. (shrink)
Phylloxera, ‘big science’ and the nature of scientific debate Content Type Journal Article Category Book Review Pages 1-3 DOI 10.1007/s11016-012-9668-z Authors Cain Todd, Department of Politics, Philosophy and Religion, County South, Lancaster University, Lancaster, LA1 4YL UK Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
Republication: In That Case Content Type Journal Article DOI 10.1007/s11673-010-9264-0 Authors Malcolm Parker, School of Medicine, University of Queensland, Brisbane, Australia Journal Journal of Bioethical Inquiry Online ISSN 1872-4353 Print ISSN 1176-7529.
Potter et al.’s (1999) response to my ‘Against Relativism in Psychology, on Balance’ (Parker, 1999) neatly summarizes what they take a ‘critical realist’ position to be and how ‘relativists’ should defend themselves. Their response also illustrates why the version of critical realism I elaborated is more thoroughly critically relativist than Potter et al. assume and how their version of relativism actually rests on a rather uncritical subscription to realism.
There is no uniquely standard concept of an effectively decidable set of real numbers or real n-tuples. Here we consider three notions: decidability up to measure zero [M.W. Parker, Undecidability in Rn: Riddled basins, the KAM tori, and the stability of the solar system, Phil. Sci. 70(2) (2003) 359–382], which we abbreviate d.m.z.; recursive approximability [or r.a.; K.-I. Ko, Complexity Theory of Real Functions, Birkhäuser, Boston, 1991]; and decidability ignoring boundaries [d.i.b.; W.C. Myrvold, The decision problem for entanglement, in: (...) R.S. Cohen et al. (Eds.), Potentiality, Entanglement, and Passion-at-a-Distance: Quantum Mechanical Studies fo Abner Shimony, Vol. 2, Kluwer Academic Publishers, Great Britain, 1997, pp. 177–190]. Unlike some others in the literature, these notions apply not only to certain nice sets, but to general sets in Rn and other appropriate spaces. We consider some motivations for these concepts and the logical relations between them. It has been argued that d.m.z. is especially appropriate for physical applications, and on Rn with the standard measure, it is strictly stronger than r.a. [M.W. Parker, Undecidability in Rn: Riddled basins, the KAM tori, and the stability of the solar system, Phil. Sci. 70(2) (2003) 359–382]. Here we show that this is the only implication that holds among our three decidabilities in that setting. Under arbitrary measures, even this implication fails. Yet for intervals of non-zero length, and more generally, convex sets of non-zero measure, the three concepts are equivalent. (shrink)
Psychology is meant to help people cope with the afflictions of modern society. But how useful is it? Ian Parker argues that current psychological practice has become part of the problem rather than the solution. Ideal for undergraduates, this book unravels the discipline to reveal the conformist assumptions that underlie its theory and practice. Psychology focuses on the happiness of "the individual." Yet it neglects the fact that personal experience depends on social and political surroundings. Parker argues that (...) a new approach to psychology is needed. He offers an alternative vision, outlining how debates in the discipline can be linked to political practice and how it can become part of a wider progressive agenda. Parker's groundbreaking book is at the cutting edge of current thinking on the discipline and should be required reading in all psychology courses. (shrink)
'The more we enquire, the less we can resolve,' wrote Johnson. Scepticism-a reasoned emphasis on the severe limitations of rationality-would seem to undermine the grounds of belief and action. But in some of the best eighteenth-century literature, a theoretically paralysing critique of the pretensions of reason, precept, and language went hand in hand with a vigorous intellectual, moral, and linguistic confidence. To realise philosophical scepticism as literature was effectively to transform it. Dr Parker traces the presence of this life-giving (...) irony in works by Pope, Hume, Sterne, and Johnson, relates it more broadly to the social self-consciousness of eighteenth-century culture, and discusses its source in Locke and its inspiration in Montaigne. The argument serves as a reminder that radical scepticism is not the invention of the late twentieth century, and that its strategies and implications have never been more interestingly explored than in the eighteenth. (shrink)
Shanachie and Norm Content Type Journal Article Category Case Studies Pages 1-2 DOI 10.1007/s11673-012-9356-0 Authors Malcolm Parker, School of Medicine, The University of Queensland, 288 Herston Road, Herston, QLD 4006, Australia Journal Journal of Bioethical Inquiry Online ISSN 1872-4353 Print ISSN 1176-7529.
Cross-sector social partnerships (CSSPs) can produce benefits at individual, organizational, sectoral and societal levels. In this article, we argue that the distribution of benefits depends in part on the cognitive frames held by partnership participants. Based on Selsky and Parker's (J Manage 31(6):849-873, 2005) review of CSSPs, we identify three analytic "platforms" for social partnerships — the resource-dependence platform, the social-issue platform, and the societal-sector platform. We situate platforms as prospective sensemaking devices that help project managers make sense of (...) partnerships by calling attention to certain desired features or downplaying other features. We describe the three platforms and contrast them on factors that influence social benefit, including orientation, learning, and power. We provide illustrations of each platform and demonstrate how the choice of platform is consequential for practice, such as how a partnership project gets started, evolves and produces social benefits. (shrink)
Insight, by F. H. Parker.--Why be uncritical about the life-world? By H. B. Veatch.--Homage to Saint Anselm, by R. Jordan.--Art and philosophy, by J. M. Anderson.--The phenomenon of world, by R. R. Ehman.--The life-world and its historical horizon, by C. O. Schrag.--The Lebenswelt as ground and as Leib in Husserl: somatology, psychology, sociology, by E. Paci.--Life-world and structures, by C. A. van Peursen.--The miser, by E. W. Straus.--Monetary value and personal value, by G. Schrader.--Individualisms, by W. L. McBride.--Sartre the (...) individualist, by W. Desan.--The nature of social man, by M. Natanson.--The problem of the will and philosophical discourse, by P. Ricoeur.--Structuralism and humanism, by M. Dufrenne.--The illusion of monolinear time, by N. Lawrence.--Can grammar be thought? By J. M. Edie.--The existentialist critique of objectivity, by S. J. Todes and H. L. Dreyfus.--Bibliography (p. 391-400). (shrink)
This paper argues that there is no genuine puzzle of ‘imaginative resistance’. In part 1 of the paper I argue that the imaginability of fictional propositions is relative to a range of different factors including the ‘thickness’ of certain concepts, and certain pre-theoretical and theoretical commitments. I suggest that those holding realist moral commitments may be more susceptible to resistance and inability than those holding non-realist commitments, and that it is such realist commitments that ultimately motivate the problem. However, I (...) argue that the relativity of imaginability is not a particularly puzzling feature of imagination. In part 2, I claim that it is the so-called ‘alethic’ puzzle, concerning fictional truth, which generates a real puzzle about imaginative resistance. However, I argue that the alethic puzzle itself depends on certain realist assumptions about the nature of fictional truth which are implausible and should be rejected in favour of an interpretive view of fictional truth. Once this is done, I contend, it becomes evident that the supposed problem of imaginative resistance as it has hitherto been discussed in the literature is not puzzling at all. (shrink)
There are several argumentative strategies for advancing the thesis that moral responsibility is incompatible with causal determinism. One prominent such strategy is to argue that agents who meet compatibilist conditions for moral responsibility can nevertheless be subject to responsibility-undermining manipulation. In this paper, I argue that incompatibilists advancing manipulation arguments against compatibilism have been shouldering an unnecessarily heavy dialectical burden. Traditional manipulation arguments present cases in which manipulated agents meet all compatibilist conditions for moral responsibility, but are (allegedly) not responsible (...) for their behavior. I argue, however, that incompatibilists can make do with the more modest (and harder to resist) claim that the manipulation in question is mitigating with respect to moral responsibility. The focus solely on whether a manipulated agent is or is not morally responsible has, I believe, masked the full force of manipulation-style arguments against compatibilism. Here, I aim to unveil their real power. (shrink)
A number of recent discussions comparing computer simulation and traditional experimentation have focused on the significance of “materiality.” I challenge several claims emerging from this work and suggest that computer simulation studies are material experiments in a straightforward sense. After discussing some of the implications of this material status for the epistemology of computer simulation, I consider the extent to which materiality (in a particular sense) is important when it comes to making justified inferences about target systems on the basis (...) of experimental results. (shrink)
This article identifies conditions under which robust predictive modeling results have special epistemic significance---related to truth, confidence, and security---and considers whether those conditions hold in the context of present-day climate modeling. The findings are disappointing. When today’s climate models agree that an interesting hypothesis about future climate change is true, it cannot be inferred---via the arguments considered here anyway---that the hypothesis is likely to be true or that scientists’ confidence in the hypothesis should be significantly increased or that a claim (...) to have evidence for the hypothesis is now more secure. (shrink)
Lloyd (2009) contends that climate models are confirmed by various instances of fit between their output and observational data. The present paper argues that what these instances of fit might confirm are not climate models themselves, but rather hypotheses about the adequacy of climate models for particular purposes. This required shift in thinking—from confirming climate models to confirming their adequacy-for-purpose—may sound trivial, but it is shown to complicate the evaluation of climate models considerably, both in principle and in practice.
With the question “What is 'discourse?' “ as the starting point, this paper addresses ways of identifying particular discourses, and attends to how these discourses should be distinguished from texts. The emergence of discourse analysis within psychology, and the continuing influence of linguistic and post-structuralist ideas on practitioners, provide the basis on which discourse-analytic research can be developed fruitfully. This paper discusses the descriptive, analytic and educative functions of discourse analysis, and addresses the cultural and political (...) questions which arise when discourse analysts reflect on their activity. Suggestions for an adequate definition of discourse are proposed and supported by seven criteria which should be adopted to identify discourses, and which attend to contradictions between and within them. Three additional criteria are then suggested to relate discourse analysis to wider political issues. (shrink)
In his recent essay in the Philosophical Review, “Truth and Freedom,” Trenton Merricks contends (among other things) that the basic argument for the incompatibility of God's foreknowledge and human freedom is question-begging. He relies on a “truism” to the effect that truth depends on the world and not the other way around. The present essay argues that mere invocation of this truism does not establish that the basic argument for incompatibilism is question-begging. Further, it seeks to clarify important elements of (...) the debate, including the fixity-of-the-past premise in the incompatibilist's argument and the Ockhamist response. It sketches some potential links between the issues here and recent work on ontological dependence, and it connects the issues raised by Merricks to important work that has appeared in (among other places) the Philosophical Review. (shrink)
Any theoretician constructing a serious model of consciousness should carefully assess the details of empirical data generated in the neurosciences and psychology. A failure to account for those details may cast doubt on the adequacy of that model. This paper presents a case in point. Dennett and Kinsbourne's (Dennett, D., & Kinsbourne, M. (1992). Time and the observer: The where and when of consciousness in the brain. Behavioral and Brain Sciences, 15, 183-243) assault on the materialist version of the Cartesian (...) Theater model of the mind relies significantly on the superiority of their Multiple Drafts model of consciousness as an explanation of the phenomenon of metacontrast. However, their description of metacontrast is, in important ways, inadequate. The result is that their explanation of how the Multiple Drafts model handles this phenomenon fails to account for the actual data. In this paper I offer a more complete description of metacontrast, show how Dennett and Kinsbourne's explanation fails, and argue that there are good theoretical reasons for choosing the so-called Stalinesque model over the so-called Orwellian model. (shrink)
Central to Fischer and Ravizza's theory of moral responsibility is the concept of guidance control, which involves two conditions: (1) moderate reasons-responsiveness, and (2) mechanism ownership. We raise a worry for Fischer and Ravizza's account of (1). If an agent acts contrary to reasons which he could not recognize, this should lead us to conclude that he is not morally responsible for his behaviour; but according to Fischer and Ravizza's account, he satisfies the conditions for guidance control and is therefore (...) morally responsible. We consider ways in which the account of guidance control might be mended. (shrink)
Multiple drug resistant strains of HIV and continuing difficulties with vaccine development highlight the importance of psychologi- cal interventions which aim to in uence the psychosocial and emo- tional factors empirically demonstrated to be significant predictors of immunity, illness progression and AIDS mortality in seropositive persons. Such data have profound implications for psychological interventions designed to modify psychosocial factors predictive of enhanced risk of exposure to HIV as well as the neuroendocrine and immune mechanisms mediating the impact of such factors (...) on disease progression. Many of these factors can be construed as unconscious mental ones, and psychoanalytic self-psychology may be a useful framework for conceptualizing psychic and immune de- fence as well as bodily and self-integration in HIV infection. Al- though further prospective studies and cross-cultural validation of research are necessary, existing data suggest that psychoanalytic insights may be useful both in therapeutic interventions and evaluative research which would require an underlying epistemology of the complementarity of mind and matter. (shrink)
My primary aim in this paper is to outline a quasi-realist theory of aesthetic judgement. Robert Hopkins has recently argued against the plausibility of this project because he claims that quasi-realism cannot explain a central component of any expressivist understanding of aesthetic judgements, namely their supposed ‘autonomy’. I argue against Hopkins’s claims by contending that Roger Scruton’s aesthetic attitude theory, centred on his account of the imagination, provides us with the means to develop a plausible quasi-realist account of aesthetic judgement. (...) Finally, I respond to two recent attempts to discredit the validity of the notion of aesthetic autonomy. I claim that both fail adequately to address the underlying non-realist motivations and justifications for maintaining the principle. (shrink)
Allan Franklin has identified a number of strategies that scientists use to build confidence in experimental results. This paper shows that Franklin's strategies have direct analogues in the context of computer simulation and then suggests that one of his strategies—the so-called 'Sherlock Holmes' strategy—deserves a privileged place within the epistemologies of experiment and simulation. In particular, it is argued that while the successful application of even several of Franklin's other strategies (or their analogues in simulation) may not be sufficient for (...) justified belief in results, the successful application of a slightly elaborated version of the Sherlock Holmes strategy is sufficient. (shrink)
How can anyone be rational in a world where knowledge is limited, time is pressing, and deep thought is often an unattainable luxury? Traditional models of unbounded rationality and optimization in cognitive science, economics, and animal behavior have tended to view decision-makers as possessing supernatural powers of reason, limitless knowledge, and endless time. But understanding decisions in the real world requires a more psychologically plausible notion of bounded rationality. In Simple heuristics that make us smart (Gigerenzer et al. 1999), we (...) explore fast and frugal heuristics – simple rules in the mind's adaptive toolbox for making decisions with realistic mental resources. These heuristics can enable both living organisms and artificial systems to make smart choices quickly and with a minimum of information by exploiting the way that information is structured in particular environments. In this précis, we show how simple building blocks that control information search, stop search, and make decisions can be put together to form classes of heuristics, including: ignorance-based and one-reason decision making for choice, elimination models for categorization, and satisficing heuristics for sequential search. These simple heuristics perform comparably to more complex algorithms, particularly when generalizing to new data – that is, simplicity leads to robustness. We present evidence regarding when people use simple heuristics and describe the challenges to be addressed by this research program. Key Words: adaptive toolbox; bounded rationality; decision making; elimination models; environment structure; heuristics; ignorance-based reasoning; limited information search; robustness; satisficing; simplicity. (shrink)
We consider an approach to some philosophical problems that I call the Method of Conceptual Articulation: to recognize that a question may lack any determinate answer, and to re-engineer concepts so that the question acquires a definite answer in such a way as to serve the epistemic motivations behind the question. As a case study we examine “Galileo’s Paradox”, that the perfect square numbers seem to be at once as numerous as the whole numbers, by one-to-one correspondence, and yet less (...) numerous, being a proper subset. I argue that Cantor resolved this paradox by a method at least close to that proposed—not by discovering the true nature of cardinal number, but by articulating several useful and appealing extensions of number to the infinite. Galileo was right to suggest that the concept of relative size did not apply to the infinite, for the concept he possessed did not. Nor was Bolzano simply wrong to reject Hume’s Principle (that one-to-one correspondence implies equal number) in the infinitary case, in favor of Euclid’s Common Notion 5 (that the whole is greater than the part), for the concept of cardinal number (in the sense of “number of elements”) was not clearly defined for infinite collections. Order extension theorems now suggest that a theory of cardinality upholding Euclid’s principle instead of Hume’s is possible. Cantor’s refinements of number are not the only ones possible, and they appear to have been shaped by motivations and fruitfulness, for they evolved in discernible stages correlated with emerging applications and results. Galileo, Bolzano, and Cantor shared interests in the particulate analysis of the continuum and in physical applications. Cantor’s concepts proved fruitful for those pursuits. Finally, Gödel was mistaken to claim that Cantor’s concept of cardinality is forced on us; though Gödel gives an intuitively compelling argument, he ignores the fact that Euclid’s Common Notion is also intuitively compelling, and we are therefore forced to make a choice. The success of Cantor’s concept of cardinality lies not in its truth (for concepts are not true or false), nor its uniqueness (for it is not the only extension of number possible), but in its intuitive appeal, and most of all, its usefulness to the understanding. (shrink)
The influence of direct-to-consumer advertising and physician promotions are examined in this study. We further examine some of the ethical issues which may arise when physicians accept promotional products from pharmaceutical companies. The data revealed that direct-to-consumer advertising is likely to increase the request rates of both the drug category and the drug brand choices, as well as the likelihood that those drugs will be prescribed by physicians. The data further revealed that the majority of responding physicians were either neutral (...) or did not feel that accepting some types of gifts from pharmaceutical companies affected their ethical behaviors. (shrink)
In this article I examine the status of putative aesthetic judgements in science and mathematics. I argue that if the judgements at issue are taken to be genuinely aesthetic they can be divided into two types, positing either a disjunction or connection between aesthetic and epistemic criteria in theory/proof assessment. I show that both types of claim face serious difficulties in explaining the purported role of aesthetic judgements in these areas. I claim that the best current explanation of this role, (...) McAllister's 'aesthetic induction' model, fails to demonstrate that the judgements at issue are genuinely aesthetic. I argue that, in light of these considerations, there are strong reasons for suspecting that many, and perhaps all, of the supposedly aesthetic claims are not genuinely aesthetic but are in fact 'masked' epistemic assessments. (shrink)
After showing how Deborah Mayo’s error-statistical philosophy of science might be applied to address important questions about the evidential status of computer simulation results, I argue that an error-statistical perspective offers an interesting new way of thinking about computer simulation models and has the potential to significantly improve the practice of simulation model evaluation. Though intended primarily as a contribution to the epistemology of simulation, the analysis also serves to fill in details of Mayo’s epistemology of experiment.
This article distinguishes two different senses of information-theoretic approaches to statistical mechanics that are often conflated in the literature: those relating to the thermodynamic cost of computational processes and those that offer an interpretation of statistical mechanics where the probabilities are treated as epistemic. This distinction is then investigated through Earman and Norton’s () ‘sound’ and ‘profound’ dilemma for information-theoretic exorcisms of Maxwell’s demon. It is argued that Earman and Norton fail to countenance a ‘sound’ information-theoretic interpretation and this paper (...) describes how the latter inferential interpretations can escape the criticisms of Earman and Norton () and Norton () by adopting this ‘sound’ horn. This article considers a standard model of Maxwell’s demon to illustrate how one might adopt an information-theoretic approach to statistical mechanics without a reliance on Landauer’s principle, where the incompressibility of the probability distribution due to Liouville’s theorem is taken as the central feature of such an interpretation. (shrink)
To study Earth’s climate, scientists now use a variety of computer simulation models. These models disagree in some of their assumptions about the climate system, yet they are used together as complementary resources for investigating future climatic change. This paper examines and defends this use of incompatible models. I argue that climate model pluralism results both from uncertainty concerning how to best represent the climate system and from difficulties faced in evaluating the relative merits of complex models. I describe how (...) incompatible climate models are used together in ‘multi-model ensembles’ and explain why this practice is reasonable, given scientists’ inability to identify a ‘best’ model for predicting future climate. Finally, I characterize climate model pluralism as involving both an ontic competitive pluralism and a pragmatic integrative pluralism. (shrink)
visual masking provides a clear illustration that ‘there is really only a verbal difference’ between two versions of the Cartesian Theater model of the mind. This alleged lack of a distinction is both the crucial premise of their main argument against the Cartesian Theater and a motivator for accepting their own Multiple Drafts model. I argue that metacontrast reveals a difference between the two versions of the Cartesian Theater that meets criteria found in (Dennett and Kinsbourne ) for determining such (...) a difference. This difference undermines the soundness of their argument against the Cartesian Theater, and exerts pressure on Dennett and Kinsbourne to offer a more detailed articulation of their model. Introduction Brief Explanation of Metacontrast Backward Visual Masking The Stalinesque and Orwellian Models of Metacontrast 3.1 Criteria for determining a difference A Difference That Makes a Difference 4.1 Skeptical hypothesis objection Other Objections and Replies 5.1 Straw person objection 5.2 Corroborative issues objection Conclusion CiteULike Connotea Del.icio.us What's this? (shrink)