Computer simulation and philosophy of science Content Type Journal Article Pages 1-4 DOI 10.1007/s11016-011-9567-8 Authors Wendy S. Parker, Department of Philosophy, Ellis Hall 202, Ohio University, Athens, OH 45701, USA Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
At the International Legal Ethics Conference IV held at Stanford Law School between 15 and 17 July 2010, one of the two opening plenary sessions consisted of a panel who debated the proposition that legal ethics should be mandatory in legal education. The panel included leading legal ethics academics from jurisdictions around the world—both those where legal ethics is a compulsory part of the law degree and those where it is not. It comprised Professors Andrew Boon, Brent Cotter, Christine (...) class='Hi'>Parker, Stephen L Pepper and Richard Wu, and was organised and chaired by Professor Kim Economides. This is an edited version of the panel's discussion. It provides a useful summary of the state of legal ethics teaching in the jurisdictions represented as well as a marshalling of the arguments for and against legal ethics as a required course in the university law degree. (shrink)
The Regress Argument is supposed to show that the language of thought hypothesis results in an infinite regress in its explanation of such things as learning, meaning, and understanding. Earlier (in Laurence & Margolis 1997) we argued that the Regress Argument doesn’t work and that even the language of thought’s supporters have given the Regress Argument far too much credit. In this paper, we respond to a critique of our earlier discussion.
What follows is a dialogue, in the Platonic sense, concerning the justifications for "business ethics" as a vehicle for asking questions about the values of modern business organisations. The protagonists are the authors, Gordon Pearson – a pragmatist and sceptic where business ethics is concerned – and Martin Parker – a sociologist and idealist who wishes to be able to ask ethical questions of business. By the end of the dialogue we come to no agreement on the necessity or (...) justification for business ethics, but on the way discuss the uses of philosophy, the meanings of integrity and trust, McDonald''s, a hypothetical torture manufacturer and various other matters. (shrink)
A comprehensive and systematic reconstruction of the philosophy of Charles S. Peirce, perhaps America's most far-ranging and original philosopher, which reveals the unity of his complex and influential body of thought. We are still in the early stages of understanding the thought of C. S. Peirce (1839-1914). Although much good work has been done in isolated areas, relatively little considers the Peircean system as a whole. Peirce made it his life's work to construct a scientifically sophisticated and logically rigorous philosophical (...) system, culminating in a realist epistemology and a metaphysical theory ("synechism") that postulates the connectedness of all things in a universal evolutionary process. In The Continuity of Peirce's Thought, Kelly Parker shows how the principle of continuity functions in phenomenology and semeiotics, the two most novel and important of Peirce's philosophical sciences, which mediate between mathematics and metaphysics. Parker argues that Peirce's concept of continuity is the central organizing theme of the entire Peircean philosophical corpus. He explains how Peirce's unique conception of the mathematical continuum shapes the broad sweep of his thought, extending from mathematics to metaphysics and in religion. He thus provides a convenient and useful overview of Peirce's philosophical system, situating it within the history of ideas and mapping interconnections among the diverse areas of Peirce's work. This challenging yet helpful book adopts an innovative approach to achieve the ambitious goal of more fully understanding the interrelationship of all the elements in the entire corpus of Peirce's writings. Given Peirce's importance in fields ranging from philosophy to mathematics to literary and cultural studies, this new book should appeal to all who seek a fuller, unified understanding of the career and overarching contributions of Peirce, one of the key figures in the American philosophical tradition. (shrink)
In that Case Content Type Journal Article DOI 10.1007/s11673-010-9261-3 Authors Malcolm Parker, School of Medicine, University of Queensland, Brisbane, Australia Journal Journal of Bioethical Inquiry Online ISSN 1872-4353 Print ISSN 1176-7529.
In this article, Walter Parker brings structure and agency to the foreground of the current tumult of public schooling in the United States. He focuses on three structures that are serving as rules and resources for creative agency. These are a discourse of derision about failing schools, a broad mobilization of multiculturalism, and an enduring nationalism. Drawing on Anthony Giddens's structuration theory, Parker examines how these discourses figure in redefining school reform, redefining school curricula, and requiring schools once (...) again to serve nationalistic purposes. (shrink)
Republication: In That Case Content Type Journal Article DOI 10.1007/s11673-010-9264-0 Authors Malcolm Parker, School of Medicine, University of Queensland, Brisbane, Australia Journal Journal of Bioethical Inquiry Online ISSN 1872-4353 Print ISSN 1176-7529.
Potter et al.’s (1999) response to my ‘Against Relativism in Psychology, on Balance’ (Parker, 1999) neatly summarizes what they take a ‘critical realist’ position to be and how ‘relativists’ should defend themselves. Their response also illustrates why the version of critical realism I elaborated is more thoroughly critically relativist than Potter et al. assume and how their version of relativism actually rests on a rather uncritical subscription to realism.
There is no uniquely standard concept of an effectively decidable set of real numbers or real n-tuples. Here we consider three notions: decidability up to measure zero [M.W. Parker, Undecidability in Rn: Riddled basins, the KAM tori, and the stability of the solar system, Phil. Sci. 70(2) (2003) 359–382], which we abbreviate d.m.z.; recursive approximability [or r.a.; K.-I. Ko, Complexity Theory of Real Functions, Birkhäuser, Boston, 1991]; and decidability ignoring boundaries [d.i.b.; W.C. Myrvold, The decision problem for entanglement, in: (...) R.S. Cohen et al. (Eds.), Potentiality, Entanglement, and Passion-at-a-Distance: Quantum Mechanical Studies fo Abner Shimony, Vol. 2, Kluwer Academic Publishers, Great Britain, 1997, pp. 177–190]. Unlike some others in the literature, these notions apply not only to certain nice sets, but to general sets in Rn and other appropriate spaces. We consider some motivations for these concepts and the logical relations between them. It has been argued that d.m.z. is especially appropriate for physical applications, and on Rn with the standard measure, it is strictly stronger than r.a. [M.W. Parker, Undecidability in Rn: Riddled basins, the KAM tori, and the stability of the solar system, Phil. Sci. 70(2) (2003) 359–382]. Here we show that this is the only implication that holds among our three decidabilities in that setting. Under arbitrary measures, even this implication fails. Yet for intervals of non-zero length, and more generally, convex sets of non-zero measure, the three concepts are equivalent. (shrink)
Psychology is meant to help people cope with the afflictions of modern society. But how useful is it? Ian Parker argues that current psychological practice has become part of the problem rather than the solution. Ideal for undergraduates, this book unravels the discipline to reveal the conformist assumptions that underlie its theory and practice. Psychology focuses on the happiness of "the individual." Yet it neglects the fact that personal experience depends on social and political surroundings. Parker argues that (...) a new approach to psychology is needed. He offers an alternative vision, outlining how debates in the discipline can be linked to political practice and how it can become part of a wider progressive agenda. Parker's groundbreaking book is at the cutting edge of current thinking on the discipline and should be required reading in all psychology courses. (shrink)
'The more we enquire, the less we can resolve,' wrote Johnson. Scepticism-a reasoned emphasis on the severe limitations of rationality-would seem to undermine the grounds of belief and action. But in some of the best eighteenth-century literature, a theoretically paralysing critique of the pretensions of reason, precept, and language went hand in hand with a vigorous intellectual, moral, and linguistic confidence. To realise philosophical scepticism as literature was effectively to transform it. Dr Parker traces the presence of this life-giving (...) irony in works by Pope, Hume, Sterne, and Johnson, relates it more broadly to the social self-consciousness of eighteenth-century culture, and discusses its source in Locke and its inspiration in Montaigne. The argument serves as a reminder that radical scepticism is not the invention of the late twentieth century, and that its strategies and implications have never been more interestingly explored than in the eighteenth. (shrink)
Shanachie and Norm Content Type Journal Article Category Case Studies Pages 1-2 DOI 10.1007/s11673-012-9356-0 Authors Malcolm Parker, School of Medicine, The University of Queensland, 288 Herston Road, Herston, QLD 4006, Australia Journal Journal of Bioethical Inquiry Online ISSN 1872-4353 Print ISSN 1176-7529.
Cross-sector social partnerships (CSSPs) can produce benefits at individual, organizational, sectoral and societal levels. In this article, we argue that the distribution of benefits depends in part on the cognitive frames held by partnership participants. Based on Selsky and Parker's (J Manage 31(6):849-873, 2005) review of CSSPs, we identify three analytic "platforms" for social partnerships — the resource-dependence platform, the social-issue platform, and the societal-sector platform. We situate platforms as prospective sensemaking devices that help project managers make sense of (...) partnerships by calling attention to certain desired features or downplaying other features. We describe the three platforms and contrast them on factors that influence social benefit, including orientation, learning, and power. We provide illustrations of each platform and demonstrate how the choice of platform is consequential for practice, such as how a partnership project gets started, evolves and produces social benefits. (shrink)
Insight, by F. H. Parker.--Why be uncritical about the life-world? By H. B. Veatch.--Homage to Saint Anselm, by R. Jordan.--Art and philosophy, by J. M. Anderson.--The phenomenon of world, by R. R. Ehman.--The life-world and its historical horizon, by C. O. Schrag.--The Lebenswelt as ground and as Leib in Husserl: somatology, psychology, sociology, by E. Paci.--Life-world and structures, by C. A. van Peursen.--The miser, by E. W. Straus.--Monetary value and personal value, by G. Schrader.--Individualisms, by W. L. McBride.--Sartre the (...) individualist, by W. Desan.--The nature of social man, by M. Natanson.--The problem of the will and philosophical discourse, by P. Ricoeur.--Structuralism and humanism, by M. Dufrenne.--The illusion of monolinear time, by N. Lawrence.--Can grammar be thought? By J. M. Edie.--The existentialist critique of objectivity, by S. J. Todes and H. L. Dreyfus.--Bibliography (p. 391-400). (shrink)
Noam Chomsky's Poverty of the Stimulus Argument is one of the most famous and controversial arguments in the study of language and the mind. Though widely endorsed by linguists, the argument has met with much resistance in philosophy. Unfortunately, philosophical critics have often failed to fully appreciate the power of the argument. In this paper, we provide a systematic presentation of the Poverty of the Stimulus Argument, clarifying its structure, content, and evidential base. We defend the argument against a variety (...) of philosophical criticisms, new and old, and argue that the Poverty of the Stimulus Argument continues to deserve its guiding role in the study of language and the mind. (shrink)
Conceptual analysis is undergoing a revival in philosophy, and much of the credit goes to Frank Jackson. Jackson argues that conceptual analysis is needed as an integral component of so-called serious metaphysics and that it also does explanatory work in accounting for such phenomena as categorization, meaning change, communication, and linguistic understanding. He even goes so far as to argue that opponents of concep- tual analysis are implicitly committed to it in practice. We show that he is wrong on all (...) of these points and that his case for conceptual analysis doesn. (shrink)
What is a concept? Philosophers have given many different answers to this question, reflecting a wide variety of approaches to the study of mind and language. Nonetheless, at the most general level, there are two dominant frameworks in contemporary philosophy. One proposes that concepts are mental representations, while the other proposes that they are abstract objects. This paper looks at the differences between these two approaches, the prospects for combining them, and the issues that are involved in the dispute. We (...) argue that powerful motivations have been offered in support of both frameworks. This suggests the possibility of combining the two. Unlike Frege, we hold that the resulting position is perfectly coherent and well worth considering. Nonetheless, we argue that it should be rejected along with the view that concepts are abstract objects. (shrink)
A number of recent discussions comparing computer simulation and traditional experimentation have focused on the significance of “materiality.” I challenge several claims emerging from this work and suggest that computer simulation studies are material experiments in a straightforward sense. After discussing some of the implications of this material status for the epistemology of computer simulation, I consider the extent to which materiality (in a particular sense) is important when it comes to making justified inferences about target systems on the basis (...) of experimental results. (shrink)
This entry provides an overview of theories of concepts that is organized around five philosophical issues: (1) the ontology of concepts, (2) the structure of concepts, (3) empiricism and nativism about concepts, (4) concepts and natural language, and (5) concepts and conceptual analysis.
Lloyd (2009) contends that climate models are confirmed by various instances of fit between their output and observational data. The present paper argues that what these instances of fit might confirm are not climate models themselves, but rather hypotheses about the adequacy of climate models for particular purposes. This required shift in thinking—from confirming climate models to confirming their adequacy-for-purpose—may sound trivial, but it is shown to complicate the evaluation of climate models considerably, both in principle and in practice.
Given the fundamental role that concepts play in theories of cognition, philosophers and cognitive scientists have a common interest in concepts. Nonetheless, there is a great deal of controversy regarding what kinds of things concepts are, how they are structured, and how they are acquired. This chapter offers a detailed high-level overview and critical evaluation of the main theories of concepts and their motivations. Taking into account the various challenges that each theory faces, the chapter also presents a novel approach (...) to concepts that is organized around two ideas. The first is a pluralistic view of differing types of conceptual structure. The second is a model that treats concepts as atomic representations that are linked to various types of conceptual structures. (shrink)
With the question “What is 'discourse?' “ as the starting point, this paper addresses ways of identifying particular discourses, and attends to how these discourses should be distinguished from texts. The emergence of discourse analysis within psychology, and the continuing influence of linguistic and post-structuralist ideas on practitioners, provide the basis on which discourse-analytic research can be developed fruitfully. This paper discusses the descriptive, analytic and educative functions of discourse analysis, and addresses the cultural and political (...) questions which arise when discourse analysts reflect on their activity. Suggestions for an adequate definition of discourse are proposed and supported by seven criteria which should be adopted to identify discourses, and which attend to contradictions between and within them. Three additional criteria are then suggested to relate discourse analysis to wider political issues. (shrink)
Radical concept nativism is the thesis that virtually all lexical concepts are innate. Notoriously endorsed by Jerry Fodor (1975, 1981), radical concept nativism has had few supporters. However, it has proven difficult to say exactly what’s wrong with Fodor’s argument. We show that previous responses are inadequate on a number of grounds. Chief among these is that they typically do not achieve sufficient distance from Fodor’s dialectic, and, as a result, they do not illuminate the central question of how new (...) primitive concepts are acquired. To achieve a fully satisfactory response to Fodor’s argument, one has to juxtapose questions about conceptual content with questions about cognitive development. To this end, we formulate a general schema for thinking about how concepts are acquired and then present a detailed illustration. (shrink)
At least since W. V. O. Quine's famous critique of the analytic/synthetic distinction, philosophers have been deeply divided over whether there are any analytic truths. One line of thought suggests that the simple fact that people have 'intuitions of analyticity' might provide an independent argument for analyticities. If defenders of analyticity can explain these intuitions and opponents cannot, then perhaps there are analyticities after all. We argue that opponents of analyticity have some unexpected resources for explaining these intuitions and that, (...) accordingly, the argument from intuition fails. (shrink)
In an important recent discussion of analyticity, Paul Boghossian (1997)1 argues for the following three claims: (i) While Quine’s well-known arguments against analyticity do undermine one type of analyticity (what Boghossian calls metaphysical analyticity), they fail to undermine another type (what he calls epistemic analyticity). (ii) Epistemic analyticity explains the a prioricity of logic and perhaps even the a prioricity of conceptual truths.
The Language of Thought Hypothesis is often taken to have the fatal flaw that it generates an explanatory regress. The language of thought is invoked to explain certain features of natural language (e.g., that it is learned, understood, and is meaningful), but, according to the regress argument, the language of thought itself has these same features and hence no explanatory progress has been made. We argue that such arguments rely on the tacit assumption that the entire motivation for the language (...) of thought consists in explaining the explanandum that allegedly generates the regress. But this tacit assumption is simply false. The Language of Thought Hypothesis is a cogent view and one with considerable explanatory advantages. (shrink)
Hilary Putnam's Twin Earth thought experiment has come to have an enormous impact on contemporary philosophical thought. But while most of the discussion has taken place within the context of the philosophy of mind and language, Terence Horgan and Mark Timmons (H8cT) have defended the intriguing suggestion that a variation on the original thought experiment has important consequences for ethics.' In a series of papers, they' ve developed the idea of a Moral Twin Earth and have argued that its significance (...) is that it has the resources to undermine naturalistic versions of moral realism.' H8t T don't hold back in their assessment. "Moral Twin.. (shrink)
Allan Franklin has identified a number of strategies that scientists use to build confidence in experimental results. This paper shows that Franklin's strategies have direct analogues in the context of computer simulation and then suggests that one of his strategies—the so-called 'Sherlock Holmes' strategy—deserves a privileged place within the epistemologies of experiment and simulation. In particular, it is argued that while the successful application of even several of Franklin's other strategies (or their analogues in simulation) may not be sufficient for (...) justified belief in results, the successful application of a slightly elaborated version of the Sherlock Holmes strategy is sufficient. (shrink)
This is the first volume of a projected three-volume set on the subject of innateness. The extent to which the mind is innate is one of the central questions in the human sciences, with important implications for many surrounding debates. By bringing together the top nativist scholars in philosophy, psychology, and allied disciplines these volumes provide a comprehensive assessment of nativist thought and a definitive reference point for future nativist inquiry. The Innate Mind: Structure and Content, concerns the fundamental architecture (...) of the mind, addressing such question as: What capacities, processes, representations, biases, and connections are innate? How do these innate elements feed into a story about the development of our mature cognitive capacities, and which of them are shared with other members of the animal kingdom? The editors have provided an introduction giving some of the background to debates about innateness and introducing each of the subsequent essays, as well as a consolidated bibliography that will be a valuable reference resource for all those interested in this area. The volume will be of great importance to all researchers and students interested in the fundamental nature and powers of the human mind. Together, the three volumes in the series will provide the most intensive and richly cross-disciplinary investigation of nativism ever undertaken. They point the way toward a synthesis of nativist work that promises to provide a new understanding of our minds and their place in the natural order. (shrink)
Strong nativist views about numerical concepts claim that human beings have at least some innate precise numerical representations. Weak nativist views claim only that humans, like other animals, possess an innate system for representing approximate numerical quantity. We present a new strong nativist model of the origins of numerical concepts and defend the strong nativist approach against recent cross-cultural studies that have been interpreted to show that precise numerical concepts are dependent on language and that they are restricted to speakers (...) of languages with the right kind of structure. (shrink)
We consider an approach to some philosophical problems that I call the Method of Conceptual Articulation: to recognize that a question may lack any determinate answer, and to re-engineer concepts so that the question acquires a definite answer in such a way as to serve the epistemic motivations behind the question. As a case study we examine “Galileo’s Paradox”, that the perfect square numbers seem to be at once as numerous as the whole numbers, by one-to-one correspondence, and yet less (...) numerous, being a proper subset. I argue that Cantor resolved this paradox by a method at least close to that proposed—not by discovering the true nature of cardinal number, but by articulating several useful and appealing extensions of number to the infinite. Galileo was right to suggest that the concept of relative size did not apply to the infinite, for the concept he possessed did not. Nor was Bolzano simply wrong to reject Hume’s Principle (that one-to-one correspondence implies equal number) in the infinitary case, in favor of Euclid’s Common Notion 5 (that the whole is greater than the part), for the concept of cardinal number (in the sense of “number of elements”) was not clearly defined for infinite collections. Order extension theorems now suggest that a theory of cardinality upholding Euclid’s principle instead of Hume’s is possible. Cantor’s refinements of number are not the only ones possible, and they appear to have been shaped by motivations and fruitfulness, for they evolved in discernible stages correlated with emerging applications and results. Galileo, Bolzano, and Cantor shared interests in the particulate analysis of the continuum and in physical applications. Cantor’s concepts proved fruitful for those pursuits. Finally, Gödel was mistaken to claim that Cantor’s concept of cardinality is forced on us; though Gödel gives an intuitively compelling argument, he ignores the fact that Euclid’s Common Notion is also intuitively compelling, and we are therefore forced to make a choice. The success of Cantor’s concept of cardinality lies not in its truth (for concepts are not true or false), nor its uniqueness (for it is not the only extension of number possible), but in its intuitive appeal, and most of all, its usefulness to the understanding. (shrink)
The influence of direct-to-consumer advertising and physician promotions are examined in this study. We further examine some of the ethical issues which may arise when physicians accept promotional products from pharmaceutical companies. The data revealed that direct-to-consumer advertising is likely to increase the request rates of both the drug category and the drug brand choices, as well as the likelihood that those drugs will be prescribed by physicians. The data further revealed that the majority of responding physicians were either neutral (...) or did not feel that accepting some types of gifts from pharmaceutical companies affected their ethical behaviors. (shrink)
After showing how Deborah Mayo’s error-statistical philosophy of science might be applied to address important questions about the evidential status of computer simulation results, I argue that an error-statistical perspective offers an interesting new way of thinking about computer simulation models and has the potential to significantly improve the practice of simulation model evaluation. Though intended primarily as a contribution to the epistemology of simulation, the analysis also serves to fill in details of Mayo’s epistemology of experiment.
To study Earth’s climate, scientists now use a variety of computer simulation models. These models disagree in some of their assumptions about the climate system, yet they are used together as complementary resources for investigating future climatic change. This paper examines and defends this use of incompatible models. I argue that climate model pluralism results both from uncertainty concerning how to best represent the climate system and from difficulties faced in evaluating the relative merits of complex models. I describe how (...) incompatible climate models are used together in ‘multi-model ensembles’ and explain why this practice is reasonable, given scientists’ inability to identify a ‘best’ model for predicting future climate. Finally, I characterize climate model pluralism as involving both an ontic competitive pluralism and a pragmatic integrative pluralism. (shrink)
One of the most important abilities we have as humans is the ability to think about number. In this chapter, we examine the question of whether there is an essential connection between language and number. We provide a careful examination of two prominent theories according to which concepts of the positive integers are dependent on language. The first of these claims that language creates the positive integers on the basis of an innate capacity to represent real numbers. The second claims (...) that language’s function is to integrate contents from modules that humans share with other animals. We argue that neither model is successful. (shrink)
This paper examines Boltzmann’s responses to the Loschmidt reversibility objection to the H-theorem, as presented in his Lectures on Gas Theory. I describe and evaluate two distinct conceptions of the assumption of molecular disorder found in this work, and contrast these notions with the Stosszahlansatz, as well as with the predominant contemporary conception of molecular disorder. Both these conceptions are assessed with respect to the reversibility objection. Finally, I interpret Boltzmann as claiming that a state of molecular disorder serves as (...) a necessary condition for the application of probabilistic arguments. This in turn offers a way to bridge the conceptual gap between the H-theorem and his combinatorial argument. (shrink)
In a survey of his views in the philosophy of mind, David Lewis criticizes much recent work in the field by attacking an imaginary opponent, Strawman. His case against Strawman focuses on four central theses which Lewis takes to be widely accepted among contemporary philosophers of mind. These theses concerns (1) the language of thought hypothesis and its relation to folk psychology, (2) narrow content, (3) de se content, and (4) rationality. We respond to Lewis, arguing (among other things) that (...) he underestimates Strawman’s theoretical resources in a variety of important ways. (shrink)