The Minimum Information for Biological and Biomedical Investigations (MIBBI) project aims to foster the coordinated development of minimum-information checklists and provide a resource for those exploring the range of extant checklists.
The too-often unhappy 'marriage' of political theory and political science has long been a source of anguish for both partners. Should this troubled partnership be dissolved? Or might this marriage yet be saved? Ball answers the former question negatively and the latter affirmatively. Playing the part of therapist instead of theorist, he selectively recounts a number of episodes which estranged the partners and strained the marriage. And yet, he concludes that the conflicts were in hindsight more constructive than destructive, (...) benefiting both partners in heretofore unexpected ways and perhaps paving a path toward reconciliation and rapprochement. (shrink)
In this lively and entertaining book, Terence Ball maintains that 'classic' works in political theory continue to speak to us only if they are periodically re-read and reinterpreted from alternative perspectives. That, the author contends, is how these works became classics, and why they are regarded as such. Ball suggests a way of reading that is both 'pluralist' and 'problem-driven'--pluralist in that there is no one right way to read a text, and problem-driven in that the reinterpretation is (...) motivated by problems that emerge while reading these texts. In addition, the subsequent readings and interpretations become more and more suffused with the interpretations of others. This tour de force, always entertaining and eclectic, focuses on the core problems surrounding many of the major thinkers. Was Machiavelli really amoral? Why did language matter so much to Hobbes--and why should it matter to us? Are the roots of the totalitarian state to be found in Rousseau? Were the utilitarians sexist in their view of the franchise? The author's aim is to show how a pluralist and problem-centered approach can shed new light on old and recent works in political theory, and on the controversies that continue over their meaning and significance. Written in a lively and accessible style, the book will provoke debate among students and scholars alike. (shrink)
Torin Alter (2013) attempts to rescue phenomenal concepts and the knowledge argument from the critique of Ball 2009 by appealing to conceptual mastery. I show that Alter’s appeal fails, and describe general features of conceptual mastery that suggest that no such appeal could succeed.
Patterns are everywhere in nature - in the ranks of clouds in the sky, the stripes of an angelfish, the arrangement of petals in flowers. Where does this order and regularity come from? It creates itself. The patterns we see come from self-organization. Whether living or non-living, scientists have found that there is a pattern-forming tendency inherent in the basic structure and processes of nature, so that from a few simple themes, and the repetition of simple rules, endless beautiful variations (...) can arise. -/- Part of a trilogy of books exploring the science of patterns in nature, acclaimed science writer Philip Ball here looks at how shapes form. From soap bubbles to honeycombs, delicate shell patterns, and even the developing body parts of a complex animal like ourselves, he uncovers patterns in growth and form in all corners of the natural world, explains how these patterns are self-made, and why similar shapes and structures may be found in very different settings, orchestrated by nothing more than simple physical forces. This book will make you look at the world with fresh eyes, seeing order and form even in the places you'd least expect. (shrink)
It has long been widely agreed that some concepts can be possessed only by those who have undergone a certain type of phenomenal experience. Orthodoxy among contemporary philosophers of mind has it that these phenomenal concepts provide the key to understanding many disputes between physicalists and their opponents, and in particular offer an explanation of Mary’s predicament in the situation exploited by Frank Jackson's knowledge argument. I reject the orthodox view; I deny that there are phenomenal concepts. My arguments exploit (...) the sort of considerations that are typically used to motivate externalism about mental content. Although physicalists often appeal to phenomenal concepts to defend their view against the knowledge argument, I argue that this is a mistake. The knowledge argument depends on phenomenal concepts; if there are no phenomenal concepts, then the knowledge argument fails. (shrink)
Physicalists about the mind are committed to claims about property identities. Following Kripke's well-known discussion, modal arguments have emerged as major threats to such claims. This paper argues that modal arguments can be resisted by adopting a counterpart theoretic account of modal claims, and in particular modal claims involving properties. Thus physicalists have a powerful motive to adopt non-Kripkean accounts of the metaphysics of modality and the semantics of modal expressions.
It is widely believed that Twin-Earth-style thought experiments show that the contents of a person's thoughts fail to supervene on her intrinsic properties. Several recent philosophers have made the further claim that Twin-Earth-style thought experiments produce metaphysically necessary conditions for the possession of certain concepts. I argue that the latter view is false, and produce counterexamples to several proposed conditions. My thesis is of particular interest because it undermines some attempts to show that externalism is incompatible with privileged access.
Certain puzzling cases have been discussed in the literature recently which appear to support the thought that knowledge can be obtained by way of deduction from a falsehood; moreover, these cases put pressure, prima facie, on the thesis of counter closure for knowledge. We argue that the cases do not involve knowledge from falsehood; despite appearances, the false beliefs in the cases in question are causally, and therefore epistemologically, incidental, and knowledge is achieved despite falsehood. We also show that the (...) principle of counter closure, and the concomitant denial of knowledge from falsehood, is well motivated by considerations in epistemological theory--in particular, by the view that knowledge is first in the epistemological order of things. (shrink)
Though modern non-cognitivists in ethics characteristically believe that values are irreducible to facts, they nevertheless believe that values are determined by facts, viz., those specified in functionalist, explanatory theories of the evolutionary origin of morality. The present paper probes the consistency of this position. The conventionalist theories of Hume and Harman are examined, and are seen not to establish a tight determinative reduction of values to facts. This result is illustrated by reference to recent theories of the sociobiological mechanisms involved (...) in moral evolution. Though explanatory theories have linguistic implications,exaggerated in Harman's linguistic form of social relativism, there is also failure to establish the semantic reductionism which non-cognitivists reject under the rubric of ethical naturalism. It is concluded that explanatory forms of naturalism, the best of which is a functionalist-utilitarian account, are compatible with the fact/value distinction. (shrink)
Stewart Cohen’s (1984) New Evil Demon argument raises familiar and widely discussed concerns for reliabilist accounts of epistemic justification. A now standard response to this argument, initiated by Alvin Goldman (1988) and Ernest Sosa (1993; 2001), involves distinguishing different notions of justification. Juan Comesaña (2002b; 2010) has recently and prominently claimed that his Indexical Reliabilism (IR) offers a novel solution in this tradition. We argue, however, that Comesaña’s proposal, suffers serious difficulties from the perspective of the philosophy of language. More (...) specifically, we show that the two readings of sentences involving the word ‘justified’ which are required for Comesaña’s solution to the problem are not recoverable within the two-dimensional framework of Robert Stalnaker (1999) to which he appeals. We then consider, and reject, an attempt to overcome this difficulty by appeal to a complication of the theory involving counterfactuals, and conclude the paper by sketching our own preferred solution to Cohen’s New Evil Demon. (shrink)
This paper examines the study of computer basedperformance monitoring (CBPM) in the workplaceas an issue dominated by questions of ethics.Its central contention paper is that anyinvestigation of ethical monitoring practice isinadequate if it simply applies best practiceguidelines to any one context to indicate,whether practice is, on balance, ethical or not. The broader social dynamics of access toprocedural and distributive justice examinedthrough a fine grained approach to the study ofworkplace social relations, and workplaceidentity construction, are also important here. This has three (...) implications, which are examinedin the paper, and are as follows: First, thatit is vital for any empirical investigation ofthe ethics of CBPM practice to take intoaccount not only its compliance withpreexisting best practice guidelines, butalso the social relations which pervade thecontext of its application. Second, that thisnecessitates a particular epistemologicaltreatment of CBPM as something whose effectsare measurable and identifiable, as well assomething which has a socially constructedmeaning and is tropic in nature. Third, thatexisting debates against which this argument isset, which regard contrasting epistemologiesand ontologies as incompatible, should beaddressed, and an alternative introduced. Introducing situated knowledges (Haraway 1991)and material semiotic ontologies as such analternative, the paper proceeds to analyse theethics of a particular case of monitoringpractice, Norco. Drawing on Marx (1998) thepaper concludes that a fine grain analysis ofthe social is vital if we are to understandfully the ethics of monitoring in theworkplace. (shrink)
Millians about proper names typically claim that it is knowable apriori that Hesperus is Phosphorus. We argue that they should claim instead that it is knowable only aposteriori that Hesperus is Hesperus, since the Kripke-Putnam epistemic arguments against descriptivism are special cases of Quinean arguments that nothing is knowable apriori, and Millians have no resources to resist the more general Quinean arguments.
This paper discusses some problems with the field of educational studies and considers the role of post-structuralist theory in shifting the study of education away from a 'technical rationalist' approach (as evidenced in the case of much research on educational management and school effectiveness) towards an 'intellectual intelligence' stance that stresses contingency, disidentification and risk-taking.
Van den Belt recently examined the notion that synthetic biology and the creation of ‘artificial’ organisms are examples of scientists ‘playing God’. Here I respond to some of the issues he raises, including some of his comments on my previous discussions of the value of the term ‘life’ as a scientific concept.
Gibbard''s theory of rationality is evolutionary in terms of its result as well as its underpinning argument. The result is that judgments about what is rational are analyzed as being similar to judgments of morality — in view of what Darwin suggests concerning the latter. According to the Darwinian theory, moral judgments are based on sentiments which evolve to promote the survival and welfare of human societies. On Gibbard''s theory, rationality judgments should be similarly regarded as expressing emotional attachments to (...) behavioral norms which originate and function to coordinate social interaction. Consequently, Gibbard''s theory of rationality might be used to illuminate Darwin''s theory of morality, and vice versa. Additionally, as argued in the present essay, both can be further elaborated, and defended, by developing related themes in philosophical ethics: viz., connected with Hume and 20th-century emotivists. The main problem is that this general Darwinian approach faces widespread opposition nowadays, not only in ethics but in philosophy of science. The purpose of this essay is to analyze Gibbard''s theory, critically and constructively, with emphasis on the pertinent commonalities in Darwin, Hume and the emotivists, while also critically addressing their common enemies. The pervasive methodological orientation is to relate this analysis to (philosophy of) science in general, and biological science in particular. (shrink)
The Theory Theory (TT) versus Simulation Theory (ST) debate is primarily concerned with how we understand others’ mental states. Theory theorists claim we do this using rules that are akin to theoretical laws, whereas simulation theorists claim we use our own minds to imagine ourselves in another’s position. Theorists from both camps suggest a consideration of individuals with autism spectrum disorders (ASD) can help resolve the TT/ST debate (e.g., Baron-Cohen 1995; Carruthers 1996a; Goldman 2006). We present a three-part argument that (...) such research has so far been inconclusive and that the prospects for studies of ASD to resolve the debate in the near future remain uncertain. First, we discuss evidence indicating that some individuals with ASD can perform effectively on tests of mental state understanding, which questions what ASD can tell us regarding theorising or simulation. Second, we claim that there is compelling evidence that domain-general mechanisms are implicated in mental state reasoning, which undermines how ASD might inform the TT/ST debate given that both theories appeal to domain-specific mindreading mechanisms. Third, we suggest that neuroscientific evidence for an assumed role of the mirror neuron system in autism also fails to arbitrate between TT and ST. We suggest that while the study of ASD may eventually provide a resolution to the TT/ST debate, it is also vital for researchers to examine the issues through other avenues, for example, by examining people’s everyday counterfactual reasoning with mental state scenarios. (shrink)
In this article, I offer a new analysis of knowledge: knowledge, I claim, is normal belief. I begin with what I take to be the conceptual truth that knowledge is epistemically justified, or permissible, belief. I then argue that this in turn is simply doxastically normal belief, first clarifying what is meant by this claim, and then providing reasons to think that normal belief, so understood, must be true and safe from error, making it a good candidate for knowledge.
Contemporary theories of justice fail to recognize that the concepts constitutive of our political practices ? including ?justice? itself? have historically mutable meanings. To recognize the fact of conceptual change entails an alteration in our understanding of justice between generations. Because there can be no transhistorical theory of justice, there can be no valid theory of intergenerational justice either ? especially where the generations in question are distant ones having very different understandings of justice. The upshot is that an earlier (...) generation cannot aspire to act justly toward a later distant generation whose members? understanding of justice differs radically from theirs. Conceptual change and incommensurability render the very idea of intergenerational justice incoherent. Even so, such radical relativism need not entail moral nihilism. (shrink)
Using key writings in the sociology of consumption and consumerism and analyses of the nature of postmodern society, this paper considers how parents decide upon a secondary school and the nature of their engagement with the education market.
This paper develops and critiques the two-dimensionalist account of mental content developed by David Chalmers. I first explain Chalmers’s account and show that it resists some popular criticisms. I then argue that the main interest of two-dimensionalism lies in its accounts of cognitive significance and of the connection between conceivability and possibility. These accounts hinge on the claim that some thoughts have a primary intension that is necessarily true. In this respect, they are Carnapian, and subject to broadly Quinean attack. (...) The remainder of the paper advances such an attack. I argue that there are possible thinkers who are willing to revise their beliefs in response to expert testimony (in a way familiar by Burge’s famous cases), and that such thinkers will have no thoughts with necessary primary intensions. I even suggest that many actual humans may well be such thinkers. I go on to argue that these possible thinkers show that the two-dimensionalist accounts fail. (shrink)
An experiment is reported examining dual-process models of belief bias in syllogistic reasoning using a problem complexity manipulation and an inspection-time method to monitor processing latencies for premises and conclusions. Endorsement rates indicated increased belief bias on complex problems, a finding that runs counter to the “belief-first” selective scrutiny model, but which is consistent with other theories, including “reasoning-first” and “parallel-process” models. Inspection-time data revealed a number of effects that, again, arbitrated against the selective scrutiny model. The most striking inspection-time (...) result was an interaction between logic and belief on premise-processing times, whereby belief - logic conflict problems promoted increased latencies relative to non-conflict problems. This finding challenges belief-first and reasoning-first models, but is directly predicted by parallel-process models, which assume that the outputs of simultaneous heuristic and analytic processing streams lead to an awareness of belief - logic conflicts than then require time-consuming resolution. (shrink)