A peer instruction model was used whereby 78 residence dons (36 males, 42 females) provided instruction regarding academic integrity for 324 students (125 males, 196 females) under their supervision. Quantitative and qualitative analyses were conducted to assess survey responses from both the dons and students regarding presentation content, quality, and learning. Overall, dons consistently identified information-based slides about academic integrity as the most important material for the presentations, indicating that fundamental information was needed. Although student ratings of the usefulness of (...) the presentations were middling, students did indicate knowledge gains. Both interest and personal value for academic integrity were highly predictive of positive evaluations of the presentations. Dons and students provided suggestions for improvement and identified more global concerns. (shrink)
Perhaps the most salient feature of Rawls's theory of justice which at once attracts supporters and repels critics is its apparent egalitarian conclusion as to how economic goods are to be distributed. Indeed, many of Rawls's sympathizers may find this result intuitively appealing, and regard it as Rawls's enduring contribution to the topic of economic justice, despite technical deficiencies in Rawls's contractarian, decision-theoretic argument for it which occupy the bulk of the critical literature. Rawls himself, having proposed a “coherence” theory (...) of justification in metaethics, must regard the claim that his distributive criterion “is a strongly egalitarian conception” as independently a part of the overarching moral argument. The alleged egalitarian impact of Rawls's theory is crucial again in normative ethics where Rawls is thought to have developed a major counter-theory to utilitarianism, one of the most popular criticisms of which has been its alleged inadequacy in handling questions of distributive justice. Utilitarians can argue, however, as Brandt recently has, that the diminishing marginal utility of money, along with ignorance of income-welfare curves, would require a utility-maximizing distribution to be substantially egalitarian. The challenge is therefore for Rawls to show that his theory yields an ethically preferable degree of equality. (shrink)
Theories of economic justice are characteristically based on abstract ethical concerns often unrelated to practical distributive results. Two decades ago, Rawls's theory of justice began as a reaction against the alleged ‘sacrifices’ condoned by utilitarian theory. One variant of this objection is that utilitarianism permits gross inequalities, severe deprivations of individual liberty, or even the enslavement of society's least well-off individuals. There are, however, more subtle forms of the objection. In Rawls, it is often waged without any claim that utilitarianism (...) does in fact imply such gross deprivations in actual realworld circumstances. A second variant hinges, rather, on the milder claim that utilitarianism could condone such deprivations or sacrifices in some possible world—the objection being that utilitarianism improperly makes justice contingent, or uncertain, in this way. A third, still more abstract, variant would be that utilitarianism is flawed—not because of any practical distributive result, actual or hypothetical, but in theory —due to the way it treats individuals' interests, or the ‘concept of persons’ it presupposes. (shrink)
Torin Alter (2013) attempts to rescue phenomenal concepts and the knowledge argument from the critique of Ball 2009 by appealing to conceptual mastery. I show that Alter’s appeal fails, and describe general features of conceptual mastery that suggest that no such appeal could succeed.
In this lively and entertaining book, Terence Ball maintains that 'classic' works in political theory continue to speak to us only if they are periodically re-read and reinterpreted from alternative perspectives. That, the author contends, is how these works became classics, and why they are regarded as such. Ball suggests a way of reading that is both 'pluralist' and 'problem-driven'--pluralist in that there is no one right way to read a text, and problem-driven in that the reinterpretation is (...) motivated by problems that emerge while reading these texts. In addition, the subsequent readings and interpretations become more and more suffused with the interpretations of others. This tour de force, always entertaining and eclectic, focuses on the core problems surrounding many of the major thinkers. Was Machiavelli really amoral? Why did language matter so much to Hobbes--and why should it matter to us? Are the roots of the totalitarian state to be found in Rousseau? Were the utilitarians sexist in their view of the franchise? The author's aim is to show how a pluralist and problem-centered approach can shed new light on old and recent works in political theory, and on the controversies that continue over their meaning and significance. Written in a lively and accessible style, the book will provoke debate among students and scholars alike. (shrink)
The too-often unhappy 'marriage' of political theory and political science has long been a source of anguish for both partners. Should this troubled partnership be dissolved? Or might this marriage yet be saved? Ball answers the former question negatively and the latter affirmatively. Playing the part of therapist instead of theorist, he selectively recounts a number of episodes which estranged the partners and strained the marriage. And yet, he concludes that the conflicts were in hindsight more constructive than destructive, (...) benefiting both partners in heretofore unexpected ways and perhaps paving a path toward reconciliation and rapprochement. (shrink)
Philip Ball explores the science of the shapes we see in nature, revealing how, from the stripes of a zebra to the development of a snowflake or even a human embryo, there is a pattern-forming tendency in the basic processes of nature, and from a few simple themes, and the repetition of simple rules, endless beautiful variations can arise.
Certain puzzling cases have been discussed in the literature recently which appear to support the thought that knowledge can be obtained by way of deduction from a falsehood; moreover, these cases put pressure, prima facie, on the thesis of counter closure for knowledge. We argue that the cases do not involve knowledge from falsehood; despite appearances, the false beliefs in the cases in question are causally, and therefore epistemologically, incidental, and knowledge is achieved despite falsehood. We also show that the (...) principle of counter closure, and the concomitant denial of knowledge from falsehood, is well motivated by considerations in epistemological theory--in particular, by the view that knowledge is first in the epistemological order of things. (shrink)
Although widely studied in other domains, relatively little is known about the metacognitive processes that monitor and control behaviour during reasoning and decision-making. In this paper, we examined the conditions under which two fluency cues are used to monitor initial reasoning: answer fluency, or the speed with which the initial, intuitive answer is produced, and perceptual fluency, or the ease with which problems can be read. The first two experiments demonstrated that answer fluency reliably predicted Feeling of Rightness judgments to (...) conditional inferences and base rate problems, which subsequently predicted the amount of deliberate processing as measured by thinking time and answer changes; answer fluency also predicted retrospective confidence judgments. Moreover, the effect of answer fluency on reasoning was independent from the effect of perceptual fluency, establishing that these are empirically independent constructs. In five experiments with a variety of reasoning problems similar to those of Alter et al., we found no effect of perceptual fluency on FOR, retrospective confidence or accuracy; however, we did observe that participants spent more time thinking about hard to read stimuli, although this additional time did not result in answer changes. In our final two experiments, we found that perceptual disfluency increased accuracy on the CRT, but only amongst participants of high cognitive ability. As Alter et al.’s samples were gathered from prestigious universities, collectively, the data to this point suggest that perceptual fluency prompts additional processing in general, but this processing may results in higher accuracy only for the most cognitively able. (shrink)
It has long been widely agreed that some concepts can be possessed only by those who have undergone a certain type of phenomenal experience. Orthodoxy among contemporary philosophers of mind has it that these phenomenal concepts provide the key to understanding many disputes between physicalists and their opponents, and in particular offer an explanation of Mary’s predicament in the situation exploited by Frank Jackson's knowledge argument. I reject the orthodox view; I deny that there are phenomenal concepts. My arguments exploit (...) the sort of considerations that are typically used to motivate externalism about mental content. Although physicalists often appeal to phenomenal concepts to defend their view against the knowledge argument, I argue that this is a mistake. The knowledge argument depends on phenomenal concepts; if there are no phenomenal concepts, then the knowledge argument fails. (shrink)
We report an experiment investigating the “special-process” theory of insight problem solving, which claims that insight arises from non-conscious, non-reportable processes that enable problem re-structuring. We predicted that reducing opportunities for speech-based processing during insight problem solving should permit special processes to function more effectively and gain conscious awareness, thereby facilitating insight. We distracted speech-based processing by using either articulatory suppression or irrelevant speech, with findings for these conditions supporting the predicted insight facilitation effect relative to silent working or thinking (...) aloud. The latter condition was included to investigate the currently contested effect of “verbal overshadowing” on insight, whereby thinking aloud is claimed to hinder the operation of special, non-reportable processes. Whilst verbal overshadowing was not evident in final solution rates, there was nevertheless support for verbal overshadowing up to and beyond.. (shrink)
(2013). Matching bias in syllogistic reasoning: Evidence for a dual-process account from response times and confidence ratings. Thinking & Reasoning: Vol. 19, No. 1, pp. 54-77. doi: 10.1080/13546783.2012.735622.
An experiment is reported examining dual-process models of belief bias in syllogistic reasoning using a problem complexity manipulation and an inspection-time method to monitor processing latencies for premises and conclusions. Endorsement rates indicated increased belief bias on complex problems, a finding that runs counter to the “belief-first” selective scrutiny model, but which is consistent with other theories, including “reasoning-first” and “parallel-process” models. Inspection-time data revealed a number of effects that, again, arbitrated against the selective scrutiny model. The most striking inspection-time (...) result was an interaction between logic and belief on premise-processing times, whereby belief - logic conflict problems promoted increased latencies relative to non-conflict problems. This finding challenges belief-first and reasoning-first models, but is directly predicted by parallel-process models, which assume that the outputs of simultaneous heuristic and analytic processing streams lead to an awareness of belief - logic conflicts than then require time-consuming resolution. (shrink)
Frank Hindriks has attempted to derive a variant of Timothy Williamson’s knowledge rule for assertion on the basis of a more fundamental belief expression analysis of that speech act. I show that his attempted derivation involves a crucial equivocation between two senses of ‘must,’ and therefore fails. I suggest two possible repairs; but I argue that even if they are successful, we should prefer Williamson’s fully general knowledge rule to Hindriks’s restricted moral norm.
Stewart Cohen’s New Evil Demon argument raises familiar and widely discussed concerns for reliabilist accounts of epistemic justification. A now standard response to this argument, initiated by Alvin Goldman and Ernest Sosa, involves distinguishing different notions of justification. Juan Comesaña has recently and prominently claimed that his Indexical Reliabilism (IR) offers a novel solution in this tradition. We argue, however, that Comesaña’s proposal suffers serious difficulties from the perspective of the philosophy of language. More specifically, we show that the two (...) readings of sentences involving the word ‘justified’ which are required for Comesaña’s solution to the problem are not recoverable within the two-dimensional framework of Robert Stalnaker to which he appeals. We then consider, and reject, an attempt to overcome this difficulty by appeal to a complication of the theory involving counterfactuals, and conclude the paper by sketching our own preferred solution to Cohen’s New Evil Demon. (shrink)
Physicalists about the mind are committed to claims about property identities. Following Kripke's well-known discussion, modal arguments have emerged as major threats to such claims. This paper argues that modal arguments can be resisted by adopting a counterpart theoretic account of modal claims, and in particular modal claims involving properties. Thus physicalists have a powerful motive to adopt non-Kripkean accounts of the metaphysics of modality and the semantics of modal expressions.
In this article, I offer a new analysis of knowledge: knowledge, I claim, is normal belief. I begin with what I take to be the conceptual truth that knowledge is epistemically justified, or permissible, belief. I then argue that this in turn is simply doxastically normal belief, first clarifying what is meant by this claim, and then providing reasons to think that normal belief, so understood, must be true and safe from error, making it a good candidate for knowledge.
This paper is based on case-study research in four English secondary schools. It explores the pressure placed on English and mathematics departments because of their results being reported in annual performance tables. It examines how English and maths departments enact policies of achievement, the additional power and extra resources the pressure to achieve brings and the possibility of resistance.
Two experiments are reported that employed think-aloud methods to test predictions concerning relevance effects and rationalisation processes derivable from Evans' (1996) heuristic-analytic theory of the selection task. Evans' account proposes that card selections are triggered by relevance-determining heuristics, with analytic processing serving merely to rationalise heuristically cued decisions. As such, selected cards should be associated with more references to both their facing and their hidden sides than rejected cards, which are not subjected to analytic rationalisation. Experiment 1 used a standard (...) selection-task paradigm, with negative components permuted through abstract conditional rules. Support was found for all heuristic-analytic predictions. This evidence was shown to be robust in Experiment 2, where "select - don't select" decisions were enforced for all cards. Both experiments also clarify the role played by secondary heuristics in cueing the consideration of hidden card values during rationalisation. We suggest that whilst Evans' heuristic-analytic model and Oaksford and Chater's (e.g., 2003) optimal data selection model can provide compelling accounts of our protocol findings, the mental models theory fares less well as an explanation of our full dataset. (shrink)
In recent years there has been an upsurge of research aimed at removing the mystery from insight and creative problem solving. The present special issue reflects this expanding field. Overall the papers gathered here converge on a nuanced view of insight and creative thinking as arising from multiple processes that can yield surprising solutions through a mixture of “special” Type 1 processes and “routine” Type 2 processes.
Millians about proper names typically claim that it is knowable apriori that Hesperus is Phosphorus. We argue that they should claim instead that it is knowable only aposteriori that Hesperus is Hesperus, since the Kripke-Putnam epistemic arguments against descriptivism are special cases of Quinean arguments that nothing is knowable apriori, and Millians have no resources to resist the more general Quinean arguments.
We applaud many aspects of Elqayam & Evans' (E&E's) call for a descriptivist research programme in studying reasoning. Nevertheless, we contend that normative benchmarks are vital for understanding individual differences in performance. We argue that the presence of normative responses to particular problems by certain individuals should inspire researchers to look for converging evidence for analytic processing that may have a normative basis.
It is widely believed that Twin-Earth-style thought experiments show that the contents of a person's thoughts fail to supervene on her intrinsic properties. Several recent philosophers have made the further claim that Twin-Earth-style thought experiments produce metaphysically necessary conditions for the possession of certain concepts. I argue that the latter view is false, and produce counterexamples to several proposed conditions. My thesis is of particular interest because it undermines some attempts to show that externalism is incompatible with privileged access.
There are two views of the essences of speech acts: according to one view, they are natural kinds; according to the other, they are what I call normative kinds—kinds in the (possibly non-reductive) definition of which some normative term occurs. In this article I show that speech acts can be normative but also natural kinds by deriving Williamson's account of assertion, on which it is an act individuated, and constitutively governed, by a norm (the knowledge rule), from a consideration of (...) the natural characteristics of normal cases of its performance. (shrink)
The Theory Theory (TT) versus Simulation Theory (ST) debate is primarily concerned with how we understand others’ mental states. Theory theorists claim we do this using rules that are akin to theoretical laws, whereas simulation theorists claim we use our own minds to imagine ourselves in another’s position. Theorists from both camps suggest a consideration of individuals with autism spectrum disorders (ASD) can help resolve the TT/ST debate (e.g., Baron-Cohen 1995; Carruthers 1996a; Goldman 2006). We present a three-part argument that (...) such research has so far been inconclusive and that the prospects for studies of ASD to resolve the debate in the near future remain uncertain. First, we discuss evidence indicating that some individuals with ASD can perform effectively on tests of mental state understanding, which questions what ASD can tell us regarding theorising or simulation. Second, we claim that there is compelling evidence that domain-general mechanisms are implicated in mental state reasoning, which undermines how ASD might inform the TT/ST debate given that both theories appeal to domain-specific mindreading mechanisms. Third, we suggest that neuroscientific evidence for an assumed role of the mirror neuron system in autism also fails to arbitrate between TT and ST. We suggest that while the study of ASD may eventually provide a resolution to the TT/ST debate, it is also vital for researchers to examine the issues through other avenues, for example, by examining people’s everyday counterfactual reasoning with mental state scenarios. (shrink)
Laboratory-based studies of problem solving suggest that transfer of solution principles from an analogue to a target arises only minimally without the presence of directive hints. Recently, however, real-world studies indicate that experts frequently and spontaneously use analogies in domain-based problem solving. There is also some evidence that in certain circumstances domain novices can draw analogies designed to illustrate arguments. It is less clear, however, whether domain novices can invoke analogies in the sophisticated manner of experts to enable them to (...) progress problem solving. In the current study groups of novices and experts tackled large-scale management problems. Spontaneous analogising was observed in both conditions, with no marked differences between expertise levels in the frequency, structure, or function of analogising. On average four analogies were generated by groups per hour, with significantly more relational mappings between analogue and target being produced than superficial object-and-attribute mappings. Analogising served two different purposes: problem solving (dominated by relational mappings), and illustration (which for novices was dominated by object-and-attribute mappings). Overall, our novices showed a sophistication in domain-based analogical reasoning that is usually only observed with experts, in addition to a sensitivity to the pragmatics of analogy use. (shrink)
In this reply, we provide an analysis of Alter et al. response to our earlier paper. In that paper, we reported difficulty in replicating Alter, Oppenheimer, Epley, and Eyre’s main finding, namely that a sense of disfluency produced by making stimuli difficult to perceive, increased accuracy on a variety of reasoning tasks. Alter, Oppenheimer, and Epley argue that we misunderstood the meaning of accuracy on these tasks, a claim that we reject. We argue and provide evidence that the tasks were (...) not too difficult for our populations and point out that in many cases performance on our tasks was well above chance or on a par with Alter et al.’s participants. Finally, we reiterate our claim that the distinction between answer fluency and perceptual fluency is genuine, and argue that Thompson et al. provided evidence that these are distinct factors that have different downstream effects on cognitive processes. (shrink)
Van den Belt recently examined the notion that synthetic biology and the creation of ‘artificial’ organisms are examples of scientists ‘playing God’. Here I respond to some of the issues he raises, including some of his comments on my previous discussions of the value of the term ‘life’ as a scientific concept.
Though modern non-cognitivists in ethics characteristically believe that values are irreducible to facts, they nevertheless believe that values are determined by facts, viz., those specified in functionalist, explanatory theories of the evolutionary origin of morality. The present paper probes the consistency of this position. The conventionalist theories of Hume and Harman are examined, and are seen not to establish a tight determinative reduction of values to facts. This result is illustrated by reference to recent theories of the sociobiological mechanisms involved (...) in moral evolution. Though explanatory theories have linguistic implications,exaggerated in Harman's linguistic form of social relativism, there is also failure to establish the semantic reductionism which non-cognitivists reject under the rubric of ethical naturalism. It is concluded that explanatory forms of naturalism, the best of which is a functionalist-utilitarian account, are compatible with the fact/value distinction. (shrink)
Cognitive enhancement is an increasingly discussed topic and policy suggestions have been put forward. We present here empirical data of views of parents of children with and without cognitive disabilities. Analysis of the interviews revealed six primary overarching themes: meanings of health and treatment; the role of medicine; harm; the ‘good’ parent; normality and self-perception; and ability. Interestingly none of the parents used the term ethics and only one parent used the term moral twice.
Since 1997 there has been a significant increase in the number and percentage of Kansas farmers who are women. Using Reskin and Roos’ model of “job queues and gender queues” I analyze changes in the agricultural industry in Kansas that resulted in more women becoming “principal farm operators” in the state. I find there are three changes largely responsible for women increasing their representation in the occupation: an increase in the demand for niche products, a decrease in the average farm (...) size, and greater societal acceptance of women as farmers. This study adds to the growing literature on women principal farm operators in developed countries, and is among the first to explore why women are becoming a larger percentage of the occupation in the United States. (shrink)
Contemporary bioethical theory relies upon the concept of informed consent to protect against abuses of patient autonomy. Due to the complexity of the informed consent process, however, many patients rely more on their trust in their health care providers than they do upon their own ability to decide whether or not to give informed consent. Reformation theologian John Calvin placed a strong emphasis on the decision-maker's duty to respect the trust that others repose in the decision-maker. In keeping with Calvin's (...) concept of the duty to honor that trust, bioethics would do well to complement its current emphasis on informed consent with an emphasis on the protection of patient trust. (shrink)