The interactivist approach to development generates a framework of types of constraints on what can be constructed. The four constraint types are based on: (1) what the constructed systems are about; (2) the representational relationship itself; (3) the nature of the systems being constructed; and (4) the process of construction itself. We give illustrations of each constraint type. Any developmental theory needs to acknowledge all four types of constraint; however, some current theories conflate different types of constraint, or rely on (...) a single constraint type to explicate development. Such theories will be inherently unable to explain important aspects of development. (shrink)
We assemble here in this time and place to discuss the thesis that conscious attention can provide knowledge of reference of perceptual demonstratives. I shall focus my commentary on what this claim means, and on the main argument for it found in the first five chapters of Reference and Consciousness. The middle term of that argument is an account of what attention does: what its job or function is. There is much that is admirable in this account, and I am (...) confident that it will be the foundation, the launching-pad, for much future work on the subject. But in the end I will argue that Campbell's picture makes the mechanisms of attention too smart: smarter than they are, smarter than they could be. If we come to a more realistic appraisal of the skills and capacities of our subpersonal minions, the "knowledge of reference" which they yield will have to be taken down a notch or two. But first let us clarify what the argument is. (shrink)
We assemble here in this time and place to discuss the thesis that conscious attention can provide knowledge of reference of perceptual demonstratives. I shall focus my commentary on what this claim means, and on the main argument for it found in the first five chapters of Reference and Consciousness. The middle term of that argument is an account of what attention does: what its job or function is. There is much that is admirable in this account, and I am (...) confident that it will be the foundation, the launching-pad, for much future work on the subject. But in the end I will argue that Campbell's picture makes the mechanisms of attention too smart: smarter than they are, smarter than they could be. If we come to a more realistic appraisal of the skills and capacities of our sub-personal minions, the "knowledge of reference" which they yield will have to be taken down a notch or two. But first let us clarify what the argument is. (shrink)
If, as Ned Block has argued, consciousness is a mongrel concept, then this collection resembles nothing so much as a visit to a dog pound, where one can hear all the varieties baying, at full volume. The experience is one of immersion in a voluminous excited cacophony, with much yipping and barking, some deep-throated growling, and other voices that can only be characterized as howling at the moon. What a time to be conscious! What a time to be conscious of (...) being conscious! (shrink)
A working hypothesis of computationalism is that Mind arises, not from the intrinsic nature of the causal properties of particular forms of matter, but from the organization of matter. If this hypothesis is correct, then a wide range of physical systems (e.g. optical, chemical, various hybrids, etc.) should support Mind, especially computers, since they have the capability to create/manipulate organizations of bits of arbitrarily complexity and dynamics. In any particular computer, these bit patterns are quite physical, but their particular physicality (...) is considered irrelevant (since they could be replaced by other physical substrata). (shrink)
This paper reports laboratory data for games that are played only once. These games span the standard categories: static and dynamic games with complete and incomplete information. For each game, the treasure is a treatment in which behavior conforms nicely to predictions of the Nash equilibrium or relevant refinement. In each case, however, a change in the payoff structure produces a large inconsistency between theoretical predictions and observed behavior. These contradictions are generally consistent with simple intuition based on the interaction (...) of payoff asymmetries and noisy introspection about others’ decisions. (shrink)
I propose a conceptual framework for emotions according to which they are best understood as the feedback mechanism a creature possesses in virtue of its function to learn. More speciﬁcally, emotions can be neatly modeled as a measure of harmony in a certain kind of constraint satisfaction problem. This measure can be used as error for weight adjustment (learning) in an unsupervised connectionist network.
A signiﬁcant portion of the world’s text is tagged by readers on social bookmarking websites. Credit attribution is an inherent problem in these corpora because most pages have multiple tags, but the tags do not always apply with equal speciﬁcity across the whole document. Solving the credit attribution problem requires associating each word in a document with the most appropriate tags and vice versa. This paper introduces Labeled LDA, a topic model that constrains Latent Dirichlet Allocation by deﬁning a one-to-one (...) correspondence between LDA’s latent topics and user tags. This allows Labeled LDA to directly learn word-tag correspondences. We demonstrate Labeled LDA’s improved expressiveness over traditional LDA with visualizations of a corpus of tagged web pages from del.icio.us. Labeled LDA outperforms SVMs by more than 3 to 1 when extracting tag-speciﬁc document snippets. As a multi-label text classiﬁer, our model is competitive with a discriminative baseline on a variety of datasets. (shrink)
How can the development of ideas in a scientiﬁc ﬁeld be studied over time? We apply unsupervised topic modeling to the ACL Anthology to analyze historical trends in the ﬁeld of Computational Linguistics from 1978 to 2006. We induce topic clusters using Latent Dirichlet Allocation, and examine the strength of each topic over time. Our methods ﬁnd trends in the ﬁeld including the rise of probabilistic methods starting in 1988, a steady increase in applications, and a sharp decline of research (...) in semantics and understanding between 1978 and 2001, possibly rising again after 2001. We also introduce a model of the diversity of ideas, topic entropy, using it to show that COLING is a more diverse conference than ACL, but that both conferences as well as EMNLP are becoming broader over time. Finally, we apply Jensen-Shannon divergence of topic distributions to show that all three conferences are converging in the topics they cover. (shrink)
We describe an approach to textual inference that improves alignments at both the typed dependency level and at a deeper semantic level. We present a machine learning approach to alignment scoring, a stochastic search procedure, and a new tool that ﬁnds deeper semantic alignments, allowing rapid development of semantic features over the aligned graphs. Further, we describe a complementary semantic component based on natural logic, which shows an added gain of 3.13% accuracy on the RTE3 test set.
0. Abstract In this paper, I argue that although the behavior of adjectives in context poses a serious challenge to the principle of compositionality of content, in the end such considerations do not defeat the principle. The first two sections are devoted to the precise statement of the challenge; the rest of the paper presents a semantic analysis of a large class of adjectives that provides a satisfactory answer to it. In section 1, I formulate the context thesis, according to (...) which the content of a complex expression depends on the context of its utterance only insofar as the contents of its constituents do. If the context thesis is false, the content of some complex expression is not compositionally determined. In section 2, using an example due to Charles Travis, I construct an objection to the context thesis based on the behavior of the adjective ‘green’. In section 3 and 4, I look at some of the difficulties surrounding the semantics of ‘good’, which provide the motivation for the thesis that most adjectives are contextually incomplete one-place predicates. In section 5, I discuss how ‘green’ and other color adjectives can be treated within such a semantic theory. Since this theory is compatible with the context thesis, the objection against the compositionality of content looses its force. (shrink)
It is an old charge against Locke that his commitment to a common substratum for the observable qualities of particular objects and his empiricist theory about the origin of ideas are inconsistent with one another. How could we have an idea of something in which observable qualities inhere if all our ideas are constructed from ideas of observable qualities? In this paper, I propose an interpretation of the crucial passages in Locke, according to which the idea of substratum is formed (...) through an elaborate mental process which he calls “supposition.” It is the same process we use when we form the idea of infinity − another problematic idea for an empiricist. In the end, Locke was more liberal than most empiricists in subscribing to the existence of ideas far removed from experience, because he accepted supposition as a legitimate way of constructing new ideas. (shrink)
This article explores the psychological literature on rationalization and connects it with contemporary questions about the role of in-house lawyers in ethical dilemmas. Using the case study of AWB Ltd, the exclusive marketer of Australian wheat exports overseas, it suggests that rationalizations were influential in the perpetuation by in-house lawyers of AWB's payment of kickbacks to the Iraqi regime. The article explores how lawyers' professional rationalizations can work together with commercial imperatives to prevent in-house lawyers from seeing ethical issues as (...) those outside the organisation would see them. In particular, where lawyers over-identify with their client's commercial point of view and convince themselves that their role is primarily about providing 'technical' advice on commercial matters, wilful or unintended 'ethical blindness' can result. Lawyers can end up involved in or perpetuating serious misconduct by their client organizations. (shrink)
It is quite probable that one will soon be able to use genetic engineering to select the gender of one’s child by directly manipulating the sex of an embryo. Some might think that this method would be a more ethical method of sex selection than present technologies such as preimplantation genetic diagnosis (PGD), since, unlike PGD, it does not need to create and destroy “wrong-gendered” embryos. This paper argues that those who object to present technologies on the ground that the (...) embryo is a person are unlikely to be persuaded by this proposal, though for different reasons. (shrink)
Sentences are often used by speakers to communicate thoughts about particular items. Call this de re communication. If a listener is to understand these uses, she must form interpretations of them that are sufficiently similar to the thoughts they express. This similarity between the thoughts on both sides should be anchored in some principled fashion in the content of the utterances. In this essay, I critically discuss a theory of de re communication and utterance content that Anne Bezuidenhout has recently (...) developed in a series of articles.1 This theory, in the Relevance tradition of Sperber and Wilson,2 regards the significance of utterances as more pragmatic in nature than allowed by traditional accounts; further, it downplays logical considerations in explaining de re communication, choosing instead to emphasize its psychological character. Included among the implications of this approach is the rejection of what can be called common content, or utterance content that is held in common by speaker and listener. After describing this theory, I argue, first, that Bezuidenhout does not supply us with a sufficient reason to prefer her account of utterance content over more traditional alternatives, and second, that her account of de re communication supplies even more reason to reject the view of content to which she subscribes. In the end, it will be clear that she has no principled reason for rejecting common content. At bottom, her view and others like it fail because they flout the distinction between the logical and the psychological, thereby making it impossible for them to appreciate the roles that logical considerations play in utterance content and de re communication. (shrink)
Consider the difference between reaching over to the desk to grab your copy of Kant’s first Critique and reaching over to grab some book or other. This is the difference between an action directed on a specific thing and an action directed on something, but no one thing in particular. In the first case, you will be successful only if you grab your copy of Kant—only one book will do; in the second, you will be successful if you grab a (...) book, and here any book will do. This is a difference that is frequently displayed: many intentional actions are directed on things, and of these, a good many are directed on specific things. In speech, we mark this difference by saying that you have a particular thing in mind in the first case but not in the second. This establishes that we can get at the notion of having a particular thing in mind (IM) with the help of intentional action, but a full-blown analysis of IM should be grounded in an assessment of its role in all contexts where it applies. That there should be additional contexts beyond intentional action seems apparent from the language we use in applying IM and the range of cases in which we apply it. Attention to language reveals that we often talk about having things “in mind” without mentioning actions, such as when we say that we had a friend in mind just last week; we might even say that we had something in mind while denying that we acted, such as when we say that we had the friend’s birthday in mind but didn’t buy a card. Turning to the range of cases, note that we are willing to describe people as having some particular thing in mind when they are not acting, such as when we say that a student had a party in mind when they should have been concentrating on a lecture. These examples suggest that IM is applicable beyond the context of intentional action. In this essay, I supply an account of what it is to have a particular thing in mind. I begin by arguing that, despite appearances, IM applies only within the context of intentional action. Any evidence that suggests otherwise depends on an incomplete appreciation of the role played by intentional action in examples such as those considered above.. (shrink)
Arguably, Hume's greatest single contribution to contemporary philosophy of science has been the problem of induction (1739). Before attempting its statement, we need to spend a few words identifying the subject matter of this corner of epistemology. At a first pass, induction concerns ampliative inferences drawn on the basis of evidence (presumably, evidence acquired more or less directly from experience)—that is, inferences whose conclusions are not (validly) entailed by the premises. Philosophers have historically drawn further distinctions, often appropriating the term (...) “induction” to mark them; since we will not be concerned with the philosophical issues for which these distinctions are relevant, we will use the word “inductive” in a catch-all sense synonymous with “ampliative”. But we will follow the usual practice of choosing, as our paradigm example of inductive inferences, inferences about the future based on evidence drawn from the past and present. A further refinement is more important. Opinion typically comes in degrees, and this fact makes a great deal of difference to how we understand inductive inferences. For while it is often harmless to talk about the conclusions that can be rationally believed on the basis of some.. (shrink)
As a conscientious moral agent, a judge in a court of law often finds herself in a difficult position. She is confident that the law requires a certain result in the case before her, but she is at least as confident that this legally required result is unjust or otherwise morally objectionable. Consider some examples of cases in which a reasonable judge might consider herself to be in this position: ▪ The law of landlord and tenant can require a judge (...) to evict an impoverished, elderly widow from her apartment for missing rent payments.1 ▪ A student in a poor school district sues his state for providing a much lower caliber of education than students receive in wealthier districts. Binding legal precedent requires the judge to dismiss the student’s lawsuit. ▪ Binding precedents construing the Fourth Amendment require judges to exclude evidence obtained without a search warrant. As a result, a child molester is acquitted, and, predictably, strikes again. (shrink)
When judges decide cases in courts of law, are they ethically obligated to apply the law correctly? Many people who think about legal systems believe so. The conviction that judges are “bound” by the law is common among lawyers, judges, legal scholars, and members of the general public. One of the most severe accusations one can make against a public official is that she has deviated from the law in her official capacity. The principle of judicial fidelity figures centrally in (...) one of the most celebrated Western political values: the rule of law. This is an ideal which some Western powers, notably the United States, aspire to export on a global scale. The principle of judicial fidelity implies many basic norms of adjudication. These vary from one legal system to another, but in Anglo-American systems they include the following: trial judges must take all admissible evidence into account; judges must follow recognized sources of law, such as constitutions, legislation, and common-law rules; inferior courts must follow superior court rulings on matters of law; courts should give at least substantial weight even to “horizontal” precedent; et cetera. Limits of Legality is a scholarly monograph, in progress, that advances our understanding of the principle of judicial fidelity and defends a refined and unorthodox version of it. The book draws on my background as both a lawyer and a philosopher, addressing issues at the intersection of legal philosophy and ethical theory. It breaks new ground in the normative theory of adjudication – the branch of legal philosophy that concerns how judges in courts of law should decide cases. Mine is one of the first projects to apply the resources of contemporary normative ethics to central questions concerning the rule of law and judicial obligation. I model the normative presuppositions of existing theories of the rule of law in terms that take into account developments in ethical theory over the past two decades.. (shrink)
Several writers have argued recently that optimal rules of law authorize morally suboptimal decisions in certain cases.1 Larry Alexander calls these “gap cases.”2 Should judges in gap cases defer to legal rules or deviate from them? Philosophers known as “formalists” favor deference, “particularists” favor deviation.
In the State of Bernstein, operating a motor vehicle on a suspended license is a misdemeanor, punishable by permanent loss of one’s license. Officer Krupke arrests everyone who does this, as Tony has. But Tony says, “Gee, Officer Krupke, can’t you bend the rules? I went to your high school, you know.” Tony’s using a euphemism. He’s really asking Krupke to break the rules. Is there, however, a non-euphemistic way to bend a rule of law, without breaking it? More (...) precisely, can we ever bend a rule while still applying it, in some sense, or is this just doubletalk? Consider the case of Maria, whose driver’s license has also been suspended. Maria lives with her mother in a remote area, twenty miles from the nearest doctor. Maria’s mother comes down with a fever of one hundred two. It’s not life-threatening, but Maria wants to spare her mother suffering and hasten recovery, so she drives to the hospital herself, rather than waiting for an ambulance to make the trip out and back. Unfortunately for Maria, the statute contains no applicable exception. I stipulate that this statute is not unjust or otherwise defective, as written. Suppose there are conclusive reasons for legislators not to complicate the statute with exceptions broad enough to cover cases such as Maria’s. Writing such exceptions would encourage sub-optimal misapplication of the exception by judges and sub-optimal misconduct by legal subjects who would anticipate (rightly or not) judicial misapplication of the statutory exception. Nevertheless, most will agree that Maria has strong reasons to act as she does. Rare is the writer who insists that rules of law trump all other reasons that bear on legal subjects. Some will insist that we give Maria’s mother a condition more life-threatening before they’ll assent. (shrink)
According to the standard positivist picture of law, each legal system contains a master rule that specifies criteria of legality for primary rules.1 A central debate in legal philosophy during the past twenty-five years has concerned the content of the master rule. Exclusive positivists (“exclusivists”) insist that the master rule can only make reference to social facts or sources: “pedigree” criteria.2 As Ronald Dworkin emphasizes, however, some rulings can’t be justified exclusively by reference to pedigreed legal norms.3 Judges sometimes (...) exercise.. (shrink)
It is well known that, for example, the Continuum Hypothesis can’t be proved or disproved from the standard axioms of set theory or their familiar extensions (unless those axiom systems are themselves inconsistent). Some think it follows that CH has no determinate truth value; others insist that this conclusion is false, not because there is some objective world of sets in which CH is either true or false, but on logical grounds. Claims of indeterminacy have also been made on the (...) basis of such considerations as the existence of non-standard models of arithmetic, with similar rejoinders. We’ll read some representative examples of the various positions and replies. (For background on second-order logic, see Stewart Shapiro. (shrink)
When a deed is done for Freedom, through the broad earthÂ’s aching breast Runs a thrill of joy prophetic, trembling on from East to West, And the slave, whereÂ’er he cowers, feels the soul within him climb To the awful verge of manhood, as the energy sublime Of a century bursts full-blossomed on the thorny stem of Time.
In this course we will examine several philosophical puzzles concerning time. We all seem to experience time in a very fundamental and direct way. Yet once we begin to reflect on what time really is, it is easy to feel as puzzled as St Augustine was, who wrote: “If no one asks me, I know what [time] is. If I wish to explain it to him who asks me, I do not know.” The first set of issues we will discuss (...) concern the question whether time is ‘real.’ Time appears to consist of past, present and future. But do the past and the future exist in the same way as the present or is only the present real? Does time ‘flow’? In what ways is time different from space? What would it be to ‘spatialize’ time? Next we will ask whether certain views of time imply that there can be no freedom of the will. One might worry that if facts about the future (including facts about what I will do tomorrow) already existed in the same way as facts about the present exist, then I could not be free to choose what I will do. After all, how can I be free to decide to skip class tomorrow, if it is ‘already’ a fact today that I will attend class? What, if anything, is the connection between various views of time and ‘fatalism’? The third topic we will discuss is time travel. First we will ask whether time travel is a conceptual possibility. As we will see, there are certain conceptual puzzles associated with the possibility of time travel. For example, one might think that if time travel is possible, then I should be able to travel back in time and kill my father before the date of my conception. But this scenario seems to lead to a contradiction. Some have taken considerations such as these to argue that the very idea of time travel is incoherent. Is this right? If not, why not? Then we will look at what the theory of relativity says about the nature of time in general and about the physical possibility of time travel more specifically. Finally we will examine several issues concerning the asymmetry of time.. (shrink)
The essays in this book exhibit a commendably high level of scholarship. They are written by an accomplished group of thinkers (some of them well-known and well-established and some of them relatively new and worth keeping in view). All the essays are new to this book (except the two on rights). The book is well produced (I noted only a dropped note superscript in Gaus’s chapter and a missing ‘not’ on p.
This paper provides a uniﬁcation-based implementation of Binding Theory (BT) for the English language in the framework of feature-based lexicalized tree-adjoining grammar (LTAG). The grammar presented here does not actually coindex any noun phrases, it merely outputs a set of constraints on co- and contraindexation that may later be processed by a separate anaphora resolution module. It improves on previous work by implementing the full BT rather than just Condition A. The main technical innovation consists in allowing lists to appear (...) as values of semantic features. (shrink)
In Knowledge and Its Limits, Timothy Williamson argues that knowledge is a purely mental state, that is, that it is never a complex state or condition comprising mental factors and non-mental, environmental factors. Three of his arguments are evaluated: arguments from (1) the non-analyzability of the concept of knowledge, (2) the “primeness” of knowledge, and (3) the (alleged) inability to satisfactorily specify the “internal” element involved in knowledge. None of these arguments succeeds. Moreover, consideration of the third argument points the (...) way to a cogent argument that knowledge is not a purely mental state. (shrink)
Viet Nam has experienced rapid social change over the last decade, with a remarkable decline in fertility to just below replacement level. The combination of fertility decline, son preference, antenatal sex determination using ultrasound and sex selective abortion are key factors driving increased sex ratios at birth in favour of boys in some Asian countries. Whether or not this is taking place in Viet Nam as well is the subject of heightened debate. In this paper, we analyse the nature and (...) determinants of sex ratio at birth in Viet Nam, including a small family size norm, recent reinforcement by the Government of the "one-to-two child" family policy, traditional son preference, easy access to antenatal ultrasound screening and legal abortion, and an increase in the proportion of one-child families. In order to prevent an increased sex ratio at birth in Viet Nam, we argue for the relaxation of the one-to-two child family policy and a return to the policy of "small family size" as determined by families, in tandem with a comprehensive approach to promoting the value of women and girls in society, countering traditional gender roles, and raising public awareness of the negative social consequences of a high sex ratio at birth. (shrink)
Â Â What exactly is a genetic disease?Â For a phrase one hears on a daily basis, there has been surprisingly little analysis of the underlying concept.Â Medical doctors seem perfectly willing to admit that the etiology of disease is typically complex, with a great many factors interacting to bring about a given condition.Â On such a view, descriptions of diseases like cancer as genetic seem at best highly simplistic, and at worst philosophically indefensible.Â On the other hand, there is (...) clearly some practical value to be had by classifying diseases according to their predominant cause when this can be accomplished in a theoretically satisfactory manner.Â The question therefore becomes exactly how one should go about selecting a single causal factor among many to explain the presence of disease.Â When an attempt to defend such causal selection is made at all, the standard accounts offered (Kochâ€™s postulates, Hillâ€™s epidemiological criteria, manipulability) are all clearly inadequate.Â I propose, however, an epidemiological account of disease causation which walks the fine line between practical applicability and theoretical considerations of causal complexity and attempts to compromise between patientcentered and population-centered concepts of disease.Â The epidemiological account is the most basic framework consistent with our strongly held intuitions about the causal classification of disease, yet it avoids the difficulties encountered by its competitors. (shrink)
The goal of this small book and accompanying DVD is to help you to have a better experience in your laboratory by getting you to step back and take a global look at what is involved in making progress in the laboratory.
Among the many philosophers who hold that causal facts1 are to be explained in terms of—or more ambitiously, shown to reduce to—facts about what happens, together with facts about the fundamental laws that govern what happens, the clear favorite is an approach that sees counterfactual dependence as the key to such explanation or reduction. The paradigm examples of causation, so advocates of this approach tell us, are examples in which events c and e—the cause and its effect—both occur, but: had (...) c not occurred, e would not have occurred either. From this starting point ideas proliferate in a vast profusion. But the remarkable disparity among these ideas should not obscure their common foundation. Neither should the diversity of opinion about the prospects for a philosophical analysis of causation obscure their importance. For even those philosophers who see these prospects as dim—perhaps because they suffer post-Quinean queasiness at the thought of any analysis of any concept of interest—can often be heard to say such things as that causal relations among events are somehow “a matter of” the patterns of counterfactual dependence to be found in them. It was not always so. Thirty-odd years ago, so-called “regularity” analyses (so-called, presumably, because they traced back to Hume’s well-known analysis of causation as constant conjunction) ruled the day, with Mackie’s Cement of the Universe embodying a classic statement. But they fell on hard times, both because of internal problems—which we will review in due course—and because dramatic improvements in philosophical understanding of counterfactuals made possible the emergence of a serious and potent rival: a counterfactual analysis of causation resting on foundations firm enough to be repel the kind of philosophical suspicion that had formerly warranted dismissal.. (shrink)
Edouard Machery's paper, ‘The Folk Concept of Intentional Action: Philosophical and Psychological Issues,’ puts forth an intriguing new hypothesis concerning recent work in experimental philosophy on the concept of intentional action. As opposed to other hypotheses in the literature, Machery's 'trade-off hypothesis' claims not to rely on moral considerations in explaining folk uses of the concept. In this paper, we critique Machery's hypothesis and offer empirical evidence to reject it. Finally, (...) class='Hi'> we evaluate the current state of the debate concerning the concept of intentional action, and motivate skepticism toward the plausibility of any parsimonious account of the relevant data. (shrink)
Mature representations of number are built on a core system of numerical representation that connects to spatial representations in the form of a ‘mental number line’. The core number system is functional in early infancy, but little is known about the origins of the mapping of numbers onto space. Here we show that preverbal infants transfer the discrimination of an ordered series of numerosities to the discrimination of an ordered series of line lengths. Moreover, infants construct relationships between individual numbers (...) and line lengths that vary positively, but not between numbers and lengths that vary inversely. These findings provide evidence for an early developing predisposition to relate representations of numerical magnitude and spatial length. A central foundation of mathematics, science and technology therefore emerges prior to experience with language, symbol systems, or measurement devices. (shrink)
Nancy Cartwright’s most recent book, Hunting Causes and Using Them: Approaches to Philosophy and Economics (hereafter, HCUT), is a welcome and provocative addition to the current literature on causation. In HCUT, Cartwright further develops themes from her earlier work, especially Nature’s Capacities and their Measurement (1989) and The Dappled World (1999). One theme is that methodological issues having to with inferring and applying claims about cause and effect must be considered in tandem with metaphysical questions about what causation is. And (...) with regard to the latter issue, Cartwright insists that causation is not just one kind of thing but is instead a general category for various types of processes that often differ in important ways. From these two themes, it naturally follows that one should be skeptical that there is any method of causal inference that is applicable in all cases. Moreover, for any method, one ought to be very clear about the types of causal systems for which it is suited and, of equal importance, those for which it is not. Given Cartwright’s approach, such investigations will require careful attention to domain specific detail about the nature of the causal processes of interest. Cartwright pursues these ideas in the context of critical examinations of current approaches to causation, including Bayes nets and several approaches proposed by econometricians. I am quite sympathetic to Cartwright’s overall perspective on causation, but I take issue with some of her characterizations of particular approaches and several of her specific claims about their limitations. I focus on Cartwright’s claims concerning methods of causal inference that rely on Bayes nets, which among the methods she discusses is the one I know best. First, I argue that Cartwright’s discussion of this topic 1 is problematic insofar as it does not pay adequate attention to the distinct projects that might be pursued within a Bayes nets approach to causation.. (shrink)
This essay defends the view that inductive reasoning involves following inductive rules against objections that inductive rules are undesirable because they ignore background knowledge and unnecessary because Bayesianism is not an inductive rule. I propose that inductive rules be understood as sets of functions from data to hypotheses that are intended as solutions to inductive problems. According to this proposal, background knowledge is important in the application of inductive rules and Bayesianism qualifies as an inductive rule. Finally, I consider a (...) Bayesian formulation of inductive skepticism suggested by Lange. I argue that while there is no good Bayesian reason for judging this inductive skeptic irrational, the approach I advocate indicates a straightforward reason not to be an inductive skeptic. (shrink)
This paper is a contribution to the debate about eudaimonism started by Kashdan, Biswas-Diener, King, and Waterman in a previous issue of The Journal of Positive Psychology. We point out that one thing that is missing from this debate is an understanding of the problems with subjective theories of well-being that motivate a turn to objective theories. A better understanding of the rationale for objective theories helps us to see what is needed from a theory of well-being. We then argue (...) that a suitably modified subjective theory can provide what is needed and that this is the theory that ought to be favored by psychologists. Keywords: well-being; happiness; hedonism; eudaimonia; subjective well-being; theory; values.. (shrink)
Increased knowledge of the gene–disease associations contributing to common cancer development raises the prospect of population stratification by genotype and other risk factors. Individual risk assessments could be used to target interventions such as screening, treatment and health education. Genotyping neonates, infants or young children as part of a systematic programme would improve coverage and uptake, and facilitate a screening package that maximises potential benefits and minimises harms including overdiagnosis. This paper explores the potential justifications and risks of genotyping children (...) for genetic variants associated with common cancer development within a personalised screening programme. It identifies the ethical and legal principles that might guide population genotyping where the predictive value of the testing is modest and associated risks might arise in the future, and considers the standards required by population screening programme validity measures (such as the Wilson and Jungner criteria including cost-effectiveness and equitable access). These are distinguished from the normative principles underpinning predictive genetic testing of children for adult-onset diseases—namely, to make best-interests judgements and to preserve autonomy. While the case for population-based genotyping of neonates or young children has not yet been made, the justifications for this approach are likely to become increasingly compelling. A modified evaluative and normative framework should be developed, capturing elements from individualistic and population-based approaches. This should emphasise proper communication and genuine parental consent or informed choice, while recognising the challenges associated with making unsolicited approaches to an asymptomatic group. Such a framework would be strengthened by complementary empirical research. (shrink)
Developments in the field of neuroscience, according to its proponents, offer the prospect of an enhanced understanding and treatment of addicted persons. Consequently, its advocates consider that improving public understanding of addiction neuroscience is a desirable aim. Those critical of neuroscientific approaches, however, charge that it is a totalising, reductive perspective–one that ignores other known causes in favour of neurobiological explanations. Sociologist Nikolas Rose has argued that neuroscience, and its associated technologies, are coming to dominate cultural models to the extent (...) that 'we' increasingly understand ourselves as 'neurochemical selves'. Drawing on 55 qualitative interviews conducted with members of the Australian public residing in the Greater Brisbane area, we challenge both the 'expectational discourses' of neuroscientists and the criticisms of its detractors. Members of the public accepted multiple perspectives on the causes of addiction, including some elements of neurobiological explanations. Their discussions of addiction drew upon a broad range of philosophical, sociological, anthropological, psychological and neurobiological vocabularies, suggesting that they synthesised newer technical understandings, such as that offered by neuroscience, with older ones. Holding conceptual models that acknowledge the complexity of addiction aetiology into which new information is incorporated suggests that the impact of neuroscientific discourse in directing the public's beliefs about addiction is likely to be more limited than proponents or opponents of neuroscience expect. (shrink)
Neuroscience research has improved our understanding of the long term consequences of sports-related concussion, but ethical issues related to the prevention and management of concussion are an underdeveloped area of inquiry. This article exposes several examples of conflicts of interest that have arisen and been tolerated in the management of concussion in sport (particularly professional football codes) regarding the use of computerized neuropsychological (NP) tests for diagnosing concussion. Part 1 outlines how the recommendations of a series of global protocols for (...) dealing with sports-related concussions (the 1st, 2nd and 3rd Consensus Statements on Concussion in Sport) have endorsed the use of NP testing. The development of these protocols has involved experts who have links with companies that sell computerised NP tests for concussion management. Part 2 describes how some professional football leagues—in particular the National Football League (NFL), the Australian Football League (AFL) and the National Rugby League (NRL)—have mandated specific NP testing products. They have done so on the basis of these international guidelines and by engaging experts who have conflicts of interest with NP testing companies. These decisions have also been taken despite evidence that casts doubt on the reliability and validity of NP tests when used in these ways. (shrink)
Addiction is increasingly described as a "chronic and relapsing brain disease". The potential impact of the brain disease model on the treatment of addiction or addicted individuals' treatment behaviour remains uncertain. We conducted a qualitative study to examine: (i) the extent to which leading Australian addiction neuroscientists and clinicians accept the brain disease view of addiction; and (ii) their views on the likely impacts of this view on addicted individuals' beliefs and behaviour. Thirty-one Australian addiction neuroscientists and clinicians (10 females (...) and 21 males; 16 with clinical experience and 15 with no clinical experience) took part in 1 h semi-structured interviews. Most addiction neuroscientists and clinicians did not uncritically support the use of brain disease model of addiction. Most were cautious about the potential for adverse impacts on individuals' recovery and motivation to enter treatment. While some recognised the possibility that the brain disease model of addiction may provide a rationale for addicted persons to seek treatment and motivate behaviour change, Australian addiction neuroscientist and clinicians do not assume that messages about "diseased brains" will always lead to increased treatment-seeking and reduced drug use. Research is needed on how neuroscience research could be used in ways that optimise positive outcomes for addicted persons. (shrink)
There is currently little empirical information about attitudes towards cognitive enhancement - the use of pharmaceutical drugs to enhance normal brain functioning. It is claimed this behaviour most commonly occurs in students to aid studying. We undertook a qualitative assessment of attitudes towards cognitive enhancement by conducting 19 semi-structured interviews with Australian university students. Most students considered cognitive enhancement to be unacceptable, in part because they believed it to be unethical but there was a lack of consensus on whether it (...) was similar or different to steroid use in sport. There was support for awareness campaigns and monitoring of cognitive enhancement use of pharmaceutical drugs. An understanding of student attitudes towards cognitive enhancement is important in formulating future policy. (shrink)
Impaired control over drug use is a defining characteristic of addiction in the major diagnostic systems. However there is significant debate about the extent of this impairment. This qualitative study examines the extent to which leading Australian addiction neuroscientists and clinicians believe that addicted individuals have control over their drug use and are responsible for their behaviour. One hour semi-structured interviews were conducted during 2009 and 2010 with 31 Australian addiction neuroscientists and clinicians (10 females and 21 males; 16 with (...) clinical experience and 15 with no clinical experience). Although many addiction neuroscientists and clinicians described uncontrolled or compulsive drug use as characteristic of addiction, most were ambivalent about whether or not addicted people could be said to have no control of their drug use. Most believed that addicted individuals have fluctuating levels of impaired control over their drug use but they nonetheless believed that addicted persons were responsible for their behaviour, including criminal behaviour engaged in to fund their drug use. Addiction was not seen as exculpating criminal behaviour but as a mitigating factor. (shrink)
Photographs are commonly taken of children in medical and research contexts. With the increased availability of photographs through the internet, it is increasingly important to consider their potential for negative consequences and the nature of any consent obtained. In this research we explore the issues around photography in low-resource settings, in particular concentrating on the challenges in gaining informed consent.