Rare earth elements have become increasingly important because of their relative scarcity and worldwide increasing demand, as well as China’s quasi-monopoly of this market. REEs are virtually not substitutable, and they are essential for a variety of high-tech products and modern key technologies. This has raised serious concerns that China will misuse its dominant position to set export quotas in order to maximize its own profits at the expense of other rare earth user industries. In fact, export restrictions on REEs (...) were the catalyst for the U.S. to lodge a formal complaint against China in 2012 at the World Trade Organization. This paper analyzes possible wealth transfer effects by focusing on export quota announcements by China, and the share price reactions of Chinese REE suppliers, the U.S. REE users, and the rest of the world REE refiners. Overall, we find limited support for the view of a wealth transfer in connection with MOFCOM announcements only when disentangling events prior to and post the initiation of the WTO trial, consistent with the trial triggering changes to China’s REE policy and recent announcement to abolish quotas. We do find, however, that extreme REE price movements have a first-order effect on all companies in the REE industry consistent with recent market trends to enable hedging against REE price volatility. (shrink)
This paper examines the relationship between performance persistence and corporate governance. We document systematic differences in performance persistence across listed companies in China during 2001–2011, and empirically demonstrate that firms with better corporate governance show higher performance persistence. The results are robust over both the short and long terms. We also find that performance persistence is an important factor in refinancing, and it can lower companies’ costs of borrowing. Overall, our findings offer important implications for business ethics, as we demonstrate (...) how corporate governance can lower companies’ costs of debt. (shrink)
Bernard Schweizer explores a hitherto neglected strain of religious rebellion. Misotheism, or hatred of God, is more radical than atheism. God-haters do not question God's existence, but instead deny his competence and goodness. Sifting through centuries of evidence and uncovering fascinating networks of influences among writers and thinkers as diverse as Friedrich Nietzsche, Zora Neale Hurston, and Philip Pullman. Schweizer reveals deep undercurrents of misotheism in many acclaimed works of literature and philosophy.
The relationship between abstract formal procedures and the activities of actual physical systems has proved to be surprisingly subtle and controversial, and there are a number of competing accounts of when a physical system can be properly said to implement a mathematical formalism and hence perform a computation. I defend an account wherein computational descriptions of physical systems are high-level normative interpretations motivated by our pragmatic concerns. Furthermore, the criteria of utility and success vary according to our diverse purposes and (...) pragmatic goals. Hence there is no independent or uniform fact to the matter, and I advance the ‘anti-realist’ conclusion that computational descriptions of physical systems are not founded upon deep ontological distinctions, but rather upon interest-relative human conventions. Hence physical computation is a ‘conventional’ rather than a ‘natural’ kind. (shrink)
Climate change assessments rely upon scenarios of socioeconomic developments to conceptualize alternative outcomes for global greenhouse gas emissions. These are used in conjunction with climate models to make projections of future climate. Specifically, the estimations of greenhouse gas emissions based on socioeconomic scenarios constrain climate models in their outcomes of temperatures, precipitation, etc. Traditionally, the fundamental logic of the socioeconomic scenarios—that is, the logic that makes them plausible—is developed and prioritized using methods that are very subjective. This introduces a fundamental (...) challenge for climate change assessment: The veracity of projections of future climate currently rests on subjective ground. We elaborate on these subjective aspects of scenarios in climate change research. We then consider an alternative method for developing scenarios, a systems dynamics approach called ‘Cross-Impact Balance’ (CIB) analysis. We discuss notions of ‘objective’ and ‘objectivity’ as criteria for distinguishing appropriate scenario methods for climate change research. We distinguish seven distinct meanings of ‘objective,’ and demonstrate that CIB analysis is more objective than traditional subjective approaches. However, we also consider criticisms concerning which of the seven meanings of ‘objective’ are appropriate for scenario work. Finally, we arrive at conclusions regarding which meanings of ‘objective’ and ‘objectivity’ are relevant for climate change research. Because scientific assessments uncover knowledge relevant to the responses of a real, independently existing climate system, this requires scenario methodologies employed in such studies to also uphold the seven meanings of ‘objective’ and ‘objectivity.’. (shrink)
The Computational Theory of Mind (CTM) holds that cognitive processes are essentially computational, and hence computation provides the scientific key to explaining mentality. The Representational Theory of Mind (RTM) holds that representational content is the key feature in distinguishing mental from non-mental systems. I argue that there is a deep incompatibility between these two theoretical frameworks, and that the acceptance of CTM provides strong grounds for rejecting RTM. The focal point of the incompatibility is the fact that representational content is (...) extrinsic to formal procedures as such, and the intended interpretation of syntax makes no difference to the execution of an algorithm. So the unique 'content' postulated by RTM is superfluous to the formal procedures of CTM. And once these procedures are implemented in a physical mechanism, it is exclusively the causal properties of the physical mechanism that are responsible for all aspects of the system's behaviour. So once again, postulated content is rendered superfluous. To the extent that semantic content may appear to play a role in behaviour, it must be syntactically encoded within the system, and just as in a standard computational artefact, so too with the human mind/brain - it's pure syntax all the way down to the level of physical implementation. Hence 'content' is at most a convenient meta-level gloss, projected from the outside by human theorists, which itself can play no role in cognitive processing. (shrink)
Opponents of the computational theory of mind have held that the theory is devoid of explanatory content, since whatever computational procedures are said to account for our cognitive attributes will also be realized by a host of other ‘deviant’ physical systems, such as buckets of water and possibly even stones. Such ‘triviality’ claims rely on a simple mapping account of physical implementation. Hence defenders of CTM traditionally attempt to block the trivialization critique by advocating additional constraints on the implementation relation. (...) However, instead of attempting to ‘save’ CTM by constraining the account of physical implementation, I argue that the general form of the triviality argument is invalid. I provide a counterexample scenario, and show that SMA is in fact consistent with empirically rich and theoretically plausible versions of CTM. This move requires rejection of the computational sufficiency thesis, which I argue is scientifically unjustified in any case. By shifting the ‘burden of explanatory force’ away from the concept of physical implementation, and instead placing it on salient aspects of the target phenomenon to be explained, it’s possible to retain a maximally liberal and unfettered view of physical implementation, and at the same time defuse the triviality arguments that have motivated defenders of CTM to impose various theory-laden constraints on SMA. (shrink)
The paper examines the status of conscious presentation with regard to mental content and intentional states. I argue that conscious presentation of mental content should be viewed on the model of a secondary quality, as a subjectiveeffect of the microstructure of an underlying brain state. The brain state is in turn viewed as the instantiation of an abstract computational state, with the result that introspectively accessible content is interpreted as a presentation of the associated computational state realized by the brain. (...) However, if the relation between consciousness and representational content is construed in this manner, then conscious presentation does not provide an adequate foundation for the claim that human mental states areintrinsically intentional. On this model, I argue that functionalism is able to account for (non-intrinsic) intentionality, but not for consciousness, which has implications for the computational paradigm, as well as for Searle's Chinese room thought experiment. (shrink)
The relationship between the physical body and the conscious human mind has been a deeply problematic topic for centuries. Physicalism is the 'orthodox' metaphysical stance in contemporary Western thought, according to which reality is exclusively physical/material in nature. However, in the West, theoretical dissatisfaction with this type of approach has historically lead to Cartesian-style dualism, wherein mind and body are thought to belong to distinct metaphysical realms. In the current discussion I compare and contrast this standard Western approach with an (...) alternative form of dualism developed in the Sāṃkhya-Yoga philosophical tradition, where matter and pure consciousness are held to belong to distinct and independent realms, but where the mind is placed on the material side of the ontological divide. I argue that this model possesses a number of theoretical advantages over Cartesian-style dualism, and constitutes a compelling theoretical framework for re-conceptualizing the mind-body problem. (shrink)
What is Life? This is the question asked by Denis Noble in this very personal and at times deeply lyrical book. Noble is a renowned physiologist and systems biologist, and he argues that the genome is not life itself: to understand what life is, we must view it at a variety of different levels, all interacting with each other in a complex web. It is that emergent web, full of feedback between levels, from the gene to the wider environment, (...) that is life. (shrink)
In evolutionary biology changes in population structure are explained by citing trait fitness distribution. I distinguish three interpretations of fitness explanations—the Two‐Factor Model, the Single‐Factor Model, and the Statistical Interpretation—and argue for the last of these. These interpretations differ in their degrees of causal commitment. The first two hold that trait fitness distribution causes population change. Trait fitness explanations, according to these interpretations, are causal explanations. The last maintains that trait fitness distribution correlates with population change but does not cause (...) it. My defense of the Statistical Interpretation relies on a distinctive feature of causation. Causes conform to the Sure Thing Principle. Trait fitness distributions, I argue, do not. *Received July 2009; revised October 2009. †To contact the author, please write to: Department of Philosophy/Institute for the History, Philosophy of Science and Technology, University of Toronto, Victoria College, 91 Charles Street West, Toronto, ON M5S 1K7, Canada; e‐mail: email@example.com. (shrink)
The systems T N and T M show that necessity can be consistently construed as a predicate of syntactical objects, if the expressive/deductive power of the system is deliberately engineered to reflect the power of the original object language operator. The system T N relies on salient limitations on the expressive power of the language L N through the construction of a quotational hierarchy, while the system T Mrelies on limiting the scope of the modal axioms schemas to the sublanguage (...) L infM +, which corresponds exactly with the restrictive hierarchy of L N. The fact that L infM + is identical to the image of the metalinguistic mapping C + from the normal operator system into L M reveals that iterated operator modality is implicitly hierarchical, and that inconsistency is produced by applying the principles of the modal logic to formulas which have no natural analogues in the operator development. Thus the contradiction discovered by Montague can be diagnosed as the result of instantiating the axiom schemas with modally ungrounded formulas, and thereby adding radically new modal axioms to the predicate system.The predicate treatment of necessity differs significantly from that of the operator in that the cumulative models for the predicate system are strictly first-order. Possible worlds are not used as model-theoretic primitives, but rather alternate models are appealed to in order to specify the extension of N, which is semantically construed as a first-order predicate. In this manner, the intensional aspects of modality are built into the mode of specifying the particular set of objects which the denotation function assigns to N, rather than in the specification of the basic truth conditions for modal formulas. Intensional phenomena are thereby localised to the special requirements for determining the extension of a particular predicate, and this does not constitute a structural modification of the first-order models, but rather limits the relevant class of models to those which possess an appropriate denotation function. (shrink)
What is Life? To answer this question, Denis Noble argues that we must look beyond the gene's eye view. For modern 'systems biology' considers life on a variety of levels, as an intricate web of feedback between gene, cell, organ, body, and environment. He shows how it is both a biologically rigorous and richly rewarding way of understanding life.
We distinguish dynamical and statistical interpretations of evolutionary theory. We argue that only the statistical interpretation preserves the presumed relation between natural selection and drift. On these grounds we claim that the dynamical conception of evolutionary theory as a theory of forces is mistaken. Selection and drift are not forces. Nor do selection and drift explanations appeal to the (sub-population-level) causes of population level change. Instead they explain by appeal to the statistical structure of populations. We briefly discuss the implications (...) of the statistical interpretation of selection for various debates within the philosophy of biologythe `explananda of selection' debate and the `units of selection' debate. (shrink)
The paper begins by examining the original Turing Test (2T) and Searle’s antithetical Chinese Room Argument, which is intended to refute the 2T in particular, as well as any formal or abstract procedural theory of the mind in general. In the ensuing dispute between Searle and his own critics, I argue that Searle’s ‘internalist’ strategy is unable to deflect Dennett’s combined robotic-systems reply and the allied Total Turing Test (3T). Many would hold that the 3T marks the culmination of the (...) dialectic and, in principle, constitutes a fully adequate empirical standard for judging that an artifact is intelligent on a par with human beings. However, the paper carries the debate forward by arguing that the sociolinguistic factors highlighted in externalist views in the philosophy of language indicate the need for a fundamental shift in perspective in a Truly Total Turing Test (4T). It’s not enough to focus on Dennett’s individual robot viewed as a system; instead, we need to focus on an ongoing system of such artifacts. Hence a 4T should evaluate the general category of cognitive organization under investigation, rather than the performance of single specimens. From this comprehensive standpoint, the question is not whether an individual instance could simulate intelligent behavior within the context of a pre-existing sociolinguistic culture developed by the human cognitive type. Instead the key issue is whether the artificial cognitive type itself is capable of producing a comparable sociolinguistic medium. (shrink)
The paper examines the nature of the behavioral evidence underlying attributions of intelligence in the case of human beings, and how this might be extended to other kinds of cognitive system, in the spirit of the original Turing Test. I consider Harnad's Total Turing Test, which involves successful performance of both linguistic and robotic behavior, and which is often thought to incorporate the very same range of empirical data that is available in the human case. However, I argue that the (...) TTT is still too weak, because it only tests the capabilities of particular tokens within a preexisting context of intelligent behavior. What is needed is a test of the cognitive type, as manifested through a number of exemplary tokens, in order to confirm that the cognitive type is able to produce the context of intelligent behavior presupposed by tests such as the TT and TTT. (shrink)
Recent work on conditional reasoning argues that denying the antecedent [DA] and affirming the consequent [AC] are defeasible but cogent patterns of argument, either because they are effective, rational, albeit heuristic applications of Bayesian probability, or because they are licensed by the principle of total evidence. Against this, we show that on any prevailing interpretation of indicative conditionals the premises of DA and AC arguments do not license their conclusions without additional assumptions. The cogency of DA and AC inferences rather (...) depends on contingent factors extrinsic to, and independent of, what is asserted by DA and AC arguments. arguments do not license their conclusions without additional assumptions. The cogency of DA and AC inferences rather depends on contingent factors extrinsic to, and independent of, what is asserted by DA and AC arguments. (shrink)
Denying the antecedent is an invalid form of reasoning that is typically identified and frowned upon as a formal fallacy. Contrary to arguments that it does not or at least should not occur, denying the antecedent is a legitimate and effective strategy for undermining a position. Since it is not a valid form of argument, it cannot prove that the position is false. But it can provide inductive evidence that this position is probably false. In this role, it is neither (...) defective nor deceptive. Denying the antecedent provides inductive support for rejecting a claim as improbable. (shrink)
The analysis of religious assertions in terms of a language game or in terms of eschotological verification are the two most notable defences today of the factual significance of religious language. But both of these approaches, I believe, are to be found wanting, not only on philosophical grounds, but especially on the grounds of faith. Neither of these approaches reflects ordinary faith, the faith of ordinary believers. And it is in terms of such ordinary faith that we can find the (...) key to a more adequate answer to the challenge of verifiability. (shrink)
Denis McManus presents a novel account of Martin Heidegger's early vision of our subjectivity and the world we inhabit. He explores key elements of Heidegger's philosophy, and argues that Heidegger's central claims identify genuine demands that must be met if we are to achieve the feat of thinking determinate thoughts about the world around us.
There are two competing interpretations of the modern synthesis theory of evolution: the dynamical (also know as ‘traditional’) and the statistical. The dynamical interpretation maintains that explanations offered under the auspices of the modern synthesis theory articulate the causes of evolution. It interprets selection and drift as causes of population change. The statistical interpretation holds that modern synthesis explanations merely cite the statistical structure of populations. This paper offers a defense of statisticalism. It argues that a change in trait frequencies (...) in a population can be attributed only to selection or drift against the background of a particular statistical description of the population. The traditionalist supposition that selection and drift are description‐independent causes of population change leads the dynamical interpretation into a dilemma: it must face a contradiction or accept the loss of explanatory power. (shrink)
This article focuses on both daily forms of weakness of will as discussed in the philosophical debate and psychopathological phenomena as impairments of decision making. We argue that both descriptions of dysfunctional decision making can be organized within a common theoretical framework that divides the decision making process in three different stages: option generation, option selection, and action initiation. We first discuss our theoretical framework, focusing on option generation as an aspect that has been neglected by previous models. In the (...) main body of this article, we review how both philosophy and neuropsychiatry have provided accounts of dysfunction in each decision-making stage, as well as where these accounts can be integrated. Also, the neural underpinnings of dysfunction in the three different stages are discussed. We conclude by discussing advantages and limitations of our integrative approach. (shrink)
What are the processes, from conception to adulthood, that enable a single cell to grow into a sentient adult? Neuroconstructivism is a pioneering 2 volume work that sets out a whole new framework for considering the complex topic of development, integrating data from cognitive studies, computational work, and neuroimaging.
Introduction -- Landscape and longing -- Art and human nature -- What is art? -- But they don't have our concept of art -- Art and natural selection -- The uses of fiction -- Art and human self-domestication -- Intention, forgery, dada : three aesthetic problems -- The contingency of aesthetic values -- Greatness in the arts.
The standard account of denying the antecedent (DA) is that it is a deductively invalid form of argument, and that, in a conditional argument, to argue from the falsity of the antecedent to the falsity of the consequent is always fallacious. In this paper, we argue that DA is not always a fallacious argumentative strategy. Instead, there is a legitimate usage of DA according to which it is a defeasible argument against the acceptability of a claim. The dialectical effect of (...) denying the antecedent is to shift the burden of proof back to the original proponent of a claim. We provide a model of this non-fallacious usage which is built upon pragmatic models of argumentation. (shrink)
According to Aristotelian essentialism, the nature of an organism is constituted of a particular goal-directed disposition to produce an organism typical of its kind. This paper argues—against the prevailing orthodoxy—that essentialism of this sort is indispensable to evolutionary biology. The most powerful anti-essentialist arguments purport to show that the natures of organisms play no explanatory role in modern synthesis biology. I argue that recent evolutionary developmental biology provides compelling evidence to the contrary. Developmental biology shows that one must appeal to (...) the capacities of organisms to explain what makes adaptive evolution adaptive. Moreover, the specific capacities in question are precisely those that, according to Aristotle, constitute the nature of an organism. Essentialism 1.1 Aristotelian biological kinds Evolutionary anti-essentialism 2.1 Taxonomic anti-essentialism 2.2 Explanatory anti-essentialism Adaptation 3.1 Stability 3.2 Mutability 3.3 Phenotypic plasticity and adaptive evolution The natures of organisms Conclusion. (shrink)
Over the past fifteen years there has been a considerable amount of debate concerning what theoretical population dynamic models tell us about the nature of natural selection and drift. On the causal interpretation, these models describe the causes of population change. On the statistical interpretation, the models of population dynamics models specify statistical parameters that explain, predict, and quantify changes in population structure, without identifying the causes of those changes. Selection and drift are part of a statistical description of population (...) change; they are not discrete, apportionable causes. Our objective here is to provide a definitive statement of the statistical position, so as to allay some confusions in the current literature. We outline four commitments that are central to statisticalism. They are: 1. Natural Selection is a higher order eﬀect; 2. Trait fitness is primitive; 3. Modern Synthesis (MS)-models are substrate neutral; 4. MS-selection and drift are model-relative. (shrink)
An argumentative passage that might appear to be an instance of denying the antecedent will generally admit of an alternative interpretation, one on which the conditional contained by the passage is a preface to the argument rather than a premise of it. On this interpretation. which generally is a more charitable one, the conditional plays a certain dialectical role and, in some cases, a rhetorical role as well. Assuming only a very weak principle of exigetical charity, I consider what it (...) would take to justify the less charitable interpretation. I then present evidence that those conditions are seldom met. Indeed, I was unable to find a single published argument that can justifiably be charged with denying the antecedent. (shrink)
In a series of reports the United Nations Special Representative on the issue of Human Rights and Transnational Corporations has emphasized a tripartite framework regarding business and human rights that includes the state “duty to protect,” the TNC “responsibility to respect,” and “appropriate remedies” for human rights violations. This article examines the recent history of UN initiatives regarding business and human rights and places the tripartite framework in historical context. Three approaches to human rights are distinguished: moral, political, and legal. (...) It is argued that the tripartite framework’s grounding of the responsibility of TNCs to respect human rights is properly understood as moral and not merely as a political or legal duty. A moral account of the duty of TNCs to respect basic human rights is defended and contrasted with a merely strategic approach. The main conclusion of the article is that only a moral account of the basic human rights duties of TNCs provides a sufficiently deep justification of “the corporate responsibility to respect human rights” feature of the tripartite framework. (shrink)
This article applies the Kantian doctrine of respect for persons to the problem of sweatshops. We argue that multinational enterprises are properly regarded as responsible for the practices of their subcontractors and suppliers. We then argue that multinationalenterprises have the following duties in their off-shore manufacturing facilities: to ensure that local labor laws are followed; to refrain from coercion; to meet minimum safety standards; and to provide a living wage for employees. Finally, we consider and reply to the objection that (...) improving health and safety conditions and providing a living wage will cause greater harm than good. (shrink)
This paper examines a doctrine which David Lewis has called 'Humean Supervenience' (hereafter 'HS'), and a problem which certain imaginary cases seem to generate for HS. They include rotating perfect spheres or discs, and flowing rivers, imagined as composed of matter which is perfectly homogeneous right down to the individual points. Before considering these examples, I shall introduce the doctrine they seem to challenge.
Denying Evolution aims at taking a fresh look at the evolution–creation controversy. It presents a truly “balanced” treatment, not in the sense of treating creationism as a legitimate scientific theory (it demonstrably is not), but in the sense of dividing the blame for the controversy equally between creationists and scientists—the former for subscribing to various forms of anti-intellectualism, the latter for discounting science education and presenting science as scientism to the public and the media. The central part of the book (...) focuses on a series of creationist fallacies (aimed at showing errors of thought, not at deriding) and of mistakes by scientists and science educators. The last part of the book discusses long-term solutions to the problem, from better science teaching at all levels to the necessity of widespread understanding of how the brain works and why people have difficulties with critical thinking. (shrink)
What is a logical constant? The question is addressed in the tradition of Tarski's definition of logical operations as operations which are invariant under permutation. The paper introduces a general setting in which invariance criteria for logical operations can be compared and argues for invariance under potential isomorphism as the most natural characterization of logical operations.
We review recent developments in ethical pluralism, ethical particularism, Kantian intuitionism, rights theory, and climate change ethics, and show the relevance of these developments in ethical theory to contemporary business ethics. This paper explains why pluralists think that ethical decisions should be guided by multiple standards and why particularists emphasize the crucial role of context in determining sound moral judgments. We explain why Kantian intuitionism emphasizes the discerning power of intuitive reason and seek to integrate that with the comprehensiveness of (...) Kant’s moral framework. And we show how human rights can be grounded in human agency, and explain the connections between human rights and climate change. (shrink)
The need to create art is found in every human society, manifest in many different ways across many different cultures. Is this universal need rooted in our evolutionary past? The Art Instinct reveals that it is, combining evolutionary psychology with aesthetics to shed new light on fascinating questions about the nature of art.