Abstract In this paper, we explore how the application of technological tools has reshaped food production systems in ways that foster large-scale outbreaks of foodborne illness. Outbreaks of foodborne illness have received increasing attention in recent years, resulting in a growing awareness of the negative impacts associated with industrial food production. These trends indicate a need to examine systemic causes of outbreaks and how they are being addressed. In this paper, we analyze outbreaks linked to ground beef and salad greens. (...) These case studies are informed by personal interviews, site visits, and an extensive review of government documents and peer-reviewed literature. To explore these cases, we draw from actor-network theory and political economy to analyze the relationships between technological tools, the design of industrial production systems, and the emergence and spread of pathogenic bacteria. We also examine if current responses to outbreaks represent reflexive change. Lastly, we use the myth of Prometheus to discuss ethical issues regarding the use of technology in food production. Our findings indicate that current tools and systems were designed with a narrow focus on economic efficiency, while overlooking relationships with pathogenic bacteria and negative social impacts. In addition, we find that current responses to outbreaks do not represent reflexive change and a continued reliance on technological fixes to systemic problems may result in greater problems in the future. We argue that much can be learned from the myth of Prometheus. In particular, justice and reverence need to play a more significant role in guiding production decisions. Content Type Journal Article Category Articles Pages 1-26 DOI 10.1007/s10806-011-9357-8 Authors Diana Stuart, Kellogg Biological Station and Department of Sociology, Michigan State University, 3700 East Gull Lake Drive, Hickory Corners, MI 49060, USA Michelle R. Woroosz, Department of Agricultural Economics and Rural Sociology, Auburn University, 306A Comer Hall, Auburn, AL 36849, USA Journal Journal of Agricultural and Environmental Ethics Online ISSN 1573-322X Print ISSN 1187-7863. (shrink)
Stuart, Jennie Review(s) of: Hands off not an option! The reminiscence museum mirror of a humanistic care philosophy, by Professor Dr Hans Marcel Becker assisted by Inez van den Dobbelsteen- Becker and Topsy Ros. Eburon Academic Publishers, Delft, 2011 272 pp.
In his classic 1936 essay On the Concept of Logical Consequence, Alfred Tarski used the notion of satisfaction to give a semantic characterization of the logical properties. Tarski is generally credited with introducing the model-theoretic characterization of the logical properties familiar to us today. However, in his book, The Concept of Logical Consequence, Etchemendy argues that Tarski's account is inadequate for quite a number of reasons, and is actually incompatible with the standard model-theoretic account. Many of his criticisms are meant (...) to apply to the model-theoretic account as well.In this paper, I discuss the following four critical charges that Etchemendy makes against Tarski and his account of the logical properties:(1)(a) Tarski's account of logical consequence diverges from the standard model-theoretic account at points where the latter account gets it right. (b) Tarski's account cannot be brought into line with the model-theoretic account, because the two are fundamentally incompatible. (2) There are simple counterexamples (enumerated by Etchemendy) which show that Tarski's account is wrong. (3) Tarski committed a modal fallacy when arguing that his account captures our pre-theoretical concept of logical consequence, and so obscured an essential weakness of the account. (4) Tarski's account depends on there being a distinction between the logical terms and the non-logical terms of a language, but (according to Etchemendy) there are very simple (even first-order) languages for which no such distinction can be made. Etchemendy's critique raises historical and philosophical questions about important foundational work. However, Etchemendy is mistaken about each of these central criticisms. In the course of justifying that claim, I give a sustained explication and defense of Tarski's account. Moreover, since I will argue that Tarski's account and the model-theoretic account really do come to the same thing, my subsequent defense of Tarski's account against Etchemendy's other attacks doubles as a defense against criticisms that would apply equally to the familiar model-theoretic account of the logical properties. (shrink)
My aim in this paper is to go some way towards showing that the maintenance of hard and fast dichotomies, like those between mind and body, and the real and the virtual, is untenable, and that technological advance cannot occur with being cognisant of its reciprocal ethical implications. In their place I will present a softer enactivist ontology through which I examine the nature of our engagement with technology in general and with virtual realities in particular. This softer ontology is (...) one to which I will commit Kant, and from which, I will show, certain critical moral and emotional consequences arise. It is my contention that Kant’s logical subject is necessarily embedded in the world and that Kant, himself, would be content with this view as an expression of his inspired response to the “scandal to philosophy… that the existence of things outside us… must be accepted merely on faith” [Bxl]. In keeping with his arguments for the a priori framing of intuition, the a priori structuring of experience through the spontaneous application of the categories, the synthesis of the experiential manifold, and the necessity of a unity of apperception, I will present an enactivist account of agency in the world, and argue that it is our embodied and embedded kinaesthetic engagement in our world which makes possible the syntheses of apprehension, reproduction and recognition, and which, in turn, make possible the activity of the reproductive or creative imagination. (shrink)
In this paper we consider the concept of a self-aware agent. In cognitive science agents are seen as embodied and interactively situated in worlds. We analyse the meanings attached to these terms in cognitive science and robotics, proposing a set of conditions for situatedness and embodiment, and examine the claim that internal representational schemas are largely unnecessary for intelligent behaviour in animats. We maintain that current situated and embodied animats cannot be ascribed even minimal self-awareness, and offer a six point (...) definition of embeddedness, constituting minimal conditions for the evolution of a sense of self. This leads to further analysis of the nature of embodiment and situatedness, and a consideration of whether virtual animats in virtual worlds could count as situated and embodied. We propose that self-aware agents must possess complex structures of self-directed goals; multi-modal sensory systems and a rich repertoire of interactions with their worlds. Finally, we argue that embedded agents will possess or evolve local co-ordinate systems, or points of view, relative to their current positions in space and time, and have a capacity to develop an egocentric space. None of these capabilities are possible without powerful internal representational capacities. (shrink)
It is argued that, based on Kant's descriptive metaphysics, one can prescribe the necessary metaphysical underpinnings for the possibility of conscious experience in an artificial system. This project is developed by giving an account of the a priori concepts of the understanding in such a system. A specification and implementation of the nomological conditions for a conscious system allows one to know a priori that any system possessing this structure will be conscious; thus enabling us to avoid possible false-indicators of (...) consciousness like that offered in a behaviouristic analysis. This is an alternative approach to the bottom-up or top-down approaches adopted by, for example CYC (Lenat and Feigenbaum 1992) and COG (Brooks 1994; Brooks and Stein 1993), neither of which, alone, or in some hybrid form, have proved productive. (shrink)
Machine consciousness exists already in organic systems and it is only a matter of time -- and some agreement -- before it will be realised in reverse-engineered organic systems and forward- engineered inorganic systems. The agreement must be over the preconditions that must first be met if the enterprise is to be successful, and it is these preconditions, for instance, being a socially-embedded, structurally-coupled and dynamic, goal-directed entity that organises its perceptual input and enacts its world through the application of (...) both a cognitive and kinaesthetic imagination, that I shall concentrate on presenting in this paper. It will become clear that these preconditions will present engineers with a tall order, but not, I will argue, an impossible one. After all, we might agree with Freeman and Núñez's claim that the machine metaphor has restricted the expectations of the cognitive sciences (Freeman & Núñez, 1999); but it is a double-edged sword, since our limited expectations about machines also narrow the potential of our cognitive science. (shrink)
The crux of this book is expressed in one short sentence from the Preface: 'Unity is a fundamental part of our experience, something that is crucial to its phenomenology' [p.xii], and the crux of this sentence is that the unity of consciousness is not a matter of phenomenal relations existing between distinct experiences – the received view [p.17], but the existence of relations between the contents of experiences – the one experience view [p.25ff]. In its simplest form Tye's claim is (...) that: all our conscious states, whether visual, auditory, olfactory, tactual or gustatory, whether imagistic or emotional are experienced concurrently; they 'are phenomenologically unified ... [and] ... Phenomenological unity is a relation between qualities represented in experience, not between qualities of experiences. [p.36]. (shrink)
Alfred Tarski (1944) wrote that “the condition of the ‘essential richness’ of the metalanguage proves to be, not only necessary, but also sufficient for the construction of a satisfactory definition of truth.” But it has remained unclear what Tarski meant by an ‘essentially richer’ metalanguage. Moreover, DeVidi and Solomon (1999) have argued in this Journal that there is nothing that Tarski could have meant by that phrase which would make his pronouncement true. We develop an answer to the historical question (...) of what Tarski meant by ‘essentially richer’ and pinpoint the general result that stands behind his essential richness claim. In defense of Tarski, we then show that each of the several arguments of DeVidi and Solomon are either moot or mistaken. (shrink)
The aim of this paper is to establish the logically necessary preconditions for the existence of self-awareness in an artificial or a natural agent. We examine the terms, agent, situated, embodied, embedded, and representation, as employed ubiquitously in cognitive science, attempting to clarify their meaning and the limits of their use. We discuss the minimal conditions for an agent’s environment constituting a ‘world’ and reject most, though not all, types of virtual world. We argue that to qualify as genuinely situated (...) an agent should function in real time within the dynamic world we inhabit, or some close simulacrum of it. We show that embodied agents will possess or evolve local co-ordinate systems, or points of view, locating, identifying and interacting with objects relative to their current position in space-time, and we discuss various types of embodiment, arguing that most current situated and embodied systems are too limited to be candidates for even the most minimal claim to self-identity. We argue that a truly autonomous agent has to be active in its participation with the world, able to synthesise and order its internal representations from its own point of view, and to do this effectively the agent will have to be embedded. To this end we propose a six point definition of embeddedness. Ultimately we argue for a philosophical-cum-cognitive science model of the self that satisfies essential elements of both sets of definitions of the term. (shrink)
The problem with model-theoretic modal semantics is that it provides only the formal beginnings of an account of the semantics of modal languages. In the case of non-modal language, we bridge the gap between semantics and mere model theory, by claiming that a sentence is true just in case it is true in an intended model. Truth in a model is given by the model theory, and an intended model is a model which has as domain the actual objects of (...) discourse, and which relates these objects in an appropriate manner. However, the same strategy applied to the modal case seems to require an intended modal model whose domain includes mere possibilia.Building on recent work by Christopher Menzel (Nous 1990), I give an account of model-theoretic semantics for modal languages which does not require mere possibilia or intensional entities of any kind. Menzel has offered a representational account of model-theoretic modal semantics that accords with actualist scruples, since it does not require possibilia. However, Menzel's view is in the company of other actualists who seek to eliminate possible worlds, but whose accounts tolerate other sorts of abstract, intensional entities, such as possible states of affairs. Menzel's account crucially depends on the existence of properties and relations in intension. (shrink)
I offer an interpretation of a familiar, but poorly understood portion of Tarskis work on truth – bringing to light a number of unnoticed aspects of Tarskis work. A serious misreading of this part of Tarski to be found in Scott Soames Understanding Truth is treated in detail. Soamesreading vies with the textual evidence, and would make Tarskis position inconsistent in an unsubtle way. I show that Soames does not finally have a coherent interpretation of Tarski. This is unfortunate, since (...) Soames ultimately arrogates to himself a key position that he has denied to Tarski and which is rightfully Tarskis own. (shrink)
While the recent special issue of JCS on machine consciousness (Volume 14, Issue 7) was in preparation, a collection of papers on the same topic, entitled Artificial Consciousness and edited by Antonio Chella and Riccardo Manzotti, was published. The editors of the JCS special issue, Ron Chrisley, Robert Clowes and Steve Torrance, thought it would be a timely and productive move to have authors of papers in their collection review the papers in the Chella and Manzotti book, and include these (...) reviews in the special issue of the journal. Eight of the JCS authors (plus Uziel Awret) volunteered to review one or more of the fifteen papers in Artificial Consciousness; these individual reviews were then collected together with a minimal amount of editing to produce a seamless chapter-by-chapter review of the entire book. Because the number and length of contributions to the JCS issue was greater than expected, the collective review of Artificial Consciousness had to be omitted, but here at last it is. Each paper's review is written by a single author, so any comments made may not reflect the opinions of all nine of the joint authors! (shrink)
Alfred Tarski (1944) wrote that "the condition of the 'essential richness' of the metalanguage proves to be, not only necessary, but also sufficient for the construction of a satisfactory definition of truth." But it has remained unclear what Tarski meant by an 'essentially richer' metalanguage. Moreover, DeVidi & Solomon (1999) have argued that there is nothing that Tarski could have meant by that phrase which would make his pronouncement true.
According to Timothy Williamson's epistemic view, vague predicates have precise extensions, we just don't know where their boundaries lie. It is a central challenge to his view to explain why we would be so ignorant, if precise borderlines were really there. He offers a novel argument to show that our insuperable ignorance ``is just what independently justified epistemic principles would lead one to expect''. This paper carefully formulates and critically examines Williamson's argument. It is shown that the argument (...) does not explain our ignorance, and is not really apt for doing so. Williamson's unjustified commitment to a controversial and crucial assumption is noted. It is also argued in three different ways that his argument is, in any case, self-defeating – the same principles that drive the argument can be applied to undermine one of its premises. Along the way, Williamson's unstated commitment to a number of other controversial doctrines comes to light. (shrink)
According to Nancy Cartwright, a causal law holds just when a certain probabilistic condition obtains in all test situations which in turn satisfy a set of background conditions. These background conditions are shown to be inconsistent and, on separate account, logically incoherent. I offer a corrective reformulation which also incorporates a strategy for problems like Hesslow's thrombosis case. I also show that Cartwright's recent argument for modifying the condition to appeal to singular causes fails.Proposed modifications of the theory's probabilistic condition (...) to handle effects with extreme probabilities (0 or 1) are found unsatisfactory. I propose a unified solution which also handles extreme causes. Undefined conditional probabilities give rise to three good, but non-equivalent, ways of formulating the theory. Various formulations appear in the literature. I give arguments to eliminate all but one candidate. Finally, I argue for a crucial new condition clause, and show how to extend the results beyond a simple probabilistic framework. (shrink)
The management literature is replete with studies on business ethics. Unfortunately, most of these studies have dealt exclusively with ethics in large businesses. Although a handful of studies can be found on small business ethics, none has paid attention to the issue of ethics in small minority businesses. Similarly, several studies on ethics have utilized the Wood et al. (1988) 16-vignette ethics scale, although reliability and validity issues associated with the scale have never been fully addressed. In this study, a (...) purification (via content analysis) of the above mentioned scale was performed. Three reliable factors were extracted from the purified scale. They were used to investigate the ethics in small minority businesses. The study found an association between business ethics and demographic and company-related variables. In the case of age of respondents, findings ran counter the usual relationship of age being positively related to ethical attitudes. The implications of these findings are also discussed. (shrink)
Abduction is or subsumes a process of inference. It entertains possible hypotheses and it chooses hypotheses for further scrutiny. There is a large literature on various aspects of non-symbolic, subconscious abduction. There is also a very active research community working on the symbolic (logical) characterisation of abduction, which typically treats it as a form of hypothetico-deductive reasoning. In this paper we start to bridge the gap between the symbolic and sub-symbolic approaches to abduction. We are interested in benefiting from developments (...) made by each community. In particular, we are interested in the ability of non-symbolic systems (neural networks) to learn from experience using efficient algorithms and to perform massively parallel computations of alternative abductive explanations. At the same time, we would like to benefit from the rigour and semantic clarity of symbolic logic. We present two approaches to dealing with abduction in neural networks. One of them uses Connectionist Modal Logic and a translation of Horn clauses into modal clauses to come up with a neural network ensemble that computes abductive explanations in a top-down fashion. The other combines neural-symbolic systems and abductive logic programming and proposes a neural architecture which performs a more systematic, bottom-up computation of alternative abductive explanations. Both approaches employ standard neural network architectures which are already known to be highly effective in practical learning applications. Differently from previous work in the area, our aim is to promote the integration of reasoning and learning in a way that the neural network provides the machinery for cognitive computation, inductive learning and hypothetical reasoning, while logic provides the rigour and explanation capability to the systems, facilitating the interaction with the outside world. Although it is left as future work to determine whether the structure of one of the proposed approaches is more amenable to learning than the other, we hope to have contributed to the development of the area by approaching it from the perspective of symbolic and sub-symbolic integration. (shrink)
This paper critiques a recent article in this journal in terms of its use of persuasive techniques. The central issue of the original article by Miles, Munilla and Covin and this paper is whether there should be a change in intellectual property rights to address the needs of impoverished people who are HIV positive or have full blown AIDS and the countries that do not have the means to buy AIDS medication in the absence of subsidies. (...) class='Hi'> This paper argues that patents are state sanctioned monopolies that worked effectively for nearly a century. However, new circumstances and a globally interdependent world represent a new environment calling for an adjustment in the conventional public policy premises underlying patents. Most of the meaning and complexity of this issue is lost to the persuasive techniques of the original article. (shrink)
Representation of similarities is not sufficient for most visual tasks. The proposed framework collapses useful dimensions such as position and pose for the sake of naming the object. Collapsing these dimensions leaves no representation of the object itself, but only an internal name that cannot be meaningfully manipulated.
Bentham.--Coleridge.--M. de Tocqueville on democracy in America.--On liberty.--Utilitarianism.--From Considerations on representative government.--From An examination of Sir William Hamilton's philosophy, volume 1.--From Three essays on religion.--John Stuart Mill, a select bibliography (p. -530).
This essay argues, flouting paradox, that Mill was a utilitarian but not a consequentialist. First, it contends that there is logical space for a view that deserves to be called utilitarian despite its rejection of consequentialism; second, that this logical space is, in fact, occupied by John Stuart Mill. The key to understanding Mill's unorthodox utilitarianism and the role it plays in his moral philosophy is to appreciate his sentimentalist metaethics—especially his account of wrongness in terms of fitting guilt (...) and resentment. Mill recognizes a fundamental moral asymmetry between the agent and others, which conflicts intractably with a presupposition of consequentialism. This allows him to differentiate three potentially conflicting evaluative spheres: morality, prudence, and aesthetics. This essay's account of Mill's utilitarianism coheres with his defense of individual liberty and his embrace of supererogation, both of which elude traditional interpretations. (shrink)
The paper is a tribute to the late Stuart Hampshire's investigations of the ramifying role of intention in our conceptual scheme. It surveys the central argument of Thought and Action and the third chapter of Freedom of the Individual. Emphasis is placed upon Hampshire's constructive account of human agency and consequent description of the manner in which perception and action are interwoven. His analysis of the character of intentional action, self-knowledge and autonomy is described. Various lacunae in Hampshire's account (...) are identified and an attempt is made to fill them in in a manner consistent with Hampshire's insights. (shrink)
In Logical consequence: A defense of Tarski (Journal of Philosophical Logic, vol. 25, 1996, pp. 617–677), Greg Ray defends Tarski"s account of logical consequence against the criticisms of John Etchemendy. While Ray"s defense of Tarski is largely successful, his attempt to give a general proof that Tarskian consequence preserves truth fails. Analysis of this failure shows that de facto truth preservation is a very weak criterion of adequacy for a theory of logical consequence and should be replaced by a stronger (...) absence-of-counterexamples criterion. It is argued that the latter criterion reflects the modal character of our intuitive concept of logical consequence, and it is shown that Tarskian consequence can be proved to satisfy this criterion for certain choices of logical constants. Finally, an apparent inconsistency in Ray"s interpretation of Tarski"s position on the modal status of the consequence relation is noted. (shrink)
Ray Monk and Anthony Palmer, (eds) Bertrand Russell and the Origins of Analytical Philosophy, Thoemmes Press, Bristol, 1996; pp. xvi + 383; Hans-Johann Glock, (ed.) The Rise of Analytic Philosophy, Blackwell, 1997; pp. xiv + 95; Matthias Schirn, (ed.) Frege: Importance and Legacy, Walter de Gruyter, Berlin, 1996; pp. x + 466; Stuart G. Shanker, (ed.) Philosophy of Science, Logic and Mathematics in the Twentieth Century, Routledge History of Philosophy Volume IX, Routledge, 1996; pp. xxxviii + 461; John Blackmore, (...) (ed.) Ludwig Boltzmann: His Later Life and Philosophy, 1900-1906, Kluwer, Dordrecht, 1995; pp. xvi + 266. (shrink)
Quentin Meillassoux: After finitude: an essay on the necessity of contingency, trans. Ray Brassier. London and New York: Continuum, 2008, 27.95 ( hb );19.95 (pb). Graham Harman, Quentin Meillassoux: Philosophy in the making, Edinburgh: Edinburgh University Press, 2011, viii and 247 pp. 110.00 ( hb );32.00 (pb). Content Type Journal Article Category Book Review Pages 1-5 DOI 10.1007/s11153-012-9341-x Authors Clayton Crockett, University of Central Arkansas, 201 Donaghey Ave., Conway, AR 72035, USA Journal International Journal for Philosophy of Religion Online ISSN (...) 1572-8684 Print ISSN 0020-7047. (shrink)
John Stuart Mill regards economics as an inexact and separate science which employs a deductive method. This paper analyzes and restates Mill's views and considers whether they help one to understand philosophical peculiarities of contemporary microeconomic theory. The author concludes that it is philosophically enlightening to interpret microeconomics as an inexact and separate science, but that Mill's notion of a deductive method has only a little to contribute.
John Stuart Mill's concept of ethics was closely related to his firm belief in freedom. He was strictly a believer in each person bringing the greatest degree of happiness or good to the greatest number. This would be an individual act and in no way a forced action. One is free to act without coercion as long as no harm is brought to another person. Consequences must be considered carefully before acting and the act chosen must be the best (...) of possible choices designed to bring about the most good. Mill is definitely a prime example of teleological ethics - an ethics of considering consequences, one which is notably different from Kant's concept of following a priori maxims or principles, regardless of consequences. (shrink)
Auguste Comte's doctrine of the three phases through which sciences pass (the theological, the metaphysical, and the positive) allows us to explain what John Stuart Mill was attempting in his magnum opus, the System of Logic: namely, to move the science of logic to its terminal and 'positive' stage. Both Mill's startling account of deduction and his unremarked solution to the Humean problem of induction eliminate the notions of necessity or force—in this case, the 'logical must'—characteristic of a science's (...) metaphysical stage. Mill's treatment had a further surprising payoff: his solution to the Problem of Necessity (what today we call the problem of determinism and freedom of the will). (shrink)
Are We Spiritual Machines? as well as Ray Kurzweil for his response to my essay in that book and his willingness to take part in this discussion. My essay in that book was titled "Kurzweil's Impoverished Spirituality" and was essentially a stripped down version of a piece I had done for..
While historians of scientific method have recently called attention to the views of many of JohnStuart Mill's contemporaries on the relation between probability and inductive inference, little if any note has been taken of Mill's own vigorous attack on the received "Laplacean" interpretation of probability in the first (1843) edition of the System of Logic. This paper examines the place of Mill's critique, both in the overall framework of his philosophy, and in the tradition of assessing the (...) so-called "probability of causes". It also offers an account of why, in later editions of the work, Mill appears to adopt a much more sympathetic stance toward the received view. (shrink)
Here are some of the most important discoveries in the history of medicine: blood circulation (1620s), vaccination, (1790s), anesthesia (1840s), germ theory (1860s), X- rays (1895), vitamins (early 1900s), antibiotics (1920s-1930s), insulin (1920s), and oncogenes (1970s). This list is highly varied, as it includes basic medical knowledge such has Harvey’s account of how the heart pumps blood, hypotheses about the causes of disease such as the germ theory, ideas about the treatments of diseases such as antibiotics, and medical instruments such (...) as X-ray machines. The philosophy of medicine should be able to contribute to understanding of the nature of discoveries such as these. The great originators of the field of philosophy of science were all concerned with the nature of scientific discovery, including Francis Bacon (1960), William Whewell (1967), John Stuart Mill (1974), and Charles Peirce (1931-1958). The rise of logical positivism in the 1930s pushed discovery off the philosophical agenda, but the topic was revived through the work of philosophers such as Norwood Russell Hanson (1958), Thomas Nickles (1980), Lindley Darden (1991, 2006), and Nancy Nersessian (1984). Scientific discovery has also become an object of investigations for researchers in the fields of cognitive psychology and artificial intelligence, as seen in the work of Herbert Simon, Pat Langley, and others (Langley et al., 1987; Klahr, 2000). Today, scientific September 14, 2009 discovery is an interdisciplinary topic at the intersection of the philosophy, history, and psychology of science. The aim of this chapter is to identify patterns of discovery that illuminate some of the most important developments in the history of medicine. I have used a variety of sources to identify forty great medical discoveries (Adler, 2004; Friedman and Friedland, 1998; Science Channel, 2006; Strauss and Strauss, 2006).. (shrink)
The observed association between supernovae and gamma-ray bursts represents a cornerstone in our understanding of the nature of gamma-ray bursts. The collapsar model provides a theoretical framework for this connection. A key element is the launch of a bipolar jet (seen as a gamma-ray burst). The resulting hot cocoon disrupts the star, whereas the 56Ni produced gives rise to radioactive heating of the ejecta, seen as a supernova. In this discussion paper, I summarize the observational status of the supernova–gamma-ray burst (...) connection in the context of the ‘engine’ picture of jet-driven supernovae and highlight SN 2012bz/GRB 120422A—with its luminous supernova but intermediate high-energy luminosity—as a possible transition object between low-luminosity and jet gamma-ray bursts. The jet channel for supernova explosions may provide new insights into supernova explosions in general. (shrink)
The origin of gamma-ray bursts (GRBs) is one of the most interesting puzzles in recent astronomy. During the last decade a consensus has formed that long GRBs (LGRBs) arise from the collapse of massive stars, and that short GRBs (SGRBs) have a different origin, most likely neutron star mergers. A key ingredient of the collapsar model that explains how the collapse of massive stars produces a GRB is the emergence of a relativistic jet that penetrates the stellar envelope. The condition (...) that the emerging jet penetrates the envelope imposes strong constraints on the system. Using these constraints we show the following. (i) Low-luminosity GRBs (llGRBs), a subpopulation of GRBs with very low luminosities (and other peculiar properties: single-peaked, smooth and soft), cannot be formed by collapsars. llGRBs must have a different origin (most likely a shock breakout). (ii) On the other hand, regular LGRBs must be formed by collapsars. (iii) While for BATSE the dividing line between collapsars and non-collapsars is indeed at approximately 2 s, the dividing line is different for other GRB detectors. In particular, most Swift bursts longer than 0.8 s are of a collapsar origin. This last result requires a revision of many conclusions concerning the origin of Swift SGRBs, which were based on the commonly used 2 s limit. (shrink)
Complete samples are the basis of any population study. To this end, we selected a complete subsample of Swift long bright gamma ray bursts (GRBs). The sample, made up of 58 bursts, was selected by considering bursts with favourable observing conditions for ground-based follow-up observations and with the 15–150 keV 1 s peak flux above a flux threshold of 2.6 photons cm−2 s−1. This sample has a redshift completeness level higher than 90 per cent. Using this complete sample, we investigate (...) the properties of long GRBs and their evolution with cosmic time, focusing in particular on the GRB luminosity function, the prompt emission spectral-energy correlations and the nature of dark bursts. (shrink)
Bob B. He: Two-dimensional X-ray diffraction Content Type Journal Article Category Book Review Pages 1-2 DOI 10.1007/s10698-011-9135-8 Authors George B. Kauffman, Department of Chemistry, California State University, Fresno, Fresno, CA 93740-8034, USA Journal Foundations of Chemistry Online ISSN 1572-8463 Print ISSN 1386-4238.
Se dice que el utilitarismo es incompatible con la defensa de los derechos humanos, pues la búsqueda del mayor bien para el mayor número que prescribe el utilitarismo, puede exigir, en ocasiones, pasar por encima de los derechos. Sin embargo, quizá sea posible ofrecer una solución al conflicto presentando una doctrina utilitarista, reconocible como tal, que sea lo suficientemente amplia como para dar cabida a los derechos. La presente obra tiene como objeto exponer la doctrina de John Stuart Mill (...) como buen ejemplo de cómo es posible llevar a cabo esta tarea. (shrink)
Over the last fifty years, traditional farming has been replaced by industrial farming. Unlike traditional farming, industrial farming is abhorrently cruel to animals, environmentally destructive, awful for rural America, and wretched for human health. In this essay, I document those facts, explain why the industrial system has become dominant, and argue that we should boycott industrially produced meat. Also, I argue that we should not even kill animals humanely for food, given our uncertainty about which creatures possess a right to (...) life. In practice, then, we should be vegetarians. To underscore the importance of these issues, I use statistics to show that industrial farming has caused more pain and suffering than the Holocaust. (shrink)
Philosophers of chemistry, following the lead of physicists, have been slow to realize that molecular descriptions issuing from quantum mechanics in the absence of chemical theory are fatally flawed. In the wake of this realization, new topics have begun to unfoldincluding new metaphysical issues, new concerns about the philosophy of chemistry's place in the philosophy of science, and new accounts of how properties are observed, inferred, and presented. A recent collection of essays, Of Minds and Molecules: New Philosophical Perspectives on (...) Chemistry edited by Nalini Bhushan and Stuart Rosenfeld, reveals what some of these new issues are and suggests new directions for the philosophy of chemistry. (shrink)
We consider the implications of a model for long-duration gamma-ray bursts in which the progenitor is spun up in a close binary by tidal interactions with a massive black-hole companion. We investigate a sample of such binaries produced by a binary population synthesis, and show that the model predicts several common features in the accretion on to the newly formed black hole. In all cases, the accretion rate declines as approximately t−5/3 until a break at a time of order 104 (...) s. The accretion rate declines steeply thereafter. Subsequently, there is flaring activity, with the flare peaking between 104 and 105 s, the peak time being correlated with the flare energy. We show that these times are set by the semi-major axis of the binary, and hence the process of tidal spin-up; furthermore, they are consistent with flares seen in the X-ray light curves of some long gamma-ray bursts. (shrink)
My aim in this paper is to describe some of John Stuart Mill’s views about property rights in land and some implications he drew for public policy. While Mill defends private ownership of land, he emphasizes the ways in which ownership of land is an anomaly that does not fit neatly into the usual views about private ownership. While most of MiII’s discussion assumes the importance of maximizing the productivity of land, he anticipates contemporary environmentalists by also expressing concerns (...) about excessive exploitation of land for productive use. I extrapolate from these remarks to suggest changes that Mill might have favored regarding ownership rights ina world in which people aimed to decrease productivity. And, I suggest, it is a virtue of utilitarianism that it so readily supports changes in important principles when circumstances change significantly. (shrink)
In our quest for gamma-ray burst (GRB) progenitors, it is relevant to consider the progenitor evolution of normal supernovae (SNe). This is largely dominated by mass loss. We discuss the mass-loss rate for very massive stars up to 300M⊙. These objects are in close proximity to the Eddington Γ limit. We describe the new concept of the transitional mass-loss rate, enabling us to calibrate wind mass loss. This allows us to consider the occurrence of pair-instability SNe in the local Universe. (...) We also discuss luminous blue variables and their link to luminous SNe. Finally, we address the polarization properties of Wolf–Rayet (WR) stars, measuring their wind asphericities. We argue to have found a group of rotating WR stars that fulfil the required criteria to make long-duration GRBs. (shrink)
The 'Art of Life' is John Stuart Mill's name for his account of practical reason. In this volume, eleven leading scholars elucidate this fundamental, but widely neglected, element of Mill's thought. Mill divides the Art of Life into three 'departments': 'Morality, Prudence or Policy, and Æsthetics'. In the volume's first section, Rex Martin, David Weinstein, Ben Eggleston, and Dale E. Miller investigate the relation between the departments of morality and prudence. Their papers ask whether Mill is a rule utilitarian (...) and, if so, whether his practical philosophy must be incoherent. The second section contains papers by Jonathan Riley and Wendy Donner, who explore the relation between the departments of morality and aesthetics. They discuss issues ranging from supererogation to aesthetic pleasure and humanity's relationship with nature. -/- The papers in the third section consider the Art of Life's axiological first principle, the principle of utility. Elijah Millgram contends that Mill's own life refutes his claim that the Art of Life has a single axiological first principle. Philip Kitcher maintains that Mill has a dynamic axiology requiring us to continually refine our conception of the good. In the final section, three papers address what it means to put the Art of Life into practice. Robert Haraldsson locates an 'Art of Ethics' in On Liberty that is in tension with the Art of Life. Nadia Urbinati plumbs the classical roots of Mill's view of the good life. Finally, Colin Heydt develops Mill's suggestion that we regard our own lives as works of art. (shrink)
Stuart Macintyre, The Poor Relation. A History of Social Sciences in Australia Content Type Journal Article Category Book Review Pages 355-358 DOI 10.1007/s11024-011-9173-3 Authors Henrika Kuklick, History and Sociology of Science, University of Pennsylvania, 303 Cohen Hall, 249 South 36th Street, Philadelphia, PA 19104-6304, USA Journal Minerva Online ISSN 1573-1871 Print ISSN 0026-4695 Journal Volume Volume 49 Journal Issue Volume 49, Number 3.
'This is the most lucid and engaged account of Stuart Hall's work. Meticulously, and with an exemplary generosity, Helen Davis patiently unravels the threads of Hall's intellectual history. The result is a most useful and thoughtful book, which could prove to be indispensable for students of cultural studies' - Graeme Turner, University of Queensland Understanding Stuart Hall traces the development of one of the most influential and respected figures within cultural studies. Focusing on Stuart Hall's writings over (...) a period of nearly fifty years, this volume offers students and academics a cogent and exploratory route through complex and overlapping areas of analysis. In her critical assessment of Hall's most important contributions to academic and public debate, Davis shows the extent to which his analyses of race and ethnicity have been informed by early studies of Marxism, class and 'societies structured in dominance'. Davis offers fresh insight into the formation of one of the most prolific, charismatic and controversial intellectuals of his generation. Despite having been branded a 'cultural pessimist', Stuart Hall has long been associated with encouraging new, cutting-edge scholarship within the field. This volume concludes with a discussion of Hall's most recent political and academic interventions and his continuing commitment to innovation within the visual arts. (shrink)
Stuart Kauffman: Steve is extremely bright, inventive. He thoroughly understands paleontology; he thoroughly understands evolutionary biology. He has performed an enormous service in getting people to think about punctuated equilibrium, because you see the process of stasis/sudden change, which is a puzzle. It's the cessation of change for long periods of time. Since you always have mutations, why don't things continue changing? You either have to say that the particular form is highly adapted, optimal, and exists in a stable (...) environment, or you have to be very puzzled. Steve has been enormously important in that sense. (shrink)
I reappraise in detail Hertz's cathode ray experiments. I show that, contrary to Buchwald's (1995) evaluation, the core experiment establishing the electrostatic properties of the rays was successfully replicated by Perrin (probably) and Thomson (certainly). Buchwald's discussion of 'current purification' is shown to be a red herring. My investigation of the origin of Buchwald's misinterpretation of this episode reveals that he was led astray by a focus on what Hertz 'could do'-his experimental resources. I argue that one (...) should focus instead on what Hertz wanted to achieve-his experimental goals. Focusing on these goals, I find that his explicit and implicit requirements for a successful investigation of the rays' properties are met by Perrin and Thomson. Thus, even by Hertz's standards, they did indeed replicate his experiment. (shrink)
A 10 kHz pulsed X-ray generator utilising a hot-cathode triode in conjunction with a new type of grid control device for controlling X-ray duration is described. The energy-storage condenser was charged up to 70 kV by a power supply, and the electric charges in the condenser were discharged to the X-ray tube repetitively by the grid control device. The maximum values of the grid voltage (negative value), the tube voltage, and the tube current were (...) −1.5 kV, 70 kV, and 0.4 A, respectively. The duration of the flash X-ray pulse was primarily determined by the time constant of the grid control device and the cut-off voltage of thermoelectrons. The X-ray duration was controlled within a region of less than 1 ms; the X-ray intensity with a pulse width of 0.27 ms, a charged voltage of 70 kV, and a peak tube current of 0.4 A was 0.92 μC kg −1 at 0.5 m per pulse. The maximum repetition rate was about 10 kHz, and the size of the focal spot was about 3.5×3.5 mm. (shrink)
In this essay, I explore John Stuart Mill’s theory of government and its application to the issue of health care reform. In particular, I ask whether Mill’s theory of government would justify or condemn the creation of a public health-insurance option. Although Mill’s deep distrust of governmental authority would seem to align him with Republicans, Tea Partiers, libertarians, and others, who cast the public option as a “government takeover” of “our” health care system, I argue that Mill offers good (...) reasons for seriously considering some form of government-operated health insurance. For Mill theorizes government as having a positive as well as a negative role to play in people’s lives, and he explicitly endorses “public options” in different areas of life. According to his theory of government, a public health-insurance option would be just as long as it would meet the following two conditions: (1) it would not invade the “reserved territory” of individual liberty; and (2) “the case of expediency is strong.” I argue that a public option would in fact meet both of these conditions, and that Mill would have likely endorsed it as an effective solution to the current health care crisis in the United States. (shrink)