Throughout the biological and biomedical sciences there is a growing need for, prescriptive ‘minimum information’ (MI) checklists specifying the key information to include when reporting experimental results are beginning to find favor with experimentalists, analysts, publishers and funders alike. Such checklists aim to ensure that methods, data, analyses and results are described to a level sufficient to support the unambiguous interpretation, sophisticated search, reanalysis and experimental corroboration and reuse of data sets, facilitating the extraction of maximum value from data sets (...) them. However, such ‘minimum information’ MI checklists are usually developed independently by groups working within representatives of particular biologically- or technologically-delineated domains. Consequently, an overview of the full range of checklists can be difficult to establish without intensive searching, and even tracking thetheir individual evolution of single checklists may be a non-trivial exercise. Checklists are also inevitably partially redundant when measured one against another, and where they overlap is far from straightforward. Furthermore, conflicts in scope and arbitrary decisions on wording and sub-structuring make integration difficult. This presents inhibit their use in combination. Overall, these issues present significant difficulties for the users of checklists, especially those in areas such as systems biology, who routinely combine information from multiple biological domains and technology platforms. To address all of the above, we present MIBBI (Minimum Information for Biological and Biomedical Investigations); a web-based communal resource for such checklists, designed to act as a ‘one-stop shop’ for those exploring the range of extant checklist projects, and to foster collaborative, integrative development and ultimately promote gradual integration of checklists. (shrink)
Between the American Revolution and the Civil War many Americans professed to reject altogether the notion of adhering to tradition, perceiving it as a malign European influence. But by the beginning of the twentieth century, Americans had possibly become more tradition-minded than their European contemporaries. So argues Michael D. Clark in this incisive work of social and intellectual history. Challenging reigning assumptions, Clark maintains that in the period 1865 to 1942 Americans became more conscious of tradition as (...) a social force, viewed it more positively, and used it more eclectically and selectively for personal and social enrichment. Clark expands upon the existing body of scholarly work by clearly distinguishing tradition from other ways of relating to the past and by studying particular traditions that shaped American thought and culture. He gives primary attention to five individuals who represent the growing influence of tradition in this period: the popular philosopher and historian John Fiske, representing Anglo-American tradition; the Virginia historians Philip Bruce and Lyon G. Tyler, representing a southern variation on a national tradition; the country's leading neo-Gothic architect, Ralph Adams Cram, representing a surprisingly pervasive medieval tradition; and the sociologist Charles H. Cooley, representing views on immigrant traditions. Clark examines how the idea of tradition, initially associated with family home and local town, was carried outward to the larger sphere and applied to promote American expansionism -- territorially, economically, and demographically. Tradition was also used as a weapon by well-entrenched social groups in struggles over power and values. It offered a sense of stability in a time of unsettling technological and sociological change. Ironically, Clark shows, `traditionalists of this era helped to create a modern sensibility by opposing the Victorian linear sense of history and employing instead historical cycles and typology as ways to bridge past and present. The American Discovery of Tradition, 1865--1942 describes a period of time when the social and intellectual forces of tradition and modernity converged in the United States. America was both a nation looking back, now having a memory of its own past, and a nation looking forward to future progress. Clark's book reveals the shaping of the modern American self and its distinctive attitude -- both positive and negative -- toward tradition. (shrink)
Why should the work of the ancient and the medievals, so far as it relates to nature, still be of interest and an inspiration to us now? The contributions to this enlightening volume explore and uncover contemporary scholarship's debt to the classical and medieval past. Thinking About the Environment synthesizes religious thought and environmental theory to trace a trajectory from Mesopotamian mythology and classical and Hellenistic Greek, through classical Latin writers, to medieval Christian views of the natural world and our (...) relationship with it. The work also offers medieval Arabic and Jewish views on humanity's inseparability from nature. The volume concludes with a study of the breakdown between science and value in contemporary ecological thought. Thinking About the Environment will be a invaluable source book for those seeking to address environmental ethics from a historical perspective. (shrink)
According to the extended cognition hypothesis (henceforth ExC), there are conditions under which thinking and thoughts (or more precisely, the material vehicles that realize thinking and thoughts) are spatially distributed over brain, body and world, in such a way that the external (beyond-the-skin) factors concerned are rightly accorded fully-paid-up cognitive status.1 According to functionalism in the philosophy of mind, “what makes something a mental state of a particular type does not depend on its internal constitution, but rather on the way (...) it functions, or the role it plays, in the system of which it is a part” (Levin 2008). The respective fates of these two positions may not be independent of each other. The claim that ExC is in some way a form of, dependent on, entailed by, or at least commonly played out in terms of, functionalism is now pretty much part of the received view of things (see, e.g., Adams and Aizawa 2008; Clark and Chalmers 1998; Clark 2005, 2008, this volume a, b, forthcoming; Menary 2007; Rupert 2004; Sprevak manuscript; Wheeler forthcoming). Thus ExC might be mandated by the existence of functionally specified cognitive systems whose boundaries are located partly outside the skin. This is the position that Andy Clark has recently dubbed extended functionalism (Clark 2008, forthcoming; see also Wheeler forthcoming). (shrink)
It is widely held in philosophy that knowing is not a state of mind. On this view, rather than knowledge itself constituting a mental state, when we know, we occupy a belief state that exhibits some additional non-mental characteristics. Fascinatingly, however, new empirical findings from cognitive neuroscience and experimental philosophy now offer direct, converging evidence that the brain can—and often does—treat knowledge as if it is a mental state in its own right. While some might be tempted to keep the (...) metaphysics of epistemic states separate from the neurocognitive mechanics of our judgements about them, here I will argue that these empirical findings give us sufficient reason to conclude that knowledge is at least sometimes a mental state. The basis of this argument is the epistemological principle of neurocognitive parity—roughly, if the contents of a given judgement reflect the structure of knowledge, so do the neurocognitive mechanics that produced them. This principle, which I defend here, straightforwardly supports the inference from the empirical observation that the brain sometimes treats knowledge like a mental state to the epistemological conclusion that knowledge is at least sometimes a mental state. All told, the composite, belief-centric metaphysics of knowledge widely assumed in epistemology is almost certainly mistaken. (shrink)
Much of contemporary epistemology proceeds on the assumption that tracking theories of knowledge, such as those of Dretske and Nozick, are dead. The word on the street is that Kripke and others killed these theories with their counterexamples, and that epistemology must move in a new direction as a result. In this paper we defend the tracking theories against purportedly deadly objections. We detect life in the tracking theories, despite what we perceive to be a premature burial.
_The Multicultural Imagination_ is a challenging inquiry into the complex interrelationship between our ideas about race and color and the unconscious. Michael Vannoy Adams takes a fresh look at the contributions of psychoanalysis to a question which affects every individual who tries to establish an effective personal identity in the context of their received 'racial' identity. Adams argues that 'race' is just as important as sex or any other content of the unconcscious, drawing on clinical case materal from contemporary (...) patients for whom 'race' or color is a vitally significant social and political concern that impacts on them personally. He does not assume that racism or 'colorism' will simply vanish if we psychoanalyse them, but shows how a non-defensive ego and a self-image that is receptive to other-images can move us towards a more productive discourse of cultural differences. Wide-ranging in its references and scope, this is a book that provokes the reader - analyst or not - to confront personally those unconscious attitudes which stand in the way of authentic multicultural relationships. (shrink)
Despite the ubiquity of knowledge attribution in human social cognition, its associated neural and cognitive mechanisms are poorly documented. A wealth of converging evidence in cognitive neuroscience has identified independent perspective-taking and inhibitory processes for belief attribution, but the extent to which these processes are shared by knowledge attribution isn't presently understood. Here, we present the findings of an EEG study designed to directly address this shortcoming. These findings suggest that belief attribution is not a component process in knowledge attribution, (...) contra a standard attitude taken by philosophers. Instead, observed differences in P3b amplitude indicate that knowledge attribution doesn't recruit the strong self-perspective inhibition characteristic of belief attribution. However, both belief and knowledge attribution were observed to display a late slow wave widely associated with mental state attribution, indicating that knowledge attribution also shares in more general processing of others' mental states. These results provide a new perspective both on how we think about knowledge attribution, as well as Theory of Mind processes generally. (shrink)
In a recent paper, Tristan Haze offers two examples that, he claims, are counterexamples to Nozick's Theory of Knowledge. Haze claims his examples work against Nozick's theory understood as relativized to belief forming methods M. We believe that they fail to be counterexamples to Nozick's theory. Since he aims the examples at tracking theories generally, we will also explain why they are not counterexamples to Dretske's Conclusive Reasons Theory of Knowledge.
The Multicultural Imagination is a challenging inquiry into the complex interrelationship between our ideas about race, color and the unconscious. Drawing on clinical case material, Michael Vannoy Adams argues that race is just as important as sex or any other content of the unconscious. He does not assume that racism will simply vanish if we psychoanalyze a patient, but shows how a non-defensive ego and a self-image that is receptive to other-images can move us towards a more productive discourse (...) of cultural differences. The Multicultural Imagination provokes the reader--analyst or not--to confront personally those unconscious attitudes which stand in the way of authentic multicultural relationships. (shrink)
Here I explore a new line of evidence for belief-credence dualism, the thesis that beliefs and credences are distinct and equally fundamental types of mental states. Despite considerable recent disagreement over this thesis, little attention has been paid in philosophy to differences in how our mindreading systems represent the beliefs and credences of others. Fascinatingly, the systems we rely on to accurately and efficiently track others’ mental states appear to function like belief-credence dualists: Credence is tracked like an emotional state, (...) composed of both representational and affective content, whereas belief is tracked like a bare representational state with no affective component. I argue on a preliminary basis that, in this particular case, the mechanics of mentalizing likely pick out a genuine affective dimension to credence that is absent for belief, further strengthening the converging case for belief-credence dualism. (shrink)
On the standard conception of risk, the degree to which an event is risky is the function of the probability of that event. Recently, Duncan Pritchard has challenged this view, proposing instead a modal account on which risk is conceived of in terms of modal ordering (2015). On this account, the degree of risk for any given event is a function of its modal distance from the actual world, not its likelihood. Pritchard's main motivation for this is that the probabilistic (...) account cannot always explain our judgements about risk. In certain cases, equally probable events are not judged to be equally risky. Here I will argue that Pritchard's account succumbs to a similar problem. Put simply, there are cases in which judgements about risk decouple from both probability and modal ordering. Thus, if we want a theory of risk that can explain our judgements about risk, neither the probabilistic nor modal account is successful. (shrink)
There is a widespread attitude in epistemology that, if you know on the basis of perception, then you couldn’t have been wrong as a matter of chance. Despite the apparent intuitive plausibility of this attitude, which I’ll refer to here as “stochastic infallibilism”, it fundamentally misunderstands the way that human perceptual systems actually work. Perhaps the most important lesson of signal detection theory (SDT) is that our percepts are inherently subject to random error, and here I’ll highlight some key empirical (...) research that underscores this point. In doing so, it becomes clear that we are in fact quite willing to attribute knowledge to S that p even when S’s perceptual belief that p could have been randomly false. In short, perceptual processes can randomly fail, and perceptual knowledge is stochastically fallible. The narrow implication here is that any epistemological account that entails stochastic infallibilism, like safety, is simply untenable. More broadly, this myth of stochastic infallibilism provides a valuable illustration of the importance of integrating empirical findings into epistemological thinking. (shrink)
The use of evidence in medicine is something we should continuously seek to improve. This book seeks to develop our understanding of evidence of mechanism in evaluating evidence in medicine, public health, and social care; and also offers tools to help implement improved assessment of evidence of mechanism in practice. In this way, the book offers a bridge between more theoretical and conceptual insights and worries about evidence of mechanism and practical means to fit the results into evidence assessment procedures.
This cutting-edge collection of essays showcases the work of some of the most influential theorists of the past thirty years as they grapple with the question of how literature should be treated in contemporary theory. The contributors challenge trends that have recently dominated the field--especially those that emphasize social and political issues over close reading and other analytic methods traditionally associated with literary criticism. Written especially for this collection, these essays argue for the importance of aesthetics, poetics, and aesthetic theory (...) as they present new and stimulating perspectives on the directions which theory and criticism will take in the future. In addition to providing a selection of distinguished critics writing at their best, this collection is valuable because it represents a variety of fields and perspectives that are not usually found together in the same volume. MichaelClark's introduction provides a concise, cogent history of major developments and trends in literary theory from World War II to the present, making the entire volume essential reading for students and scholars of literature, literary theory, and philosophy. (shrink)
Here I defend two counterexamples to Nozick’s truth-tracking theory of knowledge from an attack on them by Adams and Clarke. With respect to the first counterexample, Adams and Clarke make the error of judging that my belief counts as knowledge. More demonstrably, with respect to the second counterexample they make the error of thinking that, on Nozick’s method-relativized theory, the method M in question in any given case must be generally reliable.
It is all but universally accepted in epistemology that knowledge is factive: S knows that p only if p. The purpose of this thesis is to present an argument against the factivity of knowledge and in doing so develop a non-factive approach to the analysis of knowledge. The argument against factivity presented here rests largely on empirical evidence, especially extant research into visuomotor noise, which suggests that the beliefs that guide everyday motor action are not strictly true. However, as we (...) still want to attribute knowledge on the basis of successful motor action, I argue that the best option is to replace factivity with a weaker constraint on knowledge, one on which certain false beliefs might still be known. In defence of this point, I develop the non-factive analysis of knowledge, which demonstrates that a non-factive constraint might do the same theoretical work as factivity. (shrink)
This unique collection of essays on the late Pierre Hadot’s revolutionary approach to studying and practising philosophy traces the links between his work and that of thinkers from Wittgenstein to the French postmodernists. It shows how his secular spiritual exercises expand our horizons, enabling us to be in a fuller, more authentic way. Comprehensive treatment of a neglected theme: philosophy’s practical relevance in our lives Interdisciplinary analysis reflects the wide influence of Hadot’s thought Explores the links between Hadot’s ideas and (...) those of a wealth of ancient and modern thinkers, including the French postmodernists Offers a practical ‘third way’ in philosophy beyond the dichotomy of Continental and analytical traditions. (shrink)
One of the most widely recognised intuitions about knowledge is that knowing precludes believing truly as a matter of luck. On Pritchard’s highly influential modal account of epistemic luck, luckily true beliefs are, roughly, those for which there are many close possible worlds in which the same belief formed in the same way is false. My aim is to introduce a new challenge to this account. Starting from the observation—as documented by a number of recent EEG studies—that our capacity to (...) detect visual stimuli fluctuates with the phase of our neural oscillations, I argue that there can be very close possible worlds in which an actual-world detectable stimulus is undetectable. However, this doesn’t diminish our willingness to attribute knowledge in the case that the stimulus is detectable, even when undetectability would result in the same belief formed in the same way being false. As I will argue at length, the modal account appears unable to accommodate this result. (shrink)
"I have entitled this book _For Love of the Imagination_. Long ago, I fell in love with the imagination. It was love at first sight. I have had a lifelong love affair with the imagination. I would love for others, through this book, to fall in love, as I once did, with the imagination." _Michael Vannoy Adams_,_ _from the Preface. ___For Love of the Imagination__ _is a book about the imagination – about what and how images mean. Jungian psychoanalysis is (...) an imaginal psychology – or what Michael Vannoy Adams calls "imaginology," the study of the imagination. What is so distinctive – and so valuable – about Jungian psychoanalysis is that it emphasizes images. For Love of the Imagination is also a book about interdisciplinary applications of Jungian psychoanalysis. What enables these applications is that all disciplines include images of which they are more or less unconscious. Jungian psychoanalysis is in an enviable position to render these images conscious, to specify what and how they mean. On the contemporary scene, as a result of the digital revolution, there is no trendier word than "applications" – except, perhaps, the abbreviation "apps." In psychoanalysis, there is a "Freudian app" and a "Jungian app." The "Jungian app" is a technology of the imagination. This book applies Jungian psychoanalysis to images in a variety of disciplines. _For Love of the Imagination_ also includes the 2011 Moscow lectures on Jungian psychoanalysis. It will be essential reading for psychoanalysts, psychotherapists, students, and those with an interest in Jung. (shrink)
The problem of historiographical evaluation is simply this: By what evaluative criteria might we say that certain works of historiography are better than others? One recently proposed solution to this problem comes by way of Kuukkanen’s postnarrativist philosophy of historiography.1 Kuukkanen argues that because many historiographically interesting statements lack truth-values, we cannot evaluate historiographical claims on a truth-functional basis. In the place of truth, Kuukkanen suggests that we evaluate historiographical claims in terms of justification. The problem with this proposal, as (...) I will argue here, is that it isn’t at all clear what it means for a neither-true-nor-false claim to be justified. Moreover, this proposal also runs into trouble with the factivity of knowledge. The solution I propose here might be called “two-valued” postnarrativism, which retains Kuukkanen’s framework, except with a stricter ontology devoid of neither-true-nor-false historiographical statements. In arguing for this approach to historiographical evaluation, this paper will be structured in the following way: First, I’ll describe Kuukkanen’s postnarrativism in more detail, focusing especially on his account of historiographical evaluation. Next, I’ll introduce two problems that accompany this account, one originating from the factivity of knowledge and the other from trying to divorce justification from the concept of truth. Finally, I argue that not only might these problems be solved by simply committing to all historiographical claims being either true or false, but that Kuukkanen’s account is especially amenable to this. (shrink)
There are brains in vats in the actual world. These “cerebral organoids” are roughly comparable to the brains of three-month-old foetuses, and conscious cerebral organoids seem only a matter of time. Philosophical interest in conscious cerebral organoids has thus far been limited to bioethics, and the purpose of this paper is to discuss cerebral organoids in an epistemological context. In doing so, I will argue that it is now clear that there are close possible worlds in which we are BIVs. (...) Not only does this solidify our intuitive judgement that we cannot know that we are not BIVs, but it poses a fundamental problem for both the neo-Moorean antisceptical strategy, which purports to allow us to know that we aren’t BIVs, and the safety condition on knowledge itself. Accordingly, this case is especially instructive in illustrating just how epistemologically relevant empirical developments can be. (shrink)
The knowledge-centric Theory of Mind research program suggested by Phillips et al. stands to gain significant value by embracing a neurocognitive approach that takes full advantage of techniques like fMRI and EEG. This neurocognitive approach has already begun providing important insights into the mechanisms of knowledge attribution, insights which support the claim that it is more basic than belief attribution.
The distinction between true belief and knowledge is one of the most fundamental in philosophy, and a remarkable effort has been dedicated to formulating the conditions on which true belief constitutes knowledge. For decades, much of this epistemological undertaking has been dominated by a single strategy, referred to here as the modal approach. Shared by many of the most widely influential constraints on knowledge, including the sensitivity, safety, and anti-luck/risk conditions, this approach rests on a key underlying assumption — the (...) modal profiles available to known and unknown beliefs are in some way asymmetrical. The first aim of this paper is to deconstruct this assumption, identifying its plausibility with the way in which epistemologists frequently conceptualize human perceptual systems as excluding certain varieties of close error under conditions conducive to knowledge acquisition. The second aim of this paper is to then argue that a neural phase phenomenon indicates that this conceptualization is quite likely mistaken. This argument builds on the previous introduction of this neural phase to the context of epistemology, expanding the use of neural phase cases beyond relatively narrow questions about epistemic luck to a much more expansive critique of the modal approach as a whole. (shrink)
Quite likely the most sacrosanct principle in epistemology, it is near-universally accepted that knowledge is factive: knowing that p entails p. Recently, however, Bricker, Buckwalter, and Turri have all argued that we can and often do know approximations that are strictly speaking false. My goal with this paper is to advance this nascent non-factive project in two key ways. First, I provide a critical review of these recent arguments against the factivity of knowledge, allowing us to observe that elements of (...) these arguments mutually reinforce respective weaknesses, thereby offering the non-factive project a much stronger foundation than when these arguments were isolated. Next, I argue tentatively in favor of Bricker’s truthlikeness framework over the representational adequacy account favored by Buckwalter and Turri. Taken together, while none of this constitutes a knock-down argument against factivity, it does allow us to quiet some of the more immediate worries surrounding the non-factive project. (shrink)
The volume includes twenty-five research papers presented as gifts to John L. Bell to celebrate his 60th birthday by colleagues, former students, friends and admirers. Like Bell’s own work, the contributions cross boundaries into several inter-related fields. The contributions are new work by highly respected figures, several of whom are among the key figures in their fields. Some examples: in foundations of maths and logic ; analytical philosophy, philosophy of science, philosophy of mathematics and decision theory and foundations of economics. (...) Most articles are contributions to current philosophical debates, but contributions also include some new mathematical results, important historical surveys, and a translation by Wilfrid Hodges of a key work of arabic logic. (shrink)
Tristan Haze claims we have made two mistakes in replying to his two attempted counter-examples to Tracking Theories of Knowledge. Here we respond to his two recent claims that we have made mistakes in our reply. We deny both of his claims.
Since Kripke's attack on Nozick's Tracking Theory of knowledge, there has been strong suspicion that tracking theories are false. We think that neither Kripke's arguments and examples nor other recent attacks in the literature show that the tracking theories are false. We cannot address all of these concerns here, but we will show why some of the most discussed examples from Kripke do not demonstrate that the tracking theories are false.
Democratic countries, such as Australia, face the dilemma of preserving public and national security without sacrificing fundamental freedoms. In the context where the rule of law is an underlying assumption of the constitutional framework, Emergency Powers in Australia provides a succinct analysis of the sorts of emergency which have been experienced in Australia and an evaluation of the legal weapons available to the authorities to cope with these emergencies. It analyses the scope of the defence power to determine the constitutionality (...) of federal legislation to deal with wartime crises and the 'war' on terrorism, the extent of the executive power and its relationship to the prerogative, the deployment of the defence forces in aid of the civil power, the statutory frameworks regulating the responses to civil unrest, and natural disasters. The role of the courts when faced with challenges to the invocation of emergency powers is explained and analysed. (shrink)
Since Kripke's attack on Nozick's Tracking Theory of knowledge, there has been strong suspicion that tracking theories are false. We think that neither Kripke's arguments and examples nor other recent attacks in the literature show that the tracking theories are false. We cannot address all of these concerns here, but we will show why some of the most discussed examples from Kripke do not demonstrate that the tracking theories are false.
Multiple epistemological programs make use of intuitive judgments pertaining to an individual’s ability to gain knowledge from exclusively probabilistic/statistical information. This paper argues that these judgments likely form without deference to such information, instead being a function of the degree to which having knowledge is representative of an agent. Thus, these judgments fit the pattern of formation via a representativeness heuristic, like that famously described by Kahneman and Tversky to explain similar probabilistic judgments. Given this broad insensitivity to probabilistic/statistical information, (...) it directly follows that these epistemic judgments are insensitive to a given agent’s epistemic status. From this, the paper concludes that, breaking with common epistemological practice, we cannot assume that such judgments are reliable. (shrink)
ObjectiveBulimia nervosa and binge eating disorder are eating disorders characterized by recurrent binge eating episodes. Overlap exists between ED diagnostic groups, with BE episodes presenting one clinical feature that occurs transdiagnostically. Neuroimaging of the responses of those with BN and BED to disorder-specific stimuli, such as food, is not extensively investigated. Furthermore, to our knowledge, there have been no previous published studies examining the neural response of individuals currently experiencing binge eating, to low energy foods. Our objective was to examine (...) the neural responses to both low energy and high energy food images in three emotive categories in BN and BED participants.MethodsNineteen females with BN or BED, comprising the binge eating group, and 19 age-matched healthy control ’s completed thorough clinical assessment prior to functional MRI. Neural response to low energy and high energy foods and non-food images was compared between groups using whole-brain exploratory analyses, from which six regions of interest were then selected: frontal, occipital, temporal, and parietal lobes; insula and cingulate.ResultsIn response to low energy food images, the BEG demonstrated differential neural responses to all three low energy foods categories compared to HCs. Correlational analyses found a significant association between frequency of binge episodes and diminished temporal lobe and greater occipital lobe response. In response to high energy food images, compared to HC’s, the BEG demonstrated significantly decreased neural activity in response to all high energy food images. The HC’s had significantly greater neural activity in the limbic system, occipital lobe, temporal lobe, frontal lobe, and limbic system in response to high energy food images.ConclusionResults in the low energy food condition indicate that binge frequency may be related to increased aberrant neural responding. Furthermore, differences were found between groups in all ROI’s except the insula. The neural response seen in the BEG to disgust food images may indicate disengagement with this particular stimuli. In the high energy food condition, results demonstrate that neural activity in BN and BED patients may decrease in response to high energy foods, suggesting disengagement with foods that may be more consistent with those consumed during a binge eating episode. (shrink)
Preface to the second edition -- Preface to the first edition -- Psycho-mythology : meschugge? -- Dreams and fantasies : manifestations 0f the mythological unconscious -- African-American dreaming and the "lion in the path" : racism and the cultural unconscious -- "Hapless" the Centaur : an archetypal image, amplification, and active imagination -- Pegasus and visionary experience : from the white winged horse to the "flying red horse" -- The bull, the labyrinth, and the Minotaur : from archaeology to "archetypology" (...) (with an apology to Ariadne) -- Griffins, gold, and dinosaurs : mythology and "fantastic paleontology" -- Dreaming of a unicorn : a comparison of Lacanian and Jungian interpretation -- "Destiny" and the call to heroism : a dream of vocation and individuation -- List of publications by Michael Vannoy Adams. (shrink)
_Paradoxes from A to Z, Third edition_ is the essential guide to paradoxes, and takes the reader on a lively tour of puzzles that have taxed thinkers from Zeno to Galileo, and Lewis Carroll to Bertrand Russell. MichaelClark uncovers an array of conundrums, such as Achilles and the Tortoise, Theseus’ Ship, and the Prisoner’s Dilemma, taking in subjects as diverse as knowledge, science, art and politics. Clark discusses each paradox in non-technical terms, considering its significance and (...) looking at likely solutions. This third edition is revised throughout, and adds nine new paradoxes that have important bearings in areas such as law, logic, ethics and probability. Paradoxes from A to Z, Third edition is an ideal starting point for those interested not just in philosophical puzzles and conundrums, but anyone seeking to hone their thinking skills. (shrink)
This innovative volume, by Michael Shapiro, is not about Adam Smith in the sense in which 'about' is usually understood, for it is neither a comprehensive explication of his views nor a careful tracing of the sources of them. Instead it is a confrontation. This is a book about modernity whose vehicle is a reading of Adam Smith—it is an enactment of the convention that despite the contribution Smith made to creating and legitimating the conceptual space for (...) modern, commercial, liberal, and democratic society, his views are inadequate for those who want an effective, politicized understanding of the present. Shapiro's ultimate goal in this examination is to 'exemplify a way of doing political theory—one that challenges some traditional ways of constructing and celebrating the 'political theory cannon.'. (shrink)
The emerging neurocomputational vision of humans as embodied, ecologically embedded, social agents—who shape and are shaped by their environment—offers a golden opportunity to revisit and revise ideas about the physical and information-theoretic underpinnings of life, mind, and consciousness itself. In particular, the active inference framework makes it possible to bridge connections from computational neuroscience and robotics/AI to ecological psychology and phenomenology, revealing common underpinnings and overcoming key limitations. AIF opposes the mechanistic to the reductive, while staying fully grounded in a (...) naturalistic and information-theoretic foundation, using the principle of free energy minimization. The latter provides a theoretical basis for a unified treatment of particles, organisms, and interactive machines, spanning from the inorganic to organic, non-life to life, and natural to artificial agents. We provide a brief introduction to AIF, then explore its implications for evolutionary theory, ecological psychology, embodied phenomenology, and robotics/AI research. We conclude the paper by considering implications for machine consciousness. (shrink)
Extended Cognition examines the way in which features of a subject's cognitive environment can become constituent parts of the cognitive process itself. This volume explores the epistemological ramifications of this idea, bringing together academics from a variety of different areas, to investigate the very idea of an extended epistemology.
This bibliography in two volumes, originally published in 1988, lists and describes works by and about Jacques Lacan published in French, English, and seven other languages including Japanese and Russian. It incorporates and corrects where necessary all information from earlier published bibliographies of Lacan’s work. Also included as background works are books and essays that discuss Lacan in the course of a more general study, as well as all relevant items in various bibliographic sources from many fields.
Strange inversions occur when things work in ways that turn received wisdom upside down. Hume offered a strangely inverted story about causation, and Darwin, about apparent design. Dennett suggests that a strange inversion also occurs when we project our own reactive complexes outward, painting our world with elusive properties like cuteness, sweetness, blueness, sexiness, funniness, and more. Such properties strike us as experiential causes, but they are really effects—a kind of shorthand for whole sets of reactive dispositions rooted in the (...) nuts and bolts of human information processing. Understanding the nature and origins of that strange inversion, Dennett believes, is thus key to understanding the nature and origins of human experience itself. This paper examines this claim, paying special attention to recent formulations that link that strange inversion to the emerging vision of the brain as a Bayesian estimator, constantly seeking to predict the unfolding sensory barrage. (shrink)
There is currently an explosion of interest in grounding. In this article we provide an overview of the debate so far. We begin by introducing the concept of grounding, before discussing several kinds of scepticism about the topic. We then identify a range of central questions in the theory of grounding and discuss competing answers to them that have emerged in the debate. We close by raising some questions that have been relatively neglected but which warrant further attention.
In this paper, we argue that revisionary theories about the nature and extent of Hume's scepticism are mistaken. We claim that the source of Hume's pervasive scepticism is his empiricism. As earlier readings of Hume's Treatise claim, Hume was a sceptic – and a radical one. Our position faces one enormous problem. How is it possible to square Hume's claims about normative reasoning with his radical scepticism? Despite the fact that Hume thinks that causal reasoning is irrational, he explicitly claims (...) that one can and should make normative claims about beliefs being ‘reasonable’. We show that even though Hume thinks that our causal beliefs are rationally unjustified, there is nonetheless a ‘relative’ sense of justification available to Hume and that he relies on this ‘relative’ sense in those places where he makes normative claims about what we ought to believe. (shrink)