The way in which securities are traded is very different from the idealized picture of a frictionless and self-equilibrating market offered by the typical finance textbook. Market Liquidity offers a more accurate and authoritative take on liquidity and price discovery. The authors start from the assumption that not everyone is present at all times simultaneously on the market, and that even the limited number of participants who are have quite diverse information about the security's fundamentals. As a result, the order (...) flow is a complex mix of information and noise, and a consensus price only emerges gradually over time as the trading process evolves and the participants interpret the actions of other traders. Thus a security's actual transaction price may deviate from its fundamental value, as it would be assessed by a fully informed set of investors. This book takes these deviations seriously, and explains why and how they emerge in the trading process and are eventually eliminated. The authors draw on a vast body of theoretical insights and empirical findings on security price formation that have accumulated in the last thirty years, and have come to form a well-defined field within financial economics known as 'market microstructure.' Focusing on liquidity and price discovery, they analyze the tension between the two, pointing out that when price-relevant information reaches the market through trading pressure rather than through a public announcement, liquidity suffers. The book also confronts many puzzling phenomena in securities markets and uses the analytical tools and empirical methods of market microstructure to understand them. These include issues such as why liquidity changes over time, why large trades move prices up or down, and why these price changes are subsequently reversed, why we see concentration of securities trading, why some traders willingly disclose their intended trades while others hide them, and why we observe temporary deviations from arbitrage prices. (shrink)
[opening paragraph]: Walter Freeman discusses with Jean Burns some of the issues relating to consciousness in his recent book. Burns: To understand consciousness we need know its relationship to the brain, and to do that we need to know how the brain processes information. A lot of people think of brain processing in terms of individual neurons, and you're saying that brain processing should be understood in terms of dynamical states of populations?
This book offers a provocative, clear and rigorously argued account of the nature of perception and its role in the production of knowledge. Walter Hopp argues that perceptual experiences do not have conceptual content, and that what makes them play a distinctive epistemic role is not the features which they share with beliefs, but something that in fact sets them radically apart. He explains that the reason-giving relation between experiences and beliefs is what Edmund Husserl called 'fulfilment' - in (...) which we find something to be as we think it to be. His book covers a wide range of central topics in contemporary philosophy of mind, epistemology and traditional phenomenology. It is essential reading for contemporary analytic philosophers of mind and phenomenologists alike. (shrink)
"Local History/Global Designs" is one of the most important books in the historical humanities to have emerged since the end of the Cold War University. This is vintage Mignolo: packed with insights, breadth, and intellectual zeal.
This book examines John Locke's claims about the nature and workings of language. Walter Ott proposes an interpretation of Locke's thesis in which words signify ideas in the mind of the speaker, and argues that rather than employing such notions as sense or reference, Locke relies on an ancient tradition that understands signification as reliable indication. He then uses this interpretation to explain crucial areas of Locke's metaphysics and epistemology, including essence, abstraction, knowledge and mental representation. His discussion challenges (...) many of the orthodox readings of Locke, and will be of interest to historians of philosophy and philosophers of language alike. (shrink)
In this book, Alison Ross engages in a detailed study of Walter Benjamin’s concept of the image, exploring the significant shifts in Benjamin’s approach to the topic over the course of his career. Using Kant’s treatment of the topic of sensuous form in his aesthetics as a comparative reference, Ross argues that Benjamin’s thinking on the image undergoes a major shift between his 1924 essay on ‘Goethe’s Elective Affinities ,’ and his work on The Arcades Project from 1927 up (...) until his death in 1940 . The two periods of Benjamin’s writing share a conception of the image as a potent sensuous force able to provide a frame of existential meaning. In the earlier period this function attracts Benjamin’s critical attention, whereas in the later he mobilises it for revolutionary outcomes. The book gives a critical treatment of the shifting assumptions in Benjamin’s writing about the image that warrant this altered view. It draws on hermeneutic studies of meaning, scholarship in the history of religions and key texts from the modern history of aesthetics to track the reversals and contradictions in the meaning functions that Benjamin attaches to the image in the different periods of his thinking. Above all, it shows the relevance of a critical consideration of Benjamin’s writing on the image for scholarship in visual culture, critical theory, aesthetics and philosophy more broadly. (shrink)
There has been a great deal of talk recently among historians of Christian reflection about the problem and the possibility of a ‘plurality of theologies’. Directives from such eminent spokesmen as Karl Rahner have underscored the need for a rationale by which to demonstrate that the presence of different orientations does not necessarily violate the unitary character of a Christian tradition. Other Catholic thinkers have offered arguments for ascribing a relative status to the ‘Thomistic style’ of theology, and cases have (...) been made for the inclusion of additional schematic frameworks. Beyond all of this, there are elegant suggestions in the writings of Bernard Lonergan that there is sufficient theoretical, even metaphysical, basis to justify plurality in theology. The claim would seem to be that different theological orientations are expressive of distinct fields of vision which are not necessarily mutually exclusive. (shrink)
What is the origin of the concept of a law of nature? How much does it owe to theology and metaphysics? To what extent do the laws of nature permit contingency? Are there exceptions to the laws of nature? Is it possible to give a reductive analysis of lawhood, or is it a primitive? -/- Twelve brand-new essays by an international team of leading philosophers take up these and other central questions on the laws of nature, whilst also examining some (...) of the most important intuitions and assumptions that have guided the debate over laws of nature since the concept's invention in the seventeenth century. -/- Laws of Nature spans the history of philosophy and of science, contemporary metaphysics, and contemporary philosophy of science. Contents: 1. Intuitions and Assumptions in the Debate over Laws of Nature, Walter Ott and Lydia Patton 2. Early Modern Roots of the Philosophical Concept of a Law of Nature, Helen Hattab 3. Laws of Nature and the Divine Order of Things: Descartes and Newton on Truth in Natural Philosophy, Mary Domski 4. Leges sive natura: Bacon, Spinoza, and a Forgotten Concept of Law, Walter Ott 5. Laws and Powers in the Frame of Nature, Stathis Psillos 6. Laws and Ideal Unity, Angela Breitenbach 7. Becoming Humean, John W. Carroll 8. A Perspectivalist Better Best System Account of Lawhood, Michela Massimi 9. Laws: An Invariance Based Account, James Woodward 10. How the Explanations of Natural Laws Make Some Reducible Physical Properties Natural and Explanatorily Powerful, Marc Lange 11. Laws and their Exceptions, Stephen Mumford 12. Are laws of nature consistent with contingency?, Nancy Cartwright and Pedro Merlussi. (shrink)
Few twentieth-century thinkers have proven as influential as Walter Benjamin, the German-Jewish philosopher and cultural and literary critic. Richard Wolin's book remains among the clearest and most insightful introductions to Benjamin's writings, offering a philosophically rich exposition of his complex relationship to Adorno, Brecht, Jewish Messianism, and Western Marxism. Wolin provides nuanced interpretations of Benjamin's widely studied writings on Baudelaire, historiography, and art in the age of mechanical reproduction. In a new Introduction written especially for this edition, Wolin discusses (...) the unfinished _Arcades Project_, as well as recent tendencies in the reception of Benjamin's work and the relevance of his ideas to contemporary debates about modernity and postmodernity. (shrink)
Walter Burley is the author of a treatise, entitled De primo et ultimo instanti, which is regarded as the most popular medieval work on the problem of assigning first and last instants of being to permanent things. In this paper, however, the author does not deal with this treatise directly. She looks instead at Burley’s Physics commentary to see how he applies the ideas presented in De primo et ultimo instanti to the solution of an Aristotelian puzzle about the (...) ceasing to be of the present instant. In Burley’s interpretation, the relevant question raised by the puzzle is whether the present instant ceases to be when it is or when it is not. While Aristotle’s argument quickly dismisses the first alternative as absurd, Burley defends it by appealing to the ‘expositions’ of sentences about ceasing. Given that the sentence ‘this instant ceases to be’ has two expositions— ‘this instant now is and immediately afterwards will not be’ or ‘this instant now is not and immediately beforehand was’ —Burley maintains that the sentence is true in exposition but not true in exposition, so that an instant ceases to be when it is and not when it is not. (shrink)
This article reviews concepts of, as well as neurocognitive and genetic studies on, empathy. Whereas cognitive empathy can be equated with affective theory of mind, that is, with mentalizing the emotions of others, affective empathy is about sharing emotions with others. The neural circuits underlying different forms of empathy do overlap but also involve rather specific brain areas for cognitive (ventromedial prefrontal cortex) and affective (anterior insula, midcingulate cortex, and possibly inferior frontal gyrus) empathy. Furthermore, behavioral and imaging genetic studies (...) provide evidence for a genetic basis for empathy, indicating a possible role for oxytocin and dopamine as well as for a genetic risk variant for schizophrenia near the gene ZNF804A. (shrink)
Are living organisms--as Descartes argued--just machines? Or is the nature of life such that it can never be fully explained by mechanistic models? In this thought-provoking and controversial book, eminent geophysicist Walter M. Elsasser argues that the behavior of living organisms cannot be reduced to physico-chemical causality. Suggesting that molecular biology today is at the same point as Newtonian physics on the eve of the quantum revolution, Elsasser lays the foundation for a theoretical biology that points the way toward (...) a natural philosophy of organic life. Explicitly repudiating "vitalism" (the notion that the laws of nature need to be modified when applied to living organisms), Elsasser argues instead that the structural complexity of even a single living cell is "transcomputational"--that is, beyond the power of any imaginable system to compute. Beginning from this insight, Elsasser leads the reader through a step-by-step process that ultimately arrives at the conclusion that living and non-living matter are separated by "a no-man's land of irrationality." Trained in Germany as a physicist, Elsasser first pondered the implications of quantum mechanics for biology as early as 1951. The more closely he studied the inherent complexity of life, the more skeptical he became of the reductionist view of organisms as tiny machines. "An organism," he concluded, "is a source of causal chains which cannot be traced beyond a terminal point because they are lost in the unfathomable complexity of the organism." Like the physicist who works within the bounds of an unfathomable universe, Elsasser argues, the biologist must seek answers within a system that is no less unfathomable. (shrink)