The paper sets out the challenges facing the Police in respect of the detection and prevention of the volume crime of burglary. A discussion of data mining and decision support technologies that have the potential to address these issues is undertaken and illustrated with reference the authors’ work with three Police Services. The focus is upon the use of “soft” forensic evidence which refers to modus operandi and the temporal and geographical features of the crime, rather than “hard” evidence such (...) as DNA or fingerprint evidence. Three objectives underpin this paper. First, given the continuing expansion of forensic computing and its role in the emergent discipline of Crime Science, it is timely to present a review of existing methodologies and research. Second, it is important to extract some practical lessons concerning the application of computer science within this forensic domain. Finally, from the lessons to date, a set of conclusions will be advanced, including the need for multidisciplinary input to guide further developments in the design of such systems. The objectives are achieved by first considering the task performed by the intended systems users. The discussion proceeds by identifying the portions of these tasks for which automation would be both beneficial and feasible. The knowledge discovery from databases process is then described, starting with an examination of the data that police collect and the reasons for storing it. The discussion progresses to the development of crime matching and predictive knowledge which are operationalised in decision support software. The paper concludes by arguing that computer science technologies which can support criminal investigations are wide ranging and include geographical information systems displays, clustering and link analysis algorithms and the more complex use of data mining technology for profiling crimes or offenders and matching and predicting crimes. We also argue that knowledge from disciplines such as forensic psychology, criminology and statistics are essential to the efficient design of operationally valid systems. (shrink)
One of the most important problems of modern philosophy concerns the place of subjectivity in a purely physical universe. Brian Loar was a major contributor to the discussion of this problem for over four decades. This volume brings together his most important and influential essays in the philosophy of language and of mind.
In this article I present two arguments from Brian Hebblethwaite for the conclusion that multiple incarnations are impossible, as well as the analyses of those arguments provided by three other thinkers: Oliver Crisp, Peter Kevern, and Robin Le Poidevin. I argue that both of Hebblethwaite's arguments are unsound.
Essays on Wittgensteinian Themes Dedicated to Brian McGuinness Joachim Schulte, Göran Sundholm. PREFACE For thirty-five years the international community of philosophers have known Brian McGuinness as a major authority on the ...
William Hasker replies to my arguments against Social Trinitarianism, offers some criticism of my own view, and begins a sketch of another account of the Trinity. I reply with some defence of my own theory and some questions about his.
In "the semantics of singular terms," brian loar described and criticized a "causal" theory of reference and offered a new "description" theory. It is argued that the particular causal theory described is not to be found in the papers by donnellan and kripke cited as evidence for it, And is a straw man. Further "prima facie", Loar's new description theory fails to meet kripke's noncircularity condition. Should loar attempt to meet it, His theory is likely to run foul of (...) kripke's usual "arguments from ignorance and error" against description theories. (shrink)
In his recent article, ‘A Gift to Theology? Jean-Luc Marion's ‘Saturated Phenomena’ in Christological Perspective’, Brian Robinette has critiqued Marion's phenomenology for confining theology to a one-sided approach to Christology, one that stresses only the passive, mystical reception of Christ. To correct this imbalance, Robinette brings Marion into dialogue with those more active Christologies or ‘prophetical-ethical’ liberation theologies of Gustavo Gutierrez, Johann Baptist Metz and others that stress a life-praxis focused on confronting evil and suffering. In this essay I (...) am arguing that Robinette has not fully developed the ‘logic’ of Marion's phenomenology of the ‘call and the gifted’, in which both a passive and an active element are operative. I explore more fully that very dynamic phenomenological process of the call-and-the-gifted as developed in Marion's work Being Given: Toward a Phenomenology of Givenness. Once viewed in Christological perspective, and especially in light of Christ's death and resurrection, Marion's phenomenology entails an ethical trope consistent with the mission of Christ as rendered in Scriptural revelation, and thus the gap between Marion's work and the prophetical-ethical theologies of Gutierrez and Baptist Metz becomes narrowed. (shrink)
Human beings are peculiar. In laboratory experiments, they often cooperate in one-shot prisoners’ dilemmas, they frequently offer 1/2 and reject low offers in the ultimatum game, and they often bid 1/2 in the game of divide-the-cake All these behaviors are puzzling from the point of view of game theory. The first two are irrational, if utility is measured in a certain way.1 The last isn’t positively irrational, but it is no more rational than other possible actions, since there are infinitely (...) many other Nash equilibria besides the one in which both players bid 1/2. At the same time, these behaviors seem to indicate that people are sometimes inclined to be cooperative, fair, and just. In his stimulating new book, Brian Skyrms sets himself the task of showing why these inclinations evolved, or how they might have evolved, under the pressure of natural selection. The goal is not to justify our ethical intuitions, but to explain why we have them.2.. (shrink)
Brian Trainor argues that the current hostility of political theorists towards the idea of the common good is in part due to the influence of Isaiah Berlin's concept of `value pluralism', or the incommensurability of basic human values. I agree with Trainor's opposition to the `agonistic' interpretation of pluralism, associated with thinkers like Chantal Mouffe. However, it is not the case that the only alternative to the pluralism— agonism thesis is the monist defence of a thick common good advocated (...) by Trainor. Between these extremes there is a middle way that accepts the deep plurality of values in Berlin's sense, but recognizes a case for a thin conception of the common good — that is, a liberal political framework. (shrink)
In The Ant Trap, Brian Epstein proposes a bold new systematic strategy for developing social ontology. He explores the history and current state of the art and provides pointed critiques of leading theories in the field. His framework, incompassing frames that provide principles for grounding social facts, is developed in some detail across a variety of social practices and applied to revealing real world as well as hyporthetical examples. If Epstein's account holds, it should provide new directions and standards (...) of inquiry in both social sciecne and social philiosophy. (shrink)
In teaching jurisprudence, I typically distinguish between two different families of theories of adjudication—theories of how judges do or should decide cases. “Formalist” theories claim that the law is “rationally” determinate, that is, the class of legitimate legal reasons available for a judge to offer in support of his or her decision justifies one and only one outcome either in all cases or in some significant and contested range of cases ; and adjudication is thus “autonomous” from other kinds of (...) reasoning, that is, the judge can reach the required decision without recourse to nonlegal normative considerations of morality or political philosophy. I also note that “formalism” is sometimes associated with the idea that judicial decision-making involves nothing more than mechanical deduction on the model of the syllogism—Beccaria, for example, expresses such a view. I call the latter “Vulgar Formalism” to emphasize that it is not a view to which anyone today cares to subscribe. (shrink)
Brian Loar argues that we can account for the conceptual independence of coextensive terms purely psychologically, by appealing to conceptual rather than semantic differences between concepts, and that this leaves room for assuming that phenomenal and physical concepts can be coextensive on a posteriori grounds despite the fact that both sorts of concepts refer directly . I argue that Loar does not remove the mystery of the coextensiveness of those concepts because he does not offer any explanation of why (...) they should be coextensive. Secondly, I argue that even if we grant that phenomenal and physical concepts can be coextensive on a posteriori grounds, we are committed to holding that there are two different and essential modes of presentation of phenomenal properties, the physical and the phenomenal, and that this precludes us from seeing phenomenal properties as essentially physical in an unrelativized sense. (shrink)
In 1919 the Animal Breeding Research Department was established in Edinburgh. This Department, later renamed the Institute of Animal Genetics, forged an international reputation, eventually becoming the centrepiece of a cluster of new genetics research units and institutions in Edinburgh after the Second World War. Yet despite its significance for institutionalising animal genetics research in the UK, the origins and development of the Department have not received as much scholarly attention as its importance warrants. This paper sheds new light on (...) Edinburgh’s place in early British genetics by drawing upon recently catalogued archival sources including the papers of James Cossar Ewart, Regius Professor of Natural History at the University of Edinburgh between 1882 and 1927. Although presently a marginal figure in genetics historiography, Ewart established two sites for experimental animal breeding work between 1895 and 1911 and played a central role in the founding of Britain’s first genetics lectureship, also in 1911. These early efforts helped to secure government funding in 1913. However, a combination of the First World War, bureaucratic problems and Ewart’s personal ambitions delayed the creation of the Department and the appointment of its director by another six years. This paper charts the institutionalisation of animal breeding and genetics research in Edinburgh within the wider contexts of British genetics and agriculture in the early twentieth century. (shrink)
Brian Barry's Culture and Equality is probably the most powerful liberal egalitarian critique of multiculturalism addressing the pathologies of recognizing difference of ethnicity, religion, race, and culture. In this essay, I examine Barry's approach to the law, which underpins his theory of egalitarianism to determine whether it is enough — as Barry thinks it is — to insist on either applying the same law for everyone so that exemptions are foreclosed in general, or repealing the law since the case (...) for its existence is not justified. I find that Barry's effort is inadequate. Because the conditions for exemptions are not specified, exemptions are merely defensible, not just. Using the headscarf controversy in France to illustrate why Barry's approach backfires, I argue how enforcing the same law for all leads to undermining the very politics of redistribution that Barry champions. (shrink)
The Philosophical Challenge from China, edited by Brian Bruya, undoubtedly occupies an important place in the discourse about what practices and authorities are relevant to Philosophy as an academic discipline. Its confident reorientation of philosophical relevance in the context of Anglophone academics will hopefully speak meaningfully to any remaining skeptics of the usefulness of Chinese philosophy. The intended audience of this effort, however, is shrinking, or, more accurately, those willing to be convinced are increasingly few, and what remains is (...) simply and haplessly the staunch traditionalists of the so-called Western paradigm. This evokes the thought that anthologies that strive to show relevance... (shrink)
Brian Leiter and Peter Kail have delivered thoughtful critiques of my book, Nietzsche’s Naturalism: Philosophy and the Life Sciences in the Nineteenth Century.1 It is a great pleasure to respond to these critiques, since they raise some crucial issues with regard to Nietzsche’s understanding of naturalism and normativity. On the one hand, there are many areas of agreement: Nietzsche’s philosophical project is best understood along the lines of naturalism; developments in the nineteenth-century life sciences, broadly speaking, play a crucial (...) role in the formation of Nietzsche’s naturalism; and Nietzsche’s relationship to both Darwin and Darwin’s neo-Kantian interpreters is more complex than generally assumed. On the... (shrink)
As the author of Justice as Impartiality, I am not ashamed to admit that I was delighted by the liveliness of the discussion generated by it at the meeting on which this symposium is based. I am likewise grateful to the six authors for finding the book worthy of the careful attention that they have bestowed on it. Between them, the symposiasts take up many more points than I can cover in this response. I shall therefore focus on some themes (...) that cluster round the contractual device that I associate with the notion of justice as impartiality. Is it necessary? If it is not necessary is it nevertheless useful? Within an overall contractual framework is the form of contract that I propose uniquely justifiable? And does the form of contract that I defend generate the implications that I claim for it? (shrink)
In a recent article Brian Leiter concluded that a useful normative theory of adjudication is impossible. A normative theory of adjudication would be a theory that, among other things, identified the moral and political norms that judges ought to follow in determining the law for any particular legal dispute. Letter's elegant and subtle argument, stripped to its bones, runs as follows: Philosophers of law regard a correct normative theory of adjudication as being dependent upon an antecedent descriptive theory. The (...) dependence here, as Leiter describes it, is of a very strong sort and unique among philosophical theories: Any normative theory, to be acceptable, cannot depart from the actual practice of judges and lawyers. Consequently, the content of the normative side of the theory is simply to “continue to do what you've been doing,” supplemented, perhaps, by Holme's injunction to do it more selfconsciously and explicitly. (shrink)
I take as my text propostion 4.0312 of the Tractatus : The possibility of propositions is based on the principle that objects have signs as their representatives. My fundamental idea is that the ‘logical constants’ are not representatives; that there can be no representatives of the logic of facts. Practically the same words occur in Wittgenstein's Notebook for 25 December 1914, where Miss Anscombe translates them: The possibility of the proposition is, of course, founded on the principle of signs as (...) going proxy for objects. Thus in the proposition something has something else as its proxy. But there is also the common cement. My fundamental thought is that the logical constants are not proxies. That the logic of the fact cannot have anything as its proxy. (shrink)
In the preface to his book God the Problem , Gordon Kaufman writes ‘Although the notion of God as agent seems presupposed by most contemporary theologians … Austin Farrer has been almost alone in trying to specify carefully and consistently just what this might be understood to mean.’.
In Reference and Consciousness, 1 John Campbell attempts to a make a case that what he calls ‘the Relational View’ of visual experience, a view that he champions, is superior to what he calls ‘the Representational View’. 2 I argue that his attempt fails. In section 1, I spell out the two views. In section 2, I outline Campbell's case that the Relational View is superior to the Representational View and offer a diagnosis of where Campbell goes wrong. In section (...) 3, I examine the case in detail and argue that it fails. Finally, in section 4, I mention two very well-known problems for the Relational View that are unresolved in the book. (shrink)
Patrick O'Brian, the Aubrey-Maturin Series of twenty novels (Norton, 1970-1999). My appreciation written for WIRED magazine: "I re-read this extraordinary series of novels because of the depth of portrayal of the major and minor characters, but also because they teach me so much about what science and technology were like two centuries ago. O'Brian shows you the world-that-was through the eyes of a Tory naval captain (Jack Aubrey), at sea since the age of 12, working his way up (...) to admiral, dealing with the height of 18th-century technology (sailing ships and celestial navigation). I identify more strongly with his liberally-educated, physician-scientist friend (Stephen Maturin), who went to medical school in Paris during the French Revolution. You see natural history turning into a biological science, bleeding-and-purging medicine starting to learn some physiology -- and, because Maturin is also an intelligence agent for the Admiralty, you see statecraft at work during the Napoleonic Wars. These books strongly remind you about what scientific ignorance and social conventions can do to your mindset, and how the future will likely judge us as well." -- William H. Calvin You can get them all at once, so you can: The Complete Aubrey/Maturin Series (20 volumes). Depending on amazon.com's current discount, this works out to US$15-20 each (and in hardcover). (shrink)
At first sight it would seem difficult to find two philosophers as different as Brian Barry and Richard Rorty. It is widely held that the former is one of the most forceful proponents of liberal universalism, whereas the latter is typically viewed as the quintessential relativist. In this essay, different usages of the term univeralism are considered, and it is argued that Rorty's position is much closer to that of Barry than is generally supposed. Indeed, the article concludes by (...) suggesting that it is Rorty who offers the less question-begging philosophical account of political liberalism. (shrink)
Brian Rotman argues that (one) “mind” and (one) “god” are only conceivable, literally, because of (alphabetic) literacy, which allowed us to designate each of these ghosts as an incorporeal, speaker-independent “I” (or, in the case of infinity, a notional agent that goes on counting forever). I argue that to have a mind is to have the capacity to feel. No one can be sure which organisms feel, hence have minds, but it seems likely that one-celled organisms and plants do (...) not, whereas animals do. So minds originated before humans and before language --hence, a fortiori, before writing, whether alphabetic or ideographic. (shrink)
Brian Leiter and Neil Sinhababu (eds), Nietzsche and Morality Content Type Journal Article DOI 10.1007/s10677-008-9134-6 Authors Rainer Kattel, Tallinn University of Technology Ehitajate tee 5 19086 Tallinn Estonia Journal Ethical Theory and Moral Practice Online ISSN 1572-8447 Print ISSN 1386-2820.