Children often refer to things ambiguously but learn not to from responding to clarification requests. We review and explore this learning process here. In Study 1, eighty-four 2- and 4-year-olds were tested for their ability to request stickers from either (a) a small array with one dissimilar distracter or (b) a large array containing similar distracters. When children made ambiguous requests, they received either general feedback or specific questions about which of two options they wanted. With training, children learned (...) to produce more complex object descriptions and did so faster in the specific feedback condition. They also tended to provide more information when requesting stickers from large arrays. In Study 2, we varied only distracter similarity during training and then varied array size in a generalization test. Children found it harder to learn in this case. In the generalization test, 4-year-olds were more likely to provide information (a) when it was needed because distracters were similar to the target and (b) when the array size was greater (regardless of need for information). We discuss how clear cues to potential ambiguity are needed for children to learn to tailor their referring expression to context and how several cues of heuristic value (e.g., more distracters > say more) can promote the efficiency of communication while language is developing. Finally, we consider whether it would be worthwhile drawing on the human learning process when developing algorithms for the production of referring expressions. (shrink)
A wealth of human knowledge is acquired by attending to information provided by other people – but some people are more credible sources than others. In two experiments, we explored whether young children spontaneously keep track of an individual’s history of being accurate or inaccurate and use this information to facilitate subsequent learning. We found that 3- and 4-year-olds favor a previously accurate individual when learning new words and learning new object functions and applied the principle of mutual exclusivity (...) to the newly learned words but not the newly learned functions. These findings expand upon previous research in a number of ways, most importantly by showing that (a) children spontaneously keep track of an individual’s history and use it to guide subsequent learning without any prompting, and (b) children’s sensitivity to others’ prior accuracy is not specific to the domain of language. Ó 2008 Elsevier B.V. All rights reserved. (shrink)
The communicative interactions of very young children almost always involve language (based on conventions), gesture (based on bodily deixis or iconicity) and directed gaze. In this study, ninety-six children (3;0 years) were asked to determine the location of a hidden toy by understanding a communicative act that contained none of these familiar means. A light-and-sound mechanism placed behind the hiding place and illuminated by a centrally placed switch was used to indicate the location of the toy. After a communicative training (...) session, an experimenter pressed the switch either deliberately or accidentally, and with or without ostension (in the form of eye contact and child-directed speech). In no condition did she orient towards the hiding place. When the switch was pressed intentionally, children used the light-and-sound cue to find the toy - and tended to do so even in the absence of ostensive eye contact. When the experimenter pressed the switch accidentally, children searched randomly - demonstrating that they were tracking her communicative intent, and not merely choosing on the basis of salience. The absence of an effect of ostension contradicts research that ostension helps children to interpret the communicative intentions underlying unfamiliar signs. We explain this by concluding that while it may play a role in establishing a communicative interaction, it is not necessary for sustaining one; and that even with a highly novel communicative act - involving none of the means of communication on which children typically rely - three-year-olds can comprehend the communicative intentions behind an intentionally produced act. (shrink)
In this rich and detailed study of early modern women's thought, Jacqueline Broad explores the complexity of women's responses to Cartesian philosophy and its intellectual legacy in England and Europe. She examines the work of thinkers such as Mary Astell, Elisabeth of Bohemia, Margaret Cavendish, Anne Conway and Damaris Masham, who were active participants in the intellectual life of their time and were also the respected colleagues of philosophers such as Descartes, Leibniz and Locke. She also illuminates the continuities (...) between early modern women's thought and the anti-dualism of more recent feminist thinkers. The result is a more gender-balanced account of early modern thought than has hitherto been available. Broad's clear and accessible exploration of this still-unfamiliar area will have a strong appeal to both students and scholars in the history of philosophy, women's studies, and the history of ideas. (shrink)
Recent findings suggest that infants are capable of distinguishing between different numbers of objects, and of performing simple arithmetical operations. But there is debate over whether these abilities result from capacities dedicated to numerical cognition, or whether infants succeed in such experiments through more general, non-numerical capacities, such as sensitivity to perceptual features or mechanisms of object tracking. We report here a study showing that 5-month-olds can determine the number of collective entities – moving groups of items – when (...) non-numerical perceptual factors such as contour length, area, density, and others are strictly controlled. This suggests both that infants can represent number per se, and that their grasp of number is not limited to the domain of objects. q 2002 Elsevier Science B.V. All rights reserved. (shrink)
In search of salvation on the Stroganov estates -- Faith, family, and land after emancipation -- Youth : exemplars of rural socialism -- Elders : Christian ascetics in the Soviet countryside -- New risks and inequalities in the household sector -- Which khoziain? whose moral community? -- Society, culture, and the churching of Sepych -- Separating post-Soviet worlds? : priestly baptisms and priestless funerals.
The fundamental constants that are involved in the laws of physics which describe our universe are finely tuned for life, in the sense that if some of the constants had slightly different values life could not exist. Some people hold that this provides evidence for the existence of God. I will present a probabilistic version of this fine-tuning argument which is stronger than all other versions in the literature. Nevertheless, I will show that one can have reasonable opinions such that (...) the fine-tuning argument doesn't lead to an increase in one's probability for the existence of God. The fine-tuning argument Objective versus subjective probability Observational selection effects The problem of old evidence Against the fine-tuning argument Many universes. (shrink)
A new method was devised to test object permanence in young infants. Fivemonth-old infants were habituated to a screen that moved back and forth through a 180-degree arc, in the manner of a drawbridge. After infants reached habituation, a box was centered behind the screen. Infants were shown two test events: a possible event and an impossible event. In the possible event, the screen stopped when it reached the occluded box; in the impossible event, the screen moved through the space (...) occupied by the box. The results indicated that infants looked reliably longer at the impossible than at the possible event. This hnding suggested that infants (1) understood that the box continued to exist, in its same location, after it was occluded by the screen, and (2) expected the screen to stop against the occluded box and were surprised, or puzzled, when it failed to do so. A control experiment in which the box was placed next to the screen provided support for this interpretation of the results. Together, the results of these experiments indicate that, contrary to Piaget’s (1954) claims, infants as young as 5 months of age understand that objects continue to exist when occluded. The results, also indicate that 5-month-old infants realize that solid objects do not move through the space occupied by other solid objects. (shrink)
In 1955, Goodman set out to 'dissolve' the problem of induction, that is, to argue that the old problem of induction is a mere pseudoproblem not worthy of serious philosophical attention. I will argue that, under naturalistic views of the reflective equilibrium method, it cannot provide a basis for a dissolution of the problem of induction. This is because naturalized reflective equilibrium is -- in a way to be explained -- itself an inductive method, and thus renders Goodman's dissolution viciously (...) circular. This paper, then, examines how the old problem of induction crept back in while nobody was looking. (shrink)
The quantitative problem of old evidence is the problem of how to measure the degree to which e confirms h for agent A at time t when A regards e as justified at t. Existing attempts to solve this problem have applied the e-difference approach, which compares A's probability for h at t with what probability A would assign h if A did not regard e as justified at t. The quantitative problem has been widely regarded as unsolvable primarily on (...) the grounds that the e-difference approach suffers from intractable problems. Various philosophers have proposed that 'Bayesianism' should be rejected as a research strategy in confirmation theory in part because of the unsolvability of this problem. I develop a version of the e-difference approach which overcomes these problems and possesses various advantages (but also certain limitations). I develop an alternative 'theistic' approach which handles many cases that my development of the e-difference approach does not handle. I conclude with an assessment of the significance of the quantitative problem for Bayesianism and argue that this problem is misunderstood in so far as it is regarded as unsolvable, and in so far as it is regarded as a problem only for Bayesians. (shrink)
Hume has traditionally been understood as an inductive sceptic with positivist tendencies, reducing causation to regular succession and anticipating the modern distinctions between analytic and synthetic, deduction and induction. The dominant fashion in recent Hume scholarship is to reject all this, replacing the ‘Old Hume’ with various New alternatives. Here I aim to counter four of these revisionist readings, presenting instead a broadly traditional interpretation but with important nuances, based especially on Hume’s later works. He asked that we should treat (...) these— notably the first Enquiry—as his authoritative philosophical statements, and with good reason. (shrink)
In Bayesian epistemology, the concept of one proposition’s being evidence for another is explained along the following lines. Given a measure of degrees of confidence, con(...), that conforms to standard probability axioms: (EV) a proposition e is evidence for a proposition h iff con(h|e) is greater than con(h). (Con(h|e) is the degree of confidence in h given e, and is defined as con(h and e)/con(e).) Proposals along these lines, however, have been dogged by what Clark Glymour called the Problem of (...) Old Evidence.[i] (EV) apparently precludes a theory being confirmed by evidence that is already in. For if a potentially evidential proposition, e, is already known, then con(e)=1. One can be subjectively certain of propositions already known to be true. But by definition of con(h|e), where con(e)=1, con(h|e) will always be equal to, and hence never greater than, con(h). Not only does (EV) preclude one from confirming new theories on the basis of information already gathered. Suppose Q is some proposition of which we are now uncertain, but which is evidence for a scientific hypothesis P. That is, con(P|Q) is greater than con(P). If we now devise an experiment to test whether Q, perform the experiment, and become certain that Q, it will no longer count as evidence for P. Thus, if we accept (EV), gathering new evidence to support a theory actually has quite the opposite effect. Gathering the evidence destroys its quality as evidence. (shrink)
Scientific realism is a doctrine that was both in and out of fashion several times during the twentieth century. I begin by noting three presuppositions of a succinct characterization of scientific realism offered initially by the foremost critic in the latter part of the century, Bas van Fraassen. The first presupposition is that there is a fundamental distinction to be made between what is “empirical” and what is “theoretical”. The second presupposition is that a genuine scientific realism is committed to (...) their being “a literally true story of what the world is like”. The third presupposition is that there are methods for justifying a belief in the empirical adequacy of a theory which do not also suffice to justify beliefs in its literal truth. Each of these presuppositions raises a number of problems, some of which are quite old and others rather newer. In each case, I briefly review some of the old problems and then elaborate the newer problems. (shrink)
In this essay I sketch a philosophical argument for classical liberalism based on the requirements of public reason. I argue that we can develop a philosophical liberalism that, unlike so much recent philosophy, takes existing social facts and mores seriously while, at the same time, retaining the critical edge characteristic of the liberal tradition. I argue that once we develop such an account, we are led toward a vindication of “old” (qua classical) liberal morality—what Benjamin Constant called the “liberties of (...) the moderns.” A core thesis of the paper is that a regime of individual rights is crucial to the project of public justification because it disperses moral authority to individuals thus mitigating what I call the “burdens of justification.” Footnotesa Earlier versions of this essay were presented at the University of North Carolina, Chapel Hill, Philosophy Department workshop on the morality of capitalism, and at the conference on rights theory at the Murphy Institute, Tulane University. I am grateful for the comments of the participants; my special thanks to David Schmidtz, Julian Lamont, and Andrea Houchard for their useful written comments and suggestions. (shrink)
This paper provides preliminary insights into the process of sense-making and developing meaning with regard to corporate social responsibility (CSR) within 18 Dutch companies. It is based upon a research project carried out within the framework of the Dutch National Research Programme on CSR. The paper questions how change agents promoting CSR within these companies made sense of the meaning of CSR. How did they use language (and other instruments) to stimulate and underpin the contextual essence of CSR? Why did (...) they do that in this particular way? What were the consequences of this approach for shaping the process of CSR in their company? Did their efforts contribute to a new way of thinking and acting or was it merely putting old wine in new barrels? A preliminary conclusion is that change agents use above all linguistic artefacts (words and notions) and carry out practical projects while constructing meaning. Still, the meaning of meaning itself remains highly intangible, situational and personality related. (shrink)
Symbols should be grounded, as has been argued before. But we insist that they should be grounded not only in subsymbolic activities, but also in the interaction between the agent and the world. The point is that concepts are not formed in isolation (from the world), in abstraction, or "objectively." They are formed in relation to the experience of agents, through their perceptual/motor apparatuses, in their world and linked to their goals and actions. This paper takes a detailed look at (...) this relatively old issue, with a new perspective, aided by our work of computational cognitive model development. To further our understanding, we also go back in time to link up with earlier philosophical theories related to this issue. The result is an account that extends from computational mechanisms to philosophical abstractions. (shrink)
In the late nineteenth century there were two very active lines of research in the field of formal logic. First, logicians (mostly in English-speaking countries) were engaged in formulating a generally traditional logic as an algebra, a part of mathematics; second, logicians (mostly on the continent) were busy building a non-traditional logic that could serve, not as a part of, but as the foundation of, mathematics. By the end of the First World War the former line had been pretty well (...) abandoned while the second continued to expand. However, that old abandoned line, stretching from Aristotle, through the Scholastics and then Leibniz to the nineteenth century algebraists, had not been completely forgotten. One of those logicians who has recently worked on the restoration (and, importantly, the extension) of that line is Fred Sommers. His Term Logic preserves a number of traditional insights (especially involving the theory of logical syntax), while also enjoying a power to account for formal inference at least comparable to that of the standard logic now in place. (shrink)
Collingwood has failed to make a significant impact in the history of twentieth century philosophy either because he has been dismissed as a dusty old idealist committed to the very metaphysics the analytical school was trying to leave behind, or because his later work has been interpreted as advocating the dissolution of philosophy into history. I argue that Collingwood's key philosophical works are a sustained attempt to defend the view that philosophy is an autonomous discipline with a distinctive domain of (...) inquiry and that Collingwood's attempt to defend the autonomy of philosophy is intimately connected to his defence of intensional notions against the kind of meaning scepticism which came to prevail from the 1920s. I defend the philosophical claim that there is a third way between the idealist metaphysics with which Collingwood is often associated and the neo-empiricist agenda which characterised analytic philosophy in mid-century by defending the hermeneutic thesis that Collingwood's work is a sustained attempt to articulate a conception of philosophy as an epistemologically first science. Since there is a via media between the old metaphysics and the new empiricism there is no need to choose between a certain kind of armchair metaphysics and a scientifically informed ontology. (shrink)
This article accepts the proposition that old people want to be treated with dignity and that statements about dignity point to ethical duties that, if not independent of rights, at least enhance rights in ethically important ways. In contexts of policy and law, dignity can certainly have a substantive as well as rhetorical function. However, the article questions whether the concept of dignity can provide practical guidance for choosing among alternative approaches to the care of old people. The article explores (...) the paradoxical relationship between the apparent lack of specific content in many conceptions of dignity and the broad utility that dignity appears to have as a concept expressive of shared social understandings about the status of old people. (shrink)
The split in our thinking between "masculine" and "feminine" is probably as old as language itself. Human beings seem to have a natural tendency to divide things into pairs: good/bad, light/dark, subject/object and so on. It is not surprising, then, that the male/female or masculine/feminine dichotomy is used to classify things other than men and women. Many languages actually classify all nouns as "masculine" or "feminine" (although not very consistently: for example, the Spanish masculine noun pollo means "hen", while the (...) feminine polla is slang for "penis"). This is perfectly natural; it is part of the way categorisation works in language. This does not, however, mean that it is right. It is probably unimportant whether a table or a chair is thought of as masculine or feminine. It may not even be very important these days whether we think of the sun as male and the moon as female (like the ancient Greeks) or vice versa (like most of the German tribes). However, when we start associating abstract concepts like Reason or Nature with men and women, we run into serious difficulties. (shrink)
The apparent tension between the moral codes of the Old and New Testaments constitutes a perennial problem for Christian ethics. Scholars who have taken this problem seriously have often done so in ways that presume sharp discontinuity between the Testaments. They then proceed to devise a system for identifying what is or is not relevant today, or what pertains to this or that particular social sphere. John Howard Yoder brings fresh perspectives to this perennial problem by refuting the presumption of (...) intratestamental discontinuity. Throughout multiple scattered works on the Old Testament, Yoder offers a coherent and provocative narration that culminates in the way of Christ and establishes the ethical continuity of the entire biblical canon. This essay presents the basic parameters of Yoder's Old Testament narration, suggests points where revision is needed, and highlights several implications for social ethics. (shrink)
In Bayes or Bust? John Earman quickly dismisses a possible resolution (or avoidance) of the problem of old evidence. In this note, I argue that his dismissal is premature, and that the proposed resolution (when charitably reconstructed) is reasonable.
This article considers the question of whether it is meaningful to speak of privacy rights in public spaces, and the possibility of such rights framing the basis for regulating or restricting the use of surveillance technologies such as closed circuit television (CCTV). In particular, it responds to a recent article by Jesper Ryberg that suggests that there is little difference between being watched by private individuals and CCTV cameras, and instead argues that state surveillance is qualitatively different from (and more (...) problematic than) surveillance by ‘lonely old ladies’. (shrink)
David Lewis (1980) proposed the Principal Principle (PP) and a “reformulation” which later on he called ‘OP’ (Old Principle). Reacting to his belief that these principles run into trouble, Lewis (1994) concluded that they should be replaced with the New Principle (NP). This conclusion left Lewis uneasy, because he thought that an inverse form of NP is “quite messy”, whereas an inverse form of OP, namely the simple and intuitive PP, is “the key to our concept of chance”. I argue (...) that, even if OP should be discarded, PP need not be. Moreover, far from being messy, an inverse form of NP is a simple and intuitive Conditional Principle (CP). Finally, both PP and CP are special cases of a General Principle (GP); it follows that so are PP and NP, which are thus compatible rather than competing. (shrink)
It is argued that the disciplinary identity of anatomy and physiology before 1800 are unknown to us due to the subsequent creation, success and historiographical dominance of a different discipline-experimental physiology. The first of these two papers deals with the identity of physiology from its revival in the 1530s, and demonstrates that it was a theoretical, not an experimental, discipline, achieved with the mind and the pen, not the hand and the knife. The physiological work of Jean Fernel, Albrecht von (...) Haller and others is explored to prove this point. In conclusion this old physiological tradition is compared to the new experimental physiology, as practised by Francois Magendie and Pierre Flourens. (shrink)
In this first issue of the new Erkenntnis, it seems fitting to recall at least briefly the character and the main achievements of its distinguished namesake and predecessor. The old Erkenntnis came into existence when Hans Reichenbach and Rudolf Carnap assumed the editorship of the Annalen der Philosophie and gave the journal its new title and its characteristic orientation; the first issue appeared in 1930. The journal was backed by the Gesellschaft f r Empirische Philosophie in Berlin, in which Reichenbach, (...) Walter Dubislav, and Kurt Grelling were the leading figures, and by the Verein Ernst Mach in Vienna, whose philosophical position was strongly influenced by that of the Vienna Circle; a brief account of these groups, and of several kindred schools and trends of scientific and philosophical thinking, was given by Otto Neurath in his 'Historische Anmerkungen' (Vol. 1, pp. 311-314). As Reichenbach noted in his introduction to the first issue, the editors of Erkenntnis were concerned to carry on philosophical inquiry in close consideration of the procedures and results of the various scientific disciplines: analysis of scientific research and its presuppositions was expected to yield insight into the character of all human knowledge, while at the same time, the objectivity and the progressive character of science inspired the convection that philosophy need not remain an array of conflicting 'systems', but could attain to the status of objective knowledge. As a student in Berlin and Vienna during those years, I experienced vividly the exhilarating sense, shared by those close to those two philosophical groups, of being jointly engaged in a novel and challenging intellectual enterprise in which philosophical issues were dealt with 'scientifically' and philosophical claims were amenable to support or criticism by logically rigorous arguments. The 'logical analyses' and 'rational reconstructions' set forth by adherents of this program often made extensive use of the concepts, methods, and symbolic apparatus of contemporary symbolic logic, whose importance for philosophy was the subject of Carnap's article, 'Die Alte und die Neue Logik', which appeared in the first issue.. (shrink)
In two experiments, we investigated whether 13-month-old infants expect agents to behave in a way consistent with information to which they have been exposed. Infants watched animations in which an animal was either provided information or prevented from gathering information about the actual location of an object. The animal then searched successfully or failed to retrieve it. Infants’ looking times suggest that they expected searches to be effective when—and only when—the agent had had access to the relevant information. This result (...) supports the view that infants’ possess an incipient metarepresentational ability that permits them to attribute beliefs to agents. We discuss the viability of more conservative explanations and the relationship between this early ability and later forms of ‘theory of mind’ that appear only after children have become experienced verbal communicators. (shrink)
This paper critically examines the argument structure of Fodor's theory of modularity. Fodor claims computational autonomy as the essential properly of modular processing. This property has profound consequences, burdening modularity theory with corollaries of rigidity, non-plasticity, nativism, and the old Cartesian dualism of sensing and thinking. However, it is argued that Fodor's argument for computational autonomy is crucially dependent on yet another postulate of Fodor's theory, viz. his thesis of strong modularity, ie. the view that functionally distinct modules must also (...) have physical counterparts in the neural architecture of the brain. Yet, Fodor offers little or no independent support for this neurological speculation. Moreover, due to the cognitivist underpinnings of Fodor's theory his view of modules as 'mental organs'faces an untenable dilemma that is to be traced back to the earliest history of modem cognitive science, viz. to the rationalist-computationalist research program initiated by Descartes and Male-branche. The tension characteristic for the Cartesian program was one that arose between information correlation and information processing accounts of the transactions between body and mind. Similarly, the tension characteristic for Fodor's theory of modularity is one between a causal account of modules on the model of simple detection mechanisms, and an information processing account of modules on the model of vast and elaborate cognitive systems. It is argued that the resulting concept of a cognitive module Fodorian style constitutes an amalgam of incompatible desiderata that fails to stake out a natural kind for cognitive science. As an alternative account, the final section shows connectionism to be capable of encompassing both Gibsonian and 'new look' accounts of cognitive achievements within one theoretical perspective, thus providing a fruitful interfield theory capable of combining the theoretical resources of the ecological approach with the indispensable theoretical complement provided by psychological processing accounts. This change of perspective would ultimately involve recasting the symbo-functionalist notion of cognitive function along bio-psychological lines. (shrink)
Developmental research suggests that some of the mechanisms that underlie numerical cognition are present and functional in human infancy. To investigate these mechanisms and their developmental course, psychologists have turned to behavioral and electrophysiological methods using briefly presented displays. These methods, however, depend on the assumption that young infants can extract numerical information rapidly. Here we test this assumption and begin to investigate the speed of numerical processing in five-month-old infants. Infants successfully discriminated between arrays of 4 vs. 8 dots (...) on the basis of number when a new array appeared every 2 s, but not when a new array appeared every 1.0 or 1.5 s. These results suggest alternative interpretations of past findings, provide constraints on the design of future experiments, and introduce a new method for probing infants’ enumeration process. Further experiments using this method provide initial evidence that infants’ enumeration mechanism operates in parallel and yields increasingly accurate numerical representations over time, as does the enumeration mechanism used by adults in symbolic and non-symbolic tasks. q 2004 Elsevier B.V. All rights reserved. (shrink)
Bai, Tongdong 白彤東, New Mission of an Old State: Classical Confucian Political Philosophy in a Contemporary and Comparative Context 舊邦新命: 古今中西參考下的古典儒家政治哲學 Content Type Journal Article DOI 10.1007/s11712-010-9183-0 Authors Ellen Y. Zhang, Department of Religion and Philosophy, Hong Kong Baptist University, Kowloon Tong, Kowloon, Hong Kong Journal Dao Online ISSN 1569-7274 Print ISSN 1540-3009 Journal Volume Volume 9 Journal Issue Volume 9, Number 4.
The Heirs of Plato is the first book exclusively devoted to an in-depth study of the various directions in philosophy taken by Plato's followers in the first seventy years or so following his death in 347 BC--the period generally known as 'The Old Academy'. Speusippus, Xenocrates, and Polemon, the three successive heads of the Academy in this period, though personally devoted to the memory of Plato, were independent philosophers in their own right, and felt free to develop his heritage in (...) individual directions. Dillon's clear and accessible book fills a significant gap in our understanding of Plato's immediate philosophical influence, and will be of great value to scholars and historians of ancient philosophy. (shrink)
Critical examination of chapter 5 of Julia Annas' book _Platonic Ethics Old and New._ I first argue that she does not establish that Plato's ethics are independent of his metaphysics. I then suggest several ways in the content of his ethcis does depend on his metaphysics, with special attention paid to the discussion of the impact of theology on ethics in the _Laws_.
It is conventional to distinguish between an old liberalism, with a robust conception of private property and a limited role for government in the economy, and a new liberalism that permits government to override individual property rights in the pursuit of the general welfare. The New Deal is often taken to mark the dividing line between these two forms of liberal governance. But when we focus on property rights through the magnifying lens of Takings Clause jurisprudence, we find that the (...) movement away from strong property rights begins not with the New Deal but in the late 19th century, at what is normally taken to be the peak of constitutionally protected private property. The much-criticized decision in Kelo v. New London (2005) represents, not a break with past doctrine, but rather its logical consequence. Protecting individual property-holders against expansive state powers of eminent domain runs into a structural conundrum: while categorical restraints on state power limit government's ability to promote important public purposes, an explicitly purposive approach renders all limits on government power (including individual rights) vulnerable to an aggregative calculus. The most plausible response is a two-tier approach: respect for legally established categories in ordinary circumstances, regardless of their aggregate consequences, and consequentialism in circumstances of emergency, when the lives or basic wellbeing of citizens are at stake. Judged against this template, the consequentialism guiding modern takings clause jurisprudence in ordinary, non-emergency circumstances is hard to justify. (shrink)
New medical technologies provide us with new possibilities in health care and health care research. Depending on their degree of novelty, they may as well present us with a whole range of unforeseen normative challenges. Partly, this is due to a lack of appropriate norms to perceive and handle new technologies. This article investigates our ways of establishing such norms. We argue that in this respect analogies have at least two normative functions: they inform both our understanding and our conduct. (...) Furthermore, as these functions are intertwined and can blur moral debates, a functional investigation of analogies can be a fruitful part of ethical analysis. We argue that although analogies can be conservative; because they bring old concepts to bear upon new ones, there are at least three ways in which they can be creative. First, understandings of new technologies are quite different from the analogies that established them, and come to be analogies themselves. That is, the concepts may turn out to be quite different from the analogies that established them. Second, analogies transpose similarities from one area into another, where they previously had no bearing. Third, analogies tend to have a figurative function, bringing in something new and different from the content of the analogies. We use research-biobanking as a practical example in our investigations. (shrink)
There is a striking difference between the methodology of the young Einstein and that of the old. I argue that Einstein’s switch in the late 1910s from a moderate empiricism to an extreme rationalism should at least in part be understood against the background of his crushing personal and political experiences during the war years in Berlin. As a result of these experiences, Einstein started to put into practice what, drawing on Schopenhauer, he had preached for years, namely to use (...) science as his means of escaping from “the merely personal.” Whatever the exact sources of Einstein’s about-face, the older man has left us with a highly misleading picture of how the younger man achieved the successes that we still celebrate today. This has had a harmful inﬂuence on theoretical physics. If the young Einstein’s successes are any guide as to how successful theoretical physics is done, close adherence to general features of the empirical data is much more and mathematical elegance is much less important than the old Einstein wanted us to believe. (shrink)
In Doing What Comes Naturally, StanleyFish argues on behalf of rhetoric and against philosophy. The latter assumes an independent reality that can be perceived without distortion and then reported in a transparent verbal medium. The former insists that this is impossible. As Fish acknowledges, this debate is a version of the ?old quarrel? that has raged since the dialogues of Plato and the orations of the sophists. The present paper first examines how the Greek sophist Isocrates (...) actually formulated the terms of the debate. Then it turns to Plato in order to demonstrate that his treatment of the old quarrel is superior to Fish's postmodern update. (shrink)
This paper offers an answer to Glymour's ‘old evidence’ problem for Bayesian confirmation theory, and assesses some of the objections, in particular those recently aired by Chihara, that have been brought against that answer. The paper argues that these objections are easily dissolved, and goes on to show how the answer it proposes yields an intuitively satisfactory analysis of a problem recently discussed by Maher. Garber's, Niiniluoto's and others’ quite different answer to Glymour's problem is considered and rejected, and the (...) paper concludes with some brief reflections on the prediction/accommodation issue. (shrink)
The old idiot wanted, by himself, to account for what was lost or saved; but the new idiot wants the lost, the incomprehensible, and the absurd to be restored to him. This is most certainly not the same persona; a mutation has taken place. And yet a slender thread links the two idiots, as if the first had to lose reason so that the second rediscovers what the other, in winning it, had lost in advance.
Using explicit memory measures, Cowan predicts a new circumstance in which the central capacity limit of 4 chunks should obtain. Supporting results for such an experiment, using continuous old-new recognition, are described. With implicit memory measures, Cowan assumes that short-term repetition priming reflects the central capacity limit. I argue that this phenomenon instead reflects limits within individual perceptual processing modules.
Human cooperation is a key driving force behind the evolutionary success of our hominin lineage. At the proximate level, biologists and social scientists have identified other-regarding preferences – such as fairness based on egalitarian motives, and altruism – as likely candidates for fostering large-scale cooperation. A critical question concerns the ontogenetic origins of these constituents of cooperative behavior, as well as whether they emerge independently or in an interrelated fashion. The answer to this question will shed light on the interdisciplinary (...) debate regarding the significance of such preferences for explaining how humans become such cooperative beings. We investigated 15-month-old infants' sensitivity to fairness, and their altruistic behavior, assessed via infants' reactions to a third-party resource distribution task, and via a sharing task. Our results challenge current models of the development of fairness and altruism in two ways. First, in contrast to past work suggesting that fairness and altruism may not emerge until early to mid-childhood, 15-month-old infants are sensitive to fairness and can engage in altruistic sharing. Second, infants' degree of sensitivity to fairness as a third-party observer was related to whether they shared toys altruistically or selfishly, indicating that moral evaluations and prosocial behavior are heavily interconnected from early in development. Our results present the first evidence that the roots of a basic sense of fairness and altruism can be found in infancy, and that these other-regarding preferences develop in a parallel and interwoven fashion. These findings support arguments for an evolutionary basis – most likely in dialectical manner including both biological and cultural mechanisms – of human egalitarianism given the rapidly developing nature of other-regarding preferences and their role in the evolution of human-specific forms of cooperation. Future work of this kind will help determine to what extent uniquely human sociality and morality depend on other-regarding preferences emerging early in life. (shrink)
Maxwell claimed that the electrostatic inverse square law could be deduced from Cavendish's spherical condenser experiment. This is true only if the accuracy claims made by Cavendish and Maxwell are ignored, for both used the inverse square law as a premise in their analyses of experimental accuracy. By so doing, they assumed the very law the accuracy of which the Cavendish experiment was supposed to test. This paper attempts to make rational sense of this apparently circular procedure and to relate (...) it to some variants of traditional problems concerning old and new evidence. (shrink)
INTERNATIONAL STUDIES IN THE PHILOSOPHY OF SCIENCE Vol. 10, number 2, 1996, pp. 127-140. R.M. Nugayev. Why did the new physics force out the old ? Abstract. The aim of my paper is to demonstrate that special relativity and the early quantum theory were created within the same programme of statistical mechanics, thermodynamics and Maxwellian electrodynamics reconciliation. I’ll try to explain why classical mechanics and classical electrodynamics were “refuted” almost simultaneously or, in other words, why the quantum revolution and the (...) relativistic one both took place at the beginning of the 20th century. I’ll argue that the quantum and relativistic revolutions were simultaneous since they had a common origin – the clash beyween the mature theories of the second half of the 19th century that constituted the “body” of classical physics. The revolution’s most dramatic point was Einstein’s 1905 photon paper that laid the foundations of both special relativity and the old quantum theory. Hence the dialectic of the old theories is crucial for theory change. Later, classical physics was forced out by the joint development of quantum and relativistic subprogrammes. The title of my paper can be reformulated in Bruno Latour’s terms: The Einstein Revolution or Drawing Models Together. -/- . (shrink)
Perruchet & Vinter (P&V) ground their arguments in a view they call “the mentalistic tradition.” Here I point out that such a view has already been advocated by two old masters of psychological science, William James and James Gibson, as well as by the philosopher Merleau-Ponty. In fact, in the writings of these older thinkers, arguments very similar to those presented in the target article are found.
Nietzsche published for the public only the first three parts of Thus Spoke Zarathustra. This paper in examining the “tragic wisdom” of that work gives an account of why Nietzsche did not want his public to read Part IV. It shows the evolution in Nietzsche’s thought about tragic wisdom beginning with The Birth of Tragedy where satyric laughter is central to the wisdom of ancient Greek tragedy to Parts I-III of Thus Spoke Zarathustra where the significance of its major idea, (...) eternal recurrence, is the joy occasioned by experiencing that theory to finally Part IV where the pathos engendered by Zarathustra, who has aged to an ugly, old fool, is the sarcastic laughter that kills. (shrink)
There has been much speculation among intellectuals and philosophers about the qualitative changes in our habits of communication that have come with electronic technology - so much so that we have perhaps neglected the most obvious quantitative effect: without any doubt, human beings have never been obliged to communicate as frequently as is the case in our electronic present - with the unsurprising and well known consequence that we constantly feel "behind" in our electronic obligations to communicate. From a (pseudo-) (...) ethical point of view, the even more oppressive flip side of this phenomenon is one's need to be constantly "available," the result of which we all know: seminar discussions, religious services, or moments of erotic delight interrupted by ringing cellphones or by a constant anxiety that one needs to check one's e-mail. The main interest of this essay is to explore the existential consequences of this new - and enslaving - law of "universal availability." But this entire polemic is accompanied by the author's concession that his own - very subjective - reaction to electronic communication may well be the (legitimate) reaction of old age. (shrink)
Much research on cognitive development focuses either on early-emerging domain-specific knowledge or domain-general learning mechanisms. However, little research examines how these sources of knowledge interact. Previous research suggests that young infants can make inferences from samples to populations (Xu & Garcia, 2008) and 11- to 12.5-month-old infants can integrate psychological and physical knowledge in probabilistic reasoning (Teglas, Girotto, Gonzalez, & Bonatti, 2007; Xu & Denison, 2009). Here, we ask whether infants can integrate a physical constraint of immobility into a statistical (...) inference mechanism. Results from three experiments suggest that, first, infants were able to use domain-specific knowledge to override statistical information, reasoning that sometimes a physical constraint is more informative than probabilistic information. Second, we provide the first evidence that infants are capable of applying domain-specific knowledge in probabilistic reasoning by using a physical constraint to exclude one set of objects while computing probabilities over the remaining sets. (shrink)
Developmental research suggests that some of the mechanisms that underlie numerical cognition are present and functional in human infancy. To investigate these mechanisms and their developmental course, psychologists have turned to behavioral and electrophysiological methods using brieﬂy presented displays. These methods, however, depend on the assumption that young infants can extract numerical information rapidly. Here we test this assumption and begin to investigate the speed of numerical processing in ﬁve-month-old infants. Infants successfully discriminated between arrays of 4 vs. 8 dots (...) on the basis of number when a new array appeared every 2 s, but not when a new array appeared every 1.0 or 1.5 s. These results suggest alternative interpretations of past ﬁndings, provide constraints on the design of future experiments, and introduce a new method for probing infants’ enumeration process. Further experiments using this method provide initial evidence that infants’ enumeration mechanism operates in parallel and yields increasingly accurate numerical representations over time, as does the enumeration mechanism used by adults in symbolic and non-symbolic tasks. q 2004 Elsevier B.V. All rights reserved. (shrink)
The old quantum theory of black body radiation was manifestly logically inconsistent. It required the energies of electric resonators to be both quantized and continuous. To show that this manifest inconsistency was inessential to the theory's recovery of the Planck distribution law, I extract a subtheory free of this manifest inconsistency but from which Planck's law still follows.
Does (affirmative) judgement have a logical dual, negative judgement? Whether there is such a logical dualism was hotly debated at the beginning of the twentieth century. Frege argued in ?Negation? (1918/9) that logic can dispense with negative judgement. Frege's arguments shaped the views of later generations of analytic philosophers, but they will not have convinced such opponents as Brentano or Windelband. These philosophers believed in negative judgement for psychological, not logical, reasons. Reinach's ?On the Theory of Negative Judgement? (1911) spoke (...) to the concerns of these philosophers. While Frege took the distinction between affirmative and negative judgement to be logically redundant, Reinach argued that it is the result of confusing judgement with a different mental act. In this article, I present Reinach's arguments against the ?old logical dualism? in context, analyse them and discuss Reinach's innovative use of the notion of focus in the theory of judgement. Recently, there has been a revival of the view that sentential negation is grounded in a prior mental act of rejection. In the final section, I argue that Reinach's analysis of rejection poses a challenge for the revivalists. (shrink)