Max Weber (1864-1920), generally known as a founder of modern social science, was concerned with political affairs throughout his life. The texts in this edition span his career and include his early inaugural lecture The Nation State and Economic Policy, Suffrage and Democracy in Germany, Parliament and Government in Germany under a New Political Order, Socialism, The Profession and Vocation of Politics, and an excerpt from his essay The Situation of Constitutional Democracy in Russia, as well as other shorter (...) writings. Together they illustrate the development of his thinking on the fate of Germany and the nature of politics in the modern western state in an age of cultural 'disenchantment'. The introduction discusses the central themes of Weber's political thought, and a chronology, notes and an annotated bibliography place him in his political and intellectual context. (shrink)
This article examines the role of experimental generalizations and physical laws in neuroscientific explanations, using Hodgkin and Huxley’s electrophysiological model from 1952 as a test case. I show that the fact that the model was partly fitted to experimental data did not affect its explanatory status, nor did the false mechanistic assumptions made by Hodgkin and Huxley. The model satisfies two important criteria of explanatory status: it contains invariant generalizations and it is modular (both in James Woodward’s sense). Further, I (...) argue that there is a sense in which the explanatory heteronomy thesis holds true for this case. †To contact the author, please write to: SNF‐Professorship for Philosophy of Science, University of Basel, Missionsstrasse 21, 4003 Basel, Switzerland; e‐mail: firstname.lastname@example.org. (shrink)
While humanists have pondered the subject of love to the point of obsessiveness, philosophers have steadfastly ignored it. One might wonder whether the discipline of philosophy even recognizes love. The word philosophy means “love of wisdom,” but the absence of love from philosophical discourse is curiously glaring. So where did the love go? In The Erotic Phenomenon, Jean-Luc Marion asks this fundamental question of philosophy, while reviving inquiry into the concept of love itself. Marion begins his profound and (...) personal book with a critique of Descartes’ equation of the ego’s ability to doubt with the certainty that one exists—“I think, therefore I am”—arguing that this is worse than vain. We encounter being, he says, when we first experience love: I am loved, therefore I am; and this love is the reason I care whether I exist or not. This philosophical base allows Marion to probe several manifestations of love and its variations, including carnal excitement, self-hate, lying and perversion, fidelity, the generation of children, and the love of God. Throughout, Marion stresses that all erotic phenomena, including sentimentality, pornography, and even boasts about one’s sexual conquests, stem not from the ego as popularly understood but instead from love. A thoroughly enlightening and captivating philosophical investigation of a strangely neglected subject, The Erotic Phenomenon is certain to initiate feverish new dialogue about the philosophical meanings of that most desirable and mysterious of all concepts—love. (shrink)
In the third book in the trilogy that includes Reduction and Givenness and Being Given. Marion renews his argument for a phenomenology of givenness, with penetrating analyses of the phenomena of event, idol, flesh, and icon. Turning explicitly to hermeneutical dimensions of the debate, Marion masterfully draws together issues emerging from his close reading of Descartes and Pascal, Husserl and Heidegger, Levinas and Henry. Concluding with a revised version of his response to Derrida, In the Name: How to (...) Avoid Speaking of It, Marion powerfully re-articulates the theological possibilities of phenomenology. (shrink)
Jean-Luc Marion advances a controversial argument for a God free of all categories of Being. Taking a characteristically postmodern stance, Marion challenges a fundamental premise of both metaphysics and neo-Thomist theology: that God, before all else, must be. Rather, he locates a "God without Being" in the realm of agape, of Christian charity or love. This volume, the first translation into English of the work of this leading Catholic philosopher, offers a contemporary perspective on the nature of God. (...) "An immensely thoughtful book. . . . It promises a rich harvest. Marion's highly original treatment of the idol and the icon, the Eucharist, boredom and vanity, conversion and prayer takes theological and philosophical discussions to a new level."--Norman Wirzba, Christian Century. (shrink)
Does Descartes belong to metaphysics? What do we mean when we say "metaphysics"? These questions form the point of departure for Jean-Luc Marion's groundbreaking study of Cartesian thought. Analyses of Descartes' notion of the ego and his idea of God show that if Descartes represents the fullest example of metaphysics, he no less transgresses its limits. Writing as philosopher and historian of philosophy, Marion uses Heidegger's concept of metaphysics to interpret the Cartesian corpus--an interpretation strangely omitted from Heidegger's (...) own history of philosophy. This interpretation complicates and deepens the Heideggerian concept of metaphysics, a concept that has dominated twentieth-century philosophy. Examinations of Descartes' predecessors (Aristotle, Augustine, Aquinas, and Suarez) and his successors (Leibniz, Spinoza, and Hegel) clarify the meaning of the Cartesian revolution in philosophy. Expertly translated by Jeffrey Kosky, this work will appeal to historians of philosophy, students of religion, and anyone interested in the genealogy of contemporary thought and its contradictions. (shrink)
This pioneering book demonstrates the crucial importance of Wittgenstein's philosophy of mathematics to his philosophy as a whole. Marion traces the development of Wittgenstein's thinking in the context of the mathematical and philosophical work of the times, to make coherent sense of ideas that have too often been misunderstood because they have been presented in a disjointed and incomplete way. In particular, he illuminates the work of the neglected 'transitional period' between the Tractatus and the Investigations.
Marked sharply by its time and place (Paris in the 1970s), this early theological text by Jean-Luc Marion nevertheless maintains a strikingly deep resonance with his most recent, groundbreaking, and ever more widely discussed phenomenology. And while Marion will want to insist on a clear distinction between the theological and phenomenological projects, to read each in light of the other can prove illuminating for both the theological and the philosophical reader - and perhaps above all for the reader (...) who wants to read in both directions at once, the reader concerned with those points of interplay and undecidability where theology and philosophy inform, provoke, and challenge one another in endlessly complex ways." "In both his theological and his phenomenological projects Marion's central effort to free the absolute or unconditional (be it theology's God or phenomenology's phenomenon) from the various limits and preconditions of human thought and language will imply a thoroughgoing critique of all metaphysics, and above all of the modern metaphysics centered on the active, spontaneous subject who occupies modern philosophy from Descartes through Hegel and Nietzsche. (shrink)
In seven essays that draw from metaphysics, phenomenology, literature, Christological theology, and Biblical exegesis,Marion sketches several prolegomena to a future fuller thinking and saying of love’s paradoxical reasons, exploring evil, freedom, bedazzlement, and the loving gaze; crisis, absence, and knowing.
The title of this book echoes a phrase used by the Washington Post to describethe American attempt to kill Saddam Hussein at the start of the war againstIraq. Its theme is the notion of targeting (skopos) as the name of an intentionalstructure in which the subject tries to confirm its invulnerability by aiming todestroy a target. At the center of the first chapter is Odysseus’s killing of the suitors;the second concerns Carl Schmitt’s Roman Catholicism and Political Form; thethird and fourth (...) treat Freud’s “Thoughts for the Times on War and Death” and“The Man Moses and Monotheistic Religion.” Weber then traces the emergenceof an alternative to targeting, first within military and strategic thinking itself(“Network Centered Warfare”), and then in Walter Benjamin’s readings of“Capitalism as Religion” and “Two Poems of Friedrich Hölderlin.”. (shrink)
The ICE-theory of technical functions Content Type Journal Article Category Book Symposium Pages 1-22 DOI 10.1007/s11016-012-9642-9 Authors E. Weber, Centre for Logic and Philosophy of Science, Ghent University (UGent), Blandijnberg 2, 9000 Gent, Belgium T. A. C. Reydon, Institute of Philosophy, Leibniz University Hannover, Im Moore 21, 30167 Hannover, Germany M. Boon, Department of Philosophy, University of Twente, P.O. Box 217, 7500 AE Enschede, The Netherlands W. Houkes, Philosophy and Ethics, Eindhoven University of Technology, P.O. Box 513, 5600 MB (...) Eindhoven, The Netherlands P. E. Vermaas, Department of Philosophy, Delft University of Technology, Jaffalaan 5, 2628 BX Delft, The Netherlands Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796. (shrink)
Damage accumulation simulated previously (F. Gao and W.J. Weber, Phys. Rev. B 66 024106 (2002)) has been used to study volume swelling of 3C?SiC, and to calculate the elastic constants, bulk and elastic moduli of the cascade-amorphized SiC. The swelling increases rapidly with dose at low-dose levels, but the rate of increase decreases dramatically at higher dose with a saturation volume change of 8.2% for the cascade-amorphized state. The elastic constants in the cascade-amorphized SiC decrease by about 19, 29 (...) and 46% for C11, C12 and C44, respectively, and 23% for bulk and elastic moduli. In order to understand defect annealing of damage accumulation, the stable Frenkel pairs created at low-energy events have been annealed at different temperatures, using molecular dynamics methods, to determine the time required for interstitials to recombine with vacancies. The results show that the low activation energies qualitatively overlap with experimental values determined for defect recovery below 300?K. Thus, the present results suggest that this experimental recovery stage is associated with the spontaneous recovery of Frenkel pairs. (shrink)
Along with Husserl's Ideas and Heidegger's Being and Time, Being Given is one of the classic works of phenomenology in the twentieth century. Through readings of Kant, Husserl, Heidegger, Derrida, and twentieth-century French phenomenology (e.g., Merleau-Ponty, Levinas, and Henry), it ventures a bold and decisive reappraisal of phenomenology and its possibilities. Its author's most original work to date, the book pushes phenomenology to its limits in an attempt to redefine and recover the phenomenological ideal, which the author argues has never (...) been realized in any of the historical phenomenologies. Against Husserl's reduction to consciousness and Heidegger's reduction to Dasein, the author proposes a third reduction to givenness, wherein phenomena appear unconditionally and show themselves from themselves at their own initiative. Being Given is the clearest, most systematic response to questions that have occupied its author for the better part of two decades. The book articulates a powerful set of concepts that should provoke new research in philosophy, religion, and art, as well as at the intersection of these disciplines. Some of the significant issues it treats include the phenomenological definition of the phenomenon, the redefinition of the gift in terms not of economy but of givenness, the nature of saturated phenomena, and the question “Who comes after the subject?” Throughout his consideration of these issues, the author carefully notes their significance for the increasingly popular fields of religious studies and philosophy of religion. Being Given is therefore indispensable reading for anyone interested in the question of the relation between the phenomenological and the theological in Marion and emergent French phenomenology. (shrink)
Painting, according to Jean-Luc Marion, is a central topic of concern for philosophy, particularly phenomenology. For the question of painting is, at its heart, a question of visibility—of appearance. As such, the painting is a privileged case of the phenomenon; the painting becomes an index for investigating the conditions of appearance—or what Marion describes as “phenomenality” in general. In The Crossing of the Visible, Marion takes up just such a project. The natural outgrowth of his earlier reflections (...) on icons, these four studies carefully consider the history of painting—from classical to contemporary—as a fund for phenomenological reflection on the conditions of (in)visibility. Ranging across artists from Raphael to Rothko, Caravaggio to Pollock, The Crossing of the Visible offers both a critique of contemporary accounts of the visual and a constructive alternative. According to Marion, the proper response to the “nihilism” of postmodernity is not iconoclasm, but rather a radically iconic account of the visual and the arts that opens them to the invisible. (shrink)
This piece, included in the drift special issue of continent. , was created as one step in a thread of inquiry. While each of the contributions to drift stand on their own, the project was an attempt to follow a line of theoretical inquiry as it passed through time and the postal service(s) from October 2012 until May 2013. This issue hosts two threads: between space & place and between intention & attention . The editors recommend that to experience the (...) drifiting thought that attention be paid to the contributions as they entered into conversation one after another. This particular piece is from the BETWEEN SPACE & PLACE thread: April Vannini, Those Between the Common * Laura Dean & Jesse McClelland, Ballard: A Portrait of Placemaking * Amara Hark Weber, Crossroad * Isaac Linder & Berit Soli-Holt, The Call of the Wild: Terro(i)r Modulations * Ashley D. Hairston, Momma taught us to keep a clean house * Sean Smith, The Garage (Take One) * * * * The plains of the upper mid-west have changed substantially over the past 50 years, as farming technology and demographics shifted. What is left is a landscape covered with the shells of homes, farms, and towns melting into the earth. Those who remain do so as stubbornly as the folks so settled there 100 years ago. What becomes of the abandoned structures is a question that will only be settled with time. This collection of photography is not a document of abandonment but rather an exploration of what happens when space and place collide; the intersection between nature, home, dreams, and memory.  . (shrink)
This research replicates Weber's 1995 study of a large financial services firm that found that ethical subclimates exist within multi-departmental organizations, are influenced by the function of the department and the stakeholders served, and are relatively stable over time. Relying upon theoretical models developed by Thompson (1967) and Victor and Cullen (1998), hypotheses are developed that predict the ethical subclimate decision-making dimensions and type for diverse departments within a large steel manufacturing firm and that these ethical subclimate types will (...) be stable across the two periods of time when the data were collected. Employees were surveyed in 1995 and again in 1999 using Victor and Cullen's Ethical Climate Questionnaire. Response rates of 88 and 94 percent were achieved. Contrary to Weber's findings, our results imply that, in both samples, ethical subclimates may be determined by the strength of an organization's overall ethical climate, rather than the department's function. However, we did find support for Weber's earlier contention that these subclimates are relatively stable. Our results also suggest that differences may exist across industries, that is when comparing a large steel manufacturer, as we did in our study, with a large financial services organization, as Weber did in his 1995 study. (shrink)
Summary Based on Max Weber's concept of Kulturnation and Hans Blumenberg's project of metaphorology, this essay argues that modern nations follow distinct cultural programmes that are inherent to their national ideas. Each national idea is propagated by a particular biopolitical metaphor, which performs a transfer from practical or scientific ideas about how nature structures and organises life to cultural ideas about how human lives should be socially and politically organised. The essay examines the emergence of the principal metaphors of (...) grafting in England (Great Britain), of regeneration and elective affinities in France, and of organic self-generation in Prussia (Germany). The fact that each nation claims for its particular national idea the status of a universal principle constitutes the intrinsic paradox of nationalism. (shrink)
This paper proposes a basic revision of the understanding of teleology in biological sciences. Since Kant, it has become customary to view purposiveness in organisms as a bias added by the observer; the recent notion of teleonomy expresses well this as-if character of natural purposes. In recent developments in science, however, notions such as self-organization (or complex systems) and the autopoiesis viewpoint, have displaced emergence and circular self-production as central features of life. Contrary to an often superficial reading, Kant gives (...) a multi-faceted account of the living, and anticipates this modern reading of the organism, even introducing the term self-organization for the first time. Our re-reading of Kant in this light is strengthened by a group of philosophers of biology, with Hans Jonas as the central figure, who put back on center stage an organism-centered view of the living, an autonomous center of concern capable of providing an interior perspective. Thus, what is present in nuce in Kant, finds a convergent development from this current of philosophy of biology and the scientific ideas around autopoeisis, two independent but parallel developments culminating in the 1970s. Instead of viewing meaning or value as artifacts or illusions, both agree on a new understanding of a form of immanent teleology as truly biological features, inevitably intertwined with the self-establishment of an identity which is the living process. (shrink)
Going back at least to Duhem, there is a tradition of thinking that crucial experiments are impossible in science. I analyse Duhem's arguments and show that they are based on the excessively strong assumption that only deductive reasoning is permissible in experimental science. This opens the possibility that some principle of inductive inference could provide a sufficient reason for preferring one among a group of hypotheses on the basis of an appropriately controlled experiment. To be sure, there are analogues to (...) Duhem's problems that pertain to inductive inference. Using a famous experiment from the history of molecular biology as an example, I show that an experimentalist version of inference to the best explanation (IBE) does a better job in handling these problems than other accounts of scientific inference. Furthermore, I introduce a concept of experimental mechanism and show that it can guide inferences from data within an IBE-based framework for induction. Introduction Duhem on the Logic of Crucial Experiments ‘The Most Beautiful Experiment in Biology’ Why Not Simple Elimination? Severe Testing An Experimentalist Version of IBE 6.1 Physiological and experimental mechanisms 6.2 Explaining the data 6.3 IBE and the problem of untested auxiliaries 6.4 IBE-turtles all the way down Van Fraassen's ‘Bad Lot’ Argument IBE and Bayesianism Conclusions CiteULike Connotea Del.icio.us What's this? (shrink)
This paper compares and contrasts two distinct techniques for measuring moral judgment: The Moral Judgment Interview and the Defining Issues Test. The theoretical foundations, accompanying advantages and limitations, as well as appropriate usage of these methodologies are discussed. Adaptation and use of the instruments for business ethics research is given special attention.
Charles S. Peirce’s theory of proper names bears helpful insights for how we might think about his understanding of persons. Persons, on his view, are continuities, not static objects. I argue that Peirce’s notion of the legisign, particularly proper names, sheds light on the habitual and conventional elements of what it means to be a person. In this paper, I begin with an account of what philosophers of language have said about proper names in order to distinguish Peirce’s theory of (...) proper names from them. Then, I present Peirce’s semiotic theory of proper names, followed by some ways in which his theory can be applied to practical concerns, such as first impressions, name changing, identity, and temporary insanity. (shrink)
In this paper, I present a summary of the philosophical relationship betweenWittgenstein and Brouwer, taking as my point of departure Brouwer's lecture onMarch 10, 1928 in Vienna. I argue that Wittgenstein having at that stage not doneserious philosophical work for years, if one is to understand the impact of thatlecture on him, it is better to compare its content with the remarks on logics andmathematics in the Tractactus. I thus show that Wittgenstein's position, in theTractactus, was already quite close to (...) Brouwer's and that the points of divergence are the basis to Wittgenstein's later criticisms of intuitionism. Among the topics of comparison are the role of intuition in mathematics, rule following, choice sequences, the Law of Excluded Middle, and the primacy of arithmetic over logic. (shrink)
It has been claimed that the intentional stance is necessary to individuate behavioral traits. This thesis, while clearly false, points to two interesting sets of problems concerning biological explanations of behavior: The first is a general in the philosophy of science: the theory-ladenness of observation. The second problem concerns the principles of trait individuation, which is a general problem in philosophy of biology. After discussing some alternatives, I show that one way of individuating the behavioral traits of an organism is (...) by a special use of the concept of biological function, as understood in an enriched causal role (not selected effect) sense. On this view, a behavioral trait is essentially a special kind of regularity, namely a regularity that is produced by some regulatory mechanism. Regulatory mechanisms always require goal states, which can only be provided by functional considerations. As an example from actual (as opposed to folk) science, I examine the case of social behavior in nematodes. I show that the attempt to explain this phenomenon actually transformed it. This supports the view that scientific explanation does not explain an explanandum phenomenon that is given prior to the explanation; rather, the explanandum is changed by the explanation. This means that there could be a plurality of stances that have some heuristic value initially, but which will be abandoned in favor of a functional characterization eventually. (shrink)
In this paper I compare the roles that the explicit and implicit educational theories of John Dewey and John Rawls play in their political works to show that Rawls’s approach is skeletal and inappropriate for defenders of democracy. I also uphold Dewey’s belief that education is valuable in itself, not only derivatively, contra Rawls. Next, I address worries for any educational theory concerning problems of distributive justice. Finally, I defend Dewey’s commitment to democracy as a consequence of the demands of (...) productive public inquiry and education. (shrink)
Recognition that biological systems are stabilized far from equilibrium by self-organizing, informed, autocatalytic cycles and structures that dissipate unusable energy and matter has led to recent attempts to reformulate evolutionary theory. We hold that such insights are consistent with the broad development of the Darwinian Tradition and with the concept of natural selection. Biological systems are selected that re not only more efficient than competitors but also enhance the integrity of the web of energetic relations in which they are embedded. (...) But the expansion of the informational phase space, upon which selection acts, is also guaranteed by the properties of open informational-energetic systems. This provides a directionality and irreversibility to evolutionary processes that are not reflected in current theory.For this thermodynamically-based program to progress, we believe that biological information should not be treated in isolation from energy flows, and that the ecological perspective must be given descriptive and explanatory primacy. Levels of the ecological hierarchy are relational parts of ecological systems in which there are stable, informed patterns of energy flow and entropic dissipation. Isomorphies between developmental patterns and ecological succession are revealing because they suggest that much of the encoded metabolic information in biological systems is internalized ecological information. The geneological hierarchy, to the extent that its information content reflects internalized ecological information, can therefore be redescribed as an ecological hierarchy. (shrink)
After sketching an argument for radical anti-realism that does not appeal to human limitations but polynomial-time computability in its definition of feasibility, I revisit an argument by Wittgenstein on the surveyability of proofs, and then examine the consequences of its application to the notion of canonical proof in contemporary proof-theoretical-semantics.
I examine different arguments that could be used to establish indeterminism of neurological processes. Even though scenarios where single events at the molecular level make the difference in the outcome of such processes are realistic, this falls short of establishing indeterminism, because it is not clear that these molecular events are subject to quantum mechanical uncertainty. Furthermore, attempts to argue for indeterminism autonomously (i.e., independently of quantum mechanics) fail, because both deterministic and indeterministic models can account for the empirically observed (...) behavior of ion channels. (shrink)
In the literature on scientific explanation two types of pluralism are very common. The first concerns the distinction between explanations of singular facts and explanations of laws: there is a consensus that they have a different structure. The second concerns the distinction between causal explanations and uni.cation explanations: most people agree that both are useful and that their structure is different. In this article we argue for pluralism within the area of causal explanations: we claim that the structure of a (...) causal explanation depends on the causal structure of the relevant fragment of the world and on the interests of the explainer. (shrink)
The paper has two aims. First, to show that we need social mechanisms to establish the policy relevance of causal claims, even if it is possible to build a good argument for those claims without knowledge of mechanisms. Second, to show that although social scientists can, in principle, do without social mechanisms when they argue for causal claims, in reality scientific practice contexts where they do not need mechanisms are very rare. Key Words: social mechanisms causal inference social (...) policy. (shrink)
In this paper I argue that Martha Nussbaums Aristotelian analysis of compassion and pity is faulty, largely because she fails to distinguish between (a) an emotions basic constitutive conditions and the associated constitutive or intrinsic norms, (b) extrinsic normative conditions, for instance, instrumental and moral considerations, and (c) the causal conditions under which emotion is most likely to be experienced. I also argue that her defense of compassion and pity as morally valuable emotions is inadequate because she treats a wide (...) variety of objections as all stemming from a common commitment to a Stoic conception of the good. I argue that these objections can be construed as neutral between conceptions of the good. I conclude by arguing that construed in this way there are nonetheless plausible replies to these objections. (shrink)
In this paper, elementary but hitherto overlooked connections are established between Wittgenstein's remarks on mathematics, written during his transitional period, and free-variable finitism. After giving a brief description of theTractatus Logico-Philosophicus on quantifiers and generality, I present in the first section Wittgenstein's rejection of quantification theory and his account of general arithmetical propositions, to use modern jargon, as claims (as opposed to statements). As in Skolem's primitive recursive arithmetic and Goodstein's equational calculus, Wittgenstein represented generality by the use of free (...) variables. This has the effect that negation of unbounded universal and existential propositions cannot be expressed. This is claimed in the second section to be the basis for Wittgenstein's criticism of the universal validity of the law of excluded middle. In the last section, there is a brief discussion of Wittgenstein's remarks on real numbers. These show a preference, in line with finitism, for a recursive version of the continuum. (shrink)
The supervenience and multiple realizability of biological properties have been invoked to support a disunified picture of the biological sciences. I argue that supervenience does not capture the relation between fitness and an organism's physical properties. The actual relation is one of causal dependence and is, therefore, amenable to causal explanation. A case from optimality theory is presented and interpreted as a microreductive explanation of fitness difference. Such microreductions can have considerable scope. Implications are discussed for reductive physicalism in evolutionary (...) biology and for the unity of science. (shrink)
The Darwinian concept of natural selection was conceived within a set of Newtonian background assumptions about systems dynamics. Mendelian genetics at first did not sit well with the gradualist assumptions of the Darwinian theory. Eventually, however, Mendelism and Darwinism were fused by reformulating natural selection in statistical terms. This reflected a shift to a more probabilistic set of background assumptions based upon Boltzmannian systems dynamics. Recent developments in molecular genetics and paleontology have put pressure on Darwinism once again. Current work (...) on self-organizing systems may provide a stimulus not only for increased problem solving within the Darwinian tradition, especially with respect to origins of life, developmental genetics, phylogenetic pattern, and energy-flow ecology, but for deeper understanding of the very phenomenon of natural selection itself. Since self-organizational phenomena depend deeply on stochastic processes, self-organizational systems dynamics advance the probability revolution. In our view, natural selection is an emergent phenomenon of physical and chemical selection. These developments suggest that natural selection may be grounded in physical law more deeply than is allowed by advocates of the autonomy of biology, while still making it possible to deny, with autonomists, that evolutionary explanations can be modeled in terms of a deductive relationship between laws and cases. We explore the relationship between, chance, self-organization, and selection as sources of order in biological systems in order to make these points. (shrink)
In this article we criticize two recent articles that examinethe relation between explanation and unification. Halonen and Hintikka (1999), on the one hand,claim that no unification is explanation. Schurz (1999), on the other hand, claims that all explanationis unification. We give counterexamples to both claims. We propose a pluralistic approach to the problem:explanation sometimes consists in unification, but in other cases different kinds of explanation(e.g., causal explanation) are required; and none of these kinds is more fundamental.
This paper asks whether statutory social insurance programs, which provide contributory tax-based income support to people with disabilities, are compatible with the disability rights movement's ideas. Central to the movement that led to the Americans with Disabilities Act is the insight that physical or mental conditions do not disable; barriers created by the environment or by social attitudes keep persons with physical or mental differences from participating in society as equals.The conflict between the civil rights approach and insurance seems apparent. (...) A person takes out insurance to deal with tragedy, such as premature death, or damage, such as accidental harm to an automobile or home. Social insurance, for example, the United States Social Security old-age and disability programs, consists of government-run insurance to cover risks of advanced age and disability for which the private market has not provided affordable coverage. But the civil rights approach to disability posits that disability is not a risk, not tragedy, and not a damage or defect. Instead it is a maladaptation of society to human variation. This paper argues that a justification remains for social insurance under the civil rights approach to disability, and further suggests that expansion of social insurance for disability is both compatible with disability rights principles and supported by wise public policy. (shrink)