A new perceptual paradigm is emerging that views the universe as a living system that is being regenerated in its totality moment by moment. Scientific evidence for this view is presented as well as complementary insights from the world's spiritual traditions. The implications of this view of reality for our sense of identity, way of living, and evolutionary purpose are considered.
Philosophy long sought to set knowledge on a firm foundation, through derivation of indubitable truths by infallible rules. For want of such truths and rules, the enterprise foundered. Nevertheless, foundationalism's heirs continue their forbears' quest, seeking security against epistemic misfortune, while their detractors typically espouse unbridled coherentism or facile relativism. Maintaining that neither stance is tenable, Catherine Elgin devises a via media between the absolute and the arbitrary, reconceiving the nature, goals, and methods of epistemology. In Considered Judgment, she (...) argues for a reconception that takes reflective equilibrium as the standard of rational acceptability. A system of thought is in reflective equilibrium when its components are reasonable in light of one another, and the account they comprise is reasonable in light of our antecedent convictions about the subject it concerns.Many epistemologists now concede that certainty is a chimerical goal. But they continue to accept the traditional conception of epistemology's problematic. Elgin suggests that in abandoning the quest for certainty we gain opportunities for a broader epistemological purview--one that comprehends the arts and does justice to the sciences. She contends that metaphor, fiction, emotion, and exemplification often advance understanding in science as well as in art. The range of epistemology is broader and more variegated than is usually recognized. Tenable systems of thought are neither absolute nor arbitrary. Although they afford no guarantees, they are good in the way of belief. (shrink)
Predictions concerning the end of the world have proven less reliable than your broker’s recommendations or your fondest hopes. Whether you await the end fearfully or eagerly, you may rest assured that it will never come—not because the world is everlasting but because it has already ended, if indeed it ever began. But we need not mourn, for the world is indeed well lost, and with it the stultifying stereotypes of absolutism: the absurd notions of science as the effort to (...) discover a unique, prepackaged, but unfortunately undiscoverable reality, and of truth as agreement with that inaccessible reality. All notions of pure givenness and unconditional necessity and of a single correct perspective and system of categories are lost as well.If there is no such thing as the world, what are we living in? The answer might be “A world” or, better, “Several worlds.” For to deny that there is any such thing as the world is no more to deny that there are worlds than to deny that there is any such thing as the number between two and seven is to deny that there are numbers between two and seven. The task of describing the world is as futile as the task of describing the number between two and seven.The world is lost once we appreciate a curious feature of certain pairs of seemingly contradictory statements: if either is true, both are. Although “The earth is in motion” and “The earth is at rest” apparently contradict each other, both are true. But from a contradiction, every statement follows. So unless we are prepared to acknowledge the truth of every statement, the appearance of contradiction in cases like these must somehow be dispelled. Nelson Goodman is professor emeritus of philosophy at Harvard University. He has written Of Mind and Other Matters, Ways of Worldmaking, Problems and Projects, Languages of Art, The Structure of Appearance, and Fact, Fiction, and Forecast. His most recent contribution to Critical Inquiry is “How Buildings Mean” . Catherine Z. Elgin is associate professor of philosophy at the University of North Carolina, Chapel Hill. She is the author of With Reference to Reference and is currently writing a book entitled Philosophy without Foundations. (shrink)
"Systematizes and develops in a comprehensive study Nelson Goodman's philosophy of language. The Goodman-Elgin point of view is important and sophisticated, and deals with a number of issues, such as metaphor, ignored by most other theories." --John R. Perry, Stanford University.
If understanding is factive, the propositions that express an understanding are true. I argue that a factive conception of understanding is unduly restrictive. It neither reflects our practices in ascribing understanding nor does justice to contemporary science. For science uses idealizations and models that do not mirror the facts. Strictly speaking, they are false. By appeal to exemplification, I devise a more generous, flexible conception of understanding that accommodates science, reflects our practices, and shows a sufficient but not slavish sensitivity (...) to the facts. (shrink)
Truth is standardly considered a requirement on epistemic acceptability. But science and philosophy deploy models, idealizations and thought experiments that prescind from truth to achieve other cognitive ends. I argue that such felicitous falsehoods function as cognitively useful fictions. They are cognitively useful because they exemplify and afford epistemic access to features they share with the relevant facts. They are falsehoods in that they diverge from the facts. Nonetheless, they are true enough to serve their epistemic purposes. Theories that contain (...) them have testable consequences, hence are factually defeasible. (shrink)
This paper consists of four parts. Part 1 is an introduction. Part 2 evaluates arguments for the claim that there are no strict empirical laws in biology. I argue that there are two types of arguments for this claim and they are as follows: (1) Biological properties are multiply realized and they require complex processes. For this reason, it is almost impossible to formulate strict empirical laws in biology. (2) Generalizations in biology hold contingently but laws go beyond describing contingencies, (...) so there cannot be strict laws in biology. I argue that both types of arguments fail. Part 3 considers some examples of biological laws in recent biological research and argues that they exemplify strict laws in biology. Part 4 considers the objection that the examples in part 3 may be strict laws but they are not distinctively biological laws. I argue that given a plausible account of what distinctively biological means, such laws are distinctively biological. (shrink)
Nancy Cartwright (1983, 1999) argues that (1) the fundamental laws of physics are true when and only when appropriate ceteris paribus modifiers are attached and that (2) ceteris paribus modifiers describe conditions that are almost never satisfied. She concludes that when the fundamental laws of physics are true, they don't apply in the real world, but only in highly idealized counterfactual situations. In this paper, we argue that (1) and (2) together with an assumption about contraposition entail the opposite conclusion (...) — that the fundamental laws of physics do apply in the real world. Cartwright extracts from her thesis about the inapplicability of fundamental laws the conclusion that they cannot figure in covering-law explanations. We construct a different argument for a related conclusion — that forward-directed idealized dynamical laws cannot provide covering-law explanations that are causal. This argument is neutral on whether the assumption about contraposition is true. We then discuss Cartwright's simulacrum account of explanation, which seeks to describe how idealized laws can be explanatory. (shrink)
In this paper, I investigate the nature of a priori biological laws in connection with the idea that laws must be empirical. I argue that the epistemic functions of a priori biological laws in biology are the same as those of empirical laws in physics. Thus, the requirement that laws be empirical is idle in connection with how laws operate in science. This result presents a choice between sticking with an unmotivated philosophical requirement and taking the functional equivalence of laws (...) seriously and modifying our philosophical account. I favor the latter. (shrink)
Sober  argues that some causal statements are a priori true and that a priori causal truths are central to explanations in the theory of natural selection. Lange and Rosenberg  criticize Sober's argument. They concede that there are a priori causal truths, but maintain that those truths are only ‘minimally causal’. They also argue that explanations that are built around a priori causal truths are not causal explanations, properly speaking. Here we criticize both of Lange and Rosenberg's claims.
I argue that trustworthiness is an epistemic desideratum. It does not reduce to justified or reliable true belief, but figures in the reason why justified or reliable true beliefs are often valuable. Such beliefs can be precarious. If a belief's being justified requires that the evidence be just as we take it to be, then if we are off even by a little, the belief is unwarranted. Similarly for reliability. Although it satisfies the definition of knowledge, such a belief is (...) not trustworthy. We ought not use it as a basis for inference or action and ought not give others to believe it. The trustworthiness of a belief, I urge, depends on its being backed by reasons—considerations that other members of the appropriate epistemic community cannot reasonably reject. Trustworthiness is intersubjective. It both depends on and contributes to the evolving cognitive values of an epistemic community. (shrink)
Testimony consists in imparting information without supplying evidence or argument to back one's claims. To what extent does testimony convey epistemic warrant? C. J. A. Coady argues, on Davidsonian grounds, that (1) most testimony is true, hence (2) most testimony supplies warrant sufficient for knowledge. I appeal to Grice's maxims to undermine Coady's argument and to show that the matter is more complicated and context-sensitive than is standardly recognized. Informative exchanges take place within networks of shared, tacit assumptions that affect (...) the scope and strength of our claims, and the level of warrant required for their responsible assertion. The maxims explain why different levels of warrant are transferred in different contexts. (shrink)
Scientific realism holds that scientific representations are utterly objective. They describe the way the world is, independent of any point of view. In Scientific Representation, van Fraassen argues otherwise. If science is to afford an understanding of nature, it must be grounded in evidence. Since evidence is perspectivai, science cannot vindicate its claims using only utterly objective representations. For science to do its epistemic job, it must involve perspectivai representations. I explicate this argument and show its power.
I argue that the picture theory provides both a common referential hase and a common logical syntax for languages embodying alternative conceptual schemes. I offer an analysis of depiction, explicating the Tractarian concepts of pictorial structure, pictorial relationship, and representational form. Significant failure of reference and the existence of languages with incompatible ontological commitments show that on the molar level depiction is not required for sense. Using three premises, taken to be axiomatic for Wittgenstein, I show that analysis leads to (...) a base of elementary propositions which must depict in order to be significant. There, the relations between pictorial structure, pictorial relationship and representational form are such that reference is secured and conceptual relativity precluded. (shrink)
The arts and the sciences perform many of the same cognitive functions, both serving to advance understanding. This paper explores some of the ways exemplification operates in the two fields. Both scientific experiments and works of art highlight, underscore, display, or convey some of their own features. They thereby focus attention on them, and make them available for examination and projection. Thus, the Michelson-Morley experiment exemplifies the constancy of the speed of light. Jackson Pollock'sNumber One exemplifies the viscosity of paint. (...) Despite their similarities, science and art might seem to differ in their attitude toward facts. Science is said to adhere to facts; art, to be indifferent to them. Such, I urge, is not the case. Science, like art, often scorns fact to advance understanding through fiction. Thought experiments, I contend, are scientific fictions; literary and pictorial fictions, aesthetic thought experiments. (shrink)
In this paper, I investigate the logical relation between two claims: observations are theory-laden1 and there is no empirical common ground upon which to evaluate successive scientific theories that belong to different paradigms. I, first, construct an argument where is the main premise and is the conclusion. I argue that the term „theory-laden” has three distinct senses: semantic, psychological and epistemic. If ‘theory-laden’ is understood in either epistemic or psychological senses, then the conclusion becomes a claim about people. If incommensurability (...) is to be a claim about theories, then ‘theory-laden’ in the main premise should be understood in the semantic sense. I, then, argue that there is a further distinction to be drawn between the absolute and relative senses of theory-laden. The relative sense of theory-laden allows observations that are relatively neutral between the theories under examination. I then conclude that the argument from theory-ladenness only shows that foundational empiricism is not a tenable philosophical position, but it fails to show that no empirical test can decide between successive theories that belong to different paradigms. (shrink)
I argue that constructive nominalism is preferable to scientific realism. Rather than reflecting without distortion the way the mind-independent world is, theories refract. They provide an understanding of the world as modulated by a particular theory. Truth is defined within a theoretical framework rather than outside of it. This does not undermine objectivity, for an assertion contains a reference to the framework in terms of which its truth is claimed.
I show that it follows from both externalist and internalist theories that stupid people may be in a better position to know than smart ones. This untoward consequence results from taking our epistemic goal to be accepting as many truths as possible and rejecting as many falsehoods as possible, combined with a recognition that the standard for acceptability cannot be set too high, else scepticism will prevail. After showing how causal, reliabilist, and coherentist theories devalue intelligence, I suggest that knowledge, (...) as contemporary theories construe it, is not a particularly valuable cognitive achievement, and that we would do well to reopen epistemology to the study of cognitive excellences of all sorts. (shrink)
A number of debates in philosophy of biology and psychology, as well as in their respective sciences, hinge on particular views about the relationship between genotypes and phenotypes. One such view is that the genotype-phenotype relationship is relatively straightforward, in the sense that a genome contains the ?genes for? the various traits that an organism exhibits. This leads to the assumption that if a particular set of traits is posited to be present in an organism, there must be a corresponding (...) number of genes in that organism's genome to account for those traits. This assumption underlies what can be called the ?counting argument,? in which empirical estimates of the number of genes in a genome are used to support or refute particular hypotheses in philosophical debates about biology and psychology. In this paper, we assess the counting argument as it is used in discussions of the alleged massive modularity of the brain, and conclude that this argument cannot be upheld in light of recent philosophical work on gene concepts and empirical work on genome complexity. In doing so, we illustrate that there are those on both sides of the debate about massive modularity who rely on an incorrect view of gene concepts and the nature of the genotype-phenotype relationship. (shrink)
Understanding, as I construe it, is holistic. It is a matter of how commitments mesh to form a mutually supportive, independently supported system of thought. It is advanced by bootstrapping. We start with what we think we know and build from there. This makes education continuous with what goes on at the cutting edge of inquiry. Methods, standards, categories and stances are as important as facts. So something like E. D. Hirsch’s list of facts every fourth grader should know is (...) slightly silly. What makes for a good fourth grade education is not the set of facts the fourth grader knows, but the level of understanding she has achieved and the resources she can deploy to advance that understanding. Facts are part of the story, but so are fictions, methods, standards, and categories. A major part of understanding is recognizing what problems remain to be solved. (shrink)
Jonathan Bennett (1974) maintains that Huckleberry Finn’s deliberations about whether to return Jim to slavery afford insight into the tension between sympathy and moral judgment; Miranda Fricker (2007) argues that the trial scene in To Kill a Mockingbird affords insight into the nature of testimonial injustice. Neither claims merely that the works prompt an attentive reader to think something new or to change her mind. Rather, they consider the reader cognitively better off for her encounters with the novels. Nor is (...) her cognitive improvement restricted to acquiring new justified true beliefs about the works themselves. What the reader gleans is supposed to enhance her knowledge or understanding of the .. (shrink)
Cognitive advancement is not always a matter of acquiring new information. It often consists in reconfiguration--in reorganizing a domain so that hitherto overlooked or underemphasized features, patterns, opportunities, and resources come to light. Several modes of reconfiguration prominent in the arts--metaphor, fiction, exemplification, and perspective--play important roles in science as well. They do not perform the same roles as literal, descriptive, perspectiveless scientific truths. But to understand how science advances understanding, we need to appreciate the ineliminable cognitive contributions of non-literal, (...) non-descriptive symbols. (shrink)
The Structure of Appearance presents a phenomenalist system, constructing enduring visible objects out of qualia. Nevertheless Goodman does not espouse phenomenalism. This is not because he considers his system inadequate. Although details remain to be filled in, he considers his system viable. And he believes his constructional methods could readily yield extensions to other sensory realms. Why isn’t Goodman a phenomenalist? This paper suggests an answer that illuminates Goodman’s views about the nature and functions of constructional systems, the prospects of (...) reductionism, and the character of epistemology. These non-standard views present attractive alternatives to currently popular positions. (shrink)
Much ink has been spent on Popper's falsificationism. Why, then, am I writing another paper on this subject? This paper is neither a new kind of criticism nor a new kind of defense of falsificationism. Recent debate about the legitimacy of adaptationism among biologists centers on the question of whether Popper's falsificationism or Lakatos' methodology of scientific research programs (SRP) is adequate in understanding science. S. Jay Gould and Richard C. Lewontin (1978) argue that adaptationism is unfalsifiable since it easily (...) invites ad hoc adjustments when it makes false predictions. William A. Mitchell and Thomas J. Valone (1990) argue that adaptationism is a research program, and that the charge of falsifiability does not apply to a research program. Although both sides make use of the theories of scientific methodology proposed by Karl R. Popper (1934, 1957, 1963, 1971) and Imre Lakatos (1965, 1974, 1978), the differences and the similarities between these philosophers are overlooked. The purpose of the present paper is to explicate the differences and the similarities between the two philosophies of science. (shrink)
To understand a term or other symbol, I argue that it is generally neither necessary nor sufficient to assign it a unique determinate reference. Independent of and prior to investigation, it is frequently indeterminate not only whether a sentence is true, but also what its truth conditions are. Nelson Goodman's discussions of likeness of meaning are deployed to explain how this can be so.