This paper is concerned with the debate between substantival and relational theories of space-time, and discusses two difficulties that beset the relationalist: a difficulty posed by field theories, and another difficulty (discussed at greater length) called the problem of quantities. A main purpose of the paper is to argue that possibility can not always be used as a surrogate of ontology, and that in particular that there is no hope of using possibility to solve the problem of quantities.
The paper outlines a view of normativity that combines elements of relativism and expressivism, and applies it to normative concepts in epistemology. The result is a kind of epistemological anti-realism, which denies that epistemic norms can be (in any straightforward sense) correct or incorrect; it does allow some to be better than others, but takes this to be goal-relative and is skeptical of the existence of best norms. It discusses the circularity that arises from the fact that we need to (...) use epistemic norms to gather the facts with which to evaluate epistemic norms; relatedly, it discusses how epistemic norms can rationally evolve. It concludes with some discussion of the impact of this view on "ground level" epistemology. (shrink)
There are quite a few theses about logic that are in one way or another pluralist: they hold (i) that there is no uniquely correct logic, and (ii) that because of this, some or all debates about logic are illusory, or need to be somehow reconceived as not straightforwardly factual. Pluralist theses differ markedly over the reasons offered for there being no uniquely correct logic. Some such theses are more interesting than others, because they more radically affect how we are (...) initially inclined to understand debates about logic. Can one find a pluralist thesis that is high on the interest scale, and also true? (shrink)
The paper tries to spell out a connection between deductive logic and rationality, against Harman's arguments that there is no such connection, and also against the thought that any such connection would preclude rational change in logic. One might not need to connect logic to rationality if one could view logic as the science of what preserves truth by a certain kind of necessity (or by necessity plus logical form); but the paper points out a serious obstacle to any such (...) view. (shrink)
1. Background. At least from the time of the ancient Greeks, most philosophers have held that some of our knowledge is independent of experience, or “a priori”. Indeed, a major tenet of the rationalist tradition in philosophy was that a great deal of our knowledge had this character: even Kant, a critic of some of the overblown claims of rationalism, thought that the structure of space could be known a priori, as could many of the fundamental principles of physics; and (...) Hegel is reputed to have claimed to have deduced on a priori grounds that the number of planets is exactly five. (shrink)
There are many reasons why one might be tempted to reject certain instances of the law of excluded middle. And it is initially natural to take ‘reject’ to mean ‘deny’, that is, ‘assert the negation of’. But if we assert the negation of a disjunction, we certainly ought to assert the negation of each disjunct (since the disjunction is weaker1 than the disjuncts). So asserting..
Both in dealing with the semantic paradoxes and in dealing with vagueness and indeterminacy, there is some temptation to weaken classical logic: in particular, to restrict the law of excluded middle. The reasons for doing this are somewhat different in the two cases. In the case of the semantic paradoxes, a weakening of classical logic (presumably involving a restriction of excluded middle) is required if we are to preserve the naive theory of truth without inconsistency. In the case of vagueness (...) and indeterminacy, there is no worry about inconsistency; but a central intuition is that we must reject the factual status of certain sentences, and it hard to see how we can do that while claiming that the law of excluded middle applies to those sentences. So despite the different routes, we have a similar conclusion in the two cases. (shrink)
1. Of what use is the concept of causation? Bertrand Russell [1912-13] argued that it is not useful: it is “a relic of a bygone age, surviving, like the monarchy, only because it is erroneously supposed to do no harm.” His argument for this was that the kind of physical theories that we have come to regard as fundamental leave no place for the notion of causation: not only does the word ‘cause’ not appear in the advanced sciences, but the (...) laws that these sciences state are incompatible with causation as we normally understand it. But Nancy Cartwright has argued  that abandoning the concept of causation would cripple science; her conclusion was based not on fundamental physics, but on more ordinary science such as the search for the causes of cancer. She argues that Russell was right that the fundamental theories of modern physics say nothing, even implicitly, about causation, and concludes on this basis that such theories are incomplete. It is with this cluster of issues that I will begin my discussion. (shrink)
Are there questions for which 'there is no determinate fact of the matter' as to which answer is correct? Most of us think so, but there are serious difficulties in maintaining the view, and in explaining the idea of determinateness in a satisfactory manner. The paper argues that to overcome the difficulties, we need to reject the law of excluded middle; and it investigates the sense of 'rejection' that is involved. The paper also explores the logic that is required if (...) we reject excluded middle, with special emphasis on the conditional. There is also discussion of higher order indeterminacy (in several different senses) and of penumbral connections; and there is a suggested definition of determinateness in terms of the conditional and a discussion of the extent to which the notion of determinateness is objective. And there are suggestions about a unified treatment of vagueness and the semantic paradoxes. (shrink)
Discussion of Chapter 5 of Stephen Schiffer's "The Things We Mean' in which Stephen Schiffer advances two novel theses: 1. Vagueness (and indeterminacy more generally) is a psychological phenomenon; 2. It is indeterminate whether classical logic applies in situations where vagueness matters.
It is “the received wisdom” that any intuitively natural and consistent resolution of a class of semantic paradoxes immediately leads to other paradoxes just as bad as the ﬁrst. This is often called the “revenge problem”. Some proponents of the received wisdom draw the conclusion that there is no hope of any natural treatment that puts all the paradoxes to rest: we must either live with the existence of paradoxes that we are unable to treat, or adopt artiﬁcial and ad (...) hoc means to avoid them. Others (“dialetheists”) argue that we can put the paradoxes to rest, but only by licensing the acceptance of some contradictions (presumably in a paraconsistent logic that prevents the contradictions from spreading everywhere). (shrink)
Bayesian decision theory can be viewed as the core of psychological theory for idealized agents. To get a complete psychological theory for such agents, you have to supplement it with input and output laws. On a Bayesian theory that employs strict conditionalization, the input laws are easy to give. On a Bayesian theory that employs Jeffrey conditionalization, there appears to be a considerable problem with giving the input laws. However, Jeffrey conditionalization can be reformulated so that the problem disappears, and (...) in fact the reformulated version is more natural and easier to work with on independent grounds. (shrink)
Consider the following argument: (1) Bertrand Russell was old at age 3×1018 nanoseconds (that’s about 95 years) (2) He wasn’t old at age 0 nanoseconds (3) So there is a number N such that he was old at N nanoseconds and not old at k nanoseconds for any k
Naive truth theory is, roughly, the theory of truth that in classical logic leads to well-known paradoxes (such as the Liar paradox and the Curry paradox). One response to these paradoxes is to weaken classical logic by restricting the law of excluded middle and introducing a conditional not defined from the other connectives in the usual way. In "New Grounds for Naive Truth Theory" (), Steve Yablo develops a new version of this response, and cites three respects in which he (...) deems it superior to a version that I’ve advocated in several papers. I think he’s right that my version was non-optimal in some of these respects (one and a half of them, to be precise); however, Yablo’s own account seems to me to have some undesirable features as well. In this paper I will explore some variations on his account, and end up tentatively advocating a synthesis of his account and mine (one that is somewhat closer to mine than to his). (shrink)
The paper shows how we can add a truth predicate to arithmetic (or formalized syntactic theory), and keep the usual truth schema Tr(A)A (understood as the conjunction of Tr(A)A and ATr(A)). We also keep the full intersubstitutivity of Tr(A)) with A in all contexts, even inside of an . Keeping these things requires a weakening of classical logic; I suggest a logic based on the strong Kleene truth tables, but with as an additional connective, and where the effect of classical (...) logic is preserved in the arithmetic or formal syntax itself. Section 1 is an introduction to the problem and some of the difficulties that must be faced, in particular as to the logic of the ; Section 2 gives a construction of an arithmetically standard model of a truth theory; Section 3 investigates the logical laws that result from this; and Section 4 provides some philosophical commentary. (shrink)
It might be thought that we could argue for the consistency of a mathematical theory T within T, by giving an inductive argument that all theorems of T are true and inferring consistency. By Gödel's second incompleteness theorem any such argument must break down, but just how it breaks down depends on the kind of theory of truth that is built into T. The paper surveys the possibilities, and suggests that some theories of truth give far more intuitive diagnoses of (...) the breakdown than do others. The paper concludes with some morals about the nature of validity and about a possible alternative to the idea that mathematical theories are indefinitely extensible. (shrink)
Tim Maudlin’s Truth and Paradox is terrific. In some sense its solution to the paradoxes is familiar—the book advocates an extension of what’s called the Kripke-Feferman theory (although the definition of validity it employs disguises this fact). Nonetheless, the perspective it casts on that solution is completely novel, and Maudlin uses this perspective to try to make the prima facie unattractive features of this solution seem palatable, indeed inescapable. Moreover, the book deals with many important issues that most writers on (...) the paradoxes never deal with, including issues about the application of the Gödel theorems to powerful theories that include our theory of truth. The book includes intriguing excursions into general metaphysics, e.g. on the nature of logic, facts, vagueness, and much more; and it’s lucid and lively, a pleasure to read. It will interest a wide range of philosophers. (shrink)
The paper offers a solution to the semantic paradoxes, one in which (1) we keep the unrestricted truth schema True(A)A, and (2) the object language can include its own metalanguage. Because of the first feature, classical logic must be restricted, but full classical reasoning applies in ordinary contexts, including standard set theory. The more general logic that replaces classical logic includes a principle of substitutivity of equivalents, which with the truth schema leads to the general intersubstitutivity of True(A) with A (...) within the language.The logic is also shown to have the resources required to represent the way in which sentences (like the Liar sentence and the Curry sentence) that lead to paradox in classical logic are defective. We can in fact define a hierarchy of defectiveness predicates within the language. Contrary to claims that any solution to the paradoxes just breeds further paradoxes (revenge problems) involving defectiveness predicates, there is a general consistency/conservativeness proof that shows that talk of truth and the various levels of defectiveness can all be made coherent together within a single object language. (shrink)
Presenting a selection of thirteen essays on various topics at the foundations of philosophy--one previously unpublished and eight accompanied by substantial new postscripts--this book offers outstanding insight on truth, meaning, and propositional attitudes; semantic indeterminacy and other kinds of "factual defectiveness;" and issues concerning objectivity, especially in mathematics and in epistemology. It will reward the attention of any philosopher interested in language, epistemology, or mathematics.
In Science without numbers HartryField attempted to formulate a nominalist version of Newtonian physics?one free of ontic commitment to numbers, functions or sets?sufficiently strong to have the standard platonist version as a conservative extension. However, when uses for abstract entities kept popping up like hydra heads, Field enriched his logic to avoid them. This paper reviews some of Field's attempts to deflate his ontology by inflating his logic.
A serious flaw in HartryField’s instrumental account of applied mathematics, namely that Field must overestimate the extent to which many of the structures of our mathematical theories are reflected in the physical world, underlies much of the criticism of this account. After reviewing some of this criticism, I illustrate through an examination of the prospects for extending Field’s account to classical equilibrium statistical mechanics how this flaw will prevent any significant extension of this account beyond (...)field theories. I note in the conclusion that this diagnosis of Field’s program also points the way to modifications that may work. (shrink)
Philosophy of mathematics for the last half-century has been dominated in one way or another by Quine’s indispensability argument. The argument alleges that our best scientific theory quantifies over, and thus commits us to, mathematical objects. In this paper, I present new considerations which undermine the most serious challenge to Quine’s argument, HartryField’s reformulation of Newtonian Gravitational Theory.
Numbers without Science opposes the Quine-Putnam indispensability argument, seeking to undermine the argument and reduce its profound influence. Philosophers rely on indispensability to justify mathematical knowledge using only empiricist epistemology. I argue that we need an independent account of our knowledge of mathematics. The indispensability argument, in broad form, consists of two premises. The major premise alleges that we are committed to mathematical objects if science requires them. The minor premise alleges that science in fact requires mathematical objects. The most (...) common rejection of the argument denies its minor premise by introducing scientific theories which do not refer to mathematical objects. HartryField has shown how we can reformulate some physical theories without mathematical commitments. I argue that Field’s preference for intrinsic explanation, which underlies his reformulation, is ill-motivated, and that his resultant fictionalism suffers unacceptable consequences. I attack the major premise instead. I argue that Quine provides a mistaken criterion for ontic commitment. Our uses of mathematics in scientific theory are instrumental and do not commit us to mathematical objects. Furthermore, even if we accept Quine’s criterion for ontic commitment, the indispensability argument justifies only an anemic version of mathematics, and does not yield traditional mathematical objects. The first two chapters of the dissertation develop these results for Quine’s indispensability argument. In the third chapter, I apply my findings to other contemporary indispensabilists, specifically the structuralists Michael Resnik and Stewart Shapiro. In the fourth chapter, I show that indispensability arguments which do not rely on Quine’s holism, like that of Putnam, are even less successful. Also in Chapter 4, I show how Putnam’s work in the philosophy of mathematics is unified around the indispensability argument. In the last chapter of the dissertation, I conclude that we need an account of mathematical knowledge which does not appeal to empirical science and which does not succumb to mysticism and speculation. Briefly, my strategy is to argue that any defensible solution to the demarcation problem of separating good scientific theories from bad ones will find mathematics to be good, if not empirical, science. (shrink)
In this second paper, I continue my discussion of the problem of reference for scientific realism. First, I consider a final objection to Kitcher's account of reference, which I generalise to other accounts of reference. Such accounts make attributions of reference by appeal to our pretheoretical intuitions about how true statements ought to be distibuted among the scientific utterances of the past. I argue that in the cases that merit discussion, this strategy fails because our intuitions are unstable. The interesting (...) cases are importantly borderline--it really isn't clear what we ought to say about how those terms referred. I conclude that in many relevant cases, our grounds for thinking that the theoretical terms of the past referred are matched by our grounds for thinking that they failed to refer, in such a way that deciding on either result is arbitrary and bad news for the realist. In response to this problem, in the second part of the paper I expand upon Field's (1973) account of partial reference to sketch a new way of thinking about the theoretical terms of the past--that they partially referred and partially failed to refer. (shrink)
Since the first volume appeared in 2005, the collection Controversies has brought together pieces of work related to the field of argumentation, giving particular attention to those that are concerned with theoretical and practical problems connected with discursive controversy and confrontation. Authors such as P. Barrotta, M. Dascal, S. Frogel, H. Chang and D. Walton had already either edited or written previous editions to the present volume (volume six) of the collection. F. H. van Eemeren and B. Garssen (the (...) former has already, with P. Houtlosser, edited the second volume of this collection) are responsible for compiling and editing this collection. In this volume Van Eemeren and Garssen edit works they conceive as being akin to those elements which, in argumentation discourse, serve to resolve – or often to present – differences of opinion. However, it should be added that this is not a mere editing job, but rather the result of an intellectual collaboration between two international research groups dedicated to a common field – consisting, on the one hand, of controversies and, on the other, of argumentation. (shrink)
HartryField's formulation of an epistemological argument against platonism is only valid if knowledge is constrained by a causal clause. Contrary to recent claims (e.g. in Liggins (2006), Liggins (2010)), Field's argument therefore fails the very same criterion usually taken to discredit Benacerraf's earlier version.
H. B. D. Kettlewell's field experiments on industrial melanism in the peppered moth, Biston betularia, have become the best known demonstration of natural selection in <span class='Hi'>action</span>. I argue that textbook accounts routinely portray this research as an example of controlled experimentation, even though this is historically misleading. I examine how idealized accounts of Kettlewell's research have been used by professional biologists and biology teachers. I also respond to some criticisms of David Rudge to my earlier discussions of this (...) case study, and I question Rudge's claims about the importance of purely observational studies for the eventual acceptance and popularization of Kettlewell's explanation for the evolution of industrial melanism. (shrink)
Abstract In the present article, we provide a critical overview of the emerging field of ‘neuroeducation’ also frequently referred to as ‘mind, brain and education’ or ‘educational neuroscience’. We describe the growing energy behind linking education and neuroscience in an effort to improve learning and instruction. We explore reasons behind such drives for interdisciplinary research. Reviewing some of the key advances in neuroscientific studies that have come to bear on neuroeducation, we discuss recent evidence on the brain circuits underlying (...) reading, mathematical abilities as well as the potential to use neuroscience to design training programs of neurocognitive functions, such as working memory, that are expected to have effects on overall brain function. Throughout this review we describe how such research can enrich our understanding of the acquisition of academic skills. Furthermore, we discuss the potential for modern brain imaging methods to serve as diagnostic tools as well as measures of the effects of educational interventions. Throughout this discussion, we draw attention to limitations of the available evidence and propose future avenues for research. We also discuss the challenges that face this growing discipline. Specifically, we draw attention to unrealistic expectations for the immediate impact of neuroscience on education, methodological difficulties, and lack of interdisciplinary training, which results in poor communication between educators and neuroscientists. We point out that there should be bi-directional and reciprocal interactions between both disciplines of neuroscience and education, in which research originating from each of these traditions is considered to be compelling in its own right. While there are many obstacles that lie in the way of a productive field of neuroeducation, we contend that there is much reason to be optimistic and that the groundwork has been laid to advance this field in earnest. Content Type Journal Article Category Original Paper Pages 1-13 DOI 10.1007/s12152-011-9119-3 Authors Daniel Ansari, Numerical Cognition Laboratory, Department of Psychology, The University of Western Ontario, Westminster Hall, London, ON N6A 3K7, Canada Bert De Smedt, Parenting and Special Education Research Group, Katholieke Universiteit Leuven, Leuven, Belgium Roland H. Grabner, Institute for Behavioral Sciences, Swiss Federal Institute of Technology (ETH) Zurich, Zurich, Switzerland Journal Neuroethics Online ISSN 1874-5504 Print ISSN 1874-5490. (shrink)
HartryField's revised logic for the theory of truth in his new book, Saving Truth from Paradox , seeking to preserve Tarski's T-scheme, does not admit a full theory of negation. In response, Crispin Wright proposed that the negation of a proposition is the proposition saying that some proposition inconsistent with the first is true. For this to work, we have to show that this proposition is entailed by any proposition incompatible with the first, that is, that it (...) is the weakest proposition incompatible with the proposition whose negation it should be. To show that his proposal gave a full intuitionist theory of negation, Wright appealed to two principles, about incompatibility and entailment, and using them Field formulated a paradox of validity (or more precisely, of inconsistency). The medieval mathematician, theologian and logician, Thomas Bradwardine, writing in the fourteenth century, proposed a solution to the paradoxes of truth which does not require any revision of logic. The key principle behind Bradwardine's solution is a pluralist doctrine of meaning, or signification, that propositions can mean more than they explicitly say. In particular, he proposed that signification is closed under entailment. In light of this, Bradwardine revised the truth-rules, in particular, refining the T-scheme, so that a proposition is true only if everything that it signifies obtains. Thereby, he was able to show that any proposition which signifies that it itself is false, also signifies that it is true, and consequently is false and not true. I show that Bradwardine's solution is also able to deal with Field's paradox and others of a similar nature. Hence Field's logical revisions are unnecessary to save truth from paradox. (shrink)
In approaching Ch. 4 of Saving Truth from Paradox, it might be helpful first to revisit Curry’s original paper, and to revisit Lukasiewicz too, to provide more of the scenesetting that Field doesn’t himself fill in. So in §1 I’ll say something about Curry, in §2 we’ll look at what Lukasiewicz was up to in his original three-valued logic, and in §3 we’ll look at the move from a three-valued to a many-valued Lukasiewicz logic. In §4, I move on (...) to announce a theorem by H´. (shrink)
The following analysis demonstrates that G.H. Mead's understanding of human speech (what Mead often referred to as “the vocal gesture”) is remarkably consistent with today's interdisciplinary field that studies speech as a natural behavior with an evolutionary history. Mead seems to have captured major empirical and theoretical insights more than half a century before the contemporary field began to take shape. In that field the framework known as “Tinbergen's Four Questions,” developed in ecology to study naturally occurring (...) behavior in nonhuman animals, has been an effective organizing framework for research on human speech. It is used in this paper to organize the comparison of Mead with contemporary scholars. The analysis concludes that Mead was, in a sense, “beyond” the Four Questions by recognizing the limitations of reductionist methods in understanding the nature of conscious phenomena, especially language. Mead's socially situated model of the nature of human speech makes him relevant to today's field where some see an undervaluation of the treatment of language as a social process. (shrink)
Measure H is confusing to many people, because the scientific issues involved are complex, and few have the necessary scientific background to analyze them themselves. When those of us of more advanced years were growing up, the university scientific community for the most part was independent and objective, today even the best universities are dependent upon multinational corporations for their funding. Many scientists even have to go out and fund-raise for major portions of their own salaries. Nowhere is the situation (...) worse in this regard than in the field of biotechnology. And no modern day Jesus has yet driven the money-changers from the temple of science. Where else can we look for guidance? (shrink)
HartryField has recently presented an original and interesting approach to the a priori. Its main theses are, first, that certain rules are empirically indefeasible and, second, that the reasonableness of these rules are not based on any factual property. After an introduction, Field’s approach is presented in section II. Section III examines his claims concerning empirical indefeasibility. It will be argued that his general argument for empirical indefeasibility fails along with the particular examples of rules he (...) gives. Alternative ways of preserving empirical indefeasibility are suggested that are compatible with overdetermination under certain assumptions. In section IV, Field’s arguments for the nonfactuality of epistemological concepts, such as reasonableness, are found wanting. At the end, an alternative way of understanding the link between the epistemological concept in question and truth-conduciveness is proposed that preserves the factuality of the epistemological concept. (shrink)
This classic collection of essays, first published in 1968, has had an enduring impact on academic and public debates about criminal responsibility and criminal punishment. Forty years on, its arguments are as powerful as ever. H.L.A. Hart offers an alternative to retributive thinking about criminal punishment that nevertheless preserves the central distinction between guilt and innocence. He also provides an account of criminal responsibility that links the distinction between guilt and innocence closely to the ideal of the rule of law, (...) and thereby attempts to by-pass unnerving debates about free will and determinism. Always engaged with live issues of law and public policy, Hart makes difficult philosophical puzzles accessible and immediate to a wide range of readers. -/- For this new edition, otherwise a reproduction of the original, John Gardner adds an introduction engaging critically with Hart's arguments, and explaining the continuing importance of Hart's ideas in spite of the intervening revival of retributive thinking in both academic and policy circles. -/- Unavailable for ten years, the new edition of Punishment and Responsibility makes available again the central text in the field for a new generation of academics, students and professionals engaged in criminal justice and penal policy. (shrink)
Experiments are described, using electroencephalography (EEG) and simple tests of performance, which support the hypothesis that collapse of a quantum field is of importance to the functioning of the brain. The theoretical basis of our experiments is derived from Penrose (1989) who suggested that conscious decision-making is a manifestation of the outcome of quantum computation in the brain involving collapse of some relevant wave function. He also proposed that collapse of any wave function depends on a gravitational criterion. As (...) different brain areas are known to subserve different functions, we argue that `Penrose collapse' must occur in a particular brain area when performing a task that uses it. Further, taking an EEG from the area should amplify the gravitational prerequisite for collapse, so affecting task performance. There are no non-quantum theories which could lead one to expect that taking an EEG could directly affect task performance by subjects. The results of both pilot and main experiments indicated that task performance was indeed influenced by taking an EEG from relevant brain areas. Control experiments suggested that the influence was quantum mechanical in origin, and was not due to any experimental artefact. The results are statistically significant and merit attempts at replication in an independent laboratory, preferably with more sophisticated equipment than was available to us. (shrink)
Considering Pragma-Dialectics honors the monumental contributions of one of the foremost international figures in current argumentation scholarship: Frans van Eemeren. The volume presents the research efforts of his colleagues and addresses how their work relates to the pragma-dialectical theory of argumentation with which van Eemeren’s name is so intimately connected. This tribute serves to highlight the varied approaches to the study of argumentation and is destined to inspire researchers to advance scholarship in the field far into (...) the future. Replete with contributions from highly-esteemed academics in argumentation study, chapters in this volume address such topics as: *Pragma-dialectic versus epistemic theories of arguing and arguments; *Pragma-dialectics and self-advocacy in physician-patient interactions; *The pragma-dialectical analysis of the ad hominem family; *Rhetoric, dialectic, and the functions of argument; and *The semantics of reasonableness. As an exceptional volume and a fitting tribute, this work will be of interest to all argumentation scholars considering the astute insights and scholarly legacy of Frans van Eemeren. (shrink)
If the general arguments concerning theinvolvement of variation and selection inexplanations of ``fit'' are valid, then variationand selection explanations should beappropriate, or at least potentiallyappropriate, outside the paradigm historisticdomains of biology and knowledge. In thisdiscussion, I wish to indicate some potentialroles for variation and selection infoundational physics â specifically inquantum field theory. I will not be attemptingany full coherent ontology for quantum fieldtheory â none currently exists, and none islikely for at least the short term future. Instead, I wish (...) to engage in some partiallyspeculative interpretations of some interestingresults in this area with the aim ofdemonstrating that variation and selectionnotions might play a role even here. Ifvariation and selection can survive in even asinhospitable and non-paradigmatic a terrain asfoundational physics, then it can surviveanywhere. (shrink)