Why is the future so different from the past? Why does the past affect the future and not the other way round? The universe began with the Big Bang - will it end with a `Big Crunch'? Now in paperback, this book presents an innovative and controversial view of time and contemporary physics. Price urges physicists, philosophers, and anyone who has ever pondered the paradoxes of time to look at the world from a fresh perspective, and throws fascinating new light (...) on some of the great mysteries of the universe. (shrink)
How do rational minds make contact with the world? The empiricist tradition sees a gap between mind and world, and takes sensory experience, fallible as it is, to provide our only bridge across that gap. In its crudest form, for example, the traditional idea is that our minds consult an inner realm of sensory experience, which provides us with evidence about the nature of external reality. Notoriously, however, it turns out to be far from clear that there is any viable (...) conception of experience which allows it to do the job. The original problem is to show that thought is rationally constrained by external reality. If sensory experience is to provide the solution--in particular, if it is to provide the answer to sceptical challenges--it must therefore meet two criteria. First, it must itself be `receptive'--i.e., appropriately constrained by external reality. Second, it must be the kind of thing that can enter into a logical or rational relationship with belief--it must already be `conceptual,' in other words. In arguing against the idea that anything could serve both roles, Wilfred Sellars termed this conception of experience "the Myth of the Given.". (shrink)
This volume brings together fourteen major essays by one of contemporary philosophy's most challenging thinkers. Huw Price links themes from Quine, Carnap, Wittgenstein and Rorty, to craft a powerful critique of contemporary naturalistic metaphysics. He offers a new positive program for philosophy, cast from a pragmatist mould.
Can an agent deliberating about an action A hold a meaningful credence that she will do A? 'No', say some authors, for 'Deliberation Crowds Out Prediction' (DCOP). Others disagree, but we argue here that such disagreements are often terminological. We explain why DCOP holds in a Ramseyian operationalist model of credence, but show that it is trivial to extend this model so that DCOP fails. We then discuss a model due to Joyce, and show that Joyce's rejection of DCOP rests (...) on terminological choices about terms such as 'intention', 'prediction', and 'belief'. Once these choices are in view, they reveal underlying agreement between Joyce and the DCOP-favouring tradition that descends from Ramsey. Joyce's Evidential Autonomy Thesis (EAT) is effectively DCOP, in different terminological clothing. Both principles rest on the so-called 'transparency' of first-person present-tensed reflection on one's own mental states. (shrink)
In a recent paper, Richard Rorty begins by telling us why pragmatists such as himself are inclined to identify truth with justification: ‘Pragmatists think that if something makes no difference to practice, it should make no difference to philosophy. This conviction makes them suspicious of the distinction between justification and truth, for that distinction makes no difference to my decisions about what to do.’ (1995, p. 19) Rorty goes on to discuss the claim, defended by Crispin Wright, that truth is (...) a normative constraint on assertion. He argues that this claim runs foul of this principle of no difference without a practical difference: ‘The need to justify our beliefs to ourselves and our fellow agents subjects us to norms, and obedience to these norms produces a behavioural pattern that we must detect in others before confidently attributing beliefs to them. But there seems to be no occasion to look for obedience to an additional norm – the commandment to seek the truth. For – to return to the pragmatist doubt with which I began – obedience to that commandment will produce no behaviour not produced by the need to offer justification.’ (1995, p. 26) Again, then, Rorty appeals to the claim that a commitment to a norm of truth rather than a norm of justification makes no behavioural difference. This is an empirical claim, testable in principle by comparing the behaviour of a community of realists (in Rorty’s sense) to that of a community of pragmatists. In my view, the experiment would show that the claim is unjustified, indeed false. I think that there is an important and widespread behavioural pattern that depends on the fact that speakers do take themselves to be subject to such an additional norm. Moreover, it is a.. (shrink)
The difference between cause and effect seems obvious and crucial in ordinary life, yet missing from modern physics. Almost a century ago, Bertrand Russell called the law of causality 'a relic of a bygone age'. In this important collection 13 leading scholars revisit Russell's revolutionary conclusion, discussing one of the most significant and puzzling issues in contemporary thought.
In “A Subjectivist’s Guide to Objective Chance,” David Lewis says that he is “led to wonder whether anyone but a subjectivist is in a position to understand objective chance.” The present essay aims to motivate this same Lewisean attitude, and a similar degree of modest subjectivism, with respect to objective causation. The essay begins with Newcomb problems, which turn on an apparent tension between two principles of choice: roughly, a principle sensitive to the causal features of the relevant situation, and (...) a principle sensitive only to evidential factors. Two-boxers give priority to causal beliefs, and one-boxers to evidential beliefs. The essay notes that a similar issue can arise when the modality in question is chance, rather than causation. In this case, the conflict is between decision rules based on credences guided solely by chances, and rules based on credences guided by other sorts of probabilistic evidence. Far from excluding cases of the latter kind, Lewis’s Principal Principle explicitly allows for them, in the form of the caveat that credences should follow beliefs about chances only in the absence of “inadmissible evidence.” The essay then exhibits a tension in Lewis’s views on these two matters, by presenting a class of decision problems—some of them themselves Newcomb problems—in which Lewis’s view of the relevance of inadmissible evidence seems in tension with his causal decision theory. It offers a diagnosis for this dilemma and proposes a remedy, based on an extension of a proposal due to Ned Hall and others from the case of chance to that of causation. The remedy suggests a new view of the relation between causal decision theory and evidential decision theory, namely, that they stand to each other much as chance stands to credence, being objective and subjective faces of the same practical coin. This has much the same metaphysical benefits as Lewis’s own view of chance and also throws interesting new light on Newcomb problems, providing an irenic resolution of the apparent disagreement between causal and evidential decision rules. (shrink)
Pragmatists have traditionally been enemies of representationalism but friends of naturalism, when naturalism is understood to pertain to human subjects, in the sense of Hume and Nietzsche. In this volume Huw Price presents his distinctive version of this traditional combination, as delivered in his René Descartes Lectures at Tilburg University in 2008. Price contrasts his view with other contemporary forms of philosophical naturalism, comparing it with other pragmatist and neo-pragmatist views such as those of Robert Brandom and Simon Blackburn. Linking (...) their different 'expressivist' programmes, Price argues for a radical global expressivism that combines key elements from both. With Paul Horwich and Michael Williams, Brandom and Blackburn respond to Price in new essays. Price replies in the closing essay, emphasising links between his views and those of Wilfrid Sellars. The volume will be of great interest to advanced students of philosophy of language and metaphysics. (shrink)
There is a long-standing disagreement in the philosophy of probability and Bayesian decision theory about whether an agent can hold a meaningful credence about an upcoming action, while she deliberates about what to do. Can she believe that it is, say, 70% probable that she will do A, while she chooses whether to do A? No, say some philosophers, for Deliberation Crowds Out Prediction (DCOP), but others disagree. In this paper, we propose a valid core for DCOP, and identify terminological (...) causes for some of the apparent disputes. (shrink)
In defending so-called global expressivism I have often seen Carnap as an ally. Both Carnap’s rejection of “externalist” metaphysics and his implicit pluralism about linguistic frameworks seem grist for the global expressivist’s mill. André Carus argues for a third point of connection, via Carnap’s voluntarism. I note two reasons for thinking that this connection is not as close as Carus contends.
In this paper we defend the view that the ordinary notions of cause and effect have a direct and essential connection with our ability to intervene in the world as agents.1 This is a well known but rather unpopular philosophical approach to causation, often called the manipulability theory. In the interests of brevity and accuracy, we prefer to call it the agency theory.2 Thus the central thesis of an agency account of causation is something like this: an event A is (...) a cause of a distinct event B just in case bringing about the occurrence of A would be an effective means by which a free agent could bring about the occurrence of B. In our view the unpopularity of the agency approach to causation may be traced to two factors. The first is a failure to appreciate certain distinctive advantages that this approach has over its various rivals. We have drawn attention to some of these advantages elsewhere, and we summarize below. However, the second and more important factor is the influence of a number of stock objections, objections that seem to have persuaded many philosophers that agency accounts face insuperable obstacles. In this paper we want to show that these objections have been vastly overrated. There are four main objections. (shrink)
Causalists and Evidentialists can agree about the right course of action in an (apparent) Newcomb problem, if the causal facts are not as initially they seem. If declining $1,000 causes the Predictor to have placed $1m in the opaque box, CDT agrees with EDT that one-boxing is rational. This creates a difficulty for Causalists. We explain the problem with reference to Dummett's work on backward causation and Lewis's on chance and crystal balls. We show that the possibility that the causal (...) facts might be properly judged to be non-standard in Newcomb problems leads to a dilemma for Causalism. One horn embraces a subjectivist understanding of causation, in a sense analogous to Lewis's own subjectivist conception of objective chance. In this case the analogy with chance reveals a terminological choice point, such that either (i) CDT is completely reconciled with EDT, or (ii) EDT takes precedence in the cases in which the two theories give different recommendations. The other horn of the dilemma rejects subjectivism, but now the analogy with chance suggests that it is simply mysterious why causation so construed should constrain rational action. (shrink)
The best case for thinking that quantum mechanics is nonlocal rests on Bell's Theorem, and later results of the same kind. However, the correlations characteristic of Einstein–Podolsky–Rosen (EPR)–Bell (EPRB) experiments also arise in familiar cases elsewhere in quantum mechanics (QM), where the two measurements involved are timelike rather than spacelike separated; and in which the correlations are usually assumed to have a local causal explanation, requiring no action-at-a-distance (AAD). It is interesting to ask how this is possible, in the light (...) of Bell's Theorem. We investigate this question, and present two options. Either (i) the new cases are nonlocal too, in which case AAD is more widespread in QM than has previously been appreciated (and does not depend on entanglement, as usually construed); or (ii) the means of avoiding AAD in the new cases extends in a natural way to EPRB, removing AAD in these cases too. There is a third option, viz., that the new cases are strongly disanalogous to EPRB. But this option requires an argument, so far missing, that the physical world breaks the symmetries which otherwise support the analogy. In the absence of such an argument, the orthodox combination of views—action-at-a-distance in EPRB, but local causality in its timelike analogue—is less well established than it is usually assumed to be. 1 Introduction1.1 Background1.2 Outline of the argument2 The Experiments2.1 Standard EPRB2.2 Sideways EPRB2.3 Comparing the experiments2.4 The need for beables3 The Symmetry Considerations3.1 The action symmetry3.2 Time-symmetry in SEPRB4 The Basic Trilemma4.1 An intuitive defence of Option III?5 Avoiding the Trilemma?6 The Classical Objection7 Defending Option III7.1 The free will argument7.2 Independence and consistency8 Entanglement and Epistemic Perspective. (shrink)
One of the most striking features of causation is that causes typically precede their effects – the causal arrow is strongly aligned with the temporal arrow. Why should this be so? We offer an opinionated guide to this problem, and to the solutions currently on offer. We conclude that the most promising strategy is to begin with the de facto asymmetry of human deliberation, characterised in epistemic terms, and to build out from there. More than any rival, this subjectivist approach (...) promises to demystify the asymmetry, temporal orientation, and deliberative relevance of causal judgements. (shrink)
Is non-cognitivism compatible with minimalism about truth? A contemporary argument claims not, and therefore that moral realists, for example, should take heart from the popularity of semantic minimalism. The same is said to apply to cognitivism about other topics—conditionals, for example—for the argument depends only on the fact that ordinary usage applies the notions of truth and falsity to utterances of the kind in question. Given this much, minimalism about truth is said to leave no room for the view that (...) the utterances concerned are non-cognitive in nature. (shrink)
A number of writers have been attracted to the idea that some of the peculiarities of quantum theory might be manifestations of 'backward' or 'retro' causality, underlying the quantum description. This idea has been explored in the literature in two main ways: firstly in a variety of explicit models of quantum systems, and secondly at a conceptual level. This note introduces a third approach, intended to complement the other two. It describes a simple toy model, which, under a natural interpretation, (...) shows how retrocausality can emerge from simple global constraints. The model is also useful in permitting a clear distinction between the kind of retrocausality likely to be of interest in QM, and a different kind of reverse causality, with which it is liable to be confused. The model is proposed in the hope that future elaborations might throw light on the potential of retrocausality to account for quantum phenomena. (shrink)
Probabilistic accounts of causality have long had trouble with ‘spurious’ evidential correlations. Such correlations are also central to the case for causal decision theory—the argument that evidential decision theory is inadequate to cope with certain sorts of decision problem. However, there are now several strong defences of the evidential theory. Here I present what I regard as the best defence, and apply it to the probabilistic approach to causality. I argue that provided a probabilistic theory appeals to the notions of (...) agency and effective strategy, it can avoid the problem of spurious causes. I show that such an appeal has other advantages; and argue that it is not illegitimate, even for a causal realist. (shrink)
Since the late nineteenth century, physics has been puzzled by the time-asymmetry of thermodynamic phenomena in the light of the apparent T-symmetry of the underlying laws of mechanics. However, a compelling solution to this puzzle has proved elusive. In part, I argue, this can be attributed to a failure to distinguish two conceptions of the problem. According to one, the main focus of our attention is a time-asymmetric lawlike generalisation. According to the other, it is a particular fact about the (...) early universe. This paper aims (i) to distinguish these two different conceptions of the time-asymmetric explanandum in thermodynamics; (ii) to argue in favour of the latter; and (iii) to show that whichever we choose, our rational expectations about the thermodynamic behaviour of the future must depend on what we know about the past: contrary to the common view, statistical arguments alone do not give us good reason to expect that entropy will always continue to increase. (shrink)
[Abstract and PDF at the Pittsburgh PhilSci Archive] A slightly shorter version of this paper is to appear in a volume edited by Jonathan Barrett, Adrian Kent, David Wallace and Simon Saunders, containing papers presented at the Everett@50 conference in Oxford in July 2007, and the Many Worlds@50 meeting at the Perimeter Institute in September 2007. The paper is based on my talk at the latter meeting (audio, video and slides of which are accessible here).
I distinguish three views, a defence of any one of which would go some way towards vindicating the view that there is something objective about the passage of time: the view that the present moment is objectively distinguished; the view that time has an objective direction – that it is an objective matter which of two non-simultaneous events is the earlier and which the later; the view that there is something objectively dynamic, ﬂux-like, or "ﬂow-like" about time. I argue that (...) each of these views is not so much false as doubtfully coherent. In each case, it turns out to be hard to make sense of what the view could be, at least if it is to be non-trivial, and of use to a friend of objective passage. I conclude with some remarks about avenues that seem worth exploring in the philosophy of time, when we are done with trying to make sense of passage. (shrink)
It has often been suggested that retrocausality offers a solution to some of the puzzles of quantum mechanics: e.g., that it allows a Lorentz-invariant explanation of Bell correlations, and other manifestations of quantum nonlocality, without action-at-a-distance. Some writers have argued that time-symmetry counts in favour of such a view, in the sense that retrocausality would be a natural consequence of a truly time-symmetric theory of the quantum world. Critics object that there is complete time-symmetry in classical physics, and yet no (...) apparent retrocausality. Why should the quantum world be any different? This note throws some new light on these matters. I call attention to a respect in which quantum mechanics is different, under some assumptions about quantum ontology. Under these assumptions, the combination of time-symmetry without retrocausality is unavailable in quantum mechanics, for reasons intimately connected with the differences between classical and quantum physics (especially the role of discreteness in the latter). Not all interpretations of quantum mechanics share these assumptions, however, and in those that do not, time-symmetry does not entail retrocausality. (shrink)
William James said that sometimes detailed philosophical argument is irrelevant. Once a current of thought is really under way, trying to oppose it with argument is like planting a stick in a river to try to alter its course: “round your obstacle flows the water and ‘gets there just the same’”. He thought pragmatism was such a river. There is a contemporary river that sometimes calls itself pragmatism, although other titles are probably better. At any rate it is the denial (...) of differences, the celebration of the seamless web of language, the soothing away of distinctions, whether of primary versus secondary, fact versus value, description versus expression, or of any other significant kind. What is left is a smooth, undifferentiated view of language, sometimes a nuanced kind of anthropomorphism or “internal” realism, sometimes the view that no view is possible: minimalism, deflationism, quietism. Wittgenstein is often admired as a high priest of the movement. Planting a stick in this water is probably futile, but having done it before I shall do it again, and—who knows?—enough sticks may make a dam, and the waters of error may subside. (Blackburn, 1998a, 157). (shrink)
In a famous paper in Noûs in 1979, John Perry points out that action depends on indexical beliefs. In addition to “third-person” information about her environment, an agent need “ﬁrst-person” information about where, when and who she is. This conclusion is widely interpreted as a reason for thinking that tensed claims cannot be translated without loss into untensed language; but not as a reason for realism about tensed facts. In another famous paper in the same volume of Noûs, Nancy Cartwright (...) argues that action requires that agents represent their world in causal terms, rather than merely probabilistic terms: for, Cartwright argues, there’s a distinction between eﬀective and ineﬀective strategies, that otherwise goes missing. This is widely taken as a reason for thinking that causal claims cannot be translated without loss into merely probabilistic claims; and also – in contrast to Perry’s case – widely regarded as a reason for realism about causation. In this paper I ask whether the latter conclusion is compulsory, or whether, as in Perry’s case, the need for causal beliefs might merely reﬂect some “situated” aspect of a decision-maker’s perspective. (shrink)
Proponents of causal decision theories argue that classical Bayesian decision theory (BDT) gives the wrong advice in certain types of cases, of which the clearest and commonest are the medical Newcomb problems. I defend BDT, invoking a familiar principle of statistical inference to show that in such cases a free agent cannot take the contemplated action to be probabilistically relevant to its causes (so that BDT gives the right answer). I argue that my defence does better than those of Ellery (...) Eells and Richard Jeffrey; and that it applies, where necessary, to other types of Newcomb problem. (shrink)
The best-known argument for Evidential Decision Theory (EDT) is the ‘Why ain’cha rich?’ challenge to rival Causal Decision Theory (CDT). The basis for this challenge is that in Newcomb-like situations, acts that conform to EDT may be known in advance to have the better return than acts that conform to CDT. Frank Arntzenius has recently proposed an ingenious counter argument, based on an example in which, he claims, it is predictable in advance that acts that conform to EDT will do (...) less well than acts that conform to CDT. We raise two objections to Arntzenius’s example. We argue, first, that the example is subtly incoherent, in a way that undermines its effectiveness against EDT; and, second, that the example relies on calculating the average return over an inappropriate population of acts. (shrink)
Concepts employed in folk descriptions of the world often turn out to be more perspectival than they seem at first sight, involving previously unrecognised sensitivity to the viewpoint or 'situation' of the user of the concept in question. Often, it is progress in science that reveals such perspectivity, and the deciding factor is that we realise that other creatures would apply the same concepts with different extension, in virtue of differences between their circumstances and ours. In this paper I argue (...) that causal concepts are perspectival in this way, and describe the 'situation' on which they depend in terms of an abstract characterisation of the viewpoint of a deliberating agent. I argue that this approach makes better sense than rivals of the apparent asymmetry and temporal orientation of the causal relation. (shrink)
In many physical systems, coupling forces provide a way of carrying the energy stored in adjacent harmonic oscillators from place to place, in the form of waves. The wave equations governing such phenomena are time-symmetric: they permit the opposite processes, in which energy arrives at a point in the form of incoming concentric waves, to be lost to some external system. But these processes seem rare in nature. What explains this temporal asymmetry, and how is it related to the thermodynamic (...) asymmetry? This paper attempts to clarify these old issues, in the light of recent contributions. After brief introductory remarks (§1), the paper is in three main parts. §2 examines the so-called ‘Sommerfeld Radiation Condition’, arguing that its link to the observed asymmetry is much less direct than commonly supposed. §3 begins with Zeh's proposal to make the Sommerfeld condition an ingredient in an explanation of the observed asymmetry, and makes explicit a useful distinction between two ways in which the thermodynamic asymmetry might connect to the radiation asymmetry. §4 reviews a proposal I have defended in earlier work about the relation of the radiative asymmetry to that of thermodynamics, and defends it against recent objections by Zeh and Frisch. I also distinguish it from a recent proposal due to North. I agree with North that the observed asymmetry of radiation stems from the low entropy history, but argue that she mis-characterises the asymmetry, and hence misses a crucial element in a proper account of the role of the low entropy past. (shrink)
For more than a century, physics has known of a puzzling conﬂict between the T- asymmetry of thermodynamic phenomena and the T-symmetry of the underlying microphysics on which these phenomena depend. This paper provides a guide to the current status of this puzzle, distinguishing the central issue from various issues with which it may be confused. It is shown that there are two competing conceptions of what is needed to resolve the puzzle of the thermodynamic asymmetry, which diﬀer with respect (...) to the number of distinct T-asymmetries they take to be manifest in the physical world. On the preferable one-asymmetry conception, the remaining puzzle concerns the ordered distribution of matter in the early universe. The puzzle of the thermodynamic arrow thus becomes a puzzle for cosmology. (shrink)
holds for all central declarative sentences. According to deflationists, the key to an understanding of truth lies in an appreciation of the grammatical advantages of a predicate satisfying DS. As Paul Horwich puts it, “our truth predicate is merely a logical device enabling simple formulations of certain sorts of generalization.” (1996, p. 878; see also Horwich 1990).
The so-called Canberra Plan is a grandchild of the Ramsey-Carnap treatment of theoretical terms. In its original form, the Ramsey-Carnap approach provided a method for analysing the meaning of scientific terms, such as “electron”, “gene” and “quark”—terms whose meanings could plausibly be delineated by their roles within scientific theories. But in the hands of David Lewis (1970, 1972), the original approach begat a more ambitious descendant, generalised and extended in two distinct ways: first, Lewis applied the technique to analyse the (...) meaning of terms introduced not just by explicit scientific theories, but also by implicit folk theories such as folk psychology; second, he supplemented the theory to provide an account of the way in which the referents of the analysed terms might be identified on the basis of empirical investigation. (shrink)
Scientiﬁc naturalism is a metaphysical doctrine, a view about what there is, or what we ought to believe that there is. It maintains that natural science should be our guide in matters metaphysical: the ontology we should accept is the ontology that turns out to be required by science. Quine is often regarded as the doyen of scientiﬁc naturalists, though the supporting cast includes such giants as David Lewis and J. J. C. Smart.
It is often objected that the Everett interpretation of QM cannot make sense of quantum probabilities, in one or both of two ways: either it can’t make sense of probability at all, or it can’t explain why probability should be governed by the Born rule. David Deutsch has attempted to meet these objections. He argues not only that rational decision under uncertainty makes sense in the Everett interpretation, but also that under reasonable assumptions, the credences of a rational agent in (...) an Everett world should be constrained by the Born rule. David Wallace has developed and defended Deutsch’s proposal, and greatly clarified its conceptual basis. In particular, he has stressed its reliance on the distinguishing symmetry of the Everett view, viz., that all possible outcomes of a quantum measurement are treated as equally real. The argument thus tries to make a virtue of what has usually been seen as the main obstacle to making sense of probability in the Everett world. In this note I outline some objections to the Deutsch-Wallace argument, and to related proposals by Hilary Greaves about the epistemology of Everettian QM. (In the latter case, my arguments include an appeal to an Everettian analogue of the Sleeping Beauty problem.) The common thread to these objections is that the symmetry in question remains a very significant obstacle to making sense of probability in the Everett interpretation. (shrink)
Physics takes for granted that interacting physical systems with no common history are independent, before their interaction. This principle is time-asymmetric, for no such restriction applies to systems with no common future, after an interaction. The time-asymmetry is normally attributed to boundary conditions. I argue that there are two distinct independence principles of this kind at work in contemporary physics, one of which cannot be attributed to boundary conditions, and therefore conflicts with the assumed T (or CPT) symmetry of microphysics. (...) I note that this may have interesting ramifications in quantum mechanics. (shrink)
Wittgenstein is often thought to have challenged the view that assertion is an important theoretical category in a philosophical view of language. One of Wittgenstein’s main themes in the early sections of the Investigations is that philosophy misses important distinctions about the uses of language, distinctions hidden from us by ‘the uniform appearances of words.’ (1968, #11) As Wittgenstein goes on to say: It is like looking into the cabin of a locomotive. We see handles all looking more or less (...) alike. (Naturally, since they are all supposed to be handled.) But one is the handle of a crank which can be moved continuously (it regulates the opening of a valve); another is the handle of a switch, which has only two effective positions, it is either off or on; a third is the handle of a brake-lever, the harder one pulls on it, the harder it brakes; a fourth, the handle of a pump: it has an effect only so long as it is moved to and fro. (1968, #12) Few contemporary philosophers share Wittgenstein’s evident familiarity with cabin of a steam locomotive, and in general, most of us are increasingly remote from all but the most superficial understanding of the underlying functions of the tools on which we rely. So we are perhaps even more prone to the mistake that Wittgenstein thinks that philosophy makes with respect to language, that of regarding it as one tool rather than many: ‘Think of the tools in a tool-box: there is a hammer, pliers, a saw, a screw-driver, a rule, a glue-pot, glue, nails and screws.—The functions of words are as diverse as the functions of these objects.’1 (1968, #11). (shrink)