An Aristotelian Philosophy of Mathematics breaks the impasse between Platonist and nominalist views of mathematics. Neither a study of abstract objects nor a mere language or logic, mathematics is a science of real aspects of the world as much as biology is. For the first time, a philosophy of mathematics puts applied mathematics at the centre. Quantitative aspects of the world such as ratios of heights, and structural ones such as symmetry and continuity, are parts of the physical world and (...) are objects of mathematics. Though some mathematical structures such as infinities may be too big to be realized in fact, all of them are capable of being realized. Informed by the author's background in both philosophy and mathematics, but keeping to simple examples, the book shows how infant perception of patterns is extended by visualization and proof to the vast edifice of modern pure and applied mathematical knowledge. (shrink)
In many diagrams one seems to perceive necessity – one sees not only that something is so, but that it must be so. That conflicts with a certain empiricism largely taken for granted in contemporary philosophy, which believes perception is not capable of such feats. The reason for this belief is often thought well-summarized in Hume's maxim: ‘there are no necessary connections between distinct existences’. It is also thought that even if there were such necessities, perception is too passive or (...) localized a faculty to register them. We defend the perception of necessity against such Humeanism, drawing on examples from mathematics. (shrink)
A polemical account of Australian philosophy up to 2003, emphasising its unique aspects (such as commitment to realism) and the connections between philosophers' views and their lives. Topics include early idealism, the dominance of John Anderson in Sydney, the Orr case, Catholic scholasticism, Melbourne Wittgensteinianism, philosophy of science, the Sydney disturbances of the 1970s, Francofeminism, environmental philosophy, the philosophy of law and Mabo, ethics and Peter Singer. Realist theories especially praised are David Armstrong's on universals, David Stove's on logical probability (...) and the ethical realism of Rai Gaita and Catholic philosophers. In addition to strict philosophy, the book treats non-religious moral traditions to train virtue, such as Freemasonry, civics education and the Greek and Roman classics. (shrink)
Throughout history, almost all mathematicians, physicists and philosophers have been of the opinion that space and time are infinitely divisible. That is, it is usually believed that space and time do not consist of atoms, but that any piece of space and time of non-zero size, however small, can itself be divided into still smaller parts. This assumption is included in geometry, as in Euclid, and also in the Euclidean and non- Euclidean geometries used in modern physics. Of the few (...) who have denied that space and time are infinitely divisible, the most notable are the ancient atomists, and Berkeley and Hume. All of these assert not only that space and time might be atomic, but that they must be. Infinite divisibility is, they say, impossible on purely conceptual grounds. (shrink)
A problem for Aristotelian realist accounts of universals (neither Platonist nor nominalist) is the status of those universals that happen not to be realised in the physical (or any other) world. They perhaps include uninstantiated shades of blue and huge infinite cardinals. Should they be altogether excluded (as in D.M. Armstrong's theory of universals) or accorded some sort of reality? Surely truths about ratios are true even of ratios that are too big to be instantiated - what is the truthmaker (...) of such truths? It is argued that Aristotelianism can answer the question, but only a semi-Platonist form of it. (shrink)
How were reliable predictions made before Pascal and Fermat's discovery of the mathematics of probability in 1654? What methods in law, science, commerce, philosophy, and logic helped us to get at the truth in cases where certainty was not attainable? The book examines how judges, witch inquisitors, and juries evaluated evidence; how scientists weighed reasons for and against scientific theories; and how merchants counted shipwrecks to determine insurance rates. Also included are the problem of induction before Hume, design arguments for (...) the existence of God, and theories on how to evaluate scientific and historical hypotheses. It is explained how Pascal and Fermat's work on chance arose out of legal thought on aleatory contracts. The book interprets pre-Pascalian unquantified probability in a generally objective Bayesian or logical probabilist sense. (shrink)
Mathematicians often speak of conjectures as being confirmed by evidence that falls short of proof. For their own conjectures, evidence justifies further work in looking for a proof. Those conjectures of mathematics that have long resisted proof, such as Fermat's Last Theorem and the Riemann Hypothesis, have had to be considered in terms of the evidence for and against them. It is argued here that it is not adequate to describe the relation of evidence to hypothesis as `subjective', `heuristic' or (...) `pragmatic', but that there must be an element of what it is rational to believe on the evidence, that is, of non-deductive logic. (shrink)
Just before the Scientific Revolution, there was a "Mathematical Revolution", heavily based on geometrical and machine diagrams. The "faculty of imagination" (now called scientific visualization) was developed to allow 3D understanding of planetary motion, human anatomy and the workings of machines. 1543 saw the publication of the heavily geometrical work of Copernicus and Vesalius, as well as the first Italian translation of Euclid.
Modern philosophy of mathematics has been dominated by Platonism and nominalism, to the neglect of the Aristotelian realist option. Aristotelianism holds that mathematics studies certain real properties of the world – mathematics is neither about a disembodied world of “abstract objects”, as Platonism holds, nor it is merely a language of science, as nominalism holds. Aristotle’s theory that mathematics is the “science of quantity” is a good account of at least elementary mathematics: the ratio of two heights, for example, is (...) a perceivable and measurable real relation between properties of physical things, a relation that can be shared by the ratio of two weights or two time intervals. Ratios are an example of continuous quantity; discrete quantities, such as whole numbers, are also realised as relations between a heap and a unit-making universal. For example, the relation between foliage and being-a-leaf is the number of leaves on a tree,a relation that may equal the relation between a heap of shoes and being-a-shoe. Modern higher mathematics, however, deals with some real properties that are not naturally seen as quantity, so that the “science of quantity” theory of mathematics needs supplementation. Symmetry, topology and similar structural properties are studied by mathematics, but are about pattern, structure or arrangement rather than quantity. (shrink)
In What Science Knows, the Australian philosopher and mathematician James Franklin explains in captivating and straightforward prose how science works its magic. It offers a semipopular introduction to an objective Bayesian/logical probabilist account of scientific reasoning, arguing that inductive reasoning is logically justified (though actually existing science sometimes falls short). Its account of mathematics is Aristotelian realist.
In 1947 Donald Cary Williams claimed in The Ground of Induction to have solved the Humean problem of induction, by means of an adaptation of reasoning ﬁrst advanced by Bernoulli in 1713. Later on David Stove defended and improved upon Williams’ argument in The Rational- ity of Induction (1986). We call this proposed solution of induction the ‘Williams-Stove sampling thesis’. There has been no lack of objections raised to the sampling thesis, and it has not been widely accepted. In our (...) opinion, though, none of these objections has the slightest force, and, moreover, the sampling thesis is undoubtedly true. What we will argue in this paper is that one particular objection that has been raised on numerous occasions is misguided. This concerns the randomness of the sample on which the inductive extrapolation is based. (shrink)
The distinction between the discrete and the continuous lies at the heart of mathematics. Discrete mathematics (arithmetic, algebra, combinatorics, graph theory, cryptography, logic) has a set of concepts, techniques, and application areas largely distinct from continuous mathematics (traditional geometry, calculus, most of functional analysis, differential equations, topology). The interaction between the two – for example in computer models of continuous systems such as fluid flow – is a central issue in the applicable mathematics of the last hundred years. This article (...) explains the distinction and why it has proved to be one of the great organizing themes of mathematics. (shrink)
The formal sciences - mathematical as opposed to natural sciences, such as operations research, statistics, theoretical computer science, systems engineering - appear to have achieved mathematically provable knowledge directly about the real world. It is argued that this appearance is correct.
The global/local contrast is ubiquitous in mathematics. This paper explains it with straightforward examples. It is possible to build a circular staircase that is rising at any point (locally) but impossible to build one that rises at all points and comes back to where it started (a global restriction). Differential equations describe the local structure of a process; their solution describes the global structure that results. The interplay between global and local structure is one of the great themes of mathematics, (...) but rarely discussed explicitly. (shrink)
Dispostions, such as solubility, cannot be reduced to categorical properties, such as molecular structure, without some element of dipositionaity remaining. Democritus did not reduce all properties to the geometry of atoms - he had to retain the rigidity of the atoms, that is, their disposition not to change shape when a force is applied. So dispositions-not-to, like rigidity, cannot be eliminated. Neither can dispositions-to, like solubility.
Aristotelian, or non-Platonist, realism holds that mathematics is a science of the real world, just as much as biology or sociology are. Where biology studies living things and sociology studies human social relations, mathematics studies the quantitative or structural aspects of things, such as ratios, or patterns, or complexity, or numerosity, or symmetry. Let us start with an example, as Aristotelians always prefer, an example that introduces the essential themes of the Aristotelian view of mathematics. A typical mathematical truth is (...) that there are six different pairs in four objects: Figure 1. There are 6 different pairs in 4 objects The objects may be of any kind, physical, mental or abstract. The mathematical statement does not refer to any properties of the objects, but only to patterning of the parts in the complex of the four objects. If that seems to us less a solid truth about the real world than the causation of flu by viruses, that may be simply due to our blindness about relations, or tendency to regard them as somehow less real than things and properties. But relations (for example, relations of equality between parts of a structure) are as real as colours or causes. (shrink)
The winning entry in David Stove's Competition to Find the Worst Argument in the World was: “We can know things only as they are related to us/insofar as they fall under our conceptual schemes, etc., so, we cannot know things as they are in themselves.” That argument underpins many recent relativisms, including postmodernism, post-Kuhnian sociological philosophy of science, cultural relativism, sociobiological versions of ethical relativism, and so on. All such arguments have the same form as ‘We have eyes, therefore we (...) cannot see’, and are equally invalid. (shrink)
Einstein, like most philosophers, thought that there cannot be mathematical truths which are both necessary and about reality. The article argues against this, starting with prima facie examples such as "It is impossible to tile my bathroom floor with regular pentagonal tiles." Replies are given to objections based on the supposedly purely logical or hypothetical nature of mathematics.
Explains Aristotle's views on the possibility of continuous variation between biological species. While the Porphyrean/Linnean classification of species by a tree suggests species are distributed discretely, Aristotle admitted continuous variation between species among lower life forms.
Defends the cosmological argument for the existence of God against Hume's criticisms. Hume objects that since a cause is before its effect, an eternal succession has no cause; but that would rule of by fiat the possibility of God's creating the world from eternity. Hume argues that once a cause is given for each of a collection of objects, there is not need to posit a cause of the whole collection; but that is to assume the universe to be a (...) heap of things arbitrarily grouped rather than a whole arbitrarily divided. (shrink)
According to Quine’s indispensability argument, we ought to believe in just those mathematical entities that we quantify over in our best scientific theories. Quine’s criterion of ontological commitment is part of the standard indispensability argument. However, we suggest that a new indispensability argument can be run using Armstrong’s criterion of ontological commitment rather than Quine’s. According to Armstrong’s criterion, ‘to be is to be a truthmaker (or part of one)’. We supplement this criterion with our own brand of metaphysics, 'Aristotelian (...) (...) realism', in order to identify the truthmakers of mathematics. We consider in particular as a case study the indispensability to physics of real analysis (the theory of the real numbers). We conclude that it is possible to run an indispensability argument without Quinean baggage. (shrink)
The imperviousness of mathematical truth to anti-objectivist attacks has always heartened those who defend objectivism in other areas, such as ethics. It is argued that the parallel between mathematics and ethics is close and does support objectivist theories of ethics. The parallel depends on the foundational role of equality in both disciplines. Despite obvious differences in their subject matter, mathematics and ethics share a status as pure forms of knowledge, distinct from empirical sciences. A pure understanding of principles is possible (...) because of the simplicity of the notion of equality, despite the different origins of our understanding of equality of objects in general and of the equality of the ethical worth of persons. (shrink)
Pascal’s wager and Leibniz’s theory that this is the best of all possible worlds are latecomers in the Faith-and-Reason tradition. They have remained interlopers; they have never been taken as seriously as the older arguments for the existence of God and other themes related to faith and reason.
The classical arguments for scepticism about the external world are defended, especially the symmetry argument: that there is no reason to prefer the realist hypothesis to, say, the deceitful demon hypothesis. This argument is defended against the various standard objections, such as that the demon hypothesis is only a bare possibility, does not lead to pragmatic success, lacks coherence or simplicity, is ad hoc or parasitic, makes impossible demands for certainty, or contravenes some basic standards for a conceptual or linguistic (...) scheme. Since the conclusion of the sceptical argument is not true, it is concluded that one can only escape the force of the argument through some large premise, such as an aptitude of the intellect for truth, if necessary divinely supported. (shrink)
The late scholastics, from the fourteenth to the seventeenth centuries, contributed to many fields of knowledge other than philosophy. They developed a method of conceptual analysis that was very productive in those disciplines in which theory is relatively more important than empirical results. That includes mathematics, where the scholastics developed the analysis of continuous motion, which fed into the calculus, and the theory of risk and probability. The method came to the fore especially in the social sciences. In legal theory (...) they developed, for example, the ethical analyses of the conditions of validity of contracts, and natural rights theory. In political theory, they introduced constitutionalism and the thought experiment of a “state of nature”. Their contributions to economics included concepts still regarded as basic, such as demand, capital, labour, and scarcity. Faculty psychology and semiotics are other areas of significance. In such disciplines, later developments rely crucially on scholastic concepts and vocabulary. (shrink)
• It would be a moral disgrace for God (if he existed) to allow the many evils in the world, in the same way it would be for a parent to allow a nursery to be infested with criminals who abused the children. • There is a contradiction in asserting all three of the propositions: God is perfectly good; God is perfectly powerful; evil exists (since if God wanted to remove the evils and could, he would). • The religious believer (...) has no hope of getting away with excuses that evil is not as bad as it seems, or that it is all a result of free will, and so on. Piper avoids mentioning the best solution so far put forward to the problem of evil. It is Leibniz’s theory that God does not create a better world because there isn’t one — that is, that (contrary to appearances) if one part of the world were improved, the ramifications would result in it being worse elsewhere, and worse overall. It is a “bump in the carpet” theory: push evil down here, and it pops up over there. Leibniz put it by saying this is the “Best of All Possible Worlds”. That phrase was a public relations disaster for his theory, suggesting as it does that everything is perfectly fine as it is. He does not mean that, but only that designing worlds is a lot harder than it looks, and determining the amount of evil in the best one is no easy matter. Though humour is hardly appropriate to the subject matter, the point of Leibniz’s idea is contained in the old joke, “An optimist is someone who thinks this is the best of all possible worlds, and a pessimist thinks.. (shrink)
The abstract Latinate vocabulary of modern English, in which philosophy and science are done, is inherited from medieval scholastic Latin. Words like "nature", "art", "abstract", "probable", "contingent", are not native to English but entered it from scholastic translations around the 15th century. The vocabulary retains much though not all of its medieval meanings.
When a company raises its share price by sacking workers or polluting the environment, it is avoiding paying real costs. Accountancy, which quantifies certain rights, needs to combine with applied ethics to create a "computational casuistics" or "moral accountancy", which quantifies the rights and obligations of individuals and companies. Such quantification has proved successful already in environmental accounting, in health care allocation and in evaluating compensation payments. It is argued that many rights are measurable with sufficient accuracy to make them (...) credible and legally actionable. (shrink)
Fifty years of effort in artificial intelligence (AI) and the formalization of legal reasoning have produced both successes and failures. Considerable success in organizing and displaying evidence and its interrelationships has been accompanied by failure to achieve the original ambition of AI as applied to law: fully automated legal decision-making. The obstacles to formalizing legal reasoning have proved to be the same ones that make the formalization of commonsense reasoning so difficult, and are most evident where legal reasoning has to (...) meld with the vast web of ordinary human knowledge of the world. Underlying many of the problems is the mismatch between the discreteness of symbol manipulation and the continuous nature of imprecise natural language, of degrees of similarity and analogy, and of probabilities. (shrink)
"Does torture work?" is a factual rather than ethical or legal question. But legal and ethical discussions of torture should be informed by knowledge of the answer to the factual question of the reliability of torture as an interrogation technique. The question as to whether torture works should be asked before that of its legal admissibility—if it is not useful to interrogators, there is no point considering its legality in court.
Stanford Encyclopedia article surveying the life and work of D.C. Williams, notably in defending realism in metaphysics in the mid-twentieth century and in justifying induction by the logic of statistical inference.
Replies to Kevin de Laplante’s ‘Certainty and Domain-Independence in the Sciences of Complexity’ (de Laplante, 1999), defending the thesis of J. Franklin, ‘The formal sciences discover the philosophers’ stone’, Studies in History and Philosophy of Science, 25 (1994), 513-33, that the sciences of complexity can combine certain knowledge with direct applicability to reality.
Both the traditional Aristotelian and modern symbolic approaches to logic have seen logic in terms of discrete symbol processing. Yet there are several kinds of argument whose validity depends on some topological notion of continuous variation, which is not well captured by discrete symbols. Examples include extrapolation and slippery slope arguments, sorites, fuzzy logic, and those involving closeness of possible worlds. It is argued that the natural first attempts to analyze these notions and explain their relation to reasoning fail, so (...) that ignorance of their nature is profound. (shrink)
The objective Bayesian view of proof (or logical probability, or evidential support) is explained and defended: that the relation of evidence to hypothesis (in legal trials, science etc) is a strictly logical one, comparable to deductive logic. This view is distinguished from the thesis, which had some popularity in law in the 1980s, that legal evidence ought to be evaluated using numerical probabilities and formulas. While numbers are not always useful, a central role is played in uncertain reasoning by the (...) ‘proportional syllogism’, or argument from frequencies, such as ‘nearly all aeroplane flights arrive safely, so my flight is very likely to arrive safely’. Such arguments raise the ‘problem of the reference class’, arising from the fact that an individual case may be a member of many different classes in which frequencies differ. For example, if 15 per cent of swans are black and 60 per cent of fauna in the zoo is black, what should I think about the likelihood of a swan in the zoo being black? The nature of the problem is explained, and legal cases where it arises are given. It is explained how recent work in data mining on the relevance of features for prediction provides a solution to the reference class problem. (shrink)
If Tahiti suggested to theorists comfortably at home in Europe thoughts of noble savages without clothes, those who paid for and went on voyages there were in pursuit of a quite opposite human ideal. Cook's voyage to observe the transit of Venus in 1769 symbolises the eighteenth century's commitment to numbers and accuracy, and its willingness to spend a lot of public money on acquiring them. The state supported the organisation of quantitative researches, employing surveyors and collecting statistics to..
Probabilistic inference from frequencies, such as "Most Quakers are pacifists; Nixon is a Quaker, so probably Nixon is a pacifist" suffer from the problem that an individual is typically a member of many "reference classes" (such as Quakers, Republicans, Californians, etc) in which the frequency of the target attribute varies. How to choose the best class or combine the information? The article argues that the problem can be solved by the feature selection methods used in contemporary Big Data science: the (...) correct reference class is that determined by the features relevant to the target, and relevance is measured by correlation (that is, a feature is relevant if it makes a difference to the frequency of the target). (shrink)
Some courses achieve existence, some have to create Professional Issues and Ethics in existence thrust upon them. It is normally Mathematics; but if you don’t do it, we will a struggle to create a course on the ethical be.” I accepted. or social aspects of science or mathematics. The gift of a greenfield site and a bull- This is the story of one that was forced to dozer is a happy occasion, undoubtedly. But exist by an unusual confluence of outside (...) cirwhat to do next? It seemed to me I should cumstances. ensure the course satisfied these require- In the mid 1990s, the University of New ments: South Wales instituted a policy that all its • It should look good to students, to staff. (shrink)
Powerful, technically complex international compliance regimes have developed recently in certain professions that deal with risk: banking (the Basel II regime), accountancy (IFRS) and the actuarial profession. The need to deal with major risks has acted as a strong driver of international co-operation to create enforceable international semilegal systems, as happened earlier in such ﬁelds as international health regulations. This regulation in technical ﬁelds contrasts with the failure of an international general-purpose political and legal regime to develop. We survey the (...) new global regulatory systems in the actuarial, banking and accounting ﬁelds, with a view to showing how the need to deal reasonably with risk has resulted in an international de facto law solidly based on correct abstract principles of probability. (shrink)
Stove's article, 'So you think you are a Darwinian?'[ 1] was essentially an advertisement for his book, Darwinian Fairytales.[ 2] The central argument of the book is that Darwin's theory, in both Darwin's and recent sociobiological versions, asserts many things about the human and other species that are known to be false, but protects itself from refutation by its logical complexity. A great number of ad hoc devices, he claims, are used to protect the theory. If co operation is observed (...) where the theory predicts competition, then competition is referred to the time of the cavemen, or is reinterpreted as competition between some hidden entities like genes or abstract entities like populations. In a characteristic sally, Stove writes of the sociobiologists' oscillation on the meaning of kin altruism: Any discussion of altruism with an inclusive fitness theorist is, in fact, exactly like dealing with a pair of balloons connected by a tube, one balloon being the belief that kin altruism is an illusion, the other being the belief that kin altruism is caused by shared genes. If a critic puts pressure on the illusion balloon - perhaps by ridiculing the selfish theory of human nature - air is forced into the causal balloon. There is then an increased production of earnest causal explanations of why we love our children, why hymenopteran workers look after their sisters, etc., etc. Then, if the critic puts pressure on the causal balloon - perhaps about the weakness of sibling altruism compared with parental, or the absence of sibling altruism in bacteria - then the illusion balloon is forced to expand. There will now be an increased production of cynical scurrilities about parents manipulating their babies for their own advantage, and vice versa, and in general, about the Hobbesian bad times that are had by all. In this way critical pressure, applied to the theory of inclusive fitness at one point, can always be easily absorbed at another point, and the theory as a whole is never endangered.[ 3] Now, it is uncontroversial to assert that Darwinism is a logically complex theory, and that its relation to empirical evidence is distant and multi faceted. One does not directly observe chance genetic variations leading to the development of new species, or even continuous variations in the fossil record, but must rely on subtle arguments to the best explanation, scaling up from varieties to species, and so on.. (shrink)
The philosophy of mathematics has largely abandoned foundational studies, but is still fixated on theorem proving, logic and number theory, and on whether mathematical knowledge is certain. That is not what mathematics looks like to, say, a knot theorist or an industrial mathematical modeller. The "computer revolution" shows that mathematics is a much more direct study of the world, especially its structural aspects.
Brains, unlike artiﬁcial neural nets, use symbols to summarise and reason about perceptual input. But unlike symbolic AI, they “ground” the symbols in the data: the symbols have meaning in terms of data, not just meaning imposed by the outside user. If neural nets could be made to grow their own symbols in the way that brains do, there would be a good prospect of combining neural networks and symbolic AI, in such a way as to combine the good features (...) of each. The article argues the cluster analysis provides algorithms to perform the task, and that any solution to the task must be a form of cluster analysis. (shrink)
Defends the arguments for the irredicibility of dispositions to categorical properties in "Are dispositions reducible to categorical properties?" (Philosophical Quarterly 36, 1986) against the criticisms of D.M. Armstrong (Philosophical Quarterly 38, 1988).
The late twentieth century saw two long-term trends in popular thinking about ethics. One was an increase in relativist opinions, with the “generation of the Sixties” spearheading a general libertarianism, an insistence on toleration of diverse moral views (for “Who is to say what is right? – it’s only your opinion.”) The other trend was an increasing insistence on rights – the gross violations of rights in the killing fields of the mid-century prompted immense efforts in defence of the “inalienable” (...) rights of the victims of dictators, of oppressed peoples, of refugees. The obvious incompatibility of those ethical stances, one anti-objectivist, the other objectivist in the extreme, proved no obstacle to their both being held passionately, often by the same people. (shrink)
Though there is no international government, there are many international regimes that enact binding regulations on particular matters. They include the Basel II regime in banking, IFRS in accountancy, the FIRST computer incident response system, the WHO’s system for containing global epidemics and many others. They form in effect a very powerful international public sector based on technical expertise. Unlike the public services of nation states, they are almost free of accountability to any democratically elected body or to any legal (...) system. Although by and large they have acted for good, the dangers of long-term unaccountability are illustrated by the travesties of justice perpetrated by the International Labour Organisation Administrative Tribunal. (shrink)
In response to Eric Aarons' `Why Communism failed' (Dissent no. 4, 2001) it is argued that the present "capitalist" system is in fact so regulated as to be a hybrid of capitalist and socialist principles. It has some success in putting economic power into the hands of most people, though it needs restraint to cope with market failures.
methods that have shown promise for improving extreme risk analysis, particularly for assessing the risks of invasive pests and pathogens associated with international trade. We describe the legally inspired regulatory regime for banks, where these methods have been brought to bear on extreme ‘operational risks’. We argue that an ‘advocacy model’ similar to that used in the Basel II compliance regime for bank operational risks and to a lesser extent in biosecurity import risk analyses is ideal for permitting the diversity (...) of relevant evidence about invasive species to be presented and soundly evaluated. We recommend that the process be enhanced in ways that enable invasion ecology to make more explicit use of the methods found successful.. (shrink)