Most scientific models are not physical objects, and this raises important questions. What sort of entity are models, what is truth in a model, and how do we learn about models? In this paper I argue that models share important aspects in common with literary fiction, and that therefore theories of fiction can be brought to bear on these questions. In particular, I argue that the pretence theory as developed by Walton has the resources to answer these questions. I introduce (...) this account, outline the answers that it offers, and develop a general picture of scientific modelling based on it. (shrink)
Models are of central importance in many scientific contexts. The centrality of models such as the billiard ball model of a gas, the Bohr model of the atom, the MIT bag model of the nucleon, the Gaussian-chain model of a polymer, the Lorenz model of the atmosphere, the Lotka-Volterra model of predator-prey interaction, the double helix model of DNA, agent-based and evolutionary models in the social sciences, or general equilibrium models of markets in their respective domains are cases in point. (...) Scientists spend a great deal of time building, testing, comparing and revising models, and much journal space is dedicated to introducing, applying and interpreting these valuable tools. In short, models are one of the principal instruments of modern science. (shrink)
It is now part and parcel of the official philosophical wisdom that models are essential to the acquisition and organisation of scientific knowledge. It is also generally accepted that most models represent their target systems in one way or another. But what does it mean for a model to represent its target system? I begin by introducing three conundrums that a theory of scientific representation has to come to terms with and then address the question of whether the semantic view (...) of theories, which is the currently most widely accepted account of theories and models, provides us with adequate answers to these questions. After having argued in some detail that it does not, I conclude by pointing out in what direction a tenable account of scientific representation might be sought. (shrink)
Computer simulations are an exciting tool that plays important roles in many scientific disciplines. This has attracted the attention of a number of philosophers of science. The main tenor in this literature is that computer simulations not only constitute interesting and powerful new science , but that they also raise a host of new philosophical issues. The protagonists in this debate claim no less than that simulations call into question our philosophical understanding of scientific ontology, the epistemology and semantics of (...) models and theories, and the relation between experimentation and theorising, and submit that simulations demand a fundamentally new philosophy of science in many respects. The aim of this paper is to critically evaluate these claims. Our conclusion will be sober. We argue that these claims are overblown and that simulations, far from demanding a new metaphysics, epistemology, semantics and methodology, raise few if any new philosophical problems. The philosophical problems that do come up in connection with simulations are not specific to simulations and most of them are variants of problems that have been discussed in other contexts before. (shrink)
Scientific discourse is rife with passages that appear to be ordinary descriptions of systems of interest in a particular discipline. Equally, the pages of textbooks and journals are filled with discussions of the properties and the behavior of those systems. Students of mechanics investigate at length the dynamical properties of a system consisting of two or three spinning spheres with homogenous mass distributions gravitationally interacting only with each other. Population biologists study the evolution of one species procreating at a constant (...) rate in an isolated ecosystem. And when studying the exchange of goods, economists consider a situation in which there are only two goods, two perfectly rational agents, no restrictions on available information, no transaction costs, no money, and dealings are done immediately. Their surface structure notwithstanding, no competent scientist would mistake descriptions of such systems as descriptions of an actual system: we know very well that there are no such systems. These descriptions are descriptions of a model-system, and scientists use model-systems to represent parts or aspects of the world they are interested in. Following common practice, I refer to those parts or aspects as target-systems. What are we to make of this? Is discourse about such models merely a picturesque and ultimately dispensable façon de parler? This was the view of some early twentieth century philosophers. Duhem (1906) famously guarded against confusing model building with scientific theorizing and argued that model building has no real place in science, beyond a minor heuristic role. The aim of science was, instead, to construct theories, with theories understood as classificatory or representative structures systematically presented and formulated in precise symbolic.. (shrink)
Everything you always wanted to know about structural realism but were afraid to ask Content Type Journal Article Pages 227-276 DOI 10.1007/s13194-011-0025-7 Authors Roman Frigg, Department of Philosophy, Logic and Scientific Method, London School of Economics and Political Science, Houghton Street, London, WC2A 2AE UK Ioannis Votsis, Philosophisches Institut, Heinrich-Heine-Universität Düsseldorf, Universitätsstraße 1, Geb. 23.21/04.86, 40225 Düsseldorf, Germany Journal European Journal for Philosophy of Science Online ISSN 1879-4920 Print ISSN 1879-4912 Journal Volume Volume 1 Journal Issue Volume 1, Number 2.
The United Kingdom Climate Impacts Program’s UKCP09 project makes high-resolution forecasts of climate during the 21st century using state of the art global climate models. The aim of this paper is to introduce and analyze the methodology used and then urge some caution. Given the acknowledged systematic errors in all current climate models, treating model outputs as decision relevant probabilistic forecasts can be seriously misleading. This casts doubt on our ability, today, to make trustworthy, high-resolution predictions out to the end (...) of this century. (shrink)
Classical statistical mechanics posits probabilities for various events to occur, and these probabilities seem to be objective chances. This does not seem to sit well with the fact that the theory’s time evolution is deterministic. We argue that the tension between the two is only apparent. We present a theory of Humean objective chance and show that chances thus understood are compatible with underlying determinism and provide an interpretation of the probabilities we find in Boltzmannian statistical mechanics.
The sensitive dependence on initial conditions (SDIC) associated with nonlinear models imposes limitations on the models’ predictive power. We draw attention to an additional limitation than has been under-appreciated, namely structural model error (SME). A model has SME if the model-dynamics differ from the dynamics in the target system. If a nonlinear model has only the slightest SME, then its ability to generate decision-relevant predictions is compromised. Given a perfect model, we can take the effects of SDIC into account by (...) substituting probabilistic predictions for point predictions. This route is foreclosed in the case of SME. (shrink)
On the face of it ‘deterministic chance’ is an oxymoron: either an event is chancy or deterministic, but not both. Nevertheless, the world is rife with events that seem to be exactly that: chancy and deterministic at once. Simple gambling devices like coins and dice are cases in point. On the one hand they are governed by deterministic laws – the laws of classical mechanics – and hence given the initial condition of, say, a coin toss it is determined whether (...) it will land heads or tails.2 On the other hand, we commonly assign probabilities to the different outcomes a coin toss, and doing so has proven successful in guiding our actions. The same dilemma also emerges in less mundane contexts. Classical statistical mechanics (which is still an important part of modern physics) assigns probabilities to the occurrence of certain events – for instance to the spreading of a gas that is originally confined to the left half of a container – but at the same time assumes that the relevant systems are deterministic. How can this apparent conflict be resolved? (shrink)
Gases reach equilibrium when left to themselves. Why do they behave in this way? The canonical answer to this question, originally proffered by Boltzmann, is that the systems have to be ergodic. This answer has been criticised on different grounds and is now widely regarded as flawed. In this paper we argue that some of the main arguments against Boltzmann's answer, in particular, arguments based on the KAM-theorem and the Markus-Meyer theorem, are beside the point. We then argue that something (...) close to Boltzmann's original proposal is true for gases: gases behave thermodynamic-like if they are epsilon-ergodic, i.e., ergodic on the entire accessible phase space except for a small region of measure epsilon. This answer is promising because there are good reasons to believe that relevant systems in statistical mechanics are epsilon-ergodic. (shrink)
GRW Theory postulates a stochastic mechanism assuring that every so often the wave function of a quantum system is `hit', which leaves it in a localised state. How are we to interpret the probabilities built into this mechanism? GRW theory is a firmly realist proposal and it is therefore clear that these probabilities are objective probabilities (i.e. chances). A discussion of the major theories of chance leads us to the conclusion that GRW probabilities can be understood only as either single (...) case propensities or Humean objective chances. Although single case propensities have some intuitive appeal in the context of GRW theory, on balance it seems that Humean objective chances are preferable on conceptual grounds because single case propensities suffer from various well know problems such as unlimited frequency tolerance and lack of a rationalisation of the principal principle. (shrink)
The United Kingdom Climate Impacts Programme’s UKCP09 project makes highresolution projections of the climate out to 2100 by post-processing the outputs of a large-scale global climate model. The aim of this paper is to describe and analyse the methodology used and then urge some caution. Given the acknowledged systematic, shared shortcomings in all current climate models, treating model outputs as decision relevant projections can be significantly misleading. In extrapolatory situations, such as projections of future climate change impacts, there is little (...) reason to expect that postprocessing of model outputs can correct for the consequences of such errors. This casts doubt on our ability, today, to make trustworthy, high-resolution probabilistic projections out to the end of this century. (shrink)
Climate change adaptation is largely a local matter, and adaptation planning can benefit from local climate change projections. Such projections are typically generated by accepting climate model outputs in a relatively uncritical way. We argue, based on the IPCC’s treatment of model outputs from the CMIP5 ensemble, that this approach is unwarranted and that subjective expert judgment should play a central role in the provision of local climate change projections intended to support decision-making.
A gas prepared in a non-equilibrium state will approach equilibrium and stay there. An influential contemporary approach to Statistical Mechanics explains this behaviour in terms of typicality. However, this explanation has been criticised as mysterious as long as no connection with the dynamics of the system is established. We take this criticism as our point of departure. Our central claim is that Hamiltonians of gases which are epsilon-ergodic are typical with respect to the Whitney topology. Because equilibrium states are typical, (...) we argue that there follows the desired conclusion that typical initial conditions approach equilibrium and stay there. (shrink)
Various processes are often classified as both deterministic and random or chaotic. The main difficulty in analysing the randomness of such processes is the apparent tension between the notions of randomness and determinism: what type of randomness could exist in a deterministic process? Ergodic theory seems to offer a particularly promising theoretical tool for tackling this problem by positing a hierarchy, the so-called ‘ergodic hierarchy’, which is commonly assumed to provide a hierarchy of increasing degrees of randomness. However, that notion (...) of randomness requires clarification. The mathematical definition of EH does not make explicit appeal to randomness; nor does the usual way of presenting EH involve a specification of the notion of randomness that is supposed to underlie the hierarchy. In this paper we argue that EH is best understood as a hierarchy of random behaviour if randomness is explicated in terms of unpredictability. We then show that, contrary to common wisdom, EH is useful in characterising the behaviour of Hamiltonian dynamical systems. (shrink)
Various processes are often classified as both deterministic and random or chaotic. The main difficulty in analysing the randomness of such processes is the apparent tension between the notions of randomness and determinism: what type of randomness could exist in a deterministic process? Ergodic theory seems to offer a particularly promising theoretical tool for tackling this problem by positing a hierarchy, the so-called ‘ergodic hierarchy’ (EH), which is commonly assumed to provide a hierarchy of increasing degrees of randomness. However, that (...) notion of randomness requires clarification. The mathematical definition of EH does not make explicit appeal to randomness; nor does the usual way of presenting EH involve a specification of the notion of randomness that is supposed to underlie the hierarchy. In this paper we argue that EH is best understood as a hierarchy of random behaviour if randomness is explicated in terms of unpredictability. We then show that, contrary to common wisdom, EH is useful in characterising the behaviour of Hamiltonian dynamical systems. (shrink)
Consider a gas that is adiabatically isolated from its environment and conﬁned to the left half of a container. Then remove the wall separating the two parts. The gas will immediately start spreading and soon be evenly distributed over the entire available space. The gas has approached equilibrium. Thermodynamics (TD) characterizes this process in terms of an increase of thermodynamic entropy, which attains its maximum value at equilibrium. The second law of thermodynamics captures the irreversibility of this process by positing (...) that in an isolated system such as the gas entropy cannot decrease. The aim of statistical mechanics (SM) is to explain the behavior of the gas and, in particular, its conformity with the second law in terms of the dynamical laws governing the individual molecules of which the gas is made up. In what follows these laws are assumed to be the ones of Hamiltonian classical mechanics. We should not, however, ask for an explanation of the second law literally construed. This law is a universal law and as such cannot be explained by a statistical theory. But this is not a problem because we.. (shrink)
Various scientific theories stand in a reductive relation to each other. In a recent article, we have argued that a generalized version of the Nagel-Schaffner model (GNS) is the right account of this relation. In this article, we present a Bayesian analysis of how GNS impacts on confirmation. We formalize the relation between the reducing and the reduced theory before and after the reduction using Bayesian networks, and thereby show that, post-reduction, the two theories are confirmatory of each other. We (...) then ask when a purported reduction should be accepted on epistemic grounds. To do so, we compare the prior and posterior probabilities of the conjunction of both theories before and after the reduction and ask how well each is confirmed by the available evidence. (shrink)
In two recent papers Barry Loewer (2001, 2004) has suggested to interpret probabilities in statistical mechanics as Humean chances in David Lewis’ (1994) sense. I first give a precise formulation of this proposal, then raise two fundamental objections, and finally conclude that these can be overcome only at the price of interpreting these probabilities epistemically.
In this paper we explore the constraints that our preferred account of scientific representation places on the ontology of scientific models. Pace the Direct Representation view associated with Arnon Levy and Adam Toon we argue that scientific models should be thought of as imagined systems, and clarify the relationship between imagination and representation.
The aim of this article is twofold. Recently, Lewis has presented an argument, now known as the "counting anomaly", that the spontaneous localization approach to quantum mechanics, suggested by Ghirardi, Rimini, and Weber, implies that arithmetic does not apply to ordinary macroscopic objects. I will take this argument as the starting point for a discussion of the property structure of realist collapse interpretations of quantum mechanics in general. At the end of this I present a proof of the fact that (...) the composition principle, which holds true in standard quantum mechanics, fails in all realist collapse interpretations. On the basis of this result I reconsider the counting anomaly and show that what lies at the heart of the anomaly is the failure to appreciate the peculiarities of the property structure of such interpretations. Once this flaw is uncovered, the anomaly vanishes. (shrink)
Models occupy a central role in the scientific endeavour. Among the many purposes they serve, representation is of great importance. Many models are representations of something else; they stand for, depict, or imitate a selected part of the external world (often referred to as target system, parent system, original, or prototype). Well-known examples include the model of the solar system, the billiard ball model of a gas, the Bohr model of the atom, the Gaussian-chain model of a polymer, the MIT (...) bag model of quark confinement, the Lorenz model of the atmosphere, the Lotka-Volterra model of the predator-prey interaction, or the hydraulic model of an economy, to mention just a few. All these models represent their target systems (or selected parts of them) in one way or another. (shrink)
Boltzmannian statistical mechanics partitions the phase space of a sys- tem into macro-regions, and the largest of these is identified with equilibrium. What justifies this identification? Common answers focus on Boltzmann’s combinatorial argument, the Maxwell-Boltzmann distribution, and maxi- mum entropy considerations. We argue that they fail and present a new answer. We characterise equilibrium as the macrostate in which a system spends most of its time and prove a new theorem establishing that equilib- rium thus defined corresponds to the largest (...) macro-region. Our derivation is completely general in that it does not rely on assumptions about a system’s dynamics or internal interactions. (shrink)
Models are of central importance in many scientific contexts. The centrality of models such as the billiard ball model of a gas, the Bohr model of the atom, the MIT bag model of the nucleon, the Gaussian-chain model of a polymer, the Lorenz model of the atmosphere, the Lotka-Volterra model of predator-prey interaction, the double helix model of DNA, agent-based and evolutionary models in the social sciences, or general equilibrium models of markets in their respective domains are cases in point. (...) Scientists spend a great deal of time building, testing, comparing and revising models, and much journal space is dedicated to introducing, applying and interpreting these valuable tools. In short, models are one of the principal instruments of modern science. -/- Philosophers are acknowledging the importance of models with increasing attention and are probing the assorted roles that models play in scientific practice. The result has been an incredible proliferation of model-types in the philosophical literature. Probing models, phenomenological models, computational models, developmental models, explanatory models, impoverished models, testing models, idealized models, theoretical models, scale models, heuristic models, caricature models, didactic models, fantasy models, toy models, imaginary models, mathematical models, substitute models, iconic models, formal models, analogue models and instrumental models are but some of the notions that are used to categorize models. While at first glance this abundance is overwhelming, it can quickly be brought under control by recognizing that these notions pertain to different problems that arise in connection with models. For example, models raise questions in semantics (what is the representational function that models perform?), ontology (what kind of things are models?), epistemology (how do we learn with models?), and, of course, in philosophy of science (how do models relate to theory?; what are the implications of a model based approach to science for the debates over scientific realism, reductionism, explanation and laws of nature?). (shrink)
Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the most important notions (...) of entropy and to clarify the relations between them, After setting the stage by introducing the thermodynamic entropy, we discuss notions of entropy in information theory, statistical mechanics, dynamical systems theory and fractal geometry. (shrink)
Special issue. With contributions by Anouk Barberouse, Sarah Francescelli and Cyrille Imbert, Robert Batterman, Roman Frigg and Julian Reiss, Axel Gelfert, Till Grüne-Yanoff, Paul Humphreys, James Mattingly and Walter Warwick, Matthew Parker, Wendy Parker, Dirk Schlimm, and Eric Winsberg.
The so-called ergodic hierarchy (EH) is a central part of ergodic theory. It is a hierarchy of properties that dynamical systems can possess. Its five levels are egrodicity, weak mixing, strong mixing, Kolomogorov, and Bernoulli. Although EH is a mathematical theory, its concepts have been widely used in the foundations of statistical physics, accounts of randomness, and discussions about the nature of chaos. We introduce EH and discuss how its applications in these fields.
Science provides us with representations of atoms, elementary particles, polymers, populations, genetic trees, economies, rational decisions, aeroplanes, earthquakes, forest fires, irrigation systems, and the world’s climate. It's through these representations that we learn about the world. This entry explores various different accounts of scientific representation, with a particular focus on how scientific models represent their target systems. As philosophers of science are increasingly acknowledging the importance, if not the primacy, of scientific models as representational units of science, it's important to (...) stress that how they represent plays a fundamental role in how we are to answer other questions in the philosophy of science. This entry begins by disentangling ‘the’ problem of scientific representation, before critically evaluating the current options available in the literature. (shrink)
An important contemporary version of Boltzmannian statistical mechanics explains the approach to equilibrium in terms of typicality. The problem with this approach is that it comes in different versions, which are, however, not recognized as such and not clearly distinguished. This article identifies three different versions of typicality‐based explanations of thermodynamic‐like behavior and evaluates their respective successes. The conclusion is that the first two are unsuccessful because they fail to take the system's dynamics into account. The third, however, is promising. (...) I give a precise formulation of the proposal and present an argument in support of its central contention. †To contact the author, please write to: Department of Philosophy, Logic, and Scientific Method, London School of Economics, Houghton Street, London WC2A 2AE, England; e‐mail: email@example.com. (shrink)
This volume is a serious attempt to open up the subject of European philosophy of science to real thought, and provide the structural basis for the interdisciplinary development of its specialist fields, but also to provoke reflection on the idea of ‘European philosophy of science’. This efforts should foster a contemporaneous reflection on what might be meant by philosophy of science in Europe and European philosophy of science, and how in fact awareness of it could assist philosophers interpret and motivate (...) their research through a stronger collective identity. The overarching aim is to set the background for a collaborative project organising, systematising, and ultimately forging an identity for, European philosophy of science by creating research structures and developing research networks across Europe to promote its development. (shrink)
In its most common use, the term ‘model’ refers to a simplified and stylised version of the socalled target system, the part or aspect of the world that we are interested in. For instance, in order to determine the orbit of a planet moving around the sun we model the planet and the sun as perfect homogenous spheres that gravitationally interact with each other but nothing else in the universe, and then apply Newtonian mechanics to this system, which reveals that (...) the planet moves on an elliptical orbit. Views diverge about what sort of entity such a model is. Those focussing on the formal aspects of models regard them either as equations or settheoretical structures, while those opposed to such an approach take them to be descriptions or abstract (yet non-mathematical) entities. A further question concerns the relation of models and theories. In some cases models can be derived from theory simply by specifying the relevant determinables in a theory’s general equations. But many models cannot be obtained from theory in this straightforward way, and some even involve assumptions that contradict the fundamental theory. The relation of models to their respective target systems is equally complex and fraught with controversy. Two influential proposals take the relation between a model and its target to be isomorphism or similarity, respectively. This, however, has been criticised as too restrictive as many models do not seem to fit this mould. (shrink)
The received wisdom in statistical mechanics is that isolated systems, when left to themselves, approach equilibrium. But under what circumstances does an equilibrium state exist and an approach to equilibrium take place? In this paper we address these questions from the vantage point of the long-run fraction of time definition of Boltzmannian equilibrium that we developed in two recent papers. After a short summary of Boltzmannian statistical mechanics and our definition of equilibrium, we state an existence theorem which provides general (...) criteria for the existence of an equilibrium state. We first illustrate how the theorem works with a toy example, which allows us to illustrate the various elements of the theorem in a simple setting. After a look at the ergodic programme, we discuss equilibria in a number of different gas systems: the ideal gas, the dilute gas, the Kac gas, the stadium gas, the mushroom gas and the multi-mushroom gas. In the conclusion we briefly summarise the main points and highlight open questions. (shrink)
this study is to analyse the biomechanical influence of the been studied extensively, little is known how it is influclinical data on stability of the ankle joint. A two-dimenenced by the osseous joint configuration. Based on lateral..
In Boltzmannian statistical mechanics macro-states supervene on micro-states. This leads to a partitioning of the state space of a system into regions of macroscopically indistinguishable micro-states. The largest of these regions is singled out as the equilibrium region of the system. What justifies this association? We review currently available answers to this question and find them wanting both for conceptual and for technical reasons. We propose a new conception of equilibrium and prove a mathematical theorem which establishes in full generality (...) -- i.e. without making any assumptions about the system's dynamics or the nature of the interactions between its components -- that the equilibrium macro-region is the largest macro-region. We then turn to the question of the approach to equilibrium, of which there exists no satisfactory general answer so far. In our account, this question is replaced by the question when an equilibrium state exists. We prove another -- again fully general -- theorem providing necessary and sufficient conditions for the existence of an equilibrium state. This theorem changes the way in which the question of the approach to equilibrium should be discussed: rather than launching a search for a crucial factor, the focus should be on finding triplets of macro-variables, dynamical conditions, and effective state spaces that satisfy the conditions of the theorem. (shrink)
This is the first of three parts of an introduction to the philosophy of climate science. In this first part about observing climate change, the topics of definitions of climate and climate change, data sets and data models, detection of climate change, and attribution of climate change will be discussed.
Models are of central importance in many scientific contexts. The roles the MIT bag model of the nucleon, the billiard ball model of a gas, the Bohr model of the atom, the Gaussian-chain model of a polymer, the Lorenz model of the atmosphere, the Lotka- Volterra model of predator-prey interaction, agent-based and evolutionary models of social interaction, or general equilibrium models of markets play in their respective domains are cases in point.
This is the second of three parts of an introduction to the philosophy of climate science. In this second part about modelling climate change, the topics of climate modelling, confirmation of climate models, the limits of climate projections, uncertainty and finally model ensembles will be discussed.
Background: Gait analysis after total ankle replacement and ankle arthrodesis is usually measured barefoot. However, this does not reflect reality. The purpose of this study was to compare patients barefoot and with footwear. Methods: We compared 126 patients with 35 healthy controls in three conditions. Minimum follow-up was 2 years. We used dynamic pedobarography and a light gate. Main outcome measures: relative midfoot index, forefoot maximal force, walking speed. Findings: The relative midfoot index decreased in all groups from barefoot to (...) running shoes and again to rocker bottom shoes. The forefoot maximal force increased wearing shoes, but there was no difference between running and rocker bottom shoes. Walking speed increased by 0.06 m/s with footwear. Total ankle replacement and ankle arthrodesis were equal in running shoes but both deviated from healthy controls. In rocker bottom shoes, this ranking remained the same except the relative midfoot index merged to similar values. Tibiotalocalcaneal arthrodesis were inferior in both shoes. Interpretation: Runners are beneficial and the benefit is greater for fusions and replacements. Rocker bottom shoes have little added benefit. Total ankle replacement and ankle arthrodesis were equal but inferior to healthy controls. Tibiotalocalcaneal arthrodesis has an inferior outcome. (shrink)
Let us begin with a characteristic example. Consider a gas that is confined to the left half of a box. Now we remove the barrier separating the two halves of the box. As a result, the gas quickly disperses, and it continues to do so until it homogeneously fills the entire box. This is illustrated in Figure 1.
Why do systems prepared in a non-equilibrium state approach, and eventually reach, equilibrium? An important contemporary version of the Boltzmannian approach to statistical mechanics answers this question by an appeal to the notion of typicality. The problem with this approach is that it comes in different versions, which are, however, not recognised as such, much less clearly distinguished, and we often find different arguments pursued side by side. The aim of this paper is to disentangle different versions of typicality-based explanations (...) of thermodynamic behaviour and evaluate their respective success. My conclusion will be that the boldest version fails for technical reasons, while more prudent versions leave unanswered essential questions. (shrink)