This paper looks at the nature of idealizations and representational structures appealed to in the context of the fractional quantum Hall effect, specifically, with respect to the emergence of anyons and fractional statistics. Drawing on an analogy with the Aharonov–Bohm effect, it is suggested that the standard approach to the effects— the topological approach to fractional statistics—relies essentially on problematic idealizations that need to be revised in order for the theory to be explanatory. An alternative geometric approach is (...) outlined and endorsed. Roles for idealizations in science, as well as consequences for the debate revolving around so-called essential idealizations, are discussed. (shrink)
The spin-statistics connection is derived in a simple manner under the postulates that the original and the exchange wave functions are simply added, and that the azimuthal phase angle, which defines the orientation of the spin part of each single-particle spin-component eigenfunction in the plane normal to the spin-quantization axis, is exchanged along with the other parameters. The spin factor (−1)2s belongs to the exchange wave function when this function is constructed so as to get the spinor ambiguity under (...) control. This is achieved by effecting the exchange of the azimuthal angle by means of rotations and admitting only rotations in one sense. The procedure works in Galilean as well as in Lorentz-invariant quantum mechanics. Relativistic quantum field theory is not required. (shrink)
We present a new approach to the old problem of how to incorporate the role of the observer in statistics. We show classical probability theory to be inadequate for this task and take refuge in the epsilon-model, which is the only model known to us caapble of handling situations between quantum and classical statistics. An example is worked out and some problems are discussed as to the new viewpoint that emanates from our approach.
This paper puts forward the hypothesis that the distinctive features of quantum statistics are exclusively determined by the nature of the properties it describes. In particular, all statistically relevant properties of identical quantum particles in many-particle systems are conjectured to be irreducible, ‘inherent’ properties only belonging to the whole system. This allows one to explain quantum statistics without endorsing the ‘Received View’ that particles are non-individuals, or postulating that quantum systems obey peculiar probability distributions, or assuming that there (...) are primitive restrictions on the range of states accessible to such systems. With this, the need for an unambiguously metaphysical explanation of certain physical facts is acknowledged and satisfied. (shrink)
Statistical evidence is crucial throughout disparate impact’s three-stage analysis: during (1) the plaintiff’s prima facie demonstration of a policy’s disparate impact; (2) the defendant’s job-related business necessity defense of the discriminatory policy; and (3) the plaintiff’s demonstration of an alternative policy without the same discriminatory impact. The circuit courts are split on a vital question about the “practical significance” of statistics at Stage 1: Are “small” impacts legally insignificant? For example, is an employment policy that causes a one percent (...) disparate impact an appropriate policy for redress through disparate impact litigation? This circuit split calls for a comprehensive analysis of practical significance testing across disparate impact’s stages. Importantly, courts and commentators use “practical significance” ambiguously between two aspects of practical significance: the magnitude of an effect and confidence in statistical evidence. For example, at Stage 1 courts might ask whether statistical evidence supports a disparate impact (a confidence inquiry) and whether such an impact is large enough to be legally relevant (a magnitude inquiry). Disparate impact’s texts, purposes, and controlling interpretations are consistent with confidence inquires at all three stages, but not magnitude inquiries. Specifically, magnitude inquiries are inappropriate at Stages 1 and 3—there is no discriminatory impact or reduction too small or subtle for the purposes of the disparate impact analysis. Magnitude inquiries are appropriate at Stage 2, when an employer defends a discriminatory policy on the basis of its job-related business necessity. (shrink)
Bose-Einstein statistics may be characterized in terms of multinomial distribution. From this characterization, an information theoretic analysis is made for Einstein-Podolsky-Rosen like situation; using Shannon’s measure of entropy.
Error statistics is an important methodological view in philosophy of statistics and philosophy of science that can be applied to scientific experiments such as clinical trials. In this paper, I raise two potential issues for ES when it comes to guiding, and explaining early stopping of randomized controlled trials : ES provides limited guidance in cases of early unfavorable trends due to the possibility of trend reversal; ES is silent on how to prospectively control error rates in experiments (...) requiring multiple interim analyses. The method of conditional power, together with a rationing principle for RCTs, can assist ES in addressing such issues. (shrink)
In this article, the rotational invariance of entangled quantum states is investigated as a possible cause of the Pauli exclusion principle. First, it is shown that a certain class of rotationally invariant states can only occur in pairs. This is referred to as the coupling principle. This in turn suggests a natural classification of quantum systems into those containing coupled states and those that do not. Surprisingly, it would seem that Fermi–Dirac statistics follows as a consequence of this coupling (...) while the Bose–Einstein follows by breaking it. In Sec. 5, the above approach is related to Pauli's original spin-statistics theorem and finally in the last two sections, a theoretical justification, based on Clebsch–Gordan coefficients and the experimental evidence respectively, is presented. (shrink)
The traditional standard quantum mechanics theory is unable to solve the spin–statistics problem, i.e. to justify the utterly important “Pauli Exclusion Principle”. A complete and straightforward solution of the spin–statistics problem is presented on the basis of the “conformal quantum geometrodynamics” theory. This theory provides a Weyl-gauge invariant formulation of the standard quantum mechanics and reproduces successfully all relevant quantum processes including the formulation of Dirac’s or Schrödinger’s equation, of Heisenberg’s uncertainty relations and of the nonlocal EPR correlations. (...) When the conformal quantum geometrodynamics is applied to a system made of many identical particles with spin, an additional constant property of all elementary particles enters naturally into play: the “intrinsic helicity”. This property, not considered in the Standard Quantum Mechanics, determines the correct spin–statistics connection observed in Nature. (shrink)
The spin-statistics connection has been proved for nonrelativistic quantum mechanics . The proof is extended here to the relativistic regime using the parametrized Dirac equation. A causality condition is not required.
After sketching the conflict between objectivists and subjectivists on the foundations of statistics, this paper discusses an issue facing statisticians of both schools, namely, model validation. Statistical models originate in the study of games of chance, and have been successfully applied in the physical and life sciences. However, there are basic problems in applying the models to social phenomena; some of the difficulties will be pointed out. Hooke's law will be contrasted with regression models for salary discrimination, the latter (...) being a fairly typical application in the social sciences. (shrink)
ABSTRACT. Probability and statistics play an important role in contemporary -philosophy of causality. They are viewed as glasses through which we can see or detect causal relations. However, they may sometimes act as blinding glasses, as I will argue in this paper. In the 19th century, Francis Galton tried to statistically analyze hereditary phenomena. Although he was a far better statistician than Gregor Mendel, his biological theory turned out to be less fruitful. This was no sheer accident. His knowledge (...) of statistics generated two explananda (unknown to Mendel) which in turn generated constraints for any theory of heredity. These constraints misguided Galtons search for the causal mechanism of inheritance. This is not just. an interesting case for philosophers and historians of ·science. Notwithstanding the progress made by statitics, Galtons problem is still relevant today. In the final section, I briefly explore the implications for statistical techniques such as structural equation modelling. (shrink)
It was shown in the early seventies that, in Local Quantum Theory (that is the most general formulation of Quantum Field Theory, if we leave out only the unknown scenario of Quantum Gravity) the notion of Statistics can be grounded solely on the local observable quantities (without assuming neither the commutation relations nor even the existence of unobservable charged field operators); one finds that only the well known (para)statistics of Bose/Fermi type are allowed by the key principle of (...) local commutativity of observables. In this frame it was possible to formulate and prove the Spin and Statistics Theorem purely on the basis of First Principles.In a subsequent stage it has been possible to prove the existence of a unique, canonical algebra of local field operators obeying ordinary Bose/Fermi commutation relations at spacelike separations.In this general guise the Spin–Statistics Theorem applies to Theories (on the four dimensional Minkowski space) where only massive particles with finite mass degeneracy can occur. Here we describe the underlying simple basic ideas, and briefly mention the subsequent generalisations; eventually we comment on the possible validity of the Spin–Statistics Theorem in presence of massless particles, or of violations of locality as expected in Quantum Gravity. (shrink)
Statistics play a critical role in biological and clinical research. To promote logically consistent representation and classification of statistical entities, we have developed the Ontology of Biological and Clinical Statistics (OBCS). OBCS extends the Ontology of Biomedical Investigations (OBI), an OBO Foundry ontology supported by some 20 communities. Currently, OBCS contains 686 terms, including 381 classes imported from OBI and 147 classes specific to OBCS. The goal of this paper is to present OBCS for community critique and to (...) describe a number of use cases designed to illustrate its potential applications. The OBCS project and source code are available at http://obcs.googlecode.com. (shrink)
Are the categories used to study the social world and acting on it real or conventional ? An empirical answer to that question is given by an analysis of the debates about the quality of statistics produced by the European National Institues of statistics in the 1990s. Six criteria of quality were then specified: relevance, accuracy, timeliness, accessibility, comparability and coherence. How do statisticians and users of statistics deal with the tension produced by their objects being both (...) real (they exist before their measurement) and conventionally constructed (they are in a way, created by these conventions)? In particular, the technical and sociological distinction between the criteria of relevance and accuracy implies a realistic interpretation, desired by users, but that is nonetheless problematic. (shrink)
On this occasion of honouring the achievement of Philip McShane, I would like to recall his earliest and, in my judgment, most important work, Randomness, Statistics and Emergence. In particular, I will recall how that work situated Lonergan’s important breakthrough on statistical method in relation to the major currents of thought on the subject, many of which remain influential still today.
A realistic physical axiomatic approach of the relativistic quantum field theory is presented. Following the action principle of Schwinger, a covariant and general formulation is obtained. The correspondence principle is not invoked and the commutation relations are not postulated but deduced. The most important theorems such as spin-statistics, and CPT are proved. The theory is constructed form the notion of basic field and system of basic fields. In comparison with others formulations, in our realistic approach fields are regarded as (...) real things with symmetry properties. Finally, the general structure is contrasted with other formulations. (shrink)
Within a geometric and algebraic framework, the structures which are related to the spin-statistics connection are discussed. A comparison with the Berry-Robbins approach is made. The underlying geometric structure constitutes an additional support for this approach. In our work, a geometric approach to quantum indistinguishability is introduced which allows the treatment of singlevaluedness of wave functions in a global, model independent way.
The angular momentum operators for a system of two spin-zero indistinguishable particles are constructed, using Isham’s Canonical Group Quantization method. This mathematically rigorous method provides a hint at the correct definition of (total) angular momentum operators, for arbitrary spin, in a system of indistinguishable particles. The connection with other configuration space approaches to spin-statistics is discussed, as well as the relevance of the obtained results in view of a possible alternative proof of the spin-statistics theorem.
The paper discusses the scope and influence of eugenics in defining the scientific programme of statistics and the impact of the evolution of biology on social scientists. It argues that eugenics was instrumental in providing a bridge between sciences, and therefore created both the impulse and the institutions necessary for the birth of modern statistics in its applications first to biology and then to the social sciences. Looking at the question from the point of view of the history (...) of statistics and the social sciences, and mostly concentrating on evidence from the British debates, the paper discusses how these disciplines became emancipated from eugenics precisely because of the inspiration of biology. It also relates how social scientists were fascinated and perplexed by the innovations taking place in statistical theory and practice. (shrink)
An argument to the effect that non-relativistic quantum particles can be understood as individual objects in spite of the empirical evidence seemingly lending support to the opposite conclusion. Ways to understand quantum indistinguishability and quantum statistics in terms of individuals are indicated.
Quantum theories constructed on the noncommutative spacetime called the Groenewold–Moyal plane exhibit many interesting properties such as Lorentz and CPT noninvariance, causality violation and twisted statistics. We show that such violations lead to many striking features that may be tested experimentally. These theories predict Pauli forbidden transitions due to twisted statistics, anisotropies in the cosmic microwave background radiation due to correlations of observables in spacelike regions and Lorentz and CPT violations in scattering amplitudes.
Spin-statistics transmutation is the phenomenon occurring when a “dressing” transformation introduced for physical reasons (e.g. gauge invariance) modifies the “bare” spin and statistics of particles or fields. Historically, it first appeared in Quantum Mechanics and in semiclassical approximation to Quantum Field Theory. After a brief historical introduction, we sketch how to describe such phenomenon in Quantum Field Theory beyond the semiclassical approximation, using a path-integral formulation of euclidean correlation functions, exemplifying with anyons, dyons and skyrmions.
Despite its phenomenal success since its inception in the early nineteen-nineties, the evidence-based medicine movement has not succeeded in shaking off an epistemological critique derived from the experiential or tacit dimensions of clinical reasoning about particular individuals. This critique claims that the evidence-based medicine model does not take account of tacit knowing as developed by the philosopher Michael Polanyi. However, the epistemology of evidence-based medicine is premised on the elimination of the tacit dimension from clinical judgment. This is demonstrated through (...) analyzing the dichotomy between clinical and statistical intuition in evidence-based medicine’s epistemology of clinical reasoning. I argue that clinical epidemiology presents a more nuanced epistemological model for the application of statistical epidemiology to the clinical context. Polanyi’s theory of tacit knowing is compatible with the model of clinical reasoning associated with clinical epidemiology, but not evidence-based medicine. (shrink)
The last two decades have seen a welcome proliferation of the collection and dissemination of data on social progress, as well as considered public debates rethinking existing standards of measuring the progress of societies. These efforts are to be welcomed. However, they are only a nascent step on a longer road to the improved measurement of social progress. In this paper, I focus on the central role that gender should take in future efforts to measure progress in securing human rights, (...) with a particular focus on anti-poverty rights. I proceed in four parts. First, I argue that measurement of human rights achievements and human rights deficits is entailed by the recognition of human rights, and that adequate measurement of human rights must be genuinely gender-sensitive. Second, I argue that existing systems of information collection currently fail rights holders, especially women, by failing to adequately gather information on the degree to which their rights are secure. If my first two claims are correct, this failure represents a serious injustice, and in particular an injustice for women. Third, I make recommendations regarding changes to existing information collection that would generate gender-sensitive measures of anti-poverty rights. Fourth, I conclude by responding to various objections that have been raised regarding the rise of indicators to track human rights. (shrink)
Utility theories—both Expected Utility and non-Expected Utility theories—offer numericalized representations of classical principles meant for the regulation of choice under conditions of risk—a type of formal representation that reduces the representation of risk to a single number. I shall refer to these as risk-numericalizing theories of decision. I shall argue that risk--numericalizing theories are not satisfactory answers to the question: “How do I take the means to my ends?” In other words, they are inadequate or incomplete as instrumental theories. They (...) are inadequate because they are poor answers to the question of what it is for an option to be instrumental towards an end. To say it another way, they do not offer a sufficiently rich account of what it is for something to be a means toward an end. (shrink)
Quantitative researchers often discuss research ethics as if specific ethical problems can be reduced to abstract normative logics. Such approaches overlook how values are embedded in every aspect of quantitative methods, including ‘observations,’ ‘facts,’ and notions of ‘objectivity.’ We describe how quantitative research practices, concepts, discourses, and their objects/subjects of study have always been value-laden, from the invention of statistics and probability in the 1600s to their subsequent adoption as a logic made to appear as if it exists prior (...) to, and separate from, ethics and values. This logic, which was embraced in the Academy of Management from the 1960s, casts management researchers as ethical agents who ought to know about a reality conceptualized as naturally existing in the image of statistics and probability, while overlooking that S&P logic and practices, which researchers made for themselves, have an appreciable role in making the world appear this way. We introduce a different way to conceptualize reality and ethics, wherein the process of scientific inquiry itself requires an examination of its own practices and commitments. Instead of resorting to decontextualized notions of ‘rigor’ and its ‘best practices,’ quantitative researchers can adopt more purposeful ways to reason about the ethics and relevance of their methods and their science. We end by considering implications for addressing ‘post truth’ and ‘alternative facts’ problems as collective concerns, wherein it is actually the pluralistic nature of description that makes defending a collectively valuable version of reality so important and urgent. (shrink)
The concept of classical indistinguishability is analyzed and defended against a number of well-known criticisms, with particular attention to the Gibbs’paradox. Granted that it is as much at home in classical as in quantum statistical mechanics, the question arises as to why indistinguishability, in quantum mechanics but not in classical mechanics, forces a change in statistics. The answer, illustrated with simple examples, is that the equilibrium measure on classical phase space is continuous, whilst on Hilbert space it is discrete. (...) The relevance of names, or equivalently, properties stable in time that can be used as names, is also discussed. (shrink)
It has been recently debated whether there exists a so-called “easy road” to nominalism. In this essay, I attempt to fill a lacuna in the debate by making a connection with the literature on infinite and infinitesimal idealization in science through an example from mathematical physics that has been largely ignored by philosophers. Specifically, by appealing to John Norton’s distinction between idealization and approximation, I argue that the phenomena of fractional quantum statistics bears negatively on Mary Leng’s proposed path (...) to easy road nominalism, thereby partially defending Mark Colyvan’s claim that there is no easy road to nominalism. (shrink)
During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the (...) mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher’s methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians’ tools and expertise into the station research programme. Fisher’s statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them. (shrink)
Suppose one hundred prisoners are in a yard under the supervision of a guard, and at some point, ninety-nine of them collectively kill the guard. If, after the fact, a prisoner is picked at random and tried, the probability of his guilt is 99%. But despite the high probability, the statistical chances, by themselves, seem insufficient to justify a conviction. The question is why. Two arguments are offered. The first, decision-theoretic argument shows that a conviction solely based on the (...) class='Hi'>statistics in the prisoner scenario is unacceptable so long as the goal of expected utility maximization is combined with fairness constraints. The second, risk-based argument shows that a conviction solely based on the statistics in the prisoner scenario lets the risk of mistaken conviction surge potentially too high. The same, by contrast, cannot be said of convictions solely based on DNA evidence or eyewitness testimony. A noteworthy feature of the two arguments in the paper is that they are not confined to criminal trials and can in fact be extended to civil trials. (shrink)
An important contribution to the foundations of probability theory, statistics and statistical physics has been made by E. T. Jaynes. The recent publication of his collected works provides an appropriate opportunity to attempt an assessment of this contribution.
Two experiments were conducted to examine adult learners' ability to extract multiple statistics in simultaneously presented visual and auditory input. Experiment 1 used a cross-situational learning paradigm to test whether English speakers were able to use co-occurrences to learn word-to-object mappings and concurrently form object categories based on the commonalities across training stimuli. Experiment 2 replicated the first experiment and further examined whether speakers of Mandarin, a language in which final syllables of object names are more predictive of category (...) membership than English, were able to learn words and form object categories when trained with the same type of structures. The results indicate that both groups of learners successfully extracted multiple levels of co-occurrence and used them to learn words and object categories simultaneously. However, marked individual differences in performance were also found, suggesting possible interference and competition in processing the two concurrent streams of regularities. (shrink)
The explanatory filter is a proposed method to detect design in nature with the aim of refuting Darwinian evolution. The explanatory filter borrows its logical structure from the theory of statistical hypothesis testing but we argue that, when viewed within this context, the filter runs into serious trouble in any interesting biological application. Although the explanatory filter has been extensively criticized from many angles, we present the first rigorous criticism based on the theory of mathematical statistics.
An inductive logic is a system of inference that describes the relation between propositions on data, and propositions that extend beyond the data, such as predictions over future data, and general conclusions on all possible data. Statistics, on the other hand, is a mathematical discipline that describes procedures for deriving results about a population from sample data. These results include predictions on future samples, decisions on rejecting or accepting a hypothesis about the population, the determination of probability assignments over (...) such hypotheses, the selection of a statistical model for studying the population, and so on. Both inductive logic and statistics are calculi for getting from the given data to propositions or results that transcend the data. (shrink)