Given the common assumption that measurement plays an important role in the foundation of science, the paper analyzes the possibility that Measurement Science, and therefore measurement itself, can be properly founded. The realist and the representational positions are analyzed at this regards: the conclusion, that such positions unavoidably lead to paradoxical situations, opens the discussion for a new epistemology of measurement, whose characteristics and interpretation are sketched here but are still largely matter of investigation.
Against the tradition, which has considered measurement able to produce pure data on physical systems, the unavoidable role played by the modeling activity in measurement is increasingly acknowledged, particularly with respect to the evaluation of measurement uncertainty. This paper characterizes measurement as a knowledge-based process and proposes a framework to understand the function of models in measurement and to systematically analyze their influence in the production of measurement results and their interpretation. To this aim, a general model of measurement is (...) sketched, which gives the context to highlight the unavoidable, although sometimes implicit, presence of models in measurement and, finally, to propose some remarks on the relations between models and measurement uncertainty, complementarily classified as due to the idealization implied in the models and their realization in the experimental setup. (shrink)
The paper introduces what is deemed as the general epistemological problem of measurement: what characterizes measurement with respect to generic evaluation? It also analyzes the fundamental positions that have been maintained about this issue, thus presenting some sketches for a conceptual history of measurement. This characterization, in which three distinct standpoints are recognized, corresponding to a metaphysical, an anti-metaphysical, and relativistic period, allows us to introduce and briefly discuss some general issues on the current epistemological status of measurement science.
The paper introduces and formally defines a functional concept of a measuring system, on this basis characterizing the measurement as an evaluation performed by means of a calibrated measuring system. The distinction between exact and uncertain measurement is formalized in terms of the properties of the traceability chain joining the measuring system to the primary standard. The consequence is drawn that uncertain measurements lose the property of relation-preservation, on which the very concept of measurement is founded according to the representational (...) viewpoint. Finally, from the analysis of the inter-relations between calibration and measurement the fundamental reasons of the claimed objectivity and intersubjectivity of measurement are highlighted, a valuable epistemological result to characterize measurement as a particular kind of evaluation. (shrink)
This paper discusses a relational modeling of measurement which is complementary to the standard representational point of view: by focusing on the experimental character of the measurand-related comparison between objects, this modeling emphasizes the role of the measuring systems as the devices which operatively perform such a comparison. The non-idealities of the operation are formalized in terms of non-transitivity of the substitutability relation between measured objects, due to the uncertainty on the measurand value remaining after the measurement. The metrological structure (...) of traceability is shown to be an effective solution to cope with the problem of the general non-transitivity of measurement results. A preliminary theory is introduced as a possible formalization for the presented model. (shrink)
Measurement in soft systems generally cannot exploit physical sensors as data acquisition devices. The emphasis in this case is instead on how to choose the appropriate indicators and to combine their values so to obtain an overall result, interpreted as the value of a property, i.e., the measurand, for the system under analysis. This paper aims at discussing the epistemological conditions of the claim that such a process is a measurement, and performance evaluation is the case introduced to support the (...) analysis, performed in systematic comparison with the paradigm of measurement of physical quantities. Some background questions arising here are: – Are the chosen indicators appropriate performance indicators? – Do such indicators convey complete and non-redundant information on performance? – Does the chosen combination rule generate results suitably interpretable as performance values? And enlarging the focus: – Does the obtained value specifically convey information on the system under analysis, instead of some different entity (typically including the subject who is evaluating)? Operatively: would different subjects evaluate the same system in the same way? i.e., is the obtained information objective? – Does the obtained value convey information that is interpretable in the same way by different subjects? Operatively: would different subjects who have agreed on a decision procedure make the same decision from the same performance information? i.e., is the obtained information intersubjective? Any well founded positive answers to these questions significantly support a structural interpretation of measurement encompassing both physical and soft measurement. (shrink)
We investigate the implications of protective measurement for de Broglie-Bohm theory, mainly focusing on the interpretation of the wave function. It has been argued that the de Broglie-Bohm theory gives the same predictions as quantum mechanics by means of quantum equilibrium hypothesis. However, this equivalence is based on the premise that the wave function, regarded as a Ψ-field, has no mass and charge density distributions. But this premise turns out to be wrong according to protective measurement; a charged quantum system (...) has effective mass and charge density distributing in space, proportional to the square of the absolute value of its wave function. Then in the de Broglie-Bohm theory both Ψ-field and Bohmian particle will have charge density distribution for a charged quantum system. This will result in the existence of an electrostatic self-interaction of the field and an electromagnetic interaction between the field and Bohmian particle, which not only violates the superposition principle of quantum mechanics but also contradicts experimental observations. Therefore, the de Broglie-Bohm theory as a realistic interpretation of quantum mechanics is problematic according to protective measurement. Lastly, we briefly discuss the possibility that the wave function is not a physical field but a description of some sort of ergodic motion (e.g. random discontinuous motion) of particles. (shrink)
This article analyzes the implications of protective measurement for the meaning of the wave function. According to protective measurement, a charged quantum system has mass and charge density proportional to the modulus square of its wave function. It is shown that the mass and charge density is not real but effective, formed by the ergodic motion of a localized particle with the total mass and charge of the system. Moreover, it is argued that the ergodic motion is not continuous but (...) discontinuous and random. This result suggests a new interpretation of the wave function, according to which the wave function is a description of random discontinuous motion of particles, and the modulus square of the wave function gives the probability density of the particles being in certain locations. It is shown that the suggested interpretation of the wave function disfavors the de Broglie-Bohm theory and the many-worlds interpretation but favors the dynamical collapse theories, and the random discontinuous motion of particles may provide an appropriate random source to collapse the wave function. (shrink)
Measurement is a process aimed at acquiring and codifying information about properties of empirical entities. In this paper we provide an interpretation of such a process comparing it with what is nowadays considered the standard measurement theory, i.e., representational theory of measurement. It is maintained here that this theory has its own merits but it is incomplete and too abstract, its main weakness being the scant attention reserved to the empirical side of measurement, i.e., to measurement systems and to the (...) ways in which the interactions of such systems with the entities under measurement provide a structure to an empirical domain. In particular it is claimed that (1) it is on the ground of the interaction with a measurement system that a partition can be induced on the domain of entities under measurement and that relations among such entities can be established, and that (2) it is the usage of measurement systems that guarantees a degree of objectivity and intersubjectivity to measurement results. As modeled in this paper, measurement systems link the abstract theory of measuring, as developed in representational terms, and the practice of measuring, as coded in standard documents such as the International Vocabulary of Metrology. (shrink)
According to orthodox quantum mechanics, state vectors change in two incompatible ways: "deterministically" in accordance with Schroedinger's time-dependent equation, and probabilistically if and only if a measurement is made. It is argued here that the problem of measurement arises because the precise mutually exclusive conditions for these two types of transitions to occur are not specified within orthodox quantum mechanics. Fundamentally, this is due to an inevitable ambiguity in the notion of "meawurement" itself. Hence, if the problem of measurement is (...) to be resolved, a new, fully objective version of quantjm mechanics needs to be developed which does not incorporate the notion of measurement in its basic postuolates at all. (shrink)
This work develops an epistemology of measurement, that is, an account of the conditions under which measurement and standardization methods produce knowledge as well as the nature, scope, and limits of this knowledge. I focus on three questions: (i) how is it possible to tell whether an instrument measures the quantity it is intended to? (ii) what do claims to measurement accuracy amount to, and how might such claims be justified? (iii) when is disagreement among instruments a sign of error, (...) and when does it imply that instruments measure different quantities? Based on a series of case studies conducted in collaboration with the US National Institute of Standards and Technology (NIST), I argue for a model-based approach to the epistemology of physical measurement. To measure a physical quantity, I argue, is to estimate the value of a parameter in an idealized model of a physical process. Such estimation involves inference from the final state (‘indication’) of a process to the value range of a parameter (‘outcome’) in light of theoretical and statistical assumptions. Contrary to contemporary philosophical views, measurement outcomes cannot be obtained by mapping the structure of indications. Instead, measurement outcomes as well as claims to accuracy, error and quantity individuation can only be adjudicated relative to a choice of idealized modelling assumptions. (shrink)
In the last few decades the role played by models and modeling activities has become a central topic in the scientific enterprise. In particular, it has been highlighted both that the development of models constitutes a crucial step for understanding the world and that the developed models operate as mediators between theories and the world. Such perspective is exploited here to cope with the issue as to whether error-based and uncertainty-based modeling of measurement are incompatible, and thus alternative with one (...) another, as sometimes claimed nowadays. The crucial problem is whether assuming this standpoint implies definitely renouncing to maintain a role for truth and the related concepts, particularly accuracy, in measurement. It is argued here that the well known objections against true values in measurement, which would lead to refuse the concept of accuracy as non-operational, or to maintain it as only qualitative, derive from a not clear distinction between three distinct processes: the metrological characterization of measuring systems, their calibration, and finally measurement. Under the hypotheses that (1) the concept of true value is related to the model of a measurement process, (2) the concept of uncertainty is related to the connection between such model and the world, and (3) accuracy is a property of measuring systems (and not of measurement results) and uncertainty is a property of measurement results (and not of measuring systems), not only the compatibility but actually the conjoint need of error-based and uncertainty-based modeling emerges. (shrink)
Psychologists debate whether mental attributes can be quantified or whether they admit only qualitative comparisons of more and less. Their disagreement is not merely terminological, for it bears upon the permissibility of various statistical techniques. This article contributes to the discussion in two stages. First it explains how temperature, which was originally a qualitative concept, came to occupy its position as an unquestionably quantitative concept (§§1–4). Specifically, it lays out the circumstances in which thermometers, which register quantitative (or cardinal) differences, (...) became distinguishable from thermoscopes, which register merely qualitative (or ordinal) differences. I argue that this distinction became possible thanks to the work of Joseph Black, ca. 1760. Second, the article contends that the model implicit in temperature’s quantitative status offers a better way for thinking about the quantitative status of mental attributes than models from measurement theory (§§5–6). (shrink)
This book provides an introduction to measurement theory for non-specialists and puts measurement in the social and behavioural sciences on a firm mathematical foundation. Results are applied to such topics as measurement of utility, psychophysical scaling and decision-making about pollution, energy, transportation and health. The results and questions presented should be of interest to both students and practising mathematicians since the author sets forth an area of mathematics unfamiliar to most mathematicians, but which has many potentially significant applications.
The need for quantitative measurement represents a unifying bond that links all the physical, biological, and social sciences. Measurements of such disparate phenomena as subatomic masses, uncertainty, information, and human values share common features whose explication is central to the achievement of foundational work in any particular mathematical science as well as for the development of a coherent philosophy of science. This book presents a theory of measurement, one that is "abstract" in that it is concerned with highly general axiomatizations (...) of empirical and qualitative settings and how these can be represented quantitatively. It was inspired by, and represents a generalization and extension of, the last major research work in this field, Foundations of Measurement Vol. I, by Krantz, Luce, Suppes, and Tversky published in 1971. (shrink)
Measurement is fundamental to all the sciences, the behavioural and social as well as the physical and in the latter its results provide our paradigms of 'objective fact'. But the basis and justification of measurement is not well understood and is often simply taken for granted. Henry Kyburg Jr proposes here an original, carefully worked out theory of the foundations of measurement, to show how quantities can be defined, why certain mathematical structures are appropriate to them and what meaning attaches (...) to the results generated. Crucial to his approach is the notion of error - it can not be eliminated entirely from its introduction and control, her argues, arises the very possibility of measurement. Professor Kyburg's approach emphasises the empirical process of making measurements. In developing it he discusses vital questions concerning the general connection between a scientific theory and the results which support it (or fail to). (shrink)
We consider the problem of measurement using the Lindblad equation, which allows the introduction of time in the interaction between the measured system and the measurement apparatus. We use analytic results, valid for weak system-environment coupling, obtained for a two-level system in contact with a measurer (Markovian interaction) and a thermal bath (non-Markovian interaction), where the measured observable may or may not commute with the system-environment interaction. Analysing the behavior of the coherence, which tends to a value asymptotically close to (...) zero, we obtain an expression for the time of measurement which depends only on the system-measurer coupling, and which does not depend on whether the observable commutes with the system-bath interaction. The behavior of the coherences in the case of strong system-environment coupling, found numerically, indicates that an increase in this coupling decreases the measurement time, thus allowing our expression to be considered the upper limit for the duration of the process. (shrink)
It is argued that the so-called minimal statistical interpretation of quantum mechanics does not completely resolve the measurement problem in that this view is unable to show that quantjum mechanics can dispense with classical physics when it comes to a treatment of the measuring interaction. It is suggested that the view that quantum mechanics applies to individual systems should not be too hastily abandoned, in that this view gives perhaps the best hope of leading to a version of quantum mechanics (...) which does provide a complete solution to the measurement problem. (shrink)
Ordinary measurement using a standard scale, such as a ruler or a standard set of weights, has two fundamental properties. First, the results are approximate, for example, within 0.1 g. Second, the resulting indistinguishability is transitive, rather than nontransitive, as in the standard psychological comparative judgments without a scale. Qualitative axioms are given for structures having the two properties mentioned. A representation theorem is then proved in terms of upper and lower measures.
We revisit quantum measurement when the apparatus is initially in a mixed state. We find that, in a particular restriction setup, the amount of entanglement between the system and the apparatus is given by the entropy increasing of the system under the measurement transformation. We show that the information gained is equal to the amount of entanglement under performing perfect measurement. Based on the perfect measurement, we give an upper bound of quantum discord.
First-person data have been both condemned and hailed because of their alleged privacy. Critics argue that science must be based on public evidence: since first-person data are private, they should be banned from science. Apologists reply that first-person data are necessary for understanding the mind: since first-person data are private, scientists must be allowed to use private evidence. I argue that both views rest on a false premise. In psychology and neuroscience, the subjects issuing first-person reports and other sources of (...) first-person data play the epistemic role of a (self-) measuring instrument. Data from measuring instruments are public and can be validated by public methods. Therefore, first-person data are as public as other scientific data: their use in science is legitimate, in accordance with standard scientific methodology. (shrink)
In this paper we motivate and develop the analytic theory of measurement, in which autonomously specified algebras of quantities (together with the resources of mathematical analysis) are used as a unified mathematical framework for modeling (a) the time-dependent behavior of natural systems, (b) interactions between natural systems and measuring instruments, (c) error and uncertainty in measurement, and (d) the formal propositional language for describing and reasoning about measurement results. We also discuss how a celebrated theorem in analysis, known as Gelfand (...) representation, guarantees that autonomously specified algebras of quantities can be interpreted as algebras of observables on a suitable state space. Such an interpretation is then used to support (i) a realist conception of quantities as objective characteristics of natural systems, and (ii) a realist conception of measurement results (evaluations of quantities) as determined by and descriptive of the states of a target natural system. As a way of motivating the analytic approach to measurement, we begin with a discussion of some serious philosophical and theoretical problems facing the well-known representational theory of measurement. We then explain why we consider the analytic approach, which avoids all these problems, to be far more attractive on both philosophical and theoretical grounds. (shrink)
The paper presents an argument for treating certain types of computer simulation as having the same epistemic status as experimental measurement. While this may seem a rather counterintuitive view it becomes less so when one looks carefully at the role that models play in experimental activity, particularly measurement. I begin by discussing how models function as “measuring instruments” and go on to examine the ways in which simulation can be said to constitute an experimental activity. By focussing on the connections (...) between models and their various functions, simulation and experiment one can begin to see similarities in the practices associated with each type of activity. Establishing the connections between simulation and particular types of modelling strategies and highlighting the ways in which those strategies are essential features of experimentation allows us to clarify the contexts in which we can legitimately call computer simulation a form of experimental measurement. (shrink)
Extension is probably the most general natural property. Is it a fundamental property? Leibniz claimed the answer was no, and that the structureless intuition of extension concealed more fundamental properties and relations. This paper follows Leibniz's program through Herbart and Riemann to Grassmann and uses Grassmann's algebra of points to build up levels of extensions algebraically. Finally, the connection between extension and measurement is considered.
Abraham Stone recently has published an argument purporting to show that David Bohm's interpretation of quantum mechanics fails to solve the measurement problem. Stone's analysis is not correct, as he has failed to take account of the conditions under which the theorems he cites are proven. An explicit presentation of a Bohmian measurement illustrates the flaw in his reasoning.
There are two versions of the putative connection between consciousness and the measurement problem of quantum mechanics : consciousness as the cause of state vector reduction, and state vector reduction as the physical basis of consciousness. In this article, these controversial ideas are neither accepted uncritically, nor rejected from the outset in the name of some prejudice about objective knowledge. Instead, their origin is sought in our most cherished (but disputable) beliefs about the place of mind and consciousness in the (...) world. It is first pointed out that these common beliefs about mind and consciousness arise from reification of situated first-person experience. Then, situatedness is shown to be a constitutive part of any exhaustive treatment of quantum measurements. It turns out that the alleged connection between consciousness and the measurement problem is a symptom of (i) the ineliminability of our being situated from the end-product of science, and (ii) our difficulty to express correctly this being situated. (shrink)
This paper investigates what Newton could have meant in a now famous passage from De Gravitatione (hereafter “DeGrav”) that “space is as it were an emanative effect of God” (21). First I offer a careful examination of the four key passages within DeGrav that bear on this. I argue that the logic of Newton’s argument permits several interpretations (section I). Second I sketch four options: i) one approach associated with the Cambridge Platonist, Thomas More, recently investigated by Dana Jalobeanu and (...) Ed Slowik; ii) one traditional neo-Platonic approach; iii) a necessitarian approach associated with Howard Stein’s interpretation, recently reaffirmed by Andrew Janiak; iv) an approach connected with Bacon’s efforts to reformulate a useful notion of form and laws of nature. Hitherto only the first and third options have received scholarly attention. I offer arguments to treat Newtonian emanation as a species of Baconian formal causation and in this way to combine some of the most attractive elements of the first three options (section II). Finally in Section III, I suggest that the recent scholarly focus on emanation has obscured the importance of Newton’s very interesting claims about existence and measurement in the same passage(s). (shrink)
Kripke has argued that definitions of units of measurements provide examples of statements that are both contingent and a priori. In this paper I argue that definitions of units of measurement are intended to be stipulations of what Kripke calls theoretical identities: a stipulation that two terms will have the same rigid designation. Hence such a definition is both a priori and necessary. The necessity arises because such definitions appeal to natural kind properties only, which on Kripke's account are necessary.
The aim of this essay is to distinguish and analyze several difficulties confronting attempts to reconcile the fundamental quantum mechanical dynamics with Born''s rule. It is shown that many of the proposed accounts of measurement fail at least one of the problems. In particular, only collapse theories and hidden variables theories have a chance of succeeding, and, of the latter, the modal interpretations fail. Any real solution demands new physics.
This is a preliminary version of an article to appear in the forthcoming Ashgate Companion to the New Philosophy of Physics.In it, I aim to review, in a way accessible to foundationally interested physicists as well as physics-informed philosophers, just where we have got to in the quest for a solution to the measurement problem. I don't advocate any particular approach to the measurement problem (not here, at any rate!) but I do focus on the importance of decoherence theory to (...) modern attempts to solve the measurement problem, and I am fairly sharply critical of some aspects of the "traditional" formulation of the problem. (shrink)
Heterophenomenology is a third-person methodology proposed by Daniel Dennett for using first-person reports as scientific evidence. I argue that heterophenomenology can be improved by making six changes: (i) setting aside consciousness, (ii) including other sources of first-person data besides first-person reports, (iii) abandoning agnosticism as to the truth value of the reports in favor of the most plausible assumptions we can make about what can be learned from the data, (iv) interpreting first-person reports (and other first-person behaviors) directly in terms (...) of target mental states rather than in terms of beliefs about them, (v) dropping any residual commitment to incorrigibility of first-person reports, and (vi) recognizing that thirdperson methodology does have positive effects on scientific practices. When these changes are made, heterophenomenology turns into the self-measurement methodology of firstperson data that I have defended in previous papers. (shrink)
The thesis that numbers are ratios of quantities has recently been advanced by a number of philosophers. While adequate as a definition of the natural numbers, it is not clear that this view suffices for our understanding of the reals. These require continuous quantity and relative to any such quantity an infinite number of additive relations exist. Hence, for any two magnitudes of a continuous quantity there exists no unique ratio. This problem is overcome by defining ratios, and hence real (...) numbers, as binary relations between infinite standard sequences. This definition leads smoothly into a new definition of measurement consonant with the traditional view of measurement as the discovery or estimation of numerical relations. The traditional view is further strengthened by allowing that the additive relations internal to a quantity are distinct from relations observed in the behaviour of objects manifesting quantities. In this way the traditional theory can accommodate the theory of conjoint measurement. This is worth doing because the traditional theory has one great strength lacked by its rivals: measurement statements and quantitative laws are able to be understood literally. 1 This paper was completed in 1990-1. while the author was a visiting scholar at the Irvine Research Unit in Mathematical Behavioral Sciences. University of California. Irvine. The author wishes to thank the Director. Professor R. D. Luce, for the generous provision of space and facilities within the Unit and for his critical comments upon some of the ideas expressed herein: Professor L. Narens. for his trenchant criticisms: and the University of Sydney, for granting Special Study Leave and financial assistance to make the visit possible. (shrink)
In this article we discuss the ethical dilemmas facing performance evaluators and the "evaluatees" whose performances are measured in a business context. The concepts of role morality and common morality are used to develop a framework of behaviors that are normally seen as the moral responsibilities of these actors. This framework is used to analyze, based on four empirical situations, why the implementation of a performance measurement system has not been as effective as expected. It was concluded that, in these (...) four cases, unethical behavior (i.e. deviations from the ethical behaviors identified in the framework) provided, at least to some extent, an explanation for the lower than expected effectiveness of the performance measurement procedures. At the end of the paper we present an agenda for further research through which the framework could be further developed and systematically applied to a broader set of cases. (shrink)
A structural analogy is pointed out between a check hermeneutically developed phenomenological description, based on Husserl, of the process of perceptual cognition on the one hand and quantum mechanical measurement on the other hand. In Husserl's analytic phase of the cognition process, the 'intentionality-structure' of the subject/object union prior to predication of a local object is an entangled symmetry-making state, and this entanglement is broken in the synthetic phase when the particular local object is constituted under the influence of an (...) iota ('inner horizon') and the 'facticity' of the local world ('outer horizon'). Replacing 'perceptual cognition' by 'measurement' and 'subject' by 'expert subject using a measuring device' the analogy of a formal quantum structure is extended to the conscious structure of all empirical cognition. This is laid out in three theses: about perception, about classical measurement, and about quantum measurement. The results point to the need for research into the quantum structure of the physical embodiment of human cognition. (shrink)
The decision-theoretic account of probability in the Everett or many-worlds interpretation, advanced by David Deutsch and David Wallace, is shown to be circular. Talk of probability in Everett presumes the existence of a preferred basis to identify measurement outcomes for the probabilities to range over. But the existence of a preferred basis can only be established by the process of decoherence, which is itself probabilistic.
The aim of this paper is to give a systematic account of the so-called “measurement problem” in the frame of the standard interpretation of quantum mechanics. It is argued that there is not one but five distinct formulations of this problem. Each of them depends on what is assumed to be a “satisfactory” description of the measurement process in the frame of the standard interpretation. Moreover, the paper points out that each of these formulations refers not to a unique problem, (...) but to a set of sub-problems. (shrink)
The quantum theory of de Broglie and Bohm solves the measurement problem, but the hypothetical corpuscles play no role in the argument. The solution finds a more natural home in the Everett interpretation.
The transition from the traditional to the representational theory of measurement around the turn of the century was accompanied by little sustained criticism of the former. The most forceful critique was Bertrand Russell''s 1897 Mind paper, On the relations of number and quantity. The traditional theory has it that real numbers unfold from the concept of continuous quantity. Russell''s critique identified two serious problems for this theory: (1) can magnitudes of a continuous quantity be defined without infinite regress; and (2) (...) can additive relations between such magnitudes exist if magnitudes are not divisible? The present paper shows how the traditional theory answers these questions and compares the traditional and representational theories as contributions to our understanding of the logic of application. (shrink)
For nearly six decades, the conscious observer has played a central and essential rôle in quantum measurement theory. I outline some difficulties which the traditional account of measurement presents for material theories of mind before introducing a new development which promises to exorcise the ghost of consciousness from physics and relieve the cognitive scientist of the burden of explaining why certain material structures reduce wavefunctions by virtue of being conscious while others do not. The interactive decoherence of complex quantum systems (...) reveals that the oddities and complexities of linear superposition and state vector reduction are irrelevant to computational aspects of the philosophy of mind and that many conclusions in related fields are ill founded. (shrink)
The Representational Theory of Measurement conceives measurement as establishing homomorphisms from empirical relational structures into numerical relation structures, called models. There are two different approaches to deal with the justification of a model: an axiomatic and an empirical approach. The axiomatic approach verifies whether a given relational structure satisfies certain axioms to secure homomorphic mapping. The empirical approach conceives models to function as measuring instruments by transferring observations of an economic system into quantitative facts about that system. These facts are (...) evaluated by their accuracy and precision. Precision is achieved by least squares methods and accuracy by calibration. For calibration standards are needed. Then two strategies can be distinguished. One aims at estimating the invariant (structural) equations of the system. The other strategy is to use known stable facts about the system to adjust the model parameters. For this strategy, the requirement of models as homomorphic mappings has been dropped. (shrink)
In this paper I defend mathematical nominalism by arguing that any reasonable account of scientific theories and scientific practice must make explicit the empirical non-mathematical grounds on which the application of mathematics is based. Once this is done, references to mathematical entities may be eliminated or explained away in terms of underlying empirical conditions. I provide evidence for this conclusion by presenting a detailed study of the applicability of mathematics to measurement. This study shows that mathematical nominalism may be regarded (...) as a methodological approach to applicability, illuminating the use of mathematics in science. (shrink)
This paper presents a novel semantic analysis of unit names (like pound and meter ) and gradable adjectives (like tall, short and happy ), inspired by measurement theory (Krantz et al. In Foundations of measurement: Additive and Polynomial Representations, 1971). Based on measurement theory’s four-way typology of measures, I claim that different adjectives are associated with different types of measures whose special characteristics, together with features of the relations denoted by unit names, explain the puzzling limited distribution of measure phrases, (...) as well as unit-based comparisons between predicates (as in the table is longer than it is wide ). All considered, my analyses support the view that the grammar of natural languages is sensitive to features of measurement theory. (shrink)
The London and Bauer monograph occupies a central place in the debate concerning the quantum measurement problem. Gavroglu has previously noted the influence of Husserlian phenomenology on London's scientific work. However, he has not explored the full extent of this influence in the monograph itself. I begin this paper by outlining the important role played by the monograph in the debate. In effect, it acted as a kind of 'lens' through which the standard, or Copenhagen, 'solution' to the measurement problem (...) came to be perceived and, as such, it was robustly criticized, most notably by Putnam and Shimony. I then spell out the Husserlian understanding of consciousness in order to illuminate the traces of this understanding within the London and Bauer text. This, in turn, yields a new perspective on this 'solution' to the measurement problem, one that I believe has not been articulated before and, furthermore, which is immune to the criticisms of Putnam and Shimony. (shrink)
I examine recent arguments based on functionalism that claim to show that Bohm's theory fails to solve the measurement problem, or if it does so, it is only because it reduces to a form of the many-worlds theory. While these arguments reveal some interesting features of Bohm's theory, I contend that they do not undermine the distinctive Bohmian solution to the measurement problem. ‡I would like to thank Harvey Brown, Martin Thomson-Jones, and David Wallace for helpful discussions. †To contact the (...) author, please write to: Department of Philosophy, University of Miami, P.O. Box 248054, Coral Gables, FL 33124–4670; e-mail: firstname.lastname@example.org. (shrink)
The concept of measurement is fundamental to a whole range of different disciplines, including not only the natural and engineering sciences, but also laboratory medicine and certain branches of the social sciences. This being the case, the concept of measurement has a particular relevance to the development of top-level ontologies in the area of knowledge engineering. For this reason, the present paper is concerned with ontological aspects of measurement. We are searching for a list of concepts that are apt to (...) characterize measurement methods in a general manner. To establish such means of characterization, we will primarily deal with the semantics of measurement values. (shrink)
In the first section of this paper I review Measurement Theoretic Semantics – an approach to formal semantics modeled after the application of numbers in measurement, e.g., of length. In the second section it is argued that the measurement theoretic approach to semantics yields a novel, useful conception of propositions. In the third section the measurement theoretic view of propositions is compared with major other accounts of propositional content.
Newton characterizes the reasoning of Principia Mathematica as geometrical. He emulates classical geometry by displaying, in diagrams, the objects of his reasoning and comparisons between them. Examination of Newtonâ€™s unpublished texts (and the views of his mentor, Isaac Barrow) shows that Newton conceives geometry as the science of measurement. On this view, all measurement ultimately involves the literal juxtapositionâ€”the putting-together in spaceâ€”of the item to be measured with a measure, whose dimensions serve as the standard of reference, so that all (...) quantity (which is what measurement makes known) is ultimately related to spatial extension. I use this conception of Newtonâ€™s project to explain the organization and proofs of the first theorems of mechanics to appear in the Principia (beginning in Sect. 2 of Book I). The placementof Keplerâ€™s rule of areas as the first proposition, and the manner in which Newton proves it, appear natural on the supposition that Newton seeks a measure, in the sense of a moveable spatial quantity, of time. I argue that Newton proceeds in this way so that his reasoning can have the ostensive certainty of geometry. (shrink)
Non-collapse theories of quantum mechanics have the peculiar characteristic that, although their measurements produce definite results, their state vectors remain in a superposition of possible outcomes. David Albert has used this fact to show that the standard uncertainty relations can be violated if self-measurements are made. Bradley Monton, however, has held that Albert has not been careful enough in his treatment of self-measurement and that being more careful (considering mental state supervenience) implies no violation of the relations. In this paper, (...) I will outline both Albert's proposal and Monton's objections. Then, I will show how the uncertainty relations can be violated after all (even after being as careful as Monton). Finally, I will discuss how finding a way around the objections allows us to learn more about what is and what is not possible in non-collapse theories of quantum mechanics. (shrink)
This research examines business and psychology students’ attitude toward unethical behavior (measured at Time 1) and their propensity to engage in unethical behavior (measured at Time 1 and at Time 2, 4 weeks later) using a 15-item Unethical Behavior measure with five Factors: Abuse Resources, Not Whistle Blowing, Theft, Corruption, and Deception. Results suggested that male students had stronger unethical attitudes and had higher propensity to engage in unethical behavior than female students. Attitude at Time 1 predicted Propensity at Time (...) 1 accurately for all five factors (concurrent validity): If students consider it to be unethical, then, they are less likely to engage in that unethical behavior. Attitude at Time 1 predicted only Factor Abuse Resources for Propensity at Time 2. Propensity at Time 1 was significantly related to Propensity at Time 2. Attitude at Time 1, Propensity at Time 1, and Propensity at Time 2 had achieved configural and metric measurement invariance across major (business vs. psychology). Thus, researchers may have confidence in using these measures in future research. (shrink)
When classical mechanics is seen as the short-wavelength limit of quantum mechanics (i.e., as the limit of geometrical optics), it becomes clear just how serious and all-pervasive the measurement problem is. This formulation also leads us into the Bohm theory. But this theory has drawbacks: its nonuniqueness, in particular, and its nonlocality. I argue that these both reflect an underlying problem concerning information, which is actually a deeper version of the measurement problem itself.
We investigate the thesis of Aharonov, Bergmann, and Lebowitz that time-symmetry holds in ensembles defined by both an initial and a final condition, called preand postselected ensembles. We distinguish two senses of time symmetry and show that the first one, concerning forward directed and time reversed measurements, holds if the measurement process is ideal, but fails if the measurement process is non-ideal, i.e., violates Lüders's rule. The second kind of time symmetry, concerning the interchange of initial and final conditions, fails (...) even in the case of ideal measurements. Bayes's theorem is used as a primary tool for calculating the relevant probabilities. We are critical of the concept that a pair of vectors in Hilbert space, characterizing the initial and final conditions, can be considered to constitute a generalized quantum state. (shrink)
It has been argued, partly from the lack of any widely accepted solution to the measurement problem, and partly from recent results from quantum information theory, that measurement in quantum theory is best treated as a black box. However, there is a crucial difference between ‘having no account of measurement' and ‘having no solution to the measurement problem'. We know a lot about measurements. Taking into account this knowledge sheds light on quantum theory as a theory of information and computation. (...) In particular, the scheme of ‘one-way quantnum computation' takes on a new character in light of the role that reference frames play in actually carrying out any one-way quantum comptuation. ‡Thanks to audiences at the PSA and the Centre for Time, University of Sydney, for helpful comments and questions. †To contact the author, please write to: Department of Philosophy, University of South Carolina, Columbia, SC 29208; e-mail: email@example.com. (shrink)
The integration of recent work on decoherence into a so-called modal interpretation offers a promising new approach to the measurement problem in quantum mechanics. In this paper I explain and develop this approach in the context of the interactive interpretation presented in Healey (1989). I begin by questioning a number of assumptions which are standardly made in setting up the measurement problem, and I conclude that no satisfactory solution can afford to ignore the influence of the environment. Further, I argue (...) that there are good reasons to believe that on a modal interpretation environmental interactions rapidly ensure that a quantummechanically describable apparatus indeed records a definite result following a measurement interaction. (shrink)
In a recent article , Wiseman has proposed the use of so-called weak measurements for the determination of the velocity of a quantum particle at a given position, and has shown that according to quantum mechanics the result of such a procedure is the Bohmian velocity of the particle. Although Bohmian mechanics is empirically equivalent to variants based on velocity formulas different from the Bohmian one, and although it has been proven that the velocity in Bohmian mechanics is not measurable, (...) we argue here for the somewhat paradoxical conclusion that Wiseman’s weak measurement procedure indeed constitutes a genuine measurement of velocity in Bohmian mechanics. We reconcile the apparent contradictions and elaborate on some of the different senses of measurement at play here. (shrink)
Modifications of current theories of ordinal, interval and extensive measurement are presented, which aim to accomodate the empirical fact that perfectly exact measurement is not possible (which is inconsistent with current theories). The modification consists in dropping the assumption that equality (in measure) is observable, but continuing to assume that inequality (greater or lesser) can be observed. The modifications are formulated mathematically, and the central problems of formal measurement theory--the existence and uniqueness of numerical measures consistent with data--are re-examined. Some (...) results also are given on a problem which does not arise in current theories: namely that of determining limits of accuracy attainable on the basis of observations. (shrink)
Hale proposes a neo-logicist definition of real numbers by abstraction as ratios defined on a complete ordered domain of quantities (magnitudes). I argue that Hale's definition faces insuperable epistemological and ontological difficulties. On the epistemological side, Hale is committed to an explanation of measurement applications of reals which conflicts with several theorems in measurement theory. On the ontological side, Hale commits himself to the necessary and a priori existence of at least one complete ordered domain of quantities, which is extremely (...) implausible because science treats the logical structure of quantities as subject to experimentally and theoretically motivated refinements and revisions. (shrink)
The metrology literature neglects a strong empirical measurement tradition in economics, which is different from the traditions as accounted for by the formalist representational theory of measurement. This empirical tradition comes closest to Mari's characterization of measurement in which he describes measurement results as informationally adequate to given goals. In economics, one has to deal with soft systems, which induces problems of invariance and of self-awareness. It will be shown that in the empirical economic measurement tradition both problems have been (...) on the agenda for a long while, and that the proposed solutions to these problems provide clues for the directions in which one could develop a measurement theory that takes account of soft systems. (shrink)
This paper examines some aspects of the grammar of measurement based on data from non-split and split measure phrase (MP) constructions in Japanese. I claim that the non-split MP construction involves measurement of individuals, while the split MP construction involves measurement of events as well as of individuals. This claim is based on the observation that, while both constructions are subject to some semantic restrictions in the nominal domain, only the split MP construction is sensitive to restrictions in the verbal (...) domain (namely, incompatibility with single-occurrence events and with individual-level predicates, and (un)availability of collective readings). It is shown that these semantic restrictions can be explained by a uniform semantic constraint on the measure function, namely, Schwarzschild’s [(2002). The grammar of measurement. The Proceedings of Semantics and Linguistics Theory, 24, 241–306] monotonicity constraint. In particular, I argue that, in the two constructions at issue, the measure function is subject to the monotonicity constraint, and that we observe different semantic restrictions depending on whether the measure function applies to a nominal or a verbal domain. (shrink)
This work examines whether the environmentally-induced decoherence approach in quantum mechanics brings us any closer to solving the measurement problem, and whether it contributes to the elimination of subjectivism in quantum theory. A distinction is made between ,collapse, and ,decoherence,, so that an explanation for decoherence does not imply an explanation for collapse. After an overview of the measurement problem and of the open-systems paradigm, we argue that taking a partial trace is equivalent to applying the projection postulate. A criticism (...) of Zurek's decoherence approach to measurements is also made, based on the restriction that he must impose on the interaction between apparatus and environment. We then analyze the element of subjectivity involved in establishing the boundary between system and environment, and criticize the incorporation of Everett's branching of memory records into the decoherence research program. Sticking to this program, we end by sketching a proposal for ‘environmentally-induced collapse’. (shrink)
An appropriate characterization of property types is an important topic for measurement science. On the basis of a set-theoretic model of evaluation and measurement processes, the paper introduces the operative concept of property evaluation type, and discusses how property types are related to, and in fact can be derived from, property evaluation types, by finally analyzing the consequences of these distinctions for the concepts of ‘property’ used in the International Vocabulary of Metrology – Basic and General Concepts and Associated Terms (...) (VIM3). (shrink)
The notion of measurement plays a central role in human cognition. We measure people’s height, the weight of physical objects, the length of stretches of time, or the size of various collections of individuals. Measurements of height, weight, and the like are commonly thought of as mappings between objects and dense scales, while measurements of collections of individuals, as implemented for instance in counting, are assumed to involve discrete scales. It is also commonly assumed that natural language makes use of (...) both types of scales and subsequently distinguishes between two types of measurements. This paper argues against the latter assumption. It argues that natural language semantics treats all measurements uniformly as mappings from objects (individuals or collections of individuals) to dense scales, hence the Universal Density of Measurement (UDM). If the arguments are successful, there are a variety of consequences for semantics and pragmatics, and more generally for the place of the linguistic system within an overall architecture of cognition. (shrink)
This paper expands on, and provides a qualified defence of, Arthur Fine's selective interactions solution to the measurement problem. Fine's approach must be understood against the background of the insolubility proof of the quantum measurement. I first defend the proof as an appropriate formal representation of the quantum measurement problem. The nature of selective interactions, and more generally selections, is then clarified, and three arguments in their favour are offered. First, selections provide the only known solution to the measurement problem (...) that does not relinquish any of the explicit premises of the insolubility proofs. Second, unlike some no-collapse interpretations of quantum mechanics, selections suffer no difficulties with non-ideal measurements. Third, unlike most collapse interpretations, selections can be independently motivated by an appeal to quantum propensities. Introduction The problem of quantum measurement 2.1 The ignorance interpretation of mixtures 2.2 The eigenstate–eigenvalue link 2.3 The quantum theory of measurement The insolubility proof of the quantum measurement 3.1 Some notation 3.2 The transfer of probability condition (TPC) 3.3 The occurrence of outcomes condition (OOC) A defence of the insolubility proof 4.1 Stein's critique 4.2 Ignorance is not required 4.3 The problem of quantum measurement is an idealisation Selections 5.1 Representing dispositional properties 5.2 Selections solve the measurement problem 5.3 Selections and ignorance Non-ideal selections 6.1 No-collapse interpretations and non-ideal measurements 6.2 Exact and approximate measurements 6.3 Selections for non-ideal interactions 6.4 Approximate selections 6.5 Implications for ignorance Selective interactions test quantum propensities 7.1 Equivalence classes as physical ‘aspects’: a critique 7.2 Quantum dispositions 7.3 Selections as a propensity modal interpretation 7.4 A comparison with Popper's propensity interpretation. (shrink)
A resolution of the quantum measurement problem would require one to explain how it is that we end up with determinate records at the end of our measurements. Metaphysical commitments typically do real work in such an explanation. Indeed, one should not be satisfied with one's metaphysical commitments unless one can provide some account of determinate measurement records. I will explain some of the problems in getting determinate records in relativistic quantum field theory and pay particular attention to the relationship (...) between the measurement problem and a generalized version of Malament's theorem. (shrink)
Customary discussions of quantum measurements are unrealistic, in the sense that they do not reflect what happens in most actual measurements even under ideal circumstances. Even theories of measurement which discard the projection postulate tend to retain two unrealistic assumptions of the von Neumann theory: that a measurement consists of a single physical interaction, and that the topic of every measurement is information wholly contained in the quantum state of the object of measurement. I suggest that these unrealistic assumptions originate (...) from an overly literal interpretation of the operator formalism of quantum mechanics. I also suggest, following Park, that some issues can be clarified by distinguishing the sense of the term ''''measurement'''' occurring in the quantum-mechanical operator formalism, and the sense of ''''measurement'''' that refers to actual processes of gaining information about the physical world. (shrink)
This work examines whether the environmentally-induced decoherence approach in quantum mechanics brings us any closer to solving the measurement problem, and whether it contributes to the elimination of subjectivism in quantum theory. A distinction is made between ,collapse, and ,decoherence,, so that an explanation for decoherence does not imply an explanation for collapse. After an overview of the measurement problem and of the open-systems paradigm, we argue that taking a partial trace is equivalent to applying the projection postulate. A criticism (...) of Zurek's decoherence approach to measurements is also made, based on the restriction that he must impose on the interaction between apparatus and environment. We then analyze the element of subjectivity involved in establishing the boundary between system and environment, and criticize the incorporation of Everett's branching of memory records into the decoherence research program. Sticking to this program, we end by sketching a proposal for âenvironmentally-induced collapseâ. (shrink)
In the first two sections I present and motivate a formal semantics program that is modeled after the application of numbers in measurement (e.g., of length). Then, in the main part of the paper, I use the suggested framework to give an account of the semantics of necessity and possibility: (i) I show thatthe measurement theoretic framework is consistent with a robust (non-Quinean) view of modal logic, (ii) I give an account of the semantics of the modal notions within this (...) framework, and (iii) I defend the suggested account against various objections. (shrink)
This paper offers a new concept of the firm that aims at balancing the corporate economic, social, and environmental responsibilities and goes beyond the stakeholder approach. It intends to provide a conceptual and operationalizable basis to fairly assess corporate conduct from both inside and outside the companies. To a large extent these different responsibilities may overlap and reinforce each other. However, if they conflict, they should be clearly evaluated for their own sake and in terms of wealth creation. Only then (...) can a balanced approach be realized. Section 1 briefly discusses some general aspects of the relationship between concepts and measurement. In Section 2, a concept of the firm is developed that is based on the notion of responsibility and balances economic, social and environmental responsibilities. According to these concepts, different ways of measuring corporate planning and performance are examined in Section 3, followed up by a summary and conclusions. (shrink)
The relation between two systems of attitude ascription that capture all the empirically significant aspects of an agents thought and speech may be analogous to that between two systems of magnitude ascription that are equivalent relative to a transformation of scale. If so, just as an objects weighing eight pounds doesnt relate that object to the number eight (for a different but equally good scale would use a different number), similarly an agents believing that P need not relate her to (...) P (for a different but equally adequate interpretive scheme could use a different proposition). In either case the only reality picked out by any system of ascription is what is common to all equivalent rivals. By emphasizing some contrasts between decision theory and belief-desire psychology, it is argued that if attitude ascription is appropriately analogous to measurement then not only is being related to a proposition an artifact of the system of representation chosen, so are belief and desire. (shrink)
Extensive measurement theory is developed in terms of theratio of two elements of an arbitrary (not necessarily Archimedean) extensive structure; thisextensive ratio space is a special case of a more general structure called aratio space. Ratio spaces possess a natural family of numerical scales (r-scales) which are definable in non-representational terms; ther-scales for an extensive ratio space thus constitute a family of numerical scales (extensive r-scales) for extensive structures which are defined in a non-representational manner. This is interpreted as involving (...) arelational theory of quantity which contrasts in certain respects with thequalitative theory of quantity implicit in standard representational extensive measurement theory. The representational properties of extensiver-scales are investigated, and found to coincide withweak extensive measurement in the sense of Holman. This provides support for the thesis (developed in a separate paper) that weak extensive measurement is a more natural model of actual physical extensive scales than is the standard model using strong extensive measurement. Finally, the present apparatus is applied to slightly simplify the existing necessary and sufficient conditions for strong extensive measurement. (shrink)
Corporate sustainability performance measurement systems (SPMS) have been the subject of a growing amount of research. However, there are many challenges and opportunities associated with the design, implementation, use, and evolution of these systems that have yet to be addressed. The purpose of this article is to identify future directions for research in the design, implementation, use, and evolution of corporate SPMS. A concise review of key literature published between 2000 and 2010 is presented. The literature review focuses on research (...) conducted at the both the individual corporation- and sector-levels. The review of published literature provides a basis for the identification of a structured set of 65 key research questions to guide future work. The research questions will be of interest to both practitioners and researchers in corporate sustainability performance measurement. (shrink)
Decoherence results from the dissipative interaction between a quantum system and its environment. As the system and environment become entangled, the reduced density operator describing the system "decoheres" into a mixture (with the interference terms damped out). This formal result prompts some to exclaim that the measurement problem is solved. I will scrutinize this claim by examining how modal and relative-state interpretations can use decoherence. Although decoherence cannot rescue these interpretations from general metaphysical difficulties, decoherence may help these interpretations to (...) pick out a preferred basis. I will explore whether decoherence solves nagging technical problems associated with selecting a preferred basis. (shrink)
In the following article, we propose to show that following the general verificationist epistemic programme (its demand that the truth of our judgments be verifiable), the analysis of measurement on the one hand, and the classical positivist analysis of common-sense observation on the other, do not lead to same conclusions. This is especially important, because the differences in conclusions concern the positivist theory/observation distinction. In particular, the analysis of measurement does not fully support this distinction. This fact might have important (...) consequences for the problem of scientific realism and related ontological and epistemological problems in the philosophy of science. (shrink)
To meet the growing demands for yet more efficient and safer traffic, traffic control is deployed in all modes of transportation. In maritime transportation, traffic control is performed by Vessel Traffic Services (VTS). This paper describes research which is focused upon measurement of VTS operator performance. The concept of situation awareness is introduced as a means to describe and quantify VTS operator performance. Situation awareness is tested in a full scale interactive simulator. A scoring system for VTS operator performance accounts (...) for the specific characteristics of VTS operator work, such as accuracy and relevance of information. (shrink)
Axiomatizations of measurement systems usually require an axiom--called an Archimedean axiom--that allows quantities to be compared. This type of axiom has a different form from the other measurement axioms, and cannot--except in the most trivial cases--be empirically verified. In this paper, representation theorems for extensive measurement structures without Archimedean axioms are given. Such structures are represented in measurement spaces that are generalizations of the real number system. Furthermore, a precise description of "Archimedean axioms" is given and it is shown that (...) in all interesting cases "Archimedean axioms" are independent of other measurement axioms. (shrink)
Eino Kaila's thought occupies a curious position within the logical empiricist movement. Along with Hans Reichenbach, Herbert Feigl, and the early Moritz Schlick, Kaila advocates a realist approach towards science and the project of a “scientific world conception”. This realist approach was chiefly directed at both Kantianism and Poincaréan conventionalism. The case in point was the theory of measurement. According to Kaila, the foundations of physical reality are characterized by the existence of invariant systems of relations, which he called structures. (...) In a certain sense, these invariant structures, he maintained, are constituted in the act of measuring. By “constitution”, however, Kaila meant neither the dependency of the objects of measurement on a priori concepts (or Kantian categories) nor their being effected by conventional stipulations in a Poincaréan sense. He held that invariant structures are, quite literally, real: they exist prior to and independently of our theoretical capacity. By executing measurements, invariant structures are detected and objectively determinable by laws of nature. (shrink)
One of the major roadblocks in conducting Environmental Corporate Social Responsibility (ECSR) research is operationalization of the construct. Existing ECSR measurement tools either require primary data gathering or special subscriptions to proprietary databases that have limited replicability. We address this deficiency by developing a transparent ECSR measure, with an explicit coding scheme, that strictly relies on publicly available data. Our ECSR measure tests favorably for internal consistency and inter-rater reliability, as well as convergent and discriminant validity.
Quantum mechanics has sometimes been taken to be an empiricist (vs. realist) theory. I state the empiricist's argument, then outline a recently noticed type of measurement--protective measurement--that affords a good reply for the realist. This paper is a reply to scientific empiricism (about quantum mechanics), but is neither a refutation of that position, nor an argument in favor of scientific realism. Rather, my aim is to place realism and empiricism on an even score in regards to quantum theory.
It is widely accepted that one of the main objectives ofgovernment expenditure on health care is to generate health. Sincehealth is a function of both length of life and quality of life, thequality-adjusted life-year (QALY) has been developed in an attempt tocombine the value of these attributes into a single index number. TheQALY approach â and particularly the decision rule that healthcare resources should be allocated so as to maximise the number of QALYsgenerated â has often been equated with the (...) utilitarian philosophyof maximising `the greatest happiness of the greatest number'. Thispaper considers the extent to which the measurement and aggregation ofQALYs really is utilitarian by developing a new taxonomy in order toclassify utilitarianism and the different aspects of the QALY approach.It is shown that the measurement of QALYs is consistent with a number ofdifferent moral positions and that QALYs do not have to be aggregatedaccording to the maximisation rule. Therefore it is inappropriate tonecessarily equate QALYs with utilitarianism. It is shown thatmuch turns on what in principle the QALY represents and howin practice it can be operationalised. The paper highlights thecategory confusion that is often present here and suggests possibleavenues for future theoretical and empirical research. (shrink)
The attribution of beliefs and other propositional attitudes is best understood as a form of measurement, however counter-intuitive this may seem. Measurement theory does not require that the thing measured should be a magnitude, or that the calibration of the measuring instrument should be numerical. It only requires a homomorphism between the represented domain and the representing domain. On this basis, maps measure parts of the world, usually geographical locations, and 'belief' statements measure other parts of the world, namely people's (...) aptitudes. Having outlined an argument for this view, I deal with an obvious objection to it: that self-attribution of belief cannot be an exercise in measurement, because we are all aware, from introspection, that our beliefs have an intrinsically semantic form. Subsequently, I turn to the philosophical and methodological ramifications of the measurement theoretic view. I argue, first, that it undermines at least one version of constructivism and, second, that it provides an effective alternative to the residually Cartesian philosophy that underpins much qualitative research. Like other anti-Cartesian strategies, belief-attribution-as-measurement implies that the objective world is far more knowable than the subjective one, and that reality is ontologically prior to meaning. I regard this result as both plausible and welcome. (shrink)
The axiomatic approaches of quantum mechanics and relativity theory are compared with approaches in which the theories are thought to describe readings of certain measurement operations. The usual axioms are shown to correspond with classes of ideal measurements. The necessity is discussed of generalizing the formalisms of both quantum mechanics and relativity theory so as to encompass more realistic nonideal measurements. It is argued that this generalization favours an empiricist interpretation of the mathematical formalisms over a realist one.
The measurement of force is based on a formal law of additivity, which characterizes the effects of two or more configurations on the equilibrium of a material point. The representing vectors (resultant forces) are additive over configurations. The existence of a tight interrelation between the force vector and the geometric space, in which motion is described, depends on observations of partial (directional) equilibria; an axiomatization of this interrelation yields a proof of part two of Newton's second law of motion. The (...) present results (which were derived from a curious and deep isomorphism between force measurement and trichromatic color measurement) yield a kind of subunit, which needs to be incorporated into more complete axiomatizations of mechanics that would fulfill the Mach-Kirchhoff program. (shrink)
Philosophical debate on the measurement problem of quantum mechanics has, for the most part, been confined to the non-relativistic version of the theory. Quantizing quantum field theory, or making quantum mechanics relativistic, yields a conceptual framework capable of dealing with the creation and annihilation of an indefinite number of particles in interaction with fields, i.e. quantum systems with an infinite number of degrees of freedom. I show that a solution to the standard measurement problem is available if we exploit the (...) properties of the infinite quantum models available in this broader conceptual framework. (shrink)
This paper defends a realist interpretation of theories and a modest realism concerning the existence of quantities as providing the best account both of the logic of quantity concepts and of scientific measurement practices. Various operationist analyses of measurement are shown to be inadequate accounts of measurement practices used by scientists. We argue, furthermore, that appeals to implicit definitions to provide meaning for theoretical terms over and above operational definitions fail because implicit definitions cannot generate the requisite descriptive content. The (...) special case of establishing a temperature scale is examined to show that nonrealist accounts fail to provide insight into the theoretical connections that scientific laws postulate to hold among quantities. (shrink)
This study investigates measurement invariance of the 17-item-4-factor Love of Money Scale (LOMS) (Rich, Motivator, Success, and Important) across gender and college major among university students in People’s Republic of China. Results revealed configural (factor structures) invariance across gender. Metric (factor loadings) invariance across gender was not achieved based on chi-square change, but achieved based on fit indices change between unconstrained and constrained multi-group confirmatory factor analysis (MGCFA). Both configural invariance and metric invariance (chi-square change and fit indices change) (...) were achieved across college major (law, sociology, and political science). Results of this study suggest that the Love of Money Scale, developed in the U.S., has achieved measurement invariance in this student sample in China. Future researchers will have some confidence in using this measurement when they examine the love of money in Chinese management and organizational studies. (shrink)
This paper explores the circumstances that influence whether managers in the public services manipulate the measurement information that is used to assess performance; and if they do, what level of deception they might use. The realistic evaluation approach is adopted. A Delphi survey and the collection of critical incidents through interviews are used to identify possible configurations of contexts–mechanisms–outcomes that provide possible explanations of information manipulation. A number of these configurations are discussed. In a later stage of the project these (...) configurations will be further tested through another Delphi survey, with the intention of developing proposals for improved governance of performance measurement systems in the public services. (shrink)
In Part I we saw that the works of Helmholtz, Holder, Campbell and Stevens contain the main ingredients for the analysis of the conditions which make (fundamental) measurement possible, but, so to speak, that what is lacking in the work of the first three is to be found in the work of the last, and vice versa. The first tradition focuses on the conditions that an empirical qualitative system must satisfy in order to be numerically representable, but pays no (...) attention to the relation between possible different representations. The second tradition focuses on the study of scale types and the mathematical properties of the transformations that characterize the scales, but says nothing about the empirical facts these scales represent and the nature of such representation. Then, these two lines of research need to be appropriately integrated. In this Part II, we shall see how this integration is brought about in the foundational work of Suppes, the extensions and modifications which are generated around this work and the mature theory which results from all of this. (shrink)
This work examines whether the environmentally-induced decoherence approach in quantum mechanics brings us any closer to solving the measurement problem, and whether it contributes to the elimination of subjectivism in quantum theory. A distinction is made between 'collapse' and 'decoherence', so that an explanation for decoherence does not imply an explanation for collapse. After an overview of the measurement problem and of the open-systems paradigm, we argue that taking a partial trace is equivalent to applying the projection postulate. A criticism (...) of Zurek's decoherence approach to measurements is also made, based on the restriction that he must impose on the interaction between apparatus and environment. We then analyze the element of subjectivity involved in establishing the boundary between system and environment, and criticize the incorporation of Everett's branching of memory records into the decoherence research program. Sticking to this program, we end by sketching a proposal for 'environmentally-induced collapse'. (shrink)
Community Development Finance Institutions (CDFIs) are publicly funded organisations that provide small loans to people in financially underserved areas of the UK. Policy makers have repeatedly sought to understand and measure the performance of CDFIs to ensure the efficient use of public funds, but have struggled to identify an appropriate way of doing so. In this article, we empirically derive a framework that measures the performance of CDFIs through an analysis of their stakeholder relationships. Based on qualitative data from 20 (...) English CDFIs, we develop a typology of CDFIs according to three dimensions: organisational structure, type of lending and type of market served. Following on from this, we derive several propositions that consider how these dimensions relate to the financial and social performance of CDFIs, and provide the basis for a performance measurement framework. (shrink)
This note suggests that the exercise of measuring poverty in a society is greatly aided by clarity on precisely what one means by “the extent of poverty”. The latter concept may refer to the extent of poverty normalized for population size, or to the extent of poverty not so normalized. Absence of clarity on this distinction – which is both simple and non-trivial – could lead to rather straightforward problems of coherence and consistency in the measurement of poverty.