This article analyzes the implications of protectivemeasurement for the meaning of the wave function. According to protectivemeasurement, a charged quantum system has mass and charge density proportional to the modulus square of its wave function. It is shown that the mass and charge density is not real but effective, formed by the ergodic motion of a localized particle with the total mass and charge of the system. Moreover, it is argued that the ergodic motion (...) is not continuous but discontinuous and random. This result suggests a new interpretation of the wave function, according to which the wave function is a description of random discontinuous motion of particles, and the modulus square of the wave function gives the probability density of the particles being in certain locations. It is shown that the suggested interpretation of the wave function disfavors the de Broglie-Bohm theory and the many-worlds interpretation but favors the dynamical collapse theories, and the random discontinuous motion of particles may provide an appropriate random source to collapse the wave function. (shrink)
We investigate the implications of protectivemeasurement for de Broglie-Bohm theory, mainly focusing on the interpretation of the wave function. It has been argued that the de Broglie-Bohm theory gives the same predictions as quantum mechanics by means of quantum equilibrium hypothesis. However, this equivalence is based on the premise that the wave function, regarded as a Ψ-field, has no mass and charge density distributions. But this premise turns out to be wrong according to protectivemeasurement; (...) a charged quantum system has effective mass and charge density distributing in space, proportional to the square of the absolute value of its wave function. Then in the de Broglie-Bohm theory both Ψ-field and Bohmian particle will have charge density distribution for a charged quantum system. This will result in the existence of an electrostatic self-interaction of the field and an electromagnetic interaction between the field and Bohmian particle, which not only violates the superposition principle of quantum mechanics but also contradicts experimental observations. Therefore, the de Broglie-Bohm theory as a realistic interpretation of quantum mechanics is problematic according to protectivemeasurement. Lastly, we briefly discuss the possibility that the wave function is not a physical field but a description of some sort of ergodic motion (e.g. random discontinuous motion) of particles. (shrink)
An analysis of the two routes through which one may disentangle a quantum system from a measuring apparatus, hence protect the state vector of a single quantum system from being disturbed by the measurement, reveals several loopholes in the argument from protectivemeasurement to the reality of the state vector of a single quantum system.
Quantum mechanics has sometimes been taken to be an empiricist (vs. realist) theory. I state the empiricist's argument, then outline a recently noticed type of measurement--protectivemeasurement--that affords a good reply for the realist. This paper is a reply to scientific empiricism (about quantum mechanics), but is neither a refutation of that position, nor an argument in favor of scientific realism. Rather, my aim is to place realism and empiricism on an even score in regards to quantum (...) theory. (shrink)
It is shown that the superposed wave function of a measuring device, in each branch of which there is a definite measurement result, does not correspond to many mutually unobservable but equally real worlds, as the superposed wave function can be observed in our world by protectivemeasurement.
We investigate the meaning of the wave function by analyzing the mass and charge density distributions of a quantum system. According to protectivemeasurement, a charged quantum system has effective mass and charge density distributing in space, proportional to the square of the absolute value of its wave function. In a realistic interpretation, the wave function of a quantum system can be taken as a description of either a physical field or the ergodic motion of a particle. The (...) essential difference between a field and the ergodic motion of a particle lies in the property of simultaneity; a field exists throughout space simultaneously, whereas the ergodic motion of a particle exists throughout space in a time-divided way. If the wave function is a physical field, then the mass and charge density will be distributed in space simultaneously for a charged quantum system, and thus there will exist gravitational and electrostatic self-interactions of its wave function. This not only violates the superposition principle of quantum mechanics but also contradicts experimental observations. Thus the wave function cannot be a description of a physical field but a description of the ergodic motion of a particle. For the later there is only a localized particle with mass and charge at every instant, and thus there will not exist any self-interaction for the wave function. Which kind of ergodic motion of particles then? It is argued that the classical ergodic models, which assume continuous motion of particles, cannot be consistent with quantum mechanics. Based on the negative result, we suggest that the wave function is a description of the quantum motion of particles, which is random and discontinuous in nature. On this interpretation, the square of the absolute value of the wave function not only gives the probability of the particle being found in certain locations, but also gives the probability of the particle being there. We show that this new interpretation of the wave function provides a natural realistic alternative to the orthodox interpretation, and its implications for other realistic interpretations of quantum mechanics are also briefly discussed. (shrink)
We investigate the validity of the field explanation of the wave function by analyzing the mass and charge density distributions of a quantum system. It is argued that a charged quantum system has effective mass and charge density distributing in space, proportional to the square of the absolute value of its wave function. This is also a consequence of protectivemeasurement. If the wave function is a physical field, then the mass and charge density will be distributed in (...) space simultaneously for a charged quantum system, and thus there will exist a remarkable electrostatic self-interaction of its wave function, though the gravitational self-interaction is too weak to be detected presently. This not only violates the superposition principle of quantum mechanics but also contradicts experimental observations. Thus we conclude that the wave function cannot be a description of a physical field. In the second part of this paper, we further analyze the implications of these results for the main realistic interpretations of quantum mechanics, especially for de Broglie-Bohm theory. It has been argued that de Broglie-Bohm theory gives the same predictions as quantum mechanics by means of quantum equilibrium hypothesis. However, this equivalence is based on the premise that the wave function, regarded as a Ψ-field, has no mass and charge density distributions, which turns out to be wrong according to the above results. For a charged quantum system, both Ψ-field and Bohmian particle have charge density distribution. This then results in the existence of an electrostatic self-interaction of the field and an electromagnetic interaction between the field and Bohmian particle, which contradicts both the predictions of quantum mechanics and experimental observations. Therefore, de Broglie-Bohm theory as a realistic interpretation of quantum mechanics is probably wrong. Lastly, we suggest that the wave function is a description of some sort of ergodic motion (e.g. random discontinuous motion) of particles, and we also briefly analyze the implications of this suggestion for other realistic interpretations of quantum mechanics including many-worlds interpretation and dynamical collapse theories. (shrink)
Based on an analysis of protective measurements, we show that the quantum state represents the physical state of a single quantum system. This result is more definite than the PBR theorem [Pusey, Barrett, and Rudolph, Nature Phys. 8, 475 (2012)].
We show that the physical meaning of the wave function can be derived based on the established parts of quantum mechanics. It turns out that the wave function represents the state of random discontinuous motion of particles, and its modulus square determines the probability density of the particles appearing in certain positions in space.
Protectivemeasurement is a new measuring method introduced by Aharonov, Vaidman, and Anandan, with the aim of measuring the expectation value of an observable on a single quantum system, even if the system is initially not in an eigenstate of the measured observable. According to these authors, this feature of protective measurements favors a realistic interpretation of the wave function. These claims were challenged by Uffink. He argued that only observables that commute with the system's Hamiltonian can (...) be protectively measured, and that an allegedly protectivemeasurement of an observable that does not commute with the system's Hamiltonian does not actually measure this observable, but rather another related one that commutes with the system's Hamiltonian. In this paper we identify a number of unresolved issues in Uffink's proofs and argue that his alternative interpretation of what happens in a protectivemeasurement has not been justified. (shrink)
Protectivemeasurement, which we have introduced recently, allows one to observe properties of the state of a single quantum system and even the Schrödinger wave itself. These measurements require a protection, sometimes due to an additional procedure and sometimes due to the potential of the system itself The analysis of the protective measurements is presented and it is argued, contrary to recent claims, that they observe the quantum state and not the protective potential. Some other misunderstandings (...) concerning our proposal are also clarified. (shrink)
Protectivemeasurement might be taken to put the last nail in the coffin of ensemble interpretations of the quantum state. My goal here is to show that even though ensemble interpretations face formidable obstacles, protective measurements don't lead to any additional difficulties. Rather, they provide us with a nice illustration of a conclusion for which we had considerable indirect evidence already, namely that quantum mechanics leads to a blurring of the distinction between the intrinsic properties of a (...) system and the statistical properties of the ensemble of which it is a member. This conclusion goes for all realist interpretations of the quantum state, both the mainstream ones that take the wave function to be a real field and the more conjectural ones that take the wave function to describe our knowledge of an ensemble. (shrink)
Protectivemeasurement is a new measuring method introduced by Aharonov, Anandan and Vaidman. By a protectivemeasurement, one can measure the expectation value of an observable on a single quantum system, even if the system is initially not in an eigenstate of the measured observable. This remarkable feature of protective measurements was challenged by Uffink. He argued that only observables that commute with the system's Hamiltonian can be protectively measured, and a protectivemeasurement (...) of an observable that does not commute with the system's Hamiltonian does not actually measure the observable, but measure another related observable that commutes with the system's Hamiltonian. In this paper, we show that there are several errors in Uffink's arguments, and his alternative interpretation of protective measurements is untenable. (shrink)
Gao presents a critical reconsideration of a paper I wrote on the subject of protectivemeasurement. Here, I take the occasion to reply to his objections. In particular, I retract my previous claim to have proven that in a protectivemeasurement, the observable being measured on a system must commute with the system's Hamiltonian. However, I do maintain the viability of the interpretation I offered for protective measurements, as well as my analysis of a thought (...) experiment proposed by Aharonov, Anandan and Vaidman against Gao's objections. (shrink)
It is a fundamental and widely accepted assumption that a measurement result exists universally, and in particular, it exists for every observer, independently of whether the observer makes the measurement or knows the result. In this paper, we will argue that, based on an analysis of protective measurements, this assumption is rejected by the many-worlds interpretation of quantum mechanics, and worlds, if they indeed exist according to the interpretation, can only exist relative to systems which are decoherent (...) with respect to the measurement result. (shrink)
In light of growing concerns about obesity, Winson (2004, Agriculture and Human Values 21(4): 299–312) calls for more research into the supermarket foodscape as a point of connection between consumers and food choice. In this study, we systematically examine the marketing of ready-to-eat breakfast cereals to children in Toronto, Ontario supermarkets. The supermarket cereal aisle is a relatively unstudied visual collage of competing brands, colors, spokes-characters, and incentives aimed at influencing consumer choice. We found that breakfast cereal products with higher-than-average (...) levels of sugar, refined grains, and trans-fats are more likely to feature child-oriented marketing in the form of spokes-characters, themed cereal shapes/colors, and child incentives on cereal boxes. These forms of visual communication are consistent with a “health exploitive” pattern of targeted marketing to children in the supermarket setting. Only one aspect of visual communication is consistent with a “health protective” pattern of marketing to children—cereals shelved within reach of children aged 4–8 had less sugar per serving and were less likely to contain trans-fats than less reachable products. We discuss the implications of our findings for the measurement and regulation of marketing to children in North American supermarkets. (shrink)
Several situations, in which an empty wave causes an observable effect, are reviewed. They include an experiment showing ‘‘surrealistic trajectories’’ proposed by Englert et al. and protectivemeasurement of the density of the quantum state. Conditions for observable effects due to empty waves are derived. The possibility (in spite of the existence of these examples) of minimalistic interpretation of Bohmian quantum mechanics in which only Bohmian positions supervene on our experience is discussed.
It is shown that the de Broglie-Bohm theory has a potential problem concerning the mass and charge distributions of a quantum system such as an electron. According to the de Broglie-Bohm theory, the mass and charge of an electron are localized in a position where its Bohmian particle is. However, protectivemeasurement indicates that they are not localized in one position but distributed throughout space, and the mass and charge density of the electron in each position is proportional (...) to the modulus square of its wave function there. (shrink)
We show that the de Broglie-Bohm theory is inconsistent with the established parts of quantum mechanics concerning its physical content. According to the de Broglie-Bohm theory, the mass and charge of an electron are localized in a position where its Bohmian particle is. However, protectivemeasurement implies that they are not localized in one position but distributed throughout space, and the mass and charge density of the electron in each position is proportional to the modulus square of its (...) wave function there. (shrink)
Measurement is a special type of evaluation that is more exact than either opinion or estimation. In the social sciences, in particular, most evaluations are not measures, but rather mixtures of opinion and estimation. Over-measurement represents anchoring to evaluations which are not measures. For an over-measured characteristic, single measures are used when instead a portfolio of possible measures should be used. There are three implications. First, measurements of characteristics which depend on the over-measured characteristic are biased. Secondly, decisions (...) which depend on the over-measured characteristic are biased. Thirdly, over-measurement biases the measurement of uncertainty. (shrink)
The paper introduces what is deemed as the general epistemological problem of measurement: what characterizes measurement with respect to generic evaluation? It also analyzes the fundamental positions that have been maintained about this issue, thus presenting some sketches for a conceptual history of measurement. This characterization, in which three distinct standpoints are recognized, corresponding to a metaphysical, an anti-metaphysical, and relativistic period, allows us to introduce and briefly discuss some general issues on the current epistemological status of (...)measurement science. (shrink)
Given the common assumption that measurement plays an important role in the foundation of science, the paper analyzes the possibility that Measurement Science, and therefore measurement itself, can be properly founded. The realist and the representational positions are analyzed at this regards: the conclusion, that such positions unavoidably lead to paradoxical situations, opens the discussion for a new epistemology of measurement, whose characteristics and interpretation are sketched here but are still largely matter of investigation.
This paper discusses a relational modeling of measurement which is complementary to the standard representational point of view: by focusing on the experimental character of the measurand-related comparison between objects, this modeling emphasizes the role of the measuring systems as the devices which operatively perform such a comparison. The non-idealities of the operation are formalized in terms of non-transitivity of the substitutability relation between measured objects, due to the uncertainty on the measurand value remaining after the measurement. The (...) metrological structure of traceability is shown to be an effective solution to cope with the problem of the general non-transitivity of measurement results. A preliminary theory is introduced as a possible formalization for the presented model. (shrink)
Against the tradition, which has considered measurement able to produce pure data on physical systems, the unavoidable role played by the modeling activity in measurement is increasingly acknowledged, particularly with respect to the evaluation of measurement uncertainty. This paper characterizes measurement as a knowledge-based process and proposes a framework to understand the function of models in measurement and to systematically analyze their influence in the production of measurement results and their interpretation. To this aim, (...) a general model of measurement is sketched, which gives the context to highlight the unavoidable, although sometimes implicit, presence of models in measurement and, finally, to propose some remarks on the relations between models and measurement uncertainty, complementarily classified as due to the idealization implied in the models and their realization in the experimental setup. (shrink)
The paper introduces and formally defines a functional concept of a measuring system, on this basis characterizing the measurement as an evaluation performed by means of a calibrated measuring system. The distinction between exact and uncertain measurement is formalized in terms of the properties of the traceability chain joining the measuring system to the primary standard. The consequence is drawn that uncertain measurements lose the property of relation-preservation, on which the very concept of measurement is founded according (...) to the representational viewpoint. Finally, from the analysis of the inter-relations between calibration and measurement the fundamental reasons of the claimed objectivity and intersubjectivity of measurement are highlighted, a valuable epistemological result to characterize measurement as a particular kind of evaluation. (shrink)
Measurement in soft systems generally cannot exploit physical sensors as data acquisition devices. The emphasis in this case is instead on how to choose the appropriate indicators and to combine their values so to obtain an overall result, interpreted as the value of a property, i.e., the measurand, for the system under analysis. This paper aims at discussing the epistemological conditions of the claim that such a process is a measurement, and performance evaluation is the case introduced to (...) support the analysis, performed in systematic comparison with the paradigm of measurement of physical quantities. Some background questions arising here are: – Are the chosen indicators appropriate performance indicators? – Do such indicators convey complete and non-redundant information on performance? – Does the chosen combination rule generate results suitably interpretable as performance values? And enlarging the focus: – Does the obtained value specifically convey information on the system under analysis, instead of some different entity (typically including the subject who is evaluating)? Operatively: would different subjects evaluate the same system in the same way? i.e., is the obtained information objective? – Does the obtained value convey information that is interpretable in the same way by different subjects? Operatively: would different subjects who have agreed on a decision procedure make the same decision from the same performance information? i.e., is the obtained information intersubjective? Any well founded positive answers to these questions significantly support a structural interpretation of measurement encompassing both physical and soft measurement. (shrink)
The philosophy of measurement studies the conceptual, ontological, epistemic, and technological conditions that make measurement possible and reliable. A new wave of philosophical scholarship has emerged in the last decade that emphasizes the material and historical dimensions of measurement and the relationships between measurement and theoretical modeling. This essay surveys these developments and contrasts them with earlier work on the semantics of quantity terms and the representational character of measurement. The conclusions highlight four characteristics of (...) the emerging research program in philosophy of measurement: it is epistemological, coherentist, practice oriented, and model based. (shrink)
The need for quantitative measurement represents a unifying bond that links all the physical, biological, and social sciences. Measurements of such disparate phenomena as subatomic masses, uncertainty, information, and human values share common features whose explication is central to the achievement of foundational work in any particular mathematical science as well as for the development of a coherent philosophy of science. This book presents a theory of measurement, one that is "abstract" in that it is concerned with highly (...) general axiomatizations of empirical and qualitative settings and how these can be represented quantitatively. It was inspired by, and represents a generalization and extension of, the last major research work in this field, Foundations of Measurement Vol. I, by Krantz, Luce, Suppes, and Tversky published in 1971. (shrink)
One of the major roadblocks in conducting Environmental Corporate Social Responsibility (ECSR) research is operationalization of the construct. Existing ECSR measurement tools either require primary data gathering or special subscriptions to proprietary databases that have limited replicability. We address this deficiency by developing a transparent ECSR measure, with an explicit coding scheme, that strictly relies on publicly available data. Our ECSR measure tests favorably for internal consistency and inter-rater reliability, as well as convergent and discriminant validity.
Measurement is fundamental to all the sciences, the behavioural and social as well as the physical and in the latter its results provide our paradigms of 'objective fact'. But the basis and justification of measurement is not well understood and is often simply taken for granted. Henry Kyburg Jr proposes here an original, carefully worked out theory of the foundations of measurement, to show how quantities can be defined, why certain mathematical structures are appropriate to them and (...) what meaning attaches to the results generated. Crucial to his approach is the notion of error - it can not be eliminated entirely from its introduction and control, her argues, arises the very possibility of measurement. Professor Kyburg's approach emphasises the empirical process of making measurements. In developing it he discusses vital questions concerning the general connection between a scientific theory and the results which support it (or fail to). (shrink)
The science of metrology characterizes the concept of precision in exceptionally loose and open terms. That is because the details of the concept must be filled in—what I call narrowing of the concept—in ways that are sensitive to the details of a particular measurement or measurement system and its use. Since these details can never be filled in completely, the concept of the actual precision of an instrument system must always retain some of the openness of its general (...) characterization. The idea that there is something that counts as the actual precision of a measurement system must therefore always remain an idealization, a conclusion that would appear to hold very broadly for terms and the concepts they express. (shrink)
This book provides an introduction to measurement theory for non-specialists and puts measurement in the social and behavioural sciences on a firm mathematical foundation. Results are applied to such topics as measurement of utility, psychophysical scaling and decision-making about pollution, energy, transportation and health. The results and questions presented should be of interest to both students and practising mathematicians since the author sets forth an area of mathematics unfamiliar to most mathematicians, but which has many potentially significant (...) applications. (shrink)
The first two sections of this paper investigate what Newton could have meant in a now famous passage from “De Graviatione” (hereafter “DeGrav”) that “space is as it were an emanative effect of God.” First it offers a careful examination of the four key passages within DeGrav that bear on this. The paper shows that the internal logic of Newton’s argument permits several interpretations. In doing so, the paper calls attention to a Spinozistic strain in Newton’s thought. Second it sketches (...) four interpretive options: (i) one approach is generic neo-Platonic; (ii) another approach is associated with the Cambridge Platonist, Henry More; a variant on this (ii*) emphasizes that Newton mixes Platonist and Epicurean themes; (iii) a necessitarian approach; (iv) an approach connected with Bacon’s efforts to reformulate a useful notion of form and laws of nature. Hitherto only the second and third options have received scholarly attention in scholarship on DeGrav. The paper offers new arguments to treat Newtonian emanation as a species of Baconian formal causation as articulated, especially, in the first few aphorisms of part two of Bacon’s New Organon. If we treat Newtonian emanation as a species of formal causation then the necessitarian reading can be combined with most of the Platonist elements that others have discerned in DeGrav, especially Newton’s commitment to doctrines of different degrees of reality as well as the manner in which the first existing being ‘transfers’ its qualities to space (as a kind of causa-sui). This can clarify the conceptual relationship between space and its formal cause in Newton as well as Newton’s commitment to the spatial extended-ness of all existing beings. While the first two sections of this paper engage with existing scholarly controversies, in the final section the paper argues that the recent focus on emanation has obscured the importance of Newton’s very interesting claims about existence and measurement in “DeGrav”. The paper argues that according to Newton God and other entities have the same kind of quantities of existence; Newton is concerned with how measurement clarifies the way of being of entities. Newton is not claiming that measurement reveals all aspects of an entity. But if we measure something then it exists as a magnitude in space and as a magnitude in time. This is why in DeGrav Newton’s conception of existence really helps to “lay truer foundations of the mechanical sciences.”. (shrink)
The aim of this paper is to give a systematic account of the so-called “measurement problem” in the frame of the standard interpretation of quantum mechanics. It is argued that there is not one but five distinct formulations of this problem. Each of them depends on what is assumed to be a “satisfactory” description of the measurement process in the frame of the standard interpretation. Moreover, the paper points out that each of these formulations refers not to a (...) unique problem, but to a set of sub-problems. (shrink)
This research examines business and psychology students’ attitude toward unethical behavior (measured at Time 1) and their propensity to engage in unethical behavior (measured at Time 1 and at Time 2, 4 weeks later) using a 15-item Unethical Behavior measure with five Factors: Abuse Resources, Not Whistle Blowing, Theft, Corruption, and Deception. Results suggested that male students had stronger unethical attitudes and had higher propensity to engage in unethical behavior than female students. Attitude at Time 1 predicted Propensity at Time (...) 1 accurately for all five factors (concurrent validity): If students consider it to be unethical, then, they are less likely to engage in that unethical behavior. Attitude at Time 1 predicted only Factor Abuse Resources for Propensity at Time 2. Propensity at Time 1 was significantly related to Propensity at Time 2. Attitude at Time 1, Propensity at Time 1, and Propensity at Time 2 had achieved configural and metric measurement invariance across major (business vs. psychology). Thus, researchers may have confidence in using these measures in future research. (shrink)
Measurement is said to be the basis of exact sciences as the process of assigning numbers to matter (things or their attributes), thus making it possible to apply the mathematically formulated laws of nature to the empirical world. Mathematics and empiria are best accorded to each other in laboratory experiments which function as what Nancy Cartwright calls nomological machine: an arrangement generating (mathematical) regularities. On the basis of accounts of measurement errors and uncertainties, I will argue for two (...) claims: 1) Both fundamental laws of physics, corresponding to ideal nomological machine, and phenomenological laws, corresponding to material nomological machine, lie, being highly idealised relative to the empirical reality; and also laboratory measurement data do not describe properties inherent to the world independently of human understanding of it. 2) Therefore the naive, representational view of measurement and experimentation should be replaced with a more pragmatic or practice-based view. (shrink)
In the last few decades the role played by models and modeling activities has become a central topic in the scientific enterprise. In particular, it has been highlighted both that the development of models constitutes a crucial step for understanding the world and that the developed models operate as mediators between theories and the world. Such perspective is exploited here to cope with the issue as to whether error-based and uncertainty-based modeling of measurement are incompatible, and thus alternative with (...) one another, as sometimes claimed nowadays. The crucial problem is whether assuming this standpoint implies definitely renouncing to maintain a role for truth and the related concepts, particularly accuracy, in measurement. It is argued here that the well known objections against true values in measurement, which would lead to refuse the concept of accuracy as non-operational, or to maintain it as only qualitative, derive from a not clear distinction between three distinct processes: the metrological characterization of measuring systems, their calibration, and finally measurement. Under the hypotheses that (1) the concept of true value is related to the model of a measurement process, (2) the concept of uncertainty is related to the connection between such model and the world, and (3) accuracy is a property of measuring systems (and not of measurement results) and uncertainty is a property of measurement results (and not of measuring systems), not only the compatibility but actually the conjoint need of error-based and uncertainty-based modeling emerges. (shrink)
Measurement is a process aimed at acquiring and codifying information about properties of empirical entities. In this paper we provide an interpretation of such a process comparing it with what is nowadays considered the standard measurement theory, i.e., representational theory of measurement. It is maintained here that this theory has its own merits but it is incomplete and too abstract, its main weakness being the scant attention reserved to the empirical side of measurement, i.e., to (...) class='Hi'>measurement systems and to the ways in which the interactions of such systems with the entities under measurement provide a structure to an empirical domain. In particular it is claimed that (1) it is on the ground of the interaction with a measurement system that a partition can be induced on the domain of entities under measurement and that relations among such entities can be established, and that (2) it is the usage of measurement systems that guarantees a degree of objectivity and intersubjectivity to measurement results. As modeled in this paper, measurement systems link the abstract theory of measuring, as developed in representational terms, and the practice of measuring, as coded in standard documents such as the International Vocabulary of Metrology. (shrink)
According to orthodox quantum mechanics, state vectors change in two incompatible ways: "deterministically" in accordance with Schroedinger's time-dependent equation, and probabilistically if and only if a measurement is made. It is argued here that the problem of measurement arises because the precise mutually exclusive conditions for these two types of transitions to occur are not specified within orthodox quantum mechanics. Fundamentally, this is due to an inevitable ambiguity in the notion of "meawurement" itself. Hence, if the problem of (...)measurement is to be resolved, a new, fully objective version of quantjm mechanics needs to be developed which does not incorporate the notion of measurement in its basic postuolates at all. (shrink)
Corporate environmental performance (CEP) has been of fundamental interest in scholarly research during the last few decades. However, there is a great deal of disagreement pertaining to the definition, conceptualization, and adequate measurement of CEP. Our study addresses these issues and provides a methodologically rigorous and comprehensive examination of content validity and construct validity. By integrating the available literature on CEP, we derive a parsimonious definition and theoretically sound framework of the focal construct. Drawing on non-aggregated and publicly available (...) data for a sample of 706 firm-years, we test the construct validity of this framework by means of factor analysis. Our results provide evidence for the multidimensional nature of the focal construct. By contrasting our findings with existing measurement approaches in empirical research, we emphasize several deficiencies with regard to the inferences and conclusions yielded in prior research. Future empirical and practically oriented studies can build on our findings and thus provide more stringent results. (shrink)
Neoliberal precepts of the governance of academic science-deregulation; reification of markets; emphasis on competitive allocation processes have been conflated with those of performance management—if you cannot measure it, you cannot manage it—into a single analytical and consequent single programmatic worldview. As applied to the United States’ system of research universities, this conflation leads to two major divergences from relationships hypothesized in the governance of science literature. (1) The governance and financial structures supporting academic science in the United States’ system of (...) higher education are sufficiently different from those found in many other OECD countries where these policies have been adopted to produce political pressures for an increase rather than a decrease in governmental control over university affairs. (2) The major impact upon academic science of performance measurement systems has come not externally from new government requirements but internally from the independent adoption of these techniques by universities, initially in the name of rational management and increasingly as devices to foster reputational enhancement. The overall thrust of the two trends in the U.S. has been less a shift as experienced elsewhere from bureaucratic to market modes of governance than the displacement of professional-collegial control by internal bureaucratic control. (shrink)
This article develops a model-based account of the standardization of physical measurement, taking the contemporary standardization of time as its central case-study. To standardize the measurement of a quantity, I argue, is to legislate the mode of application of a quantity-concept to a collection of exemplary artefacts. Legislation involves an iterative exchange between top-down adjustments to theoretical and statistical models regulating the application of a concept, and bottom-up adjustments to material artefacts in light of remaining gaps. The model-based (...) account clarifies the cognitive role of ad hoc corrections, arbitrary rules and seemingly circular inferences involved in contemporary timekeeping, and explains the stability of networks of standards better than its conventionalist and constructivist counterparts. (shrink)
The social welfare functional approach to social choice theory fails to distinguish a genuine change in individual well-beings from a merely representational change due to the use of different measurement scales. A generalization of the concept of a social welfare functional is introduced that explicitly takes account of the scales that are used to measure well-beings so as to distinguish between these two kinds of changes. This generalization of the standard theoretical framework results in a more satisfactory formulation of (...) welfarism, the doctrine that social alternatives are evaluated and socially ranked solely in terms of the well-beings of the relevant individuals. This scale-dependent form of welfarism is axiomatized using this framework. The implications of this approach for characterizing classes of social welfare orderings are also considered. (shrink)
This paper introduces the reader to Meinong's work on the metaphysics of magnitudes and measurement in his Über die Bedeutung des Weber'schen Gesetzes. According to Russell himself, who wrote a review of Meinong's work on Weber's law for Mind, Meinong's theory of magnitudes deeply influenced Russell's theory of quantities in the Principles of Mathematics. The first and longest part of the paper discusses Meinong's analysis of magnitudes. According to Meinong, we must distinguish between divisible and indivisible magnitudes. He argues (...) that relations of distance, or dissimilarity, are indivisible magnitudes that coincide with divisible magnitudes called "stretches". The second part of the paper is concerned with Meinong's account of measurement as a comparison of parts. According to Meinong, since measuring consists in comparing parts only divisible magnitudes are directly measurable. Indivisible magnitudes can only be measured indirectly, by measuring the divisible stretches that coincide with them. (shrink)
This paper challenges “traditional measurement-accuracy realism”, according to which there are in nature quantities of which concrete systems have definite values. An accurate measurement outcome is one that is close to the value for the quantity measured. For a measurement of the temperature of some water to be accurate in this sense requires that there be this temperature. But there isn’t. Not because there are no quantities “out there in nature” but because the term ‘the temperature of (...) this water’ fails to refer owing to idealization and failure of specificity in picking out concrete cases. The problems can be seen as an artifact of vagueness, and so doing facilitates applying Eran Tal’s robustness account of measurement accuracy to suggest an attractive way of understanding vagueness in terms of the function of idealization, a way that sidesteps the problems of higher order vagueness and that shows how idealization provides a natural generalization of what it is to be vague. (shrink)
The quantum theory of de Broglie and Bohm solves the measurement problem, but the hypothetical corpuscles play no role in the argument. The solution ﬁnds a more natural home in the Everett interpretation.