Given the common assumption that measurement plays an important role in the foundation of science, the paper analyzes the possibility that Measurement Science, and therefore measurement itself, can be properly founded. The realist and the representational positions are analyzed at this regards: the conclusion, that such positions unavoidably lead to paradoxical situations, opens the discussion for a new epistemology of measurement, whose characteristics and interpretation are sketched here but are still largely matter of investigation.
Against the tradition, which has considered measurement able to produce pure data on physical systems, the unavoidable role played by the modeling activity in measurement is increasingly acknowledged, particularly with respect to the evaluation of measurement uncertainty. This paper characterizes measurement as a knowledge-based process and proposes a framework to understand the function of models in measurement and to systematically analyze their influence in the production of measurement results and their interpretation. To this aim, (...) a general model of measurement is sketched, which gives the context to highlight the unavoidable, although sometimes implicit, presence of models in measurement and, finally, to propose some remarks on the relations between models and measurement uncertainty, complementarily classified as due to the idealization implied in the models and their realization in the experimental setup. (shrink)
The paper introduces what is deemed as the general epistemological problem of measurement: what characterizes measurement with respect to generic evaluation? It also analyzes the fundamental positions that have been maintained about this issue, thus presenting some sketches for a conceptual history of measurement. This characterization, in which three distinct standpoints are recognized, corresponding to a metaphysical, an anti-metaphysical, and relativistic period, allows us to introduce and briefly discuss some general issues on the current epistemological status of (...)measurement science. (shrink)
The paper introduces and formally defines a functional concept of a measuring system, on this basis characterizing the measurement as an evaluation performed by means of a calibrated measuring system. The distinction between exact and uncertain measurement is formalized in terms of the properties of the traceability chain joining the measuring system to the primary standard. The consequence is drawn that uncertain measurements lose the property of relation-preservation, on which the very concept of measurement is founded according (...) to the representational viewpoint. Finally, from the analysis of the inter-relations between calibration and measurement the fundamental reasons of the claimed objectivity and intersubjectivity of measurement are highlighted, a valuable epistemological result to characterize measurement as a particular kind of evaluation. (shrink)
This paper discusses a relational modeling of measurement which is complementary to the standard representational point of view: by focusing on the experimental character of the measurand-related comparison between objects, this modeling emphasizes the role of the measuring systems as the devices which operatively perform such a comparison. The non-idealities of the operation are formalized in terms of non-transitivity of the substitutability relation between measured objects, due to the uncertainty on the measurand value remaining after the measurement. The (...) metrological structure of traceability is shown to be an effective solution to cope with the problem of the general non-transitivity of measurement results. A preliminary theory is introduced as a possible formalization for the presented model. (shrink)
Measurement in soft systems generally cannot exploit physical sensors as data acquisition devices. The emphasis in this case is instead on how to choose the appropriate indicators and to combine their values so to obtain an overall result, interpreted as the value of a property, i.e., the measurand, for the system under analysis. This paper aims at discussing the epistemological conditions of the claim that such a process is a measurement, and performance evaluation is the case introduced to (...) support the analysis, performed in systematic comparison with the paradigm of measurement of physical quantities. Some background questions arising here are: – Are the chosen indicators appropriate performance indicators? – Do such indicators convey complete and non-redundant information on performance? – Does the chosen combination rule generate results suitably interpretable as performance values? And enlarging the focus: – Does the obtained value specifically convey information on the system under analysis, instead of some different entity (typically including the subject who is evaluating)? Operatively: would different subjects evaluate the same system in the same way? i.e., is the obtained information objective? – Does the obtained value convey information that is interpretable in the same way by different subjects? Operatively: would different subjects who have agreed on a decision procedure make the same decision from the same performance information? i.e., is the obtained information intersubjective? Any well founded positive answers to these questions significantly support a structural interpretation of measurement encompassing both physical and soft measurement. (shrink)
This article analyzes the implications of protective measurement for the meaning of the wave function. According to protective measurement, a charged quantum system has mass and charge density proportional to the modulus square of its wave function. It is shown that the mass and charge density is not real but effective, formed by the ergodic motion of a localized particle with the total mass and charge of the system. Moreover, it is argued that the ergodic motion is not (...) continuous but discontinuous and random. This result suggests a new interpretation of the wave function, according to which the wave function is a description of random discontinuous motion of particles, and the modulus square of the wave function gives the probability density of the particles being in certain locations. It is shown that the suggested interpretation of the wave function disfavors the de Broglie-Bohm theory and the many-worlds interpretation but favors the dynamical collapse theories, and the random discontinuous motion of particles may provide an appropriate random source to collapse the wave function. (shrink)
We investigate the implications of protective measurement for de Broglie-Bohm theory, mainly focusing on the interpretation of the wave function. It has been argued that the de Broglie-Bohm theory gives the same predictions as quantum mechanics by means of quantum equilibrium hypothesis. However, this equivalence is based on the premise that the wave function, regarded as a Ψ-field, has no mass and charge density distributions. But this premise turns out to be wrong according to protective measurement; a charged (...) quantum system has effective mass and charge density distributing in space, proportional to the square of the absolute value of its wave function. Then in the de Broglie-Bohm theory both Ψ-field and Bohmian particle will have charge density distribution for a charged quantum system. This will result in the existence of an electrostatic self-interaction of the field and an electromagnetic interaction between the field and Bohmian particle, which not only violates the superposition principle of quantum mechanics but also contradicts experimental observations. Therefore, the de Broglie-Bohm theory as a realistic interpretation of quantum mechanics is problematic according to protective measurement. Lastly, we briefly discuss the possibility that the wave function is not a physical field but a description of some sort of ergodic motion (e.g. random discontinuous motion) of particles. (shrink)
The first two sections of this paper investigate what Newton could have meant in a now famous passage from “De Graviatione” (hereafter “DeGrav”) that “space is as it were an emanative effect of God.” First it offers a careful examination of the four key passages within DeGrav that bear on this. The paper shows that the internal logic of Newton’s argument permits several interpretations. In doing so, the paper calls attention to a Spinozistic strain in Newton’s thought. Second it sketches (...) four interpretive options: (i) one approach is generic neo-Platonic; (ii) another approach is associated with the Cambridge Platonist, Henry More; a variant on this (ii*) emphasizes that Newton mixes Platonist and Epicurean themes; (iii) a necessitarian approach; (iv) an approach connected with Bacon’s efforts to reformulate a useful notion of form and laws of nature. Hitherto only the second and third options have received scholarly attention in scholarship on DeGrav. The paper offers new arguments to treat Newtonian emanation as a species of Baconian formal causation as articulated, especially, in the first few aphorisms of part two of Bacon’s New Organon. If we treat Newtonian emanation as a species of formal causation then the necessitarian reading can be combined with most of the Platonist elements that others have discerned in DeGrav, especially Newton’s commitment to doctrines of different degrees of reality as well as the manner in which the first existing being ‘transfers’ its qualities to space (as a kind of causa-sui). This can clarify the conceptual relationship between space and its formal cause in Newton as well as Newton’s commitment to the spatial extended-ness of all existing beings. While the first two sections of this paper engage with existing scholarly controversies, in the final section the paper argues that the recent focus on emanation has obscured the importance of Newton’s very interesting claims about existence and measurement in “DeGrav”. The paper argues that according to Newton God and other entities have the same kind of quantities of existence; Newton is concerned with how measurement clarifies the way of being of entities. Newton is not claiming that measurement reveals all aspects of an entity. But if we measure something then it exists as a magnitude in space and as a magnitude in time. This is why in DeGrav Newton’s conception of existence really helps to “lay truer foundations of the mechanical sciences.”. (shrink)
The philosophy of measurement studies the conceptual, ontological, epistemic, and technological conditions that make measurement possible and reliable. A new wave of philosophical scholarship has emerged in the last decade that emphasizes the material and historical dimensions of measurement and the relationships between measurement and theoretical modeling. This essay surveys these developments and contrasts them with earlier work on the semantics of quantity terms and the representational character of measurement. The conclusions highlight four characteristics of (...) the emerging research program in philosophy of measurement: it is epistemological, coherentist, practice oriented, and model based. (shrink)
This work develops an epistemology of measurement, that is, an account of the conditions under which measurement and standardization methods produce knowledge as well as the nature, scope, and limits of this knowledge. I focus on three questions: (i) how is it possible to tell whether an instrument measures the quantity it is intended to? (ii) what do claims to measurement accuracy amount to, and how might such claims be justified? (iii) when is disagreement among instruments a (...) sign of error, and when does it imply that instruments measure different quantities? Based on a series of case studies conducted in collaboration with the US National Institute of Standards and Technology (NIST), I argue for a model-based approach to the epistemology of physical measurement. To measure a physical quantity, I argue, is to estimate the value of a parameter in an idealized model of a physical process. Such estimation involves inference from the final state (‘indication’) of a process to the value range of a parameter (‘outcome’) in light of theoretical and statistical assumptions. Contrary to contemporary philosophical views, measurement outcomes cannot be obtained by mapping the structure of indications. Instead, measurement outcomes as well as claims to accuracy, error and quantity individuation can only be adjudicated relative to a choice of idealized modelling assumptions. (shrink)
According to orthodox quantum mechanics, state vectors change in two incompatible ways: "deterministically" in accordance with Schroedinger's time-dependent equation, and probabilistically if and only if a measurement is made. It is argued here that the problem of measurement arises because the precise mutually exclusive conditions for these two types of transitions to occur are not specified within orthodox quantum mechanics. Fundamentally, this is due to an inevitable ambiguity in the notion of "meawurement" itself. Hence, if the problem of (...)measurement is to be resolved, a new, fully objective version of quantjm mechanics needs to be developed which does not incorporate the notion of measurement in its basic postuolates at all. (shrink)
Psychologists debate whether mental attributes can be quantified or whether they admit only qualitative comparisons of more and less. Their disagreement is not merely terminological, for it bears upon the permissibility of various statistical techniques. This article contributes to the discussion in two stages. First it explains how temperature, which was originally a qualitative concept, came to occupy its position as an unquestionably quantitative concept (§§1–4). Specifically, it lays out the circumstances in which thermometers, which register quantitative (or cardinal) differences, (...) became distinguishable from thermoscopes, which register merely qualitative (or ordinal) differences. I argue that this distinction became possible thanks to the work of Joseph Black, ca. 1760. Second, the article contends that the model implicit in temperature’s quantitative status offers a better way for thinking about the quantitative status of mental attributes than models from measurement theory (§§5–6). (shrink)
Measurement is a process aimed at acquiring and codifying information about properties of empirical entities. In this paper we provide an interpretation of such a process comparing it with what is nowadays considered the standard measurement theory, i.e., representational theory of measurement. It is maintained here that this theory has its own merits but it is incomplete and too abstract, its main weakness being the scant attention reserved to the empirical side of measurement, i.e., to (...) class='Hi'>measurement systems and to the ways in which the interactions of such systems with the entities under measurement provide a structure to an empirical domain. In particular it is claimed that (1) it is on the ground of the interaction with a measurement system that a partition can be induced on the domain of entities under measurement and that relations among such entities can be established, and that (2) it is the usage of measurement systems that guarantees a degree of objectivity and intersubjectivity to measurement results. As modeled in this paper, measurement systems link the abstract theory of measuring, as developed in representational terms, and the practice of measuring, as coded in standard documents such as the International Vocabulary of Metrology. (shrink)
In the last few decades the role played by models and modeling activities has become a central topic in the scientific enterprise. In particular, it has been highlighted both that the development of models constitutes a crucial step for understanding the world and that the developed models operate as mediators between theories and the world. Such perspective is exploited here to cope with the issue as to whether error-based and uncertainty-based modeling of measurement are incompatible, and thus alternative with (...) one another, as sometimes claimed nowadays. The crucial problem is whether assuming this standpoint implies definitely renouncing to maintain a role for truth and the related concepts, particularly accuracy, in measurement. It is argued here that the well known objections against true values in measurement, which would lead to refuse the concept of accuracy as non-operational, or to maintain it as only qualitative, derive from a not clear distinction between three distinct processes: the metrological characterization of measuring systems, their calibration, and finally measurement. Under the hypotheses that (1) the concept of true value is related to the model of a measurement process, (2) the concept of uncertainty is related to the connection between such model and the world, and (3) accuracy is a property of measuring systems (and not of measurement results) and uncertainty is a property of measurement results (and not of measuring systems), not only the compatibility but actually the conjoint need of error-based and uncertainty-based modeling emerges. (shrink)
In this article we discuss the ethical dilemmas facing performance evaluators and the "evaluatees" whose performances are measured in a business context. The concepts of role morality and common morality are used to develop a framework of behaviors that are normally seen as the moral responsibilities of these actors. This framework is used to analyze, based on four empirical situations, why the implementation of a performance measurement system has not been as effective as expected. It was concluded that, in (...) these four cases, unethical behavior (i.e. deviations from the ethical behaviors identified in the framework) provided, at least to some extent, an explanation for the lower than expected effectiveness of the performance measurement procedures. At the end of the paper we present an agenda for further research through which the framework could be further developed and systematically applied to a broader set of cases. (shrink)
The aim of this paper is to give a systematic account of the so-called “measurement problem” in the frame of the standard interpretation of quantum mechanics. It is argued that there is not one but five distinct formulations of this problem. Each of them depends on what is assumed to be a “satisfactory” description of the measurement process in the frame of the standard interpretation. Moreover, the paper points out that each of these formulations refers not to a (...) unique problem, but to a set of sub-problems. (shrink)
The quantum theory of de Broglie and Bohm solves the measurement problem, but the hypothetical corpuscles play no role in the argument. The solution finds a more natural home in the Everett interpretation.
This paper presents a novel semantic analysis of unit names (like pound and meter ) and gradable adjectives (like tall, short and happy ), inspired by measurement theory (Krantz et al. In Foundations of measurement: Additive and Polynomial Representations, 1971). Based on measurement theory’s four-way typology of measures, I claim that different adjectives are associated with different types of measures whose special characteristics, together with features of the relations denoted by unit names, explain the puzzling limited distribution (...) of measure phrases, as well as unit-based comparisons between predicates (as in the table is longer than it is wide ). All considered, my analyses support the view that the grammar of natural languages is sensitive to features of measurement theory. (shrink)
Newton characterizes the reasoning of Principia Mathematica as geometrical. He emulates classical geometry by displaying, in diagrams, the objects of his reasoning and comparisons between them. Examination of Newtonâ€™s unpublished texts (and the views of his mentor, Isaac Barrow) shows that Newton conceives geometry as the science of measurement. On this view, all measurement ultimately involves the literal juxtapositionâ€”the putting-together in spaceâ€”of the item to be measured with a measure, whose dimensions serve as the standard of reference, so (...) that all quantity (which is what measurement makes known) is ultimately related to spatial extension. I use this conception of Newtonâ€™s project to explain the organization and proofs of the first theorems of mechanics to appear in the Principia (beginning in Sect. 2 of Book I). The placementof Keplerâ€™s rule of areas as the first proposition, and the manner in which Newton proves it, appear natural on the supposition that Newton seeks a measure, in the sense of a moveable spatial quantity, of time. I argue that Newton proceeds in this way so that his reasoning can have the ostensive certainty of geometry. (shrink)
Measurement is said to be the basis of exact sciences as the process of assigning numbers to matter (things or their attributes), thus making it possible to apply the mathematically formulated laws of nature to the empirical world. Mathematics and empiria are best accorded to each other in laboratory experiments which function as what Nancy Cartwright calls nomological machine: an arrangement generating (mathematical) regularities. On the basis of accounts of measurement errors and uncertainties, I will argue for two (...) claims: 1) Both fundamental laws of physics, corresponding to ideal nomological machine, and phenomenological laws, corresponding to material nomological machine, lie, being highly idealised relative to the empirical reality; and also laboratory measurement data do not describe properties inherent to the world independently of human understanding of it. 2) Therefore the naive, representational view of measurement and experimentation should be replaced with a more pragmatic or practice-based view. (shrink)
The concept of measurement is fundamental to a whole range of different disciplines, including not only the natural and engineering sciences, but also laboratory medicine and certain branches of the social sciences. This being the case, the concept of measurement has a particular relevance to the development of top-level ontologies in the area of knowledge engineering. For this reason, the present paper is concerned with ontological aspects of measurement. We are searching for a list of concepts that (...) are apt to characterize measurement methods in a general manner. To establish such means of characterization, we will primarily deal with the semantics of measurement values. (shrink)
This research examines business and psychology students’ attitude toward unethical behavior (measured at Time 1) and their propensity to engage in unethical behavior (measured at Time 1 and at Time 2, 4 weeks later) using a 15-item Unethical Behavior measure with five Factors: Abuse Resources, Not Whistle Blowing, Theft, Corruption, and Deception. Results suggested that male students had stronger unethical attitudes and had higher propensity to engage in unethical behavior than female students. Attitude at Time 1 predicted Propensity at Time (...) 1 accurately for all five factors (concurrent validity): If students consider it to be unethical, then, they are less likely to engage in that unethical behavior. Attitude at Time 1 predicted only Factor Abuse Resources for Propensity at Time 2. Propensity at Time 1 was significantly related to Propensity at Time 2. Attitude at Time 1, Propensity at Time 1, and Propensity at Time 2 had achieved configural and metric measurement invariance across major (business vs. psychology). Thus, researchers may have confidence in using these measures in future research. (shrink)
Eino Kaila's thought occupies a curious position within the logical empiricist movement. Along with Hans Reichenbach, Herbert Feigl, and the early Moritz Schlick, Kaila advocates a realist approach towards science and the project of a “scientific world conception”. This realist approach was chiefly directed at both Kantianism and Poincaréan conventionalism. The case in point was the theory of measurement. According to Kaila, the foundations of physical reality are characterized by the existence of invariant systems of relations, which he called (...) structures. In a certain sense, these invariant structures, he maintained, are constituted in the act of measuring. By “constitution”, however, Kaila meant neither the dependency of the objects of measurement on a priori concepts (or Kantian categories) nor their being effected by conventional stipulations in a Poincaréan sense. He held that invariant structures are, quite literally, real: they exist prior to and independently of our theoretical capacity. By executing measurements, invariant structures are detected and objectively determinable by laws of nature. (shrink)
One of the major roadblocks in conducting Environmental Corporate Social Responsibility (ECSR) research is operationalization of the construct. Existing ECSR measurement tools either require primary data gathering or special subscriptions to proprietary databases that have limited replicability. We address this deficiency by developing a transparent ECSR measure, with an explicit coding scheme, that strictly relies on publicly available data. Our ECSR measure tests favorably for internal consistency and inter-rater reliability, as well as convergent and discriminant validity.
This book provides an introduction to measurement theory for non-specialists and puts measurement in the social and behavioural sciences on a firm mathematical foundation. Results are applied to such topics as measurement of utility, psychophysical scaling and decision-making about pollution, energy, transportation and health. The results and questions presented should be of interest to both students and practising mathematicians since the author sets forth an area of mathematics unfamiliar to most mathematicians, but which has many potentially significant (...) applications. (shrink)
In the following article, we propose to show that following the general verificationist epistemic programme (its demand that the truth of our judgments be verifiable), the analysis of measurement on the one hand, and the classical positivist analysis of common-sense observation on the other, do not lead to same conclusions. This is especially important, because the differences in conclusions concern the positivist theory/observation distinction. In particular, the analysis of measurement does not fully support this distinction. This fact might (...) have important consequences for the problem of scientific realism and related ontological and epistemological problems in the philosophy of science. (shrink)
The quantum theory of de Broglie and Bohm solves the measurement problem, but the hypothetical corpuscles play no role in the argument. The solution ﬁnds a more natural home in the Everett interpretation.
We consider the problem of measurement using the Lindblad equation, which allows the introduction of time in the interaction between the measured system and the measurement apparatus. We use analytic results, valid for weak system-environment coupling, obtained for a two-level system in contact with a measurer (Markovian interaction) and a thermal bath (non-Markovian interaction), where the measured observable may or may not commute with the system-environment interaction. Analysing the behavior of the coherence, which tends to a value asymptotically (...) close to zero, we obtain an expression for the time of measurement which depends only on the system-measurer coupling, and which does not depend on whether the observable commutes with the system-bath interaction. The behavior of the coherences in the case of strong system-environment coupling, found numerically, indicates that an increase in this coupling decreases the measurement time, thus allowing our expression to be considered the upper limit for the duration of the process. (shrink)
Community Development Finance Institutions (CDFIs) are publicly funded organisations that provide small loans to people in financially underserved areas of the UK. Policy makers have repeatedly sought to understand and measure the performance of CDFIs to ensure the efficient use of public funds, but have struggled to identify an appropriate way of doing so. In this article, we empirically derive a framework that measures the performance of CDFIs through an analysis of their stakeholder relationships. Based on qualitative data from 20 (...) English CDFIs, we develop a typology of CDFIs according to three dimensions: organisational structure, type of lending and type of market served. Following on from this, we derive several propositions that consider how these dimensions relate to the financial and social performance of CDFIs, and provide the basis for a performance measurement framework. (shrink)
A geometric approach to quantum mechanics with unitary evolution and non-unitary collapse processes is developed. In this approach the Schrödinger evolution of a quantum system is a geodesic motion on the space of states of the system furnished with an appropriate Riemannian metric. The measuring device is modeled by a perturbation of the metric. The process of measurement is identified with a geodesic motion of state of the system in the perturbed metric. Under the assumption of random fluctuations of (...) the perturbed metric, the Born rule for probabilities of collapse is derived. The approach is applied to a two-level quantum system to obtain a simple geometric interpretation of quantum commutators, the uncertainty principle and Planck’s constant. In light of this, a lucid analysis of the double-slit experiment with collapse and an experiment on a pair of entangled particles is presented. (shrink)
We revisit quantum measurement when the apparatus is initially in a mixed state. We find that, in a particular restriction setup, the amount of entanglement between the system and the apparatus is given by the entropy increasing of the system under the measurement transformation. We show that the information gained is equal to the amount of entanglement under performing perfect measurement. Based on the perfect measurement, we give an upper bound of quantum discord.
The use of real clocks and measuring rods in quantum mechanics implies a natural loss of unitarity in the description of the theory. We briefly review this point and then discuss the implications it has for the measurement problem in quantum mechanics. The intrinsic loss of coherence allows to circumvent some of the usual objections to the measurement process as due to environmental decoherence.
Measurement is fundamental to all the sciences, the behavioural and social as well as the physical and in the latter its results provide our paradigms of 'objective fact'. But the basis and justification of measurement is not well understood and is often simply taken for granted. Henry Kyburg Jr proposes here an original, carefully worked out theory of the foundations of measurement, to show how quantities can be defined, why certain mathematical structures are appropriate to them and (...) what meaning attaches to the results generated. Crucial to his approach is the notion of error - it can not be eliminated entirely from its introduction and control, her argues, arises the very possibility of measurement. Professor Kyburg's approach emphasises the empirical process of making measurements. In developing it he discusses vital questions concerning the general connection between a scientific theory and the results which support it (or fail to). (shrink)
The need for quantitative measurement represents a unifying bond that links all the physical, biological, and social sciences. Measurements of such disparate phenomena as subatomic masses, uncertainty, information, and human values share common features whose explication is central to the achievement of foundational work in any particular mathematical science as well as for the development of a coherent philosophy of science. This book presents a theory of measurement, one that is "abstract" in that it is concerned with highly (...) general axiomatizations of empirical and qualitative settings and how these can be represented quantitatively. It was inspired by, and represents a generalization and extension of, the last major research work in this field, Foundations of Measurement Vol. I, by Krantz, Luce, Suppes, and Tversky published in 1971. (shrink)
Auditory scene analysis describes the ability to segregate relevant sounds out from the environment and to integrate them into a single sound stream using the characteristics of the sounds to determine whether or not they are related. This study aims to contrast task performances in objective threshold measurements of segregation and integration using identical stimuli, manipulating two variables known to influence streaming, inter-stimulus-interval (ISI) and frequency difference (Δf). For each measurement, one parameter (either ISI or Δf) was held constant (...) while the other was altered in a staircase procedure. By using this paradigm, it is possible to test within-subject across multiple conditions, covering a wide Δf and ISI range in one testing session. The objective tasks were based on across-stream temporal judgments (facilitated by integration) and within-stream deviance detection (facilitated by segregation). Results show the objective integration task is well suited for combination with the staircase procedure, as it yields consistent threshold measurements for separate variations of ISI and Δf, as well as being significantly related to the subjective thresholds. The objective segregation task appears less suited to the staircase procedure. With the integration-based staircase paradigm, a comprehensive assessment of streaming thresholds can be obtained in a relatively short space of time. This permits efficient threshold measurements particularly in groups for which there is little prior knowledge on the relevant parameter space for streaming perception. (shrink)
Neoliberal precepts of the governance of academic science-deregulation; reification of markets; emphasis on competitive allocation processes have been conflated with those of performance management—if you cannot measure it, you cannot manage it—into a single analytical and consequent single programmatic worldview. As applied to the United States’ system of research universities, this conflation leads to two major divergences from relationships hypothesized in the governance of science literature. (1) The governance and financial structures supporting academic science in the United States’ system of (...) higher education are sufficiently different from those found in many other OECD countries where these policies have been adopted to produce political pressures for an increase rather than a decrease in governmental control over university affairs. (2) The major impact upon academic science of performance measurement systems has come not externally from new government requirements but internally from the independent adoption of these techniques by universities, initially in the name of rational management and increasingly as devices to foster reputational enhancement. The overall thrust of the two trends in the U.S. has been less a shift as experienced elsewhere from bureaucratic to market modes of governance than the displacement of professional-collegial control by internal bureaucratic control. (shrink)
A modified Beltrametti-Cassinelli-Lahti model of the measurement apparatus that satisfies both the probability reproducibility condition and the objectification requirement is constructed. Only measurements on microsystems are considered. The cluster separability forms a basis for the first working hypothesis: the current version of quantum mechanics leaves open what happens to systems when they change their separation status. New rules that close this gap can therefore be added without disturbing the logic of quantum mechanics. The second working hypothesis is that registration (...) apparatuses for microsystems must contain detectors and that their readings are signals from detectors. This implies that the separation status of a microsystem changes during both preparation and registration. A new rule that specifies what happens when these changes occur and that guarantees the objectification is formulated and discussed. A part of our result has certain similarities with ‘collapse of the wave function’. (shrink)
The recently established universal uncertainty principle revealed that two nowhere commuting observables can be measured simultaneously in some state, whereas they have no joint probability distribution in any state. Thus, one measuring apparatus can simultaneously measure two observables that have no simultaneous reality. In order to reconcile this discrepancy, an approach based on quantum logic is proposed to establish the relation between quantum reality and measurement. We provide a language speaking of values of observables independent of measurement based (...) on quantum logic and we construct in this language the state-dependent notions of joint determinateness, value identity, and simultaneous measurability. This naturally provides a contextual interpretation, in which we can safely claim such a statement that one measuring apparatus measures one observable in one context and simultaneously it measures another nowhere commuting observable in another incompatible context. (shrink)
I show that quantum theory is the only probabilistic framework that permits arbitrary processes to be emulated by sequences of local measurements. This supports the view that, contrary to conventional wisdom, measurement should not be regarded as a complex phenomenon in need of a dynamical explanation but rather as a primitive—and perhaps the only primitive—operation of the theory.
This proposes a new theory of Quantum measurement; a state reduction theory in which reduction is to the elements of the number operator basis of a system, triggered by the occurrence of annihilation or creation (or lowering or raising) operators in the time evolution of a system. It is from these operator types that the acronym ‘LARC’ is derived. Reduction does not occur immediately after the trigger event; it occurs at some later time with probability P t per unit (...) time, where P t is very small. Localisation of macroscopic objects occurs in the natural way: photons from an illumination field are reflected off a body and later absorbed by another body. Each possible absorption of a photon by a molecule in the second body generates annihilation and raising operators, which in turn trigger a probability per unit time P t of a state reduction into the number operator basis for the photon field and the number operator basis of the electron orbitals of the molecule. Since all photons in the illumination field have come from the location of the first body, wherever that is, a single reduction leads to a reduction of the position state of the first body relative to the second, with a total probability of mP t , where m is the number of photon absorption events. Unusually for a reduction theory, the larc theory is naturally relativistic. (shrink)
The polemical term “interaction-free measurement” (IFM) is analyzed in its interpretative nature. Two seminal works proposing the term are revisited and their underlying interpretations are assessed. The role played by nonlocal quantum correlations (entanglement) is formally discussed and some controversial conceptions in the original treatments are identified. As a result the term IFM is shown to be consistent neither with the standard interpretation of quantum mechanics nor with the lessons provided by the EPR debate.
Starting from an abstract setting for the Lüders-von Neumann quantum measurement process and its interpretation as a probability conditionalization rule in a non-Boolean event structure, the author derived a certain generalization of operator algebras in a preceding paper. This is an order-unit space with some specific properties. It becomes a Jordan operator algebra under a certain set of additional conditions, but does not own a multiplication operation in the most general case. A major objective of the present paper is (...) the search for such examples of the structure mentioned above that do not stem from Jordan operator algebras; first natural candidates are matrix algebras over the octonions and other nonassociative rings. Therefore, the case when a nonassociative commutative multiplication exists is studied without assuming that it satisfies the Jordan condition. The characteristics of the resulting algebra are analyzed. This includes the uniqueness of the spectral resolution as well as a criterion for its existence, subalgebras that are Jordan algebras, associative subalgebras, and more different levels of compatibility than occurring in standard quantum mechanics. However, the paper cannot provide the desired example, but contribute to the search by the identification of some typical differences between the potential examples and the Jordan operator algebras and by negative results concerning some first natural candidates. The possibility that no such example exists cannot be ruled out. However, this would result in an unexpected new characterization of Jordan operator algebras, which would have a significant impact on quantum axiomatics since some customary axioms (e.g., power-associativity or the sum postulate for observables) might turn out to be redundant then. (shrink)
In this study some conceptual developments in the theory of measurement are reframed with the use of a model-theoretic interpretation. These paradigms (physical, extended physical and representational paradigm) which emerge in the process of reconceptualization are discussed. This discussion enables us to highlight a certain logic behind a series of developments in the theory of measurement and leads to a new, more general conceptualization.
Measurement invariance (MI) is a prerequisite for comparing latent variable scores across groups. The current paper introduces the concept of approximate measurement invariance building on the work of Muthén and Asparouhov and their application of Bayesian Structural Equation Modeling (BSEM) in the software Mplus. They showed that with BSEM exact zeros constraints can be replaced with approximate zeros to allow for minimal steps away from strict MI, still yielding a well-fitting model. This new opportunity enables researchers to make (...) explicit trade-offs between the degree of MI on the one hand, and the degree of model fit on the other. Throughout the paper we discuss the topic of approximate MI, followed by an empirical illustration where the test for MI fails, but where allowing for approximate MI results in a well-fitting model. Using simulated data, we investigate in which situations approximate MI can be applied and when it leads to unbiased results. Both our empirical illustration and the simulation study show approximate MI outperforms full or partial MI In detecting/recovering the true latent mean difference when there are (many) small differences in the intercepts and factor loadings across groups. In the discussion we provide a step-by-step guide in which situation what type of MI is preferred. Our paper provides a first step in the new research area of (partial) approximate MI and shows that it can be a good alternative when strict MI leads to a badly fitting model and when partial MI cannot be applied. (shrink)
Signal causality, the prohibition of superluminal information transmission, is the fundamental property shared by quantum measurement theory and relativity, and it is the key to understanding the connection between nonlocal measurement effects and elementary interactions. To prevent those effects from transmitting information between the generating and observing process, they must be induced by the kinds of entangling interactions that constitute measurements, as implied in the Projection Postulate. They must also be nondeterministic as reflected in the Born Probability Rule. (...) The nondeterminism of entanglement-generating processes explains why the relevant types of information cannot be instantiated in elementary systems, and why the sequencing of nonlocal effects is, in principle, unobservable. This perspective suggests a simple hypothesis about nonlocal transfers of amplitude during entangling interactions, which yields straightforward experimental consequences. (shrink)
It is argued that the so-called minimal statistical interpretation of quantum mechanics does not completely resolve the measurement problem in that this view is unable to show that quantjum mechanics can dispense with classical physics when it comes to a treatment of the measuring interaction. It is suggested that the view that quantum mechanics applies to individual systems should not be too hastily abandoned, in that this view gives perhaps the best hope of leading to a version of quantum (...) mechanics which does provide a complete solution to the measurement problem. (shrink)
A certain generalization of the mathematical formalism of quantum mechanics beyond operator algebras is considered. The approach is based on the concept of conditional probability and the interpretation of the Lüders-von Neumann quantum measurement as a probability conditionalization rule. A major result shows that the operator algebras must be replaced by order-unit spaces with some specific properties in the generalized approach, and it is analyzed under which conditions these order-unit spaces become Jordan algebras. An application of this result provides (...) a characterization of the projection lattices in operator algebras. (shrink)