This paper describes how the Electrical and Computer Engineering Department at South Dakota School of Mines and Technology has chosen to integrate ethics into their curriculum. All university freshmen engineering students are introduced to ethics through the presentation of ethical dilemmas. During this exercise, students are forced to argue both sides (‘for’ and ‘against’) of a hypothetical ethical engineering dilemma. It provides a setting for great discussion with the desired outcome that they learn to carefully analyze a situation before they (...) draw conclusions. In the sophomore year, students are introduced to methods to use the fundamental principles, the fundamental canons, and the suggested guidelines for use with the fundamental canons of ethics when analyzing appropriate action to be taken when confronted with ethical dilemmas. We currently use the ‘sophomore’ method for seniors because the sequencing is just beginning. Next year the seniors will do more indepth analysis of ethical case studies. (shrink)
This book traces the deployment of intermedial aesthetics in the works of early twentieth-century female performers. By destabilizing medial and genre boundaries, these women created compelling and meaningful performances that negotiated turn-of-the-century American social and cultural issues.
We argue that the intractable part of the measurement problem -- the 'big' measurement problem -- is a pseudo-problem that depends for its legitimacy on the acceptance of two dogmas. The first dogma is John Bell's assertion that measurement should never be introduced as a primitive process in a fundamental mechanical theory like classical or quantum mechanics, but should always be open to a complete analysis, in principle, of how the individual outcomes come about dynamically. The second dogma is the (...) view that the quantum state has an ontological significance analogous to the significance of the classical state as the 'truthmaker' for propositions about the occurrence and non-occurrence of events, i.e., that the quantum state is a representation of physical reality. We show how both dogmas can be rejected in a realist information-theoretic interpretation of quantum mechanics as an alternative to the Everett interpretation. The Everettian, too, regards the 'big' measurement problem as a pseudo-problem, because the Everettian rejects the assumption that measurements have definite outcomes, in the sense that one particular outcome, as opposed to other possible outcomes, actually occurs in a quantum measurement process. By contrast with the Everettians, we accept that measurements have definite outcomes. By contrast with the Bohmians and the GRW 'collapse' theorists who add structure to the theory and propose dynamical solutions to the 'big' measurement problem, we take the problem to arise from the failure to see the significance of Hilbert space as a new kinematic framework for the physics of an indeterministic universe, in the sense that Hilbert space imposes kinematic objective probabilistic constraints on correlations between events. (shrink)
A deterministic model that accounts for the statistical behavior of random samples of identical particles is presented. The model is based on some nonmeasurable distribution of spin values in all directions. The mathematical existence of such distributions is proved by set-theoretical techniques, and the relation between these distributions and observed frequencies is explored within an appropriate extension of probability theory. The relation between quantum mechanics and the model is specified. The model is shown to be consistent with known polarization phenomena (...) and the existence of macroscopic magnetism. Finally.. (shrink)
We develop and defend the thesis that the Hilbert space formalism of quantum mechanics is a new theory of probability. The theory, like its classical counterpart, consists of an algebra of events, and the probability measures defined on it. The construction proceeds in the following steps: (a) Axioms for the algebra of events are introduced following Birkhoff and von Neumann. All axioms, except the one that expresses the uncertainty principle, are shared with the classical event space. The only models for (...) the set of axioms are lattices of subspaces of inner product spaces over a field K. (b) Another axiom due to Soler forces K to be the field of real, or complex numbers, or the quaternions. We suggest a probabilistic reading of Soler's axiom. (c) Gleason's theorem fully characterizes the probability measures on the algebra of events, so that Born's rule is derived. (d) Gleason's theorem is equivalent to the existence of a certain finite set of rays, with a particular orthogonality graph (Wondergraph). Consequently, all aspects of quantum probability can be derived from rational probability assignments to finite "quantum gambles". (e) All experimental aspects of entanglement- the violation of Bell's inequality in particular- are explained as natural outcomes of the probabilistic structure. (f) We hypothesize that even in the absence of decoherence macroscopic entanglement can very rarely be observed, and provide a precise conjecture to that effect .We also discuss the relation of the present approach to quantum logic, realism and truth, and the measurement problem. (shrink)
We develop a systematic approach to quantum probability as a theory of rational betting in quantum gambles. In these games of chance, the agent is betting in advance on the outcomes of several (finitely many) incompatible measurements. One of the measurements is subsequently chosen and performed and the money placed on the other measurements is returned to the agent. We show how the rules of rational betting imply all the interesting features of quantum probability, even in such finite gambles. These (...) include the uncertainty principle and the violation of Bell's inequality among others. Quantum gambles are closely related to quantum logic and provide a new semantics for it. We conclude with a philosophical discussion on the interpretation of quantum mechanics. (shrink)
We describe a possible physical device that computes a function that cannot be computed by a Turing machine. The device is physical in the sense that it is compatible with General Relativity. We discuss some objections, focusing on those which deny that the device is either a computer or computes a function that is not Turing computable. Finally, we argue that the existence of the device does not refute the Church–Turing thesis, but nevertheless may be a counterexample to Gandy's thesis.
Following a proposal of Vaidman The Stanford encyclopaedia of philosophy, 2014) The probable and the improbable: understanding probability in physics, essays in memory of Itamar Pitowsky, 2011), Sebens and Carroll , have argued that in Everettian quantum theory, observers are uncertain, before they complete their observation, about which Everettian branch they are on. They argue further that this solves the problem of making sense of probabilities within Everettian quantum theory, even though the theory itself is deterministic. We note some (...) problems with these arguments. (shrink)
We argue that certain types of many minds (and many worlds) interpretations of quantum mechanics, e.g. Lockwood ([1996a]), Deutsch () do not provide a coherent interpretation of the quantum mechanical probabilistic algorithm. By contrast, in Albert and Loewer's () version of the many minds interpretation, there is a coherent interpretation of the quantum mechanical probabilities. We consider Albert and Loewer's probability interpretation in the context of Bell-type and GHZ-type states and argue that it implies a certain (weak) form of nonlocality. (...) 1 Introduction 2 Albert and Loewer's interpretation 3 Probabilities in Lockwood's interpretation 4 Sets of minds and their correlations 5 Many minds and GHZ. (shrink)
In the mid-nineteenth century George Boole formulated his ‘conditions of possible experience’. These are equations and ineqaulities that the relative frequencies of events must satisfy. Some of Boole's conditions have been rediscovered in more recent years by physicists, including Bell inequalities, Clauser Horne inequalities, and many others. In this paper, the nature of Boole's conditions and their relation to propositional logic is explained, and the puzzle associated with their violation by quantum frequencies is investigated in relation to a variety of (...) approaches to the interpretation of quantum mechanics. * While preparing this paper for publication I have learnt of the untimely death of Professor J. S. Bell, and I wish to dedicate the paper to his memory. This research was undertaken while I spent a sabbatical leave at Wolfson College, and the History and Philosophy of Science Department at the University of Cambridge. I would like to thank Michael Redhead and Jeremy Butterfield for their hospitality and for helpful discussions. A first draft of this paper has been distributed among the participants of the conference 'Einstein in Context' which was held in Israel, in April 1990.1 have benefited from the comments of many colleagues. I would like to thank in particular Arthur Fine who enlightened me on the prism models, David Albert. Maya Bar-Hillel. Yemima Ben-Menachem, Mara Beller. Simon Saunders, and Mark Steiner. This research is partially supported by the Edelstein Center for the History and Philosophy of Science at the Hebrew University. (shrink)
Why do we not see large macroscopic objects in entangled states? There are two ways to approach this question. The first is dynamic. The coupling of a large object to its environment cause any entanglement to decrease considerably. The second approach, which is discussed in this paper, puts the stress on the difficulty of observeing a large-scale entanglement. As the number of particles n grows we need an ever more precise knowledge of the state and an ever more carefully designed (...) experiment, in order to recognize entanglement. To develop this point we consider a family of observables, called witnesses, which are designed to detect entanglement. A witness W distinguishes all the separable (unentangled) states from some entangled states. If we normalize the witness W to satisfy tr W 1 for all separable states , then the efficiency of W depends on the size of its maximal eigenvalue in absolute value; that is, its operator norm W . It is known that there are witnesses on the space of n qubits for which W is exponential in n. However, we conjecture that for a large majority of n-qubit witnesses W O n log n . Thus, in a nonideal measurement, which includes errors, the largest eigenvalue of a typical witness lies below the threshold of detection. We prove this conjecture for the family of extremal witnesses introduced by Werner and Wolf [Phys. Rev. A 64, 032112 (2001)]. (shrink)
A classical gas at equilibrium satisfies the locality conditionif the correlations between local fluctuations at a pair of remote small regions diminish in the thermodynamic limit. The gas satisfies a strong locality conditionif the local fluctuations at any number of remote locations have no (pair, triple, quadruple....) correlations among them in the thermodynamic limit. We prove that locality is equivalent to a certain factorizability condition on the distribution function. The analogous quantum condition fails in the case of a freeBose gas. (...) Next we prove that strong locality is equivalent to the total factorizability of the distribution function, and thus (given Liourille’s theorem) to the Maxwell Boltzmann distribution for an ideal gas. (shrink)
We consider the set of all matrices of the form pij = tr[W (Ei ⊗ Fj)] where Ei, Fj are projections on a Hilbert space H, and W is some state on H ⊗ H. We derive the basic properties of this set, compare it with the classical range of probability, and note how its properties may be related to a geometric measures of entanglement.
Boltzmann’s approach to statistical mechanics is widely believed to be conceptually superior to Gibbs’ formulation. However, the microcanonical distribution often fails to behave as expected: The ergodicity of the motion relative to it can rarely be established for realistic systems; worse, it can often be proved to fail. Also, the approach involves idealizations that have little physical basis. Here we take Khinchin’s advice and propose a de…nition of equilibrium that is more realistic: The de…nition re‡ects the fact that the system (...) is made of a great number of particles, and implies that all measurable macroscopic observables have steady values. (shrink)
Semantic priming has long been recognized to reflect, along with automatic semantic mechanisms, the contribution of controlled strategies. However, previous theories of controlled priming were mostly qualitative, lacking common grounds with modern mathematical models of automatic priming based on neural networks. Recently, we introduced a novel attractor network model of automatic semantic priming with latching dynamics. Here, we extend this work to show how the same model can also account for important findings regarding controlled processes. Assuming the rate of semantic (...) transitions in the network can be adapted using simple reinforcement learning, we show how basic findings attributed to controlled processes in priming can be achieved, including their dependency on stimulus onset asynchrony and relatedness proportion and their unique effect on associative, category-exemplar, mediated and backward prime-target relations. We discuss how our mechanism relates to the classic expectancy theory and how it can be further extended in future developments of the model. (shrink)
1. The Physical Church-Turing Thesis. Physicists often interpret the Church-Turing Thesis as saying something about the scope and limitations of physical computing machines. Although this was not the intention of Church or Turing, the Physical Church Turing thesis is interesting in its own right. Consider, for example, Wolfram’s formulation: One can expect in fact that universal computers are as powerful in their computational capabilities as any physically realizable system can be, that they can simulate any physical system . . . (...) No physically implementable procedure could then shortcut a computationally irreducible process. (Wolfram 1985) Wolfram’s thesis consists of two parts: (a) Any physical system can be simulated (to any degree of approximation) by a universal Turing machine (b) Complexity bounds on Turing machine simulations have physical signiﬁcance. For example, suppose that the computation of the minimum energy of some system of n particles takes at least exponentially (in n) many steps. Then the relaxation time of the actual physical system to its minimum energy state will also take exponential time. (shrink)
This paper offers a new semantic theory of existentials (sentences of the form There be NP pivot XP coda ) in which pivots are (second order) predicates and codas are modifiers. The theory retains the analysis of pivots as denoting generalized quantifiers (Barwise and Cooper 1981; Keenan 1987), but departs from previous analyses in analyzing codas as contextual modifiers on a par with temporal/locative frame adverbials. Existing analyses universally assume that pivots are arguments of some predicate, and that codas are (...) main or secondary predicates. It is shown that these analyses cannot account for the behavior of codas with quantifiers and for the interaction of multiple codas, both of which receive a simple treatment on the proposed theory. The assimilation of codas to frame adverbials explains several semantic properties which have not been analyzed in the semantic literature, and that distinguish existentials from their copular counterparts. Furthermore, it highlights important properties of the semantics of modification and its relation to predication. (shrink)
The Einstein-Podolsky-Rosen argument for the incompleteness of quantum mechanics involves two assumptions: one about locality and the other about when it is legitimate to infer the existence of an element-of-reality. Using one simple thought experiment, we argue that quantum predictions and the relativity of simultaneity require that both these assumptions fail, whether or not quantum mechanics is complete.
This paper explores an understudied and poorly understood phenomenon of morphological syncretism in which a morpheme otherwise used to mark the head of a possessive NP appears on words naming property concept (PC) states (states named by adjectives in languages with that lexical category; Dixon, Where have all the adjectives gone? And other essays in Semantics and Syntax, 1982) in predicative and attributive contexts. This phenomenon is found across a variety of unrelated languages. We examine its manifestation in Ulwa, an (...) endangered Misumalpan language of Nicaragua, where diachronic evidence clearly shows that a single affix is involved. We propose an explanation for the syncretism based on an explicit syntactic and semantic analysis of the relevant constructions. On the proposed explanation, the syncretism arises out of a combination of semantic and morphosyntactic facts of Ulwa grammar. Specifically, we propose that the Ulwa pattern exemplifies a possessive strategy of predication. Intuitively, this strategy is a manifestation in grammar of the idiomatic equivalence between the property of being F and the property of having F-ness. (shrink)
Kochen and Specker’s theorem can be seen as a consequence of Gleason’s theorem and logical compactness. Similar compactness arguments lead to stronger results about finite sets of rays in Hilbert space, which we also prove by a direct construction. Finally, we demonstrate that Gleason’s theorem itself has a constructive proof, based on a generic, finite, effectively generated set of rays, on which every quantum state can be approximated. r 2003 Elsevier Ltd. All rights reserved.