Bennett and Schumacher’s postselected quantum teleportation is a model of closed timelike curves (CTCs) that leads to results physically different from Deutsch’s model. We show that even a single qubit passing through a postselected CTC (P-CTC) is sufficient to do any postselected quantum measurement with certainty, and we discuss an important difference between “Deutschian” CTCs (D-CTCs) and P-CTCs in which the future existence of a P-CTC might affect the present outcome of an experiment. Then, based on a suggestion of Bennett (...) and Smith, we explicitly show how a party assisted by P-CTCs can distinguish a set of linearly independent quantum states, and we prove that it is not possible for such a party to distinguish a set of linearly dependent states. The power of P-CTCs is thus weaker than that of D-CTCs because the Holevo bound still applies to circuits using them, regardless of their ability to conspire in violating the uncertainty principle. We then discuss how different notions of a quantum mixture that are indistinguishable in linear quantum mechanics lead to dramatically differing conclusions in a nonlinear quantum mechanics involving P-CTCs. Finally, we give explicit circuit constructions that can efficiently factor integers, efficiently solve any decision problem in the intersection of NP and coNP, and probabilistically solve any decision problem in NP. These circuits accomplish these tasks with just one qubit traveling back in time, and they exploit the ability of postselected closed timelike curves to create grandfather paradoxes for invalid answers. (shrink)
Classical particles of the same kind are distinguishable: they can be labeled by their positions and follow different trajectories. This distinguishability affects the number of ways W a macrostate can be realized on the micro-level, and via S=k ln W this leads to a non-extensive expression for the entropy. This result is generally considered wrong because of its inconsistency with thermodynamics. It is sometimes concluded from this inconsistency, notoriously illustrated by the Gibbs paradox, that identical particles must be treated (...) as indistinguishable after all; and even that quantum mechanics is indispensable for making sense of this. In this article we argue, by contrast, that the classical statistics of distinguishable particles and the resulting non-extensive entropy function are perfectly all-right both from a theoretical and an experimental perspective. We remove the inconsistency with thermodynamics by pointing out that the entropy concept in statistical mechanics is not completely identical to the thermodynamical one. Finally, we observe that even identical quantum particles are in some cases distinguishable; and conclude that quantum mechanics is irrelevant to the Gibbs paradox. (shrink)
The suggestion that particles of the same kind may be indistinguishable in a fundamental sense, even so that challenges to traditional notions of individuality and identity may arise, has first come up in the context of classical statistical mechanics. In particular, the Gibbs paradox has sometimes been interpreted as a sign of the untenability of the classical concept of a particle and as a premonition that quantum theory is needed. This idea of a ‘quantum connection’ stubbornly persists in the literature, (...) even though it has also been criticized frequently. Here we shall argue that although this criticism is justified, the proposed alternative solutions have often been wrong and have not put the paradox in its right perspective. In fact, the Gibbs paradox is unrelated to fundamental issues of particle identity; only distinguishability in a pragmatic sense plays a role , and in principle the paradox always is there as long as the concept of a particle applies at all. In line with this we show that the paradox survives even in quantum mechanics, in spite of the quantum mechanical symmetrization postulates. (shrink)
Heyes's (1998) triangulation approach to distinguishing a “theory” of mind (ToM) from a “theory” of behavior (ToB) in chimpanzees fails. The ToB theorist can appeal to the explicit training sessions and analogical reasoning to explain/predict the chimpanzees' behaviors. An alternative triangulation experiment is sketched, demonstrating how the removal of such training sessions paves the way toward solving the distinguishability problem.
This paper continues an earlier work by considering in what sense and to what extent identical Bohmian-mechanical particles in many-particle systems can be considered indistinguishable. We conclude that while whether identical Bohmian-mechanical particles ace considered to be “statistically (in)distinguishable” is a matter of theory choice underdetermined by logic and experiment, such particles are in any case “physically distinguishable.”.
The Gibbs' Paradox is commonly explained by invoking some type of "principle of indistinguishability," which asserts that the interchange of identical particles is not a real physical event, i.e., is operationally meaningless. However, if this principle is to provide a satisfactory resolution of the Paradox, it must be operationally possible to determine whether, in fact, two given systems are identical or not. That is, the assertion that the Gibbs' Paradox is resolvable by an indistinguishability principle actually is an assertion that (...) we can in principle possess a complete set of effective procedures for determining the identity or non-identity of arbitrary physical systems. We show that, in rather general situations, an assertion of this type is not well founded. It is further pointed out that a failure to recognize an incomplete set of "sameness criteria" can lead to serious blunders in physics and in biology. (shrink)
We discuss two qualities of quantum systems: various correlations existing between their subsystems and the distinguishability of different quantum states. This is then applied to analysing quantum information processing. While quantum correlations, or entanglement, are clearly of paramount importance for efficient pure state manipulations, mixed states present a much richer arena and reveal a more subtle interplay between correlations and distinguishability. The current work explores a number of issues related with identifying the important ingredients needed for quantum information (...) processing. We discuss the Deutsch-Jozsa algorithm, the Shor algorithm, the Grover algorithm and the power of a single qubit class of algorithms. In the latter, a quantity called discord is seen to be more important than entanglement. One section is dedicated to cluster states where entanglement is crucial, but its precise role is highly counter-intuitive. Here we see that the notion of distinguishability becomes a more useful concept. (shrink)
Disagreements over the meaning of the thermodynamic entropy and how it should be defined in statistical mechanics have endured for well over a century. In an earlier paper, I showed that there were at least nine essential properties of entropy that are still under dispute among experts. In this paper, I examine the consequences of differing definitions of the thermodynamic entropy of macroscopic systems.Two proposed definitions of entropy in classical statistical mechanics are (1) defining entropy on the basis of probability (...) theory (first suggested by Boltzmann in 1877), and (2) the traditional textbook definition in terms of a volume in phase space (also attributed to Boltzmann). The present paper demonstrates the consequences of each of these proposed definitions of entropy and argues in favor of a definition based on probabilities. (shrink)
Tolerance spaces are sets equipped with a reflexive, symmetric, but not necessarily transitive, relation of indistinguishability, and are useful for describing vagueness based on error-prone measurements. We show that any tolerance space can be embedded in one generated by comparisons using prototypical objects. As a result propositions, definable on a tolerance space can be translated into propositions behaving classically.
In this paper I claim that perceptual discriminatory skills rely on a suitable type of environment as an enabling condition for their exercise. This is because of the constitutive connection between environment and perceptual discriminatory skills, inasmuch as such connection is construed from an ecological approach. The exercise of a discriminatory skill yields knowledge of affordances of objects, properties, or events in the surrounding environment. This is practical knowledge in the first-person perspective. An organism learns to perceive an object by (...) becoming sensitized to its affordances. I call this position ecological disjunctivism. A corollary of this position is that a case of perception and its corresponding case of hallucination—which is similar to the former only in some respects—are different in nature. I show then how the distinguishability problem is addressed by ecological disjunctivism. (shrink)
A crucial aspect of scientific realism is what do we mean by true. In Luk’s theory and model of scientific study, a theory can be believed to be “true” but a model is only accurate. Therefore, what do we mean by a “true” theory in scientific realism? Here, we focus on exploring the notion of truth by some thought experiments and we come up with the idea that truth is related to what we mean by the same. This has repercussion (...) to the repeatability of the experiments and the predictive power of scientific knowledge. Apart from sameness, we also found that truth is related to the granularity of the observation, the limit of detection, the distinguishability of the objects in theory, the simultaneous measurements of objects/processes, the consistencies of the theory and the one-to-one correspondence between terms/events and objects/processes, respectively. While there is no guarantee that we can arrive at the final “true” theory, we have a process/procedure with more and more experiments together with our own ingenuity, to direct us towards such a “true” theory. For quantum mechanics, since a particle is also regarded as a wave, quantum mechanics cannot be considered as a true theory based on the correspondence theory of truth. Failing this, truth may be defined by the coherence theory of truth which is similar to the coherence of beliefs. However, quantum mechanics may not be believed to be a true theory based on the coherence theory of truth because wave properties and particle properties may contradict. Further research is needed to address this problem if we want to regard quantum mechanics as a “true” theory. (shrink)
Duncan Pritchard has recently defended a view he calls ‘epistemological disjunctivism’, largely inspired by John McDowell. I argue that Pritchard is right to associate the view with McDowell, and that McDowell’s ‘inference-blocking’ argument against the sceptic succeeds only if epistemological disjunctivism is accepted. However, Pritchard also recognises that epistemological disjunctivism appears to conflict with our belief that genuine and illusory experiences are indistinguishable (the ‘distinguishability problem’). Since the indistinguishability of experiences is the antecedent in the inference McDowell intends to (...) block, I suggest that his argument rests on an inconsistent set of premises. In support of this, I show that Pritchard’s response to the distinguishability problem is incompatible with the conclusion of the ‘inference-blocking’ argument, and that the response available in McDowell’s work relies on a mistaken conception of fallibility. Either McDowell must deny the sceptic’s premise that perceptual experiences are indistinguishable, or he must give up his conclusion that perceptual warrant can be indefeasible. (shrink)
Any interpretation of Hegel which stresses both his deep dependence on and radical revision of Kant must account for the nature of the difference between what Hegel calls a merely finite idealism and a so-called ’Absolute Idealism’. Such a clarification in turn depends on understanding Hegel’s claim to have preserved the distinguishability of intuition and concept, but to have insisted on their inseparability, or, to have defended their ’organic’ rather than ’mechanical’ relation. This is the main issue in this (...) chapter, which invokes John McDowell’s notion of ’the unboundedness of the conceptual’ to clarify the issue, as well as noting a number of similar claims in Wittgenstein. The implications of Hegel’s view for the issues of metaphysics generally is explored. (shrink)
This paper is a further consideration of Hemmo and Shenker’s ideas about the proper conceptual characterization of macrostates in statistical mechanics. We provide two formulations of how macrostates come about as elements of certain partitions of the system’s phase space imposed on by the interaction between the system and an observer, and we show that these two formulations are mathematically equivalent. We also reflect on conceptual issues regarding the relationship of macrostates to distinguishability, thermodynamic regularity, observer dependence, and the (...) general phenomenon of measurement. (shrink)
This thesis analyses the ontological nature of quantum particles. In it I argue that quantum particles, despite their indistinguishability, are objects in much the same way as classical particles. This similarity provides an important point of continuity between classical and quantum physics. I consider two notions of indistinguishability, that of indiscernibility and permutation symmetry. I argue that neither sort of indistinguishability undermines the identity of quantum particles. I further argue that, when we understand in distinguishability in terms of permutation (...) symmetry, classical particles are just as indistinguishable as quantum particles; for classical physics also possesses permutation symmetry. (shrink)
It is only when mixing two or more pure substances along a reversible path that the entropy of the mixing can be made physically manifest. It is not, in this case, a mere mathematical artifact. This mixing requires a process of successive stages. In any finite number of stages, the external manifestation of the entropy change, as a definite and measurable quantity of heat, isa fully continuous function of the relevant variables. It is only at an infinite and unattainable limit (...) thata non-uniform convergence occurs. And this occurs when considered in terms of the number of stages together with a distinguishability parameter appropriate to the particular device which is used to achieve reversibility. These considerations, which are of technological interest to chemical engineers, resolve a paradox derived in chemical theory called Gibbs'' Paradox. (shrink)
The purpose of this article is to introduce a class of distance-based iterated revision operators generated by minimizing the geodesic distance on a graph. Such operators correspond bijectively to metrics and have a simple finite presentation. As distance is generated by distinguishability, our framework is appropriate for modelling contexts where distance is generated by threshold, and therefore, when measurement is erroneous.
The introduction of statistical models represented by directed acyclic graphs (DAGs) has proved fruitful in the construction of expert systems, in allowing efficient updating algorithms that take advantage of conditional independence relations (Pearl, 1988, Lauritzen et al. 1993), and in inferring causal structure from conditional independence relations (Spirtes and Glymour, 1991, Spirtes, Glymour and Scheines, 1993, Pearl and Verma, 1991, Cooper, 1992). As a framework for representing the combination of causal and statistical hypotheses, DAG models have shed light on a (...) number of issues in statistics ranging from Simpson’s Paradox to experimental design (Spirtes, Glymour and Scheines, 1993). The relations of DAGs with statistical constraints, and the equivalence and distinguishability properties of DAG models, are now well understood, and their characterization and computation involves three properties connecting graphical structure and probability distributions: (i) a local directed Markov property, (ii) a global directed Markov property, (iii) and factorizations of joint densities according to the structure of a graph (Lauritizen, et al., 1990). (shrink)
The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of (...) the matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support. (shrink)
When presented with a situation involving an agent’s choice between alternative actions, a moral oracle says what the agent is allowed to do. The oracle bases her advice on some moral theory, but the nature of that theory is not known by us. The moral oracle’s test consists in determining whether a series of questions to the oracle can be so constructed that her answers will reveal which of two given types of theories she adheres to. The test can be (...) applied to moral theories in order to determine if they differ in their recommendations for action. Based on this test, a terminology is developed to specify different forms and degrees of distinguishability between moral theories, or types of theories, in terms of their recommendations for action. In conclusion, the test is applied to consequentialism and utilitarianism. (shrink)
A possible mechanism of nonlinear quantum evolution is introduced and its implications for quantum communication are investigated. First, it is demonstrated that an appropriate combination of wavefunction collapse and the consciousness of observer may permit the observer to distinguish nonorthogonal quantum states in principle, and thus consciousness will introduce certain nonlinearity into quantum dynamics. Next, it is shown that the distinguishability of nonorthogonal states can be used to achieve quantum superluminal communication, by which information can be transmitted nonlocally and (...) faster than the speed of light. Finally, the issue of apparent incompatibility between superluminal communication and special relativity is briefly addressed. (shrink)
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...) paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
The quantum formalism ofdistinguishable, yetequivalent particles (with symmetric or antisymmetric wave functions) is here worked out. The result is an entirely explicit formulation of the way in which classical mechanics emerges from quantum mechanics for such particles. Distinguishability is achieved at the cost of dynamical precision; the two are, in fact, complementary.
The Gibbs Paradox is essentially a set of open questions as to how sameness of gases or fluids are to be treated in thermodynamics and statistical mechanics. They have a variety of answers, some restricted to quantum theory, some to classical theory. The solution offered here applies to both in equal measure, and is based on the concept of particle indistinguishability. Correctly understood, it is the elimination of sequence position as a labelling device, where sequences enter at the level of (...) the tensor product of one-particle state spaces. In both cases it amounts to passing to the quotient space under permutations. ‘Distinguishability’, in the sense in which it is usually used in classical statistical mechanics, is a mathematically convenient, but physically muddled, fiction. (shrink)
The long history of ergodic and quasi-ergodic hypotheses provides the best example of the attempt to supply non-probabilistic justifications for the use of statistical mechanics in describing mechanical systems. In this paper we reverse the terms of the problem. We aim to show that accepting a probabilistic foundation of elementary particle statistics dispenses with the need to resort to ambiguous non-probabilistic notions like that of (in)distinguishability. In the quantum case, starting from suitable probability conditions, it is possible to deduce (...) elementary particle statistics in a unified way. Following our approach Maxwell-Boltzmann statistics can also be deduced, and this deduction clarifies its status.Thus our primary aim in this paper is to give a mathematically rigorous deduction of the probability of a state with given energy for a perfect gas in statistical equilibrium; that is, a deduction of the equilibrium distribution for a perfect gas. A crucial step in this deduction is the statement of a unified statistical theory based on clearly formulated probability conditions from which the particle statistics follows. We believe that such a deduction represents an important improvement in elementary particle statistics, and a step towards a probabilistic foundation of statistical mechanics.In this Part I we first present some history: we recall some results of Boltzmann and Brillouin that go in the direction we will follow. Then we present a number of probability results we shall use in Part II. Finally, we state a notion of entropy referring to probability distributions, and give a natural solution to Gibbs' paradox. (shrink)
Stein has raised a fundamental problem for any attempt to characterize instrumentalism and realism as substantive alternatives. This is the distinguishability problem, which consists in the problem of developing a form of instrumentalism that is substantially different from a plausible realist alternative and the problem of showing that this form of instrumentalism does justice to actual scientific practice. Using Stein’s own discussion of Maxwell, I formulate instrumentalism and realism as a scientist’s attitudes toward models, where an attitude is understood (...) to be a complex of the scientist’s belief and intention regarding models. Developing a case study of Benzer’s modeling practice, I show that each attitude can structure inquiry differently and argue that to understand certain aspects of scientific practice, such as the practice of genetic mapping in Benzer’s work, we sometimes need to appeal to the coexistence of these attitudes. (shrink)
We show: (1) It is possible to produce the three familiar statistics without referring to the problem of distinguishability; (2) what really distinguishes elementary particles is the correlation existing among them; (3) correlations existing among quantum particles, positive for bosons and negative form fermions, are completely different in character.
I identify one neglected source of support for a Kripkean reading of Wittgenstein’s Philosophical Investigations: the analogy between rules and epistemic grounds and the existence of a Kripkean anti-privacy argument about epistemic grounds in On Certainty. This latter argument supports Kripke’s claims that the basic anti-privacy argument in the Investigations (a) poses a question about the distinguishability of certain first-person attributions with identical assertability conditions, (b) concludes that distinguishability is provided by third-person evaluability, and (c) is a general (...) argument, not one about a specific kind of alleged rules. (shrink)
It is shown here that the microcanonical ensemble for a system of noninteracting bosons and fermions contains a subensemble of state vectors for which all particles of the system are distinguishable. This “IQC” (inner quantum-classical) subensemble is therefore fully classical, except for a rather extreme quantization of particle momentum and position, which appears as the natural price that must be paid for distinguishability. The contribution of the IQC subensemble to the entropy is readily calculated, and the criterion for this (...) to be a good approximation to the exact entropy is a logarithmically strengthened form of the usual criterion for the validity of classical statistics in terms of the thermal de Broglie wavelength and the average volume per particle. Thus, it becomes possible to derive the Maxwell-Boltzmann distribution directly from the ensemble in the classical limit, using fully classical reasoning about the distinguishability of particles. The entropy is additive—theN! factor of the Boltzmann count cancels out in the course of the calculation, and the “N! paradox” is thereby resolved. The method of “correct Boltzmann counting” and the lowest term of the Wigner-Kirkwood series for the partition function are seen to be partly based on the IQC subensemble, and their partly nonclassical nature is clarified. The clear separation in the full ensemble of classical and nonclassical components makes it possible to derive the classical statistics of indistinguishable particles from their quantum statistics in a controlled, explicit way. This is particularly important for nonequilibrium theory. The treatment of molecular collisions along too-literally classical lines turns out to require exorbitantly high temperatures, although there are suggestions of indirect ways in which classical nonequilibrium theory might be justified at ordinary temperatures. The applicability of exact classical ergodic and mixing theory to systems at ordinary temperatures is called into question, although the general idea of coarse-graining is confirmed. The concepts on which the IQC idea is based are shown to give rise to a series development of thermostatistical quantities, starting with the distinguishable-particle approximation. (shrink)
Two different concepts of distinguishability are often mixed up in attempts to derive in quantum mechanics the symmetry of the wave function from indistinguishability of identical particles. Some of these attempts are analyzed and shown to be defective. It is argued that, although identical particles should be considered as observationally indistinguishable in symmetric states, they may be considered to be conceptually distinguishable. These two notions of distinguishability have quite different physical origins, the former one being related to observations (...) while the latter has to do with the preparation of the system. (shrink)
Two different concepts of distinguishability are often mixed up in attempts to derive in quantum mechanics the (anti)symmetry of the wave function from indistinguishability of identical particles. Some of these attempts are analyzed and shown to be defective. It is argued that, although identical particles should be considered as observationally indistinguishable in (anti)symmetric states, they may be considered to be conceptually distinguishable. These two notions of (in)distinguishability have quite different physical origins, the former one being related to observations (...) while the latter has to do with the preparation of the system. (shrink)
Among scholars, how to interpret and evaluate Kant’s rejection of diabolical evil remains controversial. This article has two aims. First, I will examine all six forms of diabolical evil either discussed by Kant or implicitly contained in his texts, thereby demonstrating the reasons why each of these forms must be rejected within his framework. The conclusion of this text analysis is that the extremity of human evil for Kant is quasi-diabolical Willkür which does evil for the sake of self-assertion. Second, (...) I will offer a moderate defense of Kant’s view of diabolical evil as a whole. On the one hand, the legitimacy of Kant’s rejection of both diabolical Wille and full-fledged diabolical Willkür can be confirmed within his theory of practical freedom. On the other hand, faced with the possibility of ‘occasionally doing evil qua evil’, a critical defense of Kant’s moral psychology can be established, i.e. a defense that casts doubt on the distinguishability between ‘doing evil for the sake of self-assertion’ and ‘doing evil qua evil’. (shrink)
In this paper I consider one kind of vague linguistic expression: adjec- tives like tall, big, expensive. These are called gradable adjectives. The most well-known linguistic theories that account for them are the so-called degree-based theories. In this paper I present a formal model that accounts for vague gradable adjectives as an alternative to degree-based theories. The model is built on two basic ingredients: (i) comparison classes and (ii) gran- ular partitions. (i) Comparison classes are introduced to account for the (...) context-sensitivity of vague adjectives. The extension of the predicate being tall in the comparison class of men is different from its extension in the com- parison class of children. (ii) We can look at the elements of a context under different standards of precision, each of them corresponding to a granular level of observation. The finer the level is, the more differences between the individuals are detected. Granular partitions are used to represent in- distinguishability relations between objects with respect to the properties expressed by vague adjectives. (shrink)
Uniformities describing the distinguishability of states and of observables are discussed in the context of general statistical theories and are shown to be related to distinguished subspaces of continuous observables and states, respectively. The usual formalism of quantum mechanics contains no such physical uniformity for states. Using recently developed tools of quantum harmonic analysis, a natural one-to-one correspondence between continuous subspaces of nonrelativistic quantum and classical mechanics is established, thus exhibiting a close interrelation between physical uniformities for quantum states (...) and compactifications of phase space. General properties of the completions of the quantum state space with respect to these uniformities are discussed. (shrink)
I aim to provide a satisfying response to radical scepticism, a view according to which our knowledge of the external world is impossible. In the first chapter I investigate into the nature and the source of scepticism. Radical scepticism is motivated both by the closureRK-based and the underdeterminationRK-based sceptical arguments. Because these two sceptical arguments are logically independent, any satisfying anti-sceptical proposal must take both of them into consideration. Also, scepticism is a paradox, albeit a spurious one, so we need (...) to provide a diagnosis as to why we are lead into the paradox and why the alleged paradox misrepresents our epistemic standings. Hence, I advocate an obstacle-dissolving strategy for combating the sceptical problem. In chapter two, I discuss the anti-sceptical import of transcendental arguments. Although ambitious transcendental arguments are vulnerable to Stroud’s dilemma, I argue that modest transcendental arguments are promising. Modest transcendental arguments start from an undoubted psychological fact and then reveal some necessary theoretical commitments that we must make. Regarding these commitments, I submit that we are type II epistemically justified in believing them. Our commitments are type II justified in the sense that making these commitments can promote our epistemic goals, namely, the attainment of true beliefs and the avoidance of false beliefs. After that, in light of Cassam’s objection to transcendental arguments, I contend that a modest transcendental argument should be used as a stepping stone for a diagnostic anti-sceptical proposal. In chapter three, I develop a Davidsonian response to closureRK-based radical scepticism. This form of sceptical argument rests on the idea that there is no limitation on our acquisition of rationally grounded knowledge. I discuss Davidson’s theory of radical interpretation, the principle of charity and triangulation. Crucially, he argues that the content of a knowledge-apt everyday belief is determined by its typical cause and other relevant beliefs. Further, among different propositional attitudes, belief is prior to doubt. What follows is that doubt must be local because it must presume other content-determining beliefs. Also, I explore Davidson’s view on the concept of belief. On his view, in order to have a knowledge-apt belief, we must have the concept of knowledge-apt belief. We can command this concept by having the concept of objective truth. Objective truth requires that we are aware of and are capable of appreciating the possibility of a belief’s being true or false. And this possibility cannot be appreciated unless we have some related contentful beliefs to identify the content of the very belief. However, we are committed to, as opposed to believing, the proposition that the sceptical hypothesis does not obtain. It is impossible to appreciate the possibility of our fundamental commitments being false from our own perspective, because fundamental commitments specify the general cause of our beliefs. A change in this regard would cause a total change of the content of all beliefs, which leaves us no contentful belief at all to make this possibility intelligible. Therefore, the closureRK principle is not applicable to the evaluation of the sceptical hypothesis. Hence, we can retain the closureRK principle while evading the closureRK-based sceptical challenge. Unfortunately, the Davidsonian response cannot deal with the underdeterminationRK-based sceptical challenge, because we are not shown whether our rational support in the good case favours one’s everyday belief over its sceptical counterpart. In chapter four, I examine how epistemological disjunctivism can deal with underdeterminationRK-based radical scepticism. This form of sceptical argument assumes that our rational support provides at best inconclusive support for our beliefs. Therefore, a belief’s being rationally supported, no matter in the good case or in the bad case, is compatible with the belief’s being false. Epistemological disjunctivism claims that in paradigm cases of perceptual knowledge, our rational support can be both factive and reflectively accessible. The factive rational support at issue is one’s propositional seeing. I discuss both McDowell’s and Pritchard’s proposals for motivating factive seeing, and I argue for epistemological disjunctivism against three prime facie objections, i.e., the distinguishability problem, the basis problem and the access problem. When epistemological disjunctivism is shown to be a plausible view, I argue that underdeterminationRK-based radical scepticism can be dismissed. In particular, in the optimal case, factive rational support favours our everyday belief over the sceptical hypothesis. However, regarding closureRK-based radical scepticism, epistemological disjunctivism seems to licence a robust answer. The ambitious answer is that, in the good case, we can after all know the denial of the sceptical hypothesis in virtue of possessing factive rational support. And it is the immodesty of this answer that renders this response unpalatable. In the last chapter, I propose a combined treatment of the sceptical problem. Although both the Davidsonian response and the epistemological disjunctivist response can only deal with one aspect of the sceptical problem, their views are in fact mutually supportive. On the one hand, the Davidsonian response, together with a Wittgensteinian insight, shows that why rational support can only be provided in a local manner; on the other hand, epistemological disjunctivism reminds us that rational support can be factive in the good case. Putting these two points together allows us to answer the whole sceptical challenge in a uniform way. This combined proposal has three claims. First, our rational support can be both local and factive, so we can dismiss both sceptical arguments in one go. Second, the sceptical problem is a spurious paradox, so the combined treatment involves a diagnosis. This diagnosis starts from a modest transcendental argument which reveals some necessary commitments that we must make, and then proceeds to expose faulty assumptions in the sceptical paradox. Third, once the dubious assumptions are dislodged, we can evade the sceptical problem once and for all. In the end, we are offered with a satisfying response to radical scepticism. (shrink)
The purpose of this paper is to introduce a form of update based on the minimization of the geodesic distance on a graph. We provide a characterization of this class using set- theoretic operators and show that such operators bijectively correspond to geodesic metrics. As distance is generated by distinguishability, our framework is appropriate in contexts where distance is generated by threshold, and therefore, when measurement is erroneous.
Entropy in thermodynamics is an extensive quantity, whereas standard methods in statistical mechanics give rise to a non-extensive expression for the entropy. This discrepancy is often seen as a sign that basic formulas of statistical mechanics should be revised, either on the basis of quantum mechanics or on the basis of general and fundamental considerations about the distinguishability of particles. In this article we argue against this response. We show that both the extensive thermodynamic and the non-extensive statistical entropy (...) are perfectly alright within their own fields of application. Changes in the statistical formulas that remove the discrepancy must be seen as motivated by pragmatic reasons rather than as justified by basic arguments about particle statistics. (shrink)
We consider various effects that are encountered in matter wave interference experiments with massive nanoparticles. The text-book example of far-field interference at a grating is compared with diffraction into the dark field behind an opaque aperture, commonly designated as Poisson’s spot or the spot of Arago. Our estimates indicate that both phenomena may still be observed in a mass range exceeding present-day experiments by at least two orders of magnitude. They both require, however, the development of sufficiently cold, intense and (...) coherent cluster beams. While the observation of Poisson’s spot offers the advantage of non-dispersiveness and a simple distinction between classical and quantum fringes in the absence of particle wall interactions, van der Waals forces may severely limit the distinguishability between genuine quantum wave diffraction and classically explicable spots already for moderately polarizable objects and diffraction elements as thin as 100 nm. (shrink)
The long history of ergodic and quasi-ergodic hypotheses provides the best example of the attempt to supply non-probabilistic justifications for the use of statistical mechanics in describing mechanical systems. In this paper we reverse the terms of the problem. We aim to show that accepting a probabilistic foundation of elementary particle statistics dispenses with the need to resort to ambiguous non-probabilistic notions like that of (in)distinguishability. In the quantum case, starting from suitable probability conditions, it is possible to deduce (...) elementary particle statistics in a unified way. Following our approach Maxwell-Boltzmann statistics can also be deduced, and this deduction clarifies its status.Thus our primary aim in this paper is to give a mathematically rigorous deduction of the probability of a state with given energy for a perfect gas in statistical equilibrium; that is, a deduction of the equilibrium distributions for a perfect gas. A crucial step in this deduction is the statement of a unified statistical theory based on clearly formulated probability conditions from which the particle statistics follows. We believe that such a deduction represents an important improvement in elementary particle statistics, and a step towards a probabilistic foundation of statistical mechanics.The present Part II is devoted to this deduction. Part I presented the necessary tools. After the deduction of the probability of a state with given energy for a system in statistical equilibrium, we will propose in the last section a simple model giving an ergodic interpretation of the equilibrium distributions. (shrink)
The present work is a systematic study of the nexus which holds together perception, motivation and existence in Husserl’s early writings—precisely those which are dated between 1898 and 1921. In Chapter I a historical and conceptual reconstruction of the genesis of what is termed ‘constitution problem’ is provided. After a thorough discussion about the distinction between real and intentional description, we elucidate the method of phenomenological reduction and show how the constitution problem relates to questions regarding transcendence and existence. Chapter (...) II is concerned with a detailed presentation of Husserlian phenomenology of visual perception. We present Husserl’s theory of intentionality in the light of Husserlian mereology: first, we argue that Husserl conceives of intentionality as a property which entails a relation; secondly, we debate his critique of the theory of immanent objects and his solution to the problem of non-existent objects. After examining the perceptual act in all its essential components (i.e. quality, matter and sensations), we discuss the notorious ‘content – apprehension’ schema and study the manuscripts in which Husserl develops the notion of ‘perceptional’. Themes like the relationship between fulfillment and disappointment and the distinguishability of veridical and non-veridical perceptions are also taken into account. In Chapter III we consider what differentiates the outer perception from other kinds of perception. After making clear what Husserl means by ‘inner perception’ we debate the opposition between immanent and transcendent perception, first by using identity/manifold analysis and then by means of whole/part analysis. In this context we reject Husserl’s account of reflection as a perceptual act on both exegetical and theoretical grounds. Furthermore, we explain how Husserl tries to refute the ‘image theory’ and how he addresses the issue of the hidden profiles. The study of the microstructure of the outer perception allows us to explain in which sense this kind of perception is to be conceived as necessarily inadequate. Chapter IV is largely devoted to an attempt at systematizing Husserl’s theory of kinaesthesia as it appears in the Dingvorlesung. This sheds light on the structure of motivation and on the role which this latter plays in the constitution of a mundane object. In Chapter V we scrutinize Husserl’s conception of the possibility/reality dichotomy. In particular, we distinguish an ontological analysis of possibility from a phenomenological one and investigate the diverse concepts of ‘possibility’ (e.g. ideal, real, independent, modal) developed by Husserl. Finally, we introduce and debate Husserl’s (so-called) ‘exhibition principle’ and try to point out its ambiguities. (shrink)