The research observed, in parallel and comparatively, a surveillance state’s use of communication & cyber networks with satellite applications for power political & realpolitik purposes, in contrast to the outer space security & legit scientific purpose driven cybernetics. The research adopted a psychoanalytic & psychosocial method of observation in the organizational behaviors of the surveillance state, and a theoretical physics, astrochemical, & cosmological feedback method in the contrast group of cybernetics. Military sociology and multilateral movements were adopted in the diagnostic (...) studies & research on cybersecurity, and cross-channeling in communications were detected during the research. The paper addresses several key points of technicalities in security & privacy breach, from personal devices to ontological networks and satellite applications - notably telecommunication service providers & carriers with differentiated spectrum. The paper discusses key moral ethical risks posed in the mal-adaptations in commercial devices that can corrupt democracy in subtle ways but in a mass scale. The research adopted an analytical linguistics approach with linguistic history in unjailing from the artificial intelligence empowered pancomputationalism approach of the heterogenous dictatorial semantic network, and the astronomical & cosmological research in information theory implies that noncomputable processes are the only defense strategy for the new technology-driven pancomputationalism developments. (shrink)
Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical-mechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the single-particle case, the finite particles case, and the (...) infinite particles case. (shrink)
The “universality” of critical phenomena is much discussed in philosophy of scientific explanation, idealizations and philosophy of physics. Lange and Reutlinger recently opposed Batterman concerning the role of some deliberate distortions in unifying a large class of phenomena, regardless of microscopic constitution. They argue for an essential explanatory role for “commonalities” rather than that of idealizations. Building on Batterman's insight, this article aims to show that assessing the differences between the universality of critical phenomena and two paradigmatic cases of “commonality (...) strategy”—the ideal gas model and the harmonic oscillator model—is necessary to avoid the objections raised by Lange and Reutlinger. Taking these universal explanations as benchmarks for critical phenomena reveals the importance of the different roles played by analogies underlying the use of the models. A special combination of physical and formal analogies allows one to explain the epistemic autonomy of the universality of critical phenomena through an explicative loop. (shrink)
Contextual emergence was originally proposed as an inter-level relation between different levels of description to describe an epistemic notion of emergence in physics. Here, we discuss the ontic extension of this relation to different domains or levels of physical reality using the properties of temperature and molecular shape as detailed case studies. We emphasize the concepts of stability conditions and multiple realizability as key features of contextual emergence. Some broader implications contextual emergence has for the foundations of physics and cognitive (...) and neural sciences are given in the concluding discussion. Relevant facts about algebras of observables are found in the appendices along with an abstract definition of Kubo-Martin-Schwinger states. (shrink)
The numerous and diverse roles of theory reduction in science have been insufficiently explored in the philosophy literature on reduction. Part of the reason for this has been a lack of attention paid to reduction2 (successional reduction)—although I here argue that this sense of reduction is closer to reduction1 (explanatory reduction) than is commonly recognised, and I use an account of reduction that is neutral between the two. This paper draws attention to the utility—and incredible versatility—of theory reduction. A non-exhaustive (...) list of various applications of reduction in science is presented, some of which are drawn from a particular case-study, being the current search for a new theory of fundamental physics. This case-study is especially interesting because it employs both senses of reduction at once, and because of the huge weight being put on reduction by the different research groups involved; additionally, it presents some unique uses for reduction—revealing, I argue, the fact that reduction can be of specialised and unexpected service in particular scientific cases. The paper makes two other general findings: that the functions of reduction that are typically assumed to characterise the different forms of the relation may instead be understood as secondary consequences of some other roles; and that most of the roles that reduction plays in science can actually also be fulfilled by a weaker relation than (the typical understanding of) reduction. (shrink)
I show explicitly how concerns about wave function collapse and ontology can be decoupled from the bulk of technical analysis necessary to recover localized, approximately Newtonian trajectories from quantum theory. In doing so, I demonstrate that the account of classical behavior provided by decoherence theory can be straightforwardly tailored to give accounts of classical behavior on multiple interpretations of quantum theory, including the Everett, de Broglie-Bohm and GRW interpretations. I further show that this interpretation-neutral, decoherence-based account conforms to a general (...) view of inter-theoretic reduction in physics that I have elaborated elsewhere, which differs from the oversimplified and often ambiguous picture that treats reduction simply as a matter of taking limits. This interpretation-neutral account rests on a general three-pronged strategy for reduction between quantum and classical theories that combines decoherence, an appropriate form of Ehrenfest's Theorem, and a decoherence-compatible mechanism for collapse. It also incorporates a novel argument as to why branch-relative trajectories should be approximately Newtonian, which is based on a little-discussed extension of Ehrenfest's Theorem to open systems, rather than on the more commonly cited but less germane closed-systems version. In the Conclusion, I briefly suggest how the strategy for quantum-classical reduction described here might be extended to reduction between other classical and quantum theories, including classical and quantum field theory and classical and quantum gravity. (shrink)
A conventional wisdom about the progress of physics holds that successive theories wholly encompass the domains of their predecessors through a process that is often called reduction. While certain influential accounts of inter-theory reduction in physics take reduction to require a single "global" derivation of one theory's laws from those of another, I show that global reductions are not available in all cases where the conventional wisdom requires reduction to hold. However, I argue that a weaker "local" form of reduction, (...) which defines reduction between theories in terms of a more fundamental notion of reduction between models of a single fixed system, is available in such cases and moreover suffices to uphold the conventional wisdom. To illustrate the sort of fixed-system, inter-model reduction that grounds inter-theoretic reduction on this picture, I specialize to a particular class of cases in which both models are dynamical systems. I show that reduction in these cases is underwritten by a mathematical relationship that follows the broad prescriptions of Nagel/Schaffner reduction, and support this claim with several examples. Moreover, I show that this broadly Nagelian analysis of inter-model reduction encompasses several cases that are sometimes cited as instances of the "physicist's" limit-based notion of reduction. (shrink)
I distinguish two types of reduction within the context of quantum-classical relations, which I designate “formal” and “empirical”. Formal reduction holds or fails to hold solely by virtue of the mathematical relationship between two theories; it is therefore a two-place, a priori relation between theories. Empirical reduction requires one theory to encompass the range of physical behaviors that are well-modeled in another theory; in a certain sense, it is a three-place, a posteriori relation connecting the theories and the domain of (...) physical reality that both serve to describe. Focusing on the relationship between classical and quantum mechanics, I argue that while certain formal results concerning singular \ limits have been taken to preclude the possibility of reduction between these theories, such results at most provide support for the claim that singular limits block reduction in the formal sense; little if any reason has been given for thinking that they block reduction in the empirical sense. I then briefly outline a strategy for empirical reduction that is suggested by work on decoherence theory, arguing that this sort of account remains a fully viable route to the empirical reduction of classical to quantum mechanics and is unaffected by such singular limits. (shrink)
Supporters of the de Broglie-Bohm interpretation of quantum theory argue that because the theory, like classical mechanics, concerns the motions of point particles in 3D space, it is specially suited to recover classical behavior. I offer a novel account of classicality in dBB theory, if only to show that such an account falls out almost trivially from results developed in the largely interpretation-neutral context of decoherence theory. I then argue that this undermines any special claim that dBB theory is purported (...) to have on the unification of the quantum and classical realms. (shrink)
This volume presents twelve original essays on the metaphysics of science, with particular focus on the physics of chance and time. Experts in the field subject familiar approaches to searching critiques, and make bold new proposals in a number of key areas. Together, they set the agenda for future work on the subject.
I develop a variant of the constraint interpretation of the emergence of purely physical (non-biological) entities, focusing on the principle of the non-derivability of actual physical states from possible physical states (physical laws) alone. While this is a necessary condition for any account of emergence, it is not sufficient, for it becomes trivial if not extended to types of constraint that specifically constitute physical entities, namely, those that individuate and differentiate them. Because physical organizations with these features are in fact (...) interdependent sets of such constraints, and because such constraints on physical laws cannot themselves be derived from physical laws, physical organization is emergent. These two complementary types of constraint are components of a complete non-reductive physicalism, comprising a non-reductive materialism and a non-reductive formalism. (shrink)
The Emergic Cognitive Model (ECM) is a unified computational model of visual filling-in based on the Emergic Network architecture. The Emergic Network was designed to help realize systems undergoing continuous change. In this thesis, eight different filling-in phenomena are demonstrated under a regime of continuous eye movement (and under static eye conditions as well). -/- ECM indirectly demonstrates the power of unification inherent with Emergic Networks when cognition is decomposed according to finer-grained functions supporting change. These can interact to raise (...) additional emergent behaviours via cognitive re-use, hence the Emergic prefix throughout. Nevertheless, the model is robust and parameter free. Differential re-use occurs in the nature of model interaction with a particular testing paradigm. -/- ECM has a novel decomposition due to the requirements of handling motion and of supporting unified modelling via finer functional grains. The breadth of phenomenal behaviour covered is largely to lend credence to our novel decomposition. -/- The Emergic Network architecture is a hybrid between classical connectionism and classical computationalism that facilitates the construction of unified cognitive models. It helps cutting up of functionalism into finer-grains distributed over space (by harnessing massive recurrence) and over time (by harnessing continuous change), yet simplifies by using standard computer code to focus on the interaction of information flows. Thus while the structure of the network looks neurocentric, the dynamics are best understood in flowcentric terms. Surprisingly, dynamic system analysis (as usually understood) is not involved. An Emergic Network is engineered much like straightforward software or hardware systems that deal with continuously varying inputs. Ultimately, this thesis addresses the problem of reduction and induction over complex systems, and the Emergic Network architecture is merely a tool to assist in this epistemic endeavour. -/- ECM is strictly a sensory model and apart from perception, yet it is informed by phenomenology. It addresses the attribution problem of how much of a phenomenon is best explained at a sensory level of analysis, rather than at a perceptual one. As the causal information flows are stable under eye movement, we hypothesize that they are the locus of consciousness, howsoever it is ultimately realized. (shrink)
This chapter unfolds a central philosophical problem of statistical mechanics. This problem lies in a clash between the Static Probabilities offered by statistical mechanics and the Dynamic Probabilities provided by classical or quantum mechanics. The chapter looks at the Boltzmann and Gibbs approaches in statistical mechanics and construes some of the great controversies in the field — for instance the Reversibility Paradox — as instances of this conflict. It furthermore argues that a response to this conflict is a critical choice (...) that shapes one's understanding of statistical mechanics itself, namely, whether it is to be conceived as a special or fundamental science. The chapter details some of the pitfalls of the latter ‘globalist’ position and seeks defensible ground for a kind of ‘localist’ alternative. (shrink)
Quantum electrodynamics presents intrinsic limitations in the description of physical processes that make it impossible to recover from it the type of description we have in classical electrodynamics. Hence one cannot consider classical electrodynamics as reducing to quantum electrodynamics and being recovered from it by some sort of limiting procedure. Quantum electrodynamics has to be seen not as a more fundamental theory, but as an upgrade of classical electrodynamics, which permits an extension of classical theory to the description of phenomena (...) that, while being related to the conceptual framework of the classical theory, cannot be addressed from the classical theory. (shrink)
The second half of the twentieth century offers distinct perspectives for the historian of science. The role of the State, the expansion of certain industries and the cultural engagement with science were all transformed. The foregrounding of certain strands of physical science in the public and administrative consciousness – nuclear physics and planetary science, for example – had a complement: the ‘backgrounding’ or institutional neglect of a number of other fields. My work in the history of the physical sciences has (...) focused on this little-noticed intellectual terrain, and could be categorised into several types of case study that share distinct research questions, conceptual understandings and historiographical ramifications. -/- My focus is physical sciences that have been identified as peripheral, if categorised at all, by a previous generation of historians of physics. By this I do not mean peripheral in the geographic sense, but marginal, interstitial or boundary-crossing in the context of occupations, disciplines and professions. The types of case study investigated include (i) scientific instruments; (ii) emergent professions or would-be professions; and, (iii) subject areas falling between academic science, industrial application and State interests. (shrink)
This paper discusses the alleged reduction of Thermodynamics to Statistical Mechanics. It includes an historical discussion of J. Willard Gibbs' famous caution concerning the connections between thermodynamic properties and statistical mechanical properties---his so-called ``Thermodynamic Analogies.'' The reasons for Gibbs' caution are reconsidered in light of relatively recent work in statistical physics on the existence of the thermodynamic limit and the explanation of critical behavior using the renormalization group apparatus. A probabilistic understanding of the renormalization group arguments allows for a kind (...) of unification of Gibbs' approach with contemporary understanding of the reduction problem. (shrink)
I respond to Belot's argument and defend the view that sometimes `fundamental theories' are explanatorily inadequate and need to be supplemented with certain aspects of less fundamental `theories emeritus'.
Batterman has recently argued that fundamental theories are typically explanatorily inadequate, in that there exist physical phenomena whose explanation requires that the conceptual apparatus of a fundamental theory be supplemented by that of a less fundamental theory. This paper is an extended critical commentary on that argument: situating its importance, describing its structure, and developing a line of objection to it. The objection is that in the examples Batterman considers, the mathematics of the less fundamental theory is definable in terms (...) of the mathematics of the fundamental theory and that only the latter need be given a physical interpretation---so we can view the desired explanation as drawing only upon resources internal to the more fundamental physical theory. (The paper also includes an appendix surveying some recent results on quantum chaos.). (shrink)
One of the recurrent problems in the foundations of physics is to explain why we rarely observe certain phenomena that are allowed by our theories and laws. In thermodynamics, for example, the spontaneous approach towards equilibrium is ubiquitous yet the time-reversal-invariant laws that presumably govern thermal behaviour in the microscopic level equally allow spontaneous departure from equilibrium to occur. Why are the former processes frequently observed while the latter are almost never reported? Another example comes from quantum mechanics where the (...) formalism, if considered complete and universally applicable, predicts the existence of macroscopic superpositions—monstrous Schr¨odinger cats—and these are never observed: while electrons and atoms enjoy the cloudiness of waves, macroscopic objects are always localized to definite positions. (shrink)
The relation between micro-objects and macro-objects advocated by Kim is even more problematic than Ross & Spurrett (R&S) argue, for reasons rooted in physics. R&S's own ontological proposals are much more satisfactory from a physicist's viewpoint but may still be problematic. A satisfactory theory of macroscopic ontology must be as independent as possible of the details of microscopic physics.
Robert Batterman examines a form of scientific reasoning called asymptotic reasoning, arguing that it has important consequences for our understanding of the scientific process as a whole. He maintains that asymptotic reasoning is essential for explaining what physicists call universal behavior. With clarity and rigor, he simplifies complex questions about universal behavior, demonstrating a profound understanding of the underlying structures that ground them. This book introduces a valuable new method that is certain to fill explanatory gaps across disciplines.
1. It is natural to wonder what our multitude of successful physical theories tell us about the world—singly, and as a body. What are we to think when one theory tells us about a flat Newtonian spacetime, the next about a curved Lorentzian geometry, and we have hints of others, portraying discrete or higher-dimensional structures which look something like more familiar spacetimes in appropriate limits?
Part I of this trilogy, Historical and Scientific Setting, set out a general context for selecting a certain subclass of inter-theoretic relations as achieving appropriate explanatory and ontological unification – hence for properly being labelled reductive. Something of the complexity of these relations in real science was explored. The present article concentrates on the role which identity plays in structuring the reduction relation and so in achieving ontological and explanatory unification.
Any theory of reduction that goes only so far as carried in Parts I and II does only half the job. Prima facie at least, there are cases of would-be reduction which seem torn between two conflicting intuitions. On the one side there is a strong intuition that reduction is involved, and a strongly retentive reduction at that. On the other side it seems that the concepts at one level cross-classify those at the other level, so that there is no (...) way to identify properties at one level with those at the other. There is evidence to suggest that there will be no unique mental state/neural state association that can be set up, because, e.g., many different parts of the nervous system are all capable of taking over ‘control’ of the one mental function. And it is alleged that infinitely many, worse: indefinitely many, different bio-chemo-physical states could correspond to the economic property ‘has a monetary system of economic exchange’; and similarly for the property ‘has just won a game of tennis’. Yet one doesn't want an economic system or a game of tennis to be some ghostly addition to the actual bio-chemo-physical processes and events involved. Similarly one hopes that neurophysiology allied with the rest of natural science will render human experience and behaviour explicable. (shrink)
Four current accounts of theory reduction are presented, first informally and then formally: (1) an account of direct theory reduction that is based on the contributions of Nagel, Woodger, and Quine, (2) an indirect reduction paradigm due to Kemeny and Oppenheim, (3) an "isomorphic model" schema traceable to Suppes, and (4) a theory of reduction that is based on the work of Popper, Feyerabend, and Kuhn. Reference is made, in an attempt to choose between these schemas, to the explanation of (...) physical optics by Maxwell's electromagnetic theory, and to the revisions of genetics necessitated by partial biochemical reductions of genetics. A more general reduction schema is proposed which: (1) yields as special cases the four reduction paradigms considered above, (2) seems to be in better accord with both the canons of logic and actual scientific practice, and (3) clarifies the problems of meaning variance and ontological reduction. (shrink)
I defend three general claims concerning inter-theoretic reduction in physics. First, the popular notion that a superseded theory in physics is generally a simple limit of the theory that supersedes it paints an oversimplified picture of reductive relations in physics. Second, where reduction specifically between two dynamical systems models of a single system is concerned, reduction requires the existence of a particular sort of function from the state space of the low-level model to that of the high-level model that approximately (...) commutes, in a specific sense, with the rules of dynamical evolution prescribed by the models. The third point addresses a tension between, on the one hand, the frequent need to take into account system-specific details in providing a full derivation of the high-level theory’s success in a particular context, and, on the other hand, a desire to understand the general mechanisms and results that under- write reduction between two theories across a wide and disparate range of different systems; I suggest a reconciliation based on the use of partial proofs of reduction, designed to reveal these general mechanisms of reduction at work across a range of systems, while leaving certain gaps to be filled in on the basis of system-specific details. After discussing these points of general methodology, I go on to demonstrate their application to a number of particular inter-theory reductions in physics involving quantum theory. I consider three reductions: first, connecting classical mechanics and non-relativistic quantum mechanics; second,connecting classical electrodynamics and quantum electrodynamics; and third, connecting non-relativistic quantum mechanics and quantum electrodynamics. I approach these reductions from a realist perspective, and for this reason consider two realist interpretations of quantum theory - the Everett and Bohm theories - as potential bases for these reductions. Nevertheless, many of the technical results concerning these reductions pertain also more generally to the bare, uninterpreted formalism of quantum theory. Throughout my analysis, I make the application of the general methodological claims of the thesis explicit, so as to provide concrete illustration of their validity. (shrink)