We claim that, as it stands, the Deutsch–Wallace–Everett approach to quantum theory is conceptually incoherent. This charge is based upon the approach’s reliance upon decoherence arguments that conflict with its own fundamental precepts regarding probabilistic reasoning in two respects. This conceptual conflict obtains even if the decoherence arguments deployed are aimed merely towards the establishment of certain ‘emergent’ or ‘robust’ structures within the wave function: To be relevant to physical science notions such as robustness must be empirically grounded, and, on (...) our analysis, this grounding can only plausibly be done in precisely the probabilistic terms that lead to conceptual conflict. Thus, the incoherence problems presented necessitate either the provision of a new, non-probabilistic empirical grounding for the notions of robustness and emergence in the context of decoherence, or the abandonment of the Deutsch–Wallace–Everett programme for quantum theory. (shrink)
BackgroundObtaining informed consent for participation in genomic research in low-income settings presents specific ethical issues requiring attention. These include the challenges that arise when providing information about unfamiliar and technical research methods, the implications of complicated infrastructure and data sharing requirements, and the potential consequences of future research with samples and data. This study investigated researchers’ and participants’ parents’ experiences of a consent process and understandings of a genome-wide association study of malaria involving children aged five and under in Mali. (...) It aimed to inform best practices in recruiting participants into genomic research.MethodsA qualitative rapid ethical assessment was undertaken. Fifty-five semi-structured interviews were conducted with the parents of research participants. An additional nine semi-structured interviews were conducted with senior research scientists, research assistants and with a member of an ethics committee. A focus group with five parents of research participants and direct observations of four consent processes were also conducted. French and translated English transcripts were descriptively and thematically coded using OpenCode software.ResultsParticipants’ parents in the MalariaGEN study had differing understandings of the causes of malaria, the rationale for collecting blood samples, the purposes of the study and the kinds of information the study would generate. Genomic aspects of the research, including the gene/environment interaction underlying susceptibility or resistance to severe malaria, proved particularly challenging to explain and understand.ConclusionsThis study identifies a number of areas to be addressed in the design of consent processes for genomic research, some of which require careful ethical analysis. These include determining how much information should be provided about differing aspects of the research and how best to promote understandings of genomic research. We conclude that it is important to build capacity in the design and conduct of effective and appropriate consent processes for genomic research in low and middle-income settings. Additionally, consideration should be given to the role of review committees and community consultation activities in protecting the interests of participants in genomic research. (shrink)
Econophysics is a new and exciting cross-disciplinary research field that applies models and modelling techniques from statistical physics to economic systems. It is not, however, without its critics: prominent figures in more mainstream economic theory have criticized some elements of the methodology of econophysics. One of the main lines of criticism concerns the nature of the modelling assumptions and idealizations involved, and a particular target are ‘kinetic exchange’ approaches used to model the emergence of inequality within the distribution of individual (...) monetary income. This article will consider such models in detail, and assess the warrant of the criticisms drawing upon the philosophical literature on modelling and idealization. Our aim is to provide the first steps towards informed mediation of this important and interesting interdisciplinary debate, and our hope is to offer guidance with regard to both the practice of modelling inequality, and the inequality of modelling practice. _1_ Introduction _1.1_ Econophysics and its discontents _1.2_ Against burglar economics _2_ Modelling Inequality _2.1_ Mainstream economic models for income distribution _2.2_ Econophysics models for income distribution _3_ Idealizations in Kinetic Exchange Models _3.1_ Binary interactions _3.2_ Conservation principles _3.3_ Exchange dynamics _ 4 _ Fat Tails and Savings _ 5 _ Evaluation. (shrink)
In recent decades, non-representational approaches to mental phenomena and cognition have been gaining traction in cognitive science and philosophy of mind. In these alternative approach, mental representations either lose their central status or, in its most radical form, are banned completely. While there is growing agreement that non-representational accounts may succeed in explaining some cognitive capacities, there is widespread skepticism about the possibility of giving non-representational accounts of cognitive capacities such as memory, imagination or abstract thought. In this paper, I (...) will critically examine the view that there are fundamental limitations to non-representational explanations of cognition. Rather than challenging these arguments on general grounds, I will examine a set of human cognitive capacities that are generally thought to fall outside the scope of non-representational accounts, i.e. numerical cognition. After criticizing standard representational accounts of numerical cognition for their lack of explanatory power, I will argue that a non-representational approach that is inspired by radical enactivism offers the best hope for developing a genuine naturalistic explanatory account for these cognitive capacities. (shrink)
On one popular view, the general covariance of gravity implies that change is relational in a strong sense, such that all it is for a physical degree of freedom to change is for it to vary with regard to a second physical degree of freedom. At a quantum level, this view of change as relative variation leads to a fundamentally timeless formalism for quantum gravity. Here, we will show how one may avoid this acute ‘problem of time’. Under our view, (...) duration is still regarded as relative, but temporal succession is taken to be absolute. Following our approach, which is presented in more formal terms in, it is possible to conceive of a genuinely dynamical theory of quantum gravity within which time, in a substantive sense, remains. 1 Introduction1.1 The problem of time1.2 Our solution2 Understanding Symmetry2.1 Mechanics and representation2.2 Freedom by degrees2.3 Voluntary redundancy3 Understanding Time3.1 Change and order3.2 Quantization and succession4 Time and Gravitation4.1 The two faces of classical gravity4.2 Retaining succession in quantum gravity5 Discussion5.1 Related arguments5.2 Concluding remarks. (shrink)
The canonical formalism of general relativity affords a particularly interesting characterisation of the infamous hole argument. It also provides a natural formalism in which to relate the hole argument to the problem of time in classical and quantum gravity. In this paper we examine the connection between these two much discussed problems in the foundations of spacetime theory along two interrelated lines. First, from a formal perspective, we consider the extent to which the two problems can and cannot be precisely (...) and distinctly characterised. Second, from a philosophical perspective, we consider the implications of various responses to the problems, with a particular focus upon the viability of a `deflationary' attitude to the relationalist/substantivalist debate regarding the ontology of spacetime. Conceptual and formal inadequacies within the representative language of canonical gravity will be shown to be at the heart of both the canonical hole argument and the problem of time. Interesting and fruitful work at the interface of physics and philosophy relates to the challenge of resolving such inadequacies. (shrink)
Disease prioritarianism is a principle that is often implicitly or explicitly employed in the realm of healthcare prioritization. This principle states that the healthcare system ought to prioritize the treatment of disease before any other problem. This article argues that disease prioritarianism ought to be rejected. Instead, we should adopt ‘the problem-oriented heuristic’ when making prioritizations in the healthcare system. According to this idea, we ought to focus on specific problems and whether or not it is possible and efficient to (...) address them with medical means. This has radical implications for the extension of the healthcare system. First, getting rid of the binary disease/no-disease dichotomy implicit in disease prioritarianism would improve the ability of the healthcare system to address chronic conditions and disabilities that often defy easy classification. Second, the problem-oriented heuristic could empower medical practitioners to address social problems without the need to pathologize these conditions. Third, the problem-oriented heuristic clearly states that what we choose to treat is a normative consideration. Under this assumption, we can engage in a discussion on de-medicalization without distorting preconceptions. Fourth, this pragmatic and de-compartmentalizing approach should allow us to reconsider the term ‘efficiency’. (shrink)
Symplectic reduction is a formal process through which degeneracy within the mathematical representations of physical systems displaying gauge symmetry can be controlled via the construction of a reduced phase space. Typically such reduced spaces provide us with a formalism for representing both instantaneous states and evolution uniquely and for this reason can be justifiably afforded the status of fun- damental dynamical arena - the otiose structure having been eliminated from the original phase space. Essential to the application of symplectic reduction (...) is the precept that the first class constraints are the relevant gauge generators. This prescription becomes highly problematic for reparameterization invariant theories within which the Hamiltonian itself is a constraint; not least because it would seem to render prima facie distinct stages of a history physically identical and observable functions changeless. Here we will consider this problem of time within non-relativistic me- chanical theory with a view to both more fully understanding the temporal struc- ture of these timeless theories and better appreciating the corresponding issues in relativistic mechanics. For the case of nonrelativistic reparameterization invariant theory application of symplectic reduction will be demonstrated to be both unnec- essary; since the degeneracy involved is benign; and inappropriate; since it leads to a trivial theory. With this anti-reductive position established we will then examine two rival methodologies for consistently representing change and observable func- tions within the original phase space before evaluating the relevant philosophical implications. We will conclude with a preview of the case against symplectic re- duction being applied to canonical general relativity. (shrink)
Starting from a generalized Hamilton-Jacobi formalism, we develop a new framework for constructing observables and their evolution in theories invariant under global time reparametrizations. Our proposal relaxes the usual Dirac prescription for the observables of a totally constrained system and allows one to recover the influential partial and complete observables approach in a particular limit. Difficulties such as the non-unitary evolution of the complete observables in terms of certain partial observables are explained as a breakdown of this limit. Identification of (...) our observables relies upon a physical distinction between gauge symmetries that exist at the level of histories and states, and those that exist at the level of histories and not states. This distinction resolves a tension in the literature concerning the physical interpretation of the partial observables and allows for a richer class of observables in the quantum theory. There is the potential for the application of our proposal to the quantization of gravity when understood in terms of the Shape Dynamics formalism. (shrink)
We propose a solution to the problem of time for systems with a single global Hamiltonian constraint. Our solution stems from the observation that, for these theories, conventional gauge theory methods fail to capture the full classical dynamics of the system and must therefore be deemed inappropriate. We propose a new strategy for consistently quantizing systems with a relational notion of time that does capture the full classical dynamics of the system and allows for evolution parametrized by an equitable internal (...) clock. This proposal contains the minimal temporal structure necessary to retain the ordering of events required to describe classical evolution. In the context of shape dynamics (an equivalent formulation of general relativity that is locally scale invariant and free of the local problem of time) our proposal can be shown to constitute a natural methodology for describing dynamical evolution in quantum gravity and to lead to a quantum theory analogous to the Dirac quantization of unimodular gravity. (shrink)
A fundamental tenet of Paul Feyerabend’s pluralistic view of science has it that theory proliferation, that is, the availability of theoretical alternatives, is of crucial importance for the detection of anomalies in established theories. Paul Hoyningen-Huene calls this the Anomaly Importation Thesis, according to which anomalies are imported, as it were, into well-established theories from competing alternatives. This article pursues two major objectives: (a) to work out the systematic details of Feyerabend’s ideas on theory proliferation and anomaly import as they (...) are presented in his early publications and his Against Method and (b) to compare Feyerabend’s ideas on theory proliferation and anomaly import with corresponding features in Popper’s critical rationalist philosophy of science. As it turns out, neither the Principle of Proliferation nor the Anomaly Importation Thesis are necessarily incompatible with critical rationalism. In spite of Feyerabend’s general anti-Popperian attitude, I argue that theoretical pluralism can be seen as an advancement of the critical rationalist philosophy and that critical rationalism provides good arguments for pluralism. (shrink)
Embodied and extended cognition is a relatively new paradigm within cognitive science that challenges the basic tenet of classical cognitive science, viz. cognition consists in building and manipulating internal representations. Some of the pioneers of embodied cognitive science have claimed that this new way of conceptualizing cognition puts pressure on epistemological and ontological realism. In this paper I will argue that such anti-realist conclusions do not follow from the basic assumptions of radical embodied cognitive science. Furthermore I will show that (...) one can develop a form of realism that reflects rather than just accommodates the core principles of non-representationalist embodied cognitive science. (shrink)
The analysis of the temporal structure of canonical general relativity and the connected interpretational questions with regard to the role of time within the theory both rest upon the need to respect the fundamentally dual role of the Hamiltonian constraints found within the formalism. Any consistent philosophical approach towards the theory must pay dues to the role of these constraints in both generating dynamics, in the context of phase space, and generating unphysical symmetry transformations, in the context of a hypersurface (...) embedded within a solution. A first denial of time in the terms of a position of reductive temporal relationalism can be shown to be troubled by failure on the first count, and a second denial in the terms of Machian temporal relationalism can be found to be hampered by failure on the second. A third denial of time, consistent with both of the Hamiltonian constraints roles, is constituted by the implementation of a scheme for constructing observables in terms of correlations and leads to a radical Parmenidean timelessness. The motivation for and implications of each of these three denials are investigated. (shrink)
In 1981 Unruh proposed that fluid mechanical experiments could be used to probe key aspects of the quantum phenomenology of black holes. In particular, he claimed that an analogue to Hawking radiation could be created within a fluid mechanical `dumb hole', with the event horizon replaced by a sonic horizon. Since then an entire sub-field of `analogue gravity' has been created. In 2016 Steinhauer reported the experimental observation of quantum Hawking radiation and its entanglement in a Bose-Einstein condensate analogue black (...) hole. What can we learn from such analogue experiments? In particular, in what sense can they provide evidence of novel phenomena such as black hole Hawking radiation? (shrink)
The professions have focused considerable attention on developing codes of conduct. Despite their efforts there is considerable controversy regarding the propriety of professional codes of ethics. Many provisions of professional codes seem to exacerbate disputes between the profession and the public rather than providing a framework that satisfies the public''s desire for moral behavior.After examining three professional codes, we divide the provisions of professional codes into those provisions which urge professionals to avoid moral hazard, maintain professional courtesy and serve the (...) public interest. We note that whereas provisions urging the avoidance of moral hazard are uncontroversial, the public is suspicious of provisions protecting professional courtesy. Public interest provisions are controversial when the public and the profession disagree as to what is in the public interest. Based on these observations, we conclude with recommendations regarding the content of professional codes. (shrink)
A small but growing number of studies have aimed to understand, assess and reduce existential risks, or risks that threaten the continued existence of mankind. However, most attention has been focused on known and tangible risks. This paper proposes a heuristic for reducing the risk of black swan extinction events. These events are, as the name suggests, stochastic and unforeseen when they happen. Decision theory based on a fixed model of possible outcomes cannot properly deal with this kind of event. (...) Neither can probabilistic risk analysis. This paper will argue that the approach that is referred to as engineering safety could be applied to reducing the risk from black swan extinction events. It will also propose a conceptual sketch of how such a strategy may be implemented: isolated, self-sufficient, and continuously manned underground refuges. Some characteristics of such refuges are also described, in particular the psychosocial aspects. Furthermore, it is argued that this implementation of the engineering safety strategy safety barriers would be effective and plausible and could reduce the risk of an extinction event in a wide range of possible scenarios. Considering the staggering opportunity cost of an existential catastrophe, such strategies ought to be explored more vigorously. (shrink)
We introduce ‘model migration’ as a species of cross-disciplinary knowledge transfer whereby the representational function of a model is radically changed to allow application to a new disciplinary context. Controversies and confusions that often derive from this phenomenon will be illustrated in the context of econophysics and phylogeographic linguistics. Migration can be usefully contrasted with concept of ‘imperialism’, that has been influentially discussed in the context of geographical economics. In particular, imperialism, unlike migration, relies upon extension of the original model (...) via an expansion of the domain of phenomena it is taken to adequately described. The success of imperialism thus requires expansion of the justificatory sanctioning of the original idealising assumptions to a new disciplinary context. Contrastingly, successful migration involves the radical representational re-interpretation of the original model, rather than its extension. Migration thus requires ‘re-sanctioning’ of new ‘counterpart idealisations’ to allow application to an entirely different class of phenomena. Whereas legitimate scientific imperialism should be based on the pursuit of some form of ontological unification, no such requirement is need to legitimate the practice of model migration. The distinction between migration and imperialism will thus be shown to have significant normative as well as descriptive value. (shrink)
The generalized theory of evolution suggests that evolutionary algorithms apply to biological and cultural processes like language alike. Variation, selection and reproduction constitute abstract and formal traits of complex, open and often self-regulating systems. Accepting this basic assumption provides us with a powerful background methodology for this investigation: explaining the emergence and proliferation of semantic patterns, that become conventional. A teleosemantic theory of public (conventional) meaning (Millikan 1984; 2005) grounded in a generalized theory of evolution explains the proliferation of public (...) language forms in terms of their adaptive proper function. It has also been suggested, that the emergence of meaning, can be formalized with game-theoretical tools (Skyrms 2010) within signaling systems of coordination. I want to show how closely related these approaches are, both in terms of explanandum and of outcomes. To put it in a nutshell: If the emergence of public meaning can be satisfyingly explained in terms of signaling games, then the cultural evolutionary dynamics will serve as an adequate model to describe their proliferation. Public or conventional meaning (in contrast to personal meaning) can be fully understood in terms of its evolutionary function in a population of communicators. Furthermore, I want to argue how this understanding of conventional meaning could lead us to a strong semantic holism. (shrink)
The View from Here is a study of our must fundamental attitudes toward the past. The book explores the dynamics of affirmation and regret, tracing the connections of each to our ongoing attachments. The focus is on situations in which our attachments commit us to affirming events or decisions that we know to have been unfortunate or regrettable.
In this article we argue for the existence of ‘analogue simulation’ as a novel form of scientific inference with the potential to be confirmatory. This notion is distinct from the modes of analogical reasoning detailed in the literature, and draws inspiration from fluid dynamical ‘dumb hole’ analogues to gravitational black holes. For that case, which is considered in detail, we defend the claim that the phenomena of gravitational Hawking radiation could be confirmed in the case that its counterpart is detected (...) within experiments conducted on diverse realizations of the analogue model. A prospectus is given for further potential cases of analogue simulation in contemporary science. 1 Introduction2 Physical Background2.1 Hawking radiation in semi-classical gravity2.2 Modelling sound in fluids2.3 The acoustic analogue model of Hawking radiation3 Simulation and Analogy in Physical Theory3.1 Analogical reasoning and analogue simulation3.2 Confirmation via analogue simulation3.3 Recapitulation4 The Sound of Silence: Analogical Insights into Gravity4.1 Experimental realization of analogue models4.2 Universality and the Hawking effect4.3 Confirmation of gravitational Hawking radiation5 Prospectus. (shrink)
Brain machine interface (BMI) technology makes direct communication between the brain and a machine possible by means of electrodes. This paper reviews the existing and emerging technologies in this field and offers a systematic inquiry into the relevant ethical problems that are likely to emerge in the following decades.
As we learn more about the human brain, novel biotechnological means to modulate human behaviour and emotional dispositions become possible. These technologies could be used to enhance our morality. Moral bioenhancement, an instance of human enhancement, alters a person’s dispositions, emotions or behaviour in order to make that person more moral. I will argue that moral bioenhancement could be carried out in three different ways. The first strategy, well known from science fiction, is behavioural enhancement. The second strategy, favoured by (...) prominent defenders of moral bioenhancement, is emotional enhancement. The third strategy is the enhancement of moral dispositions, such as empathy and inequity aversion. I will argue that we ought to implement a combination of the second and third strategies. Furthermore, I will argue that the usual arguments against other instances of human enhancement do not apply to moral bioenhancement, or apply only to the first strategy, behavioural enhancement. (shrink)
This article will explore a problem which is related to our moral obligations towards species. Although the re-creation of extinct animals has been discussed to some degree both in lay deliberations as well as by scientists, advocates tend to emphasize the technological and scientific value of such an endeavour, and the “coolness” factor, 32–33, 2013). This article will provide an argument in favour of re-creation based on normative considerations. The environmentalist community generally accepts that it is wrong to exterminate species, (...) for reasons beyond any instrumental value these species may have. It is often also claimed that humanity has a collective responsibility to either preserve or at least to not exterminate species. These two beliefs are here assumed to be correct. The argument presented here departs from and places these two ideas in a deontological framework, from which it is argued that when humanity causes the extinction of a species, this is a moral transgression, entailing a residual obligation. Such an obligation implies a positive duty to mitigate any harm caused by our moral failure. In light of recent scientific progress in the field of genetic engineering, it will be argued that humanity has a prima facie obligation to re-create species whose extinction mankind may have caused, also known as de-extinction. (shrink)
An intelligent machine surpassing human intelligence across a wide set of skills has been proposed as a possible existential catastrophe. Among those concerned about existential risk related to artificial intelligence, it is common to assume that AI will not only be very intelligent, but also be a general agent. This article explores the characteristics of machine agency, and what it would mean for a machine to become a general agent. In particular, it does so by articulating some important differences between (...) belief and desire in the context of machine agency. One such difference is that while an agent can by itself acquire new beliefs through learning, desires need to be derived from preexisting desires or acquired with the help of an external influence. Such influence could be a human programmer or natural selection. We argue that to become a general agent, a machine needs productive desires, or desires that can direct behavior across multiple contexts. However, productive desires cannot sui generis be derived from non-productive desires. Thus, even though general agency in AI could in principle be created by human agents, general agency cannot be spontaneously produced by a non-general AI agent through an endogenous process. In conclusion, we argue that a common AI scenario, where general agency suddenly emerges in a non-general agent AI, such as DeepMind’s superintelligent board game AI AlphaZero, is not plausible. (shrink)
Since the publication of Clark and Chalmers' Extended Mind paper, the central claims of that paper, viz. the thesis that cognitive processes and cognitive or mental states extend beyond the brain and body, have been vigorously debated within philosophy of mind and philosophy of cognitive science. Both defenders and detractors of these claims have since marshalled an impressive battery of arguments for and against “active externalism.” However, despite the amount of philosophical energy expended, this debate remains far from settled. We (...) argue that this debate can be understood as answering two metaphysical questions. Yet prominent voices within the debate have assumed that there is a tight relationship between these two questions such that one question can be answered via the other. We defend an alternative ‘wide’ view, whereby mentality is understood as constituted by wide social and cultural factors. Our wide view entails that the two metaphysical questions are separate and should be kept distinct. This suggests that active externalism as understood by prominent voices within that debate requires dissolution, rather than solution. However, if the debate were instead understood as only focusing on the second of the two questions, then there could be a possible future for this debate. (shrink)
We give in this paper a short semantical proof of the strong normalization for full propositional classical natural deduction. This proof is an adaptation of reducibility candidates introduced by J.-Y. Girard and simplified to the classical case by M. Parigot.
We present a novel procedure to engage the public in ethical deliberations on the potential impacts of brain machine interface technology. We call this procedure a convergence seminar, a form of scenario-based group discussion that is founded on the idea of hypothetical retrospection. The theoretical background of this procedure and the results of five seminars are presented.
The subjective Everettian approach to quantum mechanics presented by Deutsch and Wallace fails to constitute an empirically viable theory of quantum phenomena. The decision theoretic implementation of the Born rule realized in this approach provides no basis for rejecting Everettian quantum mechanics in the face of empirical data that contradicts the Born rule. The approach of Greaves and Myrvold, which provides a subjective implementation of the Born rule as well but derives it from empirical data rather than decision theoretic arguments, (...) avoids the problem faced by Deutsch and Wallace and is empirically viable. However, there is good reason to cast doubts on its scientific value. (shrink)
Hamiltonian constraints feature in the canonical formulation of general relativity. Unlike typical constraints they cannot be associated with a reduction procedure leading to a non-trivial reduced phase space and this means the physical interpretation of their quantum analogues is ambiguous. In particular, can we assume that “quantisation commutes with reduction” and treat the promotion of these constraints to operators annihilating the wave function, according to a Dirac type procedure, as leading to a Hilbert space equivalent to that reached by quantisation (...) of the problematic reduced space? If not, how should we interpret Hamiltonian constraints quantum mechanically? And on what basis do we assert that quantisation and reduction commute anyway? These questions will be refined and explored in the context of modern approaches to the quantisation of canonical general relativity. (shrink)
We study model theory of random variables using finitary integral logic. We prove definability of some probability concepts such as having F as distribution function, independence and martingale property. We then deduce Kolmogorov's existence theorem from the compactness theorem.
In 1990 J-L. Krivine introduced the notion of storage operators. They are $\lambda$ -terms which simulate call-by-value in the call-by-name strategy and they can be used in order to modelize assignment instructions. J-L. Krivine has shown that there is a very simple second order type in AF2 type system for storage operators using Gödel translation of classical to intuitionistic logic. In order to modelize the control operators, J-L. Krivine has extended the system AF2 to the classical logic. In his system (...) the property of the unicity of integers representation is lost, but he has shown that storage operators typable in the system AF2 can be used to find the values of classical integers. In this paper, we present a new classical type system based on a logical system called mixed logic. We prove that in this system we can characterize, by types, the storage operators and the control operators. (shrink)
A complex brain network, centered on the hippocampus, supports episodic memories throughout their lifetimes. Classically, upon memory encoding during active behavior, hippocampal activity is dominated by theta oscillations (6-10Hz). During inactivity, hippocampal neurons burst synchronously, constituting sharp waves, which can propagate to other structures, theoretically supporting memory consolidation. This 'two-stage' model has been updated by new data from high-density electrophysiological recordings in animals that shed light on how information is encoded and exchanged between hippocampus, neocortex and subcortical structures such as (...) the striatum. Cell assemblies (tightly related groups of cells) discharge together and synchronize across brain structures orchestrated by theta, sharp waves and slow oscillations, to encode information. This evolving dynamical schema is key to extending our understanding of memory processes. (shrink)
The ontic structural realist stance is motivated by a desire to do philosophical justice to the success of science, whilst withstanding the metaphysical undermining generated by the various species of ontological underdetermination. We are, however, as yet in want of general principles to provide a scaffold for the explicit construction of structural ontologies. Here we will attempt to bridge this gap by utilizing the formal procedure of quantization as a guide to ontic structure of modern physical theory. The example of (...) non-relativistic particle mechanics will be considered and, for that case, it will be shown that a viable candidate for an ontic structural realism framework can be constituted in terms of the combination of a state-space with Poisson bracket structure, and a set of observables, with Lie algebra structure. 1 Introduction2 Formulation Underdetermination and Structural Ontologies3 Quantization and Structural Ontologies4 The Case of Non-relativistic Particle Mechanics4.1 Formulation underdetermination4.2 The classical ontology4.3 Quantum theory and the generalized structural ontology4.4 Interpretation of results5 Conclusion and Prospects. (shrink)
A large body of literature agrees that persons with schizophrenia suffer from a Theory of Mind deficit. However, most empirical studies have focused on third-person, egocentric ToM, underestimating other facets of this complex cognitive skill. Aim of this research is to examine the ToM of schizophrenic persons considering its various aspects, to determine whether some components are more impaired than others. We developed a Theory of Mind Assessment Scale and administered it to 22 persons with a DSM-IV diagnosis of schizophrenia (...) and a matching control group. Th.o.m.a.s. is a semi-structured interview which allows a multi-component measurement of ToM. Both groups were also administered a few existing ToM tasks and the schizophrenic subjects were administered the Positive and Negative Symptoms Scale and the WAIS-R. The schizophrenic persons performed worse than control at all the ToM measurements; however, these deficits appeared to be differently distributed among different components of ToM. Our conclusion is that ToM deficits are not unitary in schizophrenia, which also testifies to the importance of a complete and articulated investigation of ToM. (shrink)
Is philosophy continuous with science or does it have a distinctive domain of inquiry that differs from that of the special sciences? Collingwood claimed that philosophy has a distinctive subject matter and a distinctive method. Its distinctive subject matter is what he called the “absolute presuppositions” that govern the special sciences and its method consists in making these presuppositions explicit by showing that they are entailed by the questions asked in the special sciences. In this chapter the editors seek to (...) provide a guide to the diverging interpretations of Collingwood’s claim that metaphysics is not the study of pure being but of the presuppositions that govern knowledge of reality. They argue that a reassessment of his contribution to philosophical methodology is timely in the light of the recent revival of interest in second-order questions concerning the role and character of philosophical analysis. (shrink)
This study extends earlier findings on ethical leadership by testing the relationship of the seven dimensions of ethical leadership with job satisfaction, organization commitment, job embeddedness, and cynicism. It uses time-lagged data from 585 employees in Pakistan. Using confirmatory factor analysis and structural equation modeling, the study supports the concept of multidimensional ethical leadership in the Eastern setting and indicates that the dimensions of people orientation, fairness, power sharing, ethical guidance, and role clarification are associated with the majority of outcomes, (...) as suggested by social exchange theory. However, the concern for sustainability and integrity dimensions had a limited effect. (shrink)
Conventional and well-established guidelines for the ethical conduct of clinical research are necessary but not sufficient for addressing research dilemmas related to public health research. There is a particular need for a public health ethics framework when, in the face of an epidemic, research is urgently needed to promote the common good. While there is limited experience in the use of a public health ethics framework, the value and potential of such an approach is increasingly being appreciated. Here we use (...) two examples of adolescent women as potential candidates for participation in microbicide trials to illustrate how ethical decisions for public health research can be enhanced by drawing on both traditional research ethics guidance, and the emerging framework for public health ethics. (shrink)
Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are (...) used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle. (shrink)
We propose an operator constraint equation for the wavefunction of the Universe that admits genuine evolution. While the corresponding classical theory is equivalent to the canonical decomposition of General Relativity, the quantum theory contains an evolution equation distinct from standard Wheeler–DeWitt cosmology. Furthermore, the local symmetry principle—and corresponding observables—of the theory have a direct interpretation in terms of a conventional gauge theory, where the gauge symmetry group is that of spatial conformal diffeomorphisms (that preserve the spatial volume of the Universe). (...) The global evolution is in terms of an arbitrary parameter that serves only as an unobservable label for successive states of the Universe. Our proposal follows unambiguously from a suggestion of York whereby the independently specifiable initial data in the action principle of General Relativity is given by a conformal geometry and the spatial average of the York time on the spacelike hypersurfaces that bound the variation. Remarkably, such a variational principle uniquely selects the form of the constraints of the theory so that we can establish a precise notion of both symmetry and evolution in quantum gravity. (shrink)
In 1990, J. L. Krivine introduced the notion of storage operator to simulate “call by value” in the “call by name” strategy. J. L. Krivine has showed that, using Gödel translation of classical into intuitionistic logic, one can find a simple type for the storage operators in AF2 type system. This paper studies the ∀-positive types and the Gödel transformations of TTR type system. We generalize by using syntactical methods Krivine's theorem about these types and for these transformations. We give (...) a proof of this result in the case of the type of recursive integers. (shrink)
This paper uses the example of the COVID-19 pandemic to analyse the danger associated with insufficient epistemic pluralism in evidence-based public health policy. Drawing on certain elements in Paul Feyerabend’s political philosophy of science, it discusses reasons for implementing more pluralism as well as challenges to be tackled on the way forward.