Our aim in this paper is to take quite seriously Heinz Post's claim that the non-individuality and the indiscernibility of quantum objects should be introduced right at the start, and not made a posteriori by introducing symmetry conditions. Using a different mathematical framework, namely, quasi-set theory, we avoid working within a label-tensor-product-vector-space-formalism, to use Redhead and Teller's words, and get a more intuitive way of dealing with the formalism of quantum mechanics, although the underlying logic should be modified. Thus, this (...) paper can be regarded as a tentative to follow and enlarge Heinsenberg's suggestion that new phenomena require the formation of a new ``closed" (that is, axiomatic) theory, coping also with the physical theory's underlying logic and mathematics. (shrink)
In follow-up to a large-scale ethics survey of neuroscientists whose research involves neuroimaging, brain stimulation and imaging genetics, we conducted focus groups and interviews to explore their sense of responsibility about integrating ethics into neuroimaging and readiness to adopt new ethics strategies as part of their research. Safety, trust and virtue were key motivators for incorporating ethics into neuroimaging research. Managing incidental findings emerged as a predominant daily challenge for faculty, while student reports focused on the malleability of neuroimaging data (...) and scientific integrity. The most frequently cited barrier was time and administrative burden associated with the ethics review process. Lack of scholarly training in ethics also emerged as a major barrier. Participants constructively offered remedies to these challenges: development and dissemination of best practices and standardized ethics review for minimally invasive neuroimaging protocols. Students in particular, urged changes to curricula to include early, focused training in ethics. (shrink)
Background: The research community has a mandate to discover effective treatments for neurodegenerative disorders. The ethics landscape surrounding this mandate is in a constant state of flux, and ongoing challenges place ever greater demands on investigators to be accountable to the public and to answer questions about the implications of their work for health care, society, and policy. Methods: We surveyed US-based investigators involved in neurodegenerative diseases research about how they value ethics-related issues, what motivates them to give consideration to (...) those issues, and the barriers to doing so. Using the NIH CRISP database we identified 1,034 researchers with relevant, active grants and invited them to complete an online questionnaire. We received 193 responses. We used exploratory factor analysis to transform individual survey questions into a smaller set of factors, and linear regression to understand the effect of key variables of interest on the factor scores. Results: Ethics-related issues clustered into two groups: research ethics and external influences. Heads of research groups viewed issues of research ethics to be more important than the other respondents. Concern about external influences was related to overall interest in ethics. Motivators clustered into five groups: ensuring public understanding, external forces, requirements, values, and press and public. Heads of research groups were more motivated to ensure public understanding of research than the other respondents. Barriers clustered into four groups: lack of resources, administrative burden, relevance to the research, and lack of interest. Perceived lack of ethics resources was a particular barrier for investigators working in drug discovery. Conclusions: The data suggest that senior level neuroscientists working in the field of neurodegeneration (ND), and drug discovery specifically, are motivated to consider ethics issues related to their work, but the perceived lack of ethics resources thwarts their efforts. With bioethics centres at more than 50% of the institutions at which these respondents reside, the neuroscience and bioethics communities appear to be disconnected. Dedicated ethical, legal and social implications (ELSI) programs, such as those fully integrated into genetics and regenerative medicine, provide models for achieving meaningful partnerships not yet adequately realized for scholars and trainees interested in drug discovery for ND. (shrink)
George W. Bush is not only America’s president, but also its most prominent moralist. No other president in living memory has spoken so often about good and evil, right and wrong. […] But in what moral truths does the president believe? Considering how much the president says about ethics, it is surprising how little serious discussion there has been of the moral philosophy of George W. Bush.
The physics and metaphysics of quantum field theory Content Type Journal Article Category Book Review Pages 1-3 DOI 10.1007/s11016-011-9609-2 Authors Federico Laudisa, Department of Human Sciences “R. Massa”, University of Milan-Bicocca, Piazza Ateneo Nuovo 1, 20126 Milan, Italy Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
International book reviews Content Type Journal Article DOI 10.1007/s11007-010-9162-5 Authors Federico Leoni, Università degli Studi di Milano, Milan, Italy Journal Continental Philosophy Review Online ISSN 1573-1103 Print ISSN 1387-2842.
The law of coherence helps us understand the physical force behind the increasing complexity of the evolutionary process, from quanta, to cells, to self-awareness and collective consciousness. The coherent electromagnetic field is the inner glue of every system, the "intelligent" energy-information communication that assures a cooperative and synergic behavior to all the components of the system, as a whole, allowing harmonious evolution and unity of consciousness. Neuropsychological experiments show that the different brain areas communicate with more or less coherence according (...) to different states of consciousness: high values are correlated with states of psychophysical integrity and well-being, whereas low values with states of conflict and depression. If we expand isomorphically these brain discoveries, we will have four main general states of coherence: from disgregation to unity, which represents an important element, in the General System Theory, to differentiate between inanimate and animate system, and to understand how billions cells become a single living organism, and then how billions of human beings could eventually generate planetary consciousness. In this light the resolution of the global ecosystem crisis implicates human transformation from a low to a highly coherent state of consciousness. The key to the entire process seems to be the coherent nature of consciousness. (shrink)
The focus of this paper is the prima facie plausible view, expressed by the principle of Counter-Closure, that knowledge-yielding competent deductive inference must issue from known premises. I construct a case that arguably falsifies this principle and consider five available lines of response that might help retain Counter-Closure. I argue that three are problematic. Of the two remaining lines of response, the first relies on non-universal intuitions and forces one to view the case I construct as exhibiting a justified, true (...) belief to which none of the usual diagnoses of knowledge failure in Gettier cases apply. The second line involves claiming that Fake Barns and its ilk are misdiagnosed by epistemological orthodoxy as Gettier cases. We are thus confronted by a trilemma: either the case I discuss undermines the first-blush plausible principle of Counter-Closure; or the case I discuss instantiates a novel kind of Gettier case; or a popular conception of a key range of alleged Gettier cases must be rejected. No matter which horn we choose, the case points to a philosophically curious conclusion. (shrink)
According to a wrong interpretation of the Bell theorem, it has been repeatedly claimed in recent times that we are forced by experiments to drop any possible form of realism in the foundations of quantum mechanics. In this paper I defend the simple thesis according to which the above claim cannot be consistently supported: the Bell theorem does not concern realism, and realism per se cannot be refuted in itself by any quantum experiment. As a consequence, realism in quantum mechanics (...) is not something that can be simply explained away once and for all on the basis of experiments, but rather something that must be conceptually characterized and discussed in terms of its foundational virtues and vices. To assess it, we cannot rely on experimentation but rather on philosophical discussion: realism is not a phlogiston-like notion, despite the efforts of the contemporary quantum orthodoxy to conceive it in Russellian terms as the relics of a bygone age. (shrink)
I argue that DeRose's attributor contextualism cannot straightforwardly preserve the widespread view that, when a subject believes q solely on the basis of competent deduction from p, knowledge of q requires knowledge of p. I present a novel challenge to the compatibility of this widespread view with DeRose's contextualism, then argue that the tension can be resolved in only one of two ways: if DeRose rejects the widespread view or if DeRose accepts the existence of a range of contextualism-specific Gettier-style (...) cases. (shrink)
Relational quantum mechanics is an interpretation of quantum theory which discards the notions of absolute state of a system, absolute value of its physical quantities, or absolute event. The theory describes only the way systems affect each other in the course of physical interactions. State and physical quantities refer always to the interaction, or the relation, between two systems. Nevertheless, the theory is assumed to be complete. The physical content of quantum theory is understood as expressing the net of relations (...) connecting all different physical systems. (shrink)
The relations between the majority and minorities in a democracy have been standardly viewed as the main subject matter of toleration: the majority should refrain from using its dominant position to interfere with some minorities’ practices or beliefs despite its dislike or disapproval of such practices or beliefs. Can the idea of toleration provide us with the necessary resources to understand and respond to the problems arising out of majority/minorities relations in a democracy? We reply in the negative and make (...) two main claims: first, that resorting to toleration is not enough to make sense of the problems deriving from the unequal participation of minorities in society, and, second, that it risks sanctioning the asymmetric relation between the majority and minorities informed by the negative judgement of the former towards some belief or practice of the latter. We suggest resorting instead to the idea of equal opacity respect for persons: all persons should be treated equally as moral agents, in accordance with their equally possessing the capacity for self-legislation, and as if they were opaque to our judgement for all those properties of theirs which exceed moral agency. Looking at the majority/minorities relations through such a lens enables us to understand (and appropriately respond to) what is problematic in such relations: the majority often fails to treat minorities as moral agents by failing to take their voices into account on an equal footing, by seeing them merely as recipients of certain provisions affecting them rather than their authors, and by considering them as legitimately exposed to the majority’s (negative) judgment. The purchase of our argument is illustrated by reference to two minorities whose treatment is paradigmatic of the problematic nature of majority/minorities relations across Europe: Muslims and Roma. (shrink)
The principle of Counter-Closure embodies the widespread view that when a proposition is believed solely as the conclusion of single-premise deduction, it can be known only if the premise is also known. I raise a problem for the compatibility of Jason Stanley's Interest-Relative Invariantism (IRI) with Counter-Closure. I explore the landscape of options that might help Stanley resolve this tension and argue that a trilemma confronts Stanley: he must either (i) renounce a key intuition that lies at the foundation of (...) his view; or (ii) admit into his epistemology an IRI-specific novel brand of Gettier case; or (iii) abandon Counter-Closure. (shrink)
In the same days in which Albert Einstein was completing his formulation of the theory of general relativity, David Hilbert arrived to the same field equations following a different path and different mathematical procedures. In this article, both ways to get to the same formal result will be analyzed, together with the exchange of letters between the two scientists, underlining the two different, but extremely sharp, creativities.
The necessity to model the mental ingredients of norm compliance is a controversial issue within the study of norms. So far, the simulation-based study of norm emergence has shown a prevailing tendency to model norm conformity as a thoughtless behavior, emerging from social learning and imitation rather than from specific, norm-related mental representations. In this paper, the opposite stanceânamely, a view of norms as hybrid, two-faceted phenomena, including a behavioral/social and an internal/mental sideâis taken. Such a view is aimed at (...) accounting for the difference between norms, on one hand, and either behavioral regularities (conventions) on the other. This paper, in particular, is addressed to find out the internal ingredients required for the former distinction, i.e., to model norms as distinct from mere conventions, and defined as behaviors spreading to the extent that and because the corresponding commands and beliefs do spread as well. After a brief presentation of a normative agent architecture, the results of agent-based simulations testing the impact of norm recognition and the role of normative beliefs in the emergence and innovation of social norms are presented and discussed. More specifically, the present work will endeavour to show that a sudden external constraint (e.g. a barrier preventing agents from moving among social settings) facilitates norm innovation: under such a condition, agents provided with a module for telling what a norm is can generate new (social) norms by forming new normative beliefs, irrespective of the most frequent actions. (shrink)
The status of a causal approach to EPR-Bell nonlocal correlations in terms of a counterfactual framework for causation is considered. It is argued that when the relativistic spacetime structure of the events is taken into due account, the adoption of this approach is best motivated by the assumption of a preferred frame of reference, an assumption that seems even more in need of justification than the causal theory itself.
In the context of stochastic hidden variable theories, Howard has argued that the role of separability—spatially separated systems possess distinct real states—has been underestimated. Howard claims that separability is equivalent to Jarrett‘s completeness: this equivalence should imply that the Bell theorem forces us to give up either separability or locality. Howard's claim, however, is shown to be ill founded since it is based on an implausible assumption. The necessity of sharply distinguishing separability and locality is emphasized: a quantitative formulation of (...) separability, due to D'Espagnat, is reviewed and found unsatisfactory, in that it basically conflates separability and locality in a single notion. Finally, the possibility of an ‘Einsteinian’ nonseparable realism, envisaged by Shimony, is reviewed and found also to be implausible. (shrink)
Ainslie advances Freud's and Skinner's theories of homunculi by basing their emergent complexity on the interaction of simple algorithms. The rules of competition and cooperation of these interests are underspecified, but they provide a new way of thinking about the basic elements of conditioning, particularly conditioned stimuli (CSs).
Over recent years, various semantics have been proposed for dealing with updates in the setting of logic programs. The availability of different semantics naturally raises the question of which are most adequate to model updates. A systematic approach to face this question is to identify general principles against which such semantics could be evaluated. In this paper we motivate and introduce a new such principle the refined extension principle. Such principle is complied with by the stable model semantics for (single) (...) logic programs. It turns out that none of the existing semantics for logic program updates, even though generalisations of the stable model semantics, comply with this principle. For this reason, we define a refinement of the dynamic stable model semantics for Dynamic Logic Programs that complies with the principle. (shrink)
According to a widespread view, the Bell theorem establishes the untenability of so-called 'local realism'. On the basis of this view, recent proposals by Leggett, Zeilinger and others have been developed according to which it can be proved that even some non-local realistic theories have to be ruled out. As a consequence, within this view the Bell theorem allows one to establish that no reasonable form of realism, be it local or non-local, can be made compatible with the (experimentally tested) (...) predictions of quantum mechanics. In the present paper it is argued that the Bell theorem has demonstrably nothing to do with the 'realism' as defined by these authors and that, as a consequence, their conclusions about the foundational significance of the Bell theorem are unjustified. (shrink)
El artículo ofrece una interpretación de la controversial y aparentemente inaceptable caracterización de la poesía desarrollada por Platón en la República. Los objetivos principales de la discusión son: aclarar las motivaciones de dicha caracterización, desentrañar los múltiples y discontinuos argumentos que la componen, y evaluar críticamente sus aciertos y sus límites. Se concluye que no todas las posturas que adopta Platón frente a la poesía son insostenibles, y que cuando sí lo son las razones para ello resultan particularmente esclarecedoras. The (...) article offers an interpretation of the controversial and apparently unacceptable characterization of poetry developed in Plato's Republic. The main objectives of the discussion are: to clarify the motivations for such characterization, to disentangle the various and discontinuous arguments that compose it, and to critically evaluate its limitations and the extent of its defensibility. It is concluded that not all the positions adopted by Plato with respect to poetry are unsustainable, and that when they are, this is due to reasons which result particularly revealing. (shrink)
The Bell 1964 theorem states that nonlocality is a necessary feature of hidden variable theories that reproduce the statistical predictions of quantum mechanics. In view of the no-go theorems for non-contextual hidden variable theories already existing up to 1964, and due to Gleason and Bell, one is forced to acknowledge the contextual character of the hidden variable theory which the Bell 1964 theorem refers to. Both the mathematical and the physical justifications of this contextualism are reconsidered. Consequently, the role of (...) contextualism in recent no-hidden-variables proofs and the import of these proofs are investigated. With reference to the physical intuition underlying contextualism, the possibility is considered whether a context-dependence of individual measurement results is compatible with context-independence of the statistics of measurement results. (shrink)
The problem of the biology of money is twofold: It subsumes both the identification of behavioral mechanisms that account for the power of money as an incentive, and the elucidation of the phylogeny of such mechanisms. The drugs–tool distinction, as articulated by Lea & Webley (L&W) in their fascinating synthesis, is a welcome step toward their solution. Compared to the direct invocation of instinctual drives, however, conditioning processes provide a conceptually and empirically clearer road from evolution to money. (Published Online (...) April 5 2006). (shrink)
Transplantation continues to push the frontiers of medicine into domains that summon forth troublesome ethical questions. Looming on the frontier today is human facial transplantation. We develop criteria that, we maintain, must be satisfied in order to ethically undertake this as-yet-untried transplant procedure. We draw on the criteria advanced by Dr. Francis Moore in the late 1980s for introducing innovative procedures in transplant surgery. In addition to these we also insist that human face transplantation must meet all the ethical requirements (...) usually applied to health care research. We summarize the achievements of transplant surgery to date, focusing in particular on the safety and efficacy of immunosuppressive medications. We also emphasize the importance of risk/benefit assessments that take into account the physical, aesthetic, psychological, and social dimensions of facial disfiguration, reconstruction, and transplantation. Finally, we maintain that the time has come to move facial transplantation research into the clinical phase. (shrink)
Nowadays, pushed by the belief that the world will dramatically change by 2050, governments from all over the world are searching for new visions to inspire alternative forms of development. These visions are founded on the idea that current development paradigms are deeply unsustainable in terms of energy production and consumption and on the fear that, should these paradigms remain unchallenged, the resilience of our ecosystems will soon be surpassed. The specter of ecological catastrophe (and of the consequent economic and (...) political collapse) has become a dominant element in global development politics, particularly in relation to present and future models of urbanization. According to a number of institutions .. (shrink)
Novelty is a key concept to understand creativity. Evaluating a piece of artwork or other creation in terms of novelty requires comparisons to other works and considerations about the elements that have been reused in the creative process. Human beings perform this analysis intuitively, but in order to simulate it using computers, the objects to be compared and the similarity metrics to be used should be formalized and explicitly implemented. In this paper we present a study on relevant elements for (...) the assessment of novelty in computer-generated narratives. We focus on the domain of folk-tales, working with simple plots and basic narrative elements: events, characters, props and scenarios. Based on the empirical results of this study we propose a set of computational metrics for the automatic assessment of novelty. Although oriented to the implementation of our own story generation system, the measurement methodology we propose can be easily generalized to other creative systems. (shrink)
We explore the distinctive characteristics of Mexico's society, politics and history that impacted the establishment of genetics in Mexico, as a new disciplinary field that began in the early 20th century and was consolidated and institutionalized in the second half. We identify about three stages in the institutionalization of genetics in Mexico. The first stage can be characterized by Edmundo Taboada, who was the leader of a research program initiated during the Cárdenas government (1934-1940), which was primarily directed towards improving (...) the condition of small Mexican farmers. Taboada is the first Mexican post-graduate investigator in phytotechnology and phytopathology, trained at Cornell University and the University of Minnesota, in 1932 and 1933, respectively. He was the first investigator to teach plant genetics at the National School of Agriculture and wrote the first textbook of general genetics, Genetics Notes, in 1938. Taboada's most important single genetics contribution was the production of "stabilized" corn varieties. The extensive exile of Spanish intellectuals to Mexico, after the end of Spain's Civil War (1936-1939), had a major influence in Mexican science and characterizes the second stage. The three main personalities contributing to Mexican genetics are Federico Bonet de Marco and Bibiano Fernández Osorio Tafall, at the National School of Biological Sciences, and José Luis de la Loma y Oteyza, at the Chapingo Agriculture School. The main contribution of the Spanish exiles to the introduction of genetics in Mexico concerned teaching. They introduced in several universities genetics as a distinctive discipline within the biology curriculum and wrote genetics text books and manuals. The third stage is identified with Alfonso León de Garay, who founded the Genetics and Radiobiology Program in 1960 within the National Commission of Nuclear Energy, which had been founded in 1956. The Genetics and Radiobiology Program rapidly became a disciplinary program, for it embraced research, teaching, and training of academics and technicians. The Mexican Genetics Society, created by de Garay in 1966, and the development of strains and cultures for genetics research were important activities. One of de Garay's key requirements was the compulsory training of the Program's scientists for at least one or two years in the best universities of the United States and Europe. De Garay's role in the development of Mexican genetics was fundamental. His broad vision encompassed the practice of genetics in all its manifestations. (shrink)
En primer término se analiza brevemente el contexto que posibilita en García Lorca la búsqueda de un nuevo lenguaje poético durante su estadía en Nueva York entre 1929-1930. Posteriormente se analiza el concepto de flâneur. Para ello se consideran las elaboraciones que desarrolló Walter Benjamin a propósito de la experiencia de Charles Baudelaire frente a las transformaciones urbanas y sociales que sufrió París a mediados del siglo XIX, durante el Segundo Imperio en Francia. Finalmente se indaga en la eventual configuración (...) de la figura del flâneur en Poeta en Nueva York de Federico Garcia Lorca. First of all, it’s analyzed briefly the context that makes possible in García Lorca the search of a new poetic language during his stay in New York between 1929 and 1930. Subsequently it’s analyzed the concept of flâneur. Because of that, it must be taken into account the devise that Walter Benjamin explained according to Charles Baudelaire’s experience against the urban and social transformations that hit Paris in the 19th century during the Second French Empire. Finally it makes inquiries in the possible configuration of the flâneur in Poeta en Nueva York by Federico García Lorca. (shrink)
From the advent of general purpose, Turing-complete machines, the relation between operators, programmers and users with computers can be observed as interconnected informational organisms (inforgs), henceforth analysed with the method of levels of abstraction (LoAs), risen within the philosophy of information (PI). In this paper, the epistemological levellism proposed by L. Floridi in the PI to deal with LoAs will be formalised in constructive terms using category theory, so that information itself is treated as structure-preserving functions instead of Cartesian products. (...) The milestones in the history of modern computing are then analysed through constructive levellism to show how the growth of system complexity lead to more and more information hiding. (shrink)