This paper aims to show that to think of the artificial means to think at the same time of man, nature, culture and society not as separate entities but as elements of one and the same system; since, in its field of action, the artificial articulates its component dimensions, which altogether are natural, human, cultural and social. Usually we call artificial both the procedure through which we project the realisation of something and the product of our project: the realisation of (...) the artefact. The artefact incorporates, in the physical and inanimate dimensions of nature, those dimensions which are proper to the producer. When the artificial imitates and reproduces certain aspects of nature, its action is directed towards achieving an improvement in man's life. In this perspective, we think that the artificial cannot be viewed as a factor external to man and man's social life. (shrink)
In recent years, Reichenbach’s 1920 conception of the principles of coordination has attracted increased attention after Michael Friedman’s attempt to revive Reichenbach’s idea of a “relativized a priori”. This paper follows the origin and development of this idea in the framework of Reichenbach’s distinction between the axioms of coordination and the axioms of connection. It suggests a further differentiation among the coordinating axioms and accordingly proposes a different account of Reichenbach’s “relativized a priori”.
In the early 1920s, Hans Reichenbach and Kurt Lewin presented two topological accounts of time that appear to be interrelated in more than one respect. Despite their different approaches, their underlying idea is that time order is derived from specific structural properties of the world. In both works, moreover, the notion of genidentity--i.e., identity through or over time--plays a crucial role. Although it is well known that Reichenbach borrowed this notion from Kurt Lewin, not much has been written about their (...) relationship, nor about the way Lewin implemented this notion in his own work in order to ground his topology. This paper examines these two early versions of the topology of time, and follows the extent of Lewin’s influence on Reichenbach’s proposal. (shrink)
A novel and versatile polarization-entanglement scheme is adopted to investigate the violation of the EPR local realism for a non-maximally entangled two-photon system according to the recent nonlocality proof by Lucien Hardy. In this context the adoption of a sophisticated detection method allows direct determination of any element of physical reality (viz., determined with probability equal to unity in the words of Einstein, Podolsky and Rosen) for the pair system within complete measurements that are largely insensitive to the detector quantum-efficiencies (...) and noise. (shrink)
Recently the complexity of discursive practices has been widely acknowledged by the humanities and social sciences. In fact, to know anything is to know in terms of one or more discourse. The "discursive turn" in psychology may be considered as a new paradigm oriented to a correct study of (wo)man only if it is able to grasp the semiotical ground of psychic experience both as an "effort after meaning" and as a "struggle over meaning." In this sense the notion of (...) "diatext" has been proposed as a contribution in working out a psychosemiotical approach to understand how the discursive practices assign subject-positions to the agents of each interlocution scenario. (shrink)
Because Bertrand Russell adopted much of the logical symbolism of Peano, because Russell always had a high regard for the great Italian mathematician, and because Russell held the logicist thesis so strongly, many English-speaking mathematicians have been led to classify Peano as a logicist, or at least as a forerunner of the logicist school. An attempt is made here to deny this by showing that Peano's primary interest was in axiomatics, that he never used the mathematical logic developed by him (...) for the reduction of mathematical concepts to logical concepts, and that, instead, he denied the validity of such a reduction. (shrink)
Giovanni Sommaruga (ed): Formal Theories of Information: From Shannon to Semantic Information Theory and General Concepts of Information Content Type Journal Article Pages 119-122 DOI 10.1007/s11023-011-9228-0 Authors Giuseppe Primiero, Centre for Logic and Philosophy of Science, University of Ghent, Blandijnberg 2, Ghent, 9000 Belgium Journal Minds and Machines Online ISSN 1572-8641 Print ISSN 0924-6495 Journal Volume Volume 21 Journal Issue Volume 21, Number 1.
The foundation of Mathematics is both a logico-formal issue and an epistemological one. By the first, we mean the explicitation and analysis of formal proof principles, which, largely a posteriori, ground proof on general deduction rules and schemata. By the second, we mean the investigation of the constitutive genesis of concepts and structures, the aim of this paper. This “genealogy of concepts”, so dear to Riemann, Poincaré and Enriques among others, is necessary both in order to enrich the foundational analysis (...) with an often disregarded aspect (the cognitive and historical constitution of mathematical structures) and because of the provable incompleteness of proof principles also in the analysis of deduction. For the purposes of our investigation, we will hint here to a philosophical frame as well as to some recent experimental studies on numerical cognition that support our claim on the cognitive origin and the constitutive role of mathematical intuition. (shrink)
We propose a new interpretation of objective deterministic chances in statistical physics based on physical computational complexity. This notion applies to a single physical system (be it an experimental set--up in the lab, or a subsystem of the universe), and quantifies (1) the difficulty to realize a physical state given another, (2) the 'distance' (in terms of physical resources) from a physical state to another, and (3) the size of the set of time--complexity functions that are compatible with the physical (...) resources required to reach a physical state from another. (shrink)
Is any unified theory of brain function possible? Following a line of thought dating back to the early cybernetics (see, e.g., Cordeschi, 2002), Clark (in press) has proposed the action-oriented Hierarchical Predictive Coding (HPC) as the account to be pursued in the effort of gaining the “Grand Unified Theory of the Mind”—or “painting the big picture,” as (Edelman 2012) put it. Such line of thought is indeed appealing, but to be effectively pursued it should be confronted with experimental findings and (...) explanatory capabilities (Edelman, 2012). -/- The point we are making in this note is that a brain with predictive capabilities is certainly necessary to endow the agent situated in the environment with forethought or foresight, a crucial issue to outline the unified account advocated by Clark. But the capacity for forethought is deeply entangled with the capacity for emotions and when emotions are brought into the game, cognitive functions become part of a large-scale functional brain network. However, for such complex networks a consistent view of hierarchical organization in large-scale functional networks has yet to emerge (Bressler and Menon, 2010), whilst heterarchical organization is likely to play a strategic role (Berntson et al., 2012). This raises the necessity of a multilevel approach that embraces causal relations across levels of explanation in either direction (bottom–up or top–down), endorsing mutual calibration of constructs across levels (Berntson et al., 2012). Which, in turn, calls for a revised perspective on Marr's levels of analysis framework (Marr, 1982). In the following we highlight some drawbacks of Clark's proposal in addressing the above issues. (shrink)
Various conceptual approaches to the notion of information can currently be traced in the literature in logic and formal epistemology. A main issue of disagreement is the attribution of truthfulness to informational data, the so called Veridicality Thesis (Floridi 2005). The notion of Epistemic Constructive Information (Primiero 2007) is one of those rejecting VT. The present paper develops a formal framework for ECI. It extends on the basic approach of Artemov’s logic of proofs (Artemov 1994), representing an epistemic logic based (...) on dependent justifications, where the definition of information relies on a strict distinction from factual truth. The definition obtained by comparison with a Normal Modal Logic translates a constructive logic for “becoming informed”: its distinction from the logic of “being informed”—which internalizes truthfulness—is essential to a general evaluation of information with respect to truth. The formal disentanglement of these two logics, and the description of the modal version of the former as a weaker embedding into the latter, allows for a proper understanding of the Veridicality Thesis with respect to epistemic states defined in terms of information. (shrink)
Our visual experience seems to suggest that no continuous curve can cover every point of the unit square, yet in the late nineteenth century Giuseppe Peano proved that such a curve exists. Examples like this, particularly in analysis (in the sense of the infinitesimal calculus) received much attention in the nineteenth century. They helped instigate what Hans Hahn called a “crisis of intuition”, wherein visual reasoning in mathematics came to be thought to be epistemically problematic. Hahn described this “crisis” (...) as follows: Mathematicians had for a long time made use of supposedly geometric evidence as a means of proof in much too naive and much too uncritical a way, till the unclarities and mistakes that arose as a result forced a turnabout. Geometrical intuition was now declared to be inadmissible as a means of proof... (p. 67) Avoiding geometrical evidence, Hahn continued, mathematicians aware of this crisis pursued what he called “logicization”, “when the discipline requires nothing but purely logical fundamental concepts and propositions for its development.” On this view, an epistemically ideal mathematics would minimize, or avoid altogether, appeals to visual representations. This would be a radical reformation of past practice, necessary, according to its advocates, for avoiding “unclarities and mistakes” like the one exposed by Peano. (shrink)
In his classic introduction to the subject, Cognitive Therapy and the Emotional Disorders, Aaron Beck observes that “the philosophical underpinnings” of cognitive therapy’s (CT) approach to the emotional disorders “go back thousands of years, certainly to the time of the Stoics, who considered man’s conceptions (or misconceptions) of events rather than the events themselves as the key to his emotional upsets” (Beck 1976, 3). But beyond acknowledging that the stoics anticipated the central insight of CT, Beck has very little to (...) say about the philosophical underpinnings of CT, content it would seem for it to be an empirically grounded system of psychological principles and therapeutic methods. Yet even this little .. (shrink)
Starting from a recent paper by S. Kaufmann, we introduce a notion of conjunction of two conditional events and then we analyze it in the setting of coherence. We give a representation of the conjoined conditional and we show that this new object is a conditional random quantity, whose set of possible values normally contains the probabilities assessed for the two conditional events. We examine some cases of logical dependencies, where the conjunction is a conditional event; moreover, we give the (...) lower and upper bounds on the conjunction. We also examine an apparent paradox concerning stochastic independence which can actually be explained in terms of uncorrelation. We briefly introduce the notions of disjunction and iterated conditioning and we show that the usual probabilistic properties still hold. (shrink)
Two problems related to the biological identity of living beings are faced: the who-problem (which are the biological properties making that living being unique and different from the others?); the persistence-problem (what does it take for a living being to persist from a time to another?). They are discussed inside a molecular biology framework, which shows how epigenetics can be a good ground to provide plausible answers. That is, we propose an empirical solution to the who-problem and to the persistence-problem (...) on the basis of the new perspectives opened by a molecular understanding of epigenetic processes. In particular, concerning the former, we argue that any living being is the result of the epigenetic processes that have regulated the expression of its genome; concerning the latter, we defend the idea that the criterion for the persistence of its identity is to be indicated in the continuity of those epigenetic processes. We also counteract possible objections, in particular (1) whether our approach has something to say at a metaphysical level; (2) how it could account for the passage from the two phenotypes of the parental gametes to the single phenotype of the zygote; (3) how it could account for the identity of derivatives of one living being that continue to live disjoined from that original living being; (4) how it could account for higher mental functions. (shrink)
In one of his last writings, Life: Experience and Science, Michel Foucault argued that twentieth-century French philosophy could be read as dividing itself into two divergent lines: on the one hand, we have a philosophical stream which takes individual experience as its point of departure, conceiving it as irreducible to science. On the other hand, we have an analysis of knowledge which takes into account the concrete productions of the mind, as are found in science and human practices. In order (...) to account for this division, Foucault opposed epistemologists such as Cavaillès and Canguilhem to phenomenologists such as Merleau-Ponty and Sartre but, also, and more particularly, he opposed Poincaré to Bergson. The latter was presented by Foucault as being the key-figure of the ?philosophy of experience? at the beginning of the twentieth century. Fifteen years later, in his Deleuze and in the Logics of Worlds, Alain Badiou again uses this dual structure in his interpretation of the past hundred years of French thought. He employs a series of oppositional couples: himself and Deleuze, Lautmann and Sartre, and, finally, Brunschvicg and Bergson. On the one hand a ?mathematical Platonism? and on the other a ?philosophy of vital interiority.? This Manichean reading of philosophy, and the strategic use of the figure of Bergson has, itself, a long tradition. It was also proposed by Althusser who, following Bachelard, opposed his standpoint to any form of ?empiricism.? Althusser developed his thought from a tradition of Marxist thinkers and ideologists, which included Politzer's and Nizan's critique of bourgeois philosophy and, even before that, neo-Kantians such as the philosophers of the Revue de métaphysique et de morale. The aim of this essay is to deconstruct and to put into its precise context of production this series of genealogies which entails the mobilization of Bergsonism and of the name ?Bergson.? By doing so, I hope to weight the importance of Bergsonism in twentieth-century French philosophy, in both its ?positive? and its ?negative? aspect. The essay will proceed regressively, taking into account figures such as Althusser, Badiou, Deleuze, Foucault, Canguilhem, Cavaillès, Sartre, Merleau-Ponty, but also Polizer, Brunschvicg and Alain. The conclusion of the essay is an attempt at reading the ?Bergson renaissance? in the light of new discoveries in genetics and the cognitive sciences and to tie it to the renewal of studies in the history of French philosophy. (shrink)
The value of resting electroencephalogram (EEG) in revealing neural constitutes of consciousness (NCC) was examined. We quantified the dynamic repertoire, duration and oscillatory type of EEG microstates in eyes-closed rest in relation to the degree of expression of clinical self-consciousness. For NCC a model was suggested that contrasted normal, severely disturbed state of consciousness and state without consciousness. Patients with disorders of consciousness were used. Results suggested that the repertoire, duration and oscillatory type of EEG microstates in resting condition quantitatively (...) related to the level of consciousness expression in brain-damaged patients and healthy-conscious subjects. Specifically, results demonstrated that (a) decreased number of EEG microstate types was associated with altered states of consciousness, (b) unawareness was associated with the lack of diversity in EEG alpha-rhythmic microstates, and (c) the probability for the occurrence and duration of delta-, theta- and slow-alpha-rhythmic microstates were associated with unawareness, whereas the probability for the occurrence and duration of fast-alpha-rhythmic microstates were associated with consciousness. In conclusion, resting EEG has a potential value in revealing NCC. This work may have implications for clinical care and medical–legal decisions in patients with disorders of consciousness. (shrink)
There has recently been a considerable amount of research into the influence of 18th century British philosophy--particularly into the thinking of David Hume on Continental philosophy and Kant. The aim of this collection is to provide some of the key texts which illustrate the impact of Kant's thought together with two important 20th century monographs on aspects of Kant's early reception and his influence on philosophical thought. Contents: Immanuel Kant in England 1793-1838  Rene Wellek 328 pp The Early Reception (...) of Kant's Thought in England 1785-1805  Giuseppe Micheli 114 pp A General and Introductory view of Professor Kant's Principles  F. A. Nitsch 234 pp Text-Book to Kant  (with a biographical sketch) James Hutchison Stirling 576 pp The Development from Kant to Hegel  Andrew Seth 178 pp Lectures on the Philosophy of Kant  Thomas Hill Green 155 pp On the Philosophy of Kant  Robert Adamson 270pp A Sketch of Kant's Life and Writings  H. G. Henderson 80 pp Inquisitio Philosophica , An Examination on the Principles of Kant and Hamilton M. P. W. Bolton 286 pp Philosophy of the Unconditioned  William Hamilton 38 pp On the Philosophy of Kant  Henry L. Mansel 45 pp The aim of this collection is to provide some of the key texts which illustrate the impact of Kant's thought together with two important 20th century monographs on aspects of Kant's early reception and his influence on philosophical thought. (shrink)
Machine generated contents note: Introduction. Unfounding times: the idea and ideal of ancient history in Western historical thought Alexandra Lianeri; Part I. Theorising Western Time: Concepts and Models: 1. Time's authority François Hartog; 2. Exemplarity and anti-exemplarity in Early Modern Europe Peter Burke; 3. Greek philosophy and Western history: a philosophy-centred temporality Giuseppe Cambiano; 4. Historiography and political theology: Momigliano and the end of history Howard Caygill; Part II. Ancient History and Modern Temporalities: 5. The making of a bourgeois (...) antiquity. Wilhelm von Humboldt and Greek history Stefan Rebenich; 6. Modern histories of Ancient Greece: genealogies, contexts and eighteenth-century narrative historiography Giovanna Ceserani; 7. Acquiring (a) historicity: Greek history, temporalities and eurocentrism in the Sattelzeit Kostas Vlassopoulos; 8. Herodotus and Thucydides in the view of nineteenth-century German historians Ulrich Muhlack; 9. Monumentality and the meaning of the past in ancient and modern historiography Neville Morley; Part III. Unfounding Time In and Through Ancient Historical Thought: 10. Thucydides and social change: between akribeia and universality Rosalind Thomas; 11. Historia magistra vitae in Herodotus and Thucydides? The exemplary use of the past, and ancient and modern temporalities Jonas Grethlein; 12. Repetition and exemplarity in historical thought: ancient Rome and the ghosts of modernity Ellen O'Gorman; 13. Time and authority in the chronicle of Sulpicius Severus Michael Williams; Part IV. Afterword: 14. Ancient history in the eighteenth century Oswyn Murray; 15. Seeing in and through time John Dunn. (shrink)
The importance of contextual reasoning is emphasized by various researchers in AI. (A partial list includes John McCarthy and his group, R. V. Guha, Yoav Shoham, Giuseppe Attardi and Maria Simi, and Fausto Giunchiglia and his group.) Here, we survey the problem of formalizing context and explore what is needed for an acceptable account of this abstract notion.
The philosophy of mathematics has been accused of paying insufficient attention to mathematical practice: one way to cope with the problem, the one we will follow in this paper on extensive magnitudes, is to combine the `history of ideas' and the `philosophy of models' in a logical and epistemological perspective. The history of ideas allows the reconstruction of the theory of extensive magnitudes as a theory of ordered algebraic structures; the philosophy of models allows an investigation into the way epistemology (...) might affect relevant mathematical notions. The article takes two historical examples as a starting point for the investigation of the role of numerical models in the construction of a system of non-Archimedean magnitudes. A brief exposition of the theories developed by Giuseppe Veronese and by Rodolfo Bettazzi at the end of the 19th century will throw new light on the role played by magnitudes and numbers in the development of the concept of a non-Archimedean order. Different ways of introducing non-Archimedean models will be compared and the influence of epistemological models will be evaluated. Particular attention will be devoted to the comparison between the models that oriented Veronese's and Bettazzi's works and the mathematical theories they developed, but also to the analysis of the way epistemological beliefs affected the concepts of continuity and measurement. (shrink)
In this work I propose an analogy between Pythagoras's theorem and the logical-formal structure of Werner Heisenberg's "relations of uncertainty." The reasons that they have pushed to me to place this analogy have been determined from the following ascertainment: Often, when in exact sciences a problem of measurement precision arises, it has been resolved with the resource of the elevation to the square. To me it seems also that the aporie deriving from the uncertainty principle can find one solution with (...) the resource to this stratagem. In fact, if the first classic example of the argument is the solution of the incommensurability between catheti and the hypotenuse of the triangle rectangle, one of the last cases is that which is represented from Heisenberg's principle of uncertainty. (shrink)
The default mode network (DMN) has been consistently activated across a wide variety of self-related tasks, leading to a proposal of the DMN’s role in self-related processing. Indeed, there is limited fMRI evidence that the functional connectivity within the DMN may underlie a phenomenon referred to as self-awareness. At the same time, none of the known studies have explicitly investigated neuronal functional interactions among brain areas that comprise the DMN as a function of self-consciousness loss. To fill this gap, EEG (...) operational synchrony analysis was performed in patients with severe brain injuries in vegetative and minimally conscious states to study the strength of DMN operational synchrony as a function of self-consciousness expression. We demonstrated that the strength of DMN EEG operational synchrony was smallest or even absent in patients in vegetative state, intermediate in patients in minimally conscious state and highest in healthy fully self-conscious subjects. At the same time the process of decoupling of operations performed by neuronal assemblies that comprise the DMN was highest in patients in vegetative state, intermediate in patients in minimally conscious state and minimal in healthy fully self-conscious subjects. The DMN’s frontal EEG operational module had the strongest decrease in operational synchrony strength as a function of selfconsciousness loss, when compared with the DMN’s posterior modules. Based on these results it is suggested that the strength of DMN functional connectivity could mediate the strength of self-consciousness expression. The observed alterations similarly occurred across EEG alpha, beta1 and beta2 frequency oscillations. Presented results suggest that the EEG operational synchrony within DMN may provide an objective and accurate measure for the assessment of signs of self-(un)consciousness in these challenging patient populations. This method therefore, may complement the current diagnostic procedures for patients with severe brain injuries and, hence, the planning of a rational rehabilitation intervention. (shrink)