This work represents an attempt to stake out the landscape for dynamicism based on a radical dismissal of the information-processing paradigm that dominates the philosophy of cognitive science. In Section 2, after setting up the basic toolkit of a theory of minimal representationalism, I introduce the central tenets of dynamic systems theory (DST) by discussing recent research in the dynamics of embodiment (Thelen et al. [2001]) in the perseverative-reaching literature. A recent proposal on the dynamics of representation--the dynamic field approach (...) (Spencer and Schöner [2003])-- according to which the alleged representational gap between DST and representational theories of cognition needs to be bridged in order to explain higher-order cognitive activity will then be reviewed. In Section 3 I shall argue that Spencer and Schöner's attempt to bridge the representational gap may jeopardize the whole (antirepresentationalist) spirit of the DST project. In order to show why, I shall introduce the key concepts of "reliability of environment" and "primagenesis", and argue that DST can account for de-coupled, offline cognitive activity with no need of positing representational resources. Conclusions and directions for future research will follow. (shrink)
According to Ramsey (Representation reconsidered, Cambridge University Press, New York, 2007 ), only classical cognitive science, with the related notions of input–output and structural representations, meets the job description challenge (the challenge to show that a certain structure or process serves a representational role at the subpersonal level). By contrast, connectionism and other nonclassical models, insofar as they exploit receptor and tacit notions of representation, are not genuinely representational. As a result, Ramsey submits, cognitive science is taking a U-turn from (...) representationalism back to behaviourism, thus presupposing that (1) the emergence of cognitivism capitalized on the concept of representation, and that (2) the materialization of nonclassical cognitive science involves a return to some form of pre-cognitivist behaviourism. We argue against both (1) and (2), by questioning Ramsey’s divide between classical and representational, versus nonclassical and nonrepresentational, cognitive models. For, firstly, connectionist and other nonclassical accounts have the resources to exploit the notion of a structural isomorphism, like classical accounts (the beefing-up strategy); and, secondly, insofar as input–output and structural representations refer to a cognitive agent, classical explanations fail to meet the job description challenge (the deflationary strategy). Both strategies work independently of each other: if the deflationary strategy succeeds, contra (1), cognitivism has failed to capitalize on the relevant concept of representation; if the beefing-up strategy is sound, contra (2), the return to a pre-cognitivist era cancels out. (shrink)
In this paper I shall reply to two arguments that Stephen Stich (1990; 1991; 1996) has recently put forward against the thesis of eliminative materialism. In a nutshell, Stich argues that (i) the thesis of eliminative materialism, according to which propositional attitudes don't exist, is neither true nor false, and that (ii) even if it were true, that would be philosophically uninteresting. To support (i) and (ii) Stich relies on two premises: (a) that the job of a theory of reference (...) is to make explicit the tacit theory of reference which underlies our intuitions about the notion of reference itself; and (b) that our intuitive notion of reference is a highly idiosyncratic one. In this paper I shall address Stich's anti-eliminativist claims (i) and (ii). I shall argue that even if we agreed with premises (a) and (b), that would lend no support whatsoever to (i) and (ii). (shrink)
This paper consists of two parts. First, I shall consider two defences of Quine´s polemical Thesis of the Inscrutability of Reference put forward by Hookway, and Calvo Garzón, respectively. Then, I shall consider an extension of Quine´s succinct behavioural criteria of Radical Translationsuggested by Hintikka´s Game-Theoretical Semantics. I shall argue that Hintikka´s semantics suggest behavioural criteria which we can use to constrain perverse semantic theories. In particular, I shall try to show that whilst Hintikka´s behavioural data tells against Hookway´s (...) proposal, it reveals, nonetheless, a reason as to why my proposed perverse semantic theory enjoys the same priviledged status that a standard semantic theory is supposed toenjoy. (shrink)
Crispin Wright has reshaped debates about Realism by offering a new landscape of what's at stake in the discussions between realists and their opponents. Instead of arguing whether a given discourse can be truth apt, discussion should focus, Wright contends, on what kind of truth predicate a discourse can enjoy. Namely, whether truth for a discourse can be 'robust' or merely ‘minimal' Wright's approach has important implications for Quine's well-known Thesis of the Inscrutability of Reference. The bulk of this paper (...) will be devoted to showing that an argument involving minimalism about truth which Wright offersagainst the Inscrutability Thesis fails by reductio. By the end of the paper, we'll see how Wright's proposed frame of' discussion for Realism bears on themetaphysical status of Semantic Theories. (shrink)
Pothos's revision of rules and similarity in the area of language illustrates the impression that the classicist/connectionist debate is in a blind alley. Under his continuum proposal, both hypotheses fall neatly within the information-processing paradigm. In my view, the paradigm shift that dynamic systems theory represents (Spencer & Thelen 2003) should be submitted to critical scrutiny. Specific formalizations of the Rules versus Similarity distinction may not lead to a form of unification under Generalized Context Models or connectionist networks.
I argue that a dynamical framing of the emulation theory of representation may be at odds with its articulation in Grush's information-processing terms. An architectural constraint implicit in the emulation theory may have consequences not envisaged in the target article. In my view, “how the emulator manages to implement the forward mapping” is pivotal with regard to whether we have an emulation theory of representation, as opposed to an emulation theory of (mere) applied forces.
Gareth Evans produced a powerfulline of argument against Quine's well-known Thesis of the Inscrutability of Reference. In one part of his attack, Evans argued that, under certain conditions, structural simplicity may become truth-conducive for semantic theories. Being structurally more complex than the standard semantic theory, perverse semantic theories a la Quine are an easy prey for Evans' considerations. The bulk of the paper will be devoted to addressing Evans' criticism. By reviewing the classical/connectionist debate in cognitive science between a hypothetical (...) sympathizer of “cognitive orthodoxy" and the friend ofconnectionism, I shall contend that the Quinean has nothing to fear from a classical reading of Evans' considerations. (shrink)
Jerry Fodor and Ernest Lepore [(1992) Holism: a shopper's guide, Oxford: Blackwell; (1996) in R. McCauley (Ed.) The Churchlands and their critics , Cambridge: Blackwell] have launched a powerful attack against Paul Churchland's connectionist theory of semantics--also known as state space semantics. In one part of their attack, Fodor and Lepore argue that the architectural and functional idiosyncrasies of connectionist networks preclude us from articulating a notion of conceptual similarity applicable to state space semantics. Aarre Laakso and Gary Cottrell [(1998) (...) in M. A. Gernsbacher & S. Derry (Eds) Proceedings of the 20th Annual Conference of the Cognitive Science Society, Mahway, NJ: Erlbaum; Philosophical Psychology ] 13, 47-76 have recently run a number of simulations on simple feedforward networks and applied a mathematical technique for measuring conceptual similarity in the representational spaces of those networks. Laakso and Cottrell contend that their results decisively refute Fodor and Lepore's criticisms. Paul Churchland [(1998) Journal of Philosophy, 95, 5-32 ] goes further. He uses Laakso and Cottrell's neurosimulations to argue that connectionism does furnish us with all we need to construct a robust theory of semantics and a robust theory of translation. In this paper I shall argue that whereas Laakso and Cottrell's neurocomputational results may provide us with a rebuttal of Fodor and Lepore's argument, Churchland's conclusion is far too optimistic. In particular, I shall try to show that connectionist modelling does not provide any objective criterion for achieving a one-to-one accurate translational mapping across networks. (shrink)
In this commentary I explore nonclassical connectionism as a coherent framework for evaluation in the spirit of the Newell Test. Focusing on knowledge integration, development, real-time performance, and flexible behavior, I argue that NCC's “within-theory rank ordering” would place subsymbolic modeling in a better position. Failure to adopt a symbolic level of thought cannot be interpreted as a weakness.
According to Carruthers ants and bees have minds. This claim is to be understood realistically. We do not interpret the overt behaviour of ants and bees by ascribing to them beliefs and desires in an instrumental manner. They rather possess minds in the relevant cognitive sense. In this paper, I propose to pave the way for a reductio against such a polemic view. In particular, I shall argue that if ants and bees have minds, by the same token, plants do (...) have minds too. In my view, the problem has to do with Carruthers' underlying technical concept of cognitive architecture; a concept which, as I shall argue, can be called into question both on empirical and conceptual grounds. Según Carruthers, las hormigas y las abejas tienen mente. Esta afirmación debe entenderse de modo realista. No es que interpretemos la conducta abierta de hormigas y abejas en términos de creencias y deseos atribuidos instrumentalmente. Se trata más bien de que tienen tales estados mentales en el sentido cognitivo relevante. En este trabajo, me propongo llevar esta polémica concepción a una reductio. En particular, argumentaré que si las hormigas y las abejas tienen mente, por la misma razón las plantas también la tendrían. A mi modo de ver, el problema tiene que ver con el concepto técnico de arquitectura cognitiva de Carruthers ; un concepto que puede ser cuestionado sobre bases empíricas y conceptuales, según argüiré. (shrink)
In this research work a novel two-step system for anomaly detection is presented and tested over several real datasets. In the first step the novel Exploratory Projection Pursuit, Beta Hebbian Learning algorithm, is applied over each dataset, either to reduce the dimensionality of the original dataset or to face nonlinear datasets by generating a new subspace of the original dataset with lower, or even higher, dimensionality selecting the right activation function. Finally, in the second step Principal Component Analysis anomaly detection (...) is applied to the new subspace to detect the anomalies and improve its classification capabilities. This new approach has been tested over several different real datasets, in terms of number of variables, number of samples and number of anomalies. In almost all cases, the novel approach obtained better results in terms of area under the curve with similar standard deviation values. In case of computational cost, this improvement is only remarkable when complexity of the dataset in terms of number of variables is high. (shrink)
In this study, a hybrid model based on intelligent techniques is developed to predict the active power generated in a bioclimatic house by a low power wind turbine. Contrary to other researches that predict the generated power taking into account the speed and the direction of the wind, the model developed in this paper only uses the speed of the wind, measured mainly in a weather station from the government meteorological agency. The wind speed is measured at different heights, against (...) the usual measurements in others researches, which uses the wind speed and the direction measured in a weather station on the wind turbine nacelle. The prediction is performed 30 minutes ahead, what ensures that the Building Management System knows the energy generated by the low power wind turbine 30 minutes before, and it can adapt the consumption of different equipment in the house to optimize the power use. The main objective is to allow the Building Management System to optimize the uses of energy, taking into account the predicted amount of energy that will be produced and the energy consumed in the house. The developed model uses a hybrid topology with four clusters to improve the prediction, achieving an error lower than 6.5% for Mean Absolute Error measured in a final test. To perform this test, part of the original dataset was isolated from the beginning of the training process to check the model with a dataset that is not used before, simulating the model as it is receiving new data. (shrink)
We explore the intimate connection between spacetime geometry and electrodynamics. This link is already implicit in the constitutive relations between the field strengths and excitations, which are an essential part of the axiomatic structure of electromagnetism, clearly formulated via integration theory and differential forms. We review the foundations of classical electromagnetism based on charge and magnetic flux conservation, the Lorentz force and the constitutive relations. These relations introduce the conformal part of the metric and allow the study of electrodynamics for (...) specific spacetime geometries. At the foundational level, we discuss the possibility of generalizing the vacuum constitutive relations, by relaxing the fixed conditions of homogeneity and isotropy, and by assuming that the symmetry properties of the electro-vacuum follow the spacetime isometries. The implications of this extension are briefly discussed in the context of the intimate connection between electromagnetism and the geometry of spacetime. (shrink)
Contemporary literature distinguishes two ways to defend the claim that cognition is a matter of representations: one, cognition involves representation-hungry tasks; two, cognition involves a complex form of informational covariation between subcomponents of a system with an adaptive function. Each of these conceptions involves a different notion of representation, and promotes a particular view of the architecture of cognition. But despite the differences, each of them aims to support the claim that cognition is a matter of representations on architectural constraints. (...) The objective of this article is twofold: one, it is argued that architectural constraints do not entail either of those two ways to defend the claim that cognition is a matter of representations; two, it is claimed that both notions of representation share an objectionable common element – namely, the idea of a model that grounds the representational reading– that must be abandoned, in favor of a more economical explanation in terms of causal relations, in order to get a clear view of cognition. (shrink)
Nowadays, the quality standards of higher education institutions pay special attention to the performance and evaluation of the students. Then, having a complete academic record of each student, such as number of attempts, average grade and so on, plays a key role. In this context, the existence of missing data, which can happen for different reasons, leads to affect adversely interesting future analysis. Therefore, the use of imputation techniques is presented as a helpful tool to estimate the value of missing (...) data. This work deals with the academic records of engineering students, in which imputation techniques are applied. More specifically, it is assessed and compared to the performance of the multivariate imputation by chained equations methodology, the adaptive assignation algorithm based on multivariate adaptive regression splines and a hybridization based on self-organisation maps with Mahalanobis distances and AAA algorithm. The results show that proposed methods obtain successfully results regardless the number of missing values, in general terms. (shrink)
Este trabajo indaga las resemantizaciones del mito de Venus en tres poetas latinoamericanos: Julián del Casal, Rubén Darío y José Lezama Lima teniendo en cuenta la intertextualidad y las poéticas correspondientes. This paper analyses the Venus myth appropiation into the poetry of Julián del Casal, Rubén Darío and José Lezama Lima considering their poetry and intertextuality.
Se trata de un artículo que busca una relectura de los textos y problemas en la obra del filósofo francés Louis Althusser. Hay dos líneas argumentativa que sigue la relectura: 1) su materialismo aleatorio, y 2) su teoría del lenguaje.
Aberrations introduced by the atmospheric turbulence in large telescopes are compensated using adaptive optics systems, where the use of deformable mirrors and multiple sensors relies on complex control systems. Recently, the development of larger scales of telescopes as the E-ELT or TMT has created a computational challenge due to the increasing complexity of the new adaptive optics systems. The Complex Atmospheric Reconstructor based on Machine Learning is an algorithm based on artificial neural networks, designed to compensate the atmospheric turbulence. During (...) recent years, the use of GPUs has been proved to be a great solution to speed up the learning process of neural networks, and different frameworks have been created to ease their development. The implementation of CARMEN in different Multi-GPU frameworks is presented in this paper, along with its development in a language originally developed for GPU, like CUDA. This implementation offers the best response for all the presented cases, although its advantage of using more than one GPU occurs only in large networks. (shrink)
El cardenal don Francisco de Solís Folch de Cardona murió en Roma el 12 de abril de 1775. Con una dilatada carrera cortesana y eclesiástica vivió una vida holgada, sus excesos económicos, su tren de vida y sus atenciones a los necesitados fueron las características más relevantes de su personalidad como prelado. Al igual que en Roma, Sevilla celebró solemnes exequias en su memoria, y un año más tarde se depositó su corazón en el convento de capuchinas de Santa (...) Rosalía, como era su deseo. Antes de su último viaje a Italia, en 1766 dejó rubricado un documento con sus últimas voluntades, en él dejaba como heredero universal al deán y cabildo de la catedral de Sevilla, y nombrados a todos sus albaceas, provocando entre ellos una serie de conflictos de difícil solución. El artículo concluye con la aportación de un anexo con la transcripción de una copia del testamento del cardenal don Francisco de Solís Folch y Cardona que solicitó su hermano y albacea, don Alonso de Solís Folch de Cardona, IV duque de Montellano. (shrink)
The use of batteries became essential in our daily life in electronic devices, electric vehicles and energy storage systems in general terms. As they play a key role in many devices, their design and implementation must follow a thorough test process to check their features at different operating points. In this circumstance, the appearance of any kind of deviation from the expected operation must be detected. This research deals with real data registered during the testing phase of a lithium iron (...) phosphate—LiFePO4—battery. The process is divided into four different working points, alternating charging, discharging and resting periods. This work proposes a hybrid classifier, based on one-class techniques, whose aim is to detect anomalous situations during the battery test. The faults are created by modifying the measured cell temperature a slight ratio from their real value. A detailed analysis of each technique performance is presented. The average performance of the chosen classifier presents successful results. (shrink)
Automatic control of physiological variables is one of the most active areas in biomedical engineering. This paper is centered in the prediction of the analgesic variables evolution in patients undergoing surgery. The proposal is based on the use of hybrid intelligent modelling methods. The study considers the Analgesia Nociception Index to assess the pain in the patient and remifentanil as intravenous analgesic. The model proposed is able to make a one-step-ahead prediction of the remifentanil dose corresponding to the current state (...) of the patient. The input information is the previous remifentanil dose, the ANI variable and the electromyogram signal. Modelling techniques used are Artificial Neural Networks and Support Vector machines for Regression combined with clustering methods. Both training and validation were done with a real dataset from different patients. Results obtained show the potential of this methodology to calculate the drug dose corresponding to a given analgesic state of the patient. (shrink)
The Schwarzschild solution has played a fundamental conceptual role in general relativity, and beyond, for instance, regarding event horizons, spacetime singularities and aspects of quantum field theory in curved spacetimes. However, one still encounters the existence of misconceptions and a certain ambiguity inherent in the Schwarzschild solution in the literature. By taking into account the point of view of an observer in the interior of the event horizon, one verifies that new conceptual difficulties arise. In this work, besides providing a (...) very brief pedagogical review, we further analyze the interior Schwarzschild black hole solution. Firstly, by deducing the interior metric by considering time-dependent metric coefficients, the interior region is analyzed without the prejudices inherited from the exterior geometry. We also pay close attention to several respective cosmological interpretations, and briefly address some of the difficulties associated to spacetime singularities. Secondly, we deduce the conserved quantities of null and timelike geodesics, and discuss several particular cases in some detail. Thirdly, we examine the Eddington–Finkelstein and Kruskal coordinates directly from the interior solution. In concluding, it is important to emphasize that the interior structure of realistic black holes has not been satisfactorily determined, and is still open to considerable debate. (shrink)
Closed-loop administration of propofol for the control of hypnosis in anesthesia has evidenced an outperformance when comparing it with manual administration in terms of drug consumption and post-operative recovery of patients. Unlike other systems, the success of this strategy lies on the availability of a feedback variable capable of quantifying the current hypnotic state of the patient. However, the appearance of anomalies during the anesthetic process may result in inaccurate actions of the automatic controller. These anomalies may come from the (...) monitors, the syringe pumps, the actions of the surgeon or even from alterations in patients. This could produce adverse side effects that can affect the patient postoperative and reduce the safety of the patient in the operating room. Then, the use of anomaly detection techniques plays a significant role to avoid this undesirable situations. This work assesses different one-class intelligent techniques to detect anomalies in patients undergoing general anesthesia. Due to the difficulty of obtaining real data from anomaly situations, artificial outliers are generated to check the performance of each classifier. The final model presents successful performance. (shrink)
Student performance and its evaluation remain a serious challenge for education systems. Frequently, the recording and processing of students’ scores in a specific curriculum have several flaws for various reasons. In this context, the absence of data from some of the student scores undermines the efficiency of any future analysis carried out in order to reach conclusions. When this is the case, missing data imputation algorithms are needed. These algorithms are capable of substituting, with a high level of accuracy, the (...) missing data for predicted values. This research presents the hybridization of an algorithm previously proposed by the authors called adaptive assignation algorithm, with a well-known technique called multivariate imputation by chained equations. The results show how the suggested methodology outperforms both algorithms. (shrink)
Nowadays, batteries play an important role in a lot of different applications like energy storage, electro-mobility, consumer electronic and so on. All the battery types have a common factor that is their complexity, independently of its nature. Usually, the batteries have an electrochemical nature. Several different test are accomplished to check the batteries performance, and commonly, it is predictable how they work depending of their technology. The present research describes the hybrid intelligent system created to accomplish fault detection over a (...) Lithium Iron Phosphate—LiFePO4 power cell type, commonly used in electro-mobility applications. The approach is based on the cell temperatures behaviour for voltage and current specific values. Taken into account the operating range of a real system based on a LiFePO4 cell, a large set of points of operation have been used to achieve the dataset. The different behaviour zones have been obtained by clustering as a first step. Then, different regression techniques have been used over each cluster. Polynomial regression, artificial neural networks and support vector regression were the combined techniques to develop the hybrid intelligent model proposed. The intelligent system gives very good results over the operating range, detecting all the faults tested during the validation. (shrink)
Resumen En el presente artículo se explora la pertenencia de los libros M y N al programa general de la Metafísica de Aristóteles. Los libros XIII y XIV han quedado en el trasfondo de la Metafísica, como una suerte de agregado editorial, del cual se puede prescindir para la comprensión de la propuesta aristotélica. En el presente artículo, sin embargo, se asume un punto de partida diferente, que consiste en integrar estos libros al núcleo de la propuesta de la Metafísica, (...) enten diendo que ellos son claves para completar el panorama de la teoría general de la sustancia, o ousiología, que guía todos los libros metafísicos. De este modo, la discusión de M y N sobre números y principios viene a concluir la cuestión de la sustancia como causa y principio, anunciada desde el inicio de la Metafísica y desarrollada en sus libros centrales.This article explores the belonging of books M and N to the generalprogram of Aristotle's Metaphysics. Books XIII and XIV have remained in the background of Metaphysics, as a kind of editorial aggregate, which can be dispensed with for the understanding of the Aristotelian proposal. In this article, however, a different startingpoint is assumed, which consists in integrating these books into the core of the proposal of Metaphysics, understanding that they are key to completing the panorama of the general theory of substance, or ousiology, which guides all metaphysical books. Thus, the discussion of numbers and principles within M and N concludes the question of substance as cause and principle, announced from the beginning of Metaphysics and developed in its central books. (shrink)