Dans La structure de l’apparence, Nelson Goodman met en place les principaux thèmes philosophiques qui feront de lui un penseur singulier : constructivisme, nominalisme, phénoménalisme et pluralisme s’entrecroisent ici dans l’élaboration d’une pensée aussi subtile que complexe. Ce livre propose une première traduction d’un texte fondateur de la philosophie analytique.
What is fair? How and when can punishment be legitimate? Is there recompense for human suffering? How can we understand ideas about immortality or an afterlife in the context of critical thinking on the human condition? In this book L. E. Goodman presents the first general theory of justice in this century to make systematic use of the Jewish sources and to bring them into a philosophical dialogue with the leading ethical and political texts of the Western tradition. (...) class='Hi'>Goodman takes an ontological approach to questions of natural and human justice, developing a theory of community and of nonvindictive yet retributive punishment that is grounded in careful analysis of various Jewish sources—biblical, rabbinic, and philosophical, His exegesis of these sources allow Plato, Kant, and Rawls to join in a discourse with Spinoza and medieval rationalists, such as Saasidah and Maimonides, who speak in a very different idiom but address many of the same themes. Drawing on sources old and new, Jewish and non-Jewish, Goodman offers fresh perspectives on important moral and theological issues that will be of interest to both Jewish and secular philosophers. (shrink)
If Bayesian Fundamentalism existed, Jones & Love's (J&L's) arguments would provide a necessary corrective. But it does not. Bayesian cognitive science is deeply concerned with characterizing algorithms and representations, and, ultimately, implementations in neural circuits; it pays close attention to environmental structure and the constraints of behavioral data, when available; and it rigorously compares multiple models, both within and across papers. J&L's recommendation of Bayesian Enlightenment corresponds to past, present, and, we hope, future practice in Bayesian cognitive science.
Tracing the course of thought, action, and expression in the golden age of Islamic civilization, L. E. Goodman's Islamic Humanism paints a vivid panorama that departs strikingly from the all too familiar image of Islamic dogma, authoritarianism, and militancy. Among the poets and philosophers, scientists and historians, ethicists and mystics of Islam, Goodman finds a warm and vital humanism, committed to the pursuit of knowledge and to the cosmopolitan values of generosity, tolerance, and understanding. Drawing on a wide (...) range of writings, from love poetry to pietism, to satire, to history and metaphysics, and on to hunting, music and the dance, clothing, politics, and the marketplace, Goodman discloses the rich texture of classical Islamic civilization-its distinctive problematics and the space it left for the talents and creativity of the individual. His philosophic openness and easy familiarity place Islamic humanism securely in its larger context, revealing clearly what is of universa and abiding vitality and interest. In place of stereotypes, suspicions, and unease, Goodman sets out concrete and detailed expositions and explorations of Islamic thought and experience as seen through the eyes of the participants themselves. His engaged but sympathetic readings penetrate beneath the surface of the ancient texts to the humanistic values embraced by some of the greatest thinkers of Islam. As a result, Islamic Humanism does much more than remind us how much we owe to the intellectual achievements of Islamic civilization. The work is a significant contribution to Western understanding of Islam and to Islamic self-understanding of the profoundly humanistic dimensions of the Islamic tradition. (shrink)
A Probabilistic Information Processing System uses men and machines in a novel way to perform diagnostic information processing. Men estimate likelihood ratios for each datum and each pair of hypotheses under consideration or a sufficient subset of these pairs. A computer aggregates these estimates by means of Bayes' theorem of probability theory into a posterior distribution that reflects the impact of all available data on all hypotheses being considered. Such a system circumvents human conservatism in information processing, the inability of (...) men to aggregate information in such a way as to modify their opinions as much as the available data justify. It also fragments the job of evaluating diagnostic information into small separable tasks. The posterior distributions that are a PIP's output may be used as a guide to human decision making or may be combined with a payoff matrix to make decisions by means of the principle of maximizing expected value. A large simulation-type experiment compared a PIP with three other information processing systems in a simulated strategic war setting of the 1970's. The difference between PIP and its competitors was that in PIP the information was aggregated by computer, while in the other three systems, the operators aggregated the information in their heads. PIP processed the information dramatically more efficiently than did any competitor. Data that would lead PIP to give 99:1 odds in favor of a hypothesis led the next best system to give 4Â¿: 1 odds. (shrink)
In many learning or inference tasks human behavior approximates that of a Bayesian ideal observer, suggesting that, at some level, cognition can be described as Bayesian inference. However, a number of findings have highlighted an intriguing mismatch between human behavior and standard assumptions about optimality: People often appear to make decisions based on just one or a few samples from the appropriate posterior probability distribution, rather than using the full distribution. Although sampling-based approximations are a common way to implement Bayesian (...) inference, the very limited numbers of samples often used by humans seem insufficient to approximate the required probability distributions very accurately. Here, we consider this discrepancy in the broader framework of statistical decision theory, and ask: If people are making decisions based on samples—but as samples are costly—how many samples should people use to optimize their total expected or worst-case reward over a large number of decisions? We find that under reasonable assumptions about the time costs of sampling, making many quick but locally suboptimal decisions based on very few samples may be the globally optimal strategy over long periods. These results help to reconcile a large body of work showing sampling-based or probability matching behavior with the hypothesis that human cognition can be understood in Bayesian terms, and they suggest promising future directions for studies of resource-constrained cognition. (shrink)
Marr's levels of analysis—computational, algorithmic, and implementation—have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the (...) notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call “resource-rational analysis.”. (shrink)
Drawing upon two independent national samples of 201 and 241 psychology graduate students, this article describes the development and psychometric evaluation of 4 Web-based student self-report scales tapping student socialization in the responsible conduct of research (RCR) with human participants. The Mentoring the Responsible Conduct of Research Scale (MRCR) is composed of 2 subscales assessing RCR instruction and modeling by research mentors. The 2 subscales of the RCR Department Climate Scale (RCR-DC) assess RCR department policies and faculty and student RCR (...) practices. The RCR Preparedness scale (RCR-P) and the RCR Field Integrity scale (RCR-FI) measure respectively students' confidence in their ability to conduct research responsibly and their belief in the RCR integrity of psychology as a discipline. Factor analysis, coefficient alphas, correlations, and multiple regression analyses demonstrated each of the scales had good internal consistency and concurrent and construct validity. (shrink)
Abstract Islam displaces the ancient idea of time as an implacable enemy with the scriptural image of time as the stage of judgment, a narrow bridge of accountability stretched between creation and eternity. The stark contrast of temporal evanescence with all the immutability of eternity challenges Muslim theologians and philosophers of the classic age. The dialectical theologians of the kalam describe time and change atomisti?cally and even occasionalistically, seeking to preserve the absoluteness of the contrast and to avoid compromising the (...) purity of God's creative act and the sheer facticity of its temporal effect. The falasifa, philosophers in the Greek tradition, use Platonic, Aristotelian, and Neoplatonic arguments to reconcile temporality with eternity. Stripped of argument, their emanative and archetypal schemes join the core symbolisms of Islam, but only when accommodated to the Qur'anic ideas of judgment and creation. (shrink)
the philosophers in the West, none, perhaps, is better known by name and less familiar in actual content of his ideas than the medieval Muslim philosopher, physician, minister and naturalist Abu Ali Ibn Sina, known since the days of the scholastics as Avicenna. In this book the author, himself a philosopher, and long known for his studies of Arabic thought, presents a factual account of Avicenna's philosophy. Setting the thinker in the context of his often turbulent times and tracing the (...) roots and influences of Avicenna's ideas, this book offers a factual philosophical portrait. It details Avicenna's account of being as a synthesis between the seemingly irreconcilable extremes of Aristotelian eternalism and the creationism of monotheistic scripture. It examines Avicenna's distinctive theory of knowledge, his ideas about immortality and individuality, including the famous "floating man argument", his contributions to logic, and his probing thoughts on rhetoric and poetics. (shrink)
This biography of the court scholar Xun Xu explores central areas of intellectual life in third-century China — court lyrics, music, metrology, pitch systems, archeology, and historiography. It clarifies the relevant source texts in order to reveal fierce debates. Besides solving technical puzzles about the material details of court rites, the book unfolds factional struggles that developed into scholarly ones.
The visual system is persistent, inventive, and sometimes rather perverse in building a world according to its own lights; the supplementation is deft, flexible, and often elaborate. [JL: Our eyes/consciousness could “fill in” things that are not there; they can also delete things that are there].