As a new field, cognitivism began with the total rejection of the old, traditional views of language acquisition and of learning ─ individual and collective alike. Chomsky was one of the pioneers in this respect, yet he clouds issues by excessive claims for his originality and by not allowing the beginner in the art of the acquisition of language the use of learning by making hypotheses and testing them, though he acknowledges that researchers, himself included, do use this method. The (...) most important novelty of Chomsky's work is his idealization of the field by postulating the existence of the ideal speaker-hearer and his suggestion that the hidden structure of sentences is revealed by studying together all sentences that are logically equivalent to each other. This is progress, but his tests of equivalence are insufficient, as they all are within classical logic. This limitation rests on the greatest shortcoming of Chomsky's view, his idea that every sentence has one subject or subject-part, contrary to the claim of Frege and Russell that assertions involving relations (with two-place predicates) are structurally different from those involving properties (with one-place predicates). (See the Appendix below.). (shrink)
Skeptical theists argue that no seemingly unjustified evil (SUE) could ever lower the probability of God's existence at all. Why? Because God might have justifying reasons for allowing such evils (JuffREs) that are undetectable. However, skeptical theists are unclear regarding whether or not God's existence is relevant to the existence of JuffREs, and whether or not God's existence is relevant to their detectability. But I will argue that, no matter how the skeptical theist answers these questions, it is undeniable that (...) the skeptical theist is wrong; SUEs lower the probability of God's existence. To establish this, I will consider the four scenarios regarding the relevance of God's existence to the existence and detectability of JuffREs, and show that in each—after we establish our initial probabilities, and then update them given the evidence of a SUE—the probability of God's existence drops. (shrink)
Some theists maintain that they need not answer the threat posed to theistic belief by natural evil; they have reason enough to believe that God exists and it renders impotent any threat that natural evil poses to theism. Explicating how God and natural evil coexist is not necessary since they already know both exist. I will argue that, even granting theists the knowledge they claim, this does not leave them in an agreeable position. It commits the theist to a very (...) unpalatable position: our universe was not designed by God and is instead, most likely, a computer simulation. (shrink)
In a fictional conversation designed to appeal to both working teachers and social philosophers, three educators take up the question of whether critical thinking itself can, or should, be taught independently of an explicit consideration of issues related to social justice. One, a thoughtful but somewhat traditional Enlightenment rationalist, sees critical thinking as a neutral set of skills and dispositions, essentially unrelated to the conclusions of morality, problems of social organization, or the content of any particular academic discipline. A second (...) interlocutor, steeped in “critical” pedagogy of Paulo Freire, insists that the problem is the pose of neutrality itself. On this view, all honest and effective approaches to teaching must confront the hegemony of unjust relationships, institutions, and conceptual schemes. The third character attempts to resolve the tension between these two opposed camps. (shrink)
This paper is a response to a recent claim made by Norwegian philosopher Tarjei Larsen in the Journal of the British Society for Phenomenology that Merleau-Ponty’s own theory of painting undermines the important distinction made in his thought between primordial perception and cultural construction because it requires that perception take different cultural and historical forms in order to account for perspectival painting. I try to show that this distinction is not so easily collapsed by arguing that Larsen has misconstrued Merleau-Ponty’s (...) account of painting as a phenomenological theory of painting and misinterpreted Merleau-Ponty’s concept of painterly style, and that therefore the conclusion that perception must be malleable is not warranted. What is at stake in this debate is whether Merleau-Ponty’s own account of painting threatens the basis for his phenomenological project as well as his attempt to accord substantial philosophical significance to painting. (shrink)
Theological incompatibility arguments suggest God's comprehensive foreknowledge is incompatible with human free will. Logical incompatibility arguments suggest a complete set of truths about the future is logically incompatible with human free will. Of the two, most think theological incompatibility is the more severe problem; but hardly anyone thinks either kind of argument presents a real threat to free will. I will argue, however, that sound theological and logical incompatibility arguments exist and that, in fact, logical incompatibly is the more severe (...) problem. A deep analysis of the arguments will reveal that, to avoid a fatalist conclusion, we must reject bivalence and adopt a specific kind of temporal ontology (presentism), which also forces the theist to embrace open theism. (shrink)
This dialogue discusses a proposal for the legalization of torture under specific circumstances and contrasts it with arguments for a total ban on torture. We consider three types of objection: first, that the difficulty of having adequate knowledge renders the stock “ticking bomb” scenario such a low-probability hypothetical as to present no realistic threat to a policy banning all torture; second, that empirically the information gleaned from torture is so unlikely to be reliable that it could not justify the moral (...) risk; and third, that sanctioning torture, even if only under the most extreme circumstances, would generate a ‘culture of torture,’ hence undermining fragile advances in international human rights rooted in unwavering commitment to human dignity. Compelling as these arguments appear, not all the conversants are wholly convinced by them; to this extent the dialogue ends aporetically. (shrink)
The basic idea of the particular way of understanding mental phenomena that has inspired the "cognitive revolution" is that, as a result of certain relatively recent intellectual and technological innovations, informed theorists now possess a more powerfully insightful comparison or model for mind than was available to any thinkers in the past. The model in question is that of software, or the list of rules for input, output, and internal transformations by which we determine and control the workings of a (...) computing machine's hardware. Although this comparison and its many implications have dominated work in the philosophy, psychology, and neurobiology of mind since the end of the Second World War, it now shows increasing signs of losing its once virtually unquestioned preeminence. Thus we now face the question of whether it is possible to repair and save this model by means of relatively inessential "tinkering", or whether we must reconceive it fundamentally and replace it with something different. In this book, twenty-eight leading scholars from diverse fields of "cognitive science"-linguistics, psychology, neurophysiology, and philosophy- present their latest, carefully considered judgements about what they think will be the future course of this intellectual movement, that in many respects has been a watershed in our contemporary struggles to comprehend that which is crucially significant about human beings. Jerome Bruner, Noam Chomsky, Margaret Boden, Ulric Neisser, Rom Harre, Merlin Donald, among others, have all written chapters in a non-technical style that can be enjoyed and understood by an inter-disciplinary audience of psychologists, philosophers, anthropologists, linguists, and cognitive scientists alike. (shrink)
We present a rendering of some common grammatical formalisms in terms of evolving algebras. Though our main concern in this paper is on constraint-based formalisms, we also discuss the more basic case of context-free grammars. Our aim throughout is to highlight the use of evolving algebras as a specification tool to obtain grammar formalisms.
We consider the use ofevolving algebra methods of specifying grammars for natural languages. We are especially interested in distributed evolving algebras. We provide the motivation for doing this, and we give a reconstruction of some classic grammar formalisms in directly dynamic terms. Finally, we consider some technical questions arising from the use of direct dynamism in grammar formalisms.