Johnathan Cohen's book provides a lucid and penetrating treatment of the fundamental issues of contemporary analytical philosophy. This field now spans a greater variety of topics and divergence of opinion than fifty years ago, and Cohen's book addresses the presuppositions implicit to it and the patterns of reasoning on which it relies.
The tapestry of Wilfrid Sellars’s writings is dauntingly rich in stimulus and suggestion. I shall take up here an intriguing strand of thought that was woven into one of his early papers ‘Language, Rules and Behavior’, and I shall discuss some of the issues to which it gives rise. Sellars was concerned in that paper with the procedures by which people evaluate actions as right or wrong, arguments as valid or invalid, and cognitive claims as well or ill grounded. He (...) sought to map out a true vïa media, in his treatment of such procedures, between rationalistic apriorism and what, for want of a better term then, he called ‘descriptivism’, by which he understood the claim that all meaningful concepts and problems belong to the empirical or descriptive sciences, including the sciences of human behaviour. Sellars was thus led to ask not only ‘What sort of a thing is justification?’ but also ‘How do we come to accept the rules or laws that license justifications?’ And he saw these questions as arising in relation to any dialogue in which one party seeks to justify something to another. With regard to the justification of actions he held that the intuitionism of Ross, Prichard and Ewing was reasonably faithful to the phenomenology of moral thought and experience, though he did not agree with their belief in a nonnatural quality or relation that may belong to actions over and above their empirical characteristics. With regard to the justification of predictions he held that the relevant covering laws are ultimately rendered acceptable by an appeal to instances of their application. But he was reluctant to conceive this kind of appeal as an inductive procedure. Instead he called it an application of Socratic method, since the purpose of this method is to make explicit the rules that we have implicitly adopted for thought and action, and Sellars interpreted our judgments to the effect that A causally necessitates B as the expression of a rule governing our use of the terms ‘A’ and ‘B’: science consists in the attempt, by remodeling human language, to develop a system of rule-governed behaviour which will adjust the human organism to the environment. (shrink)
Recent interest in the problem of verisimilitude stemmed originally from Popper's desire to provide a non-inductive criterion of merit that will select between two false theories) But the problem has also been taken up by others who are not committed to Popper's anti-inductivism. Indeed Ilkka Niiniluoto has argued that the estimated degree of truthlikeness of a generalisation g which is compatible with evidence e can be equated with the inductive probability of g on e, wherever g is a constituent in (...) Hintikka's sense. It might therefore be worth while to approach the problem afresh, in order to determine quite generally whether a measure of verisimilitude is what is principally needed for the evaluation of scientific progress. I shall argue that it is not. (shrink)
No one nowadays would deny the importance of conceptual innovation in the growth of scientific knowledge. But how is it possible? And by this I do not mean: what kinds of social, economic, or mental develop- ments are causally responsible for promoting it? That is a question for historians, sociologists and psychologists of science to answer. Instead I shall concern myself with a more philosophical issue, namely: how can the possibility of conceptual innovation be compatible with the way in which (...) we reason about language, meaning and understanding - i.e., what adjustments in, or constraints on, the framework of such reasoning are forced on us by acceptance of this possibility? In particular does it fit in with the project of reconstructing scientific reasoning in artificial languages like Leibniz proposed in the 17th century or Carnap in the 20th? (shrink)
First published in 1963, this title considers the philosophical problems encountered when attempting to provide a clear and general explanation of scientific principles, and the basic confrontation between such principles and experience. Beginning with a detailed introduction that considers various approaches to the philosophy and theory of science, Israel Scheffler then divides his study into three key sections – Explanation, Significance and Confirmation – that explore how these complex issues involved have been dealt with in contemporary research. This title, by (...) one of America’s leading philosophers, will provide a valuable analysis of the theory and problems surrounding the Philosophy of Science. (shrink)
The classical analysis of relevance in probabilistic terms does not fit legal, moral or conversational relevance, and, though analysis in terms of a psychological model may fit conversational relevance, it certainly does not fit legal, moral or evidential relevance. It is important to notice here that some sentences are ambiguous between conversational and non-conversational relevance. But, if and only ifR is relevant to a questionQ, R is a reason, though not necessarily a complete or conclusive reason, for accepting or rejecting (...) something as an answer toQ. Reasons of this kind are governed by appropriate covering laws or principled probabilities and a number of questions thus arise about the relationship between relevance and certain formal-logical properties. (shrink)