Curriculum 2000 has meant significant change for the post-16 sector. New qualifications have been introduced (e.g. the new Advanced Subsidiary examination) and the number of students involved in education and training post-16 has increased. In this scenario how can the standards of new qualifications, particularly the new Advanced Subsidiary examinations, be compared with those of previous qualifications? One method is to use the prior achievement of candidates (i.e. GCSE results) as a basis for comparison of their results on subsequent qualifications (...) (i.e. A levels and AS). This method of comparability and its limitations will be explored using examples with actual data. (shrink)
This book puts forward a much-needed reappraisal of Immanuel Kant's conception of and response to skepticism, as set forth principally in the Critique of Pure Reason. It is widely recognized that Kant's theoretical philosophy aims to answer skepticism and reform metaphysics--Michael Forster makes the controversial argument that those aims are closely linked. He distinguishes among three types of skepticism: "veil of perception" skepticism, which concerns the external world; Humean skepticism, which concerns the existence of a priori concepts and synthetic (...) a priori knowledge; and Pyrrhonian skepticism, which concerns the equal balance of opposing arguments. Forster overturns conventional views by showing how the first of these types was of little importance for Kant, but how the second and third held very special importance for him, namely because of their bearing on the fate of metaphysics. He argues that Kant undertook his reform of metaphysics primarily in order to render it defensible against these types of skepticism. Finally, in a critical appraisal of Kant's project, Forster argues that, despite its strengths, it ultimately fails, for reasons that carry interesting broader philosophical lessons. These reasons include inadequate self-reflection and an underestimation of the resources of Pyrrhonian skepticism. (shrink)
Recent solutions to the curve-fitting problem, described in Forster and Sober (), trade off the simplicity and fit of hypotheses by defining simplicity as the paucity of adjustable parameters. Scott De Vito () charges that these solutions are 'conventional' because he thinks that the number of adjustable parameters may change when the hypotheses are described differently. This he believes is exactly what is illustrated in Goodman's new riddle of induction, otherwise known as the grue problem. However, the 'number of (...) adjustable parameters' is actually a loose way of referring to a quantity that is not language dependent. The quantity arises out of Akaike's theorem in a way that ensures its language invariance. (shrink)
The aim of this book is twofold: to explain the reconciliation of religion and politics in the work of John Locke, and to explore the relevance of that reconciliation for politics in our own time. Confronted with deep social divisions over ultimate beliefs, Locke sought to unite society in a single liberal community. Reason could identify divine moral laws that would be acceptable to members of all cultural groups, thereby justifying the authority of government. Greg Forster demonstrates that Locke's (...) theory is liberal and rational but also moral and religious, providing an alternative to the two extremes of religious fanaticism and moral relativism. This account of Locke's thought will appeal to specialists and advanced students across philosophy, political science and religious studies. (shrink)
Johann Gottfried Herder is a towering figure in modern thought, but one who has hitherto been severely underappreciated. Michael Forster seeks to rectify that situation by exploring the full range of his ideas, and showing their enormous impact in philosophy, linguistics, anthropology, and comparative literature.
In dit essay keert Forster zich tegen geloof met een grote G: het fundamentalistisch geloof in een religie als de enige bron van waarheid, maar ook het geloof in de macht en het geweld van grote mannen en ideologieën. Hij stelt er zijn eigen geloof met een ‘heel kleine g’ tegenover, een geloof in vrijheid, democratie en individualisme, in de adel van de geest.
Traditional analyses of the curve fitting problem maintain that the data do not indicate what form the fitted curve should take. Rather, this issue is said to be settled by prior probabilities, by simplicity, or by a background theory. In this paper, we describe a result due to Akaike , which shows how the data can underwrite an inference concerning the curve's form based on an estimate of how predictively accurate it will be. We argue that this approach throws light (...) on the theoretical virtues of parsimoniousness, unification, and non ad hocness, on the dispute about Bayesianism, and on empiricism and scientific realism. * Both of us gratefully acknowledge support from the Graduate School at the University of Wisconsin-Madison, and NSF grant DIR-8822278 (M.F.) and NSF grant SBE-9212294 (E.S.). Special thanks go to A. W. F. Edwards.William Harper. Martin Leckey. Brian Skyrms, and especially Peter Turney for helpful comments on an earlier draft. (shrink)
There is a vacuum in three generations of the Grotowski menï¿½s livesï¿½this becomes clear within the filmï¿½s first ten minutes. First Hank wakes alone in the middle of the night, vomits for no apparent reason, and makes a ritual trip to a lonely diner. Next Hankï¿½s boy Sonny perfunctorily screws a prostitute whoï¿½after they have finishedï¿½tells him "you look so sad." Finally, Buckï¿½the eldest played by Peter Boyleï¿½wanders through the house sucking breath from an oxygen tank, adds a new page (...) to his capital punishment scrapbook, and spits racist epithets at some teenagers of color who wander into his yard. (shrink)
We shed light on an old problem by showing that the logic LP cannot define a binary connective $\odot$ obeying detachment in the sense that every valuation satisfying $\varphi$ and $(\varphi\odot\psi)$ also satisfies $\psi$ , except trivially. We derive this as a corollary of a more general result concerning variable sharing.
We create a database of company codes of ethics from firms listed on the Standard & Poor's 500 Index and, separately, a sample of small firms. The SEC believes that "ethics codes do, and should, vary from company to company." Using textual analysis techniques, we measure the extent of commonality across the documents. We find substantial levels of common sentences used by the firms, including a few cases where the codes of ethics are essentially identical. We consider these results in (...) the context of legal statements versus value statements. While legal writing often mandates duplication, we argue that value-based statements should be held to a higher standard of originality. Our evidence is consistent with isomorphic pressures on smaller firms to conform. (shrink)
The simple question, what is empirical success? turns out to have a surprisingly complicated answer. We need to distinguish between meritorious fit and ‘fudged fit', which is akin to the distinction between prediction and accommodation. The final proposal is that empirical success emerges in a theory dependent way from the agreement of independent measurements of theoretically postulated quantities. Implications for realism and Bayesianism are discussed. ‡This paper was written when I was a visiting fellow at the Center for Philosophy of (...) Science at the University of Pittsburgh; I thank everyone for their support. †To contact the author, please write to: Department of Philosophy, University of Wisconsin–Madison, 5185 Helen C. White Hall, 600 North Park Street, Madison, WI 53706; e-mail: firstname.lastname@example.org. (shrink)
This exploratory study examines how managers and professionals regard the ethical and social responsibility reputations of 60 well-known Australian and International companies, and how this in turn influences their attitudes and behaviour towards these organisations. More than 350 MBA, other postgraduate business students, and participants in Australian Institute of Management (Western Australia) management education programmes were surveyed to evaluate how ethical and socially responsible they believed the 60 organisations to be. The survey sought to determine what these participants considered ‘ethical’ (...) and ‘socially responsible’ behaviour in organisations to be. The survey also examined how the participants’ beliefs influenced their attitudes and intended behaviours towards these organisations. The results of this survey indicate that many managers and professionals have clear views about the ethical and social responsibility reputations of companies. This affects their attitudes towards these organisations which in turn has an impact on their intended behaviour towards them. These findings support the view in other research studies that well-educated managers and professionals are, to some extent, taking into account the ethical and social responsibility reputations of companies when deciding whether to work for them, use their services or buy shares in their companies. (shrink)
This article leverages insights from the body of Adam Smith’s work, including two lesser-known manuscripts—the Theory of Moral Sentiments and Lectures in Jurisprudence —to help answer the question as to how companies should morally prioritize corporate social responsibility (CSR) initiatives and stakeholder claims. Smith makes philosophical distinctions between justice and beneficence and perfect and imperfect rights, and we leverage those distinctions to speak to contemporary CSR and stakeholder management theories. We address the often-neglected question as to how far a company (...) should be expected to go in pursuit of CSR initiatives and we offer a fresh perspective as to the role of business in relation to stakeholders and to society as a whole. Smith’s moral insights help us to propose a practical framework of legitimacy in stakeholder claims that can help managers select appropriate and responsible CSR activities. (shrink)
The phrase ‘The iterative conception of sets’ conjures up a picture of a particular settheoretic universe – the cumulative hierarchy – and the constant conjunction of phrasewith-picture is so reliable that people tend to think that the cumulative hierarchy is all there is to the iterative conception of sets: if you conceive sets iteratively, then the result is the cumulative hierarchy. In this paper, I shall be arguing that this is a mistake: the iterative conception of set is a good (...) one, for all the usual reasons. However, the cumulative hierarchy is merely one way among many of working out this conception, and arguments in favour of an iterative conception have been mistaken for arguments in favour of this one special instance of it. (This may be the point to get out of the way the observation that although philosophers of mathematics write of the iterative conception of set, what they really mean – in the terminology of modern computer science at least – is the recursive conception of sets. Nevertheless, having got that quibble off my chest, I shall continue to write of the iterative conception like everyone else.). (shrink)
The central problem with Bayesian philosophy of science is that it cannot take account of the relevance of simplicity and unification to confirmation, induction, and scientific inference. The standard Bayesian folklore about factoring simplicity into the priors, and convergence theorems as a way of grounding their objectivity are some of the myths that Earman's book does not address adequately. 1Review of John Earman: Bayes or Bust?, Cambridge, MA. MIT Press, 1992, £33.75cloth.
What has science actually achieved? A theory of achievement should define what has been achieved, describe the means or methods used in science, and explain how such methods lead to such achievements. Predictive accuracy is one truth‐related achievement of science, and there is an explanation of why common scientific practices tend to increase predictive accuracy. Akaike’s explanation for the success of AIC is limited to interpolative predictive accuracy. But therein lies the strength of the general framework, for it also provides (...) a clear formulation of many open problems of research. (shrink)
The likelihood theory of evidence (LTE) says, roughly, that all the information relevant to the bearing of data on hypotheses (or models) is contained in the likelihoods. There exist counterexamples in which one can tell which of two hypotheses is true from the full data, but not from the likelihoods alone. These examples suggest that some forms of scientific reasoning, such as the consilience of inductions (Whewell, 1858. In Novum organon renovatum (Part II of the 3rd ed.). The philosophy of (...) the inductive sciences. London: Cass, 1967), cannot be represented within Bayesian and Likelihoodist philosophies of science. (shrink)
Illusions of personal authorship can arise when causation for an event is ambiguous, but people mentally represent an anticipated outcome and then observe a corresponding match in reality. When people do not maintain such high-level outcome representations and focus instead on low-level behavioral representations of concrete actions, illusions of personal authorship can be reduced. One condition that yields specific action plans and thereby moves focus from high-level outcomes to low-level actions is the generation of counterfactual thoughts. Hence, in the present (...) research we tested whether thinking counterfactually can reduce illusory authorship. In line with predictions, generating behavior-regulating counterfactuals reduced susceptibility to the illusion . Importantly, this only occurred when people expected to re-encounter the situation to which the counterfactuals applied . These findings extend existing research on the boundary conditions of illusory experiences of personal authorship and might hint at a relationship between the illusion and behavior regulation. (shrink)
Ramsey, Stick and Garon (1991) argue that if the correct theory of mind is some parallel distributed processing theory, then folk psychology must be false. Their idea is that if the nodes and connections that encode one representation are causally active then all representations encoded by the same set of nodes and connections are also causally active. We present a clear, and concrete, counterexample to RSG's argument. In conclusion, we suggest that folk psychology and connectionism are best understood as complementary (...) theories. Each has different limitations, yet each will co-evolve with the other in an overlapping domain of 'normal' psychology. (shrink)
Although in most situations approaching desired end-states entails decreasing distance between oneself and an object, and avoiding undesired end-states increases such distance, in some cases distancing can also be a means to approach a given goal. We highlight examples involving responses to obstacles to achievement and self-control dilemmas, showing that motivational direction is not equivalent to the motivational strategy involved when people pursue their goals.
It has become very popular among philosophers to attempt to discredit, or at least set severe limits to, the thesis that there exist conceptual schemes radically different from ours. This fashion is misconceived. Philosophers have attempted to justify it in two main ways: by means of arguments which are a priorist relative to the relevant linguistic and textual evidence (and either independent of or based upon positive theories of meaning, understanding, and interpretation); and by means of arguments which are a (...) posteriorist relative to that evidence. The former approach is misconceived, not only in that its particular arguments fail, but also in principle. The latter approach, while in general the right sort of approach to adopt to the question, arrives at its conclusion only through faulty execution, through misinterpretation of the evidence. Though quite unjustified, philosophers' hostility to the thesis of radically different conceptual schemes is easily explained, namely, in terms of a number of psychologically powerful motives which it subserves. These motives cannot step in to provide the missing justification, however. Instead, they reveal such hostility in an even shadier light. (shrink)
The theory of fast and frugal heuristics, developed in a new book called Simple Heuristics that make Us Smart (Gigerenzer, Todd, and the ABC Research Group, in press), includes two requirements for rational decision making. One is that decision rules are bounded in their rationality –- that rules are frugal in what they take into account, and therefore fast in their operation. The second is that the rules are ecologically adapted to the environment, which means that they `fit to reality.' (...) The main purpose of this article is to apply these ideas to learning rules–-methods for constructing, selecting, or evaluating competing hypotheses in science, and to the methodology of machine learning, of which connectionist learning is a special case. The bad news is that ecological validity is particularly difficult to implement and difficult to understand. The good news is that it builds an important bridge from normative psychology and machine learning to recent work in the philosophy of science, which considers predictive accuracy to be a primary goal of science. (shrink)
Curve-fitting typically works by trading off goodness-of-fit with simplicity, where simplicity is measured by the number of adjustable parameters. However, such methods cannot be applied in an unrestricted way. I discuss one such correction, and explain why the exception arises. The same kind of probabilistic explanation offers a surprising resolution to a common-sense dilemma.
In diesem zweiten Teil wird zuerst gezeigt, dass Hegel sich am Anfang seiner Jenaer Zeit genau wie Schelling am §76 der Kritik der Urteilskraft mit Kants Gedanken eines Urgrunds orientiert, in dem Sein und Denken, Subjektives und Objektives zusammenfallen. Nach Schellings Weggang 1803 kommt Hegel aber durch seinen Freund Schelver zunehmend mit Goethe in Kontakt, dessen Methodologie eines intuitiven Verstandes im Sinne von KdU §77 Schelver als neuberufener Botanikprofessor und Direktor des botanischen Gartens in die Praxis umzusetzen hat. Hegel übernimmt (...) diese Methodologie zuerst, im sog. zweiten Jenaer Systementwurf von 1804/5, teilweise, ab 1805 in den Vorlesungen zur Geschichte des Philosophie ganz, wobei er sie aber in einem entscheidenden Punkt weiterentwickelt. Sie liegt auch der Phänomenologie des Geistes zugrunde. Zugleich wird deutlich, warum der Systemteil ‚Logik und Metaphysik’ aufgegeben und die Metaphysik selbst zur Logik werden muss, der nun allerdings eine solche ‚Phänomenologie des Geistes’ vorangehen muss. (shrink)
Van Fraassen has argued that quantum mechanics does not conform to the pattern of common cause explanation used by Salmon as a precise formulation of Smart's 'cosmic coincidence' argument for scientific realism. This paper adds to this list some common examples from classical physics that also do not conform to Salmon's explanatory schema. This is bad news and good news for the realist. The bad news is that Salmon's argument for realism does not work; the good news is that realism (...) need not demand hidden variables in quantum mechanics if they are not used in classical mechanics. Many correlations in physics are explained in terms of property identity (contra Salmon). This leads to a new argument against van Fraassen because the unified version of the theory obtained by identifying theoretical properties is always less empirically adequate. (shrink)
It is shown that, according to NF, many of the assertions of ordinal arithmetic involving the T-function which is peculiar to NF turn out to be equivalent to the truth-in-certain-permutation-models of assertions which have perfectly sensible ZF-style meanings, such as: the existence of wellfounded sets of great size or rank, or the nonexistence of small counterexamples to the wellfoundedness of ∈. Everything here holds also for NFU if the permutations are taken to fix all urelemente.
A GOOD CASE COULD BE MADE that Herder is the founder not only of the modern philosophy of language but also of the modern philosophy of interpretation and translation and that he has many things to say on these subjects from which we may still learn today. This essay will not attempt to make such a case, but it will be concerned with some aspects of Herder’s position that would be central to it: three fundamental principles in his philosophy of (...) language which also play fundamental roles in his theory of interpretation and translation. The essay’s aim is also threefold: first, to describe the principles in question and their roles in this theory; second, to explain their emergence in a way which helps to make clearer the nature of Herder’s contribution ; and third, to give at least a sense of their philosophical subtlety and defensibility. (shrink)
Charles Peirce is often credited for being among the first, perhaps even the first, to develop a scientific metaphysics of indeterminism. After rejecting the received view that Peirce developed his views from Darwin and Maxwell, I argue that Peirce's view results from his synthesis of Immanuel Kant's critical philosophy and George Boole's contributions to formal logic. Specifically, I claim that Kant's conception of the laws of logic as the basis for his architectonic, when combined with Boole's view of probability, yields (...) Peirce's metaphysics of probabilistic laws. Indeterminism provides, therefore, an excellent illustration of how Peirce attempted to use logic to clarify metaphysical problems.Since everyone must have conceptions of things in general, it is most important that they should be carefully constructed. I shall enter into no criticism of the different methods of metaphysical research, but shall merely say that in the opinions of several great thinkers, the only successful mode yet lighted upon is that of adopting our logic as our metaphysics. (W1: 490, 1866)2. (shrink)
Let ZFB be ZF + "every set is the same size as a wellfounded set". Then the following are true. Every sentence true in every (Rieger-Bernays) permutation model of a model of ZF is a theorem of ZFB. (i.e.. ZFB is the theory of Rieger-Bernays permutation models of models of ZF) ZF and ZFAFA are both extensions of ZFB conservative for stratified formulæ. The class of models of ZFB is closed under creation of Rieger-Bernays permutation models.
Mistakes and errors happen in most spheres of human life and activity, including in medicine. A mistake can be as simple and benign as the collection of an extra and unnecessary urine sample. Or a mistake can cause serious but reversible harm, such as an overdose of insulin in a patient with diabetes, resulting in hypoglycemia, seizures, and coma. Or a mistake can result in serious and permanent damage for the patient, such as the failure to consider epiglottitis in an (...) initial differential diagnosis, resulting in a chronic vegetative state for a seven-year-old boy. Or a mistake can be an error in judgment that leads to a patient's death. (shrink)