Recent solutions to the curve-fitting problem, described in Forster and Sober (), trade off the simplicity and fit of hypotheses by defining simplicity as the paucity of adjustable parameters. Scott De Vito () charges that these solutions are 'conventional' because he thinks that the number of adjustable parameters may change when the hypotheses are described differently. This he believes is exactly what is illustrated in Goodman's new riddle of induction, otherwise known as the grue problem. However, the 'number of (...) adjustable parameters' is actually a loose way of referring to a quantity that is not language dependent. The quantity arises out of Akaike's theorem in a way that ensures its language invariance. (shrink)
What was the source of this great flowering? Much of the credit for it has tended to go to Jacobi and Mendelssohn, who in 1785 began a famous public dispute concerning the question whether or not Lessing had been a Spinozist, as Jacobi alleged Lessing had admitted to him shortly before his death in 1781. But Jacobi and Mendelssohn were both negatively disposed towards Spinoza. In On the Doctrine of Spinoza in Letters to Mr.
The aim of this highly original book is twofold: to explain the reconciliation of religion and politics in the work of John Locke, and to explore the relevance of that reconciliation for politics in our own time. Confronted with deep social divisions over ultimate beliefs Locke sought to unite society in a single liberal community. Reason could identify divine moral laws that would be acceptable to members of all cultural groups, thereby justifying the authority of government. Greg Forster demonstrates (...) that Locke's theory is liberal and rational but also moral and religious, providing an alternative to the two extremes of religious fanaticism and moral relativism. This fresh new account of Locke's thought will appeal to specialists and advanced students across philosophy, political science, and religious studies. (shrink)
Machine generated contents note: 1. Rationality, idealism, monism, and beyond Michael Della Rocca; 2. Kant's idea of the unconditioned and Spinoza's the fourth antinomy and the ideal of pure reason Omri Boehm; 3. The question is whether a purely apparent person is possible Karl Ameriks; 4. Herder and Spinoza Michael Forster; 5. Goethe's Spinozism Eckart Förster; 6. Fichte on freedom: the Spinozistic background Allen Wood; 7. Fichte on the consciousness of Spinoza's God Johannes Haag; 8. Spinoza in Schelling's early (...) conception of intellectual intuition Dalia Nassar; 9. Schelling's philosophy of identity and Spinoza's ethica more geometrico Michael Vater; 10. 'Omnis determinatio est negatio' - determination, negation, and self-negation in Spinoza, Kant, and Hegel Yitzhak Y. Melamed; 11. Thought and metaphysics: Hegel's critical reception of Spinoza Dean Moyar; 12. Two models of metaphysical inferentialism: Spinoza and Hegel Gunnar Hinricks; 13. Trendelenburg and Spinoza Fred Beiser; 14. Replies on behalf of Spinoza Don Garrett. (shrink)
The phrase ‘The iterative conception of sets’ conjures up a picture of a particular settheoretic universe – the cumulative hierarchy – and the constant conjunction of phrasewith-picture is so reliable that people tend to think that the cumulative hierarchy is all there is to the iterative conception of sets: if you conceive sets iteratively, then the result is the cumulative hierarchy. In this paper, I shall be arguing that this is a mistake: the iterative conception of set is a good (...) one, for all the usual reasons. However, the cumulative hierarchy is merely one way among many of working out this conception, and arguments in favour of an iterative conception have been mistaken for arguments in favour of this one special instance of it. (This may be the point to get out of the way the observation that although philosophers of mathematics write of the iterative conception of set, what they really mean – in the terminology of modern computer science at least – is the recursive conception of sets. Nevertheless, having got that quibble off my chest, I shall continue to write of the iterative conception like everyone else.). (shrink)
Traditional analyses of the curve fitting problem maintain that the data do not indicate what form the fitted curve should take. Rather, this issue is said to be settled by prior probabilities, by simplicity, or by a background theory. In this paper, we describe a result due to Akaike , which shows how the data can underwrite an inference concerning the curve's form based on an estimate of how predictively accurate it will be. We argue that this approach throws light (...) on the theoretical virtues of parsimoniousness, unification, and non ad hocness, on the dispute about Bayesianism, and on empiricism and scientific realism. * Both of us gratefully acknowledge support from the Graduate School at the University of Wisconsin-Madison, and NSF grant DIR-8822278 (M.F.) and NSF grant SBE-9212294 (E.S.). Special thanks go to A. W. F. Edwards.William Harper. Martin Leckey. Brian Skyrms, and especially Peter Turney for helpful comments on an earlier draft. (shrink)
This exploratory study examines how managers and professionals regard the ethical and social responsibility reputations of 60 well-known Australian and International companies, and how this in turn influences their attitudes and behaviour towards these organisations. More than 350 MBA, other postgraduate business students, and participants in Australian Institute of Management (Western Australia) management education programmes were surveyed to evaluate how ethical and socially responsible they believed the 60 organisations to be. The survey sought to determine what these participants considered ‘ethical’ (...) and ‘socially responsible’ behaviour in organisations to be. The survey also examined how the participants’ beliefs influenced their attitudes and intended behaviours towards these organisations. The results of this survey indicate that many managers and professionals have clear views about the ethical and social responsibility reputations of companies. This affects their attitudes towards these organisations which in turn has an impact on their intended behaviour towards them. These findings support the view in other research studies that well-educated managers and professionals are, to some extent, taking into account the ethical and social responsibility reputations of companies when deciding whether to work for them, use their services or buy shares in their companies. (shrink)
It has become very popular among philosophers to attempt to discredit, or at least set severe limits to, the thesis that there exist conceptual schemes radically different from ours. This fashion is misconceived. Philosophers have attempted to justify it in two main ways: by means of arguments which are a priorist relative to the relevant linguistic and textual evidence (and either independent of or based upon positive theories of meaning, understanding, and interpretation); and by means of arguments which are a (...) posteriorist relative to that evidence. The former approach is misconceived, not only in that its particular arguments fail, but also in principle. The latter approach, while in general the right sort of approach to adopt to the question, arrives at its conclusion only through faulty execution, through misinterpretation of the evidence. Though quite unjustified, philosophers' hostility to the thesis of radically different conceptual schemes is easily explained, namely, in terms of a number of psychologically powerful motives which it subserves. These motives cannot step in to provide the missing justification, however. Instead, they reveal such hostility in an even shadier light. (shrink)
Abstract Ramsey, Stick and Garon (1991) argue that if the correct theory of mind is some parallel distributed processing theory, then folk psychology must be false. Their idea is that if the nodes and connections that encode one representation are causally active then all representations encoded by the same set of nodes and connections are also causally active. We present a clear, and concrete, counterexample to RSG's argument. In conclusion, we suggest that folk psychology and connectionism are best understood as (...) complementary theories. Each has different limitations, yet each will co?evolve with the other in an overlapping domain of ?normal? psychology. (shrink)
Herder already very early in his career, in the 1760s, established two vitally important and epoch-making principles in the philosophy of language: that thought is essentially dependent on and bounded by language; and that meanings or concepts should be identified - not with such items as the referents involved, Platonic forms, or empiricist 'ideas' - but with word-usages. What did Herder do for an encore? His Treatise on the Origin of Language from 1772 might seem the natural place to look (...) for an answer to this question (since it is his best known work in the philosophy of language by far), but it is really the wrong place to look, because it temporarily regresses to a more conventional and less philosophically interesting position. However, Herder did succeed in making impressive progress in a broader array of works, namely by striving to identify prima facie problem cases confronting his two principles and to reconcile them with the latter. The main ones which he identified were God, animals, and non-linguistic art. In each of these cases, having initially proposed a reconciliation which did not work, he went on to develop a much more plausible one, indeed one which (at least in the two cases that really require one: animals and non-linguistic art) seems broadly correct. (shrink)
The central problem with Bayesian philosophy of science is that it cannot take account of the relevance of simplicity and unification to confirmation, induction, and scientific inference. The standard Bayesian folklore about factoring simplicity into the priors, and convergence theorems as a way of grounding their objectivity are some of the myths that Earman's book does not address adequately. 1Review of John Earman: Bayes or Bust?, Cambridge, MA. MIT Press, 1992, £33.75cloth.
This chapter examines four solutions to the problem of many models, and finds some fault or limitation with all of them except the last. The first is the naïve empiricist view that best model is the one that best fits the data. The second is based on Popper’s falsificationism. The third approach is to compare models on the basis of some kind of trade off between fit and simplicity. The fourth is the most powerful: Cross validation testing.
The simple question, what is empirical success? turns out to have a surprisingly complicated answer. We need to distinguish between meritorious fit and ‘fudged fit', which is akin to the distinction between prediction and accommodation. The final proposal is that empirical success emerges in a theory dependent way from the agreement of independent measurements of theoretically postulated quantities. Implications for realism and Bayesianism are discussed. ‡This paper was written when I was a visiting fellow at the Center for Philosophy of (...) Science at the University of Pittsburgh; I thank everyone for their support. †To contact the author, please write to: Department of Philosophy, University of Wisconsin–Madison, 5185 Helen C. White Hall, 600 North Park Street, Madison, WI 53706; e-mail: email@example.com. (shrink)
Ketelaar and Ellis have provided a remarkably clear and succinct statement of Lakatosian philosophy of science and have also argued compellingly that the neo-Darwinian theory of evolution fills the Lakatosian criteria of progressivity. We find ourselves in agreement with much of what Ketelaar and Ellis say about Lakatosian philosophy of science, but have some questions about (1) the place of evolutionary psychology in a Lakatosian framework, and (2) the extent to which evolutionary psychology truly predicts new findings.
What is induction? John Stuart Mill (1874, p. 208) defined induction as the operation of discovering and proving general propositions. William Whewell (in Butts, 1989, p. 266) agrees with Mill’s definition as far as it goes. Is Whewell therefore assenting to the standard concept of induction, which talks of inferring a generalization of the form “All As are Bs” from the premise that “All observed As are Bs”? Does Whewell agree, to use Mill’s example, that inferring “All humans are mortal” (...) from the premise that “John, Peter and Paul, etc., are mortal” is an example of induction? The surprising answer is “no”. How can this be? (shrink)
Deductive logic is about the validity of arguments. An argument is valid when its conclusion follows deductively from its premises. Here’s an example: If Alice is guilty then Bob is guilty, and Alice is guilty. Therefore, Bob is guilty. The validity of the argument has nothing to do with what the argument is about. It has nothing to do with the meaning, or content, of the argument beyond the meaning of logical phrases such as if…then. Thus, any argument of the (...) following form (called modus ponens) is valid: If P then Q, and P, therefore Q. Any claims substituted for P and Q lead to an argument that is valid. Probability theory is also content-free in the same sense. This is why deductive logic and probability theory have traditionally been the main technical tools in philosophy of science. (shrink)
Sober (1984) has considered the problem of determining the evidential support, in terms of likelihood, for a hypothesis that is incomplete in the sense of not providing a unique probability function over the event space in its domain. Causal hypotheses are typically like this because they do not specify the probability of their initial conditions. Sober's (1984) solution to this problem does not work, as will be shown by examining his own biological examples of common cause explanation. The proposed solution (...) will lead to the conclusion, contra Sober, that common cause hypotheses explain statistical correlations and not matchings between event tokens. (shrink)
Classical mechanics is empirically successful because the probabilistic mean values of quantum mechanical observables follow the classical equations of motion to a good approximation (Messiah 1970, 215). We examine this claim for the one‐dimensional motion of a particle in a box, and extend the idea by deriving a special case of the ideal gas law in terms of the mean value of a generalized force used to define “pressure.” The examples illustrate the importance of probabilistic averaging as a method of (...) abstracting away from the messy details of microphenomena, not only in physics, but in other sciences as well. (shrink)
The theory of fast and frugal heuristics, developed in a new book called Simple Heuristics that make Us Smart (Gigerenzer, Todd, and the ABC Research Group, in press), includes two requirements for rational decision making. One is that decision rules are bounded in their rationality –- that rules are frugal in what they take into account, and therefore fast in their operation. The second is that the rules are ecologically adapted to the environment, which means that they `fit to reality.' (...) The main purpose of this article is to apply these ideas to learning rules–-methods for constructing, selecting, or evaluating competing hypotheses in science, and to the methodology of machine learning, of which connectionist learning is a special case. The bad news is that ecological validity is particularly difficult to implement and difficult to understand. The good news is that it builds an important bridge from normative psychology and machine learning to recent work in the philosophy of science, which considers predictive accuracy to be a primary goal of science. (shrink)
The paper provides a formal proof that efficient estimates of parameters, which vary as as little as possible when measurements are repeated, may be expected to provide more accurate predictions. The definition of predictive accuracy is motivated by the work of Akaike (1973). Surprisingly, the same explanation provides a novel solution for a well known problem for standard theories of scientific confirmation — the Ravens Paradox. This is significant in light of the fact that standard Bayesian analyses of the paradox (...) fail to account for the predictive utility of universal laws like All ravens are black. (shrink)
Charles Peirce is often credited for being among the first, perhaps even the first, to develop a scientific metaphysics of indeterminism. After rejecting the received view that Peirce developed his views from Darwin and Maxwell, I argue that Peirce's view results from his synthesis of Immanuel Kant's critical philosophy and George Boole's contributions to formal logic. Specifically, I claim that Kant's conception of the laws of logic as the basis for his architectonic, when combined with Boole's view of probability, yields (...) Peirce's metaphysics of probabilistic laws. Indeterminism provides, therefore, an excellent illustration of how Peirce attempted to use logic to clarify metaphysical problems.Since everyone must have conceptions of things in general, it is most important that they should be carefully constructed. I shall enter into no criticism of the different methods of metaphysical research, but shall merely say that in the opinions of several great thinkers, the only successful mode yet lighted upon is that of adopting our logic as our metaphysics. (W1: 490, 1866)2. (shrink)
The likelihood theory of evidence (LTE) says, roughly, that all the information relevant to the bearing of data on hypotheses (or models) is contained in the likelihoods. There exist counterexamples in which one can tell which of two hypotheses is true from the full data, but not from the likelihoods alone. These examples suggest that some forms of scientific reasoning, such as the consilience of inductions (Whewell, 1858. In Novum organon renovatum (Part II of the 3rd ed.). The philosophy of (...) the inductive sciences. London: Cass, 1967), cannot be represented within Bayesian and Likelihoodist philosophies of science. (shrink)
Curve-fitting typically works by trading off goodness-of-fit with simplicity, where simplicity is measured by the number of adjustable parameters. However, such methods cannot be applied in an unrestricted way. I discuss one such correction, and explain why the exception arises. The same kind of probabilistic explanation offers a surprising resolution to a common-sense dilemma.
The distinction itself is best explained as follows. At the empirical level (at the bottom), there are curves, or functions, or laws, such as PV = constant the Boyle’s example, or a = M/r 2 in Newton’s example. The first point is that such formulae are actually ambiguous as to the hypotheses they represent. They can be understood in two ways. In order to make this point clear, let me first introduce a terminological distinction between variables and parameters. Acceleration and (...) distance (a and r) are variables in Newton’s formula because they represent quantities that are more or less directly measured. The distinction between what is directly measured and what it is not is to be understood relative the context. All I mean is that values of acceleration and distance are determined independently of the hypothesis, or theory, under consideration. I do not mean that their determination involves no kind of inference at all. For instance, acceleration is the instantaneous change in velocity per unit time, and this is not something that is directly determined from raw data that records the position of the moon at consecutive points in time. It is consistent with that raw data that the motion of the moon is actually discontinuous, so that the moon has no acceleration. So, there are definitely theoretical assumptions make about the moon’s motion that are used to estimate the moon’s acceleration at a particular time. But these assumptions are not unique to Newton’s theory. The same assumptions are also made by the rival hypotheses under consideration. In fact, the existence of quantities such as instantaneous acceleration is only called into question by the far more recent theory of quantum mechanics. Likewise, in the case of Boyle’s law, there is no controversy in viewing the volume of the trapped air as being determined in a way that does not make use of the theory that Boyle is introducing. (shrink)
Although in every inductive inference, an act of invention is requisite, the act soon slips out of notice. Although we bind together facts by superinducing upon them a new Conception, this Conception, once introduced and applied, is looked upon as inseparably connected with the facts, and necessarily implied in them. Having once had the phenomena bound together in their minds in virtue of the Conception men can no longer easily restore them back to the detached and incoherent condition in which (...) they were before they were thus combined. The pearls once strung, they seem to form a chain by their nature. Induction has given them unity which it is so far from costing us an effort to preserve, that it requires an effort to imagine it dissolved — William Whewell, 1858. (Quoted from Butts (ed.), 1989, p. 143). (shrink)
Sharvy’s puzzle concerns a situation in which common knowledge of two parties is obtained by repeated observation each of the other, no fixed point being reached in finite time. Can a fixed point be reached?
Science, in general, and chemistry in particular advances by methods that are difficult to codify. The availability of theories (models) and instrumentation play an important role but indefinable motivations to study individual phenomena are also involved. The area of chromium photophysics has a rich history that spans 150 years. A case history of the progression from the natural history stage to its present state reveals the way in which several factors that are common to much physical science research interact.
Textbooks in quantum mechanics frequently claim that quantum mechanics explains the success of classical mechanics because “the mean values [of quantum mechanical observables] follow the classical equations of motion to a good approximation,” while “the dimensions of the wave packet be small with respect to the characteristic dimensions of the problem.” The equations in question are Ehrenfest’s famous equations. We examine this case for the one-dimensional motion of a particle in a box, and extend the idea deriving a special case (...) of the ideal gas law in terms of the mean value of a generalized force, which has been used in statistical mechanics to define ‘pressure’. The example may be an important test case for recent philosophical theories about the relationship between micro-theories and macro-theories in science. (shrink)
Tarski  showed that for any set X, its set w(X) of well-orderable subsets has cardinality strictly greater than that of X, even in the absence of the axiom of choice. We construct a Fraenkel-Mostowski model in which there is an infinite strictly descending sequence under the relation |w (X)| = |Y|. This contrasts with the corresponding situation for power sets, where use of Hartogs' ℵ-function easily establishes that there can be no infinite descending sequence under the relation |P(X)| = (...) |Y|. (shrink)
Let ZFB be ZF + "every set is the same size as a wellfounded set". Then the following are true. Every sentence true in every (Rieger-Bernays) permutation model of a model of ZF is a theorem of ZFB. (i.e.. ZFB is the theory of Rieger-Bernays permutation models of models of ZF) ZF and ZFAFA are both extensions of ZFB conservative for stratified formulæ. The class of models of ZFB is closed under creation of Rieger-Bernays permutation models.
Skyrms's formulation of the argument against stochastic hidden variables in quantum mechanics using conditionals with chance consequences suffers from an ambiguity in its "conservation" assumption. The strong version, which Skyrms needs, packs in a "no-rapport" assumption in addition to the weaker statement of the "experimental facts." On the positive side, I argue that Skyrms's proof has two unnoted virtues (not shared by previous proofs): (1) it shows that certain difficulties that arise for deterministic hidden variable theories that exploit a nonclassical (...) probability theory extend to the stochastic case; (2) the use of counterfactual conditionals relates the Bell puzzle to Dummett's (1976) discussion of realism in quantum mechanics. (shrink)
For the purpose of this article, "hermeneutics" means the theory of interpretation, i.e. the theory of achieving an understanding of texts, utterances, and so on (it does not mean a certain twentieth-century philosophical movement). Hermeneutics in this sense has a long history, reaching back at least as far as ancient Greece. However, new focus was brought to bear on it in the modern period, in the wake of the Reformation with its displacement of responsibility for interpreting the Bible from the (...) Church to individual Christians generally. This new focus on hermeneutics occurred especially in Germany.1.. (shrink)
I. In this paper I want to sketch an account of the role of skepticism in Kant's critical philosophy.1 The critical philosophy set forth in the Critique of Pure Reason (henceforth: the Critique) grew from and responds to a complex set of philosophical concerns. Among these two of special importance are concerns to address skepticism and to develop a reformed metaphysics. This much is widely recognized. However, it is a fundamental thesis of this paper that those projects belong tightly together, (...) in the following sense: The types of skepticism which really originated and motivate the critical philosophy are ones which target metaphysics; and what originated and motivates the critical philosophy's reform of metaphysics is above all the goal of enabling it to withstand skepticism. (shrink)
I. This paper concerns a surprisingly sharp disagreement about the nature of ancient Pyrrhonism which first emerges clearly in Kant and Hegel, but which continues in contemporary interpretations.1 The paper begins by explaining the character of this disagreement, then attempts to adjudicate it in the light of the ancient texts.
A and B in signaling games (Lewis 1969). Members of the population, such as our prehistoric pair, are occasionally faced with the following ‘game’. Let one of the players be the receiver and the other the sender. The receiver needs to know whether B is true or not, but only possesses information about whether A is true or not. In some environmental contexts, A is sufficient for B, in others it is not. The sender knows nothing about A or B, (...) but does know that A is sufficient for B in some environments. This is a higher-order signaling game in which both players can benefit from sharing the information that they possess. How does a communication strategy evolve, and is it evolutionarily stable? (shrink)
Kenneth Wilson won the Nobel Prize in Physics in 1982 for applying renormalization group, which he learnt from quantum field theory (QFT), to problems in statistical physics—the induced magnetization of materials (ferromagnetism) and the evaporation and condensation of fluids (phase transitions). See Wilson (1983). The renormalization group got its name from its early applications in QFT. There, it appeared to be a rather ad hoc method of subtracting away unwanted infinities. The further allegation was that the procedure is so horrendously (...) complicated that one cannot see the forest for the trees. The.. (shrink)
Puzzle solving in normal science involves a process of accommodation—auxiliary assumptions are changed, and parameter values are adjusted so as to eliminate the known discrepancies with the data. Accommodation is often contrasted with prediction. Predictions happen when one achieves a good fit with novel data without accommodation. So, what exactly is the distinction, and why is it important? The distinction, as I understand it, is relative to a model M and a data set D, where M is a set of (...) equations with adjustable parameters (i. e., M is a family of equations with no free parameters). Definition: Model M predicts data D if and only if either (a) all members of M fit D well, or (b) a particular predictive hypothesis is selected from M by fitting M to other data, and the fitted model fits D well. M merely accommodates D if and only if (i) M does not predict D, and (ii) the predictive hypothesis selected from M using other data does not fit D well. There will be cases in which a model M neither predicts nor accommodates D. These are the cases in which we are willing to say that data falsifies the model. So, the distinction between prediction and accommodation applies only when there is no falsification. (shrink)
Wayne Myrvold (2003) has captured an important feature of unified theories, and he has done so in Bayesian terms. What is not clear is whether the virtue of such unification is most clearly understood in terms of Bayesian confirmation. I argue that the virtue of such unification is better understood in terms of other truth-related virtues such as predictive accuracy.
Type 1: This process occurs for half of the population. For this segment of the population, there is 10% chance of developing the disease. There is a test for the disease such that 90% of the people who have the disease in this case will test positive (event E), while the false positive rate is 10%, which means that there is a 10% chance of testing positive for the disease when they do not have the disease.
Der Titel meines Vortrags bezieht sich nicht auf heftige Auseinandersetzungen in der heutigen Hegelrezeption, sondern auf den gleichnamigen Abschnitt der Phänomenologie des Geistes von 1807: “Das geistige Tierreich und der Betrug oder die Sache selbst.” Dieser verhältnismäßig wenig beachtete und womöglich noch weniger verstandene Abschnitt ist meines Erachtens einer der wichtigsten im ganzen Buch. Ich möchte deshalb heute versuchen seine Bedeutung etwas aufzuklären.
Deductive logic is about the property of arguments called validity. An argument has this property when its conclusion follows deductively from its premises. Here’s an example: If Alice is guilty then Bob is guilty, and Alice is guilty. Therefore, Bob is guilty. The important point is that the validity of this argument has nothing to do with the content of the argument. Any argument of the following form (called modus ponens) is valid: If P then Q, and P, therefore Q. (...) Any claims substituted for P and Q lead to an argument that is valid. Probability theory is also content-free. This is why deductive logic and probability theory have traditionally been the main tools in philosophy of science. (shrink)
In “Connectionism and the fats of folk psychology”, Forster and Saidel argue that the central claim of Ramsey, Stich and Garon (1991)—that distributed connectionist models are incompatible with the causal discreteness of folk psychology—is mistaken. To establish their claim, they offer an intriguing model which allegedly shows how distributed representations can function in a causally discrete manner. They also challenge our position regarding projectibility of folk psychology. In this essay, I offer a response to their account and show how (...) their model fails to demonstrate that our original argument was mistaken. While I will discuss several difficulties with their model, my primary criticism will be that the features of their model that are causally discrete are not truly distributed, while the features that are distributed are not really discrete. Concerning the issue of projectibility, I am more inclined to agree with Forster and Saidel and I offer a revised account of what we should have said originally. (shrink)
Contrary to Eckart Förster, I argue that the Opus postumum represents more of an evolution than a revolution in Kant's thought. Among other points, I argue that Kant's Selbstsetzungslehre, or theory of self-positing, according to which we cannot have knowledge of the spatio-temporal world except through recognition of the changes we initiate in it by our own bodies, does not constitute a radicalization of Kant's transcendental idealism, but is a development of the realist line of argument introduced by the "Refutation (...) of Idealism" of 1787-90; and I argue that Kant's concept of the highest good, which according to Förster was only revised to connect virtue to collective rather than individual happiness in 1790-93 and was then in any case withdrawn in the Opus postumum, was uninterruptedly focused on collective happiness from the first edition of the first Critique, and that there is no reason to believe that ever Kant retracted it. (shrink)
In Western countries a considerable number of older people move to a residential home when their health declines. Institutionalization often results in increased dependence, inactivity and loss of identity or self-worth (dignity). This raises the moral question as to how older, institutionalized people can remain autonomous as far as continuing to live in line with their own values is concerned. Following Walker's meta-ethical framework on the assignment of responsibilities, we suggest that instead of directing all older people towards more autonomy (...) in terms of independence, professional caregivers should listen to the life narrative of older people and attempt to find out how their personal identity, relations and values in life can be continued in the new setting. If mutual normative expectations between caregivers and older people are not carefully negotiated, it creates tension. This tension is illustrated by the narrative of Mr Powell, a retired successful public servant now living in a residential home. The narrative describes his current life, his need for help, his independent frame of mind, and his encounters with institutional and professional policies. Mr Powell sees himself as a man who has always cared for himself and others, and who still feels that he has to fulfil certain duties in life. Mr Powell's story shows that he is not always understood well by caregivers who respond from a one-sided view of autonomy as independence. This leads to misunderstanding and an underestimation of his need to be noticed and involved in the residential community. (shrink)
Forster and Sober present a solution to the curve-fitting problem based on Akaike's Theorem. Their analysis shows that the curve with the best epistemic credentials need not always be the curve that most closely fits the data. However, their solution does not, without further argument, avoid the two difficulties that are traditionally associated with the curve-fitting problem: (1) that there are infinitely many equally good candidate-curves relative to any given set of data, and (2) that these best candidates include (...) curves with indefinitely many bumps. (shrink)
E.M. Forster’s A Passage to India presents Brahman Hindu jurisprudence as an alternative to British rule of law, a utilitarian jurisprudence that hinges on mercantilism, central planning, and imperialism. Building on John Hasnas’s critiques of rule of law and Murray Rothbard’s critiques of Benthamite utilitarianism, this essay argues that Forster’s depictions [...].
This paper reconvenes Samuel Beckett’s psychotherapy with Wilfred Bion during 1934–1936 during which time Beckett’s conceived and began writing this second novel, Murphy . Based on Beckett’s visits to the Bethlem & Maudsley Hospital and his observation of the male nurses, the climax of Murphy is a chess match between Mr Endon (a male schizophrenic patient) and Murphy (a male psychiatric nurse). The precise notation of the Endon v Murphy chess match tells us that the Beckett intended it to be (...) an exemplar of an anti-match, perhaps a metaphor for the tragedy of being locked into madness. It is also argued that the match offers us insight into Beckett’s experience of the process of psychotherapy with Bion. Based on new information from Beckett’s nephew and Bion’s widow, hypotheses about the long term impact of the Bion-Beckett analysis are advanced as a mutual experience which shaped the lives and later literary output of both men, producing conjoined career writings which continue to offer us stark and sublime condensations of depression, psychosis, and the challenges of therapy and recovery. (shrink)
Claims about the economic motivations of population groups in the American past are a staple of contemporary political argument, as polemicists of one side seek to impeach the moral standing of the other side by impeaching the moral standing of the forebears of the people on the other side. Sometimes such polemics are presented to the public in the guise of nonpartisan works of popular history. This paper, applying the training of a litigator in preparing an "opposition" or "reply" brief, (...) examines and exposes the "spin" in the economic history offered by popular author Nathaniel Philbrick in his 207 book Mayflower, in the sections of the book addressing the bloody conflict in New England in 1675-76 known as "King Philip's War." The paper uses the facts Mr. Philbrick himself reports in his book to refute his conclusions, showing that the English colonists fought in legitimate self-defense and not out of greed or racism, against certain (not all) Indian tribes whose warriors, in the words of one of their own chiefs, were like "sticks laid on a heap, till by the multitude of them a great fire came to be kindled.". (shrink)
Referring to Professor Tennessen's article “What Should We Say?”; (Inquiry, vol. 2 (1959), pp. 265-90), Mr. Stigen argues that Tennessen fails to distinguish between the speech situation of the speaker and that of the interpreter. He therefore, according to Stigen, confuses the problems relevant to each of them and frequently treats problems of “What should I say?”; with considerations relevant only to interpreters, whose proper question is “What does he mean?”;, and vice versa. Among other mistakes, according to Stigen, this (...) failure to distinguish the problems leads Tennessen to recommend a Humpty Dumpty attitude for speakers, although it is appropriate only to interpreters. The paper closes with some remarks on the use of modifying expressions. (shrink)
Dear Mr. Lucas, I was wondering if you had come across Query 44 of George Berkeley's ``Analyst: A discourse addressed to an infidel mathematician"?. It reads: ``Whether the difference between a mere computer and a man of science be not that one computes on principles clearly conceived and by rules evidently demonstrated, whereas the other [i.e a man] doth not?" Not bad for 1734!
Smart MR contrast agents exhibit modulation of their relaxivity by specific physiological or biochemical trigger-events, while targeted MR contrast agents are envisioned to deliver the large gadolinium chelates into the target tissue. In an effort to develop novel smart and targeted MR contrast agents, the series of the DO3A based multifunctional chelating agents with the variable length of the side chain has been synthesized. They serve as valuable multipurpose precursors for contrast agents based on gadolinium chelates in the design of (...) relaxometric MR probes. The presence of the amino group in the side chain of the macrocycle allows for conversion into various functional groups (aminophosponates, aminocarboxylates, etc.) or for conjugation with different biomolecules, dyes, and polymers. Choice of the functional groups depends on the.. (shrink)
The paper concludes the argument that certain aesthetic objects conduce to a feeling of radical contingency, and to an openness to St Thomas's Third Way proof for the existence of God. Much is conceded to the late Mr Gershon Weiler's criticism of an earlier discussion. The upshot is (a) that Necessary Being as converse of radical contingency may be an Aesthetic Idea/Sublime of Kant's kind, and (b) that without the ‘I AM that I am’, it is empty. The ‘inference’ from (...) radical contingency to Necessary Being may function as George Eliot thought Wit to function, intellectually/aesthetically. (shrink)
The Strange Case of Dr. B and Mr. Hide: Ethical Sensitivity as a Means to Reflect Upon One’s Actions in Managing Conflict of Interest Content Type Journal Article Category Case Studies Pages 1-3 DOI 10.1007/s11673-012-9360-4 Authors Marie-Josée Potvin, Programmes de bioéthique, Department of Social and Preventive Medicine, Université de Montréal, C.P. 6128, succ. Centre-ville, Montréal, Québec, Canada H3C 3J7 Journal Journal of Bioethical Inquiry Online ISSN 1872-4353 Print ISSN 1176-7529.
Mr. Herbert Spencer, the English philosopher, of world wide celebrity, has contributed to the April number of the Contemporary Review an article entitled “The Coming Slavery,” which commends itself to the attention of English Socialists, because he predicates therein that the Social “changes made, the changes in progress, and the changes urged, are carrying us .... to the desired ideal of the Socialists” that even the Liberals, the worst enemies of Socialists, “are diligently preparing the way for them,” and that (...) nationalisation of land, banks, railways, mines, factories, and other private instruments of production will be realised in the near future: and because this hopeful idea, entertained by so profound a philosopher, will put fresh courage into the hearts of militant Socialists, and will encourage them to pursue with renewed ardour their propaganda of Communistic theories. But the article has other claims to our attention. It professes to be a powerful and conclusive criticism of Socialism, while it is, in effect, a mere summary of all the commonplace arguments habitually brought against Socialism.. That so illustrious a man as Mr. Spencer should fail to find more serious arguments against it, is a very conclusive demonstration, if that were wanted, of the soundness of Socialism. That a thinker, like Mr.Spencer, one of the lights of the bourgeoisie, — should think it worth his while to bring forward such arguments, makes it incumbent on his opponents to refute thorn, how trivial and unworthy soever they may be. (shrink)
Mr. B. A. Farrell has argued that psychoanalysis is refutable, without clarifying different senses of 'refutable'. Once this clarification is done and the relevant literature examined, however, it is seen that psychoanalysis is not refutable in several important senses of 'refutable', although it is refutable in a sense that is quite uninteresting.
Carbon monoxide (CO) intoxication leads to acute and chronic neurological deficits, but little is known about the specific noxious mechanisms. 1 H magnetic resonance spectroscopy (MRS) may allow insight into the pathophysiology of CO poisoning by monitoring neurochemical disturbances, yet only limited information is available to date on the use of this protocol in determining the neurological effects of CO poisoning. To further examine the short-term and long-term effects of CO on the (...) central nervous system, we have studied seven patients with CO poisoning assessed by gray and white matter MRS, magnetic resonance imaging (MRI) and neuropsychological testing. Five patients suffered from acute high-dose CO intoxication and were in coma for 1–6 days. In these patients, MRI revealed hyperintensities of the white matter and globus pallidus and also showed increased choline (Cho) and decreased N -acetyl aspartate (NAA) ratios to creatine (Cr), predominantly in the white matter. Lactate peaks were detected in two patients during the early phase of high-dose CO poisoning. Two patients with chronic low-dose CO exposure and without loss of consciousness had normal MRI and MRS scans. On follow-up. five of our seven patients had long-lasting intellectual impairment, including one individual with low-dose CO exposure. The MRS results showed persisting biochemical alterations despite the MRI scan showing normalization of morphological changes. In conclusion, the MRS was normal in patients suffering from chronic low-dose CO exposure; in contrast, patients with high-dose exposure showed abnormal gray and white matter levels of NAA/Cr, Cho/Cr and lactate, as detected by 1 H MRS, suggesting disturbances of neuronal function, membrane metabolism and anaerobic energy metabolism, respectively. Early increases in Cho/Cr and decreases of NAA/Cr may be related to a poor long-term outcome, but confirmation by future studies is needed. (shrink)
Timothy Williamson has famously argued that the (KK) principle (roughly, that if one knows that p, then one knows that one knows that p) should be rejected. We analyze Williamson’s argument and show that its key premise is ambiguous, and that when it is properly stated this premise no longer supports the argument against (KK). After canvassing possible objections to our argument, we reflect upon some conclusions that suggest significant epistemological ramifications pertaining to the acquisition of knowledge from prior knowledge (...) by deduction. (shrink)