I oppose the way John Skorupski characterizes morality in terms of the blameworthy and the role he consequently assigns to punitive feelings in directing one's will and shaping one's character. Skorupski does not hold that the punishment involved in blame- and guilt-feelings grounds the normativity of moral obligation. He defends a specific view of moral psychology and moral practice in which the blame-feeling disposes to the withdrawal of recognition, which involves some sort of casting the transgressor out of the community (...) resulting in the suffering of repentance which is necessary to make atonement possible. I argue that this picture threatens to socialize morality. I defend the Kantian idea that the will is not aligned to obligation through castigation, but through our consciousness of our vocation as takers and givers of reasons. This highlights very different feelings as essential to the typically moral stance, feelings that are not necessarily punitive, like feelings of respect and reverence. (shrink)
Theo Verbeek provides the first book-length examination of the initial reception of Descartes’s written works. Drawing on his research of primary materials written in Dutch and Latin and found in libraries all over Europe, even including the Soviet Union, Theo Verbeek opens a period of Descartes’s life and of the development of Cartesian philosophy that has been virtually closed since Descartes’s death. Verbeek’s aim is to provide as complete a picture as possible of the discussions that accompanied the (...) introduction of Descartes’s philosophy into Dutch universities, especially those in Utrecht and Leiden, and to analyze some of the major problems that philosophy raised in the eyes of Aristotelian philosophers and orthodox theologians. The period covered extends from 1637, the year in which Descartes published his _Discours de la Méthode, _until his death in 1650. Verbeek demonstrates how Cartesian philosophy moved successfully into the schools and universities of Holland and how this resulted in a real evolution of Descartes’s thought beyond the somewhat dogmatic position of Descartes himself. Verbeek further argues that this progression was an essential step in the universal propagation of Cartesian philosophy throughout Europe during the second half of the seventeenth century. As he details the disputes between Cartesians and anti-Cartesians in Holland, Verbeek shows how the questions raised were related on the one hand to religious conflicts between the Remonstrants and the Orthodox Calvinists and on the other hand to political conflicts between more liberal factions fighting for the union of church and state to enhance religious control of society in general. Contending that Descartes and Cartesian philosophy were central to the development of the modern Dutch state, Verbeek illuminates the role they played in Dutch political, religious, and intellectual life. (shrink)
David Theo Goldberg engages Achille Mbembe in a wide-ranging conversation on the key lines of analysis of Mbembe’s book, The Critique of Black Reason. The discussion ranges across a broad swath of key themes: the constitutive feature of racisms in the making of modernity and modern capitalism as conceived through the global black experience; the African and French archives in constituting, resisting, and refashioning ‘black reason’ and its multiple registers; the centrality of slavery to this constitution and resistance; thinghood (...) and humanity; liberalism as the basis for racial pessimism. The discussion closes with Mbembe’s plea for a shared being and an exchange about repair and reparation. (shrink)
Fueled by ever-growing amounts of data and advances in artificial intelligence, decision-making in contemporary societies is increasingly delegated to automated processes. Drawing from social science theories and from the emerging body of research about algorithmic appreciation and algorithmic perceptions, the current study explores the extent to which personal characteristics can be linked to perceptions of automated decision-making by AI, and the boundary conditions of these perceptions, namely the extent to which such perceptions differ across media, health, and judicial contexts. Data (...) from a scenario-based survey experiment with a national sample show that people are by and large concerned about risks and have mixed opinions about fairness and usefulness of automated decision-making at a societal level, with general attitudes influenced by individual characteristics. Interestingly, decisions taken automatically by AI were often evaluated on par or even better than human experts for specific decisions. Theoretical and societal implications about these findings are discussed. (shrink)
This article argues that hunger in Canada, while being an outcome of unemployment, low incomes, and inadequate welfare, springs also from the failure to recognize and implement the human right to food. Food security has, however, largely been ignored by progressive social policy analysis. Barriers standing in the way of achieving food security include the increasing commodification of welfare and the corporatization of food, the depoliticization of hunger by governments and the voluntary sector, and, most particularly, the neglect by the (...) federal and provincial governments of their obligations to guarantee the domestic right to food as expressed in international human rights law. The interconnectedness of hunger, welfare, and food security issues in a first world society are explored from the perspective of progressive social policy and food security analysis and the development of alternative strategies. In terms of advancing the human right to food in Canada, particular emphasis is placed on the role of the state and civil society, and the social and economic rights of citizenship built on an inclusive social policy analysis and politics of welfare, food security and human rights. (shrink)
INTRODUCTION When Karl Popper published in' his definition of closer-to-the- truth this was an important intellectual event, but not a shocking one. ...
The brain adapts to changes that take place in the body. Deprivation of input results in size reduction of cortical representations, whereas an increase in input results in an increase of representational space. Amputation forms one of the most dramatic disturbances of the integrity of the body. The brain adapts in many ways to this breakdown of the afferent–efferent equilibrium. However, almost all studies focus on the sensorimotor consequences. It is not known whether adaptation takes place also at other “levels” (...) in the system. The present study addresses the question whether amputees dream about their intact body, as before the amputation, or about the body after the amputation and whether the dream content was a function of time since the amputation and type of amputation. The results show that the majority of the dreamers reported dreams about their intact body although the mean time that elapsed since the amputation was twelve years. There is no clear relation with the type of amputation. The results give modest evidence for the existence of a basic neural representation of the body that is, at least, partly genetically determined and by this relatively insensitive for changes in the sensory input. (shrink)
In this article I give a naturalistic-cum-formal analysis of the relation between beauty, empirical success, and truth. The analysis is based on the one hand on a hypothetical variant of the so-called 'mere-exposure effect' which has been more or less established in experimental psychology regarding exposure-affect relationships in general and aesthetic appreciation in particular (Zajonc 1968; Temme 1983; Bornstein 1989; (Ye 2000). On the other hand it is based on the formal theory of truthlikeness and truth approximation as presented in (...) my From Instrumentalism to Constructive Realism (2000). The analysis supports the findings of James McAllister in his beautiful Beauty and Revolutiorl in Science (1996), by explaining and justifying them. First, scientists are essentially right in regarding aesthetic criteria useful for empirical progress and even for truth approximation, provided they conceive of them as less hard than empirical criteria. Second, the aesthetic criteria of the time, the 'aesthetic canon', may well be based on 'aesthetic induction' regarding nonempirical features of paradigms of successful theories which scientists have come to appreciate as beautiful. Third, aesthetic criteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truth approximation, but this does not happen in the hands of aesthetically flexible, 'revolutionary' scientists. (shrink)
The naive structuralist definition of truthlikeness is an idealization in the sense that it assumes that all mistaken models of a theory are equally bad. The natural concretization is a refined definition based on an underlying notion of structurelikeness.In Section 1 the naive definition of truthlikeness of theories is presented, using a new conceptual justification, in terms of instantial and explanatory mistakes.
We examine an article of Paul Ricoeur on autonomy and vulnerability. Ricoeur presents the two notions in the field of justice as intricately woven into each other. He analyzes their interdependence on three levels of human agency. Ricoeur’s exposition has a focus on judicial judgment. After presenting Ricoeur’s argument and an analysis of his main points, the author argues that Ricoeur’s reflection lines up with some essential intentions of care ethics. Ricoeur’s contribution to care ethics is given in a delicate (...) balance of autonomy and its vulnerability. (shrink)
There are two principles which bear the name Frege''sprinciple: the principle of compositionality, and the contextprinciple. The aim of this contribution is to investigate whether thisis justified: did Frege accept both principles at the same time, did hehold the one principle but not the other, or did he, at some moment,change his opinion? The conclusion is as follows. There is a developmentin Frege''s position. In the period of Grundlagen he followed to a strict form of contextuality. He repeatedcontextuality in later (...) writings, but became less strict. From 1914 on,pushed by the needs of research, he comes close to compositionality. Buthe could never make the final step toward compositionality forprincipled reasons, therefore he always would reject compositionality. (shrink)
In this article I give a naturalistic-cum-formal analysis of therelation between beauty, empirical success, and truth. The analysis is based on the onehand on a hypothetical variant of the so-called `mere-exposure effect'' which has been more orless established in experimental psychology regarding exposure-affect relationshipsin general and aesthetic appreciation in particular (Zajonc 1968; Temme 1983; Bornstein 1989;Ye 2000). On the other hand it is based on the formal theory of truthlikeness andtruth approximation as presented in my From Instrumentalism to Constructive Realism (...) (2000).The analysis supports the findings of James McAllister in his beautifulBeauty and Revolution in Science (1996), by explaining and justifying them.First, scientists are essentially right in regarding aesthetic criteria useful for empiricalprogress and even for truth approximation, provided they conceive of them as less hard thanempirical criteria. Second, the aesthetic criteria of the time, the `aesthetic canon'', maywell be based on `aesthetic induction'' regarding nonempirical features of paradigms of successfultheories which scientists have come to appreciate as beautiful. Third, aestheticcriteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truth approximation, but this does not happen in the hands of aesthetically flexible, `revolutionary'' scientists. We may find totallyopposite things beautiful: a simplemathematical principle as well as a series of unrepeatable complex contingencies. It is a matter of psychology.(Stephen Jay Gould, translated passage from (Kayzer 2000, 30)). (shrink)
In section I the notions of logical and inductive probability will be discussed as well as two explicanda, viz. degree of confirmation, the base for inductive probability, and degree of evidential support, Popper's favourite explicandum. In section II it will be argued that Popper's paradox of ideal evidence is no paradox at all; however, it will also be shown that Popper's way out has its own merits.
Straightforward theory revision, taking into account as effectively as possible the established nomic possibilities and, on their basis induced empirical laws, is conducive for (unstratified) nomic truth approximation. The question this paper asks is: is it possible to reconstruct the relevant theory revision steps, on the basis of incoming evidence, in AGM-terms? A positive answer will be given in two rounds, first for the case in which the initial theory is compatible with the established empirical laws, then for the case (...) in which it is incompatible with at least one such a law. (shrink)
This paper primarily deals with theconceptual prospects for generalizing the aim ofabduction from the standard one of explainingsurprising or anomalous observations to that ofempirical progress or even truth approximation. Itturns out that the main abduction task then becomesthe instrumentalist task of theory revision aiming atan empirically more successful theory, relative to theavailable data, but not necessarily compatible withthem. The rest, that is, genuine empirical progress aswell as observational, referential and theoreticaltruth approximation, is a matter of evaluation andselection, and possibly new (...) generation tasks forfurther improvement. The paper concludes with a surveyof possible points of departure, in AI and logic, forcomputational treatment of the instrumentalist taskguided by the `comparative evaluation matrix''. (shrink)
The qualitative theory of nomic truth approximation, presented in Kuipers in his, in which ‘the truth’ concerns the distinction between nomic, e.g. physical, possibilities and impossibilities, rests on a very restrictive assumption, viz. that theories always claim to characterize the boundary between nomic possibilities and impossibilities. Fully recognizing two different functions of theories, viz. excluding and representing, this paper drops this assumption by conceiving theories in development as tuples of postulates and models, where the postulates claim to exclude nomic impossibilities (...) and the models claim to represent nomic possibilities. Revising theories becomes then a matter of adding or revising models and/or postulates in the light of increasing evidence, captured by a special kind of theories, viz. ‘data-theories’. Under the assumption that the data-theory is true, achieving empirical progress in this way provides good reasons for the abductive conclusion that truth approximation has been achieved as well. Here, the notions of truth approximation and empirical progress are formally direct generalizations of the earlier ones. However, truth approximation is now explicitly defined in terms of increasing truth-content and decreasing falsity-content of theories, whereas empirical progress is defined in terms of lasting increased accepted and decreased rejected content in the light of increasing evidence. These definitions are strongly inspired by a paper of Gustavo Cevolani, Vincenzo Crupi and Roberto Festa, viz., “Verisimilitude and belief change for conjunctive theories” :183–222, 2011). (shrink)
In this article I give a naturalistic-cum-formal analysis of the relation between beauty, empirical success, and truth. The analysis is based on the one hand on a hypothetical variant of the so-called 'mere-exposure effect' which has been more or less established in experimental psychology regarding exposure-affect relationships in general and aesthetic appreciation in particular. On the other hand it is based on the formal theory of truthlikeness and truth approximation as presented in my "From Instrumentalism to Constructive Realism". The analysis (...) supports the findings of James McAllister in his beautiful "Beauty and Revolution in Science", by explaining and justifying them. First, scientists are essentially right in regarding aesthetic criteria useful for empirical progress and even for truth approximation, provided they conceive of them as less hard than empirical criteria. Second, the aesthetic criteria of the time, the 'aesthetic canon', may well be based on 'aesthetic induction' regarding nonempirical features of paradigms of successful theories which scientists have come to appreciate as beautiful. Third, aesthetic criteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truth approximation, but this does not happen in the hands of aesthetically flexible, 'revolutionary' scientists. (shrink)
Introduction and Overview Content Type Journal Article Category Introduction Pages 151-163 DOI 10.1007/s10670-011-9288-9 Authors Theo Kuipers, Faculty of Philosophy, University of Groningen, Groningen, The Netherlands Gerhard Schurz, Department of Philosophy, University of Duesseldorf, Universitaetsstrasse 1, Geb. 23.21, 40225 Duesseldorf, Germany Journal Erkenntnis Online ISSN 1572-8420 Print ISSN 0165-0106 Journal Volume Volume 75 Journal Issue Volume 75, Number 2.
In this paper it is argued that Hintikka's game theoreticalsemantics for Independence Friendly logic does not formalize theintuitions about independent choices; it rather is aformalization of imperfect information. Furthermore it is shownthat the logic has several remarkable properties (e.g.,renaming of bound variables is not allowed). An alternativesemantics is proposed which formalizes intuitions aboutindependence.
An interesting consequence of the structuralist theory of truth approximation, as developed in my From Instrumentalism to Constructive Realism , henceforth ICR, concerns so-called ‘inference to the best explanation’ . It will be argued that this popular rule among scientific realists can better be replaced by, various kinds of, ‘inference to the best theory’.
While the special volumes of the series of Handbooks of the Philosophy of Science address topics relative to a specific discipline, this general volume deals ...
In my From Instrumentalism to Constructive Realism I have shown how an instrumentalist account of empirical progress can be related to nomic truth approximation. However, it was assumed that a strong notion of nomic theories was needed for that analysis. In this paper it is shown, in terms of truth and falsity content, that the analysis already applies when, in line with scientific common sense, nomic theories are merely assumed to exclude certain conceptual possibilities as nomic possibilities.
An interesting consequence of the structuralist theory of truth approximation, as developed in my From Instrumentalism to Constructive Realism, henceforth ICR, concerns so-called ‘inference to the best explanation’. It will be argued that this popular rule among scientific realists can better be replaced by, various kinds of, ‘inference to the best theory’.
The issues of just what laws of nature are and what makes statements law-like have been more discussed than advanced. After exploring the general area and uncovering some difficulties which, I suspect, make the case even knottier than generally imagined, I argue that certain resources available only to the theist---in particular, counterfactuals of God’s freedom---may provide the materials needed for constructing solutions.
During the past four decades, the Netherlands played a leading role in the debate about euthanasia and assisted suicide. Despite the claim that other countries would soon follow the Dutch legalization of euthanasia, only Belgium and the American state of Oregon did. In many countries, intense discussions took place. This article discusses some major contributions to the discussion about euthanasia and assisted suicide as written by Nigel Biggar (2004), Arthur J. Dyck (2002), Neil M. Gorsuch (2006), and John Keown (2002). (...) They share a concern that legalization will undermine a society's respect for the inviolability and sanctity of life. Moreover, the Report of the House of Lords Select Committee on the Assisted Dying for the Terminally Ill Bill (2005) is analyzed. All studies use ethical, theological, philosophical, and legal sources. All these documents include references to experiences from the Netherlands. In addition, two recent Dutch documents are analyzed which advocate further liberalization of the Dutch euthanasia practice, so as to include infants (Groningen Protocol, NVK 2005) and elderly people "suffering from life" (Dijkhuis Report, KNMG 2004). (shrink)
Physicalism ? or roughly the view that the stuff that physics talks about is all the stuff there is ? has had a popular press in philosophical circles during the twentieth century. And yet, at the same time, it has become quite fashionable lately to believe that the mind matters in this world after all and that psychology is an autonomous science irreducible to physics. However, if (true, downward) mental causation implies non-reducibility and Physicalism implies the converse, it is hard (...) to see how these two views could be compatible. This paper reviews some classical arguments purportedly showing how the autonomy of the special sciences can be upheld without violating the laws of physics or the principle that physics constitutes a complete and closed system. These arguments are presented in order of increasing strength, indicating how the more popular arguments in fact fall short of establishing anti-reductionism of the intended kind. New arguments are added which claim to demonstrate quite effectively how downward causation is possible compatibly with the reign of physics. The paper begins with a section which distinguishes various kinds of reductionism. (shrink)
The article sets out a framework for analysing the way discourses construct legitimation for social practices in public communication as well as in everyday interaction. Four key categories of legitimation are distinguished: 1) ‘authorization’, legitimation by reference to the authority of tradition, custom and law, and of persons in whom institutional authority is vested; 2) ‘moral evaluation’, legitimation by reference to discourses of value; 3) rationalization, legitimation by reference to the goals and uses of institutionalized social action, and to the (...) social knowledges that endow them with cognitive validity; and 4) mythopoesis, legitimation conveyed through narratives whose outcomes reward legitimate actions and punish non-legitimate actions. Examples are drawn from texts legitimating or de-legitimating compulsory education, including children’s books, brochures for parents, teacher training texts, and media texts. (shrink)
What meaning does the theological notion of discernment have in a postsecular cultural condition? Three levels of the meaning of postsecular are distinguished: first, the ‘postsecular’, as a notion that characterises a cultural condition, mainly in Western society, full of diverse religious expressions; second, ‘postsecularity’, as a reflective model for interpreting religious expressions and behaviour; and third, ‘postsecularism’, as a cultural-philosophical or theological programme. After elucidating the concept of the postsecular, we consider some key elements in discernment, investigating the subject, (...) the nature and the object of discernment. We then turn to Richard Kearney’s Anatheism as a fine example of how the notion of discernment receives a new usage in postsecular reflection. We ponder upon his idea of discernment as ‘prereflective carnal response to the advent of the Other’ and reflect upon its meaning for postsecular thinking. The concluding section offers a consideration of the meaning of Kearney’s interpretation of discernment for judgement in postsecular culture. (shrink)
Experimental results in Ultimatum, Trust and Social Dilemma games have been interpreted as showing that individuals are, by and large, not driven by selfish motives. But we do not need experiments to know that. In our view, what the experiments show is that the typical economic auxiliary hypothesis of non-tuism should not be generalized to other contexts. Indeed, we know that when the experimental situation is framed as a market interaction, participants will be more inclined to keep more money, share (...) less, and disregard other participants’ welfare [Hoffman et al., 1994]. When the same game is framed as a fair division one, participants overall show a much greater concern for the other parties’ interests. The data thus indicate that the context of an interaction is of paramount importance in eliciting different motives. The challenge then is to model utility functions that are general enough to subsume a variety of motives and specific enough to allow for meaningful, interesting predictions to be made. For the sake of simplicity (and brevity), in what follows we will concentrate upon the results of experiments that show what appears to be individuals’ disposition to behave in a fair manner in a variety of circumstances [Camerer, 2003]., though what we are saying can be easily applied to other research areas. Such experimental results have been variously interpreted, each interpretation being accompanied by a specific utility function. We shall consider three such functions and the underlying interpretations that support them, and assess each one on the basis of what they claim to be able to explain and predict. (shrink)
At the beginning of the fifth century there was a change in the style of clothing worn by Athenian men.1 When Thucydides speaks of it,2 he first describes how the Greeks of ancient times used to carry weapons in everyday life, just as the barbarians of his own day still did. The Athenians were the first to lay weapons aside and to take up a relaxed and more luxurious way of life.
The structuralist theory of truth approximation essen-tially deals with truth approximation by theory revision for a fixed domain. However, variable domains can also be taken into account, where the main changes concern domain extensions and restrictions. In this paper I will present a coherent set of definitions of “more truth-likeness”, “empirical progress” and “truth approximation” due to a revision of the domain of intended applications. This set of definitions seems to be the natural counterpart of the basic definitions of similar (...) notions as far as theory revision is concerned. The formal aspects of theory revision strongly suggest an analogy between truth approximation and design research, for example, drug research. Whereas a new drug may be better for a certain disease than an old one, a certain drug may be better for another disease than for the original target disease, a phenomenon which was nicely captured by the title of a study by Rein Vos [1991]: Drugs Looking for Diseases. Similarly, truth approximation may not only take the shape of theory revision but also of domain revision, naturally suggesting the phenomenon of “Theories looking for domains”. However, whereas Vos documented his title with a number of examples, so far, apart from plausible cases of “truth accumulation by domain extension”, I did not find clear-cut empirical instantiations of the analogy, only, as such, very interesting, non-empirical examples. (shrink)
Austrian immigration authorities frequently reject the family reunion applications of immigrant workers. They justify their decisions not only on legal grounds but also on the basis of their own often prejudiced judgements of the applicants' ability to `integrate' into Austrian society. A discourse-historical method is combined with systemic-functionally oriented methods of text analysis to study the official letters which notify immigrant workers of the rejection of their family reunion applications. The systemic-functionally oriented methods are used in a detailed analysis of (...) a sample of rejection letters while the discourse-historical method allows this analysis to be intertextually connected to other related genres of discourse and strategies of argumentation, and to the history of post-war immigration in Austria generally. (shrink)
Three related intuitions are explicated in this paper. The first is the idea that there must be some kind of probabilistic version of the HD-method, a ‘Hypothetico-Probabilistic (HP-) method’, in terms of something like probabilistic consequences, instead of deductive consequences. According to the second intuition, the comparative application of this method should also be functional for some probabilistic kind of empirical progress, and according to the third intuition this should be functional for something like probabilistic truth approximation. In all three (...) cases, the guiding idea is to explicate these intuitions by explicating the crucial notions as appropriate ‘concretizations’ of their deductive analogs, being ‘idealizations’. It turns out that the comparative version of the proposed HP-method amounts to the likelihood comparison (LC-) method applied to the cumulated evidence. This method turns out to be not only functional for probabilistic empirical progress but also for probabilistic truth approximation. The latter is based on a probabilistic threshold theorem constituting for this reason the analog of the deductive success theorem. (shrink)
In earlier publications of the first author it was shown that intentional explanation of actions, functional explanation of biological traits and causal explanation of abnormal events share a common structure. They are called explanation by specification (of a goal, a biological function, an abnormal causal factor, respectively) as opposed to explanation by subsumption under a law. Explanation by specification is guided by a schematic train of thought, of which the argumentative steps not concerning questions were already shown to be logically (...) valid (elementary) arguments.Independently, the second author developed a new, inferential approach to erotetic logic, the logic of questions. In this approach arguments resulting in questions, with declarative sentences and/or other questions as premises, are analyzed, and validity of such arguments is defined. (shrink)
Neo-Fregeans argue that substantial mathematics can be derived from a priori abstraction principles, Hume's Principle connecting numerical identities with one:one correspondences being a prominent example. The embarrassment of riches objection is that there is a plurality of consistent but pairwise inconsistent abstraction principles, thus not all consistent abstractions can be true. This paper considers and criticizes various further criteria on acceptable abstractions proposed by Wright settling on another one—stability—as the best bet for neo-Fregeans. However, an analogue of the embarrassment (...) of riches objection resurfaces in the metatheory and I conclude by arguing that the neo-Fregean program, at least insofar as it includes a platonistic ontology, is fatally wounded by it. (shrink)
Standard accounts of the micro-reduction of phenomenological to kinetic thermostatics, based on the postulate relating empirical absolute temperature to mean kinetic energy ū=(3/2)kT, face two problems. The standard postulate also allows 'reduction' in the other direction and it can be criticized from the point of view that reduction postulates need to be ontological identities. This paper presents a detailed account of the reduction, based on the postulate that thermal equilibrium is ontologically identical to having equal mean kinetic energy. In particular, (...) it is shown that this postulate enables reduction only in the appropriate direction, but leaves room for 'evidence transport' in the other. Moreover, it makes possible the derivation (explanation) of the standard postulate, using the existential kinetic hypothesis and phenomenological laws with which it turns out to be laden. (shrink)