Van den Belt recently examined the notion that synthetic biology and the creation of ‘artificial’ organisms are examples of scientists ‘playing God’. Here I respond to some of the issues he raises, including some of his comments on my previous discussions of the value of the term ‘life’ as a scientific concept.
The present discussion of sociobiological approaches to ethnic nepotism takes Pierre van den Berghe ʼs theory as a starting point. Two points, which have not been addressed in former analyses, are considered to be of particular importance. It is argued that the behavioral mechanism of ethnic nepotism—as understood by van den Berghe—cannot explain ethnic boundaries and attitudes. In addition, I show that van den Bergheʼs central premise concerning ethnic nepotism is in contradiction to Hamiltonʼs formula, the essential principle of kin (...) selection theory. It is further discussed how other approaches that make reference to ethnic nepotism are related to van den Bergheʼs account and its problems. I conclude with remarks on the evolutionary explanation of ethnic phenomena. (shrink)
We revisit the characterization of the Shapley value by van den Brink (Int J Game Theory, 2001, 30:309–319) via efficiency, the Null player axiom, and some fairness axiom. In particular, we show that this characterization also works within certain classes of TU games, including the classes of superadditive and of convex games. Further, we advocate some differential version of the marginality axiom (Young, Int J Game Theory, 1985, 14: 65–72), which turns out to be equivalent to the van den Brink (...) fairness axiom on large classes of games. (shrink)
The meaning of sustainability is the subject of intense debate among environmental and resource economists. Perhaps no other issue separates more clearly the traditional economic view from the views of most natural scientists. The debate currently focuses on the substitutability between the economy and the environment or between “natural capital” and “manufactured capital”—a debate captured in terms of weak versus strong sustainability. In this article, we examine the various interpretations of these concepts. We conclude that natural science and economic perspectives (...) on sustainability are inconsistent. The market-based Hartwick-Solow “weak sustainability” approach is far removed from both the ecosystem-based “Holling sustainability” and the “strong sustainability” approach of Daly and others. Each of these sustainability criteria implies a specific valuation approach, and thus an ethical position, to support monetary indicators of sustainability such as a green or sustainable Gross Domestic Product (GDP). The conflict between “weak sustainability” and “strong sustainability” is more evident in the context of centralized than decentralized decision making. In particular, firms selling “services” instead of material goods and regarding the latter as “capital” leads to decisions more or less consistent with either type of sustainability. Finally, we discuss the implications of global sustainability for such open systems as regions and countries. Open systems have not been dealt with systematically for any of the sustainability criteria. (shrink)
Computer ethicists have for some years been troubled by the issue of how to assign moral responsibility for disastrous events involving erroneous information generated by expert information systems. Recently, Jeroen van den Hoven has argued that agents working with expert information systems satisfy the conditions for what he calls epistemic enslavement. Epistemically enslaved agents do not, he argues, have moral responsibility for accidents for which they bear causal responsibility. In this article, I develop two objections to van den Hoven’s (...) argument for epistemic enslavement of agents working with expert information systems. (shrink)
The development of ever smaller integrated circuits at the sub-micron and nanoscale—in accordance with Moore’s Law—drives the production of very small tags, smart cards, smart labels and sensors. Nanoelectronics and submicron technology supports surveillance technology which is practically invisible. I argue that one of the most urgent and immediate concerns associated with nanotechnology is privacy. Computing in the twenty-first century will not only be pervasive and ubiquitous, but also inconspicuous. If these features are not counteracted in design, they will facilitate (...) ubiquitous surveillance practices which are widely available, cheap, and intrusive. RFID technology is an instructive example of what nanotechnology has in store for privacy. (shrink)
The impact of the Internet on democracy is a widely discussed subject. Many writers view the Internet, potentially at least, as a boon to democracy and democratic practices. According to one popular theme, both e-mail and web pages give ordinary people powers of communication that have hitherto been the preserve of the relatively wealthy (Graham 1999, p. 79). So the Internet can be expected to close the influence gap between wealthy citizens and ordinary citizens, a weakness of many procedural democracies.
Ethical issues of information and communication technologies (ICTs) are important because they can have significant effects on human liberty, happiness, and people’s ability to lead a good life. They are also of functional interest because they can determine whether technologies are used and whether their positive potential can unfold. For these reasons, policy makers are interested in finding out what these issues are and how they can be addressed. The best way of creating ICT policy that is sensitive to ethical (...) issues pertain to being proactive in addressing such issues at an early stage of the technology life cycle. The present paper uses this position as a starting point and discusses how knowledge of ethical aspects of emerging ICTs can be gained. It develops a methodology that goes beyond established futures methodologies to cater for the difficult nature of ethical issues. The authors outline how the description of emerging ICTs can be used for an ethical analysis. (shrink)
Ethical issues of information and communication technologies (ICTs) are important because they can have significant effects on human liberty, happiness, their ability to lead a good life. They are also of functional interest because they can determine whether technologies are used and whether their positive potential can unfold. For these reasons policy makers are interested in finding out what these issues are and how they can be addressed. The best way of creating ICT policy that is sensitive to ethical issues (...) would be to be proactive and address such issues at early stages of the technology life cycle. The present paper uses this position as a starting point and discusses how knowledge of ethical aspects of emerging ICTs can be gained. It develops a methodology that goes beyond established futures methodologies to cater for the difficult nature of ethical issues. The paper goes on to outline some of the preliminary findings of a European research project that has applied this method. (shrink)
We argue that nano-technology in the form of invisible tags, sensors, and Radio Frequency Identity Chips (RFIDs) will give rise to privacy issues that are in two ways different from the traditional privacy issues of the last decades. One, they will not exclusively revolve around the idea of centralization of surveillance and concentration of power, as the metaphor of the Panopticon suggests, but will be about constant observation at decentralized levels. Two, privacy concerns may not exclusively be about constraining information (...) flows but also about designing of materials and nano-artifacts such as chips and tags. We begin by presenting a framework for structuring the current debates on privacy, and then present our arguments. (shrink)
It is argued that Pettit’s conception of “contestatory democracy” is superior to deliberative, direct and epistemic democracy. The strong and weak points of these conceptions are discussed drawing upon the work of a.o Bruce Bimber. It is further argued that ‘contestation’ and ‘information’ are highly relevant notions in thinking about, just, viable and sustainable design for E-democracy.
When thinking about ethics, technology is often only mentioned as the source of our problems, not as a potential solution to our moral dilemmas. When thinking about technology, ethics is often only mentioned as a constraint on developments, not as a source and spring of innovation. In this paper, we argue that ethics can be the source of technological development rather than just a constraint and technological progress can create moral progress rather than just moral problems. We show this by (...) an analysis of how technology can contribute to the solution of so-called moral overload or moral dilemmas. Such dilemmas typically create a moral residue that is the basis of a second-order principle that tells us to reshape the world so that we can meet all our moral obligations. We can do so, among other things, through guided technological innovation. (shrink)
Although applications are being developed and have reached the market, nanopharmacy to date is generally still conceived as an emerging technology. Its concept is ill-defined. Nanopharmacy can also be construed as a converging technology, which combines features of multiple technologies, ranging from nanotechnology to medicine and ICT. It is still debated whether its features give rise to new ethical issues or that issues associated with nanopharma are merely an extension of existing issues in the underlying fields. We argue here that, (...) regardless of the alleged newness of the ethical issues involved, developments occasioned by technological advances affect the roles played by stakeholders in the field of nanopharmacy to such an extent that this calls for a different approach to responsible innovation in this field. Specific features associated with nanopharmacy itself and features introduced to the associated converging technologies- bring about a shift in the roles of stakeholders that call for a different approach to responsibility. We suggest that Value Sensitive Design is a suitable framework to involve stakeholders in addressing moral issues responsibly at an early stage of development of new nanopharmaceuticals. (shrink)
In this paper, we consider the meaning, roles, and uses of trust in the economic and public domain, focusing on the task of designing systems for trust in information technology. We analyze this task by means of a survey of what trust means in the economic and public domain, using the model proposed by Lewicki and Bunker, and using the emerging paradigm of value-sensitive design. We explore the difficulties developers face when designing information technology for trust and show how our (...) analysis in conjunction with existing engineering design methods provides means to address these difficulties. Our main case concerns a concrete problem in the economic domain, namely the transfer of control from customs agencies to companies. Control of individual items is increasingly untenable and is replaced by control on the level of companies aimed at determining whether companies can be trusted to be in control of their business and to be in compliance with applicable regulations. This transfer sets the task for companies to establish this trust by means of information technology systems. We argue that this trust can be achieved by taking into account philosophical analyses of trust and by including both parties in the trust relationship as clients for whom the information technology systems are to be designed. (shrink)
In this article we argue that discourse structure constrains the set ofpossible constituents in a discourse that can provide the relevantcontext for structuring information in a target sentence, whileinformation structure critically constrains discourse structureambiguity. For the speaker, the discourse structure provides a set of possible contexts for continuation while information structure assignment is independent of discourse structure. For the hearer, the information structure of a sentence together with discourse structure instructs dynamic semantics how rhematic (...) information should be used to update the meaning representation of the discourse (Polanyi and van den Berg, 1996). (shrink)
Marcin Lewinski: Internet Political Discussion Forums as an Argumentative Activity Type. A Pragma-dialectical Analysis of Online Forms of Strategic Manoeuvring in Reacting Critically Content Type Journal Article Pages 255-259 DOI 10.1007/s10503-011-9201-3 Authors Paul van den Hoven, Utrecht University, Utrecht, The Netherlands Journal Argumentation Online ISSN 1572-8374 Print ISSN 0920-427X Journal Volume Volume 25 Journal Issue Volume 25, Number 2.
Editorial: Concepts of Animal Welfare Content Type Journal Article Pages 93-103 DOI 10.1007/s10441-011-9134-0 Authors Kristin Hagen, Europäische Akademie zur Erforschung von Folgen wissenschaftlich-technischer Entwicklungen Bad Neuenahr-Ahrweiler GmbH, Wilhelmstr. 56, 53474 Bad Neuenahr-Ahrweiler, Germany Ruud Van den Bos, Behavioural Neuroscience, Animals in Science and Society, Faculty of Veterinary Medicine, Rudolf Magnus Institute of Neuroscience, Utrecht University, Yalelaan 2, 3584 CM Utrecht, The Netherlands Tjard de Cock Buning, Department of Biology and Society (ATHENA Institute), Faculty of Earth and Life Sciences, Vrije Universiteit, (...) De Boelelaan 1087, 1081 HV Amsterdam, The Netherlands Journal Acta Biotheoretica Online ISSN 1572-8358 Print ISSN 0001-5342 Journal Volume Volume 59 Journal Issue Volume 59, Number 2. (shrink)
In this paper, I argue against Peter van Inwagen’s claim (in “Free Will Remains a Mystery”), that agent-causal views of free will could do nothing to solve the problem of free will (specifically, the problem of chanciness). After explaining van Inwagen’s argument, I argue that he does not consider all possible manifestations of the agent-causal position. More importantly, I claim that, in any case, van Inwagen appears to have mischaracterized the problem in some crucial ways. Once we are clear on (...) the true nature of the problem of chanciness, agent-causal views do much to eradicate it. (shrink)
I. Introduction “We can and do see the truth about many things: ourselves, others, trees and animals, clouds and rivers—in the immediacy of experience.”1 Absent from Bas van Fraassen’s list of those things we see are paramecia and mitochondria. We do not see such things, van Fraassen has long maintained, because they are unobservable, that is, they are undetectable by means of the unaided senses.2 But notice that these two notions—what we can see in the “immediacy” of experience and what (...) is detectable by means of the unaided senses—are not the same. There is no incoherence in maintaining that the immediacy of experience is capable of disclosing to us truths concerning entities that are not detectable by the naked eye. And so, I claim, it does; science and technology provide us with the means to see things we have never seen before. Some of those things are van Fraassen’s unobservables. That suggestion is nothing new. Grover Maxwell long ago emphasized the continuity between seeing with and without instrumentation.3 Van Fraassen originally provided two responses to Maxwell’s arguments: some things that you can see with instruments you can also see without instruments (and those are the observables); and.. (shrink)
The anti-reductionist who wants to preserve the causal efficacy of mental phenomena faces several problems in regard to mental causation, i.e. mental events which cause other events, arising from her desire to accept the ontological primacy of the physical and at the same time save the special character of the mental. Psychology tries to persuade us of the former, appealing thereby to the results of experiments carried out in neurology; the latter is, however, deeply rooted in our everyday actions and (...) beliefs and despite the constant opposition of science still very much alive. Difficulties, however, arise from a combination of two claims that are widely accepted in philosophy of mind, namely, physical monism and mental realism, the acceptance of which leads us to the greatest problem of mental causation: the problem of causal exclusion. Since physical causes alone are always sufficient for physical effects mental properties are excluded from causal explanations of our behaviour, which makes them “epiphenomenal”. The article introduces Van Gulick’s solution to the exclusion problem which tries to prove that physical properties, in contrast to mental properties, do not have as much of a privileged status with respect to event causation as usually ascribed. Therefore, it makes no sense to say that physical properties are causally relevant whereas mental properties are not. This is followed by my objection to his argument for levelling mental and physical properties with respect to causation of events. I try to show that Van Gulick’s argument rests on a premise that no serious physicalist can accept. (shrink)
Van Heijenoort’s main contribution to history and philosophy of modern logic was his distinction between two basic views of logic, first, the absolutist, or universalist, view of the founding fathers, Frege, Peano, and Russell, which dominated the first, classical period of history of modern logic, and, second, the relativist, or model-theoretic, view, inherited from Boole, Schröder, and Löwenheim, which has dominated the second, contemporary period of that history. In my paper, I present the man Jean van Heijenoort (Sect. 1); then (...) I describe his way of arguing for the second view (Sect. 2); and finally I come down in favor of the first view (Sect. 3). There, I specify the version of universalism for which I am prepared to argue (Sect. 3, introduction). Choosing ZFC to play the part of universal, logical (in a nowadays forgotten sense) system, I show, through an example, how the usual model theory can be naturally given its proper place, from the universalist point of view, in the logical framework of ZFC; I outline another, not rival but complementary, semantics for admissible extensions of ZFC in the very same logical framework; I propose a way to get universalism out of the predicaments in which universalists themselves believed it to be (Sect. 3.1). Thus, if universalists of the classical period did not, in fact, construct these semantics, it was not that their universalism forbade them, in principle, to do so. The historical defeat of universalism was not technical in character. Neither was it philosophical. Indeed, it was hardly more than the victory of technicism over the very possibility of a philosophical dispute (Sect. 3.2). (shrink)
Van Heijenoort’s account of the historical development of modern logic was composed in 1974 and first published in 1992 with an introduction by his former student. What follows is a new edition with a revised and expanded introduction and additional notes.
The paper aims at drawing the main lines of a reflection about architectonic space, starting from the comparison between two hypothesis, as much as ever different: Theodor Lipps’ spatial aesthetics and Hans van der Laan’s elemental theory. The emphasis given by both authors to the intersection between directions and way, but also to the mutual subordination between thing and space, allows to rewrite the obituary of architecture as a spatial art, according to which the Modern Style has turned the spatiality (...) into its specular visibility, into the spaciousness, into the indefinite continuity of the Bigness. (shrink)
The paper takes as the starting point a dense and notorious quote by Lacan where he takes up in a single gesture three concepts of ancient philosophy, tyche, clinamen and den. The contention is that all three aim at the status of the object, although by different means and in different philosophical contexts, and the paper tries to spell out some crucial points concerning each. Tyche, usually translated as chance and put into an opposition with automaton, requires a reading of (...) some passages of Aristotle’s Physics where Lacan took it from, and an account of the problem of repetition in psychoanalysis. Clinamen, the swerve, stemming from Epicure and Lucretius, requires a condensed reading of the tradition which took it up, from Cicero to Hegel, Marx, Deleuze and Badiou, pinpointing the dilemmas and contradictions of this tradition. Den, stemming from Democritus who coined this neologism, brings up an entity which is neither being nor nothing, neither one nor zero nor multiple. It is perhaps the best evocation, at the dawn of philosophy, of what Lacan would call object a, and it allows to sidestep the difficulties and the pitfalls presented by the other two notions. The paper tries to pin down the minimal requirement for the Lacanian theory with the irreducible and incommensurable (non)relation of ‘minus one’ and den. (shrink)
Since the first volume appeared in 2005, the collection Controversies has brought together pieces of work related to the field of argumentation, giving particular attention to those that are concerned with theoretical and practical problems connected with discursive controversy and confrontation. Authors such as P. Barrotta, M. Dascal, S. Frogel, H. Chang and D. Walton had already either edited or written previous editions to the present volume (volume six) of the collection. F. H. van Eemeren and B. Garssen (the former (...) has already, with P. Houtlosser, edited the second volume of this collection) are responsible for compiling and editing this collection. In this volume Van Eemeren and Garssen edit works they conceive as being akin to those elements which, in argumentation discourse, serve to resolve – or often to present – differences of opinion. However, it should be added that this is not a mere editing job, but rather the result of an intellectual collaboration between two international research groups dedicated to a common field – consisting, on the one hand, of controversies and, on the other, of argumentation. (shrink)
http://dx.doi.org/10.5007/1808-1711.2008v12n1p49 The aim of this article is to offer a rejoinder to an argument against scientific realism put forward by van Fraassen, based on theoretical considerations regarding microphysics. At a certain stage of his general attack to scientific realism, van Fraassen argues, in contrast to what realists typically hold, that empirical regularities should sometimes be regarded as “brute facts”, which do not ask for explanation in terms of deeper, unobservable mechanisms. The argument from microphysics formulated by van Fraassen is based (...) on the claim that in microphysics the demand for explanation leads to a demand for the so-called hidden-variable theories, which “runs contrary to at least one major school of thought in twentieth-century physics”. It is shown here that this argument does not represent an insurmountable obstacle to scientific realism, not even when a series of important theoretical and experimental results against hidden-variable theories — and not merely a conflict with a certain school of thought—is taken into account. (shrink)
http://dx.doi.org/10.5007/1808-1711.2008v12n2p121 O objetivo deste trabalho é discutir e desenvolver o diagnóstico que efetua van Fraassen (1987, p. 110) da lei de Hardy-Weinberg, de acordo coo qual esta: 1) não pode ser considerada uma lei a ser utilizada como un axioma da teoria genética de populações, pois é uma lei de equilíbrio que só vale sob certas condições especiais, 2) só determina uma subclasse de modelos, 3) sua generalização resulta vácua e 4) variantes complexas da lei podem ser deduzidas para pressupostos (...) mais realistas. A discussão e desenvolvimento deste diagnóstico será levada a cabo tomando como base noções propostas por outra das concepções semânticas afim daquela desenvolvida por van Fraassen, a saber: a concepção estruturalista das teorias, e uma reconstrução da genética clássica de populações no marco de uma tal metateoria, também apresentada neste trabalho. (shrink)
De acordo com a concepção dominante de causação, eventos espácio-temporalmente localizáveis que podem ser designados por termos singulares e descrições definidas são os únicos relata genuínos da relação causal. Isto dá apoio e é apoiado pela dicotomia aceita entre a explicação causal, concebida como uma relação intensional entre fatos ou verdades, e a relação natural e extensional da causação. O ensaio questiona este modo de ver e argumenta pela legitimidade da noção de causação por fatos: os relata de muitas relações (...) expressas pelo conector sentencial ‘(O fato) C causa (o fato) E’ podem ser causas e efeitos genuínos (I). Esta visão expandida da causação é então aplicada ao problema da causação mental. Assumindo a verdade do realizacionismo físico, o ensaio explora a conexão entre eficácia causal e relevância contrafactual de propriedades. Mostra-se que, pelo menos em muitos casos, as ligações contrafactuais corretas, requeridas pela causação, podem ser encontradas somente no nível dos fatos realizados, não no nível mais básico dos fatos realizadores (II). Finalmente, dadas as similaridades entre a defesa do fisicismo não-reducionista esboçada aqui e as tentativas menos modestas de justificação científica das pretensões do materialismo metafísico, justamente criticadas por van Fraassen como manifestações da ‘falsa consciência’, considera-se se e como a argumento principal do ensaio pode evitar o juízo crítico de van Fraassen (III). (shrink)
I define serendipity as the art of making an unsought finding. And I propose an overview of my collection of serendipities, the largest yet assembled, chiefly in science and technology, but also in art, by giving a list of ‘serendipity patterns’. Although my list of ‘patterns’ is just a list and not a classification, it serves to introduce a new and possibly stimulating perspective on the old subject of serendipity. Knowledge of these ‘serendipity patterns’ might help in expecting also the (...) unexpected and in finding also the unsought. * I acknowledge A. D. de Groot, R. C. M. Noordam, B. P. van Heusden, T. Pinkster, C. J. van den Berg, T. A. F. Kuipers, A. Wegener Sleeswijk and my referee for their suggestions and I dedicate this article to T. A. van Kooten. Cases and studies of serendipidy are welcome. À propos: a travelling serendipity exhibition is available, also for ‘new democracies’: ‘Freedom of opportunity as developed by democracy is the best human reaction to divergent phenomena. We may, in fact, define ‘freedom’ as ‘the opportunity to profit from the unexpected.’ (Langmuir ). (shrink)
It is argued that, contrary to prevailing opinion, Bas van Fraassen nowhere uses the argument from underdetermination in his argument for constructive empiricism. It is explained that van Fraassen’s use of the notion of empirical equivalence in The Scientific Image has been widely misunderstood. A reconstruction of the main arguments for constructive empiricism is offered, showing how the passages that have been taken to be part of an appeal to the argument from underdetermination should actually be interpreted.
Over the last twenty years, Bas van Fraassen has developed a “new epistemology”: an attempt to sail between Bayesianism and traditional epistemology. He calls his own alternative “voluntarism”. A constant pillar of his thought is the thought that rationality involves permission rather than obligation. The present paper aims to offer an appraisal of van Fraassen’s conception of rationality. In section 2, I review the Bayesian structural conception of rationality and argue that it has been found wanting. In sections 3 and (...) 4, I analyse van Fraassen’s voluntarism. I raise some objections about van Fraassen’s reliance on prior opinion and argue that the content of a belief matters to its rationality. In section 5, I criticise van Fraassen’s view that inference to the best explanation is incoherent. Finally, in section 6, I take on van Fraassen’s conception of rationality and show that it is too thin to fully capture rational judgement. (shrink)
Projet En développant son « empirisme constructif », Bas Van Fraassen est devenu une référence incontournable pour la philosophie des sciences contemporaine. Après la vague de critiques qui, vers les années 1960, avait fait perdre à l'empirisme logique sa prédominance dans le champ des idées, le réalisme scientifique semblait s'être imposé comme le seul compte rendu acceptable du travail et des orientations de la recherche. Quine avait beau énoncer ce que pourrait être un empirisme affranchi de ses deux « dogmes (...) » (l'intangibilité de la distinction vérités analytiques / vérités synthétiques, et la réduction des constructions aux « faits »), le programme d'une philosophie des sciences empiriste renouvelée restait à l'état d'esquisse. Mais par trois ouvrages successifs, Scientific Image (1980), Laws and symmetry (1989), et Quantum mechanics an empiricist view (1991), Van Fraassen a posé les bases d'un empirisme viable, parce que capable de prendre en charge la plupart des spécificités dont se prévaut le réalisme contre l'empirisme classique ou logique, et de rendre raison des développements les plus actuels de la physique. Contre l'empirisme classique ou logique, les réalistes font d'abord valoir que la réduction de toute réalité et de tout acte de référence aux phénomènes, ne rend justice ni à la pratique du langage courant ni à celle des sciences. Lorsque quelqu'un procède à une dénomination, il ne cherche pas à désigner par là une tranche d'apparaître, ou quelque ensemble fini et répertorié d'apparitions; il pointe vers "quelque chose" dont les modalités de manifestation sans fin assignable sont pour partie anticipées et pour partie ouvertes. De même, quand un chercheur scientifique parle de l'objet de ses investigations, il ne limite pas son discours à un ensemble fini de résultats d'expérience obtenus sous des conditions instrumentales actuellement disponibles; il renvoie à une entité dont la variété des manifestations futures est prévue aussi complètement que possible (et avec un succès croissant) par des cadres conceptuels et théoriques révisables. Face à cette objection, Van Fraassen fait jouer un rôle capital aux modèles dans sa version de l'empirisme.. (shrink)