This paper critically examines the forays into metaphysics of The Dual Nature of Technical Artifacts Program (henceforth, DNP). I argue that the work of DNP is a valuable contribution to the epistemology of certain aspects of artifact design and use, but that it fails to advance a persuasive metaphysic. A central problem is that DNP approaches ontology from within a functionalist framework that is mainly concerned with ascriptions and justified beliefs. Thus, the materiality of artifacts emerges only as (...) the external conditions of realizability of function ascription. The work of DNP has a strong programmatic aspect and much of its foray into metaphysics is tentative, so the intent of my argument is partly synthetic: to sum up these issues as they are presented in the literature and highlight some recognized problems. But I also go beyond that, suggesting that these problems are foundational, arising from the very way in which DNP poses the question of artifact metaphysics. Although it sets out to incorporate objective aspects of technology, DNP places a strong focus on the intentional side of the purported matter-mind duality, bracketing off materiality in an irretrievable manner. Thus, some of the advantages of dualism are lost. I claim that DNP is dualistic, not merely based on “duality”, but that its version of dualism does not appropriately account for the material “nature” of artifacts. The paper ends by suggesting some correctives and alternatives to Dual Nature theory. (shrink)
Taken at face value, a programming language is defined by a formal grammar. But, clearly, there is more to it. By themselves, the naked strings of the language do not determine when a program is correct relative to some specification. For this, the constructs of the language must be given some semantic content. Moreover, to be employed to generate physical computations, a programming language must have a physical implementation. How are we to conceptualize this complex package? Ontologically, what kind of (...) thing is it? In this paper, we shall argue that an appropriate conceptualization is furnished by the notion of a technical artifact. (shrink)
I want to explore four different exercises of interpretation: (1) the interpretation of texts (or hermeneutics), (2) the interpretation of people (otherwise known as "attribution" psychology, or cognitive or intentional psychology), (3) the interpretation of other artifacts (which I shall call artifact hermeneutics), (4) the interpretation of organism design in evolutionary biology--the controversial interpretive activity known as adaptationism.
Artifacts are objects intentionally made to serve a given purpose; natural objects come into being without human intervention. I shall argue that this difference does not signal any ontological deficiency in artifacts qua artifacts. After sketching my view of artifacts as ordinary objects, I’ll argue that ways of demarcating genuine substances do not draw a line with artifacts on one side and natural objects on the other. Finally, I’ll suggest that philosophers have downgraded artifacts (...) because they think of metaphysics as resting on a distinction between what is “mindindependent” and what is “mind dependent.” I’ll challenge the use of any such distinction as a foundation for metaphysics. (shrink)
Beginning with Aristotle, philosophers have taken artifacts to be ontologically deficient. This paper proposes a theory of artifacts, according to which artifacts are ontologically on a par with other material objects. I formulate a nonreductive theory that regards artifacts as constituted by - but not identical to - aggregates of particles. After setting out the theory, I rebut a number of arguments that disparage the ontological status of artifacts.
In this reply to Professor Hookway’s lecture the comments are focused, first, on the topic of what dichotomies really are, since it is an illuminating way of understanding pragmatism in general and Putnam’s pragmatism in particular. Dichotomies are artifacts that we devise with some useful purpose in mind, but when inflated into absolute dichotomies they become metaphysical bogeys as it is illustrated by the twentieth century distinction between fact and value. Secondly, a brief comment on the so-called “thick” ethical (...) concepts and artifact terms is presented, and finally it is added a word on John L. Austin, whose approach to dichotomies is aligned with pragmatism and Putnam. (shrink)
In his Ideas II , Husserl interprets the apprehension of cultural objects by comparing it to that of the human “flesh“ and “spirit.“ Such objects are not just “bodies“ ( Körper ) to which a sense is exteriorly added, but instead they are, similarly to human bodies ( Leiber ), entirely “animated“ by a cultural meaning. In fact, this is not just an analogy for Husserl, since, in several of his later notations, he comes to show that cultural objects are (...) actually understood as such by means of an apperception employing empathy, as sediments of subjective acts and performances. Understood as cultural objects, images also point towards a previous subjective doing, and it is precisely by grasping this “pointing“ that we comprehend them in their proper significance as artifacts. In my paper, I would like to explore the nature and forms of this empathic “pointing,“ focusing on the possible use of Husserl's conception for an interpretation of non-figurative art. (shrink)
Abstract In this paper I criticise a recent account of fictional discourse proposed by Nathan Salmon. Salmon invokes abstract artifacts as the referents of fictional names in both object- and meta-fictional discourse alike. He then invokes a theory of pretence to forge the requisite connection between object-fictional sentences and meta-fictional sentences, in virtue of which the latter can be assigned appropriate truth-values. I argue that Salmon's account of pretence renders his appeal to abstract artifacts as the referents of (...) fictional names in object-fictional discourse explanatorily redundant. I further argue that his account is therefore no improvement over those he criticises, thus leaving his own account unmotivated. (shrink)
Technical artifacts are embedded in social systems and, to some extent, even shape them. This chapter inquires, then, whether designing artifacts may be regarded as a contribution to social design. I explicate a concept of general design that conceives design as the type fixation of a complex entity. This allows for an analysis of different contributions to the design of social systems without favoring the intended effects of artifacts on a system over those effects that actually show (...) up. First, the clear-cut case of socio-technical systems is considered. Here, functions of artifacts can be planned fairly precise. In societies, in contrast, the actual functions of an artifact can hardly be predicted, which is due to strong self-organizing processes. Nevertheless artifact design can be shown to contribute to the design of the system also in this case. (shrink)
Relativists maintain that identity is always relative to a general term (RI). According to them, the notion of absolute identity has to be abandoned and replaced by a multiplicity of relative identity relations for which Leibniz’s Law does not hold. For relativists RI is at least as good as the Fregean cardinality thesis (FC), which contends that an ascription of cardinality is always relative to a concept specifying what, in any specific case, counts as a unit. The same train of (...) thought on cardinality and identity is apparent among those – Artifactualists – who take relative identity sentences for artifacts as the norm. The aim of this paper is (i) to criticize the thesis (T1) thatfrom FC it is possible to derive RI, and (ii) to explain why Artifactualists mistakenly believe that RI can be derived from FC. The misunderstanding derives from their assumption that the concept of artifact – like the concept of object – is not a sortal concept. (shrink)
In his best-selling The Time Falling Bodies Take to Light , William Irwin Thompson intrigued readers with his thoughts on mythology and sexuality. In his newest book, Coming Into Being: Artifacts and Texts in the Evolution of Consciousness , he takes the reader on a journey through the evolution of consciousness from the preverbal communications of early stone carvings, to the writings of Marcel Proust, around the monumental wrappings of Christo and up to the rebirth of interest in the (...) Taoist philosophy of Lao Tzu. Owing as much to the rhythmic constructions of jazz as to established methods of scholarship, Thompson plays a riff on biology and culture seeing the birth of the mind in Proust’s Madeleine, the displacement of humanity in Christo’s wrapping of the Reichstag and, in Lao Tzu’s Tao Te Ching , the path forward to a new planetary culture. In Coming Into Being , William Irwin Thompson presents a fascinating vision of our past, our present, and our future that no one will want to miss. (shrink)
Philosophers such as Eric Katz and Robert Elliot have argued against ecological restoration on the grounds that restored landscapes are no longer natural. Katz calls them “artifacts,” but the sharp distinction between nature and artifact doesn’t hold up. Why should the products of one particular natural species be seen as somehow escaping nature? Katz’s account identifies an artifact too tightly with the intentions of its creator: artifacts always have more to them than what their creators intended, and furthermore (...) the intention behind some artifacts might explicitly be to allow things to happen unpredictably. Indeed, to build any artifact is to employ forces that go beyond the builder: in this sense all artifacts are natural. Recognizing the naturalness of artifacts can help encourage the key environmental virtues of self-knowledge and humility. (shrink)
This paper aims to critically discuss the versatility of Popper’s theory of three worlds in the analysis of issues related to the ontological status and character of technical artifacts. Despite being discussed over years and hit with numerous criticisms it is still little known that Popper’s thesis has an important bearing on the philosophical characterization of technical artifacts. His key perspectives on the reality, autonomy, and ontological status of artifacts are rarely taken into consideration by scholars known (...) to be engaged in the study of artifacts. This paper consists of two main sections. The first section attempts to present a critical exposition of Popper’s account of reality and (partial) autonomy of artifacts. Recent discussions about the longstanding distinction between natural objects and artifacts are brought up and the relevance of Popper’s pluralistic thesis to this debate is pointed out. In addition, attention is drawn towards how to read his notion of the autonomy of artifacts. The primary emphasis of the second section is the ontological position of artifacts. Two separate arguments are posed to challenge the dual ontological status of what Popper called ‘embodied’ World 3 objects or artifacts. The first argument is concerned with the material composition and characteristic features of artifacts. The second one addresses the creative and epistemic value of these artificial products. (shrink)
Mathematical theorems are cultural artifacts and may be interpreted much as works of art, literature, and tool-and-craft are interpreted. The Fundamental Theorem of the Calculus, the Central Limit Theorem of Statistics, and the Statistical Continuum Limit of field theories, all show how the world may be put together through the arithmetic addition of suitably prescribed parts (velocities, variances, and renormalizations and scaled blocks, respectively). In the limit — of smoothness, statistical independence, and large N — higher-order parts, such as (...) accelerations, are, for the most, part irrelevant, affirming that, in the end, most of the world's particulars may be averaged over (a very un-Scriptural point of view). (We work out all of this in technical detail, including a nice geometric picture of stochastic integration, and a method of calculating the variance of the sum of dependent random variables using renormalization group ideas.) These fundamental theorems affirm a culture that is additive, ahistorical, Cartesian, and continuist, sharing in what might be called a species of modern culture. We understand mathematical results as useful because, like many other such artifacts, they have been adapted to fit the world, and the world has been adapted to fit their capacities. Such cultural interpretation is in effect motivation for the mathematics, and might well be offered to students as a way of helping them understand what is going on at the blackboard. Philosophy of mathematics might want to pay more attention to the history and detailed technical features of sophisticated mathematics, as a balance to the usual concerns that arise in formalist or even Platonist positions. (shrink)
The passion of inequality exists in the discourse that binds people by their adhesion to the beliefs about the hierarchic distribution of positions in society. In this manner the differences that structure the (apparently) natural titles to be governed or to govern are put in a state of aggregation. The apparent naturalness of these titles masks a principle of equality, a necessary artifact that breaches the nature of the social bond. This article argues that despite the hegemonic pressure of inequality, (...) the situation of apprenticeship can contain events of emancipation. For this, the 'master' must articulate two complementary artifacts in his relation with his apprentice. First, he must occupy a posture of ignorance. Second, he must ascribe priority to the object to be known or listened to (the text, the words ...) in such a way that a connection is made between the intelligences that he is interrogating and, thus, that their equality is verified. (shrink)
In the near future, our life will normally be surrounded with fairly complicated artifacts, enabled by the autonomous robot and brainâmachine interface technologies. In this paper, we argue that what we call the responsibility flaw problem and the inappropriate use problem need to be overcome in order for us to benefit from complicated artifacts. In order to solve these problems, we propose an approach to endowing artifacts with an ability of socially communicating with other agents based on (...) the artifact-as-a-half-mirror metaphor. The idea is to have future artifacts behave according to the hybrid intention composed of the ownerâs intention and the social rules. We outline the approach and discuss its feasibility together with preliminary work. (shrink)
Lack of symmetry of stone tools does not require that hominids making asymmetric tools are incapable of doing better. By analogy, differences between stone tools of early humans and modern technology arose without genetic change. A conservative assumption is that symmetry of stone artifacts may have arisen simply because symmetrical tools work better when used for striking and chopping rather than scraping.
Designers’ intentions are important for determining an artifact’s proper function (i.e., its perceived real function). However, there are disagreements regarding why. In one view, people reason causally about artifacts’ functional outcomes, and designers’ intended functions become important to the extent that they allow inferring outcomes. In another view, people use knowledge of designers’ intentions to determine proper functions, but this is unrelated to causal reasoning, having perhaps to do with intentional or social forms of reasoning (e.g., authority). Regarding these (...) latter social factors, researchers have proposed that designers’ intentions operate through a mechanism akin to social conventions, and that therefore both are determinants of proper function. In the current work, participants learned about an object’s creation, about social conventions for its use and about a specific episode where the artifact was used. The function implemented by the user could be aligned with the designer’s intended function, the social convention, both, or neither (i.e., an opportunistic use). Importantly, the use episode always resulted in an accident. Data show that the accident negatively affected proper function judgments and perceived efficiency for conventional and opportunistic functions, but not for designers’ intended functions. This is inconsistent with the view that designers’ intentions are conceptualized as causes of functional outcomes and with the idea that designers’ intentions and social conventions operate through a common mechanism. (shrink)
Peter Brugger & Kirsten Taylor (B&T) regard positive extrasensory perception (ESP) test results as methodical artifacts. In their view, sequences of guessing, e.g. of symbol cards, being non-random, overlap with finite sequences of non-random targets, and surpluses of hits from chance are deemed to be due to correlated non- randomness. The present author's ESP test data obtained from his 'ball drawing test' applied with N = 231 psychology majors were used for testing five hypotheses derived from (...) B&T's claims. B&T would expect increased hit rates by intra-systemic pattern correlation of both guesses with guesses and targets with targets which are most favourable conditions for B&T's matching mechanism. But hit rates do not increase under such conditions, they decrease significantly. Moreover, Brugger's 1992 result does not replicate. B&T's 'deadly blow' directed at parapsychology turns out to be a boomerang. The authors wanted to get a 'phantom slain'. They got one slain - their own. (shrink)
Heng Xian is a previously unknown text reconstructed by Chinese scholars out of a group of more than 1,200 inscribed bamboo strips purchased by the Shanghai Museum on the Hong Kong antiquities market in 1994. The strips have all been assigned an approximate date of 300 B.C.E., and Heng Xian allegedly consists of thirteen of them, but each proposed arrangement of the strips is marred by unlikely textual transitions. The most plausible hypothesis is one that Chinese scholars do not appear (...) to take seriously: that we are missing one or more strips. The paper concludes with a discussion of the hazards of studying unprovenanced artifacts that have appeared during China’s recent looting spree. I believe the time has come for scholars to ask themselves whether their work indirectly abets this destruction of knowledge. (shrink)
It is a contested question in contemporary theories of religion whether the concept of religion can be defined in a sound way or not. Many theorists maintain that a universal but delimiting definition is impossible. In this study, by contrast, it is argued that a conceptual analysis of religion that holds universally is perfectly possible because the following thesis can be seen as a necessary and sufficient conceptual condition of what religion is: (R) X is a religion if and only (...) if X is a collection of artifacts which has the proper function of representing a supraphysical world. On this thesis, it is argued that artifacts such as pictorial and verbal representations, rituals, symbols, and various tools constitute religion as a cultural object, which, as a collection of artifacts, has the proper function of representing a conceived world that is not entirely physical, and which, allegedly, is a prerequisite for existential welfare in relation to observance. It is here important to understand what is constitutive for these kinds of conceived worlds. Supraphysical world is defined as follows. Given that the actual world is a physical world, a conception S is a construction of a supraphysical world if and only if both of the following conditions apply to S: (1) Metaphysical component: S is a duplicate of the actual world with the addition of an anti-physical substance. (2) Existential-normative component: S is an alleged prerequisite for existential welfare in relation to observance. The core argument of the study is that (R) holds a priori for the concept of religion and as an a posteriori necessity for every instance of a religion. Apart from discussing the methodological problems of defining religion, the study introduces a new theory of religion in terms of (R). It addresses issues in the theory of artifacts; in the theory of representations; and in the theory of conceptual analysis. (shrink)
Psychological essentialism is an intuitive folk belief positing that certain categories have a non-obvious inner “essence” that gives rise to observable features. Although this belief most commonly characterizes natural kind categories, I argue that psychological essentialism can also be extended in important ways to artifact concepts. Specifically, concepts of individual artifacts include the non-obvious feature of object history, which is evident when making judgments regarding authenticity and ownership. Classic examples include famous works of art (e.g., the Mona Lisa is (...) authentic because of its provenance), but ordinary artifacts likewise receive value from their history (e.g., a worn and tattered blanket may have special value if it was one’s childhood possession). Moreover, in some cases, object history may be thought to have causal effects on individual artifacts, much as an animal essence has causal effects. I review empirical support for these claims and consider the implications for both artifact concepts and essentialism. This perspective suggests that artifact concepts cannot be contained in a theoretical framework that focuses exclusively on similarity or even function. Furthermore, although there are significant differences between essentialism of natural kinds and essentialism of artifact individuals, the commonalities suggest that psychological essentialism may not derive from folk biology but instead may reflect more domain-general perspectives on the world. (shrink)
An essential property is a property that an object possesses in every possible world in which that object exists. An individual essence is a property (or set of properties) that an object possesses in every world in which that object exists, and that no other object possesses in any possible world. Call the claim that some artifacts possess an individual essence ‘artifactual essentialism’. I will argue that artifactual essentialism is true.
Th ere is much controversy surrounding the nature of the relation between fictional individuals and possible individuals. Some have argued that no fictional individual is a possible individual; others have argued that (some) fictional individuals just are (merely) possible individuals. In this paper, I off er further grounds for believing the theory of fictional individuals defended by Amie Thomasson,viz., Artifactualism, by arguing that her view best allows one to make sense of this puzzling relation. More specifically, when we realize that (...) the view allows for an identification of merely possible individuals with fictional individuals, we seethat the utility, and hence the level of credence lent to Artifactualism, is increased. After arguing for this thesis, I respond to three of the most pressing worries. (shrink)
Function theorists routinely speculate that a viable function theory will be equally applicable to biological traits and artifacts. However, artifact function has received only the most cursory scrutiny in its own right. Closer scrutiny reveals that only a pluralist theory comprising two distinct notions of function--proper function and system function--will serve as an adequate general theory. The first section describes these two notions of function. The second section shows why both notions are necessary, by showing that attempts to do (...) away with one of them fail. This demonstration draws on examples from the artifactual realm to motivate major points of the argument. The third section is an outline of artifact function. It confirms the conclusions of the second section, and also begins the task of describing some of the special features of artifact function needing accommodation within the general theory. (shrink)
Some well-known translations of the words attributed to the Master in Analects 6.25, "gu bu gu gu zai gu zai," are analyzed and sorted out. It is argued that this passage can be given a consistent reading and an interpretation that coheres with a major theme of the text, namely that the ontological status of a thing, like that of a person, is relative to the practice of constitutive rules and conventions.
Technology moves us to a better world. We contend that through technology people can simplify and solve moral tasks when they are in presence of incomplete information and possess a diminished capacity to act morally. Many external things, usually inert from the moral point of view, can be transformed into the so-called moral mediators. Hence, not all of the moral tools are inside the head, many of them are shared and distributed in “external” objects and structures which function as ethical (...) devices. (shrink)
Technical artifacts have the capacity to fulfill their function in virtue of their physicochemical make-up. An explanation that purports to explicate this relation between artifact function and structure can be called a technological explanation. It might be argued, and Peter Kroes has in fact done so, that there issomething peculiar about technological explanations in that they are intrinsically normative in some sense. Since the notion of artifact function is a normative one (if an artifact has a proper function, it (...) ought to behave in specific ways) an explanation of an artifact’s function must inherit this normativity.In this paper I will resist this conclusion by outlining and defending a ‘buck-passing account’ of the normativity of technological explanations. I will first argue that it is important to distinguish properly between (1) a theory of function ascriptions and (2) an explanation of how a function is realized. The task of the former is to spell out the conditions under which one is justified in ascribing a function to an artifact; the latter should show how the physicochemical make-up of an artifact enables it to fulfill its function. Second, I wish to maintain that a good theory of function ascriptions should account for the normativity of these ascriptions. Provided such a function theory can be formulated — as I think it can — a technological explanation may pass the normativity buck to it. Third, to flesh out these abstract claims, I show how a particular function theory — to wit, the ICE theory by Pieter Vermaas and Wybo Houkes — can be dovetailed smoothly with my own thoughts on technological explanation. (shrink)
Uttal has written 9 LEA titles over the past 25 yrs. The audience will be the same people who bought Uttal's past work, as well as people teaching courses in THEORY & METHODS of PSYCH.,those w/interests in THEORETICAL PSYCH & the HISTORY & PHILOSOPHY OF.
In a 1993 paper, I argued that empirical treatments of the epistemologyused by scientists in experimental work are too abstract in practice tocounter relativist efforts to explain the outcome of scientificcontroversies by reference to sociological forces. This was because, atthe rarefied level at which the methodology of scientists is treated byphilosophers, multiple mutually inconsistent instantiations of theprinciples described by philosophers are employed by contestingscientists. These multiple construals change within a scientificcommunity over short time frames, and these different versions ofscientific methodology (...) can determine the outcome of a controversy. Iillustrated with a comparatively detailed analysis of the methodologyused by biologists debating the existence of an entity called thebacterial mesosome between the mid-1950s and the mid-1970s. This 1993piece has drawn several critiques in the philosophical literature. Inthis present piece I respond to these critiques and argue that they failto address the core argument of the original paper, and I reflectfurther on the methodologies of philosophers of science pursuingempirical or `naturalistic' epistemology. (shrink)
This paper examines a concrete political controversy in order to shed light on a broad philosophical issue. The controversy is with regard to who owns cultural antiquities ? the nations (often in the developing world) on whose soil they originated, or the museums of developed nations that have, through a variety of means, come into possession of them. Despite their opposing views, both sides accept the claim that ownership can be derived from prior facts about cultural identity. Moreover, when their (...) claims are articulated, each side?s arguments shed contrasting light on a broader question of property: are some things intrinsically common; that is, do some things have properties that undermine claims of private ownership? Following the logic inherent in arguments made on both sides, this paper defends an affirmative answer to these questions, and, in so doing, suggests that we need to broaden our perspective on publics goods generally. (shrink)
The momentum of advances in biology is evident in the history of patents on life forms. As we proceed forward with greater understanding and technological control of developmental biology there will be many new and challenging dilemmas related to patenting of human parts and partial trajectories of human development. These dilemmas are already evident in the current conflict over the moral status of the early human embryo. In this essay, recent evidence from embryological studies is considered and the unbroken continuity (...) of organismal development initiated at fertilization is asserted as clear and reasonable grounds for moral standing. Within this frame of analysis, it is proposed that through a technique of Altered Nuclear Transfer, non-organismal entities might be created from which embryonic stem cells could be morally procured. Criteria for patenting of such non-organismal entities are considered. (shrink)
Cognitive psychologists tend to treat intentionality as a control variable during experiments, yet ignore it when generating mechanistic descriptions of performance. Wynn's work brings this conflict into striking relief and, when considered in relation to recent neurophysiological findings, makes it clear that intentionality can be regarded mechanistically if one defines it as the planning of distal effects.
Failure is a central notion both in ethics of engineering and in engineering practice. Engineers devote considerable resources to assure their products will not fail and considerable progress has been made in the development of tools and methods for understanding and avoiding failure. Engineering ethics, on the other hand, is concerned with the moral and social aspects related to the causes and consequences of technological failures. But what is meant by failure, and what does it mean that a failure has (...) occurred? The subject of this paper is how engineers use and define this notion. Although a traditional definition of failure can be identified that is shared by a large part of the engineering community, the literature shows that engineers are willing to consider as failures also events and circumstance that are at odds with this traditional definition. These cases violate one or more of three assumptions made by the traditional approach to failure. An alternative approach, inspired by the notion of product life cycle, is proposed which dispenses with these assumptions. Besides being able to address the traditional cases of failure, it can deal successfully with the problematic cases. The adoption of a life cycle perspective allows the introduction of a clearer notion of failure and allows a classification of failure phenomena that takes into account the roles of stakeholders involved in the various stages of a product life cycle. (shrink)