This paper aims to contribute to the expanding discourse on inter- and transdisciplinarity. Referring to well-established distinctions in philosophy of science, the paper argues in favor of a plurality of four different dimensions: Interdisciplinarity with regard to (a) objects ( ontology ), (b) knowledge/theories (epistemology), (c) methods/practices (methodology), and further, (d) problem perception/problem solving. Different philosophical thought traditions can be related to these distinguishable meanings. The philosophical framework of the four different dimensions will be illustrated by some of the most (...) popular examples of research programs that are labeled interdisciplinary : nanoresearch/nanoscience/nanotechnology, complex systems theory/chaos theory, biomimicry/bionics, and technology assessment/sustainability research. Thus, a minimal philosophy of science is required to understand and foster inter- and transdisciplinarity. (shrink)
“Unintended-acceleration” automobile accidents typically begin when the driver first enters the car, starts the engine, and intends to press his/her right foot on the brake while shifting from Park to a drive gear (Drive or Reverse). The driver reports an unintended (uncommanded) full-throttle acceleration, coupled with a loss of braking, until the episode ends in a crash. Pedal misapplications--where the right foot contacts the accelerator instead of the brake that was intended--have been linked to these accidents (Schmidt, 1989, 1993) (...) which, in the 1980s, were thought to occur only at the start of a driving cycle (and/or with the car in Park). But, in 1997, we identified over 200 pedal errors as the cause of accidents reported in the North Carolina database; these crashes occurred during the driving cycle (Schmidt et al., 1997), and/or with the vehicle in a gear other than Park. Our present work provides a more thorough analysis of these North Carolina Police Accident Reports from 1979 to 1995. The vast majority of pedal misapplications (over 92%) (a) occurred during the driving cycle, (b) were generally in “unhurried” conditions, and (c) were categorically separate from those events referred to as unintended-acceleration episodes at start-up. These ideas are explanatory for the recent (2009-2010) surge of unintended-acceleration reports, perhaps even suggesting that all of these crashes are caused by pedal errors, and that none of them are based on some vehicle defect(s). (shrink)
In the present enterprise we take a look at the meaning of Autonomy, how the word has been employed and some of the consequences of its use in the sciences of the artificial. Could and should robots really be autonomous entities? Over and beyond this, we use concepts from the philosophy of mind to spur on enquiry into the very essence of human autonomy. We believe our initiative, as does Dennett's life-long research, sheds light upon the problems of robot design (...) with respect to their relation with humans. (shrink)
Collingridgeâs dilemma is one of the most well-established paradigms presenting a challenge to Technology Assessment (TA). This paper aims to reconstruct the dilemma from an analytic perspective and explicates three assumptions underlying the dilemma: the temporal, knowledge and power/actor assumptions. In the light of the recent transformation of the science, technology and innovation systemâin the age of technoscience âthese underlying assumptions are called into question. The same result is obtained from a normative angle by Collingridge himself; he criticises the dilemma (...) and advances concepts on how to keep a technology controllable. This paper stresses the relevance of the dilemma and of Collingridgeâs own ideas on how to deal with the dilemma. Today, a positive interpretation of technoscience for effective TA is possible. (shrink)
Discussion about the application of scientific knowledge in robotics in order to build people helpers is widespread. The issue herein addressed is philosophically poignant, that of robots that are “people”. It is currently popular to speak about robots and the image of Man. Behind this lurks the dialogical mind and the questions about the significance of an artificial version of it. Without intending to defend or refute the discourse in favour of ‘recreating’ Man, a lesser familiar question is brought forth: (...) “and what if we were capable of creating a very convincible replica of man (constructing a robot-person), what would the consequences of this be and would we be satisfied with such technology?” Thorny topic; it questions the entire knowledge foundation upon which strong AI/Robotics is positioned. The author argues for improved monitoring of technological progress and thus favours implementing weaker techniques. (shrink)
This paper aims to contribute to the attempts to clarify and classify the vague notion of “technosciences” from a historical perspective. A key question that is raised is as follows: Does Francis Bacon, one of the founding fathers of the modern age, provide a hitherto largely undiscovered programmatic position, which might facilitate a more profound understanding of technosciences ? The paper argues that nearly everything we need today for an ontologically well-informed epistemology of technoscience can be found in the works (...) of Bacon—this position will be called epistemological real - constructivism. Rather than realist or constructivist, empiricist or rationalist, Bacon’s position can best be understood as real-constructivist since it challenges modern dichotomies. Reflection upon the contemporary relevance of Bacon could contribute to the expanding and critical discussion on technoscience. In the following I will reconstruct the term “technoscience”. My finding is that at least four different understandings or types of the term “technoscience” co-exist. In a second step, I will analyze and elaborate on Bacon’s epistemological position. I will identify central elements of the four different understandings in Bacon’s work. Finally, I will conclude that the epistemology of technoscience is, indeed, very old—it is the epistemological position put forward by Bacon. (shrink)
Within the realm of nano-, bio-, info- and cogno- (or NBIC) technosciences, the ‘power to change the world’ is often invoked. One could dismiss such formulations as ‘purely rhetorical’, interpret them as rhetorical and self-fulfilling or view them as an adequate depiction of one of the fundamental characteristics of technoscience. In the latter case, a very specific nexus between science and technology, or, the epistemic and the constructionist realm is envisioned. The following paper focuses on this nexus drawing on theoretical (...) conceptions as well as empirical material. It presents an overview of different technoscientific ways to ‘change the world’—via contemplation and representation, intervention and control, engineering, construction and creation. It further argues that the hybrid character of technoscience makes it difficult (if not impossible) to separate knowledge production from real world interventions and challenges current science and technology policy approaches in fundamental ways. (shrink)
The objective of this paper is to contribute to the expanding discourse on conceptual elements of TA. As a point of departure, it takes the recent transformation of the science, technology and innovation system ( technoscience ). We will show that the age of technoscience can be regarded as presenting not only a challenge, but also a chance and opportunity for TA. Embracing this opportunity, however, implies imposing several requirements on TA. In order to specify these requirements and to foster (...) the ongoing discourse on the foundations of TA, this paper suggests a programmatic term: prospective technology assessment (ProTA). This term is intended mainly as a reflection framework, aimed at providing an extension and complementâand not a replacementâof well-established TA concepts. Three requirements for ProTA are sketched: (1) early stage orientationâthe temporal dimension, (2) intention and potential orientationâthe knowledge dimension, (3) shaping orientationâthe power/actor dimension. Examples from fusion and nano research will illustrate the need for ProTA, as well as its specific focus. The paper concedes that ProTA is in its infancy and that there is a clear need for further clarification. (shrink)
Between Calculability and Non-Calculability. Issues of Calculability and Predictability in the Physics of Complex Systems. The ability to predict has been a very important qualifier of what constitutes scientific knowledge, ever since the successes of Babylonian and Greek astronomy. More recent is the general appreciation of the fact that in the presence of deterministic chaos, predictability is severely limited (the so-called ‘butterfly effect’): Nearby trajectories diverge during time evolution; small errors typically grow exponentially with time. The system obeys deterministic (...) laws and still is unpredictable, seemingly a paradox for the traditional viewpoint of Laplacian determinisms. With the concept of deterministic chaos the epistemological issue about an adequate understanding of predictability is no longer just a mere philosophical topic. Physicists on the one hand recognize the limits of (long term) predictability, computability and even of scientific knowledge, on the other hand they work on concepts for extending the horizon of predictability. It is shown in this paper that physics of complex systems is useful to clarify the jungle of different meanings of the terms ‘predictability’ and ‘computability’ — also with philosophical implications for understanding science and nature. Today, from the physical point of view, the relevance of the concepts of predictability seems to be underestimated by philosophers as a mere methodological topic. In the paper I analyse the importance of predictability and computability in physics of complex systems. I show a way how to cope with problems of unpredictability and noncomputability. Nine different concepts of predictability and computability (i.e. open solution, sensitivity/chaos, redundancy/chance) are presented, compared and evaluated. (shrink)
The realist interpretations of quantum theory, proposed by de Broglie and by Bohm, are re-examined and their differences, especially concerning many-particle systems and the relativistic regime, are explored. The impact of the recently proposed experiments of Vigier et al. and of Ghose et al. on the debate about the interpretation of quantum mechanics is discussed. An indication of how de Broglie and Bohm would account for these experimental results is given.
: The Siminoff, Burant, and Youngner study in Ohio is strikingly consistent with data from a national study. Both suggest that there might be significant public acceptance of future policies that violate the dead donor rule, or that further extend the boundary between life and death to include brain-damaged patients short of "brain death." Experience with donation suggests that many individuals would donate their loved ones' organs when they have concluded that the brain injury is not survivable, even if all (...) the criteria for "brain death" are not met. It would be very helpful to have research on those who have gone through a real-life clinical situation. Based on the findings of this study and the increasing demand for organs, it may be appropriate for public policy to allow for ways to increase organ procurement from individuals who are not fully "brain dead" beyond the current method of procurement after cardiac death, but any change in this area should go slowly and with significant public input. (shrink)
The term “synthetic biology” is a popular label of an emerging biotechnological field with strong claims to robustness, modularity, and controlled construction, finally enabling the creation of new organisms. Although the research community is heterogeneous, it advocates a common denominator that seems to define this field: the principles of rational engineering. However, it still remains unclear to what extent rational engineering—rather than “tinkering” or the usage of random based or non-rational processes—actually constitutes the basis for the techniques of synthetic biology. (...) In this article, we present the results of a quantitative bibliometric analysis of the realized extent of rational engineering in synthetic biology. In our analysis, we examine three issues: (1) We evaluate whether work at three levels of synthetic biology (parts, devices, and systems) is consistent with the principles of rational engineering. (2) We estimate the extent of rational engineering in synthetic biology laboratory practice by an evaluation of publications in synthetic biology. (3) We examine the methodological specialization in rational engineering of authors in synthetic biology. Our analysis demonstrates that rational engineering is prevalent in about half of the articles related to synthetic biology. Interestingly, in recent years the relative number of respective publications has decreased. Despite its prominent role among the claims of synthetic biology, rational engineering has not yet entirely replaced biotechnological methods based on “tinkering” and non-rational principles. (shrink)
In this article, I establish a theory of knowledge approach for evaluating the use of computers for educational purposes at the university. In so doing, I trace part of the history of the “enabling factor” of Artificial Intelligence in this sector, an important element that has been integrated into everyday learning environments. The result of my reflection is a dialogical structure, directly inspired by past technology assessment research, which illustrates the conceptual advancement of researchers in the field of learning technologies. (...) The notions covered have implications in future policy-related discourse with regards to education. (shrink)
In this article the meaning and main phases of 'economization' as a civilizing process are outlined. It is argued that 'ecologization' of the current political-economic regime can in a certain sense be regarded as a continuation of this development. Due attention is given to social conditions which may be favourable or impedimental to an ecologization of 'the economy'. It is pleaded that environmental policies should used the so-called trickle-down effect to their advantage.
Many have bowed before the recently acquired powers of ‘new technologies’. However, in the shift from tekhnē to tekhnologia, it seems we have lost human values. These values are communicative in nature as technological progress has placed barriers like distance, web pages and ‘miscellaneous extras’ between individuals. Certain values, like the interpersonal pleasures of rendering service, have been lost as their domain of predilection has for many become fully commercially oriented, dominated by the cadence of profitability. Though the popular cultures (...) of the artificial have surged forth to deliver us from the twentieth century, they have enabled some very superfluous dreaming—Man has succumbed to the Godly role of simulating himself and creating other beings. Communication is replaced by machines, services are rendered via many automated devices, procreation has entered the public sphere, robots and entertainment agents educate our youth and mesmerising screen-integrating ‘forms of intelligence’ even think for us. As such, this so-called culture threatens the very values Man constructed in the nineteenth and twentieth centuries to guide himself into the future. But what if the phenomena mentioned just reflect our new values? The author presents an investigation into this cultural shift, its impact on human practices with regards the mind and the body and evokes some pros and cons of generally accepting the ‘Culture of the Artificial’. (shrink)
The standard model for mereotopological structures are Boolean subalgebras of the complete Boolean algebra of regular closed subsets of a nonempty connected regular T 0 topological space with an additional "contact relation" C defined by xCy x ØA (possibly) more general class of models is provided by the Region Connection Calculus (RCC) of Randell et al. We show that the basic operations of the relational calculus on a "contact relation" generate at least 25 relations in any model of the RCC, (...) and hence, in any standard model of mereotopology. It follows that the expressiveness of the RCC in relational logic is much greater than the original 8 RCC base relations might suggest. We also interpret these 25 relations in the the standard model of the collection of regular open sets in the two-dimensional Euclidean plane. (shrink)
The standard model for mereotopological structures are Boolean subalgebras of the complete Boolean algebra of regular closed subsets of a nonempty connected regular T0 topological space with an additional "contact relation" C defined by xCy ? x n ? Ã.
Temperature reconstructions indicate that the Pliocene was approximately 3°C warmer globally than today, and several recent reconstructions of Pliocene atmospheric CO2 indicate that it was above pre-industrial levels and similar to those likely to be seen this century. However, many of these reconstructions have been of relatively low temporal resolution, meaning that these records may have failed to capture variations associated with the 41 kyr glacial–interglacial cycles thought to have operated in the Pliocene. Here we present a new, high temporal (...) resolution alkenone carbon isotope-based record of pCO2 spanning 3.3–2.8 Ma from Ocean Drilling Program Site 999. Our record is of high enough resolution (approx. 19 kyr) to resolve glacial–interglacial changes beyond the intrinsic uncertainty of the proxy method. The record suggests that Pliocene CO2 levels were relatively stable, exhibiting variation less than 55 ppm. We perform sensitivity studies to investigate the possible effect of changing sea surface temperature (SST), which highlights the importance of accurate and precise SST reconstructions for alkenone palaeobarometry, but demonstrate that these uncertainties do not affect our conclusions of relatively stable pCO2 levels during this interval. (shrink)
Insomnia is a prevalent disabling chronic disorder. The aim of this paper is fourfold: (a) to review evidence suggesting that dysfunctional forms of cognitive control, such as thought suppression, worry, rumination, and imagery control, are associated with sleep disturbance; (b) to review a new budding field of scientific investigation―the role of dysfunctional affect control in sleep disturbance, such as problems with down-regulating negative and positive affective states; (c) to review evidence that sleep disturbance can impair next-day affect control; and (d) (...) to outline, on the basis of the reviewed evidence, how the repetitive-thought literature and the affective science literature can be combined to further understanding of, and intervention for, insomnia. (shrink)
Alfred Schmidt se coloca fuera del dogma del “marxismo soviético”. Considera textos inéditos en la vida de Marx, que permiten comprender los resultados en las obras centrales. Desde el concepto de naturaleza, discute la relación sujeto-objeto y necesidad- libertad. La naturaleza está mediada socio- ..