The target article discusses various aspects of the relationship between the sympathetic system and pain. To this end, the patients under study are divided into three groups. In the first group, called (RSD), the syndrome can be characterized by a triad of autonomic, motor, and sensory symptoms, which occur in a distally generalized distribution. The pain is typically felt deeply and diffusely, has an orthostatic component, and is suppressed by the ischemia test. Under those circumstances, the pain is likely to (...) respond to sympatholytic interventions. In a second group, called (SMP) syndrome, the principal symptoms are spontaneous pain, which is felt superficially and has no orthostatic component, and allodynia. These symptoms, typically confined to the zone of a lesioned nerve, may also be relieved by sympathetic blocks. Since the characteristics of the pain differ between RSD and SMP, the underlying kind of sympathetic–sensory coupling may also vary between these cases. A very small third group of patients exhibits symptoms of both RSD and SMP. The dependence or independence of pain on sympathetic function reported in most published studies seems to be questionable because the degree of technical success of the block remains uncertain. Therefore, pain should not be reported as sympathetic function independent until the criteria for a complete sympathetic block have been established and satisfied. (shrink)
Machine generated contents note: -- Preface -- Acknowledgments -- Introduction, by Michael Weisberg and Jeffrey Kovac. -- 1 Trying to Understand, Making Bonds, by Roald Hoffmann -- Part 1: Chemical Reasoning and Explanation -- 2. Why Buy That Theory?, by Roald Hoffmann. -- 3. What Might Philosophy of Science Look Like If Chemists Built It?, by Roald Hoffmann -- 4. Unstable, by Roald Hoffmann -- 5. Nearly Circular Reasoning, by Roald Hoffmann -- 6. Ockham's Razor (...) and Chemistry, by Roald Hoffmann, Vladimir I. Minkin, and Barry K. Carpenter -- 7. Qualitative Thinking in the Age of Modern Computational Chemistry, or What Lionel Salem Knows, by Roald Hoffmann -- 8. Narrative, by Roald Hoffmann -- 9. Learning from Molecules in Distress, by Roald Hoffmann and Henning Hopf -- 10. Why Think Up New Molecules? by Roald Hoffmann -- 11. Protean, by Roald Hoffmann and Pierre Laszlo -- 12. How Should Chemists Think? by Roald Hoffmann -- Part 2: Writing and Communicating in Chemistry -- 13. Under the Surface of the Chemical Article, by Roald Hoffmann -- 14. Representation in Chemistry, by Roald Hoffmann and Pierre Laszlo -- 15.. The Say of Things, by Roald Hoffmann and Pierre Laszlo -- 16. How Symbolic and Iconic Languages Bridge the Two Worlds of the Chemist: A Case Study from Contemporary Bioorganic Chemistry, by Emily R. Grosholz and Roald Hoffmann -- 17 How Nice to Be an Outsider, by Roald Hoffmann -- 18. The Metaphor, Unchained, by Roald Hoffmann, -- Part 3: Art and Science -- 19. Art in Science? by Roald Hoffmann -- 20. Science and Crafts by Roald Hoffmann -- 21. Molecular Beauty, by Roald Hoffmann -- Part 4 Chemical Education -- 22. Teach to Search by Roald Hoffmann -- 23. Some Heretical Thoughts on What Our Students Are Telling Us, by Roald Hoffmann and Brian P. Coppola -- 24 Very Specific Teaching Strategies, and Why They Work, by Roald Hoffmann and Saundra Y. McGuire -- Part 5 Ethics in Science -- 25. Mind the Shade, by Roald Hoffmann -- 26. Science and Ethics: A Marriage of Necessity and Choice for this Millennium," by Roald Hoffmann -- 27. Honesty to the Singular Object, by Roald Hoffmann -- 28. The Material and Spiritual Rationales Are Inseparable, by Roald Hoffmann -- Index. (shrink)
Discussions concerning belief revision, theorydevelopment, and ``creativity'' in philosophy andAI, reveal a growing interest in Peirce'sconcept of abduction. Peirce introducedabduction in an attempt to providetheoretical dignity and clarification to thedifficult problem of knowledge generation. Hewrote that ``An Abduction is Originary inrespect to being the only kind of argumentwhich starts a new idea'' (Peirce, CP 2.26).These discussions, however, led to considerabledebates about the precise way in which Peirce'sabduction can be used to explain knowledgegeneration (cf. Magnani, 1999; Hoffmann, 1999).The crucial question (...) is that of understandinghow we can get the new elements capableof enlarging our theories. Under thesecircumstances, it might be helpful to step outof the entanglement and reconsider the basis ofthe problem that originally triggered Peirce'sinterest in abduction. This will lead us toanother Peircean concept, that of ``diagrammaticreasoning,'' which I discuss here in the contextof his ``pragmatism.'' In this way, I hope toreach a better understanding of thecontribution of ``abduction'' to the knowledgegeneration process. (shrink)
: The contributions to this volume originate from the workshop "Hauptsachen und Nebendinge—Pure Science and its Impurities," organized by Christoph Hoffmann, which took place at the Max-Planck-Institute for the History of Science (Berlin) in July 2000. We wish to thank all participants for rich and stimulating talks and discussions.
This experiment investigated the use of positive and negative hypothesis and target tests by groups in an adaptation of the 2-4-6 Wason task. The experimental variables were range of rule (small vs large), amount of evidence (low vs high), and trial block (1 vs 2). The results were in accordance with Klayman and Ha's (1987) analysis of base rate probabilities of falsification and with additional theoretical considerations. Base rate probabilities were more descriptive of participants' behaviour in target than in hypothesis (...) tests, under low than under high amount of evidence, and at the beginning of the process than at its end. The percentage of positive tests was higher under small than large range of rule. More falsifications than verifications resulted from hypothesis tests than would be expected by a random process. When evidence is richly available, the relative importance of falsification seems to decrease. An analysis of the group compositions before and after group discussion by the PCD model (Crott, Werner, & Hoffmann, 1996) revealed that the normative weight was approximately twice as large as the informational. Groups produced fewer false answers than their members individually. (shrink)
Turning to a brief consideration of United States foreign policy, Hoffmann points to particular moral difficulties in U.S. stances and urges the development of superpower rules that are effective and ethical.
The author presents Gernot Böhme’s median mode of being theory, which attempts to find an anthropological middle ground between the rational and the irrational, the spiritual and the corporeal and the active and passive in human experience. Böhme’s reflections on the median mode of being are normative in character and linked to the concept of “sovereign man,” which he strongly defends and whose main characteristics Hoffmann outlines in the first part of the essay. Among others, Hoffmann argues against (...) Böhme’s excessive emphasis on the controlling/restrictive functions of awareness at the cost of those functions which serve to protect and stimulate life, his non-distinction between the distance to a cognized object and its intellectual instrumentalisation, and his rather one-sided tendency to seek the sources of European rationalism in the Socra-tean tradition. (shrink)
On rationalist infallibilism, a wide range of both (i) analytic and (ii) synthetic a priori propositions can be infallibly justified (or absolutely warranted), i.e., justified to a degree that entails their truth and precludes their falsity. Though rationalist infallibilism is indisputably running its course, adherence to at least one of the two species of infallible a priori justification refuses to disappear from mainstream epistemology. Among others, Putnam (1978) still professes the a priori infallibility of some category (i) propositions, while Burge (...) (1986, 1988, 1996) and Lewis (1996) have recently affirmed the a priori infallibility of some category (ii) propositions. In this paper, I take aim at rationalist infallibilism by calling into question the a priori infallibility of both analytic and synthetic propositions. The upshot will be twofold: first, rationalist infallibilism unsurprisingly emerges as a defective epistemological doctrine, and second, more importantly, the case for the a priori infallibility of one or both categories of propositions turns out to lack cogency. (shrink)
Engineers fine-tune the design of robot bodies for control purposes, however, a methodology or set of tools is largely absent, and optimization of morphology (shape, material properties of robot bodies, etc.) is lagging behind the development of controllers. This has become even more prominent with the advent of compliant, deformable or ”soft” bodies. These carry substantial potential regarding their exploitation for control—sometimes referred to as ”morphological computation”. In this article, we briefly review different notions of computation by physical systems and (...) propose the dynamical systems framework as the most useful in the context of describing and eventually designing the interactions of controllers and bodies. Then, we look at the pros and cons of simple vs. complex bodies, critically reviewing the attractive notion of ”soft” bodies automatically taking over control tasks. We address another key dimension of the design space—whether model-based control should be used and to what extent it is feasible to develop faithful models for different morphologies. (shrink)
Tailoring the design of robot bodies for control purposes is implicitly performed by engineers, however, a methodology or set of tools is largely absent and optimization of morphology (shape, material properties of robot bodies, etc.) is lag- ging behind the development of controllers. This has become even more prominent with the advent of compliant, deformable or "soft" bodies. These carry substantial potential regarding their exploitation for control – sometimes referred to as "mor- phological computation" in the sense of offloading computation (...) needed for control to the body. Here, we will argue in favor of a dynamical systems rather than com- putational perspective on the problem. Then, we will look at the pros and cons of simple vs. complex bodies, critically reviewing the attractive notion of “soft” bodies automatically taking over control tasks. We will address another key dimension of the design space – whether model-based control should be used and to what extent it is feasible to develop faithful models for different morphologies. ----- This paper was also published in the 2014 AISB proceedings http://aisb50.org/representation-of-reality-humans-animals-and-machines/ - http://doc.gold.ac.uk/aisb50/. (shrink)
Previous research has shown that subliminally presented stimuli accelerate or delay responses afforded by supraliminally presented stimuli. Our experiments extend these findings by showing that unconscious stimuli even affect free choices between responses. Thus, actions that are phenomenally experienced as freely chosen are influenced without the actor becoming aware of the manipulation. However, the unconscious influence is limited to a response bias, as participants chose the primed response only in up to 60% of the trials. LRP data in free choice (...) trials indicate that the prime was not ineffective in trials in which participants chose the non-primed response as then it delayed performance of the incongruently primed response. (shrink)
Among the many problems posed by Peirce's concept of abduction is how to determine the scope of this form of inference, and how to distinguish different types of abduction. This problem can be illustrated by taking a look at one of his best known definitions of the term:Abduction is the process of forming an explanatory hypothesis. It is the only logical operation which introduces any new idea; for induction does nothing but determine a value, and deduction merely evolves the necessary (...) consequences of a pure hypothesis.The second half of this quote is not part of the definition, but an explanation of it. However, it adds something to this definition because it says implicitly that there are only three logical .. (shrink)
Metaphysical nihilism is the thesis that there could have been no concrete objects. Thomas Baldwin (1996) offers an argument for metaphysical nihilism. The premisses of the argument purport to provide a procedure of subtraction that can be iterated until we reach a world where no concrete objects exist. Gonzalo Rodriguez-Pereyra (1997) finds fault with Baldwin’s argument, modifies it, and claims to have proved metaphysical nihilism. My primary aim is to show that Rodriguez-Pereyra’s alleged proof rests on a false assumption. The (...) assumption is that, necessarily, if there are no concrete* objects (in the sense Rodriguez-Pereyra defines), then there are no concrete objects. My secondary aim, with which I begin, is to formulate and then strengthen a succinct version of the subtraction argument. (shrink)
As a committee of the National Academy of Engineering recognized, ethics education should foster the ability of students to analyze complex decision situations and ill-structured problems. Building on the NAE’s insights, we report about an innovative teaching approach that has two main features: first, it places the emphasis on deliberation and on self-directed, problem-based learning in small groups of students; and second, it focuses on understanding ill-structured problems. The first innovation is motivated by an abundance of scholarly research that supports (...) the value of deliberative learning practices. The second results from a critique of the traditional case-study approach in engineering ethics. A key problem with standard cases is that they are usually described in such a fashion that renders the ethical problem as being too obvious and simplistic. The practitioner, by contrast, may face problems that are ill-structured. In the collaborative learning environment described here, groups of students use interactive and web-based argument visualization software called “AGORA-net: Participate – Deliberate!”. The function of the software is to structure communication and problem solving in small groups. Students are confronted with the task of identifying possible stakeholder positions and reconstructing their legitimacy by constructing justifications for these positions in the form of graphically represented argument maps. The argument maps are then presented in class so that these stakeholder positions and their respective justifications become visible and can be brought into a reasoned dialogue. Argument mapping provides an opportunity for students to collaborate in teams and to develop critical thinking and argumentation skills. (shrink)
Abductive reasoning takes place in forming``hypotheses'''' in order to explain ``facts.'''' Thus, theconcept of abduction promises an understanding ofcreativity in science and learning. It raises,however, also a lot of problems. Some of them will bediscussed in this paper. After analyzing thedifference between induction and abduction (1), Ishall discuss Peirce''s claim that there is a ``logic''''of abduction (2). The thesis is that this claim can beunderstood, if we make a clear distinction between inferential elements and perceptive elements of abductive reasoning. For (...) Peirce, the creative act offorming explanatory hypotheses and the emergence of``new ideas'''' belongs exclusively to the perceptive side of abduction. Thus, it is necessary to study the roleof perception in abductive reasoning (3). A furtherproblem is the question whether there is arelationship between abduction and Peirce''s concept of``theorematic reasoning'''' in mathematics (4). Both forms of reasoning could be connected, because both arebased on perception. The last problem concerns therole of instincts in explaining the success ofabductive reasoning in science, and the question whether the concept of instinct might be replaced bymethods of inquiry (5). (shrink)
On rationalist infallibilism, a wide range of both (i) analytic and (ii) synthetic a priori propositions can be infallibly justified, i.e., justified in a way that is truth-entailing. In this paper, I examine the second thesis of rationalist infallibilism, what might be called ‘synthetic a priori infallibilism’. Exploring the seemingly only potentially plausible species of synthetic a priori infallibility, I reject the infallible justification of so-called self-justifying propositions.
This article examines differences in the research approaches of farmers and scientists and analyzes how these differences are related to the conditions under which both groups engage in experimental work. Theoretical considerations as well as practical experiences are presented to emphasize the great potential of farmer–researcher collaboration for rural innovation. In the first part of the article, the innovative power of farmer research and experimentation is acknowledged by presenting examples such as crop and animal breeding, development of new production systems, (...) farm equipment, and social innovations. Considering the respective comparative advantages of farmers and scientists, and inspired by theoretical concepts in the fields of knowledge management and innovation processes, we discuss five topics for optimizing the collaboration between farmers and scientists in the field of technological innovation: user orientation, decentralization, informal modes of experimentation, externalization of tacit knowledge, and economic considerations. A better understanding of such issues could help researchers to define their own role in the research process, acknowledge the strengths and weaknesses of their own and farmers’ research approaches, overcome communication gaps, and find creative solutions for problems that typically occur in the process of participatory technology development. (shrink)
The possible-worlds analysis of propositions identifies a proposition with the set of possible worlds where it is true. This analysis has the hitherto unnoticed consequence that a proposition depends for its existence on the existence of every proposition that entails it. This peculiar consequence places the possible-worlds analysis in conflict with the conjunction of two compelling theses. One thesis is that a phrase of the form ‘the proposition that S’ is a rigid designator. The other thesis is that a proposition (...) which is directly about an object – a singular proposition – depends for its existence on the existence of the object. I defend these theses and conclude that the cost of the possible-worlds analysis is prohibitively high. (shrink)
Minimalism is currently the received deflationary theory of truth. On minimalism, truth is a transparent concept and a deflated property of truth bearers. In this paper, I situate minimalism within current deflationary debate about truth by contrasting it with its main alternative―the redundancy theory of truth. I also outline three of the primary challenges facing minimalism, its formulation, explanatory adequacy and stability, and draw some lessons for the soundness of its conception of truth.
This volume represents an important contribution to Peirce’s work in mathematics and formal logic. An internationally recognized group of scholars explores and extends understandings of Peirce’s most advanced work. The stimulating depth and originality of Peirce’s thought and the continuing relevance of his ideas are brought out by this major book.
The aim of this paper is to define a notion of supervenience which can adequately describe the systematic dependence of extrinsic as well as of intrinsic higher-level properties on base-level features. We argue that none of the standard notions of supervenience—the concepts of weak, strong and global supervenience—fulfil this function. The concept of regional supervenience, which is purported to improve on the standard conceptions, turns out to be problematic as well. As a new approach, we develop the notion of property-dependent (...) supervenience. This notion is founded on a criterion of relevance adapting the supervenience base to the considered higher-level properties in a specific way, such that only features which are relevant to the instantiation of the higher-level properties under consideration are taken into account. (shrink)
This essay is the first attempt to compare Reinhart Koselleck's Historik with Hannah Arendt's political anthropology and her critique of the modern concept of history. Koselleck is well-known for his work on conceptual history as well as for his theory of historical time. It is my contention that these different projects are bound together by Koselleck's Historik, that is, his theory of possible histories. This can be shown through an examination of his writings from Critique and Crisis to his final (...) essays on historical anthropology, most of which have not yet been translated into English. Conversely, Arendt's political theory has in recent years been the subject of numerous interpretations that do not take into account her views about history. By comparing the anthropological categories found in Koselleck's Historik with Arendt's political anthropology, I identify similar intellectual lineages in them as well as shared political sentiments, in particular the anti-totalitarian impulse of the postwar era. More importantly, Koselleck's theory of the preconditions of possible histories and Arendt's theory of the preconditions of the political, I argue, transcend these lineages and sentiments by providing essential categories for the analysis of historical experience. (shrink)
Had more philosophers of science come from chemistry, their thinking would have been different. I begin by looking at a typical chemical paper, in which making something is the leitmotif, and conjecture/refutation is pretty much irrelevant. What in fact might have been, might be, different? The realism of chemists is reinforced by their remarkable ability to transform matter; they buy into reductionism where it serves them, but make no real use of it. Incommensurability is taken without a blink, and actually (...) serves. The preeminence of synthesis in chemistry could have led philosophers of science to take more seriously questions of aesthetics within science, and to find a place in aesthetics for utility. The necessary motion twixt macroscopic and microscopic views of matter in modern chemistry leads to the coexistence of symbolic and iconic representations. And in another way to the deliberate, creative violation of categories. (shrink)
Linguistic competence, in general terms, involves the ability to learn, understand, and speak a language. The nativist view in the philosophy of linguistics holds that the principal foundation of linguistic competence is an innate faculty of linguistic cognition. In this paper, close scrutiny is given to nativism's fundamental commitments in the area of metaphysics. In the course of this exploration it is argued that any minimally defensible variety of nativism is, for better or worse, married to two theses: linguistic competence (...) is grounded in a faculty of linguistic cognition that is (i) embodied and (ii) whose operating rules are represented in the brains of human language users. (shrink)
In the 1890s Ludwig Mach employed photography for visualizing streamlines in the emerging field of aerodynamic research. Étienne-Jules Marey developed a similar approach at the turn of the century. The two projects can be related to a number of current discussions on the history of scientific photography. The case of Ludwig Mach demonstrates how the collection of numerical data became both the subject and the challenge of a line of research intimately linked to the capacities of photography. At the end (...) of the nineteenth century, the particular potential of scientific photography is very often defined by comparison with the limited power of the human eye. In contrast, the example of streamline photography underlines that the requirements of the research context are critical for successfully employing photography. Marey’s studies point to a tension between his characterization of chronophotography as a method for analyzing the elementary units of processes in nature on the one hand and the necessary summation of single moments in time in his recordings of streamlines on the other. What Marey usually qualified as a cumbersome confusion was here the prerequisite of observation. The ’philosophy in machines’ ultimately limited the success of streamline photography; it aided in debates about qualitative matters, but could hardly provide what most interested scientists and engineers: reliable numbers. (shrink)
It seems that every singular proposition implies that the object it is singular with respect to exists. It also seems that some propositions are true with respect to possible worlds in which they do not exist. The puzzle is that it can be argued that there is contradiction between these two principles. In this paper, I explain the puzzle and consider some of the ways one might attempt to resolve it. The puzzle is important because it has implications concerning the (...) way we think about the relationship between a proposition and the claim that the proposition is true. (shrink)
The superassertability theory of truth, inspired by Crispin Wright (1992, 2003), holds that a statement is true if and only if it is superassertable in the following sense: it possesses warrant that cannot be defeated by any improvement of our information. While initially promising, the superassertability theory of truth is vulnerable to a persistent difficulty highlighted by James Van Cleve (1996) and Terrence Horgan (1995) but not properly fleshed out: it is formally illegitimate in a similar sense that unsophisticated epistemic (...) theories of truth are widely acknowledged to be. Sustained analysis reveals that the unrestricted formal legitimacy argument is firmly grounded in first person conceivability evidence. (shrink)
Technology is not only an object of philosophical reflection but also something that can change this reflection. This paper discusses the potential of computer-supported argument visualization tools for coping with the complexity of philosophical arguments. I will show, in particular, how the interactive and web-based argument mapping software “AGORA-net” can change the practice of philosophical reflection, communication, and collaboration. AGORA-net allows the graphical representation of complex argumentations in logical form and the synchronous and asynchronous collaboration on those “argument maps” on (...) the internet. Web-based argument mapping can overcome limits of space, time, and access, and it can empower users from all over the world to clarify their reasoning and to participate in deliberation and debate. Collaborative and web-based argument mapping tools such as AGORA-net can change the practice of arguing in two dimensions. First, arguing on web-based argument maps in both collaborative and adversarial form can lead to a fundamental shift in the way arguments are produced and debated. It can provide an alternative to the traditional four-step process of writing, publishing, debating, and responding in new writing with its clear distinction between individual and social activities by a process in which these four steps happen virtually simultaneously, and individual and social activities become more closely intertwined. Second, by replacing the linear form of arguments through graphical representations of networks of inferential relations which can grow over time in an infinite space, these tools do not only allow a clear visualization of structures and relations, but also forms of collaboration in which, for example, participants work on different “construction zones” of larger argument maps, or debates are performed at specific points of disagreement on those maps. I introduce the term synergetic logosymphysis to describe a practice that combines these two dimensions of collaborative- and web-based argument mapping. (shrink)
This book studies medieval theories of angelology insofar as they made groundbreaking contributions to medieval philosophy. -/- The discussion of angels, made famous by the humanist caricature of ‘how many angels can dance on the head of a pin’, was nevertheless a crucial one in medieval philosophical debates. All scholastic masters pronounced themselves on angelology, if only in their Sentence commentaries. The questions concerning angelic cognition, speech, free decision, movement, etc. were springboards for profound philosophical discussions that have to do (...) with anthropology and metaphysics no less than with angelology. Angels qua separate substances were of central importance in medieval metaphysics (with questions on universal hylomorphism, the esse- essentia composition of creatures, and those regarding individuation of material and immaterial substances). The doctrine of angels has not been the subject of much study in the history of medieval thought, and the volume fills an important gap in the literature. The chapters offer a well-rounded, if not encyclopedic discussion in the chronological or doctrinal sense. They cover the history of debate from Augustine and Pseudo-Dionysius until the later middle ages, but instead of an author-by-author approach, focus rather on seminal ideas with demonstrable relevance to “secular” and modern philosophical concerns. (shrink)
The primary goal of this chapter is to present a new method—called Logical Argument Mapping —for the analysis of framing processes as they occur in any communication, but especially in conflicts. I start with a distinction between boundary setting, meaning construction, and sensemaking as three forms or aspects of framing, and argue that crucial for the resolution of frame-based controversies is our ability to deal with those “webs” of mutually supporting beliefs that determine sensemaking processes. Since any analysis of framing (...) in conflicts and communication is itself influenced by sensemaking—there is no “frame-neutrality”—the main problem for an analyst is to cope with his or her own cognitive limitations. LAM offers a solution to this problem. The method will be exemplified with an analysis of two conflicting interpretations of how the international community should deal with Hamas after its election victory in 2006. (shrink)
Starting from the observation that small children can count more objects than numbers—a phenomenon that I am calling the “lifeworld dependency of cognition”—and an analysis of finger calculation, the paper shows how learning can be explained as the development of cognitive systems. Parts of those systems are not only an individual’s different forms of knowledge and cognitive abilities, but also other people, things, and signs. The paper argues that cognitive systems are first of all semiotic systems since they are dependent (...) on signs and representations as mediators. The two main questions discussed here are how the external world constrains and promotes the development of cognitive abilities, and how we can move from cognitive abilities that are necessarily connected with concrete situations to abstract knowledge. (shrink)
The weak deflationist about truth is committed to two theses: one conceptual, the other ontological. On the conceptual thesis (what might be called a ‘triviality thesis’), the content of the truth predicate is exhausted by its involvement in some version of the ‘truth-schema’. On the ontological thesis, truth is a deflated property of truth bearers. In this paper, I focus on weak deflationism’s ontological thesis, arguing that it generates an instability in its view of truth: the view threatens to collapse (...) into either that of strong deflationism (i.e., truth is not a property) or that of some form of inflationism (i.e., truth is a substantial property). The instability objection to weak deflationism is sketched by way of a truth-property ascription dilemma, the two horns of which its proponent is at pains to circumvent. (shrink)
This volume provides new sources of knowledge based on Michael Otte’s fundamental insight that understanding the problems of mathematics education – how to teach, how to learn, how to communicate, how to do, and how to represent ...
A central tenet of Heil's ontological conception is a no-levels account of reality, according to which there is just one class of basic properties and relations, while all higher-level entities are configurations of these base-level entities. I argue that if this picture is not to collapse into an eliminativist picture of the world – which, I contend, should be avoided –, Heil's ontological framework has to be supplemented by an independent theory of which configurations of basic entities should count as (...) complex entities. However, such an amendment represents a substantial ontological enhancement, so that the ensuing ontological picture is not as parsimonious as Heil claims it to be. (shrink)
According to Field’s influential incompleteness objection, Tarski’s semantic theory of truth is unsatisfactory since the definition that forms its basis is incomplete in two distinct senses: (1) it is physicalistically inadequate, and for this reason, (2) it is conceptually deficient. In this paper, I defend the semantic theory of truth against the incompleteness objection by conceding (1) but rejecting (2). After arguing that Davidson and McDowell’s reply to the incompleteness objection fails to pass muster, I argue that, within the constraints (...) of a non-reductive physicalism and a holism concerning the concepts of truth, reference and meaning, conceding Field’s physicalistic inadequacy conclusion while rejecting his conceptual deficiency conclusion is a promising reply to the incompleteness objection. (shrink)
The role of uncertainty within an organization’s environment features prominently in the business ethics and management literature, but how corporate investment decisions should proceed in the face of uncertainties relating to the natural environment is less discussed. From the perspective of ecological economics, the salience of ecology-induced issues challenges management to address new types of uncertainties. These pertain to constraints within the natural environment as well as to institutional action aimed at conserving the natural environment. We derive six areas of (...) ecology-induced uncertainties and propose ecology-driven real options as a conceptual approach for systematically incorporating these uncertainties into strategic management. We combine our results in an integrative investment framework and illustrate its application with the case of carbon constraints. (shrink)