Whereas repeated exposure to communication is a widespread phenomenon, it has so far received little attention in communication research. This article takes a step towards describing, differentiating, and explaining repeated exposure to communication. It discusses different forms of repeated exposure and then focuses on repeated exposure to narrative films. It explores possible motivations for reusing the same media content again and again, while taking processes of repeated exposure as well as situational and personal variables into account. The initially theoretical considerations (...) are then supported, expanded, and specified both by existent empirical evidence and findings from a focus group study. Finally, further questions about repeated exposure to narrative content in media are discussed. (shrink)
Machine generated contents note: -- Preface -- Acknowledgments -- Introduction, by Michael Weisberg and Jeffrey Kovac. -- 1 Trying to Understand, Making Bonds, by Roald Hoffmann -- Part 1: Chemical Reasoning and Explanation -- 2. Why Buy That Theory?, by Roald Hoffmann. -- 3. What Might Philosophy of Science Look Like If Chemists Built It?, by Roald Hoffmann -- 4. Unstable, by Roald Hoffmann -- 5. Nearly Circular Reasoning, by Roald Hoffmann -- 6. Ockham's Razor (...) and Chemistry, by Roald Hoffmann, Vladimir I. Minkin, and Barry K. Carpenter -- 7. Qualitative Thinking in the Age of Modern Computational Chemistry, or What Lionel Salem Knows, by Roald Hoffmann -- 8. Narrative, by Roald Hoffmann -- 9. Learning from Molecules in Distress, by Roald Hoffmann and Henning Hopf -- 10. Why Think Up New Molecules? by Roald Hoffmann -- 11. Protean, by Roald Hoffmann and Pierre Laszlo -- 12. How Should Chemists Think? by Roald Hoffmann -- Part 2: Writing and Communicating in Chemistry -- 13. Under the Surface of the Chemical Article, by Roald Hoffmann -- 14. Representation in Chemistry, by Roald Hoffmann and Pierre Laszlo -- 15.. The Say of Things, by Roald Hoffmann and Pierre Laszlo -- 16. How Symbolic and Iconic Languages Bridge the Two Worlds of the Chemist: A Case Study from Contemporary Bioorganic Chemistry, by Emily R. Grosholz and Roald Hoffmann -- 17 How Nice to Be an Outsider, by Roald Hoffmann -- 18. The Metaphor, Unchained, by Roald Hoffmann, -- Part 3: Art and Science -- 19. Art in Science? by Roald Hoffmann -- 20. Science and Crafts by Roald Hoffmann -- 21. Molecular Beauty, by Roald Hoffmann -- Part 4 Chemical Education -- 22. Teach to Search by Roald Hoffmann -- 23. Some Heretical Thoughts on What Our Students Are Telling Us, by Roald Hoffmann and Brian P. Coppola -- 24 Very Specific Teaching Strategies, and Why They Work, by Roald Hoffmann and Saundra Y. McGuire -- Part 5 Ethics in Science -- 25. Mind the Shade, by Roald Hoffmann -- 26. Science and Ethics: A Marriage of Necessity and Choice for this Millennium," by Roald Hoffmann -- 27. Honesty to the Singular Object, by Roald Hoffmann -- 28. The Material and Spiritual Rationales Are Inseparable, by Roald Hoffmann -- Index. (shrink)
Discussions concerning belief revision, theorydevelopment, and ``creativity'' in philosophy andAI, reveal a growing interest in Peirce'sconcept of abduction. Peirce introducedabduction in an attempt to providetheoretical dignity and clarification to thedifficult problem of knowledge generation. Hewrote that ``An Abduction is Originary inrespect to being the only kind of argumentwhich starts a new idea'' (Peirce, CP 2.26).These discussions, however, led to considerabledebates about the precise way in which Peirce'sabduction can be used to explain knowledgegeneration (cf. Magnani, 1999; Hoffmann, 1999).The crucial question (...) is that of understandinghow we can get the new elements capableof enlarging our theories. Under thesecircumstances, it might be helpful to step outof the entanglement and reconsider the basis ofthe problem that originally triggered Peirce'sinterest in abduction. This will lead us toanother Peircean concept, that of ``diagrammaticreasoning,'' which I discuss here in the contextof his ``pragmatism.'' In this way, I hope toreach a better understanding of thecontribution of ``abduction'' to the knowledgegeneration process. (shrink)
: The contributions to this volume originate from the workshop "Hauptsachen und Nebendinge—Pure Science and its Impurities," organized by Christoph Hoffmann, which took place at the Max-Planck-Institute for the History of Science (Berlin) in July 2000. We wish to thank all participants for rich and stimulating talks and discussions.
The author presents Gernot Böhme’s median mode of being theory, which attempts to find an anthropological middle ground between the rational and the irrational, the spiritual and the corporeal and the active and passive in human experience. Böhme’s reflections on the median mode of being are normative in character and linked to the concept of “sovereign man,” which he strongly defends and whose main characteristics Hoffmann outlines in the first part of the essay. Among others, Hoffmann argues against (...) Böhme’s excessive emphasis on the controlling/restrictive functions of awareness at the cost of those functions which serve to protect and stimulate life, his non-distinction between the distance to a cognized object and its intellectual instrumentalisation, and his rather one-sided tendency to seek the sources of European rationalism in the Socra-tean tradition. (shrink)
This experiment investigated the use of positive and negative hypothesis and target tests by groups in an adaptation of the 2-4-6 Wason task. The experimental variables were range of rule (small vs large), amount of evidence (low vs high), and trial block (1 vs 2). The results were in accordance with Klayman and Ha's (1987) analysis of base rate probabilities of falsification and with additional theoretical considerations. Base rate probabilities were more descriptive of participants' behaviour in target than in hypothesis (...) tests, under low than under high amount of evidence, and at the beginning of the process than at its end. The percentage of positive tests was higher under small than large range of rule. More falsifications than verifications resulted from hypothesis tests than would be expected by a random process. When evidence is richly available, the relative importance of falsification seems to decrease. An analysis of the group compositions before and after group discussion by the PCD model (Crott, Werner, & Hoffmann, 1996) revealed that the normative weight was approximately twice as large as the informational. Groups produced fewer false answers than their members individually. (shrink)
Turning to a brief consideration of United States foreign policy, Hoffmann points to particular moral difficulties in U.S. stances and urges the development of superpower rules that are effective and ethical.
_The Arcades Project_, the monumental unfinished work of cultural criticism by Walter Benjamin, is the German philosopher’s effort to comprehend urban modernity through the 19th-century Parisian shopping arcade. _The Arcades: Contemporary Art and Walter Benjamin_ combines artworks with archival materials and poetic interventions to form an original, multifaceted response to this collagelike cultural text. Jens Hoffmann astutely pairs works by thirty-six well-known and emerging artists, including Lee Friedlander, Andreas Gursky, Pierre Huyghe, and Cindy Sherman, with the thirty-six “Convolutes,” or (...) themes, in Benjamin’s text. Bound into the main volume is a graphic novelette, from the imagination of Vito Manolo Roma, of Benjamin’s dream the night before he committed suicide while fleeing the Nazis. Scholarly essays by Hoffmann and Caroline A. Jones, texts selected by the poet Kenneth Goldsmith, reproductions of Benjamin’s handwritten notes, and a list of the main Paris arcades discussed by him round out this extraordinary publication. (shrink)
An action-oriented perspective changes the role of an individual from a passive observer to an actively engaged agent interacting in a closed loop with the world as well as with others. Cognition exists to serve action within a landscape that contains both. This chapter surveys this landscape and addresses the status of the pragmatic turn. Its potential influence on science and the study of cognition are considered (including perception, social cognition, social interaction, sensorimotor entrainment, and language acquisition) and its impact (...) on how neuroscience is studied is also investigated (with the notion that brains do not passively build models, but instead support the guidance of action). A review of its implications in robotics and engineering includes a discussion of the application of enactive control principles to couple action and perception in robotics as well as the conceptualization of system design in a more holistic, less modular manner. Practical applications that can impact the human condition are reviewed (e.g., educational applications, treatment possibilities for developmental and psychopathological disorders, the development of neural prostheses). All of this foreshadows the potential societal implications of the pragmatic turn. The chapter concludes that an action-oriented approach emphasizes a continuum of interaction between technical aspects of cognitive systems and robotics, biology, psychology, the social sciences, and the humanities, where the individual is part of a grounded cultural system. (shrink)
From the early reception of Thomas Aquinas up to the present, many have interpreted his theory of liberum arbitrium to imply intellectual determinism: we do not control our choices, because we do not control the practical judgments that cause our choices. In this paper we argue instead that he rejects determinism in general and intellectual determinism in particular, which would effectively destroy liberum arbitrium as he conceives of it. We clarify that for Aquinas moral responsibility presupposes liberum arbitrium and thus (...) the ability to do otherwise, although the ability to do otherwise applies differently to praise and blame. His argument against intellectual determinism is not straightforward, but we construct it by analogy to his arguments against other deterministic threats. The non-determinism of the intellect’s causality with respect to the will results from his claims that practical reasoning is defeasible and that the reasons for actions are not contrastive reasons. (shrink)
Many consumers implicitly associate sustainability with lower product strength. This so-called ethical = less strong intuition poses a major threat for the success of sustainable products. This article explores this pervasive lay theory and examines whether it is a key barrier for sustainable consumption patterns. Even more importantly, little is known about the underlying mechanisms that might operate differently at the implicit and explicit levels of the consumer’s decision-making. To fill this gap, three studies examine how the implicit judgments that (...) consumers activate automatically shape their consumption behaviors, in concert with their more controlled explicit beliefs about sustainable products. The Main Study investigates the ELSI’s imprint on actual shopping patterns and disentangles the implicit and explicit mechanisms of the lay theory. This paper also asks how this negative influence can be attenuated by examining whether the consumer’s interest in sustainable consumption reduces reliance on the ELSI. Two follow-up studies confirm the robustness from different methodological and practical perspectives. Implications for companies and policy makers are derived. (shrink)
The contribution of the body to cognition and control in natural and artificial agents is increasingly described as “off-loading computation from the brain to the body”, where the body is said to perform “morphological computation”. Our investigation of four characteristic cases of morphological computation in animals and robots shows that the ‘off-loading’ perspective is misleading. Actually, the contribution of body morphology to cognition and control is rarely computational, in any useful sense of the word. We thus distinguish (1) morphology that (...) facilitates control, (2) morphology that facilitates perception and the rare cases of (3) morphological computation proper, such as ‘reservoir computing.’ where the body is actually used for computation. This result contributes to the understanding of the relation between embodiment and computation: The question for robot design and cognitive science is not whether computation is offloaded to the body, but to what extent the body facilitates cognition and control – how it contributes to the overall ‘orchestration’ of intelligent behaviour. (shrink)
Medieval authors generally agreed that we have the freedom to choose among alternative possibilities. But most medieval authors also thought that there are situations in which one cannot do otherwise, not even will otherwise. They also thought when willing necessarily, the will remains free. The questions, then, are what grounds the necessity or contingency of the will’s acts, and – since freedom is not defined by the ability to choose – what belongs to the essential character of freedom, the ratio (...) libertatis. This article studies medieval theories of freedom without choice from William of Auxerre to William of Ockham and their background in Augustine, Anselm of Canterbury, and Bernard of Clairvaux. (shrink)
On rationalist infallibilism, a wide range of both (i) analytic and (ii) synthetic a priori propositions can be infallibly justified (or absolutely warranted), i.e., justified to a degree that entails their truth and precludes their falsity. Though rationalist infallibilism is indisputably running its course, adherence to at least one of the two species of infallible a priori justification refuses to disappear from mainstream epistemology. Among others, Putnam (1978) still professes the a priori infallibility of some category (i) propositions, while Burge (...) (1986, 1988, 1996) and Lewis (1996) have recently affirmed the a priori infallibility of some category (ii) propositions. In this paper, I take aim at rationalist infallibilism by calling into question the a priori infallibility of both analytic and synthetic propositions. The upshot will be twofold: first, rationalist infallibilism unsurprisingly emerges as a defective epistemological doctrine, and second, more importantly, the case for the a priori infallibility of one or both categories of propositions turns out to lack cogency. (shrink)
Most of the epistemological debate on disagreement tries to develop standards that describe which actions or beliefs would be rational under specific circumstances in a controversy. To build things on a firm foundation, much work starts from certain idealizations—for example the assumption that parties in a disagreement share all the evidence that is relevant and are equal with regard to their abilities and dispositions. This contribution, by contrast, focuses on a different question and takes a different route. The question is: (...) What should people actually do who find themselves in deep disagreement with others? And instead of building theory on some “firm foundation,” the paper starts from a specific goal—building consensus by creating new proposals—and asks, first, which actions are suitable to achieve this goal and, second, what are the epistemic conditions of these actions. With regard to the latter, the paper focuses on what has been called framing and reframing in conflict research, and argues that both metaphors need and deserve a suitable epistemological conceptualization. (shrink)
Engineers fine-tune the design of robot bodies for control purposes, however, a methodology or set of tools is largely absent, and optimization of morphology (shape, material properties of robot bodies, etc.) is lagging behind the development of controllers. This has become even more prominent with the advent of compliant, deformable or ”soft” bodies. These carry substantial potential regarding their exploitation for control—sometimes referred to as ”morphological computation”. In this article, we briefly review different notions of computation by physical systems and (...) propose the dynamical systems framework as the most useful in the context of describing and eventually designing the interactions of controllers and bodies. Then, we look at the pros and cons of simple vs. complex bodies, critically reviewing the attractive notion of ”soft” bodies automatically taking over control tasks. We address another key dimension of the design space—whether model-based control should be used and to what extent it is feasible to develop faithful models for different morphologies. (shrink)
Previous research has shown that subliminally presented stimuli accelerate or delay responses afforded by supraliminally presented stimuli. Our experiments extend these findings by showing that unconscious stimuli even affect free choices between responses. Thus, actions that are phenomenally experienced as freely chosen are influenced without the actor becoming aware of the manipulation. However, the unconscious influence is limited to a response bias, as participants chose the primed response only in up to 60% of the trials. LRP data in free choice (...) trials indicate that the prime was not ineffective in trials in which participants chose the non-primed response as then it delayed performance of the incongruently primed response. (shrink)
In recent years, semiotics has become an innovative theoretical framework in mathematics education. The purpose of this article is to show that semiotics can be used to explain learning as a process of experimenting with and communicating about one's own representations of mathematical problems. As a paradigmatic example, we apply a Peircean semiotic framework to answer the question of how students learned the concept of "distribution" in a statistics course by "diagrammatic reasoning" and by developing "hypostatic abstractions," that is by (...) forming new mathematical objects which can be used as means for communication and further reasoning. Peirce's semiotic terminology is used as an alternative for notions such as modeling, symbolizing, and reification. We will show that it is a precise instrument of analysis with regard to the complexity of learning and of communication in mathematics classroom. (shrink)
Why do we formulate arguments? Usually, things such as persuading opponents, finding consensus, and justifying knowledge are listed as functions of arguments. But arguments can also be used to stimulate reflection on one’s own reasoning. Since this cognitive function of arguments should be important to improve the quality of people’s arguments and reasoning, for learning processes, for coping with “wicked problems,” and for the resolution of conflicts, it deserves to be studied in its own right. This contribution develops first steps (...) towards a theory of reflective argumentation. It provides a definition of reflective argumentation, justifies its importance, delineates it from other cognitive functions of argumentation in a new classification of argument functions, and it discusses how reflection on one’s own reasoning can be stimulated by arguments. (shrink)
As a committee of the National Academy of Engineering recognized, ethics education should foster the ability of students to analyze complex decision situations and ill-structured problems. Building on the NAE’s insights, we report about an innovative teaching approach that has two main features: first, it places the emphasis on deliberation and on self-directed, problem-based learning in small groups of students; and second, it focuses on understanding ill-structured problems. The first innovation is motivated by an abundance of scholarly research that supports (...) the value of deliberative learning practices. The second results from a critique of the traditional case-study approach in engineering ethics. A key problem with standard cases is that they are usually described in such a fashion that renders the ethical problem as being too obvious and simplistic. The practitioner, by contrast, may face problems that are ill-structured. In the collaborative learning environment described here, groups of students use interactive and web-based argument visualization software called “AGORA-net: Participate – Deliberate!”. The function of the software is to structure communication and problem solving in small groups. Students are confronted with the task of identifying possible stakeholder positions and reconstructing their legitimacy by constructing justifications for these positions in the form of graphically represented argument maps. The argument maps are then presented in class so that these stakeholder positions and their respective justifications become visible and can be brought into a reasoned dialogue. Argument mapping provides an opportunity for students to collaborate in teams and to develop critical thinking and argumentation skills. (shrink)
Abductive reasoning takes place in forming``hypotheses'''' in order to explain ``facts.'''' Thus, theconcept of abduction promises an understanding ofcreativity in science and learning. It raises,however, also a lot of problems. Some of them will bediscussed in this paper. After analyzing thedifference between induction and abduction (1), Ishall discuss Peirce''s claim that there is a ``logic''''of abduction (2). The thesis is that this claim can beunderstood, if we make a clear distinction between inferential elements and perceptive elements of abductive reasoning. For (...) Peirce, the creative act offorming explanatory hypotheses and the emergence of``new ideas'''' belongs exclusively to the perceptive side of abduction. Thus, it is necessary to study the roleof perception in abductive reasoning (3). A furtherproblem is the question whether there is arelationship between abduction and Peirce''s concept of``theorematic reasoning'''' in mathematics (4). Both forms of reasoning could be connected, because both arebased on perception. The last problem concerns therole of instincts in explaining the success ofabductive reasoning in science, and the question whether the concept of instinct might be replaced bymethods of inquiry (5). (shrink)
Minimalism is currently the received deflationary theory of truth. On minimalism, truth is a transparent concept and a deflated property of truth bearers. In this paper, I situate minimalism within current deflationary debate about truth by contrasting it with its main alternative―the redundancy theory of truth. I also outline three of the primary challenges facing minimalism, its formulation, explanatory adequacy and stability, and draw some lessons for the soundness of its conception of truth.
Had more philosophers of science come from chemistry, their thinking would have been different. I begin by looking at a typical chemical paper, in which making something is the leitmotif, and conjecture/refutation is pretty much irrelevant. What in fact might have been, might be, different? The realism of chemists is reinforced by their remarkable ability to transform matter; they buy into reductionism where it serves them, but make no real use of it. Incommensurability is taken without a blink, and actually (...) serves. The preeminence of synthesis in chemistry could have led philosophers of science to take more seriously questions of aesthetics within science, and to find a place in aesthetics for utility. The necessary motion twixt macroscopic and microscopic views of matter in modern chemistry leads to the coexistence of symbolic and iconic representations. And in another way to the deliberate, creative violation of categories. (shrink)
Over the last two decades, the capabilities approach has become an increasingly influential theory of development. It conceptualises human wellbeing in terms of an individual's ability to achieve functionings we have reason to value. In contrast, the African ethic of ubuntu views human flourishing as the propensity to pursue relations of fellowship with others, such that relationships have fundamental value. These two theoretical perspectives seem to be in tension with each other; while the capabilities approach focuses on individuals as the (...) locus of ethical value, an ubuntu ethic concentrates on the relations between individuals. In this article, we ask: to what extent is the capabilities approach compatible with this African ethical theory? We argue that, on reflection, relations play a much stronger role in the capabilities approach than often assumed. There is good reason to believe that relationality is part of the concept of a capability itself, where such relationality has intrinsic ethical value. This understanding of the ethical centrality of relations grounds new normative perspectives on the capabilities approach, and offers a more comprehensive grasp of the relevance of relationships to empirical enquiry. (shrink)
The philosophy of modality investigates necessity and possibility, and related notions--are they objective features of mind-independent reality? If so, are they irreducible, or can modal facts be explained in other terms? This volume presents new work on modality by established leaders in the field and by up-and-coming philosophers. Between them, the papers address fundamental questions concerning realism and anti-realism about modality, the nature and basis of facts about what is possible and what is necessary, the nature of modal knowledge, modal (...) logic and its relations to necessary existence and to counterfactual reasoning. The general introduction locates the individual contributions in the wider context of the contemporary discussion of the metaphysics and epistemology of modality. (shrink)
This essay is the first attempt to compare Reinhart Koselleck's Historik with Hannah Arendt's political anthropology and her critique of the modern concept of history. Koselleck is well-known for his work on conceptual history as well as for his theory of historical time. It is my contention that these different projects are bound together by Koselleck's Historik, that is, his theory of possible histories. This can be shown through an examination of his writings from Critique and Crisis to his final (...) essays on historical anthropology, most of which have not yet been translated into English. Conversely, Arendt's political theory has in recent years been the subject of numerous interpretations that do not take into account her views about history. By comparing the anthropological categories found in Koselleck's Historik with Arendt's political anthropology, I identify similar intellectual lineages in them as well as shared political sentiments, in particular the anti-totalitarian impulse of the postwar era. More importantly, Koselleck's theory of the preconditions of possible histories and Arendt's theory of the preconditions of the political, I argue, transcend these lineages and sentiments by providing essential categories for the analysis of historical experience. (shrink)
A large body of research in cognitive science differentiates human reasoning into two types: fast, intuitive, and emotional “System 1” thinking, and slower, more reflective “System 2” reasoning. According to this research, human reasoning is by default fast and intuitive, but that means that it is prone to error and biases that cloud our judgments and decision making. To improve the quality of reasoning, critical thinking education should develop strategies to slow it down and to become more reflective. The goal (...) of such education should be to enable and motivate students to identify weaknesses, gaps, biases, and limiting perspectives in their own reasoning and to correct them. This contribution discusses how this goal could be achieved with regard to reasoning that involves the construction of arguments; or more precisely: how computer-supported argument visualization tools could be designed that support reflection on the quality of arguments and their improvement. Three types of CSAV approaches are distinguished that focus on reflection and self-correcting reasoning. The first one is to trigger reflection by confronting the user with specific questions that direct attention to critical points. The second approach uses templates that, on the one hand, provide a particular structure to reason about an issue by means of arguments and, on the other, include prompts to enter specific items. And a third approach is realized in specifically designed user guidance that attempts to trigger reflection and self-correction. These types of approaches are currently realized only in very few CSAV tools. In order to inform the future development of what I call reflection tools, this article discusses the potential and limitations of these types and tools with regard to five explanations of the observation that students hardly ever engage in substantial revisions of what they wrote: a lack of strategies how to do it; cognitive overload; certain epistemic beliefs; myside bias; and over-confidence in the quality of one’s own reasoning. The question is: To what degree can each of the CSAV approaches and tools address these five potential obstacles to reflection and self-correction? (shrink)
On rationalist infallibilism, a wide range of both (i) analytic and (ii) synthetic a priori propositions can be infallibly justified, i.e., justified in a way that is truth-entailing. In this paper, I examine the second thesis of rationalist infallibilism, what might be called ‘synthetic a priori infallibilism’. Exploring the seemingly only potentially plausible species of synthetic a priori infallibility, I reject the infallible justification of so-called self-justifying propositions.
The role of uncertainty within an organization’s environment features prominently in the business ethics and management literature, but how corporate investment decisions should proceed in the face of uncertainties relating to the natural environment is less discussed. From the perspective of ecological economics, the salience of ecology-induced issues challenges management to address new types of uncertainties. These pertain to constraints within the natural environment as well as to institutional action aimed at conserving the natural environment. We derive six areas of (...) ecology-induced uncertainties and propose ecology-driven real options as a conceptual approach for systematically incorporating these uncertainties into strategic management. We combine our results in an integrative investment framework and illustrate its application with the case of carbon constraints. (shrink)
The possible-worlds analysis of propositions identifies a proposition with the set of possible worlds where it is true. This analysis has the hitherto unnoticed consequence that a proposition depends for its existence on the existence of every proposition that entails it. This peculiar consequence places the possible-worlds analysis in conflict with the conjunction of two compelling theses. One thesis is that a phrase of the form ‘the proposition that S’ is a rigid designator. The other thesis is that a proposition (...) which is directly about an object – a singular proposition – depends for its existence on the existence of the object. I defend these theses and conclude that the cost of the possible-worlds analysis is prohibitively high. (shrink)
The aim of this paper is to define a notion of supervenience which can adequately describe the systematic dependence of extrinsic as well as of intrinsic higher-level properties on base-level features. We argue that none of the standard notions of supervenience—the concepts of weak, strong and global supervenience—fulfil this function. The concept of regional supervenience, which is purported to improve on the standard conceptions, turns out to be problematic as well. As a new approach, we develop the notion of property-dependent (...) supervenience. This notion is founded on a criterion of relevance adapting the supervenience base to the considered higher-level properties in a specific way, such that only features which are relevant to the instantiation of the higher-level properties under consideration are taken into account. (shrink)
This article examines differences in the research approaches of farmers and scientists and analyzes how these differences are related to the conditions under which both groups engage in experimental work. Theoretical considerations as well as practical experiences are presented to emphasize the great potential of farmer–researcher collaboration for rural innovation. In the first part of the article, the innovative power of farmer research and experimentation is acknowledged by presenting examples such as crop and animal breeding, development of new production systems, (...) farm equipment, and social innovations. Considering the respective comparative advantages of farmers and scientists, and inspired by theoretical concepts in the fields of knowledge management and innovation processes, we discuss five topics for optimizing the collaboration between farmers and scientists in the field of technological innovation: user orientation, decentralization, informal modes of experimentation, externalization of tacit knowledge, and economic considerations. A better understanding of such issues could help researchers to define their own role in the research process, acknowledge the strengths and weaknesses of their own and farmers’ research approaches, overcome communication gaps, and find creative solutions for problems that typically occur in the process of participatory technology development. (shrink)
This volume represents an important contribution to Peirce’s work in mathematics and formal logic. An internationally recognized group of scholars explores and extends understandings of Peirce’s most advanced work. The stimulating depth and originality of Peirce’s thought and the continuing relevance of his ideas are brought out by this major book.