In this paper we consider the major development of mathematicalanalysis during the mid-nineteenth century. On the basis of Jahnke’s (Hist Math 20(3):265–284, 1993 ) distinction between considering mathematics as an empirical science based on time and space and considering mathematics as a purely conceptual science we discuss the Swedish nineteenth century mathematician E.G. Björling’s general view of real- and complexvalued functions. We argue that Björling had a tendency to sometimes consider mathematical objects in a naturalistic way. (...) One example is how Björling interprets Cauchy’s definition of the logarithm function with respect to complex variables, which is investigated in the paper. Furthermore, in view of an article written by Björling (Kongl Vetens Akad Förh Stockholm 166–228, 1852 ) we consider Cauchy’s theorem on power series expansions of complex valued functions. We investigate Björling’s, Cauchy’s and the Belgian mathematician Lamarle’s different conditions for expanding a complex function of a complex variable in a power series. We argue that one reason why Cauchy’s theorem was controversial could be the ambiguities of fundamental concepts in analysis that existed during the mid-nineteenth century. This problem is demonstrated with examples from Björling, Cauchy and Lamarle. (shrink)
This fundamental and straightforward text addresses a weakness observed among present-day students, namely a lack of familiarity with formal proof. Beginning with the idea of mathematical proof and the need for it, associated technical and logical skills are developed with care and then brought to bear on the core material of analysis in such a lucid presentation that the development reads naturally and in a straightforward progression. Retaining the core text, the second edition has additional worked examples which (...) users have indicated a need for, in addition to more emphasis on how analysis can be used to tell the accuracy of the approximations to the quantities of interest which arise in analytical limits. (shrink)
Mathematical proofs generally allow for various levels of detail and conciseness, such that they can be adapted for a particular audience or purpose. Using automated reasoning approaches for teaching proof construction in mathematics presupposes that the step size of proofs in such a system is appropriate within the teaching context. This work proposes a framework that supports the granularity analysis of mathematical proofs, to be used in the automated assessment of students' proof attempts and for (...) the presentation of hints and solutions at a suitable pace. Models for granularity are represented by classifiers, which can be generated by hand or inferred from a corpus of sample judgments via machine-learning techniques. This latter procedure is studied by modeling granularity judgments from four experts. The results provide support for the granularity of assertion-level proofs but also illustrate a degree of subjectivity in assessing step size. (shrink)
A critical study of McPeck's recent book, in which he strengthens and develops his arguments against teaching critical thinking (CT). Accepting McPeck's basic claim that there is no unitary skill of reasoning or thinking, I argue that his strictures on CT courses or programs do not follow. I set out what I consider the proper justification that programs in CT have to meet, and argue both that McPeck demands much more than is required, and also that it is (...) plausible that this deflated justification can be met. Specitically, I argue that it is reasonable to expect transfer of learning for basic logical skills. Additional topics covered include: the relation ofliberal education to critical thinking, argument analysis, testing for CT, and the value of conceptual or linguistic analysis. (shrink)
This article describes a model for incorporating lesson study into the student teaching placement and reports on the success of the implementation of such a model with student teachers and their cooperating teachers (CTs). Student teachers had the opportunity to discuss many important ideas with each other and their CTs, including ?big ideas? of mathematics, and the anticipation of student questions and possible responses. Student teachers also had a built?in opportunity for peer observation on a regular basis and (...) the opportunity to collaborate with their peers. Certain important aspects of lesson study were not present in this implementation: the teachers involved did not discuss the gaps in their own knowledge with the goal of improving their own mathematical understanding, they did not refer outside sources for ideas for the lessons, and they did not have an overarching affective goal for students. Suggestions are made for teacher preparation in light of these findings. (shrink)
Interest in the computational aspects of modeling has been steadily growing in philosophy of science. This paper aims to advance the discussion by articulating the way in which modeling and computational errors are related and by explaining the significance of error management strategies for the rational reconstruction of scientific practice. To this end, we first characterize the role and nature of modeling error in relation to a recipe for model construction known as Euler’s recipe. We then describe a general model (...) that allows us to assess the quality of numerical solutions in terms of measures of computational errors that are completely interpretable in terms of modeling error. Finally, we emphasize that this type of error analysis involves forms of perturbation analysis that go beyond the basic model-theoretical and statistical/probabilistic tools typically used to characterize the scientific method; this demands that we revise and complement our reconstructive toolbox in a way that can affect our normative image of science. (shrink)
Philosophy: The Essential Study Guide is a compact and straightforward guide to the skills needed to study philosophy, aimed at anyone coming to the subject for the first time or just looking to improve their performance. Nigel Warburton, bestselling author of Philosophy: The Basics , clarifies what is expected of students and offers strategies and guidance to help them make effective use of their study time and improve their marks. The four main skills covered by the book (...) are: · READING philosophy - both skimming and in-depth analysis of historical and contemporary work, understanding the examples and terminology used · LISTENING to philosophy - formal lectures and informal classroom teaching, preparation, picking up on arguments used, note taking · DISCUSSING philosophy - arguing and exploring, asking questions, communicating in concise and understandable ways · WRITING philosophy - planning and researching essays and other written tasks, thinking up original examples, avoiding plagiarism Written in Nigel Warburton's customary student-friendly style and filled with sound advice and top tips, Philosophy: The Essential Study Guide is an indispensable guide for anyone getting to grips with their first philosophy course. (shrink)
The need to make young scientists aware of their social responsibilities is widely acknowledged, although the question of how to actually do it has so far gained limited attention. A 2-day workshop entitled “Prepared for social responsibility?” attended by doctoral students from multiple disciplines in climate science, was targeted at the perceived needs of the participants and employed a format that took them through three stages of ethics education: sensitization, information and empowerment. The workshop aimed at preparing doctoral students to (...) manage ethical dilemmas that emerge when climate science meets the public sphere (e.g., to identify and balance legitimate perspectives on particular types of geo-engineering), and is an example of how to include social responsibility in doctoral education. The paper describes the workshop from the three different perspectives of the authors: the course teacher, the head of the graduate school, and a graduate student. The elements that contributed to the success of the workshop, and thus make it an example to follow, are (1) the involvement of participating students, (2) the introduction of external expertise and role models in climate science, and (3) a workshop design that focused on ethical analyses of examples from the climate sciences. (shrink)
The general public and environmental policy makers often perceive management actions of environmental managers as science, when such actions are, in fact, value judgments about when to intervene in natural processes. The choice of action requires ethical as well as scientific analysis because managers must choose a normative outcome to direct their intervention. I examine a management case study involving prescribed burning of sagebrush (Artemisia tridentata) communities in south-central Montana (USA) to illustrate how to teach students to ethically (...) evaluate a management action by precisely identifying: 1) the proposed management action, 2) the deficiency of the system to be remedied by the action, 3) the stakeholders affected by the action, and 4) the category and type of values affirmed in the management action. Through such analysis, students are taught to recognize implicit and explicit value judgments associated with management actions, identify stakeholders to whom managers have legitimate ethical obligations, and practice a general method of ethical analysis applicable to many forms of environmental management. (shrink)
Word problems in mathematics seem to constantly pose learning difficulties for all kinds of students. Recent work in math education (for example, [Lakoff, G. & Nuñez, R. E. (2000). Where mathematics comes from: How the embodied mind brings mathematics into being. New York: Basic Books]) suggests that the difficulties stem from an inability on the part of students to decipher the metaphorical properties of the language in which such problems are cast. A 2003 pilot study [Danesi, M. (2003a). Semiotica, (...) 145, 71–83] confirmed this hypothesis in an anecdotal way. This paper reviews the implications of that study and of a follow-up one that is described here as well, in the light of how the metaphorical analysis of word problems allows learners to overcome typical difficulties in word problem-solving by teaching them how to flesh out the underlying concepts and convert them into appropriate representations. (shrink)
A remarkable development in twentieth-century mathematics is smooth infinitesimal analysis ('SIA'), introducing nilsquare and nilpotent infinitesimals, recovering the bulk of scientifically applicable classical analysis ('CA') without resort to the method of limits. Formally, however, unlike Robinsonian 'nonstandard analysis', SIA conflicts with CA, deriving, e.g., 'not every quantity is either = 0 or not = 0.' Internally, consistency is maintained by using intuitionistic logic (without the law of excluded middle). This paper examines problems of interpretation resulting from this (...) 'change of logic', arguing that standard arguments based on 'smoothness' requirements are question-begging. Instead, it is suggested that recent philosophical work on the logic of vagueness is relevant, especially in the context of a Hilbertian structuralist view of mathematical axioms (as implicitly defining structures of interest). The relevance of both topos models for SIA and modal-structuralism as appled to this theory is clarified, sustaining this remarkable instance of mathematical pluralism. (shrink)
The goal of this paper is to stress the significance of ethics for engineering education and to illustrate how it can be brought into the mainstream of higher education in a natural way that is integrated with the teaching objectives of enriching the core meaning of engineering. Everyone will agree that the practicing engineer should be virtuous, should be a good colleague, and should use professional understanding for the common good. But these injunctions to virtue do not reach closely (...) enough the ethic of the engineer as engineer, as someone acting in a uniquely engineering situation, and it is to such conditions that I wish to speak through a set of specific examples from recent history. I shall briefly refer to four controversies between engineers. Then, in some detail I shall narrate three historical cases that directly involve the actions of one engineer, and finally I would like to address some common contemporary issues. The first section, “Engineering Ethics and the History of Innovation” includes four cases involving professional controversy. Each controversy sets two people against each other in disputes over who invented the telegraph, the radio, the automobile, and the airplane. In each dispute, it is possible to identify ethical and unethical behavior or ambiguous ethical behavior that serves as a basis for educational discussion. The first two historical cases described in “Crises and the Engineer” involve the primary closure dam systems in the Netherlands, each one the result of the actions of one engineer. The third tells of an American engineer who took his political boss, a big city mayor, to court over the illegal use of a watershed. The challenges these engineers faced required, in the deepest sense, a commitment to ethical behavior that is unique to engineering and instructive to our students. Finally, the cases in “Professors and Comparative Critical Analysis” illuminate the behavior of engineers in the design of structures and also how professors can make public criticisms of designs that seem wasteful. (shrink)
George Boole collected ideas for the improvement of his Mathematicalanalysis of logic(1847) on interleaved copies of that work. Some of the notes on the interleaves are merely minor changes in explanation. Others amount to considerable extension of method in his mathematical approach to logic. In particular, he developed his technique in solving simultaneous elective equations and handling hypotheticals and elective functions. These notes and extensions provided a source for his later book Laws of thought(1854).
(2012). Learning, Teaching and Education Research in the 21st Century: an Evolutionary Analysis of the Role of Teachers. By Joanna Swann. British Journal of Educational Studies: Vol. 60, No. 3, pp. 290-291. doi: 10.1080/00071005.2012.714553.
The papers in this volume represent the views of a range of experts in a variety of language-related disciplines on the role which context plays in language learning and language understanding. The authors provide various theoretical constructs which help impose order on the apparent chaos of contextual factors which may have an influence on the production and comprehension of speech events. They focus on a variety of types of context, including the context established by different speech communities, interpersonal contexts, the (...) classroom context, and the context provided by the linguistic code itself. The papers illustrate how the treatment of context varies across the disciplines of linguistics, historical stylistics, applied linguistics, and psycholinguistics. Each paper is prefaced by an editorial introduction to help the reader trace out common themes and points of conflict. (shrink)
Starting from the critical review of various motivational frameworks of change that have been applied to the study of eating disorders, the present paper provides an alternative conceptualization of the change in psychotherapy presenting a single case study. We analysed six psychotherapeutic conversations with a bulimic patient and found out narratives “for” and “against” change. We read them in terms of tension between dominance and exchange in I-positions, as described by Hermans. These results indicate that the dialogical (...) class='Hi'>analysis of clinical discourse may be a useful method to investigate change from the beginning to the end of therapy. (shrink)
The general public and environmental policy makers often perceive management actions of environmental managers as “science,” when such actions are, in fact, value judgments about when to intervene in natural processes. The choice of action requires ethical as well as scientific analysis because managers must choose a normative outcome to direct their intervention. I examine a management case study involving prescribed burning of sagebrush (Artemisia tridentata) communities in south-central Montana (USA) to illustrate how to teach students to ethically (...) evaluate a management action by precisely identifying: 1) the proposed management action, 2) the deficiency of the system to be remedied by the action, 3) the stakeholders affected by the action, and 4) the category and type of values affirmed in the management action. Through such analysis, students are taught to recognize implicit and explicit value judgments associated with management actions, identify stakeholders to whom managers have legitimate ethical obligations, and practice a general method of ethical analysis applicable to many forms of environmental management. (shrink)
According to a grand narrative that long ago ceased to be told, there was a seventeenth century Scientific Revolution, during which a few heroes conquered nature thanks to mathematics. This grand narrative began with the exhibition of quantitative laws that these heroes, Galileo and Newton for example, had disclosed: the law of falling bodies, according to which the speed of a falling body is proportional to the square of the time that has elapsed since the beginning of its fall; the (...) law of gravitation, according to which two bodies are attracted to one another in proportion to the sum of their masses and in inverse proportion to the square of the distance separating them -- according to his own preferences, each narrator added one or two quantitative laws of this kind. The essential feature was not so much the examples that were chosen, but, rather, the more or less explicit theses that accompanied them. First, mathematization would be taken as the criterion for distinguishing between a qualitative Aristotelian philosophy and the new quantitative physics. Secondly, mathematization was founded on the metaphysical conviction that the world was created pondere, numero et mensura, or that the ultimate components of natural things are triangles, circles, and other geometrical objects. This metaphysical conviction had two immediate consequences: that all the phenomena of nature can be in principle submitted to mathematics and that mathematical language is transparent; it is the language of nature itself and has simply to be picked up at the surface of phenomena. Finally, it goes without saying that, from a social point of view, the evolution of the sciences was apprehended through what has been aptly called the 'relay runner model,' according to which science progresses as a result of individual discoveries. Grand narratives such as this are perhaps simply fictions doomed to ruin as soon as they are clearly expressed. In any case, the very assumption on which this grand narrative relies can be brought into question: even in the canonical domain of mechanics, the relevant epistemological units crucial to understanding the dynamics of the Scientific Revolution are perhaps not a few laws of motion, but a complex set of problems embodied in mundane objects. Moreover, each of the theses just mentioned was actually challenged during the long period of historiographical reappraisal, out of which we have probably not yet stepped. Against the sharp distinction between a qualitative Aristotelian philosophy and the new quantitative physics, numerous studies insist that Rome wasn't built in a day, so to speak. Since Antiquity, there have always been mixed sciences; the emergence of pre-classical mechanics depends on both medieval treatises and the practical challenges met by Renaissance engineers. It is indeed true that, for Aristotle, mathematics merely captures the superficial properties of things, but the Aristotelianisms were many during the Renaissance and the Early Modern period, with some of them being compatible with the introduction of mathematics in natural philosophy. In addition, the gap between the alleged program of mathematizing nature and its effective realization was underlined as most natural phenomena actually escaped mathematization; at best they were enrolled in what Thomas Kuhn began to rehabilitate under the appellation of the 'Baconian sciences,' i.e., empirical investigations aiming at establishing isolated facts, without relating them to any overarching theory. Hence, mathematization of nature cannot pretend to capture a historical fact: at most, it expresses an indeterminate task for generations to come. On top of these first two considerations, and against the thesis of the neutrality of the mathematical language, it was urged that mathematics is not 'only a language' and that, exactly as other symbolic means or cognitive tools, it has its own constraints. For example, it has been thoroughly explained that the Euclidean theory of proportions both guides and frustrates the Galilean analysis of motion; its shortages were particularly clear with respect to the expression of continuity, which is crucial in the case of motion. Consequently, when calculus was invented and applied to the analysis of motion, it was not a transposition that left things as they stood. Even more clearly than in the case of a translation from one natural language to another, the shift from one symbolic language to another entails that certain possibilities are opened while others are closed. The cognitive constraints imposed by established mathematical theories, as seen in the theory of proportions or calculus, were not the only ones to be studied in relation to mathematization. Certain schemes dependent on the grammar of natural languages, e.g., the scheme of contrariety, or certain symbolic means of representation, e.g. geometrical diagrams and numerical tables, were also subject to such scrutiny. Lastly, it was insisted that, even if we concede the existence of scientific geniuses, mathematics is largely produced by intellectual communities and embedded within social practices. More attention was consequently paid to the forms of communication in given mathematical networks, or to the teaching of the discipline in, for example, Jesuit colleges and universities. The set of mathematical practices specific to specialized craftsmen, highly-qualified experts and engineers began to be studied in its own right. All these reflections may have helped us change our perspectives on the question of mathematization. It seems, however, that they were instead set aside, both because of a general distrust towards sweeping narratives that are always subject to the suspicion that they overlook the unyielding complexity of real history, and because of a shift in our interests. The more obscure and idiosyncratic they are, an alchemist, a patron of the sciences or a lunatic collector is nowadays honored in journals of the history of sciences. As for the general issues involved in the question of mathematization, they are rejected as obsolete, or reserved for specialized journals in the history of mathematics. Consequently, before presenting the essays of this fascicle, I would like to say a few words in favor of a renewed study of the forms of mathematization in the history of the early sciences. (shrink)
From the perspective of meta-analysis done in a qualitative structure, the study puts forward an inventory of the communist regime studies in the following ways: 1. The re-evaluation of the social ideology-propaganda-practice relationship of the equality between sexes in the communist regime. 2. The contextualization and the evolution of the social representations of a woman's role. 3. The effects of some political decisions, which can count as aggressiveness of a state towards its citizens (770/1966 Decree).
Empirical studies in business ethics often rely on self-reported data, but this reliance is open to criticism. Responses to questionnaires and interviews may be influenced by the subject''s view of what the researcher might want to hear, by a reluctance to talk about sensitive ethical issues, and by imperfect recall. This paper reviews the extent to which published research in business ethics relies on interviews and questionnaires, and then explores the possibilities of using secondary data, such as company documents and (...) newspaper reports, as a source for empirical studies in applied ethics. A specific example is then discussed, describing the source material, the method, the development of the research questions, and the way in which reliability and validity were established. In the example, content analysis was used to examine the extent to which the executive virtue of courage was observed or called for in items published in four international daily newspapers, and to explore the meaning which was attributed to "courage" in the papers. (shrink)
The present study is concerned the viability of the primary method in contemporary philosophy, i.e., conceptual analysis. Starting out by tracing the roots of this methodology to Platonic philosophy, the study questions whether such a methodology makes sense when divorced from Platonic philosophy, and develops a framework for a kind of analysis that is more in keeping with recent psychological research on categorization. Finally, it is shown that this kind of analysis can be applied to (...) the concept of justification in a manner that furthers the epistemological goal of providing intellectual guidance. (shrink)
In this article I use a case study of 3 newspaper pieces about assisted suicide and euthanasia to show how journalists can use analysis and commentary to highlight the ethical dimension of an important public issue. Using an approach grounded in ethical theory, I examine how these pieces-from the Christian Science Monitor, Los Angeles Times, and New York Times-shed light on ethical issues including matters of duties and consequences. It is argued that an analytical approach that openly frames (...) a topic as having a moral dimension is particularly appropriate for ethics coverage in light of the value-laden nature of writing about ethics. (shrink)
Recent events have raised concerns about the ethical standards of public and private organisations, with some attention falling on business schools as providers of education and training to managers and senior executives. This paper investigates the nature of, motivation and commitment to, ethics tuition provided by the business schools. Using content analysis of their institutional and home websites, we appraise their corporate identity, level of engagement in socially responsible programmes, degree of social inclusion, and the relationship to their ethics (...)teaching. Based on published research, a schema is developed with corporate identity forming an integral part, to represent the macro-environment, parent institution, the business school and their relationships to ethics education provision. This is validated by our findings. (shrink)
We give an overview of recent results in ordinal analysis. Therefore, we discuss the different frameworks used in mathematical proof-theory, namely "subsystem of analysis" including "reverse mathematics", "Kripke-Platek set theory", "explicit mathematics", "theories of inductive definitions", "constructive set theory", and "Martin-Löf's type theory".
Under the assumption that business ethics can be enhanced by evaluating the affects of decisions on others, this essay demonstrates a casestudy in stakeholder analysis. While much normative literature has been compiled on the subject of stakeholder management, even more can be learned from the first-hand observation of stakeholder interactions. The purpose of this essay is to present a model of Basic Manufacturing Technologies' stakeholder universe, and illustrate how this manufacturer of steel interacts with its stakeholders. (...) The conclusion which is drawn suggests that these relationships are more complex than BMT's stakeholder philosophy portrays. (shrink)
In this paper we study a new approach to classify mathematical theorems according to their computational content. Basically, we are asking the question which theorems can be continuously or computably transferred into each other? For this purpose theorems are considered via their realizers which are operations with certain input and output data. The technical tool to express continuous or computable relations between such operations is Weihrauch reducibility and the partially ordered degree structure induced by it. We have identified (...) certain choice principles such as co-finite choice, discrete choice, interval choice, compact choice and closed choice, which are cornerstones among Weihrauch degrees and it turns out that certain core theorems in analysis can be classified naturally in this structure. In particular, we study theorems such as the Intermediate Value Theorem, the Baire Category Theorem, the Banach Inverse Mapping Theorem, the Closed Graph Theorem and the Uniform Boundedness Theorem. We also explore how existing classifications of the Hahn—Banach Theorem and Weak Kőnig's Lemma fit into this picture. Well-known omniscience principles from constructive mathematics such as LPO and LLPO can also naturally be considered as Weihrauch degrees and they play an important role in our classification. Based on this we compare the results of our classification with existing classifications in constructive and reverse mathematics and we claim that in a certain sense our classification is finer and sheds some new light on the computational content of the respective theorems. Our classification scheme does not require any particular logical framework or axiomatic setting, but it can be carried out in the framework of classical mathematics using tools of topology, computability theory and computable analysis. We develop a number of separation techniques based on a new parallelization principle, on certain invariance properties of Weihrauch reducibility, on the Low Basis Theorem of Jockusch and Soare and based on the Baire Category Theorem. Finally, we present a number of metatheorems that allow to derive upper bounds for the classification of the Weihrauch degree of many theorems and we discuss the Brouwer Fixed Point Theorem as an example. (shrink)
This desk-based-study explores, on the basis of a Critical Realist perspective, the possibility to integrate the concept of Learning Cultures within the scope of Critical Discourse Analysis. It proposes a theoretical framework to support and guide the use of textual analysis in the study of Learning Cultures and highlights new opportunities to study technology enhanced learning communities and communities of practice, leveraging on Corpora Analysis and Metaphor Individuation Procedures.
This article will explore pediatric consent through the analysis of a clinical case study using the principles of biomedical ethics approach. Application of the principles of autonomy, nonmaleficence, beneficence, and justice will be dissected in order to attempt to establish resolution of the ethical dilemma. The main conflict in this case study deals with whether the wishes of an adolescent for end-of-life care should be followed or should the desire of his parents outweigh this request. In terminal (...) cancer, the hope of early palliative care and dignity in dying serve as priorities in therapy. Application of the moral principles to both sides of the dilemma aided in providing an objective resolution to uphold pediatric consent. (shrink)
Legal philosophers divide over whether it is possible to analyze legal concepts without engaging in normative argument. The influential analysis of legal rights advanced by Jules Coleman and Jody Kraus some years ago serves as a useful case study to consider this issue because even some legal philosophers who are generally skeptical of the neutrality claims of conceptual analysts have concluded that Coleman and Kraus's analysis manages to maintain such neutrality. But that analysis does depend in (...) subtle but important ways on normative claims. Their argument assumes not only a positivist concept of law, but also that it counts in favor of an analysis of legal rights that it increases the number of options available to legal decisionmakers. Thus, whether Coleman and Kraus's analysis is right in the end depends on whether those normative assumptions are justified. If even their analysis, which makes the thinnest of conceptual claims, depends on normative premises, that fact serves as strong evidence of the difficulty of analyzing legal concepts while remaining agnostic on moral and political questions. (shrink)
The purpose of this paper is to apply Aristotle''s theory of causation to the administrative realm in an attempt to provide the manager/student with a more complete basis for organizational analysis. The authors argue that the traditional approach to administrative case studies limits the manager''s/student''s perspective to the positivistic world view at the expense of a more encompassing perspective which can be achieved through the use of an Aristotelian approach. Aristotle''s four-part theory of causation is juxtaposed with contemporary views (...) of organizational ideology/philosophy, culture, climate and leadership, and staff or personnel. The Mazda automobile plant in Flat Rock, Michigan is provided as a sample case study to demonstrate the comprehensiveness of the Aristotelian method in organizational contexts. (shrink)
Modeling cognition by structural analysis of representation leads to systematic difficulties which are not resolvable. We analyse the merits and limits of a representation-based methodology to modeling cognition by treating Jackendoff's Consciousness and the Computational Mind as a good case study. We note the effects this choice of methodology has on the view of consciousness he proposes, as well as a more detailed consideration of the computational mind. The fundamental difficulty we identify is the conflict between the desire (...) for modular processors which map directly onto representations and the need for dynamically interacting control. Our analysis of this approach to modeling cognition is primarily directed at separating merits from problems and inconsistencies by a critique internal to this approach; we also step outside the framework to note the issues it ignores. (shrink)
: The paper examines differences of styles of experimentation in the history of science. It presents arguments for a historization of our historial and philosophical notion of "experimentation," which question the common view that "experimental philosophy" was the only style of experimentation in the eighteenth and early nineteenth centuries. It argues, in particular, that "experimental history" and technological inquiry were accepted styles of academic experimentation at the time. These arguments are corroborated by a careful analysis of a case (...) class='Hi'>study, which is embedded in a comparative historical overview. (shrink)
This study uses dialogic theory and philosophy of technology to provide an ethical framework for analysis of newspaper audiotex, or electronic voice information services. It concludes that growth of newspaper audiotex (a) is bound by notions of technological determinism and the technological imperative, (b) is driven by virtuosity values related more to personal aggrandizement of its developers than concern for consequences in the user sphere, and (c) signifies a shift in newspapers' communicative stance with readers to monologic mode (...) emphasizing power/persuasion. Consequences for the coming of the electronic newspaper are considered. (shrink)
I discuss the applicability of mathematics via a detailed case study involving a family of mathematical concepts known as ‘fractional derivatives.’ Certain formulations of the mystery of applied mathematics would lead one to believe that there ought to be a mystery about the applicability of fractional derivatives. I argue, however, that there is no clear mystery about their applicability. Thus, via this case study, I think that one can come to see more clearly why certain formulations of (...) the mystery of applied mathematics are not convincing. (shrink)
The influence of Michael Polanyi on William H. Poteat’s teaching from 1967 to 1976 was apparent but not paramount. Cultural conceptual analysis as taught and practiced by Poteat during this period included Polanyian texts, themes, and concepts, but drew extensively from other major conceptual innovators who provided radical alternatives to key cultural conceptual commitments of modernity. This was the period roughly between the completion of Intellect and Hope and the writing of Polanyian Meditations.
A case-study, small-group-discussion (“focal problem”) exercise in the history of medicine was designed, piloted, and evaluated in an overseas course and an on-campus elective course for medical students. Results suggest that this is a feasible approach to teaching history of medicine which can overcome some of the problems often encountered in teaching this subject in the medical curriculum.
The unified theory of dose and effect, as indicated by the median-effect equation for single and multiple entities and for the first and higher order kinetic/dynamic, has been established by T.C. Chou and it is based on the physical/chemical principle of the massaction law (J. Theor. Biol. 59: 253-276, 1976 (質量作用中效定理) and Pharmacological Rev. 58: 621-681, 2006) (普世中效指數定理). The theory was developed by the principle of mathematical induction and deduction (數學演繹歸納法). Rearrangements of the median-effect equation lead to Michaelis-Menten, Hill, (...) Scatchard, and Henderson-Hasselbalch equations. The “median” serves as the universal reference point and the “common link” for the relationship of all entities and is also the “harmonic mean” of kinetic dissociation constants. Over 300 mechanism-specific equations have been derived and published using the mathematical induction-deduction process. These equations can be deduced into several general equations, including the median-mediated whole/part equation, combination index theorem, isobologram equation, and polygonogram. It is proven that “dose” and “effect” are interchangeable, thus, “substance” and “function” are interchangeable, which leads to “the unity theory” (劑效、心物、知行一元論) in quantitative mathematical philosophy (數學的定量哲學) in functional context. Therefore, a general theory centered on the “median” and based on equilibrium dynamics has evolved. In other words: [「中」的宇宙觀： 以「中」爲基凖的動力學生態平衡]. Based on the median-effect equation of the mass-action law, the fundamental claim is that we can draw “a specific cure” for only two data points, if they are determined accurately. This claim has far reaching consequences since it defies the general held belief that two points can dray only a straight line. Remarkably, the unity theory (一元論) providesscientific/mathematical interpretation in equations and in graphics of Chinese ancient philosophy, including Fu-Si Ba Gua (伏羲八卦), Dao’s Harmony (和諧), the Confucian doctrine of the mean (儒家中庸之道), Chou Dun-Yi’s (周敦頤, 1017-1073) From Wu-ji to Tai-ji and Taiji Tu Sho (無極而太極及太極圖說). The moderntopological analysis for trinity yields an exact correspondence to the Ba-Gua, which was introduced over 4,000 years ago. Furthermore, the median-centered algorithm, promotes modern ecological content (生態學) in the equilibral dynamic state of harmony. It is concluded that Western science and Eastern philosophy are directly linked and complementary to each other. Since the truth in mathematical quantitative philosophy (數學的定量哲學) has no boundaries, East and West philosophies can flourish together for the common goal and ideal in science and in humanity (世界大同). (shrink)
A calorimetric study of Te15(Se100? x Bi x )85 glassy alloys (x = 0, 1, 2, 3 and 4 at. %) is reported. Differential thermal analysis (DTA) was performed at heating rates of 10, 15, 20 and 25 K/min. The spectra were used to determine the glass transition temperature, Tg , the crystallisation temperature, Tc and the melting temperature, Tm . All these parameters shift to higher values with increasing heating rate, ?. The glass transition temperature and the (...) melting temperature increase, and the crystallisation temperature decreases, with increase in the Bi content, x. The activation energy of the glass transition, Eg , was evaluated using the Moynihan and Kissinger methods. The activation energy of crystallisation, Ec , was calculated using modified Kissinger and Matusita approaches. The thermal stability of these glasses has been studied and found to decrease with increase in Bi content. The results obtained are explained on the basis of a chemically ordered network model and an average coordination number. (shrink)
Recent research suggests that the cerebral correlates of cognitive deficits in schizophrenia are nested in the activity of widespread, inter-regional networks rather than being restricted to any specific brain location. One of the networks that have received focus lately is the default mode network. Parts of this network have been reported as hyper-activated in schizophrenia patients (SZ) during rest and during task performance compared to healthy controls (HC), although other parts have been found to be hypo-activated. In contrast to this (...) network, task-positive networks have been reported as hypo-activated compared in SZ during task performance. However, the results are mixed, with e.g. the dorsolateral prefrontal cortex showing both hyper-and hypo-activation in SZ. In this study we were interested in signal increase and decrease differences between a group of SZ and HC in cortical networks, assuming that the regulatory dynamics of alternating task-positive and task-negative neuronal processes are aberrant in SZ. We compared 31 SZ to age- and gender-matched HC, and used fMRI and independent component analysis in order to identify relevant networks. We selected the independent components with the largest signal intensity increases (STG, insula, SMA, ACC and MTG) and decreases (fusiform gyri, occipital lobe, PFC, cingulate, precuneus and angular gyrus) in response to a dichotic auditory cognitive task. These independent components were then tested for group differences. Our findings showed deficient up-regulation of the executive network and a corresponding deficit in the down-regulation of the anterior default mode network during task performance in SZ when compared with HC. These findings may indicate a deficit in the dynamics of alternating task-dependent and task-independent neuronal processes in SZ. The results may cast new light on the mechanisms underlying cognitive deficits in schizophrenia, and may be of relevance for diagnostics and new treatments. (shrink)