This junior/senior level text is devoted to a study of first-order logic and its role in the foundations of mathematics: What is a proof? How can a proof be justified? To what extent can a proof be made a purely mechanical procedure? How much faith can we have in a proof that is so complex that no one can follow it through in a lifetime? The first substantial answers to these questions have only been obtained in this century. The most (...) striking results are contained in Goedel's work: First, it is possible to give a simple set of rules that suffice to carry out all mathematical proofs; but, second, these rules are necessarily incomplete - it is impossible, for example, to prove all true statements of arithmetic. The book begins with an introduction to first-order logic, Goedel's theorem, and model theory. A second part covers extensions of first-order logic and limitations of the formal methods. The book covers several advanced topics, not commonly treated in introductory texts, such as Trachtenbrot's undecidability theorem. Fraissé's elementary equivalence, and Lindstroem's theorem on the maximality of first-order logic. (shrink)
" This is one of the most important books on quantum mechanics ever written for lay readers, in which an eminent physicist and successful science writer, Heinz Pagels, discusses and explains the core concepts of physics without resorting to complicated mathematics. "Can be read by anyone. I heartily recommend it!" -- New York Times Book Review. 1982 edition"--.
In Thomas Thomson's System of chemistry of 1802 Bergman and Scheele are actually considered as creators of the analytical concept of an element. With regard to this, a detailed investigation of the work of Bergman and Scheele shows that Thomson's statement contains mistakes as well as inadmissable simplifications and generalizations. It is correct, however, that Bergman in 1774–1777 specifically anticipated in essential aspects the analytical element concept proposed by Lavoisier in 1787–1789.
This compelling book contains a comprehensive analytical treatment of the theory of production in a long-period framework. Although the authors take a 'Classical' approach to their subject, the scope of investigation and methods employed should interest all economic theorists. Professors Kurz and Salvadori explore economic systems that are characterised by a particular kind of primary input in the production process, such as different kinds of labour and natural resources. These systems and the corresponding prices can be understood to reflect characteristic (...) features of a capitalist market economy in an ideal way: they express the pure logic of the relationship between value and distribution in an economic system. Specific chapters deal with prices and income distribution, economic growth, joint production, fixed capital, scarce natural resources, and heterogeneous labour. The historical origins of the concepts used are also discussed in considerable detail. (shrink)
Abstract Finding a moral justification for humanitarian intervention has been the objective of a great deal of academic inquiry in recent years. Most of these treatments, however, make certain arguments or assumptions about the morality of humanitarian intervention without fully exploring their precise philosophical underpinnings, which has led to an increasingly disjointed body of literature. The purpose of this essay, therefore, is to suggest that the conventional arguments and assumptions made about the morality of humanitarian intervention can be encompassed in (...) what is essentially a consequentialist framework. After a brief examination of consequentialist ethics, this essay reveals a number of morally relevant factors concerning humanitarian intervention, wherein I suggest that the general consensus in the literature on these factors constitutes ?commonsense morality?. In doing so, I argue that consequentialism as a theory of the right provides the best fit with commonsense morality on humanitarian intervention. This is important not only to reveal the precise philosophical underpinnings of the debate, but also to bring ethical, prudential and political considerations together in a coherent ethical discourse. (shrink)
The present work critically examines two assumptions frequently stated by supporters of cognitive neuroenhancement. The first, explicitly methodological, assumption is the supposition of effective and side effect-free neuroenhancers. However, there is an evidence-based concern that the most promising drugs currently used for cognitive enhancement can be addictive. Furthermore, this work describes why the neuronal correlates of key cognitive concepts, such as learning and memory, are so deeply connected with mechanisms implicated in the development and maintenance of addictive behaviour so that (...) modification of these systems may inevitably run the risk of addiction to the enhancing drugs. Such a potential risk of addiction could only be falsified by in-depth empirical research. The second, implicit, assumption is that research on neuroenhancement does not pose a serious moral problem. However, the potential for addiction, along with arguments related to research ethics and the potential social impact of neuroenhancement, could invalidate this assumption. It is suggested that ethical evaluation needs to consider the empirical data as well as the question of whether and how such empirical knowledge can be obtained. (shrink)
This paper examines a model in which people’s preferences adjust to changes in their relative ability to attain various goals. Preference changes are modeled as changes in the configuration of weights (or values) attached to these goals. The model permits to explain common prototype changes of preferences such as the ‘sour grapes’ or the ‘overcompensating’ phenomenon. It is found that whether the first or the second phenomenon occurs depends on whether a goal is easy or difficult to substitute by other (...) goals. If two goals are sufficiently strong substitutes for each other, no weight will be placed on that goal which is harder to attain. The results readily apply to the standard microeconomic set-up involving goods and prices, rather than abstract goals and abilities. In this case, one implication of the model is that only a subset of the overall commodity space is relevant for everyday consumer choice, which reduces the complexity of the choice procedure. The model also permits to explain how new and unfamiliar products are incorporated into the consumer’s preference pattern. (shrink)
An important distinction between phonology and syntax has been overlooked. All phonological patterns belong to the regular region of the Chomsky Hierarchy, but not all syntactic patterns do. We argue that the hypothesis that humans employ distinct learning mechanisms for phonology and syntax currently offers the best explanation for this difference.
The concept of the artes liberales originates in antiquity and was, especially in the Anglo-Saxon area and during the 17th and 18th centuries, remodelled into a socially, educationally, and politically modern educational concept. In this process, the progress within the empirical sciences and the formation of an early civil public are of the utmost importance. In the course of these transformations, the absolute force of church and state is called into question; educational concepts which have to be called modern emerge (...) from it. (shrink)
Page generated Wed Aug 4 22:08:45 2021 on philpapers-web-65948fd446-wp78j
cache stats: hit=8286, miss=9463, save= autohandler : 1427 ms called component : 1411 ms search.pl : 1150 ms render loop : 903 ms next : 459 ms addfields : 398 ms publicCats : 328 ms initIterator : 245 ms autosense : 183 ms match_other : 158 ms save cache object : 78 ms menu : 68 ms quotes : 59 ms search_quotes : 29 ms retrieve cache object : 27 ms match_cats : 23 ms prepCit : 19 ms applytpl : 5 ms intermediate : 1 ms match_authors : 1 ms init renderer : 0 ms setup : 0 ms auth : 0 ms writelog : 0 ms