The non-financial effects (NFE) antitakeover amendment addresses the duties of company directors and management when faced with a possible takeover bid. The NFE amendment either permits or requires managers to consider the interests of the company's stakeholders during takeover bids. Other types of antitakeover devices have been viewed as protecting either stockholder or management interests. The NFE amendment would appear to protect a broad spectrum of interests including those of company employees, creditors, and the community in which the company operates. (...) Positive market returns to the adoption of NFE amendments provide some evidence that investors approve. The percent of both management and institutional ownership are positively related to the market reaction to the NFE amendment adoption. To the extent that institutional ownership proxies for the broad spectrum of stakeholder interest, NFE devices, unlike some other amendments that have been studied, appear to be in the interests of more than a single interest group. (shrink)
Focusing on Truth explores the question of what truth is, balancing historical with issue-orientated discussion. The book offers a comprehensive survey of all the major theories of truth. Lawrence Johnson investigates a number of closely related matters of truth in his inquiry, such as: What sorts of things are true or false? What is attributed to them when they are said to be true or false? What do facts have to do with truth? What can we learn from previous (...) theories? The book opens with an analysis of the coherence theory of truth and then the correspondence theory of truth, as developed by Moore, Russell and Wittgenstein. Through a study of the semantic conceptions of truth, the author reveals that an adequate theory of truth must take account of the pragmatics of person, purpose, and circumstance. A full understanding of facts and truth bearers is considered central to Johnson's criticism of the opposing truth theories of J. L. Austin and P. F. Strawson. Drawing on the merits of these theories and others, while identifying their deficiencies, Johnson presents a new account of truth, based on the correlation of referential foci and the use of linguistic conventions. This account is defended as being adequate to meet the legitimate demands made on a theory of truth. Johnson argues that the account leaves scope for statements of many different sorts to be true in their own widely varying ways, without the existence of a need to posit fundamentally different kinds of truth. (shrink)
Adults and infants display a robust ability to perceive the unity of a center-occluded object when the visible ends of the object undergo common motion (e.g. Kellman, P.J., Spelke, E.S., 1983. Perception of partly occluded objects in infancy. Cognitive Psychology 15, 483±524). Ecologically oriented accounts of this ability focus on the primacy of motion in the perception of segregated objects, but Gestalt theory suggests a broader possibility: observers may perceive object unity by detecting patterns of synchronous change, of which common (...) motion is a special case. We investigated this possibility with observations of adults and 4-month-old infants. Participants viewed a center-occluded object whose visible surfaces were either misaligned or aligned, stationary or moving, and unchanging or synchronously changing in color or bright- ness in various temporal patterns (e.g. ¯ashing). Both alignment and common motion con- tributed to adults' perception of object unity, but synchronous color changes did not. For infants, motion was an important determinant of object unity, but other synchronous changes and edge alignment were not. When a stationary object with aligned edges underwent syn- chronous changes in color or brightness, infants showed high levels of attention to the object, but their perception of its unity appeared to be indeterminate. An inherent preference for fast over slow ¯ash rates, and a novelty preference elicited by a change in rate, both indicated that infants detected the synchronous changes, although they failed to use them as information for object unity. These ®ndings favor ecologically oriented accounts of object perception in which surface motion plays a privileged role. Ó 1999 Elsevier Science B.V. All rights reserved. (shrink)
Review of extant research on the corporate environmental performance (CEP) and corporate financial performance (CFP) link generally demonstrates a positive relationship. However, some arguments and empirical results have demonstrated otherwise. As a result, researchers have called for a contingency approach to this research stream, which moves beyond the basic question “does it pay to be green?” and instead asks “when does it pay to be green?” In answering this call, we provide a meta-analytic review of CEP–CFP literature in which we (...) identify potential moderators to the CEP–CFP relationship including environmental performance type (e.g., reactive vs. proactive performance), firm characteristics (e.g., large vs. small firms), and methodological issues (e.g., self-report measures). By analyzing these contingencies, this study attempts to provide a basis on which to draw conclusions regarding some inconsistencies and debates in the CEP–CFP research. Some of the results of the moderator analysis suggest that small firms benefit from environmental performance as much or more than large firms, US firms seem to benefit more than international counterparts, and environmental performance seems to have the strongest influence on market-measures of financial performance. (shrink)
The seminal 1993 article by LaFollette and Shanks “Animal Models in Biomedical Research: Some Epistemological Worries” introduced an influential taxonomy into the debate about the value of animal experimentation. The distinction they made between hypothetical and causal analog models served to highlight a concern regarding extrapolating results obtained in animal models to human subjects, which endures today. Although their taxonomy has made a significant contribution to the field, we maintain that it is flawed, and instead, we offer a new practice-oriented (...) taxonomy of animal models as a means to allow philosophers, modelers, and other interested parties to discuss the epistemic merits and shortcomings, purpose, and predictive capacities of specific modeling practices. (shrink)
The similarity of documents in a large database of published Fractals articles was examined for redundancy. Three different text matching techniques were used on published Abstracts to identify redundancy candidates, and predictions were verified by reading full text versions of the redundancy candidate articles. A small fraction of the total articles in the database was judged to be redundant. This was viewed as a lower limit, because it excluded cases where the concepts remained the same, but the text was altered (...) substantially. Far more pervasive than redundant publications were publications that did not violate the letter of redundancy but rather violated the spirit of redundancy. There appeared to be widespread publication maximization strategies. Studies that resulted in one comprehensive paper decades ago now result in multiple papers that focus on one major problem, but are differentiated by parameter ranges, or other stratifying variables. This ‘paper inflation’ is due in large part to the increasing use of metrics (publications, patents, citations, etc) to evaluate research performance, and the researchers’ motivation to maximize the metrics. (shrink)
Collective memory can be used conceptually to examine African-American perceptions of wildlands and black interaction with such places. The middle-American view of wildlands frames these terrains as refuges—pure and simple, sanctified places distinct from the profanity of human modification. However, wild, primitive areas do not exist in the minds of all Americans as uncomplicated or uncontaminated places. Three labor-related institutions—forest labor, plantation agriculture, and sharecropping—and terrorism and lynching have impacted negatively on black perceptions of wildlands, producing an ambivalence toward such (...) places among African Americans. (shrink)
This study investigates the connection of moral reasoning to demographic and performance variables in business education, especially business and technical writing. The moral reasoning construct serves as the foundation for one''s decision making when confronted with moral dilemmas. Significant relationships are reported between subjects'' writing skill and their moral reasoning scores. This research serves as a foundation for questions about writers'' moral reasoning and the ethical decisions each writer makes in written communication. In addition, this study supports further research into (...) the connection between moral reasoning and written communication, given the significant relationships reported and the noticeable shortage of related, data-based research. (shrink)
Lakatos' MSRP is utilized to provide a response to Koertge's claim (in her ‘Does Social Science Really Need Metaphysics?’) that the heuristic significance of metaphysics has been vastly overrated. By outlining the hard cores and positive heuristics of the two major research programmes in economics (namely, the ‘orthodox’ and ‘Marxist’ research programmes), the paper demonstrates (in opposition to Koertge's claim) not only that the metaphysical statements in the respective hard cores are far from vague but also how these exert an (...) important regulative influence both on the theories and economic policy recommendations generated within the respective programmes. It also indicates how the adoption of the MSRP method of appraisal in economics would help to protect against the danger of entrenched metaphysical dogmatism with its implications not only for economic policy recommendations but also for moral and political recommendations. * We appreciate the assistance given by Professor N. Koertge, who provided us with translations of her articles, and the helpful comments of an anonymous referee. (shrink)
J. M. Kennedy and J. Vervaeke argue that my view of the bodily and imaginative basis of meaning commits me to a mistaken reductionism and to the erroneous view that metaphors actually impose structure on the target domain. I explain the sense in which image schemas are central to the bodily grounding of meaning, although in a way that is not reductionistic. I then show how conceptual metaphors can involve pre-existing image-schematic structure and yet can also be partially constitutive of (...) the conceptual structure of the target domain. In this way human conceptual systems can be both rooted in patterns of our bodily interactions and at the same time can be subject to various kinds of imaginative development and extension. (shrink)
Following Aristotle (who himself was following Parmenides), philosophers have appealed to the distributional reflexes of expressions in determining their semantic status, and ultimately, the nature of the extra-linguistic world. This methodology has been practiced throughout the history of philosophy; it was clarified and made popular by the likes of Zeno Vendler and J.L. Austin, and is realized today in the toolbox of linguistically minded philosophers. Studying the syntax of natural language was fueled by the belief that there is a conceptually (...) tight connection between the syntax of our language and its semantics, and the belief that there is a similarly tight connection between the semantics of our language and metaphysical facts about the world. We are less confident than our colleagues about the relation syntax has to semantics and metaphysics. In particular, we do not believe that the current status of theoretical syntax (or semantics or metaphysics) provides much support for either of the above two beliefs. We will illustrate our view with a case study regarding the status of complex demonstratives. We will show that a recent and particularly subtle syntactically based argument for the semantic/metaphysical status of complex demonstratives does not in fact show what semantic category complex demonstratives are.. (shrink)
J. Baird Callicott misinterprets both the way in which pain seems important to animal liberationists and why it is thought important. Examination of Callicott’s account reveals its inadequacies and strengthens the animal liberationist’s position. It also indicates that resolution of the dispute between proponents of animal liberation and the land ethic demands consideration of the justifiability of “sentientism.”.
J. D. Monk has shown that for first order languages with finitely many variables there is no finite set of schema which axiomatizes the universally valid formulas. There are such finite sets of schema which axiomatize the formulas valid in all structures of some fixed finite size.
Many decisions involve multiple stages of choices and events, and these decisions can be represented graphically as decision trees. Optimal decision strategies for decision trees are commonly determined by a backward induction analysis that demands adherence to three fundamental consistency principles: dynamic, consequential, and strategic. Previous research (Busemeyer et al. 2000, J. Exp. Psychol. Gen. 129, 530) found that decision-makers tend to exhibit violations of dynamic and strategic consistency at rates significantly higher than choice inconsistency across various levels of potential (...) reward. The current research extends these findings under new conditions; specifically, it explores the extent to which these principles are violated as a function of the planning horizon length of the decision tree. Results from two experiments suggest that dynamic inconsistency increases as tree length increases; these results are explained within a dynamic approachâavoidance framework. (shrink)
This essay surveys and assesses J. G. Merquior's principal English?language contributions to liberal social and political theory. The greatest strength of Merquior's work is his recognition that one can neither understand nor defend liberalism without first understanding and defending modernity. The greatest weakness of Merquior's work is his overly oppositional conception of the relationship between modernity and its postmodern critics, particularly his failure to recognize that both the positive and negative features of postmodernism are simply radicalizations of the positive and (...) negative features of modernity itself. It is argued that the strengths of Merquior's work are best affirmed and its weaknesses best overcome by appropriating it within the context of a ?critical modernist? approach to understanding and legitimizing the institutions and practices characteristic of modernity and liberalism. (shrink)
Ethics of Judaism, by M.J. Routtenberg.--Ethics of Roman Catholicism, by J.P. Fitzpatrick.--Ethics of Protestantism, by A.T. Mollegen.--The ethical culture movement, by J. Nathanson.--Rational ethics, by L. Bryson.--Ethical frontiers, by W.G. Muelder.
This paper utilizes the theories of metaphor of George Lakoff, Mark Johnson and Julian Jaynes to extend Jaynes' metaphor theory of consciousness by treating consciousness as an operator that works with 'covert behavior' so that humans can integrate temporally discontinuous percepts with concepts based on metaphoric extensions of the embodied schemas of direct and immediate perception and thereby transcend the limitations of direct perception. A theory of first-person expressions and covert behavior to account for self-conscious awareness as language-based is (...) advanced. Subjectivity and objectivity are metaphors based on schemas of perception. (shrink)
Can a bomb ever be "clean"? Are we relieved to be warned that there will be an "odor" when once we were told that something would "stink"? Or, to put it another way, when is a euphemism a mark of good taste and when is it a sign of verbal obfuscation? To answer such questions, D.J. Enright invited sixteen distinguished writers to ponder and explore the ubiquitous phenomenon of euphemism. The result is a delightful and provocative collection that not (...) only includes general reflections on euphemism and its history but also treats such specific categories as sex, death, and other natural functions; politics; the language of the great Christian texts; euphamisms spoken to and by children; the law; medicine; office life; and the jargon of official spokesmen, military communiques, and tyrants. Such writers as Diane Johnson, Robert Nisbet, John Gross, Robert Burchfield, and Joseph Epstein bring a variety of perspectives and sensibilities to bear on these topics. Because euphemisms are so intimate and integral to our thinking, any study of them is bound to throw light on the human condition, both past and present. In these essays, humor jostles horror and the homely alternates with the farfetched. Taken together they form an eloquent and often amusing testament to the richness of the subject. About the Author: D.J. Enright is a noted English poet and critic. He recently compiled and edited The Oxford Book of Death. (shrink)
This book is an exploration of human understanding, from the perspectives of psychology, philosophy, biology and theology. The six contributors are among the most internationally eminent in their fields. Though scholarly, the writing is non-technical. No background in psychology, philosophy or theology is presumed. No other interdisciplinary work has undertaken to explore the nature of human understanding. This book is unique, and highly significant for anyone interested in or concerned about the human condition.
The theory of morality we can call full rule-consequentialism selects rules solely in terms of the goodness of their consequences and then claims that these rules determine which kinds of acts are morally wrong. George Berkeley was arguably the first rule-consequentialist. He wrote, “In framing the general laws of nature, it is granted we must be entirely guided by the public good of mankind, but not in the ordinary moral actions of our lives. … The rule is framed with respect (...) to the good of mankind; but our practice must be always shaped immediately by the rule.” (Berkeley 1712, section 31) Writers often classed as rule-consequentialists include Austin 1832; Harrod 1936; Toulmin 1950; Urmson 1953; Harrison 1953; Mabbott 1953; Singer 1955; 1961; and most prominently Brandt 1959; 1963; 1967; 1979; 1989; 1996; and Harsanyi 1977; 1982; 1993. See also Rawls 1955; Hospers 1972; Haslett 1987; 1994, ch. 1; 2000; Attfield 1987, 103-12; Barrow 1991, ch. 6; Johnson 1991; Riley 1998; 2000; Shaw 1999; and Hooker 2000. Whether J. S. Mill's ethics was rule-consequentialist is controversial (Urmson 1953; Crisp 1997, 102-33). (shrink)