This paper is concerned with the representation in academic journal articles and textbooks of an organisation theory. In the case of Burns’ and Stalker’s book The Management of Innovation, summaries of the text by other scholars have arguably differed from the original authors and among themselves in their emphases. Similar points have been made about representations of other theorists such as Kurt Lewin and, perhaps most famously, Adam Smith. They all raise issues about the meanings of texts and where such (...) meanings lie: with the author, the reader, the text itself or perhaps some combination of these. They also raise questions about whether texts can be shown to have definitive meanings; and if not, whether there are any criteria for adjudicating on the validity of varied interpretations.Representations by textbook writers are analysed and questions about the meaning of texts raised by ‘structuralist’ and ‘deconstructionist’ writers examined. Their writings beg certain questions about textual representations. Perhaps the most extreme of these views is Barthes’ concept of the ‘death of the author’. Like Barthes, Derrida argues, for the reasons mentioned above, that there is no underlying, final decipherable meaning in a text, but gives more credence to the role of the author, accepting the validity of the author’s consciousness and intentions as one of the sources of meaning in texts. There are also other sources: the situatedness and historical context of the text and the text itself.Derrida’s concept of ‘différance’ requires the reader to engage in an analysis of the text which offers limitless possibilities for interpretation and a renunciation of the certainty of truth, because the meaning of a text may extend beyond the limits of our knowledge at any one time. His notion of the ‘logic of supplementarity’ is a further means to analyse texts, as it also disprivileges obvious or overt meanings in texts by overturning hierarchy in oppositions and questioning univocal definitions of meaning.Questions inspired by these and other writers give rise to an exploration of who is speaking in the text; which subject matter is represented as central and which as marginal; binary oppositions within the text and intertextual connections. The paper then begins the more ambitious task of answering the broader question as to whether it can be shown that there are more and less ‘representative’ or ‘stronger’ interpretations of a text. (shrink)
This paper addresses representations of Burns and Stalker’s theory that arose soon after its publication in The Management of Innovation in 1961. Different conceptions of Burns and Stalker’s contingency theory as portrayed in organisation and management texts are discussed. It will be argued that what has been represented as their theory stems in the main from ideas based on different positions within the spectrum of the positivistic, functionalist ‘paradigm’.
Problems surrounding representations of texts have previously been raised and discussed, as has the difficulty, in the light of hermeneutic, critical and post-structuralist writers, of arriving at definitive meanings of texts. This paper is part of ongoing research into the problem of evaluating the representation of original texts in the organisation/management area. The texts in question are Burns and Stalker’s The Management of Innovation 1961, 1966, and to alesser degree Lawrence and Lorsch’s Organization and Environment 1967. The representations are those (...) in three widely used textbooks typical of many, and applications in management accounting research. A way round this apparent impasse may be to see whether there are any ‘objective’ or commonly accepted standards and criteria within a particular system of thought, assuming that the texts in question are all located within the same system. These texts will beexplored in terms of the paradigmatic boundaries they encompass to see whether the same kinds of problems and solutions are presented, and whether they ultimately lie within the same boundaries. Finally, having argued that they are largely located in different paradigms, the underlying question is raised as to whether one paradigm can be an adequate vehicle for the transmission of a text substantially in another. (shrink)
Mainstream management scholarship has for the last half century largely legitimated its scholarship and production of knowledge on the grounds that its research is objective, neutral, scientific and uninfluenced either by its researchers or by data distorted by subjectivist human factors. However, over the decades there have been serious and sustained criticisms of aspects of this scholarship not least from within the field by mainstream scholars, eg Otley and Panozzo on grounds of the inadequacy of synchronic studies that were found (...) to be non-replicable; of the limitations of surveys/questionnaires, so frequently used to acquire data; of abstract idealisations unrelated to the real world and on an ‘obsessive preoccupation with numbers’. Serious criticisms have been made of this type of scholarship more generally in the social sciences on the grounds of political bias. Claims to the scientific nature and therefore legitimacy of this type of research have also been contested. On the other hand examples of alternate scholarship, advocating and using mixed method and multi-paradigmatic approaches, have been published in the same high-ranking journals as has research using the mainstream approaches mentioned above. These showed a richness in the data, generally unobtainable in solely objectivist approaches. Yet despite these factors, mainstream scholarship has in the main continued to produce objectivist, empiricist, quantitative-focussed research and knowledge. Reasons for this have been suggested, at the local level, the academy, including its administration, heads of department and journal editors driven by the criteria set by high-ranking journals. At a wider level, the US government’s fright at the successful launch of the Soviet Sputnik led to demands for ‘hard’ science also in the management field. Managerialism, a dominant ideology, has been an influence on researchers’ approaches, giving managers the prerogative of having the necessary knowledge and also the power and capability successfully to implement strategies independently of their subordinates. Although these explanations may have some mileage in explaining the persistence of this type of scholarship, and its resistance to the multitude of criticism against it and to compelling examples of more inclusive research, further explanations have been sought. It is argued in this paper that deeper explanations may lie in the power of neoliberal ideas, principles and policies which have spread beyond economics, permeating and seriously affecting other aspects of life including education and scholarship. (shrink)
In this paper, I begin by defending permissivism: the claim that, sometimes, there is more than one way to rationally respond to a given body of evidence. Then I argue that, if we accept permissivism, certain worries that arise as a result of learning that our beliefs were caused by the communities we grew up in, the schools we went to, or other irrelevant influences dissipate. The basic strategy is as follows: First, I try to pinpoint what makes irrelevant influences (...) worrying and I come up with two candidate principles. I then argue that one principle should be rejected because it is inconsistent with permissivism. The principle we should accept implies that it is sometimes rational to maintain our beliefs, even upon learning that they were caused by irrelevant influences. (shrink)
How is medical knowledge made? There have been radical changes in recent decades, through new methods such as consensus conferences, evidence-based medicine, translational medicine, and narrative medicine. Miriam Solomon explores their origins, aims, and epistemic strengths and weaknesses; and she offers a pluralistic approach for the future.
This paper is about the connection between rationality and accuracy. I show that one natural picture about how rationality and accuracy are connected emerges if we assume that rational agents are rationally omniscient. I then develop an alternative picture that allows us to relax this assumption, in order to accommodate certain views about higher order evidence.
Greaves and Wallace argue that conditionalization maximizes expected accuracy. In this paper I show that their result only applies to a restricted range of cases. I then show that the update procedure that maximizes expected accuracy in general is one in which, upon learning P, we conditionalize, not on P, but on the proposition that we learned P. After proving this result, I provide further generalizations and show that much of the accuracy-first epistemology program is committed to KK-like iteration principles (...) and to the existence of a class of propositions that rational agents will be certain of if and only if they are true. (shrink)
In recent years, permissivism—the claim that a body of evidence can rationalize more than one response—has enjoyed somewhat of a revival. But it is once again being threatened, this time by a host of new and interesting arguments that, at their core, are challenging the permissivist to explain why rationality matters. A version of the challenge that I am especially interested in is this: if permissivism is true, why should we expect the rational credences to be more accurate than the (...) irrational ones? My aim is to turn this challenge on its head and argue that, actually, those who deny permissivism will have a harder time responding to such a challenge than those who accept it. (shrink)
It has been claimed that, in response to certain kinds of evidence, agents ought to adopt imprecise credences: doxastic states that are represented by sets of credence functions rather than single ones. In this paper I argue that, given some plausible constraints on accuracy measures, accuracy-centered epistemologists must reject the requirement to adopt imprecise credences. I then show that even the claim that imprecise credences are permitted is problematic for accuracy-centered epistemology. It follows that if imprecise credal states are permitted (...) or required in the cases that their defenders appeal to, then the requirements of rationality can outstrip what would be warranted by an interest in accuracy. (shrink)
The aim of this paper is to apply the accuracy based approach to epistemology to the case of higher order evidence: evidence that bears on the rationality of one's beliefs. I proceed in two stages. First, I show that the accuracy based framework that is standardly used to motivate rational requirements supports steadfastness—a position according to which higher order evidence should have no impact on one's doxastic attitudes towards first order propositions. The argument for this will require a generalization of (...) an important result by Greaves and Wallace for the claim that conditionalization maximizes expected accuracy. The generalization I provide will, among other things, allow us to apply the result to cases of self-locating evidence. In the second stage, I develop an alternative framework. Very roughly, what distinguishes the traditional approach from the alternative one is that, on the traditional picture, we're interested in evaluating the expected accuracy of conforming to an update procedure. On the alternative picture that I develop, instead of considering how good an update procedure is as a plan to conform to, we consider how good it is as a plan to make. I show how, given the use of strictly proper scoring rules, the alternative picture vindicates calibrationism: a view according to which higher order evidence should have a significant impact on our beliefs. I conclude with some thoughts about why higher order evidence poses a serious challenge for standard ways of thinking about rationality. (shrink)
For this Clarendon Paperback, Dr Griffin has written a new Postscript to bring the original book fully up to date. She discusses further important and controversial questions of fact or interpretation in the light of the scholarship of the intervening years and provides additional argument where necessary. The connection between Seneca's prose works and his career as a first-century Roman statesman is problematic. Although he writes in the first person, he tells us little of his external life or of the (...) people and events that formed its setting. Miriam Griffin addresses the problem by first reconstructing Seneca's career using only outside sources and his de Clementia and Apocolocyntosis, whose political purposes are undisputed. In the second part of the book she studies Seneca's treatment of subjects of political significance, including his views on slavery, provincial policy, wealth, and suicide. On the whole, the word of the philosopher is found to illuminate the work of the statesman, but notable exceptions emerge, and the links that are revealed vary from theme to theme and rarely accord with traditional autobiographical interpretations of Seneca's works. (shrink)
The question of whether it is ever permissible to believe on insufficient evidence has once again become a live question. Greater attention is now being paid to practical dimensions of belief, namely issues related to epistemic virtue, doxastic responsibility, and voluntarism. In this book, McCormick argues that the standards used to evaluate beliefs are not isolated from other evaluative domains. The ultimate criteria for assessing beliefs are the same as those for assessing action because beliefs and actions are both products (...) of agency. Two important implications of this thesis, both of which deviate from the dominant view in contemporary philosophy, are 1) it can be permissible to believe for non-evidential reasons, and 2) we have a robust control over many of our beliefs, a control sufficient to ground attributions of responsibility for belief. (shrink)
It has been argued that Extended Cognition (EXT), a recently much discussed framework in the philosophy of cognition, would serve as the theoretical basis to account for the impact of Brain Computer Interfaces (BCI) on the self and life of patients with Locked-in Syndrome (LIS). In this paper I will argue that this claim is unsubstantiated, EXT is not the appropriate theoretical background for understanding the role of BCI in LIS. I will critically assess what a theory of the extended (...) self would comprise and provide a list of desiderata for a theory of self that EXT fails to accommodate for. There is, however, an alternative framework in Cognitive Science, Enactivism, which entails the basis for an account of self that is able to accommodate for these desiderata. I will outline some first steps towards an Enactive approach to the self, suggesting that the self could be considered as a form of human autonomy. Understanding the self from an enactive point of view will allow to shed new light on the questions of whether and how BCIs affect or change the selves of patients with LIS. (shrink)
The aim of this paper is to describe a problem for calibrationism: a view about higher order evidence according to which one's credences should be calibrated to one's expected degree of reliability. Calibrationism is attractive, in part, because it explains our intuitive judgments, and provides a strong motivation for certain theories about higher order evidence and peer disagreement. However, I will argue that calibrationism faces a dilemma: There are two versions of the view one might adopt. The first version, I (...) argue, has the implausible consequence that, in a wide range of cases, calibrationism is the only constraint on rational belief. The second version, in addition to having some puzzling consequences, is unmotivated. At the end of the paper I sketch a possible solution. (shrink)
Embodied approaches in cognitive science hold that the body is crucial for cognition. What this claim amounts to, however, still remains unclear. This paper contributes to its clarification by confronting three ways of understanding embodiment—the sensorimotor approach, extended cognition and enactivism—with Locked-in syndrome. LIS is a case of severe global paralysis in which patients are unable to move and yet largely remain cognitively intact. We propose that LIS poses a challenge to embodied approaches to cognition requiring them to make explicit (...) the notion of embodiment they defend and its role for cognition. We argue that the sensorimotor and the extended functionalist approaches either fall short of accounting for cognition in LIS from an embodied perspective or do it too broadly by relegating the body only to a historical role. Enactivism conceives of the body as autonomous system and of cognition as sense-making. From this perspective embodiment is not equated with bodily movement but with forms of agency that do not disappear with body paralysis. Enactivism offers a clarifying perspective on embodiment and thus currently appears to be the framework in embodied cognition best suited to address the challenge posed by LIS. (shrink)
The aim of this essay is to argue that, if a robust form of moral realism is true, then moral vagueness is ontic vagueness. The argument is by elimination: I show that neither semantic nor epistemic approaches to moral vagueness are satisfactory.
In Why Not Socialism?, GA Cohen defines socialism as the combined application of two moral principles: the egalitarian principle and the principle of community. The desirability of a social order organized around these two principles is illustrated by the ‘camping trip’ example. After describing the fundamental features of the camping trip scenario at reasonable length, Cohen argues that the desirability of such a social model is nearly self-explanatory, concluding therefore that the most significant challenges to socialism lie in its feasibility. (...) This article argues that the desirability of the camping trip model as an appropriate ideal for society is less obvious than Cohen acknowledges. To argue my point, I shall compare the camping trip with another social practice that is equally small sized and characterized by strong emotional ties among its members, but in which the conditions of what I shall call ‘goal-monism’ and discontinuity in time do not hold, namely, the family. (shrink)
My main aim is to argue that most conceptions of doxastic agency do not respond to the skeptic’s challenge. I begin by considering some reasons for thinking that we are not doxastic agents. I then turn to a discussion of those who try to make sense of doxastic agency by appeal to belief’s reasons-responsive nature. What they end up calling agency is not robust enough to satisfy the challenge posed by the skeptics. To satisfy the skeptic, one needs to make (...) sense of the possibility of believing for nonevidential reasons. While this has been seen as an untenable view for both skeptics and anti-skeptics, I conclude by suggesting it is a position that has been too hastily dismissed. (shrink)
Mitchell S. Green presents a systematic philosophical study of self-expression - a pervasive phenomenon of the everyday life of humans and other species, which has received scant attention in its own right. He explores the ways in which self-expression reveals our states of thought, feeling, and experience, and he defends striking new theses concerning a wide range of fascinating topics: our ability to perceive emotion in others, artistic expression, empathy, expressive language, meaning, facial expression, and speech acts. He draws (...) on insights from evolutionary game theory, ethology, the philosophy of language, social psychology, pragmatics, aesthetics, and neuroscience to present a stimulating and accessible interdisciplinary work. (shrink)
Abstract: This paper defends a constraint that any satisfactory decision theory must satisfy. I show how this constraint is violated by all of the decision theories that have been endorsed in the literature that are designed to deal with cases in which opinions or values are represented by a set of functions rather than a single one. Such a decision theory is necessary to account for the existence of what Ruth Chang has called “parity” (as well as for cases in (...) which agents have incomplete preferences or imprecise credences). The problem with the all of the decision theories that have been defended to account for parity is that they are committed to a claim I call unanimity: when all of the functions in the set agree that an agent ought to do A, then an agent ought to do A. A decision theory committed to unanimity violates the constraint I defend in this paper. Thus, if parity exists, a new approach to decision theory is necessary. (shrink)
Evidence-Based Medicine (EBM) developed from the work of clinical epidemiologists at McMaster University and Oxford University in the 1970s and 1980s and self-consciously presented itself as a "new paradigm" called "evidence-based medicine" in the early 1990s. The techniques of the randomized controlled trial, systematic review and meta-analysis have produced an extensive and powerful body of research. They have also generated a critical literature that raises general concerns about its methods. This paper is a systematic review of the critical literature. It (...) finds the description of EBM as a Kuhnian paradigm helpful and worth taking further. Three kinds of criticism are evaluated in detail: criticisms of procedural aspects of EBM (especially from Cartwright, Worrall and Howick), data showing the greater than expected fallibility of EBM (Ioaanidis and others), and concerns that EBM is incomplete as a philosophy of science (Ashcroft and others). The paper recommends a more instrumental or pragmatic approach to EBM, in which any ranking of evidence is done by reference to the actual, rather than the theoretically expected, reliability of results. Emphasis on EBM has eclipsed other necessary research methods in medicine. With the recent emphasis on translational medicine, we are seeing a restoration of the recognition that clinical research requires an engagement with basic theory (e.g. physiological, genetic, biochemical) and a range of empirical techniques such as bedside observation, laboratory and animal studies. EBM works best when used in this context. (shrink)
Green product innovation has been recognized as one of the key factors to achieve growth, environmental sustainability, and a better quality of life. Understanding green product innovation as a result of interaction between innovation and sustainability has become a strategic priority for theory and practice. This article investigates green product innovation by means of a multiple case study analysis of 12 small to medium size manufacturing companies based in Italy and Canada. First, we propose a conceptual framework (...) that presents three key environmental dimensions of green product innovation such as energy minimization, materials reduction, and pollution prevention as identified in the life cycle phases of products. Based on insights gained from in-depth interviews, we discuss firms' motivations to develop green products, environmental policies and targets for products, different dimensions of green product innovation, and challenges faced during developing and marketing of green products. Results from the study are then synthesized and integrated in a toolbox that sheds light on various aspects of green product innovation and provides solutions to challenges and risks that are faced by firms. Finally, implications for managers, academia and public policy makers are discussed. (shrink)
Special 2018 Edition From the new Introduction by Michelle Fine, Graduate Center, CUNY : "Why now, you may ask, should I return to a book written in 1988? Because, in Maxine's words: 'When freedom is the question, it is always time to begin.'" In The Dialectic of Freedom, Maxine Greene argues that freedom must be achieved through continuing resistance to the forces that limit, condition, determine, and—too frequently—oppress. Examining the interrelationship between freedom, possibility, and imagination in American education, Greene taps (...) the fields of philosophy, history, educational theory, and literature in order to discuss the many struggles that have characterized Americans’ quests for freedom in the midst of what is conceived to be a free society. Accounts of the lives of women, immigrants, and minority groups highlight the ways in which Americans have gone in search of openings in their lived situations, learned to look at things as if they could be otherwise, and taken action on what they found. Greene presents a unique overview of American concepts and images of freedom from Jefferson’s time to the present. She examines the ways in which the disenfranchised have historically understood and acted on their freedom—or lack of it—in dealing with perceived and real obstacles to expression and empowerment. Strong emphasis is placed on the focal role of the arts and art experience in releasing human imagination and enabling the young to reach toward their vision of the possible. The author concludes with suggestions for approaches to teaching and learning that can provoke both educators and students to take initiatives, to transcend limits, and to pursue freedom—not in solitude, but in reciprocity with others, not in privacy, but in a public space. (shrink)
It seems like we care about at least two features of our credence function: gradational-accuracy and verisimilitude. Accuracy-first epistemology requires that we care about one feature of our credence function: gradational-accuracy. So if you want to be a verisimilitude-valuing accuracy-firster, you must be able to think of the value of verisimilitude as somehow built into the value of gradational-accuracy. Can this be done? In a recent article, Oddie has argued that it cannot, at least if we want the accuracy measure (...) to be proper. I argue that it can. 1Introduction2Some Nuts and Bolts3First Attempts4Oddie’s Constraint5The Good5.1Proximity over the disagreement metric 5.2Proximity over the magnitude metric 6The Bad and the Ugly 7Some More Good: The Role of Evenness of Distribution 8Some More Bad: Which Propositions to Privilege? 9Concluding Thoughts: Accuracy and Practical Value. (shrink)
Internalists face the following challenge: what is it about an agent's internal states that explains why only these states can play whatever role the internalist thinks these states are playing? Internalists have frequently appealed to a special kind of epistemic access that we have to these states. But such claims have been challenged on both empirical and philosophical grounds. I will argue that internalists needn't appeal to any kind of privileged access claims. Rather, internalist conditions are important because of the (...) way in which we expect them to act as causal mediators between states of the world, on the one hand, and our beliefs and actions on the other. (shrink)
Recent field studies have broadened our view on cultural performances in animals. This has consequences for the concept of cumulative culture. Here, we deconstruct the common individualist and differential approaches to culture. Individualistic approaches to the study of cultural evolution are shown to be problematic, because culture cannot be reduced to factors on the micro level of individual behavior but possesses a dynamic that only occurs on the group level and profoundly affects the individuals. Naive individuals, as a prerequisite of (...) an atomistic perspective, do not exist. We address the construction of a social approach to culture by introducing an inevitable social embedding of the individual development of social beings. The sociological notion of “habitus” as embodied cultural capital permits us to understand social transmission of behavioral components on a very basic level, resulting in a cumulative effect. Bits of information, movement, handling of material, attitudes, and preferences below distinct functional units are acquired through transfer mechanisms simpler than emulation and imitation such as peering, participation, co-performance, or engagement with a material environment altered by group members. The search for a zero point of cumulative culture becomes as useless as the search for a zero point of culture. Culture is cumulative. (shrink)
G. E. Moore observed that to assert, 'I went to the pictures last Tuesday but I don't believe that I did' would be 'absurd'. Over half a century later, such sayings continue to perplex philosophers. In the definitive treatment of the famous paradox, Green and Williams explain its history and relevance and present new essays by leading thinkers in the area.
From the Ancient Greeks, through medieval Christian doctrine, and into the modern age, philosophers have long held envy to be irrational, a position that increasingly accompanies the political view that envy is not a justification for redistributing material goods. After defining the features of envy, and considering two arguments in favour of its irrationality, this article opposes the dominant philosophical and political consensus. It does so by deploying Rawls's much-ignored concept of ‘excusable envy’ to identify a form of envy that (...) is not imprudent and does not mis-describe. With this work completed, the article then argues – no doubt controversially – that excusable envy constitutes good grounds for redistribution or inequality-mitigation. In so doing, the article throws light on the moral significance of certain forms of uncivil disobedience, and also offers a new vocabulary for popular ‘politics of envy’ debates, which are yet to acknowledge the role of social institutions in reproducing envy-excusing economic inequalities. (shrink)
The modern state claims supreme authority over the lives of all its citizens. Drawing together political philosophy, jurisprudence, and public choice theory, this book forces the reader to reconsider some basic assumptions about the authority of the state. Various popular and influential theories - conventionalism, contractarianism, and communitarianism - are assessed by the author and found to fail. Leslie Green argues that only the consent of the governed can justify the state's claims to authority. While he denies that there (...) is a general obligation to obey the law, he nonetheless rejects philosophical anarchism and defends civility - the willingness to tolerate some imperfection in institutions - as a political virtue. (shrink)
Page generated Mon Aug 2 23:36:56 2021 on philpapers-web-65948fd446-qrpbq
cache stats: hit=18629, miss=19557, save= autohandler : 1669 ms called component : 1645 ms search.pl : 1538 ms render loop : 1254 ms addfields : 638 ms publicCats : 539 ms next : 530 ms initIterator : 280 ms menu : 92 ms save cache object : 86 ms retrieve cache object : 86 ms quotes : 68 ms search_quotes : 37 ms prepCit : 33 ms applytpl : 9 ms intermediate : 2 ms init renderer : 0 ms setup : 0 ms auth : 0 ms writelog : 0 ms