Plausibility models are Kripke models that agents use to reason about knowledge and belief, both of themselves and of each other. Such models are used to interpret the notions of conditional belief, degrees of belief, and safe belief. The logic of conditional belief contains that modality and also the knowledge modality, and similarly for the logic of degrees of belief and the logic of safe belief. With respect to these logics, plausibility models may contain too much information. A proper notion (...) of bisimulation is required that characterises them. We define that notion of bisimulation and prove the required characterisations: on the class of image-finite and preimage-finite models, two pointed Kripke models are modally equivalent in either of the three logics, if and only if they are bisimilar. As a result, the information content of such a model can be similarly expressed in the logic of conditional belief, or the logic of degrees of belief, or that of safe belief. This, we found a surprising result. Still, that does not mean that the logics are equally expressive: the logics of conditional and degrees of belief are incomparable, the logics of degrees of belief and safe belief are incomparable, while the logic of safe belief is more expressive than the logic of conditional belief. In view of the result on bisimulation characterisation, this is an equally surprising result. We hope our insights may contribute to the growing community of formal epistemology and on the relation between qualitative and quantitative modelling. (shrink)
In this paper, we investigate the use of event models for automated planning. Event models are the action defining structures used to define a semantics for dynamic epistemic logic. Using event models, two issues in planning can be addressed: Partial observability of the environment and knowledge. In planning, partial observability gives rise to an uncertainty about the world. For single-agent domains, this uncertainty can come from incomplete knowledge of the starting situation and from the nondeterminism of actions. In multi-agent domains, (...) an additional uncertainty arises from the fact that other agents can act in the world, causing changes that are not instigated by the agent itself. For an agent to successfully construct and execute plans in an uncertain environment, the most widely used formalism in the literature on automated planning is “belief states”: sets of different alternatives for the current state of the world. Epistemic logic is a significantly more expressive and theoretically better founded method for representing knowledge and ignorance about the world. Further, epistemic logic allows for planning according to the knowledge of other agents, allowing the specification of a more complex class of planning domains, than those simply concerned with simple facts about the world. We show how to model multi-agent planning problems using Kripke-models for representing world states, and event models for representing actions. Our mechanism makes use of slight modifications to these concepts, in order to model the internal view of agents, rather than that of an external observer. We define a type of planning domain called epistemic planning domains, a generalisation of classical planning domains, and show how epistemic planning can successfully deal with partial observability, nondeterminism, knowledge and multiple agents. Finally, we show epistemic planning to be decidable in the single-agent case, but only semi-decidable in the multi-agent case. (shrink)
The “practice turn” in philosophy of science has strengthened the connections between philosophy and scientific practice. Apart from reinvigorating philosophy of science, this also increases the relevance of philosophical research for science, society, and science education. In this paper, we reflect on our extensive experience with teaching mandatory philosophy of science courses to science students from a range of programs at University of Copenhagen. We highlight some of the lessons we have learned in making philosophy of science “fit for teaching” (...) outside of philosophy circles by taking selected cases from the students’ own field as the starting point. We argue for adapting philosophy of science teaching to particular audiences of science students, and discuss the benefits of drawing on research within science education to inform curriculum and course design. This involves reconsidering teaching resources, assumptions about students, intended learning outcomes, and teaching formats. We also argue that to make philosophy of science relevant and engaging to science students, it is important to consider their potential career trajectories. By anticipating future contexts and situations in which methodological, conceptual, and ethical questions could be relevant, philosophy of science can demonstrate its value in the education of science students. (shrink)
We present a case study of how mathematicians write for mathematicians. We have conducted interviews with two research mathematicians, the talented PhD student Adam and his experienced supervisor Thomas, about a research paper they wrote together. Over the course of 2 years, Adam and Thomas revised Adam’s very detailed first draft. At the beginning of this collaboration, Adam was very knowledgeable about the subject of the paper and had good presentational skills but, as a new PhD student, did not yet (...) have experience writing research papers for mathematicians. Thus, one main purpose of revising the paper was to make it take into account the intended audience. For this reason, the changes made to the initial draft and the authors’ purpose in making them provide a window for viewing how mathematicians write for mathematicians. We examined how their paper attracts the interest of the reader and prepares their proofs for validation by the reader. Among other findings, we found that their paper prepares the proofs for two types of validation that the reader can easily switch between. (shrink)
Plausibility models are Kripke models that agents use to reason about knowledge and belief, both of themselves and of each other. Such models are used to interpret the notions of conditional belief, degrees of belief, and safe belief. The logic of conditional belief contains that modality and also the knowledge modality, and similarly for the logic of degrees of belief and the logic of safe belief. With respect to these logics, plausibility models may contain too much information. A proper notion (...) of bisimulation is required that characterises them. We define that notion of bisimulation and prove the required characterisations: on the class of image-finite and preimage-finite models, two pointed Kripke models are modally equivalent in either of the three logics, if and only if they are bisimilar. As a result, the information content of such a model can be similarly expressed in the logic of conditional belief, or the logic of degrees of belief, or that of safe belief. This, we found a surprising result. Still, that does not mean that the logics are equally expressive: the logics of conditional and degrees of belief are incomparable, the logics of degrees of belief and safe belief are incomparable, while the logic of safe belief is more expressive than the logic of conditional belief. In view of the result on bisimulation characterisation, this is an equally surprising result. We hope our insights may contribute to the growing community of formal epistemology and on the relation between qualitative and quantitative modelling. (shrink)
Ethical conduct in practice has been increasingly recognised as vital to the accountancy profession following the collapse of Andersen. The foundational principles underpinning accountancy ethics receive relatively uniform recognition worldwide so that this paper concentrates on exploring how to introduce these concepts into established courses at undergraduate level. Historically, the teaching of accounting techniques has been isolated from the personal assimilation of accountancys ethical values by students. Alternative approaches are considered, of a dedicated capstone ethical course or through more (...) progressive integration within existing parts of an established curriculum. An opportunistic example of the latter is then described with the rationale, potential benefits, student reactions and practical difficulties assessed. Overall, the paper explains why, alongside technical skills, their personal development requires undergraduates to develop how to apply for themselves given professional values. It contributes suggestions as to methodologies, content and material for short modules within financial reporting, taxation, auditing and social/ environmental accountancy courses while reflecting on the limitations and potential of their use. (shrink)
This article analyses the attempts to promote economic and social development in the Third World through techniques of empowerment and participation. Based on Michel Foucault’s analytics of government - notably the notion of self-technologies - we analyse two empowerment projects for women. We argue, first, that empowerment projects seek to constitute beneficiaries as active and responsible individuals with the ability to take charge of their own lives. Thus, empowerment should be viewed not as a transfer of power to individuals who (...) formerly possessed little or no power, but as a technology seeking to create self-governing and responsible individuals, i.e. modern citizens in the western liberal sense. Second, through the intertwinement of anthropological knowledges and radical action research, knowledge about the local has become an authoritative mode of veridiction in development interventions. By seeking to instigate and activate ‘local knowledges’, participatory development interventions entail a crucial recasting of the governing of the target population who are now supposed - on the basis of rational decision-making, such as cost-benefit analysis - to freely join the power-loaded game of the active citizen. Third and finally, it is also maintained that the role of the developer is profoundly recast. By basing themselves on the subjective involvement of the individual developer, the participatory approaches recast development as an art form that puts at stake the ethical practices of ‘facilitators’ and beneficiaries alike. (shrink)
Transparency is an increasingly prominent area of research that offers valuable insights for organizational studies. However, conceptualizations of transparency are rarely subject to critical scrutiny and thus their relevance remains unclear. In most accounts, transparency is associated with the sharing of information and the perceived quality of the information shared. This narrow focus on information and quality, however, overlooks the dynamics of organizational transparency. To provide a more structured conceptualization of organizational transparency, this article unpacks the assumptions that shape the (...) extant literature, with a focus on three dimensions: conceptualizations, conditions, and consequences. The contribution of the study is twofold: On a conceptual level, we provide a framework that articulates two paradigmatic positions underpinning discussions of transparency, verifiability approaches and performativity approaches; on an analytical level, we suggest a novel future research agenda for studying organizational transparency that pays attention to its dynamics, paradoxes, and performative characteristics. (shrink)
Experimental studies investigating the contribution of conscious intention to the generation of a sense of agency for one’s own actions tend to rely upon a narrow definition of intention. Often it is operationalized as the conscious sensation of wanting to move right before movement. Existing results and discussion are therefore missing crucial aspects of intentions, namely intention as the conscious sensation of wanting to move in advance of the movement. In the present experiment we used an intentional binding paradigm, in (...) which we distinguished between immediate intention, as usually investigated, and longer standing intention. The results showed that the binding effect was significantly enhanced for distal intentions compared to proximal intentions, indicating that the former leads to stronger sense of agency. Our finding provides empirical support for a crucial distinction between at least two types of intention when addressing the efficacy of conscious intentions. (shrink)
On Folk Epistemology explores how we ascribe knowledge to ourselves and others. Empirical evidence suggests that we do so early and often in thought as well as in talk. Since knowledge ascriptions are central to how we navigate social life, it is important to understand our basis for making them. -/- A central claim of the book is that factors that have nothing to do with knowledge may lead to systematic mistakes in everyday ascriptions of knowledge. These mistakes are explained (...) by an empirically informed account of how ordinary knowledge ascriptions are the product of cognitive heuristics that are associated with biases. In developing this account, Mikkel Gerken presents work in cognitive psychology and pragmatics, while also contributing to epistemology. For example, Gerken develops positive epistemic norms of action and assertion and moreover, critically assesses contextualism, knowledge-first methodology, pragmatic encroachment theories and more. Many of these approaches are argued to overestimate the epistemological significance of folk epistemology. In contrast, this volume develops an equilibristic methodology according to which intuitive judgments about knowledge cannot straightforwardly play a role as data for epistemological theorizing. Rather, critical epistemological theorizing is required to interpret empirical findings. Consequently, On Folk Epistemology helps to lay the foundation for an emerging sub-field that intersects philosophy and the cognitive sciences: The empirical study of folk epistemology. (shrink)
Mathematicians’ use of external representations, such as symbols and diagrams, constitutes an important focal point in current philosophical attempts to understand mathematical practice. In this paper, we add to this understanding by presenting and analyzing how research mathematicians use and interact with external representations. The empirical basis of the article consists of a qualitative interview study we conducted with active research mathematicians. In our analysis of the empirical material, we primarily used the empirically based frameworks provided by distributed cognition and (...) cognitive semantics as well as the broader theory of cognitive integration as an analytical lens. We conclude that research mathematicians engage in generative feedback loops with material representations, that they use representations to facilitate the use of experiences of handling the physical world as a resource in mathematical work, and that their use of representations is socially sanctioned and enabled. These results verify the validity of the cognitive frameworks used as the basis for our analysis, but also show the need for augmentation and revision. Especially, we conclude that the social and cultural context cannot be excluded from cognitive analysis of mathematicians’ use of external representations. Rather, representations are socially sanctioned and enabled in an enculturation process. (shrink)
Ethical conduct in practice has been increasingly recognised as vital to the accountancy profession following the collapse of Andersen. The foundational principles underpinning accountancy ethics receive relatively uniform recognition worldwide so that this paper concentrates on exploring how to introduce these concepts into established courses at undergraduate level. Historically, the teaching of accounting techniques has been isolated from the personal assimilation of accountancy's ethical values by students. Alternative approaches are considered, of a dedicated 'capstone' ethical course or through more (...) progressive integration within existing parts of an established curriculum. An opportunistic example of the latter is then described with the rationale, potential benefits, student reactions and practical difficulties assessed. Overall, the paper explains why, alongside technical skills, their personal development requires undergraduates to develop how to apply for themselves given professional values. It contributes suggestions as to methodologies, content and material for short modules within financial reporting, taxation, auditing and social/environmental accountancy courses while reflecting on the limitations and potential of their use. (shrink)
Recently, several scholars have argued that scientists can accept scientific claims in a collective process, and that the capacity of scientific groups to form joint acceptances is linked to a functional division of labor between the group members. However, these accounts reveal little about how the cognitive content of the jointly accepted claim is formed, and how group members depend on each other in this process. In this paper, I shall therefore argue that we need to link analyses of joint (...) acceptance with analyses of distributed cognition. To sketch how this can be done, I shall present a detailed case study, and on the basis of the case, analyze the process through which a group of scientists jointly accept a new scientific claim and at a later stage jointly accept to revise previously accepted claims. I shall argue that joint acceptance in science can be established in situations where an overall conceptual structure is jointly accepted by a group of scientists while detailed parts of it are distributed among group members with different areas of expertise, a condition that I shall call a heterogeneous conceptual consensus. Finally, I shall show how a heterogeneous conceptual consensus can work as a constraint against scientific change and address the question how changes may nevertheless occur. (shrink)
Knowledge ascriptions are a central topic of research in both philosophy and science. In this collection of new essays on knowledge ascriptions, world class philosophers offer novel approaches to this long standing topic.
This paper investigates the notion of semiotic scaffolding in relation to mathematics by considering its influence on mathematical activities, and on the evolution of mathematics as a research field. We will do this by analyzing the role different representational forms play in mathematical cognition, and more broadly on mathematical activities. In the main part of the paper, we will present and analyze three different cases. For the first case, we investigate the semiotic scaffolding involved in pencil and paper multiplication. For (...) the second case, we investigate how the development of new representational forms influenced the development of the theory of exponentiation. For the third case, we analyze the connection between the development of commutative diagrams and the development of both algebraic topology and category theory. Our main conclusions are that semiotic scaffolding indeed plays a role in both mathematical cognition and in the development of mathematics itself, but mathematical cognition cannot itself be reduced to the use of semiotic scaffolding. (shrink)
The Golden Age of postwar capitalism has been eclipsed, and with it seemingly also the possibility of harmonizing equality and welfare with efficiency and jobs. Most analyses believe that the emerging postindustrial society is overdetermined by massive, convergent forces, such as tertiarization, new technologies, or globalization, all conspiring to make welfare states unsustainable in the future. Social Foundations of Postindustrial Economies takes a second, more sociological and more institutional, look at the driving forces of economic transformation. What, as a result, (...) stands out is postindustrial diversity, not convergence. Macroscopic, global trends are undoubtedly powerful, yet their influence is easily rivalled by domestic institutional traditions, by the kind of welfare regime that, some generations ago, was put in place. It is, however, especially the family economy that hold the key as to what kind of postindustrial model will emerge, and to how evolving tradeoffs will be managed. Twentieth-century economic analysis depended on a set of sociological assumptions that, now, are invalid. Hence, to better grasp what drives today's economy, we must begin with its social foundations. (shrink)
The paper reviews the current state of play around anti-representationalist attempts at countering Clark and Toribio’s representation-hunger thesis. It introduces a distinction between different approaches to Chemero’s Radical embodied cognition thesis in the form of, on the one hand, those pushing a hard line and, on the other, those who are more relaxed about their anti-representationalist commitments. In terms of overcoming Clark and Toribio’s thesis, hardliners seek to avoid any mentioning of mental content in the activity they purport to explain. (...) Yet, the paper argues, adopting a hard line complicates this endeavor considerably and unnecessarily. Those promoting a relaxed REC, however, are better off in that they have no problem in recognizing that some types of cognition are hybrid. By turning to Hutto and Myin’s Radical Enactivism as a prime example of a relaxed approach to the REC thesis, the paper points towards the lack of continuity between covariant information and informational content as the gap that would necessarily have to be closed in order for RECers to, once and for all, be able to dismiss Clark and Toribio’s hypothesis that certain kinds of cognition are per definition off-limits to anti-representationalism. (shrink)
Previous accounting ethics research berates auditors for ethical lapses that contribute to the failure of Andersen (e.g., Duska, R.: 2005, Journal of Business Ethics 57, 17–29; Staubus, G.: 2005, Journal of Business Ethics 57, 5–15; however, some of the blame must also fall on regulatory and professional bodies that exist to mitigate auditors’ ethical lapses. In this paper, we consider the ethical and economic context that existed and facilitated Andersen’s failure. Our analysis is grounded in Akerlof’s (1970, Quarterly (...) Journal of Economics August, 488–500) Theory of the Market for Lemons and we characterize the market for audit reports as a market for lemons. Consistent with Akerlof’s model, we consider the appropriateness of the countervailing mechanisms that existed at the time of Andersen’s demise that appeared to have effectively failed in counteracting Andersen’s ethical shortcomings. Finally, we assess the appropriateness of the remedies proposed by the Sarbanes–Oxley Act of 2002 (SOA) to ensure that similar ethical lapses will not occur in the future. Our analysis indicates that the SOA regulatory reforms should counteract some of the necessary conditions of the Lemons Model, and thereby mitigate the likelihood of audit failures. However, we contend that the effectiveness of the SOA critically depends upon the focus and attention of the␣Public Companies Accounting Oversight Board (PCAOB) towards assessing the ethical climates of public accounting firms. Assessments by the PCAOB of public accounting firm’s ethical climate are needed to sufficiently ensure that public accounting firms effectively promote and maintain audit quality in situations where unconscious bias or economic incentives may erode the public accounting firm’s independence. (shrink)
Language is infused with materiality and should therefore not be considered as an abstract system that is isolated from socio-material reality. Expressions materialise language in social practices, thus providing the necessary basis for languaging activities. For this reason, it makes sense to challenge proponents of orthodox linguistics and others who hold that language can be studied in isolation from its concrete manifestations. By exploring the relation between materiality and linguistic activity, the article extends Malafouris’ Material Engagement Theory while clarifying the (...) phenomenon of ‘linguistic denotation’. In so doing, it critiques orthodox approaches to language which trace denotation to abstract meanings and/or mental representations. The article shows how the denotative aspects of language can be cashed out in non-representational terms and, furthermore, that the interrelation of denotation and materiality is crucial to human material culture in that it allows for material engagements to transcend localised contexts. These engagements become global in Latour’s sense and, in so doing, denotation ceases to demand descriptions in terms of representations. (shrink)
This paper extends Mori’s uncanny valley-hypothesis to include technologies that fail its basic criterion that uncanniness arises when the subject experiences a discrepancy in a machine’s human likeness. In so doing, the paper considers Mori’s hypothesis about the uncanny valley as an instance of what Heidegger calls the ‘challenging revealing’ nature of modern technology. It introduces seeming autonomy and heteronomy as phenomenological categories that ground human being-in-the-world including our experience of things and people. It is suggested that this categorical distinction (...) is more foundational than Heidegger’s existential structures and phenomenological categories. Having introduced this novel phenomenological distinction, the paper considers the limits of Mori’s hypothesis by drawing on an example from science fiction that showcases that uncanniness need not only be caused by machines that resemble human beings. In so doing, it explores how the seeming autonomy-heteronomy distinction clarifies the uncanniness that can arise when humans encounter advanced technology which is irreducible to the anthropocentrism that shapes Mori’s original hypothesis. (shrink)
It is a commonly raised argument against the family resemblance account of concepts that, on this account, there is no limit to a concept's extension. An account of family resemblance which attempts to provide a solution to this problem by including both similarity among instances and dissimilarity to non-instances has been developed by the philosopher of science Thomas Kuhn. Similar solutions have been hinted at in the literature on family resemblance concepts, but the solution has never received a detailed investigation. (...) I shall provide a reconstruction of Kuhn's theory and argue that his solution necessitates a developmental perspective with builds on both the transmission of taxonomies between generations and a progressive development through history. (shrink)
I ask three questions related to the claims made within the staying alive theory : Is survival more fitness-enhancing for females than for males? Does the historical record on sex differences in mortality support the SAT? Is it possible to talk about “independent selective pressures on both male and female traits” when all we have are sex/gender comparisons?
A phenomenon is that which appears. In his phenomenology, Jean-Luc Marion shows how a phenomenon that appears in and out of itself evades the metaphysical demand of grounding. Classical philosophy has acknowledged phenomena only in so far as they can be sanctioned by the concepts of the intellect. This holds good also of Husserl’s constitutive ego. Now, Marion distinguishes between such intuitively “poor phenomena” and the “saturated phenomena” that exceed the intentional consciousness; they are given not by the consciousness but (...) to the consciousness in an excess of intuition. This “gift of appearance” is Marion’s main concern, in the visible in general, and in painting in particular. But whereas idols only reflect our own desire to see and to be seen, icons surprise us by the gaze the saint directs on us. A picture is the scene of a possible revelation; and the revelation is nothing but the phenomenon taken in its fullest meaning: intuitive saturation at its maximum. A crucial question, nonetheless, remains: What is the relation between revelation as a phenomenological possibility, and Revelation as a theological dogma of the utmost importance? (shrink)
Language learning is not primarily driven by a motivation to describe invariant features of the world, but rather by a strong force to be a part of the social group, which by definition is not invariant. It is not sufficient for language to be fit for the speaker's perceptual motor system. It must also be fit for social interactions.
Over the last century, there have been considerable variations in the frequency of use and types of diagrams used in mathematical publications. In order to track these changes, we developed a method enabling large-scale quantitative analysis of mathematical publications to investigate the number and types of diagrams published in three leading mathematical journals in the period from 1885 to 2015. The results show that diagrams were relatively common at the beginning of the period under investigation. However, beginning in 1910, they (...) were almost completely unused for about four decades before reappearing in the 1950s. The diagrams from the 1950s, however, were of a different type than those used earlier in the century. We see this change in publication practice as a clear indication that the formalist ideology has influenced mathematicians’ choice of representations. Although this could be seen as a minor stylistic aspect of mathematics, we argue that mathematicians’ representational practice is deeply connected to their cognitive practice and to the contentual development of the discipline. These changes in publication style therefore indicate more fundamental changes in the period under consideration. (shrink)
Electronic computers form an integral part of modern mathematical practice. Several high-profile results have been proven with techniques where computer calculations form an essential part of the proof. In the traditional philosophical literature, such proofs have been taken to constitute a posteriori knowledge. However, this traditional stance has recently been challenged by Mark McEvoy, who claims that computer calculations can constitute a priori mathematical proofs, even in cases where the calculations made by the computer are too numerous to be surveyed (...) by human agents. In this article we point out the deficits of the traditional literature that has called for McEvoy’s correction. We also explain why McEvoy’s defence of mathematical apriorism fails and we discuss how the debate over the epistemological status of computer-assisted mathematics contains several unfortunate conceptual reductions. (shrink)
This paper presents and discusses empirical results from a survey about the research practice of Danish chemistry students, with a main focus on the question of anomalous data. It seeks to investigate how such data is handled by students, with special attention to so-called ‘questionable research practices’ where anomalous data are simply deleted or discarded. This question of QRPs is of particular importance as the educational practices students experience may influence how they act in their future professional careers, for instance (...) in research. The ethical evaluation of QRPs however is not univocal. In parts of the literature QRPs are seen as unquestionably bad, while in other parts of the literature certain QRPs are seen as a necessary aspect of scientific practice. Results from the survey of Danish chemistry students shows that many students engage in certain types of questionable practices, and that a large minority of the students have been actively encouraged by their teachers to engage in such practices. The paper discusses to what extent and under what circumstances such instructional practices can be defended and suggests how the instructional practice connected to the handling of anomalous data can be improved. (shrink)
I develop an approach to action and practical deliberation according to which the degree of epistemic warrant required for practical rationality varies with practical context. In some contexts of practical deliberation, very strong warrant is called for. In others, less will do. I set forth a warrant account, (WA), that captures this idea. I develop and defend (WA) by arguing that it is more promising than a competing knowledge account of action due to John Hawthorne and Jason Stanley. I argue (...) that cases of warranted false belief speak in favor of (WA) and against the knowledge account. Moreover, I note some problems with an “excuse maneuver” that proponents of the knowledge account frequently invoke in response to cases of warranted false belief. Finally, I argue that (WA) may provide a strict invariantist account of cases that have been thought to motivate interest-relative or subject-sensitive theories of knowledge and warrant. (shrink)
Postphenomenologists and performativists criticize classical approaches to phenomenology for isolating human subjects from their socio-material relations. The purpose of this essay is to repudiate their criticism by presenting a nuanced account of phenomenology thus making it evident that phenomenological theories have the potential for meshing with the performative idiom of contemporary science and technology studies. However, phenomenology retains an apparent shortcoming in that its proponents typically focus on human–nonhuman relations that arise in localized contexts. For this reason, it seems to (...) contrast with one of the core assumptions behind practical ontologies: that socio-practical significance extends beyond an agent’s immediate situatedness in a localized context. Turning to Heidegger’s phenomenology and his notion of ‘de-distancing’, the essay explores how localized phenomena that pertain to human experience connect with global practices and, thus, the possibility of consilience between phenomenological research and present-day STS. (shrink)
I present a challenge to epistemological pragmatic encroachment theories from epistemic injustice. The challenge invokes the idea that a knowing subject may be wronged by being regarded as lacking knowledge due to social identity prejudices. However, in an important class of such cases, pragmatic encroachers appear to be committed to the view that the subject does not know. Hence, pragmatic encroachment theories appear to be incapable of accounting for an important type of injustice – namely, discriminatory epistemic injustice. Consequently, pragmatic (...) encroachment theories run the risk of obscuring or even sanctioning epistemically unjust judgments that arise due to problematic social stereotypes or unjust folk epistemological biases. In contrast, the epistemological view that rejects pragmatic encroachment – namely, strict purist invariantism – is capable of straightforwardly diagnosing the cases of discriminatory epistemic injustice as such. While the challenge is not a conclusive one, it calls for a response. Moreover, it illuminates very different conceptions of epistemology’s role in mitigating epistemic injustice. (shrink)
This book concentrates upon how economic rationalities have been embedded into particular historical practices, cultures, and moral systems. Through multiple case-studies, situated in different historical contexts of the modern West, the book shows that the development of economic rationalities takes place in the meeting with other regimes of thought, values, and moral discourses. The book offers new and refreshing insights, ranging from the development of early economic thinking to economic aspects and concepts in the works of classical thinkers such as (...) Thomas Hobbes, John Locke and Karl Marx, to the role of economic reasoning in contemporary policies of art and health care. With economic rationalities as the read thread, the reader is offered a unique chance of historical self-awareness and recollection of how economic rationality became the powerful ideological and moral force that it is today. (shrink)
Mathematicians appear to have quite high standards for when they will rely on testimony. Many mathematicians require that a number of experts testify that they have checked the proof of a result p before they will rely on p in their own proofs without checking the proof of p. We examine why this is. We argue that for each expert who testifies that she has checked the proof of p and found no errors, the likelihood that the proof contains no (...) substantial errors increases because different experts will validate the proof in different ways depending on their background knowledge and individual preferences. If this is correct, there is much to be gained for a mathematician from requiring that a number of experts have checked the proof of p before she will rely on p in her own proofs without checking the proof of p. In this way a mathematician can protect her own work and the work of others from errors. Our argument thus provides an explanation for mathematicians’ attitude towards relying on testimony. (shrink)
While we witness a growing belief in transparency as an ideal solution to a wide range of societal problems, we know less about the practical workings of transparency as it guides conduct in organizational and regulatory settings. This article argues that transparency efforts involve much more than the provision of information and other forms of ‘sunlight’, and are rather a matter of managing visibilities than providing insight and clarity. Building on actor-network theory and Foucauldian governmentality studies, it calls for careful (...) attention to the ways in which transparency ideals are translated into more situated practices and become associated with specific organizational and regulatory concerns. The article conceptualizes transparency as a force that shapes conduct in organizational and socio-political domains. In the second section, this conceptualization of transparency as a form of ‘ordering’ is substantiated further by using illustrations of the effects of transparency efforts in the internet domain. (shrink)