This collection of essays breaks new ground by providing an unparalleled snapshot of new work in political philosophy. The book brings together up-and-coming scholars from across the globe using such diverse methodologies as critical theory and social choice theory, historical analysis and conceptual analysis. The volume demonstrates the vibrancy of contemporary political theorizing not only when treating perennial topics--democracy, equality, legitimacy, liberty, patriotism, political freedom, rationality--but also when revivifying topics briefly out of favor--human needs, ideology, judgment, political aesthetics--and tackling topics (...) more recently put on the agenda--citizenship, collective agency, cultural contexts, feminism, identity, multiculturalism, social suffering, subjectivity. (shrink)
This paper applies emerging research on epistemic virtues to business ethics. Inspired by recent work on epistemic virtues in philosophy, I develop a view in which epistemic virtues contribute to the acquisition of knowledge that is instrumentally valuable in the realisation of particular ends, business ends in particular. I propose a conception of inquiry according to which epistemic actions involve investigation, belief adoption and justification, and relate this to the traditional ‘justified true belief’ analysis of knowledge. I defend the view (...) that epistemic virtues enable and/or motivate people to perform epistemic actions. An examination of the key epistemic virtues of love of knowledge, epistemic courage, temperance, justice, generosity and humility provides some initial evidence suggesting that the way epistemic virtues enable or motivate is by countering a number of biases that have been uncovered by behavioural economics, and also indicates ways in which the instrumental epistemic value view is superior to other approaches to epistemic virtue offered in the literature. (shrink)
This article applies philosophical work on epistemic injustice and cognate concepts to study gender and racial disparity in financial markets. Members of disadvantaged groups often receive inferior financial services. In most jurisdictions, it is illegal to provide discriminatorily disparate treatment to groups defined by gender and skin colour. Racial disparity in financial services is generally considered to be discriminatory. The standard view among most regulators is that gender disparity is not discriminatory, though. Through an analysis of various exemplary cases, I (...) propose testimonial injustice as a candidate explanation for some of the existing forms of racial disparity found in financial services. I show how prejudices about gender and finance decrease epistemic self-confidence, and how this leads to gender disparity. And I consider particularly intractable forms of self-fulfilling testimonial injustice. (shrink)
In this topical book, Boudewijn de Bruin examines the ethical 'blind spots' that lay at the heart of the global financial crisis. He argues that the most important moral problem in finance is not the 'greed is good' culture, but rather the epistemic shortcomings of bankers, clients, rating agencies and regulators. Drawing on insights from economics, psychology and philosophy, de Bruin develops a novel theory of epistemic virtue and applies it to racist and sexist lending practices, subprime (...) mortgages, CEO hubris, the Madoff scandal, professionalism in accountancy and regulatory outsourcing of epistemic responsibility. With its multidisciplinary reach, Ethics and the Global Financial Crisis will appeal to scholars working in philosophy, business ethics, economics, psychology and the sociology of finance. The many concrete examples and case studies mean that this book will also prove useful to policy-makers and regulators. (shrink)
This paper presents an argument for the value of privacy that is based on a purely negative concept of freedom only. I show that privacy invasions may decrease a person’s negative freedom as well as a person’s knowledge about the negative freedom she possesses. I argue that not only invasions that lead to actual interference, but also invasions that lead to potential interference (many cases of identity theft) constitute actual harm to the invadee’s liberty interests, and I critically examine the (...) courts’ reliance on a principle of ‘no harm, no foul’ in recent data breach cases. Using a number of insights from the psychology of human belief, I also show that the liberal claim for protection of privacy is strengthened by the observation that often the privacy invader cannot be held responsible for the influence on the invadee’s negative freedom. (shrink)
Contents. Introduction. 1. Preliminaries. 2. Normal Form Games. 3. Extensive Games. 4. Applications of Game Theory. 5. The Methodology of Game Theory. Conclusion. Appendix. Bibliography. Index. Does game theory—the mathematical theory of strategic interaction—provide genuine explanations of human behaviour? Can game theory be used in economic consultancy or other normative contexts? Explaining Games: The Epistemic Programme in Game Theory—the first monograph on the philosophy of game theory—is an attempt to combine insights from epistemic logic and the philosophy of science to (...) investigate the applicability of game theory in such fields as economics, philosophy and strategic consultancy. I prove new mathematical theorems about the beliefs, desires and rationality principles of individual human beings, and explore in detail the logical form of game theory as it is used in explanatory and normative contexts. I argue that game theory reduces to rational choice theory if used as an explanatory device, and that game theory is nonsensical if used as a normative device. A provocative account of the history of game theory reveals that this is not bad news for all of game theory, though. Two central research programmes in game theory tried to find the ultimate characterisation of strategic interaction between rational agents. Yet, while the Nash Equilibrium Refinement Programme has done badly thanks to such research habits as overmathematisation, model-tinkering and introversion, the Epistemic Programme, I argue, has been rather successful in achieving this aim. "The 'epistemic' approach to game theory has emerged over the past twenty-five years. What is this approach? How does it differ from the conventional equilibrium-based approach to game theory? What have been its strengths and weaknesses to date? To find out, read this comprehensive and excellently written account". Adam Brandenburger, J. P. Valles Professor of Business Economics and Strategy, Stern School of Business, New York University "Reading Boudewijn de Bruin's book should be rewarding both for game theorists interested in the conceptual foundations of their discipline and for philosophers who want to learn more about formal analysis of strategic interaction. It provides an in-depth logical study of the currently dominant epistemic approaches to non-cooperative games, with an eye both to the attractions and to the serious challenges facing the Epistemic Programme". Wlodek Rabinowicz, Professor of Practical Philosophy, Department of Philosophy, Lund University . (shrink)
Game theory is the mathematical study of strategy and conflict. It has wide applications in economics, political science, sociology, and, to some extent, in philosophy. Where rational choice theory or decision theory is concerned with individual agents facing games against nature, game theory deals with games in which all players have preference orderings over the possible outcomes of the game. This paper gives an informal introduction to the theory and a survey of applications in diverse branches of philosophy. No criticism (...) is reviewed. Game theory is shown at work in discussions about epistemological dependence, liberalism and efficiency, Hume’s concept of convention, morality and rationality, and distributive justice and egalitarianism. A guide to the literature provides hints at applications in collective intentionality, epistemology, ethics, history of philosophy, logic, philosophy of language, and political philosophy. (shrink)
This chapter argues for deregulation of the credit-rating market. Credit-rating agencies are supposed to contribute to the informational needs of investors trading bonds. They provide ratings of debt issued by corporations and governments, as well as of structured debt instruments (e.g. mortgage-backed securities). As many academics, regulators, and commentators have pointed out, the ratings of structured instruments turned out to be highly inaccurate, and, as a result, they have argued for tighter regulation of the industry. This chapter shows, however, that (...) the role of credit-rating agencies in achieving justice in finance is not as great as these commentators believe. It therefore argues instead for deregulation. Since the 1930s, lawgivers have unjustifiably elevated the rating agencies into official, legally binding sources of information concerning credit risk, thereby unjustifiably causing many institutional investors to outsource their epistemic responsibilities, that is, their responsibility to investigate credit risk themselves. (shrink)
Cloud computing is rapidly gaining traction in business. It offers businesses online services on demand (such as Gmail, iCloud and Salesforce) and allows them to cut costs on hardware and IT support. This is the first paper in business ethics dealing with this new technology. It analyzes the informational duties of hosting companies that own and operate cloud computing datacenters (e.g., Amazon). It considers the cloud services providers leasing ‘space in the cloud’ from hosting companies (e.g, Dropbox, Salesforce). And it (...) examines the business and private ‘clouders’ using these services. The first part of the paper argues that hosting companies, services providers and clouders have mutual informational (epistemic) obligations to provide and seek information about relevant issues such as consumer privacy, reliability of services, data mining and data ownership. The concept of interlucency is developed as an epistemic virtue governing ethically effective communication. The second part considers potential forms of government restrictions on or proscriptions against the development and use of cloud computing technology. Referring to the concept of technology neutrality, it argues that interference with hosting companies and cloud services providers is hardly ever necessary or justified. It is argued, too, however, that businesses using cloud services (banks, law firms, hospitals etc. storing client data in the cloud, e.g.) will have to follow rather more stringent regulations. (shrink)
In this chapter, one considers finance at its very foundations, namely, at the place where assumptions are being made about the ways to measure the two key ingredients of finance: risk and return. It is well known that returns for a large class of assets display a number of stylized facts that cannot be squared with the traditional views of 1960s financial economics (normality and continuity assumptions, i.e. Brownian representation of market dynamics). Despite the empirical counterevidence, normality and continuity assumptions (...) were part and parcel of financial theory and practice, embedded in all financial practices and beliefs. Our aim is to build on this puzzle for extracting some clues revealing the use of one research strategy in academic community, model tinkering defined as a particular research habit. We choose to focus on one specific moment of the scientific controversies in academic finance: the ‘leptokurtic crisis’ opened by Mandelbrot in 1962. The profoundness of the crisis came from the angle of the Mandelbrot’s attack: not only he emphasized an empirical inadequacy of the Brownian representation, but also he argued for an inadequate grounding of this representation. We give some insights in this crisis and display the model tinkering strategies of the financial academic community in the 1970s and the 1980s. (shrink)
Financial incentives, learning, group consultation, and increased experimental control are among the experimental techniques economists have successfully used to deflect the behavioral challenge posed by research conducted by such scholars as Tversky and Kahneman. These techniques save the economic armchair to the extent that they align laypeople judgments with economic theory by increasing cognitive effort and reflection in experimental subjects. It is natural to hypothesize that a similar strategy might work to address the experimental or restrictionist challenge to armchair philosophy. (...) To test this hypothesis, a randomized controlled experiment was carried out, as well as two lab experiments. Three types of knowledge attribution tasks were used. No support for the hypothesis was found. The paper describes the close similarities between the economist’s response to the behavioral challenge, and the expertise defense against the experimental challenge, and presents the experiments, results, and an array of robustness checks. The upshot is that these results make the experimental challenge all the more forceful. (shrink)
The global financial crisis has led to a surprising interest in professional oaths in business. Examples are the MBA Oath, the Economist’s Oath and the Dutch Banker’s Oath, which senior executives in the financial services industry in the Netherlands have been obliged to swear since 2010. This paper is among the first to consider oaths from the perspective of business ethics. A framework is presented for analysing oaths in terms of their form, their content and the specific contribution they make (...) to business ethics management: oaths may foster professionalism, facilitate moral deliberation and enhance compliance. This framework is used to analyse and evaluate the MBA Oath, the Economist’s Oath and the Banker’s Oath as well as various other similar initiatives. (shrink)
This paper contributes to an increasing literature strengthening the connection between epistemic logic and epistemology (Van Benthem, Hendricks). I give a survey of the most important applications of epistemic logic in epistemology. I show how it is used in the history of philosophy (Steiner's reconstruction of Descartes' sceptical argument), in solutions to Moore's paradox (Hintikka), in discussions about the relation between knowledge and belief (Lenzen) and in an alleged refutation of verificationism (Fitch) and I examine an early argument about the (...) (im)possibility of epistemic logic (Hocutt). Subsequently, I deal with interpretive questions about epistemic logic that, although implicitly, already appeared in the first section. I contend that a conception of epistemic logic as a theory of knowledge assertions is incoherent, and I argue that it does not make sense to adopt a normative interpretation of epistemic logic. Finally, I show ways to extend epistemic logic with other branches of philosophical logic so as to make it useful for some epistemological questions. Conditional logics and logics of public announcement are used to understand causal theories of knowledge and versions of reliabilism. Temporal logic helps understand some dynamic aspects of knowledge as well as the verificationist thesis. (shrink)
Using epistemic logic, we provide a non-probabilistic way to formalise payoff uncertainty, that is, statements such as ‘player i has approximate knowledge about the utility functions of player j.’ We show that on the basis of this formalisation common knowledge of payoff uncertainty and rationality (in the sense of excluding weakly dominated strategies, due to Dekel and Fudenberg (1990)) characterises a new solution concept we have called ‘mixed iterated strict weak dominance.’.
We develop a logical system that captures two different interpretations of what extensive games model, and we apply this to a long-standing debate in game theory between those who defend the claim that common knowledge of rationality leads to backward induction or subgame perfect (Nash) equilibria and those who reject this claim. We show that a defense of the claim à la Aumann (1995) rests on a conception of extensive game playing as a one-shot event in combination with a principle (...) of rationality that is incompatible with it, while a rejection of the claim à la Reny (1988) assumes a temporally extended, many-moment interpretation of extensive games in combination with implausible belief revision policies. In addition, the logical system provides an original inductive and implicit axiomatization of rationality in extensive games based on relations of dominance rather than the usual direct axiomatization of rationality as maximization of expected utility. (shrink)
The paper argues that the Nash Equilibrium Refinement Programme was less successful than its competitor, the Epistemic Programme. The prime criterion of success is the extent to which the programmes were able to reach the key objective guiding non-cooperative game theory for much of the twentieth century, namely, to develop a complete characterisation of the strategic rationality of economic agents in the form of the ultimate solution concept for any normal form and extensive game. The paper explains this in terms (...) of unjustified degrees of mathematisation in Nash Equilibrium Refinement Programme. While this programme’s mathematical models were often inspired by purely mathematical concerns rather than the economic phenomena they were intended to model, the Epistemic Programme’s models were developed with a keen eye to the role beliefs and desires play in strategic interaction between rational economic agents playing games; that is, their interactive epistemology. The Epistemic Programme succeeded in developing mathematical models formalising aspects of strategic interaction that remained implicit in the Nash Equilibrium Refinement Programme owing to an unjustified degree of mathematisation. As a result, the Epistemic Programme is the more successful theory.Keywords: Epistemic Programme; Game theory; Interactive epistemology; Mathematisation; Nash equilibrium. (shrink)
Margaret Gilbert's plural subject theory defines social collectives in terms of common knowledge of expressed willingness to participate in some joint action. The author critically examines Gilbert's application of this theory to linguistic phenomena involving "we," arguing that recent work in linguistics provides the tools to develop a superior account. The author indicates that, apart from its own relevance, one should care about this critique because Gilbert's claims about the first person plural pronoun play a role in the argument in (...) favor of her recent theory of political obligation. Key Words: collective agent • Gilbert • plural subject • semantics • we. (shrink)
Susan Hurley has argued against a well known argument for freedom of speech, the argument from autonomy, on the basis of two hypotheses about violence in the media and aggressive behaviour. The first hypothesis says that exposure to media violence causes aggressive behaviour; the second, that humans have an innate tendency to copy behaviour in ways that bypass conscious deliberation. I argue, first, that Hurley is not successful in setting aside the argument from autonomy. Second, I show that the empirical (...) data are irrelevant to statutory regulation of media violence. They do not yield a sufficiently strong correlation between exposure to media violence and non-autonomously copied criminal violence, and they do not yield a way ex ante to individuate the viewers who will be affected by media violence. (shrink)
I argue that game theoretic explanations of human actions make implausible epistemological assumptions. A logical analysis of game theoretic explanations shows that they do not conform to the belief-desire framework of action explanation. Epistemic characterization theorems (specifying sufficient conditions for game theoretic solution concepts to obtain) are argued to be the canonical way to make game theory conform to that framework. The belief formation practices implicit in epistemic characterization theorems, however, disregard all information about players except what can be found (...) in the game itself. And such a practice of belief formation is, I show, implausible. (shrink)
The mathematical tools of game theory are frequently used in the social sciences and economic consultancy. But how do they explain social phenomena and support prescriptive judgments? And is the use of game theory really necessary? I analyze the logical form of explanatory and prescriptive game theoretical statements, and argue for two claims: (1) explanatory game theory can and should be reduced to rational choice theory in all cases; and (2) prescriptive game theory gives bad advice in some cases, is (...) reducible to rational choice theory in other cases, while it makes no sense in yet other cases. (shrink)
This paper presents an epistemological or knowledge-theoretic reinterpretation of the role of external accountants. It presents a joint epistemic agent model in which corporate management and accountants together form a source of testimonial knowledge for the firm’s stakeholders about the firm’s financial situation. Recent work from virtue epistemology is used, according to which knowledge is, roughly, true belief that is justified by way of the exercise of epistemic virtue. In the joint epistemic agent model, corporate management provides information, while the (...) accountants ensure justification. The paper argues that to ensure justification, accountants have to exercise self-regarding epistemic virtues such as open-mindedness, but also other-regarding epistemic virtues such as generosity. It is also argued that these virtues are only partly encompassed in existing professional codes of conduct. (shrink)
We present an argument against a standard evidentialist position on the ethics of belief. We argue that sometimes a person merits criticism for holding a belief even when that belief is well supported by her evidence in any relevant sense. We show how our argument advances the case for anti-evidentialism in the light of other arguments presented in the recent literature, and respond to a set of possible evidentialist rejoinders.
Why are mistaken beliefs about Covid-19 so prevalent? Political identity, education and other demographic variables explain only a part of individual differences in the susceptibility to Covid-19 misinformation. This paper focuses on another explanation: epistemic vice. Epistemic vices are character traits that interfere with acquiring, maintaining, and transmitting knowledge. If the basic assumption of vice epistemology is right, then people with epistemic vices such as indifference to the truth or rigidity in their belief structures will tend to be more susceptible (...) to believing Covid-19 misinformation. We carried out an observational study (US sample, n = 998) in which we measured the level of epistemic vice of participants using a novel Epistemic Vice Scale. We also asked participants questions eliciting the extent to which they subscribe to myths and misinformation about Covid-19. We find overwhelming evidence to the effect that epistemic vice is associated with susceptibility to Covid-19 misinformation. In fact, the association turns out to be stronger than with political identity, educational attainment, scores on the Cognitive Reflection Test, personality, dogmatism, and need for closure. We conclude that this offers evidence in favor of the empirical presuppositions of vice epistemology. (shrink)
This paper presents two studies on the development and validation of a ten-item scale of epistemic vice and the relationship between epistemic vice and misinformation and fake news. Epistemic vices have been defined as character traits that interfere with acquiring, maintaining, and transmitting knowledge. Examples of epistemic vice are gullibility and indifference to knowledge. It has been hypothesized that epistemically vicious people are especially susceptible to misinformation and conspiracy theories. We conducted one exploratory and one confirmatory observational survey study on (...) Amazon Mechanical Turk among people living in the United States. We show that two psychological traits underlie the range of epistemic vices that we investigated: indifference to truth and rigidity. Indifference manifests itself in a lack of motivation to find the truth. Rigidity manifests itself in being insensitive to evidence. We develop a scale to measure epistemic vice with the subscales indifference and rigidity. The Epistemic Vice Scale is internally consistent; has good convergent, divergent, and discriminant validity; and is strongly associated with the endorsement of misinformation and conspiracy theories. Epistemic vice explains additional variance in the endorsement of misinformation and conspiracy theories over and above demographic and related psychological concepts and shows medium to large effect sizes across outcome measures. We demonstrate that epistemic vice differs from existing psychological constructs, and show that the scale can explain individual differences in dealing with misinformation and conspiracy theories. We conclude that epistemic vice might contribute to “postfactive” ways of thinking. (shrink)
This paper analyzes the logical form of valuing. I argue that valuing a concept or property is a universal statement qua logical form, that valuing an object is an existential statement qua logical form, and, furthermore, that a correct analysis of the logical form of valuing contains doxastic operators. I show that these ingredients give rise to an interesting interplay between uniform and ununiform quantification, on the one hand, and de dicto and de re beliefs, on the other. I apply (...) this analysis to the value of political freedom. The received view is that the value of freedom lies in the value of the specific things one is free to do. But Ian Carter has recently shown that freedom has irreducible, "non-specific" value, too. I show that underlying the debate between the proponents of the received view and their critics is a disagreement about logical form: ununiform de dicto beliefs about freedom as a concept, for the received view, and uniform half-de dicto-half-de re beliefs about freedom as an object, for its critics. (shrink)
Several scholars have argued that Wittgenstein held the view that the notion of number is presupposed by the notion of one-one correlation, and that therefore Hume's principle is not a sound basis for a definition of number. I offer a new interpretation of the relevant fragments on philosophy of mathematics from Wittgenstein's Nachlass, showing that if different uses of ‘presupposition’ are understood in terms of de re and de dicto knowledge, Wittgenstein's argument against the Frege-Russell definition of number turns out (...) to be valid on its own terms, even though it depends on two epistemological principles logicist philosophers of mathematics may find too ‘constructivist’. (shrink)
In this note, I show how Christian List's modal logic of republican freedom (as published in this journal in 2006) can be extended (1) to grasp the differences between liberal freedom (noninterference) and republican freedom (non-domination) in terms of two purely logical axioms and (2) to cover a more recent definition of republican freedom in terms of `arbitrary interference' that gains popularity in the literature.
This paper presents new evidence on the impact of socioeconomic status and education on knowledge attribution. I examine a variety of cases, including vignettes where agents have been Gettiered, have false beliefs, and possess knowledge. Early work investigated whether SES might be associated with knowledge attribution :429–460, 2001; Seyedsayamdost in Episteme 12:95–116, 2014). But these studies used college education as a dummy variable for SES. I use the recently developed Great British Class Survey :219–250, 2013) to measure SES. The paper (...) reports evidence against an association between SES and patterns of knowledge ascription, and reports mixed evidence about education effects. (shrink)
Jacob Glazer and Ariel Rubinstein proffer an exciting new approach to analyze persuasion, using formal tools from economics to address questions that argumentation theorists, logicians, and cognitive and social psychologists have been interested in since Aristotle's Rhetoric. In this note I examine to what extent their approach is successful, and show ways to extend it.
In this paper I criticize Popper's conception of the rationality principle in the social sciences. First, I survey Popper's outlook on the role of a principle of rationality in theorizing in the social sciences. Then, I critically examine his view on the status of the principle of rationality concluding that the arguments supporting it are quite weak. Finally, I contrast his standpoint with an alternative conception. This, I show, helps us understand better Popper's reasons for adopting his perspective on rationality.
The ethical practices of credit rating agencies, particularly following the 2008 financial crisis, have been subject to extensive analysis by economists, ethicists, and policymakers. We raise a novel issue facing CRAs that has to do with a problem concerning the transmission of epistemic status of ratings from CRAs to the beneficiaries of the ratings, and use it to provide a new challenge for regulators. Building on recent work in philosophy, we argue that since CRAs have different stakes than the beneficiaries (...) of the ratings in the ratings being accurate, what counts as knowledge concerning credit risk for a CRA may not count as knowledge for the beneficiary. Further, as it stands, many institutional investors are bound by law to make some of their investment decisions dependent on the ratings of officially recognized CRAs. We argue that the observation that the epistemic status of ratings does not transmit from CRAs to beneficiaries makes salient a new challenge for those who think current regulation regarding the CRAs is prudentially justified, namely, to show that the harm caused by acting on a rating that does not have epistemic status for beneficiaries is compensated by the benefit from them acting on a CRA rating that does have epistemic status for the CRA. Unlike most other commentators, therefore, we offer a defeasible reason to drop references to CRAs in prudential regulation of the financial industry. (shrink)
This paper argues that liberal freedom (non-interference) is epistemologically prior to republican freedom (non-domination). I start investigate three relations between liberal and republican freedom: (i) Logical Equivalence, or the question whether republican freedom entails liberal freedom (and vice versa); (ii) Degree Supervenience, or whether changes in the degree (amount, quantity) of republican freedom are mirrored by changes in the degree of liberal freedom (and vice versa); and (iii) Epistemological Priority, that is, whether knowledge about arrangements of republican freedom presupposes knowledge (...) about arrangements of liberal freedom. If Logical Equivalence holds, liberals are right to claim that republicans have not introduced a new concept of freedom. It is easy to see, though, that Logical Equivalence does not hold. If Degree Supervenience holds, liberals can maintain that while republican freedom is a new concept of freedom, fostering or promoting it is not very different from fostering or promoting liberal freedom. Degree Supervenience does not hold either, though, and as a result two straightforward counterarguments against republican freedom fail. Yet, I argue, first, that the relation of Epistemological Priority holds: knowing something about arrangements of republican freedom presupposes that you know something about arrangements of liberal freedom. Using Epistemological Priority, I show, second, that the benefits claimed for republican freedom over and above liberal freedom (it minimizes the need for strategic deference, it minimizes uncertainty, and it minimizes subordination) can be accounted for in purely liberal terms. (shrink)
This article provides an outsider perspective on the scientificity of legal studies. First, I argue that the presence of controversies does not mean that legal studies lack the status of a genuine science. Astronomy, mathematics, and economics have their controversies, too. Second, I show that non-empirical, non-normative research is no less scientific than empirical research. This is illustrated by work in mathematical logic. Third, I demonstrate the same claim for non-empirical, normative research.Here the example is research on social contract theories (...) by means of gametheoretic models. (shrink)
We explore the developmental paradox of false belief understanding. This paradox follows from the claim that young infants already have an understanding of false belief, despite the fact that they consistently fail the elicited-response false belief task. First, we argue that recent proposals to solve this paradox are unsatisfactory because they (i) try to give a full explanation of false belief understanding in terms of a single system, (ii) fail to provide psychological concepts that are sufficiently fine-grained to capture the (...) cognitive requirements for the various manifestations of false belief understanding, and (iii) ignore questions about system interaction. Second, we present a dual-system solution to the developmental paradox of false belief understanding that combines a layered model of perspective taking with an inhibition-selection-representation mechanism that operates on different levels. We discuss recent experimental findings that shed light on the interaction between these two systems, and suggest a number of directions for future research. (shrink)
Empirical evidence suggests that people often confabulate when they are asked about their choices or reasons for action. The implications of these studies are the topic of intense debate in philosophy and the cognitive sciences. An important question in this debate is whether the confabulation studies pose a serious threat to the possibility of self-knowledge. In this paper we are not primarily interested in the consequences of confabulation for self-knowledge. Instead, we focus on a different issue: what confabulation implies for (...) the special status of self-attributions, i.e. first-person authority. In the first part of the paper, we propose that FPA is based on a capacity for self-regulation. Accordingly, FPA depends on the extent to which we are able to bridge the gap between our sayings and doings by aligning our actions with our avowed self-ascriptions and vice versa. FPA is withheld when we fail at such re-alignment. In the second part of the paper, we contrast our view with the accounts of Scaife and Bortolotti :227–249, 2018). We claim that the apparent fact that we cannot reliably distinguish, from a first-person perspective, when we are confabulating and when we are not, does not necessarily undermine FPA. We argue that a systematic failure to align our actions with our self-ascriptions and vice-versa is a genuine threat to FPA. In the last part of the paper, we introduce the concept of self-know-how—the know-how embodied in the way one is disposed to relate to oneself in making sense of oneself with or in the face of others—and briefly explored the importance of diminished or absent self-know-how in clinical cases. (shrink)
4E cognition (embodied, embedded, enactive, and extended) is a relatively young and thriving field of interdisciplinary research. It assumes that cognition is shaped and structured by dynamic interactions between the brain, body, and both the physical and social environments. -/- With essays from leading scholars and researchers, The Oxford Handbook of 4E Cognition investigates this recent paradigm. It addresses the central issues of embodied cognition by focusing on recent trends, such as Bayesian inference and predictive coding, and presenting new insights, (...) such as the development of false belief understanding. -/- The Oxford Handbook of 4E Cognition also introduces new theoretical paradigms for understanding emotion and conceptualizing the interactions between cognition, language, and culture. With an entire section dedicated to the application of 4E cognition in disciplines such as psychiatry and robotics, and critical notes aimed at stimulating discussion, this Oxford handbook is the definitive guide to 4E cognition. -/- Aimed at neuroscientists, psychologists, psychiatrists, and philosophers, The Oxford Handbook of 4E Cognition will be essential reading for anyone with an interest in this young and thriving field. (shrink)
de Bruin & Gallagher suggest that the view of embodied simulation put forward in our recent article lacks explanatory power. We argue that the notion of reuse of mental states represented with a bodily format provides a convincing simulational account of the mirroring mechanism and its role in mind -reading.
This individual differences study examined the relationships between three executive functions (updating, shifting, and inhibition), measured as latent variables, and performance on two cognitively demanding subtests of the Adult Decision Making Competence battery: Applying Decision Rules and Consistency in Risk Perception. Structural equation modelling showed that executive functions contribute differentially to performance in these two tasks, with Applying Decision Rules being mainly related to inhibition and Consistency in Risk Perception mainly associated to shifting. The results suggest that the successful application (...) of decision rules requires the capacity to selectively focus attention and inhibit irrelevant (or no more relevant) stimuli. They also suggest that consistency in risk perception depends on the ability to shift between judgement contexts. (shrink)
Abstract In this article, we investigate the merits of an enactive view of cognition for the contemporary debate about social cognition. If enactivism is to be a genuine alternative to classic cognitivism, it should be able to bridge the “cognitive gap”, i.e. provide us with a convincing account of those higher forms of cognition that have traditionally been the focus of its cognitivist opponents. We show that, when it comes to social cognition, current articulations of enactivism are—despite their celebrated successes (...) in explaining some cases of social interaction—not yet up to the task. This is because they (1) do not pay sufficient attention to the role of offline processing or “decoupling”, and (2) obscure the cognitive gap by overemphasizing the role of phenomenology. We argue that the main challenge for the enactive view will be to acknowledge the importance of both coupled (online) and decoupled (offline) processes for basic and advanced forms of (social) cognition. To meet this challenge, we articulate a dynamic embodied view of cognition. We illustrate the fruitfulness of this approach by recourse to recent findings on false belief understanding. Content Type Journal Article Pages 1-23 DOI 10.1007/s11097-011-9223-1 Authors Leon C. de Bruin, Department of Philosophy II, Ruhr-University Bochum, Universitätsstr. 150, 44801 Bochum, Germany Lena Kästner, Department of Philosophy II, Ruhr-University Bochum, Universitätsstr. 150, 44801 Bochum, Germany Journal Phenomenology and the Cognitive Sciences Online ISSN 1572-8676 Print ISSN 1568-7759. (shrink)
It is generally acknowledged that confabulation undermines the authority of self-attribution of mental states. But why? The mainstream answer is that confabulation misrepresents the actual state of one’s mind at some relevant time prior to the confabulatory response. This construal, we argue, rests on an understanding of self-attribution as first-person mindreading. Recent developments in the literature on folk psychology, however, suggest that mental state attribution also plays an important role in regulating or shaping future behaviour in conformity with normative expectations. (...) We explore an analogue understanding of self-attribution of mental states in terms of first-person mindshaping. The main aim of this paper is to explore how this insight alters the implications of empirical confabulation studies on first-person authority. We also indicate how this sheds new light on the phenomenon of confabulation itself. (shrink)