My paper characterizes religious beliefs in terms of vagueness. I introduce my topic by providing a general overview of my main claims. In the subsequent section, I develop basic distinctions and terminology for handling the notion of religious tradition and capturing vagueness. In the following sections, I make the case for my claim that religious beliefs are vague by developing a general argument from the interconnection between the referential opacity of religious belief content and the long-term communitarian history of the (...) precisification of what such content means. I start from describing an empirical example in the third section, and then I move to settle the matter in a conceptually argumentative frame in the fourth one. My conclusions in the final section address a few of consequences relevant to debates about religious epistemology and religious diversity. (shrink)
The logic of assertive graphs is a modification of Peirce’s logic of existential graphs, which is intuitionistic and which takes assertions as its explicit object of study. In this paper we extend AGs into a classical graphical logic of assertions whose internal logic is classical. The characteristic feature is that both AGs and ClAG retain deep-inference rules of transformation. Unlike classical EGs, both AGs and ClAG can do so without explicitly introducing polarities of areas in their language. We then compare (...) advantages of these two graphical approaches to the logic of assertions with a reference to a number of topics in philosophy of logic and to their deep-inferential nature of proofs. (shrink)
We discuss the hints for the disappearance of continuum space and time at microscopic scale. These include arguments for a discrete nature of them or for a fundamental non-locality, in a quantum theory of gravity. We discuss how these ideas are realized in specific quantum gravity approaches. Turning then the problem around, we consider the emergence of continuum space and time from the collective behaviour of discrete, pre-geometric atoms of quantum space, and for understanding spacetime as a kind of “condensate”, (...) and we present the case for this emergence process being the result of a phase transition, dubbed “geometrogenesis”. We discuss some conceptual issues of this scenario and of the idea of emergent spacetime in general. As a concrete example, we outline the GFT framework for quantum gravity, and illustrate a tentative procedure for the emergence of spacetime in this framework. Last, we re-examine the conceptual issues raised by the emergent spacetime scenario in light of this concrete example. (shrink)
My paper provides reasons in support of the view that vague identity claims originate from a conflict between rigidity and precision in designation. To put this stricly, let x be the referent of the referential terms P and Q. Then, that the proposition “that any x being both a P and a Q” is vague involves that the semantic intuitions at work in P and Q reveal a conflict between P and Q being simultaneously rigid and precise designators. After having (...) shortly commented on an example of vague identity claim, I make the case for my proposal, by discussing how reference by baptism conflicts with descriptive attitudes towards understanding conceptual contents. (shrink)
This paper presents an enrichment of the Gabbay–Woods schema of Peirce’s 1903 logical form of abduction with illocutionary acts, drawing from logic for pragmatics and its resources to model justified assertions. It analyses the enriched schema and puts it into the perspective of Peirce’s logic and philosophy.
In this paper I shall adopt a possible reading of the notions of ‘explanatory indispensability’ and ‘genuine mathematical explanation in science’ on which the Enhanced Indispensability Argument proposed by Alan Baker is based. Furthermore, I shall propose two examples of mathematical explanation in science and I shall show that, whether the EIA-partisans accept the reading I suggest, they are easily caught in a dilemma. To escape this dilemma they need to adopt some account of explanation and offer a plausible answer (...) to the following ‘question of evidence’: What is a genuine mathematical explanation in empirical science and on what basis do we consider it as such? Finally, I shall suggest how a possible answer to the question of evidence might be given through a specific account of mathematical explanation in science. Nevertheless, the price of adopting this standpoint is that the genuineness of mathematical explanations of scientific facts turns out to be dependent on pragmatic constraints and therefore cannot be plugged in EIA and used to establish existential claims about mathematical objects. (shrink)
Philosophical analysis of mathematical knowledge are commonly conducted within the realist/antirealist dichotomy. Nevertheless, philosophers working within this dichotomy pay little attention to the way in which mathematics evolves and structures itself. Focusing on mathematical practice, I propose a weak notion of objectivity of mathematical knowledge that preserves the intersubjective character of mathematical knowledge but does not bear on a view of mathematics as a body of mind-independent necessary truths. Furthermore, I show how that the successful application of mathematics in science (...) is an important trigger for the objectivity of mathematical knowledge. (shrink)
In this paper, we present a number of problems for intellectualism about knowledge-how, and in particular for the version of the view developed by Stanley & Williamson 2001. Their argument draws on the alleged uniformity of 'know how'-and 'know wh'-ascriptions. We offer a series of considerations to the effect that this assimilation is problematic. Firstly, in contrast to 'know wh'-ascriptions, 'know how'-ascriptions with known negative answers are false. Secondly, knowledge-how obeys closure principles whose counterparts fail for knowledge-wh and knowledge-that. Thirdly, (...) as opposed to knowledge-wh and knowledge-that, knowledge-how is inferentially isolated from further knowledge-that. We close by providing some evidence against the further reduction of knowledge-wh to knowledge-that, which is presupposed by the intellectualist theory under discussion. (shrink)
For the first time in print, this article reports passages from John Rawls’s graduate papers and annotations on books and manuscripts from his personal library. The analysis of this material shows the historical inaccuracy of the widespread assumption that Rawls’s philosophy owes very little to American pragmatism. Peirce’s notion of truth, as well as the holistic critique of pragmatism thatMortonWhite began in the late 1940s, prove significant at the very beginning of Rawls’s philosophical enterprise. In the light of this material, (...) it might be argued that Rawls’s elaboration of ‘reflective equilibrium’ started at least in part as an attempt to overcome the pending problems of pragmatism. (shrink)
The paper proposes two logical analyses of (the norms of) justification. In a first, realist-minded case, truth is logically independent from justification and leads to a pragmatic logic LP including two epistemic and pragmatic operators, namely, assertion and hypothesis. In a second, antirealist-minded case, truth is not logically independent from justification and results in two logical systems of information and justification: AR4 and AR4¢, respectively, provided with a question-answer semantics. The latter proposes many more epistemic agents, each corresponding to a (...) wide variety of epistemic norms. After comparing the different norms of justification involved in these logical systems, two hexagons expressing Aristotelian relations of opposition will be gathered in order to clarify how (a fragment of) pragmatic formulas can be interpreted in a fuzzy-based question-answer semantics. (shrink)
The beliefs involved in the placebo effect are often assumed to be self-fulfilling, that is, the truth of these beliefs would merely require the patient to hold them. Such a view is commonly shared in epistemology. Many epistemologists focused, in fact, on the self-fulfilling nature of these beliefs, which have been investigated because they raise some important counterexamples to Nozick’s “tracking theory of knowledge.” We challenge the self-fulfilling nature of placebo-based beliefs in multi-agent contexts, analyzing their deep epistemological nature and (...) the role of higher-order beliefs involved in the placebo effect. (shrink)
Chang's MV algebras are the algebras of the infinite-valued sentential calculus of ukasiewicz. We introduce finitely additive measures (called states) on MV algebras with the intent of capturing the notion of average degree of truth of a proposition. Since Boolean algebras coincide with idempotent MV algebras, states yield a generalization of finitely additive measures. Since MV algebras stand to Boolean algebras as AFC*-algebras stand to commutative AFC*-algebras, states are naturally related to noncommutativeC*-algebraic measures.
Whistleblowing is the public disclosure of information with the purpose of revealing wrongdoings and abuses of power that harm the public interest. This book presents a comprehensive theory of whistleblowing: it defines the concept, reconstructs its origins, discusses it within the current ethical debate, and elaborates a justification of unauthorized disclosures. Its normative proposal is based on three criteria of permissibility: the communicative constraints, the intent, and the public interest conditions. The book distinguishes between two forms of whistleblowing, civic and (...) political, showing how they apply in the contexts of corruption and government secrecy. The book articulates a conception of public interest as a claim concerning the presumptive interest of the public. It argues that public interest is defined in opposition to corporate powers and its core content identified by the rights that are all-purposive for the distribution of social benefits. A crucial part of the proposal is dedicated to the impact of security policies and government secrecy on civil liberties. It argues that unrestrained secrecy limits the epistemic entitlement of citizens to know under which conditions their rights are limited by security policies and corporate interests. When citizens are denied the right to assess when these policies are prejudicial to their freedoms, whistleblowing represents a legitimate form of political agency that safeguards the fundamental rights of citizens against the threat of unrestrained secrecy by government power. Finally, the book contributes to shifting the attention of democratic theory from the procedures of consent formation to the mechanisms that guarantee the expression of dissent. It argues that whistleblowing is a distinctive form of civil dissent that contributes to the demands of institutional transparency in constitutional democracies and explores the idea that the way institutions are responsive to dissent determines the robustness of democracy, and ultimately, its legitimacy. What place dissenters have within a society, whether they enjoy personal safety, legal protection, and safe channels for their disclosure, are hallmarks of a good democracy, and of its sense of justice. (shrink)
This text marks a radical rethinking of the soul and the afterlife in the writings of al-Ghaz?l? (d. 505/1111), particularly within his magnum opus, "Reviving ...
The Tibetan term jo mo, generally translated as ‘noble Lady,’ ‘female adept,’ or ‘nun’ and documented from the very beginning of Tibetan history, has a mainly religious meaning. Besides various women adepts referred to as jo mo present throughout Tibetan tradition up to the present day, a hagiographic text from the late thirteenth century entitled Jo mo nyis shus rtsa bzhi’i lo rgyus, “The Stories of the Twenty-four Jo mo,” has preserved the short life stories of twenty-four female Tibetan adepts (...) of the eleventh and twelfth centuries, disciples of the Indian Tantric master Pha dam pa sangs rgyas. The realizations attained along the Path by the jo mo in question were mainly attested to by relics and other miraculous objects or events witnessed at the time of their deaths. The aim of this paper is to analyze the religious identities of the twenty-four jo mo as described in the JMLG, while exploring some of the ways in which the Tibetan Buddhist tradition has negotiated the ambiguous religious status of these female Buddhist adepts. (shrink)
Clinical equipoise has been proposed as an ethical principle relating uncertainty and moral leeway in clinical research. Although CE has traditionally been indicated as a necessary condition for a morally justified introduction of a new RCT, questions related to the interpretation of this principle remain woefully open. Recent proposals to rehabilitate CE have divided the bioethical community on its ethical merits. This paper presents a new argument that brings out the epistemological difficulties we encounter in justifying CE as a principle (...) to connect uncertainty and moral leeway in clinical ethics. The argument proposes, first, that the methodology of hypothetical retrospection is applicable to the RCT design and that it can accommodate uncertainty. As currently understood, however, HR should give up its reliance on the assumption of uncertainty transduction, because the latter assumes the principle of indifference, which does not accommodate uncertainty in the right way. The same principle is then seen to distort also the received interpretations of CE. (shrink)
In their most recent book, Evolving Enactivism: Basic Minds Meet Content, Dan Hutto and Eric Myin claim to give a complete and gapless naturalistic account of cognition, but it comes with a kink. The kink being that content-involving cognition has special properties found nowhere else in nature, making it the case that minds capable of contentful thought differ in kind, in this key respect, from more basic minds. Contra Hutto and Myin, I argue that content-involving practices are themselves simply a (...) further extension of action and do not therefore warrant being called ‘different in kind’ or ‘kinky’. With the help of Ludwig Wittgenstein and John V. Canfield, I show that Enactivism meets the challenge of explaining higher-level cognition; and, contra continuity sceptics, offer ‘a philosophically cogent and empirically respectable account’ of how human minds can emerge from nonhuman minds. (shrink)
It is commonly hypothesized that scientists are more likely to engage in data falsification and fabrication when they are subject to pressures to publish, when they are not restrained by forms of social control, when they work in countries lacking policies to tackle scientific misconduct, and when they are male. Evidence to test these hypotheses, however, is inconclusive due to the difficulties of obtaining unbiased data. Here we report a pre-registered test of these four hypotheses, conducted on papers that were (...) identified in a previous study as containing problematic image duplications through a systematic screening of the journal PLoS ONE. Image duplications were classified into three categories based on their complexity, with category 1 being most likely to reflect unintentional error and category 3 being most likely to reflect intentional fabrication. We tested multiple parameters connected to the hypotheses above with a matched-control paradigm, by collecting two controls for each paper containing duplications. Category 1 duplications were mostly not associated with any of the parameters tested, as was predicted based on the assumption that these duplications were mostly not due to misconduct. Categories 2 and 3, however, exhibited numerous statistically significant associations. Results of univariable and multivariable analyses support the hypotheses that academic culture, peer control, cash-based publication incentives and national misconduct policies might affect scientific integrity. No clear support was found for the “pressures to publish” hypothesis. Female authors were found to be equally likely to publish duplicated images compared to males. Country-level parameters generally exhibited stronger effects than individual-level parameters, because developing countries were significantly more likely to produce problematic image duplications. This suggests that promoting good research practices in all countries should be a priority for the international research integrity agenda. (shrink)
This paper discusses the prospects of a comprehensive philosophical account of promising that relies centrally on the notion of trust. I lay out the core idea behind the Trust View, showing how it convincingly explains the normative contours and the unique value of our promissory practice. I then sketch three distinct options of how the Trust View can explain the normativity of promises. First, an effect based-view, second, a view drawing on a wider norm demanding respect to those whom one (...) has invited to something, and finally, as a new suggestion, a Normative Interest View. This view holds that promising is a normative power that serves our interest in facilitating or enabling the relationship of trust between promisor and promisee. I argue that only those embracing the third view can fully account for the distinctive obligation that results from the giving of a valid promise in all cases. (shrink)
In this paper, I argue that the vindicatory/unmasking distinction has so far prevented scholars from grasping a third dimension of genealogical inquiry, one I call possibilising. This dimension has passed unnoticed even though it constitutes a crucial aspect of Foucault’s genealogical project starting from 1978 on. By focusing attention on it, I hope to provide a definitive rebuttal of one of the main criticisms that has been raised against genealogy in general, and Foucauldian genealogy in particular, namely the idea that (...) Foucault’s genealogical project lacks normative grounding and is therefore ultimately incapable of telling us why we should resist and fight against the mechanisms of power it nevertheless reveals in an empirically insightful way. This conclusion, I argue, is mistaken because it conceives of Foucauldian genealogy exclusively as an unmasking or problematising method, whereas I claim that Foucault’s genealogical project possesses a possibilising dimension that provides his work with sui generis normative force. (shrink)
Expectations play a central role in understanding scientific and technological changes. Future-oriented representations are also central with regard to nanotechnologies as they can guide policy activities, provide structures and legitimation, attract different interests, focus policy-makers’ attention and foster investments for research. However, the emphasis on future scenarios tends to underrate the complexity of the challenges of the present market of nanotechnologies by flattening them under the needs and promises of scientific research. This is particularly apparent if we consider the viewpoint (...) of the regulator who faces two different ranges of problems with regard to the market and research and is expected, simultaneously, to manage two different types of regulatory instruments, pursuing at the same time the same goals. Instead, by favoring only the future scenarios, the regulator runs the risk of abruptly shifting from more flexible and elastic tools to forms of hard legislation, wasting, thus, the resources of the soft-regulation, in particular those of self-regulation. By referring primarily to the European context, human rights, also thanks to their normative structure of principles, can help in strengthening both the legislation needed for regulating the present market and the soft instruments needed for steering research and for fostering the stakeholders’ participation without sacrificing the coherence of the regulatory response. (shrink)
Among the various experiments in ‘new governance’, the model of Responsible Research and Innovation is emerging in the European landscape as quite promising. Up to now, there have been two versions of RRI: a socio-empirical version which tends to underline the role of democratic processes aimed at identifying values on which governance needs to be anchored and a normative version which stresses the role of EU goals as ‘normative anchor points’ of both governance strategies and policy making. Both versions are (...) unsatisfactory. The first since it suggests movable anchorage which could clash with prefixed values, such as individual rights. The second since it does not safeguard fundamental rights in the process of balancing ‘anchor points’. This result is counterintuitive because it exposes governance to the risk of facing adverse court decisions in the defense of individual rights, thus losing its anticipative attitude. In order to avoid this outcome, the paper argues that it is only through better integration between the system of human rights and that of EU fundamental rights that the anticipative feature of RRI can be preserved. (shrink)
This paper challenges a standard interpretation according to which Frege’s conception of logic (early and late) is at odds with the contemporary one, because on the latter’s view logic is formal, while on Frege’s view it is not, given that logic’s subject matter is reality’s most general features. I argue that Frege – in Begriffsschrift – retained the idea that logic is formal; Frege sees logic as providing the ‘logical cement’ that ties up together the contentful concepts of specific sciences, (...) not the most general truths. Finally, I discuss how Frege conceives of the application of Begriffsschrift, and of its status as a ‘lingua characteristica’. (shrink)
One of the most formidable challenges to the Error Theory is the Normative Objection, according to which the Error Theory ought to be rejected because of its deeply implausible first-order normative implications. Recently, Bart Streumer has offered a novel and powerful defence of the Error Theory against this objection. Streumer argues that the Error Theory’s plausibility deficit when viewed against the background of our normative beliefs does not show the theory’s falsity. Rather, it can be explained by the fact that (...) this theory, though true, cannot be believed. In this paper, I argue that Streumer’s defence does not succeed. I show that, even if we grant Streumer that we cannot believe the Error Theory, we can still formulate what I call the Undermining Normative Objection, an argument that proceeds only from believable premises to a believable conclusion and shows that the arguments supporting the Error Theory cannot all be sound. (shrink)
Whistleblowing is the act of disclosing information from a public or private organization in order to reveal cases of corruption that are of immediate or potential danger to the public. Blowing the whistle involves personal risk, especially when legal protection is absent, and charges of betrayal, which often come in the form of legal prosecution under treason laws. In this article we argue that whistleblowing is justified when disclosures are made with the proper intent and fulfill specific communicative constraints in (...) addressing issues of public interest. Three communicative constraints of informativeness, truthfulness and evidence are discussed in this regard. We develop a ‘harm test’ to assess the intent for disclosures, concluding that it is not sufficient for justification. Along with the proper intent, a successful act of whistleblowing should provide information that serves the public interest. Taking cognizance of the varied conceptions of public interest, we present an account of public interest that fits the framework of whistleblowing disclosures. In particular, we argue that whistleblowing is justified inter alia when the information it conveys is of a presumptive interest for a public insofar as it reveals an instance of injustice or violation of a civil or political right done against and unbeknown to some members of a polity. (shrink)
Graham Priest proposed an argument for the conclusion that ‘nothing’ occurs as a singular term and not as a quantifier in a sentence like (1) ‘The cosmos came into existence out of nothing’. Priest's point is that, intuitively, (1) entails (C) ‘The cosmos came into existence at some time’, but this entailment relation is left unexplained if ‘nothing’ is treated as a quantifier. If Priest is right, the paradoxical notion of an object that is nothing plays a role in our (...) very understanding of reality. In this note, we argue that Priest's argument is unsound: the intuitive entailment relation between (1) and (C) does not offer convincing evidence that ‘nothing’ occurs as a term in (1). Moreover, we provide an explanation of why (1) is naturally taken to entail (C), which is both plausible and consistent with the standard, quantificational treatment of ‘nothing’. (shrink)
The problem of merging several ontologies has important applications in the Semantic Web, medical ontology engineering and other domains where information from several distinct sources needs to be integrated in a coherent manner.We propose to view ontology merging as a problem of social choice, i.e. as a problem of aggregating the input of a set of individuals into an adequate collective decision. That is, we propose to view ontology merging as ontology aggregation. As a first step in this direction, we (...) formulate several desirable properties for ontology aggregators, we identify the incompatibility of some of these properties, and we define and analyse several simple aggregation procedures. Our approach is closely related to work in judgment aggregation, but with the crucial difference that we adopt an open world assumption, by distinguishing between facts not included in an agent’s ontology and facts explicitly negated in an agent’s ontology. (shrink)
This article shows that there are a variety of paths that could lead to more democratic global governance, and that there are a diversity of political, economic and social agents that have an interest in the pursuit of cosmopolitan democracy.
In this paper, I argue that the appropriate answer to the question of the form contemporary neoliberalism gives our lives rests on Michel Foucault’s definition of neoliberalism as a particular art of governing human beings. I claim that Foucault’s definition consists in three components: neoliberalism as a set of technologies structuring the ‘milieu’ of individuals in order to obtain specific effects from their behavior; neoliberalism as a governmental rationality transforming individual freedom into the very instrument through which individuals are directed; (...) and neoliberalism as a set of political strategies that constitute a specific, and eminently governable, form of subjectivity. I conclude by emphasising the importance that Foucault’s work on neoliberalism as well as the ancient ‘ethics of the care of the self’ still holds for us today. (shrink)
This book also provides new and illuminating accounts of difficult concepts, such as patterns of life, experiencing meaning, meaning blindness, lying and ...
This paper explores the intertwining of uncertainty and values. We consider an important but underexplored field of fundamental uncertainty and values in decision-making. Some proposed methodologies to deal with fundamental uncertainty have included potential surprise theory, scenario planning and hypothetical retrospection. We focus on the principle of uncertainty transduction in hypothetical retrospection as an illustrative case of how values interact with fundamental uncertainty. We show that while uncertainty transduction appears intuitive in decision contexts it nevertheless fails in important ranges of (...) strategic game-theoretic cases. The methodological reasons behind the failure are then examined. (shrink)
This two-part paper reviews a scholarly debate on an alleged tension in Frege ’s philosophy of logic. In Section 1 of Part I, I discuss Frege ’s view that logic is concerned with establishing norms for correct thinking and is therefore a normative science. In Section 2, I explore a different understanding of the role of logic that Frege seems to advance: logic is constitutive of the very possibility of thought, because it sets forth necessary conditions for thought. Hence, the (...) tension the view according to which logic is normative for thought seems to be incompatible with the idea that abiding by the laws of logic forms a precondition for thought. In Section 1 of Part II, I survey a number of interpretations of Frege ’s conception of logic that deal with this question. I show that they are for the most part either normative readings or constitutive readings. Finally, in Section 2, I adjudicate the debate and aim at reconciling the normative and the constitutive strands in Frege ’s conception of logic. (shrink)
In this paper, I investigate the relationship between preference and judgment aggregation, using the notion of ranking judgment introduced in List and Pettit. Ranking judgments were introduced in order to state the logical connections between the impossibility theorem of aggregating sets of judgments and Arrow’s theorem. I present a proof of the theorem concerning ranking judgments as a corollary of Arrow’s theorem, extending the translation between preferences and judgments defined in List and Pettit to the conditions on the aggregation procedure.
This radical reading of Wittgenstein's third and last masterpiece, On Certainty, has major implications for philosophy. It elucidates Wittgenstein's ultimate thoughts on the nature of our basic beliefs and his demystification of scepticism. Our basic certainties are shown to be nonepistemic, nonpropositional attitudes that, as such, have no verbal occurrence but manifest themselves exclusively in our actions. This fundamental certainty is a belief-in, a primitive confidence or ur-trust whose practical nature bridges the hitherto unresolved categorial gap between belief and action.
Dans son « Découverte d'un nouveau principe de mécanique » Euler a donné, pour la première fois, une preuve du théorème qu'on appelle aujourd'hui le Théorème d'Euler. Dans cet article je vais me concentrer sur la preuve originale d'Euler, et je vais montrer comment la pratique mathématique d Euler peut éclairer le débat philosophique sur la notion de preuves explicatives en mathématiques. En particulier, je montrerai comment l'un des modèles d'explication mathématique les plus connus, celui proposé par Mark Steiner dans (...) son article « Mathematical explanation », n'est pas en mesure de rendre compte du caractère explicatif de la preuve donnée par Euler. Cela contredit l'intuition originale du mathématicien Euler, qui attribuait à sa preuve un caractère explicatif spécifique.In his "Découverte d'un nouveau principe de mécanique" Euler offered, for the first time, a proof of the so-called Euler's Theorem. In this paper I will focus on Euler's original proof and I will show how a look at Euler's practice as a mathematician can inform the philosophical debate about the notion of explanatory proofs in mathematics. In particular, I will show how one of the major accounts of mathematical explanation, the one proposed by Mark Steiner in his paper " Mathematical explanation ", is not able to account for the explanatory character of Euler's proof. This contradicts the original intuitions of the mathematician Euler, who attributed to his proof a particular explanatory character. (shrink)
We argue that a cognitive semantics has to take into account the possibly partial information that a cognitive agent has of the world. After discussing Gärdenfors's view of objects in conceptual spaces, we offer a number of viable treatments of partiality of information and we formalize them by means of alternative predicative logics. Our analysis shows that understanding the nature of simple predicative sentences is crucial for a cognitive semantics.
The Red Sea is characterized by thick salt sequences representing a seal for potential hydrocarbon accumulations within Tertiary formations deposited over deep basement structures. The Red Sea “salt” is characterized by halite concentrations embedded in layered evaporite sequences composed of evaporite and clastic lithologies. Salt complicates seismic exploration efforts in the Red Sea by generating vertical and lateral velocity variations that are difficult to estimate by seismic methods alone. In these conditions, the exploration challenges of independently imaging the subsalt section (...) and provide enhanced velocity model building capabilities were addressed by a multigeophysics strategy involving marine electromagnetics and gravity gradiometry surveys colocated with wide azimuth seismic. Three-dimensional inversion of MT and CSEM is performed first with minimal a priori constraints and then by including variable amounts of interpretation in the starting models. The internal variations in the evaporitic overburden, the subsalt, and the basement structures are independently imaged by combined electromagnetic methods and confirmed by new drilling results. CSEM, in particular, provides unprecedented detail of the internal structures within the salt overburden while magnetotellurics provides excellent reconstruction of the base of salt and basement. Gravity gradiometry shows primary sensitivity to the basement and the corresponding 3D inversion provides density distributions structurally consistent with the resistivity volumes. The common-structure, multiparameter models obtained from 3D inversion deliver additional aid to seismic interpreters to further derisk exploration in the Red Sea and provide additional detail to depth imaging velocity models. The reciprocal consistency of the obtained results show promises for extending the work to more analytical integration with seismic such as provided by joint geophysical inversion. (shrink)
Victoria S. Harrison’s theory of internal pluralism approaches religious beliefs in terms of conceptual schemes. To her, this approach has the advantage of preserving core pluralist intuitions without being challenged by the usual difficulties. My claim is that this is not the case. After providing a succinct presentation of internal pluralism, I show that the critique of traditional pluralist views such as Hick’s may also be addressed to Harrison. There are two main reasons in support of my claim. Firstly, a (...) believer’s common understanding of religious experiences conflicts with the way in which internal pluralism understands religious belief. Such conflict implies that if internal pluralism were a sound theory, most religious beliefs would turn out to be false, and, contrary to Harrison’s intention, they would be rendered cognitively irrelevant. Secondly, internal pluralism excludes the possibility of religious disagreements. By applying to religions an epistemological approach based on conceptual schemes, doxastic dissent is actually dismantled at the cost of developing an entirely solipsistic reading of religious beliefs. In the final section of my paper, I will show that such unattractive features are consequences of the notion of conceptual scheme. (shrink)
This paper surveys the impact on neuropsychology of Wittgenstein's elucidations of memory. Wittgenstein discredited the storage and imprint models of memory, dissolved the conceptual link between memory and mental images or representations and, upholding the context-sensitivity of memory, made room for a family resemblance concept of memory, where remembering can also amount to doing or saying something. While neuropsychology is still generally under the spell of archival and physiological notions of memory, Wittgenstein's reconceptions can be seen at work in its (...) leading-edge practitioners. However, neuroscientists, generally, are finding memory difficult to demarcate from other cognitive and noncognitive processes, and I suggest this is largely due to their considering automatic responses as part of memory, termed nondeclarative or implicit memory. Taking my lead from Wittgenstein's On Certainty, I argue that there is only remembering where there is also some kind of mnemonic effort or attention, and, therefore, that so-called implicit memory is not memory at all, but a basic, noncognitive certainty. (shrink)