BackgroundTo probe the functional role of brain oscillations, transcranial alternating current stimulation has proven to be a useful neuroscientific tool. Because of the excessive tACS-caused artifact at the stimulation frequency in electroencephalography signals, tACS + EEG studies have been mostly limited to compare brain activity between recordings before and after concurrent tACS. Critically, attempts to suppress the artifact in the data cannot assure that the entire artifact is removed while brain activity is preserved. The current study aims to evaluate the (...) feasibility of specific artifact correction techniques to clean tACS-contaminated EEG data.New MethodIn the first experiment, we used a phantom head to have full control over the signal to be analyzed. Driving pre-recorded human brain-oscillation signals through a dipolar current source within the phantom, we simultaneously applied tACS and compared the performance of different artifact-correction techniques: sine subtraction, template subtraction, and signal-space projection. In the second experiment, we combined tACS and EEG on one human subject to demonstrate the best-performing data-correction approach in a proof of principle.ResultsThe tACS artifact was highly attenuated by SSP in the phantom and the human EEG; thus, we were able to recover the amplitude and phase of the oscillatory activity. In the human experiment, event-related desynchronization could be restored after correcting the artifact.Comparison With Existing MethodsThe best results were achieved with SSP, which outperformed sine subtraction and template subtraction.ConclusionOur results demonstrate the feasibility of SSP by applying it to a phantom measurement with pre-recorded signal and one human tACS + EEG dataset. For a full validation of SSP, more data are needed. (shrink)
A type of transcendental argument for libertarian free will maintains that if acting freely requires the availability of alternative possibilities, and determinism holds, then one is not justified in asserting that there is no free will. More precisely: if an agent A is to be justified in asserting a proposition P (e.g. "there is no free will"), then A must also be able to assert not-P. Thus, if A is unable to assert not-P, due to determinism, then A is not (...) justified in asserting P. While such arguments often appeal to principles with wide appeal, such as the principle that ‘ought’ implies ‘can’, they also require a commitment to principles that seem far less compelling, e.g. the principle that ‘ought’ implies ‘able not to’ or the principle that having an obligation entails being responsible. It is argued here that these further principles are dubious, and that it will be difficult to construct a valid transcendental argument without them. (shrink)
Ontological dependence is a relation—or, more accurately, a family of relations—between entities or beings. For there are various ways in which one being may be said to depend upon one or more other beings, in a sense of “depend” that is distinctly metaphysical in character and that may be contrasted, thus, with various causal senses of this word. More specifically, a being may be said to depend, in such a sense, upon one or more other beings for its existence or (...) for its identity. Some varieties of ontological dependence may be analyzed in modal terms—that is, in terms of distinctly metaphysical notions of possibility and necessity—while others seem to demand an analysis in terms of the notion of essence. The latter varieties of ontological dependence may accordingly be called species of essential dependence. Notions of ontological dependence are frequently called upon by metaphysicians in their proposed analyses of other metaphysically important notions, such as the notion of substance. (shrink)
How do we come to know metaphysical truths? How does metaphysical inquiry work? Are metaphysical debates substantial? These are the questions which characterize metametaphysics. This book, the first systematic student introduction dedicated to metametaphysics, discusses the nature of metaphysics - its methodology, epistemology, ontology and our access to metaphysical knowledge. It provides students with a firm grounding in the basics of metametaphysics, covering a broad range of topics in metaontology such as existence, quantification, ontological commitment and ontological realism. Contemporary views (...) are discussed along with those of Quine, Carnap and Meinong. Going beyond the metaontological debate, thorough treatment is given to novel topics in metametaphysics, including grounding, ontological dependence, fundamentality, modal epistemology, intuitions, thought experiments and the relationship between metaphysics and science. The book will be an essential resource for those studying advanced metaphysics, philosophical methodology, metametaphysics, epistemology and the philosophy of science. (shrink)
Biochemical kinds such as proteins pose interesting problems for philosophers of science, as they can be studied from the points of view of both biology and chemistry. The relationship between the biological functions of biochemical kinds and the microstructures that they are related to is the key question. This leads us to a more general discussion about ontological reductionism, microstructuralism, and multiple realization at the biology-chemistry interface. On the face of it, biochemical kinds seem to pose a challenge for ontological (...) reductionism and hence motivate a dual theory of chemical and biological kinds, a type of pluralism about natural kinds. But it will be argued that the challenge, which is based on multiple realization, can be addressed. The upshot is that there are reasonable prospects for ontological reductionism about biochemical kinds, which corroborates natural kind monism. (shrink)
Recent work on Natural Kind Essentialism has taken a deflationary turn. The assumptions about the grounds of essentialist truths concerning natural kinds familiar from the Kripke-Putnam framework are now considered questionable. The source of the problem, however, has not been sufficiently explicated. The paper focuses on the Twin Earth scenario, and it will be demonstrated that the essentialist principle at its core (which I call IDENT)—that necessarily, a sample of a chemical substance, A, is of the same kind as another (...) sample, B, if and only if A and B have the same microstructure—must be re-evaluated. The Twin Earth scenario also assumes the falsity of another essentialist principle (which I call INST): necessarily, there is a 1:1 correlation between (all of ) the chemical properties of a chemical substance and the microstructure of that substance. This assumption will be questioned, and it will be argued that, in fact, the best strategy for defending IDENT is to establish INST. The prospects for Natural Kind Essentialism and microstructural essentialism regarding chemical substances will be assessed with reference to recent work in the philosophy of chemistry. Finally, a weakened form of INST will be presented. (shrink)
Some recent arguments defending the genuine causal efficacy of the mental have been relying on empirical research on neuroprosthetics. This essay presents a critical analysis of these arguments. The problem of mental causation, and the basic idea and results of neuroprosthetics are reviewed. It is shown how appealing to the research on neuroprosthetics can be interpreted to give support to the idea of mental causation. However, it does so only in a rather deflationary sense: by holding the mental identical with (...) the neural. So contrary to what the arguments have been assuming, neuroprosthetics cannot be used to argue for nonreductive physicalism. It can rather be taken to illustrate just the opposite: how the mental and the physical are identical. (shrink)
The epistemology of essence is a topic that has received relatively little attention, although there are signs that this is changing. The lack of literature engaging directly with the topic is probably partly due to the mystery surrounding the notion of essence itself, and partly due to the sheer difficulty of developing a plausible epistemology. The need for such an account is clear especially for those, like E.J. Lowe, who are committed to a broadly Aristotelian conception of essence, whereby essence (...) plays an important theoretical role. In this chapter, our epistemic access to essence is examined in terms of the a posteriori vs. a priori distinction. The two main accounts to be contrasted are those of David S. Oderberg and E.J. Lowe. (shrink)
In formal ontology, infinite regresses are generally considered a bad sign. One debate where such regresses come into play is the debate about fundamentality. Arguments in favour of some type of fundamentalism are many, but they generally share the idea that infinite chains of ontological dependence must be ruled out. Some motivations for this view are assessed in this article, with the conclusion that such infinite chains may not always be vicious. Indeed, there may even be room for a type (...) of fundamentalism combined with infinite descent as long as this descent is “boring,” that is, the same structure repeats ad infinitum. A start is made in the article towards a systematic account of this type of infinite descent. The philosophical prospects and scientific tenability of the account are briefly evaluated using an example from physics. (shrink)
One of the main line of responses to the infamous causal exclusion problem has been based on the counterfactual account of causation. However, arguments have begun to surface to the effect that the counterfactual theory is in fact ill-equipped to solve the exclusion problem due to its commitment to downward causation. This argumentation is here critically analysed. An analysis of counterfactual dependence is presented and it is shown that if the semantics of counterfactuals is taken into account carefully enough, the (...) counterfactual notion of causation does not need to be committed to downward causation. However, it is a further question whether this is eventually enough to solve the exclusion problem for the analysis shows how the problem itself can take various different forms. (shrink)
It is argued that if we take grounding to be univocal, then there is a serious tension between truth-grounding and one commonly assumed structural principle for grounding, namely transitivity. The primary claim of the article is that truth-grounding cannot be transitive. Accordingly, it is either the case that grounding is not transitive or that truth-grounding is not grounding, or both.
The mental realm seems different to the physical realm; the mental is thought to be dependent on, yet distinct from the physical. But how, exactly, are the two realms supposed to be different, and what, exactly, creates the seemingly insurmountable juxtaposition between the mental and the physical? This review identifies and discusses five marks of the mental, features that set characteristically mental phenomena apart from the characteristically physical phenomena. These five marks (intentionality, consciousness, free will, teleology, and normativity) are not (...) presented as a set of features that define mentality. Rather, each of them is something we seem to associate with phenomena we consider mental, and each of them seems to be in tension with the physical view of reality in its own particular way. It is thus suggested how there is no single mind-body problem, but a set of distinct but interconnected problems. Each of these separate problems is analyzed, and their differences, similarities and connections are identified. This provides a useful basis for future theoretical work on psychology and philosophy of mind, that until now has too often suffered from unclarities, inadequacies, and conflations. (shrink)
In this chapter, a generic definition of fundamentality as an ontological minimality thesis is sought and its applicability examined. Most discussions of fundamentality are focused on a mereological understanding of the hierarchical structure of reality, which may be combined with an atomistic, object-oriented metaphysics. But recent work in structuralism, for instance, calls for an alternative understanding and it is not immediately clear that the conception of fundamentality at work in structuralism is commensurable with the mereological conception. However, it is proposed (...) that once we understand fundamentality as an ontological minimality thesis, these two as well as further conceptions of fundamentality can all be treated on a par, including metaphysical infinitism of the ‘boring’ type, where the same structure repeats infinitely. (shrink)
The thesis of multiple realisation that Borsboom et al. are relying on should not be taken for granted. In dissolving the apparent multiple realisation, the reductionist research strategies in psychopathology research (the Research Domain Criteria [RDoC] framework, in particular) are bound to lead to eliminativism rather than reductionism. Therefore, Borsboom et al. seem to be aiming at a wrong target.
Aristotelian metaphysics is currently undergoing something of a renaissance. This volume brings together fourteen essays from leading philosophers who are sympathetic to this conception of metaphysics, which takes its cue from the idea that metaphysics is the first philosophy. The primary input from Aristotle is methodological, but many themes familiar from his metaphysics will be discussed, including ontological categories, the role and interpretation of the existential quantifier, essence, substance, natural kinds, powers, potential, and the development of life. The volume mounts (...) a strong challenge to the type of ontological deflationism which has recently gained a strong foothold in analytic metaphysics. It will be a useful resource for scholars and advanced students who are interested in the foundations and development of philosophy. (shrink)
When I say that my conception of metaphysics is Aristotelian, or neo-Aristotelian, this may have more to do with Aristotle’s philosophical methodology than his metaphysics, but, as I see it, the core of this Aristotelian conception of metaphysics is the idea that metaphysics is the first philosophy . In what follows I will attempt to clarify what this conception of metaphysics amounts to in the context of some recent discussion on the methodology of metaphysics (e.g. Chalmers et al . (2009), (...) Ladyman and Ross (2007)). There is a lot of hostility towards the Aristotelian conception of metaphysics in this literature: for instance, the majority of the contributors to the Metametaphysics volume assume a rather more deflationary, Quinean approach towards metaphysics. In the process of replying to the criticisms towards Aristotelian metaphysics put forward in recent literature I will also identify some methodological points which deserve more attention and ought to be addressed in future research. (shrink)
A critical analysis of recent interventionist responses to the causal exclusion problem is presented. It is argued that the response can indeed offer a solution to the problem, but one that is based on renouncing the multiple realizability thesis. The account amounts to the rejection of nonreductive physicalism and would thus be unacceptable to many. It is further shown that if the multiple realizability thesis is brought back in and conjoined with the interventionist notion of causation, inter-level causation is ruled (...) out altogether. (shrink)
This article reviews the causal exclusion argument and confronts it with some recently proposed refutations based on the interventionist account of causation. I first show that there are several technical and interpretative difficulties in applying the interventionist account to the exclusion issue. Different ways of accommodating the two to one another are considered and all are shown to leave the issue without a fully satisfactory resolution. Lastly, I argue that, on the most consistent construal, the interventionist approach can provide grounds (...) for thinking that higher-level causal notions are as legitimate as lower-level causal notions, but it does not provide grounds for postulating inter-level causal interactions. (shrink)
The goals of this paper are two-fold: I wish to clarify the Aristotelian conception of the law of non-contradiction as a metaphysical rather than a semantic or logical principle, and to defend the truth of the principle in this sense. First I will explain what it in fact means that the law of non-contradiction is a metaphysical principle. The core idea is that the law of non-contradiction is a general principle derived from how things are in the world. For example, (...) there are certain constraints as to what kind of properties an object can have, and especially: some of these properties are mutually exclusive. Given this characterisation, I will advance to examine what kind of challenges the law of non-contradiction faces; the main opponent here is Graham Priest. I will consider these challenges and conclude that they do not threaten the truth of the law of non-contradiction understood as a metaphysical principle. (shrink)
Two strains of interventionist responses to the causal exclusion argument are reviewed and critically assessed. On the one hand, one can argue that manipulating supervenient mental states is an effective strategy for manipulating the subvenient physical states, and hence should count as genuine causes to the subvenient physical states. But unless the supervenient and subvenient states manifest some difference in their manipulability conditions, there is no reason to treat them as distinct, which in turn goes against the basic assumption of (...) nonreductive physicalism. On the other hand, one can preserve the distinction between the two by introducing asymmetric manipulability conditions that the supervenience thesis entails. But this response can be used to argue that mental causes never have physical effects. However, this argumentation can also be used to show that mental causes can have mental effects. (shrink)
Communication is sharing and conveying information. In visual communication especially visual messages have to be formulated and interpreted. The interpretation is relative to a method of information presentation method which is human construction. This holds also in the case of visual languages. The notions of syntax and semantics for visual languages are not so well founded as they are for natural languages. Visual languages are both syntactically and semantically dense. The density is connected to the compositionality of the languages. In (...) the paper Charles Sanders Peirce’s theory of signs will be used in characterizing visual languages. This allows us to relate visual languages to natural languages. The foundation of information presentation methods for visual languages is the logic of perception, but only if perception is understood as propositional perception. This allows us to understand better the relativity of information presentation methods, and hence to evaluate the cultural relativity of visual communication. (shrink)
In this chapter, it is suggested that our epistemic access to metaphysical modality generally involves rationalist, a priori elements. However, these a priori elements are much more subtle than ‘traditional’ modal rationalism assumes. In fact, some might even question the ‘apriority’ of these elements, but I should stress that I consider a priori and a posteriori elements especially in our modal inquiry to be so deeply intertwined that it is not easy to tell them apart. Supposed metaphysically necessary identity statements (...) involving natural kind terms are a good example: the fact that empirical input is crucial in establishing their necessity has clouded the role and content of the a priori input, as I have previously argued. For instance, the supposed metaphysically necessary identity statement involving water and its microstructure can only be established with the help of a controversial a priori principle concerning the determination of chemical properties by microstructure. The Kripke-Putnam framework of modal epistemology fails precisely because it is unclear whether the required a priori element is present. My positive proposal builds on E. J. Lowe’s work. Lowe holds that our knowledge of metaphysical modality is based on our knowledge of essence. Lowe’s account strives to offer a uniform picture of modal epistemology: essence is the basis of all our modal knowledge. This is the basis of Lowe’s modal rationalism. I believe that Lowe’s proposal is on the right lines in the case of abstract objects, but I doubt that it can be successfully applied to the case of natural kinds. Accordingly, the case of natural kinds will be my main focus and I will suggest that modal rationalism, at least as it is traditionally understood, falls short of explaining modal knowledge concerning natural kinds. Yet, I think that Lowe has identified something of crucial importance for modal epistemology, namely the essentialist, a priori elements present in our modal inquiry. The upshot is that rather than moving all the way from modal rationalism to modal empiricism, a type of hybrid approach, ‘empirically-informed modal rationalism ’, can be developed. (shrink)
What is our epistemic access to metaphysical modality? Timothy Williamson suggests that the epistemology of counterfactuals will provide the answer. This paper challenges Williamson's account and argues that certain elements of the epistemology of counterfactuals that he discusses, namely so called background knowledge and constitutive facts, are already saturated with modal content which his account fails to explain. Williamson's account will first be outlined and the role of background knowledge and constitutive facts analysed. Their key role is to restrict our (...) imagination to rule out irrelevant counterfactual suppositions. However, background knowledge turns out to be problematic in cases where we are dealing with metaphysically possible counterfactual suppositions that violate the actual laws of physics. As we will see, unless Williamson assumes that background knowledge corresponds with the actual, true laws of physics and that these laws are metaphysically necessary, it will be difficult to address this problem. Furthermore, Williamson's account fails to accommodate the distinction between conceivable yet metaphysically impossible scenarios, and conceivable and metaphysically possible scenarios. This is because background knowledge and constitutive facts are based strictly on our knowledge of the actual world. Williamson does attempt to address this concern with regard to metaphysical necessities – as they hold across all possible worlds – but we will see that even in this case the explanation is questionable. These problems, it will be suggested, cannot be addressed in a counterfactual account of the epistemology of modality. The paper finishes with an analysis of Williamson's possible rejoinders and some discussion about the prospects of an alternative account of modal epistemology. (shrink)
Three popular views regarding the modal status of the laws of nature are discussed: Humean Supervenience, nomic necessitation, and scientific/dispositional essentialism. These views are examined especially with regard to their take on the apparent modal force of laws and their ability to explain that modal force. It will be suggested that none of the three views, at least in their strongest form, can be maintained if some laws are metaphysically necessary, but others are metaphysically contingent. Some reasons for thinking that (...) such variation in the modal status of laws exists will be presented with reference to physics. This drives us towards a fourth, hybrid view, according to which there are both necessary and contingent laws. The prospects for such a view are studied. (shrink)
This paper defends the idea that there must be some joints in reality, some correct way to classify or categorize it. This may seem obvious, but we will see that there are at least three conventionalist arguments against this idea, as well as philosophers who have found them convincing. The thrust of these arguments is that the manner in which we structure, divide or carve up the world is not grounded in any natural, genuine boundaries in the world. Ultimately they (...) are supposed to pose a serious threat to realism. The first argument that will be examined concerns the claim that there are no natural boundaries in reality, the second one focuses on the basis of our classificatory schemes, which the conventionalist claims to be merely psychological, and the third considers the significance of our particular features in carving up the world, such as physical size and perceptual capabilities. The aim of this paper is to demonstrate that none of these objections succeed in undermining the existence of genuine joints in reality. (shrink)
A thoroughly physical view on reality and our common sense view on agency and free will seem to be in a direct conflict with each other: if everything that happens is determined by prior physical events, so too are all our actions and conscious decisions; you have no choice but to do what you are destined to do. Although this way of thinking has intuitive appeal, and a long history, it has recently began to gain critical attention. A number of (...) arguments have been raised in defense of the idea that our will could be genuinely free even if the universe is governed by deterministic laws of physics. Determinism and free will have been argued to be compatible before, of course, but these recent arguments seem to take a new step in that they are relying on a more profound and concrete view on the central elements of the issue, the fundamental laws of physics and the nature of causal explanation in particular. The basic idea of this approach is reviewed in here, and it is shown how a careful analysis of physics and causal explanation can indeed enhance our understanding of the issue. Although it cannot be concluded that the problem of free will would now be completely solved (or dissolved), it is clear that these recent developments can bring significant advancement to the debate. (shrink)
The principle of causal exclusion is based on two distinct causal notions: causal sufficiency and causation simpliciter. The principle suggests that the former has the power to exclude the latter. But that is problematic since it would amount to claiming that sufficient causes alone can take the roles of causes simpliciter. Moreover, the principle also assumes that events can sometimes have both sufficient causes and causes simpliciter. This assumption is in conflict with the first part of the principle that claims (...) that sufficient causes must exclude causes simpliciter. (shrink)
The priority monist holds that the cosmos is the only fundamental object, of which every other concrete object is a dependent part. One major argument against monism goes back to Russell, who claimed that pluralism is favoured by common sense. However, Jonathan Schaffer turns this argument on its head and uses it to defend priority monism. He suggests that common sense holds that the cosmos is a whole, of which ordinary physical objects are arbitrary portions, and that arbitrary portions depend (...) for their existence on the existence of the whole. In this paper, we challenge Schaffer’s claim that the parts of the cosmos are all arbitrary portions. We suggest that there is a way of carving up the universe such that at least some of its parts are not arbitrary. We offer two arguments in support of this claim. First, we shall outline semantic reasons in its favour: in order to accept that empirical judgements are made true or false by the way the world is, one must accept that the cosmos includes parts whose existence is not arbitrary. Second, we offer an ontological argument: in order for macro-physical phenomena to exist, there must be some micro-physical order which they depend upon, and this order must itself be non-arbitrary. We conclude that Schaffer’s common sense argument for monism cannot be made to work. (shrink)
The distinction between a priori and a posteriori knowledge has been the subject of an enormous amount of discussion, but the literature is biased against recognizing the intimate relationship between these forms of knowledge. For instance, it seems to be almost impossible to find a sample of pure a priori or a posteriori knowledge. In this paper, it will be suggested that distinguishing between a priori and a posteriori is more problematic than is often suggested, and that a priori and (...) a posteriori resources are in fact used in parallel. We will define this relationship between a priori and a posteriori knowledge as the bootstrapping relationship. As we will see, this relationship gives us reasons to seek for an altogether novel definition of a priori and a posteriori knowledge. Specifically, we will have to analyse the relationship between a priori knowledge and a priori reasoning , and it will be suggested that the latter serves as a more promising starting point for the analysis of aprioricity. We will also analyse a number of examples from the natural sciences and consider the role of a priori reasoning in these examples. The focus of this paper is the analysis of the concepts of a priori and a posteriori knowledge rather than the epistemic domain of a posteriori and a priori justification. (shrink)
The project of treating knowledge as an empirical object of study has gained popularity in recent naturalistic epistemology. It is argued here that the assumption that such an object of study exists is in tension with other central elements of naturalistic philosophy. Two hypotheses are considered. In the first, “knowledge” is hypothesized to refer to mental states causally responsible for the behaviour of cognitive agents. Here, the relational character of truth creates a problem. In the second hypothesis “knowledge” is hypothesized (...) to refer to mental states causally responsible for the evolutionarily successful behaviour of cognitive agents. Here, the problem lies in the fact that evolution by natural selection is not necessarily conducive to truth. The result does not necessarily amount to eliminativism, however, since the naturalist may consistently reject the condition of truth that lies behind these problems. (shrink)
An overview of the computational prediction of emotional responses to music is presented. Communication of emotions by music has received a great deal of attention during the last years and a large number of empirical studies have described the role of individual features (tempo, mode, articulation, timbre) in predicting the emotions suggested or invoked by the music. However, unlike the present work, relatively few studies have attempted to model continua of expressed emotions using a variety of musical features from audio-based (...) representations in a correlation design. The construction of the computational model is divided into four separate phases, with a different focus for evaluation. These phases include the theoretical selection of relevant features, empirical assessment of feature validity, actual feature selection, and overall evaluation of the model. Existing research on music and emotions and extraction of musical features is reviewed in terms of these criteria. Examples drawn from recent studies of emotions within the context of film soundtracks are used to demonstrate each phase in the construction of the model. These models are able to explain the dominant part of the listeners’ self-reports of the emotions expressed by music and the models show potential to generalize over different genres within Western music. Possible applications of the computational models of emotions are discussed. (shrink)
The notion of fundamentality, as it is used in metaphysics, aims to capture the idea that there is something basic or primitive in the world. This metaphysical notion is related to the vernacular use of “fundamental”, but philosophers have also put forward various technical definitions of the notion. Among the most influential of these is the definition of absolute fundamentality in terms of ontological independence or ungroundedness. Accordingly, the notion of fundamentality is often associated with these two other technical notions.
The starting point of this paper concerns the apparent difference between what we might call absolute truth and truth in a model, following Donald Davidson. The notion of absolute truth is the one familiar from Tarski’s T-schema: ‘Snow is white’ is true if and only if snow is white. Instead of being a property of sentences as absolute truth appears to be, truth in a model, that is relative truth, is evaluated in terms of the relation between sentences and models. (...) I wish to examine the apparent dual nature of logical truth (without dwelling on Davidson), and suggest that we are dealing with a distinction between a metaphysical and a linguistic interpretation of truth. I take my cue from John Etchemendy, who suggests that absolute truth could be considered as being equivalent to truth in the ‘right model’, i.e., the model that corresponds with the world. However, the notion of ‘model’ is not entirely appropriate here as it is closely associated with relative truth. Instead, I propose that the metaphysical interpretation of truth may be illustrated in modal terms, by metaphysical modality in particular. One of the tasks that I will undertake in this paper is to develop this modal interpretation, partly building on my previous work on the metaphysical interpretation of the law of non-contradiction (Tahko 2009). After an explication of the metaphysical interpretation of logical truth, a brief study of how this interpretation connects with some recent important themes in philosophical logic follows. In particular, I discuss logical pluralism and propose an understanding of pluralism from the point of view of the metaphysical interpretation. (shrink)
According to one traditional view, empirical science is necessarily preceded by philosophical analysis. Yet the relevance of philosophy is often doubted by those engaged in empirical sciences. I argue that these doubts can be substantiated by two theoretical problems that the traditional conception of philosophy is bound to face. First, there is a strong normative etiology to philosophical problems, theories, and notions that is difficult to reconcile with descriptive empirical study. Second, conceptual analysis (a role that is typically assigned to (...) philosophy) seems to lose its object of study if it is granted that terms do not have purely conceptual meanings detached from their actual use in empirical sciences. These problems are particularly acute to the current naturalistic philosophy of science. I suggest a more concrete integration of philosophy and the sciences as a possible way of making philosophy of science have more impact. (shrink)
Aristotle talks about 'the first philosophy' throughout the Metaphysics – and it is metaphysics that Aristotle considers to be the first philosophy – but he never makes it entirely clear what first philosophy consists of. What he does make clear is that the first philosophy is not to be understood as a collection of topics that should be studied in advance of any other topics. In fact, Aristotle seems to have thought that the topics of Metaphysics are to be studied (...) after those in Physics. In what sense could metaphysics be the first philosophy in the context of contemporary metaphysics? This is the question examined in this essay. Contemporary topics such as fundamentality, grounding, and ontological dependence are considered as possible ways to understand the idea of first philosophy, but I will argue that the best way to understand it is in terms of essence. (shrink)
In this paper I offer a counterexample to the so called vagueness argument against restricted composition. This will be done in the lines of a recent suggestion by Trenton Merricks, namely by challenging the claim that there cannot be a sharp cut-off point in a composition sequence. It will be suggested that causal powers which emerge when composition occurs can serve as an indicator of such sharp cut-off points. The main example will be the case of a heap. It seems (...) that heaps might provide a very plausible counterexample to the vagueness argument if we accept the idea that four grains of sand is the least number required to compose a heap—the case has been supported by W. D. Hart. My purpose here is not to put forward a new theory of composition, I only wish to refute the vagueness argument and point out that we should be wary of arguments of its form. (shrink)
In this paper I will offer a novel understanding of a priori knowledge. My claim is that the sharp distinction that is usually made between a priori and a posteriori knowledge is groundless. It will be argued that a plausible understanding of a priori and a posteriori knowledge has to acknowledge that they are in a constant bootstrapping relationship. It is also crucial that we distinguish between a priori propositions that hold in the actual world and merely possible, non-actual a (...) priori propositions, as we will see when considering cases like Euclidean geometry. Furthermore, contrary to what Kripke seems to suggest, a priori knowledge is intimately connected with metaphysical modality, indeed, grounded in it. The task of a priori reasoning, according to this account, is to delimit the space of metaphysically possible worlds in order for us to be able to determine what is actual. (shrink)
Supervenient libertarianism maintains that indeterminism may exist at a supervening agency level, consistent with determinism at a subvening physical level. It seems as if this approach has the potential to break the longstanding deadlock in the free will debate, since it concedes to the traditional incompatibilist that agents can only do otherwise if they can do so in their actual circumstances, holding the past and the laws constant, while nonetheless arguing that this ability is compatible with physical determinism. However, we (...) argue that supervenient libertarianism faces some serious problems, and that it fails to break us free from this deadlock within the free will debate. (shrink)
The most fundamental issue of the neurosciences is the question of how or whether the mind and the body can interact with each other. It has recently been suggested in several studies that current neuroimaging evidence supports a view where the mind can have a well-documented causal influence on various brain processes. These arguments are critically analyzed here. First, the metaphysical commitments of the current neurosciences are reviewed. According to both the philosophical and neuroscientific received views, mental states are necessarily (...) neurally based. It is argued that this leaves no room for a genuine interaction of the mental and the neural. Second, it is shown how conclusions drawn from recent imaging studies are in fact compatible with the fully physicalistic notion of mental causation and how they can thus be easily accommodated to the received view. The fallacious conclusions are argued to be a result of an overly vague grasping of the conceptual issues involved. The question of whether the fundamental physical principles exclude outright the ability of mental states to have causal influence on the physical world is also addressed and the reaction of appealing to the apparent loophole provided by quantum physics is assessed. It is argued that linking psychology to quantum physics contradicts many basic tenets of the current neurosciences and is thus not a promising line of study. It is concluded that the interactionist hypothesis benefits from neither conceptual nor empirical support. (shrink)
This paper challenges the Kripkean interpretation of a posteriori necessities. It will be demonstrated, by an analysis of classic examples, that the modal content of supposed a posteriori necessities is more complicated than the Kripkean line suggests. We will see that further research is needed concerning the a priori principles underlying all a posteriori necessities. In the course of this analysis it will emerge that the modal content of a posteriori necessities can be best described in terms of a Finean (...) conception of modality – by giving essences priority over modality. The upshot of this is that we might be able to establish the necessity of certain supposed a posteriori necessities by a priori means. (shrink)
A minimal truthmaker for a given proposition is the smallest portion of reality which makes this proposition true. Minimal truthmakers are frequently mentioned in the literature, but there has been no systematic account of what they are or of their importance. In this article we shall clarify the notion of a minimal truthmaker and argue that there is reason to think that at least some propositions have minimal truthmakers. We shall then argue that the notion can play a useful role (...) in truthmaker theory, by helping to explain the truth of certain propositions as precisely as possible. (shrink)
Can the neo-Aristotelian uphold a pluralist substance ontology while taking seriously the recent arguments in favour of monism based on quantum holism and other arguments from quantum mechanics? In this article, Jonathan Schaffer’s priority monism will be the main target. It will be argued that the case from quantum mechanics in favour of priority monism does face some challenges. Moreover, if the neo-Aristotelian is willing to consider alternative ways to understand ‘substance’, there may yet be hope for a pluralist substance (...) ontology. A speculative case for such an ontology will be constructed based on primitive incompatibility. (shrink)
The present paper discusses different approaches to metaphysics and defends a specific, non-deflationary approach that nevertheless qualifies as scientifically-grounded and, consequently, as acceptable from the naturalistic viewpoint. By critically assessing some recent work on science and metaphysics, we argue that such a sophisticated form of naturalism, which preserves the autonomy of metaphysics as an a priori enterprise yet pays due attention to the indications coming from our best science, is not only workable but recommended.