BackgroundInnovations in technology have contributed to rapid changes in the way that modern biomedical research is carried out. Researchers are increasingly required to endorse adaptive and flexible approaches to accommodate these innovations and comply with ethical, legal and regulatory requirements. This paper explores how Dynamic Consent may provide solutions to address challenges encountered when researchers invite individuals to participate in research and follow them up over time in a continuously changing environment.MethodsAn interdisciplinary workshop jointly organised by the University of Oxford (...) and the COST Action CHIP ME gathered clinicians, researchers, ethicists, lawyers, research participants and patient representatives to discuss experiences of using Dynamic Consent, and how such use may facilitate the conduct of specific research tasks. The data collected during the workshop were analysed using a content analysis approach.ResultsDynamic Consent can provide practical, sustainable and future-proof solutions to challenges related to participant recruitment, the attainment of informed consent, participant retention and consent management, and may bring economic efficiencies.ConclusionsDynamic Consent offers opportunities for ongoing communication between researchers and research participants that can positively impact research. Dynamic Consent supports inter-sector, cross-border approaches and large scale data-sharing. Whilst it is relatively easy to set up and maintain, its implementation will require that researchers re-consider their relationship with research participants and adopt new procedures. (shrink)
Epigenetic and transcriptional variability contribute to the vast diversity of cellular and organismal phenotypes and are key in human health and disease. In this review, we describe different types, sources, and determinants of epigenetic and transcriptional variability, enabling cells and organisms to adapt and evolve to a changing environment. We highlight the latest research and hypotheses on how chromatin structure and the epigenome influence gene expression variability. Further, we provide an overview of challenges in the analysis of biological variability. An (...) improved understanding of the molecular mechanisms underlying epigenetic and transcriptional variability, at both the intra- and inter-individual level, provides great opportunity for disease prevention, better therapeutic approaches, and personalized medicine. Epigenetic and transcriptional variability mediate phenotypic plasticity, enabling adaptation to changing environments. In this review, we describe the sources of inter- and intra-individual variability and discuss epigenetic regulators of gene expression variability, including DNA methylation and chromatin structure. Understanding these molecular mechanisms will improve therapeutic approaches and personalized medicine. (shrink)
A VERSION OF CARTESIAN METHOD RODERICK H. CHISHQLM Introduction In one of his many profound discussions of the method of philosophy, Korner makes the ...
This comprehensive new book introduces the core history of phenomenology and assesses its relevance to contemporary psychology, philosophy of mind, and cognitive science. From critiques of artificial intelligence research programs to ongoing work on embodiment and enactivism, the authors trace how phenomenology has produced a valuable framework for analyzing cognition and perception, whose impact on contemporary psychological and scientific research, and philosophical debates continues to grow. The first part of _An Introduction to Phenomenology_ is an extended overview of the history (...) and development of phenomenology, looking at its key thinkers, focusing particularly on Husserl, Heidegger and Merleau-Ponty, as well as its cultural and intellectual precursors. In the second half Chemero and Käufer turn their attention to the contemporary interpretations and uses of phenomenology in cognitive science, showing that phenomenology is a living source of inspiration in contemporary interdisciplinary studies of the mind. Käufer and Chemero have written a clear, jargon-free account of phenomenology, providing abundant examples and anecdotes to illustrate and to entertain. This book is an ideal introduction to phenomenology and cognitive science for the uninitiated, as well as for philosophy and psychology students keen to deepen their knowledge. (shrink)
Metaphysical grounding is standardly taken to be irreflexive: nothing grounds itself. Kit Fine has presented some puzzles that appear to contradict this principle. I construct a particularly simple variant of those puzzles that is independent of several of the assumptions required by Fine, instead employing quantification into sentence position. Various possible responses to Fine's puzzles thus turn out to apply only in a restricted range of cases.
I suggest a way of extending Stalnaker’s account of assertion to allow for centered content. In formulating his account, Stalnaker takes the content of assertion to be uncentered propositions: entities that are evaluated for truth at a possible world. I argue that the content of assertion is sometimes centered: the content is evaluated for truth at something within a possible world. I consider Andy Egan’s proposal for extending Stalnaker’s account to allow for assertions with centered content. I argue that Egan’s (...) account does not succeed. Instead, I propose an account on which the contents of assertion are identified with sets of multi-centered worlds. I argue that such a view not only provides a plausible account of how assertions can have centered content, but also preserves Stalnaker’s original insight that successful assertion involves the reduction of shared possibilities. (shrink)
The view known as animalism asserts that we are human animals—that each of us is an instance of the Homo sapiens species. The standard argument for this view is known as the thinking animal argument . But this argument has recently come under attack. So, here, a new argument for animalism is introduced. The animal ancestors argument illustrates how the case for animalism can be seen to piggyback on the credibility of evolutionary theory. Two objections are then considered and answered.
Fundamental theories are hard to come by. But even if we had them, they would be too complicated to apply. Quantum chromodynamics is a case in point. This theory is supposed to govern all strong interactions, but it is extremely hard to apply and test at energies where protons, neutrons and ions are the effective degrees of freedom. Instead, scientists typically use highly idealized models such as the MIT Bag Model or the Nambu Jona-Lasinio Model to account for phenomena in (...) this domain, to explain them and to gain nderstanding. Based on these models, which typically isolate a single feature of QCD and disregard many others, scientists attempt to get a better understanding of the physics of strong interactions. But does this practice make sense? Is it justified to use these models for the purposes at hand? Interestingly, these models do not even provide an accurate description of the mass spectrum of protons, neutrons and pions and their lowest lying excitations well - despite several adjustable parameters. And yet, the models are heavily used. I'll argue that a qualitative story, which establishes an explanatory link between the fundamental theory and a model, plays an important role in model acceptance in these cases. (shrink)
There is currently disagreement about whether the phenomenon of first-person, or de se, thought motivates a move towards special kinds of contents. Some take the conclusion that traditional propositions are unable to serve as the content of de se belief to be old news, successfully argued for in a number of influential works several decades ago.1 Recently, some philosophers have challenged the view that there exist uniquely de se contents, claiming that most of the philosophical community has been under the (...) grip of an attractive but unmotivated myth.2 At the very least, this latter group has brought into question the arguments in favor of positing special kinds of content for de se belief; I think they have successfully shown that these arguments are not as conclusive, or fully articulated, as many have taken them to be. In this paper I will address these challenges directly and I will present and defend an argument for the conclusion that the phenomenon of de se thought does indeed motivate the move to a special kind of content, content that is uniquely de se. First, I characterize a notion of de se belief that is neutral with respect to friends and foes of uniquely de se content. I then argue for a determination thesis relating de se belief to belief content: that there is no difference in de se belief without a difference in belief content. I argue that various proposals for rejecting this determination thesis are unsuccessful. In the last part of the paper, I employ this determination thesis to argue for the existence of a type of belief content that is uniquely de se. (shrink)
The elucidations and regimentations of grounding offered in the literature standardly take it to be a necessary connection. In particular, authors often assert, or at least assume, that if some facts ground another fact, then the obtaining of the former necessitates the latter; and moreover, that grounding is an internal relation, in the sense of being necessitated by the existence of the relata. In this article, I challenge the necessitarian orthodoxy about grounding by offering two prima facie counterexamples. First, some (...) physical facts may ground a certain phenomenal fact without necessitating it; and they may co-exist with the latter without grounding it. Second, some instantiations of categorical properties may ground the instantiation of a dispositional one without necessitating it; and they may co-exist without grounding it. After arguing that these may be genuine counterexamples, I ask whether there are modal constraints on grounding that are not threatened by them. I propose two: that grounding supervenes on what facts there are, and that every grounded fact supervenes on what grounds there are. Finally, I attempt to provide a rigorous formulation of the latter supervenience claim and discuss some technical questions that arise if we allow descending grounding chains of transfinite length. (shrink)
This entry sketches the theory of personal identity that has come to be known as animalism. Animalism’s hallmark claim is that each of us is identical with a human animal. Moreover, animalists typically claim that we could not exist except as animals, and that the (biological) conditions of our persistence derive from our status as animals. Prominent advocates of this view include Michael Ayers, Eric Olson, Paul Snowdon, Peter van Inwagen, and David Wiggins.
Causal queries about singular cases are ubiquitous, yet the question of how we assess whether a particular outcome was actually caused by a specific potential cause turns out to be difficult to answer. Relying on the causal power framework, Cheng and Novick () proposed a model of causal attribution intended to help answer this question. We challenge this model, both conceptually and empirically. We argue that the central problem of this model is that it treats causal powers that are probabilistically (...) sufficient to generate the effect on a particular occasion as actual causes of the effect, and thus neglects that sufficient causal powers can be preempted in their efficacy. Also, the model does not take into account that reasoners incorporate uncertainty about the underlying general causal structure and strength of causes when making causal inferences. We propose a new measure of causal attribution and embed it into the structure induction model of singular causation. Two experiments support the model. (shrink)
In this new book, Ulrich Beck develops his now widely used concepts of second modernity, risk society and reflexive sociology into a radical new sociological ...
We define a notion of difference-making for partial grounds of a fact in rough analogy to existing notions of difference-making for causes of an event. Using orthodox assumptions about ground, we show that it induces a non-trivial division with examples of partial grounds on both sides. We then demonstrate the theoretical fruitfulness of the notion by applying it to the analysis of a certain kind of putative counter-example to the transitivity of ground recently described by Jonathan Schaffer. First, we show (...) that our conceptual apparatus of difference-making enables us to give a much clearer description than Schaffer does of what makes the relevant instances of transitivity appear problematic. Second, we suggest that difference-making is best seen as a mark of good grounding-based explanations rather than a necessary condition on grounding, and argue that this enables us to deal with the counter-example in a satisfactory way. Along the way, we show that Schaffer's own proposal for salvaging a form of transitivity by moving to a contrastive conception of ground is unsuccessful. We conclude by sketching some natural strategies for extending our proposal to a more comprehensive account of grounding-based explanations. (shrink)
In this paper I consider two strategies for providing tenseless truth-conditions for tensed sentences: the token-reflexive theory and the date theory. Both theories have faced a number of objections by prominent A-theorists such as Quentin Smith and William Lane Craig. Traditionally, these two theories have been viewed as rival methods for providing truth-conditions for tensed sentences. I argue that the debate over whether the token-reflexive theory or the date theory is true has arisen from a failure to distinguish between conditions (...) for the truth of tensed tokens and conditions for the truth of propositions expressed by tensed tokens. I demonstrate that there is a true formulation of the token-reflexive theory that provides necessary and sufficient conditions for the truth of tensed tokens, and there is a true formulation of the date theory that provides necessary and sufficient conditions for the truth of propositions expressed by tensed tokens. I argue that once the views are properly formulated, the A-theorist’s objections fail to make their mark. However, I conclude by claiming that even though there is a true formulation of the token-reflexive theory and a true formulation of the date theory, the New B-theory nonetheless fails to provide a complete account of the truth and falsity of tensed sentences. (shrink)
Say that two sentences are ground-theoretically equivalent iff they are interchangeable salva veritate in grounding contexts. Notoriously, ground-theoretic equivalence is a hyperintensional matter: even logically equivalent sentences may fail to be interchangeable in grounding contexts. Still, there seem to be some substantive, general principles of ground-theoretic equivalence. For example, it seems plausible that any sentences of the form A∧B\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$A \wedge B$$\end{document} and B∧A\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$B (...) \wedge A$$\end{document} are ground-theoretically equivalent. What, then, are in general the conditions for two sentences to stand in the relation of ground-theoretic equivalence, and what are the logical features of that relation? This paper develops and defends an answer to these questions based on the mode-ified truthmaker theory of content presented in my recent paper ‘Towards a theory of ground-theoretic content’ :785–814, 2018). (shrink)
Susan Carey's account of Quinean bootstrapping has been heavily criticized. While it purports to explain how important new concepts are learned, many commentators complain that it is unclear just what bootstrapping is supposed to be or how it is supposed to work. Others allege that bootstrapping falls prey to the circularity challenge: it cannot explain how new concepts are learned without presupposing that learners already have those very concepts. Drawing on discussions of concept learning from the philosophical literature, this article (...) develops a detailed interpretation of bootstrapping that can answer the circularity challenge. The key to this interpretation is the recognition of computational constraints, both internal and external to the mind, which can endow empty symbols with new conceptual roles and thus new contents. (shrink)
It is often claimed that the greatest value of the Bayesian framework in cognitive science consists in its unifying power. Several Bayesian cognitive scientists assume that unification is obviously linked to explanatory power. But this link is not obvious, as unification in science is a heterogeneous notion, which may have little to do with explanation. While a crucial feature of most adequate explanations in cognitive science is that they reveal aspects of the causal mechanism that produces the phenomenon to be (...) explained, the kind of unification afforded by the Bayesian framework to cognitive science does not necessarily reveal aspects of a mechanism. Bayesian unification, nonetheless, can place fruitful constraints on causal–mechanical explanation. 1 Introduction2 What a Great Many Phenomena Bayesian Decision Theory Can Model3 The Case of Information Integration4 How Do Bayesian Models Unify?5 Bayesian Unification: What Constraints Are There on Mechanistic Explanation?5.1 Unification constrains mechanism discovery5.2 Unification constrains the identification of relevant mechanistic factors5.3 Unification constrains confirmation of competitive mechanistic models6 ConclusionAppendix. (shrink)
The emerging consensus in the philosophy of cognition is that cognition is situated, i.e., dependent upon or co-constituted by the body, the environment, and/or the embodied interaction with it. But what about emotions? If the brain alone cannot do much thinking, can the brain alone do some emoting? If not, what else is needed? Do (some) emotions (sometimes) cross an individual's boundary? If so, what kinds of supra-individual systems can be bearers of affective states, and why? And does that make (...) emotions ?embedded? or ?extended? in the sense cognition is said to be embedded and extended? Section 2 shows why it is important to understand in which sense body, environment, and our embodied interaction with the world contribute to our affective life. Section 3 introduces some key concepts of the debate about situated cognition. Section 4 draws attention to an important disanalogy between cognition and emotion with regard to the role of the body. Section 5 shows under which conditions a contribution by the environment results in non-trivial cases of ?embedded? emotions. Section 6 is concerned with affective phenomena that seem to cross the organismic boundaries of an individual, in particular with the idea that emotions are ?extended? or ?distributed.? (shrink)
‘Transplant’ thought-experiments, in which the cerebrum is moved from one body to another, have featured in a number of recent discussions in the personal identity literature. Once taken as offering confirmation of some form of psychological continuity theory of identity, arguments from Marya Schechtman and Kathleen Wilkes have contended that this is not the case. Any such apparent support is due to a lack of detail in their description or a reliance on predictions that we are in no position to (...) make. I argue that the case against them rests on two serious misunderstandings of the operation of thought-experiments, and that even if they do not ultimately support a psychological continuity theory, they do major damage to that theory’s opponents. (shrink)
I consider whether the self-ascription theory can succeed in providing a tenseless (B-theoretic) account of tensed belief and timely action. I evaluate an argument given by William Lane Craig for the conclusion that the self-ascription account of tensed belief entails a tensed theory (A-theory) of time. I claim that how one formulates the selfascription account of tensed belief depends upon whether one takes the subject of selfascription to be a momentary person-stage or an enduring person. I provide two different formulations (...) of the self-ascription account of tensed belief, one that is compatible with a perdurantist account of persons and the other that is compatible with an endurantist account of persons. I argue that a self-ascription account of tensed beliefs for enduring subjects most plausibly involves the self-ascription of relations rather than properties. I argue that whether one takes the subject of self-ascription to be a momentary personstage or an enduring person, the self-ascription theory provides a plausible B-theoretic account of how tensed belief and timely action are possible. (shrink)
À partir d’un corpus d’enregistrements de répliques de la pièce de Tennessee Williams A Streetcar Named Desire, cet article traite de la phonologie des énoncés exclamatifs. Il établit que les spécificités phonologiques essentielles des exclamations sont de nature suprasegmentale et se trouvent être observées au niveau intonatif. Après avoir identifié certaines caractéristiques récurrentes des contours mélodiques observés dans les énoncés exclamatifs et montré que celles-ci sont conditionnées par la motivation sémantique et l’iconicité de l’intonation, nous nous intéressons aux liens indirects (...) qui existent entre l’intonation des exclamations et l’expression du degré. Nous fondant sur une réflexion sur la fonction démarcative de l’intonation et la valeur illocutoire des énoncés exclamatifs, nous suggérons également que l’étude de l’intonation des exclamations est susceptible d’éclairer la notion d’acte d’énonciation en contexte d’interaction orale. (shrink)
In his 2010 paper ‘Grounding and Truth-Functions’, Fabrice Correia has developed the first and so far only proposal for a logic of ground based on a worldly conception of facts. In this paper, we show that the logic allows the derivation of implausible grounding claims. We then generalize these results and draw some conclusions concerning the structural features of ground and its associated notion of relevance, which has so far not received the attention it deserves.
In his 2010 paper ‘Grounding and Truth-Functions’, Fabrice Correia has developed the first and so far only proposal for a logic of ground based on a worldly conception of facts. In this paper, we show that the logic allows the derivation of implausible grounding claims. We then generalize these results and draw some conclusions concerning the structural features of ground and its associated notion of relevance, which has so far not received the attention it deserves.
Simulation techniques, especially those implemented on a computer, are frequently employed in natural as well as in social sciences with considerable success. There is mounting evidence that the "model-building era" (J. Niehans) that dominated the theoretical activities of the sciences for a long time is about to be succeeded or at least lastingly supplemented by the "simulation era". But what exactly are models? What is a simulation and what is the difference and the relation between a model and a simulation? (...) These are some of the questions addressed in this article. I maintain that the most significant feature of a simulation is that it allows scientists to imitate one process by another process. "Process" here refers solely to a temporal sequence of states of a system. Given the observation that processes are dealt with by all sorts of scientists, it is apparent that simulations prove to be a powerful interdisciplinarily acknowledged tool. Accordingly, simulations are best suited to investigate the various research strategies in different sciences more carefully. To this end, I focus on the function of simulations in the research process. Finally, a somewhat detailed case-study from nuclear physics is presented which, in my view, illustrates elements of a typical simulation in physics. (shrink)
Bayesian epistemology addresses epistemological problems with the help of the mathematical theory of probability. It turns out that the probability calculus is especially suited to represent degrees of belief (credences) and to deal with questions of belief change, confirmation, evidence, justification, and coherence. Compared to the informal discussions in traditional epistemology, Bayesian epis- temology allows for a more precise and fine-grained analysis which takes the gradual aspects of these central epistemological notions into account. Bayesian epistemology therefore complements traditional epistemology; it (...) does not re- place it or aim at replacing it. (shrink)
The concept of supervenience and a regimented concept of grounding are often taken to provide rival explications of pre-theoretical concepts of dependence and determination. Friends of grounding typically point out that supervenience claims do not entail corresponding grounding claims. Every fact supervenes on itself, but is not grounded in itself, and the fact that a thing exists supervenes on the fact that its singleton exists, but is not grounded in it. Common lore has it, though, that grounding claims do entail (...) corresponding supervenience claims. In this article, I show that this assumption is problematic. On one way of understanding it, the corresponding supervenience claim is just an entailment claim under a different name. On another way of understanding it, the corresponding claim is a distinctive supervenience claim, but its specification gives rise to what I call the "reference type problem": to associate the classes of facts that are the relata of grounding with the types of facts that are the relata of supervenience. However it is understood, supervenience rules out prima facie possibilities: alien realizers, blockers, heterogeneous realizers, floaters, and heterogeneous blockers. Instead of being rival explications of one and the same pre-theoretical concept, grounding and supervenience may be complementary concepts capturing different aspects of determination and dependence. (shrink)
Trish Glazebrook has written an interesting book, and philosophers who care for Heidegger’s writing will do well to read it. The book is fertile and suggestive; it spans a large number of Heidegger’s writings, famous and obscure, and it presents Heidegger’s thinking on science from the same important variety of perspectives that Heidegger himself deems necessary to all philosophizing: science as a thought-system in need of theoretical grounding; science as a practice that involves an existential commitment by the practitioner; science (...) as a cultural possibility within an institutional setting; science as a body of knowledge that has a history; science as a way of comportment in which the world is disclosed. She shows that these perspectives belong together, and thus produces an interesting narrative in which Heidegger’s famous later critique of technology grows more or less directly out of his disastrous attempt at managing university politics, which in turn results from his Kant-and Aristotle-inspired thought on contemporary physics. In the end, Glazebrook can justifiably “hope to have awakened in others an interest in Heidegger’s philosophy of science.” And perhaps to have added momentum to the burgeoning literature on just this topic. (shrink)
In the past decade well-designed research studies have shown that the practice of collaborative philosophical inquiry in schools can have marked cognitive and social benefits. Student academic performance improves, and so too does the social dimension of schooling. These findings are timely, as many countries in Asia and the Pacific are now contemplating introducing Philosophy into their curricula. This paper gives a brief history of collaborative philosophical inquiry before surveying the evidence as to its effectiveness. The evidence is canvassed under (...) two categories: schooling and thinking skills; and schooling, socialisation and values. In both categories there is clear evidence that even short-term teaching of collaborative philosophical inquiry has marked positive effects on students. The paper concludes with suggestions for further research and a final claim that the presently-available research evidence is strong enough to warrant implementing collaborative philosophical inquiry as part of a long-term policy. (shrink)
Fundamentality plays a pivotal role in discussions of ontology, supervenience, and possibility, and other key topics in metaphysics. However, there are two different ways of characterising the fundamental: as that which is not grounded, and as that which is the ground of everything else. I show that whether these two characterisations pick out the same property turns on a principle—which I call “Dichotomy”—that is of independent interest in the theory of ground: that everything is either fully grounded or not even (...) partially grounded. I then argue that Dichotomy fails: some facts have partial grounds that cannot be complemented to a full ground. Rejecting Dichotomy opens the door to recognising a bifurcation in our notion of fundamentality. I sketch some of the far-reaching metaphysical consequences this might have, with reference to big-picture views such as Humeanism. Since Dichotomy is entailed by the standard account of partial ground, according to which partial grounds are subpluralities of full grounds, a non-standard account is needed. In a technical “Appendix”, I show that truthmaker semantics furnishes such an account, and identify a semantic condition that corresponds to Dichotomy. (shrink)
Fundamental theories are hard to come by. But even if we had them, they would be too complicated to apply. Quantum chromodynamics (QCD) is a case in point. This theory is supposed to govern all strong interactions, but it is extremely hard to apply and test at energies where protons, neutrons and ions are the effective degrees of freedom. Instead, scientists typically use highly idealized models such as the MIT Bag Model or the Nambu Jona-Lasinio Model to account for phenomena (...) in this domain, to explain them and to gain understanding. Based on these models, which typically isolate a single feature of QCD (confinement and chiral symmetry breaking respectively) and disregard many others, scientists attempt to get a better understanding of the physics of strong interactions. But does this practice make sense? Is it justified to use these models for the purposes at hand? Interestingly, these models do not even provide an accurate description of the mass spectrum of protons, neutrons and pions and their lowest lying excitations well - despite several adjustable parameters. And yet, the models are heavily used. I'll argue that a qualitative story, which establishes an explanatory link between the fundamental theory and a model, plays an important role in model acceptance in these cases. (shrink)
Effective field theories have been a very popular tool in quantum physics for almost two decades. And there are good reasons for this. I will argue that effective field theories share many of the advantages of both fundamental theories and phenomenological models, while avoiding their respective shortcomings. They are, for example, flexible enough to cover a wide range of phenomena, and concrete enough to provide a detailed story of the specific mechanisms at work at a given energy scale. So will (...) all of physics eventually converge on effective field theories? This paper argues that good scientific research can be characterised by a fruitful interaction between fundamental theories, phenomenological models and effective field theories. All of them have their appropriate functions in the research process, and all of them are indispensable. They complement each other and hang together in a coherent way which I shall characterise in some detail. To illustrate all this I will present a case study from nuclear and particle physics. The resulting view about scientific theorising is inherently pluralistic, and has implications for the debates about reductionism and scientific explanation. (shrink)
When do objects at different times compose a further object? This is the question of diachronic composition. The universalist answers, ‘under any conditions whatsoever’. Others argue for restrictions on diachronic composition: composition occurs only when certain conditions are met. Recently, some philosophers have argued that restrictions on diachronic compositions are motivated by our best physical theories. In Persistence and Spacetime and elsewhere, Yuri Balashov argues that diachronic compositions are restricted in terms of causal connections between object stages. In a recent (...) article, Nikk Effingham argues that the standard objections to views that endorse restrictions on composition do not apply to a view that restricts composition according to compliance with the laws of nature. On the face of it, such restrictions on diachronic composition preserve our common-sense ontology while eliminating from it scientifically revisionary objects that travel faster than the speed of light. I argue that these attempts to restrict diachronic composition by appealing to either causal or nomological constraints face insurmountable difficulties within the context of special relativity. I discuss how the universalist should best respond to Hudson’s argument for superluminal objects, and in doing so, I present and defend a new sufficient condition for motion that does not entail that such objects are in superluminal motion. 1 Introduction2 Diachronic Composition3 Diachronic Composition and Superluminal Objects4 Restricting Diachronic Composition5 Causal and Nomological Restrictions on Composition in a Relativistic Context6 Superluminal Objects and Motion7 Conclusion. (shrink)