The Belmont Report’s distinction between research and the practice of accepted therapy has led various authors to suggest that these purportedly distinct activities should be governed by different ethical principles. We consider some of the ethical consequences of attempts to separate the two and conclude that separation fails along ontological, ethical, and epistemological dimensions. Clinical practice and clinical research, as with yin and yang, can be thought of as complementary forces interacting to form a dynamic system in which the whole (...) exceeds the sum of its parts. Just as effective clinical practice cannot exist without clinical research, meaningful clinical research requires the context of clinical practice. We defend this thesis by triangulation, that is, by outlining how multiple investigators have reached this conclusion on the basis of varied theoretical and applied approaches. More confidence can be placed in a result if different methods/viewpoints have led to that result. (shrink)
Many have claimed that epistemic rationality sometimes requires us to have imprecise credal states (i.e. credal states representable only by sets of credence functions) rather than precise ones (i.e. credal states representable by single credence functions). Some writers have recently argued that this claim conflicts with accuracy-centered epistemology, i.e., the project of justifying epistemic norms by appealing solely to the overall accuracy of the doxastic states they recommend. But these arguments are far from decisive. In this essay, we prove some (...) new results, which show that there is little hope for reconciling the rationality of credal imprecision with accuracy-centered epistemology. (shrink)
We prove that certain natural sequent systems for bi-intuitionistic logic have the analytic cut property. In the process we show that the (global) subformula property implies the (local) analytic cut property, thereby demonstrating their equivalence. Applying a version of Maehara technique modified in several ways, we prove that bi-intuitionistic logic enjoys the classical Craig interpolation property and Maximova variable separation property; its Halldén completeness follows.
In "Variations on a theme of Curry," Humberstone conjectured that a certain logic, intermediate between BCI and BCK, is none other than monothetic BCI—the smallest extension of BCI in which all theorems are provably equivalent. In this note, we present a proof of this conjecture.
Historians of philosophy often credit Descartes, Locke, and other seventeenth-century authors with having introduced one of the most vexing problems into epistemology: the problem of mental representations. For these authors claimed that our knowledge of the external world is always mediated by mental representations, so that we have immediate access only to these representations, the ideas in our mind. As is well known, this “veil-of-ideas epistemology” gave rise to a number of skeptical questions. How can we be certain that our (...) ideas are accurate representations of the external world? And how can we be sure that there is an external world at all if we never have immediate access to it? In his highly original and provocative study, Robert Pasnau argues that these questions are not distinctively modern. They were already asked and thoroughly discussed by medieval authors: “much of what is often taken to be novel in the seventeenth and eighteenth centuries was already old news by the fourteenth”. According to Pasnau, it was Thomas Aquinas who introduced some form of representationalism into epistemology by developing the species-theory, and it was first Peter John Olivi and later William Ockham who attacked this theory, insisting that we always have immediate cognitive access to the external world. (shrink)
This paper addresses the ongoing debate over the relation between belief and credence. A proposal is made to reverse the currently predominant order of analysis, by taking belief as conceptually basic and credence as the phenomenon to be clarified. In brief, the proposal is to explicate an agent’s credence in a proposition P as the agent’s tendency toward believing P. Platitudinous as this reduction may seem, it runs counter to all of the major positions in the debate, including the Threshold (...) View, the Certainty View as conventionally understood, Dualism, Eliminativism, as well as Credence Primitivism. Section 1 gives an overview on the current state of the debate. Section 2 considers unsuccessful predecessors of the proposed belief-first approach to credence. Section 3 motivates and lays out the basics of a conceptual framework for thinking about doxastic states that characterizes such states in terms of two formally independent dimensions, one pertaining to the agent’s tendency toward believing P, the other to the level of resilience with which the agent manifests that tendency. Against this backdrop, it is argued in Sect. 4 that the present reduction satisfies a set of standard, theoretically neutral criteria of adequacy for theories of credence, at least once they are purged of a quite common conflation of tendency and resilience. Section 5 argues against all of the above competing accounts. (shrink)
The paper intends to provide an algebraic framework in which subluminal causation can be analysed. The framework merges Belnap's 'outcomes in branching time' with his 'branching space-time' (BST). it is shown that an important structure in BST, called 'family of outcomes of an event', is a boolean algebra. We define next non-stochastic common cause and analyse GHZ-Bell theorems. We prove that there is no common cause that accounts for results of GHZ-Bell experiment but construct common causes for two other quantum (...) mechanical setups. Finally, we investigate why some setups allow for common causes whereas other setups do not. (shrink)
In this essay, we defend the design of the Salk polio vaccine trial and try to put some limits on the role schemata should play in designing clinical research studies. Our presentation is structured as a response to de Freitas and Pietrobon who identified the CONSORT statement as a schema that would have, had it existed at the time, ruled out the design of the Salk polio vaccine trial of 1954 in favor of a completely randomized controlled clinical trial. We (...) argue that large-scale public health interventions often require evidence beyond simple efficacy, the limit of what an RCT can provide, and that the design actually adopted for the Salk trial represented a reasonable—albeit imperfect—compromise. This is of more than historical interest in that many contemporary studies are of the scale and scope to require a more pragmatic, rather than explanatory, approach to study design. (shrink)
We prove that all semisimple varieties of FL ew-algebras are discriminator varieties. A characterisation of discriminator and EDPC varieties of FL ew-algebras follows. It matches exactly a natural classification of logics over FL ew proposed by H. Ono.
In this paper we show that a variety of modal algebras of finite type is semisimple iff it is discriminator iff it is both weakly transitive and cyclic. This fact has been claimed already in [4] (based on joint work by the two authors) but the proof was fatally flawed.
In this paper we explore the thesis that the role of argumentation in practical reasoning in general and legal reasoning in particular is to justify the use of defeasible rules to derive a conclusion in preference to the use of other defeasible rules to derive a conflicting conclusion. The defeasibility of rules is expressed by means of non-provability claims as additional conditions of the rules.We outline an abstract approach to defeasible reasoning and argumentation which includes many existing formalisms, including default (...) logic, extended logic programming, non-monotonic modal logic and auto-epistemic logic, as special cases. We show, in particular, that the admissibility semantics for all these formalisms has a natural argumentation-theoretic interpretation and proof procedure, which seem to correspond well with informal argumentation. (shrink)
Survival of the fittest in evolutionary biology has a counterpart in the evolution of research paradigms. It’s called survival of the funded, and there is a sense in which paradigms are even more adaptable than species. Whereas species may become extinct if their fitness declines below a critical threshold, paradigms can rise again, perhaps with a new name, following fiscal collapse, provided only that funding is once again made available.A current example is the born-again concept of comparative effectiveness research , (...) which achieved resurrection status with the passage of the Affordable Care Act of 2010. The ACA established a private, nonprofit entity to oversee publicly financed CER. According to .. (shrink)
This study illuminates how a cross-sector social partnership legitimizes itself toward multiple internal and external stakeholders. Within a single-case study design, we collected retrospective and real time data on the partnership between Deutsche Post DHL and The United Nations Office for the Coordination of Humanitarian Affairs. Within this partnership, Deutsche Post DHL provides corporate volunteers that support disaster response after natural disasters on a pro bono basis. The main objects that needed legitimacy as well as the audiences from which legitimacy (...) was mainly sought changed over time. In addition, we identified legitimation work as occurring across objects, audiences, and time. Thus, we introduce legitimation work as the purposeful effort of the legitimacy seeker to avoid certain issues while ensuring other issues that are of importance to the conferrer of legitimacy. These findings contribute to micro-level considerations within institutional theory which view legitimacy as socially constructed between legitimacy seeker and conferrer. Hence, we add another perspective on legitimation to the previously existing conceptualizations of legitimacy as a deterministic consequence of institutionalization. (shrink)
This paper addresses questions of universality related to ontological engineering, namely aims at substantiating (negative) answers to the following three basic questions: (i) Is there a ‘universal ontology’?, (ii) Is there a ‘universal formal ontology language’?, and (iii) Is there a universally applicable ‘mode of reasoning’ for formal ontologies? To support our answers in a principled way, we present a general framework for the design of formal ontologies resting on two main principles: firstly, we endorse Rudolf Carnap’s principle of logical (...) tolerance by giving central stage to the concept of logical heterogeneity, i.e. the use of a plurality of logical languages within one ontology design. Secondly, to structure and combine heterogeneous ontologies in a semantically well-founded way, we base our work on abstract model theory in the form of institutional semantics, as forcefully put forward by Joseph Goguen and Rod Burstall. In particular, we employ the structuring mechanisms of the heterogeneous algebraic specification language HetCasl for defining a general concept of heterogeneous, distributed, highly modular and structured ontologies, called hyperontologies. Moreover, we distinguish, on a structural and semantic level, several different kinds of combining and aligning heterogeneous ontologies, namely integration, connection, and refinement. We show how the notion of heterogeneous refinement can be used to provide both a general notion of sub-ontology as well as a notion of heterogeneous equivalence of ontologies, and finally sketch how different modes of reasoning over ontologies are related to these different structuring aspects. (shrink)
In this article, it will be argued that tolerance is not necessarily a political or ethical, but rather an abstract attitude that can be applied to many different dimensions of normative evaluation. More specifically, it will be argued that there are genuinely intellectual forms of tolerance that are epistemically motivated and that need to be assessed on purely epistemic grounds. To establish this claim, an abstract characterization of tolerance will be applied to the epistemic phenomenon of disagreement in order to (...) develop a specific conception of tolerance that picks out a genuinely intellectual attitude towards recognized disagreement. Since the attitude that is picked out by this conception is very popular and widespread, an epistemology of tolerance would be of great significance to our intellectual practice. (shrink)
Many late medieval Aristotelians assumed that a natural substance has several substantial forms in addition to matter as really distinct parts. This assumption gave rise to a unity problem: why is a substance more than a conglomeration of all these parts? This paper discusses Francisco Suárez’s answer. It first shows that he rejected the idea that there is a plurality of forms, emphasizing instead that each substance has a single form and hence a single structuring principle. It then examines his (...) account of the relationship between matter and form. While accepting the thesis that these two parts are really distinct entities, he claimed that there is a special “mode of union” that binds them together. With this account, he defended the essential unity of a natural substance, but he transformed the program of Aristotelian metaphysics: not substances, but entities and modes inside them, are now the basic building blocks of reality. (shrink)
In a recent paper in this journal, John Worrall (2008) used the example of a series of trials involving extracorporeal membrane oxygenation (ECMO), a technology for the treatment of respiratory failure in newborns, to illustrate the relationship between ethics and epistemology in medical research. One of the issues considered was whether or not it was ethical to perform a particular clinical trial at all, and he showed clearly that the answer was intimately related to epistemological judgments about the weight to (...) be given existing evidence concerning treatment effectiveness. In the case of ECMO, a trial was initiated at the University of Michigan (Bartlett et al. 1985), despite the fact that the researchers had .. (shrink)
David Sackett and Jack Wennberg have each introduced and developed ideas and methods that have had major impacts on how we think about and perform clinical research. Sackett is best known for his work in Evidence-Based Medicine (Sackett et al. 1997); Wennberg, upon noting wide geographic (and other) variations in best practices for the same conditions, stressed the importance of comparative effectiveness in clinical decision-making (Wennberg et al. 1993). When these two collaborated in an editorial about the current state of (...) the art and science of clinical research, it comes as no great surprise that they produced some truly memorable guidance. And, as is so often the case for inspired—and inspiring—advice, their .. (shrink)
Varieties like groups, rings, or Boolean algebras have the property that, in any of their members, the lattice of congruences is isomorphic to a lattice of more manageable objects, for example normal subgroups of groups, two-sided ideals of rings, filters (or ideals) of Boolean algebras.algebraic logic can explain these phenomena at a rather satisfactory level of generality: in every member A of a τ-regular variety the lattice of congruences of A is isomorphic to the lattice of deductive filters on (...) A of the τ-assertional logic of . Moreover, if has a constant 1 in its type and is 1-subtractive, the deductive filters on A ∈ of the 1-assertional logic of coincide with the -ideals of A in the sense of Gumm and Ursini, for which we have a manageable concept of ideal generation. However, there are isomorphism theorems, for example, in the theories of residuated lattices, pseudointerior algebras and quasi-MV algebras that cannot be subsumed by these general results. The aim of the present paper is to appropriately generalise the concepts of subtractivity and τ-regularity in such a way as to shed some light on the deep reason behind such theorems. The tools and concepts we develop hereby provide a common umbrella for the algebraic investigation of several families of logics, including substructural logics, modal logics, quantum logics, and logics of constructive mathematics. (shrink)
Throughout its history, the renowned Kaṭha Upaniṣad has often been described as being both incoherent and contradictory. The aim of this paper is to show to what purpose the text was created. To this end, it discusses the connection of the three paths to salvation depicted in the text, viz. the Agnicayana, the Upaniṣadic method of self-knowledge, and yoga. The first part retraces how in the Upaniṣads, the Agnicayana was transformed into a non-material or mental ritual and linked with self-knowledge. (...) The second part analyses how the various salvation goals could be related to each other. First, the authors redefined the Agnicayana’s salvation goal, heaven, to make it identical with liberation. Secondly, they introduced self-knowledge and yoga as alternative and equally powerful means to the same end. In practice, however, the new and world-negating methods were implied to be superior to the costly ritual from which they had drawn their authority. Thus, the authors of the Upaniṣad were more concerned with showing continuity between different religious approaches than upholding consistency of content. (shrink)
It seems quite natural that we have cognitive access not only to things around us, but also to our own acts of perceiving and thinking. How is this access possible? How is it related to the access we have to external things? And how certain is it? This paper discusses these questions by focusing on Francisco Suárez’s theory, which gives an account of various forms of access to oneself and thereby presents an elaborate theory of consciousness. It argues that Suárez (...) clearly distinguishes between first-order sensory consciousness and second-order intellectual consciousness. Moreover, Suárez attempts to explain the unity of consciousness by referring to a single soul with hierarchically ordered faculties that is responsible both for first-order and for second-order consciousness. (shrink)
Toy models are highly idealized and extremely simple models. Although they are omnipresent across scientific disciplines, toy models are a surprisingly under-appreciated subject in the philosophy of science. The main philosophical puzzle regarding toy models concerns what the epistemic goal of toy modelling is. One promising proposal for answering this question is the claim that the epistemic goal of toy models is to provide individual scientists with understanding. The aim of this article is to precisely articulate and to defend this (...) claim. In particular, we will distinguish between autonomous and embedded toy models, and then argue that important examples of autonomous toy models are sometimes best interpreted to provide how-possibly understanding, while embedded toy models yield how-actually understanding, if certain conditions are satisfied. _1_ Introduction _2_ Embedded and Autonomous Toy Models _2.1_ Embedded toy models _2.2_ Autonomous toy models _2.3_ Qualification _3_ A Theory of Understanding for Toy Models _3.1_ Preliminaries and requirements _3.2_ The refined simple view _4_ Two Kinds of Understanding with Toy Models _4.1_ Embedded toy models and how-actually understanding _4.2_ Against a how-actually interpretation of all autonomous toy models _4.3_ The how-possibly interpretation of some autonomous toy models _5_ Conclusion. (shrink)
Following Bernheim,1 we examine aspects of 'felicitometrics,'2 the measurement of the 'quality' term in Quality of Life (QOL). Bernheim argued that overall QOL is best captured as the Gestalt3 of a global self-assessment and suggested that the Anamnestic Comparative Self Assessment (ACSA) approach, in which subjects' memories of the best and worst times of their lives are used to anchor a Visual Analog Scale (VAS), provided a serious answer to the serious question, 'How have you been?' Bernheim compares and contrasts (...) the ACSA to multi-item questionnaire QOL instruments, such as the SF-36, concluding that the ACSA has a number of advantages. His discussion assumes that the use of QOL outcomes in clinical trials is both relevant and appropriate. In the present paper, we document the reasonableness of this latter assumption,4 contribute to the characterization of the similarities and differences between multi-item and individualized QOL instruments, and point to some other individualized instruments that may be used in clinical trial contexts. These 'other individualized instruments' differ from the ACSA in fundamental ways; but they are individualized in that the subject defines those areas in his/her life that are most important, and these may vary from subject-to-subject. (shrink)
The Royal Society report updates the anthropogenic impacts on ecosystems services and our inability to rise to this challenge. Sustainable development is argued to be a linguistic device that has been instrumental in deflecting us from addressing the paradox at the heart of the oxymoron. The relationships between the social, environmental, and economic are explored together with the utility of the I = PAT equation, with reference to the Hardin Taboo, Jevons's, and Easterlin's paradoxes. A more prominent role for phronesis (...) in the management of human affairs and the adoption of ethics as the language for dealing with such issues are advocated. (shrink)
We exhibit a simple inference rule, which is admissible but not derivable in BCK, proving that BCK is not structurally complete. The argument is proof-theoretical.
Many different approaches to describing the players’ knowledge and beliefs can be found in the literature on the epistemic foundations of game theory. We focus here on non-probabilistic approaches. The two most prominent are the so-called Kripkeor Aumann- structures and knowledge structures (non-probabilistic variants of Harsanyi type spaces). Much of the recent work on Kripke structures has focused on dynamic extensions and simple ways of incorporating these. We argue that many of these ideas can be applied to knowledge structures as (...) well. Our main result characterizes precisely when one type can be transformed into another type by a specific type of information update. Our work in this paper suggest that it would be interesting to pursue a theory of “information dynamics” for knowledge structures (and eventually Harsanyi type spaces). (shrink)
The use of quality of life (QOL) outcomes in clinical trials is increasing as a number of practical, ethical, methodological, and regulatory reasons for their use have become apparent. It is important, then, that QOL measurements and differences between QOL scores be readily interpretable. We study interpretation in two contexts: when determining QOL and when basing decisions on QOL differences. We consider both clinical situations involving individual patients and research contexts, e.g., randomized clinical trials, involving groups of patients. We note (...) the ethical importance of such understanding: proper interpretation and communication facilitate health care decision making. Communication that facilitates interpretation is of moral significance since better communication can attenuate ethical problems and inform choices. Much of what is communication worthy about QOL assessments is determined by the particular QOL instrument used in the assessment and how it is administered. In practice, these choices will be driven by the purpose of the assessment, but, it is argued, to maximize understanding, we should combine the information garnered from traditional standardized QOL instruments, from individualized QOL assessments, and from a recently proposed dialogic paradigm, where QOL is determined by shared conversation regarding the interpretation of texts. And, while some studies can surely succeed using abbreviated methods of administration (e.g., postal surveys may suffice for certain purposes), we will focus on methods of administration involving interviewer–respondent interaction. We suggest that during the QOL elicitation process, interviewer and respondent should engage in a two-way conversation in order to achieve a shared understanding of the “answers” to QOL “questions” and, finally, to reach a shared interpretation of the individual’s QOL. (shrink)
Toy models are highly idealized and extremely simple models. Although they are omnipresent across scientific disciplines, toy models are a surprisingly under-appreciated subject in the philosophy of science. The main philosophical puzzle regarding toy models is that it is an unsettled question what the epistemic goal of toy modeling is. One promising proposal for answering this question is the claim that the epistemic goal of toy models is to provide individual scientists with understanding. The aim of this paper is to (...) precisely articulate and to defend this claim. In particular, we will distinguish between autonomous and embedded toy models, and, then, argue that important examples of autonomous toy models are sometimes best interpreted to provide how-possibly understanding, while embedded toy models yield how-actually understanding, if certain conditions are satisfied. (shrink)
We study two logics of knowledge and belief stemming from the work of Stalnaker, omitting positive introspection for knowledge. The two systems are equivalent with positive introspection, but not without. We show that while the logic of beliefs remains unaffected by omitting introspection for knowledge in one system, it brings significant changes to the other. The resulting logic of belief is non-normal, and its complete axiomatization uses an infinite hierarchy of coherence constraints. We conclude by returning to the philosophical interpretation (...) underlying both models of belief, showing that neither is strong enough to support a probabilistic interpretation, nor an interpretation in terms of certainty or the “mental component” of knowledge. (shrink)