With the ascent of modern epidemiology in the Twentieth Century came a new standard model of prediction in public health and clinical medicine. In this article, we describe the structure of the model. The standard model uses epidemiological measures-most commonly, risk measures-to predict outcomes (prognosis) and effect sizes (treatment) in a patient population that can then be transformed into probabilities for individual patients. In the first step, a risk measure in a study population is generalized or extrapolated to a target (...) population. In the second step, the risk measure is particularized or transformed to yield probabilistic information relevant to a patient from the target population. Hence, we call the approach the Risk Generalization-Particularization (Risk GP) Model. There are serious problems at both stages, especially with the extent to which the required assumptions will hold and the extent to which we have evidence for the assumptions. Given that there are other models of prediction that use different assumptions, we should not inflexibly commit ourselves to one standard model. Instead, model pluralism should be standard in medical prediction. (shrink)
Current challenges in medical practice, research, and administration demand physicians who are familiar with bioethics, health law, and health economics. Curriculum directors at American Association of Medical Colleges-affiliated medical schools were sent confidential surveys requesting the number of required hours of the above subjects and the years in which they were taught, as well as instructor names. The number of relevant publications since 1990 for each named instructor was assessed by a PubMed search.In sum, teaching in all three subjects combined (...) comprises less than two percent of the total hours in the American medical curriculum, and most instructors have not recently published articles in the fields they teach. This suggests that medical schools should reevaluate their curricula and instructors in bioethics, health law, and health economics. (shrink)
RATIONALE, AIMS AND OBJECTIVES: Increased awareness of the gap between controlled research and medical practice has raised concerns over whether the special attention of doctors to probability estimates from clinical trials really improves the care of individuals. Evidence-based medicine has acknowledged that research results are not applicable to all kinds of patients, and consequently, has attempted to overcome this limitation by introducing improvements in the design and analysis of clinical trials. METHODS: A clinical case is used to highlight the premises (...) required to support reasonable extrapolations from controlled research to individuals. Then, the prospects of two key methodological improvements - pragmatic randomized controlled trials and subgroup analysis - are critically appraised. RESULTS: A principle to guide therapeutic inferences is suggested. According to this principle, the probabilities of interest for purposes of therapeutic decision making are those of the set defined by everything that is relevant to the patient and the outcome of interest at the time of the decision. It is argued that the conditions necessary to authorize automatic extrapolations of research results to specific patients are highly demanding. Furthermore, these requirements are rarely accomplished in real practice, even in the event that probability estimates come from samples generally taken as representative and are derived from specific subsets of patients. CONCLUSIONS: Clinicians should generally avoid unreflective extrapolations from research and address, as explicitly as possible, the challenge of estimating probabilities for individual patients. A key element of this task is the integration of data from research and non-research sources. (shrink)
The present work determines the arithmetic complexity of the index sets of u.c.e. families which are learnable according to various criteria of algorithmic learning. Specifically, we prove that the index set of codes for families that are TxtFex\-learnable is \-complete and that the index set of TxtFex\-learnable and the index set of TxtFext\-learnable families are both \-complete.
Afshar et al. claim that their experiment shows a violation of the complementarity inequality. In this work, we study their claim using a modified Mach-Zehnder setup that represents a simpler version of the Afshar experiment. We find that our results are consistent with Afshar et al. experimental findings. However, we show that within standard quantum mechanics the results of the Afshar experiment do not lead to a violation of the complementarity inequality. We show that their claim originates from a particular (...) technique they use to analyze their results. In their analysis, they assume a classical concept, that particles have a definite trajectory before detection, thus, they obtain which-way information by particle detection plus path extrapolation by applying momentum conservation. This analysis technique is standard in experimental particle physics. Important discoveries such as the detection of vector bosons have been made through the application of this technique. We note that particle detection plus path extrapolation is a suitable technique within de Broglie-Bohm theory of quantum mechanics. (shrink)
Bell’s spaceship ‘paradox’  in special relativity is a particularly good one to examine with students, because although it deals with accelerated motions, it can be dissolved with elementary space–time diagrams. Furthermore, it forces us to be very clear about the relativity of simultaneity, proper length, and the ‘reality’ of the Lorentz contraction.
(1982). A methodology for achieving the aims and objectives of the Forum Humanum: The approach of the Venezuelan group. World Futures: Vol. 18, Alternatives for Humankind: The Role of Latin America, pp. 145-158.
Existem diversos estudos que indicam os fatores de risco relacionados ao uso de drogas, entre eles, está a adolescência, considerada um período de maior vulnerabilidade. No entanto, pesquisas atuais estão interessadas em conhecer os fatores promotores de saúde e proteção, com o objetivo de prevenir o desenvolvimento de comportamentos de risco, mesmo em situações de vulnerabilidade. Nessa perspectiva, a religiosidade vem sendo identificada como fator protetor ao uso de drogas; mas, apesar disso, pouco se sabe sobre os mecanismos causais desse (...) importante fenômeno. Esse artigo, apresenta uma pesquisa que buscou investigar as dimensões pedagógicas das práticas religiosas de um grupo de adolescentes pertencentes a uma igreja evangélica do município de Canoas/RS, que atuam na proteção ao uso de drogas na adolescência. Para isso, foram realizadas entrevistas semiestruturadas com seis jovens que atuam como líderes do grupo de adolescentes da instituição religiosa participante da pesquisa. Esse contexto caracteriza a pesquisa como um estudo de caso. Para a análise dos dados, utilizaram-se os princípios da grounded-theory, ou teoria fundamentada nos dados. Dessa forma, foram observadas quatro dimensões pedagógicas nas práticas religiosas do grupo participante da pesquisa, que poderiam atuar de maneira protetiva ao uso de drogas na adolescência. São elas: educação para o apoio social, educação para a autorregulação, educação para o entretenimento consciente e educação para a espiritualidade. Os resultados da pesquisa apontam a relevância do tema para a saúde pública e para a possibilidade da inserção das dimensões pedagógicas, encontradas nesse contexto religioso, em programas educativos de prevenção ao uso de drogas na adolescência. (shrink)
This work applies the competitive exclusion principle and the concept of potential competitors as simple axiomatic tools to generalized situations in ecology. These tools enable apparent competition and its dual counterpart to be explicitly evaluated in poorly understood ecological systems. Within this set-theory framework we explore theoretical symmetries and invariances, De Morgan’s laws, frozen evolutionary diversity and virtual processes. In particular, we find that the exclusion principle compromises the geometrical growth of the number of species. By theoretical extending this principle, (...) we can describe interspecific depredation in the dual case. This study also briefly considers the debated situation of intraspecific competition. The ecological consequences of our findings are discussed; particularly, the use of our framework to reinterpret coupled mathematical differential equations describing certain ecological processes. (shrink)
This book's importance is derived from three sources: careful conceptualization of teacher induction from historical, methodological, and international perspectives; systematic reviews of research literature relevant to various aspects of teacher induction including its social, cultural, and political contexts, program components and forms, and the range of its effects; substantial empirical studies on the important issues of teacher induction with different kinds of methodologies that exemplify future directions and approaches to the research in teacher induction.
In Newtonian physics, there is a clear distinction between a "framework theory", a collection of general physical principles and definitions of physical terms, and theories that describe specific causal interactions such as gravitation, i.e., "interaction theories". I argue that this distinction between levels of theory can also be found in the context of Special Relativity and that recognizing it is essential for a philosophical account of how laws are explained in this theory. As a case study, I consider the history (...) of derivations of mass-energy equivalence which shows, I argue, that there are two distinct types of theoretical explanations in physics. One type is best characterized by the "top-down" account of scientific explanation, while the other is more accurately described by the "bottom-up" account. What is significant, I argue, is that the type of explanation a law receives depends on whether it is part of the framework theory or part of an interaction theory. The former only receive "top-down" explanations while the latter can also receive "bottom-up" explanations. Thus, I argue that current debates regarding "top-down" vs "bottom-up" views of scientific explanation can be clarified by recognizing the distinction between two levels of physical theory. (shrink)