The publication in 1957 of the Wolfenden Report occasioned a celebrated controversy in which profound theoretical issues concerning the relation between law and morality, and the legal enforcement of morality were discussed. The principal disputants were Lord Justice Devlin and Professor H. L. A. Hart. It is by now well known that the main recommendation of the Wolfenden Report was the reform of the criminal law so that homosexual behaviour in private between consenting male adults should no longer be a (...) criminal offence. As homosexual behaviour in Christendom was at the outset punishable in the ecclesiastical courts, and subsequently, with the demise of the ecclesiastical courts, in the secular courts, the Wolfenden recommendation on homosexuality marked a major departure from the prevailing state of affairs in which the precepts of Christian morality, especially relating to sexual morals, were at first enforced by the ecclesiastical courts, and then by the secular courts. (shrink)
T Ã W logic is a combination of tense and modal logic for worlds or histories with the same time order. It is the basis for logics of causation, agency and conditionals, and therefore an important tool for philosophical logic. Semantically it has been defined, among others, by R. H. Thomason. Using an operator expressing truth in all worlds, first discussed by C. M. Di Maio and A. Zanardo, an axiomatization is given and its completeness proved via D. Gabbayâs irreflexivity (...) lemma. Given this lemma the proof is more or less straight forward. At the end an alternative axiomatization is sketched in which Di Maioâs and Zanardoâs operator is replaced by a version of actually. (shrink)
The authors' aim is to provide a more complete picture of a non-anthropocentric relational ethics by addressing the failure to account for environmental justice. They argue that environmental ethics is always more than how discourses are layered over place, by situating moral agency through the body's affective repertoire of being-in-the-world. Empirical evidence for their argument is drawn from self-reflexive accounts of young Americans travelling to Ulu r u-Kata Tju t a National Park, Northern Territory, Australia as part of a study-group. (...) These reflexive travel narratives illustrate the dilemmas that even well-prepared visitors have in negotiating moral pathways invoked by the policy of reconciliation. (shrink)
E. T. A. Hoffmann is one of the most famous representatives of early German horror literature. He has been both, inspired by its predecessors, as well as having influenced the work of many of his successors, and hence the development of the whole genre. The present article examines a story by E. T. A. Hoffmann, “Vampirismus” from the collection of short stories “Serapions Brüder”. Emphases are, on the one hand, on the mechanisms that cause readers’ fear and uncertainty and, on (...) the other hand, the peculiarity of the vampire or Nachzehrer figure and their function in the story. Firstly, it will be shown that the vampire depicted in the work is not actually a vampire. We find here a ghoul; that is a demon from Arab culture. However, the ghoul has more to do with the monster outlined by Antoine Galland in his translation of the “One Thousand and One Nights” than with traditional folk beliefs. Secondly, the author comes to the conclusion that Hoffmann has functionalized the Horrible. This element does not work by itself, but serves the author as the background of the action. And he is using this background to let the characters reveal all their weaknesses and dark sides. The violation of the order ruling in the world as it is represented engages the reader and at the same time raises his fear. The resulting excited feelings and general alienation are further reinforced for the figure of the Nachzehrer that occurs instead of a formerly innocent, graceful girl. The emergence of the supernatural – the final confirmation of the breach of order – is responsible for the effect just described. (shrink)
This paper began as a generalization of a part of the author's PhD thesis about ACFA and ended up with a characterization of groups definable in T A . The thesis concerns minimal formulae of the form x ∈ A ∧ σ(x) = f(x) for an algebraic curve A and a dominant rational function f: A → σ(A). These are shown to be uniform in the Zilber trichotomy, and the pairs (A, f) that fall into each of the three cases (...) are characterized. These characterizations are definable in families. This paper covers approximately half of the thesis, namely those parts of it which can be made purely model-theoretic by moving from ACFA, the model companion of the class of algebraically closed fields with an endomorphism, to T A , the model companion of the class of models of an arbitrary totally-transcendental theory T with an injective endomorphism, if this model-companion exists. A T A analog of the characterization of groups definable in ACFA is obtained in the process. The full characterization of the cases of the Zilber trichotomy in the thesis is obtained from these intermediate results with heavy use of algebraic geometry. (shrink)
In this article I propose to discuss some recent theological contributions to the problem of the historicity of the Gospels, and I wish to suggest that philosophical issues may ultimately be relevant to its solution.
Inspired by Pohlers' local predicativity approach to Pure Proof Theory and Howard's ordinal analysis of bar recursion of type zero we present a short, technically smooth and constructive strong normalization proof for Gödel's system T of primitive recursive functionals of finite types by constructing an ε 0 -recursive function [] 0 : T → ω so that a reduces to b implies [a] $_0 > [b]_0$ . The construction of [] 0 is based on a careful analysis of the Howard-Schütte (...) treatment of Gödel's T and utilizes the collapsing function ψ: ε 0 → ω which has been developed by the author for a local predicativity style proof-theoretic analysis of PA. The construction of [] 0 is also crucially based on ideas developed in the 1995 paper "A proof of strongly uniform termination for Gödel's T by the method of local predicativity" by the author. The results on complexity bounds for the fragments of T which are obtained in this paper strengthen considerably the results of the 1995 paper. Indeed, for given n let T n be the subsystem of T in which the recursors have type level less than or equal to n+2. (By definition, case distinction functionals for every type are also contained in T n .) As a corollary of the main theorem of this paper we obtain (reobtain?) optimal bounds for the T n -derivation lengths in terms of ω n+2 -descent recursive functions. The derivation lengths of type one functionals from T n (hence also their computational complexities) are classified optimally in terms of $ -descent recursive functions. In particular we obtain (reobtain?) that the derivation lengths function of a type one functional a ∈ T 0 is primitive recursive, thus any type one functional a in T 0 defines a primitive recursive function. Similarly we also obtain (reobtain?) a full classification of T 1 in terms of multiple recursion. As proof-theoretic corollaries we reobtain the classification of the IΣ n+1 -provably recursive functions. Taking advantage from our finitistic and constructive treatment of the terms of Gödel's T we reobtain additionally (without employing continuous cut elimination techniques) that PRA + PRWO ( $\varepsilon_0) \vdash \Pi^0_2$ - Refl(PA) and PRA + PRWO ( $\omega_{n+2}) \vdash \Pi^0_2$ - Refl(I Σ n+1 ), hence PRA + PRWO( $\epsilon_0) \vdash$ Con(PA) and PRA + PRWO( $\omega_{n+2}) \vdash$ Con(IΣ n+1 ). For programmatic reasons we outline in the introduction a vision of how to apply a certain type of infinitary methods to questions of finitary mathematics and recursion theory. We also indicate some connections between ordinals, term rewriting, recursion theory and computational complexity. (shrink)
Inspired by Pohlers' local predicativity approach to Pure Proof Theory and Howard's ordinal analysis of bar recursion of type zero we present a short, technically smooth and constructive strong normalization proof for Godel's system T of primitive recursive functionals of finite types by constructing an $\varepsilon_0$-recursive function [ ]$_0$: T $\rightarrow \omega$ so that a reduces to b implies [a]$_0 > [b]_0$. The construction of [ ]$_0$ is based on a careful analysis of the Howard-Schutte treatment of Godel's T and (...) utilizes the collapsing function $\psi: \varepsilon_0 \rightarrow \omega$ which has been developed by the author for a local predicativity style proof-theoretic analysis of PA. The construction of []$_0$ is also crucially based on ideas developed in the 1995 paper "A proof of strongly uniform termination for Godel's T by the method of local predicativity" by the author. The results on complexity bounds for the fragments of T which are obtained in this paper strengthen considerably the results of the 1995 paper. Indeed, for given n let T$_n$ be the subsystem of T in which the recursors have type level less than or equal to n+2. As a corollary of the main theorem of this paper we obtain optimal bounds for the T$_n$-derivation lengths in terms of $\omega_{n+2}$-descent recursive functions. The derivation lengths of type one functionals from T$_n$ are classified optimally in terms of $< \omega_{n+2}$-descent recursive functions. In particular we obtain that the derivation lengths function of a type one functional a $\in T_0$ is primitive recursive, thus any type one functional a in T$_0$ defines a primitive recursive function. Similarly we also obtain a full classification of T$_1$ in terms of multiple recursion. As proof-theoretic corollaries we reobtain the classification of the I$\Sigma_{n+1}$-provably recursive functions. Taking advantage from our finitistic and constructive treatment of the terms of Godel's T we reobtain additionally that PRA + PRWO \vdash \Pi^0_2$ - Refl and PRA + PRWO \vdash \Pi^0_2$ - Refl, hence PRA + PRWO \vdash$ Con and PRA + PRWO \vdash$ Con$. For programmatic reasons we outline in the introduction a vision of how to apply a certain type of infinitary methods to questions of finitary mathematics and recursion theory. We also indicate some connections between ordinals, term rewriting, recursion theory and computational complexity. (shrink)
This article gives a brief introduction to the MacArthur Competence Assessment Tool-Treatment (MacCAT-T) and critically examines its theoretical presuppositions. On the basis of empirical, methodological and ethical critique it is emphasised that the cognitive bias that underlies the MacCAT-T assessment needs to be modified. On the one hand it has to be admitted that the operationalisation of competence in terms of value-free categories, e.g. rational decision abilities, guarantees objectivity to a great extent; but on the other hand it bears severe (...) problems. Firstly, the cognitive focus is in itself a normative convention in the process of anthropological value-attribution. Secondly, it misses the complexity of the decision process in real life. It is therefore suggested that values, emotions and other biographic and context specific aspects should be considered when interpreting the cognitive standards according to the MacArthur model. To fill the gap between cognitive and non-cognitive approaches the phenomenological theory of personal constructs is briefly introduced. In conclusion some main demands for further research to develop a multi-step model of competence assessment are outlined. (shrink)
Three-dimensional material models of molecules were used throughout the 19th century, either functioning as a mere representation or opening new epistemic horizons. In this paper, two case studies are examined: the 1875 models of van ‘t Hoff and the 1890 models of Sachse. What is unique in these two case studies is that both models were not only folded, but were also conceptualized mathematically. When viewed in light of the chemical research of that period not only were both of these (...) aspects, considered in their singularity, exceptional, but also taken together may be thought of as a subversion of the way molecules were chemically investigated in the 19th century. Concentrating on this unique shared characteristic in the models of van ‘t Hoff and the models of Sachse, this paper deals with the shifts and displacements between their operational methods and existence: between their technical and epistemological aspects and the fact that they were folded, which was forgotten or simply ignored in the subsequent development of chemistry. (shrink)
The problem of free will is deeply linked with the causal relevance of mental events. The causal exclusion argument claims that, in order to be causally relevant, mental events must be identical to physical events. However, Gibb has recently criticized it, suggesting that mental events are causally relevant as double preventers. For Gibb, mental events enable physical effects to take place by preventing other mental events from preventing a behaviour to take place. The role of mental double preventers is hence (...) similar to what Libet names free won’t, namely the ability to veto an action initiated unconsciously by the brain. In this paper I will propose an argument against Gibb’s account, the causal irrelevance argument, showing that Gibb’s proposal does not overcome the objection of systematic overdetermination of causal relevance, because mental double preventers systematically overdetermine physical double preventers, and therefore mental events are causally irrelevant. (shrink)
The implementation of Responsible Research and Innovation is not without its challenges, and one of these is raised when societal desirability is included amongst the RRI principles. We will argue that societal desirability is problematic even though it appears to fit well with the overall ideal. This discord occurs partly because the idea of societal desirability is inherently ambiguous, but more importantly because its scope is unclear. This paper asks: is societal desirability in the spirit of RRI? On von Schomberg’s (...) account, it seems clear that it is, but societal desirability can easily clash with what is ethically permissible; for example, when what is desirable in a particular society is bad for the global community. If that society chose not to do what was desirable for it, the world would be better off than if they did it. Yet our concern here is with a more complex situation, where there is a clash with ethical acceptability, but where the world would not be better off if the society chose not do what was societally desirable for itself. This is the situation where it is argued that someone else will do it if we do not. The first section of the paper gives an outline of what we take technology to be, and the second is a discussion of which criteria should be the basis for choosing research and innovation projects. This will draw on the account of technology outlined in the first section. This will be followed by an examination of a common argument, “If we don’t do it, others will”. This argument is important because it appears to justify acting in morally dubious ways. Finally, it will be argued that societal desirability gives support to the “If we don’t…” argument and that this raises some difficulties for RRI. (shrink)
Se presenta el argumento de W. T. Stace sobre el realismo señalando no que éste sea falso sino solamente que no hay absolutamente ninguna razón para considerar que sea verdadero y por tanto no tenemos por qué creerlo. Esto se aplica a la discusión de la pregunta: ¿Cómo sabemos que los átomos existen? Haciendo referencia a algunas de las respuestas científicas más importantes conocidas que son en orden cronológico: i) La ley de las proporciones definidas o Ley de Proust, ii) (...) la teoría cinética de los gases, iii) el movimiento Browniano y, iv) imágenes de microscopio de efecto túnel. (shrink)
T. H. Morgan (1866–1945), the founder of the Drosophila research group in genetics that established the chromosome theory of Mendelian inheritance, has been described as a radical empiricist in the historical literature. His empiricism, furthermore, is supposed to have prejudiced him against certain scientific conclusions. This paper aims to show two things: first, that the sense in which the term empiricism has been used by scholars is too weak to be illuminating. It is necessary to distinguish between empiricism as an (...) epistemological position and the so-called methodological empiricism. I will argue that the way the latter has been presented cannot distinguish an empiricist methodology from a non-empiricist one. Second, I will show that T. H. Morgan was not an epistemological empiricist as this term is usually defined in philosophy. The reason is that he believed in the existence of genes as material entities when they were unobservable entities when they were unobservable entities introduced to account for the phenotypic ratios found in breeding experiments. These two points, of course, are interrelated. If we were to water down the meaning of empiricis, perhaps we could call Morgan an empiricist. But then we would also fail to distinguish empiricism from realism. (shrink)
This thesis explores, thematically and chronologically, the substantial concordance between the work of Martin Heidegger and T.S. Eliot. The introduction traces Eliot's ideas of the 'objective correlative' and 'situatedness' to a familiarity with German Idealism. Heidegger shared this familiarity, suggesting a reason for the similarity of their thought. Chapter one explores the 'authenticity' developed in Being and Time, as well as associated themes like temporality, the 'they' (Das Man), inauthenticity, idle talk and angst, and applies them to interpreting Eliot's poem, (...) 'The Love Song of J. Alfred Prufrock'. Both texts depict a bleak Modernist view of the early twentieth-century Western human condition, characterized as a dispiriting nihilism and homelessness. Chapter two traces the chronological development of Ereignis in Heidegger's thinking, showing the term's two discernible but related meanings: first our nature as the 'site of the open' where Being can manifest, and second individual 'Events' of 'appropriation and revelation'. The world is always happening as 'event', but only through our appropriation by the Ereignis event can we become aware of this. Heidegger finds poetry, the essential example of language as the 'house of Being', to be the purest manifestation of Ereignis, taking as his examples Hölderlin and Rilke. A detailed analysis of Eliot's late work Four Quartets reveals how Ereignis, both as an ineluctable and an epiphanic condition of human existence, is central to his poetry, confirming, in Heidegger's words, 'what poets are for in a destitute time', namely to re-found and restore the wonder of the world and existence itself. This restoration results from what Eliot calls 'raid[s] on the inarticulate', the poet's continual striving to enact that openness to Being through which human language and the human world continually come to be. The final chapter shows how both Eliot and Heidegger value a genuine relationship with place as enabling human flourishing. Both distrust technological materialism, which destroys our sense of the world as dwelling place, and both are essentially committed to a genuinely authentic life, not the angstful authenticity of Being and Time, but a richer belonging which affirms our relationship with the earth, each other and our gods. (shrink)
Background: Thrombolytic drugs to treat an acute ischaemic stroke reduce the risk of death or major disability. The treatment is, however, also associated with an increased risk of potentially fatal intracranial bleeding. This confronts the patient with the dilemma of whether or not to take a risk of a serious side effect in order to increase the likelihood of a favourable outcome. Objective: To explore acute stroke patients’ perception of risk and willingness to accept risks associated with thrombolytic drug treatment. (...) Design: Eleven patients who had been informed about thrombolytic drug treatment and had been through the process of deciding whether or not to participate in a thrombolytic drug trial went through repeated qualitative, semistructured interviews. Results: Many patients showed a limited perception of the risks connected with thrombolytic drug treatment. Some perceived the risk as not relevant to them and were reluctant to accept that treatment could cause harm. Others seemed to be aware that treatment would mean exposure to risk. The patients’ willingness to take a risk also varied substantially. Several statements revealed ambiguity and confusion about being involved in a decision about treatment. The patients’ reasoning about risk was put into the context of their health-related experiences and life histories. Several patients wanted the doctor to be responsible for the decisions. Conclusion: Acute stroke patients’ difficulties in perceiving and processing information about risk may reduce their ability to be involved in clinical decisions where risks are involved. (shrink)
Ted T. Aoki, the most prominent curriculum scholar of his generation in Canada, has influenced numerous scholars around the world. Curriculum in a New Key brings together his work, over a 30-year span, gathered here under the themes of reconceptualizing curriculum; language, culture, and curriculum; and narrative. Aoki's oeuvre is utterly unique--a complex interdisciplinary configuration of phenomenology, post-structuralism, and multiculturalism that is both theoretically and pedagogically sophisticated and speaks directly to teachers, practicing and prospective. Curriculum in a New Key: The (...) Collected Works of Ted T. Aoki is an invaluable resource for graduate students, professors, and researchers in curriculum studies, and for students, faculty, and scholars of education generally. (shrink)
Libertarianism needs a theory of class. This claim may meet with resistance among some libertarians. A few will say: “The analysis of society in terms of classes and class struggles is a specifically Marxist approach, resting on assumptions that libertarians reject. Why should we care about class?” A greater number will say: “We recognize that class theory is important, but libertarianism doesn't need such a theory, because it already has a perfectly good one.”.
The CRISPR system for gene editing can break, repair, and replace targeted sections of DNA. Although CRISPR gene editing has important therapeutic potential, it raises several ethical concerns. Some bioethicists worry CRISPR is a prelude to a dystopian future, while others maintain it should not be feared because it is analogous to past biotechnologies. In the scientific literature, CRISPR is often discussed as a revolutionary technology. In this paper we unpack the framing of CRISPR as a revolutionary technology and contrast (...) it with framing it as a value-threatening biotechnology or business-as-usual. By drawing on a comparison between CRISPR and the Ford Model T, we argue CRISPR is revolutionary as a product, process, and as a force for social change. This characterization of CRISPR offers important conceptual clarity to the existing debates surrounding CRISPR. In particular, conceptualizing CRISPR as a revolutionary technology structures regulatory goals with respect to this new technology. Revolutionary technologies have characteristic patterns of implementation, entrenchment, and social impact. As such, early identification of technologies as revolutionary may help construct more nuanced and effective ethical frameworks for public policy. (shrink)
I present here a modal extension of T called KTLM which is, by several measures, the simplest modal extension of T yet presented. Its axiom uses only one sentence letter and has a modal depth of 2. Furthermore, KTLM can be realized as the logical union of two logics KM and KTL which each have the finite model property (f.m.p.), and so themselves are complete. Each of these two component logics has independent interest as well.
For a variety of reasons, including the common use of deception in psychology experiments, participants often disbelieve experimenters' assertions about important task parameters. This can lead researchers to conclude incorrectly that participants are behaving non- normatively. The problem can be overcome by deriving and testing normative models that do not assume full belief in key task parameters. A real experimental example is discussed.
T. Macci Plauti Menaechmi, editio altera a F. Schoell recognita . 5 M. 60. The Menaechmi of Plautus, edited on the basis of Brix's edition, by Harold North Fowler, Ph. D.
Rosemont, Jr., Henry, and Roger T. Ames, The Chinese Classic of Family Reverence: A Philosophical Translation of the Xiaojing Content Type Journal Article Pages 259-262 DOI 10.1007/s11712-011-9215-4 Authors Thomas Radice, Department of History, Southern Connecticut State University, New Haven, CT 06515, USA Journal Dao Online ISSN 1569-7274 Print ISSN 1540-3009 Journal Volume Volume 10 Journal Issue Volume 10, Number 2.
In “Why We Need Friendly AI”, Luke Muehlhauser and Nick Bostrom propose that for our species to survive the impending rise of superintelligent AIs, we need to ensure that they would be human-friendly. This discussion note offers a more natural but bleaker outlook: that in the end, if these AIs do arise, they won’t be that friendly.
Do we live in a computer simulation? I will present an argument that the results of a certain experiment constitute empirical evidence that we do not live in, at least, one type of simulation. The type of simulation ruled out is very specific. Perhaps that is the price one must pay to make any kind of Popperian progress.
The majority of papers in this special issue were presented at a conference, ‘The Advancement of Science and the Dilemma of Dual Use: Why We Can’t Afford to Fail’ held on 9–10 November 2007. The conference chairman was Andrzej Górski and its patrons were UNESCO and the President of the Polish Academy of Sciences. Three additional papers on the subject of Dual Use have been included in this issue; the authors are T. A. Cavanaugh , J. Forge and D. Koepsall.
Internalism about a person's good is roughly the view that in order for something to intrinsically enhance a person's well-being, that person must be capable of caring about that thing. I argue in this paper that internalism about a person's good should not be believed. Though many philosophers accept the view, Connie Rosati provides the most comprehensive case in favor of it. Her defense of the view consists mainly in offering five independent arguments to think that at least some form (...) of internalism about one's good is true. But I argue that, on closer inspection, not one of these arguments succeeds. The problems don't end there, however. While Rosati offers good reasons to think that what she calls 'two-tier internalism' would be the best way to formulate the intuition behind internalism about one's good, I argue that two-tier internalism is actually false. In particular, the problem is that no substantive theory of well-being is consistent with two-tier internalism. Accordingly, there is reason to think that even the best version of internalism about one's good is in fact false. Thus, I conclude, the prospects for internalism about a person's good do not look promising. (shrink)
In a recent paper, Melchior pursues a novel argumentative strategy against the sensitivity condition. His claim is that sensitivity suffers from a ‘heterogeneity problem:’ although some higher-order beliefs are knowable, other, very similar, higher-order beliefs are insensitive and so not knowable. Similarly, the conclusions of some bootstrapping arguments are insensitive, but others are not. In reply, I show that sensitivity does not treat different higher-order beliefs differently in the way that Melchior states and that while genuine bootstrapping arguments have insensitive (...) conclusions, the cases that Melchior describes as sensitive ‘bootstrapping’ arguments don’t deserve the name, since they are a perfectly good way of getting to know their conclusions. In sum, sensitivity doesn’t have a heterogeneity problem. (shrink)
Implicit God-like and ghost-in-the-machine metaphors underlie much current thinking about genomes. Although many criticisms of such views exist, none have succeeded in substituting a different, widely accepted view. Viewing the genome with its protein packaging as a brain gets rid of Gods and ghosts while plausibly integrating machine and information-based views. While the ‘wetware’ of brains and genomes are very different, many fundamental principles of how they function are similar. Eukaryotic cells are compound entities in which case the nuclear genome (...) might best be thought of more as a government than simply as a brain. (shrink)
In this contribution, I advocate diminishing the vision of marriage as an isolated and perfectly free choice between two individuals in love, in order to unseat the extent to which students resist the view that marriage is, among other things, a social contract. I summarize views of Immanuel Kant and Claudia Card, then describe my class presentation of the social significance of marriage. I conclude that students at an individualistic and self-creating point in their lives can be under-appreciative of what (...) their public avowals mean to others, and marriage, in one sense, is indeed public. (shrink)
This volume is a state-of-the-art survey of the psychology of reasoning, based around, and in tribute to, one of the field "s most eminent figures: Jonathan St B.T. Evans.
In this paper, I engage with Law’s paper ‘Evil Pleasure Is Good For You!’ I argue that, although his criticism of hedonistic utilitarianism may hold some weight, his analysis of the goodness of pleasure is overly simplistic. I highlight some troubling results which would follow from his analysis and then outline a new account which then remedies these problems. Ultimately, I distinguish between Law’s ‘evil pleasures’ and, what I call, ‘virtuous pleasures’ and show how we can accept the goodness of (...) virtuous pleasures without being obliged to say that evil pleasures are also good for us. (shrink)
berkley's line of reasoning about sex and pain experience suggests a completely different perspective on sex differences in human experimental, clinical, and epidemiological pain research. Although physiological mechanisms may place women at greater risk for pain, women may have found ways to dampen the effect of these mechanisms. Nevertheless, it is a challenge to extrapolate physiological mechanisms in human phenomena from outcomes observed in animal models.
We present some characterizations of the members of Δta2, that class of the topological arithmetical hierarchy which is just large enough to include several fundamental types of sets of points in Euclidean spaces ℝk. The limit characterization serves as a basic tool in further investigations. The characterization by effective difference chains of effectively exhaustible sets yields only a hierarchy within a subfield of Δta2. Effective difference chains of transfinite order types, consisting of complements of effectively exhaustible sets, as well as (...) another closely related concept, yield a rich hierarchy within the whole class Δta2. The presentation always first reports analogies between Hausdorff's difference hierarchy within the Borel class ΔB2 and Ershov's hierarchy within the class Δ02 of the arithmetical hierarchy; after that the counterparts for Δta2 are developed. (shrink)