Considerable attention has been given to the accessibility of legal documents, such as legislation and case law, both in legal information retrieval (query formulation, search algorithms), in legal information dissemination practice (numerous examples of on-line access to formal sources of law), and in legal knowledge-based systems (by translating the contents of those documents to ready-to-use rule and case-based systems). However, within AI & law, it has hardly ever been tried to make the contents of sources of law, and the relations (...) among them, more accessible to those without a legal education. This article presents a theory about translating sources of law into information accessible to persons without a legal education. It illustrates the theory by providing two elaborated examples of such translation ventures. In the first example, formal sources of law in the domain of exchanging police information are translated into rules of thumb useful for policemen. In the second example, the goal of providing non-legal professionals with insight into legislative procedures is translated into a framework for making available sources of law through an integrated legislative calendar. Although the theory itself does not support automating the several stages described, in this article some hints are given as to what such automation would have to look like. (shrink)
“Esotericism” refers, more or less, to what used to be called “the occult.” It comprises such matters as astrology, alchemy, kabbalism, magic, and theosophy—to name just a few. In other words, it refers to just about everything that came to be marginalized in the modern period as “superstition” and “pseudo-science,” and anathematized by scientists and philosophers. In recent decades, there has been an explosion of scholarly interest in esotericism, partly because of research revealing that many “canonical” scientists and philosophers of (...) the past were strongly interested in these “irrational” currents. The philosophers include Leibniz, Kant, Hegel, Schelling, and Schopenhauer; the scientists include Newton. Such .. (shrink)
In this meticulous study, Wouter Hanegraaff examines the structure, themes, and development of Emanuel Swedenborg's massive work _Secrets of Heaven_, published between 1749 and 1756. Written as a work of biblical exegesis, Swedenborg also interpolated material on his visionary experiences, which have long fascinated readers. In the second part of the study, Dr. Hanegraaff examines the contemporary reception of the multi-volume work, particularly the critical reactions of Immanuel Kant and Friedrich Christoph Oetinger. He finds that Swedenborg's biblical exegesis, so (...) important in his divine calling, was largely ignored in favor of the mystical experiences. (shrink)
This paper aims to contribute to our understanding of the notion of coherence by explicating in probabilistic terms, step by step, what seem to be our most basic intuitions about that notion, to wit, that coherence is a matter of hanging or fitting together, and that coherence is a matter of degree. A qualitative theory of coherence will serve as a stepping stone to formulate a set of quantitative measures of coherence, each of which seems to capture well the aforementioned (...) intuitions. Subsequently it will be argued that one of those measures does better than the others in light of some more specific intuitions about coherence. This measure will be defended against two seemingly obvious objections. (shrink)
Fictional realists maintain that fictional characters are part of the world’s ontology. In an influential article, Anthony Everett argues that the fictional realist is thereby committing herself to problematic entities. Among these are entities that are indeterminately identical. Recently, Ross Cameron and Richard Woodward have answered Everett’s worry using the same strategy. They argue that the fictional realist can bypass the problematic identities by contending that they are merely semantically indeterminate. This paper concisely surveys Everett’s original argument, Cameron’s and Woodward’s (...) responses, and then argues that the strategy employed by Cameron and Woodward fails to satisfactorily answer Everett’s worries. (shrink)
If coherence is to have justificatory status, as some analytical philosophers think it has, it must be truth-conducive, if perhaps only under certain specific conditions. This paper is a critical discussion of some recent arguments that seek to show that under no reasonable conditions can coherence be truth-conducive. More specifically, it considers Bovens and Hartmann’s and Olsson’s “impossibility results,” which attempt to show that coherence cannot possibly be a truth-conducive property. We point to various ways in which the advocates of (...) a coherence theory of justification may attempt to divert the threat of these results. (shrink)
Traditional theory of mind (ToM) accounts for social cognition have been at the basis of most studies in the social cognitive neurosciences. However, in recent years, the need to go beyond traditional ToM accounts for understanding real life social interactions has become all the more pressing. At the same time it remains unclear whether alternative accounts, such as interactionism, can yield a sufficient description and explanation of social interactions. We argue that instead of considering ToM and interactionism as mutually exclusive (...) opponents, they should be integrated into a more comprehensive account of social cognition. We draw on dual process models of social cognition that contrast two different types of social cognitive processing. The first type (labeled Type 1) refers to process es that are fast, efficient, stimulus-driven, and relatively inflexible. The second type (labeled Type 2) refers to processes that are relatively slow, cognitively laborious, flexible, and may involve conscious control. We argue that while interactionism captures aspects of social cognition mostly related to Type 1 processes, ToM is more focused on those based on Type 2 processes. We suggest that real life social interactions are rarely based on either Type 1 or Type 2 processes alone. On the contrary, we propose that in most cases both types of processes are simultaneously involved and that social behavior may be sustained by the interplay between these two types of processes. Finally, we discuss how the new integrative framework can guide experimental research on social interaction. (shrink)
In this paper I consider whether there is a measure of coherence that could be rightly claimed to generalize the notion of logical equivalence. I show that Fitelson’s (2003) proposal to that effect encounters some serious difficulties. Furthermore, there is reason to believe that no mutual-support measure could ever be suitable for the formalization of coherence as generalized logical equivalence. Instead, it appears that the only plausible candidate for such a measure is one of relative overlap. The measure I propose (...) in this paper is quite similar to Olsson’s (2002) proposal but differs from it by not being susceptible to the type of counterexample that Bovens and Hartmann (2003) have devised against it. (shrink)
In a famous experiment by Tversky and Kahneman (Psychol Rev 90:293–315, 1983), featuring Linda the bank teller, the participants assign a higher probability to a conjunction of propositions than to one of the conjuncts, thereby seemingly committing a probabilistic fallacy. In this paper, we discuss a slightly different example featuring someone named Walter, who also happens to work at a bank, and argue that, in this example, it is rational to assign a higher probability to the conjunction of suitably chosen (...) propositions than to one of the conjuncts. By pointing out the similarities between Tversky and Kahneman’s experiment and our example, we argue that the participants in the experiment may assign probabilities to the propositions in question in such a way that it is also rational for them to give the conjunction a higher probability than one of the conjuncts. (shrink)
The research project described in this report represents one of the most extensive studies of character education ever undertaken, including over 10,000 students and 255 teachers in schools across England, Scotland, Northern Ireland and Wales. Research techniques consisted of a mixture of surveys, moral dilemmas and semi-structured interviews. This report explores: - The current situation in character education, both in the UK and internationally - How developed British students are with respect to moral character and the extent to which they (...) are able to understand and apply moral virtues, especially those aged 14 and 15 - How teachers in the UK understand their role in terms of students’ moral and character development - What helps or hinders the development of children’s characters according to teachers in UK schools. (shrink)
Character education considers teachers to be role models, but it is unclear what this means in practice. Do teachers model admirable character traits? And do they do so effectively? In this article the relevant pedagogical and psychological literature is reviewed in order to shed light on these questions. First, the use of role modelling as a teaching method in secondary education is assessed. Second, adolescents? role models and their moral qualities are identified. Third, the psychology of moral learners is critically (...) examined, using Bandura?s social learning theory as point of departure. It turns out that role modelling is rarely used as an explicit teaching method and that only a very small percentage of adolescents recognises teachers as role models. If role modelling is to contribute to children?s moral education, teachers are recommended to explain why the modelled traits are morally significant and how students can acquire these qualities for themselves. (shrink)
According to moral error theory, moral discourse is error-ridden. Establishing error theory requires establishing two claims. These are that moral discourse carries a non-negotiable commitment to there being a moral reality and that there is no such reality. This paper concerns the first and so-called non-negotiable commitment claim. It starts by identifying the two existing argumentative strategies for settling that claim. The standard strategy is to argue for a relation of conceptual entailment between the moral statements that comprise moral discourse (...) and the statement that there is a moral reality. The non-standard strategy is to argue for a presupposition relation instead. Error theorists have so far failed to consider a third strategy, which uses a general entailment relation that doesn’t require intricate relations between concepts. The paper argues that both entailment claims struggle to meet a new explanatory challenge and that since the presupposition option doesn’t we have prima facie reason to prefer it over the entailment options. The paper then argues that suitably amending the entailment claims enables them to meet this challenge. With all three options back on the table the paper closes by arguing that error theorists should consider developing the currently unrecognised, non-conceptual entailment claim. (shrink)
Glymour’s theory of bootstrap confirmation is a purely qualitative account of confirmation; it allows us to say that the evidence confirms a given theory, but not that it confirms the theory to a certain degree. The present paper extends Glymour’s theory to a quantitative account and investigates the resulting theory in some detail. It also considers the question how bootstrap confirmation relates to justification.
Despite the Aristotelian renaissance in the philosophy of education, the development of virtue has not received much attention. This is unfortunate, because an attempt to draft an Aristotelian model of moral development can help philosophers to evaluate the contribution Aristotelian virtue ethics can make to our understanding of moral development, provide psychologists with a potentially richer account of morality and its development, and help educators to understand the developmental phase people are in. In the article, it is argued that the (...) Aristotelian categories of the ‘morally indifferent’, ‘un-self-controlled’, ‘self-controlled’ and ‘properly virtuous’ can be interpreted as the successive stages or levels of a comprehensive developmental model. For each stage, it will be made clear whether people are committed to virtue, whether they act virtuously or not, whether they act with pleasure or pain, and which desires and reasons they have for acting. The article closes with suggestions about what needs to be done if the proposed Aristotelian account of moral development is to become psychologically more realistic and educationally useful. (shrink)
I develop a probabilistic account of coherence, and argue that at least in certain respects it is preferable to (at least some of) the main extant probabilistic accounts of coherence: (i) Igor Douven and Wouter Meijs’s account, (ii) Branden Fitelson’s account, (iii) Erik Olsson’s account, and (iv) Tomoji Shogenji’s account. Further, I relate the account to an important, but little discussed, problem for standard varieties of coherentism, viz., the “Problem of Justified Inconsistent Beliefs.”.
Kurzban and colleagues carry forward an important contemporary movement in cognitive control research, tending away from resource-based models and toward a framework focusing on motivation or value. However, their specific proposal, centering on opportunity costs, appears problematic. We favor a simpler view, according to which the exertion of cognitive control carries intrinsic subjective costs.
In order to develop a model of equitable and sustainable distribution, this paper advocates integrating the ecological space paradigm and the capabilities approach. As the currency of distribution, this account proposes a hybrid of capabilities and ecological space. Although the goal of distributive justice should be to secure and promote people’s capabilities now and in the future, doing so requires acknowledging that these capabilities are dependent on the biophysical preconditions as well as inculcating the ethos of restraint. Both issues have (...) been highlighted from the perspective of the ecological space paradigm. Concerning the scope of distributive justice, the integration can combine the advantages of the ecological space paradigm regarding the allocation of the responsibilities involved in environmental sustainability with the strength of the capabilities approach regarding people’s entitlements. The pattern of distribution starts from a capability threshold. In order to achieve this threshold, ecological space should be provided sufficiently, and the remaining ecological space budget could then be distributed according to the equal per capita principle. (shrink)
Een respectvolle houding ten opzichte van de natuur houdt noodzakelijkerwijs ook een zekere distantie in. -/- ('The silence of nature') If we pretent to speak with nature's voice or in on nature's behalf we risk ventriloguising. A respectful attitude towards nature also requires acknowledging the distance between us and nature, recognize that nature does not speak. But nature's silence does have something to say to us. In this paper, I analyze the debate in Dutch environmental philosophy between moral realism of (...) Wim Zweers and Wouter Achterberg, and the social constructivist position of Jozef Keulartz and others. It is argued that even though social constructivists rightfully argue (against moral realism) that there exist multiple moral interpretations of the meaning of nature, that does not mean that 'nature does not exist'. This paper instead argues that a hermeneutic environmental ethic can do justice both to the fact that meanings of environment are culturally mediated and yet also do justice to the moral experience that 'nature' has moral weight. (shrink)
This paper introduces and describes new protocols for proving knowledge of secrets without giving them away: if the verifier does not know the secret, he does not learn it. This can all be done while only using one-way hash functions. If also the use of encryption is allowed, these goals can be reached in a more efficient way. We extend and use the GNY authentication logic to prove correctness of these protocols.
In this paper, reduction and its pragmatics are discussed in light of the development in computer science of languages to describe processes. The design of higher-level description languages within computer science has had the aim of allowing for description of the dynamics of processes in the (physical) world on a higher level avoiding all (physical) details of these processes. The higher description levels developed have dramatically increased the complexity of applications that came within reach. The pragmatic attitude of a (scientific) (...) practitioner in this area has become inherently anti-reductionist, but based on well-established reduction relations. The paper discusses how this perspective can be related to reduction in general, and to other domains where description of dynamics plays a main role, in particular, biological and cognitive domains. (shrink)
A social scientific survey on visions of human/nature relationships in western Europe shows that the public clearly distinguishes not only between anthropocentrism and ecocentrism, but also between two nonanthropocentric types of thought, which may be called “partnership with nature” and “participation in nature.” In addition, the respondents distinguish a form of human/nature relationship that is allied to traditional stewardship but has a more ecocentric content, labeled here as “guardianship of nature.” Further analysis shows that the general public does not subscribe (...) to an ethic of “mastery over nature.” Instead, practically all respondents embrace the image of guardianship, while the more radical relationships of partnership and participation also received significant levels of adherence. The results imply that ethicists should no longer assume that the ethics of the public are merely anthropocentric. Finally, they call into question the idea of a single form of ecocentrism and favor a hermeneutic virtue ethics approach to the study of the interface between ethics and action. (shrink)
Deductivism is not merely a logical technique, but also a theory of normativity: it provides an objective and universal standard of evaluation. Contemporary dialectical logic rejects deductive normativity, replacing its universal standard by an intersubjective standard. It is argued in this paper that dialectical normativity does not improve upon deductive normativity. A dialogico-rhetorical alternative is proposed.
ion is instrumental for our understanding of how numbers are cognitively represented. We propose that the notion of abstraction becomes testable from within the framework of simulated cognition. We describe mental simulation as embodied, grounded, and situated cognition, and report evidence for number representation at each of these levels of abstraction.
In this study, we investigated the influence of individual learners’ motivation on the collaborative discovery learning process. In this we distinguished the motivation of the individual learners and had eye for the composition of groups, which could be homogeneous or heterogeneous in terms of motivation. The study involved 73 dyads of 10th‐grade learners. Learners worked in dyads on separate screens in a shared discovery learning environment. They communicated using a chat box. A self‐report questionnaire was used to measure the motivational (...) beliefs of learners. We used on‐line measures to measure communicative and discovery activities of the learners. Task value seems to be an important motivational construct with regard to the composition of dyads. The results show that the performance of a dyad existing of a highly and a lowly motivated learner can be influenced positively by the highly motivated peer. (shrink)