Recent changes in service environments have changed the preconditions of their production and consumption. These changes include unbundling services from production processes, growth of the information-rich economy and society, the search for creativity in service production and consumption and continuing growth of digital technologies. These contextual changes affect city governments because they provide a range of infrastructure and welfare services to citizens. Concepts such as ‘smart city’, ‘intelligent city’ and ‘knowledge city’ build new horizons for cities in undertaking their challenging (...) service functions in an increasingly cost-conscious, competitive and environmentally oriented setting. What is essential in practically all of them is that they paint a picture of cities with smooth information processes, facilitation of creativity and innovativeness, and smart and sustainable solutions promoted through service platforms. This article discusses this topic, starting from the nature of services and the new service economy as the context of smart local public services. On this basis, we build an overall framework for understanding the basic forms and dimensions of smart public services. The focus is on conceptual systematisation of the key dimensions of smart services and the conceptual modelling of smart service platforms through which digital technology is increasingly embedded in social creativity. We provide examples of real-life smart service applications within the European context. (shrink)
The book studies philosophical and mathematical-logical problems of modal notions. Its starting points are possible worlds semantics and Kripke models, and it also concentrates on proof-theoretic methods.
This article explores Gerardus van der Leeuw’s view of phenomenology of religion. The phenomenological method he defended is basically a hermeneutical approach in which an observer relates personally and even existentially to the “phenomena” he studies in order to determine their essence. In his anthropology a similar way of relating to the world is discussed: the “primitive mentality” that is characterized by the “need to participate”. Both phenomenology and mentalité primitive imply a critique of modern scholarship. This fundamental criticism of (...) the prevailing approach in the humanities including religious studies explains the growing distance between van der Leeuw and the majority of scholars of religion in the decades after his death in 1950. (shrink)
In an editorial to a recent issue of Neurology, Richard Dees expresses the same criticism in an even more rigorous epistemic tone: Veikko Launis, Ph.D., is Professor of Medical Ethics and Adjunct Professor of Ethics and Social Philosophy at the University of Turku, Finland.FootnotesThis article is part of the Neuroethics of Brainreading research project, directed by myself and funded by the Academy of Finland. I am grateful to Olli Koistinen, Pekka Louhiala, Helena Siipi, and an anonymous referee for helpful (...) comments, criticism, and suggestions. (shrink)
Purpose – This paper aims to present an overview of the various ethical, societal and critical issues that micro- and nanotechnology-based small, energy self-sufficient sensor systems raise in different selected application fields. An ethical approach on the development of these technologies was taken in a very large international, multitechnological European project. The authors approach and methodology are presented in the paper and, based on this review, the authors propose general principles for this kind of work. Design/methodology/approach – The authors’ approach (...) is based on a great amount of experience working together in multi-disciplinary teams. Ethical issues have usually been handled in the authors’ work to some degree. In this project, the authors had the opportunity to emphasise the human view in technological development, utilise the authors’ experience from previous work and customise the authors’ approach to this particular case. In short, the authors created a wide set of application scenarios with technical and application field experts in the authors’ research project. The scenarios were evaluated with external application field experts, potential consumer users and ethics experts. Findings – Based on the authors’ experiences in this project and in previous work, the authors suggest a preliminary model for construction activity within technology development projects. The authors call this model the Human-Driven Design approach, and Ethics by Design as a more focussed sub-set of this approach. As all enabling technologies have both positive and negative usage possibilities, and so-called ethical assessment tends to focus on negative consequences, there are doubts from some stakeholders about including ethical perspectives in a technology development project. Research limitations/implications – The authors argue that the ethical perspective would be more influential if it were to provide a more positive and constructive contribution to the development of technology. The main findings related to the ethical challenges based on the actual work done in this project were the following: the main user concerns were in relation to access to information, digital division and the necessity of all the proposed measurements; the ethics experts highlighted the main ethical issues as privacy, autonomy, user control, freedom, medicalisation and human existence. Practical implications – Various technology assessment models and ethical approaches for technological development have been developed and performed for a long time, and recently, a new approach called Responsible Research and Innovation has been introduced. The authors’ intention is to give a concrete example for further development as a part of the development of this approach. Social implications – The authors’ study in this particular case covers various consumer application possibilities for small sensor systems. The application fields studied include health, well-being, safety, sustainability and empathic user interfaces. The authors believe that the ethical challenges identified are valuable to other researchers and practitioners who are studying and developing sensor-based solutions in similar fields. Originality/value – The authors’ study covers various consumer application possibilities of small sensor systems. The studied application fields include health, well-being, safety, sustainability and empathic user interfaces. The findings are valuable to other researchers and practitioners who are studying and developing sensor-based solutions to similar fields. (shrink)
The article investigates the validity of two different versions of the slippery slope argument construed in relation to human gene therapy: the empirical and the conceptual argument. The empirical version holds that our accepting somatic cell therapy will eventually cause our accepting eugenic medical goals. The conceptual version holds that we are logically committed to accepting such goals once we have accepted somatic cell therapy. It is argued that neither the empirical nor the conceptual version of the argument can provide (...) a conclusive moral reason for banning somatic cell therapy. According to a third interpretation, referred to as the arbitrary result argument, the many apparent similarities between somatic cell therapy and eugenic-based human genetic engineering drive us to make principled choices concerning what differences and similarities between the two practices should be regarded as morally (ir)relevant. Decisions of this kind are likely to have unpredictable moral consequences. Thus formulated, the slippery slope argument has much plausibility. One objects to somatic cell therapy not so much because of what is at the bottom of the slope on which it lies, but because it is on a slope of which one does not know what is at the bottom. While the arbitrary result argument does not provide a conclusive reason for prohibiting human gene therapy, it reminds of a very important thing: when making bioethical decisions, we should be as specific and as consistent as possible about our basic moral and medical concepts. (shrink)
Constructions of Intersubjectivity shows that the meaning of grammatical constructions often has more to do with the human cognitive capacity for taking other peoples' points of view than with describing the world. Treating pragmatics, semantics, and syntax in parallel and integrating insights from linguistics, psychology, and animal communication, Arie Verhagen develops a new understanding of linguistic communication. In doing so he shows the continuity between language and animal communication and reveals the nature of human linguistic specialization. Professor Verhagen uses Dutch (...) and English data from a wide variety of sources and considers the contributions of grammar to the coherence of discourse. He argues that important problems in semantics and syntax may be resolved if language is understood as an instrument for exerting influence and coordinating different perspectives. The grammatical phenomena he discusses include negative expressions, the let alone construction, complementation constructions, and discourse connectives.This powerfully argued and original explanation of the nature and operation of communication will interest a wide range of scholars and advanced students in linguistics, cognitive science, and human evolution. (shrink)
Many believe that the ethical problems of donation after cardiocirculatory death (DCD) have been "worked out" and that it is unclear why DCD should be resisted. In this paper we will argue that DCD donors may not yet be dead, and therefore that organ donation during DCD may violate the dead donor rule. We first present a description of the process of DCD and the standard ethical rationale for the practice. We then present our concerns with DCD, including the following: (...) irreversibility of absent circulation has not occurred and the many attempts to claim it has have all failed; conflicts of interest at all steps in the DCD process, including the decision to withdraw life support before DCD, are simply unavoidable; potentially harmful premortem interventions to preserve organ utility are not justifiable, even with the help of the principle of double effect; claims that DCD conforms with the intent of the law and current accepted medical standards are misleading and inaccurate; and consensus statements by respected medical groups do not change these arguments due to their low quality including being plagued by conflict of interest. Moreover, some arguments in favor of DCD, while likely true, are "straw-man arguments," such as the great benefit of organ donation. The truth is that honesty and trustworthiness require that we face these problems instead of avoiding them. We believe that DCD is not ethically allowable because it abandons the dead donor rule, has unavoidable conflicts of interests, and implements premortem interventions which can hasten death. These important points have not been, but need to be fully disclosed to the public and incorporated into fully informed consent. These are tall orders, and require open public debate. Until this debate occurs, we call for a moratorium on the practice of DCD. (shrink)
Ariès traces Western man's attitudes toward mortality from the early medieval conception of death as the familiar collective destiny of the human race to the modern tendency, so pronounced in industrial societies, to hide death as if it were an embarrassing family secret.
One way to obtain a comprehensive semantics for various systems of modal logic is to use a general notion of non-normal world. In the present article, a general notion of modal system is considered together with a semantic framework provided by such a general notion of non-normal world. Methodologically, the main purpose of this paper is to provide a logical framework for the study of various modalities, notably prepositional attitudes. Some specific systems are studied together with semantics using non-normal worlds (...) of different kinds. (shrink)
Most public discussion has focused on those effects of genetic research that are considered in some way unwanted or unpleasant. For example, there has been much debate concerning the risks and the ethical appropriateness of genetic screening, gene therapy, and agricultural applications based on genetic techniques. It often claimed that genetic research may cause new problems such as genetic discrimination, stigmatization, environmental risks, or mistreatment of animals.Genes and Morality: New Essays adopts a critical attitude toward genetic research, on both a (...) theoretical and a practical level. It presents some of the most important problems in the ethics of genetic engineering, including the questions of genetic health and disease, genetic testing, responsibility for health, patenting non-human and human life, and problems related to the disclosure of genetic information.The aim of the book is to focus on real ethical and conceptual issues. Consider, for instance, the concept of genetic disease. As one of the contributors, Ingmar Pörn, writes, "fear of genetic disease, or anxiety, is not itself a disease any more than fear of becoming unemployed is a disease. Alleviating such emotions is not a medical task to be discharged by drug therapy."The book also examines the philosophical foundations of these issues by discussing the most influential bioethical theories of today, including utilitarianism and principlism. (shrink)
Brain death is accepted in most countries as death. The rationales to explain why brain death is death are surprisingly problematic. The standard rationale that in brain death there has been loss of integrative unity of the organism has been shown to be false, and a better rationale has not been clearly articulated. Recent expert defences of the brain death concept are examined in this paper, and are suggested to be inadequate. I argue that, ironically, these defences demonstrate the lack (...) of a defensible rationale for why brain death should be accepted as death itself. If brain death is death, a conceptual rationale for brain death being equivalent to death should be clarified, and this should be done urgently. (shrink)
Some people claim that evolution is "just a theory". Do you know what a scientific theory really is? Just a theory is an overview of the modern concepts of science. A clear understanding of the nature of science will enable you to distinguish science from pseudoscience (which illegitimately wraps itself in the mantle of science), and real social issues in science from the caricatures portrayed in postmodernist critiques. Prof. Ben-Ari's style is light (even humorous) and easy to read, bringing the (...) latest concepts of science to the general reader. Of particular interest is his analysis of the terminology of science (fact, law, proof, theory) in relation to the colloquial meaning of these terms. Between chapters are biographical vignettes of scientists — both familiar and unfamiliar — showing their common commitment to the enterprise of science, together with a diversity of backgrounds and personalities. This accessible, informative, and comprehensive work will give lay readers a good grasp of real science. (shrink)
We describe Peirce’s 1903 system of modal gamma graphs, its transformation rules of inference, and the interpretation of the broken-cut modal operator. We show that Peirce proposed the normality rule in his gamma system. We then show how various normal modal logics arise from Peirce’s assumptions concerning the broken-cut notation. By developing an algebraic semantics we establish the completeness of fifteen modal logics of gamma graphs. We show that, besides logical necessity and possibility, Peirce proposed an epistemic interpretation of the (...) broken-cut modality, and that he was led to analyze constructions of knowledge in the style of epistemic logic. (shrink)
This article calls into question the charge that frequentist testing is susceptible to the base-rate fallacy. It is argued that the apparent similarity between examples like the Harvard Medical School test and frequentist testing is highly misleading. A closer scrutiny reveals that such examples have none of the basic features of a proper frequentist test, such as legitimate data, hypotheses, test statistics, and sampling distributions. Indeed, the relevant error probabilities are replaced with the false positive/negative rates that constitute deductive calculations (...) based on known probabilities among events. As a result, the ampliative dimension of frequentist induction—learning from data about the underlying data-generating mechanism—is missing. *Received August 2009; revised January 2010. †To contact the author, please write to: Department of Economics, Virginia Tech, Blacksburg, VA 24061; e-mail: [email protected] (shrink)
We present a dynamic approach to Peirce’s original construal of abductive logic as a logic of conjecture making, and provide a new decidable, contraction-free and cut-free proof system for the dynamic logic of abductive inferences with neighborhood semantics. Our formulation of the dynamic logic of abduction follows the philosophical and scientific track that led Peirce to his late, post-1903 characterization of abductive conclusions as investigands, namely invitations to investigate propositions conjectured at the level of pre-beliefs.
Within the space of a few years, the idea of Responsible Research and Innovation, and its acronym RRI, catapulted from an obscure phrase to the topic of conferences and attempts to specify and realize it. How did this come about, and against which backdrop? What are the dynamics at present, and what do these imply for the future of RRI as a discourse, and as a patchwork of practices? It is a social innovation which creates opening in existing divisions of (...) moral labour, a notion that is explained with the help of the history of responsibility language. It is filled in for the present situation and ongoing developments. Some elements may stabilize and this creates a path into the future. There will be reductions of the originally open-ended innovation, some productive, others less so. This is a reason to regularly inquire into the value of the reductions and the directions the path is taking. (shrink)
This paper aims to give a substantive account of how Feynman used diagrams in the first lectures in which he explained his new approach to quantum electrodynamics. By critically examining unpublished lecture notes, Feynman’s use and interpretation of both "Feynman diagrams" and other visual representations will be illuminated. This paper discusses how the morphology of Feynman’s early diagrams were determined by both highly contextual issues, which molded his images to local needs and particular physical characterizations, and an overarching common diagrammatic (...) style, which facilitated Feynman’s movement between different diagrams despite their divergent forms and significance. (shrink)
This remarkable book--the fruit of almost two decades of study--traces in compelling fashion the changes in Western attitudes toward death and dying from the earliest Christian times to the present day. A truly landmark study, The Hour of Our Death reveals a pattern of gradually developing evolutionary stages in our perceptions of life in relation to death, each stage representing a virtual redefinition of human nature. Starting at the very foundations of Western culture, the eminent historian Phillipe Aries shows how, (...) from Graeco-Roman times through the first ten centuries of the Common Era, death was too common to be frightening; each life was quietly subordinated to the community, which paid its respects and then moved on. Aries identifies the first major shift in attitude with the turn of the eleventh century when a sense of individuality began to rise and with it, profound consequences: death no longer meant merely the weakening of community, but rather the destruction of self. Hence the growing fear of the afterlife, new conceptions of the Last Judgment, and the first attempts (by Masses and other rituals) to guarantee a better life in the next world. In the 1500s attention shifted from the demise of the self to that of the loved one (as family supplants community), and by the nineteenth century death comes to be viewed as simply a staging post toward reunion in the hereafter. Finally, Aries shows why death has become such an unendurable truth in our own century--how it has been nearly banished from our daily lives--and points out what may be done to "re-tame" this secret terror. The richness of Aries's source material and investigative work is breathtaking. While exploring everything from churches, religious rituals, and graveyards (with their often macabre headstones and monuments), to wills and testaments, love letters, literature, paintings, diaries, town plans, crime and sanitation reports, and grave robbing complaints, Aries ranges across Europe to Russia on the one hand and to England and America on the other. As he sorts out the tangled mysteries of our accumulated terrors and beliefs, we come to understand the history--indeed the pathology--of our intellectual and psychological tensions in the face of death. (shrink)
An ℵ1-Souslin tree is a complicated combinatorial object whose existence cannot be decided on the grounds of ZFC alone. But fifteen years after Tennenbaum and Jech independently devised notions of forcing for introducing such a tree, Shelah proved that already the simplest forcing notion—Cohen forcing—adds an ℵ1-Souslin tree.In this article, we identify a rather large class of notions of forcing that, assuming a GCH-type hypothesis, add a λ+-Souslin tree. This class includes Prikry, Magidor, and Radin forcing.
The main aim of this paper is to revisit the curve fitting problem using the reliability of inductive inference as a primary criterion for the ‘fittest' curve. Viewed from this perspective, it is argued that a crucial concern with the current framework for addressing the curve fitting problem is, on the one hand, the undue influence of the mathematical approximation perspective, and on the other, the insufficient attention paid to the statistical modeling aspects of the problem. Using goodness-of-fit as the (...) primary criterion for ‘best', the mathematical approximation perspective undermines the reliability of inference objective by giving rise to selection rules which pay insufficient attention to ‘accounting for the regularities in the data'. A more appropriate framework is offered by the error-statistical approach, where (i) statistical adequacy provides the criterion for assessing when a curve captures the regularities in the data adequately, and (ii) the relevant error probabilities can be used to assess the reliability of inductive inference. Broadly speaking, the fittest curve (statistically adequate) is not determined by the smallness if its residuals, tempered by simplicity or other pragmatic criteria, but by the nonsystematic (e.g. white noise) nature of its residuals. The advocated error-statistical arguments are illustrated by comparing the Kepler and Ptolemaic models on empirical grounds. ‡I am grateful to Deborah Mayo and Clark Glymour for many valuable suggestions and comments on an earlier draft of the paper; estimating the Ptolemaic model was the result of Glymour's prompting and encouragement. †To contact the author, please write to: Department of Economics, Virginia Tech 3019 Pamplin Hall (0316), Blacksburg, VA 24061; e-mail: [email protected] (shrink)
This article investigates Charles Peirce’s development of logical calculi for classical propositional logic in 1880–1896. Peirce’s 1880 work on the algebra of logic resulted in a successful calculus for Boolean algebra. This calculus, denoted byPC, is here presented as a sequent calculus and not as a natural deduction system. It is shown that Peirce’s aim was to presentPCas a sequent calculus. The law of distributivity, which Peirce states in 1880, is proved using Peirce’s Rule, which is a residuation, inPC. The (...) transitional systems of the algebra of the copula that Peirce develops since 1880 paved the way to the 1896 graphical system of the alpha graphs. It is shown how the rules of the alpha system reinterpret Boolean algebras, answering Peirce’s statement that logical graphs supply a new system of fundamental assumptions to logical algebra. A proof-theoretic analysis is given for the connection betweenPCand the alpha system. (shrink)
This article explores the various uses or – according to some authors, such as the sociologist James Beckford – misuses of the term ‘postsecular’. The variations in its use are indeed so broad that the question is justified whether the terminology as such has much analytical value. The prominence of the ‘postsecular’ in present-day debates in my view primarily indicates the inability among scholars, intellectuals and religious interest groups to come to grips with what – for some at least – (...) is an unexpected presence and resurgence of religion in the public domains of presumably secular societies. The work of the cultural anthropologist Talal Asad shows that the secular does not preclude the religious. All kinds of religious arguments, organizations, and agents are very much present in modern ‘secular’ societies. From this perspective, the emergence of the ‘postsecular’ refers to very real phenomena, most importantly the intertwinement of the secular and the religious. For instance, religious actors do not accept the barriers of secular society and claim a role for religion in public and secular arenas. This insight could be one of the most important driving forces behind the popularity of the term ‘postsecular’ in recent years. (shrink)
The notion of bilattice was introduced by Ginsberg, and further examined by Fitting, as a general framework for many applications. In the present paper we develop proof systems, which correspond to bilattices in an essential way. For this goal we introduce the notion of logical bilattices. We also show how they can be used for efficient inferences from possibly inconsistent data. For this we incorporate certain ideas of Kifer and Lozinskii, which happen to suit well the context of our work. (...) The outcome are paraconsistent logics with a lot of desirable properties.1. (shrink)
There might not be a specific nano-ethics, but there definitely is an ethics of new & emerging science and technology (NEST), with characteristic tropes and patterns of moral argumentation. Ethical discussion in and around nanoscience and technology reflects such NEST-ethics. We offer an inventory of the arguments, and show patterns in their evolution, in arenas full of proponents and opponents. We also show that there are some nano-specific issues: in how size matters, and when agency is delegated to smart devices. (...) Our overall approach is a pragmatist ethics, and we conclude that struggle (and learning) might be more productive than models emphasizing consensus. (shrink)
It is of common use in modern Venn diagrams to mark a compartment with a cross to express its non-emptiness. Modern scholars seem to derive this convention from Charles S. Peirce, with the assumption that it was unknown to John Venn. This paper demonstrates that Venn actually introduced several methods to represent existentials but felt uneasy with them. The resistance to formalize existentials was not limited to diagrammatic systems, as George Boole and his followers also failed to provide a satisfactory (...) symbolic representation for them. This difficulty points out issues that are inherent to the very nature of existentials. This paper assesses the various methods designed for the representation of existential statements with Venn diagrams. First, Venn’s own attempts are discussed and compared with other solutions proposed by his contemporaries and successors, notably Lewis Carroll and Peirce. Since disjunctives hold an important role in an effective representation of existentials, their representation is also discussed. Finally, recent methods for the diagrammatic representation of existing individuals, rather than mere existence, are surveyed. (shrink)
The Technology Assessment (TA) Program established in 2003 as part of the Dutch R&D consortium NanoNed is interesting for what it did, but also as an indication that there are changes in how new science and technology are pursued: the nanotechnologists felt it necessary to spend part of their funding on social aspects of nanotechnology. We retrace the history of the TA program, and present the innovative work that was done on Constructive TA of emerging nanotechnology developments and on aspects (...) of embedding of nanotechnology in society. One achievement is the provision of tools and approaches to help make the co-evolution of technology and society more reflexive. We briefly look forward by outlining its successor program, TA NanoNextNL, in place since 2011. (shrink)
The Vatican recently published directives regarding “beginning of life” issues that explain the Catholic Church's position regarding new technologies in this area. We think that it is important to develop a response that presents the traditional Orthodox Jewish position on these same issues in order to present an alternative, parallel system. There are many points of commonality between the Vatican document and traditional Jewish thought as well as several important issues where there is a divergence of opinion. The latter include (...) the status of the zygote as produced during in vitro fertilization, the acceptable of procreation in a method other than through the conjugal act, and the permissibility of deriving benefit from the products of an illicit act. These points of agreement and disagreement are discussed in detail in this article. (shrink)
This paper first develops a theoretically motivated view of narrative as a special form of inferential, cooperative human communication, of the role that the past tense plays in the intersubjective coordination of narrators and readers, viz. that of ‘curtailing’ the immediate argumentative applicability of the represented situation, and of its relation to viewpoint management. In three case studies, it is subsequently shown how this helps to elucidate certain effects of present and past tense alternations in stories. While these effects are (...) multi-faceted and highly text-specific, there is a common denominator of the use of the past tense in the dimension of narrator-reader communication in the narratives. The analysis supports an independently motivated conception of intersubjectivity that assigns a special status to ‘coordination with other minds’, apart from senders and addressees. (shrink)
Rayleigh and Ramsay discovered the inert gas argon in the atmospheric air in 1895 using a carefully designed sequence of experiments guided by an informal statistical analysis of the resulting data. The primary objective of this article is to revisit this remarkable historical episode in order to make a case that the error‐statistical perspective can be used to bring out and systematize (not to reconstruct) these scientists' resourceful ways and strategies for detecting and eliminating error, as well as dealing with (...) Duhemian ambiguities and underdetermination problems as they arose in the context of their local research settings. *Received December 2009; revised January 2010. †To contact the author, please write to: Department of Economics, 3016 Pamplin Hall, Virginia Tech, Blacksburg, VA 24061; e‐mail: [email protected] (shrink)
This paper presents an enrichment of the Gabbay–Woods schema of Peirce’s 1903 logical form of abduction with illocutionary acts, drawing from logic for pragmatics and its resources to model justified assertions. It analyses the enriched schema and puts it into the perspective of Peirce’s logic and philosophy.
This article seeks to contribute to the challenge of presenting the silenced voices of excluded groups in society by means of a philosophic community of inquiry composed primarily of children and young adults. It proposes a theoretical model named ‘enabling identity’ that presents the stages whereby, under the guiding role played by the community of philosophic inquiry, the hegemonic meta-narrative of the mainstream society makes room for the identity of members of marginalised groups. The model is based on the recognition (...) of diverse narratives within a web of communal narratives that does not favour the meta-narrative. It reports on the experiences of moderators and students from weak and excluded sectors of society in two countries whose participation in communities of philosophical inquiry gave them not only a “voice” but also a presence and identity. (shrink)
Peirce considered the principal business of logic to be the analysis of reasoning. He argued that the diagrammatic system of Existential Graphs, which he had invented in 1896, carries the logical analysis of reasoning to the furthest point possible. The present paper investigates the analytic virtues of the Alpha part of the system, which corresponds to the sentential calculus. We examine Peirce’s proposal that the relation of illation is the primitive relation of logic and defend the view that this idea (...) constitutes the fundamental motive of philosophy of notation both in algebraic and graphical logic. We explain how in his algebras and graphs Peirce arrived at a unifying notation for logical constants that represent both truth-function and scope. Finally, we show that Shin’s argument for multiple readings of Alpha graphs is circular. (shrink)
The dominant approach to the public sphere is characterized by idealism and normativism. It overemphasizes civic-minded or civil discourse, envisions unrealistically egalitarian and widespread participation, has difficulty dealing with consequential public events, and neglects the spatial core of the public sphere and the effects of visibility. I propose a semiotic theory that approaches the public sphere through general sensory access. This approach enables a superior understanding of all public events, discursive or otherwise. It also captures the dialectical relationship between the (...) public sphere and politics by (1) specifying the mechanisms through which visibility and publicity become resources or constraints for political actors, (2) explaining the political regulation of visibility, (3) showing the central role that struggles over the contents of public spaces play in political conflict, and (4) analyzing the links among social structure, social norms, and political action in the transformation of the public sphere. (shrink)
Sociolinguistic research shows that listeners' expectations of speakers influence their interpretation of the speech, yet this is often ignored in cognitive models of language comprehension. Here, we focus on the case of interactions between native and non-native speakers. Previous literature shows that listeners process the language of non-native speakers in less detail, because they expect them to have lower linguistic competence. We show that processing the language of non-native speakers increases lexical competition and access in general, not only of the (...) non-native speaker's speech, and that this leads to poorer memory of one's own speech during the interaction. We further find that the degree to which people adjust their processing to non-native speakers is related to the degree to which they adjust their speech to them. We discuss implications for cognitive models of language processing and sociolinguistic research on attitudes. (shrink)
Charles Peirce’s alpha system \ is reformulated into a deep inference system where the rules are given in terms of deep graphical structures and each rule has its symmetrical rule in the system. The proof analysis of \ is given in terms of two embedding theorems: the system \ and Brünnler’s deep inference system for classical propositional logic can be embedded into each other; and the system \ and Gentzen sequent calculus \ can be embedded into each other.