The contribution starts from outlining the evolution of the scholarly production flow from the print based paradigm to the digital age and in this context it explores the opposition of digital versus analog representation modes. It then develops on the triple paradigm shift caused by genuine digital publishing and its specific consequences for the social sciences and humanities (SSH) which in turn results in re-constituting basic scholarly notions such as âtextâ and âdocumentâ. The paper concludes with discussing the specific value (...) that could be added in systematically using digital text resources as a basis for scholarly work and also states some of the necessary conditions for such a âdigital turnâ to be successful in the SSH. (shrink)
Von 1925 bis 1928 wurden im Berliner J. M. Spaeth-Verlag unter der Leitung von Hans Rosenkranz eine Reihe von Werken seinerzeit eher unbekannter, in der Retrospektive jedoch signifikanter Autoren der Zwischenkriegszeit publiziert. Der Beitrag thematisiert Rosenkranz als jungen Verleger und Bewunderer Stefan Zweigs. Er entwirft auf Grundlage der Archivüberlieferung einen neuen Blick auf die Geschichte des Unternehmens und kommentiert das damit verbundene literarische Programm: Welche wichtigen verlegerischen Projekte wurden in jener kurzen Zeit unternommen? Welche Rolle hatte Stefan Zweig (...) für das Zustandekommen einiger Titel und besonders in den letzten Wochen der Verlagsexistenz? Inwiefern lässt sich Programmgestaltung und ökonomische Entwicklung von J. M. Spaeth als paradigmatisch für jüdische Verlage in der Weimarer Republik verstehen? Dazu wird erstmals das Scheitern des Unternehmens während der „Bücherkrise“ Ende der 1920er Jahre aus den Quellen rekonstruiert. (shrink)
One of Ludwik Fleck’s ideas about the development of scientific knowledge is that—once a system of interpretation is in place—the process that follows can be characterised as one of inertia: any new evidence comes under a strong pressure to be incorporated into the established frame. This can result in what Fleck called a harmony of illusions when contradictory evidence becomes almost invisible or is incorporated into the established frame only by huge efforts.The paper analyses early explanations of the tuberculin reaction (...) as a case study of Fleck’s argument. For Robert Koch, who had presented tuberculin in 1890, the compound was supposed to be a diagnostic tool and a cure for tuberculosis. His conception of its effect was rather peculiar, but strictly in line with ideas on the pathogenesis of infectious diseases he had developed much earlier. After tuberculin was released in late 1890, whether Koch’s conception was convincing depended on the place that a given observer had in the medical world in late-nineteenth-century Germany. Inside Koch’s group, the status of the tuberculin reaction remained stable and tuberculin retained its value as a diagnostic and curative tool. On the other hand, observers from outside that thought collective, and in particular from clinical medicine, soon pointed to flaws in its conception. These critics developed a rather different picture of tuberculin as a mysterious and dangerous drug. No reconciliation followed and what we find instead in German medicine around the year 1900 is the presence of rather contradictory concepts and practices surrounding Koch’s wonder cure. (shrink)
This imaginative and unusual book explores the moral sensibilities and cultural assumptions that were at the heart of political debate in Victorian and early twentieth-century Britain. It focuses on the role of intellectuals as public moralists, and suggests ways in which their more formal political theory rested upon habits of response and evaluation that were deeply embedded in wider social attitudes and aesthetic judgements. Stefan Collini examines the characteristic idioms and strategies of argument employed in periodical and polemical writing, (...) and reconstructs the sense of identity and of relation to an audience exhibited by social critics from John Stuart Mill and Matthew Arnold to J. M. Keynes and F. R. Leavis. Dr Collini begins by situating the leading intellectuals in the social and political world of the Victorian governing classes. He explores fundamental values like `altruism', `character', and `manliness', which are revealed as the animating dynamic of much of the political thought of the period. The book assesses the impact of increasing academic specialization across a range of disciplines, and offers an illuminating analysis of the public voice of legal theorists like Maine and Dicey. Through a detailed study of J.S. Mill's posthumous reputation Dr Collini uncovers the process by which the genealogy of images of national cultural identity is established; and he concludes with a provocative exploration of the nationalist significance of what he calls `the Whig interpretation of English literature'. Public Moralists is a subtle and illuminating study by a leading intellectual historian which will redirect debate about the distinctive development of modern English culture. (shrink)
A critical pathway for conceptual innovation in the social is the construction of theoretical ideas based on empirical data. Grounded theory has become a leading approach promising the construction of novel theories. Yet grounded theory-based theoretical innovation has been scarce in part because of its commitment to let theories emerge inductively rather than imposing analytic frameworks a priori. We note, along with a long philosophical tradition, that induction does not logically lead to novel theoretical insights. Drawing from the theory of (...) inference, meaning, and action of pragmatist philosopher Charles S. Peirce, we argue that abduction, rather than induction, should be the guiding principle of empirically based theory construction. Abduction refers to a creative inferential process aimed at producing new hypotheses and theories based on surprising research evidence. We propose that abductive analysis arises from actors' social and intellectual positions but can be further aided by careful methodological data analysis. We outline how formal methodological steps enrich abductive analysis through the processes of revisiting, defamiliarization, and alternative casing. (shrink)
Starting from an assessment of how far Robert Koch's bacteriology had developed in the late 1880s this paper attempts to analyse different aspects of the process that led to the foundation of the Berlin Institute for Infectious Diseases in 1891. With the development of his supposed cure against tuberculosis, tuberculin, Koch attempted to give his research a new direction, earn a fortune with the profits and become more independent of Prussian government officials who, up to that point, had had a (...) major influence on his career. In the period following the presentation of the cure in autumn 1890, however, it became clear that tuberculin's value in treatment was at most dubious. Thus, the failure of tuberculin meant that Koch had to drop his own plans and accommodate those of the Prussian Ministry of Culture. As a result he assumed directorship of the newly founded Institute for Infectious Diseases in Berlin. Even though this was definitely a prestigious position it reaffirmed Koch's dependency on Prussian government officials and was by no means the kind of institution he had aimed for at the outset. (shrink)
According to Hempel’s influential theory of explanation, explaining why some a is G consists in showing that the truth that a is G follows from a law-like generalization to the effect that all Fs are G together with the initial condition that a is F. While Hempel’s overall account is now widely considered to be deeply flawed, the idea that some generalizations play the explanatory role that the account predicts is still often endorsed by contemporary philosophers of science. This idea, (...) however, conflicts with widely shared views in metaphysics according to which the generalization that all Fs are G is partially explained by the fact that a is G. I discuss two solutions to this conflict that have been proposed recently, argue that they are unsatisfactory, and offer an alternative. (shrink)
In the past few decades, a growth in ethical consumerism has led brands to increasingly develop conscientiousness and depict ethical image at a corporate level. However, most of the research studying business ethics in the field of corporate brand management is either conceptual or has been empirically conducted in relation to goods/products contexts. This is surprising because corporate brands are more relevant in services contexts, because of the distinct nature of services and the key role that employees have in the (...) services sector. Accordingly, this article aims at empirically examining the effects of customer perceived ethicality in the context of corporate services brands. Based on data collected for eight service categories using a panel of 2179 customers, the hypothesized structural model is tested using path analysis. The results show that, in addition to a direct effect, customer perceived ethicality has a positive and indirect effect on customer loyalty, through the mediators of customer affective commitment and customer perceived quality. Further, employee empathy positively influences the impact of customer perceived ethicality on customer affective commitment, and customer loyalty positively impacts customer positive word-of-mouth. The first implication of these results is that corporate brand strategy needs to be aligned with human resources policies and practices if brands want to turn ethical strategies into employee behavior. Second, corporate brands should build more authentic communications grounded in their ethical beliefs and supported by evidence from actual employees. (shrink)
: This article analyzes German debates on the microbiology of infectious diseases from 1865 to 1875 and asks how and when organic pollution in tissues became noteworthy for aetiology and pathogenesis. It was with Ernst Hallier's pleomorphistic microbiology that the organic character of alien material in tissues came to be regarded as important for pathology. The process that followed saw both vigorous biological critique and a number of medical applications of Hallier's work. Around 1874 contemporaries reached the conclusion that pleomorphous (...) vegetation was most likely of little importance if not accidental in relation to the aetiology of infectious diseases whereas the idea of monomorphous micro-organisms facilitated a causal explanation. It was only then that notions such a pure cultures, bacterial specificity, etc. favored by Ferdinand Julius Cohn and his school became popular in medical circles. (shrink)
Upon entering clinical medicine in the 1940s, antibiotic therapy seemed to complete a transformation of hospitals that originated in the late nineteenth century. Former death sinks had become harbingers of therapeutic progress. Yet this triumph was short-lived. The arrival of pathologies caused by resistant bacteria, and of nosocomial infections whose spread was helped by antibiotic therapies, seemed to be intimately related to modern anti-infective therapy. The place where such problems culminated were hospitals, which increasingly appeared as dangerous environments where attempts (...) to combat infectious diseases had instead created hothouses of disease evolution. This paper will focus on one aspect of that history. It caused clinical medicine and hospital hygiene in particular to pay attention to a dimension of infectious disease it had previously paid little attention to thus far: The evolution of infectious disease—previously a matter of mostly theoretical interest—came to be useful in explaining many phenomena observed. This did not turn hospital hygienists into geneticists, though it did give them an awareness that the evolution of infectious disease in a broad sense was something that did matter to them. The paper advances its argument by looking at three phases: The growing awareness of the hospital as a dangerous environment in the 1950s, comprehensive attempts at improving antibiotic therapy and hospital hygiene that followed from the 1960s and lastly the framing of such challenges as risk factors from the 1970s. In conclusion, I will argue that hospital hygiene, being inspired in particular by epidemiology and risk factor analysis, discussed its own specific version of disease emergence and therefore contributed to the 1980s debates around such topics. Being loosely connected to more specialized studies, it consisted of a re-interpretation of infectious disease centred around the temporality of such phenomena as they were encountered in day-to-day dealings of clinical wards. (shrink)
One of the common characteristics of science, technology, and medicine is their ambition to epistemologically and organizationally move beyond the confines of nation states. In practice, however, they develop differently in countries or regions. Scientists, engineers, and physicians are constrained as well as enabled by national boundaries and specific cultures. The cultural status of such practices in reverse is influenced by a country's history, politics, and the view of the role of science, technology, and medicine in society. It is the (...) relation between a specific region, Scandinavia, and the history of science, technology, and medicine within this region that this issue of Science in Context sets out to explore. But what is this “Scandinavia”? To many, Scandinavia besides being a specific geographical region of three countries (Denmark, Sweden, and Norway) with entwined histories and closely related languages is a way of denoting a specific style or movement. “Scandinavian design” is renowned for three interrelated features; minimalism or simplicity, functionalism, and “design to the people” i.e. functional products for the average citizen (Beer 1975; Glambek 1997; Fallan 2012). -/- . (shrink)
Need considerations play an important role in empirically informed theories of distributive justice. We propose a concept of need-based justice that is related to social participation and provide an ethical measurement of need-based justice. The β-ε-index satisfies the need-principle, monotonicity, sensitivity, transfer and several »technical« axioms. A numerical example is given.
How ought you to evaluate your options if you’re uncertain about which axiology is true? One prominent response is Expected Moral Value Maximisation, the view that under axiological uncertainty, an option is better than another if and only if it has the greater expected moral value across axiologies. EMVM raises two fundamental questions. First, there’s a question about what it should even mean. In particular, it presupposes that we can compare moral value across axiologies. So to even understand EMVM, we (...) need to explain what it is for such comparisons to hold. Second, assuming that we understand it, there’s a question about whether EMVM is true. Since there are many plausible rivals, we need an argument to defend it. In this paper, I’ll introduce a representation theorem for axiological uncertainty to answer these two questions. Roughly, the theorem shows that if all our axiologies satisfy the von Neumann–Morgenstern axioms, and if the facts about which options are better than which in light of your uncertainty also satisfy these axioms as well as a Pareto condition, then these facts have a relevantly unique expected utility representation. If I’m right, this theorem at once affords us a compelling way to understand EMVM—and specifically intertheoretic comparisons—and a systematic argument for its truth. (shrink)
I argue that difference-making should be a crucial element for evaluating the quality of evidence for mechanisms, especially with respect to the robustness of mechanisms, and that it should take central stage when it comes to the general role played by mechanisms in establishing causal claims in medicine. The difference- making of mechanisms should provide additional compelling reasons to accept the gist of Russo-Williamson thesis and include mechanisms in the protocols for Evidence- Based Medicine, as the EBM+ research group has (...) been advocating. (shrink)
There are several important criticisms against the unificationist model of scientific explanation: Unification is a broad and heterogeneous notion and it is hard to see how a model of explanation based exclusively on unification can make a distinction between genuine explanatory unification from cases of ordering or classification. Unification alone cannot solve the asymmetry and irrelevance problems. Unification and explanation pull in different directions and should be decoupled, because for good scientific explanation extra ad explanandum information is often required. I (...) am presenting a possible solution to those problems, by focusing on an often overlooked but important element of how theoretic unification is achieved—the conceptual frameworks of theories. The core conceptual assumptions behind theories are decisive for discriminating between explanatory and non-explanatory unification. The conceptual framework is also flexible enough to balance the tension between informativeness and maximum systematization in constructing explanatory inferences. A short case study of orthogenetic and Darwinian explanations in paleontology is presented as an illustration of how my addition to the unificationist model is applicable to a historical debate between rival explanations. (shrink)
Prospect Theory (PT) is widely regarded as the most promising descriptive model for decision making under uncertainty. Various tests have corroborated the validity of the characteristic fourfold pattern of risk attitudes implied by the combination of probability weighting and value transformation. But is it also safe to assume stable PT preferences at the individual level? This is not only an empirical but also a conceptual question. Measuring the stability of preferences in a multi-parameter decision model such as PT is far (...) more complex than evaluating single-parameter models such as Expected Utility Theory under the assumption of constant relative risk aversion. There exist considerable interdependencies among parameters such that allegedly diverging parameter combinations could in fact produce very similar preference structures. In this paper, we provide a theoretic framework for measuring the (temporal) stability of PT parameters. To illustrate our methodology, we further apply our approach to 86 subjects for whom we elicit PT parameters twice, with a time lag of 1 month. While documenting remarkable stability of parameter estimates at the aggregate level, we find that a third of the subjects show significant instability across sessions. (shrink)
Psychological distance effects have attracted the attention of behavioral economists in the context of descriptive modeling and behavioral policy. Indeed, psychological distance effects have been shown for an increasing number of domains and applications relevant to economic decision-making. The current paper questions whether these effects are robust enough for economists to apply them to relevant policy questions. We demonstrate systematic replication failures for the distance-from-a-distance effect shown by Maglio et al., and relate them to theoretical arguments suggesting that psychological distance (...) theories are currently too poorly specified to make predictions that are precise enough for economic analyses. (shrink)
This article considers the prospects of inference to the best explanation as a method of confirming causal claims vis-à-vis the medical evidence of mechanisms. I show that IBE is actually descriptive of how scientists reason when choosing among hypotheses, that it is amenable to the balance/weight distinction, a pivotal pair of concepts in the philosophy of evidence, and that it can do justice to interesting features of the interplay between mechanistic and population level assessments.
This volume is the second installment in Stefan Jonsson’s epic study of the crowd and the mass in modern Europe, building on his work in A Brief History of the Masses, which focused on monumental artworks produced in 1789, 1889, and 1989.
The globalization movement in recent decades has meant rapid growth in trade, financial transactions, and cross-country ownership of economic assets. In this article, we examine how the globalization of national business systems has influenced the framing of corporate social responsibility (CSR). This is done using text analysis of CEO letters appearing in the annual reports of 15 major corporations in Sweden during a period of transformational change. The results show that the discourse about CSR in the annual reports has changed (...) from a national and communitarian view of social responsibility (cf. a negotiated view of CSR) toward an international and individualistic view of social responsibility (cf. a self-regulating view of CSR). The article contributes theoretically (1) by adding a national–global dimension to previous conceptualizations of CSR and (2) by showing that the rise of CSR discourse and activities in the last 10 years does not have to imply an increased commitment and interest in corporate responsibility per se, only that there are increased societal expectations that corporations should develop the capability to act more independently as moral agents. (shrink)
Integrating the study of human diversity into the human evolutionary sciences requires substantial revision of traditional conceptions of a shared human nature. This process may be made more difficult by entrenched, 'folkbiological' modes of thought. Earlier work by the authors suggests that biologically naive subjects hold an implicit theory according to which some traits are expressions of an animal's inner nature while others are imposed by its environment. In this paper, we report further studies that extend and refine our account (...) of this aspect of folkbiology. We examine biologically naive subjects' judgments about whether traits of an animal are 'innate', 'in its DNA' or 'part of its nature'. Subjects do not understand these three descriptions to be equivalent. Both innate and in its DNA have the connotation that the trait is species-typical. This poses an obstacle to the assimilation of the biology of polymorphic and plastic traits by biologically naive audiences. Researchers themselves may not be immune to the continuing pull of folkbiological modes of thought. (shrink)
It is widely recognized that the innate versus acquired distinction is a false dichotomy. Yet many scientists continue to describe certain traits as “innate” and take this to imply that those traits are not acquired, or “unlearned.” This article asks what cognitive role, if any, the concept of innateness should play in the psychological and behavioural sciences. I consider three arguments for eliminating innateness from scientific discourse. First, the classification of a trait as innate is thought to discourage empirical research (...) into its developmental origin. Second, this concept lumps together a number of different biological properties that ought to be treated as distinct. Third, innateness is associated with the outmoded folk biological theory of essentialism. In response to these objections, I consider two attempts to revise the concept of innateness which aim to make it more suitable for scientific explanation and research. One proposal is that innateness can be defined in terms of the biological property of environmental canalization. On this view, a trait is innate to the extent that it is developmentally buffered against a range of different environments. Another proposal is that innateness serves as an explanatory primitive for cognitive science. This view holds that there exist a sharp boundary between psychological and biological explanations and that to identify a trait as innate means that it falls into the latter explanatory domain. This essay ends with some questions for future research. (shrink)
Stefan Jonsson uses three monumental works of art to build a provocative history of popular revolt: Jacques-Louis David's _The Tennis Court Oath_, James Ensor's _Christ's Entry into Brussels in 1889_, and Alfredo Jaar's _They Loved It So Much, the Revolution_. Addressing, respectively, the French Revolution of 1789, Belgium's proletarian messianism in the 1880s, and the worldwide rebellions and revolutions of 1968, these canonical images not only depict an alternative view of history but offer a new understanding of the relationship (...) between art and politics and the revolutionary nature of true democracy. Drawing on examples from literature, politics, philosophy, and other works of art, Jonsson carefully constructs his portrait, revealing surprising parallels between the political representation of "the people" in government and their aesthetic representation in painting. Both essentially "frame" the people, Jonsson argues, defining them as elites or masses, responsible citizens or angry mobs. Yet in the aesthetic fantasies of David, Ensor, and Jaar, Jonsson finds a different understanding of democracy-one in which human collectives break the frame and enter the picture. Connecting the achievements and failures of past revolutions to current political issues, Jonsson then situates our present moment in a long historical drama of popular unrest, making his book both a cultural history and a contemporary discussion about the fate of democracy in our globalized world. (shrink)
According to the standard account of forgiveness, you forgive your wrongdoer by overcoming your resentment towards them. But how exactly must you do so? And when is such overcoming fitting? The aim of this paper is to introduce a novel version of the standard account to answer these questions. Its core idea is that the reactive attitudes are a fitting response not just to someone’s blameworthiness, but to their blameworthiness being significant for you, or worthy of your caring, in virtue (...) of your relationship to it. Someone’s blameworthiness is significant for you to the extent you’re bound up with what grounds it––e.g. with the wrongdoer’s being a participant in human relationships, with their attitudes, or with the victim’s being a source of demands. So you may fittingly not care about someone’s blameworthiness if it’s sufficiently insignificant for you in this manner––e.g. if their wrong happened far off in place and time. And forgiveness revolves around this. You forgive your wrongdoer if and only if, partly out of goodwill towards them, you cease to care about their blameworthiness––a bit as if their wrong had happened far off. If I’m right, this agent-relativity-based account can resolve the apparent ‘paradoxy of forgiveness’, satisfies a number of desiderata, and is plausible on an intuitive level. (shrink)
Ştefan Aug. Doinaş and Basarab Nicolescu, two great spirits related through the generosity of the humanist vision, met, held an epistolary dialogue and had common projects. Doinaş commented upon a few of the innovative concepts proposed by Basarab Nicolescu and he also aesthetically transfigured, in literary pages, certain concepts of transdisciplinarity.
Intuitively, we lack the standing to blame others in light of moral norms that we ourselves don't take seriously: if Adam is unrepentantly aggressive, say, he lacks the standing to blame Celia for her aggressiveness. But why does blame have this feature? Existing proposals try to explain this by reference to specific principles of normative ethics – e.g. to rule‐consequentialist considerations, to the wrongness of hypocritical blame, or principles of rights‐forfeiture based on this wrongness. In this paper, I suggest a (...) fundamentally different approach. Employing Timothy Williamson's idea of ‘constitutive rules’ of speech acts, I argue that this feature of blame is simply constitutive of any essentially moral form of disapproval. So if Adam had the standing to disapprove of Celia's aggressiveness in some form, necessarily, this disapproval couldn't be blame. If I'm right, this proposal thus not only answers our main question, but also sheds an interesting novel light on the very nature of blame. If we didn't have a form of disapproval with that feature, we wouldn't have our practice of holding each other to moral norms. (shrink)
The culture of honour hypothesis offers a compelling example of how human psychology differentially adapts to pastoral and horticultural environments. However, there is disagreement over whether this pattern is best explained by a memetic, evolutionary psychological, dual inheritance, or niche construction model. I argue that this disagreement stems from two shortcomings: lack of clarity about the theoretical commitments of these models and inadequate comparative data for testing them. To resolve the first problem, I offer a theoretical framework for deriving competing (...) predictions from each of the four models. In particular, this involves a novel interpretation of the difference between dual inheritance theory and cultural niche construction. I then illustrate a strategy for testing their predictions using data from the Human Relations Area File. Empirical results suggest that the aggressive psychological phenotype typically associated with honour culture is more common among pastoral societies than among horticultural societies. Theoretical considerations suggest that this pattern is best explained as a case of cultural niche construction. (shrink)
An English double-embedded relative clause from which the middle verb is omitted can often be processed more easily than its grammatical counterpart, a phenomenon known as the grammaticality illusion. This effect has been found to be reversed in German, suggesting that the illusion is language specific rather than a consequence of universal working memory constraints. We present results from three self-paced reading experiments which show that Dutch native speakers also do not show the grammaticality illusion in Dutch, whereas both German (...) and Dutch native speakers do show the illusion when reading English sentences. These findings provide evidence against working memory constraints as an explanation for the observed effect in English. We propose an alternative account based on the statistical patterns of the languages involved. In support of this alternative, a single recurrent neural network model that is trained on both Dutch and English sentences is shown to predict the cross-linguistic difference in the grammaticality effect. (shrink)
This paper reassesses Robert Koch’s work on tropical infections of humans and cattle as being inspired by an underlying interest in epidemiology. Such an interest was developed from the early 1890s when it became clear that an exclusive focus on pathogens was insufficient as an approach to explain the genesis and dynamics of epidemics. Koch, who had failed to do so before, now highlighted differences between infection and disease and described the role of various sub-clinical states of disease in the (...) propagation and—consequently—in the control of epidemics. Studying pathologies of men and cattle in tropical countries eventually facilitated the application of such measures in Europe through the screening of healthy carriers of typhoid, which was carried out in 1902. The concept of the carrier state can be understood as a spin-off from tropical medicine into the study and control of infectious disease in Europe. With it travelled assumptions that were typical for colonial and veterinary medicine where the health of indigenous individuals or cattle would be a secondary objective compared to the control of diseases in populations. (shrink)
Neo-Russellians like Salmon and Braun hold that: the semantic contents of sentences are structured propositions whose basic components are objects and properties, names are directly referential terms, and a sentence of the form ‘n believes that S’ is true in a context c iff the referent of the name n in c believes the proposition expressed by S in c. This is sometimes referred to as ‘the Naive Russellian theory’. In this talk, I will discuss the Naive Russellian theory primarily (...) in connection with a problem known as Schiffer’s puzzle. Schiffer first presented the puzzle as an argument against the Naive Russellian theory. Schiffer’s argument proceeds in two steps. In step one, Schiffer argues that the Naive Russellian theory is committed to two principles regarding de re belief; the special-case consequence and Frege’s constraint. Then, in step two, Schiffer argues that the special-case consequence is not consistent with Frege’s constraint. Salmon and Braun reply to Schiffer’s argument that although the Naive Russellian theory is committed to Frege’s constraint, it is not committed to the special-case consequence. However, in this paper, I will argue with a new Schiffer-case that even if the Naive Russellian theory is not committed to the special-case consequence, it is still not consistent with Frege’s constraint. Concluding, I will discuss the possibility to reject Frege’s constraint within the Naive Russellian theory. (shrink)
The entropy-reduction hypothesis claims that the cognitive processing difficulty on a word in sentence context is determined by the word's effect on the uncertainty about the sentence. Here, this hypothesis is tested more thoroughly than has been done before, using a recurrent neural network for estimating entropy and self-paced reading for obtaining measures of cognitive processing load. Results show a positive relation between reading time on a word and the reduction in entropy due to processing that word, supporting the entropy-reduction (...) hypothesis. Although this effect is independent from the effect of word surprisal, we find no evidence that these two measures correspond to cognitively distinct processes. (shrink)