Going beyond the theory/practice and discourse/matter divides -- Learning and becoming in an onto-epistemology -- The tool of pedagogical documentation -- An intra-active pedagogy and its dual movements -- Transgressing binary practices in early childhood teacher education -- The hybrid-writing-process: going beyond the theory/practice divide in academic writing -- An ethics of immanence and potentialities for early childhood education.
This article explores the possibilities of considering how ‘matter and meaning are mutually constituted’ in the production of knowledge through presenting a diffractive analysis of a piece of interview data with a six-year-old boy in a preschool class. Inspired by Donna Haraway’s and Karen Barad’s theorising, I understand diffractive analysis as an embodied engagement with the materiality of research data: a becoming-with the data as researcher. Understanding the body as a space of transit, a series of open-ended systems in interaction (...) with the material-discursive ‘environment’, diffractive analyses constitute transcorporeal engagements with data. Stacy Alaimo’s theorisation of the transcorporeal is put to work diffractively with Gilles Deleuze and Felix Guattari’s thinking on the process of becoming minor or minoritarian. This implies a reconceptualisation of the very act of thinking as a transcorporeal process of engagement, going beyond the idea of reflexivity and interpretation as inner mental activities taking place in the mind of the researcher understood as separated from the data. Through my example, I argue that diffractive analysis can make visible new kinds of material-discursive realities that can have transformative and political consequences. (shrink)
This article theorizes and exemplifies reconceptualized teaching practices, both in early childhood education 1 and in a couple of programs within the new Swedish Teacher Education . 2 These programs are tightly knit to the last 12 years of reconceptualized early childhood education practices in and around Stockholm, built on deconstructive, co‐constructive, and re‐constructive principles, inspired by poststructural and feminist poststructural theories. The aim is foremost to work towards a dissolution and/or transgression of the modernist theory‐practice binary that dominates ECE (...) and teacher education practices, where theory is meant to be applied to practice. Student teachers, as well as pre‐school teachers, use what I have conceptualized as deconstructive talks, as a possibility of making visible the dominant discourses of childhood, identity, learning, play, and gender in the performed and documented teaching practices. In teacher education, students’ narratives are also deconstructed. The aim is to transgress teaching‐as‐usual; i.e. dominant and normative ways of thinking and acting in teaching and learning situations. I will suggest an ethics of ‘resistance’, affirmation and becoming, inspired by Derridean deconstructionist thinking, as a professional attitude and reflexive mode for teachers, teacher students and teacher educators. (shrink)
In this essay we critically evaluate the progress that has been made in solving the problem of meaning in artificial intelligence and robotics. We remain skeptical about solutions based on deep neural networks and cognitive robotics, which in our opinion do not fundamentally address the problem. We agree with the enactive approach to cognitive science that things appear as intrinsically meaningful for living beings because of their precarious existence as adaptive autopoietic individuals. But this approach inherits the problem of failing (...) to account for how meaning as such could make a difference for an agent’s behavior. In a nutshell, if life and mind are identified with physically deterministic phenomena, then there is no conceptual room for meaning to play a role in its own right. We argue that this impotence of meaning can be addressed by revising the concept of nature such that the macroscopic scale of the living can be characterized by physical indeterminacy. We consider the implications of this revision of the mind-body relationship for synthetic approaches. (shrink)
Trois chercheuses allemandes viennent de publier une somme bibliographique (2250 références) sur les mouvements de femmes et les associations féministes, passés et présents. Les ouvrages à caractère de source ne sont qu'exceptionnellement mentionnés, de même que les rapports de recherche et les petits essais. La sélection porte principalement sur des ouvrages parus après 1970, en anglais, allemand et français et concernant les mouvements de femmes non seulement en Europe et en Amérique..
.I chart the considerable changes in the status and conception of the Lenz-Ising model from 1920 to 1950 in terms of three phases: In the early 1920s, Lenz and Ising introduced the model in the field of ferromagnetism. Based on an exact derivation, Ising concluded that it is incapable of displaying ferromagnetic behavior, a result he erroneously extended to three dimensions. In the next phase, Lenz and Ising’s contemporaries rejected the model as a representation of ferromagnetic materials (...) because of its conflict with the new quantum mechanics. In the third phase, from the early 1930s to the early 1940s, the model was revived as a model of cooperative phenomena. I provide more detail on this history than the earlier accounts of Brush and Hoddeson, Schubert, Heims, and Baym and question some of their conclusions. Moreover, my account differs from these in its focus on the development of the model in its capacity as a model. It examines three aspects of this development: the attitudes on the degree of physical realism of the Lenz-Ising model in its representation of physical phenomena; the various reasons for studying and using it; and the effect of the change in its theoretical basis during the transition from the old to the new quantum mechanics. A major theme of my study is that even though the Lenz-Ising model is not fully realistic, it is more useful than more realistic models because of its mathematical tractability. I argue that this point of view, important for the modern conception of the model, is novel and that its emergence, while perhaps not a consequence of its study, is coincident with the third phase of its development. (shrink)
This is the second in a series of three papers that charts the history of the Lenz–Ising model in considerable detail, from its invention in the early 1920s to its recognition as an important tool in the study of phase transitions by the late 1960s. By focusing on the development in physicists’ perception of the model’s ability to yield physical insight—in contrast to the more technical perspective in previous historical accounts, for example, Brush and Hoddeson et al. —the series (...) aims to cover and explain in depth why this model went from relative obscurity to a prominent position in modern physics, and to examine the consequences of this change. In the present paper, which is self-contained, I deal with the development from the early 1950s to the 1960s and document that this period witnessed a major change in the perception of the model: In the 1950s it was not in the cards that the model was to become a pivotal tool of theoretical physics in the following decade. In fact, I show, based upon recollections and research papers, that many of the physicists in the 1950s interested in understanding phase transitions saw the model as irrelevant for this endeavor because it oversimplifies the nature of the microscopic constituents of the physical systems exhibiting phase transitions. However, one group, Cyril Domb’s in London, held a more positive view during this decade. To bring out the basis for their view, I analyze in detail their motivation and work. In the last part of the paper I document that the model was seen as much more physically relevant in the early 1960s and examine the development that led to this change in perception. I argue that the main factor behind the change was the realization of the surprising and striking agreement between aspects of the model, notably its critical behavior, and empirical features of the physical phenomena. (shrink)
This is the last in a series of three papers on the history of the Lenz–Ising model from 1920 to the early 1970s. In the first paper, I studied the invention of the model in the 1920s, while in the second paper, I documented a quite sudden change in the perception of the model in the early 1960s when it was realized that the Lenz–Ising model is actually relevant for the understanding of phase transitions. In this article, which (...) is self-contained, I study how this realization affected attempts to understand critical phenomena, which can be understood as limiting cases of phase transitions, in the epoch from circa 1965 to 1970, where these phenomena were recognized as a research field in its own right. I focus on two questions: What kinds of insight into critical phenomena was the employment of the Lenz–Ising model thought to give? And how could a crude model, which the Lenz–Ising model was thought to be, provide this understanding? I document that the model played several roles: At first, it played a role analogous to experimental data: hypotheses about real systems, in particular relations between critical exponents and what is now called the hypothesis of scaling, which was advanced by Benjamin Widom and others, were confronted with numerical results for the model, in particular the model’s so-called critical exponents. A positive result of a confrontation was seen as positive evidence for this hypothesis. The model was also used to gain insight into specific aspects of critical phenomena, for example that diverse physical systems exhibit similar behavior close to a critical point. Later, a more systematic program of understanding critical phenomena emerged that involved an explicit formulation of what it means to understand critical phenomena, namely, the elucidation of what features of the Hamiltonian of models lead to what kinds of behavior close to critical points. Attempts to accomplish this program culminated with the so-called hypothesis of universality, put forward independently by Robert B. Griffiths and Leo P. Kadanoff in 1970. They divided critical phenomena into classes with similar critical behavior. I also study the crucial role of the Lenz–Ising model in the development and justification of these ideas. (shrink)
In der 1970 gegründeten Reihe erscheinen Arbeiten, die philosophiehistorische Studien mit einem systematischen Ansatz oder systematische Studien mit philosophiehistorischen Rekonstruktionen verbinden. Neben deutschsprachigen werden auch englischsprachige Monographien veröffentlicht. Gründungsherausgeber sind: Erhard Scheibe, Günther Patzig und Wolfgang Wieland. Von 1990 bis 2007 wurde die Reihe von Jürgen Mittelstraß mitherausgegeben.
This study examined the ability to comprehend conventional and non-conventional implicatures, and the effect of proficiency and learning context on comprehension of implicature in L2 Chinese. Participants were three groups of college students of Chinese: elementary-level foreign language learners, advanced-level foreign language learners, and advanced-level heritage learners. They completed a 36-item computer-delivered listening test measuring their ability to comprehend three types of implicature: conventional indirect refusals, conventional indirect opinions, and non-conventional indirect opinions. Comprehension was analyzed for accuracy and comprehension speed. (...) There was a significant effect of implicature type on accuracy, but not on comprehension speed. A significant effect of participant group was observed on accuracy, but the effect was mixed on comprehension speed. (shrink)
This collection of essays by scholars from Europe, Asia, North America, and Latin America offers new perspectives of the phenomenological investigation of experiential life on the basis of Husserl’s phenomenology. Not only well-known works of Husserl are interpreted from new angles, but also the latest volumes of the Husserliana are closely examined. In a variety of ways, the contributors explore the emergence of reason in experience that is disclosed in the very regions that are traditionally considered to be “irrational” or (...) “pre-rational.” The leading idea of such explorations is Husserl’s view that perception, affectivity, and volition are regarded as the three aspects of reason. Without affectivity, which is supposedly irrational, no rationality can be established in the spheres of representation and volition, whereas volitional and representational acts consistently structure the process of affective experience. In such a framework, it is also shown that theoretical and practical reason are inseparably intertwined. Thus, the papers collected here can be regarded as a collaborative phenomenological investigation into the entanglement and mutual dependency of the supposedly “rational” and the “irrational” as well as that of the “practical” and the “theoretical.”. (shrink)
Against the background of the increasing importance of digitization in health care, the paper examines how medical practitioners who are involved in the development of digital health technologies legitimate and criticize the implementation and use of digital health technologies. Adopting an institutional logics perspective, the study is based on qualitative interviews with persons working at the interface of medicine and digital technologies development in Switzerland. The findings indicate that the developers believe that digital health technologies could harmonize current conflicts between (...) an increasing economization of the health care system and professional–ethical demands. At the same time, however, they show that digital technologies can undermine the demand for medical autonomy, a central element of the medical ethos. (shrink)
The development of phenomenological philosophy in Japan is a well-established tradition that reaches back to the early 20th-century. The past decades have witnessed significant contributions and advances in different areas of phenomenological thought in Japan that remain unknown, or only partially known, to an international philosophical public. This volume offers a selection of original phenomenological research in Japan to an international audience in the form of an English language publication. The contributions in this volume range over classical figures in the (...) phenomenological movement, recent trends in French phenomenology, and contemporary inter-disciplinary approaches. In addition to this diverse engagement with European thinkers, many of the contributions in this volume establish critical and complimentary discussions with 20th-century Japanese philosophers. (shrink)
Open peer commentary on the article “Lived Experience and Cognitive Science Reappraising Enactivism’s Jonasian Turn” by Mario Villalobos & Dave Ward. Upshot: Villalobos and Ward seem to disclose a fundamental problem without solving it - a problem to which neither the Jonasian nor the Maturanian inference can offer a solution. It should be addressed by a phenomenological analysis of our basic experience of aliveness.
ABSTRACTPolitical scientists should put aside questions about whether voters are rational or irrational, informed or uninformed, and questions about how flawed democracy is. Although they are interesting, these questions are secondary. Answering them in no way helps people—it does not help them with their violent neighborhoods, their declining incomes, their flooded homes, or their dying crops. Instead, researchers should focus on the first-order question of how to improve democratic accountability.
In the course of the debates on Priscian's notion of the perfect sentence, the philosopher Peter Abelard developed a theory that closely resembles modern accounts of propositional attitudes and that goes far beyond the established Aristotelian conceptions of the sentence. According to Abelard, the perfection of a sentence does not depend on the content that it expresses, but on the fact that the content is stated along with the propositional attitude towards the content. This paper tries to provide an analysis (...) and a consistent interpretation of Abelard's arguments within the framework of the mediaeval models of language and mind. (shrink)
Behavioural phenotypes have been explained by genetic and environmental factors (E) and their interaction. Here we suggest a rethinking of the E factor. Passively incurred environmental influences (E pass) and actively copied information and behaviour (E act) may be distinguished at shared and non-shared level. We argue that E act underlies mutation and selection and is the base of gene-independent heritability.