In both biology and the human sciences, social groups are sometimes treated as adaptive units whose organization cannot be reduced to individual interactions. This group-level view is opposed by a more individualistic one that treats social organization as a byproduct of self-interest. According to biologists, group-level adaptations can evolve only by a process of natural selection at the group level. Most biologists rejected group selection as an important evolutionary force during the 1960s and 1970s but (...) a positive literature began to grow during the 1970s and is rapidly expanding today. We review this recent literature and its implications for human evolutionary biology. We show that the rejection of group selection was based on a misplaced emphasis on genes as “replicators” which is in fact irrelevant to the question of whether groups can be like individuals in their functional organization. The fundamental question is whether social groups and other higher-level entities can be “vehicles” of selection. When this elementary fact is recognized, group selection emerges as an important force in nature and what seem to be competing theories, such as kin selection and reciprocity, reappear as special cases of group selection. The result is a unified theory of natural selection that operates on a nested hierarchy of units.The vehicle-based theory makes it clear that group selection is an important force to consider in human evolution. Humans can facultatively span the full range from self-interested individuals to “organs” of group-level “organisms.” Human behavior not only reflects the balance between levels of selection but it can also alter the balance through the construction of social structures that have the effect of reducing fitness differences within groups, concentrating natural selection at the group level. These social structures and the cognitive abilities that produce them allow group selection to be important even among large groups of unrelated individuals. (shrink)
In everyday life, as well as in science, we have to deal with and act on the basis of partial (i.e. incomplete, uncertain, or even inconsistent) information. This observation is the source of a broad research activity from which a number of competing approaches have arisen. There is some disagreement concerning the way in which partial or full ignorance is and should be handled. The most successful approaches include both quantitative aspects (by means of probability theory) and qualitative aspect (by (...) means of graphical or causal models or logic). Some of these approaches have important impacts on various disciplines including philosophy, computer science, statistics, mathematics, physics, and social science. Most notably, the relation between causal and probabilistic information is a topic of interest in all of these fields. The general question of how uncertainty is related to ignorance, and the connection thereof to causality, have been the key topics of the international summer school at the University of Konstanz, Germany, in August 2004. This was the third event in a series of summer schools organized by the interdisciplinary research group “Philosophy, Probability, and Modeling”. The group was in operation from September 1st, 2001, until August 31, 2005. The first summer school, held in September 2002, focused on the application of probability theory to topics of philosophical interest. Problems and questions that arise when probabilistic models and techniques are being applied in the special sciences were then discussed during the second summer school, held in July 2003. The fourth and final event in the series, in August 2005, was devoted to philosophical questions related to the application and interpretation of probabilities in physics. Each of the four summer schools attracted 40–50 people from many different countries. It was an inspiring forum for an interdisciplinary discussion among researchers and students from many different areas. This special issue collects a selection of the invited and contributed papers presented at the summer school in 2004. We acknowledge the generous support of the Center for Junior Research Fellows at the University of Konstanz, the Gesellschaft für analytische Philosophie e.V., the Centre for Philosophy of Natural and Social Science at the London School of Economics, the Alexander von Humboldt Foundation, the Federal Ministry of Education and Research and the Program for the Investment in the Future (ZIP) of the German Government through a Sofja Kovalevskaja Award. We thank the referees of the papers for their work, and James Moor for his interest in dedicating a special issue of this journal to Causality, Uncertainty and Ignorance. (shrink)
We prove a number of results motivated by global questions of uniformity in computabi- lity theory, and universality of countable Borel equivalence relations. Our main technical tool is a game for constructing functions on free products of countable groups. We begin by investigating the notion of uniform universality, first proposed by Montalbán, Reimann and Slaman. This notion is a strengthened form of a countable Borel equivalence relation being universal, which we conjecture is equivalent to the usual notion. With this additional (...) uniformity hypothesis, we can answer many questions concerning how countable groups, probability measures, the subset relation, and increasing unions interact with universality. For many natural classes of countable Borel equivalence relations, we can also classify exactly which are uniformly universal. We also show the existence of refinements of Martin’s ultrafilter on Turing invariant Borel sets to the invariant Borel sets of equivalence relations that are much finer than Turing equivalence. For example, we construct such an ultrafilter for the orbit equivalence relation of the shift action of the free group on countably many generators. These ultrafilters imply a number of structural properties for these equivalence relations. (shrink)
Inertial frames and Lorentz transformations have a preferred status in the special theory of relativity (STR). Lorentz transformations, in turn, embody Einstein's convention that the velocity of light is isotropic, a convention that is necessary for the establishment of a standard signal synchrony. If the preferred status of Lorentz transformations in STR is not due to some particular bias introduced by a convention on signal synchronism, but to the fact that the Lorentz transformation group is the symmetry (...) class='Hi'>group of the theory, then the signal synchronism is not a matter of convention but rather a matter of fact. In order to explore the conventionalist thesis, that within the frame of STR isotropy in the velocity of light and, hence, signal synchronism is a matter of convention, we need a generalized Lorentz transformation group that does not embody Einstein's isotropy convention, and upon which STR can be based. We present here a new approach to the resulting search for a generalized STR, which is well suited for establishing some well-known results of Winnie as well as some new results. (shrink)
The US Securities and Exchange Commission recently proposed rules relating to shareholder (independent) director nominations to publicly-traded companies. While shareholder groups, such as institutional investors, consumer groups, and shareholder activists, generally support the proxy reform, the business community, including The Business Roundtable and the US Chamber of Commerce, are critical of the proposal, arguing that it will 'open the door' to specialinterest directors, e.g., labour unions or other groups having a social or political agenda contrary to the (...) economic interests of the shareholder owners of the corporation. An analysis of the proposed rules, however, show that the mechanisms offered to nominate and elect independent directors offer little or no threat of any shareholder group placing specialinterest directors on the board of a publicly-traded company in the USA. The article concludes with a recommended managerial course of action for executives and boards to follow in the unlikely event that shareholders are seriously threatening to place a specialinterest director(s) on the proxy ballot. (shrink)
Whether upheld as heroic or reviled as terrorism, people have been willing to lay down their lives for the sake of their groups throughout history. Why? Previous theories of extreme self-sacrifice have highlighted a range of seemingly disparate factors, such as collective identity, outgroup hostility, and kin psychology. In this paper, I attempt to integrate many of these factors into a single overarching theory based on several decades of collaborative research with a range of special populations, from tribes in (...) Papua New Guinea to Libyan insurgents and from Muslim fundamentalists in Indonesia to Brazilian football hooligans. These studies suggest that extreme self-sacrifice is motivated by identity fusion, a visceral sense of oneness with the group, resulting from intense collective experiences or from perceptions of shared biology. In ancient foraging societies, fusion would have enabled warlike bands to stand united despite strong temptations to scatter and flee. The fusion mechanism has often been exploited in cultural rituals, not only by tribal societies but also in specialized cells embedded in armies, cults, and terrorist organizations. With the rise of social complexity and the spread of states and empires, fusion has also been extended to much larger groups, including doctrinal religions, ethnicities, and ideological movements. Explaining extreme self-sacrifice is not only a scientific priority but also a practical challenge as we seek a collective response to suicide, terrorism, and other extreme expressions of outgroup hostility that continue to bedevil humanity today. (shrink)
Why are US labor unions so weak? Union decline has had important consequences for politics, inequality, and social policy. Common explanations cite employment shifts, public opinion, labor laws, and differences in working class culture and organization. But comparing the United States with Canada challenges those explanations. After following US unionization rates for decades, Canadian rates diverged in the 1960s, and are now nearly three times higher. This divergence was due to different processes of working class political incorporation. In the United (...) States, labor was incorporated as an interestgroup into a labor regime governed by a pluralist idea. In Canada, labor was incorporated as a class representative into a labor regime governed by a class idea. This led to a relatively stronger Canadian labor regime that better held employers in check and protected workers’ collective bargaining rights. As a result, union density stabilized in Canada while plummeting in the United States. (shrink)
This E-Special issue brings together a range of articles from the Theory, Culture, & Society archive that directly explore the relations between fiction and social theory. Each article develops a different perspective on these relations, yet they all share a common interest in probing at the different ways in which fiction might enrich and provoke our conceptual imaginations. These articles ask how theory might be used to understand or illuminate fiction, whilst also considering how theory might be extended, (...) challenged or informed by fictional resources. In general terms, the articles take three types of overlapping approach. First, there are those that use fiction to extend the imagination of social theory. Second are the articles that use fiction as a documentary resource and platform for theorizing. And, finally, there are those articles that use theory to reanimate and re-examine fictional forms. In exploring these three intersecting branches the pieces illustrate the different ways in which fiction and social theory might interweave in our thinking. The articles gathered here provide frameworks, ideas and resources through which the reader might continue to think imaginatively and creatively about the social world. (shrink)
We begin with the history of the discovery of computability in the 1930’s, the roles of Gödel, Church, and Turing, and the formalisms of recursive functions and Turing automatic machines . To whom did Gödel credit the definition of a computable function? We present Turing’s notion [1939, §4] of an oracle machine and Post’s development of it in [1944, §11], [1948], and finally Kleene-Post [1954] into its present form. A number of topics arose from Turing functionals including continuous functionals (...) on Cantor space and online computations. Almost all the results in theoretical computability use relative reducibility and o-machines rather than a-machines and most computing processes in the real world are potentially online or interactive. Therefore, we argue that Turing o-machines, relative computability, and online computing are the most important concepts in the subject, more so than Turing a-machines and standard computable functions since they are special cases of the former and are presented first only for pedagogical clarity to beginning students. At the end in §10–§13 we consider three displacements in computability theory, and the historical reasons they occurred. Several brief conclusions are drawn in §14. (shrink)
The theme of this book is formed by a pair of concepts: the concept of formal language as carrier of the precise expression of meaning, facts and problems, and the concept of algorithm or calculus, i.e. a formally operating procedure for the solution of precisely described questions and problems. The book is a unified introduction to the modern theory of these concepts, to the way in which they developed first in mathematical logic and computability theory and later in (...) class='Hi'>automata theory, and to the theory of formal languages and complexity theory. Apart from considering the fundamental themes and classical aspects of these areas, the subject matter has been selected to give priority throughout to the new aspects of traditional questions, results and methods which have developed from the needs or knowledge of computer science and particularly of complexity theory. It is both a textbook for introductory courses in the above-mentioned disciplines as well as a monograph in which further results of new research are systematically presented and where an attempt is made to make explicit the connections and analogies between a variety of concepts and constructions. (shrink)
A crucial socio-political challenge for our age is how to rede!ne or extend group membership in such a way that it adequately responds to phenomena related to globalization like the prevalence of migration, the transformation of family and social networks, and changes in the position of the nation state. Two centuries ago Immanuel Kant assumed that international connectedness between humans would inevitably lead to the realization of world citizen rights. Nonetheless, globalization does not just foster cosmopolitanism but simultaneously yields (...) the development of new group boundaries. Group membership is indeed a fundamental issue in political processes, for: “the primary good that we distribute to one another is membership in some human community” – it is within the political community that power is being shared and, if possible, held back from non-members. In sum, it is appropriate to consider group membership a fundamental ingredient of politics and political theory. How group boundaries are drawn is then of only secondary importance. Indeed, Schmitt famously declared that “[e]very religious, moral, economic, ethical, or other antithesis transforms into a political one if it is suffciently strong to group human beings e#ectively according to friend and enemy”. Even though Schmitt’s idea of politics as being constituted by such antithetical groupings is debatable, it is plausible to consider politics among other things as a way of handling intergroup di#erences. Obviously, some of the group-constituting factors are more easily discernable from one’s appearance than others, like race, ethnicity, or gender. As a result, factors like skin color or sexual orientation sometimes carry much political weight even though individuals would rather con!ne these to their private lives and individual identity. Given the potential tension between the political reality of particular groupmembership defnitions and the – individual and political – struggles against those definitions and corresponding attitudes, citizenship and civic behavior becomes a complex issue. As Kymlicka points out, it implies for citizens an additional obligation to non-discrimination regarding those groups: “[t]his extension of non-discrimination from government to civil society is not just a shift in the scale of liberal norms, it also involves a radical extension in the obligations of liberal citizenship”. Unfortunately, empirical research suggests that political intolerance towards other groups “may be the more natural and ‘easy’ position to hold”. Indeed, since development of a virtue of civility or decency regarding other groups is not easy, as it often runs against deeply engrained stereotypes and prejudices, political care for matters like education is justified. Separate schools, for example, may erode children’s motivation to act as citizens, erode their capacity for it and!nally diminish their opportunities to experience transcending their particular group membership and behave as decent citizens. This chapter outlines a possible explanation for such consequences. That explanation will be found to be interdisciplinary in nature, combining insights from political theory and cognitive neuroscience. In doing so, it does not focus on collective action, even though that is a usual focus for political studies. For example, results pertaining to collective political action have demonstrated that the relation between attitudes and overt voting behavior or political participation is not as direct and strong as was hoped for. Several conditions, including the individual’s experiences, self-interest, and relevant social norms, turned out to interfere in the link between his or her attitude and behavior. Important as collective action is, this chapter is concerned with direct interaction between agents and the in$uence of group membership on such interaction – in particular joint action. Although politics does include many forms of action that require no such physical interaction, such physical interaction between individuals remains fundamental to politics – this is the reason why separate schooling may eventually undermine the citizenship of its isolated pupils. This chapter will focus on joint action, de!ned as: “any form of social interaction whereby two or more individuals coordinate their actions in space and time to bring about a change in the environment”. Cognitive neuroscienti!c evidence demonstrates that for such joint action to succeed, the agents have to integrate the actions and expected actions of the other person in their own action plans at several levels of speci!city. Although neuroscienti!c research is necessarily limited to simple forms of action, this concurs with a philosophical analysis of joint action, which I will discuss below. (shrink)
A true Turing machine (TM) requires an infinitely long paper tape. Thus a TM can be housed in the infinite world of Newtonian spacetime (the spacetime of common sense), but not necessarily in our world, because our world-at least according to our best spacetime theory, general relativity-may be finite. All the same, one can argue for the "existence" of a TM on the basis that there is no such housing problem in some other relativistic worlds that are similar ("close") to (...) our world. But curiously enough-and this is the main point of this paper-some of these close worlds have a special spacetime structure that allows TMs to perform certain Turing unsolvable tasks. For example, in one kind of spacetime a TM can be used to solve first-order predicate logic and the halting problem. And in a more complicated spacetime, TMs can be used to decide arithmetic. These new computers serve to show that Church's thesis is a thoroughly contingent claim. Moreover, since these new computers share the fundamental properties of a TM in ordinary operation (e.g. intuitive, finitely programmed, limited in computational capability), a computability theory based on these non-Turing computers is no less worthy of investigation than orthodox computability theory. Some ideas about this new mathematical theory are given. (shrink)
A generic computation of a subset $A$ of $\mathbb{N}$ consists of a computation that correctly computes most of the bits of $A$, and never incorrectly computes any bits of $A$, but which does not necessarily give an answer for every input. The motivation for this concept comes from group theory and complexity theory, but the purely recursion theoretic analysis proves to be interesting, and often counterintuitive. The primary result of this paper is that there are no minimal pairs for (...) generic computability, answering a question of Jockusch and Schupp. (shrink)
After leading to a new axiomatic derivation of quantum theory, the new informational paradigm is entering the domain of quantum field theory, suggesting a quantum automata framework that can be regarded as an extension of quantum field theory to including an hypothetical Planck scale, and with the usual quantum field theory recovered in the relativistic limit of small wave-vectors. Being derived from simple principles, the automata theory is quantum ab-initio, and does not assume Lorentz covariance and mechanical notions. (...) Being discrete it can describe localized states and measurements, solving all the issues plaguing field theory originated from the continuum. These features make the theory an ideal framework for quantum gravity, with relativistic covariance and space-time emergent solely from the interactions, and not assumed a priori. The paper presents a synthetic derivation of the automata theory, showing how the principles lead to a description in terms of a quantum automaton over a Cayley graph of a group. Restricting to Abelian groups we show how the automata recover the Weyl, Dirac and Maxwell dynamics in the relativistic limit. We conclude with some new routes about the more general scenario of non-Abelian Cayley graphs. The phenomenology arising from the automata theory in the ultra-relativistic domain and the analysis of corresponding distorted Lorentz covariance is reviewed in Bisio et al. (shrink)
Lattice representations are an important tool for computability theorists when they embed nondistributive lattices into degree-theoretic structures. In this expository paper, we present the basic definitions and results about lattice representations needed by computability theorists. We define lattice representations both from the lattice-theoretic and computability-theoretic points of view, give examples and show the connection between the two types of representations, discuss some of the known theorems on the existence of lattice representations that are of interest to (...)computability theorists, and give a simple example of the use of lattice representations in an embedding result. (shrink)
The prominence given in national or state-wide curriculum policy to thinking, the development of democratic dispositions and preparation for the ‘good life’, usually articulated in terms of lifelong learning and fulfilment of personal life goals, gives rise to the current spate of interest in the role that could be played by philosophy in schools. Theorists and practitioners working in the area of philosophy for schools advocate the inclusion of philosophy in school curricula to meet these policy objectives. This article (...) tests claims that philosophy can aid in the acquisition of democratic dispositions and develop critical thinking and considers to the extent to which these aims are compatible with each other. These considerations are located in the context of certain policy statements relating to the curricula of Western Australia and New Zealand. (shrink)
Although Kurt Gödel does not figure prominently in the history of computabilty theory, he exerted a significant influence on some of the founders of the field, both through his published work and through personal interaction. In particular, Gödel’s 1931 paper on incompleteness and the methods developed therein were important for the early development of recursive function theory and the lambda calculus at the hands of Church, Kleene, and Rosser. Church and his students studied Gödel 1931, and Gödel taught a seminar (...) at Princeton in 1934. Seen in the historical context, Gödel was an important catalyst for the emergence of computability theory in the mid 1930s. (shrink)
Russell Hardin writes from a particular perspective, that of rational choice theory. His broad—and ambitious—overall project is to “understand the sway of groups in our time” or, in an alternative formulation, “to understand the motivations of those who act on behalf of groups and to understand how they come to identify with the groups for which they act”.
What drives much of the current philosophical interest in the idea of group cognition is its appeal to the manifestation of psychological properties—understood broadly to include states, processes, and dispositions—that are in some important yet elusive sense emergent with respect to the minds of individual group members. Our goal in this paper is to address a set of related, conditional questions: If human mentality is real yet emergent in a modest metaphysical sense only, then: (i) What would (...) it mean for a group to have emergent cognitive states? (ii) Is this even a metaphysically coherent view? (iii) Relative to which notion of emergence do we have reason to believe that certain groups in fact have emergent cognitive states? We shall argue that evidence from a wide variety of social science domains makes it plausible that there are group cognitive states and processes no less metaphysically emergent than human cognitive (and other special science) states and processes. (shrink)
There is currently much interest in bringing together the tradition of categorial grammar, and especially the Lambek calculus, with the recent paradigm of linear logic to which it has strong ties. One active research area is designing non-commutative versions of linear logic (Abrusci, 1995; Retoré, 1993) which can be sensitive to word order while retaining the hypothetical reasoning capabilities of standard (commutative) linear logic (Dalrymple et al., 1995). Some connections between the Lambek calculus and computations in groups have long (...) been known (van Benthem, 1986) but no serious attempt has been made to base a theory of linguistic processing solely on group structure. This paper presents such a model, and demonstrates the connection between linguistic processing and the classical algebraic notions of non-commutative free group, conjugacy, and group presentations. A grammar in this model, or G-grammar is a collection of lexical expressions which are products of logical forms, phonological forms, and inverses of those. Phrasal descriptions are obtained by forming products of lexical expressions and by cancelling contiguous elements which are inverses of each other. A G-grammar provides a symmetrical specification of the relation between a logical form and a phonological string that is neutral between parsing and generation modes. We show how the G-grammar can be oriented for each of the modes by reformulating the lexical expressions as rewriting rules adapted to parsing or generation, which then have strong decidability properties (inherent reversibility). We give examples showing the value of conjugacy for handling long-distance movement and quantifier scoping both in parsing and generation. The paper argues that by moving from the free monoid over a vocabulary V (standard in formal language theory) to the free group over V, deep affinities between linguistic phenomena and classical algebra come to the surface, and that the consequences of tapping the mathematical connections thus established can be considerable. (shrink)
Quantum cellular automata and quantum walks provide a framework for the foundations of quantum field theory, since the equations of motion of free relativistic quantum fields can be derived as the small wave-vector limit of quantum automata and walks starting from very general principles. The intrinsic discreteness of this framework is reconciled with the continuous Lorentz symmetry by reformulating the notion of inertial reference frame in terms of the constants of motion of the quantum walk dynamics. In particular, (...) among the symmetries of the quantum walk which recovers the Weyl equation—the so called Weyl walk—one finds a non linear realisation of the Poincaré group, which recovers the usual linear representation in the small wave-vector limit. In this paper we characterise the full symmetry group of the Weyl walk which is shown to be a non linear realization of a group which is the semidirect product of the Poincaré group and the group of dilations. (shrink)
One way to do socially relevant investigations of science is through conceptual analysis of scientific terms used in special-interest science (SIS). SIS is science having welfare-related consequences and funded by special interests, e.g., tobacco companies, in order to establish predetermined conclusions. For instance, because the chemical industry seeks deregulation of toxic emissions and avoiding costly cleanups, it funds SIS that supports the concept of "hormesis" (according to which low doses of toxins/carcinogens have beneficial effects). Analyzing the hormesis (...) concept of its main defender, chemical-industry-funded Edward Calabrese, the paper shows Calabrese and others fail to distinguish three different hormesis concepts, H, HG, and HD. H requires toxin-induced, short-term beneficial effects for only one biological endpoint, while HG requires toxin-induced, net-beneficial effects for all endpoints/responses/subjects/ages/conditions. HD requires using the risk-assessment/regulatory default rule that all low-dose toxic exposures are net-beneficial, thus allowable. Clarifying these concepts, the paper argues for five main claims. (1) Claims positing H are trivially true but irrelevant to regulations. (2) Claims positing HG are relevant to regulation but scientifically false. (3) Claims positing HD are relevant to regulation but ethically/scientifically questionable. (4) Although no hormesis concept (H, HG, or HD) has both scientific validity and regulatory relevance, Calabrese and others obscure this fact through repeated equivocation, begging the question, and data-tri mm ing. Consequently (5) their errors provide some undeserved rhetorical plausibility for deregulating low-dose toxins. (shrink)
Leibniz coined the word “dynamics,” but his own dynamics has never been completed. However, there are many illuminating ideas scattered in his writings on dynamics and metaphysics. In this paper, I will present my own interpretation of Leibniz’s dynamics and metaphysics. To my own surprise, Leibniz’s dynamics and metaphysics are incredibly flexible and modern. In particular, the metaphysical part, namely Monadology, can be interpreted as a theory of information in terms of monads, which generate both physical phenomena and mental phenomena. (...) The phenomena, i.e., how the world of monads appears to each monad must be distinguished from its internal states, which Leibniz calls perceptions, and the phenomena must be understood as the results of these states and God’s coding. My distinctive claim is that most interpreters ignored this coding. His dynamics and metaphysics can provide a framework good enough for enabling Einstein’s special relativity. And finally, his dynamics and metaphysics can provide a very interesting theory of space and time. In Part 1, we will focus on the relationship between metaphysics and dynamics. Leibniz often says that dynamics is subordinated to metaphysics. We have to take this statement seriously, and we have to investigate how dynamics and metaphysics are related. To this question, I will give my own answer, based on my informational interpretation. On my view, Leibniz’s metaphysics tries, among others, to clarify the following three: How each monad is programmed. How monads are organized into many groups, each of which is governed by a dominant monad ; this can be regarded as a precursor of von Neumann’s idea of cellular automata. And how the same structure is repeated in sub-layers of the organization. This structure is best understood in terms of the hierarchy of programs, a nested structure going down from the single dominant program to subprograms, which again controls respective subprograms, and ad infinitum. If we may use a modern term, this is a sort of recursion, although Leibniz himself did not know this word. And one of my major discoveries is that the same recursive structure is repeated in the phenomenal world, the domain of dynamical investigations. Recursion of what, you may ask. I will argue that it is elastic collision. For Leibniz, aside from inertial motions, dynamical changes of motion are brought about by elastic collisions, at any level of the infinite divisibility of matter. This nicely corresponds to the recursive structure of the program of a monad, or of the program of an organized group of monads. This is the crux of his claim that dynamics is subordinated to metaphysics. Moreover, the program of any monad is teleological, whereas the phenomenal world is governed by efficient cause of dynamics. And it is natural that the pre-established harmony is there, since God is the ultimate programmer, as well as the creator. (shrink)
A. F. Bentley’s The Process of Government (1908) is widely accepted as an important source of contemporary interestgroup study. This paper argues to the contrary that Bentley’s arguments in this area are obscure and have contributed little to the programme of modern interestgroup research. His importance is as a contributor to the debate on the nature of social science and social science method and not as the starting-point for interestgroup analysis. The (...) judgement about his role as a social scientist should rest on consideration of his body of work and not simply the one book. In terms of his much cited book, Bentley, it is argued, is misread. The central purpose of this article is to explore the consequences of that misinterpretation. The misreading of The Process of Government, and the unmerited assumption that it is directly connected to modern interestgroup theory, has led to a misunderstanding of that contemporary theory. In particular his use of the term ‘group’ is much wider in scope than is now usually followed. This means that his claims are not so uni-dimensional as they appear when extracted from their context. Bentley used the term in a sociological sense that included informal social associations as ‘groups’: these are not the sort of formal, collective organizations of the interestgroup type as identified in political science. It is argued that the major sources of ideas current in the interestgroup field are Truman (1951) and the case-study authors of the 1930s such as Odegard, Childs, Herring and Schattschneider. Bentley’s contribution to political science is not as progenitor of interestgroup studies, but his emphasis on process anticipates the policy studies movement. (shrink)
Since the later decades of the 20 th century, Brazilian psychologists have been questioning a theoretical and interventional model in educational contexts, which consider psychological phenomena apart from their cultural contexts, in order to develop an approach based on a contextualized viewpoint. Despite progress having been made in educational psychology, as a result of this critical paradigm, this area still has problems to overcome: Psychologists are becoming increasingly separate from schools, and it is now common to find psychologists who are (...) professionally unprepared to perform in this context and furthermore, school principals do not always understand the role of psychologists in educational settings. However, educational demands exist in psychologists’ daily work in a range of contexts, indicating the relevance of this field and the urgency to improve psychologists' qualifications. Considering that Brazilian professional development programs for educators and psychologists are usually restricted to technical learning, often ignoring professionals’ real needs and claims, this research aimed to develop a special type of professional development program looking at the group as a source of development, in order to rethink professional development process from within a collaborative perspective. Research data was generated from a professional development program offered for psychologists and professionals who work within educational settings. This programme involved collaborative group work and was organized and conducted in such a way as to create conditions for change. Concepts of cultural-historical theory – social situation of development, crisis and perezhivanie – were used as analytic tools for data analysis. Analysis indicated the importance of the group as a source of development through dialogue and the co-construction of new ideas and possibilities. (shrink)
Adaptations can occur at different hierarchical levels, but it can be difficult to identify the level of adaptation in specific cases. A major problem is that selection at a lower level can filter up, creating the illusion of selection at a higher level. We use optimality modeling of the volvocine algae to explore the emergence of genuine group adaptations. We find that it is helpful to develop an explicit model for what group fitness would be in the absence (...) of group-level relationships between traits and group fitness. We call this “counterfactual fitness,” because in many actual cases of interest there are group-level relationships. Once counterfactual fitness is modeled, the difference between effects that filter up and genuine group selection is explicit and so, therefore, is the distinction between apparent and genuine group adaptations. We call the latter group-specific adaptations. Recognizing group-specific adaptations is important because only group-specific adaptations would cause the lower-level units to be maladapted if they were to leave the group and enter a global cell-level population. Thus, as group-specific adaptations evolve, they create selective pressure for increased cohesiveness and individuality of groups. This article suggests that group-specific adaptations could be present in the simplest, earliest branching colonial volvocine species, which do not have distinct specialized cells. The article also makes predictions about the kind of empirical evidence needed to support or refute the hypothesis that a particular trait is a group-level adaptation. (shrink)
The INBIOSA project brings together a group of experts across many disciplines who believe that science requires a revolutionary transformative step in order to address many of the vexing challenges presented by the world. It is INBIOSA’s purpose to enable the focused collaboration of an interdisciplinary community of original thinkers. This paper sets out the case for support for this effort. The focus of the transformative research program proposal is biology-centric. We admit that biology to date has been more (...) fact-oriented and less theoretical than physics. However, the key leverageable idea is that careful extension of the science of living systems can be more effectively applied to some of our most vexing modern problems than the prevailing scheme, derived from abstractions in physics. While these have some universal application and demonstrate computational advantages, they are not theoretically mandated for the living. A new set of mathematical abstractions derived from biology can now be similarly extended. This is made possible by leveraging new formal tools to understand abstraction and enable computability. [The latter has a much expanded meaning in our context from the one known and used in computer science and biology today, that is "by rote algorithmic means", since it is not known if a living system is computable in this sense (Mossio et al., 2009).] Two major challenges constitute the effort. The first challenge is to design an original general system of abstractions within the biological domain. The initial issue is descriptive leading to the explanatory. There has not yet been a serious formal examination of the abstractions of the biological domain. What is used today is an amalgam; much is inherited from physics (via the bridging abstractions of chemistry) and there are many new abstractions from advances in mathematics (incentivized by the need for more capable computational analyses). Interspersed are abstractions, concepts and underlying assumptions “native” to biology and distinct from the mechanical language of physics and computation as we know them. A pressing agenda should be to single out the most concrete and at the same time the most fundamental process-units in biology and to recruit them into the descriptive domain. Therefore, the first challenge is to build a coherent formal system of abstractions and operations that is truly native to living systems. Nothing will be thrown away, but many common methods will be philosophically recast, just as in physics relativity subsumed and reinterpreted Newtonian mechanics. -/- This step is required because we need a comprehensible, formal system to apply in many domains. Emphasis should be placed on the distinction between multi-perspective analysis and synthesis and on what could be the basic terms or tools needed. The second challenge is relatively simple: the actual application of this set of biology-centric ways and means to cross-disciplinary problems. In its early stages, this will seem to be a “new science”. This White Paper sets out the case of continuing support of Information and Communication Technology (ICT) for transformative research in biology and information processing centered on paradigm changes in the epistemological, ontological, mathematical and computational bases of the science of living systems. Today, curiously, living systems cannot be said to be anything more than dissipative structures organized internally by genetic information. There is not anything substantially different from abiotic systems other than the empirical nature of their robustness. We believe that there are other new and unique properties and patterns comprehensible at this bio-logical level. The report lays out a fundamental set of approaches to articulate these properties and patterns, and is composed as follows. -/- Sections 1 through 4 (preamble, introduction, motivation and major biomathematical problems) are incipient. Section 5 describes the issues affecting Integral Biomathics and Section 6 -- the aspects of the Grand Challenge we face with this project. Section 7 contemplates the effort to formalize a General Theory of Living Systems (GTLS) from what we have today. The goal is to have a formal system, equivalent to that which exists in the physics community. Here we define how to perceive the role of time in biology. Section 8 describes the initial efforts to apply this general theory of living systems in many domains, with special emphasis on crossdisciplinary problems and multiple domains spanning both “hard” and “soft” sciences. The expected result is a coherent collection of integrated mathematical techniques. Section 9 discusses the first two test cases, project proposals, of our approach. They are designed to demonstrate the ability of our approach to address “wicked problems” which span across physics, chemistry, biology, societies and societal dynamics. The solutions require integrated measurable results at multiple levels known as “grand challenges” to existing methods. Finally, Section 10 adheres to an appeal for action, advocating the necessity for further long-term support of the INBIOSA program. -/- The report is concluded with preliminary non-exclusive list of challenging research themes to address, as well as required administrative actions. The efforts described in the ten sections of this White Paper will proceed concurrently. Collectively, they describe a program that can be managed and measured as it progresses. (shrink)
This article is divided into two main sections. The first discusses “Female Inheritance and the Male Retention Hypothesis.” Permanent groups exist in several species because over generations members share important interests. Considering the association between cooperation and degree of relatedness, it seems to follow that a collective interest is more likely to be achieved when members show a higher degree of relatedness. I argue that if membership is inherited by only one sex, and this is the female sex, this (...) results in a higher degree of relatedness between group members than when membership is inherited by both sexes, or by males only. Indeed, this is found in the overwhelming majority of species of insects, fish, birds, and mammals living in permanent groups. The exceptions to the rule are briefly discussed. Humans are of specialinterest because human preindustrial societies tend to show either male or female inheritance. The second section asks, “Do Moralizing Gods Raise Paternity Confidence?” Since males inherit valuable membership in patrilocal/lineal societies, they are expected to be more concerned about the probability of paternity than males in matrilocal/lineal societies. Moral rules, and specifically belief in moralizing gods, are expected to reflect this difference. An analysis of cross-cultural data of preindustrial societies does not refute the hypothesis that moralizing gods are more often found in patrilocal/lineal societies, nor is this hypothesis unambiguously supported. (shrink)
The experimental testing of the Lorentz transformations is based on a family of sets of coordinate transformations that do not comply in general with the principle of equivalence of the inertial frames. The Lorentz and Galilean sets of transformations are the only member sets of the family that satisfy this principle. In the neighborhood of regular points of space-time, all members in the family are assumed to comply with local homogeneity of space-time and isotropy of space in at least one (...) free-falling elevator, to be denoted as Robertson'sab initio rest frame [H. P. Robertson,Rev. Mod. Phys. 21, 378 (1949)].Without any further assumptions, it is shown that Robertson's rest frame becomes a preferred frame for all member sets of the Robertson family except for, again, Galilean and Einstein's relativities. If one now assumes the validity of Maxwell-Lorentz electrodynamics in the preferred frame, a different electrodynamics spontaneously emerges for each set of transformations. The flat space-time of relativity retains its relevance, which permits an obvious generalization, in a Robertson context, of Dirac's theory of the electron and Einstein's gravitation. The family of theories thus obtained constitutes a covering theory of relativistic physics.A technique is developed to move back and forth between Einstein's relativity and the different members of the family of theories. It permits great simplifications in the analysis of relativistic experiments with relevant “Robertson's subfamilies.” It is shown how to adapt the Clifford algebra version of standard physics for use with the covering theory and, in particular, with the covering Dirac theory. (shrink)
We develop an account of laboratory models, which have been central to the group selection controversy. We compare arguments for group selection in nature with Darwin's arguments for natural selection to argue that laboratory models provide important grounds for causal claims about selection. Biologists get information about causes and cause-effect relationships in the laboratory because of the special role their own causal agency plays there. They can also get information about patterns of effects and antecedent conditions in (...) nature. But to argue that some cause is actually responsible in nature, they require an inference from knowledge of causes in the laboratory context and of effects in the natural context. This process, cause detection, forms the core of an analogical argument for group selection. We discuss the differing roles of mathematical and laboratory models in constructing selective explanations at the group level and apply our discussion to the units of selection controversy to distinguish between the related problems of cause determination and evaluation of evidence. Because laboratory models are at the intersection of the two problems, their study is crucial for framing a coherent theory of explanation for evolutionary biology. (shrink)
Continuing work begun in [10], we utilize a notion of forcing for which the generic objects are structures and which allows us to determine whether these “generic” structures compute certain sets and enumerations. The forcing conditions are bounded complexity types which are consistent with a given theory and are elements of a given Scott set. These generic structures will “represent” this given Scott set, in the sense that the structure has a certain weak saturation property with respect to bounded complexity (...) types in the Scott set. For example, if ? is a nonstandard model of PA, then ? represents the Scott set ? = n∈ω | ?⊧“the nth prime divides a” | a∈?.The notion of forcing yields two main results. The first characterizes the sets of natural numbers computable in all models of a given theory representing a given Scott set. We show that the characteristic function of such a set must be enumeration reducible to a complete existential type which is consistent with the given theory and is an element of the given Scott set.The second provides a sufficient condition for the existence of a structure ? such that ? represents a countable jump ideal and ? does not compute an enumeration of a given family of sets ?. This second result is of particular interest when the family of sets which cannot be enumerated is ? = Rep[Th(?)]. Under this additional assumption, the second result generalizes a result on TA [6] and on certain other completions of PA [10]. For example, we show that there also exist models of completions of ZF from which one cannot enumerate the family of sets represented by the theory. (shrink)
Volume II of Classical Recursion Theory describes the universe from a local (bottom-up or synthetical) point of view, and covers the whole spectrum, from the recursive to the arithmetical sets. The first half of the book provides a detailed picture of the computable sets from the perspective of Theoretical Computer Science. Besides giving a detailed description of the theories of abstract Complexity Theory and of Inductive Inference, it contributes a uniform picture of the most basic complexity classes, ranging from small (...) time and space bounds to the elementary functions, with a particular attention to polynomial time and space computability. It also deals with primitive recursive functions and larger classes, which are of interest to the proof theorist. The second half of the book starts with the classical theory of recursively enumerable sets and degrees, which constitutes the core of Recursion or Computability Theory. Unlike other texts, usually confined to the Turing degrees, the book covers a variety of other strong reducibilities, studying both their individual structures and their mutual relationships. The last chapters extend the theory to limit sets and arithmetical sets. The volume ends with the first textbook treatment of the enumeration degrees, which admit a number of applications from algebra to the Lambda Calculus. The book is a valuable source of information for anyone interested in Complexity and Computability Theory. The student will appreciate the detailed but informal account of a wide variety of basic topics, while the specialist will find a wealth of material sketched in exercises and asides. A massive bibliography of more than a thousand titles completes the treatment on the historical side. (shrink)
Are sensory experiences, perceptual beliefs and observation reports faithful encoders of truthful information about the world? The theory-ladenness thesis poses an important challenge to answering this question in the affirmative. Roughly the thesis holds that theoretical factors affect the content of those experiences, beliefs and reports. In other words, it holds that their content is laden with theory. Theoretical factors here are construed broadly so as to include scientific theories, beliefs and cognitive processes. Two crucial questions arise in relation to (...) the theory-ladenness thesis. First, how pervasive is theory-ladenness? And, second, what is the extent of the distortion theory-ladenness has on the content of experiences, perceptual beliefs and observation reports? If theory-ladenness is not only pervasive, i.e. if it affects the majority of experiences, perceptual beliefs and observation reports, but also highly distortive way then those experiences, beliefs and reports can .. (shrink)
This is not a textbook in mathematical physics—excepting for one chapter one need not possess much more than geometry and elementary algebra—rather it is a philosophically reflective examination of the cardinal features of special relativity theory. Throughout the book Bohm is not merely doing physics, but thinking about doing physics as well. This metatheoretical reflexion appears in chapters concerning pre-Einsteinian notions of relativity, attempts to save the aether theories, the "ambiguity" of space-time measurements in the new cosmology, "common sense" (...) notions of space and time, and the falsification of scientific theories. There is a long appendix dealing with physics and perception—the relation between scientific objects and perceptual processes. One will, however, by working through the text learn plenty of physics in a rigorous and concise fashion. The author wisely does not attempt to cover the mathematically far more difficult and philosophically more profound General Theory of Einstein. In the area chosen, Bohm has written clearly and felicitously; this should serve a model for others who like to take their physics with a dollop of philosophy.—P. J. M. (shrink)
The standard model of subatomic particles and the periodic table of the atoms have the common goal to bring order in the bewildering chaos of the constituents of matter. Their success relies on the presence of fundamental symmetries in their core. The purpose of the book is to share the admiration for the power and the beauty of these symmetries. The reader is taken on a journey from the basic geometric symmetry group of a circle to the sublime dynamic (...) symmetries that govern the motions of the particles. The trail follows the lines of parentage linking groups upstream to the unitary symmetry of the eightfold way of quarks, and to the four-dimensional symmetry of the hydrogen atom. Along the way the theory of symmetry groups is gradually introduced with special emphasis on graphical representations. The final challenge is to open up the structure of Mendeleev's table which goes beyond the symmetry of the hydrogen atom. Breaking this symmetry to accommodate the multi-electron atoms requires to leave the common ground of linear algebras and explore the potential of non-linearity. (shrink)
A number of general points behind the story of this paper may be worth setting out separately, now that we have come to the end.There is perhaps one obvious omission to be addressed right away. Although the word “information” has occurred throughout this paper, it must have struck the reader that we have had nothing to say on what information is. In this respect, our theories may be like those in physics: which do not explain what “energy” is (a notion (...) which seems quite similar to “information” in several ways), but only give some basic laws about its behaviour and transmission.The eventual recommendation made here has been to use a broad type-theoretic framework for studying various more classical and more dynamic notions of proposition in their interaction. This is not quite the viewpoint advocated by many current authors in the area, who argue for a whole-sale switch from a ‘static’ to a ‘dynamic’ perspective on propositions. This is not the place, however, to survey the conceptual arguments for and against such a more radical move.This still leaves many questions about possible reductions from one perspective to another. For instance, it would seem that classical systems ought to serve as a ‘limiting case’, which should still be valid after procedural details of some cognitive process have been forgotten. There are various ways of implementing the desired correspondence: e.g. by considering extreme cases with ⫅ equal to identity, or, in the pure relational algebra framework by considering only pairs (x, x). What we shall want then are reductions of dynamic logics, in those special cases, to classical logic. But perhaps also, more sophisticated views are possible. How do we take a piece of ‘dynamic’ prose, remove control instructions and the like, and obtain a piece of ‘classical’ text, suitable for inference ‘in the light of eternity’?There is also a more technical side to the matter of ‘reduction’. By now, Logic has reached such a state of ‘inter-translatability’ that almost all known variant logics can be embedded into each other, via suitable translations. In particular, once an adequate semantic has been given for a new system, this usually induces an embedding into standard logic: as we know, e.g., for the case of Modal Logic. Like-wise, all systems of dynamic interpretation or inference proposed so far admit of direct embedding into an ordinary ‘static’ predicate logic having explicit transition predicates (cf. van Benthem 1988b). Thus, our moral is this. The issue is not whether the new systems of information structure or processing are essentially beyond the expressive resources of traditional logical systems: for, they are not. The issue is rather which interesting phenomena and questions will be put into the right focus by them.The next broad issue concerns the specific use of the perspective proposed here, vis-à-vis concrete proposals for information-oriented or dynamic semantics. The general strategy advocated here is to locate some suitable base calculus and then consider which ‘extras’ are peculiar to the proposal. For instance, this is the spirit in which modal S4 would be a base logic of information models, and intuitionistic logicthe special theory devoting itself to upward persistent propositions. Or, with the examples in Section 4.1, the underlying base logic is our relational algebra, whereas, say, ordinary updates then impose special properties, such as ‘idempotence’: $$xRy \Rightarrow yRy$$ Does this kind of application presuppose the existence of one distinguished base logic, of which all others are extensions? This would be attractive-and some form of relational algebra or linear logic might be a reasonable candidate. Nevertheless, the enterprise does not rest on this outcome. What matters is an increased sensitivity to the ‘landscape’ of dynamic logics, just as with the ‘Categorial Hierarchy’ in Categorial Grammar (cf. van Benthem 1989a, 1991) where the family of logics with their interconnections seems more important than any specific patriarch.Finally, perhaps the most important issue in the new framework is the possibility of new kinds of questions arising precisely because of its differences from standard logic. Notably, given the option of regarding propositions as programs, it will be of interest to consider systematically which major questions about programming languages now make sense inside logic too.EXAMPLE. Correctness. When do we have $$\left[\kern-0.15em\left[ \pi \right]\kern-0.15em\right](\left[\kern-0.15em\left[ A \right]\kern-0.15em\right]) \subseteq \left[\kern-0.15em\left[ B \right]\kern-0.15em\right]$$ for (s, t) propositions A, B and a dynamic (s, (s, t)) proposition π?Program Synthesis. Which dynamic proposition will take us from an information state satisfying A to one satisfying B? (This question needs refinement, lest there be trivial answers.)Determinism. Which propositions as programs are deterministic, in the sense of defining single-valued functions from states to states?Querying. What does it mean to ask for information in the present setting? (Again, individual types referring to e will be crucial here.)This is not merely an agenda for wishful thinking. Within Logic, there are various ways of introducing such concerns into semantics, especially, using tools from Automata Theory. (See van Benthem 1989c for further discussion of such computational perspectives in ‘cognitive programming’.)At least if one believes that ‘dynamics’ is of the essence in cognition (rather than a mere interfacing problem between the halls of eternal truth and the noisy streets of reality), the true test for the present enterprise is the development of a significant new research program not merely copying the questions of old. (shrink)
This article is primarily a study of the group selection controversy, with special emphasis on the period from 1962 to the present, and the rise of inclusive fitness theory. Interest is focused on the relations between individual fitness theory and other fitness theories and on the methodological imperatives used in the controversy over the status of these theories. An appendix formalizes the notion of "assertive part" which is used in the informal discussion of the methodological imperatives elicited (...) from the controversy. (shrink)
Collective Identity, Oppression, and the Right to Self-Ascription argues that groups have an irreducibly collective right to determine the meaning of their shared group identity, and that such a right is especially important for historically oppressed groups. It provides a novel approach to issues of identity politics, group rights, and racial identity, one which combines and develops the insights of contemporary critical theory and race theory, and will thus be of specialinterest to scholars in these (...) fields. (shrink)
This paper focuses on doctoral research which explored relationships and interpersonal learning through group dramatherapy and creative interviewing with adolescents in special education. A constructivist grounded theory study, positioning adolescents with intellectual/developmental disabilities as experts of their own relational experiences, revealed a tendency to “copy others.” The final grounded theory presented “copying” as a tool which participants consciously employed “to play with,” “learn from,” and “join in with” others. Commonly experiencing social ostracism, participants reflected awareness of their tendency (...) to “copy others” being underpinned by a need to belong. Belonging was therefore expressed as the ultimate therapeutic experience participants wished to have. Participant responses which link dramatic imitation to a self-identified tendency “to copy,” are discussed with regard to how imitation provides an accessible point of dramatic entry from which adolescents in special education begin to explore new ways of being and inter-relating. Recommendations for how dramatherapists might centralize imitative aspects of the dramatic process to achieve therapeutic intent when working alongside adolescents in special education are discussed with specific focus on creating a space of belonging.Note on type: Participant quotes extracted from the data are included throughout this article. In order to highlight participant’s contributions quotes are italicized and presented within speech marks. (shrink)
This well-written introduction to the theory of recursive functions and effective computability is an English translation of the 1960 German edition. The seven chapters deal with all the usual material, beginning with a treatment of Turing machines and their relation to the intuitive idea of computability, through general recursive functions, to a chapter on such diverse topics as the hierarchy of arithmetical predicates and Fitch's basic logic system. Rather than try to cover the whole subject sketchily, the author (...) confines himself to a narrower range of subjects which he elucidates with admirable clarity. The treatment of Turing machines is similar to that of Davis' book in using quadruples as instruction units in machine "programs," but the treatment differs in some details which might interest the connoisseur. In between technical discussions there are numerous remarks, often of more than a page in length, dealing with the more purely philosophical aspects of the theory under development. Each chapter terminates with a short bibliography. The publishers are to be congratulated for making this valuable work available to a wider range of readers.—P. J. M. (shrink)
The classical Hahn–Banach Theorem states that any linear bounded functional defined on a linear subspace of a normed space admits a norm-preserving linear bounded extension to the whole space. The constructive and computational content of this theorem has been studied by Bishop, Bridges, Metakides, Nerode, Shore, Kalantari Downey, Ishihara and others and it is known that the theorem does not admit a general computable version. We prove a new computable version of this theorem without unrolling the classical proof of the (...) theorem itself. More precisely, we study computability properties of the uniform extension operator which maps each functional and subspace to the set of corresponding extensions. It turns out that this operator is upper semi-computable in a well-defined sense. By applying a computable version of the Banach–Alaoglu Theorem we can show that computing a Hahn–Banach extension cannot be harder than finding a zero in a compact metric space. This allows us to conclude that the Hahn–Banach extension operator is ${\bf {\Sigma^{0}_{2}}}$ -computable while it is easy to see that it is not lower semi-computable in general. Moreover, we can derive computable versions of the Hahn–Banach Theorem for those functionals and subspaces which admit unique extensions. (shrink)
A true Turing machine requires an infinitely long paper tape. Thus a TM can be housed in the infinite world of Newtonian spacetime, but not necessarily in our world, because our world-at least according to our best spacetime theory, general relativity-may be finite. All the same, one can argue for the "existence" of a TM on the basis that there is no such housing problem in some other relativistic worlds that are similar to our world. But curiously enough-and this is (...) the main point of this paper-some of these close worlds have a special spacetime structure that allows TMs to perform certain Turing unsolvable tasks. For example, in one kind of spacetime a TM can be used to solve first-order predicate logic and the halting problem. And in a more complicated spacetime, TMs can be used to decide arithmetic. These new computers serve to show that Church's thesis is a thoroughly contingent claim. Moreover, since these new computers share the fundamental properties of a TM in ordinary operation, a computability theory based on these non-Turing computers is no less worthy of investigation than orthodox computability theory. Some ideas about this new mathematical theory are given. (shrink)
We show that there is a structure of countably infinite signature with $P = N_{2}P$ and a structure of finite signature with $P = N_{1}P$ and $N_{1}P \neq N_{2}P$ . We give a further example of a structure of finite signature with $P \neq N_{1}P$ and $N_{1}P \neq N_{2}P$ . Together with a result from [10] this implies that for each possibility of P versus NP over structures there is an example of countably infinite signature. Then we show that for (...) some finite ℒ the class of ℒ-structures with $P = N_{1}P$ is not closed under ultraproducts and obtain as corollaries that this class is not $\delta$ -elementary and that the class of ᵍ-structures with $P \neq N_{1}P$ is not elementary. Finally we prove that for all f dominating all polynomials there is a structure of finite signature with the following properties: $P \neq N_{1}P$ . $N_{1}P \neq N_{2}P$ , the levels $N_{2}TIME(n^{i})$ of $N_{2}P$ and the levels $N_{1}TIME(n^{i})$ of $N_{1}P$ are different for different i, indeed $DTIME(n^{i'}) \nsubseteq N_{2}TIME(n^{i})$ if $i' \textgreater i$ ; $DTIME(f) \nsubseteq N_{2}P$ , and $N_{2}P \nsubseteq DEC$ . DEC is the class of recognizable sets with recognizable complements. So this is an example where the internal structure of $N_{2}P$ is analyzed in a more detailed way. In our proofs we use methods in the style of classical computability theory to construct structures except for one use of ultraproducts. (shrink)
Issues of identity and reduction have monopolized much of the philosopher of mind’s time over the past several decades. Interestingly, while investigations of these topics have proceeded at a steady rate, the motivations for doing so have shifted. When the early identity theorists, e.g. U. T. Place ( 1956 ), Herbert Feigl ( 1958 ), and J. J. C. Smart ( 1959 , 1961 ), fi rst gave voice to the idea that mental events might be identical to brain processes, (...) they had as their intended foil the view that minds are immaterial substances. But very few philosophers of mind today take this proposal seriously. Why, then, the continued interest in identity and reduction? Th e concern, as philosophers like Hilary Putnam and Jerry Fodor have expressed it, is that a victory for identity or reduction is a defeat for psychology. For if minds are physical, or if mental events are physical events, then psychologists might as well disassemble their laboratories, making room for the neuroscientists and molecular biologists who are in a better position to explain those phenomena once misdescribed as “psychological.” Th e worry nowadays is not that locating thought in immaterial souls will make psychology intractable, but that locating thoughts in material brains will make it otiose. (shrink)
The axiomatic bases of Special Relativity Theory (SRT) are thoroughly re-examined from an operational point of view, with particular emphasis on the status of Einstein synchronization in the light of the possibility of arbitrary synchronization procedures in inertial reference frames. Once correctly and explicitly phrased, the principles of SRT allow for a wide range of “theories” that differ from the standard SRT only for the difference in the chosen synchronization procedures, but are wholly equivalent to SRT in predicting empirical (...) facts. This results in the introduction, in the full background of SRT, of a suitable synchronization gauge. A complete hierarchy of synchronization gauges is introduced and elucidated, ranging from the useful Selleri synchronization gauge (which should lead, according to Selleri, to a multiplicity of theories alternative to SRT) to the more general Mansouri–Sexl synchronization gauge and, finally, to the even more general Anderson–Vetharaniam–Stedman’s synchronization gauge. It is showed that all these gauges do not challenge the SRT, as claimed by Selleri, but simply lead to a number of formalisms which leave the geometrical structure of Minkowski spacetime unchanged. Several aspects of fundamental and applied interest related to the conventional aspect of the synchronization choice are discussed, encompassing the issue of the one-way velocity of light in inertial and rotating reference frames, the global positioning system (GPS)’s working, and the recasting of Maxwell equations in generic synchronizations. Finally, it is showed how the gauge freedom introduced in SRT can be exploited in order to give a clear explanation of the Sagnac effect for counter-propagating matter beams. (shrink)
ABSTRACT Self-Determination Theory is an empirically based organismic theory of human motivation, development, and well-being that shares many points of interest with the fields of moral development and moral education. Yet, SDT has been largely disconnected from these fields so far. How can we define and empirically assess autonomous moral motivation? How is moral autonomy achieved in the course of development? And what are the relationships between leading a moral life and happiness? These questions have been occupying moral psychologists (...) and educators for a long time. They are focal for SDT, as well. This special issue highlights various lines of intersection between SDT, morality and education. Contributions either expand SDT into the moral domain or incorporate elements of SDT into moral theory with the ultimate goal of integrating fields that inherently belong together. (shrink)