The positions Ross & Spurrett (R&S) take on issues of information, causality, functionalism, and emergence are actually implicit in the theory and practice of statistical physics, specifically in the way it relates macroscopic collective coordinates to microscopic physics. The reasons for taking macroscopic physical variables like temperature or magnetization to be real apply equally to mental properties like pain.
The brief coda devoted to the Trinity in Schleiermacher's The Christian Faith does not intend to marginalize the doctrine. It indicates that the doctrine, though at present still to be completed, is the recapitulation of the entire scheme of redemption. The central structuring concept in that scheme is that of the genuine union between the divine existence of the infinite creator and human nature in Christ, a pattern replicated in the coming of the Holy Spirit as the inauguration of a (...) second, strictly analogous union of God and humanity. The subtle way in which Schleiermacher conceives these unions in line with his rigorous understanding of radical causality of divine creation requires careful unpacking. Only such an analysis brings to light the architecture of the doctrine of the Trinity, and its function as a kind of meta-doctrine, connecting and coordinating different elements in the doctrine of grace. (shrink)
Since the very begining of quantum theory there started a debate on the proper role of space and time in it. Some authors assumed that space and time have their own algebraic operators. On that basis they supposed that quantum particles had “coordinates of position”, even though those coordinates were not possible to determine with infinite precision. Furthermore, time in quantum physics was taken to be on an equal foot, by means of a so-called “Heisenberg’s fourth relation of (...) indeterminacy” concerning time and energy. In this paper, the proper role of space and time in the core of non-relativistic quantum phsysics is analyzed. We will find that, rigorously, that relation for time and energy shows a different root. For the role of space, it will be discussed that there is no “coordinate of position” in the conceptual structure of quantum physics because quantum particles are not point-like objects. DOI:10.5007/1808-1711.2010v14n2p241. (shrink)
In recent years, Reichenbach’s 1920 conception of the principles of coordination has attracted increased attention after Michael Friedman’s attempt to revive Reichenbach’s idea of a “relativized a priori”. This paper follows the origin and development of this idea in the framework of Reichenbach’s distinction between the axioms of coordination and the axioms of connection. It suggests a further differentiation among the coordinating axioms and accordingly proposes a different account of Reichenbach’s “relativized a priori”.
Philosophers using game-theoretical models of human interactions have, I argue, often overestimated what sheer rationality can achieve. (References are made to David Gauthier, David Lewis, and others.) In particular I argue that in coordination problems rational agents will not necessarily reach a unique outcome that is most preferred by all, nor a unique 'coordination equilibrium' (Lewis), nor a unique Nash equilibrium. Nor are things helped by the addition of a successful precedent, or by common knowledge of generally accepted personal principles. (...) Commitments like those generated by agreements may be necessary for rational expectations to arise. Social conventions, construed as group principles (following the analysis in my book On Social Facts), would suffice for this task. (shrink)
We re-examine the relationship between coordination, legal sanctions, and free-riding in light of the recent controversy regarding the applicability of the coordination problem paradigm of law-making. We argue that legal sanctions can help solve coordination problems by eliminating socially suboptimal equilibrium outcomes. Once coordination has taken place, however, free-riding can not lead to the breakdown of coordination outcomes, even if sanctions may still be effective at increasing the equity of such outcomes. Finally, we argue that it is the choice of (...) a legal or constitutional system rather than the choice of law that is paradigmatic of the coordination problem. This view requires a re-assessment of the normative status of sanctions attached to individual laws. (shrink)
Frege's picture of attitude states and attitude reports requires a notion of content that is shareable between agents, yet more fine-grained than reference. Kripke challenged this picture by giving a case on which the expressions that resist substitution in an attitude report share a candidate notion of fine-grained content. A consensus view developed which accepted Kripke's general moral and replaced the Fregean picture with an account of attitude reporting on which states are distinguished in conversation by their (private) representational properties. (...) I begin in support of the consensus by showing how a sort of de facto coordination on mental symbols is possible, even for unsophisticated agents. But I go on to argue that whenever conditions are ripe for de facto coordination on symbols, there is an inter-subjective relation that supports a fine-grained notion of content resistant to Kripke's challenge. The consensus view corresponds to a Kripke-resistant strain of the Fregean picture. (shrink)
Reichenbach's Philosophy of Space and Time (1928) avoids most of the logical positivist pitfalls it is generally held to exemplify, notably both conventionalism and verificationism. To see why, we must appreciate that Reichenbach's interest lies in how mathematical structures can be used to describe reality, not in how words like 'distance' acquire meaning. Examination of his proposed "coordinative definition" of congruence shows that Reichenbach advocates a reductionist analysis of the relations figuring in physical geometry (contrary to common readings that attribute (...) to him a holistic conventionalism), while embracing a thoroughly holistic understanding of empirical confirmation (contrary to rival operationalist readings). (shrink)
This paper argues that multiple coordinations like tall, thin and happy are interpreted in a “ﬂat” iterative process, but using “nested” recursive application of binary coordination operators in the compositional meaning derivation. Ample motivation for ﬂat interpretation is shown by contrasting such coordinations with nested, syntactically ambiguous, coordinate structures like tall and thin and happy. However, new evidence coming from type shifting and predicate distribution with verb phrases show motivation for an independent hierarchical ingredient in the compositional semantics of multiple (...) coordination with no parallel hierarchy in the syntax. This establishes a contrast between operations at the syntax-semantics interface and compositional semantic mechanisms. At the same time, such evidence motivate the treatment of operations like type shifting and distributivity as purely semantic. (shrink)
How do rational agents coordinate in a single-stage, noncooperative game? Common knowledge of the payoff matrix and of each player's utility maximization among his strategies does not suffice. This paper argues that utility maximization among intentions and then acts generates coordination yielding a payoff-dominant Nash equilibrium. ‡I thank the audience at my paper's presentation at the 2006 PSA meeting for many insightful points. †To contact the author, please write to: Philosophy Department, University of Missouri, Columbia, MO 65211; e-mail: WeirichP@missouri.edu.
This article begins by pointing out the difficulties involved by the insertion of freedom in economics: It poses epistemological problems that are not satisfactorily solved by the standard theories. The article suggests that the Aristotelian epistemological frame of practical rationality may be an apt position from which one can deal with freedom in economics. Aristotle's concepts of society and economics are first introduced. The role of virtues in achieving economic coordination is exposed. Then the corresponding concept of practical science is (...) described, showing its main characteristics and how they fit in with traditional political economy. The concept of value neutrality receives special attention in the article: A reinterpretation of the meaning of it is proposed. The article concludes that Aristotle's broad concepts of practical reason and science leave room for a more comprehensive notion of economics. (shrink)
Why is interaction so simple? This article presents a theory of interaction based on the use of shared representations as “coordination tools” (e.g., roundabouts that facilitate coordination of drivers). By aligning their representations (intentionally or unintentionally), interacting agents help one another to solve interaction problems in that they remain predictable, and offer cues for action selection and goal monitoring. We illustrate how this strategy works in a joint task (building together a tower of bricks) and discuss its requirements from a (...) computational viewpoint. (shrink)
In her book Rationality and coordination (Cambridge University Press, 1994) Cristina Bicchieri brings together (and adds to) her own contributions to game theory and the philosophy of economics published in various journals in the period 1987-1992. The book, however, is not a collection of separate articles but rather a homogeneous unit organized around some central themes in the foundations of non-cooperative game theory. Bicchieri’s exposition is admirably clear and well organized. Somebody with a good knowledge of game theory would probably (...) benefit mainly from reading the second part of Chapter 3 (from Section 3.6 onward) and Chapter 4. On the other hand, those who have had little exposure to game theory, would certainly benefit from reading the entire book. I shall begin with an overview of the content of the book and then offer some critical comments on what I consider to be the most important part of it. (shrink)
This comment makes four related points. First, explaining coordination is different from explaining cooperation. Second, solving the coordination problem is more important for the theory of games than solving the cooperation problem. Third, a version of the Principle of Coordination can be rationalized on individualistic grounds. Finally, psychological game theory should consider how players perceive their gaming situation.
The concept of locally specialized functions dominates research on higher brain function and its disorders. Locally specialized functions must be complemented by processes that coordinate those functions, however, and impairment of coordinating processes may be central to some psychotic conditions. Evidence for processes that coordinate activity is provided by neurobiological and psychological studies of contextual disambiguation and dynamic grouping. Mechanisms by which this important class of cognitive functions could be achieved include those long-range connections within and between cortical regions that (...) activate synaptic channels via NMDA-receptors, and which control gain through their voltage-dependent mode of operation. An impairment of these mechanisms is central to PCP-psychosis, and the cognitive capabilities that they could provide are impaired in some forms of schizophrenia. We conclude that impaired cognitive coordination due to reduced ion flow through NMDA-channels is involved in schizophrenia, and we suggest that it may also be involved in other disorders. This perspective suggests several ways in which further research could enhance our understanding of cognitive coordination, its neural basis, and its relevance to psychopathology. Key Words: attention; cerebral cortex; cognitive coordination; cognitive neuropsychiatry; cognitive neuropsychology; context disorganization; Gamma rhythms; Gestalt theory; glutamate; grouping; memory; NMDA-receptors; PCP-psychosis; perceptual organization; schizophrenia. (shrink)
Adaptationists explain the evolution of religion from the cooperative effects of religious commitments, but which cooperation problem does religion evolve to solve? I focus on a class of symmetrical coordination problems for which there are two pure Nash equilibriums: (1) ALL COOPERATE, which is efficient but relies on full cooperation; (2) ALL DEFECT, which is inefficient but pays regardless of what others choose. Formal and experimental studies reveal that for such risky coordination problems, only the defection equilibrium is evolutionarily stable. (...) The following makes sense of otherwise puzzling properties of religious cognition and cultures as features of cooperative designs that evolve to stabilise such risky exchange. The model is interesting because it explains lingering puzzles in the data on religion, and better integrates evolutionary theories of religion with recent, well-motivated models of cooperative niche construction. (shrink)
It is widely appreciated that establishment and maintenance of coordination are among the key evolutionary promoters and stabilizers of human language. In consequence, it is also generally recognized that game theory is an important tool for studying these phenomena. However, the best known game theoretic applications to date tend to assimilate linguistic communication with signaling. The individualistic philosophical bias in Western social ontology makes signaling seem more challenging than it really is, and thus focuses attention on theoretical problems - for (...) example, coordination on lexical meaning - that actual evolution did not need to solve by improving humans' strategic or social intelligence relative to the endowments of other primates. At the same time, issues of genuine evolutionary significance related to language, especially those around the tensions between individual and collective agency, and around intergenerational accumulation of knowledge, are obscured. This in turn leads to underestimation of the potential contribution that game theory can make to enlightening models of the evolution of human language. JEL classification: A11, A12, B52, C73, D02, D03, D82, Z13. (shrink)
On many occasions, individuals are able to coordinate their actions. The first empirical evidence to this effect has been described by Schelling (1960) in an informal experiment. His results were corroborated many years later by Mehta et al. (1994a,b) and Bacharach and Bernasconi (1997). From the point of view of mainstream game theory, the success of individuals in coordinating their actions is something of a mystery. If there are two or more strict Nash equilibria, mainstream game theory has no means (...) of explaining why people tend to choose their part of one and the same equilibrium. Textbooks (see, e.g., Rasmusen, 1989 and Kreps, 1990) refer to the fact that players may use focal points (see Schelling (1960)) or social conventions (see Lewis (1969)). Both notions cannot easily be incorporated into mainstream game theory, however. The notion of social conventions has recently been extensively studied in the context of evolutionary game theory where a population of agents interacts with each other. The central focus of this paper, however, is on situations where a few players play a game only once and I study how they may coordinate their actions. (shrink)
Game theory's paradoxes stimulate the study of rationality. Sometimes they motivate the revising of standard principles of rationality. Other times they call for revising applications of those principles or introducing supplementary principles of rationality. I maintain that rationality adjusts its demands to circumstances, and in ideal games of coordination it yields a payoff-dominant equilibrium.
The type of principles which cognitive engineers need to design better work environments are principles which explain interactivity and distributed cognition: how human agents interact with themselves and others, their work spaces, and the resources and constraints that populate those spaces. A first step in developing these principles is to clarify the fundamental concepts of environment, coordination, and behavioural function. Using simple examples, I review changes the distributed perspective forces on these basic notions.
Human social coordination is often mediated by language. Through verbal dialogue, people direct each other's attention to properties of their shared environment, they discuss how to jointly solve problems, share their introspections, and distribute roles and assignments. In this article, we propose a dynamical framework for the study of the coordinative role of language. Based on a review of a number of recent experimental studies, we argue that shared symbolic patterns emerge and stabilize through a process of local reciprocal linguistic (...) alignment. Such patterns in turn come to facilitate and refine social coordination by enabling the alignment, joint construction and navigation of conceptual models and actions. Implications of the framework are illustrated and discussed in relation to a case study where dyads of interlocutors interact verbally to reach joint decisions in a perceptual discrimination task. Keywords: social coordination; language; communication; linguistic alignment; symbolic patterns; affordances; emergence; evolution; adaptivity; interaction. (shrink)
Abstract In the ideal market of general equilibrium theory, choices are made in full knowledge of one another, and all expectations are fulfilled. This pre?harmonization of individual plans does not occur in real?world markets where decisions must be taken in ignorance of one another. The Austrian school grants this, but claims that real?world price systems are nonetheless effective in coordinating saving and investment decisions, which are motivated by disparate considerations. In contrast, Keynes held that without the pre?reconciliation of individual plans, (...) investment and employment would be less than optimal, and the resulting distribution of income arbitrary and inequitable. (shrink)
We use a dynamic, context-sensitive approach to abductive interpretation to describe coordinated processes of understanding, generation and accommodation in dialogue. The agent updates the dialogue uniformly for its own and its interlocutors’ utterances, by accommodating a new context, inferred abductively, in which utterance content is both true and prominent. The generator plans natural and comprehensible utterances by exploiting the same abductive preferences used in understanding. We illustrate our approach by formalizing and implementing some interactions between information structure and the form (...) of referring expressions. (shrink)
The traditional solution concept for noncooperative game theory is the Nash equilibrium, which contains an implicit assumption that playersâ probability distributions satisfy t probabilistic independence. However, in games with more than two players, relaxing this assumption results in a more general equilibrium concept based on joint beliefs (Vanderschraaf, 1995). This article explores the implications of this joint-beliefs equilibrium concept for two kinds of conflictual coordination games: crisis bargaining and public goods provision. We find that, using updating consistent with Bayesâ rule, (...) playersâ beliefs converge to equilibria in joint beliefs which do not satisfy probabilistic independence. In addition, joint beliefs greatly expand the set of mixed equilibria. On the face of it, allowing for joint beliefs might be expected to increase the prospects for coordination. However, we show that if players use joint beliefs, which may be more likely as the number of players increases, then the prospects for coordination in these games declines vis-Ã -vis independent beliefs. (shrink)
Prior research suggests that the action system is responsible for creating an immediate sense of self by determining whether certain sensations and perceptions are the result of one's own actions. In addition, it is assumed that declarative, episodic, or autobiographical memories create a temporally extended sense of self or some form of identity. In the present article, we review recent evidence suggesting that action (procedural) knowledge also forms part of a person's identity, an action identity, so to speak. Experiments that (...) addressed self-recognition of past actions, prediction, and coordination provide ample evidence for this assumption. The phenomena observed in these experiments can be explained by the assumption that observing an action results in the activation of action representations, the more so, when the action observed corresponds to the way in which the observer would produce it. (shrink)
Following Schelling (1960), coordination problems have mainly been considered in a context where agents can achieve a common goal (e.g., rendezvous) only by taking common actions. Dynamic versions of this problem have been studied by Crawford and Haller (1990), Ponssard (1994), and Kramarz (1996). This paper considers an alternative dynamic formulation in which the common goal (dispersion) can only be achieved by agents taking distinct actions. The goal of spatial dispersion has been studied in static models of habitat selection, location (...) or congestion games, and network analysis. Our results show how this goal can be achieved gradually, by indistinguishable non-communicating agents, in a dynamic setting. (shrink)
Information processing theories of memory and skills can be reformulated in terms of how categories are physically and temporally related, a process called conceptual coordination. Dreaming can then be understood as a story-understanding process in which two mechanisms found in everyday comprehension are missing: conceiving sequences (chunking categories in time as a higher-order categorization) and coordinating across modalities (e.g., relating the sound of a word and the image of its meaning). On this basis, we can readily identify isomorphisms between dream (...) phenomenology and neurophysiology, and explain the function of dreaming as facilitating future coordination of sequential, cross-modal categorization (i.e., REM sleep lowers activation thresholds, “unlearning”). [Hobson et al.; Nielsen; Solms; Revonsuo; Vertes & Eastman]. (shrink)
The paper presents a variation of the EMAIL Game, originally proposed byRubinstein (American Economic Review, 1989), in which coordination ofthe more rewarding-risky joint course of actions is shown to obtain, evenwhen the relevant game is, at most, ``mutual knowledge.'' In the exampleproposed, a mediator is introduced in such a way that two individualsare symmetrically informed, rather than asymmetrically as in Rubinstein,about the game chosen by nature. As long as the message failure probabilityis sufficiently low, with the upper bound being a (...) function of the gamepayoffs, conditional beliefs in the opponent's actions can allow playersto choose a more rewarding-risky action. The result suggests that, forefficient coordination to obtain, the length of interactive knowledge onthe game, possibly up to ``almost common knowledge,'' does not seem to bea major conceptual issue and that emphasis should be focused instead onthe communication protocol and an appropriate relationship between thereliability of communication channels and the payoffs at stake. (shrink)
This book attempts to marry truth-conditional semantics with cognitive linguistics in the church of computational neuroscience. To this end, it examines the truth-conditional meanings of coordinators, quantifiers, and collective predicates as neurophysiological phenomena that are amenable to a neurocomputational analysis. Drawing inspiration from work on visual processing, and especially the simple/complex cell distinction in early vision (V1), we claim that a similar two-layer architecture is sufficient to learn the truth-conditional meanings of the logical coordinators and logical quantifiers. As a prerequisite, (...) much discussion is given over to what a neurologically plausible representation of the meanings of these items would look like. We eventually settle on a representation in terms of correlation, so that, for instance, the semantic input to the universal operators (e.g. and, all)is represented as maximally correlated, while the semantic input to the universal negative operators (e.g. nor, no)is represented as maximally anticorrelated. On the basis this representation, the hypothesis can be offered that the function of the logical operators is to extract an invariant feature from natural situations, that of degree of correlation between parts of the situation. This result sets up an elegant formal analogy to recent models of visual processing, which argue that the function of early vision is to reduce the redundancy inherent in natural images. Computational simulations are designed in which the logical operators are learned by associating their phonological form with some degree of correlation in the inputs, so that the overall function of the system is as a simple kind of pattern recognition. Several learning rules are assayed, especially those of the Hebbian sort, which are the ones with the most neurological support. Learning vector quantization (LVQ) is shown to be a perspicuous and efficient means of learning the patterns that are of interest. We draw a formal parallelism between the initial, competitive layer of LVQ and the simple cell layer in V1, and between the final, linear layer of LVQ and the complex cell layer in V1, in that the initial layers are both selective, while the final layers both generalize. It is also shown how the representations argued for can be used to draw the traditionally-recognized inferences arising from coordination and quantification, and why the inference of subalternacy breaks down for collective predicates. Finally, the analogies between early vision and the logical operators allow us to advance the claim of cognitive linguistics that language is not processed by proprietary algorithms, but rather by algorithms that are general to the entire brain. Thus in the debate between objectivist and experiential metaphysics, this book falls squarely into the camp of the latter. Yet it does so by means of a rigorous formal, mathematical, and neurological exposition – in contradiction of the experiential claim that formal analysis has no place in the understanding of cognition. To make our own counter-claim as explicit as possible, we present a sketch of the LVQ structure in terms of mereotopology, in which the initial layer of the network performs topological operations, while the final layer performs mereological operations. The book is meant to be self-contained, in the sense that it does not assume any prior knowledge of any of the many areas that are touched upon. It therefore contains mini-summaries of biological visual processing, especially the retinocortical and ventral /what?/ parvocellular pathways computational models of neural signaling, and in particular the reduction of the Hodgkin-Huxley equations to the connectionist and integrate-and-fire neurons Hebbian learning rules and the elaboration of learning vector quantization the linguistic pathway in the left hemisphere memory and the hippocampus truth-conditional vs. image-schematic semantics objectivist vs. experiential metaphysics and mereotopology. All of the simulations are implemented in MATLAB, and the code is available from the book’s website. • The discovery of several algorithmic similarities between visison and semantics. • The support of all of this by means of simulations, and the packaging of all of this in a coherent theoretical framework. (shrink)
Schizophrenics exhibit a deficit in theory of mind (ToM), but an intact theory of biology (ToB). One explanation is that ToM relies on an independent module that is selectively damaged. Phillips & Silverstein's analyses suggest an alternative: ToM requires the type of coordination that is impaired in schizophrenia, whereas ToB is spared because this type of coordination is not involved.
Context, connection, and coordination (CCC) describe well where the problems that apply to thought-disordered patients with schizophrenia lie. But they may be part of the experience of those with other symptom constellations. Switching is an important mechanism to allow context to be applied appropriately to changing circumstances. In some cases, NMDA-voltage modulations may be central, but gain and shift are also functions that monoaminergic systems express in CCC.
Studies of aging and autism as outlined by Bertone, Mottron, & Faubert (Bertone et al.) and by Faubert & Bertone suggest that disorders of cognitive coordination involving impairments of dynamic gestalt grouping and context-sensitivity may be common to several different disorders. We agree that such studies may shed light on these processes and their neuronal bases. However, we also emphasize that dynamic grouping and context-sensitivity can fail in various ways, and that, although the underlying pathophysiology may often involve NMDA-receptor malfunction, (...) many different malfunctions are possible, and each of these may result from any one of a number of different etiologies. (shrink)
This dissertation is based on the compositional model theoretic approach to natural language semantics that was initiated by Montague (1970) and developed by subsequent work. In this general approach, coordination and negation are treated following Keenan & Faltz (1978, 1985) using boolean algebras. As in Barwise & Cooper (1981) noun phrases uniformly denote objects in the boolean domain of generalized quanti®ers. These foundational assumptions, although elegant and minimalistic, are challenged by various phenomena of coordination, plurality and scope. The dissertation solves (...) these problems by developing a ¯exible process of meaning composition, as ®rst proposed by Partee & Rooth (1983). Flexible interpretation involves semantic operations without any phonological counterpart, which participate in the interpretation process and change meanings of overt expressions. The dissertation introduces a novel ¯exible system where a small number of operations describe the behaviour of complex phenomena such as `non-boolean' and, the scope of inde®nites and the semantics of collectivity with quanti®cational NPs. The proposed theory is based on a distinction between two features of meanings in natural language. (shrink)
The target article presents a model for schizophrenia extending four levels of abstraction: molecules, cells, cognition, and syndrome. An important notion in the model is that of coordination, applicable to both the level of cells and of cognition. The molecular level provides an “implementation” of the coordination at the cellular level, which in turn underlies the coordination at the cognitive level, giving rise to the clinical symptoms.
This article is a review of Paul Dumouchel's Emotions which focuses on the two levels of his emotion theory and heuristic. It interprets them both as the expression, in the domain of emotions, of a post-classical conception of nature and science that belongs to the tradition of scientific research on self-organization. Its main thesis, which is also shared by Emotions , is that creativity in nature and science corresponds to a process of coordination.
We calculate the Lebesgueâmeasures of the stability sets of Nash-equilibria in pure coordination games. The results allow us to observe that the ordering induced by the Lebesgueâmeasure of stability sets upon strict Nash-equilibria does not necessarily agree with the ordering induced by riskâdominance. Accordingly, an equilibrium selection theory based on the Lebesgueâmeasure of stability sets would be necessarily different from one which uses the Nash-property as a point of orientation.
The Phillips & Silverstein model of NMDA-mediated coordination deficits provides a useful heuristic for the study of schizophrenic cognition. However, the model does not specifically account for the development of schizophrenia-spectrum disorders. The P&S model is compared to Meehl's seminal model of schizotaxia, schizotypy, and schizophrenia, as well as the model of schizophrenic cognitive dysfunction posited by McCarley and colleagues.
Consideration of color alone can give a misleading impression of the three approaches to category coordination: the nativist, empiricist and culturalist models. Empiricist models can benefit from a wider range of correlational information in the environment. Also, all three approaches may explain a set of perceptual categories within the human repertoire. Finally, a suggestion is offered for supplementing the naming game by varying the social status of agents.
Neurophysiological investigations of the past two decades have consistently demonstrated a deficit in sensory gating associated with schizophrenia. Phillips & Silverstein interpret this impairment as being consistent with cognitive coordination dysfunction. However, the physiological mechanisms that underlie sensory gating have not been shown to involve gamma-band oscillations or NMDA-receptors, both of which are critical neural elements in the cognitive coordination model.
This paper is concerned with adaptive learning and coordination processes. Implementing agent-based modeling techniques (Learning Classifier Systems, LCS), we focus on the twofold impact of cognitive and environmental complexity on learning and coordination. Within this framework, we introduce the notion of Adaptive Learning Agent with Rule-based Memory (ALARM), which is a particular class of Artificial Adaptive Agent (AAA, Holland and Miller 1991). We show that equilibrium is approached to a high degree, but never perfectly reached. We also demonstrate that memorization (...) and learning capacities depend upon the relative discordance between the cognitive complexity of agents' mental models and the degree of stability of the environment. (shrink)
Although interesting, the hypotheses proposed by Phillips & Silverstein lack unifying structure both in specific mechanisms and in cited evidence. They provide little to support the notion that low-level sensory processing and high-level cognitive coordination share dynamic grouping by synchrony as a common processing mechanism. We suggest that more realistic large-scale modeling at multiple levels is needed to address these issues.
Lateralisation is traditionally viewed by neuroscientists and comparative psychologists from the perspective of the individual; however, for many animals lateralisation evolved in the context of group living. Here I discuss the implications of individual lateralisation within the context of the group from an evolutionary ecology perspective, with particular reference to coordinated anti-predator behaviour.
What insights does comparative biology provide for furthering scienti¿ c understanding of the evolution of dynamic coordination? Our discussions covered three major themes: (a) the fundamental unity in functional aspects of neurons, neural circuits, and neural computations across the animal kingdom; (b) brain organization –behavior relationships across animal taxa; and (c) the need for broadly comparative studies of the relationship of neural structures, neural functions, and behavioral coordination. Below we present an overview of neural machinery and computations that are shared (...) by all nervous systems across the animal kingdom, and the related fact that there really are no “simple” relationships in coordination between nervous systems and the behavior they produce. The simplest relationships seen in living organisms are already fairly complex by computational standards. These realizations led us to think about ways that brain similarities and differences could be used to produce new insights into complex brain–behavior phenomena (including a critical appraisal of the roles of cortical and noncortical structures in mammalian behavior), and to think brieÀy about how future studies could best exploit comparative methods to elucidate better general principles underlying the neural mechanisms associated with behavioral coordination. In our view, it is unlikely that the intricacies interrelating neural and behavioral coordination are due to one particular manifestation (such as neural oscillation or the possession of a six-layered cortex). Instead of considering the human cortex to be the standard against which all things are measured (and thus something to crow about), both broad and focused comparative studies on behavioral similarities and differences will be necessary to elucidate the fundamental principles underlying dynamic coordination. (shrink)
Texas festivals are given credit for providing benefits for both the festival's community and for the people who visit the community. As a result of these perceived benefits, communities across Texas stage a broad range of festivals and events. These events require substantial planning and skilled management to be successful. Those involved in the planning are often volunteers and have little or no background in event planning and management. Regardless of their experience level however, most event coordinators have ongoing needs (...) for information that will help them produce successful events. To produce a successful event, coordinators seek information from a variety of sources. These sources may include their personal network of friends and colleagues to professional consultants or formal workshops, conferences, and seminars. The primary purpose of this project is to provide information that will help the Texas Agricultural Extension Service Recreation, Park & Tourism Program evaluate its future role in responding to the needs of Texas festival coordinators. To accomplish this end, this study seeks to identify information needs of Texas festival coordinators and to describe how current sources of information are utilized by festival coordinators. This paper outlines the development and implementation of an information needs assessment of Texas festival coordinators and a description of current information sources being used by them. The project seeks to identify gaps in the provision of information and the access Texas festival coordinators perceive they have to information. The "importance"/"access" scale of this survey clearly identifies a range of important information topics which organizations could address. The following are the top 10 information needs of Texas festival coordinators as indicated by mail survey respondents: 1. Writing press releases 2. How to measure advertising success 3. Regulations for food, fire, safety, etc. 4. How to find regional talent 5. Insurance issues 6. Americans with Disabilities Act (ADA) 7. How to find and contract for professional entertainment 8. How to determine space requirements for event activities 9. How to create a layout for people/vehicle flow and activities 10. Estimating amount and type of security need It is indicated that most coordinators are aware of the six organizations and agencies that serve the needs of the festival industry. 1. International Festivals & Events Association (IFEA) 2. Texas Agricultural Extension Service (TAEX) - Department of Recreation, Park & Tourism at Texas A&M University 3. Texas Travel Industry Association (TTIA) 4. Texas Festivals & Events Association (TFEA) 5. Texas Department of Economic Development (TDED) 6. Texas Department of Transportation (TxDOT) Of respondents to the needs assessment mail survey, 82.8% to 72.6% of Texas festival coordinators are aware of the programs and services of these organizations and agencies. While coordinators show a high level of awareness of conferences, seminars, and workshops, only 13% to 24.7% have attended. Coordinator networks and the internet are viable sources of information. The majority (88%) of festival coordinators (n=216) indicated they have access to the internet and 81.9% of respondents (n=215) said they use the internet. Organizations can use this study and its results to focus on information needs that coordinators indicate are important and have low accessibility. January and February followed by June and July would be the best months for conferences, seminars, and workshops. Seminars may be held during the "off peak" months of January and February while brochures about how to better market an event can be distributed during the peak festival times of September and April. Coordinators' responses as to the best months they would be able to attend Certified Festival Manager (CFM) training also corresponded with event seasonality patterns. Using multiple distribution methods can improve access to festival planning information. Workshops, internet sites, brochures, and publications are just a few of the ways organizations may vary their distribution of information resources. Coordinators indicted their main source of information comes from coordinator networks (48.9%). This finding should be used to note the importance of networking at coordinator informational events. The internet is a viable source of information. Of the respondent coordinators, 98% indicted they have access to the internet. It is reasonable to suggest that more information can be provided via the internet. There are clearly overlapping functions among these organizations', however, many programs are carried out as partners in serving the festival industry. While this study did not examine these overlapping roles, it would be appropriate that areas of overlap should be examined more closely. (shrink)
Language serves many purposes in our individual lives and our varied interpersonal interactions. Daniel Everett's claim that language primarily emerges from an “interactional instinct“ and not a classic “language instinct“ gives proper weight to the importance of coordinated communication in meeting our adaptive needs. Yet the argument that language is a “cultural tool“, motivated by an underlying “instinct“, does not adequately explain the complex, yet complementary nature of both linguistic regularities and variations in everyday speech. Our alternative suggestion is that (...) language use, and coordinated communication more generally, is an emergent product of human self-organization processes. Both broad regularities and specific variations in linguistic structure and behavior can be accounted for by self-organizational processes that operate without explicit internal rules, blueprints, or mental representations. A major implication of this view is that both linguistic patterns and behaviors, within and across speakers, emerge from the dynamical interactions of brain, body, and world, which gives rise to highly context-sensitive and varied linguistic performances. (shrink)
Abstract: This paper examines the extent to which the voluntary adoption of codes of conduct by multinational corporations (MNCs) renders MNCs accountable for the performance of actions specified in a code of conduct. In particular, the paper examines the ways in which codes of conduct coordinate the expectations of relevant parties with regard to the provision of assistance by MNCs on grounds of rescue or justice. The paper argues that this coordinative role of codes of conduct renders MNCs more accountable (...) for the performance of actions specified in a code of conduct than they would be without a code of conduct. This interpretation of the significance of codes of conduct is contrasted with the view that codes of conduct render MNCs accountable for performing actions specified in a code of conduct by grounding contractual obligations for the performance of such actions. (shrink)
ChickenHawk is a social-dilemma game that distinguishes uncoordinated from coordinated cooperation. In tests with players belonging to a culturally homogeneous population, natural-language cheap talk led to efficient coordination, while nonlinguistic signaling yielded uncoordinated altruism. In a subsequent test with players from a moderately more heterogeneous population nearby, the cheap talk condition still produced better coordination than other signaling conditions, but at a lower level and with fewer acts of altruism overall. Implications are: (1) without language, even willing cooperators coordinate poorly; (...) (2) given a sufficiently homogeneous social group, language can coordinate cooperation in the face of opportunities for anonymous defection; (3) coordination therefore depends not on merely a general propensity to cooperate but on the overlap of social identities, which are always costly to acquire and maintain. So far as linguistic variation establishes how much social identities overlap, natural-language cheap talk is self-insuring, suggesting that linguistic variation is itself adaptive. (shrink)
The disagreement about intertemporal coordination between Austrians and Keynesians is explained pointing out to differences both in the way expectations and motivations are treated and the methodological principles assumed by each view. Austrians believe that research should proceed showing first what guarantees a successful coordination in individualsŠ plans, and only latter showing which could hinder the “natural” course. Keynes, on the contrary, do not start with any ideal state of affairs, but allows economies to work either “good” or “bad” according (...) to the prevailing expectations among entrepreneurs. In this way uncertainty and expectations are fully incorporated into economic theory. A link between Austrian approach and popperian situational analysis is suggested. (shrink)
The value of any kind of data is greatly enhanced when it exists in a form that allows it to be integrated with other data. One approach to integration is through the annotation of multiple bodies of data using common controlled vocabularies or ‘ontologies’. Unfortunately, the very success of this approach has led to a proliferation of ontologies which itself creates obstacles to integration. The Open Biomedical Ontologies (OBO) consortium has set in train a strategy to overcome this problem. Existing (...) OBO ontologies, including the Gene Ontology, are undergoing a process of coordinated reform and new ontologies being created on the basis of an evolving set of shared principles governing ontology development. The result is an expanding family of ontologies designed to be interoperable, logically well-formed, and to incorporate accurate representations of biological reality. We describe the OBO Foundry initiative, and provide guidelines for those who might wish to become involved. (shrink)
Although the application of the emulation model to the control of simple positioning movements is relatively straightforward, extending the scheme to actions requiring multisegmental, interlimb coordination complicates matters a bit. Special consideration of the demands in this case, both on sensory processing and on the process model (two key elements of the Kalman filter), are discussed.
What is the relation between norms (in the sense of ?socially accepted rules?) and conventions? A number of philosophers have suggested that there is some kind of conceptual or constitutive relation between them. Some hold that conventions are or entail special kinds of norms (the ?conventions-as-norms thesis?). Others hold that at least some norms are or entail special kinds of conventions (the ?norms-as-conventions thesis?). We argue that both theses are false. Norms and conventions are crucially different conceptually and functionally in (...) ways that make it the case that it is a serious mistake to try to assimilate them. They are crucially different conceptually in that whereas conventions are not normative and are behaviour dependent and desire dependent, norms are normative, behaviour independent, and desire independent. They are crucially different functionally in that whereas conventions principally serve the function of facilitating coordination, norms principally serve the function of making us accountable to one another. (shrink)
The manifest image is teeming with activity. Objects are booming and buzzing by, changing their locations and properties, vivid perceptions are replaced, and we seem to be inexorably slipping into the future. Time—or at least our experience in time— seems a very turbulent sort of thing. By contrast, time in the scientist image seems very still. The fundamental laws of physics don’t differentiate between past and future, nor do they pick out a present moment that flows. Except for a minus (...) sign in the relativistic metric, there are few differences between the temporal and spatial coordinates in natural science. We seem to have, to echo another debate, an “explanatory gap” between time as we find it in experience and as we find it in science. Reconciling these two images of the world is the principal goal of philosophy of time. (shrink)
In this paper I address a structurally similar tension between phenomenalism and realism about matter in Leibniz and Kant. In both philosophers, some texts suggest a starkly phenomenalist view of the ontological status of matter, while other texts suggest a more robust realism. In the first part of the paper I address a recent paper by Don Rutherford that argues that Leibniz is more of a realist than previous commentators have allowed. I argue that Rutherford fails to show that Leibniz (...) is any less an idealist than his main target, Robert Merrihew Adams, does. I distinguish various kinds of idealism about bodies that Leibniz might have held, and attempt to determine which package of views represents his considered view. In the second part of the paper I situate Kant’s idealism within the same coordinates. I argue that, abstracting from deep differences in their metaphysics and epistemology, Kant and Leibniz have structurally very similar views on the ontological status of matter and bodies. I conclude that the key to understanding the realist strand in their ontology of matter is understanding the way in which, for both thinkers, the forces in bodies are appearances of forces of more fundamental entities, either monads or things in themselves. (shrink)
Over the last century, psychoanalysis has transformed the ways in which we think about our relationships with others. Psychoanalytic concepts and methods, such as the unconscious and dream analysis, have greatly impacted on social, cultural and political theory. Reinterpreting the ways in which geography has explored people's mental maps and their deepest feelings about places, The Body and the City outlines a new cartography of the subject. Mapping key coordinates of meaning, identity and power across the sites of body (...) and city, author Steve Pile explores a wide range of critical thinking, particularly the work of Lefebvre, Freud and Lacan to present a pathbreaking psychoanalysis of space. (shrink)
According to an often repeated definition, economics is the science of individual choices and their consequences. The emphasis on choice is often used – implicitly or explicitly – to mark a contrast between markets and the state: While the price mechanism in well-functioning markets preserves freedom of choice and still efficiently coordinates individual actions, the state has to rely to some degree on coercion to coordinate individual actions. Since coercion should not be used arbitrarily, coordination by the state needs (...) to be legitimized by the consent of its citizens. The emphasis in economic theory on freedom of choice in the market sphere suggests that legitimization in the market sphere is “automatic” and that markets can thus avoid the typical legitimization problem of the state. In this paper, I shall question the alleged dichotomy between legitimization in the market and in the state. I shall argue that it is the result of a conflation of choice and consent in economics and show how an independent concept of consent makes the need for legitimization of market transactions visible. Footnotes1 For helpful comments and suggestions I am most grateful to Marc Fleurbaey, Alain Marciano, Herlinde Pauer-Studer, Thomas Pogge, Hans Bernhard Schmid, to seminar or conference participants in Aix-Marseille, Tutzing, Paris, and Amsterdam, and to two anonymous referees. (shrink)
Throughout history, dance has maintained a critical presence across all human cultures, defying barriers of class, race, and status. How dance has synergistically co-evolved with humans has fueled a rich debate on the function of art and the essence of aesthetic experience, engaging numerous artists, historians, philosophers, and scientists. While dance shares many features with other art forms, one attribute unique to dance is that it is most commonly expressed with the human body. Because of this, social scientists and neuroscientists (...) are turning to dance and dancers to help answer questions of how the brain coordinates the body to perform complex, precise, and beautiful movements. In the present paper, we discuss how recent advances in neuroscientific methods provide the tools to advance our understanding of not only the cerebral phenomena associated with dance learning and observation but also the neural underpinnings of aesthetic appreciation associated with watching dance. We suggest that future work within the fields of dance neuroscience and neuroaesthetics have the potential to provide mutual benefits to both the scientific and artistic communities. (shrink)
Machine generated contents note: -- 1. Introduction -- Consciousness and Sensorimotor Dynamics: Methodological Issues -- 2. Computational consciousness, D. Ballard -- 3. Explaining what people say about sensory qualia, J. Kevin O'Regan -- 4. Perception, action, and experience: unraveling the golden braid, A. Clark -- The Two-Visual Systems Hypothesis -- 5. Cortical visual systems for perception and action, A.D. Milner and M.A. Goodale -- 6. Hermann Lotze's Theory of 'Local Sign': evidence from pointing responses in an illusory figure, (...) D.R. Melmoth -- Understanding Agency and Object Perception -- 7. Two visual systems and the feeling of presence, M. Matthen -- 8. Spatial coordinates and phenomenology in the two-visual systems model, P. Jacob and F. de Vignemont -- 9. Perceptual experience and the capacity to act, S. Schellenberg -- Perception and Action: Studies in Cognitive Neuroscience -- 10. Why does the perception-action functional dichotomy not match the ventral-dorsal streams in anatomical segregation: optic ataxia and the function of the dorsal stream, Y. Rossetti et al -- 11. Mapping the neglect syndrome onto neurofunctional streams, G. Vallar and F. Mancini -- 12. Motor representations and the perception of space: perceptual judgments of the boundary of action space, Y. Delevoye-Turrell -- The Role of Action and Sensorimotor Knowledge in Sensorimotor Theories of Perception -- 13. Vision without representation, A. Noe -- 14. Sensorimotor knowledge and the contents of experience, J. Kiverstein -- Boundaries of the Agent -- 15. Extended vision, R. A. Wilson. (shrink)
Although ‘Rxx’ and ‘Rxy’ are both applications of a two-place predicate to a pair of terms, ‘Rxx’ resembles a one-place predicate in that all one needs to evaluate it is an assignment to ‘x’. A similar point applies to the sequences ‘Fx’, ‘Gx’ and ‘Fx’, ‘Gy’ – even though neither is a one-place predicate. Kit Fine’s semantic relationalism aims to extract a common idea uniting these comparisons, and to use it to provide a Millian solution to Frege’s Puzzle.
Humans are closely coupled with their environments. They rely on being ëembeddedí to help coordinate the use of their internal cognitive resources with external tools and resources. Consequently, everyday cognition, even cognition in the absence, may be viewed as partially distributed. As cognitive scientists our job is to discover and explain the principles governing this distribution: principles of coordination, externalization, and interaction. As designers our job is to use these principles, especially if they can be converted to metrics, in order (...) to invent and evaluate candidate designs. After discussing a few principles of interaction and embedding I discuss the usefulness of a range of metrics derived from economics, computational complexity and psychology. (shrink)
Enactive approaches foreground the role of interpersonal interaction in explanations of social understanding. This motivates, in combination with a recent interest in neuroscientific studies involving actual interactions, the question of how interactive processes relate to neural mechanisms involved in social understanding. We introduce the Interactive Brain Hypothesis (IBH) in order to help map the spectrum of possible relations between social interaction and neural processes. The hypothesis states that interactive experience and skills play enabling roles in both the development and current (...) function of social brain mechanisms, even in cases where social understanding happens in the absence of immediate interaction. We examine the plausibility of this hypothesis against developmental and neurobiological evidence and contrast it with the widespread assumption that mindreading is crucial to all social cognition. We describe the elements of social interaction that bear most directly on this hypothesis and discuss the empirical possibilities open to social neuroscience. We propose that the link between coordination dynamics and social understanding can be best grasped by studying transitions between states of coordination. These transitions form part of the self-organization of interaction processes that characterize the dynamics of social engagement. The patterns and synergies of this self-organization help explain how individuals understand each other. Various possibilities for role-taking emerge during interaction, determining a spectrum of participation. This view contrasts sharply with the observational stance that has guided research in social neuroscience until recently. We also introduce the concept of readiness to interact to describe the practices and dispositions that are summoned in situations of social significance (even if not interactive). This latter idea links interactive factors to more classical observational scenarios. (shrink)
This paper begins by raising a puzzle about what function our use of the word ‘rational’ could serve. To solve the puzzle, I introduce a view I call Epistemic Communism: we use epistemic evaluations to promote coordination among our basic belief-forming rules, and the function of this is to make the acquisition of knowledge by testimony more efficient.
This paper expounds the relations between continuous symmetries and conserved quantities, i.e. Noether's ``first theorem'', in both the Lagrangian and Hamiltonian frameworks for classical mechanics. This illustrates one of mechanics' grand themes: exploiting a symmetry so as to reduce the number of variables needed to treat a problem. I emphasise that, for both frameworks, the theorem is underpinned by the idea of cyclic coordinates; and that the Hamiltonian theorem is more powerful. The Lagrangian theorem's main ``ingredient'', apart from cyclic (...)coordinates, is the rectification of vector fields afforded by the local existence and uniqueness of solutions to ordinary differential equations. For the Hamiltonian theorem, the main extra ingredients are the asymmetry of the Poisson bracket, and the fact that a vector field generates canonical transformations iff it is Hamiltonian. (shrink)
The A-theory of time has intuitive and metaphysical appeal, but suffers from tension, if not inconsistency, with the special and general theories of relativity (STR and GTR). The A-theory requires a notion of global simultaneity invariant under the symmetries of the world's laws, those ostensible transformations of the state of the world that in fact leave the world as it was before. Relativistic physics, if read in a realistic sense, denies that there exists any notion of global simultaneity that is (...) invariant under the symmetries of the world's laws. If physics is at least a decent guide to metaphysics--as sympathies for scientific realism would suggest--then relativistic physics supports the B-theory. If there were a physically natural way to modify the symmetries of the physical laws so as to remove those that are repugnant to the A-theory, while retaining empirical adequacy, then such an altered physics might be attractive to the A-theorist and would weaken the support given by relativity to the B-theory. I exhibit a way to do so here, displaying a Lagrangian density explicitly containing distant simultaneity, yet implying Einstein's field equations. The modification involves a change in the nature of the lapse function and makes use of the Dirac-Bergmann formalism of constrained dynamics, which recently has been discussed much by John Earman. Here this formalism is adapted slightly to permit both local and global generalized coordinates. A classification of senses in which time might be absolute or not is made along the way. Some suggestions for extending the work by finding a first principles motivation are made. An appendix outlines an argument why many local presents are insufficient and a global present is attractive, while two more appendices review the Dirac-Bergmann apparatus for GTR and then apply it to the theory at hand. (shrink)
“Speciesism” accords greater value to human beings and their interests. It is supposed to be opposed to a liberationist stance, since it is precisely the numerous forms of discounting of animal interests which liberationists oppose. This association is mistaken. In this paper I claim that many forms of speciesism are consistent with upholding a robust liberationist agenda. Accordingly, several hotly disputed topics in animal ethics can be set aside. The significance of such clarification is that synthesizing liberationism with speciesism substantially (...) modifies some of the coordinates of the debates over animal ethics. Secondly, defusing some counterintuitive implications of liberationism may make liberationism more popular than it currently is. Liberationism would no longer demand the eradication of ingrained speciesist intuitions. The paper finally presents a form of speciesism that does oppose liberationism, but is too strong and (fortunately) shared by few. (shrink)
A new event is defined as an intervention in the time reversible dynamical trajectories of particles in a system. New events are then assumed to be quantum fluctuations in the spatial and momentum coordinates, and mental action is assumed to work by ordering such fluctuations. It is shown that when the cumulative values of such fluctuations in a mean free path of a molecule are magnified by molecular interaction at the end of that path, the momentum of a molecule (...) can be changed from its original direction to any other direction. In this way mental action can produce effects through the ordering of thermal motions. Examples are given which show that the ordering of 10^4 10^5 molecules is sufficient to (a) produce detectible PK results and (b) open sufficient ion channels in the brain to initiate a physical action. The relationship of the above model to the arrow of time is discussed. (shrink)
I argue that maps do not feature predication, as analyzed by Frege and Tarski. I take as my foil (Casati and Varzi, Parts and places, 1999), which attributes predication to maps. I argue that the details of Casati and Varzi’s own semantics militate against this attribution. Casati and Varzi emphasize what I call the Absence Intuition: if a marker representing some property (such as mountainous terrain) appears on a map, then absence of that marker from a map coordinate signifies absence (...) of the corresponding property from the corresponding location. Predication elicits nothing like the Absence Intuition. “F(a)” does not, in general, signify that objects other than a lack property F. On the basis of this asymmetry, I argue that attaching a marker to map coordinates is a different mode of semantic composition than attaching a predicate to a singular term. (shrink)
The Hard Problem of the mind is addressed and it is argued that physical-phenomenal property identities have the same status as the identification of an ostended bit of physical space and the coordinates assigned the spot on a map of the terrain. It is argued, that is to say, that such identities are, or follow from, stipulations which interpret the map.
Many physicists believe that time constitutes a serious problem in quantum mechanics. We show nevertheless that quantum mechanics does not involve a special problem for time, and that there is no fundamental asymmetry between space and time in quantum mechanics over and above the asymmetry that already exists in classical physics. The apparent problem of time arises when the time parameter is put on a par with dynamical position variables rather than with the coordinates of space. The commutation relations (...) and uncertainty relations are generally considered to embody the essential content of elementary quantum mechanics, but the traditional mathematical expression of the uncertainty principle it shown to be quite unsatisfactory. It is the total energy that decrees whether or not the time variables of a system can be sharply determined. (shrink)
How is temporal information conveyed in language? In languages with tense it is direct; without tense, inference allows the receiver to arrive at an indirect temporal interpretation. I will discuss tensed and tenseless languages, proposing a unified approach that applies to both. I show that a few very general pragmatic principles account for temporal interpretation, direct and indirect.1 I assume that understanding a sentence requires that the receiver locate an event or state, spatially and temporally: time is one of the (...) basic coordinates for truth conditional assessment. Sentences in all languages convey information that allows us to determine the temporal location of the situation expressed. One would like to understand how this happens. The pragmatic principles that I suggest constrain direct temporal interpretation and guide indirect. In languages with tense, tense gives direct temporal information; however certain apparent possibilities do not arise, due to the pragmatic constraints. In languages without tense, inference allows temporal interpretation. The key point in such languages.. (shrink)
It is argued that the Tractatus Project of Logical Atomism, in which the world is conceived of as the totality of independent atomic facts, can usefully be understood by conceiving of each fact as a bit in logical space. Wittgenstein himself thinks in terms of logical space. His elementary propositions, which express atomic facts, are interpreted as tuples of co-ordinates which specify the location of a bit in logical space. He says that signs for elementary propositions are arrangements of names. (...) Here, the names are understood as numerical symbols specifying coordinates. It is argued that, using this approach, the so-called colour-exclusion problem, which was Wittgensteins reason for abandoning the Tractatus, is in fact soluble. However, if logical space is a continuum then some coordinates will need to be expressed by numerical symbols that are infinite in size. How is this to be understood in Tractatus terms? It is shown that, in the Tractatus, Wittgenstein did recognise the possibility of infinite propositions and sentences expressing them. At first sight his approach to infinite sentences, and the approach of the present paper, seem to differ, but it is argued that the difference is superficial. Finally, we address the question of whether Logical Atomism is viable and this raises issues concerning its relationship to natural science. (shrink)
Empiricists are in general rather suspicious with respect to any kind of abstract entities like properties, classes, relations, numbers, propositions, etc. They usually feel much more in sympathy with nominalists than with realists (in the medieval sense). As far as possible they try to avoid any reference to abstract entities and to restrict themselves to what is sometimes called a nominalistic language, i.e., one not containing such references. However, within certain scientific contexts it seems hardly possible to avoid them. In (...) the case of mathematics some empiricists try to find a way out by treating the whole of mathematics as a mere calculus, a formal system for which no interpretation is given, or can be given. Accordingly, the mathematician is said to speak not about numbers, functions and infinite classes but merely about meaningless symbols and formulas manipulated according to given formal rules. In physics it is more difficult to shun the suspected entities because the language of physics serves for the communication of reports and predictions and hence cannot be taken as a mere calculus. A physicist who is suspicious of abstract entities may perhaps try to declare a certain part of the language of physics as uninterpreted and uninterpretable, that part which refers to real numbers as space-time coordinates or as values of physical magnitudes, to functions, limits, etc. More probably he will just speak about all these things like anybody else but with an uneasy conscience, like a man who in his everyday life does with qualms many things which are not in accord with the high moral principles he professes on Sundays. Recently the problem of abstract entities has arisen again in connection with semantics, the theory of meaning and truth. Some semanticists say that certain expressions designate certain entities, and among these designated entities they include not only concrete material things but also abstract entities e.g., properties as designated by predicates and propositions as designated.... (shrink)
Three of Zeno's objections to motion are answered by utilizing a version of nonstandard analysis, internal set theory, interpreted within an empirical context. Two of the objections are without force because they rely upon infinite sets, which always contain nonstandard real numbers. These numbers are devoid of numerical meaning, and thus one cannot render the judgment that an object is, in fact, located at a point in spacetime for which they would serve as coordinates. The third objection, an arrow (...) never appears to be moving, is answered by showing that it only applies to a finite number of instants of time. A theory of motion is also advanced; it consists of a finite series of contiguous infinitesimal steps. The theory is immune to Zeno's first two objections because the number of steps is finite and each lies outside the domain of observation. Present motion is hypothesized to be an unobservable process taking place within each step. The fact of motion is apparent through a summing (Riemann integration) of the steps. (shrink)
Written over the course of two months in early 2008, Art as "Night" is a series of essays in part inspired by a January 2007 visit to the Velázquez exhibition at the National Gallery of Art, London, with subsequent forays into related themes and art-historical judgments for and against theories of meta-painting. Art as "Night" proposes a type of a-historical dark knowledge (a-theology and theology, at once) crossing painting since Velázquez, but reaching back to the Renaissance, especially Titian and Caravaggio. (...) As a form of formalism, this "night" is also closely allied with forms of intellection that come to reside in art as pure visual agency or material knowledge while invoking moral agency, a function of art more or less bracketed in modern art for ethical and/or political agency. Not a theory of meta-painting, Art as "Night" restores coordinates arguably lost in painting since the separation of natural and moral philosophy in the Baroque era. It is with Velázquez that we see a turning point, an emphasis on the specific resources of painting as a form of speculative intellect, while it is with contemporary works by Gerhard Richter and Anselm Kiefer that we see the return of the same after the collapse of modernism, and after subsequent postmodern maneuvers to make art discursive yet without the austerities of the formal means present in Art as Art. Art as "Night" argues for a nondiscursive form of intellection fully embodied in the work of art – and, foremost, painting. A synoptic and intentionally elusive and allusive survey of painting, through the collapse of the art market in late 2007, Art as "Night" suggests by way of this critique of an elective "night" crossing painting that the art world is an endlessly deferred version of pleroma (Hegel’s Absolute Knowledge), a fully synthetic world given to an exploration and appropriation of the given through classical mimesis and epistemology and its complete incorporation and transfiguration in a theory of knowledge and art as pure speculative agency. In effect, Art as "Night" is an incarnational theory of art as absolute knowledge. (shrink)
“Else-where” is a synoptic survey of the representational values given to art, architecture, and cultural production from 2002 through 2011. Written primarily as a critique of what is suppressed in architecture and what is disclosed in art, the essays are informed by the passage out of post-structuralism and its disciplinary analogues toward the real Real (denoted over the course of the studies as the “Real-Irreal” or “Else-where”). While architecture nominally addresses an environmental ethos, it also famously negotiates its own representational (...) values by way of its putative autonomy (autonomy as self-interest, versus selflessness); its main repression in this regard is “landscape,” figure of the Other and figure of the Real. Engaging forms of spectrality, and not necessarily speculative intelligence per se, architecture is also “conscious” of its own complicity in capitalist orders, a complicity that in part underwrites its avant-garde forms of agitation since the onset of modern architecture. As a result, and over the course of the twentieth century, architectural vanguards have successively been depleted such that they return only as reified half-measures in the late-modern production of difference. As such, the essay “Actually Existing Ground” (2008) examines the failed promise of Landscape Urbanism. Since the 1960s, as with the allied arts, architecture has evacuated many of the utopian gestures given to modernism and embraced a form of ultra-contingency in a direct alliance with the post-modern and post-Marxist concession to markets and to cultural production as principal means of establishing formal hegemony. This recourse or surrender to the economic-determinist ethos of post-modernity, regardless of attempts to problematize it and/or critique it through types of what Manfredo Tafuri has called “operative criticism” (works of architecture as criticism), has, arguably, all but failed, and with the suggestive return circa 2011 of new forms of resistance an exit from the accommodating spirit of the times is indicative of the expectation of strenuous, yet highly formal and non-discursive operations within artistic and architectural production. The essays collected in “Else-where” cross various disciplines, inclusive of landscape architecture, architecture, and visual art, to develop a nuanced critique of an emergent formal regard in the arts that is also an invocation of the highest coordinates given to the arts – formal ontology as speculative intelligence itself – or the return of the universal as utopian thought “here-and-now.”. (shrink)
What is the meaning of general covariance? We learn something about it from the hole argument, due originally to Einstein. In his search for a theory of gravity, he noted that if the equations of motion are covariant under arbitrary coordinate transformations, then particle coordinates at a given time can be varied arbitrarily - they are underdetermined - even if their values at all earlier times are held fixed. It is the same for the values of fields. The argument (...) can also be made out in terms of transformations acting on the points of the manifold, rather than on the coordinates assigned to the points. So the equations of motion do not fix the particle positions, or the values of fields at manifold points, or particle coordinates, or fields as functions of the coordinates, even when they are specified at all earlier times. It is surely the business of physics to predict these sorts of quantities, given their values at earlier times. The principle of general covariance therefore seems untenable. (shrink)
We discuss a new theory of the universe in which the vacuum energy is of classical origin and dominates the energy content of the universe. As usual, the Einstein equations determine the metric of the universe. However, the scale factor is controlled by total energy conservation in contrast to the practice in the Robertson–Walker formulation. This theory naturally leads to an explanation for the Big Bang and is not plagued by the horizon and cosmological constant problem. It naturally accommodates the (...) notion of dark energy and proposes a possible explanation for dark matter. It leads to a dual description of the universe, which is reminiscent of the dual theory proposed by Milne in 1937. On the one hand one can describe the universe in terms of the original Einstein coordinates in which the universe is expanding, on the other hand one can describe it in terms of co-moving coordinates which feature in measurements. In the latter representation the universe looks stationary and the age of the universe appears constant. The paper describes the evolution of this universe. It starts out in a classical state with perfect symmetry and zero entropy. Due to the vacuum metric the effective energy density is infinite at the beginning, but diminishes rapidly. Once it reaches the Planck energy density of elementary particles, the formation of particles can commence. Because of the quantum nature of creation and annihilation processes spatial and temporal inhomogeneities appear in the matter distributions, resulting in residual proton (neutron) and electron densities. Hence, quantum uncertainty plays an essential role in the creation of a diversified complex universe with increasing entropy. It thus seems that quantum fluctuations play a role in cosmology similar to that of random mutations in biology. Other analogies to biological principles, such as recapitulation, are also discussed. (shrink)
This paper engages the extended cognition controversy by advancing a theory which fits nicely into an attractive and surprisingly unoccupied conceptual niche situated comfortably between traditional individualism and the radical externalism espoused by the majority of supporters of the extended mind hypothesis. I call this theory moderate active externalism, or MAE. In alliance with other externalist theories of cognition, MAE is committed to the view that certain cognitive processes extend across brain, body, and world—a conclusion which follows from a theory (...) I develop in “Synergic Coordination: an argument for cognitive process externalism.” Yet, in contradistinction with radical externalism, and in agreement with the internalist orthodoxy, MAE defends the view that mental states are situated invariably inside our heads. This is done, inter alia, by developing a novel hypothesis regarding the vehicles of content (in “Extended cognition without externalized mental states”, and by criticizing arguments in support of mental states externalism (in “Reflections and objections”). The result, I believe, is a coherent theoretical alternative worthy of serious consideration. (shrink)
Integral Ecology uses multiple perspectives to analyze environmental problems. Four of Integral Ecology's major analytical perspectives (known as the quadrants) correspond to the four divisions of the liberal arts and sciences: fine arts, natural science, social science, and humanities. Integral Ecology also utilizes the analytical perspective provided by the idea of cultural moral development. This perspective helps to reveal how stakeholders at different developmental stages disclose a phenomenon, in this case, a tropical forest that loggers propose to clear-cut. Integral Ecology (...) takes into account all pertinent perspectives, in order to arrive at the best possible solution to environmental problems and conflicts. (shrink)
Background: One of the all-time questions in evolutionary biology regards the evolution of organismal shapes, and in particular why certain forms appear repeatedly in the history of life, others only seldom and still others not at all. Recent research in this field has deployed the conceptual framework of constraints and natural selection as measured by quantitative genetic methods. -/- Scope: In this paper I argue that quantitative genetics can by necessity only provide us with useful statistical sum- maries that may (...) lead researchers to formulate testable causal hypotheses, but that any inferential attempt beyond this is unreasonable. Instead, I suggest that thinking in terms of coordinates in phenotypic spaces, and approaching the problem using a variety of empirical methods (seeking a consilience of evidence), is more likely to lead to solid inferences regarding the causal basis of the historical patterns that make up most of the data available on phenotypic evolution. (shrink)
In the May 15, 1935 issue of Physical Review Albert Einstein co-authored a paper with his two postdoctoral research associates at the Institute for Advanced Study, Boris Podolsky and Nathan Rosen. The article was entitled “Can Quantum Mechanical Description of Physical Reality Be Considered Complete?” (Einstein et al. 1935). Generally referred to as “EPR”, this paper quickly became a centerpiece in the debate over the interpretation of the quantum theory, a debate that continues today. The paper features a striking case (...) where two quantum systems interact in such a way as to link both their spatial coordinates in a certain direction and also their linear momenta (in the same direction). As a result of this “entanglement”, determining either position or momentum for one system would fix (respectively) the position or the momentum of the other. EPR use this case to argue that one cannot maintain both an intuitive condition of local action and the completeness of the quantum description by means of the wave function. This entry describes the argument of that 1935 paper, considers several different versions and reactions, and explores the ongoing significance of the issues they raise. (shrink)
Each of us distinguishes between himself and states of himself on the one hand, and what is not himself or a state of himself on the other. What are the conditions of our making this distinction, and how are they fulfilled? In what way do we make it, and why do we make it in the way we do?
What are the relationships between an entity and the space at which it is located? And between a region of space and the events that take place there? What is the metaphysical structure of localization? What its modal status? This paper addresses some of these questions in an attempt to work out at least the main coordinates of the logical structure of localization. Our task is mostly taxonomic. But we also highlight some of the underlying structural features and we (...) single out the interactions between the notion of localization and nearby notions, such as the notions of part and whole, or of necessity and possibility. A theory of localization--we argue--is needed in order to account for the basic relations between objects and space, and runs afoul a pure part-whole theory. We also provide an axiomatization of the relation of localization and examine cases of localization involving entities different from material objects. (shrink)
The article argues against the common notion ofdisciplinary medical traditions, i.e. Obstetrics, asmacro-structures that quite unilinearily structure thepractices associated with the discipline. It shows that the various existences of Obstetrics, their relations with practices and vice versa, the entities these obstetrical practices render present and related, and the ways they are connected to experiences, are more complex than the unilinear model suggests. What allows participants to go from one topos to another – from Obstetrics to practice, from practice to politics, (...) from politics to experience – is not self-evidently induced by Obstetrics, but needs to be studied as a surprising range of passages that connect (or don't). Techniques and devices to supervise the delivery, to render present the fetus during pregnancy, and to monitoring birth, are described in order to show that such techniques acquire different roles in connecting and creating Obstetrics as a system andobstetrical practices. (shrink)
Adams’s Thesis, the claim that the probabilities of indicative conditionals equal the conditional probabilities of their consequents given their antecedents, has proven impossible to accommodate within orthodox possible-world semantics. This essay proposes a modification to the orthodoxy that removes this impossibility. The starting point is a proposal by Jeffrey and Stalnaker that conditionals take semantic values in the unit interval, interpreting these (à la McGee) as their expected truth-values at a world. Their theories imply a false principle, namely, that the (...) probability of a conditional is independent of any proposition inconsistent with its antecedent. But they also point to something important, namely, that our uncertainty about conditionals is not confined to uncertainty about the facts (what the actual world is like) but also expresses uncertainty about the counterfacts (what the world would be like if one or another supposition were true). To capture this observation, this essay proposes that the semantic contents of conditionals be treated as sets of vectors of possible worlds, not singleton worlds, with the coordinates of each specifying the world that is or would be true under the supposition that it represents. The probabilities of truth for conditionals will then depend on the joint probabilities of the facts and counterfacts, the latter in turn depending on the mode of supposition. The implication of this treatment is that the probabilities of conditionals are conditional probabilities whenever the mode of supposition is evidential. (shrink)
1. A particle moves back and forth along a line, increasing in speed. Graph. 2. How many equivalence classes in Galilean spacetime are there for a particle that is at rest? A particle that is moving at a constant speed? Why are the previous two questions trick questions? 3. In Galilean spacetime, there is no such thing as absolute velocity. Is there such a thing as absolute acceleration? If not, why not? If so, describe a spacetime in which there is (...) no notion of absolute acceleration. Hint: to move from Aristotelian spacetime to Galilean spacetime, we got rid of the notion of absolute velocity by counting two graphs as equivalent (picturing the same spacetime) if they differed by a shear transformation. Perhaps we can get rid of absolute acceleration with an analogous move? 4. Draw a two-dimensional Cartesian grid. Label the axes x and t, and mark a scale on these axes. Make the x axis the horizontal axis, and the t axis the vertical one. Pick two points that are not on the same vertical line. Name them Ann and Bob. Label each point with its x and t coordinates. (shrink)
Humans and other animals are able not only to coordinate their actions with their current sensorimotor state, but also to imagine, plan and act in view of the future, and to realize distal goals. In this paper we discuss whether or not their future-oriented conducts imply (future-oriented) representations. We illustrate the role played by anticipatory mechanisms in natural and artificial agents, and we propose a notion of representation that is grounded in the agent’s predictive capabilities. Therefore, we argue that the (...) ability that characterizes and defines a true cognitive mind, as opposed to a merely adaptive system, is that of building representations of the non-existent, of what is not currently (yet) true or perceivable, of what is desired. A real mental activity begins when the organism is able to endogenously (i.e. not as the consequence of current perceptual stimuli) produce an internal representation of the world in order to select and guide its conduct goal-directed: the mind serves to coordinate with the future. (shrink)
A simple quantum relativistic model of ν µ − ντ neutrino oscillations in the OPERA experiment is presented. This model suggests that the two components in the neutrino beam are separated in space. After being created in a meson decay, the µ-neutrino moves 18 meters ahead of the beam’s center of energy, while the τ -neutrino is behind. Both neutrinos have subluminal speeds, however the advanced start of the ν µ explains why it arrives in the detector 60 ns earlier (...) than expected. Our model does violate the special-relativistic ban on superluminal signals. However, usual arguments about violation of causality are not applicable here. The invalidity of standard special-relativistic arguments is related to the interaction-dependence of the boost operator, which implies that boost-transformed space-time coordinates of events with interacting particles do not obey linear and universal Lorentz formulas. (shrink)