As opposed to the dismissive attitude toward reductionism that is popular in current philosophy of mind, a “ruthless reductionism” is alive and thriving in “molecular and cellular cognition”—a field of research within cellular and molecular neuroscience, the current mainstream of the discipline. Basic experimental practices and emerging results from this field imply that two common assertions by philosophers and cognitive scientists are false: (1) that we do not know much about how the brain works, and (2) that lower-level neuroscience cannot (...) explain cognition and complex behavior directly. These experimental practices involve intervening directly with molecular components of sub-cellular and gene expression pathways in neurons and then measuring specific behaviors. These behaviors are tracked using tests that are widely accepted by experimental psychologists to study the psychological phenomenon at issue (e.g., memory, attention, and perception). Here I illustrate these practices and their importance for explanation and reduction in current mainstream neuroscience by describing recent work on social recognition memory in mammals. (shrink)
One theme in recent philosophical attention to neuroscience has been that closer, more serious attention to actual neuroscientific research, and its results, challenges the familiar view that psychological properties are multiply realized by neuroscientific properties. Shagrir, (1998), presents a number of diverse reasons to think that diversity in neuroscientifically identified structures and properties does not inevitably lead to multiple realization. Bechtel and Mundale, (1999), argue that neuroscientific practice extending over a century contradicts the consequences of the hypothesis that psychological functions (...) are multiply realized. Bickle, (2003), argues that a series of animal models of the consolidation of short-term memories into long-term memories reveals that this process is uniquely realized by a single biochemical cascade involving cAMP, protein kinase A, and cAMP response element binding proteins. Shapiro, (2004), argues that experiments on neuroplasticity do not show that there are many ways in which a brain might be wired in order to achieve a given psychological function. (shrink)
A new theory for basic function in the nervous system has recently been proposed (Dempsher, J., 1979a, 1979b; 1980, 1981). The major basic themes of the new theory are as follows: (1) There are two fundamental units of structure and function, the fibre or conducting mechanism, and the neurocentre, where nervous system function as we know it takes place. (2) The nerve impulse is regarded as a mathematical event. The mathematics is the result of a prescribed fusion of energy and (...) matter. (3) Nervous system function everywhere in the nervous system is mathematical. In the fibre, the prescribed fusion of energy and matter results in a number. In the neurocentre, the prescribed fusion of energy and matter results in a mathematical function. Basic function in the nervous system everywhere requires a transformation of a nerve impulse in the fibre into a nerve impulse in the neurocentre with opposing properties: The nerve impulse in the fibre is confined to the fibre; cannot sum with another nerve impulse; can travel long distances with constant form and velocity; curvature in space and time are not significant features; and it is regarded as a number. On the other hand, the nerve impulse in the neurocentre is confined to the neurocentre; can sum with other nerve impulses; cannot travel long distances - even in a very short distance, it changes form; curvature in space and time is a very significant feature; and it is regarded as a mathematical function.The approach to determine how one form of the nerve impulse is transformed into the other at the input region is based on two of the differences listed above: (1) The nerve impulse in the fibre cannot sum with another nerve impulse in the fibre, whereas in the neurocentre, several nerve impulses sum to form a larger nerve impulse. (2) The nerve impulse in the fibre is regarded as a number, in the neurocentre, it is regarded as a mathematical function. The commonality of (1) and (2) is that the properties defining the nerve impulse in the fibre are associated with the property ofdiscreteness, whereas, the properties defining the nerve impulse in the neurocentre are associated with the property ofcontinuousness. Thus, the basic theme of unification of function at the input region of the neurocentre is the transformation of a phenomenon with the property of discreteness into a phenomenon with the property of continuousness. The solution to this transformation is approached from two directions:biologic andmathematical. In the biologic approach, the unit element of the nerve impulse in the fibre terminations (as.u. as a wave of energy, a spike in the classical theory) fuses with a. calcium-binding protein causing the release of Ca++. The calcium ions then combine with another protein. Associated with the second reaction is a conformational change in the Ca++-protein complex and the unit element in the neurocentre, bs.u., is emitted. Individual bs.u. then fuse with acetylcholine; summation occurs andwave b is emitted. In the mathematical approach, the nerve impulse as a number, is partitioned into two numbers with a precise rule relating these two numbers. One possibility suggested is that the number can be regarded as the value of a trigonometric function. This value then gives rise to an angle with sides related in a ratio or proportionality fashion — a relationship with the property of continuousness, as contrasted with that of a single number, discreteness. Both biologic and mathematical approaches are united so as to suggest that the mathematical (trigonometric) function arose as the result of a fusion of energy (as.u. as a wave of energy) and the calcium-binding protein as matter; following this reaction, bs.u., with opposing properties, is emitted. (shrink)
The prediction of protein–protein interactions based on independently obtained structural information for each interacting partner remains an important challenge in computational chemistry. Procedures where hypothetical interaction models (or decoys) are generated, then ranked using a biochemically relevant scoring function have been garnering interest as an avenue for addressing such challenges. The program PatchDock has been shown to produce reasonable decoys for modeling the association between pig alpha-amylase and the VH-domains of camelide antibody raised against it. We designed a (...) biochemically relevant method by which PatchDock decoys could be ranked in order to separate near-native structures from false positives. Several thousand steps of energy minimization were used to simulate induced fit within the otherwise rigid decoys and to rank them. We applied a partial free energy function to rank each of the binding modes, improving discrimination between near-native structures and false positives. Sorting decoys according to strain energy increased the proportion of near-native decoys near the bottom of the ranked list. Additionally, we propose a novel method which utilizes regression analysis for the selection of minimization convergence criteria and provides approximation of the partial free energy function as the number of algorithmic steps approaches infinity. (shrink)
o (2000), 243). In particular, the idea is that binding interactions between the relevant expressions and natural lan- guage quantiﬁers are best explained by the hypothesis that those expressions harbor hidden but bindable variables. Recently, however, Herman Cappelen and Ernie Lepore have rejected such binding arguments for the presence of hid- den variables on the grounds that they overgeneralize — that, if sound, such arguments would establish the presence of hidden variables in all sorts of ex- pressions where it is (...) implausible that they exist (Cappelen and Lepore (2005), Cappelen and Lepore (2002)).1 In what follows we respond to Cappelen’s and Lepore’s attempted reductio by bringing out crucial disanalogies between cases where the binding argument is successful and cases where it is not. But we have a deeper purpose than merely to respond to Cappelen and Lepore: we think the attempted reductio goes wrong by not taking suﬃciently seriously the nature of the binding relation that holds between quantiﬁers and arguments/variables, and that our criticism will serve to highlight the nature and importance of this relation. (shrink)
Epithets and pronominals 'en' and 'y' in French have a variety of Binding properties that are unexpected on conventional approach to Binding Theory. We argue that the linguistic variety observed cross-linguistically (and perhaps, more surprinsingly, within a single language) - derives from the morphological properties of the anaphoric element - which we claim lack number features. Epithets and pronominal like 'en' and 'y' are predicates modifying null but semantically active nouns, and must theefore refer to the Speaker. These properties, we (...) claim, explain why these elements must be employed in what we define as an Epistemic Context, and are subject to Condition C of Binding Theory. (shrink)
Recently, an associative learning account of cognitive control has been suggested (Verguts & Notebaert, 2009). In this so-called adaptation by binding theory, Hebbian learning of stimulus–stimulus and stimulus–response associations is assumed to drive the adaptation of human behavior. In this study, we evaluated the validity of the adaptation-by-binding account for the case of implicit learning of regularities within a stimulus set (i.e., the frequency of specific unit digit combinations in a two-digit number magnitude comparison task) and their association with a (...) particular response. Our data indicated that participants indeed learned these regularities and adapted their behavior accordingly. In particular, influences of cognitive control were even able to override the numerical distance effect—one of the most robust effects in numerical cognition research. Thus, the general cognitive processes involved in two-digit number magnitude comparison seem much more complex than previously assumed. Multi-digit number magnitude comparison may not be automatic and inflexible but influenced by processes of cognitive control being highly adaptive to stimulus set properties and task demands on multiple levels. (shrink)
Since Kaplan’s "Demonstratives", it has become a common-place to distinguish between the character and content of an expression, where the content of an expression is what it contributes to "what is said" by sentences containing that expression, and the character gives a rule for determining, in a context, the content of an expression. A tacit assumption of theories of character has been that character is autonomous from content – that semantic evaluation starts with character, adds context, and then derives content. (...) One consequence of this autonomy thesis is that the rules for character can contain no variables bound by content-level operators elsewhere in the sentence. Tacit appeal to this consequence features essentially both in Jason Stanley’s recent argument that all contextual ambiguity must be linked to "elements in the actual syntactic structure of the sentence uttered" in his "Context and Logical Form" and in my arguments against character-based theories of complex demonstratives in my "Complex Demonstratives". However, I argue here that the autonomy thesis is unmotivated, and show that we can separate Kaplan’s notion of character into two independent components: an aspect of meaning which is context-sensitive, and an aspect of meaning that is exempted from scopal interactions with other operators. The resulting semantic framework allows constructions similar to Kaplan’s rejected notion of "monsters begat by elegance", but which are both more empirically adequate and more theoretically versatile. Having made the distinction between context-sensitivity and autonomy from scopal interaction, I show how it allows binding into the character of expressions and hence undermines the immediate success of both Stanley’s argument and my former argument against character-based theories of complex demonstratives, and discuss briefly the prospects for reinstating modified versions of those arguments. Finally, I show how that same distinction allows a defusing of Kripke’s modal argument against a descriptive theory of names. Once autonomy from semantic interaction is separated from context-sensitivity, the first of those two alone can be used to capture the modal rigidity of proper names.. (shrink)
In this paper, we compare the mechanisms of protein synthesis and natural selection. We identify three core elements of mechanistic explanation: functional individuation, hierarchical nestedness or decomposition, and organization. These are now well understood elements of mechanistic explanation in fields such as protein synthesis, and widely accepted in the mechanisms literature. But Skipper and Millstein have argued (2005) that natural selection is neither decomposable nor organized. This would mean that much of the current mechanisms literature does not apply (...) to the mechanism of natural selection. (shrink)
Combinatory logic (Curry and Feys 1958) is a “variable-free” alternative to the lambda calculus. The two have the same expressive power but build their expressions differently. “Variable-free” semantics is, more precisely, “free of variable binding”: it has no operation like abstraction that turns a free variable into a bound one; it uses combinators—operations on functions—instead. For the general linguistic motivation of this approach, see the works of Steedman, Szabolcsi, and Jacobson, among others. The standard view in linguistics is that reflexive (...) and personal pronouns are free variables that get bound by an antecedent through some coindexing mechanism. In variable free semantics the same task is performed by some combinator that identifies two arguments of the function it operates on (a duplicator). This combinator may be built into the lexical semantics of the pronoun, into that of the antecedent, or it may be a free-floating operation applicable to predicates or larger chunks of texts, i.e. a typeshifter. This note is concerned with the case of cross-sentential anaphora. It adopts Hepple’s and Jacobson’s interpretation of pronouns as identity maps and asks how this can be extended to the cross-sentential case, assuming the dynamic semantic view of anaphora. It first outlines the possibility of interpreting indefinites that antecede non-ccommanded pronouns as existential quantifiers enriched with a duplicator. Then it argues that it is preferable to use the duplicator as a type-shifter that applies “on the fly”. The proposal has consequences for two central ingredients of the classical dynamic semantic treatment: it does away with abstraction over assignments and with treating indefinites as inherently existentially quantified. However, cross-sentential anaphora remains a matter of binding, and the idea of propositions as context change potentials is retained. (shrink)
Much contemporary nanotoxicology, nanotherapeutic and nanoregulatory research has been characterised by a focus on investigating how delivery of engineered nanoparticles (ENPs) to cells is dictated primarily by components of the ENP surface. An alternative model, some implications of which are discussed here, begins with fundamental physicochemical research into the interaction of a dynamic nanoparticle-protein corona (NPC) with biological systems. The proposed new model also requires, however, that any such fresh NPC physicochemical research approach should involve integration and targeted collaboration (...) from the earliest stages with nanotoxicology, nanotherapeutics and nanoregulatory expertise. The justification for this integrated approach, we argue, relates not just to efficiency and promotion of innovation, but to an acknowledgement that public-funded basic physicochemical research in particular should now be accepted to incorporate strong higher order public goods elements from its inception, not merely after product development at the technology transfer stage. Issues, in other words, such as university research co-operation, commercialization and intellectual property (IP) protection, safety and cost-effectiveness regulatory assessment, as well as technology transfer should not be viewed as second tier considerations even in a ‘blue sky’ NPC basic research agenda. (shrink)
The correct locus (or loci) of binding theory has been a matter of much discussion. Theories can be seen as varying along at least two dimensions. The rst is whether binding theory is con gurationally determined (that is, the theory exploits the geometry of a phrase marker, appealing to such purely structural notions as c-command and government) or whether the theory depends rather on examining the relations between items selected by a predicate (where by selection I am intending to cover (...) everything from semantic dependencies to syntactic subcategorization). The second is the level of grammar on which binding is de ned. Attempting to roughly equate levels across di erent theories, suggestions have included the semantics/lexical conceptual structure (Jackendo 1992), thematic structure (Jackendo 1972, Wilkins 1988), argument structure/D-structure/initial grammatical relations (Manning 1994, Belletti and Rizzi 1988, Perlmutter 1984), surface syntax/grammatical relations, logical form, linear order, pragmatics (Levinson 1991), and discourse (Iida 1992). The data is su ciently varied and complex that many theories end up as mixtures, variously employing a combination of elements along both dimensions (for instance, Chomsky (1986) relies purely on con gurational notions for the relationship between an anaphor and its antecedent, but uses concepts from selection in the de nition of the binding domain of an anaphor). LFG has always rejected a con gurational account of binding. For instancError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMape, Simpson (1991) argues that a con gurational theory of binding in Warlpiri cannot be maintained, among other reasons because nite clauses lack a VP.. (shrink)
This paper develops a framework for TAG (Tree Adjoining Grammar) semantics that brings together ideas from diﬀerent recent approaches. Then, within this framework, an analysis of scope is proposed that accounts for the diﬀerent scopal properties of quantiﬁers, adverbs, raising verbs and attitude verbs. Finally, including situation variables in the semantics, diﬀerent situation binding possibilities are derived for diﬀerent types of quantiﬁcational elements.
In spite of a renewed interest in the relationship between spirituality and managerial thinking, the literature covering the link between Islam and management has been sparse – especially in the area of ethics. One potential reason may be the cultural diversity of nearly 1.3 billion Muslims globally. Yet, one common element binding Muslim individuals and countries is normative Islam. Using all four sources of this religion’s teachings, we outline the parameters of an Islamic model of normative business ethics. We explain (...) how this ethics model seeks to balance the needs of multiple stakeholders, and discuss its enforcement mechanisms. This Islamic approach to business ethics is centered around criteria that are in common with stakeholder theory such as justice and balance, and includes unique additional criteria such as trust and benevolence. (shrink)
We pose and resolve several vexing decision theoretic puzzles. Some are variants of existing puzzles, such as ‘Trumped’ (Arntzenius and McCarthy 1997), ‘Rouble trouble’ (Arntzenius and Barrett 1999), ‘The airtight Dutch book’ (McGee 1999), and ‘The two envelopes puzzle’ (Broome 1999). Others are new. A unified resolution of the puzzles shows that Dutch book arguments have no force in infinite cases. It thereby provides evidence that reasonable utility functions may be unbounded and that reasonable credence functions need not be countably (...) additive. The resolution also shows that when infinitely many decisions are involved, the difference between making the decisions simultaneously and making them sequentially can be the difference between riches and ruin. Finally, the resolution reveals a new way in which the ability to make binding commitments can save perfectly rational agents from sure losses. (shrink)
The binding problem is frequently discussed in consciousness research. However, it is by no means clear what the problem is supposed to be and how exactly it relates to consciousness. In the present paper the nature of the binding problem is clarified by distinguishing between different formulations of the problem. Some of them make no mention of consciousness, whereas others are directly related to aspects of phenomenal experience. Certain formulations of the binding problem are closely connected to the classical philosophical (...) problem of the unity of consciousness and the currently fashionable search for the neural correlates of consciousness. Nonetheless, only a part of the current empirical research on binding is directly relevant to the study of consciousness. The main message of the present paper is that the science of consciousness needs to establish a clear theoretical view of the relation between binding and consciousness and to encourage further empirical work that builds on such a theoretical foundation. (shrink)
(von der Malsburg, 1981), “the binding problem” has with the visual percept of it, so that both are effortlessly captured the attention of researchers across many disci- perceived as being aspects of a single event. I like to plines, including psychology, neuroscience, computa- refer to these sorts of problems as perceptual binding tional modeling, and even philosophy. Despite the is- problems, since they involve unifying aspects of per- sue’s prominence in these fields, what “binding” means cepts. In addition, there are (...) cognitive binding problems: is rarely made explicit. In this paper, I will briefly survey they include relating a concept to a percept, such as the many notions of binding and will introduce some linking the visual representation of an apple to all the issues that will be explored more fully in the reviews semantic knowledge stored about it (it is edible, how it that follow. (shrink)
What is the proper way to draw the semantics-pragmatics distinction, and is what is said by a speaker ever enriched by pragmatics? An influential but controversial answer to the latter question is that the inputs to semantic interpretation contains representations of every contribution from context that is relevant to determining what is said, and that pragmatics never enriches the output of semantic interpretation. The proposal is bolstered by a controversial argument from syntactic binding designed to detect hidden syntactic structure. The (...) following contains an exposition and consideration of the argument. (shrink)
In “Bayesianism, Infinite Decisions, and Binding”, Arntzenius et al. (Mind 113:251–283, 2004 ) present cases in which agents who cannot bind themselves are driven by standard decision theory to choose sequences of actions with disastrous consequences. They defend standard decision theory by arguing that if a decision rule leads agents to disaster only when they cannot bind themselves, this should not be taken to be a mark against the decision rule. I show that this claim has surprising implications for a (...) number of other debates in decision theory. I then assess the plausibility of this claim, and suggest that it should be rejected. (shrink)
The binding problem is to explain how information processed by different sensory systems is brought together to unify perception. The problem has two sides. First, we want to explain phenomenal binding: the fact that we experience a single world rather than separate perceptual fields for each sensory modality. Second, we must solve a functional problem: to explain how a neural net like the brain links instances to types. I argue that phenomenal binding and functional binding require very different treatments. The (...) puzzle of phenomenal binding rests on a confusion and so can be dissolved. So only functional binding deserves explanation. The general solution to that problem is that information to be bound is arrayed along different dimensions. So sensory coding into separate topographic maps facilitates functional binding and there is no need based on the unity of perception for special mechanisms that bring "back together" information in different maps. (shrink)
Theories of binding have recently come into the focus of the consciousness debate. In this review, we discuss the potential relevance of temporal binding mechanisms for sensory awareness. Specifically, we suggest that neural synchrony with a precision in the millisecond range may be crucial for conscious processing, and may be involved in arousal, perceptual integration, attentional selection and working memory. Recent evidence from both animal and human studies demonstrates that specific changes in neuronal synchrony occur during all of these processes (...) and that they are distinguished by the emergence of fast oscillations with frequencies in the gamma-range. (shrink)
In The Right and the Good, W. D. Ross commits himself to the view that, in addition to being distinct and defeasible, some prima facie duties are more binding than others. David McNaughton has argued that there appears to be no way of making sense of this claim that is both coherent and consistent with Ross's overall picture. I offer an alternative way of understanding Ross's remarks about the comparative stringency of prima facie duties, which, in addition to being compatible (...) with his view as presented in the text, provides us with a coherent, and indeed plausible, account of what it means for some duties to be more binding than others. (shrink)
Despite its prominent role in cognitive psychology, its relevance for the research of consciousness, and some helpful clarification (e.g., Revonsuo 1999), the binding problem is still surrounded by considerable confusion. In this paper, I first give an informal but systematic overview on the diversity of forms the binding problem can assume, and then attempt to extract, on the basis of "working definitions" of various much-discussed types of binding, a common denominator. I propose that at the heart of the binding problem (...) lies the notion of representing an entity as having a certain property, and discuss several objections that could be raised against the proposed analysis, as well its usefulness and implications. (shrink)
This article proposes an object properties approach to object perception. By thinking about objects as clusters of co-instantiated features that possess certain canonical higher-order object properties we can steer a middle way between two extreme views that are dominant in different areas of empirical research into object perception and the development of the object concept. Object perception should be understood in terms of perceptual sensitivity to those object properties, where that perceptual sensitivity can be explained in a manner consistent with (...) the graded representation approach adopted by some connectionist modellers. The object properties approach does justice to the differences between a perceptual system solving the binding problem, on the one hand, and genuinely perceiving objects, on the other, without running into the theoretical problems associated with treating young infants as 'little scientists'. (shrink)
Cognitive functions like perception, memory, language, or consciousness are based on highly parallel and distributed information processing by the brain. One of the major unresolved questions is how information can be integrated and how coherent representational states can be established in the distributed neuronal systems subserving these functions. It has been suggested that this so-called ''binding problem'' may be solved in the temporal domain. The hypothesis is that synchronization of neuronal discharges can serve for the integration of distributed neurons into (...) cell assemblies and that this process may underlie the selection of perceptually and behaviorally relevant information. As we intend to show here, this temporal binding hypothesis has implications for the search of the neural correlate of consciousness. We review experimental results, mainly obtained in the visual system, which support the notion of temporal binding. In particular, we discuss recent experiments on the neural mechanisms of binocular rivalry which suggest that appropriate synchronization among cortical neurons may be one of the necessary conditions for the buildup of perceptual states and awareness of sensory stimuli. (shrink)
It is important to separate the question of binding from the problem of consciousness. Undoubtedly, there are some close connections between the two: my conscious experience is of a bound unity. But my unconscious experiences -- subliminal impressions, masked primings, etc. -- might be bound too for all I know. Hence, some of the recent commentators speak too loosely when they talk of 40 Hz oscillations solving some problem of conscious perception.
Biomedical ontologies are emerging as critical tools in genomic and proteomic research where complex data in disparate resources need to be integrated. A number of ontologies exist that describe the properties that can be attributed to proteins; for example, protein functions are described by Gene Ontology, while human diseases are described by Disease Ontology. There is, however, a gap in the current set of ontologies—one that describes the protein entities themselves and their relationships. We have designed a (...) class='Hi'>PRotein Ontology (PRO) to facilitate protein annotation and to guide new experiments. The components of PRO extend from the classification of proteins on the basis of evolutionary relationships to the representation of the multiple protein forms of a gene (products generated by genetic variation, alternative splicing, proteolytic cleavage, and other post-translational modification). PRO will allow the specification of relationships between PRO, GO and other OBO Foundry ontologies. Here we describe the initial development of PRO, illustrated using human proteins from the TGF-beta signaling pathway (http://pir.georgetown.edu/pro). (shrink)
This paper considers two recent arguments that structure should not be regarded as the fundamental individuating property of proteins. By clarifying both what it might mean for certain properties to play a fundamental role in a classification scheme and the extent to which structure plays such a role in protein classification, I argue that both arguments are unsound. Because of its robustness, its importance in laboratory practice, and its explanatory centrality, primary structure should be regarded as the fundamental distinguishing (...) characteristic of protein taxonomy. (shrink)
Theories of perception and of memory are closely allied. The binding problem (which considers how bits of perception are reassembled by the brain) leads to neurophysiological subjectivism. This could be outflanked by arguing with Bergson that perceiving consciousness is out in the world. Thus the brain would bind only behavioral “maps.” In turn, consciousness would retain our personal pasts. Such personal (episodic) memories both help us to recognize present objects and to perform creative acts. Memory, although retentive, is also creative. (...) This is important in rethinking biological and evolutionary memory. (shrink)
The problem of cortical integration is described and various proposed solutions, including grandmother cells, cell assemblies, feed-forward structures, RAAM and synchronization, are reviewed. One method, involving complex attractors, that has received little attention in the literature, is explained and developed. I call this binding through annexation. A simulation study is then presented which suggests ways in which complex attractors could underlie our capacity to reason. The paper ends with a discussion of the efficiency and biological plausibility of the proposals as (...) integration mechanisms for different regions and functions of the brain. (shrink)
In 1931 eminent chemist Fritz Paneth maintained that the modern notion of “element” is closely related to (and as “metaphysical” as) the concept of element used by the ancients (e.g., Aristotle). On that basis, the element chlorine (properly so-called) is not the elementary substance dichlorine, but rather chlorine as it is in carbon tetrachloride. The fact that pure chemicals are called “substances” in English (and closely related words are so used in other European languages) derives from philosophical compromises made by (...) grammarians in the late Roman Empire (particularly Priscian [fl. ~520 CE]). When the main features of the constitution of isotopes became clear in the first half of the twentieth century, the formal (IUPAC) definition of a “chemical element” was changed. The features that are “essential” to being an element had previously been “transcendental” (“beyond the sphere of consciousness”) but, by the mid-twentieth century the defining characteristics of elements, as such, had come to be understood in detail. This amounts to a shift in a “horizon of invisibility” brought about by progress in chemistry and related sciences. Similarly, chemical insight is relevant to currently-open philosophical problems, such as the status of “the bundle theory” of the coherence of properties in concrete individuals. (shrink)
Expressions such as English himself are interpreted as locally bound anaphors in certain syntactic environments and are exempt from the binding conditions in others. This article provides a unified semantics for himself in both of these uses. Their difference is reduced to the interaction with the syntactic environment. The semantics is based on an extension of the treatment of pronominals in variable-free semantics. The adoption of variable free semantics is inspired by the existence of proxy-readings, which motivate an analysis based (...) on Skolem functions. It is explained why certain anaphor types allow proxy-readings whereas others do not. (shrink)
This article considers two important traditions concerning the chemical elements. The first is the meaning of the term “element” including the distinctions between element as basic substance, as simple substance and as combined simple substance. In addition to briefly tracing the historical development of these distinctions, I make comments on the recent attempts to clarify the fundamental notion of element as basic substance for which I believe the term “element” is best reserved. This discussion has focused on the writings of (...) Fritz Paneth which are here analyzed from a new perspective. The other tradition concerns the reduction of chemistry to quantum mechanics and an understanding of chemical elements through their microscopic components such as protons, neutrons and electrons. I claim that the use of electronic configurations has still not yet settled the question of the placement of several elements and discuss an alternative criterion based on maximizing triads of elements. I also point out another possible limitation to the reductive approach, namely the failure, up to now, to obtain a derivation of the Madelung rule. Mention is made of some recent similarity studies which could be used to clarify the nature of ‘elements’. Although it has been suggested that the notion of element as basic substance should be considered in terms of fundamental particles like protons and electrons, I resist this move and conclude that the quantum mechanical tradition has not had much impact on the question of what is an element which remains an essentially philosophical issue. (shrink)
Animal husbandry has been accused ofmaltreating animals, polluting the environment, and soon. These accusations were thought to be answered whenthe Dutch research program ``Sustainable TechnologicalDevelopment'' (STD) suggested a government-initiatedconversion from meat to novel protein foods (NPFs).STD reasoned that if consumers converted from meat toNPFs, non-sustainable animal husbandry would no longerbe needed. Whereas STD only worried about how toconstruct NPFs with a meat bite, this paper drawsattention to the presumed, but problematic, role forthe government in the execution of the STDsuggestions. (...) Although vegetarians take the credo ``YouAre What You Eat'' literally and accuse non-vegetariansof being beasts, a different interpretation is morepromising: eating meat has become a leading thread inmany lifestyles and narratives of self-identity. Sincethe freedom to follow your own lifestyle orconsumptive preferences is a core value incontemporary affluent societies, governmentintervention in the formation and satisfaction ofconsumer preferences for meat dishes is a precariousissue. Hence, NPFs might be interesting for a smallfraction of society, but we had better not expect toomuch from a government-initiated conversion from meatto NPFs as the answer to animal husbandry'sproblems. (shrink)
In addition to the standard ellipsis process known as VP-ellipsis, another ellipsis process, known as pseudo-gapping, was first brought to the fore-front in the 1970’s by Sag (1976) and N. Levin (1986). This process elides subparts of a VP, as in (1): (1) Although I don’t like steak, I do___pizza. Developing ideas of K.S. Jayaseelan (Jayaseelan (1990)), Howard Lasnik has developed an analysis in which pseudo-gapping, which, in some instances, looks as though it is simply deleting a verb, is in (...) fact deletion of a verb phrase, so that pseudo-gapping is really a probe into the structure of the verb phrase. I will examine pseudo-gapping in detail, and will show that it truly is a gold mine of insight into a number of fundamental issues in syntax. More concretely, I will demonstrate that a careful, detailed analysis of this process will bear on the derivational level at which Principle A of the binding theory applies, as well as the amount of explicit encoding within syntactic representations of informational structure, particularly focus. The paper will also re-assess Lasnik’s conclusion that pseudo-gapping provides evidence for Larson’s (1988) V-raising to a higher empty V position, a case of head movement, and will show that the movement involved is actually a case of remnant movement, or XP-movement. (shrink)
Several of the essays in this collection discuss the `binding problem', the problem of explaining in neurophysiological terms how it is that we see the various perceptual qualities of a physical object, such as its shape, colour, location and motion, as features of a single object. The perceived object seems to us a unitary thing, but its sensory properties are diverse and turn out to be processed in different areas of the brain. How then does the brain manage the integration? (...) Readers of the essays in this collection may find themselves suffering from an analogous binding problem about the study of consciousness, though this problem is conceptual rather than perceptual, and here the difficulty is to achieve the integration rather than to understand how an effortless integration is achieved. Consciousness is the ideal topic for inter-disciplinary investigation. It is a central concern of such diverse disciplines as neurophysiology, evolutionary biology, psychology, cognitive science, philosophy and theology, among others, yet none of these disciplines has come close to providing full answers to the central questions that consciousness raises. Inter-disciplinary investigation seems an obvious way forward, but it generates the conceptual binding problem that this collection displays. The standard of the essays is very high, but it is extraordinarily difficult to integrate their content into anything like a single picture. We are all apparently talking about the same phenomenon, the conscious awareness of the world that each of us enjoys first-hand, but it is quite unclear how to see the very different things we say about this phenomenon as part of a single picture, or even as parts of different but compatible pictures. Having raised the binding problem for the inter-disciplinary study of consciousness, I hasten to say that I will not attempt even a partial substantive solution here: that is left as an exercise for the readers of this book. (shrink)
New concepts may prove necessary to profit from the avalanche of sequence data on the genome, transcriptome, proteome and interactome and to relate this information to cell physiology. Here, we focus on the concept of large activity-based structures, or hyperstructures, in which a variety of types of molecules are brought together to perform a function. We review the evidence for the existence of hyperstructures responsible for the initiation of DNA replication, the sequestration of newly replicated origins of replication, cell division (...) and for metabolism. The processes responsible for hyperstructure formation include changes in enzyme affinities due to metabolite-induction, lipid-protein affinities, elevated local concentrations of proteins and their binding sites on DNA and RNA, and transertion. Experimental techniques exist that can be used to study hyperstructures and we review some of the ones less familiar to biologists. Finally, we speculate on how a variety of in silico approaches involving cellular automata and multi-agent systems could be combined to develop new concepts in the form of an Integrated cell (I-cell) which would undergo selection for growth and survival in a world of artificial microbiology. (shrink)
Temporal binding via 40-Hz synchronization of neuronal discharges in sensory cortices has been hypothesized to be a necessary condition for the rapid selection of perceptually relevant information for further processing in working memory. Binocular rivalry experiments have shown that late stage visual processing associated with the recognition of a stimulus object is highly correlated with discharge rates in inferotemporal cortex. The hippocampus is the primary recipient of inferotemporal outputs and is known to be the substrate for the consolidation of working (...) memories to long-term, episodic memories. The prefrontal cortex, on the other hand, is widely thought to mediate working memory processes, per se. This article reviews accumulated evidence for the role of a subcortical matrix in linking frontal and hippocampal systems to select and ''stream'' conscious episodes across time (hundreds of milliseconds to several seconds). ''Streaming'' is hypothesized to be mediated by the selective gating of reentrant flows of information between these cortical systems and the subcortical matrix. The physiological mechanism proposed for this temporally extended form of binding is synchronous oscillations in the slower EEG spectrum (< 8 Hz). (shrink)
George Herbert Mead's early lectures at the University of Chicago are more important to understanding the genesis of his views in social psychology than some commentators, such as Hans Joas, have emphasized. Mead's lecture series "The Evolution of the Psychical Element," preserved through the notes of student H. Heath Bawden, demonstrate his devotion to Hegelianism as a method of thinking and how this influenced his non-reductionistic approach to functional psychology. In addition, Mead's breadth of historical knowledge as well as his (...) commitments in the natural and social sciences are on display here, culminating in the Darwinian observation that human animals only achieve the degree of control they have over their environment by the achievement of social organization. (shrink)
It has been recently argued that some machine learning techniques known as Kernel methods could be relevant for capturing cognitive and neural mechanisms (Jäkel, Schölkopf, & Wichmann, 2009). We point out that ‘‘String kernels,’’ initially designed for protein function prediction and spam detection, are virtually identical to one contending proposal for how the brain encodes orthographic information during reading. We suggest some reasons for this connection and we derive new ideas for visual word recognition that are successfully put to (...) the test. We argue that the versatility and performance of String kernels makes a compelling case for their implementation in the brain. (shrink)
A new formalism for predicate logic is introduced, with a non-standard method of binding variables, which allows a compositional formalization of certain anaphoric constructions, including donkey sentences and cross-sentential anaphora. A proof system in natural deduction format is provided, and the formalism is compared with other accounts of this type of anaphora, in particular Dynamic Predicate Logic.
Sections 1, 2 and 3 contain the main result, the strong finite axiomatizability of all 2-valued matrices. Since non-strongly finitely axiomatizable 3-element matrices are easily constructed the result reveals once again the gap between 2-valued and multiple-valued logic. Sec. 2 deals with the basic cases which include the important F i from Post's classification. The procedure in Sec. 3 reduces the general problem to these cases. Sec. 4 is a study of basic algebraic properties of 2-element algebras. In particular, we (...) show that equational completeness is equivalent to the Stone-property and that each 2-element algebra generates a minimal quasivariety. The results of Sec. 4 will be applied in Sec. 5 to maximality questions and to a matrix free characterization of 2-valued consequences in the lattice of structural consequences in any language. Sec. 6 takes a look at related axiomatization. problems for finite algebras and matrices. We study the notion of a propositional consequence with equality and, among other things, present explicit axiomatizations of 2-valued consequences with equality. (shrink)
In this paper it is shown that, in spite of their intuitive starting points, Kuipers' accounts lead to counterintuitive consequences. The counterintuitive results of Kuipers' account of H-D confirmation stem from the fact that Kuipers explicates a concept of partial (as opposed to full) confirmation. It is shown that Schurz-Weingartner's relevant-element approach as well as Gemes' content-part approach provide an account of full confirmation that does not lead to these counterintuitive results. One of the unwelcome results of Kuipers' account of (...) nomic truthlikeness is the consequence that a theory Y, in order to be more truthlike than a theory X (where Y and X are incompatible), must imply the entire nomic truth. It is shown how the relevant-element approach to truthlikeness avoids this result. (shrink)
This paper investigates relative constructions as in The gifted mathematician that you claim to be should be able to solve this equation, in which the head noun (gifted mathematician) is semantically dependent on an intensional operator in the relative clause (claim), even though it is not c-commanded by it. This is the kind of situation that has led, within models of linguistic description that assume a syntactic level of Logical Form, to analyses in which the head noun is interpreted within (...) the CP-internal gap by reconstruction or interpretation of a lower element of a chain. We offer a solution that views surface representation as the input to semantics. The apparent inverted scope effects are traced back to the interpretation of the head nominal gifted mathematician as applying to individual concepts, and of the relative clause that you claim to be as including an equational statement. According to this view, the complex DP in question refers to the individual concept that exists just in the worlds that are compatible with what is generally supposed to be the case, is a gifted mathematician in those worlds, and is identical to you in those worlds. Our solution is related to the nonreconstructionist analysis of binding of pronouns that do not stand in a c-command relationship to their binder, as in The woman that every man hugged was his mother in Jacobson (in: Harvey, Santelmann (eds.) Proceedings of Semantics and Linguistic Theory IV:161–178, 1994) and Sharvit (in: Galloway, Spence (eds.) Proceedings of Semantics and Linguistic Theory VI:227–244, 1996), and allows us to capture both similarities with and differences from the latter type of construction. We point out and offer explanations for a number of properties of such relative clauses—in particular their need for an internal intensional operator, their incompatibility with any determiner other than the definite article, and the fact that some of their properties are shared by demonstrably distinct kinds of relative clauses. (shrink)
The general theory of variable binding term operators is an interesting recent development in logic. It opens up a rich class of semantic and model-theoretic problems. In this paper we survey the recent literature on the topic, and offer some remarks on its significances and on its connections with other branches of mathematical logic.
Proteins with nearly the same structure and function (homologous proteins) are found in increasing numbers in phylogenetically different, even very distant taxa (e.g. hemoglobins in vertebrates, in some invertebrates, and even in certain plants). In discussing the origin of those proteins biologists hardly at all consider convergent evolution because the origin of proteins is held to be a random process, at least ultimately, since selection can work only what the random process delivers as having a minimum adaptive value. The repetition (...) of a random process with the same result is considered to be extremely unlikely. The supposed (un)likelihood, however, is almost never determined quantitatively. This paper attempts such a quantitative determination. It appears that the probability for the random origin of a definite protein is greater than what one would expect in view of the enormous number of equally possible nucleotide sequences in the corresponding gene since what is equally possible is not always equally likely. The probability, however, of the convergent evolution of two proteins with approximately the same structure and function is too low to be plausible, even when all possible circumstances are present which seem to heighten the likelihood of such a convergence. If this is so, then the plausibility of a random evolution of two or more different but functionally related proteins seems hardly greater. (shrink)
Stiff Person Syndrome (SPS) is a rare autoimmune disorder associated with antibodies against glutamic acid decarboxylase (GAD-Ab), the key enzyme in γ -aminobutyric acid synthesis (GABA). In order to investigate the role of cerebral benzodiazepinereceptor binding in SPS, we performed [ 11 C]flumazenil (FMZ) positron emission tomography (PET) in a female patient with SPS compared to nine healthy controls. FMZ is a radioligand to the postsynaptic central (...) benzodiazepine receptor which is co-localized with the GABA-A receptor. In the SPS patient, we found a global reduction of cortical FMZ binding. In addition, distinct local clusters of reduced radiotracer binding were observed. These data provide first in vivo evidence for a reduced postsynaptic GABA-A receptor availability which may reflect the loss of GABAergic neuronal inhibition in SPS. (shrink)
Prediction is more than testing established theory by examining whether the prediction matches the data. To show this, I examine the practices of a community of scientists, known as threaders, who are attempting to predict the final, folded structure of a protein from its primary structure, i.e., its amino acid sequence. These scientists employ a careful and deliberate methodology of prediction. A key feature of the methodology is calibration. They calibrate in order to construct better models. The construction leads (...) to knowledge of how to construct or build an object. Thus, prediction serves a cognitive goal of model construction and not just model or theory testing. The kind of knowledge that results is relevantly different than theoretical knowledge. (shrink)
The expansion or revision of false theories by true evidence does not always increase their verisimilitude. After a comparison of different notions of verisimilitude the relation between verisimilitude and belief expansion or revision is investigated within the framework of the relevant element account. We are able to find certain interesting conditions under which both the expansion and the revision of theories by true evidence is guaranteed to increase their verisimilitude.
Binding relations are fimdamentally semantic in nature. They arise as relations that are established with an interpretation. This is most apparent with dynamic binding, of the kind found in Dynamic Predicate Logic. Here it is the runtime of the evaluation that may permit a binding relation, in..
We develop a general method for applying functional models to natural systems and cite recent progress in protein modeling that demonstrates the power of this approach. Functional modeling constrains the range of acceptable structural models of a system, reduces the difficulty of finding them, and improves their fidelity. However, functional models are distinctly different from the structural models that are more commonly applied in science. In particular, structural and functional models ask different questions and provide different kinds of answers. (...) As we clarify these differences and articulate how to use these models jointly, we extend our ability to do science and gain insight into the proper use of the terms organization , order , and emergence when describing systems in nature. (shrink)
Consistent with Ruchkin and colleagues' proceduralist account, recent research on grouping and verbal-spatial binding in immediate memory shows continuity across short- and long-term retention, and activation of classes of information extending beyond those typically allowed in modular models. However, Ruchkin et al.'s account lacks well-specified mechanisms for the retention of serial order, binding, and the control of activation through attention.
Names, descriptions, and demonstratives raise well-known logical, ontological, and epistemological problems. Perhaps less well known, amongst philosophers at least, are the ways in which some of these problems not only recur with pronouns but also cross-cut further problems exposed by the study in generative linguistics of morpho-syntactic constraints on interpretation. These problems will be my primary concern here, but I want to address them within a general picture of interpretation that is required if wires are not to be crossed. That (...) picture will be sketched in sections 3 and 4; subsequent sections will focus on pronouns and binding, drawing heavily on what has preceded. (shrink)
Theabstract variable binding calculus (VB-calculus) provides a formal frame-work encompassing such diverse variable-binding phenomena as lambda abstraction, Riemann integration, existential and universal quantification (in both classical and nonclassical logic), and various notions of generalized quantification that have been studied in abstract model theory. All axioms of the VB-calculus are in the form of equations, but like the lambda calculus it is not a true equational theory since substitution of terms for variables is restricted. A similar problem with the standard formalism (...) of the first-order predicate logic led to the development of the theory of cylindric and polyadic Boolean algebras. We take the same course here and introduce the variety of polyadic VB-algebras as a pure equational form of the VB-calculus. In one of the main results of the paper we show that every locally finite polyadic VB-algebra of infinite dimension is isomorphic to a functional polyadic VB-algebra that is obtained from a model of the VB-calculus by a natural coordinatization process. This theorem is a generalization of the functional representation theorem for polyadic Boolean algebras given by P. Halmos. As an application of this theorem we present a strong completeness theorem for the VB-calculus. More precisely, we prove that, for every VB-theory T that is obtained by adjoining new equations to the axioms of the VB-calculus, there exists a model D such that T s=t iff D s=t. This result specializes to a completeness theorem for a number of familiar systems that can be formalized as VB-calculi. For example, the lambda calculus, the classical first-order predicate calculus, the theory of the generalized quantifierexists uncountably many and a fragment of Riemann integration. (shrink)
The question is broadened from isomorphism to invertible transformation and optimal representation. Motivations are drawn from image compression but with an emphasis on object segmentation. Filling-in is considered as the phenomenal side of the binding process with back-surface filling-in being important. Finally, re-normalization of local filtering by globally integrated context is emphasized.
There are exactly two nonfinitely axiomatizable algebraic matrices with one binary connective o such thatx(yz) is a tautology of . This answers a question asked by W. Rautenberg in , P. Wojtylak in  and W. Dziobiak in . Since every 2-element matrix can be finitely axiomatized (), the matrices presented here are of the smallest possible size and in some sense are the simplest possible.
In this paper we propose a theoretical model of protein folding and protein evolution in which a polypeptide (sequence/structure) is assumed to behave as a Maxwell Demon or Information Gathering and Using System (IGUS) that performs measurements aiming at the construction of the native structure. Our model proposes that a physical meaning to Shannon information (H) and Chaitin's algorithmic information (K) parameters can be both defined and referred from the IGUS standpoint. Our hypothesis accounts for the interdependence of (...)protein folding and protein evolution through mutual influencing relationships mediated by the IGUS. In brief, IGUS activity in protein folding determines long term tendencies that emerge at the evolutionary time-scale.Thus, protein evolution is a consequence of measurements executed by proteins at the cellular level, where the IGUS imposes a tendency to attain a highly unique stable native form that promotes the updating of the information content. The folding kinetics observed is, thus, the outcome of an evolutionary process where the polypeptide-IGUS drives the evolution of its linear sequence. Finally, we describe protein evolution as an entropic process that tends to increase the content of mutual algorithmic information between the sequence and the structure. This model enables one: 1. To comprehend that full determination of the three-dimensional structure by the linear sequence is a tendency where satisfaction is only possible at thermodynamic equilibrium .2. To account for the observed randomness of the amino acid sequences. 3. To predict an alternation of periods of selection and neutral diffusion during protein evolutionary time. (shrink)
Synchronization of neural activity in oscillatory neural networks is a general principle of information processing in the brain at both preattentional and attentional levels. This is confirmed by a model of attention based on an oscillatory neural network with a central element and models of feature binding and working memory based on multi-frequency oscillations.
Tropical forest conservation in developing countries has repeatedly been highlighted as a new element in international climate policy. However, no clear ideas yet exist as to what shape such a conservation strategy might take. In the present paper, we would like to make some observations to this end. It is shown how projects in order to reduce CO 2 -emissions resulting from deforestation and degradation (REDD) can be integrated into a system of tradable emission rights in an industrialised country and (...) which requirements ought to be fulfilled. Instruments are emission credits and emission allowances. Driving actors interested in emission rights through forest projects may be private investors or the rainforest state itself. The efficiency of the system depends on a great extent on a binding reference path for the tolerable emissions from deforestation, which has been agreed upon and adhered to by the rainforest country by means of a forest law aimed at limiting deforestation. Our considerations lead us to conclude that the national baseline approach with an appropriate contribution by the rainforest country coupled with a decentralised system with private investors seems the most viable option. Since additional burdens are imposed on the rainforest country to some extent, a compromise could consist of agreeing on a moderate deforestation path, which is harmonised with the benefits from the forest projects. Combining both programmes (offset credits and emission allowances) is particularly attractive because all participants, and especially the industrialised country, benefit from it. The industrialised country can expand its climate conservation programme without any additional costs to a certain degree. (shrink)
We question the ecological plausibility as a general model of cognition of van der Velde's & de Kamps's combinatorial blackboard architecture, where knowledge-binding in space and time relies on the structural rules of language. Evidence against their view of the brain and an ecologically plausible, alternative model of cognition are brought forward.
It is argued that van der Velde and de Kamps employ binding circuitry that effectively constitutes a form of conjunctive binding. Analogies with prior systems are discussed and hypothetical origins of binding circuitry are examined for credibility.
It has been recently shown  that the lattice effect algebras can be treated as a subvariety of the variety of so-called basic algebras. The open problem whether all subdirectly irreducible distributive lattice effect algebras are just subdirectly irreducible MV-chains and the horizontal sum of two 3-element chains is in the paper transferred into a more tractable one. We prove that modulo distributive lattice effect algebras, the variety generated by MV-algebras and is definable by three simple identities and the problem (...) now is to check if these identities are satisfied by all distributive lattice effect algebras or not. (shrink)
Tsuda's article suggests several plausible concepts of neurodynamic representation and processing, with a thoughtful discussion of their neurobiological grounding and formal properties. However, Tsuda's theory leads to a holistic view of brain functions and to the controversial conclusion that the “binding problem” is a pseudo-problem. By contrast, we stress the role of chaotic patterns in solving the binding problem, in terms of flexible temporal coding of visual scenes through graded and intermittent synchrony.
There is an ongoing debate in economics between the design-based approach and the structural approach. The main locus of contention regards how best to pursue the quest for credible causal inference. Each approach emphasizes one element ? sharp study designs versus structural models ? but these elements have well-known limitations. This paper investigates where a researcher might look for credibility when, for the causal question under study, these limitations are binding. It argues that seeking variety of evidence ? understood specifically (...) as using multiple means of determination to robustly estimate the same causal effect ? constitutes such an alternative and that applied economists actually take advantage of it. Evidential variety is especially relevant for a class of macro-level causal questions for which the design-based and the structural approaches appear to have limited reach. The use of evidential variety is illustrated by drawing on the literature on the institutional determinants of the aggregate unemployment rate. (shrink)
The theory of event coding by Hommel et al. proposes that feature binding is a central component of action planning. To evaluate the binding hypothesis, we consider findings from studies of action-perception interference and bimanual movements. We argue that although binding of action features may be a valuable concept, interference from partial feature overlap does not provide a parsimonious account for the observed phenomena.
van der Velde & de Kamps argue for the importance of considering the binding problem in accounts of human mental representation. However, their proposed solution fails as a complete account because it represents the bindings between roles and their fillers through associations (or connections). In addition, many criticisms leveled by the authors towards synchrony-based bindings models do not hold.
Electronegativity, described by Linus Pauling described as “The power of an atom in a molecule to attract electrons to itself” (Pauling in The nature of the chemical bond, 3rd edn, Cornell University Press, Ithaca, p 88, 1960), is used to predict bond polarity. There are dozens of methods for empirically quantifying electronegativity including: the original thermochemical technique (Pauling in J Am Chem Soc 54:3570–3582, 1932), numerical averaging of the ionisation potential and electron affinity (Mulliken in J Chem Phys 2:782–784, 1934), (...) effective nuclear charge and covalent radius analysis (Sanderson in J Chem Phys 23:2467, 1955) and the averaged successive ionisation energies of an element’s valence electrons (Martynov and Batsanov in Zhurnal Neorganicheskoi Khimii 5:3171–3175, 1980), etc. Indeed, there are such strong correlations between numerous atomic parameters—physical and chemical—that the term “electronegativity” integrates them into a single dimensionless number between 0.78 and 4.00 that can be used to predict/describe/model much of an element’s physical character and chemical behaviour. The design of the common and popular medium form of the periodic table is in large part determined by four quantum numbers and four associated rules. However, adding electronegativity completes the construction so that it displays the multi-parameter periodic law operating in two dimensions, down the groups and across the periods, with minimal ambiguity. (shrink)
Pessoa et al. (1998a) summarize a wide body of data suggesting that perceptual filling-in phenomena can be attributed to neural filling-in processes. However, they reject, on philosophical grounds, the hypothesis that filled-in representations in the brain are the immediate substrate of visual percepts. It is proposed in this commentary that resonant binding between distributed cortical areas may instead be the crucial ingredient for conscious visual percepts, and that filling-in processes may facilitate the interactions between behaving organisms and object surfaces. These (...) suggestions circumvent some of the philosophical problems associated with the idea of localized visual representations. (shrink)
Levelt, Roelofs & Meyer present a comprehensive and sophisticated theory of lexical access in production, but we question its reliance on binding-by-checking as opposed to binding-by-timing and we discuss how the timing of retrieval events is a major factor in both correct and errorful production.
Many people claim to have had direct perceptual awareness of God. William Alston, Richard Swinburne, Gary Gutting, and others have based their philosophical views on these reports. But using analogies from our encounters with humans whose abilities surpass our own, we realize that something essential is missing from these reports. The absence of this element renders it highly unlikely that these people have actually encountered a divine being. (Published Online August 11 2004).
We tested the hypothesis that perception of an alternative image in ambiguous figures would be manifest as high-frequency (gamma) components that become synchronized over multiple scalp sites as a ''cognitive binding'' process occurs. For 171 combinations of data from 19 electrodes, obtained from 17 subjects and 10 replicate stimuli, we calculated the difference in correlation between the response to first seeing an ambiguous figure and when the alternative percept for that figure became consciously realized (cognitively bound). Numerous statistically significant correlation (...) differences occurred in all frequency bands tested with ambiguous-figure stimulation, but not in two kinds of control data (a reaction-time test to sound stimuli and a no-task, mind-wandering test). Statistically significant correlation changes were widespread, involving frontal, parietal, central, and occipital regions of both hemispheres. Correlation changes were evident at each of five frequency bands, ranging up to 62.5 Hz. Most of the statistically significant correlation changes were not between adjacent sites but between sites relatively distant, both ipsilateral and contralateral. Typically, these correlation changes occurred in more than one frequency band. These results suggest that cognitive binding is a distinct mental state that is reliably induced by ambiguous-figure perception tasks. Coherent oscillations at multiple frequencies may reflect the mechanism by which such binding occurs. Moreover, different coherent frequencies may mediate different components of the total cognitive-binding process. (shrink)
This paper investigates relative constructions as in The gifted mathematician that you claim to be should be able to solve this equation, in which the head noun (gifted mathematician) is semantically dependent on an intensional operator in the relative clause (claim), even though it is not c-commanded by it. This is the kind of situation that has led, within models of linguistic description that assume a syntactic level of Logical Form, to analyses in which the head noun is interpreted within (...) the CP-internal gap by reconstruction or interpretation of a lower element of a chain. We offer a solution that views surface representation as the input to semantics. The apparent inverted scope effects are traced back to the interpretation of the head nominal gifted mathematician as applying to individual concepts, and of the relative clause that you claim to be as including an equational statement. According to this view, the complex DP in question refers to the individual concept that exists just in the worlds that are compatible with what is generally supposed to be the case, is a gifted mathematician in those worlds, and is identical to you in those worlds. Our solution is related to the non-reconstructionist analysis of binding of pronouns that do not stand in a c-command relationship to their binder, as in The woman that every man hugged was his mother in Jacobson (in: Harvey, Santelmann (eds.) Proceedings of Semantics and Linguistic Theory IV:161-178, 1994) and Sharvit (in: Galloway, Spence (eds.) Proceedings of Semantics and Linguistic Theory VI:227-244, 1996), and allows us to capture both similarities with and differences from the latter type of construction. We point out and offer explanations for a number of properties of such relative clauses—in particular their need for an internal intensional operator, their incompatibility with any determiner other than the definite article, and the fact that some of their properties are shared by demonstrably distinct kinds of relative clauses. (shrink)
Most theories of binding in most syntactic frameworks assume that the same notion of surface obliqueness that identi es the subject of a clause is also used for obliqueness conditions on re exive binding For instance in GB Chomsky binding theory is standardly de ned on S structure so that in Nancy can bind herself due to the c commanding con guration that also makes Nancy the subject of the sentence..
This article sketches a theory of time-binding communication, which is to say communication that unifies widely separated times much as space-binding communication unifies widely separated places. Drawing from the work of Harold Innis, it first describes the function and character of time-binding communication as a means to social continuity. Then, following Alasdair MacIntyre and Michael Oakshott, it explains the nature and necessary circumstances of this sort of time-binding communication, or tradition. It discusses the character, consequences, and causes of decadence - (...) radical discontinuity - as these have been described by Richard Weaver, C. E. M. Joad, and Jacques Barzun. Finally, it turns to David Lowenthal's notion of the past as a 'foreign country' in an effort to explain the relations between modernity and both tradition and decadence, as well as the geography of tradition and decadence in the modern world. (shrink)
This chapter will concentrate on a range of phenomena that have crucially been held to involve (within Government-Binding Theory and now Minimalism) movement of an element to what is known as an argument position- roughly, a position in which an element can be base-generated and bear a crucial semantic role with respect to the main predicate of a clause. It is to be distinguished from movement to an ~A (read A-bar, or non-argument) position. The two types of movement have very (...) different properties. (shrink)
Binding needs to be task dependent, and cannot usefully be driven by properties of the stimulus alone. However, task dependent binding can only take place after the patterns in a stimulus have been identified. Thus pattern recognition needs to be done prior to binding. Synchronisation may be a consequence of pattern recognition and can be used to localise the pattern and tag its attributes at different levels of information processing.
Endogenous and exogenous processes of attention have been inferred with different types of precues used in allocation of attention to a target location. In the present research, a comparison was made between the typical peripheral single-element precue (SEP), a central precue, and a multiple-element precue (MEP) in order to further understanding of the processes involved in allocation of attention. Two precues were used on each trial in these experiments. An abrupt-onset precue appeared with an SEP, an MEP, or a central (...) precue and was followed 50 or 300 ms later by a screen containing a target and two distractor characters. The abrupt-onset precue and the other precue each could be valid or invalid in indicating the location of the target, as in the study by Juola, Koshino, and Warner (1995). Response times to the targets showed that validity effects of the abrupt-onset precue and the MEP or central precue were additive, whereas those of the abrupt-onset precue and the SEP were interactive. These data suggest that, like a central precue, an MEP is an endogenous precue that guides conscious control of attention and has its attentional effects at a different processing level from an SEP, which is an exogenous precue and may compete for attentional resources with an abrupt-onset precue. (shrink)
Transgenic plants are now being used to develop pharmaceutical and industrial products in addition to their use in crop improvement. Using confinement requirements, these transgenic plants are grown and processed under conditions that prevent intermixing with commodity crops. Regulatory agencies in the United States have provided guidance of zero tolerance of these new industrial crops with commodity crops. While this is a worthy goal, it is theoretically unattainable. In spite of the best containment practices, there is a potential risk using (...) any system of production due to unforeseen incidences including natural disasters or exposure to workers. The precautionary principle has been used for numerous regulated articles in addressing the potential risks of new products and technology based on a risk assessment in similar situations. We present here a risk assessment model that could be used as a start to develop an accepted model for the industry. The model is based on current risk models used for other regulated articles, but adapted for these types of products. This could be used to determine action levels in the event of an unintended exposure or to ensure that detection or confinement methods are adequate to avoid risks. As an example, aprotinin, a therapeutic protein now being produced in maize, was evaluated for potential risk to humans using this model. (shrink)
This paper reviews and enhances numerical models for determining thermal, elastic and electrical properties of carbon nanotube-reinforced polymer composites. For the determination of the effective stress–strain curve and thermal conductivity of the composite material, finite-element analysis (FEA), in conjunction with the embedded fibre method (EFM), is used. Variable nanotube geometry, alignment and waviness are taken into account. First, a random morphology of a user-defined volume fraction of nanotubes is generated, and their properties are incorporated into the polymer matrix using the (...) EFM. Next, incremental and iterative FEA approaches are used for the determination of the nonlinear properties of the nanocomposite. For the determination of the electrical properties, a spanning network identification algorithm is used. First, a realistic nanotube morphology is generated from input parameters defined by the user. The spanning network algorithm then determines the connectivity between nanotubes in a representative volume element. Then, interconnected nanotube networks are converted to equivalent resistor circuits. Finally, Kirchhoff's current law is used in conjunction with FEA to solve for the voltages and currents in the system and thus calculate the effective electrical conductivity of the nanocomposite. The model accounts for electrical transport mechanisms such as electron hopping and simultaneously calculates percolation probability, identifies the backbone and determines the effective conductivity. Monte Carlo analysis of 500 random microstructures is performed to capture the stochastic nature of the fibre generation and to derive statistically reliable results. The models are validated by comparison with various experimental datasets reported in the recent literature. (shrink)
This paper provides a uniﬁcation-based implementation of Binding Theory (BT) for the English language in the framework of feature-based lexicalized tree-adjoining grammar (LTAG). The grammar presented here does not actually coindex any noun phrases, it merely outputs a set of constraints on co- and contraindexation that may later be processed by a separate anaphora resolution module. It improves on previous work by implementing the full BT rather than just Condition A. The main technical innovation consists in allowing lists to appear (...) as values of semantic features. (shrink)
The biblical story of the binding of Isaac may have originally been written without the figure of the angel. As such, it reads strongly as an account of Abraham disobeying God’s direct command for the sake of Isaac. Interestingly, then, many interpreters since the time of the text’s final redaction read the binding of Isaac as an account of ethical disobedience despite the presence of the angel. In what follows, I consider Levinas’s account of religion, revelation and ethics for the (...) way in which this can impact our reading of the biblical text. In this way, I hope to develop an account of the binding of Isaac which becomes an allegory for the need to mediate all modes of religious and/or political allegiance with concern for the well-being of other people. (shrink)
Friedrich Paneth’s conception of “chemical element” has functioned as the official definition adopted by the International Union of Pure and Applied Chemistry since 1923. Paneth maintains a distinction between empirical and “transcendental” concepts of element; furthermore, chemical science requires fluctuation between the two. The origin of the empirical-transcendental split is found in Immanuel Kant’s classic Critique of Pure Reason (1781/1787). The present paper examines Paneth’s foundational concept of element in light of Kant’s attempt, late in life, to revoke key distinctions (...) made in his Critique, including that of regulative and constitutive functions of reason. In a section of his Opus postumum devoted to the “Transition from the Metaphysical Foundations of Natural Science to Physics,” Kant bends his philosophical system to address the newly emerging sciences of matter of his time. Specifically, he tried, without success, to develop the transcendental ground for microscale motions of bodies encountered in physical, electrical and chemical processes. Paneth’s discussion of chemical element does not take the Opus postumum into account, which is why it begins with a rejection of Kant’s rejection (in his earlier writings) of chemistry’s status as science. I make the case that Paneth’s definition of element effectively maintains something very like Kant’s critical separation of regulative and constitutive principles, while a advancing the concept of chemical science. (shrink)
The net uptake and loss of any element by a living organism can be described as the quotient of the total amount of the element, present in the organism, and its residence time in the organism. Theoretically it can be derived that the residence time i , equals V i 1–b /k, in which b, the morphometric coefficient, is related to size and shape of the organism (volume V i ); k, the turnover coefficient, is related to its metabolic activity. (...) The net uptake or excretion, i , of an element then follows from = k C i V i b , C i being the (average, whole animal) concentration of the element.Residence times, derived from the quotient of body volume (weight) and daily food intake of various animal species of different sizes, show that, independent of animal or element species, the morphometric coefficient and turnover coefficient have values of b = 0.735 and k = 71.4 dm. year-1 respectively. The turnover coefficient may show some variation, related to the metabolic activity of the organism. The value of the morphometric coefficient implies that residence times may vary with body size: in small animals the residence times of the elements may be in the order of several hours, whereas in the largest organisms residence times may be as long as several years. (shrink)
The author offers a critical analysis of so called deflationary conception of truth. According to the conception in question an adequate theory of truth contains nothing more than instances of a schema: „p” is true iff p. In short, truth is a disquotation. After giving a brief presentation of main deflationary ideas, the author argues that deflationism conflicts with normative epistemology. In other words, being a form of naturalism it leads to elimination of so called normative element from the philosophy (...) of science. For example deflationary conception of truth is not able to account for constitutive connections between normative ideas of truth and reliability. (shrink)
The creation and consolidation of a memory can rest on the integration of any number of possibly disparate features and contexts - colour, sound, emotion, arousal, context. How is it that these bind together to form a coherent memory? What is the role of binding in memory formation? What are the neural processes that underlie binding? Do these binding processes change with age? -/- This book offers an unrivalled overview of one of the most debated hotspots of modern memory research: (...) binding. It contains 28 chapters on binding in different domains of memory, presenting classic research from the field of cognitive neuroscience. It is written by renowned scientists and leaders in the field who have made fundamental contributions to the rapidly expanding field of neurocognitive memory research. As well as presenting a state-of-the-art account of recent views on binding and its importance for remembering, it also includes a review of recent publications in the area, of benefit to both students and active researchers. More than just a survey, it supplies the reader with an integrative view on binding in memory, fostering deep insights not only into the processes and their determinants, but also into the neural mechanisms enabling these processes. -/- The content also encompasses a wide range of binding-related topics, including feature binding, the binding of items and contexts during encoding and retrieval, the specific roles of familiarity and recollection, as well as task- and especially age-related changes in these processes. A major section is dedicated to in-depth analyses of underlying neural mechanisms, focusing on both medial temporal and prefrontal structures. Computational approaches are covered as well. -/- For all students and researchers in memory, the book will not only enhance their understanding of binding, but will instigate innovative and pioneering ideas for future research. (shrink)
Many kinds of creativity result from combination of mental representations. This paper provides a computational account of how creative thinking can arise from combining neural patterns into ones that are potentially novel and useful. We defend the hypothesis that such combinations arise from mechanisms that bind together neural activity by a process of convolution, a mathematical operation that interweaves structures. We describe computer simulations that show the feasibility of using convolution to produce emergent patterns of neural activity that can support (...) cognitive and emotional processes underlying human creativity. (shrink)
as a Kantian model of aesthetic experience a free play of the cognitive faculties with beliefs or propositions. This is false to Kant, whose conception is better interpreted as a free play with elements of cognition such as intuitions and concepts. More importantly, an account closer to Kant's original provides a less restrictive model of aesthetic experience than Matravers's interpretation does, and therefore one that more readily fits a much larger number of cases.
In this paper I use a case study—the discovery of the chaperon function exerted by proteins in the various steps of the hereditary process—to re-discuss the question whether the nucleic acids are the sole repositories of relevant information as assumed in the information theory of heredity. The evidence I here present of a crucial role for molecular chaperones in the folding of nascent proteins, as well as in DNA duplication, RNA folding and gene control, suggests that the family of proteins (...) acting as molecular chaperones provides information that is complementary to that stored in the nucleic acids, and equally important. A re-evaluation of the role of proteins in the hereditary process is in order away from the gene-centric approach of the information theory of heredity, to which neo-Darwinian evolutionists adhere. (shrink)
Apoptosis proteins play an essential role in regulating a balance between cell proliferation and death. The successful prediction of subcellular localization of apoptosis proteins directly from primary sequence is much benefited to understand programmed cell death and drug discovery. In this paper, by use of Chou’s pseudo amino acid composition (PseAAC), a total of 317 apoptosis proteins are predicted by support vector machine (SVM). The jackknife cross-validation is applied to test predictive capability of (...) proposed method. The predictive results show that overall prediction accuracy is 91.1% which is higher than previous methods. Furthermore, another dataset containing 98 apoptosis proteins is examined by proposed method. The overall predicted successful rate is 92.9%. (shrink)