Abstract
Values are critical for intelligent behavior, since values determine interests, and interests determine relevance. Therefore we address relevance and its role in intelligent behavior in animals and machines. Animals avoid exhaustive enumeration of possibilities by focusing on relevant aspects of the environment, which emerge into the (cognitive) foreground, while suppressing irrelevant aspects, which submerge into the background. Nevertheless, the background is not invisible, and aspects of it can pop into the foreground if background processing deems them potentially relevant. Essential to these ideas are questions of how contexts are switched, which defines cognitive/behavioral episodes, and how new contexts are created, which allows the efficiency of foreground/background processing to be extended to new behaviors and cognitive domains. Next we consider mathematical characterizations of the foreground/background distinction, which we treat as a dynamic separation of the concrete space into (approximately) orthogonal subspaces, which are processed differently. Background processing is characterized by large receptive fields which project into a space of relatively low dimension to accomplish rough categorization of a novel stimulus and its approximate location. Such background processing is partly innate and partly learned, and we discuss possible correlational (Hebbian) learning mechanisms. Foreground processing is characterized by small receptive fields which project into a space of comparatively high dimension to accomplish precise categorization and localization of the stimuli relevant to the context. We also consider mathematical models of valences and affordances, which are an aspect of the foreground. Cells processing foregound information have no fixed meaning (i.e., their meaning is contextual), so it is necessary to explain how the processing accomplished by foreground neurons can be made relative to the context. Thus we consider the properties of several simple mathematical models of how the contextual representation controls foreground processing. We show how simple correlational processes accomplish the contextual separation of foreground from background on the basis of differential reinforcement. That is, these processes account for the contextual separation of the concrete space into disjoint subspaces corresponding to the foreground and background. Since an episode may comprise the activation of several contexts (at varying levels of activity) we consider models, suggested by quantum mechanics, of foreground processing in superposition. That is, the contextual state may be a weighted superposition of several pure contexts, with a corresponding superposition of the foreground representations and the processes operating on them. This leads us to a consideration of the nature and origin of contexts. Although some contexts are innate, many are learned. We discuss a mathematical model of contexts which allows a context to split into several contexts, agglutinate from several contexts, or to constellate out of relatively acontextual processing. Finally, we consider the acontextual processing which occurs when the current context is no longer relevant, and may trigger the switch to another context or the formation of a new context. We relate this to the situation known as "breakdown" in phenomenology.
Keywords No keywords specified (fix it)
Categories (categorize this paper)
Options
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

Our Archive


Upload a copy of this paper     Check publisher's policy     Papers currently archived: 50,488
External links

Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library

References found in this work BETA

No references found.

Add more references

Citations of this work BETA

No citations found.

Add more citations

Similar books and articles

The Use of Situation Theory in Context Modeling.Varol Akman & Mehmet Surav - 1997 - Computational Intelligence 12 (4):1-13.
Internal Context and Top-Down Processing.Peter König, Carl Chiang & Astrid von Stein - 1997 - Behavioral and Brain Sciences 20 (4):691-692.
Foreground and Background in Nietzsche.Frederick C. Copleston - 1968 - Review of Metaphysics 21 (3):506 - 523.
AI as Complex Information Processing.Hideyuki Nakashima - 1999 - Minds and Machines 9 (1):57-80.
A Theory of Concepts and Their Combinations I: The Structure of the Sets of Contexts and Properties.Diederik Aerts & Liane Gabora - 2005 - Aerts, Diederik and Gabora, Liane (2005) a Theory of Concepts and Their Combinations I.
Backgrounding Desire.Philip Pettit & Michael Smith - 1990 - Philosophical Review 99 (4):565-592.

Analytics

Added to PP index
2009-01-28

Total views
13 ( #678,366 of 2,326,559 )

Recent downloads (6 months)
1 ( #641,093 of 2,326,559 )

How can I increase my downloads?

Downloads

My notes