Complexity and non-commutativity of learning operations on graphs

Abstract
We present results from numerical studies of supervised learning operations in recurrent networks considered as graphs, leading from a given set of input conditions to predetermined outputs. Graphs that have optimized their output for particular inputs with respect to predetermined outputs are asymptotically stable and can be characterized by attractors which form a representation space for an associative multiplicative structure of input operations. As the mapping from a series of inputs onto a series of such attractors generally depends on the sequence of inputs, this structure is generally noncommutative. Moreover, the size of the set of attractors, indicating the complexity of learning, is found to behave non-monotonically as learning proceeds. A tentative relation between this complexity and the notion of pragmatic information is indicated.
Keywords No keywords specified (fix it)
Categories No categories specified
(categorize this paper)
Options
 Save to my reading list
Follow the author(s)
My bibliography
Export citation
Find it on Scholar
Edit this record
Mark as duplicate
Revision history Request removal from index
 
Download options
PhilPapers Archive


Upload a copy of this paper     Check publisher's policy on self-archival     Papers currently archived: 10,747
External links
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library
References found in this work BETA

No references found.

Citations of this work BETA
Similar books and articles
Analytics

Monthly downloads

Added to index

2009-01-28

Total downloads

5 ( #224,261 of 1,098,836 )

Recent downloads (6 months)

1 ( #286,682 of 1,098,836 )

How can I increase my downloads?

My notes
Sign in to use this feature


Discussion
Start a new thread
Order:
There  are no threads in this forum
Nothing in this forum yet.