Complexity and non-commutativity of learning operations on graphs

Abstract
We present results from numerical studies of supervised learning operations in recurrent networks considered as graphs, leading from a given set of input conditions to predetermined outputs. Graphs that have optimized their output for particular inputs with respect to predetermined outputs are asymptotically stable and can be characterized by attractors which form a representation space for an associative multiplicative structure of input operations. As the mapping from a series of inputs onto a series of such attractors generally depends on the sequence of inputs, this structure is generally noncommutative. Moreover, the size of the set of attractors, indicating the complexity of learning, is found to behave non-monotonically as learning proceeds. A tentative relation between this complexity and the notion of pragmatic information is indicated.
Keywords No keywords specified (fix it)
Categories No categories specified
(categorize this paper)
Options
 Save to my reading list
Follow the author(s)
Edit this record
My bibliography
Export citation
Find it on Scholar
Mark as duplicate
Request removal from index
Revision history
Download options
Our Archive


Upload a copy of this paper     Check publisher's policy     Papers currently archived: 29,545
External links

Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library
References found in this work BETA

No references found.

Add more references

Citations of this work BETA

Add more citations

Similar books and articles
Added to PP index
2009-01-28

Total downloads
16 ( #321,257 of 2,210,861 )

Recent downloads (6 months)
1 ( #357,942 of 2,210,861 )

How can I increase my downloads?

Monthly downloads
My notes
Sign in to use this feature