Complexity and non-commutativity of learning operations on graphs

We present results from numerical studies of supervised learning operations in recurrent networks considered as graphs, leading from a given set of input conditions to predetermined outputs. Graphs that have optimized their output for particular inputs with respect to predetermined outputs are asymptotically stable and can be characterized by attractors which form a representation space for an associative multiplicative structure of input operations. As the mapping from a series of inputs onto a series of such attractors generally depends on the sequence of inputs, this structure is generally noncommutative. Moreover, the size of the set of attractors, indicating the complexity of learning, is found to behave non-monotonically as learning proceeds. A tentative relation between this complexity and the notion of pragmatic information is indicated.
Keywords No keywords specified (fix it)
Categories No categories specified
(categorize this paper)
 Save to my reading list
Follow the author(s)
My bibliography
Export citation
Find it on Scholar
Edit this record
Mark as duplicate
Revision history Request removal from index
Download options
PhilPapers Archive

Upload a copy of this paper     Check publisher's policy on self-archival     Papers currently archived: 9,360
External links
  •   Try with proxy.
  • Through your library Only published papers are available at libraries
    References found in this work BETA

    No references found.

    Citations of this work BETA
    Similar books and articles

    Monthly downloads

    Added to index


    Total downloads

    4 ( #198,645 of 1,089,155 )

    Recent downloads (6 months)


    How can I increase my downloads?

    My notes
    Sign in to use this feature

    Start a new thread
    There  are no threads in this forum
    Nothing in this forum yet.