Applying forward models to sequence learning: A connectionist implementation

Abstract
The ability to process events in their temporal and sequential context is a fundamental skill made mandatory by constant interaction with a dynamic environment. Sequence learning studies have demonstrated that subjects exhibit detailed — and often implicit — sensitivity to the sequential structure of streams of stimuli. Current connectionist models of performance in the so-called Serial Reaction Time Task (SRT), however, fail to capture the fact that sequence learning can be based not only on sensitivity to the sequential associations between successive stimuli, but also on sensitivity to the associations between successive responses, and on the predictive relationships that exist between these sequences of responses and their effects in the environment. In this paper, we offer an initial exploration of an alternative architecture for sequence learning, based on the principles of Forward Models.
Keywords No keywords specified (fix it)
Categories (categorize this paper)
Options
 Save to my reading list
Follow the author(s)
My bibliography
Export citation
Find it on Scholar
Edit this record
Mark as duplicate
Revision history Request removal from index
 
Download options
PhilPapers Archive


Upload a copy of this paper     Check publisher's policy on self-archival     Papers currently archived: 9,360
External links
  •   Try with proxy.
  • Through your library Only published papers are available at libraries
    References found in this work BETA

    No references found.

    Citations of this work BETA

    No citations found.

    Similar books and articles
    Analytics

    Monthly downloads

    Added to index

    2009-01-28

    Total downloads

    7 ( #149,786 of 1,089,047 )

    Recent downloads (6 months)

    1 ( #69,722 of 1,089,047 )

    How can I increase my downloads?

    My notes
    Sign in to use this feature


    Discussion
    Start a new thread
    Order:
    There  are no threads in this forum
    Nothing in this forum yet.