Connectionist content

Abstract
If the arguments of chapter 1 are correct, associationist connectionist models (such as ultralocal ones) yield the clearest alternatives to the LOT hypothesis. While it may be that such models cannot provide a general account of cognition, they may account for important aspects of cognition, such as low-level perception (e.g., with the interactive activation model of reading) or the mechanisms which distinguish experts from novices at a given skill (e.g., with dependency-network models). Since these models stand a fighting chance of being applicable to some aspects of cognition, it is important from a philosophical standpoint that we have appropriate tools for understanding such models. In particular, we want to have a theory of the semantic content of representations in certain connectionist models. In this chapter, I want to consider the prospects for applying a specific sort of "fine-grained" theory of content to such models
Keywords No keywords specified (fix it)
Categories (categorize this paper)
Options
 Save to my reading list
Follow the author(s)
My bibliography
Export citation
Find it on Scholar
Edit this record
Mark as duplicate
Revision history Request removal from index
 
Download options
PhilPapers Archive


Upload a copy of this paper     Check publisher's policy on self-archival     Papers currently archived: 12,088
External links
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library
References found in this work BETA

No references found.

Citations of this work BETA

No citations found.

Similar books and articles
Analytics

Monthly downloads

Added to index

2009-01-28

Total downloads

23 ( #79,956 of 1,101,977 )

Recent downloads (6 months)

2 ( #192,049 of 1,101,977 )

How can I increase my downloads?

My notes
Sign in to use this feature


Discussion
Start a new thread
Order:
There  are no threads in this forum
Nothing in this forum yet.