Abstract
“Naturalistic” semantic theories attempt to specify sufficient conditions, in a non-intentional and non-semantic vocabulary, for a mental representation’s having a particular meaning. Information-basedtheories, for example, identify the meaning of a mental representation with the cause of its tokening in certain specifiable circumstances.1 Teleological theories hold that the meaning of a mental representation is determined by its biological function, what it was selected for.2
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Notes
See Stampe 1977, Dretske 1981, 1986, and Fodor 1990. There are, of course, important differences in these accounts.
See, primarily. Millikan 1984 and Papineau 1987.
Fodor’s (1990) asymmetrical dependency account specifies three conditions jointly sufficient to determine a mental representation’s meaning. These conditions will be satisfied by undetached cow parts whenever they are satisfied by cows. With respect to the asymmetrical dependency condition, undetached cow parts causing “cow” tokenings is symmetrically dependent upon cows causing “cow” tokenings: undetached cow parts would not cause “cow” tokenings unless cows did, and vice versa.
It might be argued that an object ontology is innate. There is some evidence from studies with infants that this may be true (see Spelke 1990). But the claim that certain basic categories are innate does not help the naturalistic semanticist, who is attempting to show how semantic properties are physically grounded.
See, for example, Fodor 1975, Pylyshyn 1984, and Cummins 1989.
The directness requirement needs to be precisely specified; however, it requires at a minimum that independently characterized states of the device covary with elements of the intended domain. The requirement that the mapping be direct precludes interpreting the wall behind me as an adding machine, since the assignment of numbers to states of the wall requires the interpreter to compute the addition function herself. The system is not doing the work.
The account applies to both classical and connectionist computational models.
See “A Pragmatic Account of Mental Content” (forthcoming) for a more extensive defense of this view.
See Egan 1995 for elaboration of this point.
In the absence of a causal connection between the device and the stock market, or the opponent’s chess moves, these “accidental” correlations are of no real interest.
The objection is originally from Searle 1980.
I assume that this is what Horgan means by a structure figuring in a“contentappropriate way” in a system.
This might happen if general environment constraints tacitly assumed by the visual system do not hold in a particular situation.
See Chomsky (1995), p.56.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Egan, F. (1999). Pragmatic Aspects of Content Determination. In: Fisette, D. (eds) Consciousness and Intentionality: Models and Modalities of Attribution. The Western Ontario Series in Philosophy of Science, vol 62. Springer, Dordrecht. https://doi.org/10.1007/978-94-015-9193-5_10
Download citation
DOI: https://doi.org/10.1007/978-94-015-9193-5_10
Publisher Name: Springer, Dordrecht
Print ISBN: 978-90-481-5300-8
Online ISBN: 978-94-015-9193-5
eBook Packages: Springer Book Archive