To transform the phenomena: Feyerabend, proliferation, and recurrent neural networks

Philosophy of Science 64 (4):420 (1997)
Paul Feyerabend recommended the methodological policy of proliferating competing theories as a means to uncovering new empirical data, and thus as a means to increase the empirical constraints that all theories must confront. Feyerabend's policy is here defended as a clear consequence of connectionist models of explanatory understanding and learning. An earlier connectionist "vindication" is criticized, and a more realistic and penetrating account is offered in terms of the computationally plastic cognitive profile displayed by neural networks with a recurrent architecture
Keywords No keywords specified (fix it)
Categories (categorize this paper)
DOI 10.1086/392618
 Save to my reading list
Follow the author(s)
My bibliography
Export citation
Find it on Scholar
Edit this record
Mark as duplicate
Revision history Request removal from index
Download options
PhilPapers Archive

Upload a copy of this paper     Check publisher's policy on self-archival     Papers currently archived: 24,392
External links
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library
References found in this work BETA

No references found.

Add more references

Citations of this work BETA
Matthew J. Brown & Ian James Kidd (2016). Introduction: Reappraising Paul Feyerabend. Studies in History and Philosophy of Science Part A 57:1-8.
Christopher J. Preston (2005). Restoring Misplaced Epistemology. Ethics, Place and Environment 8 (3):373 – 384.

Add more citations

Similar books and articles

Monthly downloads

Added to index


Total downloads

41 ( #117,924 of 1,924,687 )

Recent downloads (6 months)

1 ( #417,767 of 1,924,687 )

How can I increase my downloads?

My notes
Sign in to use this feature

Start a new thread
There  are no threads in this forum
Nothing in this forum yet.