Skip to main content
Log in

Neural information and the problem of objectivity

  • Published:
Biology & Philosophy Aims and scope Submit manuscript

Abstract

A fascinating research program in neurophysiology attempts to quantify the amount of information transmitted by single neurons. The claims that emerge from this research raise new philosophical questions about the nature of information. What kind of information is being quantified? Do the resulting quantities describe empirical magnitudes like those found elsewhere in the natural sciences? In this article, it is argued that neural information quantities have a relativisitic character that makes them distinct from the kinds of information typically discussed in the philosophical literature. It is also argued that despite this relativistic character, there are cases in which neural information quantities can be viewed as robustly objective empirical properties.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. The entropy associated with a single value of some random variable is given by the log of the reciprocal of its relative frequency. \(\hbox {Log}_{2}(1/.5)\) = 1 bit. \(\hbox {Log}_{2}(1/.25)\) = 2 bits. To find the entropy of an entire distribution, we take a weighted sum over all individual entropies. So, the marginal entropy \(\hbox {H}(\hbox {X}) = .5(1) + .25(2) +.25(2) = 1.5\). The computation required to find the marginal entropy H(Y) is identical to that required for H(X). To compute the joint entropy H(X,Y), we take a weighted sum over the individual entropies associated with each of the six terms in the center of the table. Three of those terms evaluate to 0. Once they are removed, the remaining terms constitute an expression that is identical to that for H(X), which, as we just saw, is equal to 1.5 bits.

  2. It is worth noting here that the hair-in-the-wind argument exploits a perfectly contingent fact about humans. Filiform hairs on the legs of crickets move with the local air currents too. But in that case, neural receiver mechanisms use the hair-air correlation for predator detection (Magal et al. 2006). In that case, hair direction really is an informational signal. At least in principle, the amount of information transmitted in this case could be quantified.

  3. I have not given a definition of the term “function.” As the size of the philosophical literature on the subject suggests, it is not easy to say exactly what it means for an object or process to have a biological function. For many aspects of biological theory, including the kind of function discussed here, I favor the modern history theory of functions, as expressed in Godfrey-Smith (1994). But my view is compatible with other theories of biological function as well. It is important, however, that the notion of function have some relation to natural history. Without that connection, it loses some of the objectivity that I argue is worth retaining in neurophysiological theory. See section “Is neural information cbjective?” for further commentary on this point.

  4. A very recent survey is Floridi (2015). Other prominent surveys include Sayre (1976), Adams (2003), and Harms (2006).

  5. I have suppressed the role of time in this discussion because it complicates the mathematics without changing the conceptual issues under consideration. Notice that the argument doesn’t change significantly if we consider two non-identical strings sent from one location to another over time. If strings share statistical properties, their transmission may achieve the same information rate expressed in bits/s. This is still no reason to think that the two strings have the same meaning.

  6. Another, more controversial, reason to resist the semantic interpretation of information theoretic estimates is that the activity in a single neuron seems to be too low-level for semantic properties to emerge at all. If there are no semantic properties at the level of individual neurons, then, clearly, information estimates describing the behavior of individual neurons cannot be interpreted semantically. Rosa Cao has defended this anti-semantic position on the basis of an interesting dilemma. The signals transmitted by individual neurons either lack sufficiently robust connections to the external world to carry content on their own, or the connections they exhibit are too inflexible too deserve an informational, as opposed to merely causal, mode of description (Cao 2012).

  7. See, for example, Cover and Thomas (1991).

  8. The phrase “For Shannon” in this definition might be misleading. Shannon was many things, but he was not a metaphysician. He was not interested in trying to divide the world into informational phenomena and non-informational phenomena. In fact, in a short paper entitled “The Bandwagon,” Shannon warns that information theory is easily misused when applied outside the realm of communications technology (Shannon 1956).

  9. Although their article is focused on genetic information, they suggest that their definition can be extended to include neural information quantities.

References

  • Adams F (2003) The informational turn in philosophy. Minds Mach 13(4):471–501

    Article  Google Scholar 

  • Bar-Hillel Y, Carnap R (1953) Semantic information. Br J Philos Sci 4(14):147–157

    Article  Google Scholar 

  • Bergstrom CT, Rosvall M (2008) The transmission sense of information. Biol Philos 26(2):191–194

    Google Scholar 

  • Borst A, Theunissen FE (1999) Information theory and neural coding. Nat Neurosci 2(11):947–957

    Article  Google Scholar 

  • Cao R (2012) A teleosemantic approach to information in the brain. Biol Philos 27(1):49–71

    Article  Google Scholar 

  • Cover TM, Thomas JA (1991) Elements of information Theory, 1st edn. Wiley, New York

    Book  Google Scholar 

  • Dayan P, Abbott LF (2001) Theoretical neuroscience, vol 10. MIT Press, Cambridge

    Google Scholar 

  • Dretske F (1981) Knowledge and the flow of information. MIT Press, Cambridge

    Google Scholar 

  • Floridi L (2015) Semantic conceptions of information. In: Zalta EN (ed) The Stanford encyclopedia of philosophy, vol 2015. Spring, Berlin

    Google Scholar 

  • Frye MA, Dickinson MH (2001) Fly flight: amodel for the neural control of complex behavior. Neuron 32(3):385–388

    Article  Google Scholar 

  • Gallistel CR, King AP (2009) Memory and the computational Brain: Why cognitive science will transform neuroscience, 1st edn. Wiley, Chichester

    Book  Google Scholar 

  • Godfrey-Smith P (1994) A modern history theory of functions. Noûs 28(3):344–362

    Article  Google Scholar 

  • Godfrey-Smith P (2011) Senders, receivers, and genetic information: comments on bergstrom and rosvall. Biol Philos 26(2):177–181

    Article  Google Scholar 

  • Godfrey-Smith P, Sterelny K (2008) Biological Information. In: Zalta EN (ed) The Stanford Encyclopedia of Philosophy. Fall 2008 edition. https://plato.stanford.edu/archives/fall2008/entries/information-biological/

  • Harms WF (2006) What is information? Three concepts. Biol Theory 1(3):230–242

    Article  Google Scholar 

  • Magal C, Dangles O, Caparroy P, Casas J (2006) Hair canopy of cricket sensory system tuned to predator signals. J Theor Biol 241(3):459–466

    Article  Google Scholar 

  • Neri P (2006) Spatial integration of optic flow signals in fly motion-sensitive neurons. J Neurophysiol 95(3):1608–1619

    Article  Google Scholar 

  • Piccinini G, Scarantino A (2011) Information processing, computation, and cognition. J Biol Phys 37(1):1–38

    Article  Google Scholar 

  • Purves D, Augustine GJ, Fitzpatrick D, Katz L, LaMantia A-S, McNamara JO, Williams S (2001) Neuroscience, vol 3. Sinauer Associates, Sunderland

    Google Scholar 

  • Reiss J, Sprenger J (2014) Scientific objectivity. In: Zalta EN (ed), The Stanford Encyclopedia of Philosophy. Fall 2014 edition. https://plato.stanford.edu/archives/fall2014/entries/scientific-objectivity/

  • Rieke F, Warland D, Steveninck RDDRV, Bialek W (1999) Spikes: exploring the neural code. A Bradford Book, Cambridge

    Google Scholar 

  • Sayre K (1976) Cybernetics and the philosophy of mind. Routledge, Abingdon-on-Thames

    Google Scholar 

  • Shannon CE (1956) The bandwagon. IRE Trans Inf Theory 2(1):3

    Article  Google Scholar 

  • Shannon CE, Weaver W (1949) The mathematical theory of information. Urbana: University of Illinois Press.

  • Skyrms B (2010) Signals: evolution, learning, and information. Oxford University Press, Cambridge

    Book  Google Scholar 

  • Van Hateren J, Kern R, Schwerdtfeger G, Egelhaaf M (2005) Function and coding in the blowfly H1 neuron during naturalistic optic flow. J Neurosci 25(17):4343–4352

    Article  Google Scholar 

Download references

Acknowledgements

Funding was provided by the National Science Foundation (Grant No. 1430601). Thanks to Peter Godfrey-Smith, Rosa Cao, Ron Planer, Matteo Colombo, Daniel Kostic, and Gualtiero Piccinini for their insightful criticisms of my initial writings on this topic. Thanks also to two anonymous referees for their detailed commentary on an earlier draft of this manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Charles Rathkopf.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rathkopf, C. Neural information and the problem of objectivity. Biol Philos 32, 321–336 (2017). https://doi.org/10.1007/s10539-017-9561-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10539-017-9561-7

Keywords

Navigation