Abstract
Shannon information is commonly assumed to be the wrong way in which to conceive of information in most biological contexts. Since the theory deals only in correlations between systems, the argument goes, it can apply to any and all causal interactions that affect a biological outcome. Since informational language is generally confined to only certain kinds of biological process, such as gene expression and hormone signalling, Shannon information is thought to be unable to account for this restriction. It is often concluded that a richer, teleosemantic sense of information is needed. I argue against this view, and show that a coherent and sufficiently restrictive theory of biological information can be constructed with Shannon information at its core. This can be done by paying due attention some crucial distinctions: between information quantity and its fitness value, and between carrying information and having the function of doing so. From this I construct an account of how informational functions arise, and show that the “subject matter” of these functions can easily be seen as the natural information dealt with by Shannon’s theory