Abstract
The concept of information tempts us as a theoretical primitive, partly because of the respectability lent to it by highly successful applications of Shannon’s information theory, partly because of its broad range of applicability in various domains, partly because of its neutrality with respect to what basic sorts of things there are. This versatility, however, is the very reason why information cannot be the theoretical primitive we might like it to be. “Information,” as it is variously used, is systematically ambiguous between whether it involves continuous or discrete quantities, causal or noncausal relationships, and intrinsic or relational properties. Many uses can be firmly grounded in existing theory, however. Continuous quantities of information involving probabilities can be related to information theory proper. Information defined relative to systems of rules or conventions can be understood relative to the theory of meaning . A number of causal notions may possibly be located relative to standard notions in physics. Precise specification of these distinct properties involved in the common notion of information can allow us to map the relationships between them. Consequently, while information is not in itself the kind of single thing that can play a significant unifying role, analyzing its ambiguities may facilitate headway toward that goal