THE RELEVANCE OF PHILOSOPHICAL ONTOLOGY TO INFORMATION AND COMPUTER SCIENCE Barry Smith Preprint version of a paper to appear in Ruth Hagengruber and Uwe Riss (eds.), Philosophy, Computing and Information Science, London: Pickering and Chatto, 2014, 75-83 1 ARTIFICIAL COMPANIONS Ontology as a branch of philosophy is the science of what is, of the kinds and structures of objects, properties, events, processes and relations in every area of reality. The earliest use of the term 'ontology' (or 'ontologia') seems to have been in 1606 in the book Ogdoas Scholastica by the German Protestant scholastic Jacob Lorhard. For Lorhard, as for many subsequent philosophers, 'ontology' is a synonym of 'metaphysics' (a label meaning literally: 'what comes after the Physics'), a term used by early students of Aristotle to refer to what Aristotle himself called 'first philosophy'. Some philosophers use 'ontology' and 'metaphysics' to refer to two distinct, though interrelated, disciplines, the former to refer to the study of what might exist; the latter to the study of which of the various alternative possible ontologies is in fact true of reality.1 The term – and the philosophical discipline of ontology – has enjoyed a checkered history since 1606, with a significant expansion, and consolidation, in recent decades (see Figure 1, which records references to 'ontology' in items accessible to the google website). We shall not discuss here the successive rises and falls in philosophical acceptance of the term, but rather focus on certain phases in the history of recent philosophy which are most relevant to the consideration of its recent advance, and increased acceptance, also outside the discipline of philosophy. Figure 1: 'Ontology' chart from google history (http://tiny.cc/vZszP, last accessed June 10, 2010) 1 R. Ingarden, Time and Modes of Being, translated by H. Michejda (Springfield: Charles Thomas, 1964). 2 VARIETIES OF PHILOSOPHICAL ONTOLOGY For the philosophical ontologist ontology seeks to provide a definitive and exhaustive classification of entities in all spheres of being. The classification should be definitive in the sense that it can serve as an answer to such questions as: What classes of entities are needed for a complete description and explanation of all the goings-on in the universe? Or: What classes of entities are needed to give an account of what makes true all truths? It should be exhaustive in the sense that all types of entities should be included in the classification, and it should include also all the types of relations by which entities are tied together to form larger wholes. Different schools of philosophy offer different approaches to the provision of such classifications. One large division is that between what we might call substantialists and fluxists, which is to say between those who conceive ontology as a substanceor thing- (or continuant-) based discipline and those who favour an ontology centred on events or processes (or occurrents). Another large division is between what we might call adequatists and reductionists. Adequatists seek a taxonomy of the entities in reality at all levels of aggregation, from the microphysical to the cosmological, and including also the middle world (the mesocosmos) of human-scale entities (carpets, caves, caravans, carpal tunnel syndromes) in between. Reductionists see reality in terms of some one privileged level of existents, normally the smallest. They thereby seek to establish the 'ultimate furniture of the universe' by decomposing reality into its simplest constituents, or they seek to 'reduce' in some other way the apparent variety of types of entities existing in reality, often by providing recipes for logically translating assertions putatively about entities at higher levels into assertions allowable from the reductionist perspective. In the work of adequatist philosophical ontologists such as Aristotle, Ingarden,2 Johansson,3 Chisholm,4 and Lowe,5 the proposed taxonomies are in many ways comparable to those produced and used in empirical sciences such as biology or chemistry, though they are of course radically more general than these. Adequatism – which is the view defended also by the author of this paper – transcends the dichotomy between substantialism and fluxism, since its adherents accept categories of both continuants and occurrents. 2 R. Ingarden, Time and Modes of Being, translated by H. Michejda (Springfield: Charles Thomas, 1964). 3 L. Johansson, Ontological Investigations. An Inquiry into the Categories of Nature. Man and Society (New York, London: Routledge, 1989). 4 R. Chisholm, A Realistic Theory of Categories: An Essay on Ontology (Cambridge: Cambridge University Press, 1996). 5 E. Lowe, The Four-Category Ontology: A Metaphysical Foundation for Natural Science (Oxford: Oxford University Press, 2006). Ontology, for the adequatist, is a descriptive enterprise. It is distinguished from the special sciences not only in its radical generality but also in its goal: the ontologist seeks not predication, but rather description of a sort that is based on adequate classification. Adequatists study the totality of those objects, properties, processes and relations that make up the world on different levels of granularity, whose different parts and moments are studied by the different scientific disciplines – often, as in the case of all the adequatists listed above, with a goal of providing the philosophico-ontological tools for the unification or integration of science. 3 METHODS OF ONTOLOGY The methods of ontology in philosophical contexts include the development of theories of wider or narrower scope and the refinement of such theories by measuring them either against difficult counterexamples or against the results of science. These methods were familiar already to Aristotle himself. In the course of the twentieth century a range of new formal tools became available to ontologists for the development, expression and refinement of their theories. Ontologists nowadays have a choice of formal frameworks (deriving from algebra, category theory, mereology, set theory, topology) in terms of which their theories can be formulated. These new formal tools, along with the languages of formal logic, allow philosophers to express intuitive principles and definitions in clear and rigorous fashion, and, through the application of the methods of formal semantics, they can allow also for the testing of theories for consistency and completeness. When we examine the work of computational ontologies below, we shall see how they have radicalized this approach, using formal methods as implemented in computers as a principal method of ontology development. 4 THE ROLE OF QUINE Some philosophers have thought that the way to do ontology is exclusively through the investigation of scientific theories. With the work of Quine there arose in this connection a new conception of the proper method of philosophical ontology, according to which the ontologist's task is to establish what kinds of entities scientists are committed to in their theorizing.6 Quine thereby took ontology seriously. His aim was to use science for ontological purposes, which means: to find the ontology in scientific theories. Ontology is for him a network of claims (a web of beliefs) about what exists, deriving from the natural sciences. Each natural science has, Quine holds, its own preferred repertoire of types of objects to the existence of 6 W. v. O. Quine, 'On What There Is', reprinted in: From a Logical Point of View (New York: Harper & Row, 1953). which it is committed. Ontology is then not the meta-level study of the ontological commitments or presuppositions embodied in the different natural-scientific theories. Ontology, for Quine, is rather these commitments themselves. Quine fixes upon the language of first-order logic as the medium of canonical representation in whose terms these commitments will beome manifest. He made this choice not out of dogmatic devotion to some particular favored syntax, but rather because he holds that the language of first-order logic is the only really clear form of language we have. His so-called 'criterion of ontological commitment' is captured in the slogan: To be is to be the value of a bound variable. This should not be understood as signifying some reductivistic conception of being – as if to exist would be a merely logico-linguistic matter – something like a mere façon de parler. Rather it is to be interpreted in practical terms: to determine what the ontological commitments of a scientific theory are, it is necessary to determine the values of the quantified variables used in its canonical (first-order logical) formalization. One problem with this approach is that the objects of scientific theories are disciplinespecific. How, then, are we to approach the issue of the compatibility of these different sets of ontological commitments. Various different solutions have been suggested for this problem, including reductionistic solutions, based on the conception of a future perfected state of science captured by a single logical theory and thus marked by a single, consistent and exhaustive set of ontological commitments. At the opposite extreme is a relativistic approach, which renounces the very project of a single unitary scientific world view (and which might in principle include into the mix the ontological commitments of non-scientific world views, as embraced for example by different religious cultures). The 'external' question of the relations between objects belonging to different disciplinary domains fall out of bounds for an approach along these lines. The adequatist approach to ontology stands in contrast to both of these perspectives, holding that the issue of how different scientific theories (or how the objects described by such theories) relate to each other is of vital importance, and can be resolved in a way which does justice to the sciences themselves . For Quine himself, the best we can achieve in ontology lies in the quantified statements of particular theories, theories supported by the best evidence we can muster. We have no extra-scientific way to rise above the particular theories we have and to harmonize and unify their respective claims. This implies also that philosophers lack authority to interfere with the claims and methods and empirical data of scientists. In the current age of information-driven science, however, tasks of the sort which were in earlier epochs addressed by philosophical ontologists, and which in the era of Quine and Carnap (and of their precursors in the Vienna Circle) were seen as falling in the province of logicians, are now being addressed by computer scientists. 5 ON THE WAY TO COMPUTATIONAL ONTOLOGY As scientists must increasingly rely on the use of computer systems to absorb the vast quantities of information with which they are confronted, and as computers are being applied to the storage and integration of multiple different kinds of scientific data, computer scientists are being called upon to address problems which has been earlier addressed by those with philosophical training. In a development hardly noticed by philosophers, the term 'ontology' has hereby gained currency in the field of computer and information science, initially through the avenue of Quine, whose work on ontological commitment attracted the attentions of researchers in artificial intelligence such as John McCarthy7 and Patrick Hayes and from the programming world such as Peter Naur.8 As McCarthy expressed it in 1980, citing Quine in his use of 'ontology', builders of logic-based intelligent systems must first 'list everything that exists, building an ontology of our world'. In 1999, a new wave of computationally oriented ontology developments began in the world of bioinformatics with the creation of the Gene Ontology (GO).9 The GO addresses the task of solving the problem of data integration for biologists working on so-called 'model organisms' – genetically tailored mice or fish or other organisms – which are used in experiments designed to yield results which will bring consequences for our understanding of human health and of the effects of different kinds of treatment. The problem faced by the GO's authors turned on the fact that each group of model organism researchers had developed its own idiosyncratic vocabularies for describing the phenomena revealed in their respective bodies of data. Moreover, these vocabularies were in turn not consistent with the vocabularies used to describe the human health phenomena to the understanding of which their research was directed. Different groups of researchers used identical labels but with different meanings, or they expressed the same meaning using different names. With the explosive growth of bioinformatics, ever more diverse groups became involved in sharing and translating ever more diverse varieties of information at all levels, from molecular pathways 7 J. McCarthy, 'Circumscription – A Form of Non-Monotonic Reasoning', Artificial Intelligence, 13:5 (1980), pp. 27–39. 8 P. Hayes, 'The Second Naive Physics Manifesto', in J. Hobbs and R. Moore (eds.), Formal Theories of the Common-Sense World (Norwood: Ablex, 1985), pp. 1–36, and P. Hayes, 'Naïve Physics I: Ontology for Liquids', in J. Hobbs and R. Moore (eds.), Formal Theories of the Common-Sense World (Norwood: Ablex, 1985), pp. 71–108. 9 M. Ashburner, et al., 'Gene Ontology: tool for the unification of biology', Nature Genetics 25 (2000), pp. 25–9. to populations of organisms, and the problems standing in the way of putting this information together within a single system began to increase geometrically. By providing a solution to these problems in the form of a common, species-neutral, controlled vocabulary covering the entire spectrum of biological processes, the GO has proved tremendously successful (see figure 2: Number of articles on ontology or ontologies in PubMed/MEDLINE from Bodenreider and Stevens10) and is almost certainly the first real demonstration case of the advantages brought by ontological technology in supporting the integration of data for scientific purposes. Figure 2: Number of articles on ontology or ontologies in PubMed/MEDLINE (GO in blue, other ontologies in yellow) As the GO community has discovered, however, the success of an ambitious ontology initiative along these lines faces a constant need to identify and resolve the inconsistencies which arise as its terminological resources are expanded through the contributions of multiple groups engaged in different kinds of biological research. Initially, such incompatibilities were resolved by the GO – and by the authors of the new ontologies which had grown up in its wake – on a case-by-case basis. Gradually, however, it came to be recognized in the field of bio-ontologies that the provision of common reference ontologies – effectively, shared taxonomies of entities – might provide significant advantages over such case-by-case resolution. An ontology is in this context a dictionary of terms formulated in a canonical syntax and with commonly accepted definitions designed to yield a lexical or taxonomical framework for knowledge-representation which can be shared by different 10 O. Bodenreider, and R. Stevens, 'Bio-ontologies: current trends and future directions', Briefings in Bioinformatics 7:3 (2006), pp. 256–74. information systems communities. More ambitiously, an ontology is a formal theory within which not only definitions but also a supporting framework of axioms is included (perhaps the axioms themselves provide implicit definitions of the terms involved). The methods used in the construction of ontologies thus conceived are derived on the one hand from earlier initiatives in database management systems. But they also include methods similar to those employed in philosophy (as described already in Hayes11), including the methods used by logicians when developing formal semantic theories. 6 UPPER-LEVEL ONTOLOGIES The potential advantages of ontology for the purposes of information management are obvious. Each group of data analysts would need to perform the task of making its terms and relations compatible with those of other such groups only once – by calibrating its results in the reference to a single canonical backbone language. If all databases were calibrated in terms of just one common ontology (a single consistent, stable and highly expressive set of category labels), then the prospect would arise of leveraging the thousands of person-years of effort that have been invested in creating separate database resources in fields such as biochemistry or computational biology such a way as to create, in more or less automatic fashion, a single integrated knowledge base. The obstacles standing in the way of the construction of a single shared ontology in the sense described are unfortunately prodigious, ranging from technical difficulties in choice of appropriate logical framework,12 difficulties in coordination of different ontology authoring communities, difficulties which flow from the entrenched tendencies of many computer scientist communities to react negatively to the idea of reusing already created computational artefacts and to prefer much rather to create something new for each successive customer.13 Added to this are the difficulties which arise at the level of adoption. To be widely accepted an ontology must be neutral as between different data communities, and there is, as experience has shown, a formidable trade-off between this constraint of neutrality and the requirement that an ontology be maximally wide-ranging and expressively powerful – that it should contain canonical definitions for the largest possible number of terms. One solution to this tradeoff problem is the idea of a top-level ontology, which would confine itself to the specification 11 P. Hayes, 'The Second Naive Physics Manifesto', in J. Hobbs and R. Moore (eds.), Formal Theories of the Common-Sense World (Norwood: Ablex, 1985), pp. 1–36. 12 S. Schulz, H. Stenzhorn, M. Boekers and B. Smith, 'Strengths and Limitations of Formal Ontologies in the Biomedical Domain', Electronic Journal of Communication, Information and Innovation in Health, Special Issue on Ontologies, Semantic Web and Health, 3:1 (2009), pp. 31–45. 13 B. Smith, 'Ontology (Science)', in C. Eschenbach and M. Gruninger (eds.), Formal Ontology in Information Systems. Proceedings of the Fifth International Conference (Amsterdam: IOS Press, 2008), pp. 21–35. of such highly general (domain-independent) categories as: time, space, inherence, instantiation, identity, measure, quantity, functional dependence, process, event, attribute, boundary, and so on. The top-level ontology would then be designed to serve as common neutral backbone, which would be supplemented by the work of ontologists working in more specialized domains on, for example, ontologies of geography, or medicine, or ecology, or law. An ambitious strategy along these lines is currently being realized in the domains of biology and biomedicine,14 and it is marked especially by the adoption of a common top-level ontology of relations, which provides the common formal glue to link together ontologies created by different communities of researchers.15 7 SOME CRITICAL REMARKS ON CONCEPTUALIZATIONS Drawing on the technical definition of 'conceptualization' introduced by Genesereth and Nilsson in their Logical Foundation of Artificial Intelligence.16 Tom Gruber introduced in 1993 an influential definition of 'ontology' as meaning: 'the specification of a conceptualization'.17 One result of Gruber's work was that it became fashion in computer circles to conceive of 'ontology' as meaning just: 'conceptual model'. Applied Ontology, the principal journal of the ontology engineering field, accordingly has the subtitle An Interdisciplinary Journal of Ontological Analysis and Conceptual Modeling. For Gruber, 'A conceptualization is an abstract, simplified view of the world that we wish to represent for some purpose. Every knowledge base, knowledge-based system, or knowledge-level agent is committed to some conceptualization, explicitly or implicitly.'18 The idea, here, is as follows. As we engage with the world from day to day we use information systems, databases, specialized languages, and scientific instruments. We also buy insurance, negotiate traffic, invest in bond derivatives, make supplications to the gods of our ancestors. Each of these ways of behaving involves, we can say, a certain conceptualization. What this means is that it involves a system of concepts in terms of which the corresponding universe of discourse is divided up into objects, processes and relations in different sorts of ways. Thus in 14 B. Smith, Ashburner, M., Rosse, C., Bard, J. Bug, W., Ceusters, W., Goldberg, L., Eilbeck, K., Ireland, A., Mungall, C., The OBI Consortium, Leontis, N., Rocca-Serra, P., Ruttenberg, A., Sansone, S., Scheuermann, R., Shah, N., Whetzel, P., Lewis, S., 'The OBO Foundry: Coordinated Evolution of Ontologies to Support Biomedical Data Integration', Nature Biotechnology 25:11 (2007), pp. 1251–5. 15 Cf. B. Smith, W. Ceusters, B. Klagges, J. Köhler, A. Kumar, J. Lomax, C. Mungall, F. Neuhaus, A. Rector and C. Rosse, 'Relations in Biomedical Ontologies', Genome Biology 6:5 (2005), R46. 16 M. Genesereth, and L. Nilsson, Logical Foundation of Artificial Intelligence (Los Altos, California: Morgan Kaufmann, 1987). 17 T. Gruber, 'A Translation Approach to Portable Ontology Specifications', Knowledge Acquisition 5 (1993), pp. 199–220. 18 T. Gruber, 'Toward Principles for the Design of Ontologies Used for Knowledge Sharing', International Journal of Human and Computer Studies 43:5/6 (1995), pp. 907–28, on p. 908. a religious ritual setting we might use concepts such as salvation and purification; in a scientific setting we might use concepts such as virus and nitrous oxide; in a story-telling setting we might use concepts such as: leprechaun and dragon. Such conceptualizations are often tacit; that is, they are often not thematized in any systematic way. But tools can be developed to specify and to clarify the concepts involved and to establish their logical structure, and thus to render explicit the underlying taxonomy. An 'ontology' in Gruber's sense is then the result of such clarification employing appropriate logical tools. Ontology, for Gruber and for the many computer scientists who have followed in his wake, thus concerns itself not with the question of ontological realism, that is with the question whether its conceptualizations are true of some independently existing reality. Rather, it is a strictly pragmatic enterprise. It starts with conceptualizations, and goes from there to the description of corresponding domains of objects – often themselves confusingly referred to as 'concepts' – which are not real-world entities but rather abstract nodes in simplified computer models created for specific application purposes. Against this background, the project of developing a top-level ontology, a common ontological backbone, begins to seem rather like the attempt to find some highest common denominator that would be shared in common by a plurality of true and false theories. Seen in this light, the principal reason for the failure of so many attempts to construct shared top-level ontologies lies precisely in the fact that these attempts were made on the basis of a methodology which treated all application domains on an equal footing. It thereby overlooked the degree to which the different conceptualizations which serve as inputs to ontology are likely to be not only of wildly differing quality but also mutually inconsistent. The Open Biomedical Ontologies (OBO) Foundry,19 which is one promising attempt to create an interoperable suite of ontologies sharing a common top-level ontology, seems to be succeeding in this respect primarily because it is restricted to domains where an independently existing reality – of biological and biomedical entities studied by science – serves as a constraint on the content of the ontologies included within the OBO framework. Ontology for the OBO Foundry, in other words, is not a matter of conceptual modelling.20 19 B. Smith, Ashburner, M., Rosse, C., Bard, J. Bug, W., Ceusters, W., Goldberg, L., Eilbeck, K., Ireland, A., Mungall, C., The OBI Consortium, Leontis, N., Rocca-Serra, P., Ruttenberg, A., Sansone, S., Scheuermann, R., Shah, N., Whetzel, P., Lewis, S., 'The OBO Foundry: Coordinated Evolution of Ontologies to Support Biomedical Data Integration', Nature Biotechnology 25:11 (2007), pp. 1251–5. 20 Portions of this essay are based on material taken from my chapter 'Ontology', in L. Floridi (ed.) Blackwell Guide to the Philosophy of Computing and Information (Oxford: Blackwell, 2003), pp. 155–166, and from H. Stenzhorn, S. Schulz, M. Boeker and B. Smith, 'Adapting Clinical Ontologies in Real-World Environments', Journal of Universal Computer Science, 14:22 (2008), pp. 3767–80. REFERENCES 1. Ashburner, M. et al., 'Gene Ontology: tool for the unification of biology', Nature Genetics 25 (2000), pp. 25–9. 2. Bodenreider, O., and R. Stevens, 'Bio-ontologies: current trends and future directions', Briefings in Bioinformatics 7:3 (2006), pp. 256–74. 3. Chisholm, R., A Realistic Theory of Categories: An Essay on Ontology (Cambridge: Cambridge University Press, 1996). 4. Genesereth, M. and L. Nilsson, Logical Foundation of Artificial Intelligence. Los Altos, California: Morgan Kaufmann, 1987). 5. Gruber, T., 'A Translation Approach to Portable Ontology Specifications', Knowledge Acquisition 5 (1993), pp. 199–220. 6. Gruber, T., 'Toward Principles for the Design of Ontologies Used for Knowledge Sharing', International Journal of Human and Computer Studies 43:5/6 (1995), pp. 907–28. 7. Gruber, T., 'What is an Ontology?', at: http://www-ksl.stanford.edu/kst/what-is-an -ontology.html, [accessed 20 September 2013]. 8. Guarino, N., 'Formal Ontology, Conceptual Analysis and Knowledge Representation', International Journal of Human-Computer Studies 43 (1995), pp. 625–40. 9. Guarino, N. (ed.), Formal Ontology in Information Systems (Amsterdam, Berlin, Oxford, DC: IOS Press, 1998). 10. Hayes, P., 'The Second Naive Physics Manifesto', in J. Hobbs and R. Moore (eds.), Formal Theories of the Common-Sense World (Norwood: Ablex, 1985), pp. 1–36. 11. Hayes, P., 'Naïve Physics I: Ontology for Liquids', in J. Hobbs and R. Moore (eds.), Formal Theories of the Common-Sense World (Norwood: Ablex, 1985), pp. 71–108. 12. Hobbs, J. and R. Moore (eds.), Formal Theories of the Common-Sense World (Norwood: Ablex, 1985). 13. Ingarden, R., Time and Modes of Being, translated by H. Michejda (Springfield: Charles Thomas, 1964). 14. Johansson, I., Ontological Investigations. An Inquiry into the Categories of Nature. Man and Society (New York, London: Routledge, 1989). 15. Koepsell, D., The Ontology of Cyberspace: Law, Philosophy, and the Future of Intellectual Property (Chicago: Open Court, 2000). 16. Koepsell, D. (ed.), Proceedings of the Buffalo Symposium on Applied Ontology in the Social Sciences, special issue of The American Journal of Economics and Sociology, 58:2 (1999). 17. Lowe, E., The Four-Category Ontology: A Metaphysical Foundation for Natural Science (Oxford: Oxford University Press, 2006). 18. McCarthy, J., 'Circumscription – A Form of Non-Monotonic Reasoning', Artificial Intelligence, 13:5 (1980), pp. 27–39. 19. Mulligan, K., 'Promisings and Other Social Acts: Their Constituents and Structure', in K. Mulligan (ed.), Speech Act and Sachverhalt. Reinach and the Foundations of Realist Phenomenology (Dordrecht, Boston, Lancaster: D. Reidel, 1987), pp. 29–90. 20. Naur, P., 'Programming as Theory Building', Microprocessing and Microprogramming 15 (1985), pp. 254–61. 21. Ørstrøm, P., J. Andersen and H. Schärfe, 'What Has Happened to Ontology?', in F. Dau, M.-L. Mugnier and G. Stumme (eds.), Conceptual Structures: Common Semantics for Sharing Knowledge. Lecture Notes in Computer Science 3596 (Berlin, Heidelberg: Springer, 2005), pp. 425–38. 22. Øhrstrøm, P., S. Uckelman and H. Schärfe, 'Historical and conceptual Foundations of diagrammatical ontology', in U. Priss, S. Polovina and R. Hill (eds.), Conceptual Structures: Knowledge Architectures for Smart Applications, Proceedings of the 15th International Conference on Conceptual Structures (Berlin: Springer, 2007), pp. 374–86. 23. Quine, W.: 'On What There Is', reprinted in: From a Logical Point of View (New York: Harper & Row, 1953). 24. Schulz, S., H. Stenzhorn, M. Boekers and B. Smith, 'Strengths and Limitations of Formal Ontologies in the Biomedical Domain', Electronic Journal of Communication, Information and Innovation in Health, Special Issue on Ontologies, Semantic Web and Health, 3:1 (2009), pp. 31–45. 25. Smith, B., 'Ontology (Science)', in C. Eschenbach and M. Gruninger (eds.), Formal Ontology in Information Systems. Proceedings of the Fifth International Conference (Amsterdam: IOS Press, 2008), pp. 21–35. 26. Smith, B., W. Ceusters, B. Klagges, J. Köhler, A. Kumar, J. Lomax, C. Mungall, F. Neuhaus, A. Rector and C. Rosse, 'Relations in Biomedical Ontologies', Genome Biology 6:5 (2005), R46. 27. Smith, B., Ashburner, M., Rosse, C., Bard, J. Bug, W., Ceusters, W., Goldberg, L., Eilbeck, K., Ireland, A., Mungall, C., The OBI Consortium, Leontis, N., Rocca-Serra, P., Ruttenberg, A., Sansone, S., Scheuermann, R., Shah, N., Whetzel, P., Lewis, S., 'The OBO Foundry: Coordinated Evolution of Ontologies to Support Biomedical Data Integration', Nature Biotechnology 25:11 (2007), pp. 1251–5. 28. Stenzhorn, H., S. Schulz, M. Boeker and B. Smith, 'Adapting Clinical Ontologies in RealWorld Environments', Journal of Universal Computer Science 14:22 (2008), pp. 3767–80.