OntologySummit2013 Track C: Building Ontologies to Meet Evaluation Criteria    (3K9J)

Track Co-champions: MatthewWest & MikeBennett    (3KAL)

Background    (3K9K)

There are two approaches that can be taken to assuring the quality of an ontology:    (3K96)

1. Measure the quality of the result against the requirements that it should meet.    (3K9D)

2. Use a process or methodology which will ensure the quality of the resultant ontology.    (3K97)

If you wait to the end of ontology development to measure the quality, the costs of correction of any errors are likely to be high. Therefore using a process or methodology that builds quality into an ontology can have significant benefits. At present, however, it is unclear if there is any process or methodology that, if followed, is sufficient to guarantee the quality of a resulting ontology, and most of those that do exist are relatively informal and tend to require expert support.    (3K9I)

A consideration in evaluating ontologies is the different scenarios in which they are used. For example, one might be used as a formal conceptual model to inform development and another might be used in an ontology based application. Both the evaluation criteria and the development methodologies employed may vary widely.    (3K98)

Mission    (3K9L)

To investigate the state of the art in ontology development methodologies, including key achievements and key gaps that currently exist.    (3K9A)

Objectives    (3K9M)

1. Examine the explicit and implicit methodologies that are known to exist.    (3K9E)

2. Understand the role that upper ontologies play in ontology development methodologies.    (3K9F)

3. Understand the role of ontological patterns in ontology development methodologies.    (3K9G)

4. Identify how to apply the intrinsic and extrinsic aspects of ontology evaluation identified by the other tracks, within the applicable development methodologies.    (3K9H)

5. Identifying how to frame the applicable ontology development methodologies within the frameworks of established quality assurance regimes (such as ISO 9000 and CMMI) for industrial applications.    (3K9C)


Synthesis & Track Input to the Communique    (3QNX)

see: http://ontolog.cim3.net/forum/ontology-summit/2013-04/msg00000.html    (3QNY)

Please find, below, the initial input from Track C for the Summit Communique.    (3V5K)

For section B: Introduction    (3V5L)

This section should answer mainly these questions:    (3V5M)

(1) Why is ontology evaluation important?    (3V5N)

(2) What is the scope of this document?    (3V5R)

We focus in the communique on the evaluation of ontologies under the following aspects    (3V5S)

For section C: The State of the Art of Ontology Evaluation    (3V5X)

This section should cover these topics:    (3V5Y)

Track C noted that for integrating ontologies, consistency was a critical property. Achieving consistency across large and potentially geographically and culturally diverse development and maintenance teams was a particular challenge in methodology development.    (3V61)

The development process for an ontology needs to have a number of stages, just like the data model in a traditional information systems development process. Similarly requirements need to be identified in levels too, starting with the capabilities of the overall system that the ontology is a component of, to capabilities of the ontology itself in that setting, to high level requirements, like consistency, to detailed requirements, like conforming to naming standards. The ontology development needs to go through stages to match, equivalent to conceptual, logical, and physical data model development in information systems. There are architectural decisions to be made in terms of the choices of ontological commitments the ontology needs to make and does make. There are choices of ontology language and implementation environment. There is little evidence of this in current practice, where ontology development seems to start with someone writing some OWL or CL.    (3V63)

There is little or no integrated tool support for multilevel/multistage ontology development beyond some tools to directly support the development of ontologies at this physical level.    (3V65)

For section D: Future Steps    (3V66)

What needs need to be addressed in order to improve the situation for ontology evaluation and, thus, indirectly, improve the quality of ontologies out there? This includes theoretical contributions (e.g., a better understanding of the characteristics or the development of better metrics) as well as lack of tool support. [all tracks]    (3V67)


 --
 maintained by the Track-C champions: MatthewWest & MikeBennett ... please do not edit    (3K9T)