OntologySummit2013 Track C: Building Ontologies to Meet Evaluation Criteria (3K9J)
Track Co-champions: MatthewWest & MikeBennett (3KAL)
Background (3K9K)
There are two approaches that can be taken to assuring the quality of an ontology: (3K96)
1. Measure the quality of the result against the requirements that it should meet. (3K9D)
2. Use a process or methodology which will ensure the quality of the resultant ontology. (3K97)
If you wait to the end of ontology development to measure the quality, the costs of correction of any errors are likely to be high. Therefore using a process or methodology that builds quality into an ontology can have significant benefits. At present, however, it is unclear if there is any process or methodology that, if followed, is sufficient to guarantee the quality of a resulting ontology, and most of those that do exist are relatively informal and tend to require expert support. (3K9I)
A consideration in evaluating ontologies is the different scenarios in which they are used. For example, one might be used as a formal conceptual model to inform development and another might be used in an ontology based application. Both the evaluation criteria and the development methodologies employed may vary widely. (3K98)
Mission (3K9L)
To investigate the state of the art in ontology development methodologies, including key achievements and key gaps that currently exist. (3K9A)
Objectives (3K9M)
1. Examine the explicit and implicit methodologies that are known to exist. (3K9E)
2. Understand the role that upper ontologies play in ontology development methodologies. (3K9F)
3. Understand the role of ontological patterns in ontology development methodologies. (3K9G)
4. Identify how to apply the intrinsic and extrinsic aspects of ontology evaluation identified by the other tracks, within the applicable development methodologies. (3K9H)
5. Identifying how to frame the applicable ontology development methodologies within the frameworks of established quality assurance regimes (such as ISO 9000 and CMMI) for industrial applications. (3K9C)
Synthesis & Track Input to the Communique (3QNX)
see: http://ontolog.cim3.net/forum/ontology-summit/2013-04/msg00000.html (3QNY)
Please find, below, the initial input from Track C for the Summit Communique. (3V5K)
For section B: Introduction (3V5L)
This section should answer mainly these questions: (3V5M)
(1) Why is ontology evaluation important? (3V5N)
- Establishing requirements (agreed between users and developers of an ontology) that an ontology needs to meet in order to meet the needs of its application means that those developing the ontology have a better chance of meeting those requirements (you cant fail to meet unstated requirements). (3V5O)
- Confirming that an ontology meets the requirements should be part of the acceptance of an ontology in a wider systems development context. There may be several stages of development and maintenance with different levels of requirements at different stages. (3V5P)
- When looking to reuse rather than reinvent an ontology, an evaluation of the ontology in terms of what requirements it meets, will make it easier to identify an ontology that may be appropriately reused in whole or in part for some other purpose. (3V5Q)
(2) What is the scope of this document? (3V5R)
We focus in the communique on the evaluation of ontologies under the following aspects (3V5S)
- Is the domain represented appropriately (given the requirements of the IT system)? (3V5T)
- Is the ontology human-intelligble? (3V5U)
- Is the ontology maintainable? (3V5V)
- Does the query/reasoning capability and performance meet the requirements of the IT system? (3V5W)
For section C: The State of the Art of Ontology Evaluation (3V5X)
This section should cover these topics: (3V5Y)
- (1) The terminological distinctions that we use in the rest of the text. [all tracks] (3V5Z)
- (2) What are the desirable characteristics of ontologies and how are they measured? For each of the main kinds of ontology evaluation, it should highlight desirable characteristics of ontologies (e.g., reusability) and measurable metrics (e.g., natural language definitions of classes and relations) linked to them. This communique should not strive for an exhaustive list, but should focus on the most important characteristics. [track A, track B] (3V60)
Track C noted that for integrating ontologies, consistency was a critical property. Achieving consistency across large and potentially geographically and culturally diverse development and maintenance teams was a particular challenge in methodology development. (3V61)
- (3) What best practices should one adopt (across the whole ontology life cycle) to ensure that ontologies have the desired characteristics identified in C-2? Ideally this section should be organized by the characteristics mentioned in C-2; at a minimum there needs to be a clear correlation between the desirable characteristics and best practices. [track C] (3V62)
The development process for an ontology needs to have a number of stages, just like the data model in a traditional information systems development process. Similarly requirements need to be identified in levels too, starting with the capabilities of the overall system that the ontology is a component of, to capabilities of the ontology itself in that setting, to high level requirements, like consistency, to detailed requirements, like conforming to naming standards. The ontology development needs to go through stages to match, equivalent to conceptual, logical, and physical data model development in information systems. There are architectural decisions to be made in terms of the choices of ontological commitments the ontology needs to make and does make. There are choices of ontology language and implementation environment. There is little evidence of this in current practice, where ontology development seems to start with someone writing some OWL or CL. (3V63)
- (4) What tool-support is currently available to support the evaluation of the characteristics identified in C-2 and the best practices identified in C-3? Again, the point is not an exhaustive list of all available tools, but draw an explicit connection between the results of the other tracks and the findings of the tool track. [track D] (3V64)
There is little or no integrated tool support for multilevel/multistage ontology development beyond some tools to directly support the development of ontologies at this physical level. (3V65)
For section D: Future Steps (3V66)
What needs need to be addressed in order to improve the situation for ontology evaluation and, thus, indirectly, improve the quality of ontologies out there? This includes theoretical contributions (e.g., a better understanding of the characteristics or the development of better metrics) as well as lack of tool support. [all tracks] (3V67)
- A better understanding of the relationships between requirements at different levels and how low level requirements support higher level requirements. (3V68)
- Ontology development methodologies that align with and recognize similar stages to information systems development with distinct conceptual, logical, and physical stages, so that ontology development does not start at the physical level with the choice of an implementation language. (3V69)
- A clearer understanding of the architecture of ontology development and the different aspects of architecture that are relevant, from ontological commitments to language choices. (3V6A)
-- maintained by the Track-C champions: MatthewWest & MikeBennett ... please do not edit (3K9T)