unfortunately I can't participate to the meeting tomorrow, but, concerning point 3, I would like to include the criteria used to express how well an ontology serves its main function, i.e., making the intended meaning of a vocabulary explicit.
In the past, I have isolated and discussed such criteria: precision, completeness, accuracy. See
in preparation for Thursday's Pre-launch of Ontology Summit 2013,
here are some proposals for potential tracks.
One objective of the session will be to decide on the tracks and
specific topics that will be addressed within the tracks.
Please note that this list of potential tracks is meant to start the
discussion -- if you have other ideas for tracks, please join us
on Thursday's call!
1. Dimensions of Ontology Evaluation
- addresses notions of verification, validation, quality, ranking, ...
2. Evaluation and the Ontology Application Framework
- looks at the problem of ontology evaluation from the perspective of
the applications that use the ontologies. This Framework was one of the
outcomes of Ontology Summit 2011
3. Best Practices in Ontological Analysis
- focuses on the ontology evaluation based on the ontology itself, such
as logical criteria (consistency, completeness, modularity) and
ontological analysis techniques (e.g. OntoClean).
4. Requirements for Ontologies
- how do we specify the requirements against which we evaluate ontologies?
5. Environments for Developing and Evaluating Ontologies
- what are best practices for evaluation that we can adapt from software
particularly with distributed open-source software development?
Michael Gruninger and Matthew West
co-chairs of Ontology Summit 2013
Msg Archives: http://ontolog.cim3.net/forum/ontology-summit/
Community Files: http://ontolog.cim3.net/file/work/OntologySummit2013/
Community Wiki: http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2013
Community Portal: http://ontolog.cim3.net/wiki/