This is a recurring issue of the Ontology Summit: the topic cannot be broken down into neatly disjoint subproblems. However, in the past that has not turned out not to be a problem as long as the people who run the different tracks work together. The main purpose of these tracks is to give some structure to the discussions and focus the attention on a given aspect of the topic of the Summit. As long as they address identifiably different questions, it does not matter whether the boundaries are fuzzy. And I think this is the case here -- at least in my interpretation of Michael's email -- because the tracks address different questions:
The Dimension-track answers: What kind of activities are performed under the label "Ontology Evaluation"?
The Application Framework-track answers: Giving that my ontology performs function X in my application, which kind of ontology evaluation techniques are relevant to me?
The Requirements-track answers: Now that I know what kinds of evaluation techniques are relevant to me, how do I capture the specific requirements of my application in a way that supports these evaluation techniques?
On Dec 11, 2012, at 5:55 PM, Todd J Schneider wrote:
Items 1,2, and 4 of your suggested tracks are, or should be,
intimately related. In the context of systems development the
requirements (hence uses) are the driver.
<graycol.gif>Michael Gruninger ---12/11/2012 12:59:52 PM---Hello everyone, in preparation for Thursday's Pre-launch of Ontology Summit 2013,
From: Michael Gruninger <gruninger@xxxxxxxxxxxxxxx>
To: Ontology Summit 2013 <ontology-summit@xxxxxxxxxxxxxxxx>
Date: 12/11/2012 12:59 PM
Subject: [ontology-summit] Potential Tracks for Ontology Summit 2013
Sent by: ontology-summit-bounces@xxxxxxxxxxxxxxxx
in preparation for Thursday's Pre-launch of Ontology Summit 2013,
here are some proposals for potential tracks.
One objective of the session will be to decide on the tracks and
specific topics that will be addressed within the tracks.
Please note that this list of potential tracks is meant to start the
discussion -- if you have other ideas for tracks, please join us
on Thursday's call!
1. Dimensions of Ontology Evaluation
- addresses notions of verification, validation, quality, ranking, ...
2. Evaluation and the Ontology Application Framework
- looks at the problem of ontology evaluation from the perspective of
the applications that use the ontologies. This Framework was one of the
outcomes of Ontology Summit 2011
3. Best Practices in Ontological Analysis
- focuses on the ontology evaluation based on the ontology itself, such
as logical criteria (consistency, completeness, modularity) and
ontological analysis techniques (e.g. OntoClean).
4. Requirements for Ontologies
- how do we specify the requirements against which we evaluate ontologies?
5. Environments for Developing and Evaluating Ontologies
- what are best practices for evaluation that we can adapt from software
particularly with distributed open-source software development?
Michael Gruninger and Matthew West
co-chairs of Ontology Summit 2013
Msg Archives: http://ontolog.cim3.net/forum/ontology-summit/
Community Files: http://ontolog.cim3.net/file/work/OntologySummit2013/
Community Wiki: http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2013
Community Portal: http://ontolog.cim3.net/wiki/