I was about to write almost exactly what Ed wrote below. :-)
There is a strong philosophical *intuition* that a set of OWL or CLIF statements is only a *representation* and that an *ontology* is the information such a
representation expresses. However, like Ed, I don't think that notion of an ontology is scientifically useful. What, exactly, *is* an ontology on this approach? What distinguishes one such ontology from another? What, exactly, is the "expressing" relation between a representation and an ontology so understood? It seems to me that there are no scientifically rigorous answers to these questions.
The only tangible, testable objects we have to work with are the representations themselves. Hence, it seems to me that the cleanest approach (as I've argued before and as Ed also suggests below) is to define any set of statements (perhaps meeting certain minimal conditions) in a given ontology language to be an ontology. We can then easily and cleanly define a number of useful notions for categorization and comparison, e.g., two ontologies are *equivalent* if each can be interpreted in the other, inconsistent if their union is
On Jan 11, 2011, at 5:49 PM, Ed Barkmeyer wrote:
> Bill Burkett wrote:
>> Chris, Ed:
>> I disagree that an ontology is (1) an artifact and (2) is something that can be engineered. (Thus I support Peter's question of whether "ontology engineer" is a useful term.) It is the *representation/manifestation* of an ontology that is the artifact that is created - it's the OWL representation (or CL representation or whatever) that is the artifact.
> This is a subject that has been discussed in this forum before. I hold
> with those who believe the ontology is the artifact -- the captured
> knowledge. Captured knowledge has a form of _expression_. Knowledge that
> is without external form is not an 'ontology'. Knowledge whose external
> form is not 'suitable for automated reasoning' is not an 'ontology'.
> That is the
difference between an 'ontology' and an XML schema, or a
> Java program (perhaps), or a PDF text, or a relational database.
> The OWL formulation is an ontology; the CL formulation is an ontology;
> they are different ontologies, even when both are said to represent
> exactly the same knowledge. If they do represent exactly the same
> knowledge, they are "equivalent".
> I am indebted to Professor Hyunbo Cho for the characterization of
> uncaptured knowledge as "unknown knowledge" -- a fitting oxymoron.
>> There is also the intangible aspect of what the representation of the ontology means that not subject to engineering discipline, but rather depends more on individual interpretation and perspective.
> Which is only to say that the formulation of the captured knowledge has
> some level of residual ambiguity. One must realize, however, that
> perspective often adds relationships of concepts in the ontology to
> concepts that go beyond the scope of the ontology, and such differences
> in perspective may not lead to any difference in the interpretation of
> the concepts in the ontology per se.
> In fairness, the biggest issue in most ontologies is the number of
> primitive concepts -- concepts whose definitions are written in natural
> language and are formally testable only to the extent of the axioms
> provided for them, if any. Primitive concepts usually leave much to the
> intuition of the human reader. OTOH, there are a number of 'primitive
> concepts' in Cyc that are so strongly characterized by axioms that it is
> difficult to imagine anyone misunderstanding what was meant -- a false
> intuition will quickly lead to a contradiction.
>> An ontology is not like a chair or a car
or a building that is engineered to meet specific, concrete, physical requirements, and can be measured whether or not it meets those requirements.
> Rich Cooper answered that:
>> I emphatically disagree! If the ontology doesn’t meet a specific set
>> of needs, whether documented as requirements or some other
>> documentation method, the need drives the usage. If there are no
>> needs, the ontology stays in the college or academy where it was
>> originated or partnered with.
>> Requirements, i.e. real human needs, always drive the market.
> My sentiments exactly.
> I agree that a lot of what is published as ontologies is academic toys,
> but that has been true of all kinds of software for 40 years, and it is
> not restricted to software artifacts. Rich is right. Commissioned
have a functional purpose; otherwise there would be no
> investment. The purpose of most academic stuff is to learn the trade,
> and get a degree by pretending to model a real domain, and in the
> process learning about the ugliness of reality. Civil engineers build
> concrete boats, or cardboard ones. MEs build fighting robots.
>> While I agree that training and experience can make one a better ontology designer, I don't think it's possible to completely remove individual bias from the process.
> Well, that depends on what the process is. If you have one knowledge
> engineer working with one or more domain experts, and the result is
> nominally a consensus model, it will be biased by the opinions in the
> room that are backed by the most personally or politically powerful
> individuals, like any other such effort. And the sole ontology engineer
> is in a position of some power. But if that result is subjected to a
> fair and open review process, by peers of the knowledge engineer and
> peers of the domain experts, a lot of the biases are exposed and
> eliminated. That is, we are talking about what the engineering practice
> is. (If we start by saying this is not engineering, the practitioners
> will never be forced to consider what good practice for their trade
> might be.)
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx