On Jan 11, 2011, at 11:13 PM, Burkett, William [USA] wrote: Chris, Ed:
So given your interpretation, if I have (1) an ontology written in OWL about, say, making a cup of coffee, and (2) another ontology written in CLIF (or KM language of your choice) that is about the exact same making-a-cup-of-coffee process, then they are two separate and distinct ontologies rather than different representations of the same ontology? That goes against my understanding conceptual modelling viz physical data modelling, and my understanding of the RDF abstract model viz the various representations of it. For a given set of concepts and relationships in my mind, there are many different physical ways to represent, manifest, or write them down. Why are ontologies different?
I don't think they are; I think matters are exactly the same in the case of RDF "abstract models". I think that part of our apparent disagreement here is due to the notorious word "model", which has very different but, unfortunately, very entrenched uses in different communities. On the one hand, in e.g., the database community, a model is often itself some sort of syntactic object, like a Bachman ER model with its boxes, diamonds, and arrows. On the other hand, in mathematical logic, a model is a certain kind of mathematical object, an abstract characterization of the meaning of a set of sentences (or a diagram) in a given representation language. Often times, especially in discussions of knowledge representation, these two uses are run together. I wonder if there might be a bit of that in our exchange here. I'm sure you would agree that we need to distinguish clearly between syntactic representations — e.g., sets of RDF/OWL/CLIF sentences — and their intended semantic models. I use the term "ontology" for the former; you seem inclined to use it for the latter. Obviously, if that is the case, then all we have here is a quibble — an argument over how a word should be used — rather than a substantive disagreement.
The reason I would argue for my position is two-fold. First, it is often notoriously difficult to pin down the notion of an intended semantic model precisely. Second (perhaps just a precise form of the first), semantic models are by nature abstract, mathematical (when properly defined) objects; and if they contain, e.g., the real numbers — as a model for a physical system might — they are uncountably infinite as well. Our syntactic representations, by contrast, are precisely defined, tangible, and finite (or, at worst, countably infinite).
I myself would therefore express your intuition by saying that there are concepts and relationships in the world (i.e., that portion of the world the ontology is intended to describe) that constitute the intended semantic model of a given ontology. Equivalent but distinct ontologies — your CLIF cup-o-coffee ontology and your OWL cup-o-coffee ontology — are "the same" in the sense that they express the same intended semantic model (more exactly, they have exactly the same semantic models, intended or not).
If my coffee-making ontologies /are/ different ontologies, then what do you call the set of concepts that they share and represent?
I'd likely call them "the set of concepts they share and represent". :-)
Bill
(geez - what am I doing on the arguing with the team of Chris and Ed - if John joins them, I'm doomed!!! :-) )
:-) Seems to me you set up the issue very nicely.
-chris
-----Original Message----- From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx [mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Christopher Menzel Sent: Tuesday, January 11, 2011 9:21 PM To: [ontolog-forum] Subject: Re: [ontolog-forum] I ontologise, you ontologise, we all mess up... (was: Modeling a money transferring scenario, or any of a range of similar dialogues) +1 I was about to write almost exactly what Ed wrote below. :-) There is a strong philosophical *intuition* that a set of OWL or CLIF statements is only a *representation* and that an *ontology* is the information such a representation expresses. However, like Ed, I don't think that notion of an ontology is scientifically useful. What, exactly, *is* an ontology on this approach? What distinguishes one such ontology from another? What, exactly, is the "expressing" relation between a representation and an ontology so understood? It seems to me that there are no scientifically rigorous answers to these questions. The only tangible, testable objects we have to work with are the representations themselves. Hence, it seems to me that the cleanest approach (as I've argued before and as Ed also suggests below) is to define any set of statements (perhaps meeting certain minimal conditions) in a given ontology language to be an ontology. We can then easily and cleanly define a number of useful notions for categorization and comparison, e.g., two ontologies are *equivalent* if each can be interpreted in the other, inconsistent if their union is inconsistent, etc. -chris On Jan 11, 2011, at 5:49 PM, Ed Barkmeyer wrote: Bill Burkett wrote:
Chris, Ed:
I disagree that an ontology is (1) an artifact and (2) is something that can be engineered. (Thus I support Peter's question of whether "ontology engineer" is a useful term.) It is the *representation/manifestation* of an ontology that is the artifact that is created - it's the OWL representation (or CL representation or whatever) that is the artifact.
This is a subject that has been discussed in this forum before. I hold
with those who believe the ontology is the artifact -- the captured
knowledge. Captured knowledge has a form of _expression_. Knowledge that
is without external form is not an 'ontology'. Knowledge whose external
form is not 'suitable for automated reasoning' is not an 'ontology'.
That is the difference between an 'ontology' and an XML schema, or a
Java program (perhaps), or a PDF text, or a relational database.
The OWL formulation is an ontology; the CL formulation is an ontology;
they are different ontologies, even when both are said to represent
exactly the same knowledge. If they do represent exactly the same
knowledge, they are "equivalent".
I am indebted to Professor Hyunbo Cho for the characterization of
uncaptured knowledge as "unknown knowledge" -- a fitting oxymoron.
There is also the intangible aspect of what the representation of the ontology means that not subject to engineering discipline, but rather depends more on individual interpretation and perspective.
Which is only to say that the formulation of the captured knowledge has
some level of residual ambiguity. One must realize, however, that
perspective often adds relationships of concepts in the ontology to
concepts that go beyond the scope of the ontology, and such differences
in perspective may not lead to any difference in the interpretation of
the concepts in the ontology per se.
In fairness, the biggest issue in most ontologies is the number of
primitive concepts -- concepts whose definitions are written in natural
language and are formally testable only to the extent of the axioms
provided for them, if any. Primitive concepts usually leave much to the
intuition of the human reader. OTOH, there are a number of 'primitive
concepts' in Cyc that are so strongly characterized by axioms that it is
difficult to imagine anyone misunderstanding what was meant -- a false
intuition will quickly lead to a contradiction.
An ontology is not like a chair or a car or a building that is engineered to meet specific, concrete, physical requirements, and can be measured whether or not it meets those requirements.
Rich Cooper answered that:
I emphatically disagree! If the ontology doesn't meet a specific set
of needs, whether documented as requirements or some other
documentation method, the need drives the usage. If there are no
needs, the ontology stays in the college or academy where it was
originated or partnered with.
Requirements, i.e. real human needs, always drive the market.
My sentiments exactly.
I agree that a lot of what is published as ontologies is academic toys,
but that has been true of all kinds of software for 40 years, and it is
not restricted to software artifacts. Rich is right. Commissioned
ontologies have a functional purpose; otherwise there would be no
investment. The purpose of most academic stuff is to learn the trade,
and get a degree by pretending to model a real domain, and in the
process learning about the ugliness of reality. Civil engineers build
concrete boats, or cardboard ones. MEs build fighting robots.
While I agree that training and experience can make one a better ontology designer, I don't think it's possible to completely remove individual bias from the process.
Well, that depends on what the process is. If you have one knowledge
engineer working with one or more domain experts, and the result is
nominally a consensus model, it will be biased by the opinions in the
room that are backed by the most personally or politically powerful
individuals, like any other such effort. And the sole ontology engineer
is in a position of some power. But if that result is subjected to a
fair and open review process, by peers of the knowledge engineer and
peers of the domain experts, a lot of the biases are exposed and
eliminated. That is, we are talking about what the engineering practice
is. (If we start by saying this is not engineering, the practitioners
will never be forced to consider what good practice for their trade
might be.)
-Ed
_________________________________________________________________ Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/ Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/ Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxxShared Files: http://ontolog.cim3.net/file/Community Wiki: http://ontolog.cim3.net/wiki/ To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1JTo Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx_________________________________________________________________ Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/ Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/ Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxxShared Files: http://ontolog.cim3.net/file/Community Wiki: http://ontolog.cim3.net/wiki/ To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1JTo Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx
|
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (01)
|