Chris, Ed: (01)
So given your interpretation, if I have (1) an ontology written in OWL about,
say, making a cup of coffee, and (2) another ontology written in CLIF (or KM
language of your choice) that is about the exact same making-a-cup-of-coffee
process, then they are two separate and distinct ontologies rather than
different representations of the same ontology? That goes against my
understanding conceptual modelling viz physical data modelling, and my
understanding of the RDF abstract model viz the various representations of it.
For a given set of concepts and relationships in my mind, there are many
different physical ways to represent, manifest, or write them down. Why are
ontologies different? If my coffee-making ontologies /are/ different
ontologies, then what do you call the set of concepts that they share and
represent? (02)
Bill (03)
(geez - what am I doing on the arguing with the team of Chris and Ed - if John
joins them, I'm doomed!!! :-) ) (04)
-----Original Message-----
From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx
[mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Christopher Menzel
Sent: Tuesday, January 11, 2011 9:21 PM
To: [ontolog-forum]
Subject: Re: [ontolog-forum] I ontologise, you ontologise, we all mess up...
(was: Modeling a money transferring scenario, or any of a range of similar
dialogues) (05)
+1 (06)
I was about to write almost exactly what Ed wrote below. :-) (07)
There is a strong philosophical *intuition* that a set of OWL or CLIF
statements is only a *representation* and that an *ontology* is the information
such a representation expresses. However, like Ed, I don't think that notion of
an ontology is scientifically useful. What, exactly, *is* an ontology on this
approach? What distinguishes one such ontology from another? What, exactly, is
the "expressing" relation between a representation and an ontology so
understood? It seems to me that there are no scientifically rigorous answers
to these questions. (08)
The only tangible, testable objects we have to work with are the
representations themselves. Hence, it seems to me that the cleanest approach
(as I've argued before and as Ed also suggests below) is to define any set of
statements (perhaps meeting certain minimal conditions) in a given ontology
language to be an ontology. We can then easily and cleanly define a number of
useful notions for categorization and comparison, e.g., two ontologies are
*equivalent* if each can be interpreted in the other, inconsistent if their
union is inconsistent, etc. (09)
-chris (010)
On Jan 11, 2011, at 5:49 PM, Ed Barkmeyer wrote: (011)
> Bill Burkett wrote:
>> Chris, Ed:
>>
>> I disagree that an ontology is (1) an artifact and (2) is something that can
>be engineered. (Thus I support Peter's question of whether "ontology engineer"
>is a useful term.) It is the *representation/manifestation* of an ontology
>that is the artifact that is created - it's the OWL representation (or CL
>representation or whatever) that is the artifact.
>
> This is a subject that has been discussed in this forum before. I hold
> with those who believe the ontology is the artifact -- the captured
> knowledge. Captured knowledge has a form of expression. Knowledge that
> is without external form is not an 'ontology'. Knowledge whose external
> form is not 'suitable for automated reasoning' is not an 'ontology'.
> That is the difference between an 'ontology' and an XML schema, or a
> Java program (perhaps), or a PDF text, or a relational database.
>
> The OWL formulation is an ontology; the CL formulation is an ontology;
> they are different ontologies, even when both are said to represent
> exactly the same knowledge. If they do represent exactly the same
> knowledge, they are "equivalent".
>
> I am indebted to Professor Hyunbo Cho for the characterization of
> uncaptured knowledge as "unknown knowledge" -- a fitting oxymoron.
>
>> There is also the intangible aspect of what the representation of the
>ontology means that not subject to engineering discipline, but rather depends
>more on individual interpretation and perspective.
>
> Which is only to say that the formulation of the captured knowledge has
> some level of residual ambiguity. One must realize, however, that
> perspective often adds relationships of concepts in the ontology to
> concepts that go beyond the scope of the ontology, and such differences
> in perspective may not lead to any difference in the interpretation of
> the concepts in the ontology per se.
>
> In fairness, the biggest issue in most ontologies is the number of
> primitive concepts -- concepts whose definitions are written in natural
> language and are formally testable only to the extent of the axioms
> provided for them, if any. Primitive concepts usually leave much to the
> intuition of the human reader. OTOH, there are a number of 'primitive
> concepts' in Cyc that are so strongly characterized by axioms that it is
> difficult to imagine anyone misunderstanding what was meant -- a false
> intuition will quickly lead to a contradiction.
>
>> An ontology is not like a chair or a car or a building that is engineered to
>meet specific, concrete, physical requirements, and can be measured whether or
>not it meets those requirements.
>
> Rich Cooper answered that:
>
>> I emphatically disagree! If the ontology doesn't meet a specific set
>> of needs, whether documented as requirements or some other
>> documentation method, the need drives the usage. If there are no
>> needs, the ontology stays in the college or academy where it was
>> originated or partnered with.
>>
>> Requirements, i.e. real human needs, always drive the market.
>>
>
> My sentiments exactly.
>
> I agree that a lot of what is published as ontologies is academic toys,
> but that has been true of all kinds of software for 40 years, and it is
> not restricted to software artifacts. Rich is right. Commissioned
> ontologies have a functional purpose; otherwise there would be no
> investment. The purpose of most academic stuff is to learn the trade,
> and get a degree by pretending to model a real domain, and in the
> process learning about the ugliness of reality. Civil engineers build
> concrete boats, or cardboard ones. MEs build fighting robots.
>
>> While I agree that training and experience can make one a better ontology
>designer, I don't think it's possible to completely remove individual bias
>from the process.
>>
>
> Well, that depends on what the process is. If you have one knowledge
> engineer working with one or more domain experts, and the result is
> nominally a consensus model, it will be biased by the opinions in the
> room that are backed by the most personally or politically powerful
> individuals, like any other such effort. And the sole ontology engineer
> is in a position of some power. But if that result is subjected to a
> fair and open review process, by peers of the knowledge engineer and
> peers of the domain experts, a lot of the biases are exposed and
> eliminated. That is, we are talking about what the engineering practice
> is. (If we start by saying this is not engineering, the practitioners
> will never be forced to consider what good practice for their trade
> might be.)
>
> -Ed (012)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (013)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (014)
|