ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] I ontologise, you ontologise, we all mess up...

To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: Ed Barkmeyer <edbark@xxxxxxxx>
Date: Wed, 12 Jan 2011 15:41:54 -0500
Message-id: <4D2E1212.7020502@xxxxxxxx>
Bill,    (01)

you wrote:
> Chris, Ed:
>
> So given your interpretation, if I have (1) an ontology written in OWL about, 
>say, making a cup of coffee, and (2) another ontology written in CLIF (or KM 
>language of your choice) that is about the exact same making-a-cup-of-coffee 
>process, then they are two separate and distinct ontologies rather than 
>different representations of the same ontology?      (02)

All we are saying is that the ontology is the captured description of 
the concepts.  And the choice of language is intimately involved with 
capturing the description.  Further, the choice of language is closely 
related to computational use of that description.     (03)

> That goes against my understanding conceptual modelling viz physical data 
>modelling, and my understanding of the RDF abstract model viz the various 
>representations of it.      (04)

That distinction is artificial.  RDF is only one language.  If a 
"language" has more than one "notation" (an idea that seems to apply 
only to computational languages and printing), then an ontology that is 
written in that language is the same ontology, regardless of the 
notation used.  The idea that a language has an abstract syntax and 
multiple concrete syntaxes is pure computer science, and a sop to the 
inability of standards communities to actually make a useful standard.  
As long as the semantics and expressiveness of the language is 
associated entirely with the abstract syntax, the difference in concrete 
syntax is rather like choosing a page layout and a print font, or 
distinguishing spoken English from written English or typed English.  
One can perform a rote transformation of the syntax without changing any 
aspect of its interpretation.    (05)

In the particular case of OWL and CLIF, the grammar and formal semantics 
of the two languages are significantly different.  It is possible to 
"translate" an OWL ontology into a CLIF ontology, but there are multiple 
ways to do that, and each has different consequences for first-order 
reasoners.    (06)

One could argue that an ontology expressed in one CL dialect is "the 
same as" an ontology with the same concepts expressed in a different CL 
dialect.  But trying to define what is meant by "same as" is pointless 
effort.  As Chris says, one can produce a well-defined concept of 
"equivalence" and stop.    (07)

> For a given set of concepts and relationships in my mind, there are many 
>different physical ways to represent, manifest, or write them down.  Why are 
>ontologies different?       (08)

Until you physically express the concepts and relationships in your 
mind, they are unavailable to any automated function, or any function 
implemented by another person.  Once you express them, they will be 
interpreted by the automaton or the other person according to the 
vocabulary and rules of the language you used.  The difference between 
expressing your concepts in speech and in writing is much less 
significant for the recipient than the difference between expressing 
your concepts in English and in Chinese, or in OWL.  The ontology is 
what you expressed in the language you chose, because that is the 
concept set that is available to your "audience" (noting that that word 
originally meant the people who heard you speak).    (09)

> If my coffee-making ontologies /are/ different ontologies, then what do you 
>call the set of concepts that they share and represent?
>       (010)

I would call that the "conceptualization" -- what you have in your 
head.  The idea that "representing" those concepts in different 
languages is just "representation" has linguistically inappropropriate 
connotations.  What we are talking about is a difference in the 
"expression" of those concepts.  If the language has a word for walking 
alertly through tall grass (Swahili does), you use the word to express 
that concept; if it has no such word (like English), you use a complex 
circumlocution.  The same idea applies to ontologies.  If your language 
distinguishes classifiers from Boolean properties and you assign 
different semantics to those concepts, then you have to develop some 
circumlocution to carry those additional semantics in a CLIF model, 
because CL doesn't make such a distinction -- they are just unary 
relations.  And how you do the circumlocution affects how much of your 
concept is "shared" between the two "representations", as interpreted by 
the audience, and in consequence, what they have in their heads as a 
result of your "utterance".    (011)

This is very important, because the difference in expressiveness of 
ontology languages is not trivial.  Translating a CL ontology to OWL is 
a bit like looking at the English subtitles for a French or Swedish 
movie.  Some percentage of the original content survives.     (012)

> Bill  
>
> (geez - what am I doing on the arguing with the team of Chris and Ed - if 
>John joins them, I'm doomed!!!  :-) )
>       (013)

I didn't realize this was an argument.  I thought it was a discussion.  
And in the words of Stephen Hopkins, "I've never seen any topic so 
dangerous that it couldn't be discussed."  (He was talking about the 
motion of the Commonwealth of Virginia that proposed independence from 
the British crown.)  Doom is off the table. :-)    (014)

-Ed    (015)

>
> -----Original Message-----
> From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx 
>[mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Christopher Menzel
> Sent: Tuesday, January 11, 2011 9:21 PM
> To: [ontolog-forum]
> Subject: Re: [ontolog-forum] I ontologise, you ontologise, we all mess up... 
>(was: Modeling a money transferring scenario, or any of a range of similar 
>dialogues)
>
> +1
>
> I was about to write almost exactly what Ed wrote below. :-)
>
> There is a strong philosophical *intuition* that a set of OWL or CLIF 
>statements is only a *representation* and that an *ontology* is the 
>information such a representation expresses. However, like Ed, I don't think 
>that notion of an ontology is scientifically useful. What, exactly, *is* an 
>ontology on this approach?  What distinguishes one such ontology from another? 
>What, exactly, is the "expressing" relation between a representation and an 
>ontology so understood?  It seems to me that there are no scientifically 
>rigorous answers to these questions.
>
> The only tangible, testable objects we have to work with are the 
>representations themselves.  Hence, it seems to me that the cleanest approach 
>(as I've argued before and as Ed also suggests below) is to define any set of 
>statements (perhaps meeting certain minimal conditions) in a given ontology 
>language to be an ontology.  We can then easily and cleanly define a number of 
>useful notions for categorization and comparison, e.g., two ontologies are 
>*equivalent* if each can be interpreted in the other, inconsistent if their 
>union is inconsistent, etc.
>
> -chris
>
>
> On Jan 11, 2011, at 5:49 PM, Ed Barkmeyer wrote:
>
>   
>> Bill Burkett wrote:
>>     
>>> Chris, Ed: 
>>>
>>> I disagree that an ontology is (1) an artifact and (2) is something that 
>can be engineered. (Thus I support Peter's question of whether "ontology 
>engineer" is a useful term.)  It is the *representation/manifestation* of an 
>ontology that is the artifact that is created - it's the OWL representation 
>(or CL representation or whatever) that is the artifact.  
>>>       
>> This is a subject that has been discussed in this forum before. I hold 
>> with those who believe the ontology is the artifact -- the captured 
>> knowledge. Captured knowledge has a form of expression. Knowledge that 
>> is without external form is not an 'ontology'. Knowledge whose external 
>> form is not 'suitable for automated reasoning' is not an 'ontology'. 
>> That is the difference between an 'ontology' and an XML schema, or a 
>> Java program (perhaps), or a PDF text, or a relational database.
>>
>> The OWL formulation is an ontology; the CL formulation is an ontology; 
>> they are different ontologies, even when both are said to represent 
>> exactly the same knowledge. If they do represent exactly the same 
>> knowledge, they are "equivalent".
>>
>> I am indebted to Professor Hyunbo Cho for the characterization of 
>> uncaptured knowledge as "unknown knowledge" -- a fitting oxymoron.
>>
>>     
>>> There is also the intangible aspect of what the representation of the 
>ontology means that not subject to engineering discipline, but rather depends 
>more on individual interpretation and perspective.  
>>>       
>> Which is only to say that the formulation of the captured knowledge has 
>> some level of residual ambiguity. One must realize, however, that 
>> perspective often adds relationships of concepts in the ontology to 
>> concepts that go beyond the scope of the ontology, and such differences 
>> in perspective may not lead to any difference in the interpretation of 
>> the concepts in the ontology per se.
>>
>> In fairness, the biggest issue in most ontologies is the number of 
>> primitive concepts -- concepts whose definitions are written in natural 
>> language and are formally testable only to the extent of the axioms 
>> provided for them, if any. Primitive concepts usually leave much to the 
>> intuition of the human reader. OTOH, there are a number of 'primitive 
>> concepts' in Cyc that are so strongly characterized by axioms that it is 
>> difficult to imagine anyone misunderstanding what was meant -- a false 
>> intuition will quickly lead to a contradiction.
>>
>>     
>>> An ontology is not like a chair or a car or a building that is engineered 
>to meet specific, concrete, physical requirements, and can be measured whether 
>or not it meets those requirements.  
>>>       
>> Rich Cooper answered that:
>>
>>     
>>> I emphatically disagree! If the ontology doesn't meet a specific set 
>>> of needs, whether documented as requirements or some other 
>>> documentation method, the need drives the usage. If there are no 
>>> needs, the ontology stays in the college or academy where it was 
>>> originated or partnered with.
>>>
>>> Requirements, i.e. real human needs, always drive the market.
>>>
>>>       
>> My sentiments exactly.
>>
>> I agree that a lot of what is published as ontologies is academic toys, 
>> but that has been true of all kinds of software for 40 years, and it is 
>> not restricted to software artifacts. Rich is right. Commissioned 
>> ontologies have a functional purpose; otherwise there would be no 
>> investment. The purpose of most academic stuff is to learn the trade, 
>> and get a degree by pretending to model a real domain, and in the 
>> process learning about the ugliness of reality. Civil engineers build 
>> concrete boats, or cardboard ones. MEs build fighting robots.
>>
>>     
>>> While I agree that training and experience can make one a better ontology 
>designer, I don't think it's possible to completely remove individual bias 
>from the process.
>>>
>>>       
>> Well, that depends on what the process is. If you have one knowledge 
>> engineer working with one or more domain experts, and the result is 
>> nominally a consensus model, it will be biased by the opinions in the 
>> room that are backed by the most personally or politically powerful 
>> individuals, like any other such effort. And the sole ontology engineer 
>> is in a position of some power. But if that result is subjected to a 
>> fair and open review process, by peers of the knowledge engineer and 
>> peers of the domain experts, a lot of the biases are exposed and 
>> eliminated. That is, we are talking about what the engineering practice 
>> is. (If we start by saying this is not engineering, the practitioners 
>> will never be forced to consider what good practice for their trade 
>> might be.)
>>
>> -Ed
>>     
>
>  
> _________________________________________________________________
> Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
> Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
> Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
> Shared Files: http://ontolog.cim3.net/file/
> Community Wiki: http://ontolog.cim3.net/wiki/ 
> To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
> To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx
>  
>  
> _________________________________________________________________
> Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
> Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
> Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
> Shared Files: http://ontolog.cim3.net/file/
> Community Wiki: http://ontolog.cim3.net/wiki/ 
> To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
> To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx
>  
>       (016)

-- 
Edward J. Barkmeyer                        Email: edbark@xxxxxxxx
National Institute of Standards & Technology
Manufacturing Systems Integration Division
100 Bureau Drive, Stop 8263                Tel: +1 301-975-3528
Gaithersburg, MD 20899-8263                Cel: +1 240-672-5800    (017)

"The opinions expressed above do not reflect consensus of NIST, 
 and have not been reviewed by any Government authority."    (018)


_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx    (019)

<Prev in Thread] Current Thread [Next in Thread>