ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Ontology for Climate Change - need input

To: ontolog-forum@xxxxxxxxxxxxxxxx
From: John F Sowa <sowa@xxxxxxxxxxx>
Date: Wed, 03 Apr 2013 16:58:28 -0400
Message-id: <515C97F4.9060000@xxxxxxxxxxx>
Ed, Rich, and Doug,    (01)

I agree with your observations.  But I would qualify or revise the
conclusions.    (02)

EB
> The whole idea in ontology development is that we sort all [those
> definitions] out in the formal language.  In a formal ontology,
> all terms are Humpty Dumpty words:  They mean exactly what we say
> they mean, neither more nor less.    (03)

I agree.  But I would add that there's a major problem in getting anyone
other than Humpty D. to understand what he meant in whatever arcane
notation he chose to use.  Even Humpty himself is unlikely to remember
from one day to the next the exact details of every definition.    (04)

How many programmers can recall exactly what assumptions they made
in a program they wrote ten years ago?  Ten days ago?    (05)

Can anyone expect an ontologist (or worse, a committee of ontologists)
to remember the definitions of every term?  To reread those definitions
every time they use the terms?  To understand exactly what they read?
To use what they thought they understood in the way it was intended?    (06)

EB
> An ontology in which most of the terms are primitive (not formally defined)
> does not provide much of a foundation for inference.  In particular, most
> OWL models I see are information models, not ontologies.  They don’t DEFINE
> terms using both necessary and sufficient (iff) characteristics.    (07)

I agree.  But this isn't just a problem with OWL.  The same issues occur
in systems that have much more detailed definitions.  But when you put
all that detail into the definitions, the likelihood that anybody will
read and understand them exactly as intended is very, very low.    (08)

It doesn't matter how precise your definitions are if the data entry
people don't read, understand, remember, or use them.    (09)

RC quoting "Ordered Chaos"
> We have discussed earlier the possibility of naming ontological nodes
> as non linguistic labels, perhaps like X3-05023-C which makes it clear
> that there is NO linguistic relationship between ontological nodes and
> words.   But that has been an unworkable approach as well.  The authors
> of the nodes  assign "meaningful" names, such as "river", and do not
> exhaustively characterize  the node "river" to include all the > varieties of 
>river which we have 
discussed in the past.    (010)

In short, the only things that people remember and use are the English
phrases in the "meaningful names".  The pretense that the anybody will
remember or use the formal definitions is more hope than reality.    (011)

RC continuing the quote:
> The point is that mnemonic names for ontological nodes must necessarily
> be misleading at some point in using the ontology.  Blaming "erroneous
> interpretations" on the NL side in order to leave the ontological side
> blameless is just an exercise in self delusion.    (012)

Yes, indeed!    (013)

DF
> If the term and its meaning(s) is/are known to an ontologist, all that is
> needed is to map the NL term to the term(s) in the ontology that
> it may represent.  If meanings are known, it is irrelevant to the ontologist
> whether any of the meanings may be calculable from the NL term.    (014)

Several qualifications:    (015)

The assumption that any two ontologists who are using the same formal
ontology understand and use all the terms in exactly the same way is
unlikely.  It's even unlikely that the *same* ontologist will remember
and use every term in the same way from one month (or even one day)
to the next.    (016)

For anybody who has ever written a program, just think of how many bugs
crop up from mistakes in remembering the details of every definition
in every function or subroutine in a large system.  Just look at the
huge number of patches that emanate from Redmond, Wash, every week.    (017)

Every patch changes the definition of something in the system.  And
there is nobody who understands the whole thing.    (018)

RC
> The solution to this conundrum has yet to be
> found, and indeed may not even exist.    (019)

There is a solution.  And C. S. Peirce stated it over a century ago:    (020)

CSP
> It is easy to speak with precision upon a general theme. Only, one
> must commonly surrender all ambition to be certain. It is equally
> easy to be certain. One has only to be sufficiently vague. It is not
> so difficult to be pretty precise and fairly certain at once about
> a very narrow subject. (CP 4.237)    (021)

In other words, it's easy to be precise in a universal ontology,
but your precision will have no correspondence with reality.    (022)

It's also easy to have a broad ontology that is widely applicable,
but only if it is "sufficiently vague".  That is why schema.org is
much more likely to be generally applicable than any universal
formal ontology with detailed definitions.    (023)

But as CSP said, you can be "pretty precise and fairly certain"
about a very narrow subject.    (024)

Solution:  You need a broad "vague" upper level.  And the
only places where you can have enough precision for detailed
reasoning (or specification) are in the very narrow, specialized
microtheories.    (025)

John    (026)

_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J    (027)

<Prev in Thread] Current Thread [Next in Thread>