ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Just What Is an Ontology, Anyway?

To: "'Christopher Menzel'" <cmenzel@xxxxxxxx>, "'[ontolog-forum] '" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: "Ian Bailey" <ian@xxxxxxxxxxxxxxxx>
Date: Mon, 26 Oct 2009 17:54:27 -0000
Message-id: <00f101ca5665$5fb31d20$1f195760$@com>
Hi Chris,    (01)

Sorry, yes typo on non-well-founded. By which I mean (at least from my
limited understanding) that there is no requirement to build everything up
from the empty set. All the IDEAS Group work has been based on this
principle, and the ontologies seem to be perfectly implementable to me. I
doubt they'd support reasoning though.     (02)

There's a link here
(http://en.wikipedia.org/wiki/Non-well-founded_set_theory) which probably
explains it in terms of logic. Personally, I can't understand a bloody word
of it.     (03)

I'm not sure the AI level-nine wizards entirely co-opted the term either.
All the ISO15926 and BORO work was done without requirement for machine
reasoning. Philosophers also talk about ontology quite a bit too.     (04)

For me, it's much more important for an ontology to closely mirror the real
world, because what I care about is building better information systems.
Sure, there's got to be some formality in their development, but that
doesn't mean it has to support reasoning. I get particularly worried when AI
folks tell me I can't do things like have types whose instances are types,
and relationships to relationships. They're bending their "ontology" to suit
the tool at hand. If all you have is a war-hammer, everything starts to look
like an orc.     (05)

Cheers
--
Ian    (06)

-----Original Message-----
From: Christopher Menzel [mailto:cmenzel@xxxxxxxx] 
Sent: 26 October 2009 17:30
To: ian@xxxxxxxxxxxxxxxx; [ontolog-forum] 
Subject: Re: [ontolog-forum] Just What Is an Ontology, Anyway?    (07)

On Oct 26, 2009, at 11:43 AM, Ian Bailey wrote:
> Er...what does ontology have to do with automated reasoning ?    (08)

Ever since the term was co-opted (not entirely without warrant) by the  
CS/AI community, a (perhaps the) central motivation has been to  
facilitate automated reasoning on large knowledge bases.    (09)

> The scope of ontology is far wider than that, and there are lots of  
> ontologies out there that are really useful for real world  
> applications, but don't meet the narrow requirements for finite-time  
> reasoning.    (010)

Example being...?  Do you really mean it's in a logic without a proof  
theory?  Or do you simply mean that the ontology is not formally  
specified?  I don't doubt that a semi-formal ontology couldn't be  
useful for, e.g., facilitating a common understanding of a domain  
among human agents.  But, ultimately, complete clarity (and  
computational support) comes only when an informal ontology has been  
rendered in a logical language.  And if you've got a genuine logical  
language, you'll have some sort of proof theory and hence something  
amenable to automated reasoning.    (011)

> On the other hand, there are ontologies out there that have been  
> built only for reasoning, and are no use whatsoever in real world  
> applications...in fact there are rather a lot of these, mostly  
> funded by our taxes, unfortunately.    (012)

So there are bad, well-funded ontologies; nothing new there.    (013)

> I'm not sure a complete proof theory is required either.    (014)

You are right; partial proof theories for well-specified fragments of  
a given logic could also be useful.  The point was that one needs a  
rigorous proof theory for a logic to support any kind of automated  
reasoning.    (015)

> The none-well-founded stuff seems to work quite well (assuming  
> that's what Chris meant by "proof").    (016)

I don't have any clear idea what you have in mind by "none-well- 
founded" stuff.  I'm guessing you mean "non-well-founded" but I'm  
still not sure what you mean.  Perhaps you are alluding to well- 
founded semantics (WFS)?  That is indeed a framework that in general  
does not have a complete proof theory but there are a number of  
interesting completeness results for WFS-based systems when certain  
conditions are imposed on models.    (017)

-chris    (018)



_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx    (019)

<Prev in Thread] Current Thread [Next in Thread>