[Top] [All Lists]

Re: [ontolog-forum] Two ontologies that are inconsistent but both needed

To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: Waclaw Kusnierczyk <Waclaw.Marcin.Kusnierczyk@xxxxxxxxxxx>
Date: Wed, 13 Jun 2007 22:47:50 +0200
Message-id: <467057F6.40701@xxxxxxxxxxx>
Kathryn Blackmond Laskey wrote:    (01)

> Barry, you caricature Tom Gruber by saying he wants to build 
> ontologies of concepts while you're building ontologies of the world.     (02)

Gruber's definition should be read in the context of AI, and it may not 
be appropriate in the context of biomedical ontology;  here, we talk 
about agents whose reality (so to speak) is specified by an ontology:    (03)

"For AI systems, what "exists" is that which can be represented."
[http://www-ksl.stanford.edu/kst/what-is-an-ontology.html]    (04)

And what can be represented, is specified by the ontology.    (05)

In this sense, an ontology is both a representation and a specification.
I would argue that it is the developer's specification of the agents' 
reality (the developer is the God here), as well as a specification of 
the agents' conceptualization of that reality (the agent can only 
represent).    (06)

 > That ain't so. Anyone who builds an ontology is specifying a
 > conceptualization.  A good ontologist specifies conceptualizations
 > that are as true to the structure of the world. But an ontology IS a
 > specification of a conceptualization.  It is NOT a specification of
 > the reality.  Only God can specify reality.  We can describe reality
 > and act in it, but we can't specify it.  We describe reality by
 > specifying our conceptualizations of it, arguing over them, refining
 > them, and hammering out consensus agreements on them.    (07)

When we come to the world of biomedical ontologies, they are clearly not 
specifications of the reality;  they are representations of the reality, 
and specifications of conceptualizations of the reality -- by 
representing the reality, they specify how we should think about the 
reality (conceptualize it).    (08)

> If we bite the bullet and admit that ontologies specify 
> conceptualizations [of a domain], then it's easy to argue that 
> conceptualizations should be allowed to have probabilities when we 
> don't have enough information for a complete specification.  This 
> argument makes sense even if we don't think the probabilities 
> themselves are ontological.    (09)

Sure.  An ontology understood as a specification of a conceptualization 
may specify that you should be uncertain and to what degree as to 
whether Px if Qx.  As a representation of the reality, though, it can't 
say that Px is probable or uncertain.  (Unless it is our knowledge that 
is the represented part of the reality.)    (010)

vQ    (011)

Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx    (012)

<Prev in Thread] Current Thread [Next in Thread>