ontolog-forum
[Top] [All Lists]

[ontolog-forum] Ontology similarity and accurate communication

To: <edbark@xxxxxxxx>, "'[ontolog-forum] '" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: "Patrick Cassidy" <pat@xxxxxxxxx>
Date: Thu, 6 Mar 2008 08:00:29 -0500
Message-id: <002601c87f8a$0ddf1fc0$299d5f40$@com>
In the discussion on "orthogonal", Ed Barkmeyer pointed out:    (01)

> My position is that two agents don't need to have non-overlapping
> ontologies to be unable to communicate effectively.  Their ontologies
> can have a 90% overlap, but if there is one critical idea that one has
> and the other does not understand, they can't do business.
>    (02)

 Ed focused on the problem that arises when one 'critical idea' differs
between the ontologies (or assumptions) of two different communicating
agents.  I suspect that the problem can also arise when even minor
differences are present in the interpretations of communicated information,
because the interpretation of many concepts involve a very large number of
implications and associated inferences.    (03)

   This question appears to me to be one that is worthy of a separate field
of investigation: precisely how different can ontologies be while sustaining
an adequate level of accuracy in interpreting communications that rely on
the ontologies?    (04)

  My own suspicion is that the similarity **in the fundamental concepts**
has to be very close to 100%.  The reasoning is something like this:  if the
brain (or a simulation) does as much computation as one of our laptops, then
it can run at least 1 million inferences per second. If (crudely
calculating) the inferences supported by the differing ontologies differ by
1 in 1000 then two different ontologies will generate 1000 differing
inferences per second from the same information.  How much difference can be
tolerated before something goes badly wrong - perhaps a direct logical
contradiction?  My guess is that each serious "fact" that we rely on to
support our everyday activities is supported by at least 1000 assumptions,
and getting one in a thousand wrong would invalidate the meaning of these
facts, making normal actions, expecting predictable results, effectively
impossible at any level.  A similarity of 99.9% in two different fundamental
ontologies may not be enough for any meaningful level of communication.
But, as I said at the start, this is an issue that needs investigation.    (05)

   We all know that people differ in assumptions and beliefs, and yet we do
manage to communicate reasonably well in most cases.  How can that be?
Well, it happens probably because we **know** that we have different
assumptions and beliefs, and when communicating, only assume that there is a
certain fundamental set of knowledge in common, and only rely on that basic
set of common assumptions and beliefs to express the ideas we want to
communicate.  If we 'misunderestimate' what our fellow conversant knows,
there can be and often is a miscommunication.  The ability to communicate
effectively depends on the ability to guess correctly what facts,
assumptions, and beliefs are likely to be shared by those with whom we
communicate.  Among specialists, of course, a lot more common technical
knowledge is assumed.    (06)

   An issue that has occupied some of my attention lately has been the
question of what basic ontological concepts are sufficient to support
accurate communication.  I frame the issue as being analogous to the
"defining vocabulary" used by some dictionaries as a controlled vocabulary
with which they define all of their words.  For the Longman's, it is around
2000 words.  The analogous question is how many fundamental ontological
elements (types/classes, and relations/functions) are needed to logically
specify the meanings of all other terms used in a reasonably complex domain
(having perhaps 100,000+ terms), to some adequate level of detail?  I don't
know, but I think that this is a question that is important enough to
warrant substantial effort.  My guess is in the 6,000-10,000 concept range,
and that many of those are fundamental enough to be common to many complex
domains.    (07)

  Any other guesses?    (08)

Patrick Cassidy
MICRA, Inc.
908-561-3416
cell: 908-565-4053
cassidy@xxxxxxxxx    (09)



_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx    (010)

<Prev in Thread] Current Thread [Next in Thread>