This mail is publicly posted to a distribution list as part of a process
of public discussion, any automatically generated statements to the
contrary non-withstanding. It is the opinion of the author, and does not
represent an official company view. (01)
The critical business problem in data exchange is ensuring consistent
interpretation (semantics and pragmatics) of the terms of a message, not
of using a consistent set of terms. We have twenty years of experience
of this. I would expect the business problem to be the same with
ontologies. I am persistently annoyed with academic papers which "solve"
ontology alignment on the basis of the ontology itself, not on the basis
of the world it represents. (02)
I hypothesise that the uptake of ontology technologies will be based on
risk assessment of the reliability of the implementers of the ontology. (03)
Sean Barker
BAE SYSTEMS - Advanced Technology Centre
Bristol, UK
+44(0) 117 302 8184 (04)
BAE Systems (Operations) Limited
Registered Office: Warwick House, PO Box 87, Farnborough Aerospace
Centre, Farnborough, Hants, GU14 6YU, UK
Registered in England & Wales No: 1996687 (05)
________________________________ (06)
From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx
[mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Adrian
Walker
Sent: 06 March 2008 15:28
To: [ontolog-forum]
Subject: Re: [ontolog-forum] Ontology similarity and accurate
communication (07)
*** WARNING *** (08)
This mail has originated outside your organization,
either from an external partner or the Global Internet.
Keep this in mind if you answer this message. (09)
Hi Pat -- (010)
Using a defining vocabulary is all very well, but suppose that
different organizations interpret a concept in the defining vocabulary
differently. What then? Back to the drawing board? (011)
I'd suggest that the way things actually work in practice is
that divergent conceptual models get pulled into alignment by usage,
context and explanations. If that is indeed the case, a static
dictionary, however good, is at best only a small part of the solution. (012)
What do you think? (013)
Cheers, -- Adrian (014)
Internet Business Logic
A Wiki and SOA Endpoint for Executable Open Vocabulary English
Online at www.reengineeringllc.com Shared use is free (015)
Adrian Walker
Reengineering (016)
On Thu, Mar 6, 2008 at 8:00 AM, Patrick Cassidy <pat@xxxxxxxxx>
wrote: (017)
In the discussion on "orthogonal", Ed Barkmeyer pointed
out: (018)
> My position is that two agents don't need to have
non-overlapping
> ontologies to be unable to communicate effectively.
Their ontologies
> can have a 90% overlap, but if there is one critical
idea that one has
> and the other does not understand, they can't do
business.
> (019)
Ed focused on the problem that arises when one
'critical idea' differs
between the ontologies (or assumptions) of two different
communicating
agents. I suspect that the problem can also arise when
even minor
differences are present in the interpretations of
communicated information,
because the interpretation of many concepts involve a
very large number of
implications and associated inferences. (020)
This question appears to me to be one that is worthy
of a separate field
of investigation: precisely how different can ontologies
be while sustaining
an adequate level of accuracy in interpreting
communications that rely on
the ontologies? (021)
My own suspicion is that the similarity **in the
fundamental concepts**
has to be very close to 100%. The reasoning is
something like this: if the
brain (or a simulation) does as much computation as one
of our laptops, then
it can run at least 1 million inferences per second. If
(crudely
calculating) the inferences supported by the differing
ontologies differ by
1 in 1000 then two different ontologies will generate
1000 differing
inferences per second from the same information. How
much difference can be
tolerated before something goes badly wrong - perhaps a
direct logical
contradiction? My guess is that each serious "fact"
that we rely on to
support our everyday activities is supported by at least
1000 assumptions,
and getting one in a thousand wrong would invalidate the
meaning of these
facts, making normal actions, expecting predictable
results, effectively
impossible at any level. A similarity of 99.9% in two
different fundamental
ontologies may not be enough for any meaningful level of
communication.
But, as I said at the start, this is an issue that needs
investigation. (022)
We all know that people differ in assumptions and
beliefs, and yet we do
manage to communicate reasonably well in most cases.
How can that be?
Well, it happens probably because we **know** that we
have different
assumptions and beliefs, and when communicating, only
assume that there is a
certain fundamental set of knowledge in common, and only
rely on that basic
set of common assumptions and beliefs to express the
ideas we want to
communicate. If we 'misunderestimate' what our fellow
conversant knows,
there can be and often is a miscommunication. The
ability to communicate
effectively depends on the ability to guess correctly
what facts,
assumptions, and beliefs are likely to be shared by
those with whom we
communicate. Among specialists, of course, a lot more
common technical
knowledge is assumed. (023)
An issue that has occupied some of my attention lately
has been the
question of what basic ontological concepts are
sufficient to support
accurate communication. I frame the issue as being
analogous to the
"defining vocabulary" used by some dictionaries as a
controlled vocabulary
with which they define all of their words. For the
Longman's, it is around
2000 words. The analogous question is how many
fundamental ontological
elements (types/classes, and relations/functions) are
needed to logically
specify the meanings of all other terms used in a
reasonably complex domain
(having perhaps 100,000+ terms), to some adequate level
of detail? I don't
know, but I think that this is a question that is
important enough to
warrant substantial effort. My guess is in the
6,000-10,000 concept range,
and that many of those are fundamental enough to be
common to many complex
domains. (024)
Any other guesses? (025)
Patrick Cassidy
MICRA, Inc.
908-561-3416
cell: 908-565-4053
cassidy@xxxxxxxxx (026)
_________________________________________________________________
Message Archives:
http://ontolog.cim3.net/forum/ontolog-forum/
Subscribe/Config:
http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (027)
********************************************************************
This email and any attachments are confidential to the intended
recipient and may also be privileged. If you are not the intended
recipient please delete it from your system and notify the sender.
You should not copy it or use it for any purpose nor disclose or
distribute its contents to any other person.
******************************************************************** (028)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (029)
|