ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Relating and Reconciling Ontologies

To: "[ontolog-forum] " <ontolog-forum@xxxxxxxxxxxxxxxx>
From: "AzamatAbdoullaev" <abdoul@xxxxxxxxxxxxxx>
Date: Thu, 21 Apr 2011 22:15:58 +0300
Message-id: <8354218D867E43B5B6773EBC10E54A6A@personalpc>
Cory Casanave wrote:
"Instead consider a "multi hub" approach where we attempt to minimize the 
number of reference models but accept that there may be more than one,
even for a single domain. Endpoints (the viewpoint specific schema and 
ontologies) may be grounded in more than one such reference models and 
reference models may be partially federated - this provides for federation 
where there is ANY reference ontology in common."    (01)

It reminds the SW discussion on a Federal Ontology System: 
http://lists.w3.org/Archives/Public/semantic-web/2009Jan/0030.html. There 
was suggested:
"One can merge ontologies of different schemes, languages, scope, degree, 
granularity in several ways, like the different cultures in a society:
a) multiculturalism (multi-ontologies, loose and free as birds, like a 
bottom-up folksonomy, a people's taxonomy);
b) melting pot (mixing and amalgamating ontologies);
c) Monoculturism (absorbing all numerosity of ontologies into a single 
whole);
d) Core culture (Leitkultur, a top-bottom globally federated ontology)."
Azamat Abdoullaev    (02)

----- Original Message ----- 
From: "Cory Casanave" <cory-c@xxxxxxxxxxxxxxx>
To: "[ontolog-forum] " <ontolog-forum@xxxxxxxxxxxxxxxx>
Sent: Thursday, April 21, 2011 9:49 PM
Subject: Re: [ontolog-forum] Relating and Reconciling Ontologies    (03)


> Azamat,
> Re: Given that, I am convinced that to obtain the General Semantic
> Interoperability standard, costing hundreds billions per year, means to
> develop a single world reference model, in the first place.
>
> [cbc] I must disagree with you on this point.  There are ends of the
> spectrum where we have a single universal model on one side and chaos on
> the other, we are currently closer to the chaos side.  While I do agree
> that we can have general and reusable reference ontologies that serve to
> bind the many different representations, I don't think we will achieve
> this with a " single world reference model ".  Such a universal model
> would require the conception and integration of to many viewpoints and
> theories.
>
> Instead consider a "multi hub" approach where we attempt to minimize the
> number of reference models but accept that there may be more than one,
> even for a single domain.  Endpoints (the viewpoint specific schema and
> ontologies) may be grounded in more than one such reference models and
> reference models may be partially federated - this provides for
> federation where there is ANY reference ontology in common.    This
> would allow for more of an open community leveraging and developing
> reference ontologies where by those that are most successful at being
> reused and federated will grow in authority.  It would allow a community
> to develop their reference ontology and, before or after the fact,
> relate it to other references to improve interoperability.
>
> Such a set of reference ontologies, more loosely coupled, seems to
> better fit with our social systems, capacity to agree and desire for
> local control.  It also allows more rapid results in narrow domains.
> What is required for this is the technology and culture for modularity
> and relations between models.  I am involved in some of the efforts you
> reference, such as NIEM, and feel that a more domain focused reference
> ontology with links to schema such as NIEM would provide substantial
> advantage.  That is the approach we are pursuing in "SIMF"
> (http://www.omgwiki.org/architecture-ecosystem/doku.php?id=semantic_info
> rmation_modeling_for_federation_rfp), previously posted.
AA No content: This topic does not exist yet...
>
> Regards,
> Cory Casanave
>
>
> -----Original Message-----
> From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx
> [mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of
> AzamatAbdoullaev
> Sent: Thursday, April 21, 2011 1:36 PM
> To: [ontolog-forum]
> Subject: Re: [ontolog-forum] Relating and Reconciling Ontologies
>
> AA
>> Interoperability is a critical idea needing depth and breadth and
>> common foundation framework.
> JS
>> At that level of detail, we agree.  But the framework must support
>> existing systems, future systems, and people at all levels of
> education.
> John,
> We are in accord here.
> But any thing, product, system, service, network, or technology to be
> interoperable must be compatible with the same standard, ideally, with a
> standard ontology.
> Let me remind some key points mentioned in my last:
> 1. There are nation-level programs as EU Interoperability Framework, USA
> NIEM, or UK e-GIF.
> 2. Interoperability in general implies common standards, specifications,
> formats, categorizations and integration,  unifying models and schemas.
> 3. the General Interoperability Framework (GIF) is closely connected
> with a world/domain reference model as a common foundation ontology.
> Take what closely concerns the most of the Forum: the US National
> Information Exchange Model:  http://www.niem.gov/. "It is designed to
> develop, disseminate and support enterprise-wide information exchange
> standards and processes that can enable jurisdictions to effectively
> share critical information in emergency situations, as well as support
> the day-to-day operations of agencies throughout the nation."
> Its syntactic operability is to be achieved by using the XML Schema data
> model, constructs, and methods, seemingly, thus supporting existing
> "legacy systems", across all levels of the government, federal, state
> and local.
> However, the issue of issues how to achieve computatable Semantic
> Interoperability, among any and all communicating entities, legacy ones
> or not. Seemingly, by developing the GIF implying a fundamental set of
> basic entities and relationships, providing the semantic basis (meaning
> exchange/interpretation standards and processes) for more specialized
> domains and fields and applications.
> Given that, I am convinced that to obtain the General Semantic
> Interoperability standard, costing hundreds billions per year, means to
> develop a single world reference model, in the first place.
> Azamat Abdoullaev
>
> ----- Original Message -----
> From: "John F. Sowa" <sowa@xxxxxxxxxxx>
> To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
> Cc: "Smith, Barry" <phismith@xxxxxxxxxxx>; "Azamat"
> <abdoul@xxxxxxxxxxxxxx>
> Sent: Thursday, April 21, 2011 4:50 PM
> Subject: Relating and Reconciling Ontologies
>
>
>> Barry, Azamat, et al.,
>>
>> The note copied below was addressed to the Ontology Summit list.
>> But it addresses important issues that should be discussed in the
>> wider forum.  I presented some related slides there on Tuesday:
>>
>>    http://www.jfsowa.com/talks/par.pdf
>>
>> The concluding slide 6 advocated automated methods for relating
>> ontologies to one another and for extracting ontologies from
>> legacy software and from natural language texts.  This morning,
>> I added some pointers to suggested readings for further detail.
>>
>> BS
>>> The mappings I know of between ontologies in practical use
>>> (for example between different anatomy ontologies) involve very
>>> costly manual effort, and even then they are still imperfect
>>> (and fragile as the mapped ontologies themselves change).
>>
>> I agree.
>>
>> Even worse, inter-annotator agreement among professionals who use
>> the ontologies (and the related terminologies) is very poor.  At
>> the Ontology Summit, I was discussing the issues with a physician
>> who cited a discouraging result:  agreement between any two
>> ophthalmologists who assign SNOMED codes to a set of cases is
>> about 60%.
>>
>> The annotators don't even agree with themselves.  In the study,
>> the experimenters retested exactly the same ophthalmologists
>> a year later on a subset of exactly the same cases.  For each
>> of the "experts", their new answers had about a 60% agreement
>> with their answers the year before.
>>
>> This is the fatal flaw in any system that depends on human experts
>> to link real-world data to formal definitions.  Unique identifiers
>> of formal definitions are hopelessly unreliable in any system that
>> depends on human annotators to select an option from a menu.
>>
>> BS
>>> Can John point to examples of practically useful mappings created
>>> and updated automatically through appeal to some sort of Lindenbaum
>>> lattice-based technology?
>>
>> Yes, indeed.  Every *correct* alignment of any two ontologies that
>> has ever been done by human or machine is a successful application
>> of the mappings shown in a Lindenbaum lattice.
>>
>> The lattice is actually a very simple structure that can be
>> specified on one page.  It is the formal foundation for every
>> method of theory revision or ontology alignment.
>>
>> The lattice is like arithmetic.  People were counting on their
>> fingers long before Peano stated his axioms.  The theory doesn't
>> say that counting on fingers is bad, but it can distinguish sound
>> methods from flaky ones.  Furthermore, it can provide guidelines
>> for designing automated and semi-automated tools that can be
>> much faster and more reliable than finger exercises.
>>
>> AA
>>> Interoperability is a critical idea needing depth and breadth
>>> and common foundation framework.
>>
>> At that level of detail, we agree.  But the framework must support
>> existing systems, future systems, and people at all levels of
>> education.  (And even experts in one field are novices in others.)
>>
>>  1. There are trillions of dollars of legacy software that run the
>>     world economy.  It won't be replaced for a long, long time.
>>
>>  2. Anything that replaces a legacy system has to interoperate with
>>     it during a long period of transition.  In fact, most systems
>>     that replace a legacy system build on and extend the implicit
>>     ontology in the old system.
>>
>>  3. Anything that depends on people using unique identifiers must
>>     address the problem that even experts in a subject can't agree
>>     on what codes or categories to assign.
>>
>> John
>>
>> -------- Original Message --------
>> Subject: Re: [ontology-summit] Official Communique Feedback Thread
>> Date: Wed, 20 Apr 2011 10:39:51 -0400
>> From: Barry Smith
>> To: Ontology Summit 2011 discussion <ontology-summit@xxxxxxxxxxxxxxxx>
>>
>> On Wed, Apr 20, 2011 at 9:50 AM, John F. Sowa <sowa@xxxxxxxxxxx>
> wrote:
>>> AGC
>>>> ... having one single ontology does not solve the problem. actually
>>>> IMHO it does not solve anything. it could probably be a good idea to
>>>> address the issue of interoperability across ontologies rather than
>>>> pretending to have "one ontology per domain".
>>>
>>> Yes, indeed.
>>>
>>> There are already a huge number of implemented and proposed
> ontologies,
>>> and the largest number of potential ontologies comes from the
> trillions
>>> of dollars of legacy software.  The total number is finite, but it is
>>> sufficiently large that infinity is the only practical upper bound.
>>>
>>> BS
>>>> Who will keep the N-squared mappings up to date, for an N that is
>>>> increasing, if AGC gets his way, without limit? Who will pay for
> this
>>>> ever increasing mapping effort? Who will oversee the mapping effort?
>>>
>>> The only reasonable solution is to provide automated methods for
>>> discovering the mappings.  Adolf Lindenbaum showed how to do that
>>> over 80 years ago -- it's called the Lindenbaum lattice.
>>>
>>> For a brief survey, see Section 6 and 7 of the following paper:
>>>
>>>     http://www.jfsowa.com/pubs/rolelog.pdf
>>>
>>> John
>>
>> It would be nice, if it worked. But in practice, at least in the areas
>> with which I am familiar, it doesn't. The mappings I know of between
>> ontologies in practical use (for example between different anatomy
>> ontologies) involve very costly manual effort, and even then they are
>> still imperfect (and fragile as the mapped ontologies themselves
>> change). See e.g. the papers by Bodenreider (who does the best work in
>> this field) listed here:
>>
>> http://mor.nlm.nih.gov:8000/pubs/offi.html
>>
>> (and especially the items co-authored with Zhang).
>> Can John point to examples of practically useful mappings created and
>> updated automatically through appeal to some sort of Lindenbaum
>> lattice-based technology?
>>
>> BS
>
>
> _________________________________________________________________
> Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
> Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
> Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
> Shared Files: http://ontolog.cim3.net/file/
> Community Wiki: http://ontolog.cim3.net/wiki/
> To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
>
>
> _________________________________________________________________
> Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
> Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
> Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
> Shared Files: http://ontolog.cim3.net/file/
> Community Wiki: http://ontolog.cim3.net/wiki/
> To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
>     (04)


_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J    (05)

<Prev in Thread] Current Thread [Next in Thread>