ontology-summit
[Top] [All Lists]

Re: [ontology-summit] Ontology driven Data Integration using owl:equival

To: ontology-summit@xxxxxxxxxxxxxxxx
From: Ron Wheeler <rwheeler@xxxxxxxxxxxxxxxxxxxxx>
Date: Sat, 08 Feb 2014 22:13:56 -0500
Message-id: <52F6F274.4080805@xxxxxxxxxxxxxxxxxxxxx>
On 08/02/2014 6:26 PM, Patrick Cassidy wrote:

Kingsley, just a reply to one point:

 

[PC]  >>  If one only wants to create equivalencies and use that to perform probabilistic or pattern-matching reasoning,

>> that may lead to useful results that can be helpful for the humans who evaluate the results. 

>>  But don’t expect the kind of accuracy that would be needed to allow the computers

>> to make mission-critical decisions without human intervention.  

>
[KI] > Computers cannot be left alone to mission-critical decisions for humans. What they can

> do is perform a lot of the grunt work that makes humans beings make better decisions, more productively.


It is believed that medical errors kill over 400,000 people a year in the US. (http://www.fiercehealthcare.com/story/hospital-medical-errors-third-leading-cause-death-dispute-to-err-is-human-report/2013-09-20)
What will be the acceptable loss rates for computers making mission-critical decisions?
It appears that highly trained professionals have a very high rate of error.

Car accidents cost over 34,000 deaths in 2012. (http://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_in_U.S._by_year)
What will be an acceptable rate when computers control the cars?
We are already starting to build cars that talk to each other and make decisions about the level of danger in a situation before the driver is aware of it.

Ron


Computer


     Well, getting computers with increasing ability to perform without humans is, I believe one of the goals that motivates many workers with knowledge based systems (including myself), and in other fields as well.  We hear a lot about self-driving cars, or airplane defense systems that respond automatically to perceived threats, or billion dollar rovers that have to negotiate uncertain terrain on other planets.   At present such systems need to be carefully constructed for restricted domains, but increased capability will make them more useful, and is eagerly sought after.

 

My own interest is precisely to learn what is required for systems to become as capable as humans in a wide range of situations.  Those who have immediate problems to solve in a short time frame will find this of no direct relevance.  But if one needs to communicate **accurately** between systems, the tactic  of using a single language in the communication process is unavoidable.  For two or a small number of systems that can agree ahead of time on meanings of terms or data fields, that single language is effected by such prior communication.   For the case where one party puts data on the web and another party wants to use the data without prior communication with the first party, the user still needs to understand the meanings of the terms used by the data supplier . . . and if one want accuracy over a range of topics, there is no alternative that I have seen to precise logical descriptions of the terms that both parties have access to.   Precise logical descriptions of terms is the function of the foundation ontology, and both communicating parties need to use the same foundation ontology (of which there could be more than one).

[PC] >> Or, if one only wants to extract some one or two properties of an entity (e.g. the director of a film), equating “film” in two different ontologies may work as intended.   But extreme caution is advised.

>>
[KI] > Of course this has to be done with caution. I've never perceived ontologies as a mechanism for making computers perform tasks for which they are ill equipped.
> Computers are tools, and when used properly, extremely powerful productivity tools for humans.

   Yes, computers are tools and the tasks for which they are properly equipped are becoming increasingly sophisticated.  For the task of communication among computer systems that need to properly interpret the transmitted data, some prior agreement on meaning is required, and a common foundation ontology can provide that agreement over a broad range of topics with the least effort.  Otherwise one will have probabilistic interpretations that will be suitable for some applications, but too inaccurate for others.

 

Pat

Patrick Cassidy

MICRA Inc.

cassidy@xxxxxxxxx

1-908-561-3416

 

From: ontology-summit-bounces@xxxxxxxxxxxxxxxx [mailto:ontology-summit-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Kingsley Idehen
Sent: Friday, February 07, 2014 11:31 PM
To: ontology-summit@xxxxxxxxxxxxxxxx
Subject: Re: [ontology-summit] Ontology driven Data Integration using owl:equivalentClass relations

 

On 2/7/14 3:53 PM, Patrick Cassidy wrote:

There is a serious problem with the suggested methodology:

 

>  Two ontologies or vocabularies (for instance FOAF and Schema.org) include definitions for the same class (or kind) of entity e.g., an Organization,

>  and as a consequence we end up with Web accessible   documents comprised of RDF statements that describe Organizations as instances of  foaf:Organization or schemaorg:Organization.
>
>  Challenge: How do we get a merged view of all the organizations, irrespective of how they've been described across various RDF documents?
>
>  Solution:
>
>  1. Make a mapping/bridge/meta ontology that uses owl:equivalentClass relations to indicate the fact

>  that <http://xmlns.com/foaf/0.1/Organization> and <http://schema.org/Organization> are equivalent.
>
>  2. Load the mapping/bridge/meta ontology document into a data management system that's capable

>  of applying reasoning and inference to the equivalence claim based on its comprehension of the relation semantics expressed
>
>  3. Access instances of the <http://xmlns.com/foaf/0.1/Organization> classes (e.g., by seeking a description

>   of <http://xmlns.com/foaf/0.1/Organization> which should produce a solution that includes subjects

>  of instanceOf (rdf:type) relations) -- and this will show a union of all instances of across <http://xmlns.com/foaf/0.1/Organization> and <http://schema.org/Organization>
>
>  4. Reverse the action in step 4 above -- the results should be the same.

   If you create an equivalence mapping between entities in  independently developed ontologies  and try to “reason” with it in any but the most highly restricted manner, you will almost certainly find unintended inferences, likely logical inconsistency, and potentially a great deal of gibberish.    When you look at the logical specifications of entities of the same name in different ontologies, they are often quite different, even though the intuition for the meanings may be similar.  For “organization”, for example, some ontologies have that as a subtype of “group of people”.  In some legal jurisdictions, an Organization can exist without any members – i.e., no people.   That can lead to logical contradictions if different definitions are equated.  I have never seen “process” defined the same way in any two independent ontologies.


Yes, but nothing I've outlined contradicts the point you are making.

My point is that you can opt to apply reasoning and inference based on classes (entity types/kinds) described across different ontologies. There is no such thing as one ontology that rules all. What you can do is opt to apply meta/bridge/mapping ontologies, according to your situation.

In my example, I am demonstrating how understanding of FOAF and Schema.org classes can be used to integrate instances of classes from either ontology. Note, I specifically include examples with and without inference & reasoning to drive home the point that is all loosely coupled i.e., its application and use is a choice made by system user. 

 

   If one only wants to create equivalencies and use that to perform probabilistic or pattern-matching reasoning, that may lead to useful results that can be helpful for the humans who evaluate the results.  But don’t expect the kind of accuracy that would be needed to allow the computers to make mission-critical decisions without human intervention.  


Computers cannot be left alone to mission-critical decisions for humans. What they can do is perform a lot of the grunt work that makes humans beings make better decisions, more productively.


Or, if one only wants to extract some one or two properties of an entity (e.g. the director of a film), equating “film” in two different ontologies may work as intended.   But extreme caution is advised.


Of course this has to be done with caution. I've never perceived ontologies as a mechanism for making computers perform tasks for which they are ill equipped.

Computers are tools, and when used properly, extremely powerful productivity tools for humans.

Kingsley

 

     Pat

 

Patrick Cassidy

MICRA Inc.

cassidy@xxxxxxxxx

1-908-561-3416

 

From: ontology-summit-bounces@xxxxxxxxxxxxxxxx [mailto:ontology-summit-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Andrea Westerinen
Sent: Friday, February 07, 2014 1:17 PM
To: Ontology Summit 2014 discussion
Subject: Re: [ontology-summit] Ontology driven Data Integration using owl:equivalentClass relations

 

Kingsley, +1 ... Your mapping/bridge/meta ontology is my "integrating ontology".  And, you captured the essence extremely well in your demos.

 

The keys are: 

1.  Creating the mappings

2.  Reasoning with the mappings

 

Clearly this works over data that is Linked Data or data in ontologies.

 

 

 

On Fri, Feb 7, 2014 at 10:01 AM, Kingsley Idehen <kidehen@xxxxxxxxxxxxxx> wrote:

All,

Starting a new thread based on the theme above to make what I am trying to demonstrate clearer.

Situation:

Schema.org [1] is a collaborative effort aimed as simplifying structured data publication to the Web. As part of this effort, a number of collaborators have collectively produced a number of shared vocabularies under the "schema.org" namespace.

In addition to what's being produced by Schema.org there are a thousands of shared ontologies and vocabularies that have been constructed and published to the Web from a plethora of sources, many of these have been aggregated by services such as LOV (Linked Open Vocabulary) [2] which is basically accentuates the TBox and RBox aspects of the Linked Open Data (LOD) Cloud.

Typical Integration Problem:

Two ontologies or vocabularies (for instance FOAF and Schema.org) include definitions for the same class (or kind) of entity e.g., an Organization, and as a consequence we end up with Web accessible documents comprised of RDF statements that describe Organizations as instances of  foaf:Organization or schemaorg:Organization.

Challenge: How do we get a merged view of all the organizations, irrespective of how they've been described across various RDF documents?


Solution:

1. Make a mapping/bridge/meta ontology that uses owl:equivalentClass relations to indicate the fact that <http://xmlns.com/foaf/0.1/Organization> and <http://schema.org/Organization> are equivalent.

2. Load the mapping/bridge/meta ontology document into a data management system that's capable of applying reasoning and inference to the equivalence claim based on its comprehension of the relation semantics expressed

3. Access instances of the <http://xmlns.com/foaf/0.1/Organization> classes (e.g., by seeking a description of <http://xmlns.com/foaf/0.1/Organization> which should produce a solution that includes subjects of instanceOf (rdf:type) relations) -- and this will show a union of all instances of across <http://xmlns.com/foaf/0.1/Organization> and <http://schema.org/Organization>

4. Reverse the action in step 4 above -- the results should be the same.


Live Demo Link:

[1] http://lod.openlinksw.com/describe/?url=""> -- description of <http://xmlns.com/foaf/0.1/Organization> *without inference and reasoning enabled*, so the relations presented are specific to the aforementioned class.

[2] http://lod.openlinksw.com/describe/?url=""> -- description of <http://schema.org/Organization> *without inference and reasoning enabled*, so the relations presented are specific ot the aforementioned class .

[3] http://lod.openlinksw.com/describe/?url=""> -- description of <http://xmlns.com/foaf/0.1/Organization> *with inference and reasoning enabled*.

[4] http://lod.openlinksw.com/describe/?url=""> -- description of <http://schema.org/Organization> with *inference and reasoning enabled*.

--

Regards,

Kingsley Idehen
Founder & CEO
OpenLink Software
Company Web: http://www.openlinksw.com
Personal Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter Profile: https://twitter.com/kidehen
Google+ Profile: https://plus.google.com/+KingsleyIdehen/about
LinkedIn Profile: http://www.linkedin.com/in/kidehen







_________________________________________________________________
Msg Archives: http://ontolog.cim3.net/forum/ontology-summit/
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontology-summit/
Unsubscribe: mailto:ontology-summit-leave@xxxxxxxxxxxxxxxx
Community Files: http://ontolog.cim3.net/file/work/OntologySummit2014/
Community Wiki: http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2014
Community Portal: http://ontolog.cim3.net/wiki/



 




 
_________________________________________________________________
Msg Archives: http://ontolog.cim3.net/forum/ontology-summit/   
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontology-summit/  
Unsubscribe: mailto:ontology-summit-leave@xxxxxxxxxxxxxxxx
Community Files: http://ontolog.cim3.net/file/work/OntologySummit2014/
Community Wiki: http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2014  
Community Portal: http://ontolog.cim3.net/wiki/ 




-- 
 
Regards,
 
Kingsley Idehen             
Founder & CEO 
OpenLink Software     
Company Web: http://www.openlinksw.com
Personal Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter Profile: https://twitter.com/kidehen
Google+ Profile: https://plus.google.com/+KingsleyIdehen/about
LinkedIn Profile: http://www.linkedin.com/in/kidehen
 
 
 
 


 
_________________________________________________________________
Msg Archives: http://ontolog.cim3.net/forum/ontology-summit/   
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontology-summit/  
Unsubscribe: mailto:ontology-summit-leave@xxxxxxxxxxxxxxxx
Community Files: http://ontolog.cim3.net/file/work/OntologySummit2014/
Community Wiki: http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2014  
Community Portal: http://ontolog.cim3.net/wiki/ 


-- 
Ron Wheeler
President
Artifact Software Inc
email: rwheeler@xxxxxxxxxxxxxxxxxxxxx
skype: ronaldmwheeler
phone: 866-970-2435, ext 102

_________________________________________________________________
Msg Archives: http://ontolog.cim3.net/forum/ontology-summit/   
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontology-summit/  
Unsubscribe: mailto:ontology-summit-leave@xxxxxxxxxxxxxxxx
Community Files: http://ontolog.cim3.net/file/work/OntologySummit2014/
Community Wiki: http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2014  
Community Portal: http://ontolog.cim3.net/wiki/     (01)
<Prev in Thread] Current Thread [Next in Thread>