ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Polysemy and Subjectivity in Ontolgies - the HDBIexa

To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: "doug foxvog" <doug@xxxxxxxxxx>
Date: Wed, 17 Nov 2010 16:19:33 -0500 (EST)
Message-id: <64247.71.178.11.66.1290028773.squirrel@xxxxxxxxxxxxxx>
Rich,    (01)

A general comment before going into detail,
  Terms in ontologies should be treated as different things as words
  in natural languages.  A discussion of properties of ontologies
  is not about NLP.    (02)

On Wed, November 17, 2010 13:13, Rich Cooper said:    (03)

> John, you posted a response to my question on 11/6 - sorry I jut got
> around
> to having time to answer it.  Please see comments below,    (04)

> -Rich    (05)

> Sincerely,
> Rich Cooper
> EnglishLogicKernel.com
> Rich AT EnglishLogicKernel DOT com
> 9 4 9 \ 5 2 5 - 5 7 1 2    (06)


> John Sowa wrote:
> On 11/6/2010 12:55 PM, Rich Cooper wrote:    (07)

>>> Then perhaps I don't understand the reasons why you define an ontology
>>> as
>>> monosemous.  Why don't you think a practical ontology MUST be polysemous
>>> if
>>> you agree with the conclusion I reached?    (08)

>> For many years, I have been saying that there is no such thing as one
>> ideal ontology of everything.  In my 1984 book, the last chapter had
>> the title "Limits of Conceptualization," in which I outlined the many
>> problems with assuming one ideal ontology.    (09)

>> In my 2000 book, I covered similar material in more detail in Ch 6,
>> which had the title "Knowledge Soup."  That was also the title of
>> a talk I gave in 1987, and a paper I published in 1991.  In 2004,
>> I wrote a longer version, "The Challenge of Knowledge Soup":    (010)

>>     http://www.jfsowa.com/pubs/challenge    (011)

>> There are several points, which I have emphasized over and over
>> and over again in those publications and many email notes:    (012)

>>   1. There is no such thing as an ideal ontology with one unique
>>      meaning for every term.    (013)

>>   2. But for any system of formal reasoning, we must have one
>>      meaning for each term in order to avoid contradictions.    (014)

> Assuming this (point 2) is true implies we    (015)

This is not a restriction on what "we" can do, but on what a "system
of formal reasoning" can do.  Human reasoning deals with not with
ontologies and formal logic, but often with languages in which terms
have multiple meanings.    (016)

> cannot reason with ambiguous
> representations, but that is clearly not true.    (017)

The following discussion does not deal with ontologies, but with
natural language processing.  Although ontologies are often used
to assist in NLP, they are two different topics.    (018)

This is the wrong forum to discuss NLP methods, except in so far as
applications of ontologies to NLP is the topic.    (019)

> Disambiguation in language
> only goes a short way before reaching the many, many interpretations that
> can be placed on a phrase ...    (020)

There is no disagreement with you in this forum on that matter.    (021)

An ontology would map its specific terms to various denotations of
specified words.  A given word would have multiple denotations and
a given term would have multiple words and phrases which denote it.    (022)

> Yet for NLP, we MUST handle ambiguity.  ...    (023)

> Avoiding contradictions is bad policy for NLP, IMHO.    (024)

Agreed.  Proper use of ontologies which have terms with single
meanings mapped to different denotations of terms is one technique
NLP systems can use to handle such contradictions.    (025)

> ... NLP requires
> ontologies of parallel interpretations (unless you can provide an
> alternative way to handle contradictions).    (026)

I wouldn't call these "ontologies of interpretations".  I would say
that NLP using ontologies requires the ontology to encode multiple
intrpretations of terms in the language.    (027)

>>   3. Therefore, we can handle requirements #1 and #2 by providing
>>      an open-ended number of theories (or microtheories, as Cyc
>>      calls them), each of which has one meaning per term.    (028)

>>   4. But we can have terms with same spelling, but different
>>      definitions (or axiomatizations) in different theories.    (029)

>>   5. In order to manage all that multiplicity of theories and to
>>      organize them in a way that enables us to find the one we need
>>      for any particular problem, we can arrange them in a generalization
>>      hierarchy.    (030)

I would note that this hierarchy is a DAG, not a tree.    (031)

> But a generalization hierarchy, while useful, is not the only approach.    (032)

Thus the word "can".  I interpreted this to mean that one method was
presented, but no claim was made that it was the only method.    (033)

> There are plenty of ways to represent concurrent interpretations and to
> search among them for those interpretations which can be mapped word by
> word, synset by synset into graphs of possible interpretations.    (034)

Sure.  Some of these would use ontologies, while others wouldn't.    (035)

> Especially with relational representations using relational database
> technologies, a table of interpretations can be constructed with each
> thread having its own elaborations of a row in that table.    (036)

This is an example that is not based on ontologies, and so is beyond
the scope of this forum.  [Note that i am in no way disputing this,
merely noting it is off topic.]    (037)

>>   6. The complete hierarchy of all those theories would be an infinite
>>      lattice, and we can't implement all of them.  But any one(s) we need
>>      can be created by combinations and modifications of ones we have.    (038)

> I'm not certain that what is needed is a hierarchy - processing of
> ambiguous threads need not be the same interpretation TYPE hierarchy.    (039)

The hierarchy discussed is not one of partially processed threads, but
of theories of meanings of sets of terms.  You are talking past each other
here.    (040)

John is discussing ontologies in this thread.  He is NOT discussing
NLP techniques.    (041)

> As you have
> pointed out many times, ambiguity in language is not equivalent to
> ambiguity in the word meaning lattice.  Even phrases like "throw the
> game", which have no meaningful physical interpretation,
> still have meaning in the recipient's
> and sender's heads about what "throw" means,
> and it isn't heaving a mass in a direction.    (042)

And an ontology that contains this topic should have one of its many
denotations of "throw" being the concept ThrowingAGame, with linguistic
restrictions on the sentence requiring the sentence's direct object to be
a competition and the subject to be a competitor in the competition.    (043)

>>   7. When we're not sure which of the many theories with a particular
>>      term, say the predicate dog(x), we can select an underspecified
>>      theory near the top.  As we get more info, we can move to (or
>>      create) a more specialized theory that adds any detail we need.    (044)

Here, i'd part ways with John.  I'd use different terms in the ontology
for different meanings such as throwing a game vs. physically throwing a
physical object.  However, for shades of meaning (throwing dice vs.
throwing a basketball), theory specialization can be quite useful.    (045)

>> I'm sure that I repeated this in about 2,376 different notes over
>> the past ten years.  For the record, following is the most recent
>> talk in which I discussed it:    (046)

>>     http://www.jfsowa.com/talks/iss
>>     Integrating Semantic Systems    (047)

>> Please read slides 61 to 81.  Then bookmark it and reread it
>> whenever you have a question like the one above.    (048)

> I just revisited (for the fourth time) slides 61 to 81.  They are
> excellent
> slides for people who are concerned about repositories and about the
> portability of ontologies.  But the subject I am most concerned with is
> just the NLP aspects,    (049)

This is clear.  This results in you talking past each other.    (050)

> not interfacing databases of one type to those of another
> type by means of an ontology.    (051)

> All the years of work by many people in NLP have not paid off very well.
> The basic capability needed is in active voice (or even text) interactions
> having a natural feel to them so that training is minimal for people who
> work with those systems.  Cars, phones, even the Kinect sensor, have more
> and more leaned toward active voice interfaces, but they all lack real
> effectiveness.    (052)

These in general do not use controlled languages.  Thus parsing and
ambiguity are issues.  Parsing of controlled languages is unambiguous.    (053)

> I believe there is more at risk here than JUST LOGIC.    (054)

Sure.  Things that are off topic in the ONTOLOG-FORUM.    (055)

> The actual designation of objects, structure of language utterances,
> and discovery of user intent require much more than JUST logic,
> so what else is
> missing for NLP to become useful as a computer interface technique?    (056)

This seems to be the wrong forum for such questions. (not that they
aren't interesting.)    (057)

> You mentioned Cyc and its microtheories for interpreting each meaning
> of a word.  It doesn't seem to have worked, in that Cyc hasn't
> gotten out of the public weal,
> with contract work only, and only for research not for
> practical applications.    (058)

Cycorp has been much more focused on building the ontologies than on
NLP.  It has been used for practical applications for over a decade,
for example by a pharmaceutical company.    (059)

The microtheories do work, but more logic as well as more word sense
discrimination is needed for a general purpose text understanding system.    (060)

> So there is more than LOGIC at stake here.  I am
> trying to characterize what is needed to build a roadmap toward full NLP.
> Any suggestions on how that roadmap would have milestones would be
> appreciated.    (061)

Some suggestions of considerations for the roadmap,
* Distinguish between NL words and ontology terms; there should be no
  overlap.
* Establish N-N mappings between NL words (and phrases) and ontology terms.
* Specify linguistic features of the words depending on the word sense.
* Establish general topics for the ontology terms (word senses).
* When selecting among possible word senses, favor those that share
  topics with each other and the broader context.    (062)

-- doug f    (063)

> -Rich    (064)

>> John    (065)

=============================================================
doug foxvog    doug@xxxxxxxxxx   http://ProgressiveAustin.org    (066)

"I speak as an American to the leaders of my own nation. The great
initiative in this war is ours. The initiative to stop it must be ours."
    - Dr. Martin Luther King Jr.
=============================================================    (067)


_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx    (068)

<Prev in Thread] Current Thread [Next in Thread>