I don't disagree with the way John characterizes the use of human languages,
but the issue that a common foundation ontology addresses is to have a
reliably stable set of *logical* terms whose meaning is fixed. If, after a
testing and refining period for an FO, someone needs a new logical element
for *any* reason, the proper procedure is to add a new one and relate it to
those already on the ontology, not to change the meanings of the labels in
the FO. Any older terms that are eventually rendered unnecessary can be
deleted when the working ontology is extracted from the FO -- as I
mentioned, it is not necessary and probably unlikely that the full FO would
be used in any practical application, so having vestigial unused elements
will be a cost-free way to avoid changes in meaning. It is fundamentally
important in this discussion to remember and not confuse the way human
languages are used and the way an FO with fixed meanings for its terms would
Programmers may of course change the meanings of the terms they use as
much as they want as long as they have control over their use. For *general
interoperability* no programmer has control over the meanings of the terms
in the FO - those meanings, once adopted, would be fixed, and changes to the
ontology would have to come from additions - unless logical errors were
found that had to be corrected. If a programmers want their programs to
interoperate via the FO, they would have to change the mapping for any new
term or term of changed meaning if it is used in communicating with other
systems via the FO. Terms and processes used locally that do not affect the
communications can be controlled and changed locally as the programmers see
As for CYC, the parts that are released for free use include few of the
rules: the meanings of the terms in an ontology are determined only by the
relations (including isa) and the meanings of the relations are determined
only by the rules expressing the implications of some relation holding
between entities. But as I mentioned, even if all of Cyc were public, that
would only be the *start* of the process of demonstrating the utility of the
common FO for interoperability. I reiterate: the point of the FO project
is (1) to *agree on* (not just to create) a common FO; (2) to create
multiple independently developed useful demo programs that use an ontology
mapped to the FO; and (3) to demonstrate communication among those programs
by means of the FO. Only after all that would the wider public be able to
evaluate the utility of an FO for enabling accurate interoperability. In
the course of all this, I presume that the initial FO would be changed to
address any deficiencies made evident by the testing phase. SUMO, because
it does have a lot of rules included, might even be adopted as the starting
base before CYC. But I cannot visualize reaching the stage of adoption of
any FO (within my lifetime) without a project aimed directly at that goal,
*coordinating* a large number of users. I believe that the notion that an
uncoordinated interaction of multiple groups would eventually arrive at some
common understanding is fundamentally flawed: such a process will most
likely create another *human* language, with terms having multiple meanings
and multiple terms having the same intended meaning. And if, against
expectations, that process somehow did arrive at a high level of agreement
(over 99% is essential), it would likely take very very long to get there.
Meanwhile, the money wasted due to that delay will dwarf the cost of a
coordinated project of the kind I am suggesting. (01)
>> PC> I have created definitions for 600+ words not in the Longman DV,
> and those definitions with their trace back to the LDV can be found
> at... (02)
[JS] Those are definitions in English, which are comparable to the glosses
on WordNet, which has over a hundred times more terms together with machine
readable information that has proved to be very useful. If you think
Longman's has anything to contribute, I suggest that you collaborate with
the WordNet group to improve it. (03)
Of course, they are definitions in English and I have pointed to the use of
the Longman DV as *presumptive evidence* (I never suggested that it was
proof) that a comparable small inventory of ontological concepts would serve
analogously to specify the meanings of many other concepts. The definitions
in the supplementary vocabulary list I prepared are not intended to be
comprehensive - there may well be much more one will want to say about each
term. But **the point of that exercise** was to demonstrate that,
*whatever* one wants to say, it can be said with terms grounded in the LDV.
If an accurate NLU program could be developed, that would allow the logical
specifications of the ontology terms to be automatically crated from the
English definitions. If you agree that perfectly reasonable-looking and
comprehensive *English language* definitions *can* be grounded in the LDV,
we can forget about that analogy and focus on the technical issues of the FO
and its manner of usage. (04)
On WordNet: I think that the FO project should include the use of
WordNet to the extent that its inheritance hierarchy represents a valid
inheritance hierarchy. But the most abstract levels have a lot of problems,
which have been recognized for a while. My own work includes some effort to
map COSMO to WordNet where possible. But a simple mapping, as has been done
for Cyc and SUMO, is inadequate, because some WordNet synsets include
multiple meanings, and some different synsets have overlapping meanings that
can be disentangled. For NLU a mapping to some hierarchy somewhat similar
to WordNet would be quite useful, but at least some of WordNet would have to
be changed to function that way. I have talked to the WordNet group, and
they are aware of the problem. I believe that some work is being done to
address the issue of the hierarchy, but I believe that the work is not using
(as of a year ago) a logic-based ontology to specify the meanings of those
synsets. If at any time some group decides to restructure WordNet so that
it is reorganized into a proper logical inheritance hierarchy, I will be
delighted to work with them - I believe I could provide some useful
suggestions. I am not aware of a current project of that type. (05)
> -----Original Message-----
> From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx [mailto:ontolog-forum-
> bounces@xxxxxxxxxxxxxxxx] On Behalf Of John F. Sowa
> Sent: Saturday, January 30, 2010 10:39 PM
> To: [ontolog-forum]
> Subject: Re: [ontolog-forum] Foundation ontology, CYC, and Mapping
> Pat and Ali,
> Some of our disagreements result from different views about the
> nature of natural languages and the source of the multiple word
> senses. People used to think that multiple word senses were
> a flaw in NLs and that the cure was to legislate a single word
> sense or a small number of fixed senses for each word.
> Peirce observed that "symbols grow" and that a person who interprets
> a word derives a meaning that is *rarely* identical to the meaning
> of the speaker or author. In fact, the interpreter can sometimes
> derive a meaning that is richer or more detailed than the author
> had intended. That is called "reading between the lines".
> The inevitable changes in the world are the most common source
> of multiple meanings and shifts in meaning. A definition of
> 'automobile' in 1910, for example, would be very different from
> a definition in 1960 and much more different in 2010.
> In computer systems, where precise definitions are critical,
> the meanings of words change even more rapidly. Terms like
> 'file', 'datatype', 'operating system', 'display', 'browser',
> 'CPU', 'interrupt', 'compiler', 'interpreter', or 'database'
> change with each vendor, each implementer, each release of
> a new version, and even each patch to a current version.
> In linguistics, Alan Cruse coined the term 'microsense' for the
> subtle changes in word meaning with every new application (or
> language game, in Wittgenstein's terms). For the Semantic Web,
> some people hoped that assigning a unique identifier to each word
> sense would magically stop meanings from changing. But that hope
> is incredibly naive.
> You can put any definition you choose at the end of a URI, but
> it won't guarantee that the people who annotate their web pages
> will use that definition. In fact, it won't insure that people
> will read it -- or understand it even if they read it. And it
> won't stop programmers from changing the semantics of their
> programs when they fix bugs or make updates.
> Some comments:
> PC> Cyc is (1) mostly proprietary, and a public language is essential
> > (2) no effort by a single group can demonstrate interoperability
> > among independent development groups.
> You said that many times, and I always make the same response:
> 1. Doug Lenat & Co. have already released a large part of Cyc
> to the OpenCyc project.
> 2. They need funding to continue their R & D.
> 3. For a small fraction of the $30 million you're requesting, I'm
> that they would agree to release a very large amount of their
> ontology to OpenCyc and make it freely available.
> 4. There is already a large community that is familiar with Cyc,
> people can begin using OpenCyc immediately, and they can upgrade
> to full Cyc as soon as it would be released.
> I am definitely *not* convinced that Cyc is the ideal ontology that
> the world needs. But the alternatives are much less convincing, they
> would cost more and take longer to develop, and nobody would have
> any experience in using them.
> PC> A related question that John has raised in the past is, if the CYC
> > approach is technically adequate to support effective AI applications,
> > why do we not see such applications? If we assume that even the
> > proprietary applications that CYC has built (not available for public
> > inspection) are also not particularly impressive, there are possible
> > reasons other than that the approach itself is flawed.
> I strongly agree with that statement. But I have little confidence
> that any new ontology would be much better.
> PC> I not only haven't seen anything from CYC that is public and
> > impressive, I haven't seen anything from anyone else that is public
> > and impressive that uses ontology to do things not easily done by
> > other techniques (and there are *many* ontologies around).
> I strongly agree with that statement.
> PC> As for whether CYC has "failed", they are currently getting most
> > of their revenue from commercial projects.
> I agree. Back in 1991, I suggested to Doug that he should devote
> more effort to commercial applications. But his response then was
> that he didn't want to dilute his research by putting any effort
> into applications.
> I replied that if he spent some percentage (say 20%) of his resources
> on commercial applications, the revenue could support much, if not
> all of the research. But he didn't want to work on applications.
> However, the research funds have now dried up, and his only chance
> of survival is to do what I believe he should have done many years
> ago: implement applications that can bring in some revenue.
> PC> I have created definitions for 600+ words not in the Longman DV,
> > and those definitions with their trace back to the LDV can be
> > found at...
> Those are definitions in English, which are comparable to the glosses
> on WordNet, which has over a hundred times more terms together with
> machine readable information that has proved to be very useful. If
> you think Longman's has anything to contribute, I suggest that you
> collaborate with the WordNet group to improve it. That would cost
> much less than $30 million, and it would immediately benefit the
> large community of WordnNet users.
> PC> I have in the past suggested that if anyone doubts the adequacy
> > of the Longman vocabulary to properly define (in English) any terms
> > of interest, they should prepare what they consider to be a proper
> > definition of some term, and I will try to demonstrate how the
> > words of that definition can themselves be grounded recursively
> > in the LDV.
> Those 600 definitions in your file aren't sufficiently precise
> to do machine reasoning. Unless you can write definitions at the
> level of precision and detail of Cyc, they can't be used in
> a computer system. And if you want to write at that level of
> precision, I think you should start with Cyc.
> PC> From past experience, I feel it likely that anything I choose
> > would be dismissed as not convincing.
> I agree. Please note the opening comments of this note. They
> apply equally well to Cyc, WordNet, and every other proposed
> formal ontology. I have just as many (but slightly different)
> doubts about them as I have about your proposals.
> In summary, those points imply that *every* attempt to define
> a fixed, frozen ontology is doomed. That doesn't mean that
> work on Cyc, SUMO, Dolce, BFO, etc., has been wasted. But
> it does imply that work can only be salvaged by making it
> part of a more dynamic ontology along the following lines:
> A Dynamic Theory of Ontology
> As I said to Ali, I believe that his approach is compatible
> with the lattice, and it could be combined with it. But no
> fixed ontology would be suitable (except as part of a larger
> open-ended collection).
> Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
> Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
> Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
> Shared Files: http://ontolog.cim3.net/file/
> Community Wiki: http://ontolog.cim3.net/wiki/
> To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
> To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (09)