Patrick (01)
I would expect Longman's to be based on semiotics as a proxy for semantics.
That is, it relies on knowledge based procedures to invoke the correct sign
in the reader, but without identifying what these procedures are. These may
include non-logical methods such as Bayesian Belief Networks and Fuzzy
Logic. This may be a valid approach, but it does not fit directly into
Ontology. (02)
If you want to define a set of "primitives", one would need to (03)
a) demonstrate that complex ideas can be generated from the primatives by
the addition of predicates which would allow support the semiotic inferences
needed
b) demonstrate that the whole system of predicates would not lead to an
inconsistent or incomplete understanding of the world; (04)
I have not noted anything about (a) in ontology literature, and I would
suspect (b) would be very difficult. Further, the need to re-evaluate (b)
any time additional predicates were added would make the system
unsupportable. (05)
I suspect that what you want is a decent data exchange standard, of which
there are already hundreds, if not thousands. More specifically, where you
need to start is the big, integrative standards, such as ISO 10303, ISO
15926, GJXML, .... (06)
If you could start by unifying the first three, that would be a great help
:-). (07)
Sean Barker
BAE SYSTEMS - Advanced Technology Centre Bristol, UK
+44(0) 117 302 8184 (08)
BAE Systems (Operations) Limited
Registered Office: Warwick House, PO Box 87, Farnborough Aerospace Centre,
Farnborough, Hants, GU14 6YU, UK Registered in England & Wales No: 1996687
-----Original Message-----
From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx
[mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Patrick Cassidy
Sent: 16 February 2010 19:12
To: '[ontolog-forum] '
Subject: Re: [ontolog-forum] Foundation ontology, CYC, and Mapping (09)
*** WARNING *** (010)
This message has originated outside your organisation,
either from an external partner or the Global Internet.
Keep this in mind if you answer this message. (011)
Chris,
Thanks for the comment below. It's a relief to see that at least some of
the participants in this list do not share the misinterpretations of what I
have been saying about the FO. Try as I might to be unambiguous, bizarre
distortions of my comments seem to provoke irate replies. (012)
> On Feb 15, 2010, at 4:28 PM, Rob Freeman wrote:
> > Pat,
> >
> > What is clear is that:
> >
> > 1) You keep repeating your belief that a single complete
> > axiomatization (with useful coverage?) is possible.
> >
> > 2) You admit you don't know how to prove it is possible.
> >
> > 3) You ignore/don't understand that other people have proven it is
> impossible.
> >
> > "A complete and consistent logic complex enough to include
> > arithmetic was shown by Kurt Goedel to be impossible..."
> (013)
Chris Menzel replied (amid a delightful exposition of what Goedel said):
>
> As for charge #3 that you direct at Pat above, Pat has never claimed
> that a FO would include a complete, consistent, recursive
> axiomatization of arithmetic. You should be more careful about
> lecturing others about what they don't understanding.
>
Right. I am not concerned with the foundations of mathematics, I leave
that to others far more qualified than I. I am concerned with the quickest
and most direct route to general and accurate semantic interoperability.
The FO proposal in brief can be summarized:
(1) there are some ontology elements whose intended meanings cannot be
expressed solely as an FOL combination of other ontology elements. I call
these "primitives". Anyone else can call them anything they please.
(2) for any given group of domain ontologies, there will be some set of such
primitives that will be sufficient to logically specify by FOL combination
the intended meanings of the non-primitive elements of those ontologies.
These meanings will not necessarily be complete descriptions of the intended
real-world referents; they will be sufficient to perform the computations
desired for
(3) To *accurately* translate logical assertions among those domain
ontologies, the most parsimonious tactic (and probably the fastest) would be
to identify the primitives in common among those domain ontologies, include
them in an FO, and use them to create translations of assertions between the
domain ontologies. Those translations will use "bridging axioms" to convert
assertions from the form in one ontology to the form in another ontology.
(4) To minimize the changes in the FO as new domain ontologies are linked to
(mapped to or logically expressed by) the FO, it is advisable to try to
identify as many of the possible primitives as can be identified, at the
earliest stages of testing of the FO. This should reduce the number of new
primitives that need to be created as new domain ontologies are linked to
the FO. Since the test has never been done, we do not know whether or how
quickly the need for new primitives will drops for each new domain. That
can only be determined by testing the FO process. It is possible that new
primitives will need to be continually added; even so, this method appears
(to me) to be the most effective to achieve the maximum and most accurate
semantic interoperability that is possible at any given time.
(5) As possible inventories of primitives that should be included in a
*starting* FO, to aim for the broadest coverage as quickly as possible, I
suggest using the senses associated with the Longman dictionary defining
vocabulary - 2148 words, and probably over 4000 senses. Longman has been
tested for its ability to linguistically define all other words in the
dictionary, but whether there could be a similar small inventory of
primitive ontology elements that can combine to specify *all* other ontology
elements is unknown and may be impossible. The more relevant question is
whether a set of primitive ontology elements can be found that will not need
*significant* supplementation as new domains are linked to the FO; if little
supplementation is needed, the FO should be stable enough for most practical
tasks requiring semantic interoperability. This question can only be
answered by testing multiple domain ontologies versus some common FO.
Other possible sources of essential primitives could be the 3000 most
frequent Chinese characters (covering 98.9% of modern text) and the 2000
most common signs of AMESLAN. But these symbols have not been tested as a
"defining vocabulary".
(6) The fastest method to test the FO hypothesis is to create a consortium
of multiple diverse groups of ontology developers and potential users (and
relevant standards groups), to conduct a carefully organized study to agree
on an FO and test its utility in practical applications, and test its
ability to support accurate semantic interoperability. (014)
NOTE that there is nothing in here about a "complete" theory of anything,
just the most practical and expeditious method to address a practical
problem. The goal is the fastest and best possible solution to semantic
interoperability for any given set of ontologies. Theoretical completeness
of the inventory of primitives is unnecessary, though it is an asymptote
worthwhile trying to approximate. (015)
There is more detail about this approach to semantic interoperability in the
presentation:
http://www.micra.com/COSMO/TheFoundationOntologyForInteroperability.ppt (016)
There are doubtless questions that one may have that are not answered even
in that document; I will be happy to answer any such as best I can, and then
add the answers to the ppt. (017)
If one does not consider broad semantic interoperability to be of
significant practical utility, or doubts that an FO can contribute to that
goal, so be it. I think the problem is serious, and that all serious
proposals to address the problem should be considered seriously. I am of
course, happy to hear well-considered technical concerns. But what I have
heard so far in objection in many cases are based on misinterpretations of
what I have been saying. If anything sounds wrong, do look at the ppt
presentation, and if there still seems to be a problem, do let me know. (018)
Pat (019)
Patrick Cassidy
MICRA, Inc.
908-561-3416
cell: 908-565-4053
cassidy@xxxxxxxxx (020)
> -----Original Message-----
> From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx [mailto:ontolog-forum-
> bounces@xxxxxxxxxxxxxxxx] On Behalf Of Christopher Menzel
> Sent: Tuesday, February 16, 2010 12:50 PM
> To: [ontolog-forum]
> Subject: Re: [ontolog-forum] Foundation ontology, CYC, and Mapping
>
> On Feb 15, 2010, at 4:28 PM, Rob Freeman wrote:
> > Pat,
> >
> > What is clear is that:
> >
> > 1) You keep repeating your belief that a single complete
> > axiomatization (with useful coverage?) is possible.
> >
> > 2) You admit you don't know how to prove it is possible.
> >
> > 3) You ignore/don't understand that other people have proven it is
> impossible.
> >
> > "A complete and consistent logic complex enough to include
> > arithmetic was shown by Kurt Goedel to be impossible..."
>
> Better, in characterizing Gödel's theorem, to talk about theories
> rather than logics. Also better to emphasize simplicity rather than
> complexity, as one needs only a very modest bit of arithmetic to prove
> incompleteness in general. That modest bit is usually called Q. Q
> contains three axioms for the successor function, two of which also
> axiomatize 0:
>
> 1. ∀x(0 ≠ sx) -- "0 is not the successor of any number."
> 2. ∀x(x ≠ 0 → ∃y(x = sy)) -- "Every nonzero number has a predecessor"
> 3. ∀xy(sx = sy → x = y) -- "The successor function is 1-1"
>
> In addition, Q provides the obvious axioms for addition and
> multiplication:
>
> 4. ∀x(x + 0 = x)
> 5. ∀xy(x + sy = s(x + y))
> 6. ∀x(x • 0 = 0)
> 7. ∀xy(x • sy = s(x • y))
>
> Finally, it is absolutely critical in a characterization of the
> theorem to mention recursive axiomatizability (basically, the property
> a theory has when it is possible (i.e., you can write a program) to
> list its theorems), since there are in fact complete, consistent (but
> not recursively axiomatizable) theories that include that modest bit
> of arithmetic (a trivial example being simply being the theory
> consisting of all true sentences in the language of arithmetic). So,
> specifically, what Gödel showed was:
>
> (*) No consistent, complete theory that includes Q is
> recursively axiomatizable.
>
> And, actually, Gödel didn't prove exactly (*) in his famous 1931
> paper, he proved something weaker, although (*) itself can be proved
> using only the foundations that he laid in that paper (and was done so
> by Rosser in 1936).
>
> As for charge #3 that you direct at Pat above, Pat has never claimed
> that a FO would include a complete, consistent, recursive
> axiomatization of arithmetic. You should be more careful about
> lecturing others about what they don't understanding.
>
> > I'm also finding references to a guy named Thoralf Skolem. Anyone
> else
> > heard of him?
>
> Of course. He's a prominent figure in the history of mathematical
> logic in the early 20th century, particularly well known for his work
> in model theory.
>
> > "In the 1922 lecture the Löwenheim-Skolem theorem was applied to a
> > formalization of set theory. The result was a relativization of the
> > notion of set, later known as the Skolem paradox: If the axiomatic
> > system (e.g. as presented by Zermelo) is consistent, i.e. if it is
> > at all satisfiable, then it must be satisfiable within a countable
> > ``Denkbereich'' (domain). But does this not contradict Cantor's
> > theorem of the uncountable, the existence of a never-ending sequence
> > of transfinite powers? The ``paradox'' of Skolem is no contradiction.
>
> Indeed not. It is a theorem about the expressive limitations of
> first- order logic.
>
> > Roughly speaking it asserts that there is no complete axiomatization
> > of mathematics,
>
> Actually, the Löwenheim-Skolem theorem doesn't say that at all. It
> says that a theory with infinite models has a countable model. In and
> of itself, that does not imply incompleteness. (If it did, Löwenheim
> and Skolem would be credited with discovering arithmetical
> incompleteness in the early 1920s.)
>
> > and that certain concepts must be interpreted relative to a given
> > axiomatization and its models and thus have no ``absolute''
> > meaning."
>
> That's in fact a highly controversial (some would even argue
> decisively
> refutable) philosophical interpretation of the the L-S theorem.
> Regardless of your philosophical view of the theorem, the idea that
> the notion of set is "relative" to an axiomatization is certainly not
> a
> *mathematical* consequence of it.
>
> > "Towards the end of the 1929 paper Skolem expressed some doubts
> > about the complete axiomatizability of mathematical concepts. His
> scepticism
> > was based on the set-theoretic relativism which follows from the
> > Löwenheim-Skolem theorem. In 1929 he could give only some partial
> > results, but in a paper from 1934 (and a previous one from 1933)
> > ``Über die Nichtcharacterisierbarkeit der Zahlenreihe mittels
> > endlich oder abzählbar unendlich vieler Aussagen mit ausschliesslich
> > Zahlenvariablen'' he could prove that there is no finite or
> > countably infinite set of sentences in the language of Peano
> > arithmetic which characterizes the natural numbers. Today, this
> > follows as a simple consequence of Gödel's completeness theorem. The
> > technique used by Skolem was a more direct model-theoretic
> > construction. And this technique, suitably refined to the so-called
> > ``ultraproduct''
> > construction, has been an important tool in recent work on model
> > theory."
> >
> > http://www.hf.uio.no/ifikk/filosofi/njpl/vol1no2/skobio/node1.html
> >
> > By the way, my alternative is to work directly with observations of
> > different kinds, perhaps indexed by labels, and implement
> > interoperability based on overlaps between sets of these, as the
> > task demands.
>
> Your alternative to *what*? The above is just a description of
> certain historical developments in mathematical logic. It doesn't
> make sense to talk about an "alternative" to it, especially one that
> involves contemporary notions out of ontological engineering like
> interoperability.
>
> Chris Menzel
>
>
> _________________________________________________________________
> Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
> Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
> Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
> Shared Files: http://ontolog.cim3.net/file/ Community Wiki:
> http://ontolog.cim3.net/wiki/ To join:
> http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
> To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx
> (021)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/ Community Wiki:
http://ontolog.cim3.net/wiki/ To join:
http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (022)
********************************************************************
This email and any attachments are confidential to the intended
recipient and may also be privileged. If you are not the intended
recipient please delete it from your system and notify the sender.
You should not copy it or use it for any purpose nor disclose or
distribute its contents to any other person.
******************************************************************** (023)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (024)
|