ontology-summit
[Top] [All Lists]

Re: [ontology-summit] ontology as logical theory?

To: Ontology Summit 2007 Forum <ontology-summit@xxxxxxxxxxxxxxxx>
From: "John F. Sowa" <sowa@xxxxxxxxxxx>
Date: Sat, 24 Feb 2007 12:07:18 -0500
Message-id: <45E070C6.3000505@xxxxxxxxxxx>
Folks,    (01)

I'd like to comment on various issues raised in this thread:    (02)

CM replying to CW> the idea of expressiveness vis-a-vis
 > natural languages inherently has no precise meaning....    (03)

I agree, but it's not realistic to demand a formal definition
for an informal language.  As a basis for an informal definition,
the following principles seem to hold for every comparison of
any natural language with any formal language ever proposed:    (04)

  1. Every statement s in any formal language L can be translated
     to a sentence s' in at least one natural language and,
     very likely, in any natural language.    (05)

  2. The designers of the language L use sentences such as s'
     when they talk among themselves about the "meaning" of s,
     especially during the design stages when the details of the
     syntax and semantics of L have not yet been fully specified.    (06)

  3. People who use the language L continue to use sentences
     such as s' when they explain the "meaning" (in whatever
     informal sense anyone would care to define) of statements
     such as s to their colleagues and students.    (07)

I am not aware of any formal language or logic for which these
principles do not hold.  (And if there any exceptions, I would
be *very* interested to see them.)  Therefore, I conjecture
that the expressive power of any natural language is equal to
or greater than the total expressive power of all the formal
languages ever invented.  (I won't claim that the expressive
power of NLs could never be exceeded by any formal language,
but I suspect that any such formal language would have to be
radically different from anything designed so far.)    (08)

MFU> But I'm talking about the real world meaning perspective.
> How is a term intended to be used?  In that sense, the only
> hint at how a term is intended to be used comes from its term
> (e.g. 'event', or 'FOAF:person'...)    (09)

I agree that an informal explanation of the intended meaning of
any formal notion is essential, even for the most sophisticated
mathematicians and logicians.    (010)

Slightly over a century ago, Peano founded a journal whose
goal was to translate all of mathematics into his notation for
logic.  No NL phrases were permitted except in the titles of
the articles.  That journal was exceedingly unpopular, and it
was shut down after the first one or two issues.    (011)

Textbooks on mathematics normally include explanations along
the lines that Mike suggests.  They are essential, but some
mathematicians, notably Dieudonné, hoped to eliminate them.    (012)

But Dieudonné was a hypocrite.  He published a book on
geometry that did not have a single diagram.  But when
he was writing a proof at the blackboard during a lecture,
he got stuck in the middle.  After staring at the board
for a few minutes, he went to the side, covered what he was
doing, drew a diagram, erased it, and continued his proof.    (013)

LJO> There is also the notion of descriptive complexity
 > (under Finite Model Theory), which tries to characterize
 > the expressive complexity of various logics....    (014)

That's an interesting exercise, but the authors of that approach
have made the assumption that the primary purpose of a logic is
to prove theorems.  Yet the most widely used version of FOL is
SQL, which is *never* used to prove theorems, but to evaluate
queries and constraints against a relational database.  For that
purpose, the algorithms have at worst polynomial complexity,
and when the columns of the relations are indexed, the algorithms
can process many queries and constraints in logarithmic time.    (015)

There has been an enormous amount of confusion about complexity,
and I'd like to make the following observations:    (016)

  1. Research on the computational complexity of *algorithms* has
     proved to be valuable for many practical purposes.    (017)

  2. However, some of that research has had a *negative* influence
     on important research areas.  For example, somebody finds a
     particular algorithm A for solving a particular problem P and
     publishes an article proving that A has polynomial complexity
     -- e.g., it takes N-cubed time to process N items of data.
     That is a useful result.    (018)

  3. But then people confuse algorithm A with the problem P, and
     they make claims that the *problem* P has N-cubed complexity.
     As a result, people abandon the attempt to process data for
     which N is larger than 1000, since N-cubed would be a billion.
     Yet there may exist other algorithms for solving P that might
     have a polynomial with a smaller exponent (perhaps even 1).    (019)

  4. The unfortunate result in point #3 above comes from a confusion
     of algorithms with problems.  An even worse result comes from
     confusing problems with languages and making the claim that
     certain *languages* have a certain inherent complexity.  That
     is why I *very strongly* object to claims that the work on
     descriptive complexity is meaningful for the choice of language.    (020)

To emphasize these points, I often cite the work by Bill Andersen
and his colleagues in automatically extracting subsets of FOL
that are appropriate for different algorithms.  I believe that
the following paper should be required reading for anybody who
has ever thought about complexity:    (021)

   Peterson, Brian J., William A. Andersen, & Joshua Engel (1998)
   "Knowledge bus: generating application-focused databases from
   large ontologies," Proc. 5th KRDB Workshop, Seattle, WA.
   http://sunsite.informatik.rwth-aachen.de/Publications/CEUR-WS/Vol-10/    (022)

They developed a "knowledge compiler" which extracted a subset of
axioms from Cyc to drive a deductive database.  It translated Cyc
axioms, stated in a superset of FOL, to constraints for an SQL
database and to Horn-clause rules for an inference engine. Although
the knowledge engineers had used a very expressive dialect of logic,
84% of the axioms they wrote could be translated directly to Horn-
clause rules (4667 of the 5532 axioms extracted from Cyc).  The
remaining 865 axioms were translated to SQL constraints, which would 
ensure that all database updates were consistent with the axioms.    (023)

That is the ideal approach:  use compiler technology to enable
knowledge engineers (and even domain experts) to express knowledge
in the simplest and most readable form *without* worrying about
complexity.  Then use compilers to extract and transform that
specification into various formats that are suitable for various
problems and algorithms.    (024)

Summary:  Never confuse the complexity of *algorithms* with
the complexity of the *problems* they solve.  And most definitely
do not confuse the complexity of algorithms or problems with the
complexity of the *languages* in which they are stated.    (025)

John Sowa    (026)


_________________________________________________________________
Msg Archives: http://ontolog.cim3.net/forum/ontology-summit/ 
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontology-summit/  
Unsubscribe: mailto:ontology-summit-leave@xxxxxxxxxxxxxxxx
Community Files: http://ontolog.cim3.net/file/work/OntologySummit2007/
Community Wiki: http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2007
Community Portal: http://ontolog.cim3.net/    (027)
<Prev in Thread] Current Thread [Next in Thread>