ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Semantic Systems

To: "Rich Cooper" <rich@xxxxxxxxxxxxxxxxxxxxxx>, "'[ontolog-forum] '" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: "John F. Sowa" <sowa@xxxxxxxxxxx>
Date: Mon, 13 Jul 2009 05:58:56 EST
Message-id: <4a5b0560.1661.0@xxxxxxxxxxx>
Rich and Frank,    (01)

RC> Contrast is by definition a comparison.  Ordinarily,
>it means comparison with neighboring pixels, against frequency
> buckets...    (02)

Those are computer implementation details.  You don't have pixels
or frequency buckets in the brain.    (03)

Nobody knows exactly how the brain works, but many kinds of
neural operations are senstive to changes, in either temporal
or spatial directions.  Such operations can be modeled quite
well with Fourier transforms in spatial and/or temporal
coordinates.  There is some evidence that the equivalent of a temporal
Fourier transform is performed even in the neural
endings in the ear -- before the signals actually reach the
brain.  In terms of such transforms, contrast can be defined by
the shape of the transformed signal.    (04)

In any case, there is an enormous literature about what kinds
of operations seem to be more primitive than others.  But as
any neuroscientist would say, there are still very many, many
unknowns.    (05)

>FK:> This is why now it was high time to see that no ontology
> is correct without mental operations identified within the FO
> language system, of which abstraction is one operation that
> results in a property.    (06)

I agree that mental operations should be considered in any kind of ontology
that attempts to be comprehensive.  But given that
nobody knows exactly how the brain works, it impossible for
anyone today to develop a truly comprehensive ontology.    (07)

In any case, a child can learn language far better and faster
than any computer system today, and there is now evidence
that the child has much, if any built-in ontology.  But by
the time a child starts to use language, he or she already has
a lot of low-level facts and models about how the world works,
the people in it, his or her own body, and how all those things
interact.  But that ontology almost certainly does not include
much knowledge about mental operations.    (08)

Reasons like this are among the many, many reasons why I have
maintained that any upper-level ontology should have very few
axioms -- because the more axioms you have the greater the
likelihood of error, contradiction, and confusion.    (09)

For detailed reasoning, you do need axioms.  But both people and
computers do detailed reasoning only at the very low levels
required for solving specific problems.    (010)

Summary:  We need an upper level that along the lines of a
sparsely axiomatized and systematized WordNet.  The detailed
resoning is always done in the low-level microtheories, of
which we need an enormous number.    (011)

John    (012)

_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx    (013)

<Prev in Thread] Current Thread [Next in Thread>