ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Foundations for Ontology

To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: Rob Freeman <lists@xxxxxxxxxxxxxxxxxxx>
Date: Tue, 4 Oct 2011 17:28:29 +0800
Message-id: <CAKAf4GjaV2KQQuhsDgU6jxW3+7PAG1voCHh-=K8TVtgZpq-XpQ@xxxxxxxxxxxxxx>
Good luck selling your software John. I always respect your advocacy
for subjectivity in logical representations.    (01)

I also like the emphasis you are putting on Syd Lamb's network model
in your latest presentation.    (02)

Any other interested parties should note vector/distributed
representations are a good way to model networks.    (03)

-Rob    (04)

On Mon, Oct 3, 2011 at 12:59 AM, John F. Sowa <sowa@xxxxxxxxxxx> wrote:
> Rob,
>
> I posted an updated version of my slides, which clarify and elaborate
> many of the issues.  I'll cite them in responding to your note:
>
>    http://www.jfsowa.com/talks/ontofound.pdf
>
> JFS
>>> The distinction I was trying to make is between a symbolic knowledge
>>> representation, which can be translated to and from a natural language...
>
> RF
>> Had anyone ever found a way to do that using logic, your argument
>> might hold water.
>
> I have criticized many aspects of formal semantics, but on this point
> they have succeeded.  Their batting average on arbitrary NL texts is
> very low, but when they are successful, they can hit a home run.
>
> No statistical software can translate its internal forms to a coherent
> phrase or sentence in any NL.  But logic-based systems can and do.
>
> RF
>> I personally worked for years on an MT system which attempted this.
>> We did not succeed. No-one has succeeded since.
>
> That depends on your definition of success.  If you mean FAHQMT
> (Fully Automated High Quality Machine Translation) on a broad range
> of texts, no MT system of any kind has reached that goal.
>
> But for narrow domains, such as weather reports, the METEO system
> from the 1980s did a good job of translating English or French source
> to a logical form and generating a report in the other language.
> But METEO never attempted to translate the more complex commentary
> that might accompany a report about unusual or extreme weather.
>
> My recommendation is to develop flexible, adaptable systems that
> can handle an open-ended number of domains of any size.  And we
> have been doing that at VivoMind -- see Slides 75 to 105.
>
> RF
>> Humans may be able to get back and forth to a useful degree, but it is
>> exactly how they do that which is at issue.
>
> I agree.  That is why I went into quite a bit of detail about how people
> process language (psycholinguistics & neuroscience) in Slides 13 to 60.
>
> See slides 24-27 about Sydney Lamb's "neurocognitive networks".  See his
> course notes for more detail:  http://www.owlnet.rice.edu/~ling411
>
> Slides 28 to 31 cover Peirce's logic and an article by the psychologist
> Johnson-Laird: http://mentalmodels.princeton.edu/papers/2002peirce.pdf
>
> Slides 42 to 44 discuss background knowledge and mental maps, images,
> and models.  Those are issues that depend on mappings between images
> and the words, phrases, and sentences of NL.  People do that very well.
> Even current logic-based systems sometimes do that, but not with the
> consistency, efficiency, and broad coverage we would like.
>
> RF
>> But no one logic can in itself express all the distinctions necessary
>> to make the choice.
>
> I agree.  For over 20 years, I have published and discussed the need
> for an open-ended variety of logics and ontologies.  That is the point
> of my discussion of Wittgenstein and my recommendations for a hierarchy
> of multiple theories to implement something that resembles W's open-
> ended language games.  See Slides 45 to 74.
>
> RF
>> I agree little has been done historically to implement compositionality
>> in particular using distributed representation. That doesn't mean
>> it can't be done.
>
> I wasn't criticizing the lack of implementations.  I was just pointing
> out the obvious:  Every "bag o' words" method throws away the structure
> of sentences, paragraphs, and discourse.  There is no way to recover
> the missing information by composing multiple vectors that all suffer
> from the same lack of structural information.
>
> But you can do very accurate work with vectors that represent
> *both* structure *and* ontology.  See slides 76 to 80.
>
> I also want to emphasize that I am strongly in favor of using
> statistical methods when they are useful. I was just making the
> point that, by themselves, they are not adequate to do language
> understanding.  I say that in Slide 12.
>
> John
>
> _________________________________________________________________
> Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
> Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
> Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
> Shared Files: http://ontolog.cim3.net/file/
> Community Wiki: http://ontolog.cim3.net/wiki/
> To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
>
>    (05)

_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J    (06)

<Prev in Thread] Current Thread [Next in Thread>