ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Practical Semantic Primitives

To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: Gary Berg-Cross <gbergcross@xxxxxxxxx>
Date: Fri, 9 Aug 2013 19:12:30 -0400
Message-id: <CAMhe4f3NxBz=+DqUty1MRvk1Hv2wxUOQqPpErSfpRfdZRywOLQ@xxxxxxxxxxxxxx>
Bruce,

This _expression_ of "mastering how a word embodies “meaning" may be  a bit misleading.

A word-symbol seems like an encoding. In that view  it must carry some representational content. But it seems more like a signal of potential information that is carried to a cognitive agent where it is interpreted by cognitive processes that use activation background knowledge.

 You can think of what happens as some signal that activates portions of a  semantic net…

On this view the meaning is not in, or embodied in, the word symbol. Rather the personal meaning of a word-symbol is the totality of what gets activated in a cognitive agent.  It takes an agent to verify or validate the potential of what is being signaled. And of course different agents with different semantic net knowledge will have different activations...so will the same person in different circumstances.

Thus the _expression_….”without people text has no meaning”

This is a derivative/constructionist interpretation of language after Mark Bickhard & Terveen, (1995) and others like Croft & Cruse Cognitive Linguistics. Cambridge: Cambridge University Press. 2004


Mark H. Bickhard &  Loren Terveen FOUNDATIONAL ISSUES IN ARTIFICIAL INTELLIGENCE AND COGNITIVE SCIENCE: IMPASSE AND SOLUTION, 1995


Gary Berg-Cross, Ph.D.  
NSF INTEROP Project  
SOCoP Executive Secretary
Knowledge Strategies    
Potomac, MD
240-426-0770


On Fri, Aug 9, 2013 at 6:35 PM, Bruce Schuman <bruceschuman@xxxxxxx> wrote:

Thanks so much for this reply.  I’ve been bumping through the Wikipedia citations, and considering how to interpret those ideas.  They’re helpful, and express a hopeful and idealistic spirit.  It’s true that my own efforts are somewhat tinged by the Leibnitzian dream of a universal semantics – but to introduce that idea seriously into a world where Google and computer science are so strong, I’d say we need to make some major concessions to hard-edged analysis.

 

So it’s probably true that my use of the word “primitive” could be a little misleading – given that there are these substantial existing efforts to identify a particular set of “words” as “ primitives” – with the hope that maybe “the right set of words” could be the answer…

 

For me – the concept of “primitive” must involve something deeper than “words”.  As I see it – a word is a label or a name for some concept – and that concept has implicit structure.  What I feel we have to do – is to drill down beneath the level of words – in the process, mastering how a word embodies “meaning” – and explore the structure of the “abstract objects” that words are naming – and see if we can generalize the construction mechanics of those abstract objects – showing how words and concepts and meaning are “constructed” – and from what.

 

For me – every word, every concept, is a name for an abstract symbolic object assembled as a composite body of “distinctions”.  Those distinctions are the fundamental building blocks of any concept – and hence, I would say, any word.  For me – it is this “distinction” – or “cutting” process – that is the key “primitive” through which all language and all conceptual structure can/should be defined.

 

I’d say there’s an analogy with the way “bits” are built up into alphabets and from there into larger composite units (words, sentences, paragraphs, books…).  Concepts are composite bodies of distinctions – defined just as Aristotle suggested, and as John Sowa describes in slide 17 of  http://www.jfsowa.com/talks/kdptut.pdf

 

What I think I am seeing when I look at a natural language – any natural language – is a dimensional parsing of conceptual space,  with words naming composite abstract units that can be defined with absolute fluid plasticity.

 

So, for me, if there is a “universal language” behind any and all instances of natural language – it is a language of dimensions and distinctions that can and should be defined in a kind of universal algebra, and which each culture defines in its own particular way – assigning labels (“words”) to composite blocks of distinctions that are interesting for them.

 

I did write up a brief review of this idea for this list, at http://sharedpurpose.net/groupdocs/introtoontolog.docx

 

There’s a bibliography here: http://originresearch.com/sd/biblio.cfm

 

John Sowa’s 1984 comments on concepts: http://originresearch.com/sd/sd4.cfm

 

Thanks so much for the discussion and comment.

From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx [mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Obrst, Leo J.
Sent: Friday, August 09, 2013 12:05 PM
To: [ontolog-forum]
Subject: Re: [ontolog-forum] Practical Semantic Primitives

 

The linguist Anna Wierzbicka has attempted to define a set of semantic primes or primitives for language (i.e., all languages): http://en.wikipedia.org/wiki/Semantic_primes, perhaps similar in notion as Pat Cassidy is trying with COSMOS. There is also Swadesh’s list of core words for historical linguistics, with many variations: http://en.wikipedia.org/wiki/Swadesh_list.

 

It’s the dream of many. Personally, I think it is a lost cause when considered as a reduction to semantic primitives, but there may be some merit in looking for a set of common words in many languages.

 

It also strikes me as an effort of lexical decomposition similar to that of the Generative Semanticists of the late 1960s/early 1970s, and some of the semantic-feature based work of Jackendoff, etc.

                                                                                     

Thanks,

Leo

 

From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx [mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Bruce Schuman


Sent: Friday, August 09, 2013 11:43 AM
To: '[ontolog-forum] '
Subject: Re: [ontolog-forum] Practical Semantic Primitives

 

Good morning from Santa Barbara.  As a new member of this very interesting forum, thanks to all for being here.

 

On this issue of “primitives” – my instinct is to go to the basic theory of concepts, and ask how any concept is defined or “constructed”.  For me, the answer is more or less found in the Aristotelian approach to definition, as described by John Sowa in slide 17 of http://www.jfsowa.com/talks/kdptut.pdf  -- a process which defines a “distinction within a genus”.

 

When I look at systems defined by primitives – to my eye and understanding, these elements are usually not what I would call primitive – not fundamental – not truly “ontological”.  They are most often composite/holistic objects with a complex but undefined and implicit internal structure, that we are asked to take on faith, on the assumption that these “units” are somehow basic.

 

I want to see an approach to primitives that constructs everything – every possible concept – from a simple fundamental algebraic process of “drawing a distinction”, as per the Aristotelian method.

 

As I see it, the concept of “distinction” or differentiation is related to the fundamental mathematical concept of “cut” – as per the Dedekind Cut at the foundation of mathematics and the definition of continuity and the real number line.   From my point of view, we should be building our fundamental conceptual units from this foundation.

 

As regards the “atoms/molecules” analogy – for me, the right approach is to look for a “fundamental particle”.  Even atoms are composite structures.  If we are going to take a bottom-up approach to constructing every possible cognitive unit, we need to build these units from something truly fundamental.

 

In pursuit of this basic approach, I am developing a model of conceptual structure based on dimensionality and taxonomy that I call “synthetic dimensionality”.  I put a brief intro written for this list online:  http://sharedpurpose.net/groupdocs/introtoontolog.docx

 

Thanks so much for this discussion.

 

Bruce Schuman

(805) 966-9515 Santa Barbara

http://interspirit.net | http://sharedpurpose.net | http://bridgeacrossconsciousness.net

 

From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx [mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Patrick Cassidy
Sent: Thursday, August 08, 2013 8:30 PM
To: '[ontolog-forum] '
Subject: Re: [ontolog-forum] Case realtions as Practical Semantic Primitives - was Context and Inter-annotator agreement

 

Gary,  

    On two points:

[GB-C]

> He provided a highlight of work, but in that list I didn't see Fillmore's Case grammar,
which did have an important role in other part's of John's postings such as the
Verb Semantics Ontology project.  This might not provide ultimate primitives, but are

perhaps molecules of a deeper chemistry.

 

   I have been tempted to refer to primitive concepts as “atoms” that build up “molecules” of meaning, but there are important differences that make the analogy misleading.  Many “primitive” concepts that are types within a hierarchy will be distinguished not by necessary and sufficient conditions (a logical “definition”), but only by necessary conditions.  This leaves a lot of potential instances unspecified, and differs from the fixed properties of atoms; I believe that is indeed the way people use the primitives – they are only as specific as necessary for particular communication tasks.  Perhaps even in the ‘atom’ analogy there can be some flexibility, since the isotopes of elements can have differing properties, but even that variability is much less than one sees with many conceptual primitives.

 

[GB-C]

> Case relations may not be the final word, but they provide a
starting point for core meta-relations that can be used to develop canonical propositions.

 

    Yes, case relations are among the relations I believe are primitive, but they are still only a small part of the total number of primitive relations.

    As my earlier note suggested, these hypotheses (however well motivated) need careful experimental testing to warrant strong assent, but the current trends in funding of NL research suggest that proper testing is still years in the future.

 

Pat

 

Patrick Cassidy

MICRA Inc.

cassidy@xxxxxxxxx

1-908-561-3416

 

From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx [mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Gary Berg-Cross
Sent: Thursday, August 08, 2013 3:47 PM
To: [ontolog-forum]
Subject: Re: [ontolog-forum] Case realtions as Practical Semantic Primitives - was Context and Inter-annotator agreement

 

The Context and Inter-annotator agreement topic seems to have wound down, but along the
path of that discussion there was this idea of semantic primitives. 
 John Sowa provied an historical  list of people who have addressed this seductively, common sense 
idea of selecting a small number of primitives for defining everything. It is, as he noted:
" one of the oldest in the history of philosophy,
logic, linguistics, and AI.  It can be traced back at least to 500 BC
with Pythagoras, Plato, and Aristotle. " 
He provided a highlight of work, but in that list I didn't see Fillmore's Case grammar,
which did have an important role in other part's of John's postings such as the
Verb Semantics Ontology project.  This might not provide ultimate primitives, but are
perhaps molecules of a deeper chemistry. Case relations may not be the final word, but they provide a
starting point for core meta-relations that can be used to develop canonical propositions.
As John noted, more research is needed but this is one tool that can be used for now.
 
 

Gary Berg-Cross, Ph.D.  

gbergcross@xxxxxxxxx     

http://ontolog.cim3.net/cgi-bin/wiki.pl?GaryBergCross

NSF INTEROP Project  

http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0955816

SOCoP Executive Secretary

Knowledge Strategies    

Potomac, MD

240-426-0770

 

On Sun, Aug 4, 2013 at 1:28 PM, John F Sowa <sowa@xxxxxxxxxxx> wrote:

Pat,

PC

> The point at issue is whether all of the senses of a particular word
> needed for language understanding can be included in a semantic lexicon.
> My experience suggests that they can, even though new senses are being
> developed all the time.  The new senses can also be included in the lexicon,
> if they are important enough to warrant the effort.

That claim is vague enough to cover all bases.  If you want a project
that includes all word senses anyone considers important, I suggest
Wiktionary.  It has "3,476,017 entries with English definitions from
over 500 languages":

    http://en.wiktionary.org/wiki/Wiktionary:Main_Page

Large numbers of people around are actively updating and extending
Wiktionary.  When the number of senses is in the millions and growing,
it seems hard to claim that there is any finite upper limit.

PC

> JFS seems to be saying that failure of some groups to achieve a goal means
> that no amount of effort trying a related but different way can succeed

More precisely, the idea of selecting a small number of primitives for
defining everything is one of the oldest in the history of philosophy,
logic, linguistics, and AI.  It can be traced back at least to 500 BC
with Pythagoras, Plato, and Aristotle.  For summaries and references,
see http://www.jfsowa.com/talks/kdptut.pdf .

Slides 13 to 18:  Aristotle's categories, definitions, and the Tree
    of Porphyry for organizing them graphically.

Slides 91 to 93:  Universal language schemes in the 17th and 18th
    centuries.  John Wilkins developed the largest and most impressive
    set of primitives (40 genera subdivided in 2030 species).  Wilkins
    got help from other members to define 15,000 words in those terms.
    For more information about these and other schemes, see references
    by Knowlson (1975), Eco (1995), and Okrent (2009).

Slides 94 to 97:  Ramon Llull's Great Art (Ars Magna), which included
    Aristotle's categories, the Tree of Porphyry, rotating circles
    for combining categories, and a methodology for using them to
    answer questions.  Leibniz was inspired by Llull to encode the
    primitive categories in prime numbers and use multiplication
    to combine them and division to analyze them.

Slide 98:  Leibniz's method generated a lattice.  For modern
    lattice methods, see FCA and Ranganathan's facet classification.
    Click on the URLs to see FCA lattices that are automatically
    derived from WordNet and from Roget's Thesaurus.

Slides 99 to 101:  Categories by Kant and Peirce.  A suggested
    updated version of Wilkins' hierarchy that includes more
    modern developments.

Slides 102 to 107:  Issues about the possibility of ever having
    a complete, consistent, and finished ontology of everything.

For modern computational linguistics, the idea of selecting a set
of primitives for defining everything was proposed and implemented
in the late 1950s and early '60s:

1961 International Conf. on Machine Translation.  See the table
    of contents: http://www.mt-archive.info/NPL-1961-TOC.htm .
    At that conference, Margaret Masterman proposed a list of 100
    primitive concepts, which she used as the basis for lattices
    that combine them in all possible ways.  Yorick Wilks worked
    with Masterman and others at CLRU, and he continued to use
    her list of primitives for his later work in NLP.  For the
    list, see http://www.mt-archive.info/NPL-1961-Masterman.pdf

TINLAP (three conferences on Theoretical Issues in Natural Language
    Processing from 1975 to 1987).  The question of primitives was
    the focus of these conferences.  Yorick Wilks was one of the
    organizers.  Roger Schank (who also had a set of primitives for
    defining action verbs) was prominent in them.  For summaries,
    see http://www.aclweb.org/anthology-new/T/T78/T78-1000.pdf
    and http://www.aclweb.org/anthology-new/T/T87/T87-1001.pdf .

Anna Wierzbicka spent many years working on issues of selecting and
    using a proposed set of primitives for defining words in multiple
    languages.  From Wikipedia:  "She is especially known for Natural
    Semantic Metalanguage, particularly the concept of semantic primes.
    This is a research agenda resembling Leibniz's original "alphabet
    of human thought", which Wierzbicka credits her colleague, linguist
    Andrzej Bogusławski, with reviving in the late 1960s."  Many people
    tried to use her "semantic primes" in computational linguistics,
    but none of those projects were successful.

I never said "No amount of effort trying a related but different way
can succeed."  In fact, I have been proposing and *using* related
methods, but I always insist on keeping all options open.

There is no evidence that a fixed set exists, and an overwhelming
amount of evidence that Zipf's Law holds:  there is an extremely long
tail to the distribution of word senses.  But if you keep your options
open and *if* a fixed set of primitives is sufficient, then you will
discover that set.  That is my recommended strategy.


> So the statistical approach has become vastly more funded than
> the ontological/analytical.

I certainly agree with you that a deeper analysis with ontologies and
related lexical resources is essential for NL understanding.  I believe
that statistical methods are useful as a *supplement* to the deeper
methods.   At VivoMind Research, we use *both*, but the emphasis is
on a syntactic and semantic analysis by symbolic methods.


> the current strong emphasis on the statistical approach is, I believe
> retarding progress by failing to develop even the most basic resources
> needed for the analytical stage 2 function.

I wholeheartedly agree.  But from a selfish point of view, that gives
us a competitive advantage.  We got a contract with the US Dept. of
Energy based on a competition with a dozen groups that used their
favorite methods of NLP.

For the test, all competitors were asked to extract certain kinds of
data from a set of research reports and present the results in a table.
The scores were determined by the number of correct answers.  Our score
was 96%.  The next best was 73%.  Third best was above 50%, and all the
rest were below 50%.

For analyzing the documents, we used very general lexical resources
and a fairly simple general ontology.  But we supplemented it with
a detailed ontology that was specialized for chemical compounds,
chemical formulas, and the related details of interest.

For an example of a spreadsheet with the results, see slides 49 & 50
of http://www.jfsowa.com/talks/relating.pdf .

John


_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J

 



_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
 


_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J    (01)

<Prev in Thread] Current Thread [Next in Thread>