ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Foundations for Ontology

To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
Cc: Ali SH <asaegyn+out@xxxxxxxxx>
From: Rob Freeman <lists@xxxxxxxxxxxxxxxxxxx>
Date: Thu, 29 Sep 2011 16:00:15 +0800
Message-id: <CAKAf4GgERfWB6P8jMVdQ-TSC5hdLWp5agjvLH8cZW0EWaTFN7g@xxxxxxxxxxxxxx>
Ali,    (01)

I'm delighted to see this paper. Thanks for posting the link (and
thanks to John S. for once again catalyzing a discussion about
fundamental issues.)    (02)

A vector compositional model of this kind is exactly what I've been
promoting on this list and elsewhere for some years now.    (03)

Just recently I found the papers Daoud Clarke cites by Mitchell and
Lapata, 2008, Stephen Clark, and more recently Grefenstette. I was
delighted to see that a small community seems to have sprung up
pursuing the idea.    (04)

I hadn't seen this paper, though, or Daoud Clarke's original PhD thesis.    (05)

Apart from this group of publications, the only other thread of vector
compositional publications I have found have been associated with
Simon Levy and Ross Gayler. Have you seen these? E.g:    (06)

Vector Symbolic Architectures: A New Building Material for AGI
(home.wlu.edu/~levys/presentations/Haskins_12_APR_2010.pdf)    (07)

http://video.google.com/videoplay?docid=-6666777138089848257    (08)

I don't think Clarke cites them. Though others among his references
cite Tony Plate's "Holographic Reduced Representation" work, which I
think Gayler was heavily influenced by.    (09)

Getting back to this community of publications centered around the
U.K. From memory Mitchell and Lapata and others conclude tensor
product vector combination has the best properties, but suffers from
an exponential increase in the dimensions of the representation. The
solution followed by Grefenstette and others seems to have been to
assume quantum categories, and use the mathematics of Hilbert Spaces.
So, broadly speaking, they solve a bottom up explosion in the
dimensions of vector combination representation, by assuming a top
down variables with quantum properties.    (010)

Dominic Widdows seems to be especially active in this. There is now a
conference called Quantum Interaction, which looks at all kinds of
phenomena amenable to top down quantum interpretation in this way.
Here's a website with links:    (011)

http://www.quantuminteraction.org/    (012)

New Scientist recently published an article on it:    (013)

http://www.newscientist.com/article/mg21128285.900-quantum-minds-why-we-think-like-quarks.html    (014)

Personally I think borrowing maths from Quantum Mechanics is not the
way to go. I think it is viable, but unnecessarily complex and hard to
implement.    (015)

Since about 2000 I've been promoting my own vector combination model.
This solves the problem of an exponential increase in the size of the
representation in a very simple way entirely reasonable to linguists.
Briefly it assumes you can use analogy to relate dimensions.    (016)

I published about this in 2000:    (017)

Freeman R. J., Example-based Complexity--Syntax and Semantics as the
Production of Ad-hoc Arrangements of Examples, Proceedings of the ANLP/NAACL
2000 Workshop on Syntactic and Semantic Complexity in Natural Language
Processing Systems, pp. 47-50. ( http://acl.ldc.upenn.edu/W/W00/W00-0108.pdf)    (018)

You can find some other draft papers, presentations and the like on
this theme at http://independent.academia.edu/RobFreeman. Sorry about
the sign in. Write to me if you want them directly.    (019)

A particular query about Daoud Clarke's paper. Do you know why he
assumes associativity? I assume exactly the opposite. Surely not
having associativity is what you want. Assuming the vector products
are non-associative would explain why different parses of the same
sentence have different meanings.    (020)

Best regards,    (021)

Rob Freeman    (022)

On Wed, Sep 28, 2011 at 2:25 AM, Ali SH <asaegyn+out@xxxxxxxxx> wrote:
>
> Oops, had meant to also send to the list.
> Ali
>
> On Tue, Sep 27, 2011 at 2:15 PM, Ali SH <asaegyn+out@xxxxxxxxx> wrote:
>>
>> Hi John,
>> Would be curious to hear your take on the following paper (link to preprint 
>provided below, the accepted Computational Linguistics version comes out 
>tomorrow). The author claims to provide a novel foundation for (computational) 
>meaning, specifically in what he calls a "context-theoretic" framework 
>(explicitly making the analogy to a model-theoretic framework):
>> http://arxiv.org/PS_cache/arxiv/pdf/1101/1101.4479v1.pdf
>>
>> A Context-theoretic Framework for Compositionality in Distributional 
>Semantics
>> Daoud Clarke
>> (Submitted on 24 Jan 2011)
>>
>> Abstract
>> Techniques in which words are represented as vectors have proved useful in 
>many applications in computational linguistics, however there is currently no 
>general semantic formalism for representing meaning in terms of vectors. We 
>present a framework for natural language semantics in which words, phrases and 
>sentences are all represented as vectors, based on a theoretical analysis 
>which assumes that meaning is determined by context.
>> In the theoretical analysis, we define a corpus model as a mathematical 
>abstraction of a text corpus. The meaning of a string of words is assumed to 
>be a vector representing the contexts in which it occurs in the corpus model. 
>Based on this assumption, we can show that the vector representations of words 
>can be considered as elements of an algebra over a field. We note that in 
>applications of vector spaces to representing meanings of words there is an 
>underlying lattice structure; we interpret the partial ordering of the lattice 
>as describing entailment between meanings. We also define the 
>context-theoretic probability of a string, and, based on this and the lattice 
>structure, a degree of entailment between strings.
>> We relate the framework to existing methods of composing vector-based 
>representations of meaning, and show that our approach generalises many of 
>these, including vector addition, component-wise multiplication, and the 
>tensor product.
>>
>> Best,
>> Ali    (023)

_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J    (024)

<Prev in Thread] Current Thread [Next in Thread>