ontology-summit
[Top] [All Lists]

Re: [ontology-summit] OntologyFrameworkDraftStatementfortheOntologySummi

To: "'Ontology Summit 2007 Forum'" <ontology-summit@xxxxxxxxxxxxxxxx>
From: "Chris Partridge" <mail@xxxxxxxxxxxxxxxxxx>
Date: Sun, 22 Apr 2007 12:41:44 +0100
Message-id: <000001c784d3$34d46890$0200a8c0@POID7204>

Leo,

 

I think something is being missed here.

 

Leo said:

Insofar as we share a common notion of the things in the world, we do have a common semantics, i.e., via a common reference.

 

It seems to me that some attempts at models (languages, concepts, etc.) try to map the “things in the world” directly via reference – others do not.

 

Language is something where this is not the case – the majority of language use’s goal is not about describing the world, but persuading, cajoling, etc. – I recall a particularly good description of this in George Lakoff’s Women, Fire, etc.

 

Even when it is about describing a situation – it is not always clear how reference works. David Armstrong gives as an example the statement that ‘there are at least two people in the room” – when there are a lot more. What does the statement refer to (e.g. which two people?) – you have to go through quite a few contortions to rescue reference.

 

To give a more practical example, in a recent ontology mining exercise on an operational system, the code included the notion of a node which was a point in space described by the traditional three co-ordinates. The problem was that as the system needed to be distributed the (representation of) the same node could (and did) appear a number of times. This is a very simple example, of a common feature of currently implemented systems, the requirements of system performance least to a downgrading of the importance of maintaining a direct relationship with the “things in the world” via reference and hence (legitimately) reference becomes indirect .

 

So what seems to me to characterise a model of an ontology is a desire to map the “things in the world” directly via reference – and that language, concepts, etc do not necessarily share that desire.

 

I am not sure that this desire has been made explicit in the current Ontology Framework Draft Statement for the Ontology Summit – and I think it might usefully do so.

 

Regards,

Chris

 

 


From: ontology-summit-bounces@xxxxxxxxxxxxxxxx [mailto:ontology-summit-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Bill Andersen
Sent: 21 April 2007 22:17
To: Ontology Summit 2007 Forum
Subject: Re: [ontology-summit] OntologyFrameworkDraftStatementfortheOntologySummit

 

Hi Leo..

 

This deserves a longer discussion, because it's a good answer.  However, in short, I think you're making my point, not refuting it.  My point was why use the term "concept" to describe that which we're quantifying over (topic maps, thesauri, et al, are not exempt) in our ontologies *unless* we're willing to go through something like the exercise you have below?  The point is that the things we're quantifying over are (hopefully) not concepts at all but rather the things we wish to talk about.  We may also wish to talk about tree-concepts as well as trees, precisely as you did below (BTW, thanks for previewing to anyone interested what such a theory might look like).  A second and more pressing problem is that, while computer science ontology-talk blithely discards any kind of realism by the extensive (and as Barry points out, slipshod) use of the term "concept", there seems to be a hasty implicit willingness to at least regard concepts as real.  One finds a contradiction here -- concept talk lends reality to at least one ontological category - the concept - or at least accepts reliable epistemic access to such things.  Why this and not have any confidence in our epistemic access to trees and dogs.  Concepts seem much more difficult to get a handle on, especially if they're private.  Dogs and trees are public - we share our access to them.

 

          .bill

 

On Apr 21, 2007, at 16:26 , Obrst, Leo J. wrote:



Bill,

 

Although I think concepts are internal (call them instead ideas or semantic senses, if you wish), I think they appropriately point to things in the world, so the shared semantics has two aspects:sense and reference. The latter is the thing in the world; the former is the placeholder for that, and indexed by our language constructs. Insofar as we share a common notion of the things in the world, we do have a common semantics, i.e., via a common reference. One might say in fact that the degree to which our thoughts match or map to the things in the world is the degree to which we have a common way of thinking about the things of the world. And then, finally, the degree to which our terminology and compositions of our terminology align with those common thoughts is the degree to which we can communicate with reasonably shared semantics.

 

What I don't understand is how my term can map directly to a thing in the world and bypass my thoughts.

 

Let's do an experiment:

 

Refer directly to a specific tree without 1) using language, 2) pointing, or 3) thinking about that tree. I'd say you can probably do without (1) or (2) (e.g., most animals), but how can you do without (3)?

 

Can the tree be physically in my head? If you say that the term-reference relation is in my head, then I would say that term-reference relation is a concept (yes, an entity, class, relation, property, instance, logical operatory, if you will -- as a way of characterizing the kinds of concepts), a reified representation in my head, and that that term-reference relation in fact can dispense with the term, since we think that animals can know the world without necessarily having language. Is the tree in the head of my dog? I don't think so: I think the dog has an idea about the tree.

 

I still think that one of the causes of our dissonance is that we are talking about 1) ontology, 2) logic, and 3) semantics, and not keeping these things straight. I would say that we build engineering models (call them engineering ontologies) which try to represent the real world. However, those engineering models consist of two items: 1) labels and 2) the representation of the meaning of those labels where the meaning is expressed as formal classes, entities, relations, properties, instances, rules, etc., that are supposed to align with what we think is the way the real world is and "means". Now, labels are terms, i.e., the names we give to these representations. As such they are elements of our language, abstracted or idealized. We have other terms we use in ordinary communication that index those labels. Both the terms and the labels are vocabulary; their interpretations, i.e., the actual formal models (stand-ins or representations for the real world things) they map to, and the mappings, are their semantics. Because a logic itself is a language, we have another filegree of potential dissonance.

 

I use "concept" not because I am a conceptualist (I'm not), but because I think that that notion abstracts over stuff like entity, class, property, relation, attribute, instance, rule, etc.

 

Thanks,

Leo

_____________________________________________
Dr. Leo Obrst       The MITRE Corporation, Information Semantics
lobrst@xxxxxxxxx    Center for Innovative Computing & Informatics
Voice: 703-983-6770 7515 Colshire Drive, M/S H305
Fax: 703-983-1379   McLean, VA 22102-7508, USA
 

 

 


From: ontology-summit-bounces@xxxxxxxxxxxxxxxx [mailto:ontology-summit-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Bill Andersen
Sent: Saturday, April 21, 2007 2:23 AM
To: Ontology Summit 2007 Forum
Subject: Re: [ontology-summit] OntologyFrameworkDraftStatementfortheOntology Summit

Hey Leo...

 

Your note hits precisely at the issue that I think plagues concept talk.  

 

(1) If concepts are private to bearers, then why pretend to talk about "shared semantics", "shared meanings" and so on, as is common in the literature on ontology in computer science?  For this to work, there would have to be something (anything!) by virtue of which such concepts could be shared.  There are, as you know, theories about how that may come about (I'm thinking of Carnap's private language argument), but nobody's talking about that in computer science ontology.

 

(2) If concepts are not private, then there must be some nexus that supports the non-private component of them that is shared.  Generally, we of a realist bent take that to be *reality*.  Barry's comment concerning the bio-ontologist who thinks of their computational bio-ontology representing not concepts, but biological reality, comes to mind.  If we still want concepts (say to talk about someone's personal concepts) some form of conceptual realism can be employed to relate the two.  

 

In either case, I don't think any useful work is done whatsoever by calling the things denoted by linguistic terms in *computational* ontologies "concepts" with no further comment.  We humans (at least those of us who are not concept theorists) seem to resort to to the use of the term "concept" for the same reason that we call something we can't remember the name of a "thingy" or "whatchamacallit" - in this form it's a kind of forgivable intellectual laziness.  That, or we're *really* talking about concepts in which case we have lots of work to do.  Rather, wouldn't it be better - especially if one doesn't care about (philosophically-motivated) ontology - simply to use the more neutral terms of "property", "relation", and "object" that can be taken to correspond to the denotation of relation- (unary and greater-than-unary) and constant-terms in mathematical logic.  Nicola Guarino, and later with Chris Welty, went this direction.  This relates to Welty's comment of yesterday about there being nothing new in computer science ontology -- it's almost as if computer scientists engaged in the "semantic technology" field are afraid to use terms that might make their enterprise seem less sexy and "semantic", so they stick with "concept"

 

On Apr 20, 2007, at 21:27 , Obrst, Leo J. wrote:



[Opinion on]

 

Everything is a concept: entities, relations among them, properties, attributes, even many instances/individuals (days of the week; Joe Montana; etc.) Especially when you think of concept in animal mental apparatus as a placeholder for something real in the real world (I am a realist). Sure, I have a concept for 'Joe Montana'. Is that concept a general notion, i.e., a class of something? No.

 

The general problem (from my perspective) is that we are typically always addressing two perspectives: 1) ontology, i.e., what exists in the world? and 2) semantics, i.e., what is the relationship between our ways of talking/thinking and those things in the world? To me it's clear that we are talking about (1) things of the world, but our language (and our thought, I would say) interposes another layer or two. I would say there are minimally 3 things: 1) our language (terms and compositions of terms), 2) the senses of terms (and their compositions) which we might characterize as concepts, and 3) real world referents that those senses or concepts somehow point to. In formal semantics, a good theory of reference (i.e., (3)) is hard to come by.

 

[Opinion off]

 

 

_____________________________________________
Dr. Leo Obrst       The MITRE Corporation, Information Semantics
lobrst@xxxxxxxxx    Center for Innovative Computing & Informatics
Voice: 703-983-6770 7515 Colshire Drive, M/S H305
Fax: 703-983-1379   McLean, VA 22102-7508, USA
 

 

 


From: ontology-summit-bounces@xxxxxxxxxxxxxxxx [mailto:ontology-summit-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Uschold, Michael F
Sent: Friday, April 20, 2007 9:01 PM
To: Ontology Summit 2007 Forum
Subject: Re: [ontology-summit] Ontology FrameworkDraftStatementfortheOntology Summit

me-thinks this is a leftover from DL-speak in which 'concept' refers to the classes, not the relationships.
I prefer the broader use of 'concept' whereby one speaks of the concept of having a brother, or of being a mentor (which of course are relationships).

 

Good to raise this ambiguity.

Mike

 

 

==========================
Michael Uschold
M&CT, Phantom Works
425 373-2845
michael.f.uschold@xxxxxxxxxx 
==========================

----------------------------------------------------
COOL TIP: to skip the phone menu tree and get a human on the phone, go to: http://gethuman.com/tips.html

 

 

 


From: Bill Andersen [mailto:andersen@xxxxxxxxxxxxxxxxx]
Sent: Friday, April 20, 2007 5:58 PM
To: Ontology Summit 2007 Forum
Subject: Re: [ontology-summit] Ontology Framework DraftStatementfortheOntology Summit

Correction.  Second sentence should read:

 

Are relations not "conceptual" in the way that "concepts" are?

 

Sorry 'bout that.

 

On Apr 20, 2007, at 20:57 , Bill Andersen wrote:



Pat,

 

How come "relations" are a separate category from "concepts"?  Are relations not "conceptual" in the way that "conceptual" are?  If it is the case that 'concept' is just parlor speak for those things that we typically represent with nodes in a taxonomy or unary predicates in a logic, and if 'relation' is used to talk about those things that are not "concepts" (i.e. the things we like to represent with predicate terms of arity greater than one), then the distinction seems artificial.  Should there not be just "concepts" divided into the 1-, 2- ... n-ary cases?

 

.bill

 

On Apr 20, 2007, at 19:12 , Cassidy, Patrick J. wrote:



In discussions I use:

"A representation of the structure of concepts and the relations

between them, in a form that a computer can reason with."

 

Pat

 

Patrick Cassidy

CNTR-MITRE

260 Industrial Way West

Eatontown NJ 07724

Eatontown: 732-578-6340

Cell: 908-565-4053

 

 

-----Original Message-----

Of Peter F Brown

Sent: Friday, April 20, 2007 7:08 PM

To: Ontology Summit 2007 Forum

Subject: Re: [ontology-summit] Ontology Framework Draft 

StatementfortheOntology Summit

 

Too many too's... ;-)

 

But seriously, are we looking for a Gartner Group-style 4 word

mission

statement to make it sound good, or do we want a formulation that

actually does mean something and that we can agree on? 

Brevity does not

always equate with clarity: if I have to choose to sacrifice one, it

would be brevity.

 

Peter

 

-----Original Message-----

Deborah

MacPherson

Sent: 20 April 2007 16:02

To: Ontology Summit 2007 Forum

Subject: Re: [ontology-summit] Ontology Framework Draft Statement

fortheOntology Summit

 

 "a formal description of terms that represent concepts and

relationships in as

chosen subject matter of interest"

 

is too long, too much of a mouthful of too many words.

 

Debbie

 

On 4/20/07, Uschold, Michael F <michael.f.uschold@xxxxxxxxxx> wrote:

Its almost good enough... But an ontology is more than just about

terms.

 

How about:

 

 "a formal description of terms that represent concepts and

relationships in as

chosen subject matter of interest"

 

Mike

 

 

 

==========================

Michael Uschold

M&CT, Phantom Works

425 373-2845

==========================

 

----------------------------------------------------

COOL TIP: to skip the phone menu tree and get a human on 

the phone, go

 

 

 

-----Original Message-----

From: Peter F Brown [mailto:peter@xxxxxxxxxx]

Sent: Friday, April 20, 2007 3:08 PM

To: Ontology Summit 2007 Forum

Subject: Re: [ontology-summit] Ontology Framework Draft 

Statement for

theOntology Summit

 

I agree: we've worked with the definition "a formal descriptions of

terms and the relationships between them" [1] as being good 

enough to

know what we talking about when we're talking about what 

we're talking

about...and "good enough" should be good enough.

 

Peter

 

[1] From 'OASIS Reference Model for Service-Oriented Architecture',

p17,

see

 

 

 

-----Original Message-----

Chris

Welty

Sent: 19 April 2007 20:23

To: Ontology Summit 2007 Forum

Subject: Re: [ontology-summit] Ontology Framework Draft 

Statement for

the Ontology Summit

 

 

Surely after 15 years we can do better than "specification of a

conceptualization"?  Isn't it time we put that one to rest?

 

-Chris

 

Obrst, Leo J. wrote:

All,

 

Here is our draft statement about the Ontology Framework. 

We invite

you to consider and discuss this -- now and in next 

week's sessions.

We intend this to be an inclusive characterization of what an

ontology

 

is. Inclusive: meaning that we invite you to consider 

where you and

your community is with respect to these dimensions. If you have

concerns or issues, restatements or elaborations, please 

let us know

now and next week. This will shortly be posted on the 

Framework Wiki

page:

 

 

meworksFor

Consideration.

 

 

Thanks much,

 

Tom Gruber, Michael Gruninger, Pat Hayes, Deborah McGuinness, Leo

Obrst

 

_____________________________________________

Dr. Leo Obrst       The MITRE Corporation, Information Semantics

lobrst@xxxxxxxxx    Center for Innovative Computing & Informatics

Voice: 703-983-6770 7515 Colshire Drive, M/S H305

Fax: 703-983-1379   McLean, VA 22102-7508, USA

 

 

 

 

 

 

----------------------------------------------------------------------

--

 

 

_________________________________________________________________

Subscribe/Config:

Community Files:

Community Wiki:

Community Portal: http://ontolog.cim3.net/

 

--

Dr. Christopher A. Welty                    IBM Watson 

Research Center

+1.914.784.7055                             19 Skyline Dr.

cawelty@xxxxxxxxx                           Hawthorne, NY 10532

 

_________________________________________________________________

Subscribe/Config:

Community Files: 

Community Wiki:

Community Portal: http://ontolog.cim3.net/

 

No virus found in this incoming message.

Checked by AVG Free Edition.

Version: 7.5.463 / Virus Database: 269.5.5/769 - Release Date:

19/04/2007 17:56

 

 

No virus found in this outgoing message.

Checked by AVG Free Edition.

Version: 7.5.463 / Virus Database: 269.5.5/769 - Release Date:

19/04/2007 17:56

 

 

_________________________________________________________________

Subscribe/Config:

Community Files: 

Community Wiki:

Community Portal: http://ontolog.cim3.net/

 

_________________________________________________________________

Subscribe/Config:

Community Files: 

Community Wiki:

Community Portal: http://ontolog.cim3.net/

 

 

 

-- 

 

*************************************************

Deborah L. MacPherson

Specifier, WDG Architecture PLLC

Projects Director, Accuracy&Aesthetics

 

The content of this email may contain private

and confidential information. Do not forward,

copy, share, or otherwise distribute without

explicit written permission from all

correspondents.

 

**************************************************

 

_________________________________________________________________

Subscribe/Config:

Community Files:

Community Wiki:

Community Portal: http://ontolog.cim3.net/

 

_________________________________________________________________

Subscribe/Config: 

Community Files:

Community Wiki: 

Community Portal: http://ontolog.cim3.net/

 

 

_________________________________________________________________

Community Portal: http://ontolog.cim3.net/

 

Bill Andersen (andersen@xxxxxxxxxxxxxxxxx)

Chief Scientist

Ontology Works, Inc. (www.ontologyworks.com)

3600 O'Donnell Street, Suite 600

Baltimore, MD 21224

Office: 410-675-1201

Cell: 443-858-6444



 

 

_________________________________________________________________

Community Portal: http://ontolog.cim3.net/

 

Bill Andersen (andersen@xxxxxxxxxxxxxxxxx)

Chief Scientist

Ontology Works, Inc. (www.ontologyworks.com)

3600 O'Donnell Street, Suite 600

Baltimore, MD 21224

Office: 410-675-1201

Cell: 443-858-6444



 

 

_________________________________________________________________

Community Portal: http://ontolog.cim3.net/

 

Bill Andersen (andersen@xxxxxxxxxxxxxxxxx)

Chief Scientist

Ontology Works, Inc. (www.ontologyworks.com)

3600 O'Donnell Street, Suite 600

Baltimore, MD 21224

Office: 410-675-1201

Cell: 443-858-6444



 

 

_________________________________________________________________

Community Portal: http://ontolog.cim3.net/

 

Bill Andersen (andersen@xxxxxxxxxxxxxxxxx)

Chief Scientist

Ontology Works, Inc. (www.ontologyworks.com)

3600 O'Donnell Street, Suite 600

Baltimore, MD 21224

Office: 410-675-1201

Cell: 443-858-6444



 


--
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.5.446 / Virus Database: 269.5.6/770 - Release Date: 20/04/2007 18:43


No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.5.463 / Virus Database: 269.5.7/771 - Release Date: 21/04/2007 11:56


_________________________________________________________________
Msg Archives: http://ontolog.cim3.net/forum/ontology-summit/ 
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontology-summit/  
Unsubscribe: mailto:ontology-summit-leave@xxxxxxxxxxxxxxxx
Community Files: http://ontolog.cim3.net/file/work/OntologySummit2007/
Community Wiki: http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2007
Community Portal: http://ontolog.cim3.net/    (01)
<Prev in Thread] Current Thread [Next in Thread>