ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Self Interest Ontology

To: "'[ontolog-forum] '" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: "Rich Cooper" <rich@xxxxxxxxxxxxxxxxxxxxxx>
Date: Fri, 24 Aug 2012 21:48:00 -0700
Message-id: <7C37230F05BD47F086B4213FEC434467@Gateway>

Dear John,

 

I agree: consciousness is not the place to start, so that’s why I am treating it as an unknown variable, in structure, form and content still to be determined. 

 

It will eventually have to be further elaborated, but first I think it might be wise to concentrate on the other term, which he called paralysis

 

Paralysis means, in his rendition, has to do with loss of specific capabilities that neurosci SMEs have names for.  That use of the words is what I am trying to capture in the Self Interest Ontology. 

 

To ultimately define the whole Ontology, we first have to define the self, in the words that specialist SMEs use.  So both consciousness and paralysis eventually have to be refined in an iterative way. 

 

Paralysis has to be refined first, IMHO, because consciousness is created in part by the experiences we associate with actions and objects. 

 

Our experiences are memories of the objects we have experienced in the past, and memories of the actions we have observed, or performed by our self, or by other selfs. 

 

Those stored memories are retrieved, and they define what we can think about, and what we experience, in the future.

 

Paralysis refers to the loss of the capabilities enumerated in the vocabulary of those actions and objects which we are no longer able to perform.  So they are intensively interrelated semantically. 

 

Thus I am suggesting a top down design of the Self Interest Ontology using words which are mentioned and related to his theory of what maps into or out of what in neurophysiology terms. 

 

That way, in future work, we can still further refine those terms.  Wherever the need for further refinement exists and is justifiable, that’s where we should focus further refinement. 

 

I think he is saying that consciousness is based on sensory and motor feedback loops, and the purposes to which they are put when using the neurosci viewpoint. 

 

Clearly we can only go so far before reaching a practical limit of what its worth to us.  Consider this exploratory.  Refinement stops when no TBDs are worth the effort to further refine. 

 

That makes it a top level ontology waiting for further applications. 

 

Each application, by contrast, has words to start from the bottom of the ontology.  At some point, they refine the words in the upper ontology onto their application words to render the application. 

 

The result is a Self Interest Ontology in refinement.  We started with defining Self.  Now there is neuroscientist with a clear and elaborate explanation of what Self actually is, in physiological terms.  That is the basis of the Self Interest Ontology, as I have proposed it. 

 

Interest comes later. 

 

-Rich

 

Sincerely,

Rich Cooper

EnglishLogicKernel.com

Rich AT EnglishLogicKernel DOT com

9 4 9 \ 5 2 5 - 5 7 1 2


From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx [mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of John Bottoms
Sent: Friday, August 24, 2012 6:55 PM
To: ontolog-forum@xxxxxxxxxxxxxxxx
Subject: Re: [ontolog-forum] Self Interest Ontology

 

Rich,

It certainly is an interesting topic for anyone who has passed time thinking about consciousness.

However...I am more than a little suspicious about any neuro*-work that has been turned into a music composition.

Further, consciousness is such a large topic that it touches on every bit of an ontology. It might even define the scope and domain of the ontology. That is too large a work area to attempt. A reduction in scope is in order. Don't get me wrong; I am intrigued but it reminds me of my 7th grade days when I wanted to study calculus. My mother explained that I needed algebra, trig and geometry before that would be possible.

And, as I have said before; it is nigh on to impossible to make any progress in defining a useful  ontology without a problem statement or a set of use cases. Linguists know that environment defines and shapes a language. It is true of ontologies also.

-John Bottoms
 FirstStar Systems
 Concord, MA

On 8/24/2012 8:33 PM, Rich Cooper wrote:

If there is any interest in this topic, here is a TED talk by Antonio Damasio, a neuroscientist, who correlates the self with conscious awareness of self, locates it in the brain, and describes many related thing (structure, fiber pathways, damage to specific regions …) and how those things interact with the experience of self. 

 

http://www.ted.com/talks/antonio_damasio_the_quest_to_understand_consciousness.html

 

His rendition from a neurosci perspective, is that the information sensed by each body sensor, and processed in the cortex, visual, auditory parts of the brain, is “made available to the motor cortex and the hindbrain”. 

 

That is where he locates the self, in two small regions adjacently spanning the width of the brainstem. 

 

His perspective is literate and informative and he spells out his theories of how that perspective was justified in his view. I propose that his rendition of the self is what needs to be nominated as the official

 

            SELF INTEREST ONTOLOGY

 

And I so nominate it.  Now the problem will be to codify it into assertions that can be agreed on.  It would be useful to transcribe his statements into text.  Does any agree, disagree or has quit reading object to this?

 

Damage in each one of those two regions has unique results, which he describes eloquently as “consciousness” on the one hand, and “paralysis” on the other.   So I propose this, in my own chosen form of logic:

 

type      Self =

                        Consciousness : TBD0;

                        Paralysis          : TBD1;

            ;

 

At this point, that’s all I have to contribute about how to refine the Self in each of the ways that the good professor related so well. 

 

Refinements of TBD0 and TBD1 according to his perspective would be welcome comments if anyone wants to make one.  Are there any specialists in the crowd?

 

-Rich

 

 

 

The regions seem to be paired parallel to the kidney pairing further down in the brainstem than I find comfortable. 

Together, a cross-section of the region would be the size of your neck bone’s nerve bundle.  That’s thinner than I am comfortable with, but who would have known?

 

 

Sincerely,

Rich Cooper

EnglishLogicKernel.com

Rich AT EnglishLogicKernel DOT com

9 4 9 \ 5 2 5 - 5 7 1 2




 

 


_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J    (01)

<Prev in Thread] Current Thread [Next in Thread>