ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Child architecture

To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: Steven Ericsson-Zenith <steven@xxxxxxx>
Date: Sun, 21 Dec 2014 10:31:14 -0800
Message-id: <CAAyxA7ub745akAg_5oXv4mGPaa9amLioe0FEXFjztmbUt8iC4Q@xxxxxxxxxxxxxx>
Thank you John. This is a cogent response. I expect terms like those discussed in a thesis to relate to some behavior under discussion. IOW, I expect such discussion to contribute to the thesis.

Using a discussion of a term, such a "consciousness" (again, however it is meant) in order to simply hide an inadequacy of the thesis does a disservice to the other good work done.  Phil appears to be saying (to me) "I do not know what this term means but let's call it something that you may be familiar with" and in doing so "I'll discuss what others have said" but "I will not offer any explanation or speak to what it might do."

I will probably have the same problem with "signature of consciousness" unless you are referring to some simple correlation, such as some power utilization while the network router is turned on.

Phil's last response probably deserves a response from me. However, I am stumped by his expectation that his work is a basis for future work in this field. This is a naive expectation of a new PhD. I recall my own response on the completion of my doctorate more than twenty years ago (in parallel computation), not satisfaction but rather a discomfort that ultimately led me to my current research.

Being charitable and revealing something of my own interest, I assume "consciousness" is in the thesis because there is some expectation that "feeling" is involved somehow in the claim for "human level intelligence." This is certainly my expectation both intuitively and confirmed after studying results from biophysics over the past two decades. But it should be clear to all that if it, as the general notion of "sense," is indeed involved then there is something that it must do and some way to account for it mathematically in physics.

Anyone that thinks the matter resolved by computation or electronic sensing devices has had their head in the sand for the past ten years. I suspect that one day, perhaps soon, we will be able to build human level intelligence, but these "machines" will be "machines that experience" in which the development of "feeling" is a principle goal and in which this property plays a central role (and gives us what Charles Peirce called his "third"), enabling low-power "general recognition" and low-power decisions across diverse large-scale structures. But it should be clear that such machines will not be conventional electronic devices and nor will they be computational by our (Turing based) conception.

Regards,
Steven


On Sun, Dec 21, 2014 at 6:25 AM, John F Sowa <sowa@xxxxxxxxxxx> wrote:
Phil and Steven,

These issues arise from trying to relate words of ordinary language
to technical terms in various branches of cognitive science. I agree
that some discussion of the relationships is useful in a book with
the scope of Phil's dissertation.

But it's important to distinguish formal terms in a theory, data
derived by experimental procedures and observations, and informal
words whose meanings evolved through thousands of years of usage.
They are related, but not in any one-to-one correspondence.

Phil
> Consciousness is a dynamic property of a system observing itself and
> its relation to its environment. This property is evidenced by the
> creation of conceptual structures that represent a system's observations
> and thoughts about itself and its environment.

This is an attempt to define a common word by a technical definition.
It's misleading on both sides: (1) attempts to legislate how people
use words have little or no effect on practice; (2) using informal
words in a technical sense is confusing for the reader, who can't
avoid mixing preconceived ideas with the formal presentation.

As an example, I would cite Dehaene's term 'signature of consciousness'.
That's a technical term with 'signature' as the head word (focus).
The common word 'consciousness' is in a qualifier that serves as a
reminder that this technical term is related to the informal term. *

In my writing, I use the term 'conceptual graph' as a technical term,
in which the head word is 'graph'.  I also use the word 'concept',
but I emphasize that it has no formal meaning other than "a node
in a graph".

If anybody asks what a concept means in the theory, I just repeat:
"Formally, a concept is a node in a graph.  Its only meaning comes
from the operations that are defined on the graphs."

* Dehaene, Stanislas (2014) _Consciousness and the Brain_,
   New York: Viking.

John


_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J    (01)

<Prev in Thread] Current Thread [Next in Thread>