To: | "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx> |
---|---|
From: | Steven Ericsson-Zenith <steven@xxxxxxx> |
Date: | Sun, 21 Dec 2014 10:31:14 -0800 |
Message-id: | <CAAyxA7ub745akAg_5oXv4mGPaa9amLioe0FEXFjztmbUt8iC4Q@xxxxxxxxxxxxxx> |
Thank you John. This is a cogent response. I expect terms like those discussed in a thesis to relate to some behavior under discussion. IOW, I expect such discussion to contribute to the thesis.
Using a discussion of a term, such a "consciousness" (again, however it is meant) in order to simply hide an inadequacy of the thesis does a disservice to the other good work done. Phil appears to be saying (to me) "I do not know what this term means but let's call it something that you may be familiar with" and in doing so "I'll discuss what others have said" but "I will not offer any explanation or speak to what it might do." I will probably have the same problem with "signature of consciousness" unless you are referring to some simple correlation, such as some power utilization while the network router is turned on. Phil's last response probably deserves a response from me. However, I am stumped by his expectation that his work is a basis for future work in this field. This is a naive expectation of a new PhD. I recall my own response on the completion of my doctorate more than twenty years ago (in parallel computation), not satisfaction but rather a discomfort that ultimately led me to my current research. Being charitable and revealing something of my own interest, I assume "consciousness" is in the thesis because there is some expectation that "feeling" is involved somehow in the claim for "human level intelligence." This is certainly my expectation both intuitively and confirmed after studying results from biophysics over the past two decades. But it should be clear to all that if it, as the general notion of "sense," is indeed involved then there is something that it must do and some way to account for it mathematically in physics. Anyone that thinks the matter resolved by computation or electronic sensing devices has had their head in the sand for the past ten years. I suspect that one day, perhaps soon, we will be able to build human level intelligence, but these "machines" will be "machines that experience" in which the development of "feeling" is a principle goal and in which this property plays a central role (and gives us what Charles Peirce called his "third"), enabling low-power "general recognition" and low-power decisions across diverse large-scale structures. But it should be clear that such machines will not be conventional electronic devices and nor will they be computational by our (Turing based) conception. Regards, Steven On Sun, Dec 21, 2014 at 6:25 AM, John F Sowa <sowa@xxxxxxxxxxx> wrote: Phil and Steven, _________________________________________________________________ Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/ Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/ Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx Shared Files: http://ontolog.cim3.net/file/ Community Wiki: http://ontolog.cim3.net/wiki/ To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J (01) |
Previous by Date: | Re: [ontolog-forum] Child architecture, Philip Jackson |
---|---|
Next by Date: | Re: [ontolog-forum] Child architecture, Philip Jackson |
Previous by Thread: | Re: [ontolog-forum] Child architecture, Philip Jackson |
Next by Thread: | Re: [ontolog-forum] Child architecture, Philip Jackson |
Indexes: | [Date] [Thread] [Top] [All Lists] |