ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Child architecture

To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: Philip Jackson <philipcjacksonjr@xxxxxxxxxxx>
Date: Sun, 21 Dec 2014 10:40:45 -0500
Message-id: <SNT147-W39EDBC15EB1B205C424E92C1690@xxxxxxx>
John,

Thanks for your comments:
 
JS: 
> But it's important to distinguish formal terms in a theory, data
> derived by experimental procedures and observations, and informal
> words whose meanings evolved through thousands of years of usage.
> They are related, but not in any one-to-one correspondence.
Agreed.
 
> PJ:
> > Consciousness is a dynamic property of a system observing itself and
> > its relation to its environment. This property is evidenced by the
> > creation of conceptual structures that represent a system's observations
> > and thoughts about itself and its environment.
JS:
> This is an attempt to define a common word by a technical definition.
> It's misleading on both sides: (1) attempts to legislate how people
> use words have little or no effect on practice; (2) using informal
> words in a technical sense is confusing for the reader, who can't
> avoid mixing preconceived ideas with the formal presentation.
 
To clarify, I was not stating this as a definition of consciousness. It was just part of an answer to Steven E-Z's question as to whether I was claiming consciousness resides in bit patterns on hard disks (which I am not).
 
The problem of defining consciousness is similar to the problem of defining intelligence, which has challenged AI since its inception. Section 2.1 presents the thesis approach to defining 'human-level AI', and explains why the thesis needs to consider consciousness in discussing human-level AI. Section 2.3.4 discusses previous research on artificial consciousness. It introduces Aleksander & Morton's 'axioms of being conscious', which section 3.7.6 adapts for the thesis definition of artificial consciousness, i.e., what is required to say a system has (artificial) consciousness.
 
JS:
> As an example, I would cite Dehaene's term 'signature of consciousness'.
> That's a technical term with 'signature' as the head word (focus).
> The common word 'consciousness' is in a qualifier that serves as a
> reminder that this technical term is related to the informal term. *
>
> In my writing, I use the term 'conceptual graph' as a technical term,
> in which the head word is 'graph'. I also use the word 'concept',
> but I emphasize that it has no formal meaning other than "a node
> in a graph".
>
> If anybody asks what a concept means in the theory, I just repeat:
> "Formally, a concept is a node in a graph. Its only meaning comes
> from the operations that are defined on the graphs."
>
> * Dehaene, Stanislas (2014) _Consciousness and the Brain_,
> New York: Viking.
 
I take your point. In retrospect, perhaps the thesis should have made more use of the term 'artificial consciousness' throughout, to avoid confusion with the public term 'consciousness', and to avoid philosophical debates about whether an AI system really is or is not conscious.
 
Phil

_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J    (01)

<Prev in Thread] Current Thread [Next in Thread>