[Top] [All Lists]

Re: [ontolog-forum] Ontology, Information Models and the 'Real World': C

To: ontolog-forum@xxxxxxxxxxxxxxxx
From: KCliffer@xxxxxxx
Date: Mon, 28 May 2007 08:19:42 EDT
Message-id: <bdd.1318d773.338c22de@xxxxxxx>
An observation and some thoughts emerging from it:
Note how much discussion was generated by a simple unintentional error in coding/terminology - an instance of differences in what was meant and perceived by a proposition - in this case a type of instance not included in our recent previous discussion (simple unintentional error in _expression_). Had the proposer fully reviewed and revised the proposition, it would have sailed smoothly through the discussion (compared to the actual result).
Note - I mean to cast no aspersions - I make plenty of such mistakes and am the first to hope for them to be excused or treated with no negative judgment, and furthermore for them to be corrected by myself or others before any negative effects occur due to them. My purpose here is to point out another kind of example that systems must take into account when dealing with categorizing or handling propositions - their meaning may vary or be uncertain for many reasons, including simple error in composition, as well as differences in perspectives, perceptions, experience, etc.
In fact, one of the characteristics that could be considered to be in a well-functioning system is that it can accommodate and correct such errors through its functional processes, without causing "collateral damage" to the fallible human person involved and that person's ability to contribute constructively to the functioning of the system, and without negatively affecting other aspects of the functioning of the system -- as, I might point out, appears to have eventually happened here, as far as I can tell, to this discussion's credit.
The stakes in such functionality depend on the functional purpose of the system - for example if it's a medical system in which lives or health are at stake, the importance of such robustness with respect to errors is obvious. In other kinds of systems, the nature and importance of how they deal with error may not be so obvious. In complex systems, small variations can have surprisingly great and hard-to-predict effects (sometimes represented by the "butterfly effect," in which a butterfly's wing-flapping theoretically could result in a hurricane elsewhere in the world). Stories abound about how small, understandable human errors have had disastrous results in systems that were not robust enough to accommodate and correct them or correct for their effects (including in high-stakes systems).
In a message dated 5/28/2007 2:13:09 A.M. Eastern Daylight Time, Waclaw.Marcin.Kusnierczyk@xxxxxxxxxxx writes:
nothing else, in fact;  i would be surprised were it not that definition.

but i am aware of the immenseness of my ignorance, and was eager to
discover that there are other definitions one may coherently use.


Christopher Menzel wrote:
> What did you expect?
> On May 28, 2007, at 12:55 AM, Waclaw Kusnierczyk wrote:
>> pretty standard.
>> ok.
>> vQ
>> Christopher Menzel wrote:
>>> On May 27, 2007, at 4:08 PM, Waclaw Kusnierczyk wrote:
>>>> this was my guess, though i was wondering whether you use a
>>>> different definition of 'asymmetric'.
>>> (if (R x y) (not (R y x))
Kenneth Cliffer, Ph.D.
cell: 703-919-0104
e-mail: KCliffer@xxxxxxx

See what's free at AOL.com.

Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx    (01)

<Prev in Thread] Current Thread [Next in Thread>