Answers inline: (01)
>> Context is a set
>
> You are speaking in the singular. Do you mean to
> imply that that is a single thing called
> "context" which is a *particular* set?
DN: The manner in which CCTS did this is to establish there were 8 major
factors that influenced the manner in which a generalized data element
concept becomes a very specific data element. This was referred to as the
context which implied a set of things. It was deemed non-exclusive as there
was no way to know if it was complete.
>
>> of zero or more qualifiers
>
> What is a 'qualifier'?
DN: It is a UN/CEFACT term. It roughly translates to an external force or
asepct of context which has a specialization effect on the instance being
discussed. Example: If I am an instance of the human class (some say this
might be a stretch), and was in Canada, this makes me a Canadian human,
being more specific than a mere human.
>
>> that affect various aspects of
>> the semantics of a given statement.
>
> Of the semantics or of the meaning?
DN: Specifically in UN/CEFACT terms, it modifies the semantics by making the
instance more specialized for the specific context. A generic date data
element can be refined and constrained to be a "PurchaseOrder.Ship.Date"
which is a type of date that takes on additional characteristics. (02)
Note that all of the preceding are done during the modeling work to refine
metadata by making the semantics of data elements more specialized.
>
>> Different context qualifiers can impact
>> one or more aspects of the semantics of such statements including
>> representation terms and concepts.
>>
>> Example:
>>
>> A glass of water
>
> What is this an example of? Is the glass of water
> the topic of an ontology, or an example of a
> representation of something?
DN: An instance of a class of either a taxonomy of ontology. Perhaps this
is not the best example as I noted after I wrote it.
>
>> Context one:
>>
>> Glass of water is sitting on your kitchen table. To the average Western
>> observer, it is quite in it's place in this context and is not raised in
>> one's internal tuple stores.
>
> ?? What internal tuple stores are you talking
> about? I very much doubt if my brain works using
> RDF. And what does 'raise' mean?
>
DN: an analogy to say that you would not raise it in your general level of
awareness in this context as there is nothing that strikes you as unusual or
out of place. OTOH, if you were very thirsty, you may immediately fixate
your attention on this object.
>> Concept: water, two molecules of hydrogen bound to one of oxygen in a glass
>> container.
>> Dangerous: possibly
>> State: liquid
>> Use: quenching thirst
>> Mass: about 200 grams
>>
>> Context two:
>>
>> You are driving down the road in sub zero weather and the same glass of
>> water is in the middle of the road.
>
> No, it isn't. Not the SAME glass of water. OK,
> these are two different circumstances, and if
> fully described in an ontological formalism would
> have very different descriptions. But what his
> this fact to do with contexts? Ontologies aren't
> living in the world like we are, driving cars and
> drinking glasses of water. Living in a changing
> world raises a host of new issues that go beyond
> ontology engineering. This kind of difference you
> describe here is more relevant to KR in AI than
> to ontology engineering.
DN: Perhaps. I am here to try and learn/understand this. Your help has
been great so far.
>
> Here's a quick way to say the difference between
> OE and AI. In many AI applications, a reasoner is
> living IN a changing world, one that it needs to
> perceive and act in. In OE applications, the job
> of the ontology is to DESCRIBE possibly-changing
> worlds, but not to live IN them.
>
DN: makes sense.
>> Since it represents a hazard for your
>> vehicle, you raise it to the highest layers of consciousness in your
>> internal tuple stores and give it active attention.
>
> Ah, this is what 'raise' means? OK, but what has
> this to do with ontologies? AFAIK, the notion of
> attention simply doesn't arise in mechanical
> reasoning.
>
DN: it is sort of orthogonal to ontology however in the CI field, cognitive
skills rely on an agent to recognize that X is an instance of
<ontology_class> or <taxonomy_item> etc. The level of attention a given
instance X has in any observers consciousness (or machine equivalent tuple
stores), varies depending on the surrounding context in which the instance X
incurs. This is of course not a 100% proven hypothesis and only my opinion.
I am not sure if others agree. (03)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (04)
|