ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Visual Complexity

To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: paola.dimaio@xxxxxxxxx
Date: Sat, 3 Feb 2007 11:40:49 +0700
Message-id: <c09b00eb0702022040w18356bb2m8c7805f8dc381a4@xxxxxxxxxxxxxx>
***LONG BUT INTERESTING***    (01)


HI Pat
You make so many points...
I am glad that with more clarification my arguments raise  less objections    (02)

just a few more (quick?) thoughts:
- my ideas are sometimes 'packed' and conveyed with inexact
expressions, simply because I assume (incorrectly obviously) that I am
communicating with with people who share assumptions, language and
background - while I realise now that this may not be the case. We may
both claim to have some expertise and understanding in ontology and
semantics, yet obviously have very different views/perceptions of our
domain. This however really shows the point of having an ontology, and
partly supports my argument that
given two 'experts' (or more) the given boundaries of the subject
matter they claim expertise about should not be set by any of them
alone. This is where I take (slight) issue when people say'
'ontology' is this, and not that'. I think we are all entitled to
participate in the description and analysis of what ontology is for
us, at least, of what we think it is, and that  any one ruling out
another view is simply restricting the view, rather than widenting it
 Basically  I think that some things are definitely black and some
things are definitely white, but that does not mean that  everything
is black and white.    (03)


>>>> the fact that you are a clever and nice guy - as you
> >certainly are - means that you understand everything about ontology
> >and semantics.
>
> Well, not *everything*. I hope I didn't ever say that.    (04)

No, you never said that, but it sounds like you have made your mind up
about all things already and that you have divided the world between
'true and 'untrue'
I am just saying, 'true' and 'untrue' are often not absolute terms,
but relative to conditions
In the open world, there are a lot of variables    (05)

 if you feel that you have a
> new perspective on all this stuff from an
> entirely different point of view, and especially
> a different *disciplinary* point of view (social
> science? ethnomethodology?  some part of
> linguistics?) then please don't take my comments
> or their tone as anything other than an opening
> move in a cross-disciplinary bout of (hopefully)
> mutual education.
well, not sure how to define where I come from in terms of discipline
as my stance does not reflect only my brief studies (both social
science and technology, still lookin for a Phd supervisor - anyone?),
but also reflects my view of reality through the lense that have
developed from acquiring different knowledge during different work
experiences, travels,in living in different cultures , cultivating
personal interests (some of them as weird as other members of this
community, I read)
Studying bacterial behaviour may give you a different insight on the
human world.etc    (06)

Also, what I study is the interaction of different fields and
disciplines, but such a discipline
does not yet exist formaly - how do we go about launching it - and
maybe thats why I ended up in this forum. Some scientists deny the
existance of metaphysics as a scientific field, and I do not blame
them.
 I like to think that my perspective of the day is ' Information
systems applied to dynamic heterogenous social contexts' how does that
sound?    (07)



> I think this applies to us all, regrettable
> though it may be. (I take it that, like the rest
> of us, you also do not claim to have a direct
> route to Reality.)
LOL. Understanding Reality is my ultimate personal pet project  Yet, I
do not claim to have
learned it all yet....also because it keeps on changing (darn)    (08)


> >I refer to ontology as conceptual frameworks,  Gruber's first definition.    (09)

An ontology is a specification of a conceptualization.
www-ksl.stanford.edu/kst/what-is-an-ontology.html    (010)

This one. I remember I started reading Gruber and Sowa and thought:
this is what I wanted to study( even if I only understood the
pictures)
. I think Grubers work may have evolved from there, probably
influenced by his own
field and surroundings. But 'sepcification of conceptualization' is
what I intend, unless otherwise specified, as ontology.    (011)


 Gruber was (and still is) talking about
> formalized conceptual frameworks, and he was at
> the time talking specifically about such
> frameworks in the context of knowledge
> representation in AI (though the scope has since
> broadened, the technology and theory used in the
> field has all been inherited from AI/KR work.)
Yes, that's what I refer to when I use the word ontology, unless
otherwise specified
- Show quoted text -    (012)


>
> From http://www-ksl.stanford.edu/kst/what-is-an-ontology.html (emphasis added)
>
> "A body of FORMALLY REPRESENTED knowledge is
> based on a conceptualization: the objects,
> concepts, and other entities that are assumed to
> exist in some area of interest and the
> relationships that hold among them (Genesereth &
> Nilsson, 1987) . A conceptualization is an
> abstract, simplified view of the world that we
> wish to represent for some purpose. Every
> knowledge base, knowledge-based system, or
> knowledge-level agent is committed to some
> conceptualization, explicitly or implicitly.
>
> An ontology is an EXPLICIT SPECIFICATION of a
> conceptualization. ... For AI systems, what
> "exists" is that which can be represented. When
> the knowledge of a domain is represented in a
> declarative formalism, the set of objects that
> can be represented is called the universe of
> discourse. This set of objects, and the
> describable relationships among them, are
> reflected in the representational vocabulary with
> which a knowledge-based program represents
> knowledge. Thus, in the context of AI, we can
> describe the ontology of a program by defining a
> set of representational terms. In such an
> ontology, definitions associate the names of
> entities in the universe of discourse (e.g.,
> classes, relations, functions, or other objects)
> with human-readable text describing what the
> names mean, AND FORMAL AXIOMS that constrain the
> interpretation and well-formed use of these
> terms. Formally, AN ONTOLOGY IS THE STATEMENT OF
> A LOGICAL THEORY."    (013)

yes, probably more, but I still think this definition is sound    (014)


>
> >An ontology is first formed at conceptual level (design)  being
> >inmplemented is the last step.
>
> What do you mean by "implemented"?    (015)

I mean that the LOGICAL THEORY does not have a single  language attached to it
You can express a logica theory using a language, but the theory IS NOT
the language. Implementation independence is a fundamental concept in modern
software engineering    (016)

>> A  formalization (eg, in OWL, or IKL if you prefer a
> more exotic notation) is not an implementation of
> anything in the usual sense. Ontologies are not
> software! (BTW, I think that this is not widely
> recognized enough. Thinking of ontologies as
> software is one of the recurrent motivations for
> applying software engineering principles to
> ontology design, which may be a mistake. Or not,
> of course.)  <<There is a wider argument there, later-PDM>>    (017)

'implemented'  is the physical dimension of a conceptual/knowledge formalism
a formalisation is still implementation independent
I think we learn this when we study systems  engineering
a system has a functional desgin, a logical design and an implementation
I can implement a formalisation using different languages, owl being just one    (018)


. But a set of terms alone does not constitute
> an ontology, except in a very trivial sense.    (019)

I dont think I have said that    (020)

> [Later: I see that with your nonstandard usage of
> "term", in fact it does.]
>
> BTW - this might be central - what do you mean here by a "semantic reference"?    (021)

cannot find the original context I used the above expression, but by I
guess I mean
'the term that has a relationship with the semantic convention adopted'
you just use the words that have been chosen to represent the knowledge
pain (english) vs pain (french) is a good recent example    (022)

>
>
> >Impossible? Is this the same 'impossible' that people said when people
> >wanted to cross the sky, or go to the moon, or develop the internet?
>
ybe the entire AI/NL field
> is stuck in a mental box. But there is a
> recurring history here of optimism being replaced
> again and again by a more cautious assessment of
> progress, so I want to see something more
> positive than optimistic rhetoric, before
> agreeing to accept success in this area as a
> *requirement*.
>
PAT: natural language is increasingly used in information systems not
because AI is
advancing - because we are moving towards 4th generation programming languages
Basically, we are not making the machines capable of speaking natualr
language (too ambitious I agree) we are simply creating a semantic
layer capable of translating
NL queries into computable instructons. The advance here, is in the
new generation of web based software that can retrieve information and
perform functions without requiring the operator to write the query in
programming language (say sql) or learning how to use a specific
programme (say, protege) It's happening., albeit slowly and not workig
too well
yet    (023)

> >I would be very careful in defining 'possible' and 'impossible' these
> >days
>
> Perhaps I should clarify that I did not mean
> impossible in principle: only well beyond the
> state of the technological art, so best not
> considered a requirement.    (024)

sorry I cannot describe the  state of the art in semtnaic applications
that would be too long (and painful)    (025)


>
> They I guess I will just wish you luck, and let
> us talk again when you have succeeded. I know you
> are concerned with disaster relief. I would have
> guessed that might be an area where to claim to
> have ALL relevant knowledge might be a particular
> overstatement. But I bow to your expertise in
> this matter.    (026)

Oh noh!  the  all available knowledge cannot come from anyone person alone
hence the need for collaborative environments, and adquate ontology models
to support them    (027)

>> >okay 'all available'= 'all available knowledge that can be put
> >together within given constraints'
>
> Well, that rather begs the question. Of course we
> can get all that we can get. Yes, let us agree on    (028)

OK, Thats settled then.    (029)


>
> I see (reading ahead) that you are using the
> third sense, as in "terms of the treaty", where
> it means roughly the propositions contained in
> the treaty (or ontology). That is not
> conventional usage in the field, so you run the
> risk (as we are here illustrating) of being
> seriously misunderstood; but OK, yes of course if
> you include the entire apparatus of the ontology
> under the phrase "terms of the ontology" then
> defining the terms indeed amounts to creating the
> ontology.    (030)

yes I intended the entire apparatus, (was trying to use generally understood
'term' but onvious that does not work with the very clever people)    (031)

 i need to clarify what is
> >'terms of an ontology' are not just the words, but all the knowledge
> >representation artifacts used
> >to define the ontology
>
> Thanks for the clarification, indeed it helps a
> lot. I think I will not be the only one who
> misunderstood you here. I'd suggest using a less
> misleading terminology if you possibly can.    (032)

Thanks to you to help me make the clarification, cause for me it was obvious
what the meaning was, but obviously it wasnt    (033)


>
> "Collective and distributed" is one matter. The
> existence of a particular social barrier between
> "academics" and "experts" is another claim. I do
> not dispute the importance of providing
> distributed means for collective ontology
> creation. I do however resist (and have seen no
> evidence supporting) a claim that there is an
> important [academic]/[implementors+users]
> cultural 'gap' that needs to be bridged.
>
Yes I have some evidence there  if you want it, got studies not written up yet
person a) expert know all about domain but limited understanding of
ontology so insits that the only knowledge that he knows is the only
knowledge, but would not point to the source
because its confidential
person b) ontology engineer, does not see how you can build a coherent
system without an ontology, and without explicit representation    (034)

person c) programmer, is concerned about his code and does not
understand how, and why, he should worry about the 'knowledge' which
is abstract, and that he does not really have (someone else has domain
knowledge not the programmer)    (035)



> Unfortunately I do not have access to
> IEEEexplore. Is there an openly readable copy, do
> you know?
hm, dont know were I got that paper from, lemme check    (036)


> An excellent paper, I agree, which everyone should read.    (037)

glad you liked that paper    (038)

> ...
>
> Yes, and I stand corrected on this point. Others
> have pointed out to me that free access (as on
> the Web) does not automatically give for example
> rights to copy, for example. So open licencing is
> indeed an issue.
>    (039)

>
> OK, I see you were using 'source of knowledge'
> far more broadly. OK, I will agree then with your
> requirement. But we must recognize that many
> ontologies have no such 'source' in this sense,
> or may represent a distillation from many such
> sources. I look forward to a future in which
> ontologies themselves are considered to be
> definitive sources of knowledge, so that your
> 'broad' view and my 'narrow' view may become
> closer in scope.
>
well, I think that is the general, overall scope of an ontology
If I want to create a system for the red cross to use, then I need to
develop an ontology that reflects the red cross view of the world
but if I want to create a system that works for 'any' emergency
response, then I have to
create a new, more neutral ontology. my problem is to reconcile
different points of views
to create a common language, in terms of conceptual  as well a
semantic reference
Maybe this is what I mean by semantic reference
see the recent argument  'pain' (english) vs 'pain' french, but also
'pours' rain  and 'pours tea'    (040)



> less. We were at cross purposes. But I don't
> think you are describing it properly (or at any
> rate, your usage here is likely to be
> misunderstood by more people than just me.) It
> isn't at all clear that these various doctors are
> REASONING differently, in the sense of using
> different logical principles. In fact it is
> likely they are not. They are however reasoning
> from different sets of assumptions, and reasoning
> about different concepts and using different
> views of what is important or salient. In other
> words, they are using different ontologies :-)    (041)

sure -    (042)

>
> >    (043)

>
> Im really not sure what would count as "declaring
> reasoning".    (044)

in the same way that you declare variable when you write code
you simply state at the top of your deliverable    (045)

a) is a variable and it represents this sets (define set)    (046)

Take your example of the various
> doctors, and suppose each of them writes an
> ontology. What would you accept as a reasonable
> declaration of the reasoning in each case?    (047)

I havent really worked it out how a decent declaration should look like
for example:    (048)

ontology:  medical for treatement of cancer patients
declaration of reasoning:  chinese medicine    (049)

would describe what book they are advocating is the source of
knowledge for that ontology (confucius vs aristotles, for example) or
more specifically
(the book of change vs materia medica)
gotta work it out, dont know know    (050)

Not
> the ontology itself, but the reasoning associated
> with (what? Behind? Giving rise to? to be used in
> the context of?) the ontology. Just in broad
> outline, to give an idea of what you mean.    (051)

for example:
I mean that today, when we (attempt) to design a system that can be synched up
with emergency providers, we are told by the experts (some experts ,
the most experts that we can put our hands on) that the best knowledge
source to date is the red cross, which has the most complete set of
conceptual and sematic definitions in the world
But I have a problem with that    (052)

1) the red cross ontology is not publicly accessible, and if is, it is
not visible (could not find it online)
2) assuming I can find it, the red cross operations are not smooth,
and not transparent, and not  necessarily efficient . This is (I
argue) also because their ontology has  been developed top down, nor
allows anyone to provide feedback. there could be intrisic bias,
and knowledge misrepresentation due to the point of view represented
not being a collectivee one, of diverse communities, but a 'standard'
one, that may not rflect the reality of an emergency.    (053)

I am studying this a little, and I have reason to believe that what I
say above is true, aldthough do have results to share as such.    (054)

So, the red cross ontology may well be the best ontology in emergency
today, but we dont know on what assumptions it was developed (racial,
gender, age and clas discrimination for  example? may all be built
into the system, and the people would never know. why on eearth FEMA
and Red Cross operate they way they do) It could be because their
information system is designed to reflects very partial knowledge.    (055)

I think that in order to be useful and widely adopted, an ontology
should be accessible
visible and transparent in the sense of declaring explicitly what
assumptions it is based on
I hope we are still talking about the same thing at this stage    (056)


> >>  Given this, then, what this amounts to is that
> >>  the ontology should identify what language it is
> >>  written in. Which is a good idea, but again a
> >>  solved problem, so a non-issue, at least if it is
> >>  written using XML; since the XML spec provides
> >>  for just such declarations using the XML header.
> >    (057)

>
> Declare how? What would one say? Would a
> reference to a cultural tradition do? Or are you
> asking for a formalized logic to be used on the
> ontology (as I was presuming)? Or something in
> between? What?    (058)

havent worked it out yet
you tell me    (059)

>
> What is the purpose of this declaration? Will it
> influence how the ontology is to be processed by
> machines? (I presume not.) Knowing the purpose
> might help answer the above questions.    (060)

also knowing the hidden agendas of an organisation    (061)

>    (062)

>
> So am I, but I am considering near or forseeable future.
i think long term    (063)


 "Who is"  simply tells Google to use a specialized set of
> criteria in its next retrieval. And what you get
> back certainly isn't natural language.    (064)

I consider my knowledge queries on search engines a good example
I type everyday  define:natural language or what is: bla bla
 I get a set of documents written in natural language
by  natural language I mean 'not code' am wrong?    (065)



>
> Well, allow me in turn to ask that your ambition
> is based on some acquaintance with the state of
> this particular art.    (066)

semantic technology    (067)

 People have been trying to
> do this since the beginning of AI, and more
> recently under huge commercial pressure and with
> large resources, with very limited success.
>
obviuosly, they must be doing something wrong...
:-)    (068)

>
> And I insist that it is. My insistence is based
> on a fairly close contact with people actively
> working in the field. What is yours based on?    (069)

different view of what I consider state of the art perhaps
and consideration that if someone is doing something unsuccessfully
does not mean
that it cannot be done, using a different approach entirely
(unless somone  decides a priori that something cannot be done)    (070)

>
> >I mean that can be parsed by a computer
>
> OK, thanks.
should have said' machine readable language' sorry for abbr    (071)

>    (072)

> >
> >
> >yes, sure Pat
> >but again. ask a russian undergraduate student, and he will give you a
> >different view
> >wont't he?
>
> But he might find OWL easier than English. My
> point is merely that NL is not automatically more
> humanly usable than what are often called
> 'formal' notations.    (073)

I see, accepted. My point is that I think people should not be required
to study OWL or any other specialised language to access Specialised Knowledge
Probably what I mean that I'd like ontologies to be used as knowledge
repositories
and not just as specialised reservoirs of knowledge with high barriers    (074)


. So  just a bit of training to use an ontology editor
> might be acceptable, if the resulting combination
> of user+editor is more effective than forcing
> users to interview a near-dyslexic NL interface
> to find out what it 'knows'.
LOL
yes, but my ambition is to make Knowledge transparent and accessible
I believe the semantic web and our friends in the search engine business
can really help us there :-)    (075)


>
> >  >
> >>  But the central point is that an ontology, by its
> >>  very nature, is ultimately a text written in some
> >>  language; and so to understand it, you have to
> >>  know that language. (And to ward off possible
> >>  misunderstanding, I'm here using 'language'
> >>  broadly to include, eg, map-making and
> >>  diagrammatic conventions; so that for example
> >>  circuit diagrams or flowcharts or social networks
> >>  displayed as graphs are all kinds of language.
> >>  The basic point still applies.)    (076)

diagrams with clear notation really can be understood with
limited NL language . You will be surprised how I manage to communicate
in the countryside of east asia just using a pen and paper and even
sign language
The sign for 'sleep' (two hands paired on the side of one ear and a
tilted head) is not easilty misunderstood by any human on earth.
Casic communication is easilty done with a few signs provided the
communicator has a clear idea of what they are trying to say, and are
capable of expressing that clearly
- dont want to start another thread on semiotics - just an example of
cognitive representation should not be tied to any particular
'formalims; and its abstract layer should be formalism intependent    (077)


> >i think diagrams and vocabularies should be sufficient - I think we
> >should be able to simplify
> >our standards so that they are accessible, see some of the papers
> >reference in the list above (not just my idea)
>
> Yes, I agree, though Im not sure they need
> simplification so much as re-description. In fact
> I think OWL, the current standard, can be made
> pretty easy for lay users to understand. Most of
> the trip-up issues have to do with making clear
> the limitations of the notation rather than the
> basic concepts, and many limitations can be
> imposed (and hidden) by good interface design.
>    (078)

Okay, to learn OWL you need to have some prior knowledge (be skilled
at some things)  or undergo the expensive training at stanford (2500
usd) . Hay, isnt this a barrier?
Can I interact with  the knowledge of your ontology without owl?
> >  >    (079)


> But my point is that any particular ontology is
> going to be represented in SOMETHING: it might be
> OWL or CL or GOFOL or Prolog or RDF or Concept
> Graphs or CGIF or who knows what. But it can't be
> represented in nothing, and it can't be
> represented in some kind of supervening
> �bernotation, because there is no such universal
> notation.    (080)

my point is that knowledge in any domain is first represented by concepts and
words -  I think grammars,  and logical diagrams, e/r notation, plus
controlled vocabularies
do the trick to represent ontology from the KR point of view    (081)

choice of formalism is personal, and knowledge representation shold be
as as independent as possble from formalisms to be widely usable    (082)



> Oh, so am I! I *detest* OWL. I think adopting OWL
> as a Web standard was a huge mistake.
> Nevertheless, here it is, being used, and we all
> have to get used to it. But not being OWL is one
> thing, and being "independent" of all formal
> notations is quite another.
not of all formal notations    (083)

I dont have any feelings towards OWL, other than I am looking for
someone who can teach me ohw to use it and I have not succeeded yet. I
can learn chinese over here, but OWL
cant. I have also tried to use the protege tutorial, and havent gotten anywhere
also asked around 'can you teach me owl' no luck yet. I can go to the
protege training at stanford this spring but do you know what that
means? Long distance travel, plus tuition fees, plus a new passport
and the risk of being sent back on entry cause I am threat to national
security.    (084)

But I can understand any/most knowledge in plain language, so maybe
owl should not be a reuquirement for knowledge representation on the
web. I guess thats the point of that requirement.    (085)

I am advocating 'clear abstraction'.
I can understant if abstraction can be alienating for some, and that something
abstract does not mean anything to you, and that you cannot visualise
'abstract knowledge'. I guess I ll have to work on that.    (086)

>
> BTW, if I may blow a different trumpet for a
> second, the best candidate so far for a single
> overarching KR notation is I think the IKL
> language Chris Menzel and I recently developed
> (closely modelled on KIF). An ontology written in
> IKL can for example describe concepts defined in
> most other formal languages and in other
> ontologies, referring to those languages and
> ontologies by name, and relate their meanings to
> one another in ways that are sensitive to the
> local context, if required. But IKL in its
> current incarnation is probably too 'logicky' for
> widespread use, and there are not as yet any
> complete formal reasoners for IKL.
>
let me look at it, references? we can always work on it    (087)


> That sounds like multiple ontologies. I agree
> they can be important. But they can also be a
> damn nuisance. How many different views of time
> do we need? How many different ontologies of
> geographical space?    (088)

everything that is in the given boundary must be modelled, or
the system is not adequately representing the reality it aims to
model    (089)

 It is very easy to get confusions
> and inconsistencies unless great care is taken to
> keep divergent concepts separated from one
> another. You might find the literature of
> contextual reasoning and the use of
> 'microtheories' in Cyc illuminating in this
> regard.    (090)

thats what I mean that we may not be speaking the same language
so lets try to focus on the arguments that we can share, and leave the
rest out for the moment    (091)

>
> >Impossible? I would reconsider that statement soon Pat...
>
> Check more carefully what I said was impossible.
> Of course we can have multiple views of things.
> We have that now, and it is a major source of
> problems.
I trhink the problems occur when we cannot reconcile our views, there
is a lot of misunderstandings
>    (092)

thanks Pat
let us now what you see as good requirements for you, and stick em on
the wiki perhaps?    (093)

look forward to be seeing your KR model    (094)

cheers    (095)

Paola Di Maio
(first quarter of weekend gone)    (096)

_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx    (097)

<Prev in Thread] Current Thread [Next in Thread>