Not quite, AJ: see some comments below.
I hope this helps a bit.
Leo
_____________________________________________ Dr. Leo
Obrst The MITRE Corporation, Information
Semantics lobrst@xxxxxxxxx Center for Innovative
Computing & Informatics Voice: 703-983-6770 7515 Colshire
Drive, M/S H305 Fax: 703-983-1379 McLean, VA
22102-7508, USA
I read Deborah's paper, Leo's presentation and other materials on ontology
spectrum that were mentioned on this list. I found it's a nice way to view how
knowledge representation has been evolved. If I understand correctly, the
ontology spectrum view implies the term "ontology" is used almost as the
replacement for "knowledge representation". [LEO:] Not quite. The focus on the Ontology
Spectrum/Semantic Continuum is more on the expressiveness of the "semantic
model", from low to high. On the Semantic Continuum diagram, vocabularies and
models are presented (on the ontology spectrum, if you extend the lower left
below Taxonomy you would be in the realm of vocabularies; I chose to exclude
those because my focus was on models -- although again, one could in fact
introduce mathematical ordering/structure (it's implicit in the diagrams, but
one can consider the ascending models as increasingly more structured). Though
"knowledge representation" is closely related, I would prefer to keep KR as
part of AI, where it began.
The problem is that these diagrams are pedagogical and thus
simplify things a bit, focusing as they do on the question of "what is an
ontology", rather than on the pure expressivity or complexity of the model or
logic (and most of the "semantic models" in the ontology are not based on
logics). For my purpose (and I think for the purposes of Deborah McGuinness,
Mike Uschold, Mike Gruninger, Chris Welty, and Fritz Lehmann), the
overriding concern was to differentiate a range of less to more expressive
models, some of which can be called "ontologies". For my purpose, it was to
sort out what wasn't an ontology (logical theory), what could be an ontology
(conceptual mode), and what wasn't an ontology (thesaurus, taxonomy). And
thereby teach folks about these notions. However, these are simple diagrams
and as such eliminate some complexity they really shouldn't, if one wants to
really clarify these notions in some detail.
Some detail one could
differentiate:
1) Orderings or structure. This is mathematical
order/structure. Typically assuming set theory or category theory, one
can define increasing "levels" of order. Using set theory, it is probably
something like: sets, ordered pairs, partially ordered sets, ... lattices, ...
Where the "order" relation/s is typically defined by mathematical
properties suchas reflexivity, symmetry, transitivity, etc. With category
theory, you get simple structural notions of categories (nodes) and
morphisms (arrows), which you can generalize to functors, natural
transformations, cartesian closed categories, topoi,
etc.
2) Formal. Whether there is a language or logic
(and a logic is a formal language) associated with the model, i.e.,
again rather simplistically, is there a formal syntax and a formal semantics
for the modeling language? Sometimes we call this a "formal" model. In
general, taxonomies and thesauri, or rather the languages they are expressed
in, are not formalized, though one could develop a formal language for each.
This formalization must be based on (1). Necessarily related to this is the
notion of "machine-interpretability" and thus "automated reasoning/inference."
If there is no logic behind the model, then the machine cannot "semantically
interpret" that model, and thus cannot "automatically
reason/infer."
3) Philosophically ontological. Whether the
model (ontology) is rooted in reality. And then what is the philosophical
perspective of that rooting in reality? I.e., realist, idealist, nominalist,
conceptualist, etc. I would also say that formal semantics comes into play
here because we use symbols of a language and represent whatever
those real-world referents are that those symbols refer to, leading to an
intermediate realm of "concepts" or representations for those
referents.
4) Use cases. What kinds of applications do these
models address? For example, if you want to place documents into gross topical
buckets (doc X is about China), for navigation or search that is a bit more
structured than free-text (i.e., Google) search, then use a topic/term
taxonomy. If you need great precision (i.e., you want to specify, generate,
or find a specific service or application), you need a conceptual
model or logical theory. If you want to have automated reasoning support for
the latter, you need a logical theory.
So, the notion of semantic model or ontology
brings together: mathematics, logic, theoretical computer science, AI,
formal language theory, formal semantics, formal
ontology, etc. As you
can see, it is difficult to correlate all these sciences (and their related
engineering or applied disciplines) in a simple single-dimensional
spectrum focused just on "ontology". But I am optimistic and think that,
though difficult, it is possible, especially if the purpose is
pedagogical, and the result will help disparate communities that
address similar "semantic" notions converge on common vocabulary and
meaning for characterizing what they are about and how they are systematically
distinguishable from others.
In addition to the broad view from "ontology spectrum", I'm interested in
looking deeply in the more strictly defined ontology space, where an ontology
represents a specific aspect of the real world using class objects that can be
expressed in clearly defined languages (like RDF, OWL). By focusing on this
strict ontology space, I hope we may be able to come up with some sort of
framework that will help people develop more reusable ontologies (in RDF/OWL
or newer languages). A classification scheme may be one component of this
framework.
There are probably lots of work done in this direction already that I don't
know about. Please provide pointers to any relevant work if you know any.
Coming from object-oriented software development background, my sense is that
we should be able to learn a lot from reusable software design. When an
ontology is coded in a language, it's conceptually possible to map the
ontology to a layer of ontologies designed with different granularity. And
these different layers of ontologies should be about to talk to each other in
an integrated application. [LEO:] I
think we come full-circle on this, since most of OO came out of AI and
theoretical CS, i.e., from Smalltalk and similar languages, and the early
pre-Common LISP notions of OO (Flavors, LOOPs), i.e., declarative
representations of real-world objects, ways of structuring those objects,
message-passing and eventually methods for communicating between objects and
doing work, data abstraction and hiding, method/type coercing, inheritance,
etc.
Again, using examples of real ontologies to further understand the problem
of reusability is neccessary. In my previous post, I tried to use my own
examples to illustrate the problem. Hope more people will bring their
examples into the discussion as well.
AJ -- AJ Chen, PhD http://www.web2express.org "Open Data on Semantic
Web"
Quoting "Deborah L. McGuinness" <dlm@xxxxxxxxxxxxxxxx>:
>
hi - i am catching up on email so sorry to come in mid stream. i have >
not gone back through the whole thread but figured i would chime in
now > while the discussion is on a spectrum. > i have gotten a LOT
of leverage out of the notion of an ontology spectrum. > > In 1999
a number of us were on an ontology panel at aaai. as homework > for the
panel, we had a pre-meeting about what each of us were willing > to call
ontologies. It might be useful to explain my perspective when i >
participated (which actually still reflects my perspective now on the >
topic). > this was just after i had done a fairly comprehensive
consulting project > looking at "naturally occurring" things that might
be considered > ontologies as a LARGE crawl to obtain starting points
for a very > comprehensive ontology. > In that effort i looked in
excruciating detail at a number of mostly > taxonomies like yahoo
shopping, lycos, amazon, as well as a number of > subject specific
taxonomies and things i consider light weight > ontologies - class
hierarchies where the classes have a small number of > properties,
sometimes with value restrictions. i was working in the > context of a
startup with a very broad user base so somewhat by > necessity, i needed
to consider what the general public might consider > an ontology and how
a broad set of people might use it. > i also was in the midst of a large
ontology-driven project - hpkb - > where my team had to answer questions
using 80 kbs as input. the > knowledge bases were all essentially kif
statements (or they had been > translated into kif) and they were
generated by people at least > reasonably trained in kr and the kbs had
a lot of structure. so i had > been doing A LOT of ontology and
knowledge base merging at that time in > my life - an it was driven
somewhat by two fairly different desires and > needs for ontologies - 1
heavy weight theorem proving to generate > answers and 2 very light
weight ontology-enhanced search. (i still find > myself driven by these
2 needs on a very regular basis). > > the ai panel members (welty,
gruninger, uschold, lehman, and myself) > came up with an ontology
spectrum that i have grown to like quite a bit > - but of course that is
because it reflects a lot of my personal > experiences with structured
declarative knowledge representations from > extremely lightly
structured things to very principled, detailed > structured artifacts.
the belief i came to then and the one i still > stand by now is that we
are more likely to have structured knowledge get > used in applications
if we work with the notion of a spectrum and help > people move along it
as their needs and education permits. > > I gave a talk in early
2000 about the pull i was experiencing for > ontologies at a dagstuhl
meeting and used the spectrum as an organizer. > I wrote a paper
describing my view of each of the points on that > spectrum along with
some examples of my experiences in each of those > points. i wrote the
paper in 2000 with a small update in 2001 but the > actual published
version of it came out in a book from that even MUCh > later - actually
2003. > an online "preprint" version is up at: > http://www.ksl.stanford.edu/people/dlm/papers/ontologies-come-of-age-mit-press-(with-citation).htm > >
one reason i bring up that paper is that i still find that a lot of >
people tell me they get value out of that paper i think for 2 reasons
- > 1. the simple spectrum (yes - i think one dimension of
expressiveness is > really too limiting... but it is convenient
pedagogically.) > 2 the examples of each point on the
spectrum. > > i would be happy to co-author a next generation of
something like that > paper.... and in fact, i have been asked for such
a paper on a regular > basis with more examples and more current
references. > > deborah > > Obrst, Leo J.
wrote: >> Charles, >> I agree with you. A number of us
through the years have come up with >> similar ontology continuums or
spectrums. I prefer my Ontology >> Spectrum*, but that's natural, I
guess. It was developed over time to >> act as an educational aid. I
found that many folks understood notions >> such as taxonomies,
database schemas, UML models, but they didn't know >> how these
related to the new kid on the block, ontologies. Was a >> thesaurus
an ontology? No. Was a UML model: no, not yet. And term vs. >>
concept (placeholder for real world referent) is a crucial >>
distinction. The former is a word/phrase (string, utterance) that >>
indexes the latter, which is a representation of the meaning of
that >> term (at least approximately). The important point is that
these >> concepts/placeholders are meant to stand in for real world
referents, >> since ontology is about the things of the world. I also
attach a newer >> slide that tries to show those distinctions, along
with their typical >> use cases:
OntologySpectrumApplication-Obrst06.jpg. >> Thanks, >>
Leo >> *If you look at the current Wikipedia article on the subject,
it's not >> completely accurate: http://en.wikipedia.org/wiki/Semantic_spectrum.
I >> independently developed the Ontology Spectrum in Fall, 1999, and
it >> really represents one dimension, though it is depicted
diagonally (for >> increased space) as though it were two
dimensional: the one dimension >> is in terms of expressivity of the
model. Also the 4 way stations of >> taxonomy, thesaurus, conceptual
model, and logical theory are semantic >> models; that is why I don't
include glossaries, term lists, etc., >> directly -- they are not
models but are human language lists and >> definitions. Mike Uschold,
Mike Gruninger, and Chris Welty and I have >> talked about this topic
of the co-invention of the semantic/ontology >> spectrum for quite
some time. Personally, I prefer my Ontology >> Spectrum because I
overlay onto the specific models additional >> information, such as
the kind of parent-child relation, related >> database and modeling
languages, and logic information. But all of >> these ontology
spectrum/semantic continuums are sound: they represent >> the best
distillations of solid generalizations especially good for >>
educational purposes.You are probably referring to the presentations
I >> gave at Ontolog last Jan 19/26 2006: "*What is an ontology? -
A >> Briefing on the Range of Semantic Models*", >> http://ontolog.cim3.net/cgi-bin/wiki.pl?ConferenceCall_2006_01_12. >>
_____________________________________________ >> Dr. Leo Obrst The
MITRE Corporation, Information Semantics >> lobrst@xxxxxxxxx Center
for Innovative Computing & Informatics >> Voice: 703-983-6770
7515 Colshire Drive, M/S H305 >> Fax: 703-983-1379 McLean, VA
22102-7508, USA >> >>
------------------------------------------------------------------------ >>
*From:*
ontology-summit-bounces@xxxxxxxxxxxxxxxx >>
[mailto:ontology-summit-bounces@xxxxxxxxxxxxxxxx] *On Behalf
Of >> *Charles D Turnitsa >>
*Sent:* Monday, January 22, 2007 1:39 PM >>
*To:* Ontology Summit 2007 Forum >>
*Subject:* Re: [ontology-summit] Defining
"ontology" >> >> One of the big schisms
in types of ontology that I see is a >> difference
in an ontological representation (model) that is >>
intended to organize knowledge at the level of terms, and a
model >> that is intended to organize knowledge at
the level of meaning. >> >> If you look
at the Ontology Spectrum that was presented to the >>
Ontolog group last year by Dr. Leo Obrst, you see a progression
of >> ontology representation techniques, from
controlled vocabularies >> and simple data models,
up through thesauri, taxonomy techniques, >> up to
axiomatized systems and logic based models (and beyond). One >>
of the big shifts I have seen is the difference in emphasis
of >> lower level models (thesauri and controlled
vocabularies, for >> instance) on terms, and the
attempt of upper level models (axiom >> based
systems, logic models) on definitions. For different >>
communities, differently focused applications, both appear
equally >> useful, but they are very
different. >> >> From all of this,
possibly an axis of differentation for >>
ontologies can exist to show the focus of what the ontology
is >> defining, and the depth of it's intended
use. >> >>
Chuck >> >> Charles
Turnitsa >> Project Scientist >>
Virginia Modeling, Analysis & Simulation
Center >> Old Dominion University Research
Foundation >> 7000 College Drive >>
Suffolk, Virginia 23435 >> (757)
638-6315 (voice) >> (757) 686-6214
(fax) >> cturnits@xxxxxxx
<mailto:cturnits@xxxxxxx> >> >>
-----ontology-summit-bounces@xxxxxxxxxxxxxxxx wrote:
----- >> >> To: Ontology
Summit 2007 Forum <ontology-summit@xxxxxxxxxxxxxxxx> >>
From: Patrick Durusau
<patrick@xxxxxxxxxxx> >> Sent
by: ontology-summit-bounces@xxxxxxxxxxxxxxxx >>
Date: 19/01/2007 08:53AM >>
Subject: [ontology-summit] Defining
"ontology" >> >>
Greetings, >> >> I am
concerned with the suggestions that it is possible to >>
create a >>
continuum along which to organize what are known as >>
"ontologies" in one >>
or more circles. >> >>
At least unless we are willing to concede that the creation
of >> such a >>
continuum is itself an imposition of assumptions
from an >>
undisclosed >>
ontology. >> >> I am
sure there are those who would say that folksonomies are >>
"missing" >>
features that are present in "formal" ontologies. Perhaps,
but >> folksonomies predate "formal"
ontologies by several millenia >>
and have >> proven robust
enough for many purposes. If the goal is to >>
represent the >>
opinions of the many rather than the few, perhaps it is
"formal" >> ontologies that
"missing" features. >> >> I
am not taking a position one way or the other. But, I do >>
think it is >>
important to realize that any attempt to construct a
continuum >> is
with >> an unstated choice of a
winner before the the continuum is >>
populated. >> >> Hope
everyone is looking forward to a great weekend! >> >>
Patrick >> >>
-- >> Patrick
Durusau >>
Patrick@xxxxxxxxxxx >> Chair,
V1 - Text Processing: Office and Publishing Systems >>
Interface >>
Co-Editor, ISO 13250, Topic Maps -- Reference Model >>
Member, Text Encoding Initiative Board of
Directors, 2003-2005 >> >>
Topic Maps: Human, not artificial, intelligence at
work! >> >> >> >>
_________________________________________________________________ >>
Msg Archives: http://ontolog.cim3.net/forum/ontology-summit/ >>
Subscribe/Config: >>
http://ontolog.cim3.net/mailman/listinfo/ontology-summit/ >>
Unsubscribe:
mailto:ontology-summit-leave@xxxxxxxxxxxxxxxx >>
Community Files: >> http://ontolog.cim3.net/file/work/OntologySummit2007/ >>
Community Wiki: >>
http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2007 >>
Community Portal: http://ontolog.cim3.net/ >> >> >> >>
------------------------------------------------------------------------ >> >>
------------------------------------------------------------------------ >> >> >>
_________________________________________________________________ >>
Msg Archives: http://ontolog.cim3.net/forum/ontology-summit/ >>
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontology-summit/ >>
Unsubscribe: mailto:ontology-summit-leave@xxxxxxxxxxxxxxxx >>
Community Files: http://ontolog.cim3.net/file/work/OntologySummit2007/ >>
Community Wiki: http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2007 >>
Community Portal: http://ontolog.cim3.net/ >> > > >
_________________________________________________________________ > Msg
Archives: http://ontolog.cim3.net/forum/ontology-summit/ >
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontology-summit/ >
Unsubscribe: mailto:ontology-summit-leave@xxxxxxxxxxxxxxxx > Community
Files: http://ontolog.cim3.net/file/work/OntologySummit2007/ >
Community Wiki: http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2007 >
Community Portal: http://ontolog.cim3.net/ >
_________________________________________________________________
Msg Archives: http://ontolog.cim3.net/forum/ontology-summit/
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontology-summit/
Unsubscribe: mailto:ontology-summit-leave@xxxxxxxxxxxxxxxx
Community Files: http://ontolog.cim3.net/file/work/OntologySummit2007/
Community Wiki: http://ontolog.cim3.net/cgi-bin/wiki.pl?OntologySummit2007
Community Portal: http://ontolog.cim3.net/ (01)
|