Thank you for the comments, Patrick. (01)
> . . . but perhaps you have still underestimated the magnitude. As one who
> has tried to map ontologies to each other, I have concluded that it is
> essentially impossible to do it accurately without participation of the
> developers of the individual ontologies, because (02)
The amount of work that has been conducted up to now is a good
indication of the complexity of the problem. This is not lost on me,
hence the baby steps I have set out for myself. The restrictions I
put on the domains of my mappings limits me heavily, specifically NLP,
the most expressive language out there. (03)
> (1) it requires not only human-level intelligence to understand the
> meanings of the ontology components to be mapped, it requires an expert in
> ontology to understand them; if we ever get to the point where a machine can
> do this, automated ontology mapping will not be needed - the machines will
> generate ontologies directly from text. (04)
Let me clarify that when I said:
"This can naturally be extended to the web, and semantic
classification of documents."
I meant that the size and number of ontologies will only grow over
time, and manual intervention will become impractical. Fully
automated systems may be put into place out of necessity, with a
corresponding hit to precision. I absolutely agree that this
holy-grail of generating ontologies from natural language documents
directly is still a ways away. (05)
> (2) even if a machine had that capability, it still wouldn't be able to
> map accurately, because the documentation is never adequate to resolve
> ambiguities, and almost always very sparse. Except in the simplest cases,
> one needs the original developer to explain the intended meanings. (06)
I equate this to Wikipedia, where the definition changes, is written
in different styles, and hence still needs members of the site to
clean them up, with author's comments included on the "history" and
"discussion" pages associated with each wiki page. (07)
>
> So, if you understand that the mappings will always be of low quality
> compared to a human product, and have an application in mind (such as
> search) that can tolerate large errors, then good luck. (08)
For NLP, absolutely, but I'm hoping that with rigorous structures
imposed on ontology authors, this will be lessened. The consolidation
effort you mentioned, to produce an upper ontology which everyone uses
would be great. For now I think we can focus on creating tools which
at least guide authors to a particular structure, and the work in this
field has been growing as well. The hope is that this effort will make
the task of mapping these ontologies easier. (09)
>
> But there may also be useful things that can be done with automatically
> generated mappings, and if one must go that route I suppose that it is
> worthwhile trying to improve the process. Just be aware that you are
> tackling a project that is daunting for human experts, and be sure to know
> what applications could use a result with a high error rate.
> (010)
I'm humbled by the task, and am in the process of setting realistic
goals for my research. This forum has been a great source of
information in that regard. (011)
--
Bart Gajderowicz
MSc Candidate, '10
Dept. of Computer Science
Ryerson University
http://www.scs.ryerson.ca/~bgajdero (012)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (013)
|