Michel, Leo, and Michael, (01)
JFS
> But what was produced [by the SW] failed to address the requirements Tim
> proposed and many others (including Robert and me) believe are essential. (02)
MD
> can you list/summarize the requirements and why you think the steps that
> the semantic web effort has made do *not* contribute to those requirements? (03)
As I've said repeatedly, three words that Tim B-L emphasized in the DAML
proposal of 2000 were diversity, heterogeneity, and interoperability. (04)
In the final DAML report of 2005, two of them (diversity and
interoperability) were mentioned just once and heterogeneity was
never mentioned at all. (05)
I also believe that Robert Meersman's short summary is very good: (06)
http://starlab.vub.ac.be/website/files/MeersmanBuffaloAug2007.pdf
> Why "the" Semantic Web has failed.
> * Data models vs. ontologies
> * Legacy systems
> * Scalability
> * Methodology (07)
For point #1, RDF + SPARQL is just YADM -- Yet Another Data Model.
It has few advantages and many disadvantages over data models that have
been in use for decades. I have no objection to YADM if people find it
useful, but I have serious objections to edicting any single data model
as a requirement for the Semantic Web. (08)
For point #2, Tim B-L noted the importance of interoperability with
legacy systems, but the DAML report ignored them completely. I can't
blame them for not doing everything in five years, but they have not
done *anything* to support legacy systems in the past 13 years. (09)
And please do not repeat the claim that they provided a tool to convert
RDBs to RDF. Interoperability means that the legacy systems work with
the new tools *concurrently* -- not by means of forced conversion. (010)
For point #3, the SW people claim that OWL is decidable. That only
means that decisions terminate in *finite* time -- even though that
time might be greater than the age of the universe. For anything
the size of the WWW, scalability means no worse than (N log N) time. (011)
For point #4, please reread Robert M's slides for an example of what
a methodology can and should support. (012)
Leo
> The closest that relational databases get to having a semantic model
> is the conceptual schema, which is a type of conceptual model (modeled
> in a graphic Entity-Relation-Attribute language, with cardinality
>restrictions). (013)
Unfortunately, there was never a standard for a conceptual schema, and
the vendors merely pasted the term 'conceptual schema' on top of what
they were doing anyway. They turned it into an advertising slogan. (014)
E-R-A + cardinality is a requirement that must be specified in any
conceptual schema (or ontology), but it's far from sufficient.
And most of the published OWL ontologies do little or nothing
to go beyond that level. (015)
From 1978 to 2000, the published R & D on the conceptual schema and
related issues went far beyond what the vendors provided, Tim B-L
cited some of that work, but the DAML developers ignored it. (016)
Leo
> Now the above view does have rare exceptions in the database world: e.g.,
> Matthew West's work immediately springs to mind. Similarly, HighFleet
> (formerly Ontology Works) tries to bridge the ontology-database connection.
> Also, of course deductive databases try to combine logic programming +
> relational constructs, though these just focus on the implementational
> apparatus you would need for more expressive ontologies, but say nothing
> in particular about ontologies. (017)
I agree that the systems you mention are good. But there were many
years of very good systems that the SW ignored. Deductive DBs were
proposed in the 1970s -- note Planner and Microplanner. RDBs combined
with Prolog and other AI tools have been widely used since the '80s.
Tim B-L cited them in his DAML proposal of 2000, but the SW gnored them. (018)
By the way, two commercial companies *based* on Prolog + RDBs are
Mathematica and Experian. Mathematica started with Prolog as their
underlying reasoning engine in the '80s, and they have developed the
foundation into a very rich logic-programming system that uses RDBs
for external storage. (019)
Experian uses Prolog + RDBs for Big Data -- much bigger than any
application that uses RDF + OWL. They compute everybody's credit
rating on a daily basis with every imaginable input they can find.
They use Prolog so heavily that they bought the Prologia company. (020)
MB
> But I have to add that the transition between data model and ontology
> is fluent. In practice, you often have to make compromises - for example to
> enable better querying or because knowledge and application data cannot be
> untied easily. (021)
I agree. And those issues were addressed in the 3-schema strategy of
of the original ANSI/SPARC TR in 1978. The conceptual schema -- which
is very close, if not identical, to what we now call formal ontology --
was at the heart of the proposal. The physical schema, which is very
close, if not identical to what is called the data model, specifies
the data formats, layout, and structure. The application schema
specifies the APIs of the software. (022)
I also agree that the detailed ontologies will often use primitives
and operations that have a simple mapping to the preferred data model.
That is another reason why I have recommended an underspecified upper
level ontology with families of "microtheories" for more specialized
ontologies that are optimized for different kinds of applications. (023)
But those issues get into details that we have discussed many
times before, and I won't repeat them now. (024)
John (025)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J (026)
|