ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Fw: Next steps in using ontologies as standards

To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: Ed Barkmeyer <edbark@xxxxxxxx>
Date: Tue, 06 Jan 2009 17:47:13 -0500
Message-id: <4963DF71.8080708@xxxxxxxx>
Rich Cooper wrote:    (01)

> Past history indicates that the killer app comes first, and the standards
> follow.     (02)

I think that was John Sowa's original point.    (03)

> Wirth's Pascal solved a lot of problems in programming language
> design in the late seventies, showing that a programming language could be
> both elegant and effective.  Ada resulted from the failed DoD attempt to
> standardize Pascal and turn compilers into a commodity.      (04)

I would like to correct the history on this, having participated in both 
Pascal and Ada standardization in the late 1970s.    (05)

First, Wirth's Pascal language was published in late 1970 or early 1971, 
and Kathy Jensen got her Ph.D. for writing the first Pascal compiler in 
1971.  The ANSI/IEEE and BSI/ISO Pascal standardization projects began 
in 1978.  Klaus told the ISO standards group in 1979 that standardizing 
Pascal was a stupid idea, because Pascal was intended as an experimental 
programming language, unsuitable for production software, whose purpose 
was to educate students in good programming techniques (I think he was 
all about Simula at the time).  The military had almost nothing to do 
with Pascal per se.  NASA was the source of the U.S. Government interest 
in Pascal.    (06)

Interestingly, by 1980, several instrument companies, making systems for 
automotive and aircraft manufacturers, were using a Pascal subset to do 
what the military created the Ada project for -- writing software for 
what we would now call "embedded computer systems".    (07)

DARPA funded the "embedded systems language" project in 1975, with four 
competing awards, dubbed the red, green, yellow and blue languages.  In 
1977 the evaluating board pared the continuation projects down to 2, and 
in 1979 chose one: Ada.  Ada was the "green" language, developed by Jean 
Ichbiah and colleagues.  All of them were structured programming 
languages with real-time features and similar type definition 
capabilities.  (Euclid (PARC) and Praxis (BBN) were red and blue, as I 
recall.)  Ichbiah's work, his competitor's work, and Klaus's work, all 
came out of several international conferences on programming language 
design in the late 1960s that were seen as the successors to the Algol 
conferences of the late 1950s.  That work gave rise to Algol68 (which 
had all the Pascal features and most of the Ada features, but in 
un-compilable syntax -- trust me, many of us tried) and several others. 
  It also gave us the Vienna Development Method -- a first cut at a 
language for formal programming language specifications.    (08)

DoD went out of its way to avoid turning Ada into a family of languages 
(and compilers into a commodity).  It required that no subset of the 
language could be called "Ada", even though few programmers needed the 
real-time features even for embedded systems, and it required the most 
inefficient compile-time (but most efficient runtime) handling of 
"generic functions".  These two choices made Ada compilers harder to 
build and more expensive to run -- high overhead tools for academic 
work.  In so doing, DoD ate its own children -- universities could not 
develop subsets and could not use the commercial tools, so they did not 
train Ada programmers.    (09)

> C, which was licensed for $1 to universities in its early days, made that
> language widely used, though not standardized.     (010)

Exactly.  C was the university substitute for Ada.  C was standardized 
in ISO around 1988, and by 1990 almost all C compilers (save the 
Berkeley one) conformed to ISO 9899.  To this day, C is still a primary 
language for embedded systems and other real-time systems.    (011)

> C++, a freeware layer on top
> of C, added the object oriented layer that took off from there.  C++ is now
> the most widely used programming language, but the AT&T product came first.    (012)

The "now" in the above must mean the date on this email is erroneous. 
Java and VBasic have been the "most widely used" programming languages 
since 1999.  C++ was a creature of the 1990s, whose time has essentially 
passed.  Bjorne Stroustrup's team went out of its way to create an 
object programming language that would produce efficient runtime code. 
The result is an ugly and inconsistent language that produces code that 
is efficient in time but not in space (which is why embedded systems 
programmers of the 1990s still used C).  Jim Gosling realized in 1992 
that in the future, few programmers would care about either time or 
space.  (And if you do, Java is not for you.)    (013)

One other note:  What really made compilers a "commodity" was:
- Dave Gries's book in 1972 that taught children all the basics of 
compiler writing that we professionals had struggled to learn in the 
1960s; and
- yacc and lex, Unix-based freeware tools of 1976 that allowed the 
programmer to write a grammar and generated a C implementation of the 
parser for it, leaving the user to complete the semantic stubs.  Yes, 
you had to learn something about symbol tables and code generation -- 
read Gries.    (014)

My NIST predecessor in the Pascal standards effort, Justin Walker, used 
yacc and lex to write a Pascal compiler/interpreter, with almost no 
previous background in compiler writing.    (015)

> SQL was built on the need for a database technology.  Precursors to that
> technology were already in use in commercial applications as flat files,
> B-Trees, indexed sequential access methods, and many other partial solutions
> to the problem of flexible persistent data storage.     (016)

Most importantly, SQL/DB2 (vintage 1980) was built on IBM's need for a 
competitor to the database implementations more or less conforming to 
the Codasyl standard for "navigational databases" of 1974, which 
included products by Univac, Honeywell, NCR and two commercial products 
by independent software/service houses that ran on IBM mainframes.  To 
avoid being one among many, IBM turned to the 10-year-old work by Ed 
Codd et voila - a new technology.  (To their surprise, a startup made 
the same choice at the same time.  They called their company Oracle.)
I believe John Sowa can provide a lot more detail on this -- he was 
there.  (I worked on the Honeywell IDMS.)    (017)

The Codasyl NDB standard (the rage of the 1970s, together with 
hierarchical systems) was essentially the structure of OODBs --  variant 
record structures with embedded pointers -- without the "object" paint 
job.  The IDL was designed to be embedded in COBOL programs.    (018)

> The point is that the market determines success first, without standards.
> Standards are only practical after one or more killer apps have shown what
> can be done.  When standards are posted before the killer app comes out,
> little or no impact is felt in the community.  Investors have more proven
> vehicles for their assets.      (019)

I agree.  The exceptions are rare.  But in this community, OWL is one of 
the exceptions.  The effect of OWL was to create "commercial use" of a 
relatively long-standing academic technology.  Before OWL, no one could 
read or use what anyone else in the AI community wrote.  The products, 
or their academic prototypes, were in place, but the commercial use was 
non-existent.    (020)

> Nothing equivalent to these older products has been produced for ontologies
> yet.  There is, IMHO, no killer ontology app at this time.     (021)

In fairness, SAP was the first killer app in the database business, and 
it was 10 years after Adabas, Oracle and SQL/DB2.  The enabling 
technology isn't the killer app.  The killer app is the one that has 
direct business value.    (022)

We are now nearly 20 years into ontology development and 40 years into 
reasoning tooling.  The killer app is the ontology, or the 
ontology-based application, that will produce a clear RoI for some 
business or government activity.  (And in the U.S., it is the 
intelligence community that is trying very hard to find and fund the 
killer app.)    (023)

> There is need
> for integrating multiple databases and for unifying poorly automated
> processes, but unifying ontological approaches don't yet exist.  The
> technology has not been demonstrated in practice.       (024)

Agreed.  Unsurprisingly, human intelligence is more efficient than 
artificial intelligence in solving these problems.  Our experience (over 
7 years) is that more than half of an integration problem is in the 
interfaces, not in the information.  Wrapping an application with a 
webservice or reading a screwball file format is not a semantic problem, 
but it is always a part of the integration problem.  The semantics-based 
integrator is 25% of the solution cost.  And when the problem is simply 
transforming one XML form to another, a human engineer can do it faster 
with an XSLT front-end than with an XML-to-ontology and ontology-to-XML 
mapping.    (025)

And there are no fully automated ontology mapping tools, because (among 
other things) there is no starting point.  And mapping tools don't 
understand circumlocution, but humans, and especially OWL ontologists, do.    (026)

There is no killer app hiding in these bushes.    (027)

> So it seems extremely premature to discuss standard ontologies until there
> are killer apps to make the market pay attention.  After the usefulness of
> ontologies are demonstrated (not anticipated or projected), there will be
> fruitful efforts to use some ontologies on a widespread basis.     (028)

I agree with this.  The ontologies that are used and found useful will 
become the reference ontologies, warts and all.  It is nice to think 
that we could prevent the warts by some prophylactic effort, but the 
best prophylactic practice we can reliably do is teach and exemplify. 
As Andreas Tolk said, ontologies are models, and we have to teach 
would-be knowledge engineers how to model.  And we have to do it within 
the constraints of the supported languages.    (029)

And no matter how well published the best practices are, there is always 
the danger that the ontology in the killer app will be Windows95.    (030)

-Ed    (031)

-- 
Edward J. Barkmeyer                        Email: edbark@xxxxxxxx
National Institute of Standards & Technology
Manufacturing Systems Integration Division
100 Bureau Drive, Stop 8263                Tel: +1 301-975-3528
Gaithersburg, MD 20899-8263                FAX: +1 301-975-4694    (032)

"The opinions expressed above do not reflect consensus of NIST,
  and have not been reviewed by any Government authority."    (033)

_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx    (034)

<Prev in Thread] Current Thread [Next in Thread>