Ed, (01)
I mostly agree, but I'd like to state a few qualifications. (02)
> I share John's general concern about the plethora of standards, but my
> particular concern is primarily with competing exchange standards, which
> only serve to reduce interoperability of software tools. In that area,
> one standard may be productive; two standards is necessarily
> counterproductive. (03)
We have huge numbers of exchange standards at many different levels,
and those that have been reconciled with one another work together
so well that most people are completely unaware of them. (04)
For example, security people are extremely concerned about the
details of the packets at the TCP/IP level. But anyone using
http, ftp, email, etc. is usually blissfully ignorant of packets. (05)
The worst kinds of exchange standards are ones that were never
properly thought out as levels that had to interact with anything
else. For example, PDF has become a de facto standard for
exchanging documents. But anybody who has ever had to process
the text in those documents quickly becomes aware of "PDF hell". (06)
The problem that Postscript and PDF addressed was to create
a standard that would recreate an exactly formatted visual image
on a wide range of printers and display devices. That worked
better than any other alternative available. (07)
But in the process, Postscript focused on preserving the visual
image on the page at the expense of scrambling the string of
words, characters, and images in the document. It allows so many
"creative" ways of generating a page image that even Adobe has
never solved the problem of "decompiling" PDF to text for the
many clever coding tricks. (08)
The *ML family is far better than Postscript as exchange standards,
but they too create huge numbers of problems by "useful" syntactic
features that scramble the semantics. These problems were also
crated by short-sighted fixes to address special-case issues. (09)
> ... where they differ or conflict, one
> has to ask what the motivations, resources and target products are
> like. The engineering practice for a 3-year greenfield project with a
> team of 50 and a 10% annual staff turnover is going to be different from
> the practice for a 2-man 6-month feature addition to an existing
> software product, managed in parallel with remedial maintenance and
> related new product development. (010)
Yes. And all the issues that Fred Brooks discusses arise when
multiple independent design committees collaborate on one enormous
project that has ramifications into every facet of everything else. (011)
Microsoft addresses that issue with their "daily build" -- and we
have all had painful experiences with the results. (012)
> I have seen several projects fail, because they were led by 'experts'
> who brought to the project a religious methodology that was ill-suited
> to the problem and staff at hand. As John will confirm... (013)
Well, yes. But I would say that there are valid reasons for having
different methodologies for radically different kinds of applications.
To mention one aspect of the conventions that came out of Fred Brooks'
shop, there are two that were universally accepted at IBM: (014)
1. The two characters "//" at the beginning of every command for
both OS/360 and DOS. (015)
2. The conventions observed by the link editors that combine object
code from a compiler into an executable module. (016)
The first was a trivial syntactic feature that was totally useless
for any kind of compatibility. The second was an important semantic
convention that enabled compilers written to run under one operating
system to be ported to others that also ran System/360 bit codes. (017)
Unfortunately, the link-edit convention was at too low a level to
ensure that code from different compilers, even on the same OS,
could be combined -- but that is a still a problem today. It is
partially solved by compilers like gcc that compile multiple
languages. But problems can still arise for code generated by
different versions of the "same" compiler. (018)
> ... The trick is to recognize and clearly
> specify the common concepts, and segregate the well-defined concepts
> that are method-dependent and may therefore be ignored or superseded by
> other methodologies... (019)
Yes, but. There are two critical distinctions: syntax vs. semantics,
and the level of the convention. All exchange specifications are
ultimately stated in some kind of syntax. The TCP/IP conventions
work because they preserve the syntax at a very low level: a stream
of uninterpreted bits. That ensures that the semantics of any level
above the bits will be preserved. (020)
But the Postscript/PDF conventions preserve the syntax at the level
of a visual image, which can be created in an open-ended number of
ways. That is OK for anything that depends on the integrity of
the image -- such as a printer or a human reader. But that's bad
for software that needs to process the original text. (021)
John (022)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J (023)
|