Doug and David, (01)
DMcD> And the reason this is worrisome is that it introduces massive cost
> and lost opportunities into enterprises. (02)
I agree, but there are many things that are creating massive cost
overruns. This is just one of many, and probably not the most
pressing problem enterprises must face. (03)
DMcD> Something I would like to know, but seriously doubt if I'll ever
> learn, is how these technical dispersed environments sync up their
> changes. I could be wrong, but I seriously doubt if there's a
> software configuration management (SCM) tool that natively handles
> Windows/Unix/AS400/mainframe in one fell swoop. (04)
That will never happen. (05)
The only things that those programs "synch up" with other programs
are data formats, entity names, units of measure, and terminologies.
They don't use anything like a detailed, axiomatized ontology. And
there is no evidence that a formal ontology would be of much use
to them, even if somebody gave it to them on a silver platter.
(In fact, the platter would probably be more valuable.) (06)
See those slides I presented in June and July: (07)
http://www.jfsowa.com/talks/iss.pdf (08)
Note slide 4: The amount of legacy software in current use is half
a trillion lines of code. Slide 7 notes that the typical cost to
produce one line of fully debugged code is $18 to $45. The great
bulk of that code will still be running 20 years from now, and
some will still be in daily use 50 years from now. (09)
See slides 62 to 67 about the issues of dealing with legacy systems:
just use a "surface" (AKA "superficial" or "underspecified") ontology
that covers the interfaces and treat the programs as "black boxes"
whose inner workings are unknown. (010)
DE> The current state is that most organizations don't have a clue
> what they have for applications, much less how changing something
> in Silo A impacts Silo Z... so they just make the change in A &
> eventually pay for fixing the mess in Z (probably without ever
> knowing the connection).
>
> I have recently conducted an informal survey... and learned
> in no uncertain terms that "inventory/impact analysis" is NOT
> something to discuss. People DO NOT CARE. (011)
Some do care. (012)
Note slides 91 to 98, which discuss a method for analyzing both
the legacy software and documentation for e large corporation
(on the Fortune 1000 list). That project analyzed and compared
1.5 million lines of COBOL code and 100 megabytes of miscellaneous
documentation, some of which was up to 40 years old. (013)
Those tools aren't yet widely used, but that kind of technology
is at the research level today, and it will become more widely
available as time goes by. (014)
John (015)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (016)
|