ontolog-forum
[Top] [All Lists]

Re: [ontolog-forum] Ontologies and languages

To: "'[ontolog-forum] '" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: "Bruce Schuman" <bruceschuman@xxxxxxx>
Date: Fri, 19 Jun 2015 08:21:41 -0700
Message-id: <006a01d0aaa3$a501bc90$ef0535b0$@net>

Thank you, Ed, I appreciate your vision.  I did bump around on your website at http://users.rcn.com/eslowry and found a lot of interesting resources.  I did manage to find the article by Peter Wegner entitled “Towards Empirical Computer Science”, which might be pushing these complexities in a more widely grounded direction.

 

And I have to take a close look at this – this notion of “perfection in the microstructures”  – which does, I would say, describe exactly what I am talking about:

 

See "Toward Perfect Information Microstructures" on the web site. That in turn
has concealed the potential for very general purpose precise language which
can serve as a lingua franca for technical literacy -- and perhaps most
practical ontologies. 

I want to see a series of very simple basic definitions that are based on something like this – and maybe exactly this -- so that universally-meaningful concepts like “identity” or “comparison” or “similarity” or “difference” or “analogy” can be given exact and very simple definitions with some hope of becoming consensual – in terms that are not only consistent with the needs of computer science – but which can directly translate to any other science that depends on those concepts.  Define these things in truly universal terms, and logic cascades between disciplines would become far simpler and less noisy.  I would say this agenda goes directly to the list of 4 points John Sowa just posted – maybe with some critique of the elements he defines as foundational, perhaps with emphasis on something like your “perfect microstructure” idea.

 

When I look at your “peg” model – I must say I don’t fully understand it – and thought I might privately ask you a series of simple-minded questions that could help me see exactly what you are saying.  More or less, the diagram looks like a cascade of descending levels of generality, and reminds me of this below strongly generalized cascade of “part-of” and “is-a” hierarchies as represented by Grady Booch.

 

http://originresearch.com/docs/booch/Booch_CH01.docx

 

For many specialized applications, generality may not matter.  But for me, coming from a broadly interdisciplinary point of view, it absolutely does matter.  Observations and principles that characterize one system ought very often to characterize another – and Booch presents his analysis in these terms.  That, I would say, makes him a great engineer, and is no doubt one reason why he has been so widely influential.

 

I did read though several of your articles on reducing complexity and fundamental design principles – and more or less, I agree with it all.  This is a very noisy subject area – no doubt for good reason – and I understand your point about simplification being a hard sell.  As regards whether I am receptive or sympathetic to simplicity as a consensual value – the answer is absolutely yes.  Actually, I would say that there are probably ways to “compress” complexity in some process of optimization, where some measure like “simplicity” becomes the driving variable.  But this is kind of apocalyptic stuff – and until some magician helps put it all together, it’s no wonder people remain skeptical or indifferent.

 

Thanks

 

- Bruce Schuman, Santa Barbara

http://networknation.net/vision.cfm

 

cid:image002.png@01D0AA5E.4935E160

 

 

cid:image003.png@01D0AA63.AE2A0CE0

 

 

From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx [mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Ed Lowry
Sent: Thursday, June 18, 2015 6:04 PM
To: [ontolog-forum]
Subject: Re: [ontolog-forum] Ontologies and languages

 

Bruce

I applaud your foundational focus.  The diffuseness of these discussions
seems to confirm such a need.
I have been working on a similar "common foundation" for some time,
or at least a beginning for one.  I do not attempt to represent the real
world or natural language, only models which can be captured in
computers.
At the most fundamental level, I propose basic data objects structured
like these:

cid:image001.jpg@01D0AA5B.C0AD67C0

The person peg effectively represents the set of 2 persons suspended from it.
Other pegs with lists of links suspended from them can represent other sets
of objects pointed to by the links. Such sets exist by construction.
Two executed references are equal only if they refer to the same object.

Of course there are many possible ways to represent such information structures.
What gives this way some prospect of providing "a meeting point for a zillion
collaborative developments" is that the idea of simplification (that is: eliminating
extraneous complexity) has a lot of common sense appeal -- at least for people
who do not have vested interests in making other people's lives complicated.

For the past 40+ years the technology community has collectively obstructed progress
in designing precise language that facilitates simplicity compared with a design
developed at IBM in the early 1970s. See "Inexcusable complexity for 40 years"
on my web site.  That obstruction has concealed the fact that simplification leads
to convergence toward fundamental building blocks of information that are an
enduring practical optimum. They are at least very close to those illustrated above. 
See "Toward Perfect Information Microstructures" on the web site. That in turn
has concealed the potential for very general purpose precise language which
can serve as a lingua franca for technical literacy -- and perhaps most
practical ontologies. 

I would be very interested in any valid long term reason for departing from the
proposed practical optimum data object structure. Or any real obstacle to
precise language with much more generality than is used at present. The
connections built directly into the primitive objects provide for iteration through
sets.  That resolves a false dichotomy between languages with rich data structures
and those with simple plural expressions. It continues to plague software
technology.

Students everywhere are taught how to arrange pieces of information by
educators who are unaware of elementary pieces of information that are
well designed to be easily arranged. To me, few educational practices seem
more obviously unreasonable.

I wonder if you are receptive to simplicity as a value around which to seek
consensus on foundational choices.  I have found it a tough sell for many years.

Ed Lowry
http://users.rcn.com/eslowry 
 
                                *****************
On 6/18/2015 1:47 PM, Bruce Schuman wrote:

Well.  Just downloaded the Pope’s encyclical on the environment and “our common home”.  I’d say it is nothing less than stunning.  And in the context of this ontolog discussion, and the challenges that seem to come up here,  I wanted to pick up on just one sentence, in this message from Thomas Johnston:

 

Even when, as you indicate, you are referring to languages for formal ontologies (not to other kinds of formal languages, and not to natural languages), the terminological issues are too vague to pin down. In other words, we mean too many things by "language" and "ontology", i.e. our usages are not well-regulated by agreement on sets of necessary and sufficient conditions for correct usage

 

There are many themes in this recent conversation here I find fascinating.  I spent years exploring things like “the dimensional decomposition of features” and how that relates to classification.  And all this kind of exploration does feel critically important to me today, in a global context where “we humans” are having such a hard time bringing ourselves into common focus on high-pressure issues that demand collective response.

 

So, I have this little phrase that came to me last night, as a way it might be possible to package something that might work.

 

A ZERO-AMBIGUITY COMMON FOUNDATION

 

Yes, as Thomas Johnston notes, “the terminological issues are too vague to pin down”.  We “mean too many [different] things [by the words we use]”.

 

Our usages are “are not well-regulated by agreement” – and some will say there’s no way they can be – reality is just too complicated…

 

So –  far too briefly, and no doubt inexpertly, and just as a sketch – I am pushing this thing, as a test/seminal hypothesis, into the maw of confusion.

 

As suggested a day or two ago in the quote from John Sowa on sets, and the wide range of meanings and interpretations that concept has – and in the context of recognizing that these definitions are “not well-regulated” and maybe cannot be – I am looking for stable bedrock, a non-transient industry standard at the lowest level of all machine-processing languages – which, I am daring to suppose, can probably be translated into an interpretation for natural language as well.

 

I am looking for a way to stipulate, to intentionally affirm, to postulate, to assert.

 

For the moment, I am looking at a few basic elements.  On the definition of “set” – I like the (?) “absolutely non-confusing” notion that a set is a container – a range of boundaries – maybe “boundary values” – within which something exists.  But keep it simple.  A set is a container.

 

So – to make this scientific, and not a matter of opinion or ungrounded floating abstraction, I want to define this container in some hard-core industry-standard way, in a “machine space” – at “the lowest possible level”.  Don’t define this thing in terms of “concepts” or ideas or abstractions.  Define it in some material/substantial way, in the simplest possible terms that “computer scientists everywhere” might see as basic and obvious.

 

We got a cell.  We got something inside the cell.  It’s mechanically represented.  It has physical instantiation.  Isn’t that what it takes?

 

Every computer program in the world runs on something like this presumption.  Every computer language and program in the world is built up from something like this, in terms of some ascending cascade of abstractions.

 

Maybe those languages and programs immediately diverge in a zillion ways from this point of departure.  But isn’t this the basic foundation for all data assembly, for every structure, for every representation process, for every computation, for every function?

 

I’ve been gathering just a few basic terms from the conversation here that might be the immediate first elements in some common definition cascade.

 

For me, it looks like one very robust way to approach this – that does appear to be something like an industry standard – is this general discussion of an “object” – with a very simple model saying something like the entire business of data modeling involves a “real object” and an “abstract object” – and the mapping between them.

 

And there are some guiding general principles, such as the apparent fact that a taxonomic decomposition is always and necessarily context-specific to some specific environment or motivation.  This is the essential core reason we have a zillion overlapping but logically independent semantic ontologies.  Every situation is different, no one-size-fits-all, reality never parses the same way twice.  But maybe – if we stipulate it in the right way --  all models branch from the same root.

 

So – we have the “real object” – and we have a locally-motivated context-dependent interpretation of the real object, which enables us to assign which “properties” of this real object belong in our “abstract object” representational model.  The entire undertaking of science generally involves this process, and the entire activity of scientific confirmation and validation involves testing the correlation between the abstract object and the real object.  “Does this abstract (theoretical) model accurately describe the actual reality?”  Test, probe, iterate, confirm….

 

This is more or less the process in large simple terms.

 

So, into this context, we propose a few big simple definitions and explorations of “best method” – maybe following some explorations of the meaning of “best” – such as simplest, fastest, most general, etc.

 

Real object < --------- > Abstract object

 

Now, if that is right – a few very basic simple definitions – maybe starting with those primal concepts suggested by John –

 

Similarity, identity, difference

 

And into that conversation – I would want to suggest that this primal tension between “real object” and “abstract object” has a lot of implications for things we’ve been talking about here on ontolog.  For one – all perception of the “real object” involves inference – because we natively see the world of real objects through a lens of presumptive categories.  Two “real objects” cannot be defined as identical or not identical – because the properties of “identity” are abstractions and properties of our abstract model, not properties of the “real object” – which actually has no properties innate to itself.  Properties are abstract conceptual structures that we insert into our abstract model because they are useful for some context-specific purpose (e.g., this is how this particular medical office or local medical network agrees to handle this particular kind of issue).

 

Get confused on that point – and we are off the rails.  This looks to me like something close to a basic convention.

 

What is happening for me, I think – is that I am staring at this very fundamental principle (“how to map the real to the abstract”) for the definition of reality, and what it might take to create a universal shared agreement on the semantic fundamentals of abstract description.  If that basic process could be mapped in a 100% clear way to the fundamental bedrock of computer processing, where we would probably be safe to assume any language can be constructed, we would have a clear point of focus – a line of demarcation – common ground.  Build a zillion processes, in a zillion directions – but build from this common root – where we can always meet each other, and build together in coherent ways --

 

If that basis were stable – then a bunch of fundamental definitions might quickly emerge – terms like similarity and difference and identity could be given common-ground machine-space (“cellular”)  definitions, and a flood of interrelated higher and more composite terms could then also emerge – all kind of terms relating to object description and comparison – terms like analogy or metaphor or “universal” or “class” or “instance” – might then emerge – and maybe not always in the same way, but retaining their context-relative fluency for immediate local applications – such that these terms could continue to be seen as relativistic purpose-specific definitions – as they are in actual practice today.

 

I can think of a flood of confusing issues that might fall into place if this kind of definition-basis were to emerge in a sound consensual way.

 

Don’t build all of this on the basis of “natural language” – by empirical observation of natural language -- because natural language is a vastly complex organic and variable thing with no innate perfect rationale.  Instead, build it on the basis of a man-made human-constructed intentional and stipulated “synthetic” language – because our synthetic language can be controlled and specified and perfected in micro-detail – and in that context, it is still true that “we can map anything to any desired degree of specificity”, including natural language.  There’s nothing we can’t do with it.  It’s the ultimate in fine-grained fluency, and comes as close to continuously variable as we feel like pushing it.

 

An approach like this would give us all a fundamental coherence underlying everything we do.  And it would not constrain the endless number of alternative ways people are actually doing things today, because it affirms nothing beyond the absolute foundation.  But it would give us all a meeting point for a zillion collaborative developments that the world is calling for today.

 

 

From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx [mailto:ontolog-forum-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Thomas Johnston
Sent: Wednesday, June 17, 2015 1:45 PM
To: [ontolog-forum]
Subject: Re: [ontolog-forum] Ontologies and languages

 

Hi Juan,

 

Even when, as you indicate, you are referring to languages for formal ontologies (not to other kinds of formal languages, and not to natural languages), the terminological issues are too vague to pin down. In other words, we mean too many things by "language" and "ontology", i.e. our usages are not well-regulated by agreement on sets of necessary and sufficient conditions for correct usage. 

 

But I would suggest, given this proviso, that an ontology is a formalization of the semantic relationships among the members of a lexicon, and a language is a means for combining lexical elements into statements. So it's somewhat analogous to the distinction, in a natural language, between expressions in a language, and the statements created from combinations of those expressions according to grammatical rules.

 

Of course, the semantic relationships among a set of expressions would themselves be expressed in statements, e.g. the semantic rule that "bachelor" means "unmarried adult male", in an ontology, would be expressed by the statement "A bachelor is an unmarried adult male" in the language for that ontology. So given this fact, a language is indeed a means for expressing the semantic relationships among expressions. The sentences thus produced are analytic sentences, ones true by definition.

 

But that's not all that a language (for a formal ontology) is. Another class of statements are the synthetic ones, the ones true if and only if what they state to be the case is, in fact, the case. A language for an ontology should be able to express synthetic statements as well.

 

Since there is no Aristotelian definition (by genus and specific difference) of "ontology" and "ontology language" (within the language community we are both part of), and since, as it seems to me, there will be almost as many putative definitions as there are people attempting to give them, this is little more than my own personal view. 

 

But I would remind anyone else willing to step up to the plate and attempt their own definitions of these terms, that a definition of an _expression_ is more than a collections of true statements about what that _expression_ purportedly represents. As much of an ideal rather than an achievable objective as Aristotelian definition is, it is still the gold standard.

 

Regards,

 

Tom Johnston

 




_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J    (01)

<Prev in Thread] Current Thread [Next in Thread>