Hi John,
I kinda object to the use of "T" because it conflicts with the
extremely long history of dynamic systems, discrete time systems, even
electronics which is often spread out in a frequency v time plane.
Wavelets, Fourier analysis, control systems, optimal controls, discrete
sampled systems, and zillions of other engineering marvels use “T” and have for
centuries. It seems unnecessary to displace it now.
< snippage
>
-----Original
Message-----
From: John F. Sowa
On 8/17/2010 6:29 AM, Rich Cooper
wrote:
> I interpret “comprehension” in this passage as referring to
the degree
> of specialization of a “term”, or
symbol.
< snip
>
I'd also like to relate this discussion to the term used for
the
top of a type hierarchy. My preferred term is the symbol T
for top,
because it avoids all possible confusion with words like
'thing'
or 'concept'. If anybody wants a pronounceable word, I
recommend
'entity' because it is a technical term that avoids all kinds
of
pointless controversy about whether an event or a property is a
thing.
The crucial point about T (or whatever else you want to call it)
is
that it has maximum extension: The corresponding predicate
T(x)
is true of every and any x that anybody can imagine. There
is
one and only one axiom that defines the predicate
T(x):
For every x,
T(x).
But T also has the minimum possible comprehension (or
intension):
zero. That single axiom, which is true of everything, says
nothing
about anything. T has no attributes or properties of any
kind.
> I interpret "comprehension" in this passage as
referring to the
> degree of specialization of a "term", or
symbol.
It's better not to try to explain it. Just think in terms
of
the logic: The comprehension (or intension) is determined
by
the differentiae (monadic predicates) that define it:
adding
more differentiae makes a term more specialized, and
deleting
differentiae makes it more generalized. If you erase all
the
differentiae, you get T.
< remainder snipped
>