Bruce:
If
I had my way, every natural language term would be (stipulatively)
grounded (somehow) in the real number line, with exact measurements in x
number of decimal places in explicit dimensions with a known error
tolerance.. A “dog” is exactly this….
Me:
I have thought of a way to do that, using an "information theory" measure.
Say that a word has N meanings, with probabilities: p1, ... pN.
Then the "word uncertainty" is sum(-pi log pi) over i = 1 to N.
If we don't know the probabilities we can, for lack of any better idea,
take pi = 1/N, and the word uncertainty is log N, traditionally measured
in bits.
Let "proposition uncertainty" = sum(word uncertainty) over words in proposition.
Think of this uncertainty as a quality measure for the context of the proposition.
If the context has defined all the words in the proposition,
we have a perfect context with uncertainty 0.
Dick McCullough
Context Knowledge Systems
Name your propositions !