DN: I would assert that a
key engineering problem is how to capture the context in which a question is
posed. For example ? the human brain thinks in terms of cognition, yet
pragmatics are intrinsically linked to that
AA; Some explication.
Humans think in terms of objects and processes, things and actions, entities and
changes, or objects and events. If "cognition" meant as a conception and notion,
then in terms of the concepts of objects, states, changes and relations.
DN: For example ? if you
are having drinks in a bar and you see a glass of beer on an adjacent table, you
recognize it, but you may not elevate it to the highest tuples of your
There is an
old Russian anecdote about the mentality of different nations, would
have been scaned when the neuroscientists and cognitive engineers had managed to
invent a sort of "mental projector". With one known nation (keeping
political correctness), a glass of beer was a dominating cognition and
As for the WA, the search
system has little to do with any meaningful equations, physically meaningful or
biological meaningful or socially meaningful, and so with the semantic primes of
meanings, senses, reference, representation, truth, and context. Its scope
is a low-level domain of statistics, numbers, correlational
----- Original Message -----
Sent: Wednesday, June 03, 2009 11:02
Subject: Re: [ontolog-forum]
Wolfram|Alpha and NL QA
On 6/2/09 6:08 PM, "Adrian Walker"
Because 'context' in WolframAlpha is limited to
the equations (I'm using the term 'equations' loosely) of a particular
knowledge domain, the meaning of a term can change depending on how it is
used, giving you completely wrong answers, with no warning that they are
wrong. All of this is enough to make a data governance geek cringe. The only
thing worse than being wrong is being wrong and believing you are right, and
acting on it.
A key technical problem would seem to be how to
prevent the system from answering questions it does not
would assert that a key engineering problem is how to capture the context in
which a question is posed. For example ? the human brain thinks in terms
of cognition, yet pragmatics are intrinsically linked to that context.
For example ? if you are having drinks in a bar and you see a glass of
beer on an adjacent table, you recognize it, but you may not elevate it to the
highest tuples of your brian.
If you are driving down a freeway at 100
KPH and see the same glass of beer approaching your windshield, it has
Given the WA engine cannot understand the context
in which questions are posed, it will also fail in terms of absolute
semantically meaningful answers.
Technical Evangelist ? Adobe Systems
Chair ? OASIS SOA RM Technical
Manager ? Adobe LiveCycle ES Developers List
Duane?s World TV: http://www.duanesworldtv.org
Author ? <a href=""
?>Web 2.0 Architecture</a>
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J