DN: I would assert that a
key engineering problem is how to capture the context in which a question is
posed. For example ? the human brain thinks in terms of cognition, yet
pragmatics are intrinsically linked to that
context.
AA; Some explication.
Humans think in terms of objects and processes, things and actions, entities and
changes, or objects and events. If "cognition" meant as a conception and notion,
then in terms of the concepts of objects, states, changes and relations.
DN: For example ? if you
are having drinks in a bar and you see a glass of beer on an adjacent table, you
recognize it, but you may not elevate it to the highest tuples of your
brain.
There is an
old Russian anecdote about the mentality of different nations, would
have been scaned when the neuroscientists and cognitive engineers had managed to
invent a sort of "mental projector". With one known nation (keeping
political correctness), a glass of beer was a dominating cognition and
image...:)
As for the WA, the search
system has little to do with any meaningful equations, physically meaningful or
biological meaningful or socially meaningful, and so with the semantic primes of
meanings, senses, reference, representation, truth, and context. Its scope
is a low-level domain of statistics, numbers, correlational
statistics, etc.
Azamat Abdoullaev
----- Original Message -----
Sent: Wednesday, June 03, 2009 11:02
AM
Subject: Re: [ontolog-forum]
Wolfram|Alpha and NL QA
Inline:
On 6/2/09 6:08 PM, "Adrian Walker"
<adriandwalker@xxxxxxxxx>
wrote:
Because 'context' in WolframAlpha is limited to
the equations (I'm using the term 'equations' loosely) of a particular
knowledge domain, the meaning of a term can change depending on how it is
used, giving you completely wrong answers, with no warning that they are
wrong. All of this is enough to make a data governance geek cringe. The only
thing worse than being wrong is being wrong and believing you are right, and
acting on it.
A key technical problem would seem to be how to
prevent the system from answering questions it does not
understand.
I
would assert that a key engineering problem is how to capture the context in
which a question is posed. For example ? the human brain thinks in terms
of cognition, yet pragmatics are intrinsically linked to that context.
For example ? if you are having drinks in a bar and you see a glass of
beer on an adjacent table, you recognize it, but you may not elevate it to the
highest tuples of your brian.
If you are driving down a freeway at 100
KPH and see the same glass of beer approaching your windshield, it has
different semantics.
Given the WA engine cannot understand the context
in which questions are posed, it will also fail in terms of absolute
semantically meaningful answers.
Just MHO.
D
-- Sr.
Technical Evangelist ? Adobe Systems Chair ? OASIS SOA RM Technical
Committee Manager ? Adobe LiveCycle ES Developers List
Blog: http://technoracle.blogspot.com Twitter:
duanechaos Duane?s World TV: http://www.duanesworldtv.org Band:
http://www.myspace.com/22ndcentury
Author ? <a href=""
href="http://technoracle.blogspot.com/2009/05/web-20-architecture-book-is-here.html">http://technoracle.blogspot.com/2009/05/web-20-architecture-book-is-here.html
?>Web 2.0 Architecture</a>
_________________________________________________________________ Message
Archives: http://ontolog.cim3.net/forum/ontolog-forum/ Config
Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx Shared Files:
http://ontolog.cim3.net/file/ Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J To
Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx
|