Thanks for the (ever) thoughtful comments and reply.
I'd agree with most of your points. What I guess my concern boils down to
(maybe yours too) is the consequences of failure, and the scale of those
Admittedly that can be as devastating whether committed by human or machine
but can the machine be held to account? When the trade-off is
accountability/responsibility against quality of service, where do you draw
the line? And, surely, the fact that it is us humans asking such questions,
should point us to the answer? (01)
Best regards for the holiday season,
| -----Original Message-----
| From: ontolog-forum-bounces@xxxxxxxxxxxxxxxx [mailto:ontolog-forum-
| bounces@xxxxxxxxxxxxxxxx] On Behalf Of John F. Sowa
| Sent: Wednesday, 22 December 2010 02:24
| To: ontolog-forum@xxxxxxxxxxxxxxxx
| Subject: Re: [ontolog-forum] Fwd: [New post] The Newest from SOA: The SOA
| Ontology Technical Standard
| I very strongly agree with your concerns. But I would also note that the
| opposite is equally chilling: final control by some annotator who made a
| mistake in choosing among senses of words like 'fire', 'is', or 'can'.
| > I'm happy that some automated system might *help* me model my world
| > (even extensively), but I will *always* want the final say.
| Two points:
| First, note that Yahoo tried to improve their search engine by using an
| derived by human volunteers. Google's automated system proved to be far
| more effective.
| Second, note that all of us are being evaluated constantly by automated
| software or semi-automated software run by poorly educated clerks or
| who are "just following orders". That is happening every day at banks,
| bureaus, insurance companies, air lines, etc., and none of them will give
| final say about their decisions about us (or even let us know the reasons
| > Decades of AI failures and false dawns demonstrate that we should
| > never hand control over to fully automated systems.
| I agree. But such failures happen with every system, independently of how
| implemented. Armies of clerks can be just as bad or worse than computers.
| In fact, some of the worst abuses are caused by human systems run by
| making companies whose major concern is the size of the bonuses their
| executives (and clerks) can get for denying service.
| Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
| Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
| Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
| Shared Files: http://ontolog.cim3.net/file/ Community Wiki:
| http://ontolog.cim3.net/wiki/ To join: http://ontolog.cim3.net/cgi-
| To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (04)