[Top] [All Lists]

Re: [ontolog-forum] The history computing volume 6

To: "[ontolog-forum]" <ontolog-forum@xxxxxxxxxxxxxxxx>
From: Ed Barkmeyer <edbark@xxxxxxxx>
Date: Tue, 22 Feb 2011 11:55:51 -0500
Message-id: <4D63EA97.6090005@xxxxxxxx>
Gian Piero Zarri wrote:    (01)

> Note that SNOBOL was, at least partly, a spin-off of Victor Yngve's
> COMIT, the first (I think) computer language practically used for string
> processing and computational linguistics-like applications. I have (very
> satisfactorily) used COMIT for several applications in the mid-60 at the
> Center for Cybernetics and Linguistic Applications of the University of
> Milan, including a small generative grammar program for Italian inspired
> from Yngve's "Little Train" work.    (02)

I had forgotten that.  Yes, even the SNOBOL IV (1968) manual referred to 
COMIT and Yngve as the ancestors.    (03)

John F. Sowa wrote:    (04)

> A serious deficiency of string languages, however, is that their
> unit of parsing is a single character.  That is not bad for analyzing
> morphology (word endings, for example), but it is a serious limitation
> for grammars, in which you need to group characters into words,
> words into phrases, and phrases into larger phrases.
> In his history of LISP, McCarthy noted that limitation of string
> languages.  The ability to group lists into larger lists was a major
> advantage of list languages, including IPL and FLPL as predecessors
> to LISP.      (05)

All true.  So, for parsing formal languages in SNOBOL, you first do the 
pattern matches on the source strings, and from them you create strings 
that are list structures of symbols, some of which refer to variables 
that contain strings, some which may be list structures of symbols, 
etc.  This is, of course, a rather crude approximation to the native 
form of LISP.  OTOH, one can then recursively perform pattern matches on 
the parsed symbol strings, which in many cases gives you a direct 
implementation of grammatical production rules, or translation rules, or 
optimization rules.  In LISP, you have to code those rules as functions 
that analyze lists.     (06)

The important thing here is that parsing input from humans has always 
been a critical idea in all "artificial intelligence", and that includes 
the capture of algorithms for solutions to all kinds of problems, using 
all kinds of approaches.  Programming languages were invented to make 
that task easier, and they were optimized for certain approaches to 
certain kinds of problem solving.  The rest is about whether one 
approach was better than another, or one was easier to use for more 
applications than another, or one was promulgated in the community that 
got funded, or promulgated by the biggest IT company of its time, or 
just got better press.  There was a lot of very good work in the 1950s 
thru 1970s that was as good as that which got the recognition.    (07)

-Ed    (08)

"Full many a gem of purest ray serene
 The dark unfathom'd depths of ocean bear
 Full many a flow'r is born to blush unseen
 and waste its sweetness on the desert air."
  - Thomas Gray, "Elegy in a Country Churchyard"    (09)

Edward J. Barkmeyer                        Email: edbark@xxxxxxxx
National Institute of Standards & Technology
Manufacturing Systems Integration Division
100 Bureau Drive, Stop 8263                Tel: +1 301-975-3528
Gaithersburg, MD 20899-8263                Cel: +1 240-672-5800    (010)

"The opinions expressed above do not reflect consensus of NIST, 
 and have not been reviewed by any Government authority."    (011)

Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/  
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/  
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/ 
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx    (012)

<Prev in Thread] Current Thread [Next in Thread>