Peter will probably kill me for this but I can not resist the
opportunity to ask a question to a group that really knows its history
of computing. (01)
In 1972, the University of Western Ontario produce a limited edition of
50 of computer art that I produced using a PDP-8 with a storage screen.
The poster made from a photograph of the screen taken while the program
ran and printed using normal printing technology for posters.
It looks a lot like the CBS "eye" logo.
I have the original photographs of several images of different graphic
designs made by a small group of Masters students and department
technicians from which mine was chosen for publishing and I have a few
copies of the actual paper posters. (02)
My question is where does this fit in the history of computing and
published computer generated art? (03)
Ron (04)
On 18/02/2011 2:35 PM, Ed Barkmeyer wrote:
> More old fogey discussion of computer science in the 1960s, and its
> possible relevance to the present.
>
> John F. Sowa wrote:
>> On 2/17/2011 5:15 PM, Ed Barkmeyer wrote:
>>
>>> Unfortunately, the amount of high-quality
>>> research that was done under the name AI has been confused with the
>>> amount of high quality research in computational technologies that was
>>> not done under the name AI, because they have all been integrated into
>>> the foundations of computer science.
>>>
>> That is true, especially since many of the pioneers crossed over from
>> one to the other quite freely. For dates, I'm relying on
>>
>> _History of Programming Languages_, edited by Richard L. Wexelblat,
>> Academic Press, 1981.
>>
> Ah. That is rather after my time in the compiler business. My
> recollections are from Jean Sammet's book, about 1970.
> (Jean was another IBMer, and one of the "Grandes Dames of Computing" of
> the 1950s and 60s.)
> I wrote compilers for a living in the late 1960s; but I wrote my last
> commercial compiler in 1973.
>
>> On p. 176, McCarthy wrote that he had been using FORTRAN to write
>> a chess program in 1957-1958, which led him to propose the
>> "conditional expression", which he then proposed for Algol
>> in a letter to the _Communications of the ACM_.
>>
>> McCarthy had also collaborated with Nat Rochester and Herb Gelernter
>> at IBM on the development of FLPL (FORTRAN List Processing Language).
>> He tried to write a program to do symbolic differentiation in FLPL and
>> realized that recursion was necessary, but FORTRAN didn't support it.
>>
> Ah, yes. Automated differentiation was inspired by McCarthy and gave
> rise to Project Mac, which gave rise to MacLISP -- the dominant LISP
> implementation of the late 1960s. I think we can say that Mathematica
> had its origins in McCarthy's work of 1960.
>
>> The design of LISP began in the fall of 1958, and the implementation
>> happened almost by accident. As McCarthy wrote (p. 179), they started
>> with a recursive definition of the eval function to evaluate LISP
>> expressions. But then "S. R. Russell noticed that eval could serve
>> as an interpreter for LISP, promptly hand coded it, and we now had
>> a programming language with an interpreter."
>>
>> P. 181: "The first successful LISP compiler was... written in LISP
>> and was claimed to be the first compiler written in the language
>> to be compiled."
>>
>>
>>> I believe many of the programming language pioneers of the late 1950s
>>> would argue that McCarthy took from their discussions as much as he
>>> provided, and that if-then-else and recursion, which were features of
>>> Algol 58, hardly originated with LISP.
>>>
>> We both get partial credit. The conditional expression was proposed
>> for Algol 58, but it was McCarthy who proposed it.
>>
> Bear in mind that Fortran 57 had a conditional expression of the
> if...then...else variety, but in a cruder form taken directly from the
> thinking of the IBM 701/704 instruction set:
> IF (numeric expression) positive-destination, zero-destination,
> negative-destination
> but it was all based on goto's and statement labels. Algol 58 had only:
> if (logical expression) statement
> but<statement> could be begin<statement>* end
> It seems likely that McCarthy's contribution was "else".
>
>>> ... we finally did see IBM 360 Fortran H -- an optimizing compiler
>>> written in Fortran! Whether John got the idea from McCarthy I couldn't say.
>>>
>> 1963 was very late in the programming language game.
>>
> We...ell, it wasn't the origin work, but I wouldn't call it "very late".
>
> I think Bell Labs SNOBOL (vI) appeared in late 1962. It was a
> semi-procedural rule-based language for string processing, whose
> execution model was based on a complex paraform string matching
> algorithm. (Later versions of the SNOBOL approach showed up in AWK and
> other tools around 1970.) And of course, Klaus Wirth's journey thru
> experimental programming languages began in 1963-4.
>
> By 1963, the beginnings of a discipline were emerging in the compiler
> community, to replace the former seat-of-the-pants stuff. The period
> 1961-1965 saw publication of many of the references on formal language
> grammars, although the LR(k) terminology is almost 10 years later. IBM
> (Backus) and CSC and U. Illinois and ICL had been working on
> "optimization" since about 1960, but the reference papers appeared in
> 1965 and 1968. The first 7090 Fortran compiler to generate relocatable
> binary code instead of intermediate assembly code was the U. Illinois
> compiler of 1964, although the Arden/Galler/Graham gang at U. Michigan
> had done that for MAD in 1960. To the best of my knowledge CSC's
> TransGen was the first compiler-compiler; it appeared in 1963. The GE
> POPS compilers, which generated a common byte-code, instead of object
> code, first appeared in 1962, and allowed GE to provide one of the
> original 'time-sharing services'. (They did the byte-code thing,
> because they didn't know the target hardware platform when the project
> started!)
>
> Although many regard the Algol 60 report as the first programming
> language standard, the first formal programming language standard was
> ANSI Fortran, X3.3-1966. The ANSI COBOL spec appeared in 1968. In the
> late 1970s I was the NBS Pascal standards rep, with an office on
> "language row" -- Mabel Vickers (COBOL 68, 74), Betty Holberton (Fortran
> 66, 77), John Cugini (Basic), me, and then Joe Wegstein, former Chair of
> the Algol 60 committtee. (All of my knowledge is second-hand.)
>
>>> But I think Wegner and Gear and Backus and Aronson have equal right to claim
>>> if-then-else, recursion, and the foundations of compiler technology.
>>>
>> McCarthy certainly didn't invent compilation, since he used FORTRAN
>> and FLPL before beginning to define LISP. But it was his letter
>> about the weaknesses of the FOTRAN IF that proposed the conditional.
>>
>> Recursive functions were common in logic since the 1930s, but
>> computational algorithms rarely used them. The ALGOL article in
>> Wexelblat's book indicates that ALGOL 58 was a specification
>> rather than an implementation. Many incompatible versions were
>> implemented in the 1959-60 years with different names, such as
>> MAD, JOVIAL, NELLIAC. Page 86 says that recursion was required
>> for Algol 60. For Algol 58, the only comment is that it was not
>> specifically forbidden.
>>
> Exactly. Fortran and MAD disallowed recursion, because they used a
> standard IBM 704 subroutine interface architecture that wasn't based on
> a stack. But JOVIAL (Jules (Aronson's) Own Version of the International
> Algorithmic Language, U. Texas) allowed recursion because it was
> originally written for some UNIVAC real-time control machine that had a
> stack structure (the subroutine call was PLJ (push-location-and-jump)
> rather than TSX (transfer and set index)). The Burroughs Algol 58
> compiler also supported recursion, because the stack architecture was
> supported by hardware in the B-500 machine, and its successors.
>
>> the adoption of garbage collection makes Java much
>> closer in spirit to LISP+CLOS than to C++.
> In the spirit of memory management strategy, John means. I personally
> would not have considered the underlying storage management strategy to
> be a primary factor in characterizing programming languages. But in
> languages in which you can create "new" things on the "heap" (which
> effectively began with LISP, although those terms are taken from
> Algol68), storage management strategy is important.
>
>> Smalltalk undoubtedly
>> had an influence on Java, but Smalltalk was also very strongly
>> influenced by LISP.
> Really? Is there some description of this, or is it simply a matter of
> the PARC gang being well-educated in programming language concepts?
> The only resemblance I see is that Smalltalk is closer to typeless
> languages like LISP than to strongly typed languages like Algol and
> Ada. But Java is strongly typed. It is true that Smalltalk
> implementations also do garbage collection and can do something a bit
> like 'eval'.
>
>> Even so, Java is more similar to LISP+CLOS than to Smalltalk.
> And much more similar to strongly typed languages like Algol, Ada, C and
> C++. Strong typing is essential for the concepts of subsumption and
> inheritance and encapsulation and 'polymorphism' (of a sort) that are
> said to be the hallmarks of OOPLs. For that reason, CLOS is more
> strongly typed than LISP.
>
> The only real resemblance I see is the Java 'interface' notion, which is
> closer to the CLOS and Smalltalk views of typing than the Ada/C++ view.
> Stated simply, the Ada/C++ view is that an object IS what its base type
> is, and all of its properties are dependent on the type hierarchy. The
> Smalltalk/CLOS/Java view is that an object is what its structure is, but
> that is encapsulated. What interfaces it can provide, and thus how it
> is perceived outside of itself, is a separate concern, not directly
> dependent on its structure. (But this was also the foundation concept
> for DCE and CORBA, which were also important to Gosling's
> conceptualization for Java.)
>
>> The similarity between LISP and Java is
>> apparent in the ability to compile LISP to Java bytecodes --
>> and by the many professors who had taught LISP for AI and
>> switched to Java for AI.
>>
> John, as a logician, you should be ashamed of that... (This is just too
> much of a stretch to ignore. :-) )
>
> LISP environments have never interfaced well with others, with the
> consequence that LISP is not really viable for a lot of 21st century
> network-based applications, even though it may be better for describing
> the processing algorithms. The network interface libraries are designed
> to work procedurally, with standard C or Java interfaces. As a
> consequence, the AI community is converting LISP code, and sometimes
> LISP programmers, to the Java environment. Ergo??, Java is similar to LISP.
>
> And the fact that LISP can be compiled into Java byte-codes (with an
> 'eval' function in the class library) doesn't say much. By way of
> comparison, I would point out that the famous basic LISP elements CAR
> (head) and CDR (rest) originally referred to the placement of the 'this'
> and 'next' links in the IBM 704 representation of the list. The 'this'
> link (CAR) was placed in the "address" field of the word (bits 21-35)
> and the 'next' link (CDR) was placed in the "decrement" field of the
> word (bits 3-15). To use these fields, you "moved Address to Register"
> or "moved Decrement to Register". By John's logic, we might conclude
> that LISP was similar to IBM 704 assembly language!
>
> The other major similarity of LISP to Java is that Jim Gosling was an
> active member of the AI gang at MIT and was one of the programmers of
> emacs, which is written entirely in LISP. Like the PARC developers of
> Smalltalk, Jim was well-educated in diverse programming language
> technologies.
>
> McCarthy is a great man of computing, and LISP was a major contribution
> to computer science, and to the development of AI. LISP doesn't have to
> have had any technical relationship to Java to be worth our recognition.
>
>> [EB]
>>
>>> My point is only that it is now convenient for John to recall many of
>>> the 'something else' technologies as 'AI contributions', when "AI" was
>>> for 20 years a reserved designation for a handful of universities and
>>> other programs working directly on knowledge modeling and language
>>> processing.
>>>
>> Actually the split between AI and mainstream commercial programming
>> occurred in 1864, when IBM switched to the 360 architecture.
> I'm sure John meant 1964, but with that date, I was expecting a
> reference to Peirce. :-)
>
>> Up to that
>> time, most AI programming was done on IBM 704-7094 systems, and most
>> LISP systems called the FORTRAN library for arithmetic functions.
>>
>> But when IBM stopped making 7094s, MIT and Stanford decided to buy
>> the DEC 10, which cost as much as the 360/50 but had the performance
>> of the 360/65. The net result was that mainstream data processing
>> stayed with IBM, but most universities followed MIT and Stanford
>> in moving to the DEC 10, Dec 20, and VAXen.
>>
> Good point! Until 1968 IBM had had the lion's share of academic
> research computer systems (and corporate ones as well) -- they were all
> IBM 704 successors that were 'high end performance' for their time, so
> there had been a common platform for software development for 10 years.
> The 360 changed the hardware architecture -- the platform -- which
> necessitated software conversion in any case. Further, the IBM
> marketing strategy of the time interfered with the delivery of
> cost-effective 'higher performance' machines, and created windows for
> DEC, GE/Honeywell and Univac in the marketplace. And at the real high
> performance mathematics end, CDC took over. (The VAX phenomenon was 10
> years later -- the next generation.) While the others were more
> successful commercially, a large chunk of the academic research
> community in general moved to the DECsystem-10. Part of the reason for
> that, BTW, was that DEC was already established in the laboratories with
> the PDP-8, and was introducing the PDP-11, and DEC understood how to
> work with academic tinkerers.
>
> (NBS itself replaced an IBM 7094 with a Univac 1108 in 1969, and
> minicomputers moved into the scientific labs in a big way. Most of them
> were Interdata4s and DEC PDP-11s. NBS maintained a small IBM 360 to run
> specialized printing software for scientific reports. The IBM marketing
> strategy had computerized many organizations in that industry with IBM
> 360s, including the Government Printing Office.)
>
> -Ed
>
>
>
>> John
>>
>> _________________________________________________________________
>> Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
>> Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
>> Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
>> Shared Files: http://ontolog.cim3.net/file/
>> Community Wiki: http://ontolog.cim3.net/wiki/
>> To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
>> To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx
>>
>> (05)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (06)
|